Tuesday, October 27, 2009

"The site with the id {GUID} could not be found" and BLOBCache


My own memo. Nothing I myself found.

Anyway it was really hard to figure out. But I was lucky that it happened in our testing environment. So I did not have to start crying yet.

The problem was the following.
I was getting “The site with the id {GUID} could not be found” error to get some css files from Style Library. But only on the extended web application, and futhermore, only when a user logs in to the site. # Our site accepts anonymous visitors too.

I re- extended the web app, re-restored the data from our production site. Nothing helped. # I did not have the problem on the production site.

As always, I went to the net for some help, a hint at least. But this time I did not think I would find anything helpful. I was believing that the content database might have been corrupted somehow, or something…

However, there are people who had the exact same problem, and have overcome it.
http://blogs.msdn.com/joshuag/archive/2008/05/22/filenotfoundexception-the-site-with-the-id-guid-could-not-be-found.aspx

This was a big help. I did not even think about the cache.
Unfortunately, however, the solution did not work for me.

But I know now all I need to know is how to clear the cache. And found this http://blogs.pointbridge.com/Blogs/monnette_jeff/Pages/Post.aspx?_ID=15.

More (seemingly) in depth explanation can be found at http://sharepointinterface.com/2009/06/18/we-drift-deeper-into-the-sound-as-the-flush-comes/.
I did not have anough patience to read it throughly though…

In short, the UI does not work for a farm like ours, and the stsadm solution (http://msdn.microsoft.com/en-us/library/aa622758.aspx) stinks.
We have only two front-ends. So I went for the manual flush.


Tuesday, July 21, 2009

Accessibility : Accessibility Kit for Sharepoint


There is this tool (? add-on?) exists. But after all it did not help much.

You find a couple of things in the package but we were only interested in what they call HCCE (HiSoftware Compliant Code Engine) and WebZonePart controladapter.

The HCCE is nothing more than a custom base class for Page Layout, and what it does is simple string replacements.
For instance, MOSS Publishing feature generates <SCRIPT> in its output HTML, which violates XHTML. It has to be <script type=”text/javascript”>.
It does this, the replacement in the output. But only if you specify it, exactly as above, one by one, in its config file.
So it is not like, it does the work if you say you want to “comply” to XHTML, or HTML. “Compliant Code Engine”… Funny..

And the other cr*p, WebZonePart controladapter, it is a controladapter. So as you know, it replaces with table with div.
By default, MOSS generates lines like below for each webpart on the page. Of course it all violates standard.
<td id="MSOZoneCell_WebPartWPQ3" orientation="Vertical" name="MSOZoneCell" relatedWebPart="WebPartWPQ3"…

The adapter replaces those with divs. Sounds nice. But my problem was that it does not produces title and border of the part even if you configure you want them.
Luckly, the package contains the source (not for HCCE. HCCE comes only as binary (of debug build…)). So I managed to have them back.


What still remains not OK are those XHTML not compatible codes generated mainly by Rich HTML editor.
The HCCE’s static string replacement does not do anything for this. Here you want to replace, for instance border=0 with border=”0”. And there could be border=1, or 2, or 3, you never know…

For this, I did not find any turnkey solution. However, I found a guy who solved the similar problem in a different CMS by plugging-in a httpmodule of his own.
http://www.codeplex.com/compfilter4umbraco
He uses this library http://www.codeproject.com/KB/dotnet/apmilhtml.aspx , for basic clean-up of markup.

I was not sure if I can plug-in a httpmodule to MOSS, but since I did not have any other more promising alternative, I gave it a try. And voila, it works!!
For the module, I created it from scratch. I did not need any of logics placed specifically for the CMS. Quicker to build one from scratch than to remove those.
But the library, thanks to the author, I did use it, although I changed it a bit.

The challenge I had was that, at the beginning I had all page go thru this filter. It screwed up things.
So, in the end, I made it so that it cleans up only contents, not part coming from masterpage. That works. This is where the rich editor generating codes appear.


Saturday, July 18, 2009

Accessibility : XHTML or HTML


The first step for an accessible site is conformance to a standard. A colleague of mine who closely follows this subject told me. It can be XHTML or HTML. Does not matter much as far as we conform to one, he added.

Please contest me if I say something stupid, but in a word, a site created using MOSS 2007 Publishing Feature does not conform to none of the two. You can not configure it to. You have to customize it.

It is based on ASP.NET 2.0 which I believe supports XHTML more than HTML.One of the places that you clearly see it is that it inserts lines like below, which violates HTML scheme.

<input type="hidden" name="__VIEWSTATE" id="__VIEWSTATE" value="..." />

And you can not configure it to stop doing so.

OK. Let us go for XHTML then. But, thank you MS! It violates it too...
You can see it quite easily, just by inserting an image using its Rich HTML Editor. It generates a line like this.

<img border=0 ... >

Actually, I have the impression that MOSS Publishing feature is designed(?) to create HTML sites. Its OOTB masterpage files have got HTML doctype declaration.I really do not understand why it comes out like this...

Anyway, after having tried both, I saw going for XHTML should be a realistic solution, and started customizing it.
In a couple of posts after this, I would like to talk about the series of tricks that I employed.

Tuesday, May 26, 2009

IIS7 Application Request Routing

While the memory is still fresh…

URL rewrite and/or Reverse Proxy on IIS.
If I am not wrong, it was not possible up to IIS6 i.e. windows 2003, unless you buy a third party tool (which has been our case) or develop something in-house.
Now It has got everything.

I stalled the module (plug-in?) called Application Request Routing.
One small note. When you install it with the version 1.0 installation exe, it installs also the URL Rewrite module, the version 1.0.
It went fine on w2008, the original revision (does the R stand for Rivision?) but not on R2.
I had to install the URL Rewrite module v1.1 separately.

BTW, do we all know what CTP stand for?
The install instruction says that I need to first uninstall the CTP1, if I have. ???
I googled it and found that it is Community Technical Preview, I think.

When I try to install the AAR (Application Request Routing), I had already URL Rewrite v1.1 installed, then found that it does not do reverse proxy.
Since it was not the “CTP”, I did not see it written anywhere, so I went on installing AAR and failed.

Anyway, once you have got it installed, the rest was quite easy. (for me. The goal that I was tasked this time was quite simple. Reverse proxy everything to a given host.)

I created the server farm (of in fact the one host). A pop-up came and said that it would create the routing rule for me. It was just fine. It has turned out to be it.

Friday, April 24, 2009

Customize list form (cont.)


I wrote on this topic a little bit in the past.
http://murmurofawebmaster.blogspot.com/2008/11/customize-list-form.html

The motivation at that time was, and still is to evaluate, to see what it can, how easily, and what not.
The thing that I picked up (some functionality of the web site to implement using this technique) was the feedback form (I have to have it anyway in one way or another).
The challenges were:
1. It sends an email message. The address is given.
# The current form takes the address from QueryString. With the new SharePoint based website, you find it in the PropertyBag of each site. So idealy, depending on where a link to the feedback is clicked, when it is opened, it should already knows where to send email.
2. An anonymous visitor can give him/her feedback.

For the email, I used the SPD Workflow.
The first prototype takes the address from QueryString, just like our current form. I do not think I can get it from the site’s PropertyBag unless I make the form as an Application Page with code behind.
With some JavaScript, I pick up the email from QueryString and set to a field of the list. You can send a message to it from Workflow.
Nice. Done. (I thought…)

Then, I realized that I can not access the custom insert form of the list anonymously.
After having spent sometime, I now guess that it is probably because the site being in the lockdown mode.
http://technet.microsoft.com/en-us/library/cc263468.aspx
OK, then, I can make it as a page under the Pages library.

An error. “The data source control failed to execute the insert command.”
Again spent time on the net to find someone say the following.
“I’m finding out that the dealbreaker with anonymous access is the association with the SPD created workflow. … Design a workflow and associate it with the list and the form will bomb any time that the workflow is initiated.”
http://blogs.devhorizon.com/reza/?p=498

This person ended up doing it with an Application Page with code behind. I guess I would follow the path…

Wednesday, April 8, 2009

Content Query Web Part


My goal was to have multilingual summary links.
I tried to implement it through the combination of custom list and content query web part.
Our web site is in multiple languages. Image you have the OOBT summary links placed on pages in different languages, and that you update them one by one. It would be so painful for a person who does not speak the language(s). And it would br very tidious as well.
So my idea was that I have a custom list containing all links, each marked with a language. And using content query part to filter them. For instance if it is on an Arabic page, it picks up only links marked as in Arabic.

First, I defined my multiligual link content type as an inherited type of the built-in link type.
Then I added; language column (built-in), ImageUrl column (custom) of publishing image (comes with the UI that allows my users to browse thru image libraries to select an image) and LinkSortOrder column (custom. Useful also to group links to the same contents in different languages).

Next, I created a custom list of the type.
# I do not think defining the type, and then defining it as the type of the list was required. You could have a list with those columns directly defined. It was a design choice.

Then I prepared a page layout per language. Again a design choice. I may be able to have just one layout to serve all different languages.

Now to customize the presentation.

As you know, it is to write a XSLT code to render the XML returned from CQWP in a way you like.
First, we want to have a look at the XML. I found a good post here
http://www.sharepointblogs.com/radi/archive/2009/03/17/content-query-web-part-getting-a-full-dump-of-the-raw-xml.aspx.

However, the challenge was to figure out how to tell the web part to use my XSL rather than the default one.
I have the CQWP placed on a page layout, not a page. I did not find many who do this on the net. At the beginning, I did not think this export/upload was a solution to my case. I gave it a try nevertheless and found out that it is a hidden attribute. “hidden” here I mean, you do not see it by just creating an instance of the OOTB CQWP. So all you have to do is just add it manually.

<PublishingWebControls:ContentByQueryWebPart … MainXslLink="/Style Library/XSL Style Sheets/myContentQueryMain.xsl" … >

In the XML that I finally managed to look into, of course I do not see those columns I added. So I specified the following value to another hidden attribute.

CommonViewFields="URL, URL;ImageUrl, Image"

The URL is a column defined with the built-in Link type.
To determine the types (comes after comma for each column) was a challenge too. It still is, when I do this again for other purposes.
This document
http://msdn.microsoft.com/en-us/library/aa981241.aspx is the most explanatory that I found, but still not stisfactorily clear.
For instance, in addition to the URL type, it says now we have another type called Link. I tried both but did not see the difference. They are the same in the XML. I think Microsoft should give us a table where we see the matching between Content Column type and type should be used in the CommonViewFields and other similar properties of CQWP…

Once you have those columns you added appear in the XML, the rest is just usual XSLT coding that many already talk about on the net.

After having gone thru this exercise, I think now that to deal with a custom list such as this one, a better, more beautiful way would be to have the completely own set of columns appear in the XML using yet another hidden attribute called ViewFieldsOverride. There is a nice article
http://sharepoint-tweaking.blogspot.com/2008/04/displaying-listname-and-sitename-when.html.

A better still would be that Microsoft, or somebody (though I would rather not to go to a third party solution) come up with a CQWP-like webpart that generates XML according to the custom list and/or content type specified. It may be difficult though, since then what I could have as the default XSL…

Wednesday, April 1, 2009

Cache-Control: public and Session


Recently one of our developers reported that session variables are not kept for him and asked me to look into.
After a couple of hours of investigation, I finally came to conclude that this is probably because he set Cache-Control to public in his page.

The below is my conclusion.
When Cache-Control is set tp public with your page, it is undestood that the page is the same for everybody at any time.
When a proxy server sees it, it caches the page, and the cookie, where your session is stored, will not reach the user browser.
Consequently, when the user makes the next request to your server, no cookie will be sent. The server sees it as a new session.

That the cookie does not reach the client is true. I do not see the Set-Cookie header in the response. I would see it if Cache-Control is set to private, the default.
But I was surprised by two things.

One: I found nobody talks specifically about this on the net. I think this is a pitfall that we could fall into easily.
Many says that when the page requires authentication, it has to be private, without explaining why in detail. So you may not see it related.

Two: By the fact that proxy strips cookie when Cache-Control is set to public.
Mmm… Maybe… because it first caches the response and then send it to you. Anyway, it was good to know

Monday, March 30, 2009

default.aspx with custom type and layout under Variations


Gaaa!! This was difficult and I am not still sure this was the best way.

Objective was (I guess this is for everybody who uses Sharepoint Publishing portal with Variations enabled. For us, this is to support multiple languages) to have a custom content type and a page layout associated with it for default.aspx, when a subsite is created. And the subsite is propagated by Variations as it is.
That is, you create a subsite in the source label. The default.aspx is of your custom type and layout. And you see the same on all target labels.

First, to have default.aspx of a custom type and layout is achieved, by having a custom Site Definition defined and specifying it upon creation of subsites.

If the objective is just this, you may not need to use a custom Site Definition.
(Although it is a bit backdoor-way) I think we could use as well the Site Template technique. The GUI is provided. Thus, it can be done by a series of clicks.
This was not my option because of the Variations.

Suppose you create a subsite in the source label, based on a Site Template with default.aspx being of your custom type and layout.
The propagation of default.aspx fails because the type does not match.
This is because when Variations creates the subsite (propagation) in a target label, it does so based on the Site Template that you selected when you created the source label. It is either Publishing Site with Workflow or Publishing Site. For both, content type of default.aspx is set to Welcome Page.

How to have our custom Site Template selected there? The answer that I found was to come up with a custom Site Definition.
What I did at the beginning was rather simple. I create my definition having the BLANKINTERNET as basis, and just changed the default.aspx provisioning part so my type and layout are used. This what I have in my onet.xml.

<Configuration ID="2">
...
<Modules>
<Module Name="SubWebWelcome" />
</Modules>
</Configuration>
<Modules>
<Module Name="SubWebWelcome" Url="$Resources:cmscore,List_Pages_UrlName;" Path="">
<File Url="default.aspx" Type="GhostableInLibrary" Level="Draft" >
<Property Name="Title" Value="$Resources:cmscore,IPPT_HomeWelcomePage_Title;" />
<Property Name="PublishingPageLayout" Value="~SiteCollection/_catalogs/masterpage/mylayout.aspx, ~SiteCollection/_catalogs/masterpage/mylayout.aspx" />
<Property Name="ContentType" Value="mytype" />
</File>
</Module>

OK. Fine. Looked nice at first. But then I realized that layout is OK but not for the type. The type is Page, not “mytype”.

Actually, up to this far, seems there are many doing the similar. I found blog posts discussing it.
But I found none specifically talking of type. They seemed all happy having their custom layout with default.aspx. Do not tell what happened with the type.
Could be my mistake, oversight somewhere. But I could not figure it out…

After another sometime on the net, I found some people adding custom types to the Pages document library using Feature invoked when a site is created. I gave it a try.
# I later found that the same can be achived using a Feature XML element (
http://msdn.microsoft.com/en-us/library/aa543152.aspx), in which case you do not have to do any coding.

Now mytype is added, after the three default types. But still, default.aspx is of type Page, not of my type.

I thought that this may be because the default type of the Pages library is the Page, although it does not make sense because when we create a page thru the UI, it becomes of the type and layout we specify, not of the default type. Actually, when we do so thru the UI, we do not have to add the type first, to begin with.

I looked for a way to change this default (type selected in a document library). But found no such method provided with SPList. It appears that a type is the default just because it comes at the top.

So I decided to see what happens if I deleted the three build-in types, after having added mine.
Bingo!! Now I have the default.aspx created as of my type as well as layout. And since the same definition is used when the site is created in a target label by Variations, there too, the default.aspx is of my type and layout, so the contents are properly propagated.


Monday, March 2, 2009

Zip size limit, 32-bit and 64-bit


Still FTP migration.

Sometimes one folder is really big. Many files; each rather big. I had problems when unarchiving.
I do not copy it here but from the error message, it seems the upzip can not find the file end.

Found this page.
http://www.info-zip.org/FAQ.html#limits
Says the actual size limit for an archive file is 2G on 32-bit and 4G on 64-bit environment.
This explains.

Actually, in my case, the target is 64-bit Windows 2008. We migrate files from a Tru64 UNIX. 4G should be big enough.
However, the info-zip Unzip.exe that I mentioned that I have got in the last post is compiled against 32-bit DLLs, and it reads here (http://www.info-zip.org/board/board.pl?m-1235603338/) that there is no 64-bit Windows version available as of Feb 27/2009.
It says also that I should be able to compile it and can report for any problem. I might want to try that some other time when I am bored but not now.

So I used the –i and x options of zip command to include and exclude some type of file into one but not the other one, to keep one archive less than 2G.
The mission seemed completed but in course of doing this, I have lost the time stamps of some of the files…

Tuesday, February 24, 2009

Copy data from UNIX to Windows, keeping structure and timestamp


We are migrating our FTP server from UNIX, Tru64 to a Windows box, Server 2008.
We want to keep the structure and timestamp.

First tried tar.gz, thinking that Server 2008 is smart enough to be able to handle it, without me doing any trick.
No. It does not recognize .gz nor .tar. BTW, I did not give a try to the SUA, subsystem for UNIX-base application.
Since it is already a production server. I wanted to do it as simply as possible. Without installing anything.

I tried the GNU tar, which you do not have to really “install”, just have the executable.
Funnily, the timestamp is not preserved. The date is OK but time changes. We can easily guess that, in which case the time is not really OK either.
Some difference in ways to store datetime info, probably. I did not digg it further.

Then went for ZIP. I should have tried it first. Both platform claim that they support it natively. It was just that I do not use it often.

On UNIX, I zipped the folder structure, just normally without specifying any special option,

% zip -r (resulting zipfile) (folder to archive)
# This is just for me to remember the syntax. The man entry was not so easy to understand...

copied it over to Windows, and simply did the Extract All from the Right-Click menu.
This time, the timestamp of files are OK but not for folders. It appears that folders are created while it unarchives.
Looking into the zip file, without extracting it, shows that the dates are there. So it is just a problem with unarchiving, not with the way the zip file is created.

As far as my research goes, Windows 2008 does not seem to have a command line interface for unzip.
I found one at
http://www.info-zip.org/Info-ZIP.html. Seems a dicent project. Downloaded it and finally my folder timestamps are there!!


Tuesday, February 17, 2009

A small good-to-know with FireBug


Like me, I think you like it, the FireBug. It is really nice, helpful when especilly you work with CSS.
But have you ever been frustrated when you work on those for hyperlinks i.e. a:link, a:hover etc?
You want to know which of the classes takes the effect when it appears on the browser, but it tells you that only when you click the element.
But with the case with a hyperlink, when you do so, what takes the effect is that of “hover”, not that takes the effect when the page is loaded.

I was very frustrated, helpless, but found out that just by coincident, after you once selected the element, if you refresh the page, it tells you the css class that takes the effect at the page load time.