Google Launches Personalized Search … For EVERYONE!

Last Friday afternoon, Google made a rather large change to its search engine results that went pretty much unnoticed (I didn’t find out about it until yesterday).  Google is now personalizing everyone’s search results, regardless of whether you have opted into this or not.   For more details behind how search personalization works, and how you can technically opt out of it (because by default you are now opted in) please see this article: Google Now Personalizes Everyone’s Search Results.

Google personalized search is not new.  As a matter of fact, since 2007, personalized search results have been showing, but only for people logged into their Google account.  For example, if you were logged into your Gmail account, and then did a search before having logged out of Gmail, then your personalized search results would display.  Google, along with other search engines, has been tracking users search history (which Google calls “Web History”) for years.  The only difference is that NOW, Google is using the data to refine each individual’s search results, whether the user has opted into this or not.

To be quite honest, as both a search marketer and a regular Internet user, I am not a fan of having my search results personalized based on what search results I have clicked on in the past.

As a search marketer, I am not a fan of this because it is going to make it unbelievably difficult to gauge the effectiveness of any Search Engine Optimization (SEO) effort.  In addition, now that search is personalized, I think that the reach of my SEO efforts is limited.  Why is it limited?  Because everyone’s search query results will be different.  Yes, the same theme will hold true with SEO – relevant content with targeted keywords, relevant back links…etc. will remain critical factors, but the reach will be limited because the branding element of SEO is now severed.   It is not expected that every user is going to click on your search engine listing, which is fine.  However, there is a branding element in SEO that people often neglect.  Even though users may not click on your search engine listing, they still SEE your listing, thus they see your URL and your company name.  This is a free branding element that appears within Google search results, in one of the largest web properties on the Internet and the largest search engine.  Now with personalized search, the reach of this free branding is limited because everyone is going to have different search results, and those results may not include your site.

As a regular Internet user I don’t like personalized search not only because I’m opted into personalization by default, I will also now be shown search results that I am already familiar with, which in my opinion negates the point of search – to find new information.  In other words, the diversity of web sites in my search results could be reduced.  Why do I want to see stuff I already like? I want to discover new web sites and information that I had never known about.   With the new personalized search, am I going to constantly see the same results that I clicked on in the past at the top?  This would make using Google less useful and maybe a little bit more… boring.  It would be nice to see some facts regarding what percentage of the queries will be personalized.

Mark Medina
Director of Marketing

Part II: PDC 2009 – Thoughts and Observations on PDC

Can’t believe another PDC came and went. The year is whizzing by so fast. Here are some thoughts and observations from PDC 2009.

1. Overall, I thought that this year’s PDC was more mellow than last year. I didn’t feel the same energy level. Perhaps it was partly due to the economy – less vendors, less attendees. Perhaps it was due to just having the PDC in back-to-back years. Remember, it was three years between PDC 2005 and PDC 2008 so there was a lot of build up of excitement for developers to get together and check out what was new. To keep the energy level high, it may be in Microsoft’s interest to keep at least 2 years between PDC conferences.

2. During the first day keynote, Ray Ozzie, Microsoft’s chief software architect, kept coming back to the vision of “Three Screens and a Cloud”. He was referring to Microsoft’s goal of providing seamless integration between the Computer, the mobile phone/devices, the television and the cloud. However, this PDC 2009 conference was more about one screen – the computer – and the Cloud. We will need to wait until next year to hear more on the story of the other two screens.

3. During the first day keynotes, there were two guests that surprised me. The first surprise was Matt Mullenweg, the creator of open source blogging engine WordPress who came out to announce that they are working with Microsoft to run WordPress on Windows Azure. So it was announced that Azure supports MySQL, Java, and PHP. Microsoft has been putting a lot of resources into partnering with the open source community. Now, I’d like to see WordPress working with SQL server.

4. The second surprise was Vivek Kundra Chief Information Officer at the White House. He had words to say via satellite.  It’s pretty cool that the Federal CIO took time out to address the attendees of the PDC. He spoke on the partnership between government and industry and the future of exciting new applications driven by innovations like Cloud computing and the democratization of data. One example he mentioned was around the Department of Defense opening up GPS data to enable innovations in GPS enabled devices. He also talked about how NASA is making Mars Pathfinder data available for the Pathfinder information challenge a citizen scientist contest.

5. Vivek’s presentation helped reinforce the announcement of Microsoft Dallas a new data set subscription service available on the Windows Azure platform. This is pretty cool. This service should help developers to more easily access data sets for their applications that would otherwise be difficult or too expensive to get. It’s in its early stage so we’ll see how this evolves.

6. Another surprise was the presence of the iPhone for two demos during the keynotes, especially after the hoopla over Bill Gates banning iphones from his house earlier this year.  First, during his video presentation, Vivek Kundra demoed an iPhone App for job searches, in part using data from data.gov. Then on Day 2, Scott Guthrie attempted to demo Silverlight streaming on the iPhone. I say “attempted” because the demo failed and the several backup iPhones also failed. Got to feel for any speaker and, especially any keynote speaker, who has a demo blow up. Scott Gutherie handled it well though. I took a look at the video posted at iis.net/iphone on my iphone and I thought the video looked pretty cool.

7. Scott Guthrie talked about Silverlight 4 for his keynote which he announced is going to beta during his talk. He used some gratuitous demos to show different new features of Silverlight 4. Then, Brian Goldfarb came out and demoed a Silverlight App that interfaced with Facebook.  I thought that with the next version of Silverlight, I can see the possibility of some really cool business applications. It should be an exciting launch. Scott Gutherie also announced some recent wins for Silverlight with powering future huge online events like the Winter Olympics and Victoria’s Secret Fashion Show.

8. There were less Expo vendors compared to last year. You can tell the economy is affecting Microsoft partners. However, they used the same conference hall footprint so booths were far apart. Even with people in the expo floor, things looked sparse.

9. I was able to attend the PDC Underground Party at the Conga Room. The was a long line to get in. Mr. Scott Guthrie even did a talk on Silverlight during the party. Does this guy ever rest? And our friends at Neudesic sponsored a lounge area where they were hand rolling cigars.


10. Microsoft also had one of their Server Containers on the Expo floor which you could walk though. This is the kind of container that will be used to house Windows Azure. This particular unit was one that will sit outside. In the cloud data center, you will see stacks of these in many rows.  The servers in this demo container looked to be DELL rackmount servers. however, in a real production environment, I’m not sure what servers they will use. A while ago I saw a picture of a Google server (sorry wasn’t able to find the link) that they use for their cloud and it was a really stripped down thing.  I asked the Microsoft staff about it but they didn’t say what the production servers would be.


11. Ray Ozzie announced that Windows Azure will go live into production on January 1, 2009. For a period of one month, it will still be free as Microsoft tests out there billing system. I’ve seen many different types of billing systems and even for something seemingly simple like say billing for web hosting, billing can get very complicated. Now, with something like the pay as you go model of the Cloud and different metered things triggering microcharges, this billing system is going to be really complex. I don’t know if a month is going to be enough time for testing. In any case, on February 1, 2009, Microsoft plans to start billing for Microsoft Azure.

12. I met with many people and some of them asked me about how Windows Azure was going to affect us. I’ve discussed this topic before and having talked with Azure staff and seeing the presentations, I still believe that shared hosting is not threatened. The user will have another choice for hosting and depending on their needs they will choose what is right for them. The Cloud is not the right choice for every hosting need.  I spoke to many attendees as well and they have a lot of concerns about the Cloud and I think the adoption is not going to be as fast as reported.

13. Ray Ozzie also talked about the two Cloud data centers in the US that are live now. Two more are being built in Europe and two more are being built in Asia. These international data centers are to go live next year or two some time.

14. At the MSDev booths, they were giving away T-shirts and they made you fill in the t-shirt “_______ was my idea” going along with the “Windows 7 was my idea” campaign. They took pictures with the people and their shirts. Some other shirts proclaimed “The Internet was my idea”, “Web 2.0 was my idea” so here is the one I did right quick:


15. And oh, did I mention they gave out laptops?

Takeshi Eto
VP Marketing and Business Development

It’s not easy being green

You may have seen many hosts touting their “green” status lately. Hey, green is good, right? But what does it really mean?

At home you may do things to reduce your energy use, recycle waste, reuse, reduce – all the steps an individual can take to lessen their negative impact on the environment. But just about every web hosting business typically consists of two things; an office and a data center. A company can certainly take steps to reduce consumption within an office; turning off unused lights and monitors, using less air conditioning, encouraging car pooling, etc. But the bulk of a host’s consumption takes place at the data center, and that’s where things get a little fuzzy.

Servers require a large amount of electrical power and a cool atmosphere. If the temperature in a data center starts to nudge it’s way up into the high 80’s or low 90’s, servers begin to fail. Cooling entire floors of large buildings that contain tens of thousands of servers is an expensive proposition, both in cost and environmental impact. But it is an unavoidable cost, since the servers can only function within a narrow temperature range. Additionally, all computers contain small amounts of toxic materials necessary to their construction. That’s why your city may have a specific site where they ask you to bring in your old electronics, rather than dumping them into a landfill. The harsh reality is a host cannot do much to reduce its environmental “footprint” where their data centers are concerned.

So if a host wants to declare itself green, what can they do? Well, typically they buy “carbon credits.” When you see a host sporting a “green” logo, that is very likely the path they have taken. The problem with these credits is they allow a business to make claims of being green while otherwise not changing a thing that they are doing. They still have to power and cool their servers, so there is always that inevitable level of energy use that cannot be sidestepped.

Buying carbon credits seems disingenuous. If there is no effort to reduce your actual use of energy or toxic materials, what have you really done by buying a “credit”? All you have done is pay for a logo to place on your web site. You are assuaging your guilt, or the wishes of your customers, by writing a check. What does that really accomplish?

Everyone in our business is being forced to look for ways to consume less power, simply because power in most data centers is now at a premium. You can get all the space you want, and all the bandwidth you want, but electricity is harder and harder to come by in the vast majority of large facilities. But if you choose to rule out carbon credits, your options become much more limited. We do not tout ourselves as a “green” host because – at least at this point in time – we do not believe such a thing is really possible. And as the technology improves and becomes more widespread, and an honest “green” label is within our reach, it is something that we will certainly pursue.

Having said that…

There are a few options. One is using a “green” data center, meaning a facility that gets its power from renewable sources. As you might imagine, they are few and far between, they are rarely really “green,” and there are none at all anywhere near our base of operations. The second option is the aforementioned “carbon credits.”

Where we can make a difference by our choices, we do so. Aside from utilizing newer, less power-hungry CPUs in our new servers, we have also introduced a sophisticated power monitoring system in our Irvine, CA data center. This allows us to maximize the load on our existing power supply in order to avoid bringing in additional power. Without a monitoring system it is more or less a guessing game – “How many boxes can we load onto this circuit?” and obviously in most cases you will err on the side of caution (to avoid overloading a circuit and taking down a dozen servers). Erring on the side of caution inevitably leaves you with an underutilized supply of power.

But with our new systems we can monitor power usage in real time and maximize the use of our power allotment. This has reduced our power “waste” (power provisioned but unused) by 20 – 25%. You can imagine the overall savings that could be realized if all large data centers provided such a monitoring service for their customers, but very few do. And those that do typically charge a premium that forces most smaller companies to opt out of the monitoring. Our Irvine data center – a very large place – offers no such detailed power consumption analysis service, which is why we had to bring in the hardware ourselves.

But probably the major step we have taken is with our newly opened data center in the U.K. The architecture there is completely different than the U.S. network. We utilize central file servers and “virtual machines” for the web servers. What this means is that a handful of centralized units can do the work that used to require dozens of discreet servers. This configuration has the added benefit of actually being more efficient in serving files as well, so we save a tremendous amount of power and provide a speedier service. A rare win/win situation, but so far it is performing quite well. If we continue to see positive results in the U.K. we will consider moving to a similar architecture in our U.S. data center, where the bulk of our accounts reside.

Now, when it comes right down to it these were business decisions. Saving energy is good for our bottom line. But whatever the intention, the end result is the same. So if it takes a cost analysis to convince some people that reducing energy consumption is a good thing, we’ll take that as a small victory. Most people who are driving smaller, more fuel efficient cars these days are not doing so out of noble intent. They are trying to save money. But we still reap the environmental benefits, so it’s a good trend overall.

Until truly green data centers start cropping up in Los Angeles we will continue to work on our windmill up on the roof of the building. It’s not much to look at now, but give us some time.

Those misleading custom errors

With .Net you can configure your own custom error pages. Which means rather than letting the server generate its default error page you can place your own page depending on the HTTP status code that produces the error.  The problem is the custom error page can itself can generate its own status error page.

Here is an example; you have an ASP.Net application and it was working fine, but suddenly you get an “HTTP Error 404- File or Directory not found” error. You check all your files and folders and nothing has changed.  All the files and folders appear to be there.  A few minutes later the application is back up.  No big deal, right? You go on your daily routine.  Sometime later it happens again; the same error: “HTTP Error 404- File or Directory not found.”

The first thing you should look at is the URL in the browser.  If you see “aspxerrorpath=” in the address of the URL then your application is trying to generate a custom error page.  The “404 file or directory not found” message means that the path set to display the custom error page is either incorrect or the custom error page itself is not where the path specified. The application is generating an error, yes, but the error you are seeing is being generated by your custom error configuration.

You have two basic options to stop the “HTTP Error 404- File or Directory not found.” First, make sure that the custom error page exist where the custom error settings specify it does.  Once you fix any problems with that path you will no longer get the “HTTP Error 404- File or Directory not found” error, but rather the custom error page you designed.  The problem with this is that your application can still generate an error and you will not know what it is because the custom error page is hiding it.

The second – and in my opinion the best – step is to simply disable custom error handling.  That way when your application generates an error you can take the necessary steps to correct it.

We all know errors are bad, but if you do not know what the problem is, it is next to impossible to fix it.

Raymond Penalosa
Technical Support

PDC 2009 Part I: The Big Laptop Reveal

The Microsoft PDC 2009 conference was in Los Angeles again which was great for us since we are local. In the interest of saving money, we decided to just get one full conference pass this year. We tried not to send a bunch of staff to the conference at the same time, so we were able to just get the one day expo passes when needed.

On the first day, I picked up my badge and conference bag. This bag was really lame compared to the cool Tech-Ed bag earlier this year. You be the judge.

The conference schedule was also not printed in a book. It was on some Xeroxed piece of paper. And on top of that, there was no scheduled end-of-Conference blowout party like last year’s PDC 2008 party at Universal Studios. So the whole first day, I along with many others I spoke to were thinking that Microsoft is really tightening their belts for this one.

So, we were all taken aback during the second day keynote, when Steven Sinofsky, President, Windows and Windows Live Division, announced that Microsoft was making a special Acer laptop sku available to all PDC attendees…for FREE! All the developers in the room were stunned.


This was a brilliant move by Microsoft. Instead of just giving out disks of Windows 7, they got Windows 7 on a working laptop into developer’s hands – and not an ordinary laptop – a convertible TabletPC with multitouch screen with a lot of other bells and whistles. Basically Microsoft gave developers a box to play with and develop against to take advantage of the new features in the Windows 7 platform. Here is link to some more information on the specs of the PDC 2009 laptop.

Now I kinda wish we had got our hands on additional conference passes…

Takeshi Eto
VP Marketing and Business Development

*Free* ASP.NET 4.0 Hosting (beta 2) Sandbox now available!

You have been asking for it, and it is finally available; a .NET 4.0 beta hosting sandbox!

Signup is easy, just go to the DiscountASP.NET labs page and follow the link.

The number of accounts on the .NET 4.0 beta server are strictly limited, so if you are interested you should sign up today. This is a public beta, so you do not have to be a current DiscountASP.NET customer to take advantage of the offer. We are including MS SQL2008 space in the sandbox space so you have the opportunity to thoroughly test drive the latest .NET features.

In their press release for the new Visual Studio and .NET Framework, “Microsoft described the next release through the following five focus areas: riding the next-generation platform wave, inspiring developer delight, powering breakthrough departmental applications, enabling emerging trends such as cloud computing, and democratizing application life-cycle management (ALM).”

What? Boy, I don’t know, but “inspiring developer delight” sounds good to me. If you are interested in democratizing emerging breakthroughs and trends and riding waves you should probably get on board as well.

Sandbox? Waves? See you at the beach! Or better yet, at http://labs.discountasp.net.

How to Deploy your Development Web Site and Database to your Live DiscountASP.NET Server using the Web Deployment Tool in IIS Manager

We launched support for Microsoft’s newly released Web Deployment Tool 1.0, also known as MSDeploy on our Windows 2008 hosting platform. The tool allows developers to package their web application configurations and content, including databases, and to use the packages for both deployment and archiving. This blog post contains guidelines for deploying your web application to our Windows 2008 servers using IIS Manager.

The first step is to install the Web Deployment Tool on your computer.  You will need to select the 32-bit or 64-bit version depending on your computer. Select custom installation.  Along with the Web Deployment Framework, only the IIS Manager UI Module is required on the client.

The 32-bit link and 64-bit link can be found on the iis.net web site.

After, you have installed the Web Deployment Tool on your computer, then log into your DiscountASP.NET Control Panel and navigate to the IIS Tools | IIS Manager Tab and enable access to the Web Deployment Tool.

Now you are ready to package your application and database for deployment.

1.       EXPORT your local web site or application to a package

a.       Open IIS Manager.

b.      Select your local web site or application.

c.       Click “Export Application” from the Deploy section of the Actions pane on the right.  This will bring up the Export wizard.

d.      In the “Select the Contents of the Package” window, click “Manage Components”.  You will see that the “iisApp” provider is already added.   Add the “dbFullSql” provider, and enter the connection string to your local SQL database as the path.  For example;

Data Source=.\SQLEXPRESS;Initial Catalog=MyDatabase;Integrated Security=True

Click OK, then Next.

e.      In the “Select Parameters” window, you will see the Parameters.  Click “Add Parameter Entry” to the dbSqlFull parameter, which should be Parameter 2.

Type:  Xml File
Scope:  \\Web.config$
Match:  //connectionStrings/add/@connectionString

Click OK, then Next.

This allows you change the connection string in the web.config when deploying to the live environment later.

f.        In the “Save Package” window, enter the path where you would like to save the package.  Click Finish.

2.       IMPORT the package

a.       Connect to your DiscountASP.NET site in IIS Manager.

b.      Click “Import Application”.

c.       In the “Select the Package” window, select the package file you just created.  Click Next.

d.      In the “Select the Contents of the Package” window, click Next.

e.      In the “Enter Application Package Information” window, enter the Application Path where you would like to deploy the application.  If deploying to the root of your live web site, leave it blank.  The Connection String should connect to your DiscountASP.NET SQL database.  For example,

Data Source=DbServer;Initial Catalog=DbName;uid=DbUser;password=DbPassword

Click Next to start importing.

Aristotle de la Cruz
Developer

Twitter – In Control and Has Power

Here’s an interesting article I found that says Twitter is in talks with both Google and Microsoft regarding access to Twitter’s real-time Tweets and Tweeps:

There is data in the real-time Tweets and Tweeps that both Google and Microsoft value and both companies could be willing to dish out large sums of cash for access to it.  So for those of you asking how Twitter can make money, well there you go.

There are two things I found particularly interesting about this article (and the two things I liked most) were:

  1. Twitter isn’t just working with Google. Twitter is also choosing to work with Microsoft Bing, which I hope gets under the skin of Google (take that Google!).
  2. Because Twitter isn’t choosing to work with one or the other exclusively, but rather both companies, Twitter is maintaining an independent stance.

It is no secret that I dislike the search monster that Google has become so I love the fact that Twitter isn’t giving in and working with just Google.  I want Bing to be successful and if that means Twitter chooses to provide access to their data to both Microsoft and Google, then so be it; just as long as it is NOT only Google.

I also really like the fact that Twitter is remaining independent and choosing to work with both companies.  There have been numerous rumblings about who will acquire Twitter.  Will it be Google, Microsoft, Facebook, etc?  If Twitter were only in talks with Google, then there would be mass speculation that Google was going to acquire Twitter.  Why can’t Twitter remain its own company?  Who’s to say that Twitter can’t remain a profitable stand-alone company that actually competes against Facebook, MySpace and yes even the almighty Google (many people consider Twitter’s “real time search” competes against Google)?

I think Twitter is an absolute phenomenon that has already changed the Internet and will continue to do so.  It would be fantastic if Twitter remains independent and continues to change the landscape of Web 2.0.  I also like the fact that Google, Microsoft, and even Facebook (remember they acquired FriendFeed) want something that Twitter has – This ultimately gives Twitter power.

Mark Medina
Director of Marketing