Research Article: Windows Phone 7 Application Category Interests of Global ASP.NET Developers

Silverlight is becoming an important foundational technology for Microsoft and they are betting on Silverlight to power the mobile applications of tomorrow on their upcoming Windows Phone 7 platform. The Silverlight developer will be the core group driving innovation for Microsoft’s new mobile platform. Therefore, it is important for Microsoft and the community to have more insight and knowledge about the Silverlight developer.

In our previous research paper, Profile of Silverlight Adopters and Implications for Windows Phone 7 Applications Development Strategies, we focused our USA-based customers and reported on their attitudes and perceptions of Microsoft’s mobile strategy.

Just recently, new research results are starting to get published about the mobile developer’s attitudes and perceptions on different mobile platforms and on the mobile application development enterprise. There are interesting findings in studies from Appcelerator and VisionMobile, for instance. We believe that this type of research will accelerate as the mobile marketplace continues to heat up.

To add to this ongoing conversation, in our new research article, we take a more global view and report our observations on our customer’s mobile application development experience, interest level in developing Windows Phone 7 applications, and Windows Phone 7 application category interests. We found that the interest level in the types of mobile applications our customers desire to build varies depending on their global region.

Here is a link to the research article:

Windows Phone 7 Application Category Interests of Global ASP.NET Developers

Let us know what you think.

Takeshi Eto
VP Marketing and Business Development

Having trouble keeping up with Microsoft code names?

There is always so much stuff happening at Microsoft that it is difficult to keep up and with all the code names used things can get very confusing.

So, for those that want to try to keep up Mary-Jo Foley, ZDNet’s Microsoft watcher puts together a Microsoft CodeTracker Whitepaper that you can download. The June 2010 edition is 22 pages long!

Takeshi Eto
VP Marketing and Business Development

Happy 10th Birthday .NET!

Wow, time just flies by.

It was 10 years ago on June 22, 2000 when .NET was unveiled. I wasn’t keeping track personally, but I did come upon this article on NetworkWorld.com; 10 years ago today, Microsoft unveiled the .NET Framework.

.NET was first released Feb. 13, 2002 and is still going strong.

We bet on .NET when we launched DiscountASP.NET in 2003. It’s been an exciting ride and we look forward to the next 10 years.

Takeshi Eto
VP Marketing and Business Development

DiscountASP.NET Voted Best ASP.NET Hosting Service 2010 in The Code Project’s Members Choice Poll for Second Consecutive Year

I am extremely happy to announce that DiscountASP.NET was voted the Best ASP.NET Hosting Service for 2010 in The Code Project‘s second annual Members Choice Poll. This is our second year in a row!

THANK YOU to The Code Project community for all your support.

Takeshi Eto
VP Marketing and Business Development

Checklist of 29 Worst SEO Practices

Earlier today I was browsing around SearchEngineLand and found this helpful little article about the 29 Worst Practices & Most Common Failures: SEO Checklist.

Basically, this article is a simple checklist of 29 worst SEO practices with a brief explanation of why the practices listed are bad.  Some of the practices are pretty basic for well versed search engine optimizers (although some of the practices listed may shock even the more expert SEO marketers), but for those who are new to the SEO game, this list is pretty helpful.  I’m not exactly sure how many of our customers are SEO experts and how many are SEO newbies, but regardless of your expertise, I would recommend printing out the check list and see where you and your site stand.

Mark Medina
Director of Marketing

Research Paper: Profile of Silverlight Adopters and Implications for Windows Phone 7 Applications Development Strategies

We are interested in understanding more about Silverlight adoption because Silverlight is becoming an increasingly important technology for Microsoft’s business strategy. Our interest in Silverlight is further fueled by the announcement that Silverlight will be a key development platform for Microsoft’s next-generation mobile phone OS, Windows Phone 7.

In our research, we only found data on the statistics of users that can consume the Silverlight experience but didn’t find information on Silverlight adoption within sites that are delivering the Silverlight experience.  So we decided to do our own research on Silverlight adoption and we published a Research Paper on the State of Silverlight previously, based on a survey targeting customers who hosted their web sites at DiscountASP.NET for two or more years.

In April 2010, we conducted a similar survey for DiscountASP.NET customers who hosted their web sites with us for less than two years and we use this data for a second Research Paper. In this study, we focus in on the USA-based customers and look at Silverlight adoption trends, explore the characteristic differences between the Silverlight adopter and our general customer base, and get a measure of customer attitudes toward Microsoft’s mobile strategy.

Here is a link to the research paper:

Profile of Silverlight Adopters and Implications for Windows Phone 7 Applications Development Strategies

Let us know what you think.

Takeshi Eto
VP Marketing and Business Development

Google’s Fully Caffeinated

Google has been working on a huge update called “Caffeine” to their search indexing infrastructure. Over the past year they have been rolling out the Caffeine update incrementally to test and tweak it. But as of a blog post on June 8 earlier this week, they claim to have completed their Caffeine update.

The thing is that Google won’t really reveal all the things that have been updated with Caffeine. They claim that all the important search engine optimization factors that search marketers have been pointing out over the past years has not changed or become less important. Google says that they are just making more crawled content available in their index much faster now with Caffeine. So for the search users, they should be getting more relevant results with fresher content.

That’s great for the search users but you may be asking how these changes will affect your natural search engine ranking. Over here, we’ve been monitoring the Caffeine rollout and I’ve seen DiscountASP.NET’s natural search listing position on Google fluctuate a lot over the last year.

It’s always been a battle to get a site to rank high but with Caffeine, the battle has become even more difficult. Google on Caffeine means that other site’s new content will be indexed faster and real-time web content will be available faster. So I would expect that you will continuously see more fluctuations in your search ranking.

But while the activities of others will have an affect on your search ranking, this also means that your own activities can affect other’s rankings too. So it is even more important now to continually contribute new relevant web content to your site and to participate in real-time web 2.0 activities. Not only does this make your business more relevant, it will help your search engine ranking.

Google named their update appropriately; we are all going to need some caffeine to work on improving search rankings.

Takeshi Eto
VP Marketing and Business Development

SQL Trace for Query Optimization

DiscountASP recently introduced SQL profiling service. The profiling service produces a trace file. The results of the trace are especially useful in determining the source of real-world problems that you are unable to replicate in your development environment. The goal is query optimization, and this article will give you some guidelines to help get the most out of your SQL database.
The query optimization process can be broken into two main steps:

  • Isolating long-running queries.
  • Identifying the cause of long-running queries.

The first step in optimizing queries process is to locate the queries that have the longest run-time, and the best tool for that is SQL Trace. Next, you need to analyze the long-running queries to determine where exactly they are spending their time and whether they can be improved.

Replicating the Problem

In order to be able to successfully locate the troublesome queries, the problem should be replicated while SQL Trace is running. This can be the most difficult part of the process because the issue can be random and intermittent, and trace can only be run for a very short period of time. Try to determine when your site is most likely to experience the problem and use that time in your trace request.

Requesting the Trace

When requesting the trace, the following information should be provided:

  • The amount of time to capture ranging from 2 to 15 minutes in 1 minute increments.
  • The exact date and time that the capture should start running.
  • The exact database name that the trace will be running against.

Getting the trace file

When the trace session is complete, the trace file will be placed in your site and you will be notified by email of your request completion. At this time you can do a number of things with the trace file including, but not limited to:

  • Opening and analyzing trace results in Profiler. Microsoft SQL Server Profiler is a graphical user interface to SQL Trace. Although you can view and analyze the trace results in Profiler, it is not useful for this task due to the large amount of information presented and because it lacks the capabilities of sorting columns.
  • Importing trace results to a table. This is the most powerful way to analyze the trace results since you can run custom queries against this table and thus have the most flexibility. A trace table can be used by Tuning Advisor and Profiler in the same way the trace file is used.
  • Using other third-party tools to analyze trace and tune your database.

Importing Trace into a Table

Start Management Studio, open a new query window and use the following T-SQL code to import your trace file to a table:

 USE [MyDatabase]
 GO
 SELECT * INTO MyTraceTable
 FROM ::fn_trace_gettable('c:\MyTraceFile.trc', default)

Replace [MyDatabase]with the database name where you are creating MyTraceTable. Replace:

 'c:\MyTraceFile.trc'

With your actual trace file name and the path to the file.

Locating the Long-running Queries

You are now free to run any custom query against your trace table or you can use the sample add-hoc query below. This query locates queries that run for 1 millisecond or more and sorts them by duration with highest duration being on top. In this query I selected only three columns: EventClass, Duration, and TextData. The TextData column will contain the complete T-SQL code of the captured queries and can be run against your production database directly.

 SELECT
     TE.name           AS [Trace Event],
     T.TextData        AS [T-SQL Code], -- actual T-SQL Code
     (T.Duration/1000) AS [Duration (ms)] -- time in milliseconds
 FROM
     dbo.MyTraceTable T
 JOIN
     sys.trace_events TE
 ON
     T.EventClass = TE.trace_event_id
 WHERE Duration > 1000 -- Filter out the queries that run for less then 1ms
 ORDER BY Duration DESC

Beginning with SQL Server 2005, the server reports the duration of an event is reported in microseconds. That’s why I divided it by 1000 to convert to milliseconds.

Examining Long-running Queries

You can now see the troublesome queries in the query results window after you run the query against your trace table above. Select the T-SQL code to examine by right clicking on T-SQL Code row, then copy and paste it to a new query window. You can now format this code in the query window the way you want and examine it noting which tables and columns are being used.

Replaying the Query

You can run any query in from the trace results against your database again. The good news is that you can replay those queries against your production database directly, whereas you cannot replay the trace using profiler and you cannot run tuning advisor against your production database because that requires sysadmin permissions that cannot be granted on production server.  At this point you may be able to fix the timing issue by indexing the tables and columns being used in the troublesome query. You can also examine the query execution plan to locate the exact problem if you have this advanced knowledge.

Dmitri Gropen
Technical Support