15. December 2008 09:44
by Troy
1 Comments

New Computer - Time for a major upgrade

15. December 2008 09:44 by Troy | 1 Comments

This story actually starts over a year ago when I decided that I might like to upgrade my computer.  I used to be an avid gamer... that dried up a little, but partly because my hardware was a little deficient for the latest titles.  I don't play XBox or any of the other consoles... I just never got into them.  When I do play video games, they are on the computer.  These days, it is probably more like 85% power user and 15% gamer.  I don't need the latest and greatest hardware, but it is a nice indulgence once in a while.

As I said, I started doing my research for new hardware over a year ago.  I got about half way through picking components, and got sidetracked by something else.  By the time I got back to it a couple weeks ago, I had to start all over again.

For this upgrade I would be replacing almost the entire box and its contents.  First a look at what I am replacing:

  • AMD Athlon 64 3400+ CPU @ 2.2GHz (socket 754)
  • ABIT KV8 Pro v1.1 motherboard
  • 1GB RAM Kingston Value RAM PC3200
  • 200 GB Western Digital Caviar SATA WD2000JD Hard Drive (Configured with a 12GB System partition and a 188GB Data partition)
  • DVD-RW - NEC ND-3500A
  • ATI All-In-Wonder X800XT Video Card
  • Cambridge Soundworks 4 Point Surround speakers
  • Windows XP SP2

It is a bit funny to look back now.  I first got the CPU in November 2004.  Back then, 64-bit computing was going to be all the rage, and I thought my purchase would help to future-proof my machine.  Well, here we are 4 years later, and the machine never did see a 64 bit operating system.  Only now are 64 bits becoming more adopted and stable/supported enough to make the switch.

During this round of research, I was surprised to discover that SLI (Scalable Link Interface for the semi-techs - "hooking up multiple graphics cards together so they work as one" for the non-techs) had made a comeback.  However, I was disappointed to find out that NVidia had held exclusive licensing rights which meant the only chipset that supported SLI was the NVidia nForce chipsets.  Based on what I read, these chipsets are generally of lesser quality than their Intel based competition.  So it seems that choosing a motherboard/chipset now required you to choose which video card manufacturer would get your business (ATI or NVidia), because if you wanted to support SLI (or CrossFireX) then support from the motherboard was required.

This powerplay by NVidia has finally come to an end with the X58 chipset from Intel.  This is the latest chipset that supports the latest socket 1366 Core i7 CPU's recently released from Intel.  These processors are quad core and represent the first round in some fundamental changes to CPU architecture, compared with their predacessors.

Now normally, I am an analytical, common sense kinda guy.  I don't buy brand new cars... I buy 1 year old with 30,000 KM on them.  The same goes for computer hardware, don't buy the latest, greatest model of anything otherwise you pay through the nose now, and in 6 months, the same item can be had for 1/2 the price.  I would much rather buy the 2nd generation product and save money.

<beginJustification>
This time, I couldn't get over the NVidia chipset thing.  I wanted to try NVidia video card(s) this time, because I had seen first hand some of the low quality drivers/software created by ATI over the years.  To be fair, my research indicated that ATI has gotten better in the last year or two... but I still wanted to try walking on the green grass on the other side of the fence this time.  Choosing NVidia meant wanting to support SLI (just in case I ever move to 2 cards).  Supporting SLI meant being forcefed the nForce chipset, or bite the expensive bullet and jump to the top of price mountain and go for the X58 chipset that supports both SLI and CrossFireX.  Guess what I did.
</beginJustification>

Edit: Well, about 2 or 3 weeks after buying my nVidia GTX280 based video card, they mothballed that model in favour of the GTX285.  I guess I won't be going SLI anytime soon.  I do still like the versatility of supporting both ATI and nVidia though.

Now, for a look at what I upgraded to:

  • Intel Core i7 920
  • ASUS P6T Deluxe motherboard
  • 6GB RAM G.Skill PC10666 triple channel
  • 150GB Western Digital VelociRaptor WD1500HLFS (10,000 RPM for System Drive)
  • 1TB Western Digital Caviar Black (Data Drive)
  • DVD-RW - NEC ND-3500A
  • XFX GTX280 1024MB Video Card
  • Coolermaster Stacker 830 Evolution case
  • Logitech X-540 5.1 Speakers
  • Windows Vista Business x64

So far, things are running great, and I am happy with the upgrade.

9. November 2008 02:27
by Troy
0 Comments

PDC 2008 Scott Guthrie @ Open Space

9. November 2008 02:27 by Troy | 0 Comments

After returning from PDC 2008, I've finally had a chance to process some video that I recorded of Scott Guthrie who participated in an Open Space meeting for about an hour.  Unfortunately, I only recorded about 15 minutes of the action before my memory card filled up, but I guess that is better than nothing.

I did my best to clean up the audio, which had alot of ambient room noise.  I wish it could be better, but I think it is clear enough now to be understandable, if not somewhat enjoyable.

The video is hosted on YouTube and due to time restrictions on uploaded video, the content is split into 2 parts. 

In the first part,  Scott talks about:

  • ASP.NET and MVC Framework - the future of both and them co-existing with each other
  • Data Access - LINQ and the Entity Framework

In the second part, he discusses:

  • Functional Programming - programming WHAT you want done, versus the more typical HOW to do something, citing LINQ as a simple example
  • Moore's Law - gradually being replaced by new rules where the number of machine cores will begin doubling
  • Parallelism - efforts to make it easier for developers to take advantage of multiple cores through explicit APIs and implicitly through improvements to the CLR
  • F# and .NET 4.0
  • Dynamic Languages
  • WPF and Silverlight

Part 1
http://www.youtube.com/watch?v=9-62sSDTIMY

Part 2
http://www.youtube.com/watch?v=lXm60VnyZwo

2. November 2008 03:42
by Troy
0 Comments

Azure manages to avoid a Hailstorm of criticism -- extended

2. November 2008 03:42 by Troy | 0 Comments

I just got back from the PDC 2008 in Los Angeles where Microsoft unveiled their plans for Windows Azure.  Azure is their answer to computing in the cloud.

After having spent nearly an hour at the Azure desk, speaking with a Group Program Manager from Microsoft, I was approached by a CNET news reporter and asked if I would be willing to comment on Azure.  They were looking for developer reaction to the announcement.  Of course I said yes.

The questions (and subsequent article) seemed like they were looking to find a negative angle to explore, but I don't think they were able to find one.  In the end, I think the article came out reasonably fair.

Azure has some challenges ahead, as were conceded by Ray Ozzie.  There will be companies who will not adopt computing in the cloud and trusting Microsoft is an issue for some.  However, I think that like many other technologies, it is right for some and not others.  Azure offers some definite benefits, and I for one, welcome the choice, regardless of whether or not it is right for my projects.

Not all of my comments were included in the article of course.  We chatted for probably 5 or 10 minutes, and only a soundbite quote made it into the article.  The quote is ok... it is neither overly positive or negative.  The intent of the quote was meant to be positive.  Prior to the quote that was used, I indicated that I would be evaluating Azure with an open mind for our projects.  I said that there were lots of third party hosting companies located in North America, and it was common for companies to pay for hosting, thereby placing their intellectual property into the hands of others and that requires a degree of trust.

The quote that made it into the article followed this, which basically said that I don't see much difference between paying for hosting with another company and paying for hosting within the Microsoft Windows Azure platform.  In fact, it appears that there would be additional benefits to the Azure platform than just simple hosting, including a number of open standards based framework services.

Overall, I was glad to be interviewed.  It was fun.  You can read the full article here:

http://news.cnet.com/8301-13860_3-10078496-56.html

  View from keynote where Azure was announed.
               View from my chair at the keynote where Ray Ozzie announced Windows Azure.

3. August 2008 01:55
by Troy
1 Comments

Maximum file path length - Windows and TFS

3. August 2008 01:55 by Troy | 1 Comments

260 Characters.  This seems like alot, and it is, however...

I ran into this issue a while back with an existing project I worked on and it was a royal pain.  When attempting to follow the naming conventions adopted for the folders and projects within a Visual Studio solution, the newly added filename + path exceeded this hardcoded limit within windows (many core Windows APIs still have this hardcoded limit, and many of the more recent APIs, including the .NET framework still depend upon many of these core APIs).  This issue became apparent when trying to check the file into Visual SourceSafe, when an error was thrown.

Now having moved on to a new project, and new technology (Team Foundation Server), I somehow thought that the issue would magically disappear.  Not so.

While 260 characters seems like alot, it is quite possible to hit this limit when you:

  • use nice descriptive names for folders within projects instead of more cryptic abbreviations
  • root your TFS workspace in a subfolder, that will ultimately add unnecessary characters to the total path (D:\work  vs. D:\CompanyName\ProjectName\Source) 
  • use a VS.NET database project, which has its own built in folder structure (Schema Objects\Tables\Keys\) and file naming conventions, which if you have descriptive table names, means the file name of a foreign key constraint SQL file could be really long, just over 100 characters alone for the filename itself without the path in a recent database that I reverse engineered

There is no real fix for this that I have found, except using a shorter path.  Being aware of this limitation when setting up the naming conventions on a new project can save alot of hassle later on, and could avoid having to rename the existing files/paths or changing your naming conventions part way through a project to accommodate this limitation.

References:
http://www.shifd.net/post/2008/02/Maximum-file-path-length-in-TFS-Team-Build.aspx
http://blogs.msdn.com/aaronhallberg/archive/2007/06/20/team-build-and-260-character-paths.aspx
http://neovolve.com/archive/2006/11/09/So-you-still-can_2700_t-have-a-path-more-than-260-characters_3F003F003F00_.aspx
http://blogs.msdn.com/saraford/archive/2005/12/15/504240.aspx

Edit:  I've posted Part 2 on this topic here.

15. March 2008 05:15
by Troy
0 Comments

Windows Server 2003 Distorted Audio

15. March 2008 05:15 by Troy | 0 Comments

After setting up a new workstation at work with Windows Server 2003 x64 I discovered an annoying problem that was hard to live with and wound up being harder to solve than I expected.

For work, I often view alot of MSDN video and webcasts.  The problem I was having was distorted audio during playback of these webcasts.  Other MP3 or YouTube sources were problem free.  The distorted audio was distracting and very annoying, but you could still hear the audio, which for these webcasts was typically just voice audio for the webcast presentation.  Because the other audio sources were problem free, I initially thought that the audio for these webcasts was just poor quality, and I lived with it for a while, until I tested one at home, and found that the audio was perfectly fine at home.  This caused me to investigate further, and for some reason finding a solution was much more difficult than I think it should have been. 

Maybe the patch had just been released at that point and not indexed by google yet, I'm not sure.  I was very glad to find a fix though.

Anyway, the fix is available from Microsoft and cures a problem with audio that has been encoded using the Windows Media Audio Voice 9 codec, which was the case with many of the MSDN webcasts.

The fix can be found here:

http://support.microsoft.com/kb/940666

7. March 2008 09:44
by Troy
1 Comments

ASP.NET ViewState Explained

7. March 2008 09:44 by Troy | 1 Comments

While doing some research for something I was working on, I came across this blog about ASP.NET and the Viewstate.  This is a must read for those who are somewhat familiar with the viewstate and those who believe they know it all.   

http://weblogs.asp.net/infinitiesloop/archive/2006/08/03/Truly-Understanding-Viewstate.aspx

The following are some points that I took away from the article:

  • an ASP.NET control will automatically restore entered/selected values even if the ViewState is DISABLED!  (wait, what? I thought that is what the ViewState was for?)
  • the Viewstate is typically only required for simple controls if the controls will be hidden between post backs (otherwise, see point above)
  • if you databind your control on each request in the OnInit event, you can disable the ViewState and the control will still automatically restore entered/selected values

2. March 2008 06:47
by Troy
0 Comments

Music Downloading - Good for Artists, Bad for Business

2. March 2008 06:47 by Troy | 0 Comments

Let me start by saying that I am not an audiophile.  I do not download alot of music.  I do not own an iPod.  Mostly I listen to the radio in the car on my way to work, or streaming music from the website of a local radio station when I am near the computer.  I am sure that this preference is due to my inherent inability to be a good DJ. 

Despite this, I am a consumer, and I think this entitles me to an opinion.

I recently read an article in the Toronto Star by Ben Rayner with great interest.  The basic ideas I got from the article were these:

  • The majority of bands make their money from touring, not from CD sales.
  • The record companies get rich from CD sales, in return for promoting the talents of their signed artist.
  • The record companies are now including a cut of the proceeds of touring, management and even merchandising in their contracts with new artists, due to reduced CD sales.

In the years before the internet and modern home computers, artists needed the record companies to record and promote their music.  Without the record companies, an artist would starve back in those days, but the technology of today has changed all of that.  The most unfortunate aspect of that recipe was that the record companies were essentially deciding for the public which artists would be stars because they would decide who would get a contract.  I have no doubt that a lot of dubious criteria was applied to new and upcoming artists and simply having talent was not enough.

In the world of today, it is much easier for artists to record and distribute their own music through the internet, reaching millions of people and building legions of fans worldwide.  The only requirement is talent.  Just the way it should be as far as I am concerned.

I believe the hoopla surrounding music downloading is largely the noise that is generated by the record companies realizing that their services are no longer required, and they are desperately trying to find a way to prove their continued worth and justify their continued collection of millions of dollars worth of fees from their artists.

Change can be difficult.  Being fired or getting dumped does not feel good, and coming to terms with the thought that you are not needed anymore kinda sucks.  On one hand, I don't blame them for trying to hang on to their cash cow.  Hanging on to the cow is easier than the alternative choice of changing your business model, or finding a new business altogether.

Nonetheless, this article helped to solidify what I already knew on some level.  It is time for the record companies to go away.  It is time to allow the market to dictate the price of a CD.  I find it interesting that the manufacturing cost for a CD has decreased over the years, and yet, the average price of a music CD is still what, $15?  It has been $15 ever since I can remember.  The price of HD televisions has dropped 50% or more in the last year or two, and yet the price of a CD hasn't changed in years?  Give me a break.

I also think the focus for artists SHOULD be on touring and performing live.  It is their job.  I get up every day and go to work to earn a living.  Touring is the job of a successful artist.  Especially in this world of technology, where a poor vocalist can be digitally enhanced in a recording to sound spectacular.  In my opinion, touring is required for an artist to demonstrate that they actually do have talent.

Talented artists can record the music and distribute the music for next to nothing.  Why shouldn't it be free?  Many artists have realized that the money is in touring and if you give out the music for free, you can build a huge fanbase.  A huge fanbase means sold out shows when touring.  Even popular artists like Madonna have realized this, opting to drop her record label of 25 years and instead sign agreements with a concert promotion company.

The record companies like to say that music downloading will kill the music industry.  It won't.  It will only kill them.  The artists will always do what they do, because they love it and because they can still make a good living at it.

Through the years, the evolution of technology has created new jobs/industries and killed others.  The record company of today is the blacksmith of yesterday.  In the end, I think the artists may be better off without them.

19. February 2008 21:05
by Troy
0 Comments

Free Antivirus Product Musings

19. February 2008 21:05 by Troy | 0 Comments

Recently my Symantec Antivirus 2005 edition support contract ran out and I began to get the nagging, "fear of god" messages from the program telling me that my computer wasn't protected and that I could no longer get the latest virus definition updates.  It was very scary.  Ok, scary may not be the right word.  Annoying may be a better word.

Avast! Home Edition - http://www.avast.com/eng/avast_4_home.html
I started by installing and trying this free antivirus software product.  Early on, I ran into conflicts.  Avast! reported  that it could not properly initialize all of its scanning services while Symantec was installed.  My contract had run out, and I was using the older 2005 edition of Symantec anyway, so I uninstalled it so I could get a clean taste of what Avast! could do for me.

The biggest problem with the free version of Avast! is that you cannot schedule the virus scans to occur on a regular basis automatically.  You must manually run the scan yourself.  Overall, the software seemed pretty decent, easy to use, etc. however, for me, I want my computer to do things for me, not the other way around, so having to manually run scans was a bit of a deal breaker.  Of course, I could always upgrade to the paid version of Avast!, but my intention was to see what you could get for free.

AVG Antivirus Free Edition - http://free.grisoft.com/
I am currently trying this product, after having uninstalled Avast!.  From my point of view, there isn't much difference between AVG and Avast!.  They both appear to be competent in the function they perform.  AVG is generally user friendly, as was Avast!.  The big limitation with Avast! is not present with AVG and I am able to schedule a scan, however, only one scheduled task is permitted, and that task is limited in its defintion.  It only provides for a daily scheduled scan and you can only define the time at which it runs.

The most annoying thing I've found about AVG is that updates to virus definitions seem to always require a reboot of the computer.  The scheduled task for updates is similar to that of scans... only one is permitted and it is daily, or a manual process.  Neither of these options is perfect because you have to choose between manually updating the virus software yourself, or be faced with possibly having to reboot the computer on a daily basis in order to finish installing the update.  This annoyance is heighted by the fact that after an update is downloaded and installed, the AVG virus scanner is deactivated until you do reboot, which means delaying the reboot leaves you unprotected.

I am going to leave it another day or so, to see if the updates and reboots are truly a daily event, or if I've been unlucky with a steady stream of updates these last couple days.

So I left it a few more days, here are some of the results... the following list is "number of days between reboots" where 1 is the next day.  This assumes the software updates itself on a nightly basis, so this table provides a glimpse of just how often a reboot is required.

2 > reboot > 2 > reboot > 2 > reboot > 4 > reboot > 2 > reboot > 2 > reboot > 0 > reboot > 4 > reboot > 2

As you can see, AVG has me rebooting my machine almost every other day.  It is unfortunate that their update process involves a reboot so frequently.  It might be time to try out another offering...

I'll let you know what I find.

Update:
Ok, another oddity I've discovered with AVG.  It detected a virus on my machine, in an old archive file, so I'm not too concerned about it.  But here is the thing, it didn't notify me that it found a virus.  I had to go into the Test History (double checking to make sure they were still running) and that is when I saw that it found a virus.  In fact, it has been finding this virus with each scan every day for over a week.  If I hadn't gone into the Test History on my own, I still wouldn't have known about it.  Wierd. 

Update (2008-04-09):
I recently came across this site which provides the results of some comparison tests of many different antivirus products.  Worth a look if you are shopping around.
http://www.av-comparatives.org/

Update (2008-09-30):
Ok, well, time has passed.  I am still using AVG.  The latest version (8.0) seems to have fixed the frequent reboot issues that their update process used to cause, which is nice.  I don't notice it running now, which is a good thing... you want it doing the job, but you don't want it annoying you while it does it.

13. January 2008 06:32
by Troy
1 Comments

IIS 6 on a Development box and Premature Session End

13. January 2008 06:32 by Troy | 1 Comments

I started working on a legacy web application that was comprised of both classic ASP and ASP.NET.  The development box was running win 2003 server x64 with IIS 6. 

The application typically uses the standard 20 minute session timeout period, however, for my dev box, I wanted to lengthen that time to avoid having to log in more times than necessary.  I set the classic ASP session to 240 minutes, and the ASP.NET session to 240 minutes in the web.config file.

I then proceeded to go to work.  A short time later, I was redirected to the login page.  Hmm, wierd I thought.  Ah, let it go this time.  Log in and keep working.

The next day, after my initial log in, once again my session timed out after about 40 minutes and I was redirected to the login page.  I double checked my session times, both were still set to 240 minutes.

After a little bit of head scratching, I found my way into the properties of the Application Pool for my web app in IIS.  My attention was drawn to the Recycling tab where I noted that my application pool would be recycled every 1740 minutes.  Since my sessions were stored "In Process", this recycling would kill my session, however, the timing didn't seem too likely.  Nonetheless, I changed the property to configure the worker process to recycle at a specifiec time instead:  3am.

On the Performance tab, I found something even more interesting.  An "Idle Timeout" setting that was set to shutdown the worker process after being idle for 20 minutes.  Bingo!

So even though my session timeouts were set correctly at 240 minutes, I could still lose my session after 20 minutes because the worker process would be shut down after 20 minutes of inactivity... which might be common on a development machine where you are the only web site visitor and you are distracted by other work or working in other applications for a period of time.

Once I turned off the "Idle Timeout" feature for my Application Pool, my premature session death is a thing of the past.

12. January 2008 01:41
by Troy
0 Comments

This is my first blog post.

12. January 2008 01:41 by Troy | 0 Comments

I'm not sure that I have that much to say yet.

I am still getting my feet wet when it comes to the whole blogging thing.  For instance, WTF is a slug?  I see that it is optional, so that is good.  I also see that I can extract a slug from a file.  Hmm, can't wait to google that in the context of blogging and find out what it is.

Well, that is probably enough for my first post.  I don't want to burn myself out too quickly.