Wednesday, December 17, 2008

Degree Complete, Now What?

Those who have followed me know that I've been completing a Master's degree in Economics for the past couple of years. I had to study at a slower pace due to job demands, and I'm pleased to announce that I received my diploma this past weekend, so your's truly for now is done with schoolwork.

A common question I receieve is "What are you doing to do with a Master's in Economics?" It's a good question as I primarily work with computers. I considered getting a M.S. in MIS, but thought this would be too redundant. I wanted to go the econ route because it's always interested me, I can shift in that direction with my career one day, and studying econ gives a very good understanding of the vagaries of the economy which affect every single one of us. Working with data, it's also given me a huge leg up with regards to efficiency and understanding business practices of the clients I work with. I was told that my degree had been shopped to a client considering a large data warehouse project because they wanted someone with both datawarehousing and analytical experience, and the degree became the differentiator that put me in a strong competitive position.

In short, it's like having a quiver full of arrows and I just added an additional arrow. Short term benefits are limited to the extra money I can make with the education, but long term benefits are unlimited. I'm glad I did it.

Tuesday, December 16, 2008

Extracting SQL in Data Warehouses

In working with potential clients, I often get the following question in tech interviews, and the question is "what is the best way to extract data from a data source?". To use the name of Andy Warren's blog, the answer is "It Depends".

I was asked this specific question today and gave the response above. Knowing that's an incomplete answer, I elaborated. The response was something like "It depends on the machines that are running both the database and the ETL process."

It goes something like this. Let's say our source system is an order entry system that is a mainframe DB2 system, and our data warehouse is a Windows-based SSIS implementation running 16 dual-core processors with 32 GB of RAM. We can run the extraction a couple of different ways, including:
- Perform a simple select statement and pull all rows out of the database on the source
- Perform a simple select statement and pull rows with a constraint
- Run a compplex query on the database engine to pull the minimum amount of data.

The key differentiator in this dilemna is how much data do we need? I've worked on ETL processes that pulled 20 million records in a single ETL jobstream only to use a majority for lookups and then updated only about 1 million records. I've also worked on ETL jobs that looked like the third case above, running a complex query with where clauses out the wazoo and returning just a couple rows to the ETL process. I generally try to avoid the first choice above unless we need all those rows. It's also possible and common to constrain the query and have indexes and other tweaks on the database side to speed query performance.

The way I would do it is dependent on the performance of the two machines. I always try to minimize long running network pulls, but if we run a significant query on the source system we may negatively impact their performance. Thus, it's such a localized question that is dependent on the environment.

Some good benchmark rules for extracting data are:
- Don't run queries on source systems that have negative performance impacts
- Don't have long running network connections to pull data
- Do utilize the machine with the most power for complex tasks
- Use all processor available to complete tasks faster
- Use specialized database connections (vs plain ODBC) to improve performance

Finally, always remember the correct answer is "It depends".

Monday, December 01, 2008

Computer Update

Hello again, as if you're not tired of hearing my long running discussion about work computers, here's another one and probably the last for a while.  

After my latest reformat of my Dell computer, it's been running like a champ.  Running so well that it seems to be brand new.  I'm very pleased with XP Pro and every day I'm reminded why I don't like Vista.  I was visiting one of my clients the other day and a mass comm email went out to the group saying that in 2009 there will be no upgrading to Vista, and this is a company that has thousands of PCs.  I have yet to work with any clients running it, and it seems that Microsoft acknowledges the issues with Vista and is pushing Windows 7 out for an early 2010 release.  From the specs I read a few weeks ago, Windows 7 is going to be a great system and use significantly less resources than Vista.

My biggest concern with laptops is the screen.  I prefer to use 17" lappies, and the screens vary from good to poor depending on the model.   When I bought the Dell in 2005, I didn't get the HD screen since it had a glossy finish, and there is not a single Windows-based consumer machine on the market today that doesn't have a glossy finish.    I purchased a cheaper HP laptop this year to see if the gloss would be as bad as I thought, and I'm sorry to say that it is.  The colors are very vivid, but that computer is worthless is a majority of the settings in which I need to do work, typically corporate offices or outside on a beach.   Dell offers a business 17" model (Precision line) which is very expensive and the same form factor as my current comp, which looks dated.    My searching led me to the Apple MacBook Pro 17, and with the current new line of MacBooks having went only to glossy screens, it was now or never for the 17, lest Apple only offer glossy on that one.

I'm typing this post from my new Apple MacBook Pro 17 with a HD screen (1900x1200) with matte finish.  I must say this screen is the most beautiful computer I have ever worked on.  It's powered by LED so it's bright and easy to read, and offers a lot of real-estate for doing remote desktops and development apps.  It has a 2.53 Ghz C2D processor, 4 GB ram, and 320 GB hard drive.  I installed VMWare Fusion 2.0 and run Windows XP Pro on it.  It took me less than 2 hours to have this computer set up exactly like I wanted, and I even have a W2K3 virtual machine with all my servers on it.  In short, I finally own my dream machine.

I'm sure I'll catch hell from the guys for using an Apple, but the screen sealed the deal.  I would only be happier with this purchase if it was free.  

For now, the laptop search is over and I'll try to get back with some consulting stories.  Remember it's all about the Damn data.  

Tuesday, October 21, 2008

Code Camps vs. SQL Saturday

I had a good conversation with our developer evangalist (Joe Healy) the other day, and he mentioned that SQL content is way down at code camps due to the pull of SQL Saturday. As I was preparing my latest presentation for this weekend, I felt a little excited. I then began to wonder why I was excited to prepare slide decks, and realized that I hadn't given a talk since May, and I've only done four this whole year (February, March, May, and October.). Feb. was in Miami, and May in Jacksonville, and March and October in Orlando.  . In 2006 and 2007 I did 7-8 talks per year, so either people are getting tired of hearing me talk, or I am not getting out as much as I used to.


After a few minutes of thought, it dawned on me that Tampa code camp was moved to December, and I missed Jacksonville and Tallahassee due to traveling to client sites. So the problem is not that I'm doing less, it's just that I'm doing different things. I would definately rather work and make money than give talks for free, but I do get a great return of satisfaction having passed on some knowledge at these events.


Going back to Joe's comment, he is correct that as the events have pretty much split in two, there is less database content at Code camps. I begged and pleaded with Keith to give me a SQL track at Tampa code camp a few years ago, and while we had a good representation, there was never the content flow that I think would be an awesome winner. I'm hoping to keep some sort of presence for the code campers who want to know database stuff but don't know or aren't interested in it enough to go to a SQL Saturday.


As I learned when I was a kid, if you can't find a road, make one.    Keep your eyes tuned to the upcoming 'Day of Data' event.  

Monday, October 20, 2008

Installing SQL Server 2008

I recently installed my first version of SQL Server 2008. I used the Enterprise edition and installed on on Windows Server 2003 running in a VPC. This is certainly not the ideal setup for any kind of benchmark analysis, but it was a good experience.

The installation proceeded very easily. I first had to install the .NET 3.5 Framework, reboot the machine, and start the full installation. I chose to do Windows authentication and set everything to be default and the same user accounts. This is not a good practice for production environment, but if my laptop is considered a full-scale production center, consider me guilty.

Having done a 'select all' on the components, it took just over 1 hour for the full software to load, but I was very pleased to see all the services come up without so much as a reboot. I've been playing around all night with Change data capture (CDC for short) so I hope to write about it soon.

We're up an running, and that's a start.

Sunday, October 19, 2008

SQL Saturday - Orlando

I'll be attending the 2nd annual Orlando SQL Saturday event on October 25. Andy Warren and the OPass crew host this event and they did a heckuva job last year and I'm sure this year will be even better.

My session is all new: "Building a Data Warehouse using SQL Server 2008". This session is all new with a great new data warehouse example. I have been using the airline example, but we'll step it up a notch and look at not only how a major airline can use SSIS, but also how a major manufacturer can as well. I'm really excited about this presentation, as I believe SSIS 2008 will be the version that finally gets the respect it deserves in the marketplace.

I already have the slide deck done and posted on the website (wow, that's a record for me) so if you want to print it off beforehand you can get it at http://durableimpact.com/resources.aspx

Hope to see you there.

Saturday, October 18, 2008

Complete System Reformat

I have written before about my desire to replace my Dell laptop that I use for business with a newer model. After months of fruitless searching, I found the only suitable replacement - an Apple MacBook Pro 17. The steep price tag of the Apple had me reconsider exactly why I needed a new laptop, and I came up with the following reason. My Dell is slow. Slow is hard to quantify, but the hard drive is constantly seeking, and then I had an ah-ha moment. I remember this Dell was quite snappy under Windows XP Pro but seemed to labor under Vista. What if I took it back to XP Pro? Would the speed be acceptable to get another years' life from this machine which has served me well?

Today I found the answer. I just spent 6 hours of my life reinstalling everything from Windows XP Pro to Office 2007 to SQL Server. I was able to save my virtual machines, and VPC is running 10x faster now than it did before. The only negative I have seen is that the Windows Mobile Device center is no longer and it's back to Active Sync for the cellphone.

Why do so many people have problems with Vista? I think the software itself is not too bad, but it is definately a bloated O/S. My Dell is maxed out at 2 GB ram, and with Vista and VPC I would fill it all up. Doing the same on XP Pro I am still at 1.2 GB with everything running. I think I will be very happy with this reformat, and thus be able to squeeze another year out of this machine.

Aside from dealing with a slow Vista system, I have had some problems with WM 6.1 on my Sprint Mogul. This version of software seems to have memory leaks such that the memory fills up and necessitates a reboot every 3-4 days to keep everything running OK. I'm slightly surprised and disappointed that Microsoft has not yet fixed these errors. The 6.1 software is 10x better than the old 5.0 that I had on my last phone, but I am definately looking forward to Windows 7 for both the laptop and phone. Let's just hope that MSFT puts a lot of work into slimming down both systems to make them run better.

Saturday, October 11, 2008

IE 8.0 Review

Hello readers. It's been a while since I posted and I apologize about that. Let me start with a comment about Internet Explorer 8, which I have installed on my Dell laptop. Yeah, this is my main computer since I still have not found anything on the market to satisfy my needs (See earlier post about Laptops). Apple is coming out with a new line this week which might be promising.

Anyways, I installed IE 8.0 which is supposed to be more stable and quicker than IE7. Let me say that I use IE for everything so I can't compare it to Firefox. IE8 does have some nice right click menus (called Accelerators) which allow different functions (Blog with Windows Live is one). The concept of accelerators holds a lot of promise. With regards to websites, I've had no issue with about 95% of the pages, some of them work best in compatibility mode, which is a great mode inclusion for the browser. It does seem quicker with less overhead, and the 3 or 4 times that it's crashed, it pulls itself right back up where it left off, which is nice cleanup work. The one complaint I do have is how much the brower slows down with Flash pages, and Adobe PDF files. I'm not sure why that is, but let's hope it gets fixed for the final release.

All in all, a nice upgrade from IE7.

Sunday, August 31, 2008

The Life of a Traveling Consultant

Hello from here and there. Today I was flying on AA, sitting in my seat waiting for us to takeoff when a beautiful AA Boeing 777 came by on departure and the percussion from the jet engines caused my heart to skip a beat, and as I watched the beautiful jet gracefully defy the forces of gravity and lumber into the sky, I thought it would be a good post to write about life on the road, so here it is.

I had often dreamed about the glamorous life on the road, thinking it would be all about staying in nice hotels, flying commerical airplanes all the time, eating in different restaurants and meeting some very interesting people, and I'm pleased to report that's exactly what it's like. In the past I would just stand in awe at the airport watching the lucky business travelers work their way through the system and know the behind the ropes' scene of current day air travel and now I'm one of them. I had a great innoculation due to the excessive personal travel I've done the past few years, so much that I had already attained status on airlines and hotels prior to becoming a modern day road warrior.

Being a road warrior is not for everyone. For instance, my car spends more time at the airport and in the garage at home that it does being driven - for me that's a plus since I'm not a fan of manejando (driving in Spanish). This kind of lifestyle becomes readily apparent when working at a client site. I have more than once found myself with a corporate mentality of "heck it's 5 PM time to go home" only to remember that I'm not home and leaving would just mean more hours sitting in a hotel room reading, so why not stay at the office a little longer and get more work completed. Thankfully most hotels today have wonderful service and beds and I have found myself enjoying the accomodations quite well, even if sometimes I would like to just be in my own home, watching a baseball game while sitting on my couch.

Another benefit (or not, depending on your perception) is continuous learning of how other companies operate. Just like any corporate job, I have to get in there and play a game of politics to find out who are the power players, who gets stuff done, who knows what, and importantly, who not to listen to. In past experience, I've found some most interesting information from people in the last category, as they might be clammed up in a corporate world, but have a lot to offer to someone who is an outsider (like my role). One thing I've learned is that each organization has it's own quirks and monkey hoops that you have to go through to get even something menial completed, but accepting this is just part of the game.

I'm going to write more on this later and tell stories of some of the battles I've fought and share some of the knowledge I've learned, but now I have some free time and personal time is short for us road warriors, so I ask that you carry on, enjoy your life, appreciate the small things, and have a great Labor day.

Monday, August 04, 2008

TDWI Membership, Is it worthwhile?

The primary organization in data warehousing is TDWI - The Data Warehouse Institute. Many of you might also be aware of PASS for SQL Server, or the Oracle User groups for Oracle. Today I want to talk about TDWI and get some feedback from you as to the value you find from your membership.

I was a TDWI member but am not currently. I attended two TWDI World Conferences in 2005 and 2006 and while I learned a lot at the first conference, the second was interesting but almost the same content. A lot of the comments I've heard from contats are that TDWI conferences often have the same courses time after time which is good for new attendees but not so good for repeat attendees. The biggest complaint I have about the conference was that my contact information was sold and I received about 20 phone calls per week for six months after the conference, and at least 1-2x a week for 2 years following my attendence.

Along with the membership comes access to reports and website information. There are some good case studies and other reports and I often receive emails requesting to fill out a survey about salaries and projects. One complaint I have about these surveys is that by filling it out, I give them information which they resell to others without receiving any information that helps me in my competitive situation. Most salary information is also given at salary.com. To be fair I have found some surveys to be worthwhile (companies starting projects, etc), but again not worth the $299 per year.

Now TDWI is trying to break out into forming user groups. My premonition is that their membership is declining and they are looking for more ways to reach out to the community (and more membership sales). The brilliant part of this strategy is that they get a huge networking community together and that definately has a lot of value for the membership.

In short, I think TDWI is useful and worthwhile to attend a conference every couple of years, but I'd like to hear and comments about the value you have received from your membership.

Tuesday, July 29, 2008

Continued: How many packages do I need?

The other day we looked at a scenario that described the difference between one and many packages. I also put a ghoulish picture of a nasty looking package in my post to scare you away from doing a large package when organized, smaller packages are the better choice.

Let's look at the following case scenario. Great Plains Foods, Inc has hired us to create a database load process for them. Their requirements are that the data is loaded from a flat file every day by 10 AM and any failures need to generate an email to their production support control. We are to load 5 data files into 5 tables. The information below shows us filenames, expected daily row count, and the destination tables.

File Name Expected Count Target Table
Orders A 5500 DAILY_ORDERS
Orders B 1200 DAILY_ORDERS
Production 7500 DAILY_PRODUCTION
Employees 350 DAILY_EMPLOYEES
Shipments 900 DAILY_SHIPMENTS

From observing the data, we see that the orders files A & B load into the same table. A question for the client should pop into your head at this point, which is "What is the difference between Orders A and Orders B files?" Our business associate at Great Plains tells us that the files are the same but A comes from computer orders and B comes from telephone orders. Now, from an architecture standpoint, how many packages would you create for this job?

I would create 4: Orders, Production, Shipments, and Employees. The Orders job will read from two files while the remainder will read from the single source file. I have seen single packages load all these tables in a single job, but that kind of architecture does not leave any room to manually load a single file or to handle data related issues and failures.

What would you do in this scenario?

Monday, July 28, 2008

Package Failure Notification

One way to guarantee disagreement among a room of ETL developers is to begin a discussion on how to notify when a package fails - and that argumentative topic is the subject of our discussion today.

When I'm working with clients designing ETL jobs one of the first questions I ask from my fact finding checklist is "What is the package failure notification requirements?". This answer not only varies by client but also by team members within the same organization. This simple question often has to be escalated up to the senior manager to get the correct answer. Developers often want to be notified by email, preferably next-day notification while team leaders and managers want people notified by pager or phone calls when the package is a critical data task. I've also found developers on the same team desire next day email and immediate phone call from the operations control even at 2 AM.

There has to be a middle ground in notifications. When business critical metrics are at risk of not being met, immediate notification is a reasonable requirement. If no one will be looking at the data until Monday and it's Wednesday, an email should be sufficient. Thankfully, SSIS has a Send Mail task that can be used to generate emails for failures or abnormal operation. A good example of using Send Mail task is when a process runs but only processes 10,000 records when the daily average is 100,000. In this case, a proactive email can be sent to the developers to do some research before the business metric is published wrong and senior management notices. DataStage 7.x also has notification capabilities when using the Sequencer and a notification task.

The decision on notification activities should be as consensus as possible among an organization and thankfully the major ETL players in the market provide easy development of notification processes in ETL jobs.

Tomorrow we will do a deeper dive into yesterday's topic of splitting up ETL jobs so stay tuned.

Sunday, July 27, 2008

One, Two, or More Packages?


For today's topic we will look at a a major ETL architecure consideration: how many tasks should we have in an ETL package?

One of the beautiful things about ETL is visual data flows that really help the architect consider the best way to move and transform data. Alas, each ETL architect has a different opinion on when to split tasks out into a separate package or to put everything in a single package.

I'll use another consulting experience that I have seen - clients who want to put all kinds of data derivations in a single package. My general rule of thumb is split off packages where there is a logical split in the data - have a separate package to handle order data and sales data. Sometimes there is other reasons to have separate packages - maybe you want to have a package that loads up a SQL Server database and another package to write to a Oracle instance. Normally I would just use Expressions to keep from creating multiple jobs, but consider the case where there are two processes that are similar but not exact, where using an Expression would not be practical.

Another key consideration is for restart capability in event of failure. I don't want to design a package that loads up data but has no ability to restart in case there is a referential integrity violation in the middle of the load.

For more information, see my ETL Architecture design document and/or ETL Architecture considerations presentation I did in June 2007 on the Durable Impact website to get a better idea of some of the considerations used to determine how to break up processes into separate pacakges.

As a final note, don't have a package that looks like the one above.


Saturday, July 26, 2008

Laptop as primary work computer?

Today I'll diverge a bit from talking about software and discuss hardware and a topic that everyone as a strong opinion about. Does a laptop suffice as a primary work computer for us techies?

For my situation the answer is yes, but. Yes, but includes a nice external widescreen LCD monitor and extrenal keyboard and mouse for the house. My personal config is a 24" Dell LCD and ideally when I get a bigger desk I'll get a second 24" LCD, but two monitors might be problematic with laptops that generally have one VGA out. Some of the nicer laptops have VGA and DVI, but more on that later.

I prefer to use laptops because * ding ding ding * I bet you can guess the answer - they are portable. The newer generation of laptops are sufficiently powerful to run every piece of software that I need to run and it keeps me from having a server stack in my office, which is good because I sold them all last year.

This topic came up today because I am looking for another DTR laptop (dtr = desktop replacement). My choices are the Dell Studio line and the HP dv7t. I'm not overly excited about either one of them because they cut a lot of important things like DVI output and high end speakers and removed the option for mattee screens which is a nice option in an office setting. I don't buy the business machines because they do not offer a 17" screen for whatever reason.

My ultimate goal is to have two laptops set up to be ready to go at any time ....1 set up with the Microsoft BI suite and another set up with the IBM BI suite. Once I choose a computer and get them setup I will write a review about it.

Do you use a laptop in place of a desktop?

Friday, July 25, 2008

DataStage 8 New Features

I have recently taken a good look at the IBM WebSphere DataStage version 8 product offering. v8 has a very tight integration with WebSphere product and is the first true IBM iteration of the previous Ascential toolsuite and it shows. Installation on a Windows 2003 Server product was a really painful process. First, I read the entire manual which has a lot of steps to create users and set permissions on files and folders. Then I installed the software and had repeated failures when configuring the WebSphere product. The installation was failing for unknown reasons, and each time I would have to remove DB2 and WebSphere and restart the process. Finally I was able to successfully complete installation and get the software running.

v8 installs both DB2 and Websphere server but it's possible to use SQL Server or Oracle for the MetaData server instance. DB2 is not a bad database for those who haven't used it but there are definately some quirks (especially with dates and times) for those used to the other platforms. For right now I am using v8 with DB2 to keep things simple, but I will demonstrate the v8 platform using SQL Server, DB2, and Oracle as the MetaData repository to test the performance of each platform.

After I finally got the software installed and rebooted the machine in preparations to prepare screen shots for this blog, I was unable to get the WebSphere server running which means I cannot log into the Information Server portal. Once I troubleshoot this issue I'll post a fix, as I suspect I'm not the only person to have this problem.

Thursday, July 24, 2008

Business Intelligence Development Studio, or BIDS for Short

Part of the allure of the Microsoft platform for business intelligence is the BIDS, or development environment used for the platforms. BIDS is directly part of Visual Studio and will make an easy transition for anyone familiar with VS. I actually like BIDS although I do have some gripes and I understand there are a number of changes in store for BIDS 2008 that will be detailed in a later writing.

One of my main beefs with the BIDS is that sometimes it's overly complicated to do something. Microsoft has always been known for giving developers full control of the environment and BIDS is no exception to this, almost to the point of there being so many options in the tool that it's easy to get overwhelmed. I always put the options I use most often on the toolbar so I don't have to wade through menu after menu of options. It would be great if Microsoft used the ribbon in BIDS 2008 as they have used in Office 2007. The first time I saw the ribbon I was intrigued and soon became a fan and don't even like to use older versions of Office anymore.

The main positive about BIDS is that there is full control over aspects. A key annoyance is all the windows that are docked but it really is a love/hate relationship for me, as I love having access to it, and I like to hide the windows, but I find myself constantly pinning and unpinning the windows when working with my laptop, although it's not so bad when I'm working on the 24" LCD in the office.

What changes are in store for BIDS 2008? You'll have to wait a couple of weeks to find out - I am giving a presentation on August 11 about the 'Features of SQL Server 2008' that you will find interesting.

Wednesday, July 23, 2008

The Importance of Data Quality

The key concern in data warehousing is quality data. If you have seen me present, you probably won't forget my emphasis on having data quality mechanisms built into ETL processes. Let me tell you about a scenario I was dealing with at a client recently.

The client wanted and ETL process built that would do the following:
1. Pull all records from table A with a insert date of yesterday into a flat file
2. Delete all records from table A
3. Run this process daily

Okay, so this is simple enough. I set up an ETL job that read the rows from a table and put them into a flat file. I then prepared a second job that would do the delete with the same date criteria. The purpose of doing this in two jobs was to ensure the backup file was created and populated before any delete process executes.

I tested this process and the counts looked great. I implemented in production and checked the counts of the two processes in the morning and they were different. I thought something must be horribly wrong because I had tested this process. After doing a little research with the client it was determined that Table A was being updated on a 24-hour window process, so by the time my process ran in the morning, all the records had not been loaded from the previous night, resulting in records being deleted that were not in the backup file.

The solution to the problem was to set the date back to 2 days prior and voila, all the record counts between archive and delete matched up. The issue with some records not being in the archive was not a problem due to the information being replicated on the source system.

The purpose of this posting is to describe again that data quality and doing simple checks of record counts between processes (especially delete processes) is a pressing and fundamental part of the data quality process. As data systems architects and developers, it is our responsibility to ensure that we build and check processes to ensure requirements are met. It might have been an issue if I have never checked the process after moving it into production. It may have never been discovered. It doesn't matter though because it's part of the data quality stewardship that is so important in today's business intelligence systems.

Friday, July 11, 2008

Datastage Lookup vs. Join

Today’s topic will do a deep dive into ETL architecture considerations for the IBM Websphere DataStage product. Specifically, I am going to discuss the difference between the Lookup and Join stages in version 7.5. For those who might be confused, version 7.5 will cover 7.5.1, 7.5.2, and 7.5.x2, 7.5.3, etc. I've worked on clients will different variations of the tool but the functionatlity for these stages is the same.

This is a real-life case study of a problem I had desiging a job for a client. The job read from a source table with a driving table containing rows that were to be pulled.....a generic SQL query to do the same thing would look like:

select * from BIG_TABLE where record_type IN (select * from REF_TABLE where process_date = current date)

In DataStage I could have done a couple different things. I could have used a database stage to pull the rows with the join in a single stage and had good performance by forcing the DB engine to do a majority of the work. However, for the purpose of maintenance I chose to use a couple different stages to do the job because the client liked having stages on the pallete for people who support the jobs to easily see what is happening. Fair enough....

Thus I had the option of using Join, Lookup, or Merge stages. Merge is designed to be used to have two similar datasets (original and update sets) and merge them together. Since both tables had different metadata, this was not the best tool for the job. Eliminated from consideration.

The Join stages has options to perform a left, right, inner, or full outer join. Without going into extreme detail of the four types of joins (research SQL joins), I could set up the driver as the left table and the reference as the right and match the rows based on the date column. Okay, so this solution would work but it wasn't really what I wanted to do, so I went to the Lookup stage.

Lookup is similar to join except that it looks up a row in the reference dataset for each row in the source dataset. This is memory intensive and should be avoided on large datasets and I was dealing with millions of rows. I knew that my reference dataset would only have a maximum of 9 rows per run and there is this nifty little option to have the lookup table loaded into memory. Ah ha!!! Now I can load the reference table into memory, perform a lookup on the rows in memory, and get some great performance. For very large reference datasets the join will use less memory, but it worked perfectly for this application.

With some further tuning of the job that was initially running 45 minutes per run, I was able to get it down to 6 minutes. Huge performance increase for a little time spent thinking about better ways to architect the job.

Do you have a good example of where a simple ETL architecture change saved you a large amount of processing time?

Tuesday, July 01, 2008

The Great Debate: Employee vs. Consultant, Part 3

This is part 3 of my series on the debate between being a W2 Employee vs. a 1099 Contractor as it relates to the technology world.  In my previous posts, I concentrated around the following points:

-          There is too much focus on provided benefits and no easy way to attach a dollar value to them

-          Without a dollar value on benefits, its’ hard to determine your real wage

-          It’s hard to make a good decision on what to do without knowing your real wage

My earlier conclusion is that many people would be better off as contractors instead of employees if they truly looked at the dollar amount of benefits but this summation comes with the following major caveat:  the person should be entrepreneurial in nature and be willing to accept the uncertainty of continued employment.  Working as a W2 employee you have some protections against zealous bosses and other events out of your control that you do not have as a 1099.   Furthermore, many companies have a vested interest in promoting a stable workforce for support purposes and having a staff of consultants who come and go does not help with this goal.    This is a key concern of many hiring managers I’ve talked to – they are concerned they will get a 1099 who will come in, develop some stuff, and bail quickly to go make $10 more an hour on another project.  It’s a valid concern because this has happened a lot in the past, or the consultant walks away without documenting what he/she did leaving the rest of the people in a lurch.  These kind of situations can be rectified by open, honest communication between both the supervising manager and the consultant, but the key point to remember as a consultant is that unhappy clients do not result in additional work. 

Some of the feedback I received from part 1 & 2 revolved around the uncertainty (risk) factor.  Using economic techniques risk can be modeled if you know the probability of each outcome, but there is no real way to know the outcomes in most job situations, so it’s best to plan on billing the client a rate that takes into effect your desired risk premium.  For an example, let’s look at a DBA who can work as W2 for $80,000 or as a 1099 for an hourly wage.  This DBA decides that he needs a risk premium of $50/hr to do the job, putting his hourly rate at $150/hr.  This is a pretty high rate for a DBA so unless there are special skills involved, it’s not likely this DBA will find many clients so the DBA should probably stick to a W2 job.   Another example (quite common) are the consultants who like to work 6-9 months a year with some downtime to travel the world or do whatever.  In this case, the risk premium would be substantially less because the person is hoping to be out of projects for a time.  Someone wanting to work 6-9 months a year is definitely not a candidate for a W2 position.

The biggest and most desired benefit is health insurance.  If I were to put on my economic hat, I could come up with 10 reasons that employers should NOT provide health insurance (and a few reasons they should, mainly adverse selection), but since this is a technology blog, let’s accept that most employers make health insurance the key benefit in their compensation package.  My first W2 job provided ‘free’ health insurance and it was a good plan.   I’ve seen good plans and I’ve seen bad plans, but this is definitely a consideration for anyone who has a family to support.     I’m not going to say a whole lot more about this benefit except look at the plan closely to try and attach a free-market price to it and use this in your job search.     Two plans which have a similar bi-weekly deduction may have drastically different out-of-pocket expenses or doctor networks.   Caveat emptor!

By and large, the biggest benefit I see for companies is the ability to have flexible scheduling. Having a flexible, motivated workforce is key to economic growth.  This is why you will mostly see large development projects staffed with 1099s who have the specialized knowledge to perform the work and leave the after implementation support to the employee base.  The best projects I’ve worked on have done exactly that – design, develop, and build the solution, then train the employees and move on.   These projects are very rewarding and part of the key reason that I enjoy working as a 1099 worker.

In synopsis, working as a W2 employee vs. a 1099 worker can be drastically different depending on your personal values and associated preferred wage.  I like to manage my own career, benefits, and time off versus having to follow a corporate template.  Good luck in whatever YOU decide to do, but I hope this candid discussion has helped inform you of some of the considerations.

Wednesday, June 18, 2008

Community Speaking - Thoughts

My buddy Andy Warren over at End2End put up an interesting blog about what drives people to speak at community events. I thought I would share my own story here.

The first time I spoke it was about increasing my own knowledge of the product. If you can teach it, you’re a master of it so the saying goes. Teaching has forced me to stay sharper and more on top of my game (less complacent) than I would otherwise be.
Andy surmises that some people speak to do de-facto networking. I will vouch for this – I have had many great discussions with people who’ve attended my sessions asking about future projects or my thoughts on a topic. He also talks about the couple guys who just love contributing to the community – and there are a few of them out there, with the key word being few. Some people just love giving something to others and speaking about technology is a form of community service, if you will.

I can say from personal experience the sheer amount of events in the state of FL is starting to become overwhelming. The code camp circuit has recently expanded to add Southwest Florida (now totaling 6), with 4 SQL Saturday events per year. Add to the local user groups who need speakers, and by the end of the year I will have spoken in Tampa twice, Jacksonville twice, Tallahassee twice, Orlando three times, Miami, and Naples. That’s a lot of money spent on travel but IMO it’s worth every penny. My primary concern about the high number of events is speaker fatigue – most of us are either working schleps in one way or another and have to give up what little free time we have to give back to the community. It’s worth it for now – but if there are another layer of events added I will have to pick and choose which ones provide the best forums for both myself and my audience.

Tuesday, June 17, 2008

Employee vs. Consulting, Part 2

Let me begin by clarifying that W2 workers are employees and 1099 workers are contractors. 1099 workers are not employees for anyone as defined by the IRS, so I want to make that very clear in my previous post where I was too liberal in my use of the word employee.

My earlier synopsis was that most people would be better off working as 1099 contractors. There are numerous reasons, but the first is to make transparent the cost of benefits. The true cost of seeing the doctor when you sneeze twice in the morning is much higher than your $20 co-pay. Cash consumers also have a lot of leverage because they eliminate the middleman insurance company. Vacation time is very hard to value – some people don’t use it all and worse yet, some companies don’t allow their employees to use accumulated vacation time. For those economists out there, there is a labor/leisure scenario that goes like this: “As a 1099 I make $800 a day when I’m in the office, and if I want to take a day off work to go fishing I need to get at least $800 value out of my fishing, else I should work”. It’s pretty easy to put a valuation on leisure activities when you know your wage, but as a W2 employee it’s more like: “I get 2-weeks off a year so I’m going to be fishing at least 2 weeks this year and hopefully 10 more days when I’m supposed to home sick in bed.” It’s pretty clear to see that scenario 1 is more efficient for both parties as to the valuation of time and money.

There are also some rules that companies must abide by with 1099 workers that are not true for W2. For starters, 1099 workers set their own schedule. They are also given project milestones but not direction. They are sometimes provided with office equipment. W2 employees are told when they can/can’t work, are given training, direction, and specific instructions, and are provided everything they need to complete the job. The reason for these rules is that 1099 workers do not collect workers compensation insurance. One of my friends had a 1099 worker who didn’t pay his personal taxes and came back and claimed he should have been a W2 employee. My buddy was not impressed and it’s sad that some people reap the benefits and expect others to pay the piper. As a 1099, I prefer to use my own laptop during client visits because I have all the software I need to document and develop and I always have unfettered email and internet access via my Sprint card. At some client sites it’s difficult to carry on a phone conversation because of privacy issues and email is a confidential and quick substitute.

I also know people who have done the contract-to-hire routine. This routine is generally touted as a way for both the employer and employee to get to know each other before committing to a long term (W2) relationship. Unfortunately, I think it’s more biased to the employer. The employer can decide whether or not to extend an offer just as the worker can choose whether or not to accept it. But the worker is given a hard deadline of a time to convert or leave, and this is generally at the whim of the budget of the hiring manager. I’ve known friends who wanted to roll W2 but were waiting 8 months to start the project they were K-to-permed for, and didn’t feel comfortable rolling until the project started. That situation turned out to be a lose/lose for both parties because the company didn’t get a good worker and the worker lost a good long term relationship because the contracted ‘dating’ period wasn’t long enough to provide any benefits.

This post is becoming long enough for a part 3, so I will post more about this in a couple weeks when I get some time. Next I’ll explore the difficulty of finding 1099 work vs. W2 work, and where that trend is heading in the future. Stay tuned and happy data warehousing.

Thursday, June 12, 2008

The Great Debate - Employee vs. Consulting

Thought I would share an interesting dilemma I found myself dealing with the past couple of weeks.  This is the consultant dilemma of whether to stay on as a consultant (or 1099 employee for commonality) vs. going directly and working with someone as a W2 employee.

First, there are numerous benefits to being a W2 employee.  The true benefits are the ability to receive workman’s compensation in event of unemployment or injury, and the protection of having to deal with corporate HR departments if there is a personnel issue.  Other ancillary benefits are generally medical insurance, vacation and sick time, and a retirement plan.    The drawbacks of being a W2 employee are that it’s more difficult to move on to other projects quickly and generally some amount of pay is given up to provide the above benefits. 

There are also benefits to being a 1099 employee.  The major benefit is freedom.  This freedom is being able to pack up and leave at a moments’ notice – which truthfully isn’t used that much if one wants to maintain satisfied clients.  There are also other benefits like being able to manage your own career vs. following a corporate career path and receive much more flexibility with scheduling.  The drawbacks are that you are responsible for providing all benefits and there is no nanny organization wanting to take care of you – i.e. companies do not have a vested interest in the well being of consultants as they do their own employees.

All that said, I’ve been studying and practicing economics long enough to know that there is a certain opportunity cost associated with being an employee.  I did a little research into the origin of medical insurance being provided by corporations and it generally was a benefit provided after WW2 to attract the best employees in a brutally competitive market.  Another offshoot was the good ‘ol pension plan which has almost universally disappeared from corporate offerings.   Vacation and sick time provisions were also introduced around this time and now we have a model for the current workplace.

Is this model broken?  I’m inclined to believe yes.  Here’s the economic argument for it: Job seekers blindly look at salary and benefits to determine their offered compensation package.  One thing is that it is very difficult to rank different packages side by side: if company A is offering a salary of $90,000 + $200 a month for health insurance + 2 weeks vacation + 2 weeks sick time per year, is that better than company B which is offering $93,000 +_ $300 a month for health insurance + 3 weeks vacation + 2 weeks sick time per year.  Out of the door B looks better because of the salary difference ($3000-$1200 health care diff = $1800).  However, what if company A would roll over vacation and pay out upon departure but B does not?  Now the equation is different.   Job seekers also don’t know that company A gives 4% raises per year while B gives 2%. 

Everyone who has insurance knows full well that the costs have risen substantially in the past 10 years.  A good portion of those costs can be attributed to the prevalence of insurance that subsidizes unnecessary procedures.  An invention to fix this, the HSA account and high-deductible plan is designed to put more consumers in charge of their own insurance on the assumption that someone paying cash will do only necessary, value-oriented procedures.  This has been proven time and time again that in a cash society the consumer is a winner.

 

Being a 1099 worker is not for the faint of heart – it’s easy to see steady, well paying work cancelled at the drop of a hat.  A couple years ago I worked with a number of contractors who were brought in for a project, moved from out of state, and cut after 2 days on site due to cancellation of the project.  The benefit is generally contractors are paid an hourly wage that includes the premiums of acceptance of this kind of risk and the loss of benefits.  Someone who is smart can take good advantage of this situation to provide themselves with a higher level of salary and benefits at a much lower cost than what could be provided by a corporation.

 

Notwithstanding the obvious corporate goal of promoting employee stability and retention, I believe most people would be better off as 1099 employees.  This would allow the following:

1.       People are judged by their output and not padding the clock by showing up on Sunday to clean desk for 4 hours

2.       You don’t get paid to socialize around the water cooler discussing last-night’s episode of Survivor (Reduction in office politics)

3.       You are able to best manage your finances and benefits which in turns reduces the cost of providing insurance to all

4.       Employers are best able to keep around high performers and lose those employees who do not perform

One major hurdle to this is recent changes from the government that more narrowly defines the W2/1099 world to keep employers from having contractors that are de-facto employees. 

Obviously this is going to be a controversial post, but stay tuned for part 2……

 

 

Saturday, June 07, 2008

Presentation Series

For the past 2.5 years I’ve been traveling around the southeast US giving lectures on Data warehousing and ETL.   The first presentation I did was “It’s all about the Data: Building a Data Warehousing using SQL Server 2005”.  Since then I have done numerous variations of that presentation.  In addition I have done various deep dives into ETL design, architecture, and development for local organizations.   Thus, I have decided to organize my presentations into two distinct areas:   It’s all about the Data and Data in a Nutshell.

The “It’s all about the Data” series will focus on general overviews of data warehousing and ETL.   These sessions are less technical in nature and a lot of overview of the subject matter and some demonstration of data warehousing processes.    In contrast, the “Data in a Nutshell” (or DINS) series will be technical deep dives will a lot less overview and more hands on action designing and building data structures.  Some DINS talks will include DBA tasks such as permissions, replication, table design while others will strictly complement the “It’s all about the Data” series by doing technical deep dives into ETL architecture and design.

It will take a couple events to get everything lined out but I’m sure when it’s done there will be a clear delineation of the subject matter and I hope to build these two talks into must-see sessions at events around the country.

 

Thanks again for your support and I welcome any comments. 

Windows Vista Update

In a previous post I had written about my displeasure of working with Windows Vista on my old Dell laptop.  After installing service pack 1 the computer has sped up noticeably.  The batteries are so old that they don’t last more than 20 minutes per charge so SP1 didn’t make a difference there, but the boot up speed and hard drive access issues have noticeably improved.  At this time I have shelved plans to go back down to Windows XP Pro.   I’m also working with a Virtual PC image of Windows Server 2008 and I’ve been quite impressed with the performance and reliability, more to come on that.

 

 

Orlando Tweener

I’ll be speaking at the Orlando Tweener event at the Orange County Convention Center in Orlando, FL.  This event is making use of the TechEd facilities during the weekend to provide free education to the 1300 registered attendees.

 

My session is “Data in a Nutshell:  Using SSIS to Solve Common Business Problems”.  This session is the first in my Data in a Nutshell series (see other posting for an explanation) and will cover the SSIS tool package and use it to solve common scenarios that may be found throughout businesses.

 

I hope you join me!

Thursday, May 08, 2008

Windows Vista.....Is it good for business?

Hello again.  Today’s topic is a discussion of Windows Vista.  I use my trust Dell Inspiron 9300 for all my consulting work.  It originally came with Windows XP Professional and worked pretty good, but in my experience Windows has a way of mucking up the file system so the computer drastically slows down over time.  This has happened with every machine I’ve had from Windows 95 to present.   Two solutions to this problem are not to install/remove software all the time, and do a fresh reformat about once a year. 

So the Dell came with XP Pro and I moved it to Windows Vista Ultimate 32-bit in January 2007.  I bought the Dell because it has a great video card for gaming (although in the 2.5 years I’ve had it I can count on one hand the number of times I’ve played games)….so the computer was ready for Vista.  Running the Windows Experience Tool shows the following numbers:

 

The processor is the slowest part of the machine and the rest is A-ok.

The Dell has struggled a lot under Vista, especially when running SQL Server and BIDS.  It’s also become slower booting up.   An error a couple weeks ago lead me to do a complete wipe and reinstall of basic applications.  However, I will still not pleased with the performance (and especially battery life).   The biggest dealbreaker was that I have a difficult time being able to dock my computer to my 24” LCD at home.  The video card crashes and then it takes a lot of reboots to get it running, but there is not an ‘official’ Nvidia driver for my card (since it came with XP). 

Keith suggested that I give Service Pack 1 a try – see if it will help with my performance issues.  I installed it without a problem yesterday and my first impression is that the computer is definitely much faster.  SP1 supposedly addressed many performance and battery life issues so I’m going to give it a couple weeks to check.  I also haven’t docked the system yet to see if that was affected. 

Overall, I can certainly understand the reluctance of businesses to not deal with a Vista upgrade.  MSFT continues to create a monster OS each time they release a new version.  I have found Vista (and XP) to be very stable…..I am able to crash Vista if I remove my broadband card while it’s running – but otherwise it’s been stable.  I also appreciate the extra work on security.   I’m sure more people will begin to use Vista in corporate offices, but I’m going to wait and give Windows Server 2008 a try before I comment further.  Stay tuned. 

Saturday, May 03, 2008

BBQ Quest

Howdy all.

Amazingly enough, I am speaking at SQL Saturday – Jacksonville (#3) today.  As usual, I rode up to Jacksonville with Joe Healy (Microsoft dude) and both of us being from the rural part of the country enjoy bbq.  When I say BBQ, I don’t mean chicken smothered in BBQ sauce like you will find at your local Applebees (priced $12 for $2 worth of food + $1 of preparation expense).  I’m talking good, southern BBQ – the kind that a local guy owns a smoker and cooks ribs, pork, or chicken all day in smoke at 250 degrees until the meat is so tenderized it falls off the bone.  On our road trips in the past we have discovered some local gem BBQ places, generally on the old country roads between major cities.  There’s good places we’ve discovered outside of Tampa, Lakeland, Suwanee, and Gainesville.

Interestingly enough, we planned our route to intersect a new BBQ join.  We drove up I-75 to Ocala, 301 to Gainesville, SR 20 to Palatka, and SR 17 to Green Cove Springs outside of Jax.  Ironically, we passed a grand total of two bbq places, of which exactly 0 were open.  We arrived at Tommy’s BBQ just north of Ocala on 301 at 3:05 PM.  They were open but stopped serving lunch at 3 PM, and the lady was eager to head on home, so we moved on up the road.  Passing two Sonny’s BBQ later, we arrived in Jax without having succeeded on the mission.  A BBQ Bills’ place north of Palatka was closed as well.

I’m not a huge Sonny’s fan – but it would have sufficed given the precarious situation, but Joe was hell bent on finding a real place or going hungry, so we arrived in Jax about 5 PM starved and disappointed.   I planned the route home on different roads, hoping to intersect with a new, hidden gem of a BBQ restaurant. 

Monday, April 21, 2008

Update from Busy World

Hello folks.   It's been a while so I thought I would update on what's been happening in my life related to data warehousing.

 

1.  I'll be speaking at the Jacksonville SQL Saturday on May 3, 2008.  This should be a really good event and a new one for north Florida.  Durable Impact Consulting is also a sponsor of the event and is providing pens and goodie bags.

 

2. I've revamped my entire lineup of code camp presentations.   Now that it's up to about 7 per year, I'm consolidating all of them into the "It's all about the Data" series.  I've done this lineup at Miami and Orlando and it gets more popular every year.  This year in Miami it was a standing room only which leads me to wonder why throw away a good thing.  

 

3. SQL Saturday Tampa was a great success.  I had promised to write a full review of the event, but I'm just too tired so I'll paraphrase.  Logistics turned out to be a nightmare but the 220 attendees and 15 sponsors were all around pleased with the event.  I'll probably not be running the event next year - instead I am planning to run smaller, quarterly events.  The Day of Data was great, 41 attendees all had great remarks about the program.  Look for a future event soon.

 

4. I'm not so happy with my HP laptop anymore.  It's really fast but I've had some issues with the network connections and the battery is lucky to last 1.5 hours.  In retrospect, I should have went with the Dell Latitude line and I'm strongly considering going this way and selling the HP.  I want a computer that can go three hours on the battery.  Plus the HP gets REALLY hot.

 

5. Thankfully my trusty Dell is still going strong, notwithstanding the dreaded "Cannot find bootloader" message I received after a failed Windows Update last week.  I had to reformat the entire computer (and lose a few documents and pictures in the process) which took about 8 hours.   I found the best solution is to set Windows Updates to manual, and backup full to external drive before each update.  I hope this computer will last another couple years as I'm looking to buy another battery on Ebay.

 

6. Still haven't heard anything about the Tampa Code Camp, but I assume it will be held in July and I'm holding July 12 free for the event.  Last I heard from Keith, they were having issues getting the venue straightened out.

 

7.  There is a great program shaping up: the "All-Florida" Code camp to be held the weekend between the two weeks of Tech Ed (in Orlando).  I won't write the date because it's not 100%, but if this event gets off the ground it will be awesome.  Look for me to be a track leader at this event to provide another great opportunity for the community.

 

Thanks and stay tuned for some more posts. 

Tuesday, February 19, 2008

SQL Saturday Jacksonville

(Check back later this week for a full write up on the Tampa SQL Saturday)

 

SQL Saturday has moved on to Jacksonville on May 3, 2008.  If you were at the Tampa or Orlando event, Jacksonville should be more good presentations and knowledge from a lot of the top speakers in the southeast United States.  This is going to be another top notch event being driven by Andy Warren, Brian Knight, and co. 

I plan on attending and speaking at this event.  I always go in August for the Code camp, and this is another chance to have a great time in the fun city of Jacksonville.  Hope to see you all there.

 

 

Wednesday, February 13, 2008

SQL Saturday Update

Well folks, we’re just two days away from SQL Saturday – Tampa.   Our venue is still the same (DoubleTree Suites – Westshore) and the DT and us are working hard to put together a good event.     We do have one predicament I wanted to share to those who follow this blog.  Arrive early.

 

One of the problems with free events is that people who have no intention of coming can sign up without recourse.   This does two negative things for code camps: it causes us to have inaccurate counts for event goodies (including prizes, food, giveaways), and when registration cap is reached, it keeps people who want to come from signing up.   This is a typical economist dilemma (on one hand, on the other hard)…….

 

We have over 380 people signed up for the SQL Saturday event and another 20+ on the waitlist.   While experience has shown these kind of events to have about a 40% no-show rate, I made the decision to close off registration for a couple reasons.   One, our event is set up to handle 200 attendees.   We have lunches, seating, and prizes confirmed for about 200 people.   We can handle more people, but at a certain rate we will bump up against fire code regulations and have to cut off attendance in sessions.  If we don’t manage the crowd, our facility will and this isn’t good for us or attendees.   We will be providing lunch tickets for the first 200 people to register and indicate they will be staying for lunch.   Experience has shown us that people come and go during the day, so I expect that with 200 lunches we can handle maybe 240 attendees since some will leave early, some don’t want lunch, some want to go out, etc.    There will be plenty of water and soft drinks for all starting at 11 AM. 

 

We’re not having any keynote or closing session either.  Prizes will be given out at the end of the day (3:45 PM).     Just show up, go to sessions, spend a little time lounging around the pool if that’s your thing.

 

I’m going to give a little reasoning now why we chose to do our event at a hotel.  This kind of venue poses unique budgetary constraints and would be a horrible choice had it not been for our great sponsors.  Thankfully we were able to negotiate a rate that was comparable to the rates that other events have paid around the state and we’ll have a unique atmosphere.   It’s going to be free, but the hope is that it will feel like a professional conference that people pay thousands to attend.   I’ve attended the TDWI Conference in 2005 and our event will be very comparable on a micro level.   Our venue is nice and professional, our volunteers are rehearsed and will present a uniform appearance.     We were hoping to have speakers stay at the same hotel, but there was an issue with being able to do that so hopefully next time.   We’re hosting the Day of Data on the Friday before, which allows a more specialized learning experience for those willing to pay a little extra. 

 

To handle the ‘phantom registration’ problem, there are discussions going on around organizers about charging attendees a nominal fee $5 to $20 to hold a reservation.   The money could be used to pay lunch, or more importantly, so people have ‘skin in the game’.   Read my sister blog about economics to see what happens to a housing market when people have no ‘skin in the game’, but I digress.  I’m not fond of charging to enter events, but I feel a small fee (closer to $5 than $20) is fair to hold registration and provide lunch.   If everyone who attends SQL Saturday had paid just $10, we would have been able to provide a full day buffet for everyone.    But it’s something to think about for the next year.

 

We’ll be paying close attention to speaker evaluations, but please give us some room to handle the crowding and food situation.   Looking forward to seeing you at the events.

 

Wes

Tuesday, January 29, 2008

Visual Studio 2008 Launch

Visual Studio 2008 is here finally!   Early reports from the battleground have a lot of people excited about the reduced level of effort.  One local company reported they thought that VS 2008 would reduce development time by 50% this year for their products.  As an economist, that’s always a good return on the money. 

 

I installed a copy last night at the Install party Florida.NET user group meeting.  If you want to get a free copy of VS 2008 Professional, you have the chance all week to attend some events and install the software.   There are events tonight, tomorrow, and Thursday where Joe Healy is providing copies to those who bring computers (Laptops or desktops with peripherals) to the Microsoft office in Tampa and install the software.

 

Check out Joe’s blog for more details:  www.devfish.net

 

Happy hunting.    

 

 

Monday, January 14, 2008

Day of Data

Introducing my newest creation:  a SQL Server training series called the ‘Day of Data’.

 

DoD is a one-day conference matching the top speakers on SQL Server to attendees willing to pay a small free to acquire knowledge and get some one-on-one time with presenters.   

 

The idea for ‘Day of Data’ came out of my original idea of a SQL Saturday (data camp).  We have the event on Saturday in a hotel.  Speakers and attendees from out of town are staying at the hotel on Friday night, so why not have an event during the day Friday for people in the following two categories:

1.       People who work M-F but can’t attend SQL Saturday because of time constraints

2.       People who have training budget to spend on SQL Server training but contemplate the general lack of local training

3.       People who want to attend the SQL Saturday and also get a little more in-depth knowledge than a one-hour session.

 

The first DoD is going to be held in the DoubleTree  Westshore in Tampa.   We have two great speakers lined up: the infamous Joe Celko and also SQL BI expert Rushabh Mehta.    We’re hoping that we can get enough people to fill both of the rooms and possibly a third room with another speaker who is interested in joining for a trifecta.

 

The price of the event is $99.  This amount merely covers expenses.  We provide all attendees with a continental breakfast, coffee and drinks, a good plate lunch, and a snack in the afternoon.  We think this event is a great value.

 

Check it out at www.dayofdata.com

 

The long range goal is to have these events at cities around the country.

 

 

New Year, New Computer

Happy New Year everyone.  I wanted to start off the new year with a story about a computer I purchased.

 

I sold both of my desktop computers last year and swore off buying any more, thinking I would want to stick primarily with laptops.  I still agree with that assertion and have been watching the notebook design wars with interest.  My current notebook, a Dell Inspiron 9300, has been a great computer.  It has a 17” screen, is quite peppy, and still 2.5 years later, the Vista Ultimate ratings for the video blow the doors off the ratings of the best computer for sale in Best  Buy (the Dell has the Nvidia GeForce 6800 dedicated card – I specifically purchased this computer for that card.   Regardless, as great as the large screen has been for work and using the computer at home, it’s not quite as portable as I wanted, and I’ve been specifically contemplating a purchase of a 14” laptop to supplement it.  To give you a breakdown of my computer replacement schedule, I buy a good machine and depreciate it over three years.  Thus, the Dell will be replaced in December 2008.   So it came to my thoughts that I should buy something small that would be compatible with any accessories I buy for the Dell replacement.

 

My specific requirements were:

1.       Intel Core 2 Duo processor

2.       2 GB + Ram

3.       250 GB Hard drive

4.       DVD Burner

5.       Good battery life

6.       Less than $1000

7.       Light and thin

8.       Ability to be carried without breaking my back in a backpack for long periods of time.

 

I began looking at computers during the holiday season.  First stop: Dell.  My last three computer purchases have been Dell and I’ve influenced quite a few more among family and friends.  However, I strongly dislike the current design of Dell laptops.  The Inspiron line is bland and bloated.   My Inspiron 9300 looks a lot better than the 1721 that replaced it, even if the 9300 has an obvious miss in lack of a 9-key pad.   The Dell XPS was a nice looking computer, but the 13.3” screen was a tad small for my liking.  Thus, Dell was eliminated from consideration.    That leaves Gateway, HP, and Toshiba as the only other computers I will consider.  Gateway looks cheap, eliminated.  Toshiba looks good but only offers integrated video on the 17” lappys, so it’s eliminated.  Thus only HP is left.  (Apple, for all its pluses, is out of my desired price range).

 

I bought an HP.   I began to look at HP computers after buying and setting one up for a friend and I was impressed with the build quality and design.  Yes, the HP computer looks great.    The 14.1” laptop has integrated video but I don’t care because I’ll only play games on the big boy.  The HP dv2XXX series met specifications 1-8 above and as a plus, included 3 GB of RAM.  I could get the dv6000 series with 15.4” screens for less than a 14.1” lappy, but it’s too close in size to the Dell so I don’t really gain anything.  I started looking in stores.  Best Buy had a computer of my liking, but it had AMD Turion X2 64 and Nvidia.  The windows experience score was 2.5, a horrible rating on the graphics card.  Combined with lack of Bluetooth, Best Buy’s offerings were eliminated from my choices.    I went to Circuit City.   They had the perfect computer with all the specs I wanted, it had a WEI rating of 3.5.  No problem.  However, the computer was silver and white, big problem.  I specifically wanted something that looked great and black/silver looks are preferred to white/silver, although the Dell is white/silver.  So I checked around and played with the computer for a while to determine if I could live with the colors.  I decided to buy it, and when I get the HP d9000 series 17” laptop at Christmas, I’ll make sure to get that one in black/silver. 

 

I’m already pleased because I purchased a notebook dock that will work with the dv2000 and dv9000 series.  Power adapters are also compatible.  Sweet,   I’m already happy.  Now let’s hope that HP doesn’t change the dv9000 too much before I can purchase one.     After a complete reformat of the computer to remove the bloatware, I’m real happy with everything and the way it runs.