I sometimes miss the Dynamic Properties Task in DTS. I used it for all sorts of functionality - and I'm struggling to replace some of that flexibility in complex SSIS development.
I recently developed a package with dynamic SQL serving as the data source for a Data Flow. I accepted the mission and here's an example to demonstrate how I accomplished this:
Drag a Data Flow Task onto the Control Flow canvas:
Double-click the Data Flow task to edit it. Drag a DataReader Source onto the Data Flow canvas and double-click it for editing. The Advanced Editor opens. Select or create an ADO.Net connection manager on the Connection Managers tab:
On the Component Properties tab, enter an SQL Statement in the SqlCommand property (I query a table I created named dbo.TestTable which contains three columns: TestID [Identity PK], TestName [VarChar(20)], and TestValue [int]):
Verify the SqlCommand field names on the Column Mappings tab:
Drag a Flat File Destination onto the Data Flow canvas and (this is important) connect the output of the DataReader Source to the Flat File Destination with a Data Path (green arrow) [Note: Previously, I incorrectly identified this as a Precedence Constraint. Precedence Constraints are found only on the Control Flow. Data Paths are the connecting arrows on the Data Flow. My apologies for any readers misled.]:
Double-click the Flat File Destination to edit it. On the Connection Manager page, click the New button to create a new Flat File Connection Manager:
Assign the Flat File Connection Manager a name (I used "OutFile") on the General page. Enter a filename ("OutFile.txt") and specifics regarding collation and row delimiters:
Click Columns to configure column details.
Why? I always create Flat File Connection Managers for new files from the Flat File Destination task. If the target file does not exist, the Destination task has no better way to know which columns to expect.
Note the Preview grid displays the expected columns (read from the DataReader Source via the precendence constraint connector):
Click OK to create the Connection Manager and return to the Flat File Destination Editor:
Click the Mappings item in the page list to view (and auto-map) the DataReader Source columns to the Flat File Destination columns:
Return to the Control Flow tab and click any empty space on the Control Flow canvas.
Why? You are about to create a new Variable. Unlike DTS variables (which were all global in scope), SSIS variables have specific scope. Scope is determined, for better or worse, by the Task or SSIS Object that has focus when you create a new variable. I say "for better or worse" because I've not yet discovered a method for changing variable scope during the variable creation process. Clicking on the Control Flow canvas gives it focus, and Package-scoped (global) variables may then be created.
Click View, then Other Windows, then Variables to display the Variables dialog. Set the Data Type to String and enter some query in the Value field. Note: the query used here must expose the same columns as the query in the SqlCommand property of the DataReader Source or the Data Flow will fail during pre-execution package validation (There is most likely a method to address this - more later, perhaps...). I created a variable named SQLOut with a default value of "SELECT 0 AS TestID, 'Zero' AS TestName, 0 AS TestValue":
Drag a Script Task onto the Data Flow canvas. Connect it via Data Path (green arrow) [Note: Previously, I incorrectly identified this as a Precedence Constraint. Precedence Constraints are found only on the Control Flow. Data Paths are the connecting arrows on the Data Flow. My apologies for any readers misled.] to the Data Flow task as shown:
Double-click the Script Task to open the Script Task Editor. Click on Script in the page list and add the variable name created earlier (SQLOut) to the ReadWriteVariables property:
Click the Design Script button to open the Visual Studio for Applications (VSA) Script Editor. Enter the following script in the Main subroutine of the ScriptMain class:
Dts.Variables("SQLOut").Value = "SELECT 1 AS TestID, 'Unos' AS TestName, 12 AS TestValue"
Dts_TaskResult = Dts_Results_Success
To test, close the VSA Editor and click the OK button to close the Script Task Editor. Right-click the Script Task and click Execute Task:
The first MsgBox command should execute displaying the default contents of the SQLOut variable:
Click the OK button. The second MsgBox command should display the updated SQLOut variable contents:
Click the OK button. The Script task should complete without error:
Frank La Vigne and I are proud to announce the formation of the Richmond SQL Server Users Group!
The first meeting is scheduled for 6:30 PM 11 May 2006 at 4600 Cox Road, Glen Allen, VA. I'll be speaking, and since this is the first meeting I think I will start at the beginning - with a presentation on installing and configuring SQL Server 2005.
We're working on a website and hope to have it online sometime this month.
Special thanks to Fahrenheit Technology for sponsoring our inaugural meeting!
I hope to see you there Thursday!
Technorati Tags: SSIS Sql Server Developer Community Richmond Virginia
SQL Server Integration Services offers vast functional enhancements over DTS. One impressive change is the flexibility of new Precendence Constraints.
Right-clicking a precedence constraint reveals a context menu:
Clicking Edit reveals most of the new functionality. To demonstrate, let's walk through a simple example.
First, add a new variable named Now as shown below:
Drag an Execute SQL Task onto the Control Flow workspace:
Double-click the task to open the editor. Enter some generic SQL in the task as shown:
Change the Resultset fron "None" to "Single row":
Click the Result Set navigation link. Add "Now" as the Result Name and "User::Now" as the Variable Name. This maps the single row returned by the query to the user variable name "Now" - which assigns the current server time to the [User::Now] variable:
Test the package by clicking the Run button (or F5). The package fails with the following error:
Clicking the Execution Results tab reveals the error to be related to the Result Binding:
To correct the error, add an alias ("Now") to the query:
The package now succeeds:
Add two more Execute SQL Tasks. Configure them as shown below. Change the Name property of the second task to "Execute SQL Task 31 - 60":
Drag a new precendence constraint from the first task to the second. Right-click the precedence constraint and select Edit:
Change the Expression operation from "Constraint" to "Expression and Constraint":
Leave the Value set to Success. Enter "DatePart("s", @[User::Now])<=30" in the Expression textbox. Leave the Multiple Constraints option set to Logical AND - this means the previous task must succeed (the Success value must be satisfied) and the Expression must evaluate to True (the seconds of the current minute must be between 0 and 30 inclusive) for this precedence constraint to execute. Click OK to close the precedence constraint editor:
The precendence constraint now displays an fx symbol to indicate it is controlled by an Expression:
Drag another precendence constraint from the first Execute SQL Task to the third (31-60) task. As before, enter "DatePart("s", @[User::Now])>30" in the Expression textbox. Leave the Multiple Constraints option set to Logical AND - this means the previous task must succeed (the Success value must be satisfied) and the Expression must evaluate to True (the seconds of the current minute must be between 31 and 59 inclusive) for this precedence constraint to execute. Click OK to close the precedence constraint editor:
The second precendence constraint now indicates an Expression. Click the Play (F5) button to start execution. Depending on which half of the current minute the execution occurs, one of the two images below will display indicating that precedence constraint's Expression to evaluates True. Clicking Play 30 seconds later will cause the other precedence constraint's Expression to evaluate True:
Precedence Constraints in SSIS provide lots more functionality than their predecessors (no pun intended!).
Technorati Tags: SSIS Sql Server Precendence Constraint
I had an opportunity to speak to the Richmond .Net Users Group last night as a last-minute fill-in guest speaker.
The folks in attendance were simply awesome! It was an honor (as always) to be able to present to a cool group of folks. Frank LaVigne, MVP (congratulations Frank on being awarded the 21st Tablet PC MVP on the planet!) and I had concerns about presenting a database-centric topic to a developer group. But it went very well. It turns out there are a lot of developers delving into SQL Server 2005 - either by choice or necessity.
Do you know how to spell SQL?
You're our new database resource."
You laugh... this is a paraphrase of the discussion that converted Andy from a web developer to a SQL Server DBA...
I'm working on more Team System presentations for upcoming events, like the Richmond Code Camp. I will be presenting "SSIS with Team System" there and at upcoming user group engagements.
Plus, Frank and I are (re-) starting the Richmond SQL Server Users Group. The first meeting is planned for May 11, 2006 and we plan to hold monthly meetings the 2nd Thursday of each month. More details will follow...
I ran into a couple errors recently when trying to create new SQL Server Integration Services (SSIS) projects. One error stated:
Failed to save package file "C:\Documents and Settings\Administrator\Local Settings\Temp\1\tmp2B.tmp" with error 0x80040155 "Interface not registered".
The other stated:
Failed to save package file "C:\Documents and Settings\Administrator\Local Settings\Temp\1\tmp2B.tmp" with error 0x80040154 "Class not registered".
These errors were encountered on a new laptop and a new virtual PC. Both had Visual Studio 2005 installed, so I suspected some sort of conflict. On the VPC, I loaded VS 2005 first, followed by SQL Server 2005 Developer. I noticed the client tools didn't install, although I thought I'd checked that box. To correct, I executed the setup for client tools. When completed, my client tools were available and creating an SSIS project succeeded.
But the laptop wasn't so simple. I thought "I have a solution!" and went about re-installing the client tools, but it didn't correct the issue!
So I popped on Google and searched for similar issues. I found a helpful post at MSDN Forums: http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=53694&SiteID=1. Executing two RegSvr32 commands did the trick:
When all else fails, read the instructions!
I was honored to present a session on Beginning SSIS Development at the NoVa (Northern Virginia) SQL Server Users Group last night. I met some great people and learned a lot.
I can hear you thinking "What? You learned a lot?" Absolutely! I always learn when presenting. In fact, I would argue presenting / teaching is the best way to learn.
Presenting forces me to remain knowledgeable in my field. And it provides immediate feedback about my success at ths endeavor.
Any new technology - especially one as flexible and powerful as SSIS - will be used in unanticipated ways. This will generate unexpected results, which will in turn generate perplexing questions to presenters - usually in front of a large crowd! :)
It is all good.
It's called "learning." Experience has taught me learning is best done in groups. No one can master it all and not all at once. This is why teaching teaches.
Thanks, Nova SSUG, for the opportunity and knowledge shared!
So... I'm on a call yesterday afternoon and the topic of replication comes up. Before I can catch myself, my kneejerk response spills out "There has to be a better way to accomplish this than replication!" My colleague is taken aback by the statement. "You're the SQL guy" he says... and I am left to ponder why I reacted the way I did.
So... I ponder it most of the evening and again this morning. And I've reached a conclusion: I reacted because of the trouble I've experienced in the past making changes to SQL Server 2000 published databases.
About half the time, if memory serves, I ended up poking around in a sys% table turning bits off so I could actually make the desired change. This, after dropping the publication while standing on one leg facing East. Yes, it was tricky. And I never attained that warm fuzzy feeling of comfort that accompanies knowing what to expect from the technology.
But it isn't fair to judge SQL Server 2005 by the same standard - which would place me in the same camp as those java folks complaining about things Microsoft fixed last century (a place I choose not to camp).
So... I'm walking through setting up a local pushed publication on my laptop - just to see what it's like.
Clicking Report, View Report reveals:
Resolution: Change the SQLAgent service properties and start the service:
On to Attempt #2...
Clicking Report, View Report reveals:
Now I remember Brett, the network guy, telling me I have access to the domain, but I am not a member of the domain. I wonder if this has anything to do with this issue. Hmm...
I check my account properties - I have a local account and I'm a local administrator:
But this isn't the captechventures\aleonard - this is the local aleonard. Checking the local Administrators group, aleonard is in there, but no domain account:
Resolution attempt 2: Since I'm not part of the domain, adding my domain account is out (I can't see it). But I'm prompted to enter a domain\account when setting up the publication... what to do, what to do:
Resolution: Since I cannot see the domain, I use my local machine account credentials.
I need to work on a method to test the scalability of this architecture, but Step 1 is complete.
Technorati Tags: Sql Server 2005 Sql Server Replication
Professional SQL Server 2005 Integration Services is now on book shelves.
The work isn't over, by any measure. There are updates to maintain. The authors publicize the book at speaking engagements and conferences. In this case, it's not because it's required by anyone (no one's told me I'm required to promote the book anyway...) - I'm proud to be part of the team that authored this work.
I believe we formed a bond while working together. I certainly made new friends - most notably Haidong Ji - whom I enjoyed chatting and swapping emails with during the writing and editing phase.
I was privileged to be employed alongside several of the other authors during the writing: Brian Knight, Douglas Hinson, Jason Gerard, and Mike Murphy. Writing together while working together was a blast! At various times I believe we all took turns serving as coach, mentor, editor, encourager, co-conspirator, comiserator, brother and friend. It was a team effort I will treasure the remainder of my life.
The book has received awesome accolades - both public and private. My couple chapters would not have been possible without lots of help from others - some of whom were not mentioned in the credits section of the book. First and foremost, Jeff Beehler provided support "above and beyond" both any duty and my expectations. Without his help, the chapter about SDLC would not have been possible. Thanks Jeff!
Ashvini Sharma and Donald Farmer provided moral support, encouragement, and unfettered access to the SQL Server Integration Services team.
Kamal Hahti deserves recognition for an inspiring demonstration that motivated me to dig into some of the atypical functionality exposed by SSIS. This proved especially useful while writing the chapter on interfacing with external applications.
I recently documented building a virtual server for use with the Team Foundation Server Dec 2005 CTP. It was a lot of fun / work - and it's documented at VSTeamSystemCentral.com.
An opportunity at work prompted use of another virtual server to facilitate a data-transformation-intense software migration. In order to make this particular migration work, I needed to install an old ODBC driver. I did not have access to the driver installation - apart from installing a suite of software products purchased back in the day. I needed to land the migrated data on an existing test server and I did not want to corrupt other software currently installed there.
So... I built a VPC, installed Windows and SQL Server 2k. I loaded the legacy software package, which installed the driver, then built some DTS packages to migrate the data from the source to my target server. Voila!
It's worked wonderfully. Among the many benefits is the low impact to our existing Development / Test environment.
This experience - coupled with clustering functionality built into Virtual Server 2005 R2 - has me seriously considering Production Virtual Servers. Thoughts?
As promised, you can learn more about my Team Foundation Server setup experience at VSTeamSystemCentral.com.
I have started posting a series of articles (Team Foundation Virtual Server) describing the process I used to build and configure a virtual PC similar to the VPC Microsoft distributed with their releases of Visual Studio 2005. At the time of this writing, 6 of 12 to 15 articles have been posted. I attempted to capture screenshots for every step I used to build a functional TFS server.
I do not provide software. The series assumes you have access to an MSDN subscription or licensed copies of Windows 2003 Server Enterprise, SQL Server 2005 Developer, Visual Studio 2005 Team System (or the trial version), and Microsoft Virtual PC 2004 SP1.
Registration is required to access the content.
I just finished installing the Dec CTP of TFS.
I haven't tested the installation yet (that's the task du jour), but I am quite impressed with the installation. Concise instructions, well-written and well-formatted help. And (for me at least), no ugly and unexpected dialogs.
I installed on a VPC running Windows 2003 Server enterprise configured as a standalone AD / DNS server on its own local domain. My only issue was with SQL Server Surface Area configuration: I needed to expose TCP for communications. The pre-installation checks caught this and warned me before the TFS installation even started - which was really nice.
All in all, installation of the Dec CTP was a positive experience. Now, on to testing...
For arguably the first time ever, DBAs have a stable Microsoft development environment with integrated source control.
There's a chapter in the upcoming Wrox Press book "Professional SQL Server 2005 Integration Services" on this very topic. It serves as an introduction to some source control and project management features available to DBAs using SQL Server 2005.
There's lots more information available online about Team System - I find most of my information on blogs:
There is also a Wrox Press book due out in February entitled "Professional Visual Studio 2005 Team System".
"Why should we care about Visual Studio, Andy?" I'm glad you asked. If you haven't yet tinkered with the new SQL Server 2005 development environments, you may not know this: When you develop SQL Server applications in SQL Server 2005, one of the tools available to you is the Visual Studio 2005 environment. In fact, if you develop business intelligence applications (Analysis Services, Integration Services, etc.), you utilize a tool called the SQL Server Business Intelligence Development Studio (BIDS), which is a version of Visual Studio with tools for developing SQL Server business intelligence applications.
If Visual Studio is a new environment for you, learning about it will make you a more effective DBA when you move to SQL Server 2005.
If you don't use a source control product currently - or even if you do - I encourage you to examine the new version of Visual SourceSafe (2005) and Team System.
Now for the fun part: the work is done.
For me, the work ended with one last all-nighter. I was a week late on the copy-edits and was basically holding up the book. It was definitely not intentional, but facts are harsher realities than intentions.
"There I was..." in the middle of moving myself and my family from Jacksonville, Florida - our home of 3-1/2 years - to Virginia. The purpose of the move was to get my family closer to our extended families in Virginia. I had relocated my wife, children, cats, and most of our stuff and was preparing to fly out of Richmond for one last day of packing and work. Sitting in the Richmond airport, I popped open my laptop, connected to the free airport wireless service (is anyone in Atlanta reading this?) and began checking email. I hadn't checked email in a few days, so there were a few messages waiting. Some marked "URGENT" - from people you don't want to receive URGENT messages when you're writing.
It wasn't a pretty situation. Months earlier, I'd snagged my screenshots using Beta 2. I saw a webcast earlier in the week about Team System (one of my chapters is an introduction to SDLC for DBAs), and the wizards looked really different. So I knew I would have to snag new screenshots. But to snag new screenshots I needed access to the Release Candidate 1 (RC1) applications, which I did not have. This meant the book could go to press with inaccurate screenshots or the chapter could be pulled. Not good.
Luckily, some good folks at Microsoft responded to my repeated barrages of tear-stained spam and provided a copy of the RC1 software. It turned out a disk had been FedEx'd to my home in Jax - and that email was among those I read while sitting at RIC awaiting my delayed flight. As I was proof-reading my response to the URGENT message, stating the material I needed to complete my work was waiting for me in Jax, a followup URGENT message arrived - this one with an appended PLEASE RESPOND IMMEDIATELY in subject line.
I stopped proof-reading and clicked Send.
Don't get me wrong, I am not complaining. These folks had more than enough reason to be demanding, upset, and worried. I gave every indication of dropping the ball, and the publishing business is extremely time-critical in nature. Drop a few balls and you have no book to show for lots of time and effort. So I understood.
When I arrived in Jax (late), my FedEx was propped against the front door. I opened the package and began snagging and editing. I snagged into the wee hours of the morning, then into the morning, then past the time I was supposed to show up for my last day at work. I worked into the early afternoon and finally it was done. And accurate.
I shipped it off to the editors for another final once-over, put in an appearance at work to say some tough good-byes, headed back to the house, paid the carpet cleaner, packed up my old pickup, grabbed four hours of sleep, then aimed the old Jeep north for 11 hours of interstate travel. It was the hardest thing I've done in years. Was it worth it? Are you kidding?
Amazon has the book listed with updated cover art showing the authors. It's almost showtime!
In an earlier post, I shared some revelations I experienced as I participated in authoring a couple chapters for an upcoming SQL Server 2005 Integration Services book. I feel inclined to share some more, so here goes:
1. It's a lot of work. I've heard that from authors before, and I suppose it just didn't take. An experienced author and very good friend shared that he wouldn't wish writing on his worst enemy. I understand the sentiment and the editing has just started - I'm certain there's more to come.
2. It's a once-in-a-decade (-perhaps-lifetime) honor and opportunity. I've been reading books published by this publishing label for a decade. I always admired the style and content of their books. Some were better than others but all in all, they publish cool stuff. To have an opportunity to write for them is humbling and amazing all at the same time.
On balance, the work is well worth the honor and opportunity.
I received a cool compliment today from a peer who's a developer. He said, "You know, I really like having a DBA on my team!" I have to tell you, it made my whole day!
It led to a discussion about past experiences and expectations, and I shared something I thought was pretty much common knowledge: there are three types of DBAs. My peer was shocked, so maybe the knowledge isn't so common after all.
The three "flavors" of DBAs I define are:
- System, Operations, or Production Support DBAs - these DBAs write maintenance plans in notepad and have no qualms whatsoever about executing in command-line. They were DBAs in the old days, when we carved our own ICs out of wood. They will get your server and database back online fast - and with less data corruption than anyone else on the planet. They live for torn pages and I/O faults.
- Application Support DBAs - these DBAs are familiar with one or more (usually complex) applications. I'm talking PeopleSoft, Seibel, and SAP here. If you want to customize a screen or write a one-off web application, you desperately need these folks.
- Database Developers - these DBAs are ruthless bit-heads. They use bigint and byte fields for masking binary states. They can optimize a stored procedure in their sleep and wrap an API around a database so developers never have to consider writing SQL that directly hits tables. They are performance freaks that will work 18-hour days on weekends to test in Production - when it's "safe."
Do you think DBAs fall into these categories? Do you know any that do? Do you see yourself in there anywhere? Do you have more or less or different "flavors" for classifying DBAs?
Technorati Tags: Sql Server DBA flavor