Welcome to the mystical and spell binding world of Softworks. We write software … and read and write blog posts, and email, and do time sheets, and answer support requests, and do other stuff … but mostly we write software. Our focus is great software for healthcare. If you are looking for information about Synoptec™ or our other healthcare solutions – try http://www.synoptec.com or http://www.softworksgroup.com. This is our internal blog site but if you aren’t a geek (aka software developer) you’re still welcome. The blog has some technical categories that require you to get your geek on but you can browse to avoid those or pick a category of interest from the choices on the left. If you see something you like – or don’t like – leave a comment. I’ll feel less lonely. Thanks!
This is just a quick post so that when we come back to do another release branch we can avoid over-complicating the process; developers are good at doing that. With a simple 2-level branch scheme like we use all you really want to do is create a release branch off of the main development branch. To do this run the TFS new team project wizard and on the last step where you set up the source control repository just select the project you want to branch from instead of creating a brand new source control folder. That’s it, simple.
Here is a short article on using indexed views in SQL Server. This may be of use once we get into application tuning during the Sybase to SQL Server conversion (I’m so excited )!
It’s been a while coming, but Microsoft has finally seen fit to release a version of their Business Intelligence tools for Visual Studio 2012. I just found out about it, though it was released back at the beginning of March, 2013. These tools are an addition to the SQL Server Data Tools, and allow you to work on SSIS projects (amongst other things) entirely in VS2012. Previously, you had to have VS2010 installed with the older version of the BI tools installed. You can read more about it and grab the download at the following link:
PDF generation is a feature of most of our systems. We traditionally do this in a domain service but there are options for executing the process through the backend (via CLR of course). This article provides a good description of how that can be accomplished.
After a little more than a year of planning, design, coding and extensive testing, the Saskatchewan Finance site for SETS (Saskatchewan Electronic Tax Service) is finally live. It took a lot of diligent work from Ji, Tim, myself and the team of testers at Saskatchewan Finance to upgrade and convert this 12 year old application from it’s Classic ASP/Oracle roots to its shiny, new ASP.NET/jQuery/SQL Server 2012 incarnation.
Yesterday was a little nail-biting while the conversion of 12 years worth of data was brought into SQL Server from Oracle, but it all went smoothly. They even took my advice about not officially launching the system until this morning so that there would be people manning their support phone lines. There’s been a couple of minor issues today as the public started using it, but that’s to be expected with a conversion of this magnitude, and they were all fixed quickly.
In the end, they have a system that’s friendlier, easier to use, more secure, and branded under their own domain name for further acceptance by the public. And it still talks to several mainframe systems at their end in the round trip of data about the tax filers and their returns.
It also gets us a few steps closer to being able to shut down a few aging servers.
If you want to have a look at the login page, head on over to https://www.sets.gov.sk.ca/SETS/.
Yet another interesting use of regular expressions: converting PL/SQL to TSQL. The ”search and replace” feature supporting regular expressions in SSMS is used. I think this technique could be relevant to a Sybase to SQL conversion project I can think of.
As I’ve been working through some scripts in Iguana, I hit upon an issue where I needed to send more than a single value through Iguana’s inter-messaging queue. In this particular case, I needed to pass the SID of the stg.ChannelMessage record I was working with, along with the HL7 message that had been constructed and needed to be sent off to Cloverleaf (and then on to Netcare). I needed the SID so that the status of the stg.ChannelMessage record could be updated as processing progressed through its different stages.
I first tried, in the Filter script of the channel, to put the SID into an unused field in the HL7 message which was originally being put into the queue, and then in the Destination script parse the HL7 message, pull the SID and remove the unused field so that Cloverleaf/Netcare don’t complain about a value they’re not expecting to receive. It worked fine right up to the point where I tried to remove the unused field, where I was told that that portion of the HL7 message was read only.
So then I tried passing a table of values through the queue, only to find that the queue takes nothing but strings, and a single string value at that.
- In the Source or Filter script of a channel, create a string variable in the following manner:
queueVariable = [[queueVar1, queueVar2 = value1, "value2"]]
(Note that “queueVariable” can be any variable name, as can “queueVar1″ and “queueVar2″, and “value1″ and “value2″ can be anything, enclosing strings in single or double quotes. Make sure that queueVar1 and queueVar2 are not values you’re declaring in the follow-on script.)
- Push queueVariable on to the queue.
- In the channel script which follows the script you created queueVariable in, it will be received as “Data” by default. Put the following line of code into your script:
(Yes, that extra “()” at the end of it is not a typo and is very important.)
- Once that “loadstring(Data)()” call has run, you will have access to the queueVar1 and queueVar2 variables, which were just created.
If you need to pass a complex or formatted string value, like an HL7 message, base64 encode it into queueVariable, and then base64 decode it in the follow-on script. You can pass any number of variables this way.
The link below is a great article on the full set of “Window” functions now available in TSQL. The article covers what we had in SQL 2005 (unchanged in 2008) and now in SQL 2012. The article has simple, clear examples and shows the subtle differences between default and custom clauses for each command. The article takes patience to work through but the investment is worth it. Take 10 minutes for the next few mornings to work hour way through it and you will have new “set base” tools to attack challenging analysis requirements that would otherwise require far more complex RBAR-procedural approaches.
The link below is to an article about fixing an implicit conversion problem in TSQL. The implicit conversion was causing poor performance. This single ”nvarchar” to ‘varchar” implicit conversion was taking up 83% of the time to process the query and was triggering high CPU usage alarms. I found this surprising for what I would have assumed was a fairly simple string-to-string conversion. The rest of the article goes on to talk about how the problem was solved using a calculated column – since the DBA investigating did not have access to the code. The fix wasn’t of much interest to me but the fact that this simple implicit conversion caused so much grief was.
For the past few years at Softworks our standard has been to apply all the Microsoft “code-analysis” rule violations as errors when building database projects. One of the included checks is to block the build when implicit conversions are detected. This story is a good example of why avoiding implicit conversions is worth the effort. After a while including the simple “cast(@MyColumn as nvarchar(35))” becomes second nature and eliminates what could be a significant problem in production before it ever materializes.