DevOps Case study links


•GE Case Study
•AWS Sydney Australia
https://aws.amazon.com/blogs/aws/ge-oil-gas-digital-transformation-in-the-cloud/
•GE Oil and Gas Presentation
https://www.youtube.com/watch?v=DFGFaJZFtuk
•GE, AWS adoption
http://www.slideshare.net/AmazonWebServices/ism209-acceleration-of-aws-enterprise-adoption-in-ge??Microsoft Developer Division Case Study

•Interview with Sam Guckenheimer on Microsoft’s Journey to Cloud Cadence
http://www.infoq.com/articles/agile2014-guckenheimer
ALM Devops features
https://www.visualstudio.com/en-us/features/alm-devops-vs.aspx
•Microsoft Continuous Delivery Discussion
https://www.youtube.com/watch?v=caM0DojhV7w

Rakuten and Microsoft talk Devops in Real World
http://www.slideshare.net/TsuyoshiUshio/rakuten-and-microsoft-talk-devops-in-real-world
•Detailed retrospective
http://blogs.msdn.com/b/bharry/archive/2013/11/25/a-rough-patch.aspx?

• Knight Capital Case Study
https://www.sec.gov/News/PressRelease/Detail/PressRelease/1370539879795
•Knight Capital Group Wikipedia
https://en.wikipedia.org/wiki/Knight_Capital_Group
•Loss Swamps Trading Firm
http://www.wsj.com/articles/SB10000872396390443866404577564772083961412
•Knight Capital Group Comments
https://www.kcg.com/news-perspectives/article/knight-capital-group-comments-on-contributions-to-stabilize-the-u.s.-equity
•Knight Capital Says Trading Glitch cost it $440M
http://dealbook.nytimes.com/2012/08/02/knight-capital-says-trading-mishap-cost-it-440-million/?_r=0?







Open Source Development: Installing Java, Maven, Git and Gnome on Ubuntu

Before you can begin building that groundbreaking new app you have been dreaming about you first need to get your development environment setup. The following steps represent the path of least resistance to get a development environment setup and begin development of your new killer app. The tools mentioned will also allow your new software delivery pipeline to be automated with another tool or 2 and some additional configuration.

Operating System
First we need an OS to host all of our tools and for our purposes that OS will be Ubuntu Server 18.04 host on the AWS Cloud. Once we have our AWS EC2 Instance up and running we can use PuTTY to SSH into our server.


Once we have our AWS EC2 Instance up and running we can use Putty to SSH into our server and finish up the configuration of the new server.


When we open the connection for the first time we will be prompted to add the servers host key to the cache if we trust the server and intend to connect to it again in the future. We do trust the server as we just created it and we also intend to connect to it again in the future so click “Yes”.


Once we click “Yes” we are successfully connected to our new Ubuntu Server in AWS.


Server Configuration

Now we can complete the configuration of our new server to allow us to install and use our development tools. First, we will run “sudo apt-get update” to download the package lists from the repositories and “update” them to get information about the newest versions of packages and any dependencies they may have.


Configure passwords and create developers account

Next, we will set a password for the root user so that we can login to the console later if needed


Now we can create a new account to use when we connect to the server to do development. To accomplish this, we will use the command “sudo adduser developer”.


Once the user has been created we will add them to the admin group using the command “sudo usermod -aG admin developer”


Runtime Environment

Now that we have our server up and running and have configured our users we can begin installing the tools. First we will need to load the Java Runtime environment with the command “sudo apt install default-jre”. This will install the latest version of the runtime. It is important to note that while the latest version of the runtime is what we want to use for any new development it may still be necessary to load a specific version of the runtime to support a particular tool. For example Jenkins requires that version 8 be installed otherwise the installation will fail. For this reason we will install the version required by Jenkins as well by first adding the repository with the command “sudo add-apt-repository ppa:webupd8team/java”


Then we will execute the command “sudo apt install oracle-java8-installer” to complete the Java 8 installation.


Build Tool

We will use maven to build our solution and specify dependencies in Eclipse projects. We will use the command “sudo apt install maven” to install Maven.

Version Control

Once Maven has installed successfully we can install Git with the following command “sudo apt install git”.


Then we can add our developer user to Git with the command git config –global user.name “developer”


Using the git config –global user.email ” devopsstudent@outlook.com “ command we can include an email address for our developer.

Install Jenkins for Continuous Integration

Jenkins can be used for many things from Continuous Integration to Continuous Delivery and Deployment depending on the plugins employed. Since Jenkins has so many uses in optimizing our software delivery pipeline we will install now and complete final configuration in a later post. The command to install Jenkins (after we have install the appropriate runtime version) is “wget -q -O – https://pkg.jenkins.io/debian/jenkins.io.key | sudo apt-key add –” to add the repository. Then “sudo sh -c ‘echo deb http://pkg.jenkins.io/debian-stable binary/ > /etc/apt/sources.list.d/jenkins.list'” to refresh the list.


Then we will need a “sudo apt-get update” to get latest. And finally “sudo apt install jenkins” to install Jenkins. After the install completes we can use “systemctl status jenkins” to verify installation was successful.

Note: The installation is case sensitive and Jenkins must have a lowercase “j”

Graphical User Interface

Before installing anything else we will install the Gnome GUI so that we can remote into this server and use the GUI to navigate between tools and locate files. To install Gnome we will first add the repository “sudo add-apt-repository ppa:gnome3-team/gnome3” then perform an update “sudo apt-get update” and finally install Gnome “sudo apt-get install gnome-shell ubuntu-gnome-desktop”

Install RDP and Open Ports

Once we have all system level tools installed we can install xRDP and open the Web and RDP ports so that we can connect to our server remotely. To install xRDP we can use the “sudo apt install xrdp” command and then use “” to enable it


Then open ports 3389 and 8080 with the commands “sudo ufw allow 3389” and “sudo ufw allow 8080”

Now we can use Remote Desktop to connect to our server and install the Integrated Development Environment use the Marketplace in the GUI

A dialog will notify you that the identity of the remote computer cannot be verified. Since we just created this server we will check the “Don’t ask me again for connections to this computer” checkbox then click “Yes”

Then we can login as the developer user we created earlier.

From this point we can use the Marketplace to search for and install additional tools using the GUI.

Technical Debt Pay It Forward

Pay it Forward

You can pay now or pay later but trust me you’re gonna pay! I’m talking about Technical Debt… Technical Debt like any other Debt has Interest, so you can pay now or pay later but if you pay later it will be much more expensive. See this post on the Increasing cost of Technical Debt for more info. 

The increasing cost of technical debt
Increasing cost of technical debt

The short version is defects are said to have a 1x cost at design time but costs increase exponentially as you build, test approaching production. The simple point is that the earlier that we find and resolve issues and defects the less it costs. Anything that we can do simplify or speed up this process of finding and documenting defects reduces our Debt. We want to make a Shift Left in quality by moving quality checks closer to the beginning of the delivery pipeline where defects are cheaper to fix. By creating Test Cases based on the customer requirements and success criteria we can ensure that our tests are mapped to business value. This is no substitute for Unit Tests written at the Function or Method level as used in Unit Testing.  Plan for chaos, write test to detect it and buffer team capacity to fix it (more on that in a future post).

Follow ProDataMan on Facebook, YouTube and Twitter

We geek out with no limits on Facebook @ProDataMan there will be posts about underwater hotels that have nothing to do with Programming, SQL Sever or DevOps but it will always be cool High Tech stuff…

Follow us on Twitter @ProDataMan to be notified when we add a new video to a Playlist on the YouTube channel. Currently Curating Cucumber Acceptance Testing Videos for a course I’m working on.  Twitter follower will get a notification every time I add a new Cucumber Video to the Agile Testing playlist.

The ProDataMan YouTube channel @ProDataManTrains focuses mainly on Information Technology Topics with Demos and How To Videos.  ProDataManTrains also Live Streams Agile and DevOps related topics from live courses.

On the ProDataMan blog we try to stick to Information Technology related topics.  There will occasionally be a topic too good to pass up like the recent sale of Google Assistant Smart Speakers at Best Buy for $29.  The Insignia (Best Buy’s Brand) Smart Speaker with Google Assistant with far better sound and bass response than the $129 Google Home.

Facebook – www.Facebook.com/ProDataMan
Twitter – www.Twitter.com/ProDataMan (@ProDataMan)
YouTube – www.YouTube.com/ProDataManTrains
Blog – http://www.prodataman.com/Home/Blog

Best Agile / DevOps Open Source Tool Chain

Historically I have been a Microsoft C# guy but the more I work with non-Microsoft shops with Hybrid environments and Java guys running around everywhere the more curious I have become about open source tool chains for Agile and DevOps.
We use Team Foundation Services for Work Item Tracking, Planning, Continuous Integration, and Continuous Deployment to QA and Stage in Azure. That’s all fine and good for projects built almost entirely on the Microsoft Platform but when there are more Java guys on the team than C# guys the holy wars begin.
I love the deep Integration between the tools on the Microsoft stack obviously born from vendor lock in but I am totally open to a more open-source, vendor agnostic solution. I just haven’t been able to find one that provides the required features I’m looking for.
Base level requirements are as follows:
A tool that provides Epic / Story management and visualization (Kanban / Burndown).
A tool for Source / Version control that integrates well with the work item tracking tool and CI server to allow gated check-ins (reject check in if build or tests fail)
A Continuous Integration server that can notify source control of failed builds and tests so check in can be rejected and notifies the work item tracking tool so that a bug work item can be created and assigned to the user who performed the commit of bad code.
A release automation tool / plug in that can trigger a release based on successful CI build and test.
Does this tool chain only exist in the land of flying reindeer and unicorns?

Git and GitHub work fine for source / version control and integrates with almost everything but gated check-ins and automatic bug creation had been elusive thus far.

Anyone have this working already? Any suggestions?

Create a Time Dimension using the SQL Server Data Tools Dimension Wizard

If you database has no Time or Date table you can use the Dimension Wizard in SQL Server Data Tools (SSDT) to generate your Time Dimension.  You can have the tool generate a Time Table either in the data source (if you have permissions) or on the Server.  When creating your Time Table using the Wizard you have the option to specify the Time or Date range the Table will include dates / times between your specified start and end points.

See the article on MS Docs below for more details on creating Time Dimensions automatically using the Dimension Wizard

https://docs.microsoft.com/en-us/sql/analysis-services/multidimensional-models/create-a-time-dimension-by-generating-a-time-table?view=sql-server-2017

Currently Reviewing Open Source Agile Tools

Looking for the best open source tools for running agile projects.  The goal of this little experiment is to create a CI / CD pipeline including planning, task management, source control / versioning, triggered build and test and deployment to the cloud.

Today I’m experimenting with Taiga an open source planning and task management tool.  So far the interface is intuitive and it has most of the features and data points that I would expect to capture during planning.

For free you can have 3 team members and 1 private project (unlimited public projects).  There are Epics, Stories and Sub-tasks to track.  There are Sprints, Backlogs and Kanbans to view.  It even has an issue tracker and a wiki.  You can even link your project timeline to a slack channel to share project updates.

So far this tool is looking pretty good for free.  Are there other free tools that I should be looking at?  Looking for integration with Git and Jenkins to automate builds and tests.  The golden feature is Gated Checkins!  If there is a free open source solution that allows association of an assigned sub-task on checkin to version control then triggers a build in Jenkins and creates an issue (bug) in work item tracking if the build or tests fails or deploys to the cloud if successful the contest is over!  If you know of this magical free toolset please leave links in the comments.

I’ll post a video and screenshots shortly with a more detailed review.

CRUD Script and SSMS Toolkit

Using stored procedures in your Data Access code from ASP.Net applications stops most (not all) SQL Injection Attacks and also ensure that the query is executed with the same parameters in the same order and format each time allowing the query optimizer to use the same query plan on subsequent executions.  So it makes good sense to use stored procedures for almost all access to your database.  The only problem with this practice is the time that it takes to create at least 4 stored procedures for each table in your database.  We need a procedure for Insert, one for Select, one for Update and One for Delete.  We may even need additional stored procedures to get customers by email or to search for customers by FirstName or LastName.  In a database that has 1,000 tables that means at a minimum we are creating 4,000 stored procedures.
 So in order to lighten the DBAs workload we can use an SSMS Add-in (SSMS = SQL Server Management Studio) or a CRUD script (CRUD = Create, Read, Update, Delete) to automate the creation of our Insert, Select, Update and Delete statements.
I found a nifty little script that creates stored procedures for Select, Insert, Update and Delete for all of the tables in a Database or for a Single table when a TableName Variable is set. You can find this script in the Demos folder on the ProDataMan Portal with the Name ISUD with Prefix and Schema Support.sql or you can use the following link to download: CRUDScript
*New: I finally updated CRUDScript for Schema support!
Someone told me about a feature of the SSMS Toolkit a SSMS Add-in available here: SSMS Toolkit
This tool allows you to create CRUD stored procedures for tables based on fully customizable templates that you can change to suit your needs. But this tool does so much more!! See the Features page for more details