Free DevOps Webinar Series Starting May 2nd at 3:00PM Pacific

Intro to Agile Webinar
Thursday May 2nd 3:00PM to 4:00PM Pacific Time

Is your team Agile in name only? Do you hear terms like WaterScrum or AgileFall around the office? Is your Daily Stand-up a 45 minute Sit-Down with little to no value to your team? Are you having trouble seeing the value and benefits Agile was supposed to provide? Or do you just want to get a better handle on what it means to be Agile? If so this webinar is for you!

In the Intro to Agile Webinar we will discuss the topics below and clarify some of the misconceptions about Agile. At the end of the Webinar registered attendees will the have the opportunity to ask questions and learn how to access the step by step instructions and virtual machine images used in the technical demos seen in the webinar.

If you are an IT Manager looking for a high-level overview of Agile to help you and your team get started on your Agile journey this webinar has the info you need!
Click the link below to begin the Registration Process for the Free Intro to Agile Webinar May 2nd at 3:00pm Pacific
https://www.messenger.com/t/ProDataMan

The Triage Triangle a useful Agile Tool

In traditional Waterfall projects we had to be careful and beware of the dreaded Scope Creep… That evil demon that creeps into your back log as overachieving team members agree to do undocumented change requests that come in after the project planning and budget have already been done. Historically to avoid scope creep organizations have required adherence to strict change management policies that restrict changes of any form with extensive approval, requirements document update and sign off and negotiation of an amended contract. That’s NOT Agile!

The Agile Manifesto States that we Prefer Customer Collaboration over Contract Negotiation and Responding to Change over Following a Plan. What this really means is the customer wants what the customer wants… If the planned solutions as defined by the requirements does not meet the customers needs then we are doing the customer a disservices by saying the “The contract says we are only allowed to build whats in this requirements document”. You wont get much repeat business with that line of thinking. In my experience the customer doesn’t seem to know what they want until you build what they asked for, then they know for sure that is NOT what they want. So its best if we work in short iterations and get feedback from the customer confirming we are building what they want and if there are any adjustments they would like to make. While this idea of Customer Collaboration over Contract Negotiation allows the Agile Delivery Team to Respond to Change over Following a Plan if not managed correctly it can quickly erode the profits from App Dev.

Enter the Triage Triangle. The Triage Triangle is a tool that we can use to illustrate to customers and stakeholders the impact of unplanned changes on Delivery Date and Cost. In traditional Waterfall projects the scope was rigid and fixed (tied to the contract) as a result any unplanned changes that were included expanded the scope causing delivery date and cost to slip. On an Agile project we keep the Delivery Date and Cost fixed and allow changes in scope even late in the development process with the understanding that if the project Scope is expanded then one or both of the other sides of the Triage Triangle must be adjusted.

For example if we have 100 story points worth of planned work to complete and the customer decides that they absolutely must have this new unplanned feature 10 story points in size than we can do 1 of 3 things:
1. Trade a feature of equal size and story point value
2. Keep existing features and add the new feature increasing cost to cover the additional work and pushing the delivery date.
3. Keep existing features and add the new feature increasing cost enough to cover the additional work without pushing the delivery date. (hiring additional resources will cost more)

Installing Cucumber Plugin for Eclipse

If you are using the Eclipse IDE and want to get started with Behavior Driven Development then you will most likely want to install the Cucumber Eclipse Plugin. With the plugin installed and a couple of supporting files created you can run and review automated acceptance tests. If you don’t already have Eclipse installed it can also be easily installed from the Software Marketplace in Gnome on Ubuntu.


Eclipse in the Software Marketplace in Gnome on Ubuntu

To install Eclipse simply click Eclipse then click the blue install button.


The path of least resistance to install the cucumber eclipse plugin is to use the Eclipse Marketplace. The installation is pretty straight forward, you just search for Cucumber and click Install.

Once Eclipse is loaded and launched you can then use the Eclipse Marketplace to load Cucumber for Acceptance Tests


Select Help > Eclipse Marketplace to search the Eclipse Marketplace and install Cucumber

On the Search tab in the Eclipse Marketplace type cucumber in the Find box to find and install Cucumber.


Cucumber in the Eclipse Marketplace

You must accept the license agreement to complete cucumber installation


Accept the license agreement to complete the installation

Once the Cucumber Plugin is installed, we can create a Maven Project and add a CucumberRunner.

See the next post for details

Staging and Committing Changes in Git and GitHub

Staging Files
As we generate new solution artifacts they need to be added to the repository, integrated with work by other team members and tested to ensure that no defects were introduced by our changes. Until we add the files to the repository they are considered untracked files even though they are present in a folder in the repo. Simply creating files in the folder structure of the local repository is not sufficient.


Create Echo.java in local repo folder structure

After creating a new file, we can use git status to check the status of files in the repository’s directory structure. Untracked files will be displayed in red. Red bad Green good!


Use git Status to see untracked files

Depending on the number of files we have created we can use git add with various options to track one or more untracked files. If we create a single file (or a short list of files) we can use git add <filename(s)> to Stage a particular file or list of files


Stage single file or list of files with git add <filename(s)>

If we create multiple files in a single directory we can use git add . to Stage all files in the current directory but not sub directories. We can use git status at any time to determine if there are any untracked files or uncommitted changes in the local repository.


Create multiple untracked files and check status

Once we have determined that there are 2 untracked files and tracked file with a changes to be committed we can use git add . (git add –all) to add all track all new files in the current directory (or all files in current and subdirectories)

Stage all files in current directory
Git add .

If we create multiple files in multiple subdirectories we can use git add –all to stage all files in the directory tree with a single command

Committing changes to the Local Repository
Once we have created or edited files and Staged the changes then we can commit the changes to the repository. However, if we attempt to commit without authenticating against the repository we will get a warning message


Please tell me who you are

We can add the email and username info using the git config command


Use git config to add email and user name

Committing changes to the Local Repository
If we attempt to commit without adding a commit message we will be prompted to enter a commit message in the dreaded vim editor as in the image below. If you find yourself in this editor, enter your message then press <esc> to exit edit mode then type 😡 <enter> to exit the editor


Enter Added Echo Util Class, Unit and Acceptance Tests

Adding a Commit Message

Once we have added our commit message and have pressed <esc> then typed 😡 <enter> to save and exit the editor we will then see our commit message followed by details about the files committed to the repository as in the image below.


Commit complete

Pushing Changes to a Remote Repository
In order to share our changes with other team members we need to push our changes to a remote repository. With a small team and little to no branching pushing changes to a remote repo is as simple as running git push origin master. This command pushes changes from the current branch in the local repo to the master branch of the remote repo


git push origin master

The get push origin master command only works if you cloned your remote repository to the local repository or if you have configured the remote repo in advance. Even if you cloned or configured the remote repo in advance if you are not already logged into the remote repo you will be prompted to log in when you execute git push origin master


Authenticate to remote repo for push to complete

Once you have authenticated, the commit will complete and a status message will be displayed showing the number of files committed as well as the source and destination branch.


Push complete

In the next post we will clone this remote repo to a new local repo on a different computer pull the changes add a couple of new files commit and push our changes before we move into branching and merging.

Creating a Git Repository

There are a few ways to create a Git Repository but before we talk about How let’s talk about Why.
Why do we need a Git Repository or any Repository for that matter? And What is a Git Repository anyway? In its simplest for a Git Repository is simply a folder that contains files that we want to keep track of. As changes are made to the files the modifications and different versions of the files are tracked and changes are committed to the repository. This is very useful when there are many different team members working on the same set of files. For this reason, tools like Git are commonly referred to as Version Control systems. Historically systems such a Git, SVN and Subversion have been used to store Source Code for software applications so they have also been referred to as Source Code Management SCM or Software Configuration Management Systems. However, with the recent push to streamline Software Delivery Life Cycles (SDLC) there has been a move to place ALL solution related artifacts in Version Control. This allows for configuration of Continuous Integration, Automated Testing, easier automated deployments and simplified scrips.

So that makes the why do we need a Git Repository question easier to answer. A Git repository provides a central source of truth as to the current, tested and ready to deploy version of the application. A Git repository also provides a central source from which to build and deploy our solutions from as well as providing the ability to isolate changes made to different features or by different developers to their own isolated branch of the repository.

So now that we know what it is and why we need one lets look at how to create a Git Repository.

Git Init

If we are starting a brand new project and do not yet have a repository we can convert and existing directory into a repository using the Git Init command.
Git Init initialized the directory as a Git Repository and creates a hidden folder called .Git as a sub directory of the directory being initialized.

Git Clone

If we already have a remote repository in GitHub we clone that repository to create a local workspace with files we can edit.

GitHub
GitHub

Cloning a repository is a fast way for new team member to quickly get a new workstation setup and start being productive immediately.

Simply copy the URL of the Remote repository you wish to clone then run git clone <past URL here>. Depending on the size of the repository the cloning process takes a matter of seconds.

Once you have initialized or cloned your repository you can then begin to add files edit and commit your changes. See the next post in this series for details on adding files and committing staged changes.

DevOps Case study links


•GE Case Study
•AWS Sydney Australia
https://aws.amazon.com/blogs/aws/ge-oil-gas-digital-transformation-in-the-cloud/
•GE Oil and Gas Presentation
https://www.youtube.com/watch?v=DFGFaJZFtuk
•GE, AWS adoption
http://www.slideshare.net/AmazonWebServices/ism209-acceleration-of-aws-enterprise-adoption-in-ge??Microsoft Developer Division Case Study

•Interview with Sam Guckenheimer on Microsoft’s Journey to Cloud Cadence
http://www.infoq.com/articles/agile2014-guckenheimer
ALM Devops features
https://www.visualstudio.com/en-us/features/alm-devops-vs.aspx
•Microsoft Continuous Delivery Discussion
https://www.youtube.com/watch?v=caM0DojhV7w

Rakuten and Microsoft talk Devops in Real World
http://www.slideshare.net/TsuyoshiUshio/rakuten-and-microsoft-talk-devops-in-real-world
•Detailed retrospective
http://blogs.msdn.com/b/bharry/archive/2013/11/25/a-rough-patch.aspx?

• Knight Capital Case Study
https://www.sec.gov/News/PressRelease/Detail/PressRelease/1370539879795
•Knight Capital Group Wikipedia
https://en.wikipedia.org/wiki/Knight_Capital_Group
•Loss Swamps Trading Firm
http://www.wsj.com/articles/SB10000872396390443866404577564772083961412
•Knight Capital Group Comments
https://www.kcg.com/news-perspectives/article/knight-capital-group-comments-on-contributions-to-stabilize-the-u.s.-equity
•Knight Capital Says Trading Glitch cost it $440M
http://dealbook.nytimes.com/2012/08/02/knight-capital-says-trading-mishap-cost-it-440-million/?_r=0?







Open Source Development: Installing Java, Maven, Git and Gnome on Ubuntu

Before you can begin building that groundbreaking new app you have been dreaming about you first need to get your development environment setup. The following steps represent the path of least resistance to get a development environment setup and begin development of your new killer app. The tools mentioned will also allow your new software delivery pipeline to be automated with another tool or 2 and some additional configuration.

Operating System
First we need an OS to host all of our tools and for our purposes that OS will be Ubuntu Server 18.04 host on the AWS Cloud. Once we have our AWS EC2 Instance up and running we can use PuTTY to SSH into our server.


Once we have our AWS EC2 Instance up and running we can use Putty to SSH into our server and finish up the configuration of the new server.


When we open the connection for the first time we will be prompted to add the servers host key to the cache if we trust the server and intend to connect to it again in the future. We do trust the server as we just created it and we also intend to connect to it again in the future so click “Yes”.


Once we click “Yes” we are successfully connected to our new Ubuntu Server in AWS.


Server Configuration

Now we can complete the configuration of our new server to allow us to install and use our development tools. First, we will run “sudo apt-get update” to download the package lists from the repositories and “update” them to get information about the newest versions of packages and any dependencies they may have.


Configure passwords and create developers account

Next, we will set a password for the root user so that we can login to the console later if needed


Now we can create a new account to use when we connect to the server to do development. To accomplish this, we will use the command “sudo adduser developer”.


Once the user has been created we will add them to the admin group using the command “sudo usermod -aG admin developer”


Runtime Environment

Now that we have our server up and running and have configured our users we can begin installing the tools. First we will need to load the Java Runtime environment with the command “sudo apt install default-jre”. This will install the latest version of the runtime. It is important to note that while the latest version of the runtime is what we want to use for any new development it may still be necessary to load a specific version of the runtime to support a particular tool. For example Jenkins requires that version 8 be installed otherwise the installation will fail. For this reason we will install the version required by Jenkins as well by first adding the repository with the command “sudo add-apt-repository ppa:webupd8team/java”


Then we will execute the command “sudo apt install oracle-java8-installer” to complete the Java 8 installation.


Build Tool

We will use maven to build our solution and specify dependencies in Eclipse projects. We will use the command “sudo apt install maven” to install Maven.

Version Control

Once Maven has installed successfully we can install Git with the following command “sudo apt install git”.


Then we can add our developer user to Git with the command git config –global user.name “developer”


Using the git config –global user.email ” devopsstudent@outlook.com “ command we can include an email address for our developer.

Install Jenkins for Continuous Integration

Jenkins can be used for many things from Continuous Integration to Continuous Delivery and Deployment depending on the plugins employed. Since Jenkins has so many uses in optimizing our software delivery pipeline we will install now and complete final configuration in a later post. The command to install Jenkins (after we have install the appropriate runtime version) is “wget -q -O – https://pkg.jenkins.io/debian/jenkins.io.key | sudo apt-key add –” to add the repository. Then “sudo sh -c ‘echo deb http://pkg.jenkins.io/debian-stable binary/ > /etc/apt/sources.list.d/jenkins.list'” to refresh the list.


Then we will need a “sudo apt-get update” to get latest. And finally “sudo apt install jenkins” to install Jenkins. After the install completes we can use “systemctl status jenkins” to verify installation was successful.

Note: The installation is case sensitive and Jenkins must have a lowercase “j”

Graphical User Interface

Before installing anything else we will install the Gnome GUI so that we can remote into this server and use the GUI to navigate between tools and locate files. To install Gnome we will first add the repository “sudo add-apt-repository ppa:gnome3-team/gnome3” then perform an update “sudo apt-get update” and finally install Gnome “sudo apt-get install gnome-shell ubuntu-gnome-desktop”

Install RDP and Open Ports

Once we have all system level tools installed we can install xRDP and open the Web and RDP ports so that we can connect to our server remotely. To install xRDP we can use the “sudo apt install xrdp” command and then use “” to enable it


Then open ports 3389 and 8080 with the commands “sudo ufw allow 3389” and “sudo ufw allow 8080”

Now we can use Remote Desktop to connect to our server and install the Integrated Development Environment use the Marketplace in the GUI

A dialog will notify you that the identity of the remote computer cannot be verified. Since we just created this server we will check the “Don’t ask me again for connections to this computer” checkbox then click “Yes”

Then we can login as the developer user we created earlier.

From this point we can use the Marketplace to search for and install additional tools using the GUI.

Technical Debt Pay It Forward

Pay it Forward

You can pay now or pay later but trust me you’re gonna pay! I’m talking about Technical Debt… Technical Debt like any other Debt has Interest, so you can pay now or pay later but if you pay later it will be much more expensive. See this post on the Increasing cost of Technical Debt for more info. 

The increasing cost of technical debt
Increasing cost of technical debt

The short version is defects are said to have a 1x cost at design time but costs increase exponentially as you build, test approaching production. The simple point is that the earlier that we find and resolve issues and defects the less it costs. Anything that we can do simplify or speed up this process of finding and documenting defects reduces our Debt. We want to make a Shift Left in quality by moving quality checks closer to the beginning of the delivery pipeline where defects are cheaper to fix. By creating Test Cases based on the customer requirements and success criteria we can ensure that our tests are mapped to business value. This is no substitute for Unit Tests written at the Function or Method level as used in Unit Testing.  Plan for chaos, write test to detect it and buffer team capacity to fix it (more on that in a future post).

Follow ProDataMan on Facebook, YouTube and Twitter

We geek out with no limits on Facebook @ProDataMan there will be posts about underwater hotels that have nothing to do with Programming, SQL Sever or DevOps but it will always be cool High Tech stuff…

Follow us on Twitter @ProDataMan to be notified when we add a new video to a Playlist on the YouTube channel. Currently Curating Cucumber Acceptance Testing Videos for a course I’m working on.  Twitter follower will get a notification every time I add a new Cucumber Video to the Agile Testing playlist.

The ProDataMan YouTube channel @ProDataManTrains focuses mainly on Information Technology Topics with Demos and How To Videos.  ProDataManTrains also Live Streams Agile and DevOps related topics from live courses.

On the ProDataMan blog we try to stick to Information Technology related topics.  There will occasionally be a topic too good to pass up like the recent sale of Google Assistant Smart Speakers at Best Buy for $29.  The Insignia (Best Buy’s Brand) Smart Speaker with Google Assistant with far better sound and bass response than the $129 Google Home.

Facebook – www.Facebook.com/ProDataMan
Twitter – www.Twitter.com/ProDataMan (@ProDataMan)
YouTube – www.YouTube.com/ProDataManTrains
Blog – http://www.prodataman.com/Home/Blog

Blockchain Part 4: Decentralized Applications (Dapps)

While cryptocurrencies in general and Bitcoin in particular are the primary application that brought blockchain into the limelight, the potential of blockchain in the enterprise extends far beyond cryptocurrencies. As we’ll see in subsequent blogs, companies in financial, energy, medical, supply chain management, logistics, cloud computing, etc. are now investing in and launching proof of concepts to get ahead of the coming operations and technology changes in their enterprise systems.

The real power of blockchain will be realized as the technology gets adopted by industries such as:

  • Healthcare
  • Banking & Finance
  • Logistics
  • Ecommerce
  • Cloud Computing

Some of the startups launching Decentralized Applications
(Dapps) include Storj, which is a decentralized storage solution. There’s also technical conversation about how to launch domain name server (DNS) on blockchain to prevent denial of service (DoS) attacks.

Blockchain is rapidly gaining adoption in the enterprise. A significant percentage of fortune 500 companies are either building external facing commercial blockchain platforms (Microsoft, Amazon, Oracle, etc) or deploying proof of concept within their operations. This means that –

  • How to introduce blockchain to your organization especially teams already using DevOps
  • Knowledge base for blockchain solutions and strategies
  • Increased demand for blockchain developers
  • Timeline for blockchain widespread blockchain adoption

We will be exploring these topics in subsequent articles