Simple Echo App in Eclipse with TDD

In this tutorial, we will walk through creating a basic JUnit test class in Eclipse just to make sure that our environment is set up properly. We will use Test Driven Development (AKA Test First Development) to ensure that we are writing just enough code to pass our test, and not writing our test to pass our code. This process can be used to test and confirm that you are getting the correct output from your code.

We will create a single Junit test method, review the various dependancies, and confirm that there are 0 errors in your code. Using this process can help to save time & energy that may otherwise be expended on reviewing and re-writing code that doesn’t work.

 

First we will create a new Java project, then we will create a JUnit test that describes the functionality that we expect from our code. In addition, we will use code generation tools in Eclipse (templates) to create the class and it’s contained methods. Lets get started!

 

  1. On the File menu, select: New > Java Project
  2. Name the Project EchoApp, and click Finish

  3. Expand the EchoApp project
  4. Right click on the src folder
  5. On the popup menu select: New > Class


  6. Name your new class TestEcho, and accept all defaults by clicking Finish
  7. Type the following code into your project (Notice the red squigglies beneath some of your annotations)
  8. Hover over the @Test annotation with the red squiggly beneath it
  9. Select “Add JUnit 4 library to the build path” to make the red squiggly disappear
  10. Hover over the assertEquals annotation
  11. Select Add static import ‘org.junit.Assert*’ to
    add the assert import and make the red squiggly go away
  12. Hover your mouse over Echo and select Create class ‘Echo’ to create the Echo class and make the red squiggly disappear

  13. Accept default settings and click Finish

    An empty public class has now been created:
  14. Click on the TestEcho.java tab to return to your class (Notice that now, a red squiggly is beneath echo)
  15. Hover over echo, and you will see ‘1 quick fix available:’
  16. Select Create method ‘echo(String)’ in type ‘Echo’ to create a Public Static String method
  17. Right click on the @Test annotation, select Run as > 1 Junit Test
  18. Eclipse will prompt you to save your TestEcho and your new Echo class


  19. Click OK to save the files
  20. Uh oh! It looks like your test has failed L
  21. Click the Echo.java tab (Notice that the default method template has set the return value to null)
  22. Change null to expected
  23. Click the TestEcho.java tab to return to your test class
  24. Right click on the @Test annotation
  25. Select Run as > 1Junit test

  26. A dialogue box will appear that asks if you want to save your resources. Select OK

Your test will now run and pass with 0 errors and 0 failures:

In this demo, we created a single Junit test method, revied the dependancies, and ran the test to confirm that there were 0 errors in our code. .

Resources:

Java Development Kit

Maven

Eclipse

See the post on installing Jenkins to convert this simple Java Project to a Maven Project and build and test with Jenkins.

Advertisements

Acceptance Test Automation with Cucumber / SpecFlow in Visual Studio

Acceptance Criteria, Test Automation and Gherkin

What do Acceptance Criteria, Test Automation and a Cucumber have in common? Well for the uninitiated in may seem that these 3 things have nothing in common however if you are an “Agilist” or Test Automation specialist than you are probably very familiar with the similarities. Let’s start with the Cucumber specifically the Gherkin. The Gherkin as it happens is the most common type of cucumber used to make pickels. Gherkin is also the language used to Automated Acceptance Tests in a tool called Cucumber. Acceptance Criteria are the conditions for success for a new feature or part of a feature being added to solution. If the Acceptance Criteria is written in Gherkin format most of the heavy lifting of Test Automation has already been done. To automate acceptance tests in Visual Studio we use the Visual Studio version of Cucumber: SpecFlow. To get started using Gherkin Acceptance Criteria to Automate Acceptance Testing the Visual Studio IDE needs to be configured to use the Cucumber / SpecFlow extension.

Create a Virtual Machine in AWS install SpecFlow Extension in Visual Studio Community Edition and create an Automated Acceptance Test for a basic Stack object. Detailed Step By Step Instructions below video

Install SpecFlow Extension for Visual Studio

  1. In Visual Studio 2019 Community Edition
  2. In the Launch dialog choose Continue without code
    VS 2019 Continue without code
    VS 2019 Continue without code
  3. In the Extensions Menu Select Manage Extensions
    VS 2019 Manage Extensions
    VS 2019 Manage Extensions
  4. Select Online from the Navigation menu
  5. In the Search box type “SpecFlow”
  6. Select SpecFlow for Visual Studio 2019 and click Download
    VS 2019 Install SpecFlow Extension
    VS 2019 Install SpecFlow Extension
  7. Click Close to close the Manage Extensions dialog
    VS 2019 Close Visual Studio to complete SpecFlow Extension Installation
    VS 2019 Close Visual Studio to complete SpecFlow Extension Installation
  8. Exit Visual Studio
  9. Click Modify in the VSIX Installer dialog
    VSIX Installer
    VSIX Installer – Modify
  10. When Installation completes click Close in the VSIX Installer dialog
    VSIX Installer
    VSIX Installer – Installation Complete
  11. Restart Visual Studio

Note: Visual Studio will not Install the Cucumber Extension until all Visual Studio windows are closed.

Create a Unit Test Project to Map Acceptance Criteria

  1. In order to Add Gherkin Feature Descriptions we will need to add a Unit Test Project.
  2. In Visual Studio from the File menu select New Project
  3. In the New Project Dialog under Language select C# under Platform select Windows under Project Type select Test
  4. Select Unit Test Project (.NET Framework) and click Next
    Create new Test Project
    Create new Test Project
  5. Name the Project Stack App Acceptance Tests
  6. Name the Solution Stack App
  7. Click Create

    Configure new Project and Solution
    Configure new Project and Solution names
  8. In the Solution Explorer
  9. Right Click UnitTest1.cs and select Delete
  10. Right click the Project and select Add New Item
  11. Select the SpecFlow folder on the left
  12. Select SpecFlow Feature File
  13. Name the File StackAppFeatures.feature
  14. Click Add

    Create new Feature File
    Create new Stack Feature File
  15. In the Solution Explorer select StackAppFeatures.feature
  16. In the Properties windows remove SpecFlowSingleFileGenerator from the Custom Tool Property
    Remove Custom Tool Property value
    Remove SpecFlowSingleFileGenerator from the Custom Tool property

Note: The Custom tool error will disappear once the SpecFlowSingleFileGenerator text is removed from the Custom Tool Property of the StackAppFeatures.feature file.

Add SpecFlow NuGet Packages to Test Project

  1. Right click the Stack App Acceptance Tests Project
  2. Select Manage NuGet Packages… from the Menu
  3. In the NuGet Stack App Acceptance Tests window select Browse form the menu
  4. In the Search (Ctl+L) box type SpecFlow and press enter
  5. Select SpecFlow by TechTalk
  6. In the Detail window to the right click Install

    Install SpecFlow NuGet package
    Install SpecFlow NuGet package in Test Project
  7. In the License Acceptance dialog click I Accept

    License Acceptance - I Agree
    License Acceptance – I Agree to accept license terms
  8. Select SpecRun.SpecFlow
  9. In the Detail window to the right click Install
  10. In the License Acceptance dialog click I Accept
  11. In the Search (Ctl+L) box type SpecFlow.Tools and press enter
  12. Select SpecFlow.Tools.MsBuild.Generation
  13. In the Detail window to the right click Install

Step Definitions

In the Solution explorer select StackAppFeatures.feature and replace the Math feature (User Story) and Acceptance Criteria with text below:

Feature: StackAppFeatures
    As a User
    I need to Stack stuff
    So that I can move it around

@EmptyStack
Scenario: IsEmpty should be true
Given An Empty Stack
When I check IsEmpty
Then IsEmpty should be “true”

@NonEmptyStack
Scenario: IsEmpty should be false
Given A nonEmpty Stack
When I check IsEmpty
Then IsEmpty should be “false”

@PushTests
Scenario: Push Check IsEmpty
Given A Empty Stack
When I Push “Bugga Boo” onto the Stack
And I check IsEmpty
Then IsEmpty should be “false”

@PushPopTests
Scenario: Push Gets Popped
Given An Empty Stack
When I Push “Item 1” onto the Stack
And I Pop a value off the Stack
Then The result should be “Item 1”

@PushPeekTests
Scenario: Push Gets Peeked
Given An Empty Stack
When I Push “Item 1” onto the Stack
And I Peek at a value on the Stack
And I check IsEmpty
Then The result should be “Item 1”
And IsEmpty should be “false”

The purple text indicates that the Gherkin statement does not yet have associated step definitions. We’ll use the Code Generations tools in the SpecFlow extension generate the step definitions from the Gherkin Acceptance Criteria.

  1. Right click on Given an Empty Stack and select Generate Step Definitions
  2. In the Generate Step Definition Skeleton – SpecFlow dialog click Generate

    Generate Step Definitions
    Generate Step Definitions
  3. In the Select target step definition class file dialog accept the defaults and click Save

    Accept Default location for Step Definitions class
    Accept Default location for Step Definitions class file

Feature Implementation – Test First

Now we will replace the ScenarioContext.Current.Pending(); placeholder code in each of the Given…When…Then… functions in the StackAppFeaturesSteps.cs class file with calls to our code under test

  1. In the Solution Explorer select the StackAppFeaturesSteps.cs class file
  2. Add the following code to the StackAppFeaturesSteps class
    Stack stack;
    String _actual;
    Boolean _isEmpty;
  3. In the public void GivenAnEmptyStack() function replace the ScenarioContext.Current.Pending(); with the code below
    stack = new Stack();
  4. Replace the ScenarioContext.Current.Pending(); code in the GivenANonEmptyStack function with the code below
    stack = new Stack();
    stack.Push(“Hello, World!”);
  5. Replace the ScenarioContext.Current.Pending(); code in the WhenICheckIsEmpty function with the code below
    _isEmpty = stack.IsEmpty();
  6. Replace the ScenarioContext.Current.Pending(); code in the WhenIPushOntoTheStack function with the code below
    stack.Push(p0);
  7. Replace the ScenarioContext.Current.Pending(); code in the WhenIPopAValueOffTheStack function with the code below
    _actual = stack.Pop();
  8. Replace the ScenarioContext.Current.Pending(); code in the WhenIPeekAtAValueOnTheStack function with the code below
    _actual = stack.Peek();
  9. Replace the ScenarioContext.Current.Pending(); code in the ThenTheResultShouldBe function with the code below
    Assert.AreEqual(p0, _actual);
  10. Replace the ScenarioContext.Current.Pending(); code in the ThenIsEmptyShouldBe function with the code below
    Assert.AreEqual(Boolean.Parse(p0), _isEmpty);

Now that we have our tests we can use code generation tools to create the class under test. At this point we will also have 5 syntax / reference errors, we will resolve these in the next steps.

  1. In the solution explorer right click Solution ‘Stack App’ (1 of 1 project) select Add > New Project
    In the Solution Explorer you should now see the Stack Utility project
    Add new Class Library project
    Add new Class Library project
  2. In the Solution Explorer select the StackAppFeaturesSteps.cs class file
  3. In the StackAppFeaturesSteps.cs class file hover the mouse over the Stack type with the red squiggles
    In the Quick Actions menu (The lightbulb) select Generate type ‘Stack’ > Generate new type…

    Generate Stack Type
    Use Quick Actions to Generate Stack Type
  4. In the Generate Type dialog select Stack Utility from the Project dropdown, select the Create new file radio button and click OK
    Note:
    the red squiggles under Stack are now gone
    No more red Squiggles
    Stack exists – No more red Squiggles
  5. In the StackAppFeaturesSteps.cs class file hover the mouse over the call to the stack.Push method and select Generate method ‘Stack.Push’ from the Quick Actions menu
    Generate Push method
    Use Quick Actions to Generate Push method
  6. Repeat the same process for the IsEmpty, Push, Pop and Peek methods
  7. In the StackAppFeaturesSteps.cs class file hover the mouse over the Assert in Assert.AreEqual and select using Microsoft.VisualStudio.TestTools.UnitTesting; from the Quick Actions menu
    Note: You may now run the tests and see them fail
  8. From the Test menu select Run > All Tests
    Note:
    In the Test Explorer that of 6 Tests 3 failed. Only 5 were actual tests 1 was a SpecRun delay for the evaluation version. 3 Failed and the 2 Tests are ignored. The ignored tests call methods that were already tested and failed in this test run.
    Run the Tests and see them Fail
    Run the Tests and see them Fail

Now that class skeleton has been created, we can implement the 4 methods, IsEmpty, Push, Pop and Peek.

  1. In the Solution Explorer select the Stack.cs class file in the Stack Utility project
  2. In the Stack class create a new stack variable of type ArrayList by adding the code below:
    ArrayList stack = new ArrayList();
  3. Hover the mouse over the ArrayList type and select using System.Collections; from the Quick Actions menu
  4. In the Push method replace throw new NotImplementedException(); with the code below
    stack.Insert(0,v);
  5. In the IsEmpty method replace throw new NotImplementedException(); with the code below
    return stack.Count == 0;
  6. In the Pop method replace throw new NotImplementedException(); with the code below
    String result = stack[0].ToString();
    stack.Remove(0);
    return result;
  7. In the Peek method replace throw new NotImplementedException(); with the code below
    return stack[0].ToString();
    Note:
    You may now run the tests and see them pass
  8. From the Test menu select Run > All Tests
    Note:
    All test should display a green check mark indicating a pass.
    Run the Tests and see them Pass
    Green = Good Red = Bad – Run the Tests and see them Pass


    Now that we have the Step Definitions (Glue Code) defined adding new Automated Acceptance tests is a simple a pasting in new Gherkin statements.  If the Product Owner adds new Acceptance Criteria to a user story we can simply copy and paste from the Work Item Tracking (Project Management) tool into our Feature file and we are done.  No new coding.  For example the Gherkin statement below can simply be added to the feature file with

    @MultiPushPopTests
    Scenario: Multi Push Pop
    Given An Empty Stack
    When I Push “Item 1” onto the Stack
    And I Push “Item 2” onto the Stack
    And I Pop a value off the Stack
    Then The result should be “Item 2”

no additional changes necessary.

Add Cucumber Acceptance Tests to a Maven Project

In order to automate acceptance tests with Cucumber in Eclipse we must first load the Cucumber Plugin in Eclipse using the Eclipse Marketplace.  Once the Cucumber Plugin is installed in Eclipse, we can create a Maven Project and add Feature, Step Definition and CucumberRunner classes to integrate our Gherkin Acceptance Criteria statements with our Junit test runner.  

See the previous post for details on installing the Cucumber Plugin


Create a new Project (select other in the new project menu)


In the New Project Dialog Select Maven Project from the Maven folder

For this simple example we will forego archetype selection in favor of a simple project

Create a simple project (skip archetype selection)


Add a Group Id and Artifact Id


Create a new CucumberRunner class in the src/test/java folder

Add the code below to the CucumberRunner class

import org.junit.runner.RunWith;
import cucumber.api.CucumberOptions;
import cucumber.api.junit.Cucumber;

@RunWith(Cucumber.class)
@CucumberOptions()
public class CucumberRunner {
}


In a more complex project we may choose to better organize our Feature Files and Step Definitions by storing them in specific Packages or Folders.  The syntax for specifying a feature folder is features = “path”.  The syntax for specifying a feature package is features = {“package name”}.  The same is true when specifying a package or folder for the “glue” code.

import org.junit.runner.RunWith;
import cucumber.api.CucumberOptions;
import cucumber.api.junit.Cucumber;
@RunWith(Cucumber.class)
@CucumberOptions(
features = "src/test/resources"
,glue= {"steps"}
)
public class CucumberRunner {
}

Cucumber Runner with options for features folder and glue code package

After updating the CucumberRunner class we will need to update the pom.xml file to declare dependencies and resolve the reference issues (get rid of the red squigglies)


In our maven project we need to update the pom.xml file to include a dependency for cucumber as well as a dependency for junit integration

Add the code below to the pom.xml file before the closing </project> tag

<dependencies>
<!-- Cucumber -->
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-java</artifactId>
<version>4.2.0</version>
<scope>test</scope>
</dependency>
<!-- Cucumber JUnit Integration -->
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-junit</artifactId>
<version>4.2.0</version>
<scope>test</scope>
</dependency>
<dependency>
<!-- JUnit -->
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
<scope>test</scope>
</dependency>
</dependencies>

Once the pom.xml file has been updated with the required dependencies and saved the reference issues in the CucumberRunner class will be resolved.

Now we can add a feature file then generate step definitions

Create Feature File

Create Feature File

Once we have created a new file with the feature extension the Eclipse plug in will provide us with a sample feature file to get us started.

Default Feature file Eclipse
Default Feature file Eclipse

For the purpose of this demo we will paste in some existing Acceptance Criteria from a user story in our Sprint Backlog in Azure DevOps (Team Foundation Server Online)

Copy Acceptance Criteria from a User Story in Azure DevOps
Copy Acceptance Criteria from a User Story in Azure DevOps
Paste Acceptance Criteria into Feature File
Paste Acceptance Criteria into Feature File

Once we have created and populated our feature file the Gherkin statements that do not yet have matching “glue” code (which is all of them) will be highlighted in Yellow.

If we attempt to execute the tests we will get suggestions in the output window for creating the glue code.

import static org.junit.Assert.assertEquals;
import cucumber.api.java.en.Given;
import cucumber.api.java.en.Then;
import cucumber.api.java.en.When;
import stackApp.Stack;
public class StackSteps {
Stack stack;
Boolean _isEmpty;
String _actual;
@Given("An Empty Stack")
public void an_Empty_Stack() {
stack = new Stack();
}
@When("I check IsEmpty")
public void i_check_IsEmpty() {
_isEmpty = stack.isEmpty();
}
@Then("IsEmpty should be {string}")
public void isempty_should_be(String string) { assertEquals(Boolean.parseBoolean(string), _isEmpty);
}
@Given("A nonEmpty Stack")
public void a_nonEmpty_Stack() {
stack = new Stack();
String expected = "Hello, World!";
stack.push(expected);
}
@When("I Push {string} onto the Stack")
public void i_Push_onto_the_Stack(String string) { stack.push(string);
}
@When("I Pop a value off the Stack")
public void i_Pop_a_value_off_the_Stack() {
_actual = stack.pop();
}
@Then("The result should be {string}")
public void the_result_should_be(String string) { assertEquals(string, _actual);
}
@When("I Peek at a value on the Stack")
public void i_Peek_at_a_value_on_the_Stack() {
_actual = stack.peek(); }
}

Finally we will create a Stack class and write just enough code to pass our tests

Create Stack Class Java
Create Stack Class Java
import java.util.ArrayList; 
public class Stack {
ArrayList stack = new ArrayList();
public Boolean isEmpty() {
return stack.isEmpty();
}
public void push(String input) {
stack.add(0,input);
}
public String pop() {
return stack.remove(0).toString();
}
public String peek() {
return stack.get(0);
}
}
Execute Cucumber Runner as a Junit Test
Execute Cucumber Runner as a Junit Test
View Cucumber Test Results
View Cucumber Test Results


The Triage Triangle a useful Agile Tool

In traditional Waterfall projects we had to be careful and beware of the dreaded Scope Creep… That evil demon that creeps into your back log as overachieving team members agree to do undocumented change requests that come in after the project planning and budget have already been done. Historically to avoid scope creep organizations have required adherence to strict change management policies that restrict changes of any form with extensive approval, requirements document update and sign off and negotiation of an amended contract. That’s NOT Agile!

The Agile Manifesto States that we Prefer Customer Collaboration over Contract Negotiation and Responding to Change over Following a Plan. What this really means is the customer wants what the customer wants… If the planned solutions as defined by the requirements does not meet the customers needs then we are doing the customer a disservices by saying the “The contract says we are only allowed to build whats in this requirements document”. You wont get much repeat business with that line of thinking. In my experience the customer doesn’t seem to know what they want until you build what they asked for, then they know for sure that is NOT what they want. So its best if we work in short iterations and get feedback from the customer confirming we are building what they want and if there are any adjustments they would like to make. While this idea of Customer Collaboration over Contract Negotiation allows the Agile Delivery Team to Respond to Change over Following a Plan if not managed correctly it can quickly erode the profits from App Dev.

Enter the Triage Triangle. The Triage Triangle is a tool that we can use to illustrate to customers and stakeholders the impact of unplanned changes on Delivery Date and Cost. In traditional Waterfall projects the scope was rigid and fixed (tied to the contract) as a result any unplanned changes that were included expanded the scope causing delivery date and cost to slip. On an Agile project we keep the Delivery Date and Cost fixed and allow changes in scope even late in the development process with the understanding that if the project Scope is expanded then one or both of the other sides of the Triage Triangle must be adjusted.

For example if we have 100 story points worth of planned work to complete and the customer decides that they absolutely must have this new unplanned feature 10 story points in size than we can do 1 of 3 things:
1. Trade a feature of equal size and story point value
2. Keep existing features and add the new feature increasing cost to cover the additional work and pushing the delivery date.
3. Keep existing features and add the new feature increasing cost enough to cover the additional work without pushing the delivery date. (hiring additional resources will cost more)

Installing Cucumber Plugin for Eclipse

If you are using the Eclipse IDE and want to get started with Behavior Driven Development then you will most likely want to install the Cucumber Eclipse Plugin. With the plugin installed and a couple of supporting files created you can run and review automated acceptance tests. If you don’t already have Eclipse installed it can also be easily installed from the Software Marketplace in Gnome on Ubuntu.


Eclipse in the Software Marketplace in Gnome on Ubuntu

To install Eclipse simply click Eclipse then click the blue install button.


The path of least resistance to install the cucumber eclipse plugin is to use the Eclipse Marketplace. The installation is pretty straight forward, you just search for Cucumber and click Install.

Once Eclipse is loaded and launched you can then use the Eclipse Marketplace to load Cucumber for Acceptance Tests


Select Help > Eclipse Marketplace to search the Eclipse Marketplace and install Cucumber

On the Search tab in the Eclipse Marketplace type cucumber in the Find box to find and install Cucumber.


Cucumber in the Eclipse Marketplace

You must accept the license agreement to complete cucumber installation


Accept the license agreement to complete the installation

Once the Cucumber Plugin is installed, we can create a Maven Project and add a CucumberRunner.

See the next post for details

Staging and Committing Changes in Git and GitHub

Staging Files
As we generate new solution artifacts they need to be added to the repository, integrated with work by other team members and tested to ensure that no defects were introduced by our changes. Until we add the files to the repository they are considered untracked files even though they are present in a folder in the repo. Simply creating files in the folder structure of the local repository is not sufficient.


Create Echo.java in local repo folder structure

After creating a new file, we can use git status to check the status of files in the repository’s directory structure. Untracked files will be displayed in red. Red bad Green good!


Use git Status to see untracked files

Depending on the number of files we have created we can use git add with various options to track one or more untracked files. If we create a single file (or a short list of files) we can use git add <filename(s)> to Stage a particular file or list of files


Stage single file or list of files with git add <filename(s)>

If we create multiple files in a single directory we can use git add . to Stage all files in the current directory but not sub directories. We can use git status at any time to determine if there are any untracked files or uncommitted changes in the local repository.


Create multiple untracked files and check status

Once we have determined that there are 2 untracked files and tracked file with a changes to be committed we can use git add . (git add –all) to add all track all new files in the current directory (or all files in current and subdirectories)

Stage all files in current directory
Git add .

If we create multiple files in multiple subdirectories we can use git add –all to stage all files in the directory tree with a single command

Committing changes to the Local Repository
Once we have created or edited files and Staged the changes then we can commit the changes to the repository. However, if we attempt to commit without authenticating against the repository we will get a warning message


Please tell me who you are

We can add the email and username info using the git config command


Use git config to add email and user name

Committing changes to the Local Repository
If we attempt to commit without adding a commit message we will be prompted to enter a commit message in the dreaded vim editor as in the image below. If you find yourself in this editor, enter your message then press <esc> to exit edit mode then type 😡 <enter> to exit the editor


Enter Added Echo Util Class, Unit and Acceptance Tests

Adding a Commit Message

Once we have added our commit message and have pressed <esc> then typed 😡 <enter> to save and exit the editor we will then see our commit message followed by details about the files committed to the repository as in the image below.


Commit complete

Pushing Changes to a Remote Repository
In order to share our changes with other team members we need to push our changes to a remote repository. With a small team and little to no branching pushing changes to a remote repo is as simple as running git push origin master. This command pushes changes from the current branch in the local repo to the master branch of the remote repo


git push origin master

The get push origin master command only works if you cloned your remote repository to the local repository or if you have configured the remote repo in advance. Even if you cloned or configured the remote repo in advance if you are not already logged into the remote repo you will be prompted to log in when you execute git push origin master


Authenticate to remote repo for push to complete

Once you have authenticated, the commit will complete and a status message will be displayed showing the number of files committed as well as the source and destination branch.


Push complete

In the next post we will clone this remote repo to a new local repo on a different computer pull the changes add a couple of new files commit and push our changes before we move into branching and merging.

Creating a Git Repository

There are a few ways to create a Git Repository but before we talk about How let’s talk about Why.
Why do we need a Git Repository or any Repository for that matter? And What is a Git Repository anyway? In its simplest for a Git Repository is simply a folder that contains files that we want to keep track of. As changes are made to the files the modifications and different versions of the files are tracked and changes are committed to the repository. This is very useful when there are many different team members working on the same set of files. For this reason, tools like Git are commonly referred to as Version Control systems. Historically systems such a Git, SVN and Subversion have been used to store Source Code for software applications so they have also been referred to as Source Code Management SCM or Software Configuration Management Systems. However, with the recent push to streamline Software Delivery Life Cycles (SDLC) there has been a move to place ALL solution related artifacts in Version Control. This allows for configuration of Continuous Integration, Automated Testing, easier automated deployments and simplified scrips.

So that makes the why do we need a Git Repository question easier to answer. A Git repository provides a central source of truth as to the current, tested and ready to deploy version of the application. A Git repository also provides a central source from which to build and deploy our solutions from as well as providing the ability to isolate changes made to different features or by different developers to their own isolated branch of the repository.

So now that we know what it is and why we need one lets look at how to create a Git Repository.

Git Init

If we are starting a brand new project and do not yet have a repository we can convert and existing directory into a repository using the Git Init command.
Git Init initialized the directory as a Git Repository and creates a hidden folder called .Git as a sub directory of the directory being initialized.

Git Clone

If we already have a remote repository in GitHub we clone that repository to create a local workspace with files we can edit.

GitHub
GitHub

Cloning a repository is a fast way for new team member to quickly get a new workstation setup and start being productive immediately.

Simply copy the URL of the Remote repository you wish to clone then run git clone <past URL here>. Depending on the size of the repository the cloning process takes a matter of seconds.

Once you have initialized or cloned your repository you can then begin to add files edit and commit your changes. See the next post in this series for details on adding files and committing staged changes.

DevOps Case study links


•GE Case Study
•AWS Sydney Australia
https://aws.amazon.com/blogs/aws/ge-oil-gas-digital-transformation-in-the-cloud/
•GE Oil and Gas Presentation
https://www.youtube.com/watch?v=DFGFaJZFtuk
•GE, AWS adoption
http://www.slideshare.net/AmazonWebServices/ism209-acceleration-of-aws-enterprise-adoption-in-ge??Microsoft Developer Division Case Study

•Interview with Sam Guckenheimer on Microsoft’s Journey to Cloud Cadence
http://www.infoq.com/articles/agile2014-guckenheimer
ALM Devops features
https://www.visualstudio.com/en-us/features/alm-devops-vs.aspx
•Microsoft Continuous Delivery Discussion
https://www.youtube.com/watch?v=caM0DojhV7w

Rakuten and Microsoft talk Devops in Real World
http://www.slideshare.net/TsuyoshiUshio/rakuten-and-microsoft-talk-devops-in-real-world
•Detailed retrospective
http://blogs.msdn.com/b/bharry/archive/2013/11/25/a-rough-patch.aspx?

• Knight Capital Case Study
https://www.sec.gov/News/PressRelease/Detail/PressRelease/1370539879795
•Knight Capital Group Wikipedia
https://en.wikipedia.org/wiki/Knight_Capital_Group
•Loss Swamps Trading Firm
http://www.wsj.com/articles/SB10000872396390443866404577564772083961412
•Knight Capital Group Comments
https://www.kcg.com/news-perspectives/article/knight-capital-group-comments-on-contributions-to-stabilize-the-u.s.-equity
•Knight Capital Says Trading Glitch cost it $440M
http://dealbook.nytimes.com/2012/08/02/knight-capital-says-trading-mishap-cost-it-440-million/?_r=0?







Open Source Development: Installing Java, Maven, Git and Gnome on Ubuntu

Before you can begin building that groundbreaking new app you have been dreaming about you first need to get your development environment setup. The following steps represent the path of least resistance to get a development environment setup and begin development of your new killer app. The tools mentioned will also allow your new software delivery pipeline to be automated with another tool or 2 and some additional configuration.

Operating System
First we need an OS to host all of our tools and for our purposes that OS will be Ubuntu Server 18.04 host on the AWS Cloud. Once we have our AWS EC2 Instance up and running we can use PuTTY to SSH into our server.


Once we have our AWS EC2 Instance up and running we can use Putty to SSH into our server and finish up the configuration of the new server.


When we open the connection for the first time we will be prompted to add the servers host key to the cache if we trust the server and intend to connect to it again in the future. We do trust the server as we just created it and we also intend to connect to it again in the future so click “Yes”.


Once we click “Yes” we are successfully connected to our new Ubuntu Server in AWS.


Server Configuration

Now we can complete the configuration of our new server to allow us to install and use our development tools. First, we will run “sudo apt-get update” to download the package lists from the repositories and “update” them to get information about the newest versions of packages and any dependencies they may have.


Configure passwords and create developers account

Next, we will set a password for the root user so that we can login to the console later if needed


Now we can create a new account to use when we connect to the server to do development. To accomplish this, we will use the command “sudo adduser developer”.


Once the user has been created we will add them to the admin group using the command “sudo usermod -aG admin developer”


Runtime Environment

Now that we have our server up and running and have configured our users we can begin installing the tools. First we will need to load the Java Runtime environment with the command “sudo apt install default-jre”. This will install the latest version of the runtime. It is important to note that while the latest version of the runtime is what we want to use for any new development it may still be necessary to load a specific version of the runtime to support a particular tool. For example Jenkins requires that version 8 be installed otherwise the installation will fail. For this reason we will install the version required by Jenkins as well by first adding the repository with the command “sudo add-apt-repository ppa:webupd8team/java”


Then we will execute the command “sudo apt install oracle-java8-installer” to complete the Java 8 installation.


Build Tool

We will use maven to build our solution and specify dependencies in Eclipse projects. We will use the command “sudo apt install maven” to install Maven.

Version Control

Once Maven has installed successfully we can install Git with the following command “sudo apt install git”.


Then we can add our developer user to Git with the command git config –global user.name “developer”


Using the git config –global user.email ” devopsstudent@outlook.com “ command we can include an email address for our developer.

Install Jenkins for Continuous Integration

Jenkins can be used for many things from Continuous Integration to Continuous Delivery and Deployment depending on the plugins employed. Since Jenkins has so many uses in optimizing our software delivery pipeline we will install now and complete final configuration in a later post. The command to install Jenkins (after we have install the appropriate runtime version) is “wget -q -O – https://pkg.jenkins.io/debian/jenkins.io.key | sudo apt-key add –” to add the repository. Then “sudo sh -c ‘echo deb http://pkg.jenkins.io/debian-stable binary/ > /etc/apt/sources.list.d/jenkins.list'” to refresh the list.


Then we will need a “sudo apt-get update” to get latest. And finally “sudo apt install jenkins” to install Jenkins. After the install completes we can use “systemctl status jenkins” to verify installation was successful.

Note: The installation is case sensitive and Jenkins must have a lowercase “j”

Graphical User Interface

Before installing anything else we will install the Gnome GUI so that we can remote into this server and use the GUI to navigate between tools and locate files. To install Gnome we will first add the repository “sudo add-apt-repository ppa:gnome3-team/gnome3” then perform an update “sudo apt-get update” and finally install Gnome “sudo apt-get install gnome-shell ubuntu-gnome-desktop”

Install RDP and Open Ports

Once we have all system level tools installed we can install xRDP and open the Web and RDP ports so that we can connect to our server remotely. To install xRDP we can use the “sudo apt install xrdp” command and then use “” to enable it


Then open ports 3389 and 8080 with the commands “sudo ufw allow 3389” and “sudo ufw allow 8080”

Now we can use Remote Desktop to connect to our server and install the Integrated Development Environment use the Marketplace in the GUI

A dialog will notify you that the identity of the remote computer cannot be verified. Since we just created this server we will check the “Don’t ask me again for connections to this computer” checkbox then click “Yes”

Then we can login as the developer user we created earlier.

From this point we can use the Marketplace to search for and install additional tools using the GUI.

Technical Debt Pay It Forward

Pay it Forward

You can pay now or pay later but trust me you’re gonna pay! I’m talking about Technical Debt… Technical Debt like any other Debt has Interest, so you can pay now or pay later but if you pay later it will be much more expensive. See this post on the Increasing cost of Technical Debt for more info. 

The increasing cost of technical debt
Increasing cost of technical debt

The short version is defects are said to have a 1x cost at design time but costs increase exponentially as you build, test approaching production. The simple point is that the earlier that we find and resolve issues and defects the less it costs. Anything that we can do simplify or speed up this process of finding and documenting defects reduces our Debt. We want to make a Shift Left in quality by moving quality checks closer to the beginning of the delivery pipeline where defects are cheaper to fix. By creating Test Cases based on the customer requirements and success criteria we can ensure that our tests are mapped to business value. This is no substitute for Unit Tests written at the Function or Method level as used in Unit Testing.  Plan for chaos, write test to detect it and buffer team capacity to fix it (more on that in a future post).