In the previous post we set up an Ubuntu server. We also installed and configured AWX. Now it is time to finish up the Ubuntu side of our lab environment.
Our lab environment will eventually contain a domain controller, a management box, and two sql servers. See the image below:
Prepare for Github
To prevent logging into github each time, create an ssh key pair to authenticate with github. With the -C parameter you can provide a description for your ssh key.
ssh-keygen -t rsa -b 4096 -C "lab-awx"
This will create some files in your home folder. Copy the contents of ~/<user>/.ssh/id_rsa.pub to github. Login into your github account, go to Settings (it is located under your account in the top right) and choose SSH and GPG keys. Paste the public key in here.
To test the authentication, issue this command:
ssh -T git@github.com
One final thing to configure git, is setting your user name and email address.
AWX provides a web-based user interface, REST API, and task engine built on top of Ansible. It is one of the upstream projects for Red Hat Ansible Automation Platform. Installing it is not a trivial task (for me at least).
What I will cover in this post is how to prepare your environment so you can start and play with AWX. I am using a Hyper-V virtual machine with a clean Ubuntu 22.04 server.
Ubuntu in Hyper-V VM
So we start with a plain and simple Ubuntu installation in a VM. I downloaded Ubuntu 22.04 from the official website (https://ubuntu.com/download/server). Next step is creating a new VM in my Hyper-V environment, I called mine LAB-AWX. Before installing Ubuntu, enable the Secure Boot option and select the Microsoft UEFI Certificate Authority. Now you can select the Generation 2 option while creating the VM.
Like I said, install Ubuntu with all default options. Make sure your VM has a network adapter connected to an External Virtual Switch so you can download all required software. Make a note of the IP address of your VM, it is shown on screen after you log in.
Visual Studio Code
You could use any terminal application to connect to your Ubuntu environment, including Hyper-V’s native “Connect” one. I will use Visual Studio Code however. In VSCode, install the extension called “remote ssh”. This extension allows you to connect to the Ubuntu virtual machine.
To connect to the VM, in the lower left corner, click on the green square. In the top of the window, select Connect to Host and fill in the VM’s IP address.
Once connected, you can open a new Terminal windows (by pressing CTRL+Shift+`), or use the menu Terminal -> New Terminal.
Install Kubernetes
Now it is finally time to install Kubernetes.
curl -sfL https://get.k3s.io | sh -
This will download the software and install it.
To check the installation, issue:
kubectl version
This will fail because of missing permissions. You could get it working by using sudo each time, but that is annoying. To fix this, you need to grant permissions to file /etc/rancher/k3s/k3s.yaml. First determine your name and group using whoami and groups. In my case it is roger and roger, so grant the permissions like so:
sudo chown roger:roger /etc/rancher/k3s/k3s.yaml
Running the kubectl version statement will output the version information.
Kustomize
Before installing Ansible we will download and install Kustomize. This tool makes it easier to deploy configurations to your Kubernetes cluster.
Once Kustomize is installed, the folder will be moved to /usr/local/bin/ directory.
AWX
Now it is finally time to install AWX. The preferred way to install it is by making use of the so called AWX Operator. This Operator will take care of everything, and we will use Kustomize to install and configure it for us.
In your terminal, create a new file (you can use nano kustomization.yaml), and paste in the following code (indentation is important!):
---
apiVersion: kustomize.config.k8s.io/v1beta1
kind: Kustomization
resources:
# Find the latest tag here: https://github.com/ansible/awx-operator/releases
- github.com/ansible/awx-operator/config/default?ref=1.1.3
# Set the image tags to match the git version from above
images:
- name: quay.io/ansible/awx-operator
newTag: 1.1.3
# Specify a custom namespace in which to install AWX
namespace: awx
Save and exit your kustomization.yaml file, and run this:
kustomize build . | kubectl apply -f -
This will take a while to finish, you can check the progress using this:
kubectl get pods --namespace awx
When it is finished, create a new file (nano awx.yaml) and apply the configuration to Kubernetes:
As you can see on the last line, the resulting AWX webportal will be accessible on TCP port 30080. You can change this port if you want. Save and exit this file.
Now open your kustomization.yaml file again and add an additional resource. Final result will look like this:
---
apiVersion: kustomize.config.k8s.io/v1beta1
kind: Kustomization
resources:
# Find the latest tag here: https://github.com/ansible/awx-operator/releases
- github.com/ansible/awx-operator/config/default?ref=1.1.3
- awx.yaml
# Set the image tags to match the git version from above
images:
- name: quay.io/ansible/awx-operator
newTag: 1.1.3
# Specify a custom namespace in which to install AWX
namespace: awx
Save and exit the file again, and re-apply the configuration:
kustomize build . | kubectl apply -f -
Allow it some time to finish, you can check progress by using
kubectl get pods --namespace awx
When all ran fine, you will now have a bunch of pods, and your AWX website can be accessed. In order to log into the environment, you need to extract the generated admin password. You can do this like so:
After all this work, we are finally able to open up the portal for the first time. On your host, open a webbrowser and navigate to <ip of your ubuntu vm>:30080. Login with user admin and the password you just copied (tip: in the portal change it to a more user friendly password;-))
AWX is now running hopefully. In a next post we will add some Windows VMs to our testlab, and see if we can actually do with our beautiful AWX test environment.
When working in a team, it can be difficult to set up development environments. To avoid issues, it is really important to make sure each team member is using the same tools, versions, editions, etc. And as we may all know, it takes a lot of time to set up a new machine when a new team member joins the team.
To make things easier (and quicker) you could use a so called dev container. To quote the official website: “The Visual Studio Code Dev Containers extension lets you use a Docker container as a full-featured development environment. It allows you to open any folder inside (or mounted into) a container and take advantage of Visual Studio Code’s full feature set. A devcontainer.json file in your project tells VS Code how to access (or create) a development container with a well-defined tool and runtime stack. This container can be used to run an application or to separate tools, libraries, or runtimes needed for working with a codebase.”
This post will describe what is needed to get started, and how to work with SQL Server database projects. After reading this post you will be able to create a SQL Server database project, use the tSQLt framework for unit testing, and deploy dacpac files.
As a side note: yes I know, the scripts are using hard coded passwords.
To get started quickly, I have used one of the examples provided by Microsoft, which can be found here. In your VSCode root folder, create a subfolder called .devcontainer and create some new files in it:
devcontainer.json
// For format details, see https://aka.ms/devcontainer.json. For config options, see the README at:
// https://github.com/microsoft/vscode-dev-containers/tree/v0.245.2/containers/dotnet-mssql
{
"name": "C# (.NET) and MS SQL",
"dockerComposeFile": "docker-compose.yml",
"service": "app",
"workspaceFolder": "/workspace",
// Configure tool-specific properties.
"customizations": {
// Configure properties specific to VS Code.
"vscode": {
// Set *default* container specific settings.json values on container create.
"settings": {
"mssql.connections": [
{
"server": "localhost,1433",
"database": "",
"authenticationType": "SqlLogin",
"user": "sa",
"password": "P@ssw0rd",
"emptyPasswordInput": false,
"savePassword": false,
"profileName": "mssql-container"
}
],
"sqlDatabaseProjects.dotnetSDK Location": "/usr/bin/"
},
// Add the IDs of extensions you want installed when the container is created.
"extensions": [
"ms-dotnettools.csharp",
"ms-mssql.mssql"
]
}
},
"features": {
"ghcr.io/devcontainers/features/azure-cli:1": {
"version": "latest"
}
},
// Use 'forwardPorts' to make a list of ports inside the container available locally.
"forwardPorts": [1433,1433],
// postCreateCommand.sh parameters: $1=SA password, $2=dacpac path, $3=sql script(s) path
"postCreateCommand": "bash .devcontainer/mssql/postCreateCommand.sh 'P@ssw0rd' './.devcontainer/dacpac' './.devcontainer/mssql/sql'",
"remoteEnv": {
"PATH": "${containerEnv:PATH}:/opt/mssql-tools/bin"
}
}
docker-compose.yml
version: '3'
services:
app:
build:
context: .
dockerfile: dockerfile
args:
# Update 'VARIANT' to pick a version of .NET: 3.1-focal, 6.0-focal
VARIANT: "6.0-focal"
# Optional version of Node.js
NODE_VERSION: "none"
volumes:
- ..:/workspace:cached
# Overrides default command so things don't shut down after the process ends.
command: sleep infinity
# Runs app on the same network as the database container, allows "forwardPorts" in devcontainer.json function.
network_mode: service:db
# Uncomment the next line to use a non-root user for all processes.
# user: vscode
# Use "forwardPorts" in **devcontainer.json** to forward an app port locally.
# (Adding the "ports" property to this file will not forward from a Codespace.)
db:
image: mcr.microsoft.com/mssql/server:2022-latest
restart: unless-stopped
environment:
SA_PASSWORD: P@ssw0rd
ACCEPT_EULA: Y
ports:
- "1433:1433"
The result of this docker compose file will be two docker containers (named db-1 and app-1). The app-1 container will host the .NET stuff but also the sql tools to build database projects and deploy dacpac files. It will install the latest version of SQL Server 2022.
I had to add the last two lines because I want to be able to connect to the Sql instance in the container from outside the container (using SQL Server Management Studio). It will map port 1433 from inside the container to port 1433 on your local machine. If that port is already in use in your local environment, you can pick any available port. This way you’re able to connect from SSMS by using “localhost,1433”.
dockerfile
# [Choice] .NET version: 6.0-focal, 3.1-focal
ARG VARIANT="3.1-focal"
FROM mcr.microsoft.com/vscode/devcontainers/dotnet:0-${VARIANT}
# [Choice] Node.js version: none, lts/*, 18, 16, 14
ARG NODE_VERSION="none"
RUN if [ "${NODE_VERSION}" != "none" ]; then su vscode -c "umask 0002 && . /usr/local/share/nvm/nvm.sh && nvm install ${NODE_VERSION} 2>&1"; fi
# Install SQL Tools: SQLPackage and sqlcmd
COPY mssql/installSQLtools.sh installSQLtools.sh
RUN bash ./installSQLtools.sh \
&& apt-get clean -y && rm -rf /var/lib/apt/lists/* /tmp/library-scripts
SHELL [ "/usr/bin/pwsh", "-c"]
RUN $ErrorActionPreference='Stop'; Install-Module -Name Pester, dbatools, PSScriptAnalyzer -Force
The last two lines are used to install PowerShell modules. I am not using them yet, but it is mainly to show how to do it (thanks MarkV for the inspiration).
installSQLtools.sh
Now, in the .devcontainer folder, create two subfolders: dacpac and mssql.
Add file installSQLtools.sh in folder .devcontainer/mssql:
Since this is a bash script (running in the container which in Linux), be aware that Linux is case sensitive, and make sure the line endings are set correctly in VSCode. Windows uses CRLF, but Linux files use LF. You can check this easily in VSCode in the lower right corner of the window. You can change the line ending by clicking on CRLF/LF and set the correct value:
When the bash file has incorrect line endings, you will get an error like shown below. I made this mistake on purpose of course, just to show the error 😉
postCreateCommand.sh
Create a new file postCreateCommand.sh in subfolder .devcontainer/mssql (and make sure the line endings are set correctly ;-)) This script will be executed after the Docker containers are configured and running. It will loop through each .sql file in the given folder (if any) and execute it. It will also deploy dacpac files that are stored in the dacpac folder (if any).
#!/bin/bash
dacpac="false"
sqlfiles="false"
SApassword=$1
dacpath=$2
sqlpath=$3
echo "SELECT * FROM sys.databases;" | dd of=testsqlconnection.sql
for i in {1..60};
do
/opt/mssql-tools/bin/sqlcmd -S localhost -U sa -P $SApassword -d master -i testsqlconnection.sql > /dev/null
if [ $? -eq 0 ]
then
echo "SQL server ready"
break
else
echo "Not ready yet..."
sleep 1
fi
done
rm testsqlconnection.sql
for f in $dacpath/*
do
if [ $f == $dacpath/*".dacpac" ]
then
dacpac="true"
echo "Found dacpac $f"
fi
done
for f in "$sqlpath/"*.sql; do
# if [ $f == $sqlpath/*".sql" ]
# then
echo "Executing:"
echo "$f"
/opt/mssql-tools/bin/sqlcmd -S localhost -U sa -P $SApassword -d master -i "$f"
# fi
done
if [ $dacpac == "true" ]
then
for f in $dacpath/*
do
if [ $f == $dacpath/*".dacpac" ]
then
dbname=$(basename $f ".dacpac")
echo "Deploying dacpac $f"
/opt/sqlpackage/sqlpackage /Action:Publish /SourceFile:$f /TargetServerName:localhost /TargetDatabaseName:$dbname /TargetUser:sa /TargetPassword:$SApassword
fi
done
fi
if [ $SApassword == "P@ssw0rd" ]
then
echo "$(tput setaf 1)WARNING$(tput sgr0): you are using the default sample password. If you want to change it, execute the following command"
echo "/opt/mssql-tools/bin/sqlcmd -S localhost -U sa -P $SApassword -d master -Q \"ALTER LOGIN sa WITH PASSWORD = '<enterStrongPasswordHere>' \""
fi
First test
Now that the initial scripts are set up, it is time to give the dev container a try. In VSCode, press F1 and type “dev containers rebuild”. Click on “Dev Containers: Rebuild and Reopen in Container”.
In the lower right corner you have the option to show the log. If this is the first time you run this, it will download the required Docker images so it may take a few minutes to finish.
After a while you will see in the left bottom corner VSCode is connected to the dev container 🙂
If you click on the Ports tab, it will display the ports that are forwarded from inside the containers to the outside world.
This means that everything is working, and you are able to connect to the Sql instance running in the container (user name ‘sa’, password P@ssw0rd):
Happy days!
First database project
Let’s start with the first database project. For this to work you need the SQL Database Projects extension. After installing this one, you can create a new database project.
Click on Create new and use the wizard to create the project:
Select Database Project Type: SQL Server Database
Enter Project Name: Movies
Select Project Location: /workspace/
Select Target Platform: SQL Server 2022
SDK-style project (Preview): Yes
If you are familiar with working with database projects in Visual Studio, the interface will look quite familiar. If you already have a database set up, it is possible to import the database objects. It’s also possible to generate objects based on an OpenAPI/Swagger website (which I will ignore for now).
Usually I organize the database objects based on their schema, so let’s create some folders and add table Director in the database project in folder dbo/Tables:
CREATE TABLE [dbo].[Director]
(
[Id] INT NOT NULL IDENTITY CONSTRAINT Director_PK PRIMARY KEY CLUSTERED
, [Name] VARCHAR(50) NOT NULL CONSTRAINT Director_Name_UIX UNIQUE
, [DateAdded] DATETIME NOT NULL CONSTRAINT Director_DateAdded_DF DEFAULT (GETDATE())
, [DateModified] DATETIME NULL
);
To fill the dbo.Director table with some initial data, add a Post-Deployment script to the Scripts folder.
Leave the default name for the Post-Deployment Script and paste in this code:
;WITH Source(Name, DateAdded)
AS
(
SELECT Name, DateAdded
FROM
(
VALUES
('Quentin Tarantino', GETDATE())
, ('Robert Rodriguez', GETDATE())
) AS Source(Name, DateAdded)
)
MERGE [dbo].[Director] AS T
USING (SELECT Name, DateAdded FROM Source) AS S
ON T.Name = S.Name
WHEN NOT MATCHED BY TARGET THEN
INSERT (Name, DateAdded) VALUES (S.Name, S.DateAdded);
To create a stored procedure, right click folder dbo\StoredProcedures in the database project and select Add Stored Procedure. Name it GetDirectorByName.
CREATE PROCEDURE [dbo].[GetDirectorByName]
@Name VARCHAR(50)
AS
BEGIN
SET NOCOUNT ON;
SELECT
Id
, Name
, DateAdded
, DateModified
FROM
dbo.Director
WHERE
Name = @Name;
END
Now right click on your database project, and select Build. In the OUTPUT tab you should see that the build succeeded:
Now it is time to actually deploy the database project onto the SQL instance in the container. Again, right click on the database project and select Publish. A wizard opens:
Select where to publish the project to: Publish to an existing SQL server
Select publish profile to load: Don’t use profile
Choose a connection profile from the list below: mssql-container
Password: P@ssw0rd
Select database: Movies
Choose action: publish
Assuming there are no errors, you could check the table is actually created by opening up SSMS, connect to the SQL instance and be amazed.
tSQLt Framework
It is time to add the tSQLt framework. You can read all about it here. This post is not about using the framework per se, but I will discuss how you can set up your dev container to use it. The tSQLt framework uses a CLR assembly and some tSQLt specific tables and stored procedures. It is possible to include them all into your database project. This makes it really hard to deploy your resulting dacpac into environments other than your development environment.
What I like to do is create three separate database projects:
Core database project: This project contains your core database objects like tables, stored procedures, views, etc. The resulting dacpac of this project will not contain any tSQLt related objects and therefore can safely be deployed to your test, acceptance and production environment.
tSQLt framework database project: This project contains the database objects required by the tSQLt framework.
tSQLt unit test database project: This project contains your unit tests.
Prepare SQL Server for tSQLt
VSCode is still connected to the dev container. In order to get tSQLt working, we need to open the local workspace again. You can do this by clicking in the lower left corner and select Reopen Folder Locally (ignore the message to Reopen in Container for the moment).
Download the tSQLt framework from the tSQLt website. From the zipfile, copy the file PrepareServer.sql to your .devcontainer\mssql\sql\folder. This script will create the required CLR assembly and enable the CLR option in the instance. It needs to run once, so a good place to do this, is immediately after the dev containers are created. Remember, we already have the code for this created in bash script postCreateCommand.sh. In order to rebuild the containers, press F1 in VSCode, and choose option Dev containers: Rebuild and Reopen container. In the Terminal output window you should see the PrepareServer script is executed.
Database Project tSQLt_Framework
So let’s create a new Database Project called tSQLt_Framework:
Select Database Project Type: SQL Server Database
Enter Project Name: tSQLt_Framework
Select Project Location: /workspace/
Select Target Platform: SQL Server 2022
SDK-style project (Preview): Yes
Like I mentioned earlier, this database project will contain the tSQLt related database objects. The tSQLt download contains file tSQLt.class.sql which will create all required objects for us.
In this new project, make a subfolder Scripts and add a Pre-Deployment Script (leave the default name). From the tSQLt zipfile, copy the contents of file tSQLt.class.sql into Scripts\Script.PreDeployment1.sql.
Build the tSQLt_Framework by right clicking the project. Make sure the build succeeds.
Now it is time to publish this project into our Movies database.
Select where to publish the project to: Publish to an existing SQL server
Select publish profile to load: Don’t use profile
Choose a connection profile from the list below: mssql-container
Select database: Movies
Choose action: publish
And… crap. Deployment fails! The reason it fails, is because we filled the dbo.Director table with some initial data. This tSQLt_Framework project is not aware of the database objects we created in the Movies project, so by default, these objects will be dropped. Sqlpackage errors out when an unknown table with data is dropped. Another issue is that the tSQLt.class.sql script is not idempotent (so it can run only once).
Instead of modifying the tSQLt.class.sql file, I made the decision to build and deploy the projects from the CLI. That way, I can drop and recreate the Movies database each time. So in the root of your workspace add a bash file called bd.sh (remember to set the line endings to LF).
The first line in this bash script (Line Endings!) will build the database project into a dacpac file. And the second line will publish the dacpac to the SQL instance. Note the last /p:CreateNewDatabase=True option: this will drop and recreate the database. The last line has parameter DropObjectsNotInSource set to False, so it will not drop objects that are “unknown”.
You can run the above script in VSCode Terminal, by issuing:
sh bd.sh
If you haven’t build a project before (in the current session that is), the commands will fail mentioning it cannot find some namespace. If you run into this issue, public the Movies database project first using the GUI (right click -> publish). I have not figured out how to fix this yet.
Database Project tSQLt_UnitTests
Now it is finally time to actually write some unit tests. So let’s create a new Database Project called tSQLt_UnitTests:
Select Database Project Type: SQL Server Database
Enter Project Name: tSQLt_UnitTests
Select Project Location: /workspace/
Select Target Platform: SQL Server 2022
SDK-style project (Preview): Yes
In the tSQLt_UnitTests project create a new folder Security, add a new Script TestBasic.sql and paste in this code:
CREATE SCHEMA [TestBasic] Authorization [dbo];
GO
EXECUTE sp_addextendedproperty @name='tSQLt.TestClass', @value = 1, @level0type = 'SCHEMA', @level0name = 'TestBasic';
GO
In the tSQLt_UnitTests project create a new folder TestBasic\StoredProcedures. Right click this folder and select Add Stored Procedure and name it test If database has correct collation. Add this code, make sure it is created in the TestBasic schema:
CREATE PROCEDURE [TestBasic].[test If database has correct collation]
AS
BEGIN
SET NOCOUNT ON;
----- ASSEMBLE -----------------------------------------------
DECLARE @expected VARCHAR(255),
@actual VARCHAR(255)
SELECT @expected = 'SQL_Latin1_General_CP1_CI_AS'
----- ACT ----------------------------------------------------
SELECT @actual = CAST(DATABASEPROPERTYEX(DB_NAME(), 'Collation') AS VARCHAR(255));
----- ASSERT -------------------------------------------------
EXEC tSQLt.AssertEquals @expected, @actual;
END;
Add another stored procedure called [test If table dbo.Director exists] and paste in this code:
CREATE PROCEDURE [TestBasic].[test If table dbo.Director exists]
AS
BEGIN
SET NOCOUNT ON;
----- ASSERT -------------------------------------------------
EXEC tSQLt.AssertObjectExists @ObjectName = N'dbo.Director';
END;
The last thing we need to do is adding this last project to the bd.sh script, and we’re good to go. Below you will find the complete bd.sh script. It will build and deploy all three database projects, and finally will kick off the unit tests.