DotNetZero vNext
2018-01-09
I've been working on a new version of dotnetzero (formerly psakezero). This v2 version has a number of changes that were fun to build.
Large refactoring into smaller PowerShell functions
I'm trying out an approach that takes each PowerShell function and places them into their own file. This was for a couple of reasons. One was forcing me to start to remove some of the global state from the 'application' and two their are functions that are useful, to me, in other areas and I wanted to be able to easily access them.
The old PsakeZero version just wasn't setup very well. It was a bit paradoxical in that it had a monolith script and with a number of HTTP Uris where other data persisted. You don't have to read ahead to see that there would be a barrage of HTTP requests fired off when it executed.
With a directory structure full of scripts I still waned the end result to be a single script. I created a highly meta 'build script' which spins through the appropriate directories and 'compiles' them into the 'application' that is served up to the end user.
Embedded Content
One of the user selected options is to install Psake for the task runner. Psake is built on and uses PowerShell which means I have a need to create or lay down some PowerShell for the user in some scenarios. I wanted to keep the source of these content slugs in PowerShell so the next person (read-as: me) had a chance of 'seeing' the intent without having to visually un-escape everything while reading. I think that if I tried to have the build script escape all the content I would have run into edge cases that the escapinator would have missed so I opted for a bit of a nuclear option.
Again using the build process I take those selected embedded content directories and encode the file content so that in no way that it could be interpreted by PowerShell as valid executable script or commands which would cause problems at run time for dotnetzero.
Here is an example those compression functions being used
» $functionContent = Get-Content -Raw .\New-Directory.ps1
» $compressedFunction = $functionContent | Compress-String
» Write-Host $compressedFunction
H4sIAAAAAAAEAEsrzUsuyczPU/BLLdd1ySxKTS7JL6pUqOblUgCCaOfclJzUEqfMvJTMvHQNzViIcEFiUWKuggaEAwIqBYklGRCuJoQCGedZkpqroOuWX5ScqgDmhFQWpCogLNENAOqC6FWoUfAvLdH1K83J4eWq5eUCACei7sKXAAAA
» ($compressedFunction | Expand-String) -eq $functionContent
True
» $compressedFunction | Expand-String
function New-Directory {
[CmdletBinding()]
param (
$path
)
New-Item -Force -ItemType Directory -Path $path | Out-Null
}
And the sample implementation
Wraps up dotnet new for adding projects to the source tree
One of the things that the old version did not do was give the user a starting point for new .NET projects. Part of this was because the dotnet templating feature wasn't around yet and I wasn't about to implement some convoluted clone repo strategy - I don't care for the approach for starting projects. Now that we have the dotnet cli template feature I wanted to at least wrap up the list template information of dotnet new
so that the user would have a jumping off point to get some source code projects stubbed out for their repo.
If you have the dotnet cli installed you can run dotnet new --list
which will display the installed templates on the machine. Below is a sample
Templates Short Name Language Tags
-------------------------------------------------------------------------------
Console Application console [C#], F#, VB Common/Console
Class library classlib [C#], F#, VB Common/Library
Unit Test Project mstest [C#], F#, VB Test/MSTest
xUnit Test Project xunit [C#], F#, VB Test/xUnit
ASP.NET Core Empty web [C#], F# Web/Empty
ASP.NET Core Web App (mvc) mvc [C#], F# Web/MVC
ASP.NET Core Web App razor [C#] Web/MVC/Razor Pages
ASP.NET Core w/ Angular angular [C#] Web/MVC/SPA
ASP.NET Core w/ React.js react [C#] Web/MVC/SPA
ASP.NET Core w/ React.js and Redux reactredux [C#] Web/MVC/SPA
ASP.NET Core Web API webapi [C#], F# Web/WebAPI
global.json file globaljson Config
NuGet Config nugetconfig Config
Web Config webconfig Config
Solution File sln Solution
Razor Page page Web/ASP.NET
MVC ViewImports viewimports Web/ASP.NET
MVC ViewStart viewstart Web/ASP.NET
----------------------------------------------------------------
Installed dotnet templates
----------------------------------------------------------------
1 Console Application
2 Class library
3 Unit Test Project
4 xUnit Test Project
5 ASP.NET Core Empty
6 ASP.NET Core Web App (mvc)
7 ASP.NET Core Web App
8 ASP.NET Core w/ Angular
9 ASP.NET Core w/ React.js
10 ASP.NET Core w/ React.js and Redux
11 ASP.NET Core Web API
12 global.json file
13 NuGet Config
14 Web Config
15 Solution File
16 Razor Page
17 MVC ViewImports
18 MVC ViewStart
----------------------------------------------------------------
Adding projects to your solution
Select dotnet item # to add to your solution
(blank to quit/finish):
This wizard will collect the project type and name add that to a collection to pass onto the New-DotNetSolution function which will then issue all the appropriate commands on the dotnet cli to create your solution.
one of the things I like about this feature is that it will also prompt you to add a test project after each application based template.
Adding projects to your solution
Select dotnet item # to add to your solution
(blank to quit/finish): 11
ASP.NET Core Web API Name: API
Do you want to add unit test project?
[N] No [M] mstest [X] xunit [?] Help (default is "N"):
It also gives you a status section so you can see what you are building with the name and type of project.
----------------------------------------------------------------
2 Selected Project(s)
----------------------------------------------------------------
- API
ASP.NET Core Web API
- API.Tests
xUnit Test Project
----------------------------------------------------------------
Adding projects to your solution
Select dotnet item # to add to your solution
(blank to quit/finish):
If you keep adding application type items to the collection the test question will change to a yes or no since you've already decided on xUnit or one of the other testing frameworks installed in the local template cache.
Adding projects to your solution
Select dotnet item # to add to your solution
(blank to quit/finish): 7
ASP.NET Core Web App Name: Web
Do you want to add unit test project?
[N] No [Y] Yes [?] Help (default is "Y"):
And when you are done you issue a empty item it starts actually building the project. The next item in the pipeline just takes the collection and in the incoming solution name and create your application.
For the most part this process is fairly straight forward. It creates the solution, creates the projects, adds the projects to the solution, checks for test project names by convention and sets up the connecting references for their respective project, runs the restore, build and when (you going to always add test projects, right?) runs the test command for those.
item of note here is that part of the restore section it will check the directories for a package.json and run
npm install
in those respective directories. This keeps those those web based projects in a state that will allow them to initially compile as part of this wizard processes.
Allows for selected task runners
Another new feature is the ability to select different task runners for your new project. The default is still Psake but I've also added a basic install of Cake Build. Invoke-Build will be added shortly as well.
All of the runners are tucked behind a run.ps1
file that handles all the bootstrapping of the runner and any dependencies they might have. This is a common pattern I use across my projects so that the default build can easily be ran with no args but also specific tasks can be executed that the developers may need for daily work. For examples .\run.ps1 compile-css
or .\run.ps1 update-database
etc
Hosted on Azure Functions
I think of the best features of dotnetzero is how you can execute it. irm dotnetzero.com | iex
or Invoke-RestMethod dotnetzero.com | Invoke-Expression
and you are off to the the races. You can also use Invoke-WebRequest
or iwr
but your machine needs to have gone thought those initial IE runs otherwise you need the -UseBasicParsing
flag and that former simple easy to remember command is gone.
You can also visit https://dotnetzero.com in your browser and you'll see the brochure page I created for the project. The reason this works is a HTTP principle called Content-Negotiation or conneg for short. This allows the site to serve html when it sees a browser request, serve the PowerShell script when those requests are made as PowerShell send a User-Agent header - Yes I'm a bad person for sniffing the User-Agent. When curl sends a similar request it will send a shell script which is really out of date and under powered at the moment.
The back end for this is on Azure Functions and Azure Blob Storage. I have Azure Function Proxy sitting in front of this so thatstand alone commands can be requested. The current command list is just one but if you issue a irm dotnetzero.com/dotnetcli | iex
you can run just the wizard portion of the application.
Here is what that looks like
{
"$schema": "http://json.schemastore.org/proxies",
"proxies": {
"appcli": {
"matchCondition": {
"route": "/dotnetcli"
},
"backendUri": "https://example.com/api/fullapp",
"requestOverrides": {
"backend.request.method": "get",
"backend.request.querystring.clicmd": "Get-DotNetProjects | New-DotNetSolution",
"backend.request.querystring.beta": "false"
}
},
"approot": {
"matchCondition": {
"route": "/"
},
"backendUri": "https://example.com/api/fullapp",
"requestOverrides": {
"backend.request.method": "get",
"backend.request.querystring.clicmd": "New-SourceTree",
"backend.request.querystring.beta": "false"
}
}
}
}
This is basically converting the route match into additional query string parameters that the Azure Function parses. The default is just to run the New-SourceTree
command but when the /dotnetcli
is passed to the end the wizard or Get-DotNetProjects | New-DotNetSolution
command chain is then used. This is where having all the functions sent down together comes in handy. It allows for PowerShell Command Composition™.
Feel free to take it for a spin and let me know what you think.