Taming the Visual Studio Code for Golang

There are few tips which I have collected over time when I was using Visual Studio Code for editing my Go code.

Using gogetdoc as the default documentation provider

Sometimes I came across a code which wasnt showing tooltips and over a function for me. To me it looks like godoc doesn’t handle well non local sources.

Luckily we can enable gogetdoc quite easily, just open Visual Studio Code, press Shift+Cmd+P (Shift+Alt+P) to open Command Palette and type in Open Settings (JSON), after hitting enter you will get VS’s settings.json opened. Find out if you have go.docsTool set and change it to gogetdoc like this:

"go.docsTool": "gogetdoc",
Screenshot 2019-05-27 at 20.48.29
Your tooltips should now work well.

Speed up the search in folder by keeping file cache alive

By default, simple search in workspace/folder is quite slow. It looks like editor is starting a special process just for search every time you start a new search. It is possible to keep the process in memory which makes search pretty fast. Open the Settings.json as in previous tip and set up:

"search.maintainFileSearchCache": true,

You can compare how fast it is when turned off. The keyboard shortcut to search is Shift+Cmd+F.

 

Enlarge the debugging tooltips for variable content

While debugging under Delve, the default limit used in VS is 64 characters, which is quite not enough. If you dont want to play with slices in debug window which also works.

Evaluating myvar[64:128] prints it

Without changes it looks somethings like this:

debugging-long-text.pngBut fortunatelly it is possible to edit your delve source code to increase limit as you want.

In the settings.json, place following configuration snippet

    "go.delveConfig": {
        "dlvLoadConfig": {
            "followPointers": true,
            "maxVariableRecurse": 1,
            "maxStringLen": 1000,
            "maxArrayValues": 1000,
            "maxStructFields": -1
        },
        "apiVersion": 2,
        "showGlobalVariables": true
    }
It increases the length of text nicely.
Btw. Did you know that there is Game of Thrones Lorem ipsum generator, for example here, the one by Rich Finelli.
The result can be observed here:
Screenshot 2019-05-28 at 20.21.08.png

Enabling the golang test log messages

Similarly like when running go test ./… any log messages produced by your tests like:

    t.Logf("logf Score %v\n", score)
    fmt.Printf("fmtf Score %v\n", score)

Are being ignored unless we start them with go test -v ./…
From those, fmt.Printf is printing immediatelly, while t.Logf waits until the test completes, so sometimes it might be more usefull to use fmt.Printf in tests.

Fortunatelly editor allows us to set up tests to use this argument. Open Settings.json again and write down:

"go.testFlags": ["-v"],
That will show the log messages the the Output window just right.

How to Import a GPX track to Garmin 920XT

I recently bought a Forerunner 920 and despite trying before my first cycling, I was unable to import a GPX file, which I created from a map, to navigate me on the road.

Now, several days later, I returned to the topic and I was hoping that I would get rid of my dependency on Phone because of map navigation. After another round of searching, I was much more successfull this time.

Thanks to Cyberbob99, here on Garmin Forums I found the way. The trick is that you have to have ..

gpsies.com to the rescue

Fortunatelly, there is a site www.gpsies.com, which can convert from a file, which misses route (connection among points), timing data, altitude, or otherwise not-complete Gpx file obtained by real excercise. Go there, register and by clicking Create/Upload load your Gpx data.

Then continue with just clicking download in the left-bottom side. You will get another Gpx file, this time it will be bigger, because it contains track and some made excercise.

www.gpsies.com before clicking download

Garmin Connect imports Gpx

With the newly obtained fully featured Gpx file, including some generated “cycling”, I am successfully able to import it to my online (modern) Garmin connect. I just need to add it as activity first, because courses do not have import.

In the Graphical menu, I choose Activity and Import (Gpx).

Importing new Activity, like if you exercised in real

 

The import should succeed and I got a new activity, happened sometimes in 2010 (I can remove it later).

Then I can go to the activity itself (should contain the map too) and I can click to small wheel in top-right. This expands a menu which has option to Save as Course.

 

We have a Course now

Now we can switch in Garmin connect to Courses, select the new one, we just created and when we go to its details, it should have option to Send to device. Click that option.

Sending course to the device

Now, it will ask you to Connect your device (I used Garmin Express, which needs to be installed on your Computer).

The rest is work of Garmin Express, Because I used Garmin Express, It asked me to connect device to Usb cradle, while page was searching for device, I set my device for sync and it was synchronised. I just returned to web and closed (still waiting) window on the site.

Then I went to my Garmin > Pressed three-dot button and went to Navigation > Tracks and the new map was there.

 

 

 

 

Configuring EntityFramework for .NET Core

I will try to show some real example of .net core way of configuring Entity Framework Code First in this article. Microsoft documentation shows some examples, but the solution there creates one DbContext and stores it in DI container.

Downsize of that solution is that you cannot dispose container obtained from DI.

Better solution is to create a Factory class, store it as Singleton into Di container and create DbContext with this service.

public class MyDataContextFactory : IDataContextFactory
{
const string ConnectionStringName = "MyDbConn";
IConfigurationRoot configuration;
public DataContextFactory(IConfigurationRoot Configuration)
{
configuration = Configuration;
}
public DataContext Create()
{
var optionsBuilder = new DbContextOptionsBuilder();
optionsBuilder.UseSqlServer(
configuration.GetConnectionString(ConnectionStringName));
return new MyDataContext(optionsBuilder.Options);
}
}

Also, for this class, we would create interface IMyDataContextFactory, which will look like this:

public interface IMyDataContextFactory
{
MyDataContext Create();
}

The meat of this class is the method Create, which reads from Configuration the Db connection string and creates particular instance of DbContext. In my case called MyDataContext. I am passing Sql connection string read from configuration.

The class is taking IConfigurationRoot as only parameter, so that it could read from .Net Core Configuration. This is usually appsettings.json, but its up to your application.

Configuration and registration of the service

When you look to your Program.cs, you can see it configured like for example this:

public class Program
{
public static IConfigurationRoot Configuration { get; set; }
public static void Main(string[] args = null)
{
var builder = new ConfigurationBuilder()
.SetBasePath(Directory.GetCurrentDirectory())
.AddJsonFile("appsettings.json");
Configuration = builder.Build();
/// and so on...

Please notice Configuration is – when created to class member.
The next step is to add Factory method to .Net core dependency injection container. In the Startup.cs file, locate your existing ConfigureServices, which is used for these things and add our line for example to end of it:

public void ConfigureServices(IServiceCollection services)
{
services.AddMvc();
// Register MyDataContextFactory here
services.AddSingleton(
new DataContextFactory(Configuration));
}

How to use it in Controllers

The use is then similar to other Net Core DI services. We need to add our factory interface to particular Controller constructor, to get injected by DI automatically. It will look like this:

public class ValueController : Controller
{
private IMyDataContextFactory myDataContextFactory;
public ValueController(IMyDataContextFactory myDataContextFactory)
{
this.myDataContextFactory=myDataContextFactory;
}
// and so on

And use is then very simple, for example in same controller I can have method like:

public class ValueController : Controller
{
// ...
[HttpGet("{id}", Name = "GetTodo")]
public IActionResult GetById(long id)
{
using (var context = dataContextFactory.Create())
{
var item = context.TodoItems.FirstOrDefault(t => t.Id == id);
if (item == null)
{
return NotFound();
}
return new ObjectResult(item);
}
// ...

So it is, hopefully this will help someone.

Thanks for reading.

Queueing items to ThreadPool

The simplicity of this approach is astonishing. And it was in .net since .net 4, but our eyes are not always looking in the direction we should.

ThreadPool.QueueUserWorkItem

The method is offering overload with Action delegate and argument. Actually the argument is significant. If it is blocking function it will Queue the new action only when it stops blocking.

Thus the simple http server can only be:

while(true) {
   ThreadPool.QueueUserWorkItem(Process, listener.GetContext());    
 }

void Process(object o)
 {    var context = o as HttpListenerContext;
    // process request and make response
}

We are waiting until someone connects. And yes it only processes maximally pool-size amount of concurrent connections. But that is another story I think. I wanted to share the beauty of this solution.

Inspiration was from
https://codehosting.net/blog/BlogEngine/post/Simple-C-Web-Server
https://stackoverflow.com/questions/9034721/handling-multiple-requests-with-c-sharp-httplistener

Inspecting Wix Managed Custom Actions

logo-black-hollow-xs

Wix Toolset allows creating .net managed Custom actions for the Windows Installers. Windows Installer only supports

Custom Actions in .exe, VBScript, JScript and .dll

The tool which marshals std call to native dll function into managed code is called Deployment Tools Foundation (DTF).

image

When you open an Msi with managed CA in Orca, you will see it as a native .dll in Binary table. You can also notice WixCA binary attachment named after your binary , which basically encapsulates all DTF code which extracts managed CA.

For example your binary in a .wxs file:

<Binary Id=”CustomActionBinary” SourceFile=”$(var.CA.TargetDir)$(var.CA.TargetName).CA.dll”/>

If you export your binary as .dll and open it in Dependency Walker you will see your Custom Action name as exported function in .dll

image

The CA binary is actually created by MakeSfxCA, which is actually attaching all CA managed binaries and WixCA in a .dll as Windows Cab file.

For example 7Zip can extract cabinet files to actual content:

image

And here is how it looks in ILSpy

image