I recently was working on an Azure Resource Manager template and I ran into an issue setting a resource id. I actually had some one assist me in figuring it out, but I figured I share it. I was trying to get the resourceid of the subnet for the nic I was creating. If you look closely you can see I am getting the id of the IPAddresses. I use the resource name and then I add the argument which is the name of the IPAddresses. This is a two segment resource because I am referencing two resources ‘Microsoft.Network/publicIPAddresses’ . Usually with most resources its resourcename,resource value. There are some cases where that is not the case and Subnets are one of them. Subnets use a three segment setup and if you don’t understand that then you get the segment errors. If you look you see that the subnet resourceid uses ‘Microsoft.Network/virtualNetworks/subnets/ which means I have to pass in the subnet name, but I also need the vnet name. SO if there are three segments you have to pass in the values for both resources named. In this case its the VirtualNetworks and Subnet. So if you look at the subnet resource you see the difference between a 2 and 3 segment resource
“id”: “[resourceId(‘Microsoft.Network/publicIPAddresses/’, parameters(‘publicIPAddresses_CEBuild_ip_name’))]”
“id”: “[resourceId(‘Microsoft.Network/virtualNetworks/subnets/’, parameters(‘virtualNetworks_CloudEngine_RG_vnet_name’), parameters(‘subnets_default_name’))]”
Once I cleared that up with assistance from Stackoverflow 🙂 I started getting another error. Apparently when you are creating the Subnet in the ARM template the name must also be a 3 segment setup. You normally just enter the name of the resource you are creating. In my case I just use the subnet name. I was not aware I needed the vnet name also as part of the actual name. The error pointed me to the line where the subnet was getting created and based on the last fix I decided to try and do the same thing with the name and that fixed it. Like so much with Azure ARM, its a lot of trial and error sometimes.
“comments”: “Generalized from resource: ‘/subscriptions/24b189e9-d338-435f-a60f-3491b68bc8a2/resourceGroups/CloudEngine_RG/providers/Microsoft.Network/virtualNetworks/CloudEngine_RG-vnet/subnets/default’.”,
“name”: “[concat(parameters(‘virtualNetworks_CloudEngine_RG_vnet_name’),’/’, parameters(‘subnets_default_name’))]”,
I was glad to hear that Microsoft has built in the conditional logic for builds/releases in TFS 2017 Update 2 RC. I have always thought that we needed that functionality. I remember trying to do it with the old workflow editor and it was a pain to do. Here are the conditions that can be used. Now custom conditions are not yet available for release tasks, but they are working on it. So if you are on the fence about the upd ates i encourage you to take a look at Update 2 for TFS 2017. There is much more than this in the update.
- Only when all previous tasks have succeeded
- Even if a previous task has failed, unless the build was canceled
- Even if a previous task has failed, even if the build was canceled
- Only when a previous task has failed
I am going to try and do a walk through of the setup and usage of these in upcoming posts.
People seem surprised that I create videos of my code camp talks and give them away for free. I think its a great idea so people who attended can go back and watch or anyone can for that matter. I know that we all learn in different ways. Some read books, some watch videos and some need in-person training. I am a video learner. I love to watch videos on my topics of interest. I find its the best for me That is one reason why I like to create training videos. The other reason is that that I feel I have a obligation to give back to the tech community. Some folks say in-person training seems like a high cost for what is gained, but then again others may disagree. I feel that you need to do what works. I have added a Videos page to my site to list out all the videos I am going to be creating. I am a bit behind on the topics I speak on, but I will catch up quickly. Now back to the giving back part. here is where i get funny looks when I say I have an obligation to give back. Most folks want to keep that knowledge they have close to the chest. I like to call this Tribal Knowledge, as if no one else can find it out some other way. When started out I had someone take the time to create a training plan for me to learn programming from. It was fairly in depth and I was given it for free. The only thing I was asked was to give it back some day. I have been doing it ever since I learned the basics. This is one reason I speak at so many code camps and tech fests as you will see on the Speaking page on my site. What i would like to do is encourage anyone in the tech trade to give back to the community. Even if you are just learning you know more than someone else. So blog, speak, write, but do something to give back. Keeping all that knowledge inside does not help anyone. If you think no one cares what you have to say, try not to. Your style my be exactly what someone is looking for. So lets all start giving back.
Its no doubt that everyone has heard the value in implementing Dev Ops in their organization. I consult with a variety of companies trying to get on the bandwagon and get there Devs and Ops folks working together. I noticed that in everyone of these there is a team that is left out usually. That is the Database team. I am yet to here a company who vie worked with ask about database devops. Its a suggestion I always make and I get similar responses. “Our database team works fine” or “We don’t do auto deploys”. This is being really short sighted since the DBAs and Database devs are still part of the team and need to be included. So if we are going to include them what are some good practices to follow. I am sure, given some time, I could create a list of them. For now I just want to list a few important ones.
Start treating Database as Code
I am always encountering DBAs and Database devs writing scripts to deploy their changes to various environments. We need to get away from this practice. It is often error-prone and time consuming. The idea here is to put the database code into version control so that all changes are tracked. This allows us to be able to deploy in an automated and repeatable fashion to a Development environment and to run tests. Some benefits of this practice are:
This will allow the DBAs to focus on whats really important to them such as managing database changes.
Elimination of the deployment errors that can come from script writing and deployments
Takes the guess work out of knowing what database version is in what environment since an automation tool can track that
Single source of truth
One of the issues I run into quite often is that database devs store their version of the database code on file shares or cloud storage. This means that there may be many versions of the database out there and understanding what needs to be deployed is made that much harder. With Database as Code we can eliminate this practice and therefore have one source of truth for all database code. Treat the database files the same way you are treating your software code.
One of the tenets of Dev Ops is to be all inclusive in regards to development and operations teams. So why are the database developers left out? Makes no sense to exclude them. They are developers also. My suggestion is to get them involved in daily meetings your dev and ops team have. If you are using a Scrum or Agile approach, get them in the daily stand up. Not only will they feel more a part of the team, they can also make teams aware of hats going on in their world. Allow them to give updates and track there tasks in the same task tracking system that the development teams are using. If you use Kanban, get their tasks on the board and be sure to talk about their tasks. They need to be able to answer the same three questions that developers ask/answer
What did you work on yesterday?
What are you working on today?
Are there any impediments or issues the team needs to know about?
Like I said, this is not an all inclusive list , but is a great start to a database teams Dev Ops journey. I hope to see more and more companies adopting devops practices for the database teams.