Site Loader

In this post, I will explain how to migrate team foundation server to Azure DevOps. So we are discussing here the pre-requisites, and some drawbacks you may encounter while migrating. The migration includes the projects data and fewer elements around it like issues and work items. For instance.

There is an importance of migration the team foundation server. It subsist on the following reality: It is important to maintain the history for learning, audit, security and transition purposes. Once you understand what you want on migration then you are a good prospect for migration. Please follow this article to the end and see how the whole process was to me.

Part of this article that explains how to migrate team foundation server was written using the documentation online from the Microsoft azure portal, which you can find here.

Pre-requisites

First make sure you have specific versions of the product, One of my first problems was that I did not have the required versions of the services. But in general these are the things you need to check before you begin:

  • Your server must have enough space (At least 50 GB more depending on how many projects you have, let’s say that at least 250MB per project)
  • My server has only 8GB of RAM but it is better with 16GB
  • Windows Server 2008 R2 or newer
  • The database server where you projects reside needs to be at least SQL Server 2016+
  • Make sure your database and tables collation are set to one of these: SQL_Latin1_General_CP1_CI_AS or Latin1_General_CI_AS
  • Direct access to the internet, because you’ll use couple Cmdlets that will require this connection to azure services
  • A Visual Studio Subscription with the required licences for DevOps Online
  • An Azure Subscription and a blob storage account
  • Microsoft Azure Storage Explorer
  • Admin permission to install tools and required features on the server before migrating to the cloud
  • Access to the database (in case you have a separate server for handling the TFS projects database)
  • You will need a lot of patience and positive energy

Let’s start the migration process

In my case, the migration process started with taking in count that there’s not direct path to migrate from TFS 2015 to DevOps Online, promptly I noticed that I should upgrade at least 2 times before going to the desired state. Remember all the steps presented here needs to be done on the server where the TFS is installed, if there’s another way on doing this from a remote machine is out of my knowledge.

1. First we will begin to migrate the TFS 2015 to TFS 2018, this process is called an In Place migration, download the installer

2. Double click the installer and follow the instructions as shown:

Decide if you do want to participate in the VSEI Program or not and click next

3. In the deployment type choose the second option that specifies “I have existing databases to use this Team Foundation Server Deployment” because remember, we are doing an upgrade.

After choosing the right option click on next button

4. Specify the TFS configuration database your instance is using:

Check the lower box to specify you are good to go because you have a backup and click on next

5. Select your desired scenario, in my case I selected the production upgrade because I was ready and had the corresponding backups in case something goes wrong

After choosing your scenario click on the next button

6. Specify the service account for running your instance, or, you can choose running it as a network service, it depends on your situation and environment

Set the username and password and click on the next button

7. Finally you can specify your public and local URL, in my case I left it without change, you can now click on the next button and follow the final steps (next next finish approach)

8. In my case I did not configured reporting server or sharepoint integration, I just skipped all of those steps.

Here click on next and it will start to upgrade

9. Finally it will show you a window like this:

You can close clicking the close button here

Verifying the upgrade

After this upgrade, you can now open your browser. Verify if the web server is working and you are able to see the new interface for TFS 2018. Also is good to open the Team Foundation Server Management Console and verify that everything is working as expected.

Connect your project to the new environment, pull, push, check in, to see if everything is good working. Soon we are moving to DevOps Online.

Upgrade TFS 2018 to Azure DevOps Server 2019 Update 1

1. To upgrade your TFS 2018 to Azure DevOps Server 2019 Update 1 (because at the moment of writing this article that is the last stable version to install). First download the installer

2. Follow the steps like in the previous upgrade process, it is almost the same process, let’s go to the final screen:

Now we are running Azure DevOps Server 2019 Update 1

Upgrading from Azure DevOps Server 2019 to DevOps Online

Project collections in TFS are now Organizations in DevOps Online

Microsoft

First, I’m going to save you a lot of time, if you start the migration you will notice 2 obstacles in your road, I’ll show you how to solve this, because it took me a lot of time.

  1. I won’t be able to generate the export package is you don’t have the Microsoft SQL Data-Tier Application Framework installed. Grab it from here
  2. Maybe you need to update your project templates to the new standard, to do this you will need a tool. which you can grab it from here. Also give a read to the documentation because you can migrate using a .CSV file listing all of your projects, I discovered it after a lot of try-fail

Import the process template for every project in your collection using this command:

./ConformProject.ps1 "http://myServer/tfs/DefaultCollection" "foo" "c:\folder\agile"

Once you have completed the previous 2 steps, let’s start the migration process which consists on the following steps, but first remember, you need an internet connection for running the tool since it connects to the Azure services online:

1. Validate your collection using this command (inside the migration tool folder):

Migrator validate /collection:http://localhost:8080/tfs/DefaultCollection
The validation command will show any information regarding this process, this step is very important because you could work any validation error before running the definitive migration command.

2. Review the logs, and work on errors shown in the previews process. Once running the validation command, a folder will create inside the migration tool folder /logs with the collection name like this:

Inside the folder with the project name you’ll find a folder with the date of your validation or migration process, it contains inside a structure like this

3. Generate the import file, this is a JSon file with all the information of your collection. (we will edit this file in further steps), let’s run this command inside the migration tool folder:

.\Migrator.exe prepare /collection:http://MyServer:8080/MyCollection /tenantDomainName:my.contoso.com /Region:CUS

After running the command browse to the folder inside and you should find a file with the .json extension like this:

What I did is that I was moving every generated file to one location to run the import after I had all set, so in my ImportFiles folder I had all these files:

Note, every collection has its own json file which will becoming an Organization in the DevOps Online platform.

4. Generate the DACPAC file, this will extract all data and tables from the database, one DACPAC per collection. The command is like this:

SqlPackage /sourceconnectionstring:"Data Source=MYSERVER;Initial Catalog=MyProjectDB;Integrated Security=True" /targetFile:C:\DACPACK\MyProjectDB.dacpac /action:extract /p:ExtractAllTableData=true /p:IgnoreUserLoginMappings=true /p:IgnorePermissions=true /p:Storage=Memory

Once finished, I uploaded the generated .dacpac file to my blob container using the Microsoft Azure Storage Explorer, right click the file and selected the “Generate Shared Access Signature” and copied the full link of this.

5. Once the DACPAC file is uploaded and you have the Shared Access Signature link copied, open back the .json file for your current collection, let’s edit it:

{
  "Source": {
    "Location": "THE COPIED SAS link goes here",
    "Files": {
      "Dacpac": "MyProjectDB.dacpac"
    }
  },
  "Target": {
    "Name": "MyOrganizationName (Must not exist)"
  },
  "Properties": {
    "ImportType": "ProductionRun"
  },
  "ValidationData": {
    "SourceCollectionId": "THIS VALUE WILL BE AUTOGENERATED",
    "DataImportCollectionId": "THIS VALUE WILL BE AUTOGENERATED",
    "ServicesToInclude": "Team Foundation Server;Release Management;Package Management",
    "ServicesToExclude": "Analytics",
    "ActiveUserCount": 35,
    "TenantId": "THIS VALUE WILL BE AUTOGENERATED",
    "TfsVersion": "Dev17.M153.3",
    "CommandExecutionCount": 4149,
    "CommandExecutionTime": 21.182599294000003,
    "DatabaseTotalSize": 2999,
    "DatabaseBlobSize": 2567,
    "DatabaseTableSize": 432,
    "DatabaseLargestTableSize": 2701,
    "TfsMigratorVersion": "17.153.29207.2",
    "Region": "CUS",
    "ValidationChecksumVersion": 1,
    "ValidationChecksum": "THIS VALUE WILL BE AUTOGENERATED"
  },
  "Identities": [
    "THIS VALUE WILL BE AUTOGENERATED"

  ]
}

6. Once you have your file edited with the values needed (basically the .dacpac filename, the organization name and the SAS file url) you are ready to queue the import operation (this will finally upload the data and create the organization with your projects inside). For that purpose please run the following command:

.\Migrator.exe import /importFile:C:\New\ImportFiles\myproject.json

After you run the previous command, it will tell you that your collection is queued for import. After couple minutes or hours you will see the new Organization and projects in full in the Azure DevOps portal. Take in count that this service have a daily limit, as you can note here:

The tenant queue has a limit

Closing the chapter

Finally, I would like to thank to the Microsoft Azure Team for the Caribbean and Latam, they have such impressive professionals there, special mention to Anthony Rugama for helping me a lot in this process, really I was stuck with a lot of work to do on this and after couple calls we finally came on a solution.

To my dear readers, I hope you can migrate all of your collections without any problem, if you need some help I’m here, go to the contact section in this blog or reach me using my social networks.

walalm

2 Replies to “Migrate Team Foundation Server 2015 to Azure DevOps Online”

Leave a Reply

Your email address will not be published. Required fields are marked *