Migrate legacy UWP project system to MSBuild-based

When Microsoft decided to adopt MSBuild on .NET Core platform, project.json was not dropped immediately until first toolchain RTM arrives. Dotnet Development on Universal Windows Platform Development leverages .NET Core too, but the depreciation progress is significantly slower than other .NET Core platforms due to historical reasons. UWP uses project.json for package management and MSBuild for builds.

In Visual Studio 2017 April Update, Microsoft finally migrates new UWP projects to full MSBuild-based project system. But our projects, which creates on early 2015, doesn’t get an auto migration as expected. Hence we decided to migrate them manually for additional benefits like better toolchain stability and advanced customization features.

Reminder: Do not attempt to use “dotnet migrate” CLI command, it won’t work for UWP projects.

Migration Prerequisites

  • Notify all your team members. Make sure everyone has Visual Studio 2017 with April update installed.
  • If you have continuous integration environment configured, make sure build agents have NuGet 4.1 or higher installed (3.5 or 4.0 won’t work).
  • Lock VCS during migration to prevent additional incidents. (We’re using TFVC for source management so that it will be easy)

Migration

  • Clean up all projects (including bin and obj directories)
  • Iterate all project directories
  • Find C# project file, open with your favorite editor.
  • Add following property group before project file lists:
<PropertyGroup>
    <RestoreProjectStyle>PackageReference</RestoreProjectStyle>
</PropertyGroup>

Okay, you’ve completed the first step. Then open your project.json file. Migrate all NuGet packages references as the picture below.

Package Reference
Package Reference

Finally, remove project.json and additional files like project.lock.json, *.nuget.targets, *.nuget.props. (Or your will get lots of warning that may lead .NET Native compilation fail)

Do it for every project. Then open Visual Studio, restore NuGet packages for all projects, build to validate and submit changes.

The Windows “Gatekeeper” Internals

"Rickrolling" in Windows SmartScreen

Windows 10 Insider Preview 15046 introduces the Windows-flavor “Gatekeeper“. It is similar to Gatekeeper in macOS, with some minor differences.

First of all, Windows “Gatekeeper” doesn’t block the execution of applications that don’t require installation. I tried to run PuTTY, a popular tool on Windows and it works.

Secondly, Windows “Gatekeeper” is based on Microsoft SmartScreen, which means disabling SmartScreen will turn it off too. Prior to application execution, SmartScreen will send file hash and publisher information(including certificate thumbprint) to Microsoft’s server, then SmartScreen server send back metadata including application reputation. Response is signed with a specific key that will be checked in client side for message integrity.

Unlike macOS, attempt to start application from console(e.g. Command Prompt and PowerShell) will trigger “Gatekeeper”.

Attempt to start application from PowerShell
Attempt to start application from PowerShell

The window is web-based. Although you can’t modify the response directly(no one wants to deal with sha256RSA unless the key leaks), you can attach a debugger to have some fun with it.

"Rickrolling" in Windows SmartScreen
“Rickrolling” in Windows SmartScreen

Microsoft claims that this feature is opt-in for most Windows SKUs (except Windows 10 Cloud AFAIK), and it is not revalent to UMCI (User-mode Code Integrity), which is enforced in Windows 10 Cloud.

Deep dive into UnityFS: structure and implementation

Someone asked me if I could extract some images from a popular Chinese mobile game. I accepted the challenge, but things were far more complicated than I expected.

What I knew

  • This game is Unity3D-based.
  • Original assets were encrypted with known algorithm and key. DISCLAIMER: I will not tell you details about encryption.

The story began

I thought I could extract assets I needed with existing tools (e.g. Disunity) but I was proved wrong. Disunity has been refactored, and remaining work is still in progress (at least the moment I write this article). Since resource extraction has not been implemented at this moment, Disunity couldn’t be my choice.

Then I turned to a tool called Unity Assets Bundle Extractor. It did a great job extracting resources I needed graphically. However, acquiring thousands of texture assets from 2000+ isolated files is not an easy job. I tried the command line support but failed (maybe I was too stupid).

Luckily this toolkit provides some API and documentation. Since it was compiled with Microsoft Visual C++ 2010, I was unable to use it directly(C++ ABI constantly changes with every MSVC release). And I was too lazy to write a C wrapper for P/Invoke. But these C++ header files point to a perfect solution – parse file and implement my own UnityFS parser/reader.

Special thank to the UABE project – without these generous header, I would not be able to implement my own parsing and compose this article.

Wow so many projects
Wow so many projects

UnityFS

UnityFS was a new asset bundle format introduced in Unity 5. I am not a Unity3D developer, and I absolutely didn’t know why Unity introduce a new bundle format. But anyway, let’s analyze it.

Things you need to know

  • UnityFS is just bundle of several Unity assets. Each asset contains a collection of serialized Unity objects (e.g. 2D texture, text resources, scene objects, etc.).
  • UnityFS follows a standard Unity file header structure. Let’s call it AssetsBundleHeader06
  • You have to parse asset files in order to extract what you need. There’s bunch of documentation about this. Look into the old Disunity source code for some idea.
UnityFS Header Structure
UnityFS Header Structure

So the header goes like this. There’s a DWORD flags data that matters – it contains some critical information required for decompression and directory parsing. The rule goes like this:

  • (Flags & 0x3F) is compression mode. 0 means no compression, 1 means LZMA and 2/3 means LZ4/LZ4HC.
  • (Flags & 0x40) says whether the bundle has directory info.
  • (Flags & 0x80) says whether the block and directory list is at the end of this bundle file.

C# provides a good BinaryReader that makes things a bit easy. But it can be improved for better Null-terminated String and Big Endian support. Be careful with endianness. Unity utilizes both Big Endian and Little Endian in a single file and personally I didn’t get this. For the sake of convenience, I extended the original BinaryReader for these support. Length of each data type matters – but that’s a basic stuff for CS students.

Code snippet of my simple parser
Code snippet of my simple parser

Compression

UnityFS uses optional block-based compression for streaming (you can read a specific bundle without downloading the whole file). Both LZMA and LZ4* (LZ4Hc, etc.) are supported. The Unity’s proprietary parser and Disunity respects this design. But I just wanted these bundle files, so I decided to read all blocks at once and decompress into a single memory stream.

Decompressed size should match what you get. If not, something must happened.

You can implement your own block-based reader – but my time budget didn’t allow me to do this.

There we go…block and file information!

Following a unknown 16 bytes block, there’s a Big-Endian UInt32 value represents block count in a single package. Each block information contains a Big-Endian UInt32 decompressed size, a Big-Endian UInt32 compressed size and a flag that we might not interested in.

Then a BE UInt32 value represents file count in a single package. Each file information contains file offset we need(BE UInt64), the decompressed size(BE UInt64), a BE UInt32 flag and a Null-Terminated string of file name.

Parse your assets now

With sufficient information we retrieved, we were able to extract raw asset files from a UnityFS bundle. Then what you need is search the Internet for ideas of extracting objects(text resources, 2D texture, etc.) from Unity assets. Good luck on exploring!

Conclusion

In this article, we discussed structure and parsing of UnityFS resource bundle file. For more information about UnityFS and Unity asset files, please research these projects I mentioned in this article.

Openconnect and Active Directory Certificate Service

Previously my openconnect server deployment plan utilizes PAM authentication (via Kerberos/Active Directory) as the primary authentication method. It works but it’s complicated (password every time). I just enabled certificate authentication today and it worked fine.

Things to note

  • Enable certificate authentication as an alternative authentication method (up to you, but some guys in our domain don’t use certificate-capable device)
  • Use “Smartcard Logon” certificate template with subject information in “Common Name” style
  • Set OID 2.5.4.3 as user identifier in openconnect server configuration
  • Provision root CA, CRL and OCSP (CRL and OCSP are optional but essential as part of the best-practice)

Something else

I provisioned the same certificate in my Yubikey PIV and TPM-based virtual smartcard, but neither works for AnyConnect client. Certificate in user certificate store is fine.

AnyConnect client refused to use smartcard
AnyConnect client refused to use smartcard

Friendzoned and the end

I hesitated to face the fact, but finally I did that, bravely. At least I experienced that (but maybe not the last time) in my life.

I should have learned that it is extremely hard to maintain a cross-country relationship for the lack of physical presence.

Take care.

Changing Microsoft Account alias is painful

Access deined for my new alias

A short update: The recent MSDN subscription migration kills my migrated account alias too. After contacting Microsoft support, I removed legacy alias from my account, create a new Microsoft Account using my legacy alias and restored my access to the new Visual Studio Subscription portal. In the same way, I removed legacy Microsoft Account in my Azure AD, linked two separated Microsoft Accounts(legacy and new alias) and resolved my issue accessing Visual Studio Team Services.

Such inconsistency always happens, and usually remove & add will be the universal solution in most cases.

After using legacy alias for almost 7 years, I decided to replace my Microsoft Account alias with a new Outlook.com email address due to increasing security concern of Netease Mail (my previous email service provider). Though I changed alternative recovery email to my domain email after several major security incidents, it looks weird to have an @163.com email alias linked to my Microsoft account.

Okay, I changed my alias the day before yesterday. It works. I didn’t delete the old one because I want to maintain some sort of backward compatibility. It works across my personal devices without any pain.

Annoying things came afterward days later.

Let’s talk about SSO/Federated Logon

Before talking about terrible things after switching to the new alias, let’s talk about Federated Logon. Technically speaking, Federated login is an authentication workflow based on trust relationships. Suppose Identity Provider A and Application B have successfully established two-way trust relationship by service provision. When a new user login attempt occurs, B redirects authentication challenges to Identity Provider A, with necessary metadata, like secure token ID, timestamp, nonce and finally something that validates the request, for example, digital signature, even token encryption. Since Application B has its own approach to understand Identity Provider A’s payload(so does B), the communication will be secured.

When Identity Provider A completes user authentication challenges(password, client certificate, fingerprint, etc.), it signs (encrypts maybe) authenticated user claims (user ID, user name and something else) and posts to B. The workflow image of WS-Federation below represents such workflow. OAuth and OpenID Connect have similar workflow with slight differences(multiple modes to retrieve user claims).

WS-Fed workflow from docs.oasis-open.org
WS-Fed workflow from docs.oasis-open.org

Microsoft Azure, Visual Studio Team Services and most Microsoft services use OpenID Connect. Believe it or not, you use Federated Logon and SSO every day.

Microsoft Account and Azure AD Account

They are two separated systems though they have something in common. Each Microsoft Account has a CID, a unique identifier in Microsoft Account system. All Microsoft Consumer services use CID to recognize your identity. For example, your Outlook.com email account is identified using your CID.

Azure AD Account handles it differently. Each Azure Active Directory have a tenant ID to identify AAD in AAD system. Each AAD contains objects: users, groups, computers, trust relationships….and more. Each AAD user has a unique alias in a specific AAD tenant. So the coexistence of 2ea6c0b4-cc49-42b8-9f1b-3f4aa653c719\imbushuo and b5093785-af31-4819-bf75-728d4474769c\imbushuo is possible.

Microsoft Accounts can be linked into Azure AD too: during the linking procedure, a new external user from Microsoft Account will be created in an AAD tenant, so you may have 2ea6c0b4-cc49-42b8-9f1b-3f4aa653c719\bill@live.com. When Bill wants to access resources in his tenant’s AAD, he will type bill@live.com in AAD Federation Service(Work and school account), a single sign on portal for Azure AD. Later, AAD FS will redirects the authentication challenges to Microsoft Account login portal. If Bill is authenticated in Microsoft Account login portal, he will be redirected back to AAD FS, with claims provided by Microsoft Account. Finally, AAD FS will tell the application that the user is Bill.

My blog uses such login mechanism too. See my management portal to get some idea about this if you don’t understand.

But…there’s no CID in Azure AD

But there’s something just works like CID: user alias. Another mapping! Microsoft Account will be mapped to Azure AD account, then the application will use the Azure AD account identity. After changing my alias in Microsoft Account, my Azure AD user alias remains the same. So I can login into my blog management portal with the same identity:

Logged in with the same ID
Logged in with the same ID

Do you remember that federation logon can carry multiple attributes at one time? So here’s the problem. My team’s source control service, Visual Studio Team Services, seems to use email address (which changes after rotating my primary Microsoft Account alias) to identity user. After logging in with my organization account, I found that my email address didn’t change after the rotation. To make the whole thing worse, I am the account creator, hence I cannot remove my Microsoft Account in VSTS to address the issue.

In short, the primary alias rotation didn’t change my user alias in Azure AD, but applications’ behavior vary based on how they deal with user claims.

 

Seems that I have to change my alias back. Yuck.