The case of UEFI for Windows on ARM, and comparison with LK/ABoot

Nights before trips are always boring, and I decided to draft some words to spend the time. So we have Windows 10 on ARM running on Dragonboard 410c, and Lumia 950 XL (Article in Chinese, sorry). It will be helpful to write down some firmware-related information for platform bring-ups for further reference. Meanwhile, the comparison of Little Kernel, the common Linux Android (well, Qualcomm says so) bootloader will provide useful information for Android on Lumia project.

I recommended you read this article if you are not familiar with UEFI.

Assumptions, assumptions

Compared to Linux, Windows Kernel assumes its platform firmware and bootloader (aka. Windows Boot Manager) prepare the basic environment for successful kernel initializations. If certain components are not initialized, bugchecks may occur. Even the system successfully launches, it may have some unexpected behaviors (weird things). An official document explains these a lot.

Little Kernel initializes basic hardware too (at least you need serial for debugging). Certain periapical, including clocks, regulators, and USB are initialized too for application purposes (e.g. Fastboot). Anyway, it initializes less periapical as possible. Sometimes even the panel is not brought up (I’ve seen a case on Android phone).

In short, you have to do more for a successful Windows bring-up:

  • If you know certain components are in the usable state already, skip initialization procedures. For example, on Lumia 950 XL, our UEFI implementation does not need to initialize USB since our bootstrapper (Qualcomm UEFI) did so.
  • If your platform has PCIe components, clocks them up, set regulators and mappings, etc.
  • Initialize at least one debug resource described in your DBG2 table (if applicable, likely on all ARM platforms)
  • Bring up the panel, set basic display parameters and pass a framebuffer pointer for Windows.

So how about Linux? If your Linux platform uses DT instead of ACPI, you are likely not required to do most of the stuff Windows requires. On Qualcomm platforms, Linux kernel will clock up PCIe cores, set regulators and mappings to make it in the usable state. If your platform uses standard ACPI and platform drivers do not perform additional initialization procedures, initialize these components in firmware.

Fill the hole

Both UEFI w/ ACPI and LK will perform fix-up tasks before transferring control to the kernel. On Qualcomm platforms, chipset metadata (revision, foundry ID, etc.) will be filled in DSDT. Certain logic in DSDT depends on them. Typical Linux Android device will ship with a large DT for multiple variants. LK selects the best fit using chipset ID/PMIC ID/board ID, then fill in some memory region information for kernel use.

ACPI tables in the firmware for Windows 10 on ARM is pre-patched. So I don’t implement the fix-up logic additionally.

Multi-processor Startup, Again

Why am I discussing the thing again? Because it is important.

Little Kernel (and likely other Linux Android bootloaders) will only use a single processor in its lifecycle (a notable exception is Raspberry Pi, which uses spin table except 3+). When it transfers control to Linux, Linux will bring other cores out of reset state and make them available for use.

Windows platforms that implement ACPI Multi-Processor Parking Protocol behaves differently. Although firmware uses a single core, other CPU cores are brought out of the reset state and being instructed to run a special piece of code. The code flow is like this:

parking:
    Wait for an interrupt.
    Am I the processor being waked up?
    If yes, go to the address that OS told me
    If not, go back to parking.

(Interrupt acknowledgment and memory barriers ignored. Sorry, I don’t want to write assembly at 11 PM.)

Because different platforms handle core startup differently (on Qualcomm platforms, TrustZone has participated), booting Linux Kernel and starts cores the Linux way with a UEFI firmware that implements this protocol may fail. Someone told me he was unable to bring up other three cores on 640. It is reasonable since LK on recent Lumia phones is launched via a special UEFI application in Windows Boot Application form. Qualcomm UEFI put the other three cores in running state (and WFI). Both LK and Linux are not aware of that (they have the assumption of core state). Finally, core startup fails.

Since it is not possible to ditch Qualcomm UEFI (unlike the exploit for first-generation Lumia WP8 devices), we have to comfort the parking protocol in AArch32 mode (You have PSCI for AArch64 SoCs):

  • Ignore other cores Unicore is the best
  • Implement parking protocol for unsupported systems (not too hard). Linux has the protocol support; you have to enable it.
  • Go AArch64 and use PSCI (remember to use HVC mode for 8992/8994)

 

Good night (And to my girlfriend: If you see this article, sorry that I say “Good Night” too early.)

 

Migrate legacy UWP project system to MSBuild-based

When Microsoft decided to adopt MSBuild on .NET Core platform, project.json was not dropped immediately until first toolchain RTM arrives. Dotnet Development on Universal Windows Platform Development leverages .NET Core too, but the depreciation progress is significantly slower than other .NET Core platforms due to historical reasons. UWP uses project.json for package management and MSBuild for builds.

In Visual Studio 2017 April Update, Microsoft finally migrates new UWP projects to full MSBuild-based project system. But our projects, which creates on early 2015, doesn’t get an auto migration as expected. Hence we decided to migrate them manually for additional benefits like better toolchain stability and advanced customization features.

Reminder: Do not attempt to use “dotnet migrate” CLI command, it won’t work for UWP projects.

Migration Prerequisites

  • Notify all your team members. Make sure everyone has Visual Studio 2017 with April update installed.
  • If you have continuous integration environment configured, make sure build agents have NuGet 4.1 or higher installed (3.5 or 4.0 won’t work).
  • Lock VCS during migration to prevent additional incidents. (We’re using TFVC for source management so that it will be easy)

Migration

  • Clean up all projects (including bin and obj directories)
  • Iterate all project directories
  • Find C# project file, open with your favorite editor.
  • Add following property group before project file lists:
<PropertyGroup>
    <RestoreProjectStyle>PackageReference</RestoreProjectStyle>
</PropertyGroup>

Okay, you’ve completed the first step. Then open your project.json file. Migrate all NuGet packages references as the picture below.

Package Reference
Package Reference

Finally, remove project.json and additional files like project.lock.json, *.nuget.targets, *.nuget.props. (Or your will get lots of warning that may lead .NET Native compilation fail)

Do it for every project. Then open Visual Studio, restore NuGet packages for all projects, build to validate and submit changes.

Deep dive into UnityFS: structure and implementation

Someone asked me if I could extract some images from a popular Chinese mobile game. I accepted the challenge, but things were far more complicated than I expected.

What I knew

  • This game is Unity3D-based.
  • Original assets were encrypted with known algorithm and key. DISCLAIMER: I will not tell you details about encryption.

The story began

I thought I could extract assets I needed with existing tools (e.g. Disunity) but I was proved wrong. Disunity has been refactored, and remaining work is still in progress (at least the moment I write this article). Since resource extraction has not been implemented at this moment, Disunity couldn’t be my choice.

Then I turned to a tool called Unity Assets Bundle Extractor. It did a great job extracting resources I needed graphically. However, acquiring thousands of texture assets from 2000+ isolated files is not an easy job. I tried the command line support but failed (maybe I was too stupid).

Luckily this toolkit provides some API and documentation. Since it was compiled with Microsoft Visual C++ 2010, I was unable to use it directly(C++ ABI constantly changes with every MSVC release). And I was too lazy to write a C wrapper for P/Invoke. But these C++ header files point to a perfect solution – parse file and implement my own UnityFS parser/reader.

Special thank to the UABE project – without these generous header, I would not be able to implement my own parsing and compose this article.

Wow so many projects
Wow so many projects

UnityFS

UnityFS was a new asset bundle format introduced in Unity 5. I am not a Unity3D developer, and I absolutely didn’t know why Unity introduce a new bundle format. But anyway, let’s analyze it.

Things you need to know

  • UnityFS is just bundle of several Unity assets. Each asset contains a collection of serialized Unity objects (e.g. 2D texture, text resources, scene objects, etc.).
  • UnityFS follows a standard Unity file header structure. Let’s call it AssetsBundleHeader06
  • You have to parse asset files in order to extract what you need. There’s bunch of documentation about this. Look into the old Disunity source code for some idea.
UnityFS Header Structure
UnityFS Header Structure

So the header goes like this. There’s a DWORD flags data that matters – it contains some critical information required for decompression and directory parsing. The rule goes like this:

  • (Flags & 0x3F) is compression mode. 0 means no compression, 1 means LZMA and 2/3 means LZ4/LZ4HC.
  • (Flags & 0x40) says whether the bundle has directory info.
  • (Flags & 0x80) says whether the block and directory list is at the end of this bundle file.

C# provides a good BinaryReader that makes things a bit easy. But it can be improved for better Null-terminated String and Big Endian support. Be careful with endianness. Unity utilizes both Big Endian and Little Endian in a single file and personally I didn’t get this. For the sake of convenience, I extended the original BinaryReader for these support. Length of each data type matters – but that’s a basic stuff for CS students.

Code snippet of my simple parser
Code snippet of my simple parser

Compression

UnityFS uses optional block-based compression for streaming (you can read a specific bundle without downloading the whole file). Both LZMA and LZ4* (LZ4Hc, etc.) are supported. The Unity’s proprietary parser and Disunity respects this design. But I just wanted these bundle files, so I decided to read all blocks at once and decompress into a single memory stream.

Decompressed size should match what you get. If not, something must happened.

You can implement your own block-based reader – but my time budget didn’t allow me to do this.

There we go…block and file information!

Following a unknown 16 bytes block, there’s a Big-Endian UInt32 value represents block count in a single package. Each block information contains a Big-Endian UInt32 decompressed size, a Big-Endian UInt32 compressed size and a flag that we might not interested in.

Then a BE UInt32 value represents file count in a single package. Each file information contains file offset we need(BE UInt64), the decompressed size(BE UInt64), a BE UInt32 flag and a Null-Terminated string of file name.

Parse your assets now

With sufficient information we retrieved, we were able to extract raw asset files from a UnityFS bundle. Then what you need is search the Internet for ideas of extracting objects(text resources, 2D texture, etc.) from Unity assets. Good luck on exploring!

Conclusion

In this article, we discussed structure and parsing of UnityFS resource bundle file. For more information about UnityFS and Unity asset files, please research these projects I mentioned in this article.

自制 Federation STS: MediaWiki x ASP.NET OWIN Identity

最近在做一个和 MediaWiki 扩展管理的项目时遇到一个问题:如何安全地把身份凭据传递给ASP.NET MVC的后端,而且共用一套账号系统。

这篇文章将简要讲述完成的过程。

了解 WS-Federation

在 ASP.NET OWIN Identity 里,最方便的实现 Claim Identity 凭据的方法也就是 WS-Federation。WS-Federation 最典型的一个例子就是Active Directory Federation Service,其具体工作流程可以简化为以下的图表:

WS-Fed workflow from docs.oasis-open.org
WS-Fed workflow from docs.oasis-open.org

在本文的场景中,MediaWiki 将充当 IdP 的角色。

获得 Federation Metadata

Federation Metadata是一个XML-Dsig签名后的XML文件,包含了 SP 需要的所有信息,诸如提供的 Claim Identity 类型,公钥,访问端点。一个简化的 Federation Metadata 如下文所示:

<?xml version="1.0"?>
<EntityDescriptor xmlns="urn:oasis:names:tc:SAML:2.0:metadata" ID="_3ef47b02-f5a9-4a32-a48d-3ba56d6b270f" entityID="">
    <ds:Signature xmlns:ds="http://www.w3.org/2000/09/xmldsig#">
  <!-- 这里是签名,是 enveloped-signature + xml-exc-c14n -->
  <!-- 推荐 sha256RSA -->
  <!-- 记得带上公钥 -->
    </ds:Signature>
    <RoleDescriptor xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
 xmlns:fed="http://docs.oasis-open.org/wsfed/federation/200706" 
 xsi:type="fed:SecurityTokenServiceType" 
 protocolSupportEnumeration="http://docs.oasis-open.org/wsfed/federation/200706">
        <KeyDescriptor use="signing">
            <KeyInfo xmlns="http://www.w3.org/2000/09/xmldsig#">
                <X509Data>
     <!-- 用于STS签名的公钥, base64 encoded -->
                    <X509Certificate></X509Certificate>
                </X509Data>
            </KeyInfo>
        </KeyDescriptor>
        <fed:ClaimTypesOffered>
   <!-- 提供的 Claim Identity Types,下面举例四个 -->
            <auth:ClaimType xmlns:auth="http://docs.oasis-open.org/wsfed/authorization/200706" Uri="http://schemas.xmlsoap.org/ws/2005/05/identity/claims/upn" Optional="true">
                <auth:DisplayName>UPN</auth:DisplayName>
                <auth:Description>User Principal Name</auth:Description>
            </auth:ClaimType>
            <auth:ClaimType xmlns:auth="http://docs.oasis-open.org/wsfed/authorization/200706" Uri="http://schemas.xmlsoap.org/ws/2005/05/identity/claims/name" Optional="true">
                <auth:DisplayName>User Name</auth:DisplayName>
                <auth:Description>The mutable display name of the user.</auth:Description>
            </auth:ClaimType>
            <auth:ClaimType xmlns:auth="http://docs.oasis-open.org/wsfed/authorization/200706" Uri="http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddress" Optional="true">
                <auth:DisplayName>Email</auth:DisplayName>
                <auth:Description>Email address of the user.</auth:Description>
            </auth:ClaimType>
            <auth:ClaimType xmlns:auth="http://docs.oasis-open.org/wsfed/authorization/200706" Uri="http://schemas.microsoft.com/ws/2008/06/identity/claims/groups" Optional="true">
                <auth:DisplayName>Groups</auth:DisplayName>
                <auth:Description>Groups of the user.</auth:Description>
            </auth:ClaimType>
        </fed:ClaimTypesOffered>
  <!-- 服务Endpoint -->
        <fed:PassiveRequestorEndpoint>
            <EndpointReference xmlns="http://www.w3.org/2005/08/addressing">
                <Address></Address>
            </EndpointReference>
        </fed:PassiveRequestorEndpoint>
        <fed:SecurityTokenServiceEndpoint>
            <EndpointReference xmlns="http://www.w3.org/2005/08/addressing">
                <Address></Address>
            </EndpointReference>
        </fed:SecurityTokenServiceEndpoint>
    </RoleDescriptor>
    <RoleDescriptor xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:fed="http://docs.oasis-open.org/wsfed/federation/200706" xsi:type="fed:SecurityTokenServiceType" protocolSupportEnumeration="http://docs.oasis-open.org/wsfed/federation/200706">
        <KeyDescriptor use="signing">
            <KeyInfo xmlns="http://www.w3.org/2000/09/xmldsig#">
                <X509Data>
     <!-- 用于STS签名的公钥, base64 encoded -->
                    <X509Certificate></X509Certificate>
                </X509Data>
            </KeyInfo>
        </KeyDescriptor>
  <!-- 服务Scope -->
  <!-- 确保Scope和后续的Audience匹配 -->
        <TargetScopes>
            <EndpointReference xmlns="http://www.w3.org/2005/08/addressing">
                <Address></Address>
            </EndpointReference>
        </TargetScopes>
  <!-- 服务Endpoint -->
        <fed:ApplicationServiceEndpoint>
            <EndpointReference xmlns="http://www.w3.org/2005/08/addressing">
                <Address></Address>
            </EndpointReference>
        </fed:ApplicationServiceEndpoint>
        <fed:PassiveRequestorEndpoint>
            <EndpointReference xmlns="http://www.w3.org/2005/08/addressing">
                <Address></Address>
            </EndpointReference>
        </fed:PassiveRequestorEndpoint>
    </RoleDescriptor>
    <IDPSSODescriptor protocolSupportEnumeration="urn:oasis:names:tc:SAML:2.0:protocol">
        <KeyDescriptor use="signing">
            <KeyInfo xmlns="http://www.w3.org/2000/09/xmldsig#">
                <X509Data>
     <!-- 用于STS签名的公钥, base64 encoded -->
                    <X509Certificate></X509Certificate>
                </X509Data>
            </KeyInfo>
        </KeyDescriptor>
  <!-- 服务Endpoint -->
        <SingleLogoutService Binding="urn:oasis:names:tc:SAML:2.0:bindings:HTTP-Redirect" Location=""/>
        <SingleLogoutService Binding="urn:oasis:names:tc:SAML:2.0:bindings:HTTP-Redirect" Location=""/>
        <SingleLogoutService Binding="urn:oasis:names:tc:SAML:2.0:bindings:HTTP-POST" Location=""/>
    </IDPSSODescriptor>
</EntityDescriptor>

STS 服务

SP在解析 Federation Metadata 后会发起请求。一般在OWIN-based里会带上如下的参数:
– wa: 动作。可以是wsignin1.0 (登录) 和 wsignout1.0 (登出)
– wctx: 上下文。附上即可
– wp: 可能有
– wreply: 如果手动指定,则返回到这个页面;如果不指定,根据应用默认注册情况来
– wtrealm: 应用ID

所有的参数都在 Query String 里。在收到 STS 请求后,IdP首先根据情况判断有无再次输入密码必要。然后判断身份,签发凭据。签发的凭据也是一个XML文档,里面包含有一个SAML文档(XML-Dsig),简化的格式如下:

<?xml version="1.0" encoding="utf-8"?>
<t:RequestSecurityTokenResponse xmlns:saml="urn:oasis:names:tc:SAML:1.0:assertion" xmlns:ds="http://www.w3.org/2000/09/xmldsig#" xmlns:t="http://schemas.xmlsoap.org/ws/2005/02/trust">
<t:Lifetime>
<!-- 需要标注有效时间 -->
<wsu:Created xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd">2016-02-12T05:13:30+0000</wsu:Created>
<wsu:Expires xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd">2016-02-12T06:13:30+0000</wsu:Expires>
</t:Lifetime>
<wsp:AppliesTo xmlns:wsp="http://schemas.xmlsoap.org/ws/2004/09/policy">
<wsa:EndpointReference xmlns:wsa="http://www.w3.org/2005/08/addressing">
<wsa:Address>https://ligstd.com/STSTest</wsa:Address>
</wsa:EndpointReference>
</wsp:AppliesTo>
<t:RequestedSecurityToken xmlns:saml="urn:oasis:names:tc:SAML:1.0:assertion" xmlns:ds="http://www.w3.org/2000/09/xmldsig#">
<!-- 需要标注有效时间,AssertionID一般就是一个UUID -->
<saml:Assertion xmlns:saml="urn:oasis:names:tc:SAML:1.0:assertion" xmlns:ds="http://www.w3.org/2000/09/xmldsig#" MajorVersion="1" MinorVersion="1" AssertionID="_7529070a-0300-4b65-bdee-df0e55c2c775" Issuer="签发者,参考metadata" IssueInstant="2016-02-12T05:13:30+0000">
<saml:Conditions NotBefore="2016-02-12T05:13:30+0000" NotAfter="2016-02-12T06:13:30+0000">
<saml:AudienceRestrictionCondition>
<!- 请注意这个必须和Scope匹配 -->
<saml:Audience>https://ligstd.com/STSTest</saml:Audience>
</saml:AudienceRestrictionCondition>
</saml:Conditions>
<saml:AttributeStatement>
<!-- 各种 Claim Identity结果 -->
<!-- 请注意不能有空的Attribute -->
<saml:Subject>
<saml:NameIdentifier>Imbushuo</saml:NameIdentifier>
<saml:SubjectConfirmation>
<saml:ConfirmationMethod>urn:oasis:names:tc:SAML:1.0:cm:bearer</saml:ConfirmationMethod>
</saml:SubjectConfirmation>
</saml:Subject>
<saml:Attribute AttributeName="upn" AttributeNamespace="http://schemas.xmlsoap.org/ws/2005/05/identity/claims">
<saml:AttributeValue>Imbushuo@xxxx</saml:AttributeValue>
</saml:Attribute>
<saml:Attribute AttributeName="emailaddress" AttributeNamespace="http://schemas.xmlsoap.org/ws/2005/05/identity/claims">
<saml:AttributeValue>i@xxxx</saml:AttributeValue>
</saml:Attribute>
<saml:Attribute AttributeName="name" AttributeNamespace="http://schemas.xmlsoap.org/ws/2005/05/identity/claims">
<saml:AttributeValue>Imbushuo</saml:AttributeValue>
</saml:Attribute>
</saml:AttributeStatement>
<saml:AuthenticationStatement AuthenticationMethod="urn:oasis:names:tc:SAML:2.0:ac:classes:PasswordProtectedTransport" AuthenticationInstant="2016-02-12T05:13:30+0000">
<saml:Subject>
<saml:NameIdentifier>Imbushuo</saml:NameIdentifier>
<saml:SubjectConfirmation>
<saml:ConfirmationMethod>urn:oasis:names:tc:SAML:1.0:cm:bearer</saml:ConfirmationMethod>
</saml:SubjectConfirmation>
</saml:Subject>
</saml:AuthenticationStatement>
<ds:Signature xmlns:ds="http://www.w3.org/2000/09/xmldsig#">
<!-- 签名段,只需要对SAML签名。参考前面的XML签名 -->
</ds:Signature>
</saml:Assertion>
</t:RequestedSecurityToken>
<t:TokenType>urn:oasis:names:tc:SAML:1.0:assertion</t:TokenType>
<t:RequestType>http://schemas.xmlsoap.org/ws/2005/02/trust/Issue</t:RequestType>
<t:KeyType>http://schemas.xmlsoap.org/ws/2005/05/identity/NoProofKey</t:KeyType>
</t:RequestSecurityTokenResponse>

然后将其进行 HTML 表单编码,封装到返回里。提供一段PHP代码作参考:

 $output->addHTML("<form method="POST" name="hiddenform" action="{$escapedToken}/">");
        $output->addHTML(' <input type="hidden" name="wa" value="wsignin1.0" />');
        $output->addHTML(" <input type="hidden" name="wresult" value="{$resultXml}" />");
        $output->addHTML(" <input type="hidden" name="wctx" value="{$this->wCtx}" />");
        if(isset($this->wp)){
            $output->addHTML(" <input type="hidden" name="wp" value="{$this->wp}" />");
        }
        $output->addHTML(' <noscript><p>Script is disabled. Click Submit to continue.</p><input type="submit" value="Submit" /></noscript>');
        $output->addHTML(' </form>');
        $output->addHTML(' <script language="javascript"> window.setTimeout('document.forms[0].submit()', 0); </script>');

在 MediaWiki 的工作流程

验证用户登录以及权限。如果不满足需求,返回权限错误。
通过 MediaWiki 的 User.php 里的函数获得必要的信息。
写 SOAP XML 和 SAML XML。用 xmlseclibs 对 SAML XML 签名,封入 SOAP XML,封入表单,返回特殊页面。
ASP.NET OWIN Identity 完成后续验证。

使用

新建 ASP.NET MVC v4.6 项目,认证模式选择 Work and School account,然后选择 On-Premise。输入 Federation Metadata 位置,输入 URI (如果实现了 App 注册,请输入对应的URI)
调试项目,已经可以使用。

备注

MediaWiki 的特殊页面输出非 text/html 有点麻烦,我选择了直接暴露一个在 extensions/文件夹/StsMetadata.php 的文件来暴露。
STS 所需的证书可以用 OpenSSL 生成。
由于 xmlseclib 在 URI 处的一些处理原因,目前似乎无法和 Azure ACS 直接工作,但是可以跟 ASP.NET OWIN Identity 工作。
一般来说,wsignout1.0 的处理就是销毁 Cookie ,注销 ST S这边的登录,ASP.NET 这儿会有 OWIN 自己处理,然后跟 wsignin1.0 的表单类似,但是不需要返回SAML数据。

推荐阅读

Understanding WS-Federation – MSDN
Web Services Federation Language (WS-Federation) Version 1.2

未完待续

Direct2D based blur effect in Windows Runtime Apps

Effects such as DropShadowEffect, BlurEffect were removed from Windows Runtime XAML. In order to achieve some goals, I need to write something like these.

Luckily Direct2D provides many useful effects for us, including Gaussian Blur, which is the effect I want.

At first I tried SharpDX, it worked well on Intel platform devices, but not ARM-based devices. To make matters worse, SharpDX‘s performance was not so good as I thought. So I had to write a C++/CX Windows Runtime Component and use it in my own Windows Runtime XAML project.
Here’s the result.

Windows Runtime XAML Render to bitmap sample with blur effect
Windows Runtime XAML render to bitmap sample with blur effect

To use Direct2D, I need to create device resources first. Create the Direct3D 11 API device object, and then get the Direct2D device object.

Note: To convert stream, see here: http://blogs.msdn.com/b/win8devsupport/archive/2013/05/15/how-to-do-data-conversion-in-windows-store-app.aspx

Then receive the bitmap and create WIC object. Finally get things ready and draw, and generate output file.

Note: Set D2D1_GAUSSIANBLUR_PROP_BORDER_MODE to D2D1_BORDER_MODE_HARD, you will get the iOS 7-like blur style.

Here’s the main source code:

D2DEffect.cpp


#include "pch.h"
#include "D2DBlurEffect.h"

using namespace Light::UI::Effects::Direct2D::BlurEffect;
using namespace Platform;
using namespace concurrency;

using namespace Microsoft::WRL;
using namespace Windows::ApplicationModel;
using namespace Windows::System;
using namespace Windows::Foundation;
using namespace Windows::Graphics::Display;
using namespace Windows::Storage;
using namespace Windows::UI::Core;

// Initialize hardware-dependent resources.
void BlurEffectImageProcessor::CreateDeviceResources()
{
// This flag adds support for surfaces with a different color channel ordering
// than the API default. It is required for compatibility with Direct2D.
UINT creationFlags = D3D11_CREATE_DEVICE_BGRA_SUPPORT;

#if defined(_DEBUG)
// If the project is in a debug build, enable debugging via SDK Layers.
creationFlags |= D3D11_CREATE_DEVICE_DEBUG;
#endif

// This array defines the set of DirectX hardware feature levels this app will support.
// Note the ordering should be preserved.
// Don't forget to declare your application's minimum required feature level in its
// description. All applications are assumed to support 9.1 unless otherwise stated.
const D3D_FEATURE_LEVEL featureLevels[] =
{
D3D_FEATURE_LEVEL_11_1,
D3D_FEATURE_LEVEL_11_0,
D3D_FEATURE_LEVEL_10_1,
D3D_FEATURE_LEVEL_10_0,
D3D_FEATURE_LEVEL_9_3,
D3D_FEATURE_LEVEL_9_2,
D3D_FEATURE_LEVEL_9_1,
};

// Create the Direct3D 11 API device object.
DX::ThrowIfFailed(
D3D11CreateDevice(
nullptr, // Specify nullptr to use the default adapter.
D3D_DRIVER_TYPE_HARDWARE,
nullptr,
creationFlags, // Set debug and Direct2D compatibility flags.
featureLevels, // List of feature levels this app can support.
ARRAYSIZE(featureLevels),
D3D11_SDK_VERSION, // Always set this to D3D11_SDK_VERSION for Windows Store apps.
&m_d3dDevice, // Returns the Direct3D device created.
nullptr,
nullptr
)
);

// Get the Direct3D 11.1 API device.
ComPtr dxgiDevice;
DX::ThrowIfFailed(
m_d3dDevice.As(&dxgiDevice)
);

// Create the Direct2D device object and a corresponding context.
DX::ThrowIfFailed(
D2D1CreateDevice(
dxgiDevice.Get(),
nullptr,
&m_d2dDevice
)
);

DX::ThrowIfFailed(
m_d2dDevice->CreateDeviceContext(
D2D1_DEVICE_CONTEXT_OPTIONS_NONE,
&m_d2dContext
)
);
}

///
/// Internal method referred from Bing.
/// Convert IBuffer to IStream.
///

///The buffer to convert. IStream* createIStreamFromIBuffer(Streams::IBuffer ^buffer) {
// convert the IBuffer into an IStream to be used with WIC
IStream *fileContentsStream;
HRESULT res = CreateStreamOnHGlobal(NULL, TRUE, &fileContentsStream);
if (FAILED(res) || !fileContentsStream) {
throw ref new FailureException();
}
Streams::DataReader^ dataReader = Streams::DataReader::FromBuffer(buffer);
// read the data into the stream in chunks of 1MB to preserve memory
while (dataReader->UnconsumedBufferLength > 0) {
UINT chunkSize = min(1024 * 1024, dataReader->UnconsumedBufferLength);
auto data = ref new Platform::Array(chunkSize);
dataReader->ReadBytes(data);
ULONG written;
res = fileContentsStream->Write(data->Data, chunkSize, &written);
if (FAILED(res) || written != chunkSize) {
fileContentsStream->Release();
throw ref new FailureException();
}
}
return fileContentsStream;
}

BlurEffectImageProcessor::BlurEffectImageProcessor()
{
IsInitialized = false;
}

///
/// Render image but not get the final image.
/// REMEMBER call DataInitialize method first.
///

///Indicates the the blur amount. ///Indicates the current display's DPI. IAsyncAction^ BlurEffectImageProcessor::RenderImage(float gaussianBlurStDev, float DPI){
return create_async([this, gaussianBlurStDev,DPI]{
if (!IsInitialized){
throw ref new Platform::Exception(1, "The class has not initialized.");
}

// Render it
UINT imageWidth;
UINT imageHeight;
m_wicFormatConverter->GetSize(&imageWidth, &imageHeight);

// Create a Bitmap Source Effect.
DX::ThrowIfFailed(m_d2dContext->CreateEffect(CLSID_D2D1BitmapSource, &m_bitmapSourceEffect));

// Set the BitmapSource Property to the BitmapSource generated earlier.
DX::ThrowIfFailed(
m_bitmapSourceEffect->SetValue(D2D1_BITMAPSOURCE_PROP_WIC_BITMAP_SOURCE, m_wicFormatConverter.Get())
);

// Create the Gaussian Blur Effect.
DX::ThrowIfFailed(m_d2dContext->CreateEffect(CLSID_D2D1GaussianBlur, &m_gaussianBlurEffect));

// Set the input to recieve the bitmap from the BitmapSourceEffect.
m_gaussianBlurEffect->SetInputEffect(0, m_bitmapSourceEffect.Get());

// Set the blur amount.
DX::ThrowIfFailed(m_gaussianBlurEffect->SetValue(D2D1_GAUSSIANBLUR_PROP_STANDARD_DEVIATION, gaussianBlurStDev));
DX::ThrowIfFailed(m_gaussianBlurEffect->SetValue(D2D1_GAUSSIANBLUR_PROP_BORDER_MODE, D2D1_BORDER_MODE_HARD));

// Begin drawing.
m_d2dContext->BeginDraw();

m_d2dContext->Clear(D2D1::ColorF(D2D1::ColorF::CornflowerBlue));

// Draw the scaled and blurred image.
m_d2dContext->DrawImage(m_gaussianBlurEffect.Get());

// We ignore D2DERR_RECREATE_TARGET here. This error indicates that the device
// is lost. It will be handled during the next call to Present.
HRESULT hr = m_d2dContext->EndDraw();
if (hr != D2DERR_RECREATE_TARGET)
{
DX::ThrowIfFailed(hr);
}

});
}

///
/// Initializes all device resources and the image.
/// You need to call this method before doing other things.
///

IAsyncAction^ BlurEffectImageProcessor::DataInitialize(IRandomAccessStream^ ImageDataStream,float DPI){
// DirectXBase::Initialize(Window, DPI);
return create_async([this,ImageDataStream, DPI]{
// Initialize Devices
CreateDeviceResources();

DX::ThrowIfFailed(CoCreateInstance(
CLSID_WICImagingFactory1,
nullptr,
CLSCTX_INPROC_SERVER,
IID_PPV_ARGS(&m_wicImagingFactory)
)
);

DX::ThrowIfFailed(
CoCreateInstance(
CLSID_WICImagingFactory,
nullptr,
CLSCTX_INPROC_SERVER,
IID_PPV_ARGS(&m_wicImagingFactory2)
)
);

// Now we have the image source and we can decode it.
ImageBuffer = ref new Buffer(ImageDataStream->Size);
auto op = create_task(ImageDataStream->ReadAsync(ImageBuffer, ImageDataStream->Size, InputStreamOptions::None)).then([this,DPI](IBuffer^ ImageBufferData){
DX::ThrowIfFailed(
m_wicImagingFactory2->CreateDecoderFromStream(createIStreamFromIBuffer(ImageBufferData), nullptr, WICDecodeMetadataCacheOnDemand,
&m_wicDecoder)
);

// Get data ready
DX::ThrowIfFailed(
m_wicDecoder->GetFrame(0, &m_wicFrameDecode)
);
DX::ThrowIfFailed(
m_wicImagingFactory2->CreateFormatConverter(&m_wicFormatConverter)
);

DX::ThrowIfFailed(
m_wicFormatConverter->Initialize(
m_wicFrameDecode.Get(),
GUID_WICPixelFormat32bppBGRA,
WICBitmapDitherTypeNone,
nullptr,
0.0f,
WICBitmapPaletteTypeCustom
)
);

// Create output bitmap & get it ready
UINT Width;
UINT Height;
m_wicFrameDecode->GetSize(&Width, &Height);
m_wicImagingFactory2->CreateBitmap(Width, Height, GUID_WICPixelFormat32bppBGRA, WICBitmapCreateCacheOption::WICBitmapCacheOnDemand, &m_wicBitmap);
D2D1_SIZE_U bitmapSize = D2D1::SizeU(Width, Height);
D2D1_PIXEL_FORMAT bitmapPixelFormat = D2D1::PixelFormat(DXGI_FORMAT_B8G8R8A8_UNORM, D2D1_ALPHA_MODE_IGNORE);
D2D1_BITMAP_PROPERTIES1 bitmapProp1 = D2D1::BitmapProperties1(D2D1_BITMAP_OPTIONS_TARGET,bitmapPixelFormat, DPI, DPI);
m_d2dContext->CreateBitmap(
D2D1::SizeU(Width, Height),
nullptr,
Width * 4, // 4 bytes for B8G8R8A8
bitmapProp1,
&m_d2dBitmap1
);

m_d2dContext->SetTarget(m_d2dBitmap1.Get());

IsInitialized = true;

return;
});

op.wait();
});
}

///
/// Get the final image.
/// REMEMBER call DataInitialize method first.
/// You can call this method before calling RenderImage, but you will get the original image.
///

///Indicates the current display's DPI. IAsyncOperation<IRandomAccessStream^>^ BlurEffectImageProcessor::GetImageAsBitmap(float DPI){
return create_async([this,DPI]{
if (!IsInitialized){
throw ref new Platform::Exception(1, "The class has not initialized.");
}
// Render the bitmap use WIC.
ComPtr m_iwicBitmap;
ComPtr m_iwicStream;
ComPtr m_iwicBitmapEncoder;
ComPtr m_iwicBitmapFrameEncode;
ComPtr m_iwicImageEncoder;
WICImageParameters* m_imageparm = new WICImageParameters();
D2D1_PIXEL_FORMAT m_pixel_format = D2D1_PIXEL_FORMAT();
ComPtr m_iStream;
ID2D1Image* m_id2d1image;
UINT height;
UINT width;

// Since we can't create IStream directly in Windows Runtime, we need creating InMemoryRandomAccessStream and convert it
IRandomAccessStream^ data = ref new InMemoryRandomAccessStream();

DX::ThrowIfFailed(
CreateStreamOverRandomAccessStream(data, IID_PPV_ARGS(&m_iStream))
);

// Get size, we need it later
DX::ThrowIfFailed(
m_wicFrameDecode->GetSize(&width, &height)
);

// Create bitmap
DX::ThrowIfFailed(
m_wicImagingFactory2->CreateBitmap(width, height, GUID_WICPixelFormat32bppBGRA, WICBitmapCreateCacheOption::WICBitmapCacheOnDemand, &m_iwicBitmap)
);

// Create WIC Stream
DX::ThrowIfFailed(
m_wicImagingFactory->CreateStream(&m_iwicStream)
);

// Initialize WIC Stream from IStream that we converted
DX::ThrowIfFailed(
m_iwicStream->InitializeFromIStream(m_iStream.Get())
);

// Create encoder
DX::ThrowIfFailed(
m_wicImagingFactory2->CreateEncoder(GUID_ContainerFormatPng, nullptr, &m_iwicBitmapEncoder)
);

// Create image encoder
DX::ThrowIfFailed(
m_wicImagingFactory2->CreateImageEncoder(m_d2dDevice.Get(), &m_iwicImageEncoder)
);

// Initialize
DX::ThrowIfFailed(
m_iwicBitmapEncoder->Initialize(m_iwicStream.Get(), WICBitmapEncoderCacheOption::WICBitmapEncoderNoCache)
);

// Create new frame for the bitmap
DX::ThrowIfFailed(
m_iwicBitmapEncoder->CreateNewFrame(&m_iwicBitmapFrameEncode,nullptr)
);

// Set properties
m_iwicBitmapFrameEncode->Initialize(nullptr);
m_iwicBitmapFrameEncode->SetSize(width, height);
WICPixelFormatGUID format = GUID_WICPixelFormat32bppBGRA;
m_iwicBitmapFrameEncode->SetPixelFormat(&format);
m_d2dContext->GetTarget(&m_id2d1image);
m_imageparm->DpiX = DPI;
m_imageparm->DpiY = DPI;
m_pixel_format.alphaMode = D2D1_ALPHA_MODE_IGNORE;
m_pixel_format.format = DXGI_FORMAT_B8G8R8A8_UNORM;
m_imageparm->PixelFormat = m_pixel_format;
m_imageparm->PixelHeight = height;
m_imageparm->PixelWidth = width;

// Write frmae
DX::ThrowIfFailed(
m_iwicImageEncoder->WriteFrame(m_id2d1image, m_iwicBitmapFrameEncode.Get(), m_imageparm)
);

// Commit
DX::ThrowIfFailed(
m_iwicBitmapFrameEncode->Commit()
);

DX::ThrowIfFailed(
m_iwicBitmapEncoder->Commit()
);

// Now we successfully got the image
// Convert it to stream.
// Reference: MSDN
Windows::Storage::Streams::IRandomAccessStream^ comRAS;
IUnknown* p11 = reinterpret_cast(comRAS);

static const GUID guidIRandomAccessStream =
{ 0x905a0fe1, 0xbc53, 0x11df, { 0x8c, 0x49, 0x00, 0x1e, 0x4f, 0xc6, 0x86, 0xda } };

DX::ThrowIfFailed(
CreateRandomAccessStreamOverStream(m_iwicStream.Get(), BSOS_DEFAULT, guidIRandomAccessStream, (void**)&p11)
);

// Return result
return reinterpret_cast<IRandomAccessStream^>(p11);
});
}


D2DEffects.h

#pragma once

#include "DirectXBase.h"

using namespace Windows::Storage::Streams;
using namespace Windows::Foundation;
using namespace Windows::UI::Core;

namespace Light{
namespace UI{
namespace Effects{
namespace Direct2D{
namespace BlurEffect{
public ref class BlurEffectImageProcessor sealed
{
public:
BlurEffectImageProcessor();
IAsyncAction^ DataInitialize(IRandomAccessStream^ ImageDataStream,float DPI);
IAsyncAction^ RenderImage(float gaussianBlurStDev, float DPI);
IAsyncOperation<IRandomAccessStream^>^ GetImageAsBitmap(float DPI);
void CreateDeviceResources();
private:
Microsoft::WRL::ComPtr m_bitmapSourceEffect;
Microsoft::WRL::ComPtr m_gaussianBlurEffect;

Microsoft::WRL::ComPtr m_wicDecoder;
Microsoft::WRL::ComPtr m_wicFrameDecode;
Microsoft::WRL::ComPtr m_wicFormatConverter;
Microsoft::WRL::ComPtr m_wicImagingFactory2;
Microsoft::WRL::ComPtr m_wicImagingFactory;
Microsoft::WRL::ComPtr m_d2ddevice1;
Microsoft::WRL::ComPtr m_d2ddevice;
Microsoft::WRL::ComPtr m_d3d11device;

// Direct3D device
Microsoft::WRL::ComPtr m_d3dDevice;

// Direct2D objects
Microsoft::WRL::ComPtr m_d2dDevice;
Microsoft::WRL::ComPtr m_d2dContext;
Microsoft::WRL::ComPtr m_d2dBitmap1;
Microsoft::WRL::ComPtr m_wicBitmap;

int m_width;
int m_height;
IBuffer^ ImageBuffer;
bool IsInitialized;
};
}
}
}
}
}


And don’t forget these input file: dxgi, dwrite, d2d1, d3d11, windowscodec, etc.

[Project]知乎专栏RSS

RSS Demo
RSS Demo

2014-4-4 Update: 加入全文支持。方法同WordPress的全文输出。图片好像不能工作

2014-4-5 Update: 加入所有文章输出。看下文

@韩学森 邀请,用了2h鼓捣了一个简单的RSS generator。

地址: http://beta.imbushuo.net/rss/Zhihu.Zhuanlan.RSSGenerator.php

用法:

http://beta.imbushuo.net/rss/Zhihu.Zhuanlan.RSSGenerator.php?columnid={专栏ID}&full={1|0}

例如专栏”急救室http://zhuanlan.zhihu.com/first-aid,则columnid=first-aid,就是http://beta.imbushuo.net/rss/Zhihu.Zhuanlan.RSSGenerator.php?columnid=first-aid

效果见图。如果需要获取所有文章则为

http://beta.imbushuo.net/rss/Zhihu.Zhuanlan.RSSGenerator.php?columnid=first-aid&full=1

KNOWN ISSUES:

a. 代码质量很挫。你们自己到GitHub上找吧…不想贴链接(逃