Hack TwinUI to force Windows Store Apps run on low resolution screens

Windows Store Apps on Lumia 640 XL.

Windows 8 and Windows 8.1 has a minimum screen resolution constraint for Windows Store Apps (aka. Metro Apps or whatever). If the screen resolution doesn’t meet requirement, user will see a prompt indicating the resolution is too low for these applications.

However, on certain platforms (like phones and single-board computers), it is not convenient to change resolution. Recently I am trying Windows RT 8.1 on Lumia 640 XL. Qualcomm has the resolution hard-coded in platform configuration, so I was unable to change the resolution. 1280 * 720 is not sufficient for Store Apps.

But there was an exception – the PC settings (aka. Immersive Control Panel) app. It always opens regardless of current resolution settings. So how can I force other applications to launch?

Let’s turn to TwinUI.dll. It’s one of the core components of shell infrastructure. Start IDA Pro, load TwinUI with symbols from Microsoft. Go ahead and search the existence of PC settings app. All Windows Store Apps are associated with a package family identifier. Let’s search it. In this case, it’s windows.immersivecontrolpanel_cw5n1h2txyewy.

Bingo. We found it in some functions.

PC Settings Package Family ID is hardcoded in TwinUI.dll. This function has been patched by me, so it doesn't reflect actual situation you get from official Microsoft binary.
PC Settings Package Family ID is hardcoded in TwinUI.dll. This function has been patched by me, so it doesn’t reflect actual situation you get from official Microsoft binary.

By checking it’s references, we learned that layout checking routine verifies whether it is a desktop application, or PC settings app when resolution doesn’t meet requirements. Either you can patch layout checking routine or PC settings PFN verification routine. I decided to patch the second one, however patching the first is probably a better idea.

On ARMv7-A platform, I simply patched initial register store operation and the branch. Instruction BLX call was replaced with a simple NOP(MOV R0, R0).

Patched function
Patched function

There are two version of the PC settings check routines, so I need to patch both. The other one is similar to this one. Patching the layout verification routine (actually a better idea, as this patch will have some trouble when launch files from desktop) / patching on other architectures should be similar to this one.

Deep dive into UnityFS: structure and implementation

Someone asked me if I could extract some images from a popular Chinese mobile game. I accepted the challenge, but things were far more complicated than I expected.

What I knew

  • This game is Unity3D-based.
  • Original assets were encrypted with known algorithm and key. DISCLAIMER: I will not tell you details about encryption.

The story began

I thought I could extract assets I needed with existing tools (e.g. Disunity) but I was proved wrong. Disunity has been refactored, and remaining work is still in progress (at least the moment I write this article). Since resource extraction has not been implemented at this moment, Disunity couldn’t be my choice.

Then I turned to a tool called Unity Assets Bundle Extractor. It did a great job extracting resources I needed graphically. However, acquiring thousands of texture assets from 2000+ isolated files is not an easy job. I tried the command line support but failed (maybe I was too stupid).

Luckily this toolkit provides some API and documentation. Since it was compiled with Microsoft Visual C++ 2010, I was unable to use it directly(C++ ABI constantly changes with every MSVC release). And I was too lazy to write a C wrapper for P/Invoke. But these C++ header files point to a perfect solution – parse file and implement my own UnityFS parser/reader.

Special thank to the UABE project – without these generous header, I would not be able to implement my own parsing and compose this article.

Wow so many projects
Wow so many projects

UnityFS

UnityFS was a new asset bundle format introduced in Unity 5. I am not a Unity3D developer, and I absolutely didn’t know why Unity introduce a new bundle format. But anyway, let’s analyze it.

Things you need to know

  • UnityFS is just bundle of several Unity assets. Each asset contains a collection of serialized Unity objects (e.g. 2D texture, text resources, scene objects, etc.).
  • UnityFS follows a standard Unity file header structure. Let’s call it AssetsBundleHeader06
  • You have to parse asset files in order to extract what you need. There’s bunch of documentation about this. Look into the old Disunity source code for some idea.
UnityFS Header Structure
UnityFS Header Structure

So the header goes like this. There’s a DWORD flags data that matters – it contains some critical information required for decompression and directory parsing. The rule goes like this:

  • (Flags & 0x3F) is compression mode. 0 means no compression, 1 means LZMA and 2/3 means LZ4/LZ4HC.
  • (Flags & 0x40) says whether the bundle has directory info.
  • (Flags & 0x80) says whether the block and directory list is at the end of this bundle file.

C# provides a good BinaryReader that makes things a bit easy. But it can be improved for better Null-terminated String and Big Endian support. Be careful with endianness. Unity utilizes both Big Endian and Little Endian in a single file and personally I didn’t get this. For the sake of convenience, I extended the original BinaryReader for these support. Length of each data type matters – but that’s a basic stuff for CS students.

Code snippet of my simple parser
Code snippet of my simple parser

Compression

UnityFS uses optional block-based compression for streaming (you can read a specific bundle without downloading the whole file). Both LZMA and LZ4* (LZ4Hc, etc.) are supported. The Unity’s proprietary parser and Disunity respects this design. But I just wanted these bundle files, so I decided to read all blocks at once and decompress into a single memory stream.

Decompressed size should match what you get. If not, something must happened.

You can implement your own block-based reader – but my time budget didn’t allow me to do this.

There we go…block and file information!

Following a unknown 16 bytes block, there’s a Big-Endian UInt32 value represents block count in a single package. Each block information contains a Big-Endian UInt32 decompressed size, a Big-Endian UInt32 compressed size and a flag that we might not interested in.

Then a BE UInt32 value represents file count in a single package. Each file information contains file offset we need(BE UInt64), the decompressed size(BE UInt64), a BE UInt32 flag and a Null-Terminated string of file name.

Parse your assets now

With sufficient information we retrieved, we were able to extract raw asset files from a UnityFS bundle. Then what you need is search the Internet for ideas of extracting objects(text resources, 2D texture, etc.) from Unity assets. Good luck on exploring!

Conclusion

In this article, we discussed structure and parsing of UnityFS resource bundle file. For more information about UnityFS and Unity asset files, please research these projects I mentioned in this article.

Fixing Single Sign Out for Auth0 WordPress Integration

Since I set up Active Directory & Azure Active Directory for my workgroup and myself, I decided to switch to SSO for my web services. I chose Auth0 as the WordPress Identity Middleware as it is pretty flexible. However, the log out function, doesn’t work properly on federated logons. It will just call auth0 to sign out instead of signing out of all IdPs.

Luckily, fixing that is pretty easy. Located to lib/WP_Auth0_LoginManager.php and find the logout function:

If you don’t see federated after logout, fix it. They have fixed it on GitHub, but I don’t know why they don’t push it to WordPress release.

关于 Connext 的个人看法

很早以前就知道 Connext 了,从我的观点看这样的一个青少年的活动并不是什么坏事情,它至少给了青少年开发者一个互相认识的平台,特别是在中国更加难得。后来在活动结束后在 Telegram 群上和微博上也看到了一些不尽如人意的事情,不过也是可以预见到的。
Update 2注意:主办方表示中饭晚饭是没有承诺保证的
从会议的内容看这次的 Connext 比去年的 ADC 要好很多,但是有一些问题也是依旧存在的。作为一个以青少年学生开发者为主的会议,本身的总体水平是受限的(也不排除像比尔盖子 雪碧虾 Awc Zhang Jeff Bai 这样的神菊苣的存在),所以不要指望它能成为像 //build/ Google I/O WWDC 乃至 各地的 JavaSctipt / Linux User Group 之类的研讨会,组织者在组织层面上也没必要把它做的非常正式,像一些高级别的套票也是没有必要的。后来得知预计中的中饭和晚饭也没有解决,这些不确定因素极强的事情就不应该在会前做出承诺,更应该向一些 LUG 交流会一样做得更松散一些。
去年 ADC 有微软的赞助,然后刚刚知道今年是没有的,但是我了解到有很多软粉,去年情况我不是很清楚但是今年有一个MVP(来源请求),然后了解到上午的气氛还是和谐的,下午开始出现了一些冲突。这里知道了几个比较大的极端粉丝的冲突我就不讲了。包括对不用VS的人展开安利VS活动。其实是没什么必要的。然后还听说有把网上的资料做成一个collection然后在现场所是内部资料的。我说这种行为真的是内部资料的应该北京西Global Security见。23333
所以说到底这类活动的举办的关键在于减少层次感,然后大家应该更友善。
推荐阅读:https://www.tombu.info/contents/%e7%a7%81%e8%b4%a7-%e5%85%b3%e4%ba%8e-connext-%e7%9a%84%e4%b8%aa%e4%ba%ba%e8%a7%82%e7%82%b9/
————————————————-
不要说我为什么不来,只是因为这几年我都很忙,去年这个时候在上课,今年在上课,明年在美帝,啊啊啊啊啊啊啊啊
另外我不是什么菊苣。

Direct2D based blur effect in Windows Runtime Apps

Effects such as DropShadowEffect, BlurEffect were removed from Windows Runtime XAML. In order to achieve some goals, I need to write something like these.

Luckily Direct2D provides many useful effects for us, including Gaussian Blur, which is the effect I want.

At first I tried SharpDX, it worked well on Intel platform devices, but not ARM-based devices. To make matters worse, SharpDX‘s performance was not so good as I thought. So I had to write a C++/CX Windows Runtime Component and use it in my own Windows Runtime XAML project.
Here’s the result.

Windows Runtime XAML Render to bitmap sample with blur effect
Windows Runtime XAML render to bitmap sample with blur effect

To use Direct2D, I need to create device resources first. Create the Direct3D 11 API device object, and then get the Direct2D device object.

Note: To convert stream, see here: http://blogs.msdn.com/b/win8devsupport/archive/2013/05/15/how-to-do-data-conversion-in-windows-store-app.aspx

Then receive the bitmap and create WIC object. Finally get things ready and draw, and generate output file.

Note: Set D2D1_GAUSSIANBLUR_PROP_BORDER_MODE to D2D1_BORDER_MODE_HARD, you will get the iOS 7-like blur style.

Here’s the main source code:

D2DEffect.cpp


#include "pch.h"
#include "D2DBlurEffect.h"

using namespace Light::UI::Effects::Direct2D::BlurEffect;
using namespace Platform;
using namespace concurrency;

using namespace Microsoft::WRL;
using namespace Windows::ApplicationModel;
using namespace Windows::System;
using namespace Windows::Foundation;
using namespace Windows::Graphics::Display;
using namespace Windows::Storage;
using namespace Windows::UI::Core;

// Initialize hardware-dependent resources.
void BlurEffectImageProcessor::CreateDeviceResources()
{
// This flag adds support for surfaces with a different color channel ordering
// than the API default. It is required for compatibility with Direct2D.
UINT creationFlags = D3D11_CREATE_DEVICE_BGRA_SUPPORT;

#if defined(_DEBUG)
// If the project is in a debug build, enable debugging via SDK Layers.
creationFlags |= D3D11_CREATE_DEVICE_DEBUG;
#endif

// This array defines the set of DirectX hardware feature levels this app will support.
// Note the ordering should be preserved.
// Don't forget to declare your application's minimum required feature level in its
// description. All applications are assumed to support 9.1 unless otherwise stated.
const D3D_FEATURE_LEVEL featureLevels[] =
{
D3D_FEATURE_LEVEL_11_1,
D3D_FEATURE_LEVEL_11_0,
D3D_FEATURE_LEVEL_10_1,
D3D_FEATURE_LEVEL_10_0,
D3D_FEATURE_LEVEL_9_3,
D3D_FEATURE_LEVEL_9_2,
D3D_FEATURE_LEVEL_9_1,
};

// Create the Direct3D 11 API device object.
DX::ThrowIfFailed(
D3D11CreateDevice(
nullptr, // Specify nullptr to use the default adapter.
D3D_DRIVER_TYPE_HARDWARE,
nullptr,
creationFlags, // Set debug and Direct2D compatibility flags.
featureLevels, // List of feature levels this app can support.
ARRAYSIZE(featureLevels),
D3D11_SDK_VERSION, // Always set this to D3D11_SDK_VERSION for Windows Store apps.
&m_d3dDevice, // Returns the Direct3D device created.
nullptr,
nullptr
)
);

// Get the Direct3D 11.1 API device.
ComPtr dxgiDevice;
DX::ThrowIfFailed(
m_d3dDevice.As(&dxgiDevice)
);

// Create the Direct2D device object and a corresponding context.
DX::ThrowIfFailed(
D2D1CreateDevice(
dxgiDevice.Get(),
nullptr,
&m_d2dDevice
)
);

DX::ThrowIfFailed(
m_d2dDevice->CreateDeviceContext(
D2D1_DEVICE_CONTEXT_OPTIONS_NONE,
&m_d2dContext
)
);
}

///
/// Internal method referred from Bing.
/// Convert IBuffer to IStream.
///

///The buffer to convert. IStream* createIStreamFromIBuffer(Streams::IBuffer ^buffer) {
// convert the IBuffer into an IStream to be used with WIC
IStream *fileContentsStream;
HRESULT res = CreateStreamOnHGlobal(NULL, TRUE, &fileContentsStream);
if (FAILED(res) || !fileContentsStream) {
throw ref new FailureException();
}
Streams::DataReader^ dataReader = Streams::DataReader::FromBuffer(buffer);
// read the data into the stream in chunks of 1MB to preserve memory
while (dataReader->UnconsumedBufferLength > 0) {
UINT chunkSize = min(1024 * 1024, dataReader->UnconsumedBufferLength);
auto data = ref new Platform::Array(chunkSize);
dataReader->ReadBytes(data);
ULONG written;
res = fileContentsStream->Write(data->Data, chunkSize, &written);
if (FAILED(res) || written != chunkSize) {
fileContentsStream->Release();
throw ref new FailureException();
}
}
return fileContentsStream;
}

BlurEffectImageProcessor::BlurEffectImageProcessor()
{
IsInitialized = false;
}

///
/// Render image but not get the final image.
/// REMEMBER call DataInitialize method first.
///

///Indicates the the blur amount. ///Indicates the current display's DPI. IAsyncAction^ BlurEffectImageProcessor::RenderImage(float gaussianBlurStDev, float DPI){
return create_async([this, gaussianBlurStDev,DPI]{
if (!IsInitialized){
throw ref new Platform::Exception(1, "The class has not initialized.");
}

// Render it
UINT imageWidth;
UINT imageHeight;
m_wicFormatConverter->GetSize(&imageWidth, &imageHeight);

// Create a Bitmap Source Effect.
DX::ThrowIfFailed(m_d2dContext->CreateEffect(CLSID_D2D1BitmapSource, &m_bitmapSourceEffect));

// Set the BitmapSource Property to the BitmapSource generated earlier.
DX::ThrowIfFailed(
m_bitmapSourceEffect->SetValue(D2D1_BITMAPSOURCE_PROP_WIC_BITMAP_SOURCE, m_wicFormatConverter.Get())
);

// Create the Gaussian Blur Effect.
DX::ThrowIfFailed(m_d2dContext->CreateEffect(CLSID_D2D1GaussianBlur, &m_gaussianBlurEffect));

// Set the input to recieve the bitmap from the BitmapSourceEffect.
m_gaussianBlurEffect->SetInputEffect(0, m_bitmapSourceEffect.Get());

// Set the blur amount.
DX::ThrowIfFailed(m_gaussianBlurEffect->SetValue(D2D1_GAUSSIANBLUR_PROP_STANDARD_DEVIATION, gaussianBlurStDev));
DX::ThrowIfFailed(m_gaussianBlurEffect->SetValue(D2D1_GAUSSIANBLUR_PROP_BORDER_MODE, D2D1_BORDER_MODE_HARD));

// Begin drawing.
m_d2dContext->BeginDraw();

m_d2dContext->Clear(D2D1::ColorF(D2D1::ColorF::CornflowerBlue));

// Draw the scaled and blurred image.
m_d2dContext->DrawImage(m_gaussianBlurEffect.Get());

// We ignore D2DERR_RECREATE_TARGET here. This error indicates that the device
// is lost. It will be handled during the next call to Present.
HRESULT hr = m_d2dContext->EndDraw();
if (hr != D2DERR_RECREATE_TARGET)
{
DX::ThrowIfFailed(hr);
}

});
}

///
/// Initializes all device resources and the image.
/// You need to call this method before doing other things.
///

IAsyncAction^ BlurEffectImageProcessor::DataInitialize(IRandomAccessStream^ ImageDataStream,float DPI){
// DirectXBase::Initialize(Window, DPI);
return create_async([this,ImageDataStream, DPI]{
// Initialize Devices
CreateDeviceResources();

DX::ThrowIfFailed(CoCreateInstance(
CLSID_WICImagingFactory1,
nullptr,
CLSCTX_INPROC_SERVER,
IID_PPV_ARGS(&m_wicImagingFactory)
)
);

DX::ThrowIfFailed(
CoCreateInstance(
CLSID_WICImagingFactory,
nullptr,
CLSCTX_INPROC_SERVER,
IID_PPV_ARGS(&m_wicImagingFactory2)
)
);

// Now we have the image source and we can decode it.
ImageBuffer = ref new Buffer(ImageDataStream->Size);
auto op = create_task(ImageDataStream->ReadAsync(ImageBuffer, ImageDataStream->Size, InputStreamOptions::None)).then([this,DPI](IBuffer^ ImageBufferData){
DX::ThrowIfFailed(
m_wicImagingFactory2->CreateDecoderFromStream(createIStreamFromIBuffer(ImageBufferData), nullptr, WICDecodeMetadataCacheOnDemand,
&m_wicDecoder)
);

// Get data ready
DX::ThrowIfFailed(
m_wicDecoder->GetFrame(0, &m_wicFrameDecode)
);
DX::ThrowIfFailed(
m_wicImagingFactory2->CreateFormatConverter(&m_wicFormatConverter)
);

DX::ThrowIfFailed(
m_wicFormatConverter->Initialize(
m_wicFrameDecode.Get(),
GUID_WICPixelFormat32bppBGRA,
WICBitmapDitherTypeNone,
nullptr,
0.0f,
WICBitmapPaletteTypeCustom
)
);

// Create output bitmap & get it ready
UINT Width;
UINT Height;
m_wicFrameDecode->GetSize(&Width, &Height);
m_wicImagingFactory2->CreateBitmap(Width, Height, GUID_WICPixelFormat32bppBGRA, WICBitmapCreateCacheOption::WICBitmapCacheOnDemand, &m_wicBitmap);
D2D1_SIZE_U bitmapSize = D2D1::SizeU(Width, Height);
D2D1_PIXEL_FORMAT bitmapPixelFormat = D2D1::PixelFormat(DXGI_FORMAT_B8G8R8A8_UNORM, D2D1_ALPHA_MODE_IGNORE);
D2D1_BITMAP_PROPERTIES1 bitmapProp1 = D2D1::BitmapProperties1(D2D1_BITMAP_OPTIONS_TARGET,bitmapPixelFormat, DPI, DPI);
m_d2dContext->CreateBitmap(
D2D1::SizeU(Width, Height),
nullptr,
Width * 4, // 4 bytes for B8G8R8A8
bitmapProp1,
&m_d2dBitmap1
);

m_d2dContext->SetTarget(m_d2dBitmap1.Get());

IsInitialized = true;

return;
});

op.wait();
});
}

///
/// Get the final image.
/// REMEMBER call DataInitialize method first.
/// You can call this method before calling RenderImage, but you will get the original image.
///

///Indicates the current display's DPI. IAsyncOperation<IRandomAccessStream^>^ BlurEffectImageProcessor::GetImageAsBitmap(float DPI){
return create_async([this,DPI]{
if (!IsInitialized){
throw ref new Platform::Exception(1, "The class has not initialized.");
}
// Render the bitmap use WIC.
ComPtr m_iwicBitmap;
ComPtr m_iwicStream;
ComPtr m_iwicBitmapEncoder;
ComPtr m_iwicBitmapFrameEncode;
ComPtr m_iwicImageEncoder;
WICImageParameters* m_imageparm = new WICImageParameters();
D2D1_PIXEL_FORMAT m_pixel_format = D2D1_PIXEL_FORMAT();
ComPtr m_iStream;
ID2D1Image* m_id2d1image;
UINT height;
UINT width;

// Since we can't create IStream directly in Windows Runtime, we need creating InMemoryRandomAccessStream and convert it
IRandomAccessStream^ data = ref new InMemoryRandomAccessStream();

DX::ThrowIfFailed(
CreateStreamOverRandomAccessStream(data, IID_PPV_ARGS(&m_iStream))
);

// Get size, we need it later
DX::ThrowIfFailed(
m_wicFrameDecode->GetSize(&width, &height)
);

// Create bitmap
DX::ThrowIfFailed(
m_wicImagingFactory2->CreateBitmap(width, height, GUID_WICPixelFormat32bppBGRA, WICBitmapCreateCacheOption::WICBitmapCacheOnDemand, &m_iwicBitmap)
);

// Create WIC Stream
DX::ThrowIfFailed(
m_wicImagingFactory->CreateStream(&m_iwicStream)
);

// Initialize WIC Stream from IStream that we converted
DX::ThrowIfFailed(
m_iwicStream->InitializeFromIStream(m_iStream.Get())
);

// Create encoder
DX::ThrowIfFailed(
m_wicImagingFactory2->CreateEncoder(GUID_ContainerFormatPng, nullptr, &m_iwicBitmapEncoder)
);

// Create image encoder
DX::ThrowIfFailed(
m_wicImagingFactory2->CreateImageEncoder(m_d2dDevice.Get(), &m_iwicImageEncoder)
);

// Initialize
DX::ThrowIfFailed(
m_iwicBitmapEncoder->Initialize(m_iwicStream.Get(), WICBitmapEncoderCacheOption::WICBitmapEncoderNoCache)
);

// Create new frame for the bitmap
DX::ThrowIfFailed(
m_iwicBitmapEncoder->CreateNewFrame(&m_iwicBitmapFrameEncode,nullptr)
);

// Set properties
m_iwicBitmapFrameEncode->Initialize(nullptr);
m_iwicBitmapFrameEncode->SetSize(width, height);
WICPixelFormatGUID format = GUID_WICPixelFormat32bppBGRA;
m_iwicBitmapFrameEncode->SetPixelFormat(&format);
m_d2dContext->GetTarget(&m_id2d1image);
m_imageparm->DpiX = DPI;
m_imageparm->DpiY = DPI;
m_pixel_format.alphaMode = D2D1_ALPHA_MODE_IGNORE;
m_pixel_format.format = DXGI_FORMAT_B8G8R8A8_UNORM;
m_imageparm->PixelFormat = m_pixel_format;
m_imageparm->PixelHeight = height;
m_imageparm->PixelWidth = width;

// Write frmae
DX::ThrowIfFailed(
m_iwicImageEncoder->WriteFrame(m_id2d1image, m_iwicBitmapFrameEncode.Get(), m_imageparm)
);

// Commit
DX::ThrowIfFailed(
m_iwicBitmapFrameEncode->Commit()
);

DX::ThrowIfFailed(
m_iwicBitmapEncoder->Commit()
);

// Now we successfully got the image
// Convert it to stream.
// Reference: MSDN
Windows::Storage::Streams::IRandomAccessStream^ comRAS;
IUnknown* p11 = reinterpret_cast(comRAS);

static const GUID guidIRandomAccessStream =
{ 0x905a0fe1, 0xbc53, 0x11df, { 0x8c, 0x49, 0x00, 0x1e, 0x4f, 0xc6, 0x86, 0xda } };

DX::ThrowIfFailed(
CreateRandomAccessStreamOverStream(m_iwicStream.Get(), BSOS_DEFAULT, guidIRandomAccessStream, (void**)&p11)
);

// Return result
return reinterpret_cast<IRandomAccessStream^>(p11);
});
}


D2DEffects.h

#pragma once

#include "DirectXBase.h"

using namespace Windows::Storage::Streams;
using namespace Windows::Foundation;
using namespace Windows::UI::Core;

namespace Light{
namespace UI{
namespace Effects{
namespace Direct2D{
namespace BlurEffect{
public ref class BlurEffectImageProcessor sealed
{
public:
BlurEffectImageProcessor();
IAsyncAction^ DataInitialize(IRandomAccessStream^ ImageDataStream,float DPI);
IAsyncAction^ RenderImage(float gaussianBlurStDev, float DPI);
IAsyncOperation<IRandomAccessStream^>^ GetImageAsBitmap(float DPI);
void CreateDeviceResources();
private:
Microsoft::WRL::ComPtr m_bitmapSourceEffect;
Microsoft::WRL::ComPtr m_gaussianBlurEffect;

Microsoft::WRL::ComPtr m_wicDecoder;
Microsoft::WRL::ComPtr m_wicFrameDecode;
Microsoft::WRL::ComPtr m_wicFormatConverter;
Microsoft::WRL::ComPtr m_wicImagingFactory2;
Microsoft::WRL::ComPtr m_wicImagingFactory;
Microsoft::WRL::ComPtr m_d2ddevice1;
Microsoft::WRL::ComPtr m_d2ddevice;
Microsoft::WRL::ComPtr m_d3d11device;

// Direct3D device
Microsoft::WRL::ComPtr m_d3dDevice;

// Direct2D objects
Microsoft::WRL::ComPtr m_d2dDevice;
Microsoft::WRL::ComPtr m_d2dContext;
Microsoft::WRL::ComPtr m_d2dBitmap1;
Microsoft::WRL::ComPtr m_wicBitmap;

int m_width;
int m_height;
IBuffer^ ImageBuffer;
bool IsInitialized;
};
}
}
}
}
}


And don’t forget these input file: dxgi, dwrite, d2d1, d3d11, windowscodec, etc.

杭州学军中学半日游/A Peek At Hangzhou Xuejun High School

Hi, I'm Xuejun. :)
Hi, I’m Xuejun. 🙂 / 你好,我是学军 🙂

2014/5/1 7:50 PM Update 1:

1.讲座挺好的。语文信息量很大

2.学军的Wi-Fi怎么那么多

3.英语讲到一个事情,高中会把很多初中的东西包装再扩展

4.那位Presentation上的家长是我们老师 😉

WARNING:图超级多,使用按流量计费的网络连接的朋友们请慎重。

其实我整天路过学军的。
中考(高中升学)即将到来,然后各种事情都来了。
上周学校发了一张Advertisement

Advertisement
Advertisement

然后没事情嘛,家里学军又近,就去了。

以下以图为主。

ENVIRONMENT

WP_20140501_16_59_02_Pro
An ordinary one.
WP_20140501_16_58_50_Pro
我喜欢阳光射向相机的那种感觉。
WP_20140501_16_07_52_Pro

在科学馆4楼拍的。顺便围观了一下Robitics Lab谁说看不到高楼?

MEETING

你好,我是学军:)
你好,我是学军:) Video

链接地址: Here 密码:123

WP_20140501_14_07_50_Pro_processed

经过Office Lens白板处理,文字内容不做评论。

WP_20140501_15_35_37_Pro WP_20140501_15_31_06_Pro
很奇葩(新颖)的题目。
WP_20140501_16_35_50_Pro
我还没有搞懂为什么。
WP_20140501_16_46_44_Pro
论Connective Words的重要性

SOMETHING INTERESTING

WP_20140501_17_18_23_Pro (2) WP_20140501_17_13_09_Pro (2) WP_20140501_17_11_45_Pro (2)

其他:
1. 文澜跟学军没什么区别,环境很好,上下课铃一样
2. 讲座其实挺有意思的
3. 先吃饭,想到了再Update

图片地址: http://1drv.ms/1mgACiY