VI love for the *nix coders and programmers - VsVim mode in Visual Studio - VsVim
Unit Testing OData Web API/Entity Framework Applications
The cost of fixing bugs is lower if found early in the software development cycle. Unit tests greatly reduce this cost by pushing quality upstream. This helps prevent regression bugs and when coupled with TDD approach, provides a solid foundation for a complete test suite. Furthermore, teams are striving to push code into production more frequently and with higher quality. In order to flourish in this DevOps space, it is imperative to have a pipeline of solid automated tests. Traditionally, unit tests have been written for a class library projects or middle tier/business logic projects.
In this blog post, I will demonstrate how to easily unit test an OData based Web API service which is consuming data from SQL DB using Entity Framework. The goal is to write unit tests which stub out the actual DB calls. We will make use of faking the DB Context and using a common DI pattern of construction injection. The result is a highly testable maintainable and cohesive codebase.
Create the ASP.NET Web Application
First, open Visual Studio VS 2013, create an ASP.Net Web Application, choose MVC and Web API template and check the Add Unit Tests checkbox.
Create the DB
Next, create a SQL Database as the backend for this project. Use SQL Server Edition of your choice. I am using SQL Server 2014 for this post. We will use Entity Framework(EF 6.0.0.0) to connect to this DB.
Let the DB name be ODataDB
Next up, create a Table in the database named Application with following properties.
Create the Application Table
Reverse Engineer Code First from the DB
Right Click on the Web API project we created to reverse engineer code from the database. You will need to install Visual Studio Entity framework extensions.
Make sure Entity Framework dll’s are referenced in the project.
We now have Application Map.cs and ODataDBContext.cs created and mapping created.
Create ODataController
Next up, install Web API OData Libraries from Nuget for the Web API project. I am using the latest version of ASP.Net Web API2.2 for OData v1-3
Now let’s create ODataController based on Entity Framework and call it Application Controller.
We will use ApplicationMap and ODataDBContext for Model and Data Context class respectively.
Now that Application Controller is created and we have our Get/Post/Put operations in place in the controller class.
Click F5, and the OData Service is up and running!
Create Unit Test for the Application Controller
Now that the application is running, we want to write unit tests for it.
First and foremost, we need to Fake Out the DB Context in our application. Instead of using the ODataDBContext, we will use FakeDBContext. In order to do that we first need to create an interface for ODataDBContext. We will call it IODataDBContext and have ODataDBContext implement from it.
Create Interface IODataDBContext
Implement IODataDBContext from ODataDBContext
Run the application to see everything works like before. Now that the IODataDBContext is in place, let’s create FakeDBContext in the Unit Test Project.
Unit Test Project Plumbing
Add the following to the Unit Test Project.
- Add FakeODataDBContext to the unit test project. Ensure that FakeODataDBContext implements the IODataDBContext Interface.
- Add Entity framework dll to the references.
- Add System.Web.Http dll to the references.
- Add System.Web.Http.OData dll to the references.
- Add FakeDBSet class to the project.
Notice the constructor in the FakeODateDBContext, it has a method called GetApp(), this method is used to create the Fake Application object which is returned to the Unit test when calling the ODataController for Application.
Below is the code for GetApp(), Here we are using the FakeDbSet to create a Fake Application Object in memory for the purposes of the Unit Test.
Before we start writing Unit Tests, we need to tweak the controller code to expose the ODataDBContext object as a parameter. This is classic case of constructor dependency injection which would make the code testable.
Now, let’s write the Unit Test for the Application Controller. Here is the Code for the unit test. The unit test is a Visual Studio Unit Test.
As you see above, we have successfully written a Unit Test for the ODataController Api. The above unit test will exercise the code paths of your controller code and also execute any logic inside it without having to make the actual Database call. The FakeDBContext lets you get/set data in memory and disposes it once the unit test exits.
You can potentially create Unit Tests for all your Controller methods(Get,Post.Put,Delete) across all your application and ensure very high Code Coverage and a highly testable and maintainable application.
Hopefully this post helps you start writing Unit tests for ODataController Services using Entity Framework.
National Consumer Fraud Week 18 - 24 May 2015
This week marks the start of this year’s National Cyber Security Awareness Week. Microsoft is proud to be a continued supporer of this important initiative from the Australasian Consumer Fraud Taskforce.
If you haven’t stopped and thought about how you keep your private information secure, chances are you could be leaving it wide open for scammers to steal. When scammers get your details, they can use them for all sorts of identity crime such as making unauthorised purchases on your credit card, or using your identity to open accounts such as banking, telephone or energy services, take out loans or carry out other illegal business under your name.
...(read more)Dynamics CRM Online 2015 Update 1: SharePoint 統合の新機能
みなさん、こんにちは。
今回は Microsoft Dynamics CRM Online 2015 Update 1 で提供される
SharePoint 統合の新機能を紹介します。
概要
今回のリリースでは、以下の新機能が提供されます。
- 設置型 SharePoint サーバーとの統合
- 新しい構成ウィザード
- OneNote 統合 (SharePoint 統合が前提)
SharePoint 統合の設定
Dynamics CRM Online と同一の Office 365 テナント上に SharePoint
Online が展開済みである前提です。体験版でもご利用いただけます。
以下の手順で設定を行います。
1. ブラウザで Microsoft Dynamics CRM Online へ接続します。
2. 設定 | ドキュメント管理 | サーバーベースの SharePoint 統合の
有効化をクリックします。
3. 統合の概要が表示されます。「次へ」 をクリックします。
4. SharePoint サイトの場所を指定します。設置型も選択できます。
今回はオンラインで設定を行います。「次へ」をクリックします。
※設置型を利用する際の詳細は以下のサイトをご覧ください。
https://technet.microsoft.com/en-us/library/dn894709.aspx (英語)
5. 接続先の SharePoint サイトを指定して、「次へ」をクリックします。
6. 検証に成功したら「有効」ボタンをクリックします。
7. 以下画面が出たら完了です。続いてドキュメント管理設定を行うので
「ドキュメント管理ウィザードを開きます」にチェックを入れて 「完了」
ボタンをクリックします。
8. ドキュメント管理を有効化したいエンティティを選択します。また
サイト URL に SharePoint サイトを入力して「次へ」をクリックします。
9. フォルダ構成を決めます。ここでは既定のままにしておきます。
「次へ」をクリックします。
10. 確認ポップアップが出るので「OK」をクリックします。
11. 作業が成功したら「完了」ボタンをクリックします。
OneNote 統合の設定
SharePoint 統合が完了すると OneNote 統合のメニューが表示されます。
以下の手順で OneNote 統合を有効化します。
1. 新しく表示された OneNote 統合をクリックします。
2. OneNote 統合を行いたいエンティティを選択して、「完了」をクリック
します。意図したエンティティが表示されない場合ドキュメント管理設定
で該当のエンティティが有効になっているか再度確認してください。
動作確認
最後に動作を確認してみます。
1. OneNote 統合を有効にしたエンティティの任意レコードを開きます。
2. ソーシャルペインにある ONENOTE タブをクリックします。ノートが
ない場合はこのタイミングでノートが作成されるため、1 度目は少し
時間がかかります。また既定で「無題」というノートができます。
3. 「無題」ノートをクリックします。統合先の SharePoint が開きます。
まとめ
設定は上記の通り非常に簡単です。次回は OneNote 統合の詳細を
紹介します。お楽しみに!
- 中村 憲一郎
TFS2015服务器和构建环境一键安装
昨天看见一句“程序员花了300小时谈恋爱,却花了1000小时安装依赖”,莫名戳中泪点。所以当即决定从此要用自动化来提高生产力。
TFS2015自身的安装配置过程是比较简化的,由于其支持的构建系统众多,所以在“装机”过程中还是有不少烦恼。这个脚本工具可以自动地安装TFS2015RC及其构建系统所使用到的软件环境,对于系统管理员来讲还是非常的审事的。
请从附件中下载安装脚本,此脚本包含TFS2015RC,VS2015RC 以及python,nodejs,JAVA开发环境,不包含SQL数据库
所有内容托管在github里面:https://github.com/8106040/tfs2015
说明:
包含两个脚本,使用方法简单,都是直接运行脚本便可以执行安装过程。安装文件均从网络下载,所以安装时间取决于网络连接速度。
tfs-server-install.ps1 是TFS服务器安装脚本
tfs-linuxagent.sh 是linux构建服务代理安装脚本
注意事项:
1. 推荐使用Azure虚机来安装试用。
2. 如果使用其他环境,请确保安装机器有互联网连接。
3.此脚本尚不具备配置功能,仅包含环境软件预装
Channel9 Edge chat, "View of DevOps inside Microsoft"
Recently stumbled onto an interesting chat with Brian Harry and Donovan Brown about experiences with teams within Microsoft adopting DevOps. Wrote up an interpretation of their chat that hopefully reasonably captures and shares their perspective and thinking.
View of DevOps inside Microsoft
http://channel9.msdn.com/Shows/Edge/View-of-DevOps-Inside-Microsoft
00:00 Intros
01:45 Before DevOps... Ops and Dev teams' goals misaligned.
- Operations team goal to keep system up, so understandably resistant to changes, since changes #1 cause of instability.
- Dev team goal is to deliver new value to customers, requiring changes.
02:45 After DevOps... Combined Ops and Dev into single Org, better alignment for accountability/processes.
04:00 More balanced delivery of features, capability, value and stability.
04:30 Can you have change and stability at the same time? Yeah, but you have to be careful.
05:30 In per DevOps world you'd reboot the machine, restores the system but problems recur.
06:00 In DevOps world you focus on root cause, improving rapid reaction, improving building and pushing out fixes, continuously improving system stability.
07:00 People, Process and Products need to be aligned.
08:00 Need automation and may require shift in culture where Devs are accountable for the service.
08:30 Most systems fire alerts, lots of alerts. Results in missing production issues, low signal to noise ratio.
08:45 Noisy alerts need to be treated as bugs and fixed.
09:45 Make alerts the Dev's problem, they'll fix it.
10:45 Devs need to feel the consequences for their actions, otherwise they don't learn.
11:00 Traditionally, getting the site back up is the number #1 thing.
11:30 Now, higher pri is capturing info required to root cause and fix.
13:00 "Automate it all" to help up level the work/analysis Ops and Devs are doing.
13:45 Technical debt is anything that slows down progress, inefficient processes/practices.
15:00 Release management involves Ops to review deployments, but Devs do the deployment.
16:45 Regularly organize Red and Blue teams where one team attacks the service, while other defends. Great way to find issues before Customers or Hackers do.
18:30 Unit Testing - Build up over time, earlier in the cycle the better/cheaper to implement.
19:30 Test in production - Monitor Synthetic transactions, usage, performance, exceptions and logs.
21:40 Monthly Service Review (20-30 slides), look at availability, time to detect, time to mitigate, usage, adoption and leaving rates, financial revenue and costs.
23:30 Application Insights provides live site info/reports.
24:20 http://www.sonarqube.org helps to track technical debt, provides code and test metrics/analysis/graphing.
25:00 Release Management tool - Provides info on versions deployed to environment, metrics, time to move, time to release.
25:30 Combined Engineering resulted in one team having common goals where folks are better aligned. Team has much better shared view on balance between stability and change.
27:30 Harry may use http://en.wikipedia.org/wiki/5_Whys to root cause, believe this often leads to needing better a) automation, b) training, c) documentation.
Avoid stopping asking questions too early, keep asking and looking deeper for real root cause. Then fix the real root cause and systematic issues.
Using PowerShell to get VM IP addresses
Here is a handy PowerShell snippet:
Get-VM | ?{$_.ReplicationMode -ne "Replica"} | Select -ExpandProperty NetworkAdapters | Select VMName, IPAddresses, Status
Which delivers an output like this:
As you can see, it lists the IP addresses of all the virtual machines running under Hyper-V. A couple of notes to make about this:
- I use Hyper-V Replica heavily. So I have developed the habit of always filtering out Replicas - so I do not worry about them.
- I look at the network adapter status, because (as you can see) it allows me to tell the difference between a VM without an IP address - and a VM that is not reporting whether it has an IP address or not.
Hopefully you will find this useful in your environments.
Cheers,
Ben
[MVA Course Guide] Microsoft Azure Iaas Deep Dive (Arabic)
Arabic Speaker? Interested in watching Microsoft Virtual Academy courses in your local language?
Here you go, the first Arabic course on MVA "Microsoft Azure IaaS Deep Dive".
The course is currently being recorded in 4Afrika MVA Studio & I will be sharing in this post all about the course starting from the overview, the slide decks till the links of the online/offline Q&A sessions that will follow publishing the course. Also, any updates related to the course will be added here.
Overview
Learn how to leverage Microsoft Azure infrastructure-as-a-service capabilities, no need to have a previous experience with Azure. The course walkthrough creating and configuring Azure Virtual Machines taking in consideration storage, Azure Virtual Networks and cross-premises connectivity to get things up and running on the cloud.
The course mainly targets IT Pros and it requires familiarity with Virtual Machines, Networks, Windows Server and Hyper-V, no need to have a previous experience with Cloud Computing.
Course Modules
0/ Azure Virtual Machines
1/ Closer Look on Azure Virtual Machines
2/ Storage
3/ Virtual Networks
4/ Closer Look on Virtual Networks
5/ Migrating Hyper-V with PowerShell
Watch the Course on MVA
Not Published Yet [URL Placeholder]
Recording in progress...
Slide Decks
Once the course is published, you will find the slide decks here: http://aka.ms/azureiaasarabic
Q&A Sessions
Online Q&A Session
Date: 18th of June, 2015
Time: 2 PM (UTC +02:00)
Kindly share with your community.
For post alerts and more updates, please feel free to follow me on Twitter @_SherifElMahdi
Building Mobile Apps with Ionic and Monaca
Cordova makes it easy to build an app for iOS, Android, or Windows using web technologies. While the Blank project templates provide all the raw materials to build your basic “Hello World”, you’ve told us that you’d like to see richer starting points that use best practices vetted by the community, well-modeled design patterns, and advice on how to make slick-looking apps in less time. We hear you!
We’ve collaborated with some of the most respected mobile developers in the industry to bring great starting points into Visual Studio. Specifically, we called upon the wizards at Ionic and Monaca to give you project templates based on their libraries which use AngularJS routers, modules, controllers and factories. The templates allow you to use the Ionic or Onsen UI frameworks to build mobile apps with a native look and feel, right within Visual Studio.
To build Apache Cordova apps based on these Project Templates, simply head over to the Visual Studio Gallery and install the Ionic Templates and Monaca templates individually. In a few clicks, you will be able to see these templates in the New Project dialog:Note: Currently, the Ionic templates are only available for JavaScript. We are working with the Ionic team to provide a TypeScript version in the near future.
After creating your project, you’re all set to begin developing in Visual Studio. You can run and debug these apps in Ripple, devices, and emulators just like the default Apache Cordova projects. Improved JS and HTML Intellisense for AngularJS also means that you’ll spend less time consulting documentation and more time coding.
If you’re familiar with the Ionic CLI, you can even drop into the command line thanks to the interoperability improvements we introduced in Visual Studio 2015 RC. What you see in Visual Studio mirrors what you see in the file system!
Better Intellisense Support
A few months ago, Jordan Matthiesen’s blog post covered some of the editor improvements that makes it easier to work with AngularJS. Because both the Ionic and Monaca frameworks a based on AngularJS, the experience working with these frameworks is much better thanks to these editor enhancements and a little addition.
First, both the Ionic and Monaca project templates provide d.ts files which contain metadata to help Visual Studio show class and method definitions for JavaScript libraries. This means you get great intellisense out-of-the-box:
Second, Visual Studio 2015 RC ships with Intellisense support for Angular directives and ng- attributes in your HTML markup. This means you get Intellisense for your web components in both Ionic and Onsen UI:
This is just the beginning!
In the future, we would like to provide even more templates in Visual Studio that rely on more of the frameworks, libraries, and tools that you care about. If you are interested in helping out or have ideas on what templates we should deliver next, please post on UserVoice, Twitter, or StackOverflow. You can also ping me directly via e-mail.
Cheers,
Kirupa
![]() | Kirupa Chinnathambi, Program Manager, Visual Studio Client Platform Team @kirupa Kirupa Chinnathambi is a Program Manager on the Visual Studio team where he spends a lot of time thinking about how to improve the HTML developer experience. When he isn’t busy writing short bios of himself for blog posts, he can be found on twitter, Facebook, and on kirupa.com. |
Windows Server 2016 Failover Cluster Troubleshooting Enhancements - Active Dump
Active Dump
The following enhancement is not specific to Failover Cluster or even Windows Server. However, it has significant advantages when you are troubleshooting and getting memory.dmp files from servers running Hyper-V.
Memory Dump Enhancement – Active memory dump
Servers that are used as Hyper-V hosts tend to have a significant amount of RAM and a complete memory dump includes processor state as well as a dump of what is in RAM and this results in the dmp file for a Full Dump to be extremely large. On these Hyper-V
hosts, the parent partition is usually a small percentage of the overall RAM of the system, with the majority of the RAM allocated to Virtual Machines(VMs). It’s the parent partition memory that is interesting in debugging a bugcheck or other bluescreen and the VM
memory pages are not important for diagnosing most problems.
Windows Server 2016 introduces a dump type of “Active memory dump”, which filters out most memory pages allocated to VMs and therefore makes the memory.dmp much smaller and easier to save/copy.
As an example, I have a system with 16GB of RAM running Hyper-V and I initiated bluescreens with different crash dump settings to see what the resulting memory.dmp file size would be. I also tried “Active memory dump” with no VMs running and with 2 VMS taking up 8 of the 16GB of memory to see how effective it would be:
Memory.dmp in KB | % Compared to Complete | |
Complete Dump: | 16,683,673 | |
Active Dump (no VMs): | 1,586,493 | 10% |
Active Dump (VMs with 8GB RAM total): | 1,629,497 | 10% |
Kernel Dump (VMs with 8GB RAM total) | 582,261 | 3% |
Automatic Dump (VMs with 8GB RAM total) | 587,941 | 4% |
*The size of the Active Dump as compared to a complete dump will vary depending on the total host memory and what is running on the system.
In looking at the numbers in the table above, keep in mind that the Active Dump is larger than the kernel, but includes the usermode space of the parent partition, while being 10% of the size of the complete dump that would have normally been required to get the usermode space.
Configuration
The new dump type can be chosen through the Startup and Recovery dialog as shown here:
The memory.dmp type can also be set through the registry under the following key. The change will not take effect until the system is restarted if changing it directly in the registry: HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\CrashControl\
Note: Information on setting memory dump types directly in the registry for previous versions can be found in a blog here.
To configure the Active memory.dmp there are 2 values that need to be set, both are REG_DWORD values.
HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\CrashControl\CrashDumpEnabled
The CrashDumpEnabled value needs to be 1, which is the same as a complete dump.
And
HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\CrashControl\FilterPages.
The FilterPages value needs to be set to 1.
Note: FilterPages value will not found under the HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\CrashControl\ key unless the GUI “Startup and Recovery” dialog is used to set the dump type
to “Active memory dump”, or you manually create and set the value.
If you would like to set this via Windows PowerShell, here
is the flow and example:
- Gets the value of CrashDumpEnabled
- Sets the value of CrashDumpEnabled to 1 (so effectively this is now set to Complete dump).
- Gets the value of FilterPages (note that there is an error because this value doesn’t exist yet).
- Sets the value of FilterPages to 1 (this changes it from Complete dump to Active dump)
- Gets the value of FilterPages to verify it was set correctly and exists now.
Here is TXT version of what is showing above, to make it easier to copy/paste:
Get-ItemProperty –Path HKLM:\System\CurrentControlSet\Control\CrashControl –Name CrashDumpEnabled
Get-ItemProperty –Path HKLM:\System\CurrentControlSet\Control\CrashControl –Name FilterPages
Set-ItemProperty –Path HKLM:\System\CurrentControlSet\Control\CrashControl –Name CrashDumpEnabled –value 1
Set-ItemProperty –Path HKLM:\System\CurrentControlSet\Control\CrashControl –Name FilterPages –value 1
"Updating the content type failed" while publishing the InfoPath form
While trying to publish an InfoPath form to a SharePoint forms library, it might fail with the error message “Updating the content type failed” after a long time. Also, we might see duplicate columns in the content type in the library settings.
During the final process of "validating the content type", it could get failed.
If we check the columns and content types in the settings of the forms library, we could see duplicate entries of most of the columns. When we are trying to publish the InfoPath form which contains the promoted fields again to the same library, it might have created duplicate columns.
Steps to fix the issue:
1. We need to remove the duplicate entries.
2. We need to publish the InfoPath form by removing the promoted fields.
3. As it takes lot of time to validate the content type, we need to increase the execution timeout in the web.config file of the web application.
<httpRuntime maxRequestLength="51200" executionTimeout="600" />
4. We need to do an IISRESET /noforce or recycle the app pool.
5. Once it is done, we need to republish the InfoPath form.
Published By: Madhan Kumar
Creating "Libros Gratis": Part 1 - Why?
This is the first in a series of posts about the creation of the "Libros Gratis para Windows" (Free Books for Windows) apps for Windows and Windows Phone. In this entry, I will talk about the motivations and considerations for building the app(s).
Being involved in the Software business is like being involved in something magic. Have you watched Fullmetal Alchemist? Remember the "Equivalent Exchange" rule? Basically you can't create stuff out of nothing: you have to give something in exchange, something equivalent. If you want ice, you have to give up on water and heat. If you want a piece of furniture, you have to give up on some wood, plastic, metal, and hard work... There are, however, a few ways in which we can create stuff whose result greatly exceeds what we give in exchange. And Software is a great example.
It's like having a superpower. Software and Information have been redefining the world at an incredibly fast pace for the last decades. That's sped up with the popularization of the internet, and then again as our devices become smaller, more portable, more social and more capable. A lot of attention is given to the economical and technical enrichment derived of that magic; but I'm afraid many engineers (not all of them, of course) forget paying attention to the social improvement potential of their superpowers...
Flash back to many years ago. Like, I don't know... 20 years ago, or something. When I was a little kid in Mexico City with an extreme hunger for knowledge. Knowledge back then (before the internet was popular, at least down there) came in the shape of books... But my family was neither fond of books nor in the position of acquiring them with enough pace to keep up with my hunger. "All right", my mom would say every 3 months or something, "we're going to the book store... I'm going to buy you 3 books, not expensive ones, and you better make them last 3 months"… They rarely were enough to keep me entertained for over a week.
My salvation came in the form of a monthly subscription to a delivery service of public domain or almost-public-domain books. "The Little Prince", "Robinson Crusoe", "Frankenstein", et. al. They were relatively cheap, and a few years later you could find the whole collection for insanely cheap prices at some subway stations (Chabacano transfer, anyone :) ?). Then the internet came and, leaving aside the ridiculously slow dial-up connection speeds, everything was easier. But I wonder, haven't I known that something such as "public domain" books existed -wouldn't I be either downloading illegal books, or even worse, finding a way to entertain my mind with something less healthy?
Lets' come back to the present, then. There I was, wondering what could I do now to improve, at least a little bit, the lives and success potential of people. I needed something that:
- Required the less possible amount of time, for reasons I'll explain in post #2,
- Could have an impact. In order to have an impact it would most likely need to be free for end users, from start to end,
- Preferably used the Windows Phone ecosystem for the most part, so It'd both reach the lower-end of the Spanish-speaking countries income bracket AND guaranteed that my buddy Chris would get involved :P , and,
- Was based in something that I felt passionate about, so I had a good motivation to keep up with it.
So there we go... #2: Free. #3: Windows Phone. #4... Uhm... Coffee, dogs, my daughter, history, books, Canada, gardening... Wait wait, what did you say? Books? Free. Windows Phone. Books. FreeBooks for Windows and Windows Phone!
And eureka! The idea was born. There, I have it! An app to search for free (public domain) books on the Windows and Windows Phone platform! Lets' get coding then, right now!
Eh... Wait. Except for, you know, time :( … Well, I guess it'll have to wait...
Hold on. Wasn't there a set of tools to create apps from the scratch, using data sources, web-based and that would end up basically giving you the app packages, super quickly? What was its name, Windows App Studio? All right, lets' check it out...
And that's how this story continues in post #2: how was the app itself actually made :) .
Monday, Tuesday... Learn Day!
MSDN NZ Flash
|
Win10 development in VB and .NET - getting started
This is Part 1 of my "VB Win10 Apps" series. (this list is just my rough thoughts on what to write. I'll amend it based on what I get sucked into, and what areas you folks ask me to do in comments).
- > Part 1: getting started
- Part 2: issues with common libraries - JSON.Net, SignalR, SharpDX, SQLite, LiveSDK
- Part 3: full-screen, tablet mode, and windowed apps
- Part 4: Direct2d graphics acceleration with Win2d
- Part 5: how to move forwards with Win8.1 or WP8.1 PCLs
- Part 6: experiences in porting a large "forms-over-data" app from PhoneSilverlight to Win10
- Part 7: writing an Azure service, and interacting with it from my Win10 app.
Win10 development in VB and .NET - getting started
I'm excited to be writing Win10 apps in VB. For me the biggest promise is
Write just a single app and it can run windowed on Desktops, and can run on tablets and phones and even HoloLens.
Over the coming weeks I'll be writing more odds and ends about Win10 development as I learn about them. Here's my first "getting started" post.
- Your machine should be running Windows 10, build 10074 or higher, from here: https://insider.windows.com/
- I installed it "clean" on my SurfacePro1. To do this, first download the ISO of Windows10 from http://windows.microsoft.com/en-us/windows/preview-download, and then use "Rufus" http://rufus.akeo.ie/ to create an installable USB image from the ISO. Within Rufus, for my SurfacePro1, I had to choose "FAT32" and "GPT for UEFI". I don't know what you need on other machines.
- Install VS2015 RC from here: https://www.visualstudio.com/en-us/downloads/visual-studio-2015-downloads-vs.aspx
- During installation, you get to choose which components to install. You should choose "Windows 10 tools and emulators", and also "Windows Phone 8.1". Installation takes from 1 to several hours.
- It will install Windows 10 SDK version 10069.
- There are some glitches with the installer. If you go back to AddRemovePrograms and modify which components of VS2015 RC are installed, then it ends up installing more than it really should.
- When VS runs for the first time, it asks if you want to sign in. I always used to click "no" and it'd then ask me more configuration questions. But if I answer "yes" and I sign in, then it bypasses all those configuration questions, and ends up being quicker. Lesson learned: I now always sign in.
- You can run Win10 apps upon emulators. The emulators came with the Win10 SDK, and so are running version 10069 of the Win10 operating system.
- You can run Win10 apps on your local machine. To do this you local machine must be Win10.Desktop version 10074 or higher and must be unlocked.
- Instructions for unlocking are at https://msdn.microsoft.com/en-us/library/windows/apps/dn706236.aspx. Here's a "TL;DR" version:
- Run gpedit.msc > Local Computer Policy > Computer Configuration > Administrative Templates > Windows Components > App Package Deployment. Then right-click to enable two things,
- "Allow all trusted apps to install"
- "Allow development of Windows Store apps"
- You can run Win10 apps on your phone. To do this your phone must be Win10.Mobile version 10080 or higher (which came out on May 14th). Your phone must be unlocked.
- Instructions for unlocking are at https://msdn.microsoft.com/en-us/library/windows/apps/dn706236.aspx. Here's a "TL;DR" version:
- Plug your phone in by USB. Run "Windows Phone Developer Registration" tool from your desktop. (I believe this tool is only installed when you check the option for Windows Phone 8.1 Development Tools during VS install).
- File > New > VB > Windows > Blank App (Windows Universal)
- To run on local machine: select Debug, then "x86", then "LocalMachine" in the debug toolbar.
- To run on emulator: select Debug, then "x86", then in the dropdown (that at first says "LocalMachine") pick Emulator.
- To run on your phone: connect by USB, then select Debug, then "ARM", then in the dropdown pick "Device"
Here are some exciting things about coding on VS2015 RC and Win10:
- Edit and Continue now works! At least, it works on local machine and emulator. It doesn't yet work on ARM.
- LINQ and lambdas can now be used in the Immediate and Watch windows!
- There are some great new features in the VB language. My favorites are
- string interpolation, e.g. Debug.WriteLine($"Point {pt} has length {pt.x:0.00}")
- null-checking operator, e.g. Dim x As String? = customers.FirstOrDefault?.Name
Learning resources
Here's the basic "getting started" documentation on MSDN:
The first question you'll ask is: "If my app is supposed to be able to run windowed on desktop, and on tablets, and on phones, and on HoloLens, how should I design an adaptive-UI so it looks great on all of them?" This area is outside my expertise. Here are the training videos I've queued up but haven't yet got around to watching.
- What's New in XAML for Universal Apps
- Developing UWAs
- From small to big screen: XAML for UWAs
- New XAML tools
- Deep dive into XAML and UWA development
- Porting from Win8.1 or WPSilverlight to Win10
I'm also curious about how things will work on "Internet Of Things" devices.
And there are a few more talks on my to-watch list:
How to Collect and Analyze Azure Security Logs
Approximately 2 months ago, Mahesh Nayak, a Senior Program Manager in the Azure security team published whitepaper titled Microsoft Azure Security and Audit Log Management. This has been a very popular topic for many customers who have recently moved to Azure and are looking to collect and analyze the security logs for their cloud based applications and VMs. In addition, many customers also frequently ask during introduction meetings on how they can detect threats, policy violations, achieve regulatory compliance or search for potential network, host or user anomalies in their deployments. So, I thought I would post a quick introduction to the broader community to share the availability of this information that we regularly update on the Azure Trust Center.
At a high level, it is quite easy and simple to begin collecting logs using Windows Event Forwarding (WEF) or the more advanced Azure Diagnostics when you have deployed Windows based VMs using IaaS in Azure. In addition, Azure Diagnostics can be configured to collect logs and events from PaaS role instances. When using IaaS based VMs, you simply configure and enable the desired security events the same way you enable Windows Servers to log audits in your on premise datacenter. You have several configuration options of course all depending on whether your machines are joined to the domain or you whether you need to use local policy configuration as well. For web applications, you can also enable IIS logging if that is your primary application and deployment in Azure. Security data can always be stored in storage accounts in supported geo locations of your choice to meet your data sovereignty requirements.
Last, but not least, many customers also frequently ask how they can export the logs to other systems and potential SIEMs. The whitepaper provides the necessary guidance and options for not only downloading all the stored blobs from Azure storage and the operations logs, but also how you can view and use the access and usage reports from Azure Active Directory.
Would you like to learn more and configure your public cloud environment the same way you have enabled logging in your on premise datacenter(s)? Download and check out the Microsoft Azure Security and Audit Log Management whitepaper and you will be collecting the logs and able to analyze them in a very short period of time!
We would look forward to your feedback and stay tuned for future updates on this blog in the near future!
David B. Cross
Engineering Director, Azure Security
Cumulative Update #16 for SQL Server 2012 SP1
Cumulative Update #6 for SQL Server 2012 SP2
What is the IP address 168.63.129.16?
The IP address 168.63.129.16 is a virtual public IP address that is used to facilitate a communication channel to internal platform resources for the bring-your-own IP Virtual Network scenario. Because the Azure platform allow customers to define any private or customer address space, this resource must be a unique public IP address. It cannot be a private IP address as the address cannot be a duplicate of address space the customer defines. This virtual public IP address facilitates the following things:
- Enables the VM Agent to communicating with the platform to signal it is in a “Ready” state
- Enables communication with the DNS virtual server to provide filtered name resolution to customers that do not define custom DNS servers. This filtering ensures that customers can only resolve the hostnames of their deployment.
- Enables monitoring probes from the load balancer to determine health state for VMs in a load balanced set
- Enables PaaS role Guest Agent heartbeat messages
The virtual public IP address 168.63.129.16 is used in all regions and will not change. Therefore, it is recommended that this IP be allowed in any local firewall policies. It should not be considered a security risk as only the internal Azure platform can source a message from that address. Not doing so will result unexpected behavior in a variety of scenarios.
Additionally, traffic from virtual public IP address 168.63.129.16 that is communicating to the endpoint configured for a load balanced set monitor probe should not be considered attack traffic. In a non-virtual network scenario, the monitor probe is sourced from a private IP.
Azure DNS Server Redundancy
Customers may observe that their PaaS role instances and IaaS virtual machines are only issued one DNS server IP address by DHCP. This does not mean that name resolution in Azure has a single point of failure however.
The Azure DNS infrastructure is highly redundant. The IP address that is exposed to the customer virtual machine is a virtual IP address in the Azure platform. That virtual IP address maps to a cluster of DNS servers in the same region that are behind a load balanced IP so a failure of any particular server is not a concern. In the event a DNS server cluster in the region fails, the virtual IP address exposed to customers will fail over to a DNS server cluster in a nearby region. The only impact of such a failure to customers will be a slight increase in latency.
SQL Server 2014 Service Pack 1 is now available
Approximately one year ago, we launched SQL Server 2014. Today, May 15, we are pleased to announce the release of SQL Server 2014 Service Pack 1 (SP1). The Service Pack is initially available for download on the Microsoft Download Center. SQL Server 2014 with SP1 will be rolling out to additional venues including MSDN/TechNet, the Volume Licensing center, and other channels starting May 21, 2015.
SQL Server 2014 SP1 contains fixes provided in SQL Server 2014 CU 1 up to and including CU 5, as well as a rollup of fixes previously shipped in SQL Server 2012 SP2. For highlights of the release, please read the Knowledge Base Article for Microsoft SQL Server 2014 SP1.
As part of our continued commitment to software excellence for our customers, this upgrade is available to all customers with existing SQL Server 2014 deployments via the download links below.
Microsoft® SQL Server® 2014 SP1
Microsoft® SQL Server® 2014 SP1 Express
Microsoft® SQL Server® 2014 SP1 Feature Pack
Thank You!
SQL Server Engineering Team