Quantcast
Channel: MSDN Blogs
Viewing all 29128 articles
Browse latest View live

Surface Pro LTE での APN 設定について

$
0
0

こんにちは。Surface 法人サポート担当の沖です。

Surface Pro with LTE Advanced モデルが発売されてから多くのお客様にご利用いただき、いくつか LTE に関してご質問をいただいております。

この記事では、その中で特にご注意いただきたい点をまとめさせていただきますので、ご参照いただけますと幸いです。

Surface Pro LTE モデルにて APN 設定を手動で作成する場合には [APN の種類] [インターネットおよびアタッチ] としないと、インターネット接続できないことがございますので[インターネットおよびアタッチ] を選択してください。

 

尚、APN の設定方法については以下の公開情報でもご紹介しておりますのでご参考までにご案内差し上げます。 

Windows 10 の携帯ネットワークの設定

https://support.microsoft.com/ja-jp/help/10739/windows-10-cellular-settings

APN を追加します” をご覧ください。


Now in Public Preview: Visual Studio tools for Azure Stream Analytics on IoT Edge

$
0
0

Azure Stream Analytics (ASA) on IoT Edge empowers developers to deploy real-time analytical intelligence closer to IoT devices so that they can unlock the full value of device-generated data. Today we are happy to announce that Visual Studio tools for ASA now supports development of ASA on IoT Edge, in addition to development of cloud jobs. These tools can greatly simplify the experience for developing ASA Edge jobs.

With these Visual Studio tools, you can easily author an ASA Edge script with IntelliSense support, test it on your local machine against local data inputs and then create a corresponding job in the cloud for deployment on the IoT Edge.

Below we take a look at some of the key features available, or you can quickly get started by following the tutorial.

Script Authoring

As with ASA Cloud project you can start with creating an ASA on IoT Edge project. This lets you create and open queries in the editor that understand IoT Edge syntax and offers several cool features including keyword completion, error marker and syntax highlighting. These features can save editing time as well as help you correct compilation errors as early as possible.

Local testing

Since we released Visual Studio tools for ASA cloud jobs in 2017, local testing has become the most popular feature among users. The tools encapsulate a single-box local runtime which allows you to run the query solely on the local machine. In this way you can focus on verifying the query logic even in a disconnected mode before deploying to devices.

Submit an ASA Edge job

When the local development and testing is done, you can submit an ASA Edge job to Azure and then use IoT Hub to deploy it to your IoT Edge device(s).

If you encounter any issues, please contact ASAToolsFeedback@microsoft.com .

You are also very welcomed to share your feedback at https://feedback.azure.com/forums/270577-stream-analytics

构建时间改进建议:关闭/MAP,使用PDB

$
0
0

[原文发表地址] https://blogs.msdn.microsoft.com/vcblog/2018/03/14/build-time-improvement-recommendation-turn-off-map-use-pdbs/

[原文作者] YongKang Zhu [MSFT]

[原文发表时间] 3/14/2018

映射文件是一个纯文本文件,其中包含有关链接器生成的二进制文件中存在的某些名称和符号的信息。它还包含二进制文件中的所有段的详细信息(代码,数据等)以及每个符号定义在哪个OBJ / LIB中。Windows调试器(如windbg.exe)可以使用映射文件来帮助定位程序崩溃的位置。映射文件是一种古老的技术:使用MSVC工具集的现代版本,PDB(程序数据库)文件可以完成所有映射文件的工作。

生成一个映射文件需要很长时间。如果你不需要映射文件但却在构建时看到了链接器选项/MAP,你应该移除它加快构建速度。最近我们已经完成了加速生成映射文件的工作,但是生成映射文件仍是一个缓慢的过程。

如果你是少数需要映射文件的人之一(例如,为了快速检查感兴趣的函数集或数据是否按照预期的或正确的二进制顺序排列),请放心,我们不会移除它。但是,以下几点将说明为什么您应该关闭/ MAP并使用PDB:

  • 关闭映射文件生成可以缩短构建时间。虽然我们最近在完全链接场景中提高了映射文件生成的吞吐量,但链接器还是无法增量更新先前链接生成的映射文件,因为这会影响增量链接吞吐量。跟PDB文件不同的是,在增量链接期间可以通过链接器精准的更新PDB文件。
  • 与PDB文件不同,二进制文件与其对应的映射文件之间没有紧密的绑定。跟踪哪个映射文件对应哪个版本的二进制文件是很困难的。
  • 与PDB文件不同,符号服务器不支持映射文件。
  • PDB文件中的信息是映射文件中的内容的超集。实际上,几乎所有的版本都会默认生成PDB文件。

最后,我们已经发布 DIA API ,人们可以使用它来编写自己的工具来从PDB文件中检索当前在映射文件中可用的所有信息。

最后

我们知道构建的吞吐量对于开发人员非常重要,我们正在继续改进我们的工具集的吞吐量性能。你可以阅读我们最近发布的博客 Visual Studio 2017 吞吐量的改进和建议 来了解更多我们为提高吞吐量所做的事情。记住检查你的构建选项看看你是否在生成不必要的映射文件!

如果您对我们有任何意见或建议,请告诉我们。您可以通过在下方评论,通过电子邮件(visualcpp@microsoft.com )与我们联系,您也可以通过产品中的 帮助>报告问题 或通过 开发者社区提供反馈。您还可以在Twitter@VisualC )和Facebookmsftvisualcpp )上找到我们。

Become a Skype Expert

$
0
0

Looking for ways to easily bring technology in your classroom? What to connect your students globally and provide them with unique learning opportunities? Then look no further than Skype in the Classroom! This amazing resource is available for free on our Microsoft Educator Community.


New to Skype in the Classroom? 

Follow these 3 simple steps to get started

Step 1: Register on the Microsoft Educator Community and Complete your Profile.


Step 2: Take the Intro Course


 

 

 

 

 

 

 

 

 

 

 

 


Step 3: Request a Skype Activity



Already using Skype in the Classroom? 


 

 

Become a Skype Expert by completing this simple learning path

To become a Skype in the Classroom Expert simply complete the four courses below:

 

 

 

 


Introduction to Skype in the Classroom

 

 

 

 

Virtual Field Trips with Skype in the Classroom

 

 

 

 

Skype collaborations

 

 

 

 

Become a Mystery Skype Master

 

 

 

 

 


Join the Community: Follow @SkypeClassroom


プライベートAOS

$
0
0

Sandbox環境からプライベートAOSマシン(PAOS)が削除された理由に関しては、

下記のドキュメントをご参照ください
https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/deployment/cloud-deployment-overview#frequently-asked-questions

<日本語訳>
使用しているTier 2からTier 5の複数のサンドボックス環境から、プライベートAOSマシンが無くなったのはなぜですか?

プライベートAOS は、これまでAOSとBIマシン間のセキュアな通信のために必要でした。
最近のアップデートにより、全てのAOSとBIマシン間の通信は、直接セキュアな通信が行えるようになりました。
このため、プライベートAOSマシンを介する必要なくなり、プライベートAOSマシンの削除を順次実施しております。
この変更は、機能やセキュリティに影響を及ぼしません。

Linux项目系统,Linux控制台窗口,同步和附加到进程的Linux C ++工作负载改进

$
0
0

[原文发表地址] Linux C++ Workload improvements to the Project System, Linux Console Window, 同步 and Attach to Process

[原文发表时间] 2018/03/13

MSBuild 项目系统改进

我们在C/C++的常规属性页面上为Linux项目添加了一些新属性。最大的并行编译作业允许你启动其他编译过程。默认值是1,但可以增加以提高构建吞吐量。公共项目包含目录允许你指定要在解决方案中暴露给其他项目的项目中的目录。在消费项目中,添加对公开其包含目录的项目的引用,现在可以从源中引用它们。

Linux 控制台窗口改进

现在运行或者调试Linux项目时,将显示Linux控制台窗口。如果你停靠在这个窗口,这个位置将会在随后的运行中被记住。当你从调试模式返回时,窗口将被关闭。我们还修复了开启/关闭回声的处理,以正确显示来自远程系统的消息。

 CMake 和打开文件夹的同步改进

我们在打开文件夹和CMake场景中的同步支持也看到了一些改进。以前,即使你已取消启动它的任务,同步也会运行完成,这已得到修复。如果同步是由构建触发的,例如你取消构建,同步现在将取消执行。我们还进行了一些性能改进,并为root用户启用了同步。你现在也可以使用CMakeSettings.json中的rsyncCommandArgs选项将其他命令参数传递给同步。

附加到流程改进

你已经向我们反馈了关于需要对远程Linux调试的附加到流程方案进行更多控制的需求。我们通过Linux项目或打开文件夹的正常调试启动设置添加了许多控件,例如启用子进程调试,预附连命令等。要启用此位置,请使用名为Microsoft.MIEngine.Options的文件。 xml在你的解决方案或工作区的根目录中。这是一个简单的例子。

<?xml version="1.0" encoding="utf-8"?>
<SupplementalLaunchOptions>
    <AttachOptions>
      <AttachOptionsForConnection AdditionalSOLibSearchPath="/home/user/solibs">
        <ServerOptions MIDebuggerPath="C:Program Files (x86)Microsoft Visual StudioPreviewEnterpriseCommon7IDEVCLinuxbingdb7.9x86_64-linux-gnu-gdb.exe"
ExePath="C:tempConsoleApplication17ConsoleApplication17binx64DebugConsoleApplication17.out"/>
        <SetupCommands>
          <Command IgnoreFailures="true">-enable-pretty-printing</Command>
        </SetupCommands>
      </AttachOptionsForConnection>
    </AttachOptions>
</SupplementalLaunchOptions>

The AttachOptionsForConnection has most of the attributes you may need. The example above shows passing a location to search for additional .so libraries. The child element ServerOptions enables attaching to the remote process with gdbserver instead. To do that you need to specify a local gdb client (the one shipped in VS is shown above) and a local copy of the binary with symbols. The other child element SetupCommands allows you to pass commands directly to gdb. You can find all the options available in the LaunchOptions.xsd schema, look at the root element for this file of SupplementalLaunchOptions.

下一步是什么 

下载Visual Studio 2017预览版,安装Linux C ++ Workload并尝试使用你的项目。

联系我们的最佳方式是通过GitHub托管的问题列表,直接通过邮件atvcpplinux-support@microsoft.com或在Twitter @ robotdad上找到我。

IoT Hub のデータ保持期間について

$
0
0

IoT Hub のデータ保持期間についてのご質問と回答をご案内します。

 

Q. IoT Hub Stream Analytics などにデータを送信できない場合は内部にデバイスからの情報を溜め込むようになっているようですが、この情報の保持期間はどのくらいになるでしょうか?また保持期間が存在する場合、保持期間は1日といった日単位になるのかそれともデータ容量によって溜め込む期間がきまるのでしょうか?

 

A. IoT Hub はデバイスから受け取ったメッセージデータを 1 日単位で保存しています。IoT Hub のデータを保存する期間を「リテンション期間」と呼んでおり、こちらは既定で1日となっています。なお、データの量には影響されません。データの保持期間は最大で7日まで保持するよう設定いただくことが可能です。この情報は下記資料にも公開されています。

 

・デバイスからクラウドへのメッセージを組み込みのエンドポイントから読み取る

https://docs.microsoft.com/ja-jp/azure/iot-hub/iot-hub-devguide-messages-read-builtin

 

***上記資料より抜粋***

リテンション期間

このプロパティは、IoT Hub によってメッセージが保持される期間を日数で指定します。 既定は 1 日ですが、7 日間に増やすことができます。

************************

 

clip_image001

 

 

 

上記の内容がお役に立てば幸いです。

 

Azure IoT 開発サポートチーム 津田

Minispy File System Minifilter Driver サンプルを動かしてみる

$
0
0

今回は、ファイルシステムミニフィルタ ドライバのサンプル Minispy File System Minifilter Driver をご紹介します。

 

このサンプルは、システム上の任意の I/O を監視しログに記録する方法を示すサンプルです。

 

Minispy は、ユーザーモード アプリケーションの minispy.exe とカーネルモード ドライバの minispy.sys で構成されています。Minispy.sys が、様々なI/O に対応するコールバックを、フィルタマネージャに登録します。このコールバックが、システム上の任意のI/O を記録します。ユーザーが、この記録された情報を要求した時に、minispy.sys minispy.exe にその情報を渡し、minispy.exe が画面上に出力、またはファイルにログしていきます。

 

あるデバイス上のI/O を監視するためには、minispy.exe を使って、明示的に minispy.sys をそのデバイスにアタッチする必要があります。あるデバイス上の I/O の監視をやめる場合もminispy.exe を使います。

 

今回は、このサンプルをWindows 10 (1709) x86 にインストールして、C ドライブ上で監視した I/O の情報がコマンドプロンプトやファイル上に出力されるところをお見せしたいと思います。サンプルをビルドする、開発側のPC Windows 10 (1709) x64 Visual Studio 2017 WDK for Windows 10 Version 1709 がインストールされています。

 

1. サンプルの入手

 

Minispy File System Minifilter Driver サンプルは、以下のサイトの右側の緑色の [Clone or Download] ボタンを押すと表示される [Download ZIP] ボタンでWindows-driver-samples-master.zipをダウンロードすると、Windows-driver-samples-masterfilesysminiFilterminispy のフォルダにあります。

 

https://github.com/Microsoft/Windows-driver-samples

 

2. サンプルのビルド

 

このフォルダのminispy.sln を、Visual Studio 2017 で開きます。Filter の方がminispy.sysUser の方が minispy.exe のプロジェクトです。

 

clip_image002

 

[ソリューション ‘minispy] を右クリックして [構成マネージャー] をクリックします。

 

clip_image004

 

今回は、[アクティブソリューション構成] [Debug][アクティブ ソリューション プラットフォーム] [Win32] とします。

 

また、minispy.exe 側でVisual C++ Runtime (VCRUNTIME14D.dll) のインストールを省略するために、User フォルダの下のプロジェクトminispy を右クリックして [プロパティ] を開き、[構成プロパティ]-[C/C++]-[コード生成]-[ランタイム ライブラリ] [マルチスレッド デバッグ(/MTd)] に変更します。

 

clip_image006

 

 

[ソリューション ‘minispy] を右クリックして[ソリューションのリビルド] をクリックします。

これで、minispy.sys minispy.exe ができます。

次のステップに必要なファイルと場所は以下です。

 


ファイル

場所

minispy.sys

minispyfilterDebugminispy

minispy.exe

minispyuserDebug

minispy.inf

minispy

 

 

3. サンプルのインストールと動作確認の準備

 

上記のファイルをWindows 10 (1709) x86 の環境にコピーします。例えば、C:minispy というフォルダを作って、そこに置きます。minispy.inf を右クリックして、[インストール] をクリックすれば、インストールできます。

 

インストールしても、以下のようにまだこのサンプルドライバはロードされていません。(fltmc.exe の使い方の詳細は、以前の記事「fltmc.exe の使い方」<https://blogs.msdn.microsoft.com/jpwdkblog/2013/02/27/fltmc-exe/> をご参照ください。)

 

>fltmc

 

フィルター名                      インスタンス数 階層          フレーム

------------------------------  -------------  ------------  -----

WdFilter                                3       328010         0

storqosflt                              0       244000         0

wcifs                                   1       189900         0

CldFlt                                  0       180451         0

FileCrypt                               0       141100         0

luafv                                   1       135000         0

npsvctrig                               1        46000         0

Wof                                     2        40700         0

FileInfo                                3        40500         0

 

そこで、以下のコマンドでこのサンプルドライバをロードします。

 

> fltmc load minispy

 

これにより、以下のようにminispy がロードされています。ただ、まだインスタンス数が 0 であることから、ボリュームへのアタッチはされていません。

 

>fltmc

 

フィルター名                      インスタンス数 階層          フレーム

------------------------------  -------------  ------------  -----

Minispy                                 0       385100         0

WdFilter                                3       328010         0

storqosflt                              0       244000         0

wcifs                                   1       189900         0

CldFlt                                  0       180451         0

FileCrypt                               0       141100         0

luafv                                   1       135000         0

npsvctrig                               1        46000         0

Wof                                     2        40700         0

FileInfo                                3        40500         0

 

なお、上記まで行わないと、minispy.exe の実行時に以下のエラーが出ます。

 

C:minispy>minispy

Connecting to filter's port...

Could not connect to filter: 0x80070002

 

上記のminispy.sys のロードを行うことで、minispy.exe を以下のように実行できるようになります。

 

C:minispy>minispy

Connecting to filter's port...

Creating logging thread...

 

Dos Name        Volume Name                            Status

--------------  ------------------------------------  --------

                DeviceMup

C:              DeviceHarddiskVolume2

                DeviceHarddiskVolume1

                DeviceNamedPipe

                DeviceMailslot

 

Hit [Enter] to begin command mode...

 

最後の行の指示通り、Enter を押すと、コマンドが入力できるようになります。

? を入力して、どのようなコマンドがあるか見てみます。

 

>?

Valid switches: [/a <drive>] [/d <drive>] [/l] [/s] [/f [<file name>]]

    [/a <drive>] starts monitoring <drive>

    [/d <drive> [<instance id>]] detaches filter <instance id> from <drive>

    [/l] lists all the drives the monitor is currently attached to

    [/s] turns on and off showing logging output on the screen

    [/f [<file name>]] turns on and off logging to the specified file

  If you are in command mode:

    [enter] will enter command mode

    [go|g] will exit command mode

    [exit] will terminate this program

>

 

オプションを表にすると以下の通りです。

 


オプション

説明

/a <ドライブ>

<ドライブ> の監視を開始します。/a は、minispy.sys をそのドライブに「アタッチ」するということです。

/d <ドライブ>
[<
インスタンスID>]

<インスタンスID> minispy.sys <ドライブ> からデタッチします。これにより監視を停止します。

/l

現在minispy.sys により監視しているすべてのドライブをリストします。

/s

スクリーンへのログ出力をON/OFF します。(デフォルトはON です。)

/f [<ファイル名>]

ファイルへのログ出力をON/OFF します。

ON の時は<ファイル名> が必須で、OFF の時は<ファイル名> が不要です。

 

コマンドモードについては以下の操作ができます。

Enter を押せばコマンドモードに入ります。

go またはg を入力すれば、コマンドモードから抜けます。

exit を入力すれば、minispy.exe を終了します。

 

ここまでを理解したら、コマンドモードのまま、以下を実行して、C ドライブにアタッチしてみましょう。

 

>/a c:

    Attaching to c:...     Instance name: Minispy - Top Instance

 

実際、/l オプションを実行したら、C ドライブにアタッチできている(Status Attached) と出力されます。

 

>/l

 

Dos Name        Volume Name                            Status

--------------  ------------------------------------  --------

                DeviceMup

C:              DeviceHarddiskVolume2               Attached

                DeviceHarddiskVolume1

                DeviceNamedPipe

                DeviceMailslot

>

 

 

4. サンプルの動作確認

 

以上で準備ができたので、動作確認として、C ドライブ上で監視した I/O の情報がコマンドプロンプトやファイル上に出力されるところをお見せしたいと思います。

 

コマンドモードのまま、g を入力して、スクリーン上にログを出力します。

 

>g

Should be logging to screen...

Opr  SeqNum   PreOp Time  PostOp Time   Process.Thrd      Major/Minor
Operation          IrpFlags      DevObj   FileObj  Transact   status:inform                               Arguments                             Name

--- -------- ------------ ------------ ------------- ----------------------------------- ------------- -------- -------- -------- ----------------- ----------------------------------------------------------------- -----------------------------------

IRP 00000D69 14:43:32:574 14:43:32:634        4.a54  IRP_MJ_WRITE                        00060a01 N--- 8DF72388 8DF74398 00000000 00000000:00001000 1:00001000 2:00000000 3:0006F000 4:00000000 5:9F1BC000 6:00000000 DeviceHarddiskVolume2ProgramDataMicrosoftWindows DefenderSupportMpWppTracing-03172018-123129-00000003-ffffffff.bin

                                                                     IRP_MN_NORMAL

 

上記は、出力のヘッダの部分と、最初のログの一行を抜粋しています。実際には、全部一行で表示できるようにコマンドプロンプトの設定を調整するか、ファイルに出力して、ヘッダの行とログの行の各項目の対応関係がわかるようにした方が、わかりやすいと思います。

 

上記の例を表にすると、以下のようになります。

 


ヘッダ

意味

ログ出力例

Opr

IRPFast I/OFsFilter のいずれかのオペレーション

IRP:
FLT_CALLBACK_DATA_IRP_OPERATION

FIO:
FLT_CALLBACK_DATA_FAST_IO_OPERATION

FSF:
FLT_CALLBACK_DATA_FS_FILTER_OPERATION

IRP

SeqNum

シーケンス番号

00000D69

PreOp
Time

Pre-operation コールバックが呼ばれた時刻

14:43:32:574

PostOp
Time  

Post-operation コールバックが呼ばれた時刻

14:43:32:634

Process.Thrd     

プロセスID とスレッドID

4.a54

Major/Minor
Operation         

Major Function Minor Function

IRP_MJ_WRITE

IRP_MN_NORMAL

IrpFlags     

IRP のフラグ。

値に以下が含まれるかどうかをアルファベット一文字でも示す。

N:
IRP_NOCACHE

P:
IRP_PAGING_IO

S:
IRP_SYNCHRONOUS_API

Y:
IRP_SYNCHRONOUS_PAGING_IO

00060a01
N---

DevObj

デバイスオブジェクトのアドレス

8DF72388

FileObj

ファイルオブジェクトのアドレス

8DF74398

Transact

FLT_RELATED_OBJECTS 構造体のTransaction

00000000

status:inform                              

FLT_CALLBACK_DATA 構造体のIoStatus.Status IoStatus.Information

00000000:00001000

Arguments                            

引数

Arg1 ~ Arg6 は、それぞれ、Data->Iopb->Parameters.Others.Argument*

1:00001000
2:00000000 3:0006F000 4:00000000 5:9F1BC000
6:00000000

Name

ファイル名

DeviceHarddiskVolume2ProgramDataMicrosoftWindows  DefenderSupportMpWppTracing-03172018-123129-00000003-ffffffff.bin

 

 

続いて、ファイルに出力してみます。/f の後にファイル名を指定して実行後、g を入力します。

 

>/f c:minispylog.txt

    Log to file c:minispylog.txt

>g

Should be logging to screen...

IRP 0000DDD2 15:26:04:616 15:26:04:616        4.a54  IRP_MJ_WRITE                        00060a01 N--- 8DF72388 8DF74398 00000000 00000000:00001000 1:00001000 2:00000000 3:001C9000 4:00000000 5:9F15E000 6:00000000 DeviceHarddiskVolume2ProgramDataMicrosoftWindows DefenderSupportMpWppTracing-03172018-123129-00000003-ffffffff.bin

                                                                     IRP_MN_NORMAL

 

ファイル出力をやめるために、Enter を実行してコマンドモードにし、/f を入力します。(ファイル出力を止めない状態でログを開こうとすると、別のプロセスが使用中というエラーになります。)

 

>/f

    Stop logging to file

 

ログファイルを開いてみます。以下のように上記と同じログがありつつも、コマンドプロンプト(スクリーン) 上に表示されるよりも多く記録されていました。

 

clip_image008

 

 

上記の内容が、ファイルシステムミニフィルタドライバを開発される方のお役に立てば幸いです。

 

WDK サポートチーム 津田

 


Learn How To View Your Real-Time Carbon Emissions With Azure, And More On The Friday Five

$
0
0

View Your Real-time Carbon Emissions - Yes, It’s Possible Now With Microsoft Azure!

Chourouk Hjaij is an Azure MVP, blogger and speaker. She writes on her blog here, has published many code samples on MSDN and has spoken at events around the world. Being a Microsoft Certified Trainer has allowed her to become proficient in simplifying complex technologies, making her an expert in Azure, Office 365 and Xamarin. Sharing expertise and leadership have allowed to her to become a MCT Regional Lead for France region. Chourouk works as a head of the Microsoft department in LK Technology. She helps startups and enterprises make the best architecture around Microsoft technologies, and helps them to architect their cloud projects by mentoring others in latest software development techniques. Chourouk was a very active Microsoft Student Partner and she was awarded the MEA first prize in the Microsoft technical competition for cross platform mobile development & Azure. Follow her on Twitter @ChouroukHJ

Angular Filtering On Multiple Options Using Anonymous Functions

Oscar Garcia is a Principal Software Architect who resides in Sunny South Florida. He is a Microsoft MVP and certified solutions developer with many years of experience building solutions using .Net framework and multiple open source frameworks. Currently, he specializes in building solutions using technologies like SharePoint,ASP.NET, NodeJS, AngularJS, and other JavaScript frameworks. You can follow Oscar on Twitter via @ozkary or visiting his blog at ozkary.com.

    Monitoring Azure PaaS

    Steve Buchanan is a “Group Manager – Cloud Transformation/DevOps” with Avanade, a six-time Microsoft MVP, and author of several technical books focused on cloud and data center management. In his 17+ years as an IT professional, Steve has held positions ranging from infrastructure architect to IT manager. Steve was recently featured on the “25 ITSM Experts to Watch in 2017” list and as an “IT Unity Community Champ” in 2017. He has presented at several technical conferences including MMS, Microsoft Ignite, and OSCON. Steve is currently focused on transforming the position of IT into a strategic partner of the business and driver of digital transformation through ITSM, DevOps, and CloudOps. He stays active in the technical community and enjoys blogging about his adventures in the world of IT at www.buchatech.com. Follow him on Twitter @buchatech

    Microsoft MVP Global Summit — 2018

    David Pine is a Technical Evangelist and Microsoft MVP working at Centare in Milwaukee, Wisconsin. David's passion for learning has led to his desire to give back to the developer community. David is a technical blogger, whose blogs have been featured on asp.netmsdn webdev and msdn dotnet. David is active in the developer community, speaking at technical conferences, contributing to popular open-source projects, serving as a mentor and giving back to stackoverflow.com. Follow him on Twitter @davidpine7.

    SharePoint Migration Best Practices: Take Your Customizations To The Cloud

    Erwin van Hunen has more than 20 years of experience in application development at complex corporate environments. He works in various roles including systems development, project management and system architect. Erwin is experienced with large corporate accounts, and has many years of experience working for international organizations. Today Erwin is focused on architectural design for solutions and infrastructures based upon Office365. Erwin is a member of the Office365 Patterns and Practices Core Team, and is actively involved in promoting and extending the solutions, samples and scenarios in that project. Erwin has been certified both as a Microsoft Certified Master: SharePoint 2010 and as a Microsoft Certified Solution Master: SharePoint. This makes him a member of a very unique community consisting of certified masters in the world. Follow him on Twitter @erwinvanhunen

    LUIS App を作成する

    $
    0
    0

     

    今回は、LUIS App を作成する例を、以下の公開ドキュメントに沿って 2 つご案内いたします。

     

      1. Create your first LUIS app の手順に沿って、「Home Automation」のLUIS App を作ります。(手順 7 15)

      1. Create an app 以降の手順に沿って、「TravelAgent」のLUIS App を作ります。(手順16 23)

     

    手順1 ~ 6は共通です。

     

    手順

    =====

      1. Azure ポータルで LUIS のリソースを作成します。

    1-1.      Azure ポータル(https://portal.azure.com/) にログインします。

    1-2.      左側のペインより[リソースの作成] - [AI + Cognitive Services] - [Language Understanding] をクリックします。

    1-3.      Name、サブスクリプション、場所、価格レベル(今回は例としてF0)Resource group を適宜入力し、[作成] をクリックします。

     

    clip_image001

     

     

      1. Azure ポータルで1-3 で作成した Name のリソースをクリックします。

      1. 以下の画面で、[Language Understanding Portal] をクリックします。

     

    clip_image002

     

      1. https://www.luis.ai/home が表示されるので、[Login/Sign up] をクリックします。

     

    clip_image003

     

      1. Azure ポータルの時と同じアカウントを選択します。

    clip_image004

     

    ※ログインできない場合は以下のブログ記事をご参照ください。

    - LUIS ポータル (www.luis.ai) にサインインできない場合の対処方法

    < https://blogs.msdn.microsoft.com/jpcognitiveblog/2018/03/14/cannot-sign-in-luis-portal/ >

     

      1. https://www.luis.ai/applications が表示されますので、[Create new app] をクリックします。

     

    clip_image005

     

    以降の手順では、例として、以下の2 つの LUIS App を作ります。ご都合の良い方をご利用ください。

      1. Create your first LUIS app の手順に沿って、「Home Automation」のLUIS App を作ります。(手順 7 15)

      1. Create an app 以降の手順に沿って、「TravelAgent」のLUIS App を作ります。(手順16 23)

     

      1. Name Home Automation と入力し、[Done] をクリックします。

     

    clip_image006

     

      1. 左側のペインの一番下の [Prebuilt Domains] をクリックし、右側の検索ボックスでHome を入力し、表示された [HomeAutomation] [Add domain] をクリックします。

     

    clip_image007

     

    [Remove domain] と表示されたら完了です。

     

    clip_image008

     

      1. 左側のペインの [Intents] をクリックすると、HomeAutomation domain intents が以下のように登録されていることを確認できます。

     

    clip_image009

     

      1. HomeAutomation.TurnOff intent をクリックすると、以下のように登録済みの utterance の一覧が確認できます。

     

    clip_image010

     

      1. 右上の[Train] をクリックして、学習させます。学習が完了すると、赤色から緑色に変わります。

     

      1. [Train] の右側の [Test] をクリックします。

     

      1. Type a test utterance” と書かれたテキストボックスに、例として turn off the lights と入力します。HomeAutomation.TurnOff 0.99 というスコアで選ばれたことがわかります。

     

    clip_image011

     

      1. 画面上部の [PUBLISH] タブをクリックし、[Publish to production slot] をクリックして、このHome Automation アプリを Publish します。

     

    clip_image012

     

      1. Publish に成功したら、同ページの下部に、エンドポイント URL が表示されます。

     

    clip_image013

     

     

      1. 次の TravelAgent アプリを作るために、画面上部の [My apps] をクリックし、[Import new app] をクリックします。

     

    clip_image014

     

      1. LUIS のサンプル App travel-agent-sample-01.json JSON < https://github.com/Microsoft/LUIS-Samples/tree/master/documentation-samples/Examples-BookFlight > からローカルに.json
        ファイルとして保存します。

      1. 以下の画面で、17. .json ファイルを選択し、[Done] をクリックします。

     

    clip_image015

     

      1. 左側のペインの[Intents] [Entities] をクリックし、サンプルの内容が登録されていることを確認します。

     

    clip_image016

     

    clip_image017

     

      1. 右上の[Train] をクリックし、学習させます。

      1. 右側の[Test] をクリックし、”Type a test utterance” と書かれたテキストボックスに、book a flight to seattle と入力すると、BookFlight intent 0.79 のスコアで選ばれたことが確認できます。

     

    clip_image018

     

      1. 画面上部の[PUBLISH] タブをクリックし、[Publish to production slot] をクリックします。

     

    clip_image019

     

     

      1. Publish に成功したら、同ページの下部に、エンドポイント URL が表示されます。

     

     

    以上で2 つのLUIS App が作成できました。

     

    上記に関連するご質問と回答です。

     

    Q1. 日本語のサンプルはありませんか?

     

    A1. 現時点で弊社よりご紹介しているサンプルはございません。

     

    Q2. (1) の例で使用している Prebuilt Domain は、日本語のLUIS App を作成すると出てきません。英語でも Preview なので、日本語では未実装ですか?

     

    A2. はい、現時点では、日本語では未実装です。

     

    Culture-specific understanding in LUIS apps

    https://docs.microsoft.com/en-us/azure/cognitive-services/luis/luis-supported-languages

     

    clip_image020

     

     

    Prebuilt entities であれば、利用可能なものの一覧が、以下のドキュメントの表に記載されています。

     

    Prebuilt entities reference

    https://docs.microsoft.com/en-us/azure/cognitive-services/luis/luis-reference-prebuilt-entities

     

    clip_image021

     

     

     

    上記がお役に立てば幸いです。

     

    Cognitive Services 開発サポートチーム 津田

     

    The Microsoft Education Roadshow is back!

    $
    0
    0

    Here’s what to expect and how to sign up!

    What is the Microsoft Education Roadshow?

    The Microsoft UK Education Roadshow will help fulfil our mission to empower the students and teachers of today to create the world of tomorrow. With over 100 events taking place across the UK in 2017 and 2018, this is the perfect opportunity for educators to see first-hand how Microsoft technologies can enhance teaching and learning.

    Events are completely FREE and perfect for those at the very beginning of their digital transformation journeys. All events will involve hands-on training workshops led by our specialist Microsoft Learning Consultants and/or Microsoft Training Academies, and will focus specifically on how Office 365 and Windows 10 can help transform learning.


    Agenda:

    • Opening Keynote
    • Microsoft Educator Community
    • Microsoft Teams
    • Office 365
    • Windows 10
    • Paint 3D
    • Digital Skills Program
    • Survey and Feedback
    • Closing Keynote and Networking to explore device offerings

    What to expect and how to sign up?

    The events hosted around the UK will aim to give delegates a hands-on workshop experience where they can interact within an Office 365 tenancy on Windows 10 devices to experience Microsoft in Education in action. With sessions led by Microsoft Learning Consultants who are also teachers there is plenty of opportunities for Q & A to get real-life experiences. In addition, network with Microsoft Education partners and discuss device offerings and programs available. Simply click on the links below to register for our upcoming events or visit our UK Roadshow page for updated information


    Where we’re going?

    Scotland and Wales

    18/04/2018 Fort William: SIGN UP HERE

    Lochaber High School, Camaghael

    19/04/2018 Inverness: SIGN UP HERE

    STEM HUB, University of the Highlands and Islands, An Lochran, 10 Inverness Campus IV2 5NA

    01/05/2018 Dundee: SIGN UP HERE

    Harris Academy, Perth Road

    30/05/2018 Aberdeenshire: SIGN UP HERE

    The Gordon Schools, Huntly AB54 4SE

     

    North of England

    24/04/2018 Cumbria: SIGN UP HERE

    Yarlside Academy, Redoak Avenue, Barrow-in-Furness, Cumbira LA13 0LH

     

     

    South of England

    17/04/2018 Surrey: SIGN UP HERE

    St Hilary's School, Holloway Hill, Godalming. GU7 1RZ

    19/04/2018 Hertfordshire: SIGN UP HERE

    Jupiter Community Free School, Jupiter Drive, Hemel Hempstead, HP2 5NT

    25/04/2018 Oxfordshire: SIGN UP HERE

    Manor School, 28 Lydalls Cl, Didcot

    27/04/2018 Weston: SIGN UP HERE

    Weston College, Winter Gardens (Italian Gardens Entrance), Royal Parade, Weston-Super-Mare BS23 1AJ

    04/05/2018 Milton Keynes College: SIGN UP HERE

    Chaffron Way Campus, Woughton Campus West, Leadenhall MK6 5LP 


    Microsoft Training Academies

    Alternatively we also host specialist events in our 6 Microsoft Training Academies located around the UK. Our Microsoft Showcase schools provide opportunities for you and your staff to go into a Microsoft in the Classroom environment and experience hands on learning in Microsoft Education with a teacher who has first-hand experience of using it within the classroom.

    Click on the links below to register for the events at our Microsoft Showcase School Training Academies:

    St Joseph's Primary and Nursery School in Derbyshire

    Danesfield School in Marlow

    Shireland Collegiate Academy in Smethwick

    Ribblesdale High School in Clitheroe

    Treorchy Comprehensive School in Treorchy


    Microsoft Training Academy in Paddington at Microsoft HQ

    Address: 2 Kingdom Street, Paddington, W2 6BD – to book your personalised session with one of our Microsoft Learning Consultant's at Microsoft's HQ in London, please send an email to mstrainingacademy@microsoft.com with details of the institution, your digital transformation journey, and a range of dates. Our events usually run from 10am-3pm, are tailored for the needs of your individual institution, and refreshments and lunch are provided on the day!


    Our Microsoft Learning Consultants can provide expert advice and training to Microsoft Schools around the UK. If you would like to speak to a Learning Consultant, or even arrange for one to come and visit your school, please email MTAsupport@microsoft.com.

    Modernizing “Did my dad influence me?” – Part 2

    $
    0
    0

    In Part 1 we saw how we can capture the LastFM data from the API.  This of course just gives me some raw data on an Azure Storage Account.  Having data stored on a storage account gives us a wide range of options to process the data, we can use tools like Polybase to read it into Azure SQL Data Warehouse, connect directly from Power BI or when the dataset needs pre-processing or it becomes too large to handle with traditional systems we can use solutions like Azure Databricks to process the data.

    In Part 2 we will focus on processing the data, so we can build beautiful dashboards in Power BI.  Note that we are going to focus on Azure Databricks even though I do realize that the dataset is not huge, but we are learning remember?

    Prerequisites:

    Azure Databricks is an easy to use Spark platform with a strong focus on collaboration.  Microsoft and Databricks collaborated on this offering and it truly makes using Spark a walk in the park.  Azure Databricks went GA recently so we are working in fully supported environment.

    First, we'll create an Azure Databricks Workspace which is simple, all you need is a name, location and the Pricing tier.  Azure Databricks is secured with Azure Active Directory which is of course important to align your efforts around identity in the organization.  After about a minute you will be able to launch the Workspace.  Just push the big "Launch Workspace" button to get started!

    Once you launch the Workspace you will see the Azure Databricks welcome page.  There are links immediately on the page to help you get started, and the navigation is on the left side.

    The first thing we need to do is create a cluster as we need a place to process the code we are going to write.  So let's go ahead and click Clusters - + Create Cluster.  For the "Cluster Type" you will have two choices, Serverless or Standard.  A serverless pool is self-managed pool of cloud resources that is auto-configured, it has some benefits like optimizing the configuration to get the best performance for your workload, better concurrency, creating isolated environments for each notebook, ....  You can read more details about serverless pools here.  We are however going to choose for Standard just because Serverless Pools are currently in Beta.  Creating a cluster takes a couple of arguments like the name and the version of the Databricks runtime and Python.  Now comes the interesting part which is choosing the size of the nodes to support your cluster, this comes with a couple of interesting options like auto scale where you can define the minimum and maximum workers.  From a cost perspective it is very useful to enable Auto Termination if you know you will not be needing the cluster 24/7, just tell Databricks after how many minutes of inactivity it can drop the cluster.  You can also assign some Tags for tracking on the Azure part, override the Spark configuration and enable logging.

    While the cluster is provisioning let's go ahead a start writing code.  We'll go back to the Welcome Page and click Notebook under New.  I'm going to give my notebook a name and choose the default language, here I'm going for Scala because @nathan_gs was my Spark mentor and he's a Scala fan.

    Now we can start to crunch the data using Scala, I'll add the code to the GitHub repo so you can replicate this easily.
    Azure Databricks allows you to mount storage which makes it easy to access the data.

    We can also do visualization right inside the notebook which is a great addition while you are doing data exploration.

    Now we have done some of the data wrangling we can start visualizing the data to get some of the insights and answer the question if my father has influenced me, and whether I am influencing my daughter.  Connecting Power BI to Azure Databricks is documented here.

    We can see a couple of interesting things now that we have visualized the data.  It looks like there is quite some overlap between what I'm listening and what my father is listening.  On my side I have not been successful in convincing my father of System Of A Down though.  I am very happy to see that at least I am having some impact on my daughter too, although she hasn't been able to convince me of Ariana Grande.  I am quite proud if you look at some of the top songs my daughter has in her list. 

    Power BI also allows you to import the data into its own in-memory model which obviously makes sense from a performance point-of-view (given your dataset is not in the terabytes ranges of course).  A neat thing you can do if the data is in memory is use Q&A to get insights.  This way you can just ask questions to your data instead of building your visualizations with drag & drop.

    Both Azure Databrick and Power BI have a lot more interesting capabilities obviously, but my post would be endless if I had to dive into all of this.

    In Part 3 we will look at how we can use Serverless solutions to automate the deployment of the Docker containers to Azure Containers Instances.

    Beginner’s Guide to Azure Automation

    $
    0
    0

    Azure Automation

    For Azure IAAS enthusiasts, Microsoft has provided a platform to automate all the azure services using powershell. The language is tweaked and used as “powershell workflow”.

    Why to Use

    • Reducing Manual Effort and help in consistent testing
    • Managing resources (deployment/VM’s etc)

    How to Use

    • Create a powershell workflow in azure web portal and execute it.

    Runbook

    • Deployment and execution of tasks written in PowerShell.
    • Provisioning/Deployment/Maintenance/Monitoring.

    Things to know!

    Automation Account – A dedicated account to perform runbook design/execution/management.

    Asset – Global resources used by runbooks to assist in common tasks and value specific operations

    Windows PowerShell Workflow – Implementation of azure automation using PowerShell Workflows. Workflow is a group of individual steps performing an action.

    Management Certificates – Authenticate azure resources for azure automation in an azure subscription.

    Tips to remember!

    • An automation account name is unique per region and per subscription. Multiple accounts are possible. Max 30 per subscription in different region.

     

    Sample One: Creating a runbook to Connect Azure Subscription using Azure AD

    Create Automation Account

    1. Goto https://portal.azure.com

     

    2. Click Browse and select Automation Accounts.

     

    3. Click Add in the Automation accounts.

    4. Fill details in Add Automation Account and click Create.

    5. Automation Account is created.

     

    6. In Automation Resources, Select Runbooks.

    7.  Click Add Runbook. Enter details and click create.

     

    8. Runbook is created. Authoring status is New.

     

    9. Click Edit in the Runbook details page.

     

    10. Create an Azure AD User to the subscription and set it as a co-administrator. Add the user to the asset as credential type.

    11. Edit the Runbook with the below code and Save the runbook.

    12. Test the runbook by clicking Start in the Test Pane.

     

    13. Test is passed.

    14. Publish the runbook by clicking publish in the Runbook detail page.

    15. You can schedule the run book based on the Recurrence/Date etc.

    Developer Preview – March Update

    $
    0
    0

    We're pleased to announce the March update of the Developer Preview. We have been working hard on improving the capabilities of the toolset as well as fixing incoming issues reported by you. Below you can see the changes that we're announcing for this update. The preview is already available if you sign up for the Ready to Go program. Read more at http://aka.ms/readytogo.

    After April 2nd the build will become public and you can get it through http://aka.ms/bcsandbox.

    Please note, that the improvements announced in this blog post are not available in Dynamics NAV 2018 and the following cumulative updates of Dynamics NAV 2018.

     

    Static Code Analysis

    Specifying "al.enableCodeAnalysis": true in your settings will enable static code analysis for AL projects.  Three analyzers have been implemented that will support general AL coding guidelines, AppSource, and per-tenant extension analysis. Analyzers can be individually enabled by specifiying them in the al.codeAnalyzers setting.

    "al.enableCodeAnalysis": true,

    "al.codeAnalyzers": [

    "${CodeCop}"

    ]

    You can customize how the diagnostics generated by the analyzers are reported by adding a custom ruleset file <myruleset>.ruleset.json to the project and specifying the path to it in the “al.ruleSetPath” setting.

    “al.ruleSetPath” : “myruleset.ruleset.json”

    Using the snippets truleset and trule will get you started quickly.

    For more information, see Using the Code Analysis Tool.

    Help for new pages

    When creating new Pages, Reports, and XMLPorts in Extensions V2, it is now possible to specify the help link that will be used when the user presses the Help button in the user interface.

    You can do this by using the property HelpLink on Pages, for example:

    page 50100 MyPageWithHelp
    {
    HelpLink = 'https://www.github.com/Microsoft/AL';
    }

    And by using the property HelpLink on the request page of Reports and XmlPorts:

    report 50100 MyReportWithHelp
    {
    requestpage
    {
    HelpLink = 'https://www.github.com/Microsoft/AL';
    }
    }

    For more information, see Adding Help Links.

    Creating Role Center Headlines

    You can set up a Role Center to display a series of headlines, where headlines appear one at a time for a predefined period of time before moving to the next. The headlines can provide users with up-to-date information and insights into the business and their daily work.
    For more information, see Creating Role Center Headlines.

     

    Improved experience for event subscribers

    We improved the snippets and IntelliSense around event subscribers, for both the attribute arguments and the method parameters. This is now working for trigger events, integration and business events. In case of business and integration events, the suggestion of the method parameters is made based on the attributes of the event publisher in order to know if the global variables and/or the sender should also be suggested.

    Here is what it looks like to subscribe to an integration event when using the snippets:

    Here is what it looks when writing the event subscriber from scratch:

     

    Working with data?

    You can now inspect the contents of a table when you publish an AL project (F5 and Crtl+F5) from Visual Code. Simply modify the launch.json file of the project to include the "startupObjectType="table" and "startupObjectId"=" settings, replacing with the ID of the table that you want to see. The table will display in client as read-only.

    From the client, you can also view a specific table by appending the URL with "&table=", such as:
    https://businesscentral.dynamics.com/?company=CRONUS%20Inc.&table=18

    For more information, see Viewing Table Data.

     

    Choose your cue layout on Role Centers

    We now offer a wide layout option for cues. The wide layout is designed to display large values and gives you a way emphasize a group of cues. When set to the wide layout, a cue group will be placed in its own area, spanning the entire width of the workspace.

    For more information, see Cues and Action Tiles.

     

    As usual we encourage you to let us know how you like working with these additions and keep submitting suggestions and bugs. You can see all the filed bugs on our GitHub issues list (https://github.com/Microsoft/AL/issues).

     

    For a list of our previous blog posts, see the links at the end of this post.

    NAV Development Tools Preview - February 2018 Update

    NAV Development Tools Preview - Anniversary Update

    NAV Development Tools Preview - December 2017 Update

    NAV Development Tools Preview - November 2017 Update

    NAV Development Tools Preview - October 2017 Update

    NAV Development Tools Preview - September 2017 Update

    NAV Development Tools Preview - August 2017 Update

    NAV Development Tools Preview - July 2017 Update

    NAV Development Tools Preview - June 2017  Update

    NAV Development Tools Preview - April 2017 Update

    NAV Development Tools Preview - March 2017 Update

    NAV Development Tools Preview - February 2017 Update

    NAV Development Tools Preview - January 2017 Update

    Announcing the Preview of Modern Development Tools for Dynamics NAV

    プロジェクト参照を含むソリューションのビルド時に、変更のないプロジェクトに対してもリビルドが行われる

    $
    0
    0

    こんにちは、Visual Studio サポート チームです。

    今回は、Visual Studio でソリューションをビルドする際のリビルドの動作に関して、ご留意いただきたい内容をご紹介します。

     

    現象

    プロジェクト参照 (*1) を含むソリューションを Visual Studio で開き、ソリューション構成 (Debug/Release など) を切り替えてビルドを行うと、プロジェクトに変更が加えられていなくてもプロジェクトがリビルドされる場合があります。

     

    (*1) "プロジェクト参照" とは、あるプロジェクトから、同一のソリューションに含まれる別のプロジェクトを参照する参照方法であり、被参照側プロジェクトに変更があると、そのプロジェクトだけでなく参照側のプロジェクトもビルドが必要とみなされます。複数のプロジェクトを 1 つのソリューションで同時に開発する場合に便利な参照形式です。

    これに対し、DLL を直接参照する "アセンブリ参照" は、サードパーティから提供されている DLL や、社内共通の共通ライブラリとして提供されている DLL など、対象のプロジェクトと被参照 DLL が別々に開発される場合に利用される参照方法です。

     

    原因

    本現象は Visual Studio の想定された動作に基づく制限事項です。

    プロジェクト参照を含むソリューションでは、Visual Studio がビルドの要否を判断するロジックの制約から、直前にビルドされたソリューション構成から変更が生じた場合に、プロジェクトを参照している側のプロジェクトに対してリビルドが行われます。(*2)

    なお、"直前にビルドされたソリューション構成" の情報は、Visual Studio の起動中は Visual Studio 内に記憶され、Visual Studio の終了後は .suo ファイル (*3) に保持されます。ここで、.suo ファイルが存在しない状態では、"直前にビルドされたソリューション構成" としては既定の設定 (Debug) が適用されるため、.suo ファイルが存在しない状態で Release 構成のビルドを行うと、常に、"直前にビルドされたソリューション構成から変更が生じた" 状態と判断されるため、プロジェクト参照しているプロジェクトに対してリビルドが行われます。

    このため、特に、直前に Release 構成でビルドを行った状態を保持した .suo ファイルが存在しない状態で、Visual Studio のコマンドライン ツールである devenv.exe を利用して Release 構成でソリューションのビルドを行うと、毎回、リビルドが行われてしまうといった状況が起こりえます。

     

    (*2) 本動作は Visual Studio 2017 より前のバージョンの Visual Studio で確認されています。Visual Studio 2017 では IDE 上からのビルドでは本動作は改善されています。

    (*3) .suo ファイルは、Visual Studio の終了時に自動的に出力される隠しファイルで、ユーザーによる設定などの情報を保持しています。Visual Studio 2013 の場合は ***.v12.suo、Visual Studio 2015 の場合は拡張子のみで .suo といった形で、バージョンによってファイル名が異なります。

     

    <実行例 : .suo ファイルなし>

    devenv.exe で Release 構成を指定して 2 回続けてビルドを行っていますが、更新不要とならずにビルドが実行されています。

    c:tempVS2013ConsoleApplication1_vb2013>devenv.exe /Build "Release|Any CPU" ConsoleApplication1_vb2013.sln
    Microsoft Visual Studio 2013 Version 12.0.40629.0
    Copyright (C) Microsoft Corp. All rights reserved.
    ------ ビルド開始: プロジェクト:ConsoleApplication1_vb2013, 構成:Release Any CPU ------

    ========== ビルド: 1 正常終了0 失敗、1 更新不要、0 スキップ ==========

    c:tempVS2013ConsoleApplication1_vb2013>devenv.exe /Build "Release|Any CPU" ConsoleApplication1_vb2013.sln

    Microsoft Visual Studio 2013 Version 12.0.40629.0

    Copyright (C) Microsoft Corp. All rights reserved.

    ------ ビルド開始: プロジェクト:ConsoleApplication1_vb2013, 構成:Release Any CPU ------

    ========== ビルド: 1 正常終了0 失敗、1 更新不要、0 スキップ ==========

     

    対処策

    対象のソリューションを、一度、Visual Studio で開いて Release 構成でビルドし、Visual Studio を終了して .suo ファイルが出力された状態とすることで、devenv.exe を使用して Release 構成でビルドを行う場合でも、不要なリビルドを避けることができます。

    <実行例 : Release 構成でビルドして終了した際に出力された .suo ファイルあり>

    c:tempVS2013ConsoleApplication1_vb2013>devenv.exe /Build "Release|Any CPU" ConsoleApplication1_vb2013.sln
    Microsoft Visual Studio 2013 Version 12.0.40629.0
    Copyright (C) Microsoft Corp. All rights reserved.
    ========== ビルド: 0 正常終了、0 失敗、2 更新不要0 スキップ ==========

    c:tempVS2013ConsoleApplication1_vb2013>devenv.exe /Build "Release|Any CPU" ConsoleApplication1_vb2013.sln

    Microsoft Visual Studio 2013 Version 12.0.40629.0

    Copyright (C) Microsoft Corp. All rights reserved.

    ========== ビルド: 0 正常終了、0 失敗、2 更新不要0 スキップ ==========

     

    プロジェクトの規模が大きくできる限り不要なリビルドを避けたい場合など、上記内容がお役に立てましたら幸いです。


    SQL Updates Newsletter – March 2018

    $
    0
    0

    Recent Releases and Announcements

     

    Troubleshooting and Issue Alerts

    • Critical: Do NOT delete files from the Windows Installer folder. C:windowsInstaller is not a temporary folder and files in it should not be deleted. If you do it on machines on which you have SQL Server installed, you may have to rebuild the operating system and reinstall SQL Server.
    • Critical: Please be aware of a critical Microsoft Visual C++ 2013 runtime pre-requisite update that may be required on machines where SQL Server 2016 will be, or has been, installed.
      • https://blogs.msdn.microsoft.com/sqlcat/2016/07/28/installing-sql-server-2016-rtm-you-must-do-this/
      • If KB3164398 or KB3138367 are installed, then no further action is necessary. To check, run the following from a command prompt:
      • powershell get-hotfix KB3164398
      • powershell get-hotfix KB3138367
      • If the version of %SystemRoot%system32msvcr120.dll is 12.0.40649.5 or later, then no further action is necessary. To check, run the following from a command prompt:
      • powershell "get-item %systemroot%system32msvcr120.dll | select versioninfo | fl"
    • Important: If the Update Cache folder or some patches are removed from this folder, you can no longer uninstall an update to your SQL Server instance and then revert to an earlier update build.
      • In that situation, Add/Remove Programs entries point to non-existing binaries, and therefore the uninstall process does not work. Therefore, Microsoft strongly encourages you to keep the folder and its contents intact.
      • https://support.microsoft.com/en-us/kb/3196535
    • Important: You must precede all Unicode strings with a prefix N when you deal with Unicode string constants in SQL Server
    • Important: Default auto statistics update threshold change for SQL Server 2016
    • Audit SQL Server stop, start, restart
    • Heuristic DNS detections in Azure Security Center
      • We have heard from many customers about their challenges with detecting highly evasive threats. To help provide guidance, we published Windows DNS server logging for network forensics and the introduction of the Azure DNS Analytics solution
      • The benefits of examining DNS is its ability to observe connections across all possible network protocols from all client operating systems in a relatively small dataset. The compactness of this data is further aided by the default behavior of on-host caching of common domains.
      • https://azure.microsoft.com/en-us/blog/heuristic-dns-detections-in-azure-security-center/
    • How to configure tempdb in Azure SQL Managed Instance(preview)
      • One limitation in the current public preview is that tempdb settings are not maintained after fail-over. If you add new files to tempdb or change file size, these settings will not be preserved after fail-over, and original tempdb will be re-created on the new instance. This is a temporary limitation and it will be fixed during public preview.
      • However, since Managed Instance supports SQL Agent, and SQL Agent can be configured to execute some script when SQL Agent start, you can workaround this issue and create a SQL Agent job that will pre-configure your tempdb.
      • https://blogs.msdn.microsoft.com/sqlserverstorageengine/2018/03/13/how-to-configure-tempdb-in-azure-sql-managed-instance/

     

    Recent Blog Articles

     

    Recent Training and Technical Guides

     

    Script and Tool Tips

     

    Fany Carolina Vargas | SQL Dedicated Premier Field Engineer | Microsoft Services

    Running .NET applications client-side in the browser

    $
    0
    0

    In this post, App Dev Managers Robert Schumann and Ben Hlaban, introduce us to Blazor – an experimental web UI framework based on C#, Razor, and HTML that runs in the browser via WebAssembly.


    This journey started from a blog by Daniel Roth. Other than the YouTube of Steve Sanderson’s prototype demo at NDC Oslo, this wasn’t much information to draw from.

    A few days later I mention Blazor to my colleague Ben, and he starts asking a bunch of rapid-fire questions. Whoa! Time-out. With coffee top-offs, we start a Skype call, launch Visual Studio, git clone repo, and intrigue quickly ensued.

    This blog is about getting started with Blazor. We’ll provide setup guidance, develop a cursory ToDo List application using the MVC pattern, and even do some unit testing. A second blog is intended to delve into E2E testing the application using Selenium and demonstrate how to position the project for CI/CD.

    Pre-requisites*

    * If you had to install any of the above please do cursory system reboot

    Setup

    • Launch Visual Studio Installer
      • Make sure Visual Studio is up-to-date
      • Make sure “ASP.NET and web development” is enabled
      • Make sure “.NET Core cross-platform development” is enabled
    • Install Blazor project template
      • Double-click previously downloaded file Blazor.VSExtension.VSIX
        or
      • At command via VSIXInstaller.exe Blazor.VSExtension.VSIX

    Here we go…

    • In Visual Studio 2017, select File | New Project | Visual Studio | Web | Blazor application
    • Name this new project “HelloBlazor”. Click OK button.
    • Press CTRL + F5 to make sure the default baseline project works. IIS Express should spin-up. The project eventually loads and is a typical Visual Studio templated SPA with Home, Counter, and Fetch Data features OTB.

    image

    • Right-click HelloBlazor project | Add | Class | Name = “Todo.cs” | OK

    namespace HelloBlazor

    {

    public class Todo

    {

    public string Description { get; set; }

    public bool IsComplete { get; set; }

    }

    }

    • Right-click HelloBlazor project | Add | Class | Name = “TodoComponent .cs” | OK

    using Blazor.Components;

    using System.Collections.Generic;

    namespace HelloBlazor

    {

    public class TodoComponent : RazorComponent

    {

    public IList<Todo> Todos = new List<Todo>();

    public Todo NewTodo = new Todo();

    public void AddTodo()

    {

    if (!string.IsNullOrWhiteSpace(NewTodo.Description))

    {

    Todos.Add(new Todo { Description = NewTodo.Description, IsComplete = NewTodo.IsComplete });

    NewTodo = new Todo();

    }

    }

    }

    }

    image

    • Right-click HelloBlazor project | Add | New Item | Web | ASP.NET | Razor View | Name = “TodoList.cshtml” | OK

    @using HelloBlazor

    @inherits TodoComponent

    <h1>Todo List (@Todos.Count(todo => !todo.IsComplete))</h1>

    <ul style="list-style: none">

    @foreach (var todo in Todos)

    {

    <li>

    <input @bind(todo.Description) />

    <input type="checkbox" @bind(todo.IsComplete) />

    </li>

    }

    <li>

    <input @bind(NewTodo.Description) />

    <input type="checkbox" @bind(NewTodo.IsComplete) />

    <button @onclick(AddTodo)>Add</button>

    </li>

    </ul>

    • Finally, let’s add a menu link to the new page
      • Double-click or open the file Shared/NavMenu.cshtml
      • Add a new list item to the existing unordered list;

    <ul class='nav navbar-nav'>

    . . .

    <li>

    <a href='~/TodoList'>

    <span class='glyphicon glyphicon-th-list'></span> Todo List

    </a>

    </li>

    </ul>

    • Press CTRL + F5 to make sure the modified project works. The new page should be available on the left navbar from the “Todo List” link.

    image

    Unit Testing

    • Right-click HelloBlazor solution | Add | New Project | Installed | Visual C# | Web | .NET Core | MSTest Test Project (.NET Core) | Name = HelloBlazor.Test | OK
    • Right-click Dependencies | Add Reference | Projects | Solution | HelloBlazor | OK
    • Right-click UnitTest1.cs file | Rename | Name = TodoComponentTests.cs | Yes
    • Within the TodoComponentTests class rename TestMethod1 to AddToDo

    using Microsoft.VisualStudio.TestTools.UnitTesting;

    namespace HelloBlazor.Tests

    {

    [TestClass]

    public class TodoComponentTests

    {

    [TestMethod]

    public void AddTodo()

    {

    //arrange

    var todoComponent = new TodoComponent();

    var description = "this is a test";

    var isComplete = false;

    //act

    todoComponent.NewTodo.Description = description;

    todoComponent.NewTodo.IsComplete = isComplete;

    todoComponent.AddTodo();

    //assert

    Assert.IsTrue(todoComponent.Todos.Count == 1);

    Assert.IsTrue(todoComponent.Todos[0].Description == description);

    Assert.IsTrue(todoComponent.Todos[0].IsComplete == isComplete);

    }

    }

    }

    • Press CTRL+R,A to run all tests.

    image

    References


    Premier Support for Developers provides strategic technology guidance, critical support coverage, and a range of essential services to help teams optimize development lifecycles and improve software quality.  Contact your Application Development Manager (ADM) or email us to learn more about what we can do for you.

    Top stories from the VSTS community – 2018.03.23

    $
    0
    0

    Here are top stories we found in our streams this week related to DevOps, VSTS, TFS and other interesting topics.

    Top Stories

    Podcasts & Videos

    • How Windows, Windows Store, Xbox & Surface streamlined their planning and centralized on VSTS - Donovan Brown
      A very in-depth interview with Jill Campbell a Group Program Manager who works in the Windows and Devices Group (WDG) in Microsoft. Jill describes how Windows moved from various tools to a unified process powered by VSTS, and how VSTS scales for them along with the flexibility and customizability which enabled them to be successful in managing over 11 million work items, across 33,000 users (across all of Microsoft) and with teams varying in processes ranging from Agile through to Waterfall.
    • Radio TFS 157: Getting into the Groove
      Josh and Paul catch Greg up on the MVP Summit and while they can't share the content, sounds like they can share some of the colds they caught. Lots of news including TFS updates and TFS to VSTS migration tips.

    If you have feedback or would like to see a story included in next weeks round-up then please leave a comment below or use the #VSTS hashtag on Twitter

    OAUTH 2.0 protocol support level for ADFS 2012R2 vs ADFS 2016

    $
    0
    0

    Active Directory Federation Services (ADFS) is a software component developed by Microsoft that can be installed on Windows Server operating systems to provide users with single sign-on access to systems and applications located across organizational boundaries. It uses a claims-based access control authorization model to maintain application security and implement federated identity.

    OAuth 2.0 is an open standard created by the IETF for authorization and is documented by RFC 6749 (https://tools.ietf.org/html/rfc6749). Generally, OAuth provides to clients a "secure delegated access" to server resources on behalf of a resource owner. It specifies a process for resource owners to authorize third-party access to their server resources without sharing their credentials. Designed specifically to work with Hypertext Transfer Protocol (HTTP), OAuth essentially allows access tokens to be issued to third-party clients by an authorization server, with the approval of the resource owner. The third party then uses the access token to access the protected resources hosted by the resource server.

    Starting from Windows Server 2012 R2 ADFS (Version 3.0) supports OAUTH 2.0 authorization protocol, and this post tries to clarify what this means. OAUTH 2.0 define various authorization grants, client and token types. ADFS started with the support of a subset of these, and  increased this support over time with Windows Server 2016 and his ADFS Version 4.0.

    Authorization Grants

    authorization grant type ADFS 2012R2 ADFS 2016
    Authorization code grant

    used to obtain both access tokens and refresh tokens and is optimized for confidential clients (i.e. mobile apps)

    yes yes
    Implicit Grant

    is used to obtain access tokens (it does not support the issuance of refresh tokens) and is optimized for public clients known to operate a particular redirection URI.  These clients are typically implemented in a browser using a scripting language such as JavaScript.

    no yes
    Resource owner password credential no yes
    Client credential grant no yes

    Client Types

    Client Types ADFS 2012R2 ADFS 2016
    Public Client Yes Yes
    Confidential Client No Yes

    Oauth confidential client authentication methods:

    • Symmetric (shared secret / password)
    • Asymmetric keys
    • Windows Integrated Authentication (WIA)

    Token Types

    Token Type ADFS 2012 R2 ADFS 2016
    id_token

    A JWT token used to represent the identity of the user. The 'aud' or audience claim of the id_token matches the client ID of the native or server application

    no yes
    access_token

    A JWT token used in Oauth and OpenID connect scenarios and intended to be consumed by the resource. The 'aud' or audience claim of this token must match the identifier of the resource or Web API.

    yes yes
    refresh_token

    This token is submitted in place of collecting user credentials to provide a single sign on experience. This token is both issued and consumed by AD FS, and is not readable by clients or resources.

    yes yes

    ADFS issues access tokens and refresh tokens in the JWT (JSON Web Token) format in response to successful authorization requests using the OAuth 2.0 protocol. ADFS does not issue SAML tokens over the OAuth 2.0 authorization protocol.

    Further information

    YOLO on Azure Deep Learning Virtual Machine (Windows)

    $
    0
    0

    Video credits: Greenwood Campbell at CherryBot Launch

    YOLO "You only look once" by Joseph Redmon

    A state of the art real-time object detection system that works using what I can only describe as magic: https://arxiv.org/abs/1612.08242

    If you've seen any of the youtube.com videos demonstrating it, you may be curious to setup a development rig and play with it yourself.  However, setting up an environment isn't that simple as there are quite a few dependencies, all evolving at different cadence - depending on what you've already got installed on your machine you can end up in a bit of a mess and scratching you head as to why it's not working.

    I spent a couple of hours this week setting up my Surface Book (with GPU) to experiment with YOLO.  However, after doing so could only get Tiny YOLO to work as kept hitting CUDA out of memory errors.  Then had a dawning moment, why don't I just use Azure's Deep Learning Virtual Machine (DLVM) with GPU?  Here is a guide to getting your own DLVM setup working with YOLO.

    Note: There are numerous guides out there to walk through setting up a development environment for YOLO.  I strongly encourage you to read the following first: https://pjreddie.com/darknet/ and https://github.com/AlexeyAB/darknet/blob/master/README.md.

    Solution

    In the Azure Portal, create a Deep Learning Virtual Machine (DVLM) NC-Series GPU on Windows (Linux also available).  Once provisioned, remote in:

    Dependencies

    The DVLM has a bucket load of tools already installed on it.  However, there are a few updates and additional libraries you need to install to get YOLO up and running.

    Environment setup

    • Tweak the c:program filesNVIDA GPU Computing ToolkitCUDAv9.1includecrt host_config.h file to update it with the latest VS version:

    • Clone the darknet repo: https://github.com/AlexeyAB/darknet.git
    • Copy the C:Program Files (x86)Microsoft Visual StudioPreviewCommunityVCAuxiliaryBuild14.11 props file to the darknetbuilddarknet solution folder eg:

    • Open the darknet.sln solution - retarget the project

    • Build (Release x64 bit)
    • Open the solution/x64 output directory and you should see darknet.exe 🙂
    • Copy the OpenCV dependencies into the output directory above:

    Enjoy.

    Viewing all 29128 articles
    Browse latest View live


    Latest Images

    <script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>