Quantcast
Channel: MSDN Blogs
Viewing all 29128 articles
Browse latest View live

No environment monitoring on customer/partner-owned environments

$
0
0

Using a purchased Azure subscription, customers and partners can deploy a Microsoft Dynamics 365 for Finance and Operations, Enterprise edition environment through the Cloud Hosted Environments feature in LCS. These environments are owned and controlled by the customer or partner. In the past, you could use the Activity monitoring page to view the usage and telemetry data that is used to diagnose issues and build a storyboard view of the system during a specific timeframe. The page could be accessed from the Environment monitoring link on the Environment details page.

 

Due to recent changes in compliance and security, the collection of telemetry data from environments that are hosted in customer and partner subscriptions will no longer be enabled. Going forward, the Activity monitoring page will be blank for these environments. This blog post is to notify you that in the next release of LCS, we will temporarily remove the Activity monitoring and Health metrics tabs from the Environment monitoring portal. You will still be able to access the performance dashboard on the SQL Insights page. We will update this blog post when we re-enable telemetry collection on these environments.


Accelerate your career path with Microsoft Azure cloud skills!

$
0
0

The more you understand how to apply cloud computing skills, the more opportunities may open for you, including career paths, jobs, and more. According to IDC, by 2021 38 percent of IT positions will be cloud related, so there's no better time to learn cloud skills. Once you do, certification is the industry-recognised way to validate your Microsoft Azure expertise and there are options for everyone, no matter where you are in your career journey.

> Find the cloud certification skill for you

Role-based Azure learning paths

If you're an IT professional or a developer, you can start building practical job skills you can use right away with role-based learning paths. They include free self-paced courses provided in partnership with Pluralsight. These paths allow you to learn at your own speed, in a style that works for you. They include hands-on labs, virtual instructor–led labs, and assessments to test your skill level. Upon completion of those assessments, you receive a digital badge that showcases your achievement.

> Explore role-based Azure learning paths

Advanced skills training with a community to support you

You can continue your learning journey or advance your existing skills by learning from industry experts and gaining a formal qualification. Microsoft Next Up Exam Camps use a variety of modalities, so you can study at a time and place that best fits in with you. This helps you build your skills and gain the expertise you need to achieve certification and advance your career. When you certify your skills by taking exams, you earn validation recognised across the industry.

> Get support on your certification journey with Next Up

Azure skills certification options for everyone

Show off your hard work and expertise by validating your Azure skills with industry-recognised certification. There is a solid portfolio of Azure exams, and between now and March 30, 2018, you can take them for a reduced cost. Kick-start your career by learning—and proving—new cloud skills.

Find the certification that's right for you, start learning, and earn the certification that will open doors for your career!

> Start learning 

 

Git と Visual Studio 2017 その 9 : Git でソリューションの共有

$
0
0

前回の記事では、ブランチ切り替え時における作業の一時保存について説明しました。今回は Git を使って手元のソリューションを他の人と共有する方法について見ていきます。共有方法は多くありますが、今回は GitHub.com における共有を考えます。もしアカウントが無い場合は http://github.com へ行って、サインアップしてください。

GitHub でレポジトリ作成

まずは GitHub.com 上に今回使うレポジトリを作ります。

1. http://github.com へログイン。

2. “Start a project” をクリック。

image

3. 名前や説明を入れ、各種設定は既定値のまま作成。

image

image

リモートの追加とプッシュ

次に Git で、作成したリモートサーバーの情報を追加します。

1. 作成した GitHub のレポジトリでリモート用の git コマンドを確認。

image

2. 表示されている通り、‘git remote add origin https://github.com/kenakamu/VS_Git.git” を実行。これで指定した URL を ‘origin’ という名称で利用可能。名称は任意でよいが origin を使うことが一般的なため、今回はそのまま採用。

image

3. ‘git remove -v’ を実行して現在のリモートを確認。

image

4. ‘git branch -a’ を実行。-a を付けることでリモート追跡ブランチを含めた確認が可能。現時点では 2 つのローカルブランチのみ。

image

5. ‘git push -u origin master’ でローカルの master ブランチの情報をリモートにプッシュ。プッシュを実行すると Git はリモート追跡ブランチという特殊ブランチを追加する。また -u オプションを使うことで  ローカルの master ブランチとリモート追跡ブランチの紐づけが行われ、以降 ’git push’ のみでプッシュが可能となる。ログによると 34 アイテムが同期された模様。

image

6. GitHub からも確認。コミットが 6 つあることも確認。

image

7. “6 commits” リンクをクリックするとコミットの履歴が確認可能。コミット ID や追加したユーザー名が確認可能。写真が出ていない理由は、今後改めて。

image

8. 他の開発者が変更をコミットした場合の動作確認として、GitHub 上で変更を追加。Code タブにある ”Add a README” をクリック。

image

9. 特に編集はせず、そのまま “Commit new file” をクリックしてコミット。

image

10. ファイル一覧より REAMD.md を確認。

image

リモートからの変更をプル

Git はリモートの変更を受け取る手段として fetch と pull があり、それぞれ役割が違います。実際に見ていきましょう。

1. ‘git branch -a’ を実行してローカルブランチとリモート追跡ブランチを確認。赤色で表示されいてる remotes/origin/master がリモート追跡ブランチ。

image

2. .gitrefsheads フォルダにはローカルブランチしかない。

image

3. リモート追跡ブランチは .gitrefsremotesorigin に存在。

image

4. ファイルの中身を ‘type .gitrefsremotesoriginmaster’ で確認。ローカルブランチ同様 SHA1 ハッシュ値を含んでいる。現在はローカルの master と同じコミットをポイント

image

5. ‘git fetch’ を実行。fetch はリモートの情報をリモート追跡ブランチにのみ同期。

image

6. 再度各ブランチファイルの中身を確認。リモート追跡ブランチが新しいコミットをポイント。

image

7. GitHub でコミット履歴を見ると同じ ID が確認可能。

imageimage

8. 次に ‘git pull’ を実行。コミットを更新している旨が表示され、README.md が作成。

image

9. ‘type .gitrefsheadsmaster’ でローカル master ブランチのポイント先を確認。リモート追跡ブランチと同じハッシュ値をポイント。

image

競合の解消

既にローカルブランチ間の競合解消は見てきましたが、リモートとの競合も同じか確認しましょう。

Resolve Conflicts

1. ‘echo "ローカルで変更追加" >> README.md’ を実行して README.md を編集。

image

2. ‘git commit -am “ローカルで README.md 更新”’ を実行してコミット作成。

image

3. ‘git log --oneline --graph --all’ で現状確認。ローカル master とリモート追跡ブランチの master が異なるコミットをポイント。

image

4. GitHub からも README.md を編集。ファイルを選択して、鉛筆アイコンをクリック。

image

5. ファイルを変更してコミット。

image

6. ‘git push’ でローカルの変更をリモートへ反映。しかしリモートに更新があるという理由で却下。

image

7. ‘git pull’ を実行して一旦リモートの変更を取得。fetch はできているが、ローカルブランチに取り込む際に競合が発生。

image

8. fetch が成功しているか ‘git log --oneline --graph --all’ で確認。リモート追跡ブランチの master がより新しいコミットをポイントしていることを確認。

image

9. ‘git mergetool’ を実行して競合をマージ。

image

image

10. 競合解消後、‘git commit -m “README.md をマージ”’ にてコミット。

image

11. ‘git log --oneline --graph --all” を実行してマージ履歴を確認。ローカル master がリモート追跡ブランチより新しいコミットをポイント。

image

12. ‘git push’ を実行。今回は成功。GitHub 側でも READM.md を確認。

image

image

ブランチの同期

現在 master ブランチだけが同期されているので、dev ブランチも同期しましょう。

1. ‘git push -u origin dev’ を実行。master と同じく GitHub にブランチが作成される。

image

image

2. リモート側でブランチが削除された場合の処理を検討。現状確認のためまず ‘git branch -a’ を実行。

image

3. さらに詳細を確認するため、‘git remote -v show origin’ を実行。各ブランチが pull/push ペアとしてリモート追跡ブランチとペアになっている。

image

4. GitHub 上で “dev” ブランチを削除。ブランチ一覧よりごみ箱アイコンをクリック。

imageimage

5. ‘git remote prune origin’ を実行。リモートの dev ブランチがないため、origin/dev が pruned と表示。

image

6. ‘git branch -a’ でブランチ表示。リモート追跡ブランチから remotes/origin/dev が来ていることを確認。

image

7. ‘git checkout dev’ を実行。まだリモート追跡ブランチと紐づいている場合警告が表示。その場合は ‘git branch --unset-upstream’ を実行。

image

8. 他のシナリオも検証するため、再度リモートに dev ブランチをプッシュ。今回は -u オプションをつけずに実行。‘git push origin dev’ 実行後 ‘git remote show origin’ で状況確認。

image

9. dev ブランチで ‘git push’ 実行。プッシュ時に -u を付けなかったことでリモート追跡ブランチとペアになっていないことからエラー。’git push -u origin dev’ を再度実行すれば紐づけされる。

image

image

10. GitHub のブランチをローカルから削除することも可能。‘git push origin -d dev’ を実行。-d を付けることで削除指示をプッシュ。

image

11. ‘git branch -a’ を実行するとローカルのリモート追跡ブランチも消えていることを確認。

image

12. 再度 ‘git push -u origin dev’ でリモートブランチを再作成。

リモートをクローン

PC が新しくなったり、ローカルのソリューションを消してしまった場合、GitHub からすべてを復旧することが出来ます。

1. ローカルの VS_Git フォルダを削除。

image

2. GitHub に行って、“Clone or download” ボタンをクリック。アドレスをコピー。

image

3. コマンドプロンプトを開き、ソリューションをクローンしたいフォルダへ移動。

4. ‘git clone https://github.com/kenakamu/VS_Git.git’ を実行。

image

5. VS_Git フォルダが作成されることを確認。

image

6. VS_Git フォルダに移動後、‘git log --oneline --graph --all’ で状況確認。すべてのコミットが存在するが、ローカルの dev ブランチがない。

image

7. ‘git branch -a’ で確認。やはりローカルの dev ブランチがない。

image

8. ‘git checkout dev’ を実行。ローカル dev ブランチ作成と同時に、リモート追跡ブランチとペアリングされる。

image

リモートの削除

最後にリモートを削除してみましょう。

1. ‘git remote remove origin’ を実行後 ‘git branch -a’ でブランチ確認。

image

2. 削除はこれだけだが、次回 Visual Studio でも同じ検証をするため、以下のように各ブランチでハードリセットを実行。

image

まとめ

リモートやリモート追跡ブランチは概念の理解が重要です。クローンやブランチをの同期を覚えておけば、ローカルのファイルがなくなっても安心です。次回は Visual Studio 2017 のリモート機能について見ていきます。

中村 憲一郎

RyuJIT Just-in-Time Compiler Optimization Enhancements

$
0
0

I'd like to tell you about some of the recent changes we've made as part of our ongoing work to extend the optimization capabilities of RyuJIT, the MSIL-to-native code generator used by .NET Core and .NET Framework. I hope it will make for an interesting read, and offer some insight into the sorts of optimization opportunities we have our eyes on.

Note: The changes described here landed after the release fork for .NET Core 2.0 was created, so they are available in daily preview builds but not the released 2.0 bits. Similarly, these changes landed after the fork for .NET Framework 4.7.1 was created. The changes to struct argument passing and block layout, which are purely JIT changes, will automatically propagate to subsequent .NET Framework releases with the new JIT bits (the RyuJIT sources are shared between .NET Core and .NET Framework); the other changes depend on their runtime components to propagate to .NET Framework.

Improvements for Span

Some of our work was motivated by the introduction of Span<T>, so that it and similar types could better deliver on their performance promises.

One such change was #10910, which made the JIT recognize the Item property getters of Span<T> and ReadOnlySpan<T> as intrinsics -- the JIT now recognizes calls to these getters and, rather than generate code for them the same way it would for other calls, it transforms them directly into code sequences in its intermediate representation that are similar to the sequences used for the ldelem MSIL opcode that fetches an element from an array. As noted in the PR's performance assessment (n.b., if you follow that link, see also the follow-up where the initially-discovered regressions were fixed with subsequent improvements in #10956 and dotnet/roslyn#20548), this improved several benchmarks in the tests we added to track Span<T> performance, by allowing the existing JIT code that optimized array bound checks that are redundant with prior checks, or that are against arrays with known constant length, to kick in for Span<T> as well. This is what some of those improved benchmark methods look like, and their improvements:

Building on that, change #11521 updated the analysis machinery the JIT uses to eliminate bounds checks for other provably in-bounds array accesses, to similarly eliminate bounds checks for provably in-bounds Span<T> accesses (in particular, bounds checks in for loops bounded by span.Length). As noted in the PR (numbers here), this brought the codegen for four more microbenchmarks in the Span<T> tests up to par with the codegen for equivalent patterns with arrays; here are two of them:

One key fact that these bounds-check removal optimizations exploit is that array lengths are immutable; any two loads of a.Length, if a refers to the same array each time, will load the same length value. It's common for the JIT to encounter different accesses to the same array, where the reference to the array is held in a local or parameter of type T[], such that it can determine that intervening code hasn't modified the local/parameter in question, even if that intervening code has unknown side-effects. The same isn't true for parameters of type ref T[], since intervening code with unknown side-effects might change which array object is referenced. Consider:

Since Span<T> is a struct, some platforms' ABIs specify that passing an argument of type Span<T> actually be done by creating a copy of the struct in the caller's stack frame, and passing a pointer to that copy in to the callee via the argument registers/stack. The JIT's internal modeling of this convention is to rewrite Span<T> parameters as ref Span<T> parameters. That internal rewrite at first caused problems for applying bounds-check removal optimizations to spans passed as parameters. The problem was that methods written with by-value Span<T> parameters, which at source look analogous to by-value array parameter a in the example above, when rewritten looked to the JIT like by-reference parameters, analogous to by-reference array parameter b above. This caused the JIT to handle references to such parameters' Length fields with the same conservativism needed for b above. Change #10453 taught the JIT to make local copies of such parameters before doing that rewrite (in beneficial cases), so that bounds-check removal optimizations can equally apply to spans passed by value. As noted in the PR, this change allowed these optimizations to fire in 9 more of the Span<T> micro-benchmarks in our test suite; here are three of them:

This last change applies more generally to any structs passed as parameters (not just Span<T>); the JIT is now better able to analyze value propagation through their fields.

Enum.HasFlag Optimization

The Enum.HasFlag method offers nice readability (compare targets.HasFlag(AttributeTargets.Class | AttributeTargets.Struct) vs targets & (AttributeTargets.Class | AttributeTargets.Struct) == (AttributeTargets.Class | AttributeTargets.Struct)), but, since it needs to handle reflection cases where the exact enum type isn't known until run-time, it is notoriously expensive. Change #13748 taught the JIT to recognize when the enum type is known (and known to equal the argument type) at JIT time, and generate the simple bit test rather than the expensive Enum.HasFlag call. Here's a micro-benchmark to demonstrate, comparing .NET Core 2.0 (which doesn't have this change) to a recent daily preview build (which does). Much thanks to @adamsitnik for making it easy to use BenchmarkDotNet with daily preview builds of .NET Core!

Output:

BenchmarkDotNet=v0.10.9.313-nightly, OS=Windows 10 Redstone 2 [1703, Creators Update] (10.0.15063)
Processor=Intel Core i7-4790 CPU 3.60GHz (Haswell), ProcessorCount=8
Frequency=3507517 Hz, Resolution=285.1020 ns, Timer=TSC
.NET Core SDK=2.1.0-preview1-007228
  [Host]     : .NET Core 2.1.0-preview1-25719-04 (Framework 4.6.25718.02), 64bit RyuJIT
  Job-WFNGKY : .NET Core 2.0.0 (Framework 4.6.00001.0), 64bit RyuJIT
  Job-VIXUQP : .NET Core 2.1.0-preview1-25719-04 (Framework 4.6.25718.02), 64bit RyuJIT
Method Toolchain Mean Error StdDev
HasFlag .NET Core 2.0 14,917.4 ns 80.147 ns 71.048 ns
HasFlag .NET Core 2.1.0-preview1-25719-04 449.3 ns 1.239 ns 1.034 ns

With the cool new BenchmarkDotNet DisassemblyDiagnoser (again thanks to @adamsitnik), we can see that the optimized code really is a simple bit test:

Bench.HasFlag
RyuJIT x64 .NET Core 2.0 RyuJIT x64 .NET Core 2.1.0-preview1-25719-04
HasFlagBench.Bench.HasFlag():
push    rdi
push    rsi
push    rbx
sub     rsp,20h
mov     rsi,rcx
xor     edi,edi
L1:
mov rcx, [[AttributeTargets type]]
call    [[box]]
mov     rbx,rax
mov rcx, [[AttributeTargets type]]
call    [[box]]
mov     ecx,dword ptr [rsi+8]
mov     dword ptr [rbx+8],ecx
mov     rcx,rbx
mov     dword ptr [rax+8],0Ch
mov     rdx,rax
call    [[System.Enum.HasFlag]]
mov     byte ptr [rsi+0Ch],al
inc     edi
cmp     edi,3E8h
jl      L1
add     rsp,20h
pop     rbx
pop     rsi
pop     rdi
ret
HasFlagBench.Bench.HasFlag():
xor     eax,eax
mov     edx,dword ptr [rcx+8]
L1:
mov     r8d,edx
and     r8d,0Ch
cmp     r8d,0Ch
sete    r8b
mov     byte ptr [rcx+0Ch],r8b
inc     eax
cmp     eax,3E8h
jl      L1
ret

What's more, implementing this optimization involved implementing a new scheme for recognizing intrinsics in the JIT, which is more flexible than the previous scheme, and which is being leveraged in the implementation of Intel SIMD intrinsics for.NET Core.

Block Layout for Search Loops

Outside of profile-guided optimization, the JIT has traditionally been conservative about rearranging the basic blocks of methods it compiles, leaving them in MSIL order except to segregate code it identifies as "rarely-run" (e.g. blocks that throw or catch exceptions). Of course, MSIL order isn't always the most performant one; notably, in the case of loops with conditional exits/returns, it's generally a good idea to keep the in-loop code together, and move everything on the exit path after the conditional branch out of the loop. For particularly hot loops, this can cause a significant enough difference that developers have been using gotos to make the MSIL order reflect the desired machine code order. Change #13314 updated the JIT's loop detection to effect this layout automatically. As usual, the PR included a performance assessment,
which noted speed-ups in 5 of the benchmarks in our performance test suite.

Again comparing .NET Core 2.0 (which didn't have this change) to a recent daily preview build (which does), let's look at the effect on the repro case from the GitHub issue describing this opportunity:

The results confirm that the new JIT brings the performance of the loop with the in-place return in line with the performance of the loop with the goto, and that doing so constituted a 15% speed-up:

BenchmarkDotNet=v0.10.9.313-nightly, OS=Windows 10 Redstone 2 [1703, Creators Update] (10.0.15063)
Processor=Intel Core i7-4790 CPU 3.60GHz (Haswell), ProcessorCount=8
Frequency=3507517 Hz, Resolution=285.1020 ns, Timer=TSC
.NET Core SDK=2.1.0-preview1-007228
  [Host]     : .NET Core 2.0.0 (Framework 4.6.00001.0), 64bit RyuJIT
  Job-NHAVNC : .NET Core 2.0.0 (Framework 4.6.00001.0), 64bit RyuJIT
  Job-CTEHPT : .NET Core 2.1.0-preview1-25719-04 (Framework 4.6.25718.02), 64bit RyuJIT

Method Toolchain Mean Error StdDev
LoopReturn .NET Core 2.0 61.97 ns 0.1254 ns 0.1111 ns
LoopGoto .NET Core 2.0 53.63 ns 0.5171 ns 0.4837 ns
LoopReturn .NET Core 2.1.0-preview1-25719-04 53.75 ns 0.5089 ns 0.4511 ns
LoopGoto .NET Core 2.1.0-preview1-25719-04 53.52 ns 0.0999 ns 0.0934 ns

Disassembly confirms that the difference is entirely block placement:

LoopWithExit.LoopReturn
RyuJIT x64 .NET Core 2.0 RyuJIT x64 .NET Core 2.1.0-preview1-25719-04
LoopLayoutBench.LoopWithExit.LoopReturn_
(System.String, System.String):
sub     rsp,18h
xor     eax,eax
mov     qword ptr [rsp+10h],rax
mov     qword ptr [rsp+8],rax
mov     ecx,dword ptr [rdx+8]
mov     qword ptr [rsp+10h],rdx
mov     rax,rdx
test    rax,rax
je      L1
add     rax,0Ch
L1:
mov     qword ptr [rsp+8],r8
mov     rdx,r8
test    rdx,rdx
je      L2
add     rdx,0Ch
L2:
test    ecx,ecx
je      L5
L3:
movzx   r8d,word ptr [rax]
movzx   r9d,word ptr [rdx]
cmp     r8d,r9d
je      L4
xor     eax,eax
add     rsp,18h
ret
L4:
add     rax,2
add     rdx,2
dec     ecx
test    ecx,ecx
jne     L3
L5:
mov     eax,1
add     rsp,18h
ret
LoopLayoutBench.LoopWithExit.LoopReturn_
(System.String, System.String):
sub     rsp,18h
xor     eax,eax
mov     qword ptr [rsp+10h],rax
mov     qword ptr [rsp+8],rax
mov     eax,dword ptr [rdx+8]
mov     qword ptr [rsp+10h],rdx
test    rdx,rdx
je      L1
add     rdx,0Ch
L1:
mov     qword ptr [rsp+8],r8
mov     rcx,r8
test    rcx,rcx
je      L2
add     rcx,0Ch
L2:
test    eax,eax
je      L4
L3:
movzx   r8d,word ptr [rdx]
movzx   r9d,word ptr [rcx]
cmp     r8d,r9d
jne     L5
add     rdx,2
add     rcx,2
dec     eax
test    eax,eax
jne     L3
L4:
mov     eax,1
add     rsp,18h
ret
L5:
xor     eax,eax
add     rsp,18h
ret
LoopWithExit.LoopGoto
RyuJIT x64 .NET Core 2.0 RyuJIT x64 .NET Core 2.1.0-preview1-25719-04
LoopLayoutBench.LoopWithExit.LoopGoto_
(System.String, System.String):
sub     rsp,18h
xor     eax,eax
mov     qword ptr [rsp+10h],rax
mov     qword ptr [rsp+8],rax
mov     eax,dword ptr [rcx+8]
mov     qword ptr [rsp+10h],rcx
test    rcx,rcx
je      L1
add     rcx,0Ch
L1:
mov     qword ptr [rsp+8],rdx
test    rdx,rdx
je      L2
add     rdx,0Ch
L2:
test    eax,eax
je      L4
L3:
movzx   r8d,word ptr [rcx]
movzx   r9d,word ptr [rdx]
cmp     r8d,r9d
jne     L5
add     rcx,2
add     rdx,2
dec     eax
test    eax,eax
jne     L3
L4:
mov     eax,1
add     rsp,18h
ret
L5:
xor     eax,eax
add     rsp,18h
ret
LoopLayoutBench.LoopWithExit.LoopGoto_
(System.String, System.String):
sub     rsp,18h
xor     eax,eax
mov     qword ptr [rsp+10h],rax
mov     qword ptr [rsp+8],rax
mov     eax,dword ptr [rcx+8]
mov     qword ptr [rsp+10h],rcx
test    rcx,rcx
je      L1
add     rcx,0Ch
L1:
mov     qword ptr [rsp+8],rdx
test    rdx,rdx
je      L2
add     rdx,0Ch
L2:
test    eax,eax
je      L4
L3:
movzx   r8d,word ptr [rcx]
movzx   r9d,word ptr [rdx]
cmp     r8d,r9d
jne     L5
add     rcx,2
add     rdx,2
dec     eax
test    eax,eax
jne     L3
L4:
mov     eax,1
add     rsp,18h
ret
L5:
xor     eax,eax
add     rsp,18h
ret

Conclusion

We're constantly pushing to improve our codegen, whether it's to enable new scenarios/features (like Span<T>), or to ensure good performance for natural/readable code (like calls to HasFlag and returns from loops). As always, we invite anyone interested to join the community pushing this work forward. RyuJIT documentation avilable online includes an overview and a recently added tutorial, and our GitHub issues are open for (and full of) active discussions!

test custome field

What’s New in EDU UK

$
0
0

 

This week's updates from the Microsoft EDU Team UK! Every Tuesday, we will be sharing the top news coming from Microsoft in Education globally and highlighting any must read blogs and videos to keep you up to date with Microsoft in Education in the UK and globally.

 

Our 'What's New in EDU UK'  blog will focus on:

  1. Must read blogs
  2. Must see videos
  3. What's happening on Twitter

Here's this week's Latest News:

Education Stories Blog- Making inclusive and accessible classrooms for students

Find out all about today's TweetMeet focused on accessibility in support of National Dyslexia Awareness Month!

Making inclusive and accessible classrooms for students: Learn from the #MSFTEduChat TweetMeet on Oct. 17th

 


Microsoft Education Videos- What's New in Microsoft EDU October Edition

In the October episode of What's New in EDU, our monthly round-up of the latest efforts and products from Microsoft Education, we put wheels on the whole classroom for a cross-country learning adventure through Minecraft: Education Edition, and we redesign it to ignite innovative ideas for our upcoming Hack the Classroom event!

If you'd like to see what else is new in Microsoft Education, or discover what other educators are doing in their classrooms, visit and join our Microsoft Educator Community: https://education.microsoft.com/

 


Twitter feed Updates for Microsoft Edu UK

Check out what is happening in Microsoft in Education here in the UK by viewing the Microsoft Education Twitter updates below.



So that wraps up this week's What's New in Edu UK. Remember to follow @Microsofteduk for all our latest updates daily!

Git と Visual Studio 2017 その 10 : VS でソリューションの共有

$
0
0

前回の記事では Git と GitHub を使ったソリューションの共有を説明しました。今回は Visual Studio と GitHub でのソリューションの共有を見ていきます。

GitHub レポジトリの削除

前回作成した GitHub.com のレポジトリは一旦削除します。“Settings” | “Danger Zone” から削除が可能です。

image

image

Visual Studio 用 GitHub エクステンション

Visual Studio 2017 は既定で GitHub をサポートしてませんが、便利なエクステンションがあります。

1. ツール | 拡張機能と更新プログラムをクリック。

image

2. オンラインを選択して、検索より “github” を検索。“GitHub Extensions for Visual Studio” をインストールします。

image

3. 必要に応じて Visual Studio を閉じて、インストールウィザードに従って進めます。

4. 次にアカウントを設定します。チームエクスプローラーより設定を選択。

image

5. 今回はこのプロジェクトの範囲でのみ設定したいので、レポジトリの設定をクリック。

image

6. オーバーライドのチェックボックスを入れて、GitHub と同じユーザー名、メールアドレスを入力して更新。

image

7. チームエクスプローラーの上部メニューより接続の管理をクリック。

image

8. 接続の管理より ”Connect to GitHub” を選択。

image

9. ユーザー名とパスワードを入力。これで Visual Studio が GitHub アカウントを記憶。

image

リモートの追加とプッシュ

次に GitHub にレポジトリを作って、ソリューションを同期しましょう。前回と異なり、レポジトリ作成も Visual Studio から行えます。

1. チームエクスプローラーのホームより ”同期” メニューを選択。プッシュの画面が出るので ”Publish to GitHub” をクリック。

image

2. 全て既定のまま、Publish ボタンをクリック。これにより、GitHub 上にレポジトリの作成、ローカルでリモートの追加、および初回同期が実行。

image

3. GitHub 側でレポジトリが作成されいる事および ‘git remote show’ でリモート追加が完了していることを確認。

image

4. ‘git branch –a’ と ‘git log –oneline –graph –all’ で状況確認。

image

5. GitHub 上にもアイテムやコミットが存在。

image

6. 前回同様、README.md を GitHub 上で追加しておく。

7. チームエクスプローラーに GitHub 関連の新しいメニューが追加されていることを確認。

image

リモートからの変更をプル

GitHub 上で追加した README.md を取得してみましょう。

1. チームエクスプローラーより同期を選択。

2. “フェッチ” リンクをクリック。

image

3. フェッチの対象が表示されるので、”フェッチ” リンクをクリック。完了してもアイテムは消えない。

image

4. ‘git log –oneline –graph –all’ を実行してフェッチが出来ていることを確認。

image

5. 次に ”プル” リンクをクリック。アイテムが消えたことを確認して、再度コミット履歴を確認。

image

競合の解消

前回記事と同じように競合シナリオも検証してみましょう。

1. GitHub 上で README.md を更新。

image

2. Visual Studio でも README.md を編集。そのためにはまずソリューションレベルでアイテムを追加。

image

image

3. ファイルを編集してコミット。sln ファイルも変更があるためコミット対象。

image

image

4. コミット時に ”同期” するようメッセージが出るため、”同期” リンクをクリック。

image

5. ”同期” リンクをクリック。”同期” はプルを実行してから、プッシュを実行できる VS の機能。

image

6. 前回の記事同様、競合が発生。

image

7. 既に競合解消のやり方は知っているため、競合を解消。ここではリモート、ローカルの変更を共に受け入れ。

image

8. 競合が解消したら、”マージをコミット” をクリック。

image

9. コミットコメントを入力してコミット。

image

10. 新しくコミットを追加したので、再度 ”同期” をクリック。

image

11. もともとのコミットに加えて、マージコミットが表示されるので、再度 ”同期” をクリック。このタイミングでは ”プッシュ” でも良いが、他に変更が入っている可能性もあるので、個人的には常に ”同期” を実行。

image

12. 同期が完了したら master ブランチで履歴を確認。

image

ブランチの同期

Visual Studio からブランチも同期できます。

1. チームエクスプローラーよりブランチメニューを選択。

2. remotes/origin には master しかない事を確認。

image

3. dev ブランチを右クリックして、”ブランチのプッシュ” をクリック。これで ‘git push –u origin dev’ が実行される。

image

4. remotes/origin にも、GitHub 上にも dev ブランチが作成される。

image

5. remotes/origin にある dev ブランチを右クリックして、”リモートからブランチを削除” をクリック。これでGitHub からもブランチが削除。

image

7. 他のシナリオも検証。dev ブランチを右クイックして削除。

image

8. GitHub 上で dev ブランチを作成。Code 画面にある Branch: master をクリックし、名前を入れて Enter 押下。

image

9. Visual Studio に戻り、チームエクスプローラーより同期をクリック。フェッチを実行。特に結果は出ない。

10. しかしブランチメニューに移動すると remotes/origin に dev ブランチが表示される。

image

11. dev ブランチを右クリックして ”チェックアウト” を実行。これでローカルブランチとして dev が作成され、リモート追跡ブランチとペアされる。もしローカルブランチの名前を変えるなど詳細なコントロールをしたい場合は ”新しいローカルブランチ” メニューを利用可能。

image

12. 次に GitHub から dev ブランチを削除。

13. 残念ながら Visual Studio で prune 機能を発見できなかったため、コマンドプロンプトから ‘git remote prune origin’ を実行。

image

14. 次の検証のために再度ローカル dev ブランチをプッシュ。これで GitHub にも dev ブランチが再作成される。

リモートからのクローン

Visual Studio もクローン機能をサポート。

1. Visual Studio 2017 を一旦閉じて、ソリューションフォルダを削除。

image

2. Visual Studio を開いてチームエクスプローラーへ移動。

3. ”接続の管理” をクリック。

image

4. GitHub の項目で ”Clone" をクリック。

image

5. クローンしたいレポジトリを選択して、クローン実行。

image

6. クローンが完了するとトップフォルダがソリューションエクスプローラーに表示される。

image

7. ヒントにあるように、”ソリューションおよびフォルダー” ボタンをクリックして、sln ファイルを選択。ソリューションが開かれる。

image

8. チームエクスプローラーよりブランチを選択。Git での挙動と同じく、ローカルは master しかない。

image

リモートの削除

最後にリモートを削除します。

1. チームエクスプローラー | 設定 | レポジトリの設定を選択。

2. リモートの項目より ”削除” をクリック。個別に追加したい場合は、追加をクリック。

image

まとめ

拡張機能の助けを借りましたが、非常に多くのことが Visual Studio から実行できました。prune がないのは残念ですが。次回は Git の構成について見ていきます。

中村 憲一郎

MSDN インシデントで技術サポートにドライバー開発のお問い合わせをする方法

$
0
0

MSDN サブスクリプションの特典として付属している、テクニカル サポートのインシデントを利用して、ドライバー開発に関連したお問い合わせをする方法について、ご案内いたします。

 

流れとしては、大きく 2 段階あります。

 

[1] お電話でインシデントをアクティベーションしてアクセス ID と契約 ID を取得 (※初回またはご契約更新後の初回のみ必要)

[2] Web から「法人のお客様向け 技術サポート窓口」のページからお問い合わせ

 

すでに [1] ID をお持ちの方は、[2] へお進みいただけます。ただ、MSDN サブスクリプションのご契約を更新いただいた場合、改めて [1] が必要になります。

 

[1] お電話でインシデントをアクティベーションしてアクセス ID と契約 ID を取得

 

以下のいずれかにお電話をいただき、MSDN サブスクリプションの特典インシデントの有効化(アクティベーション) を行っていただき、アクセス ID と契約 ID を取得します。

 

    • カスタマー サービス (0120-750052)

    • サポート契約センター (0120-17-0196)

 

いずれも営業時間は 9:00-17:30 (土日祝日、弊社指定休業日を除く) です。

 

[2] Web から「法人のお客様向け 技術サポート窓口」のページからお問い合わせ

 

(1)         以下のページの[開発ツール] [Windows Driver Kit] のリンクをクリック、または検索ボックスに「Windows Driver Kit」と入力し、ご利用のバージョンのリンクをクリックします。

 

法人のお客様向け 技術サポート窓口

https://support.microsoft.com/ja-jp/assistedsupportproducts

 

    • Windows Driver Kit のリンクをクリックする場合

 

clip_image002

 

    • 検索ボックスに Windows Driver Kit と入力する場合

 

clip_image004

 

ここでは、例としてWindows Driver Kit 10 のリンクをクリックします。どれを選んでも、基本的には同じ手順となりますのでご安心ください。

 

 

(2)         お困りのご状況に最も近い「問題のタイプ」と「カテゴリー」を選択します。

 

問題のタイプには以下の種類があり、それぞれにカテゴリーがあります。どれを選んでも、基本的には私共ドライバー開発サポートのチームにお問い合わせ内容が転送されますので、ご安心ください。

 

clip_image006

 

ここでは、例として、「バスドライバ―」と「USB」を選びます。USB に関連したお問い合わせをしたいからと言って、必ずしもこの組み合わせでなければならないことはございませんので、ご安心ください。

 

clip_image008

 

 

(3)         上記ページで「お問い合わせを開始」をクリックし、Microsoft アカウントでサインインします。

 

(4)         「インシデントの作成- 支払いオプションの選択」画面で「MSDN/Visual Studio サブスクリプションを使用する」を選択し、「その他の契約を使用する」をクリックすると、アクセス ID と契約番号(契約ID) を入力できます。

 

clip_image010

 

 

以上の内容がお役に立ちましたら幸いです。

 

WDK サポートチーム 津田

 

    • 参考:

アクセスID と契約ID を使用したサポート インシデントの作成

https://support.microsoft.com/ja-jp/help/3020636/using-an-access-id-and-a-contract-id-to-create-on-premises-professiona

 

法人向けサポートサービス

https://www.microsoft.com/ja-jp/services/support.aspx

 


Can I define a UDT inside a Oracle Function Package and use it on my WCF-Oracle Adaptor?

$
0
0

One of this days I got this problem were some help was needed to provide an insight regarding the use of the WCF-Oracle Adapter for Biztalk, being very comfortable with the WCF part of the issue I thought that I could provide some help on the discussion.

So basically the issue he had this team that was trying to consume an Oracle Function Package which has a User Defined Type defined in it. (big hint here) 🙂

The developers were already implemented this solution by following our recommendation in the below article that explains how to proceed with a correct WCF-Oracle adaptor configuration.
Invoke Functions and Procedures with REF CURSORS in Oracle Database using BizTalk Server

But unfortunately they were not able to generate the schemas for their internal pipeline functions, moreover they told me that the issue was caused by the fact that the adaptor was not enable to retrieve the metadata of the function hence not able to generate the schemas (second hint here).

They even share with me the errors that they were seeing.

Error while retrieving or generating the WSDL. Adapter message: Retrieval of Operation Metadata has failed while building WSDL
at 'http://Microsoft.LobServices.OracleDB/2007/03/......'
Microsoft.ServiceModel.Channels.Common.MetadataException: Retrieval of Operation Metadata has failed while building WSDL at 'http://Microsoft.LobServices.OracleDB/2007/03/....' ---> Microsoft.ServiceModel.Channels.Common.MetadataException: Invalid Metadata. Check if the database user has permissions to UDT '..._TABLE'
at Microsoft.Adapters.OracleDB.OracleCommonMetadataResolverHandler.CreateProcedureParameter(OracleCommonConnectionWrapper connection, DataTable metadataTable, Int32 i, DataRow row, String dataType, ProcedureMetadata operation, OracleCommonExecutionHelper executionHelper, OracleCommonTypeMetadataPreResolver preResolver)
at Microsoft.Adapters.OracleDB.OracleCommonMetadataResolverHandler.ResolveProcedureMetadata(OracleCommonConnectionWrapper connection, DataTable metadataTable, ProcedureMetadata operation, OracleCommonExecutionHelper executionHelper, OracleCommonTypeMetadataPreResolver preResolver)
at Microsoft.Adapters.OracleDB.OracleCommonMetadataResolverHandler.ResolveOperationMetadata(String operationId, TimeSpan timeout, TypeMetadataCollection& extraTypeMetadataResolved)
at Microsoft.ServiceModel.Channels.Common.Design.MetadataCache.GetOperationMetadata(String uniqueId, Guid clientId, TimeSpan timeout)
at Microsoft.ServiceModel.Channels.Common.Design.WsdlBuilder.SearchBrowseNodes(MetadataRetrievalNode[] nodes, WsdlBuilderHelper helper, TimeoutHelper timeoutHelper)
--- End of inner exception stack trace ---

Looking at all this information I had pretty had an idea about what was happening but of course we first needed to calm down everyone and try to get all the information flowing to the developers, so they could understand what was happening.

So, the next natural step, was asking the developers to share with me the problematic function so I can check on the inner works of the procedure they were trying to implement, and after i got that function we could confirm that they were indeed specifying inside this function package a User Defined Type.

Unfortunately, except for PL/SQL tables (which of course was not the case), the Oracle Database adapter does not support UDTs that are defined inside a package.

This is stated on the below article.

Limitations of BizTalk Adapter for Oracle Database

So, after that we enter a different discussion in terms of explaining that this behavior that their were seeing was indeed by design, which is not always a good discussion.

Hope that helps.

 

Get Started with Microsoft Graph API

$
0
0

Guest by Dolga Rares Microsoft Student Partner at University College London

clip_image002

About Me

Hello! My name is Dolga Rares and I am a second year, computer science UCL student. I was passionate about programming and exact sciences from an early age. During high school, I participated in computer science contests and won prizes at regional stages. All these participations made me develop a passion for optimization. During my first year at university I gained experience in software development, and during this summer I built an API and Android library for connecting to a database.

Besides software development, I am interested in machine learning and mixed reality.

My LinkedIn: https://uk.linkedin.com/in/rares-dolga

Introduction:

Microsoft Graph API(v1.0) is perfect for beginners and for senior developers. It provides loads of information and functionality, which is well documented. Even if you have no idea about Graph API, from my experience, I think that this API is the perfect start.

I will guide you through learning about the graph, by building an Android application, that reads your emails and gets access to your Azure drive. It would be great if you have programming experience, but I will try to explain the steps for beginners. Note that you can use the Graph API with any other languages. The principles are the same, and Microsoft even provides SDKs for most common languages. Even though we can use C# for our app, I will show that Microsoft services can be used with other languages as well. Not being restricted just to other Microsoft tools.

Main goals:

· Learn about Microsoft Graph API

· Extract data from your Office 365 account.

· Learn how to integrate Microsoft technologies with other platforms such as Android

· Discover the Microsoft android SDK

· Learn what is a RESTful API

What you should have before starting:

· You should have a Microsoft Office 365 Account so that you can extract data and test the application. If you are a student get it free from https://products.office.com/en-us/student/office-in-education .

Or if you are not in university environment: https://products.office.com/en-gb/buy/office

· You should have Android Studio Installed: https://developer.android.com/studio/index.html

Steps that we will make:

1. What is Microsoft Graph API and how to use it?

2. I will explain the terminology for those who are new to these technologies

3. We will have a practical example of how to use the API in Android

o Register your app with Microsoft

o Create an Android project and add needed libraries and dependencies.

o Create the User Interface

o Create classes for getting an access token from Azure V2.0 endpoint and processing of data

o Read emails from your account and get your drive id

1. Conclusion

2. References

The complete code can be found here: https://github.com/raresdolga/Microsoft_Graph_API 

1)What is Microsoft Graph API?

As the name says it is the API that lets you build an app in which users can extract data from Office 365. You can read emails, write emails, get contacts, get users, and take information from Azure Active directory. You can even access the calendar and OneNote. You can build all types of apps that make use of the data and possibilities the graph offers and Xamarin is a great tool for this. However, I will you show that the API is not only compatible with Microsoft tools, but with other platforms as well. Documentation offers support for different languages such as C# (for Xamarin), Angular JS, IOS, PHP, Python, and many others.

So how does it work?

It communicates with your application through URL requests. You ask for particular data that has an identifier. For example, in the URL:” https://graph.microsoft.com/v1.0/users?$top=5 “we ask for users and apply a filter, just first 5 from all of them. A web server parses the information sent over the internet and extracts data from Microsoft’s data centres.

You may wonder if this is secure. Well, it is perfectly protected, as the application must authenticate a user and get the access token. The access token s passed through the URL, so data is given back only if somebody otherized requested it.

A schematic view:

image

More information about what you can do with the Graph: https://developer.microsoft.com/en-us/graph/docs/concepts/overview

2) Terminology

What is a RESTful API?

API = Application Programming Interface. If you do not know what is an interface, imagine it like a bridge of communication between different software. API defines a set of functions that you can call in your program to get/send some information to an existing software. For example, in Microsoft Graph API you can get personal information by this request: “https://graph.microsoft.com/v1.0/me/”. You may wonder why this is an URL and not a function call in some language. In fact, this is the RESTful API, which works by sending requests over the internet to a server. The server processes the request you sent in the URL and sends back a response. Usually, or at least in Microsoft’s Graph API case, it is in JSON (JavaScript object notation) format.


What is an SDK and why do we use it?

SDK stands for Software Development Kit. It is a set of classes (In Java’s case) and methods defined for a specific purpose. Microsoft Android SDK has the purpose of communicating with the API. We use it because it provides an optimal implementation for sending and receiving information from the RESTful API. You can write your own classes for that, but you will need to construct the URL and make HTTP request from java code. Also, the response from the internet is not coming directly, it is a flow of information and you need to implement an asynchronous task. This means that you should start a process (transforming data) before information arrives. With the SDK you just make function calls which are easier to understand.

3) Let’s start the app:

1) How the app should look?

image

2) Create an Empty Application

· Create a new android project and name it how you want:

clip_image002[4]

It should have the following characteristics:

clip_image004

Please remember to choose an Empty Activity as a starting point.

· Add libraries

Our project will depend on one library and on one SDK. The MSAL (Microsoft Authentication Library) library provides the authentication token from Azure v2 endpoint. The SDK will make our project easier to code and in an efficient way.

In the build.grandle (Module app) please add the following in the dependency:

compile ('com.microsoft.identity.client:msal:0.1.+') {
exclude group: 'com.android.support', module: 'appcompat-v7'
}
// Include the SDK as a dependency
compile 'com.microsoft.graph:msgraph-sdk-android:1.3.2'
// Include the gson dependency
compile('com.google.code.gson:gson:2.3.1')

Above dependencies add:

epository {

jcenter()

}

Also modify minifyEnable to true. ->For faster performance.

Now we need to add the necessary permissions into the Manifest File.

Add this code between package = “your package...” and “>”:

Above dependencies add:

repository {

jcenter()

}

Also modify minifyEnable to true. ->For faster performance.

Now we need to add the necessary permissions into the Manifest File.

Add this code between package = “your package...” and “>”:

xmlns:tools=http://schemas.android.com/tools

After the closing symbol “>” add the permissions you need:

<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
<uses-sdk tools:overrideLibrary="com.microsoft.identity.msal" />

Now we just have to allow our activity to open a browser window and add the redirect URL for getting the token.

Under the </activity> closing tag, add this code:

<activity
android:name="com.microsoft.identity.client.BrowserTabActivity">
<intent-filter>
<action android:name="android.intent.action.VIEW" />
<category android:name="android.intent.category.BROWSABLE" />
<category android:name="android.intent.category.DEFAULT" />
<category android:name="android.intent.category.LAUNCHER" />
<!--Add in your scheme/host from registered redirect URI-->
<data android:scheme="msalc6eb0d0a-28fa-4424-8e6f-62b0f865181d"
android:host="auth" />
</intent-filter>
</activity>

3) Register your app with Microsoft:

Go to: https://developer.microsoft.com/en-us/graph/quick-start. Chose Android and follow the steps. If you do not already have an account, please sign in.

After you registered your app, the Microsoft page should look like this:

image

You should go to: “app registration page” in step 3. There click “Add URI” near Custom Redirect URIs, and paste this: msalc6eb0d0a-28fa-4424-8e6f-62b0f865181d://auth

Notice that this is the redirect URL that we used in our Manifest file!

Click Save at the bottom of the page, and you are ready to code!

Create the User Interface:

Add this code in the XML file of the activity. (res->layout->activity_connect.xml – depends on the name of your activity). This code should be placed between Relative layouts tags):

<Button
     android:text="Sign Out"
     android:layout_width="wrap_content"
     android:layout_height="wrap_content"
     android:id="@+id/sign_out"
     android:layout_alignParentEnd="true" />

<ProgressBar
     style="@android:style/Widget.Material.Light.ProgressBar.Large"
     android:layout_width="wrap_content"
     android:layout_height="wrap_content"
     android:id="@+id/progressBar"
     android:layout_centerVertical="true"
     android:layout_centerHorizontal="true" />

<Button
     android:text="Sign In"
     android:layout_width="wrap_content"
     android:layout_height="wrap_content"
     android:id="@+id/sign_in"
     android:layout_alignParentTop="true"
     android:layout_alignParentStart="true"
     android:layout_marginStart="20dp" />

<TextView
     android:layout_width="wrap_content"
     android:layout_height="wrap_content"
     android:id="@+id/driveText"
     android:text="sddsdsdsds"
     android:layout_below="@+id/sign_in"
     android:layout_alignStart="@+id/sign_in"
     android:layout_marginTop="56dp" />

<Button
     android:text="Me"
     android:layout_width="wrap_content"
     android:layout_height="wrap_content"
     android:id="@+id/info_me"
     android:layout_alignParentTop="true"
     android:layout_alignParentStart="true" />

<Button
     android:text="Your Emails"
     android:layout_width="wrap_content"
     android:layout_height="wrap_content"
     android:id="@+id/read_email"
     android:layout_alignParentTop="true"
     android:layout_centerHorizontal="true" />

<WebView
     android:layout_width="match_parent"
     android:layout_height="match_parent"
     android:id="@+id/myEmail"
     android:layout_below="@+id/sign_out"
     android:layout_alignParentBottom="true"
     android:layout_alignParentStart="true">

</WebView>

<Button
     android:text="Next"
     android:layout_width="wrap_content"
     android:layout_height="wrap_content"
     android:id="@+id/next"
     style="@style/Widget.AppCompat.Button.Small"
     android:layout_alignParentTop="true"
     android:layout_toStartOf="@+id/sign_out" />

<Button
     android:text="Prev"
     android:layout_width="wrap_content"
     android:layout_height="wrap_content"
     android:id="@+id/prev"
     style="@style/Widget.AppCompat.Button.Small"
     android:layout_alignParentTop="true"
     android:layout_toStartOf="@+id/next" />

Writing the code:

The code has a structure, one class for a specific task. The general overview of the app:

Authenticate the user.

Create Your Graph class, that represent an abstraction for Microsoft Graph

Construct the Activity class. This will represent the logic behind the User Interface and will handle the request to the Graph

Create a class that help you make your own requests, apart from those provided by the SDK.

Create the authentication class:

This class should have a single instance as we do not want to have different objects that ask for access to the account at the same time. We achieve this by making the constructor private and having an instance that creates an object if it does not exist or returns the existing one.

The purpose of this class is to obtain an authentication token from Azure. We can achieve this in two ways.

Silently – without the user action. In this case, we already asked for permissions before and they were accepted. Also, there was no change in the account of the user, like password change.

Interactively – we involve users in the process by asking for permissions (read email). This is done the sign in is done for the first time or something changed in the user credentials. Also, if the user signed out from the app previously, this type of authentication is needed.

Please put your package name at the top and replace the class name with the name you used. Preferably, use the same name as in my code. Also, do not forget to import the necessary packages.

public class Authentication {
     //URL constants
    
private static String app_id = "c6eb0d0a-28fa-4424-8e6f-62b0f865181d";
     private String basicUrl = "https://graph.microsoft.com/";
     private String [] scope = { "Mail.ReadWrite","User.ReadBasic.All", "Mail.Read"};
     private String graph_endPoint = "https://graph.microsoft.com/v1.0/me";

     // App variables and constants
    
private String TAG = "ConnectAct";
     private volatile static Authentication single;
     private static PublicClientApplication myApp = null;
     private AuthenticationResult authR = null;
     private ConnectAct connect_activ;
     private Authentication() {
     }

     public static Authentication getInstance() {
         if (single == null) {
             synchronized (Authentication.class) {
                 if(single == null) {
                     single = new Authentication();
                     if (myApp == null) {
                         myApp = new PublicClientApplication(ConnectAct.getAppContext(), app_id);
                     }
                 }
             }
         }
         return single;
     }

     public String  getAccessToken(){
         return authR.getAccessToken();
     }

     public PublicClientApplication getAppClient(){
         return myApp;
     }

     public void aquire_acessTokenInteractive(Activity activity, final ConnectAct passedActivity){
             connect_activ = passedActivity;
             myApp.acquireToken(activity, scope, getInteractiveCallBack());
     }

     public void acquire_accessTokenSilent(User user, boolean forceRefresh, ConnectAct passedActivity) {
         connect_activ = passedActivity;
         myApp.acquireTokenSilentAsync(scope, user, null, forceRefresh, getSilentCallBack());
     }

     public void disconnect(){
         List<User> users = null;
         try {
             users = myApp.getUsers();
             if(users != null) {
                 for (User u : users) {
                     myApp.remove(u);
                 }
             }

         } catch(Exception e){
             e.printStackTrace();
         }
     }

     public AuthenticationCallback getSilentCallBack(){
         return new AuthenticationCallback() {
             @Override
             public void onSuccess(AuthenticationResult result) {
                 authR = result;
                 Log.v(TAG,"This is the token got silently: " + authR.getAccessToken());

             }

             @Override
             public void onError(MsalException exception) {
                 exception.printStackTrace();
             }

             @Override
             public void onCancel() {
                 Log.d(TAG, "logged in canceled");
             }
         };
     }
     public AuthenticationCallback getInteractiveCallBack(){
         //anonymous class
        
return new AuthenticationCallback(){
             @Override
             public void onSuccess(AuthenticationResult result){
                 authR = result;
                 Log.v(TAG,"interactive token"+authR.getAccessToken());
                 connect_activ.onSuccessAuth();
             }
             @Override
             public void onError(MsalException e){
                 connect_activ.onErrorAuth();
             }
             @Override
             public void onCancel(){
                 connect_activ.onCancelAuth();
             }
         };
     }
}

Create your graph class:

This graph must implement the interface provided by Microsoft SDK. The class must have a single instance and is unique in the sense that when the object is created it contains the authentication token provided by the above class. Remember that is an abstraction of the real graph. We use it to call methods implemented in the SDK as we would build a request to the API.

public class Graph implements IAuthenticationProvider {
    private volatile static Graph singleInst;
     private IGraphServiceClient graphServiceClient = null;
     private Graph(){
     }

     public static Graph getInstance() {
         if (singleInst == null) {
             synchronized (Authentication.class) {
                 if(singleInst == null) {
                     singleInst = new Graph();
                 }
             }
         }
         return singleInst;
     }

     public synchronized IGraphServiceClient getGraphServieClient(){
         if(graphServiceClient == null){
             IClientConfig myConfig = DefaultClientConfig.createWithAuthenticationProvider(this);
             graphServiceClient = new GraphServiceClient.Builder().fromConfig(myConfig).buildClient();
         }
         return graphServiceClient;
     }

     @Override
     public void authenticateRequest(IHttpRequest request) {
         try {
             request.addHeader("Authorization", "Bearer " + Authentication.getInstance().getAccessToken());

             Log.v("Connect", "Request: " + request.toString());
         } catch (Exception e) {
             e.printStackTrace();
         }
     }
}

Create your main activity:

This class handles what happens when a user clicks a button and interacts with the app. For example, when “YOUR EMAILS” button is clicked, a request that asks the Graph API for recently emails will be made.

Here I will show you a pattern between how the request URL looks and how the request made with the SDK looks:

image

The code for the class:

public class ConnectAct extends AppCompatActivity {

     private String TAG = "ConnectAct";
     Button signIn;
     Button signOut;
     Button me;
     ProgressBar pb;
     TextView driveT;
     Button getEmails;
     WebView emailsTo_read;
     Button next, prev;
     private static Context appContext = null;
     private List<Message> globalEm = null;
     private int iterator = 1;
     IGraphServiceClient myConfigured_graph;
     private AuthenticationResult authR;
     private Authentication authManager = null;
     private ConnectAct outerObj = this;

     @Override
     protected void onCreate(Bundle savedInstanceState) {
         super.onCreate(savedInstanceState);
         setContentView(R.layout.activity_connect);
         appContext = this.getApplicationContext();
         authManager = Authentication.getInstance();
         myConfigured_graph = Graph.getInstance().getGraphServieClient();
         //UI implementation
        
signIn = (Button) findViewById(R.id.sign_in);
         signOut = (Button) findViewById(R.id.sign_out);
         me = (Button) findViewById(R.id.info_me);
         prev = (Button) findViewById(R.id.prev);
         next = (Button) findViewById(R.id.next);
         pb = (ProgressBar) findViewById(R.id.progressBar);
         getEmails = (Button) findViewById(R.id.read_email);
         emailsTo_read = (WebView) findViewById(R.id.myEmail);
         driveT = (TextView) findViewById(R.id.driveText);
         setGui();

         signIn.setOnClickListener(new View.OnClickListener() {
             @Override
             public void onClick(View v) {
                 pb.setVisibility(View.VISIBLE);
                 try {
                     List<User> users = authManager.getAppClient().getUsers();
                     // try a silent acquire token
                    
if (users != null && users.size() == 1) {
                         authManager.acquire_accessTokenSilent(users.get(0), false, outerObj);
                         updateGui();
                     } else {
                         authManager.aquire_acessTokenInteractive(outerObj, outerObj);
                     }
                 } catch (Exception e) {
                     e.printStackTrace();
                 }
             }
         });
         signOut.setOnClickListener(new View.OnClickListener() {
             @Override
             public void onClick(View v) {
                 authManager.disconnect();
                 Toast.makeText(getBaseContext(), "You had signed out!", Toast.LENGTH_LONG);
                 setGui();
             }
         });

         me.setOnClickListener(new View.OnClickListener(){
             @Override
             public void onClick(View v){

                 pb.setVisibility(View.VISIBLE);
                 driveT.setVisibility(View.VISIBLE);
                 emailsTo_read.setVisibility(View.GONE);
                 getEmails.setVisibility(View.VISIBLE);
                 next.setVisibility(View.GONE);
                 prev.setVisibility(View.GONE);
                 myConfigured_graph.getMe()
                         .getDrive()
                         .buildRequest()
                         .get(new ICallback<Drive>() {
                             @Override
                             public void success(final Drive result) {
                                 final String msg = "Found Drive, the ID =  " +         result.id;
                                driveT.setText(msg);
                                 pb.setVisibility(View.GONE);
                             }
                             @Override
                             public void failure(ClientException e){
                                 e.printStackTrace();
                                 driveT.setText("An error occurred, please try again");
                                 pb.setVisibility(View.GONE);
                             }

                        });
             }
         });
         getEmails.setOnClickListener(new View.OnClickListener(){
             @Override
             public void onClick(View v) {
                 pb.setVisibility(View.VISIBLE);
                 myConfigured_graph.getMe().getMessages().buildRequest().get(new ICallback<IMessageCollectionPage>() {
                     @Override
                     public void success(final IMessageCollectionPage result) {
                         emailsTo_read.setVisibility(View.VISIBLE);
                         driveT.setVisibility(View.GONE);
                         getEmails.setVisibility(View.GONE);
                        
List <Message> emails = result.getCurrentPage();
                         setEmailList(emails);
                         navigateEmails(0);
                         pb.setVisibility(View.GONE);
                         next.setVisibility(View.VISIBLE);
                         prev.setVisibility(View.VISIBLE);
                     }

                     @Override
                     public void failure(ClientException e) {
                         e.printStackTrace();
                         Toast.makeText(getAppContext(),"An error occurred, please try again",Toast.LENGTH_LONG);
                         pb.setVisibility(View.GONE);
                     }
                 });
             }
         });
         next.setOnClickListener(new View.OnClickListener(){
             @Override
             public void onClick(View v){
                 navigateEmails(++iterator);
             }
         });
         prev.setOnClickListener(new View.OnClickListener(){
             @Override
             public void onClick(View v){
                 navigateEmails(--iterator);
             }
         });
     }
     /* Handles the redirect from the System Browser */
    
@Override
     protected void onActivityResult(int requestCode, int resultCode, Intent data) {
         authManager.getAppClient().handleInteractiveRequestRedirect(requestCode, resultCode, data);


     }

     public Activity getOutActivity(){
         return this;
     }

     public static Context getAppContext(){
         return appContext;
     }

     public void onSuccessAuth(){

         updateGui();
     }

     public void onErrorAuth(){
         Toast.makeText(getBaseContext(), "Sorry Something went wrong", Toast.LENGTH_LONG);
     }

     public void onCancelAuth(){

     }

     private void updateGui(){
         pb.setVisibility(View.INVISIBLE);
         me.setVisibility(View.VISIBLE);
         signIn.setVisibility(View.GONE);
         me.setVisibility(View.VISIBLE);
         getEmails.setVisibility(View.VISIBLE);
     }

     private void setGui(){
         iterator = 1;
         driveT.setText(null);
         signIn.setVisibility(View.VISIBLE);
         pb.setVisibility(View.GONE);
         me.setVisibility(View.GONE);
         getEmails.setVisibility(View.GONE);
         driveT.setVisibility(View.GONE);
         next.setVisibility(View.GONE);
         prev.setVisibility(View.GONE);
         emailsTo_read.setVisibility(View.GONE);
         globalEm = null;
     }

     private void setEmailList(List<Message> messages){
         globalEm = messages;
     }
     private void navigateEmails( int i){
         if(i >= globalEm.size() || i < 0){
             Toast.makeText(getBaseContext(), "No more emails to find", Toast.LENGTH_LONG);
             Log.v(TAG,i + "");
             if(iterator >= globalEm.size())
                 iterator--;
             else
                 iterator
++;
         }
         else {
             Message m = globalEm.get(i);
             emailsTo_read.loadData(m.body.content.toString(), "text/html; charset=utf-8", "UTF-8");
             Log.v(TAG, i + "");
         }
     }
}

If you ask yourself what is with the lambda functions or what they are I will quickly explain.

We basically construct a class in a method. Instead of passing an object of that class we call new on a class that we

define in the same place. For example, in this section:

myConfigured_graph.getMe().getMessages().buildRequest().get(new ICallback<IMessageCollectionPage>() {

@Override

public void success(final IMessageCollectionPage result) {

List <Message> emails = result.getCurrentPage();

setEmailList(emails);

}

@Override

public void failure(ClientException e) {

e.printStackTrace();

Toast.makeText(getAppContext(),"An error occurred, please try again",Toast.LENGTH_LONG);

pb.setVisibility(View.GONE);

} });

Here we implement the request with Microsoft SDK. We must manage the situation when the HTTP request fails and succeed. In the last case, we need to handle the information.

This is the most important part as here we call the Microsoft Graph API to give you the information you asked.

Create your own requests:

This class purpose is to implement the functionality from the superclass in SDK that handles requests. You should override a method that passes your own request. For example, you pass the HTTP for a request that is not already implemented in the SDK. The HTTP should be a string.

public class YourOwnRequest extends BaseRequest {
     public YourOwnRequest(final String requestUrl, final IBaseClient client, final java.util.List<Option> requestOptions){
         super(requestUrl,client,requestOptions,Void.class);
     }
     public String get() throws ClientException {
         return send(HttpMethod.GET, null);
     }
}

An example of the call would be:

YourOwnRequest request = new YourOwnRequest("https://graph.microsoft.com/v1.0/custom", myConfigured_graph, new ArrayList<Option>());
request.get();

The Final App

imageConclusion:

Now I hope that you understand the principles of Microsoft Graph API, by making this practical example. As you can see it is easy to use and you can build a lot of functionality with it. Now you can build more complicated applications that make use of the other functionalities that are provided.

Hope you understand and enjoy!

The complete code can be found here: https://github.com/raresdolga/Microsoft_Graph_API

References:

Microsoft Graph API: https://developer.microsoft.com/en-us/graph/docs/concepts/overview

Microsoft Android SDK: https://github.com/microsoftgraph/msgraph-sdk-android

Microsoft Azure V2.0 Endpoint: https://docs.microsoft.com/en-us/azure/active-directory/develop/active-directory-appmodel-v2-overview

La plateforme Windows Mixed Reality est arrivée !

$
0
0

image

Développeurs : A vos outils ! Nous sommes impatient de voir ce que vous pouvez créer !!

Pour développer vos applications et jeux,

image Installez la version Fall Creators Update de Windows 10 disponible dès aujourd’hui. Il s’agit de la version 1709 Build 16299.15
image Téléchargez la version finale 2017.2 d’Unity3D (sortie le 12 octobre) disponible sur le site https://store.unity.com/download
image Enfin, faites la mise à jour Visual Studio 2017 Version 15.4 pour cibler Windows Mixed Reality - https://blogs.msdn.microsoft.com/visualstudio/2017/10/10/visual-studio-2017-version-15-4-released/

 

Le point d’entrée le plus utile est surement le projet GitHub Microsoft/MixedRealityToolkit-Unity !

Amusez-vous bien !

\ Sebastien.

Windows Mixed Reality platform is here!

$
0
0

image64

Developers: We can’t wait to see what you will create!!

To develop your apps and games,

image_thumb16 Install the latest Windows 10 version named Fall Creators Update available today. It is the version 1709 Build 16299.15
image60 Download the final version Unity3D 2017.2 available since October the 12th at https://store.unity.com/download
image62 Last but not least, update to Visual Studio 2017 Version 15.4 in order to be able to target Windows Mixed Reality SDK 10.0.16299.0 - https://blogs.msdn.microsoft.com/visualstudio/2017/10/10/visual-studio-2017-version-15-4-released/

 

The best entry point would be the GitHub project Microsoft/MixedRealityToolkit-Unity !

Enjoy!

\ Sebastien.

Cloud Tech 10: Only have 10mins to learn about Azure then…

$
0
0

Take a look at the 10min sessions Mark delivers, here is this weeks

https://aka.ms/CloudTech10

Others can be found here

This week's Cloud Tech 10 is available now! In less than 10 minutes, learn more about

New Flushing principle

$
0
0

This blog post describes a new flushing principle Available at location that was introduced in Dynamics 365 for finance and operations, Enterprise edition for the spring release 2017.

The flushing principles reflect different consumption strategies for raw materials that are used in production processes. Consumption is the process that deducts material from the on-hand inventory and sets the value of the consumed materials to Work in progress (WIP) for production orders and batch orders. Raw materials are usually consumed from a location that is configured for the process that consumes the material. This location is known as the production input location. Material consumption for production and batch orders is accounted for in the Production picking list journal.

Flushing principle available at location

The Available at location flushing principle indicates that the material will be automatically consumed when it's registered as picked for production. The material is registered as picked from location when work for the raw material picking is completed, or when material is available on the production input location and the material line is released to the warehouse. The picking list that is generated during the process is posted in a batch job. This principle is relevant if, for example, you have many picking activities against one production order. In this case, you don't have to update the picking list manually, and you can get a current view of the WIP balance.

Let's walk through some simple scenarios that shows how the new flushing principle works.

In the following example we will be using product L0101 from the USMF demo data.

  1. Go to the BOM version for L0101 and remove product L0100 and M9204, so the resulting bill of material looks like below.

2. Go to the details for the line item for M9203 and change the flushing principle to “Available at location”

3. Create a production order for L0101 with the default quantity 20.
4. Estimate and Release the production order
a. Verify that work has been created for M9203
5. Start the production order with the below setting

Note: When starting the production order, no picking list was created. This is because no material has yet been made available on the production input location by completion of warehouse work.
6. Process the warehouse work with the help of the hand-held device
7. After completing the work, go to the picking list of the production order

Note: Because the bom line has the flushing principle Available at location the picking list journal was automatically created when the pick work was completed. The posting of the picking list is inserted into a batch job, and that is why it can appear as not posted when opened just after the work is completed. After a couple of minutes after work completion the picking list will be posted by the batch job.

8. Create a new production order for L0101
9. Estimate and Release the order
Note: No work was created, because M9203 is available at the production input location, because we staged the full quantity of the license plate (200 pcs) when we processed the work for the first production order and we only consumed 20 pcs, so 180 pcs are remaining.
10. Start the production order with Auto BOM Consumption = Flushing principle
Note: A production picking list was created and posted. When material is already available at the production input location, the flushing principle “Available at location” has the same behavior as the flushing principle “Start”.

MIEE Spotlight- Gordon Wardlaw

$
0
0


Today's MIEE Spotlight is shining brightly on Gordon Wardlaw, Chemistry Teacher from Grangemouth High School in Scotland. Gordon loves being involved in wider school life and is dubbed as 'Mr ICT'!

Gordon is a teacher who is willing to take risks and try new things in the classroom, evaluating and improving them to ensure they effectively enhance teaching and learning. He uses Office 365 tools across his lessons and school life through Scotland's online portal GLOW. He has utilised OneNote Class Notebook to host course content for his students and to enable them to collaborate anytime, anywhere. Using his own OneNote he is able to track student progress and assessment easily and quickly, ready for evaluation at any time.

As well as this, Gordon has used Yammer effectively to allow his pupils to take responsibility for their own learning and collaborate and communicate together online.

You can follow Gordon on Twitter @misterwardlaw  keep up to date with the innovative work he is doing in his classroom.


Interact with the Sway below to hear more about Gordon's development and classroom practices in his own words!

Follow in the footsteps of our fantastic MIEE's and learn more about how Microsoft can transform your classroom with the Microsoft Educator Community.


Department of Justice Issues a FedRAMP High ATO for Azure Government

$
0
0

The Department of Justice (DOJ) Justice Management Division (JMD) has issued two FedRAMP High ATOs for Azure Government. The DOJ performs critical services for U.S. Citizens with the mission "[t]o enforce the law and defend the interests of the United States according to the law; to ensure public safety against threats foreign and domestic; to provide federal leadership in preventing and controlling crime; to seek just punishment for those guilty of unlawful behavior; and to ensure fair and impartial administration of justice for all Americans."

JMD is the principal organizational unit responsible for management and administrative support in the Department of Justice. Much of the work of this Division is internal facing and works to make sure that the infrastructure is in place so that all other components of the Department can complete the more visible Departmental work. Under the direction of the Assistant Attorney General for Administration (AAG/A), JMD provides Department-wide policy guidance to the Department's offices, boards, divisions and, to a limited extent, its bureaus. The Division has four key areas of responsibility:

  • Fiscal responsibility
  • Human resources management
  • Information resources and management
  • Policy, management and planning

The authorizations will allow DOJ to take advantage of cloud-based core networking and application hosting environments to standardize cloud implementations for application owners within the organization. Having consistent implementations for individual mission owners within DOJ will streamline cloud adoption practices and security authorizations for future DOJ systems moving to Azure Government cloud.

By issuing the Azure Government ATO, DOJ can also take advantage of the cloud benefits provided by IaaS and PaaS capabilities. Azure Site Recovery (ASR) allows streamlined migration options for customers migrating from on-premises datacenters to Azure Government. ASR automates the replication of virtual machines for disaster recovery scenarios, cloud migration, and replication for hybrid environments.

In addition to an Agency FedRAMP High ATO from DOJ, Microsoft Azure Government has also received a FedRAMP High P-ATO from the Joint Authorization Board (JAB). Microsoft Azure Government leads the industry with 32 FedRAMP-approved services spanning both infrastructure-as-a-service and platform-as-a-service offerings. A complete list of the Azure services covered under the Azure Government FedRAMP High ATO can be found by visiting the Microsoft Trust Center.

Microsoft is committed to providing the most trusted, comprehensive cloud for mission-critical workloads so that our nearly 6 million government users across 7,000-plus federal, state and local organizations can achieve more in carrying out their mission-critical workloads. Please visit:

We welcome your comments and suggestions to help us continually improve your Azure Government experience. To stay up to date on all things Azure Government, be sure to subscribe to our RSS feed and to receive emails, click "Subscribe by Email!" on the Azure Government Blog. To experience the power of Azure Government for your organization, sign up for an Azure Government Trial.

Modernizing a Monolithic Application using Microservices and Azure

$
0
0

This post on modernizing your application for the cloud comes to us from Premier Developer consultant Najib Zarrari.


Context

As companies embark on their digital transformation journeys, pressure on IT organizations have been mounting to levels that have never been experienced before.  Businesses have the following expectations, among others, from their IT organizations:

  • Applications need to be developed, deployed and enhanced at a rapid pace
  • Applications are always available, resilient and performant
  • Features expected to match or exceed competitors
  • Applications need to run on different form factors including PCs and mobile
  • All applications delivered must be secure and compliant

To meet those expectations, not only do companies need to have the capabilities to build these kind of solutions, but also they have to build them faster than ever before.  This is why many organizations are rethinking how they are architecting and building solutions so that they can better respond to the demands and expectations businesses have on them.  Also, IT organizations are constantly on the lookout for ways to enhance agility, optimize resource consumption and minimize solutions time to market.  One way businesses are achieving those goals is by embracing the cloud.  Trends show that organizations from all sizes either have moved toward scaling down their on-prem data centers in favor of the cloud or contemplating the adoption of the cloud.

Read more on Najib’s blog here.

Video-Tutorial zu Minecraft – Teil 2

$
0
0

Sie möchten die #MinecraftEducationEdition im Unterricht einsetzen? In unseren Video-Tutorials erklären wir die Basics Schritt für Schritt! Teil 2: Die Gamemodes

Schulen können jetzt von einem besonders vorteilhaften Angebot profitieren: Zu jedem neu gekauften Windows-10-Gerät erhalten sie die Minecraft Education Edition kostenfrei für ein Jahr. Mehr zur Minecraft Education Edition und zu den Details der Aktion finden Sie hier.

Den ersten Teil unserer Minecraft Video-Tutorials finden Sie hier

Ignite 2017 Demonstration: PowerShell with the Dynamics 365 Online Management API

$
0
0

Today’s post contains the sample script we used to talk through our PowerShell demo’s at Ignite – to all those who watched remotely or in person: thank you!  The purpose of this script is to give admins a ready-to-run script that demonstrates some of the Online Management API features in PowerShell and uses an optional module (Microsoft.Xrm.Data.PowerShell GitHub link & PowerShell Gallery link) to also review and edit data in a given Dynamics 365 Customer Engagement instance.  This script may require you changing your execution policy (specifically the Xrm.Data.Powershell module as it’s not signed with a public cert at this point – I am looking to sign it hopefully in our next release, though it will be self-signed) but the online management PowerShell module is signed for your consumption.

For those looking for more content or an API reference for our new Online Management API you can find there here: https://docs.microsoft.com/en-us/dynamics365/customer-engagement/developer/online-management-api/get-started-online-management-api

And for those who just want to install the PowerShell Module – use the script below to get started.

Thanks for reading!

Sean McNellis | Twitter: @seanmcne


#Install the Online Management API to the current user profile
install-module Microsoft.Xrm.OnlineManagementAPI -Scope CurrentUser -force

#interactive prompt
$cred = Get-Credential

$apiUrl = "https://admin.services.crm.dynamics.com/" #Northamerica Service Instance

Import-Module Microsoft.Xrm.OnlineManagementAPI -Verbose

#get instance info
$instances = Get-CrmInstances -Credential $cred -ApiUrl $apiUrl -verbose

#retrieve all the current deployed versions of Dynamics 365
$Versions = Get-CrmServiceVersions -ApiUrl $apiUrl -Credential $cred

#find version 8.2
$v8dot2 = $Versions | where Version -like "8.2"

#now create new instance information for our new instance that we wish to create
$instanceInfo = New-CrmInstanceInfo -BaseLanguage 1033 `
    -ServiceVersionId $v8dot2.Id `
    -InstanceType Sandbox `
    -DomainName "pfecrmonline.onmicrosoft.com" `
    -InitialUserEmail "user@tenantname.onmicrosoft.com" `
    -FriendlyName "Ignite 2017"

#create that new instance using the info from above
$newInstance = New-CrmInstance -ApiUrl $apiUrl -Credential $cred -NewInstanceInfo $instanceInfo

#now parse the resource ID (instance ID)
$resource = $newInstance.ResourceLocation.Split("/")
$instanceId = $resource[$resource.Count-1]

Write-Output "the instance ID is: $instanceId"
Get-CrmInstance -ApiUrl $apiUrl -Credential $cred -Id $instanceId -Verbose

#now get backups for another instance by the instance's uniquename
$instance = Get-CrmInstance -ApiUrl $apiUrl -Credential $cred -Id ($instances|where UniqueName -eq "uniqueinstancename").Id -Verbose
$instance|Select UniqueName,version,state, ApplicationUrl, Id | Format-Table
$backups = Get-CrmInstanceBackups -ApiUrl $apiUrl -Credential $cred -InstanceId $instance.Id|ft

#next let's use Xrm Data PowerShell to explore data within a particular instance

#install from PowerShell Gallery
Install-Module Microsoft.Xrm.Data.PowerShell -Scope CurrentUser

#load the module
Import-Module Microsoft.Xrm.Data.PowerShell -Verbose

Connect-CrmOnline -ServerUrl naosrtw.crm.dynamics.com -Cred $cred

Invoke-CrmWhoAmI

$iam = Invoke-CrmWhoAmI

Get-CrmRecords -EntityLogicalName account

Get-CrmEntityAttributes -EntityLogicalName systemuser | select logicalname,AttributeType | sort logicalname

Get-CrmRecord -EntityLogicalName systemuser -Id $iam.UserId -fields personalemailaddress

Set-CrmRecord -EntityLogicalName systemuser -Id $iam.UserId -Fields @{"personalemailaddress"="jim@outlook.com"}

$accounts = Get-CrmRecords -EntityLogicalName account -Fields name -TopCount 400

Submissions using Windows 10, version 1709 are now being accepted!

$
0
0

The Windows Hardware Lab Kit (HLK) has been updated to support Windows 10, version 1709.

The HLK is available for download on the Hardware Dev Center:

HLK version 1709 enforces the Windows 10 hardware requirements and policies posted at https://aka.ms/compatreq and is designed for testing Windows 10, version 1709.

Note: Starting with HLK version 1709, the HLK will support testing a single version of Windows 10. (Previous versions of the HLK supported testing multiple versions.)

The following  support scenarios will be accepted:

HLK version Windows 10 versions supported Device/Component Submissions accepted System Submissions accepted
1709 1709 – Client 1709 Client Device/Component 1709 Client Systems
1703 1703 – Client

1607 - Client

1703 Client Device/Component

1607 Client Device/Component

1703 Client Systems
1607 1607 – Client

1607 – Server, Azure Stack, SDDC

1511 - Client

1607 Client Device/Component

1607 Server Device/Component

1511 Client Device/Component

1607 Server Systems

When submitting a Windows 10, version 1709 HLK package for validation, you must use Windows 10, version 1709 build 16299 or newer on the test device. The submission will otherwise be rejected.

You must continue to use the Windows Hardware Certification Kit (HCK) version 2.1 to certify for following operating systems:

  • Windows 7
  • Windows 8
  • Windows 8.1
  • Windows Server 2012
  • Windows Server 2012 R2

You must continue to use the Windows Logo Kit (WLK) version 1.6 to certify for following operating systems:

  • Windows Server 2008 R2 (x64 and ia64)
  • Windows Server 2008 (x86, x64 and ia64)

Certification for Windows Server 2016, Azure Stack and SDDC must meet the Windows Hardware Compatibility Requirements as stated in version 1607 of the documentation, use the 1607 version of the Windows Server 2016 operating system and use HLK version 1607 build 14393 with matching playlist and supplemental content to generate logs and following the policies stated in the Windows Server Policy. Questions about the Azure Stack or SDDC program or how to submit the results for solution validation should be directed to the appropriate Microsoft contact – technical account manager or partner management contact.

Playlists to support the incremental Windows releases

With the change in the policy regarding which versions of Windows 10 the HLK will validate, it becomes important to note which tests are required with each kit. Playlists must match the HLK version used, not the Windows 10 version under test.

The required playlists pairing are:

HLK Kit version Architecture Playlist
1709 X86 or x64 HLK Version 1709 CompatPlaylist x86_x64
1709 ARM64 Desktop* HLK Version 1709 CompatPlaylist ARM64

HLK Version 1709 CompatPlaylist ARM64_x86 on ARM64

1703 X86 or x64 HLK Version 1703 CompatPlaylist
1607 X86 or x64 HLK Version 1607 CompatPlaylist

*Testing ARM64 Desktop requires two playlists. Please see HLK client setup and testing guidance here for additional information.

https://msdn.microsoft.com/en-us/library/windows/hardware/dn914975(v=vs.85).aspx

All playlists are available at http://aka.ms/HLKPlaylist

Systems shipping Windows 10, version 1709 may ship with drivers that achieved Compatibility with Windows 10, version 1703 until February 1, 2018!

Partners looking to achieve Compatibility for systems shipping Windows 10, version 1709 may factory-install drivers for components that achieved Compatibility with Windows 10, version 1703. Historically, this has been limited to a 90-day period after the RTM of a Windows 10 version, however to support the transition to Universal Windows Drivers this Windows 10 release will allow this combination of OS and drivers until February 1, 2018.

The WHCP requirements for this release that articulate what is required of drivers to be in compliance with the Universal Windows Driver effort are:

Requirement Enforcement Date
Device.DevFund.Reliability.ProperINF Windows 10, version 1709 RTM
Device.DevFund.CDA.Application February 1, 2018
Device.DevFund.INF.DDInstall.CoInstallers February 1, 2018
System.Fundamentals.SignedDrivers.DigitalSignature February 1, 2018

It is recommended that drivers which are modified to align with the Universal Windows Driver effort retest using the HLK version 1709 kit to ensure there are no regressions as an outcome of the changes in the driver. For more information about Universal drivers please visit http://aka.ms/udriver.

Errata 26874 filter is available to mask the particular failures seen when testing a system and the latest errata filter package can be found at http://aka.ms/hlkfilters. This policy is not applicable to partners certifying for Windows Server 2016, as all components within the system must be certified for Windows Server 2016 in order to be considered compatible.

Certification for Windows Server 2016, Azure Stack and SDDC must meet the Windows Hardware Compatibility Requirements as stated in version 1607 of the documentation, use the 1607 version of the Windows Server 2016 operating system and use HLK version 1607 build 14393 with matching playlist and supplemental content to generate logs and following the policies stated in the Windows Server Policy. Questions about the Azure Stack or SDDC program or how to submit the results for solution validation should be directed to the appropriate Microsoft contact – technical account manager or partner management contact.

Submitting test results for Windows 10, version 1709

As previously mentioned, submissions for Windows 10, version 1709 must use HLK version 1709. Note that if you are submitting results for Windows 10 version 1709, the results must be packaged for submission using a controller with HLK version 1709 installed. If a controller with HLK version 1607 is used to package and submit results for Windows 10, version 1709, the submission will fail because the data in the package will not be in the correct format. Submissions affected by this will appear to be stuck in the Validating HCK/HLK Submission Package step. We will monitor for this scenario and reach out to partners with affected submissions. A controller with HLK version 1709 installed can be used to submit results from previous kits (HCK/HLK).

Viewing all 29128 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>