Quantcast
Channel: MSDN Blogs
Viewing all 29128 articles
Browse latest View live

Pricing and Discount – Product Price Precision

$
0
0

In the United States, the common price precision is two digits - for example, a mug costs $9.99 - except when it is not.

Example: gasoline

In the gas pump, it is three digits, for example, $2.499 per gallon

Example: nails

In some hardware stores, you would get $4.99 for 100 nails. So per "each" nail, the price is $0.0499. One option is model 100 nails as a unit of measure - say "hundreds" - but then what about 150 nails? Partial quantity, like 1.5 "hundreds" of nails, could be tricky. For example, if the price is $4.99 for 70 nails, so "seventy" is a unit of measure. Then how do you model 80 nails?

Per product price precision

Introduce per product price precision. By default, we still have a common price precision. On top of that, we can have override by product. In the gasoline case, it is three digits for gasoline, while two for everything else.

It is a simple solution, yet not perfect. If we take unit of measure into account, then it is possible that price precision can vary by unit of measure, the price for one nail can be $0.0499, but for a box (say 200 nails per box), you may want a different precision.

Before we go on with the complexity, ask yourself whether you really need it. In case you absolutely need it per unit of measure, we have two options.

  1. Model price precision per product per unit of measure
  2. Model price precision per product for the default unit of measure, and convert the precision when converting the unit of measure. (There may be some rounding of precision.)

Integration Boot Camp 2017: September 21 - September 22, 2017

$
0
0

It's that time again. Yep - time for the 2017 Integration Boot Camp (formerly known as the BizTalk Boot Camp). As always, it's free, and open to anyone and everyone. With the new features and easier interaction with Azure Services, we have so many new and different things to show. The key technologies in this boot camp are BizTalk Server 2016 Feature Pack 1, Azure Logic Apps, Service Bus, and much more.

The agenda is being created, and will be added when the details are finalized.

Requirements

  • In-person event; Skype is not offered
  • Laptop: Many discussions include hands-on activities
  • Azure subscription: Azure virtual machines, Logic Apps, and other Azure services require an Azure subscription
  • Non-disclosure agreement (NDA): A non-disclosure agreement (NDA) is required; which is available to sign upon arrival. Many companies already have a NDA with Microsoft. When you register, we’ll confirm if one already exists.

Location

Microsoft Corporation
8050 Microsoft Way
Charlotte, NC 28273

Bing map  Google map

Note: This year's boot camp is in a different building than previous boot camps.

When you arrive, check-in is required. If you are driving to the event, car registration is also required (download and complete Vehicle and Customer Registration).

Event Time: 8 AM - 4:30 PM EST

Registration

Link is coming soon

Attendance is limited. The event is targeted towards a technical audience, including administrators and developers. Registration includes:

  • Attendance both days
  • Breakfast and lunch both days
  • Access to the Microsoft company store

Hotel

Hotel options coming soon.

Questions

Ask in this post or contact me: mandi dot ohlinger at microsoft dot com.

We hope to see you here!
Mandi Ohlinger

Visual Studio Code C/C++ extension July 2017 Update – time to try out the new IntelliSense!

$
0
0

2 million downloads! This is very exciting for the Visual Studio Code C/C++ extension, considering it was just 4 months ago when we hit the 1 million milestone!

Today we are shipping the July 2017 update to the extension. Besides several bug fixes, we are continuing to polish the new IntelliSense experience that we shipped last month. This new experience continues to be on by default for VS Code Insiders and off for everyone else, but we encourage everyone to try out the new, improved IntelliSense! You will get improved, more accurate results for several IntelliSense features, including auto-complete suggestions for class/struct/namespace members, quick info tooltips, and error squiggles (linting), all powered by a new semantic engine. You can turn on the new IntelliSense by changing the “C_Cpp.intelliSenseEngine” setting in your settings.json file (open from menu File->Preferences->Settings) from “Tag Parser” to “Default”, as shown in the screenshot below.

The following screenshot shows auto-complete suggestions for a class and quick info tooltips using the new IntelliSense engine.

The extension enables the new IntelliSense experience by first trying to fully parse any opened file using the new IntelliSense engine. If it discovers that it cannot find a header file or a dependency, it will fall back to the tag parser and provide the original fuzzy IntelliSense behavior. This blog post Visual Studio Code C/C++ extension June 2017 Update details how the fallback behavior works and how you can control it.

Tell us what you think

We encourage everyone to try out the new IntelliSense and send us feedback. Download the C/C++ extension for Visual Studio Code, try it out and let us know what you think. File issues and suggestions on GitHub. If you haven’t already provided us feedback, please take this quick survey to help shape this extension for your needs.

Release Update 2017-07-14

$
0
0

NOTE: Weekly updates will now show up directly in the Azure Portal.  We will still post some updates (about once a month) at this place, but for latest updates please view by opening a Logic App and selecting "Release Notes" as pictured below:

Release Notes (aggregate):

  • Setting to turn disable async polling pattern for actions (fire and forget)
  • Templates are now hosted in GitHub here
  • Added "landing tips" for creating a logic app
  • Added ability to mark a foreach loop as "sequential" from the designer
  • Webhook triggers for OneDrive for Business
  • Retry policy settings added to action settings
  • Expression authoring in the designer (dynamic content tab)
  • XSLT now has option to disable BOM (Byte Order Mark)
  • Pagination support in the designer (e.g. SQL get > 256 rows in an action)
  • Support for 8 levels of nesting in the designer (used to be 6)
  • Pan and zoom controls added to the designer
  • Can configure action timeout within the action settings in designer
  • Server-side paging for getting multiple pages of results (e.g. > 256 SQL records)
  • Accessibility enhancements
  • More special characters supported in the designer
  • Can switch an array to pass in an entire array instead of only single properties
  • Can modify and enable splitOn within the designer for a trigger
  • Decrement variable support
  • API Management run-history now renders based on OpenAPI definition
  • New video added to the template screen
  • Accessibility improvements to folder/file picker and dropdown menus
  • Performance and optimization improvements

Bug Fixes (aggregate):

  • Renaming the foreach loop in code-view wouldn't always reflect on reload
  • Foreach using new @items() syntax to enable nested loops
  • Rendering of the file picker would sometimes not wrap correctly
  • Sending full array would disappear when clicking "Switch to raw inputs"
  • Trigger wouldn't have delete enabled even if outputs weren't being accessed
  • IE11 had issues with multiple triggers
  • Function name was over the API Management card
  • Filter array from value would change to string on designer load
  • Tracked properties for variables weren't being correctly preserved
  • Copy/paste would replace the entire field when pasting in values
  • Fixed issue where trigger history display would show "Loading"
  • Dynamic content would sometimes be cut-off the top of the screen
  • Fixed issue where trigger history display would show "Loading"
  • Portal wouldn't show the status for all Logic Apps when > 35
  • Fixed issues when opening run history with new browse mode
  • Webhook trigger operations would sometimes ask user for polling frequency
  • Initialize variable in designer correctly lists supported types
  • Fixed reload of designer when using x-ms-dynamic-values in swagger for HTTP + Swagger

 

SSIS on Linux supports RedHat in SQL Server 2017 RC1

$
0
0

Dear all,

I am very pleased to announce that SSIS on Linux supports RedHat in SQL Server 2017 RC1. Besides Ubuntu, you can install SSIS on RedHat and execute your packages.

Install SSIS on RedHat

To install the mssql-server-is Package on RedHat, follow these steps:

  1. Enter superuser mode.

          $ sudo su

  1. Download the Microsoft SQL Server Red Hat repository configuration file.

          $ curl https://packages.microsoft.com/config/rhel/7/mssql-server.repo > /etc/yum.repos.d/mssql-server.repo

  1. Exit superuser mode.

          $ exit

  1. Run the following commands to install SQL Server Integration Services:

         $ sudo yum install -y mssql-server-is

  1. After installation, please run ssis-conf:

         $ sudo /opt/ssis/bin/ssis-conf setup 

  1. Once the configuration is done, set path:

$ export PATH=/opt/ssis/bin:$PATH 

  1. Copy your SSIS package to your Linux machine and run:

$ dtexec /F "your package" /DE "protection password"

使用带有Boost C++ 类库的C++ 协同程序

$
0
0

原文发表地址:using C++ Coroutines with Boost C++ Libraries

原文发表时间:2017/5/19

上个月,Jim Springfield 写了一篇关于在Libuv中使用C++ 协同程序的文章(一个用于异步I/O的多平台C类库)。本月我们将介绍如何使用Boost C++ 类库的组件协同工作,即boost::futureboost::asio。

获得Boost

如果你已经安装了boost,你可以跳过这一步。如果还没有安装,建议你使用vcpkg快速地在你的电脑上安装boost。你可以按照如下指令获取vcpkg 然后输入如下命令行安装32位或者64位的boost版本:

.vcpkg install boost boost:x64-windows

为了确保安装正确,请打开VS并且创建一个C++ Win控制台程序:

当你运行程序时,它应该打印出42

Boost::Future: 协同程序部分

当编译器在处理一个函数中的co_await, co_yield 或者co_return时,它将这个函数视为协同程序。就C++本身来说,它并没有C++协同程序语义的含义,用户或者类库编写者需要提供一个std::experimental::coroutine_traits模板特例化去告诉编译器应该做什么。(编译器通过传递返回值的类型和所有传递给函数的参数的类型)。

我们希望能够编写协同程序以返回boost::future。为了做到这些,我们将按照以下方式特例化coroutine_traits:

当协同程序暂停时,需要返回一个可以满足当协程程序运行到结束或者结束时返回异常的future

成员函数promise_type::get_return_object解释了怎么获取一个能够连接到特定实例的协同程序的future。成员函数promise_type::set_exception说明了如果在协同程序中发生了未处理的异常时会发生什么。在我们的示例中,我们希望把异常存储在与我们从协同程序中返回的future连接着的promise中。

成员函数promise_type::return_void 说明了当执行到co_return语句或者控制流运行到协同程序末尾的时候会发生什么。

成员函数initial_suspendfinal_suspend,正如我们定义的那样,告诉编译器,在它被调用后并且我们要立即开始执行协同程序并且一旦运行完就销毁协同程序。

为了控制非空futures,定义boost::future的特例化任意类型:

注意在这种情况下我们定义了return_value,和前边的例子中的return_void不同。它告诉编译器我们期望一个协同程序最终返回一些非空值(通过一个co_return语句)并且这些值将会被传递到与该协同程序相关联的future。(这两个专门化之间有很多常见的代码;如果需要的话它可以被分离出来)。

现在我们准备好要测试了。在命令行选项增加编译选项“/await”以便启用协同程序在编译器中的支持(因为协同程序还不是C++标准的一部分,所以需要明确的选择性加入去启用它们)。

另外,添加一个支持协同程序的include文件, 这些文件主要定义了std::experimental::coroutine中对我们比较重要的实例化模板:

当程序运行时,它应该打印:“Hi”42

Boost::Future: Await部分

下一步是向编译器解释如果你想在boost::future尝试‘await’该怎么做。

给出一个需要await的表达式,编译器需要知道三件事:

  1. 准备好了吗?
  2. 如果准备好了,怎么获得结果?
  3. 如果没有准备好,怎么预定能够当它准备好时得到通知?

为了得到这些问题的答案,编译器会寻找三个成员函数:await_ready能够返回‘true’或者‘false’,当表达式准备好获得结果时编译器将会调用await_resume(调用await_resume的结果会成为整个await表达式的结果),并且最终,编译器会调用await_suspend()函数以便当结果准备就绪时得到通知。并且会传递一个用于恢复或者销毁协同程序的协同程序句柄。

在boost::future的情况下,它有给出答案的功能,但是它没有像上一段描述的那样的必须的成员函数。为了解决这个问题,我们可以定义一个可以把boost::future有的转化为编译器想要的东西的运算符co_await

在这种情况下,当future准备就绪时,协同程序会通过await_suspend绕过暂停并且立即通过await_resume获得结果。

根据应用,有一种最为有效的方法。比如你正在编写一个客户端应用程序,当future已经准备好了的时候你的程序自然会运行的比较快一点,你免去了暂停之后由boost::future 协同函数所产生的时间消耗。。在服务器应用程序中,随着你的服务器处理数成百个同步请求,当它接收请求时,如果协同程序依据公平准则被启动, 那么处理请求需要的响应时间是可以被预测的, 这将会意义重大。 我们可以假想在一个进度条里面, 部分协同程序运行良好, 一旦它们开始请求是否ready 状态时, 程序的future 结束。 然而这样的协同程序将会独占线程资源而导致其他用户资源紧张。

你可以挑选任何一种你喜欢的方式并且尝试我们的品牌新运算符co_await

像往常一样,当你运行这个片段时,它将打印出42。注意在函数f里我们不再需要co_return。由于await表达式的存在编译器会知道这是一个协同程序。

Boost::asio

使用我们迄今为止开发的适配器,你现在可以自由地使用返回boost::future的协同程序并且处理任何APIs和返回boost::future的类库。但是如果你有一些不返回boost::future并且使用回调作为延续机制的类库呢?

作为模型,我们将使用boost::asio::system_timer的成员函数async_wait。没有协同程序,你可以按照如下所示使用system_timer

当你运行这个程序时,它会打印出“waiting for a tick”,100毫秒之后会紧跟着输出“tick”。

让我们围绕timer’s async_await创建一个封装能够使它在伴随协同程序时也可用。我们希望能够使用这个结构使指定的定时器暂停执行所需的持续时间:

整体结构看起来和我们如何为boost::future定义运算符co_await一样。我们需要从async_awiat返回一个对象来告诉编译器什么什么暂停,什么时候唤醒什么是运算符的结果。

注意当我们构造Awaiter时我们传递参数t和参数d。我们需要把这两个参数存储在Awaiter里以便我们可以在await_readyawait_suspend成员函数里访问它们。

另外你可能注意到在system_timer示例中async_await有一个用来接收表示是否完成等待或者出现错误(例如定时器被取消)的错误码的参数。我们需要向awaiter添加一个成员变量去存储错误码直到它被await_resume取消。

成员函数await_ready将会告诉我们是否需要暂停。如果我们完成如下设置,如果等待时间为0时我们将会告诉编译器不要暂停一个协同程序。

在await_suspend中,我们将调用timer.async_await去预定一个延续。当boost::asio提示我们时我们将会记住错误代码并且返回协同程序。

最后,当一个协同程序被回复时,我们将会检查错误代码并且如果wait不成功时我们会将其当作异常。

并且为了方便起见,整个适配器一体化:

一个简单的使用它的示例:

当你运行它的时候,它会在100毫秒分别打印出tick1tick2tick3

总结

我们快速浏览了如何开发能够使用现有C++库的协同程序的适配器。请尝试,并且试着添加更多的适配器。另外即将发布的博客文章:了解如何使用boost::asioCompletionToken traits去创建一个协同程序适配器而没有手动写入。

 

Building VSTS Extensions with feature flags – Part 2

$
0
0

In Building VSTS Extensions with feature flags we started a discussion about feature flags and how we’re using LaunchDarkly to eliminate risk and deliver value. We closed with a brief mention that we’re trying to find a way to exchange a more secure user key as part of the communication between our extensions and the LaunchDarkly service. We believe we found a solution, which is the focus of this post.

Continued from Part 1

Context

image

Read http://docs.launchdarkly.com/docs/js-sdk-reference#section-secure-mode for details on the secure mode.

Our proposed solution

After some investigation, and, inspired by the Create a VSTS Extension that uses Azure Functions blog, we experimented with the use of Azure Functions to be able to call server-side code. The VSTS extension calls the Azure Function, which will process and load the hash key and return it to the extension. But with this solution, we had a challenge of the security of this Azure Function tunnelling for passing secures parameters in this Azure Function: We can imagine that a malicious user could call the Azure function by passing in the user key of another user and recovering the values of the user’s flags.

So, we had to find the most secure parameters.

The flow in simple steps:

  1. The VSTS extension calls Azure Function with user token parameter
  2. The Azure Function check the user token and return the hashed userkey.
  3. The VSTS extension gets the response from the Azure Function and continues processing by calling the Initialize method with this hash key in parameter from the LaunchDarkly SDK.

image

Here’s the specification for the Azure Function.

Inputs parameters

  • User Token: session type token provided by VSTS service. For example: eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJuYW1laWQiOiI4NGVhNDg0NS01N2JhLTQwOTYtOTA5Zi0yOGYyM2NlNTRmZTUiLCJ0aWQiOiJXaW5kb3dzIExpdmUgSUQiLCJpc3MiOiJhcHAudnNzcHMudmlzdWFsc3R1ZGlvLmNvbSIsImF1ZCI6ImZiMzk4ZTkyLWY2ODktNDAyOS05ZDhhLWQwNmI2YzdjODc2YyIsIm5iZiI6MTQ5ODY2Njk2MywiZXhwIjoxNDk4NjcwNTYzfQ.yioxdiH6AGMpSzoTWmf3953yjqg46DS0N2TWhR8EX1E
  • VSTS user account. For example: mikaelkrief

Azure Function process (see sample source code at the end of this post)

  1. Check the validity of the user token (using the certificate of the extension).

    Read Auth and security for details.

  2. If the user token is valid, extract the user id from the principalClaims encrypted in the token. For example, 22544b1b-d4cd-489b-a2ea-ed932c8853b6.

    -> If not valid, return a 500 error
  3. Create the LaunchDarly userkey with this pattern: userid:vstsaccount
  4. Generate the Hash for the userkey created in (3).

    See public string SecureModeHash(User user) for details.

Output

  • The Azure Function returns the hash key for the current user.

Let’s validate and test this solution

SCENARIO 1 - Change the user token

Test: Malicious user tries to change the user token; knowing that it is impossible to have a valid token of another user who is currently connected (is the goal of the session token J )

Result: The token validation fails and the Azure function returns error 500

SCENARIO 2 - Change the VSTS account

Test: Malicious user tries to change the VSTS account by passing another VSTS account and a correct user token.

Result: The Azure function does not fail as the check of the user token return true. It returns a hash key. However, the hash key does not match with the current userkey passed in LaunchDarkly Initialization method, resulting in a validation failure in the LaunchDarkly service and returning an 400 status code error (Bad request) and a message "Environment is in secure mode, and user hash does not match.".

What’s Next?

Now that we find secure solution for call Azure Function from our VSTS extension, we use this solution to call LD Rest APIs, it will certainly be exposed in a future blog article. And we’re polishing the team-services-extension-featureflag-sample and implementing feature flags in our Roll-Up-Board-Widget-Extension, and Work-Item-Details-Widget-Extension solutions. Once we’re done, we’ll summarize the learnings and recommendations in an article “Phase the features of your application with feature flags” on https://aka.ms/techarticles. Watch the space.

References

https://blogs.msdn.microsoft.com/visualstudioalmrangers/2017/06/27/building-vsts-extensions-with-feature-flags/

https://launchdarkly.com/

SAMPLE CODE

    1. #r "D:homesitewwwrootCheckTokenSystem.IdentityModel.dll"
    2. using System.Net;
    3. using System.Collections.Generic;
    4. using System.Security.Cryptography;
    5. using System.IdentityModel.Tokens;
    6. using System.ServiceModel.Security.Tokens;
    7. public static HttpResponseMessage Run(HttpRequestMessage req, TraceWriter log)
    8. {
    9.     try
    10.     {
    11.         //Gettings input POST parameters
    12.         var data = req.Content.ReadAsStringAsync().Result;
    13.         var formValues = data.Split('&')
    14.             .Select(value => value.Split('='))
    15.             .ToDictionary(pair => Uri.UnescapeDataString(pair[0]).Replace("+", " "),
    16.                           pair => Uri.UnescapeDataString(pair[1]).Replace("+", " "));
    17.         var issuedToken = formValues["token"];
    18.         var account = formValues["account"];     
    19.         //Check the token, and extract the userid crypted in the token
    20.         var userId = checkTokenValidityAndGetUserId(issuedToken);
    21.         if (userId != null)
    22.         {
    23.             //hash the User Key
    24.             string hash = getHashKey(userId + ":" + account);

    25.             //return the hash key

    26.             return req.CreateResponse(HttpStatusCode.OK, hash);
    27.         }
    28.         else
    29.         {
    30.             return req.CreateResponse(HttpStatusCode.InternalServerError, HttpStatusCode.InternalServerError);
    31.         }
    32.     }
    33.     catch
    34.     {
    35.         return req.CreateResponse(HttpStatusCode.InternalServerError, HttpStatusCode.InternalServerError);
    36.     }

    37. }
    38. public static string checkTokenValidityAndGetUserId(string issuedToken)
    39. {
    40.     try
    41.     {
    42.         string secret = "<My extension certificate>"; // Load your extension's secret
    43.         var validationParameters = new TokenValidationParameters()

    44.         {

    45.             IssuerSigningTokens = new List<BinarySecretSecurityToken>()

    46.                         {

    47.                             new BinarySecretSecurityToken (System.Text.UTF8Encoding.UTF8.GetBytes(secret))

    48.                         },

    49.             ValidateIssuer = false,

    50.             RequireSignedTokens = true,

    51.             RequireExpirationTime = true,

    52.             ValidateLifetime = true,

    53.             ValidateAudience = false,

    54.             ValidateActor = false

    55.         };
    56.         SecurityToken token = null;

    57.         var tokenHandler = new JwtSecurityTokenHandler();

    58.         var principal = tokenHandler.ValidateToken(issuedToken, validationParameters, out token);
    59.         //extract the userId from principalClaims
    60.        string principalUserId = principal.Claims.FirstOrDefault(q => string.Compare(q.Type,
                    "
      http://schemas.xmlsoap.org/ws/2005/05/identity/claims/nameidentifier", true) == 0).Value;
    61.         return principalUserId;

    62.     }

    63.     catch

    64.     {

    65.         return null;

    66.     }

    67. }
    68. //Hash the secure userkey

    69. //source : https://github.com/launchdarkly/.net-client/blob/eafb706589ba57e72f93f58cfb80f48c6fba03ec/src/LaunchDarkly.Client/LdClient.cs#L189
    70. public static string getHashKey(string userkey)

    71. {

    72.     if (string.IsNullOrEmpty(userkey))

    73.     {

    74.  
    75.        return null;

    76.     }

    77.     System.Text.UTF8Encoding encoding = new System.Text.UTF8Encoding();

    78.     byte[] keyBytes = encoding.GetBytes("sdk-59baef5c-3851-4fef-a6a6-05a6e9c38ea9");
    79.     HMACSHA256 hmacSha256 = new HMACSHA256(keyBytes);

    80.     byte[] hashedMessage = hmacSha256.ComputeHash(encoding.GetBytes(userkey));

    81.     return BitConverter.ToString(hashedMessage).Replace("-", "").ToLower();

    82. }

Lync 2010 2017 年 6 月のセキュリティ 更新プログラム (Sec Patch) がリリースされています。

$
0
0

こんにちは。Japan Lync/Skype サポート チームです。
Lync 2010 2017 年 6 月のセキュリティ 更新プログラム (Sec Patch) がリリースされています。

Description of the security update for Microsoft Lync 2010: June 13, 2017
https://support.microsoft.com/ja-jp/help/4020732

Description of the security update for Microsoft Lync 2010 Attendee (admin level install): June 13, 2017
https://support.microsoft.com/en-us/help/4020733/

Description of the security update for Microsoft Lync 2010 Attendee (user level install): June 13, 2017
https://support.microsoft.com/en-us/help/4020734/

適用後のファイル バージョンは、4.0.7577.4534 となります。
以下のような問題が修正されます。

メモリ内のオブジェクトのハンドル、Windows Uniscribe 方法によりリモートでコードが実行される脆弱性が存在します。
この脆弱性を悪用した攻撃者は、影響を受けるシステムの制御にかかります。
攻撃者はプログラムをインストールする可能性があります。ビューを変更、またはデータを削除します。
または完全なユーザー権限を持つ新しいアカウントを作成します。
設定されているアカウント、システム上のユーザー権利の少ないがあるユーザーは管理者特権で実行しているユーザーよりも影響が少なくすることはできます。

本情報の内容(添付文書、リンク先などを含む)は、作成日時点でのものであり、予告なく変更される場合があります。


How to Set-up and Host Your Own Website with an SQL Server on Azure for Students

$
0
0

Guest post by James Tavernor, Microsoft Student Partner from Imperial College London

About Me

image

Hi, I’m James, I’ve just completed my first year studying Joint Mathematics and Computer Science at Imperial College London. I’m from Cheshire, as well as enjoying programming and video games, I’ve also played the piano from a young age and picked up the Trombone a few years ago.

Setting Up Azure for Students

Sign up for an Imagine student account and verify student status to get free access to Azure web apps on the cloud at https://imagine.microsoft.com/en-us/Account.

Then you can get the student Azure offer at https://azure.microsoft.com/en-gb/pricing/member-offers/imagine/.

Once these steps have been completed go to https://portal.azure.com where you should now be able to access the Azure portal for free.

Setting up the SQL Server and Database

We are going to set up an SQL server and host an SQL database on this server. To do this choose SQL databases on the left panel.

image

Then choose the add option. You will then be prompted to choose the name for your SQL database and for an SQL server to host the database on, though you can just choose to create a new server.

image

You’ll want to ensure when creating the server that the little checkbox is ticked to allow Azure services to access the server so our web app can access it.

I have used an existing resource group in the above screenshot but you will probably want to create a new one, resource groups are generally used to group together resources for one project to help with organisation.

Now we will probably want to add a table to our SQL database. A nice tool to help with this is the SQL Server Data Tools (https://docs.microsoft.com/en-us/sql/ssdt/download-sql-server-data-tools-ssdt) or you can use the Query editor if you’re confident with your SQL

Select your database, either from the dashboard or by navigating to SQL databases on the left panel and choosing the database we just created. Then choose overview and on the window that opens along the top there will be an option called tools, click this.

image

A new window will then open giving us two choices, we can open the database in visual studio like I did to create a table, or you can open the query editor to run SQL Queries on the database.

image

To create a table in Visual Studio using the SQL Server Data Tools is easy, we choose open in visual studio and once that opens we will be met with a connection window which will be filled in we just need to enter the password we made for the server when we created it.

image

After connecting, there is a file explorer on the left. Use this to navigate to your database and then into the “Tables” folder inside the database and click add new table. A new page will load up and it’s very intuitive to add new columns - you just fill in the details and it’ll dynamically expand as you add more.

Once it is ready just choose update in the top left and once it has prepared the script choose update database.

image

After creating the table in Visual Studio and updating the database I then used the Query editor on Azure to add values to my table.

After opening the query editor, before we do anything we’ll have to login using the credentials we provided when creating the SQL server, and then we can enter SQL commands and execute them on our database.

image

Click run to execute the SQL on the database.

The window below the query editor should output a message saying how the table was affected by the code (mine said “Query succeeded: Affected rows: 6”).

Can then output table to check if it all worked correctly.

image

We will also need to then whitelist any IP connecting to this SQL Server or Azure will deny access as a security feature. To access this, we navigate back to the overview of our SQL Database and along the top where tools was we have another option “Set server firewall” and here we can easily choose to add client IP, or manually add any ourselves.

To connect we can then use standard PHP connection code, we shouldn’t need to whitelist an IP when we host the web app because we checked the box to allow connections from Azure when we created the SQL server.

If you’re struggling with the code Azure makes it even easier by giving you example connection code for a lot of common languages. Again, on the overview choose “Show database connection strings”.

image

Uploading Your Website to Azure

I’m now going to upload my PHP website to Azure which will then connect to the SQL database we created.

First navigate to App Services.

image

Choose the add option at the top of the screen like we did with the SQL Database/Server and choose an app name, this will be the start of your URL. Use the same resource group as you have used for the rest of the project as this will help with organisation if you decide to host several websites which might use different SQL Servers or resources.

image

Then from the dashboard choose your App Service, then from inside your web app on the left panel choose “Deployment options”.

image

As our source, we are going to choose GitHub.

image

Then authorise Azure to view your repositories and choose the repository of your website from GitHub and which branch you want to publish and continue.

This will essentially just git clone your repository into to wwwroot of the server. The best thing about using GitHub for this is that the server will automatically update when you push updates to your repository which makes it so simple and efficient to update and issue bug fixes.

If you set up your index webpage to be in the top most directory of your GitHub repository then your website should now be set up, if you didn’t you’ll have to move some files around in the console section of the Azure page (which I will briefly demonstrate as I will be moving my database login details up a directory to be completely sure it can’t be accessed by the web).

So, on the left panel of the App Service we have open scroll down to the section developer tools and choose console - you should then have a screen like this.

image

Here is the code I used to move the file up a directory, I won’t go into much detail about this section because it is standard bash commands.

image

If you need to edit a file (like I did to put my database login details as I didn’t want to upload them to GitHub) a good way to do that is to go underneath console on the left panel and choose Advanced Tools, then choose Debug console at the header bar and then CMD.

image

Then simply navigate to the file you want to edit by clicking the site folder and then wwwroot as this is where your GitHub repository will have been uploaded and then click the little pencil next to the name.

image

And then just edit as though it were a normal text file.

If the name you chose for your web app was “name” your URL will be “http://name.azurewebsites.net” and your website should now be live at this URL. The code for my website can be viewed at https://github.com/jtavernor/MyWebsite.

Falha Nossa no Git (TOP 5)

$
0
0

Fiz aqui um TOP 5 das coisas mais estranhas que notei no nosso repositório do ARDA.

Seguem os links dos repositórios:

Antigo: https://github.com/DXBrazil/Arda_old
Recente: https://github.com/DXBrazil/Arda

Bons desenvolvedores mantém o histórico do GIT linear e conciso usando um workflow definido (ex: gitflow), squash commit, rebase, etc. Nosso time é bom, mas ignoramos essas práticas e mantivemos todos os commits intermediários. Felizmente como o histórico ficou intacto, hoje é possível entender a história do ARDA em todos os detalhes, principalmente com os nossos erros.

Falha #5: Trabalhar sem feature branches

Por um tempo, mantivemos branches individuais por usuários e isso gerou um histórico visualmente não-linear:

image

Em nosso projeto, não adotamos o rebase e sempre fizemos pontos de merge. Embora o rebase torne o histórico mais linear, a vantagem do merge é manter o histórico fiel aos commits realizados.

Apenas como brincadeira, peguei um screenshot do SourceTree e rotacionei a imagem, tornando o nosso histórico musicalmente mais interessante:

image

Que confusão! Hoje vejo que poderíamos ter feito um histórico mais limpo se tivéssemos adotado o workflow do Gitflow (ou semelhante) desde o princípio do projeto.

Falha #4: Guardar senhas dentro do repositório

O pior é que esse erro é muito comum nos projetos. No nosso caso, adicionamos o arquivo de secrets.json com informações de login no Redis, SQL Server e Azure Blob Storage.

image

Em seguida, tentamos remover as senhas com um commit “removido string de conexão do sql server”.

image

Esse commit também poderia ser renomeado para “hackers, aqui tem senha -- veja os detalhes do commit”. Embora as senhas sejam removidas dos arquivos, os commits são mantidos dentro do repositório do GIT e qualquer um que tenha acesso pode olhar o histórico do arquivo.

image

Lição: esse erro sempre vai acontecer e se repetir, então esteja preparado para definir um processo de reset de senhas nos ambientes de desenvolvimento, teste e produção.

Falha #3: Configurar incorretamente o GIT Client

Durante a instalação do cliente do Git, você configura um usuário e email para acessar o repositório. Se tiver máquinas diferentes, então deve repetir a configuração em todas as máquinas.

Falhamos nisso. Temos diversos emails registrados:

Fabricio Sanchez, por exemplo, usou várias configurações de email diferente:

Configurar usuário e email errado tem um impacto muito baixo, mas gerou uma estatística interessante: a contribuição do Fabricio Sanchez foi apagar 165 mil linhas do projeto contra 135 mil adicionadas! Ele destruiu mais do que criou (merece um prêmio hehe).

image

Obrigado Fabricio Sanchez por todos os seus deletes no projeto!

image

Mas suspeito que, pelo fato de configurar o usuário/email errado, fez com que suas adições fossem ignoradas.

image

No final, isso não tira o mérito dele em ser um grande contribuidor do projeto ARDA.

Falha #2: Erro no arquivo Git Ignore

Outro erro comum é esquecer de configurar o arquivo do Git Ignore (.gitignore) e incluir arquivos desnecessários no controle de versão. Entre os casos mais comuns estão os diretórios bin, out, obj, bower_components, node_modules.

O que há de errado em incluir as dependências do NodeJS (diretório node_modules)?

image

Simples: esse diretório tem 7302 arquivos que não precisam estar versionados no Git.

image

Mas por incrível que pareça, nosso time do ARDA não cometeu essa falha!!! O Visual Studio cria automaticamente o arquivo com as extensões a serem ignoradas e, por isso, o .gitignore estava correto desde o começo do projeto.

image

Entretanto, nós fizemos melhor: nosso estagiário (sempre ele!) adicionou o caminho do nosso projeto Arda.Main dentro do Git ignore.

image

Quando você faz isso, erros estranhos acontecem no projeto do Arda.Main:

  • Arquivo existentes: continuam “trackeados” pelo git e a atualização funciona
  • Arquivos novos: são ignorados pelo .gitignore e não são incluídos no projeto

Esse comportamento gerou erros aleatórios por um bom tempo. Determinadas modificações do Arda.Main funcionavam, outras não. Só fomos descobrir a causa do problema dias depois.

Falha #1: Falta de controle no master

Nossos repositórios (GitHub e VSTS) são protegidos por senha, mas qualquer um do time podia entrar no repositório e editar os arquivos. Pela descrição, dá para identificar algumas gambiarras. Por exemplo, commits correspondentes a Work in Progress (WIP) estão em vários lugares no histórico do Git.

image

Tem outros commits de “adjusting dockerfile again” ou “once again, trying to adjust the path” realizados direto na master.

image

Isso sem contar que usamos o repositório master para demonstrações no Tech Summit 2016.

image

A coisa fica feia quando aparece um root fazendo commit.

image

Até hoje não sei de onde veio esse commit, mas acho que foi alguém usando o VIM dentro de um Ubuntu para rodar os containers.

image

Entendo que diversas vezes usamos essa aplicação para demonstração em eventos. Entretanto, a nossa branch master ficou uma bagunça. Se esse projeto fosse crítico (mais crítico do que é atualmente), a gente deveria fazer Fork do repositório e trabalhar nele. Enfim, várias formas de organizar o acesso aos repositórios.

 

Conclusão

Seu time realmente sabe trabalhar com o Git?

No nosso caso, estamos (sempre) aprendendo…

E vocês? Compartilhem suas experiências ruins nos comentários - as boas experiências já conhecemos!

 

Using MinGW and Cygwin with Visual C++ and Open Folder

$
0
0

Building cross-platform C and C++ code is easier than ever with Visual Studio 15.3 Preview 4.  The latest preview improves support for alternative compilers and build environments such as MinGW and Cygwin.  MinGW (Minimalist GNU for Windows), in case you are not familiar with it, is a compiler in the GCC family designed to run natively on Windows.  If you are interested in a quick rundown of this new functionality, check out our latest GoingNative episode on Channel 9.

Most MinGW installations, however, include much more than just a compiler.  Most distributions of MinGW include a whole host of other tools that can be used to build and assemble software on Windows using the familiar GNU toolset.  MinGW build environments often contain an entire POSIX development environment that can be used to develop both native Windows software using POSIX build tools and POSIX software that runs on Windows with the help of an emulation layer.

Please download the preview and try out the latest C++ features.  You can learn more about Open Folder on the Microsoft docs site.  We are looking forward to your feedback.

Why MinGW

There are three reasons why you might want to use MinGW on Windows.  The first is simply compiler compatibility.  We all love MSVC, but the reality is that some codebases are designed from the ground up under the expectation that they will be built with GCC.  This is especially true for many cross-platform projects.  Porting these codebases to support MSVC can be a costly endeavor when all you want to do is get up and running with your code.  With MinGW you can be up and running in minutes instead of days.

There is, however, an even more compelling reason to use MinGW than source compatibility with GCC.  MinGW environments are not just a compiler, but include entire POSIX build environments.  These development environments allow cross-platform projects to build on Windows with few if any modifications to their build logic.  Need Make or AutoTools, it’s already installed – if you are using an environment like MSYS2 nearly every tool you might need is only a single package management command away.

Finally, some MinGW distributions such as MSYS2 allow you to build native POSIX software on Windows without porting the code to Win32.  This is accomplished by an emulation layer that allows native POSIX tools to run on Windows.  This same layer is what allows all your favorite POSIX build tools to run in the MinGW environment.

Install MinGW on Windows

Getting started with MinGW is simple once you have installed the latest Visual Studio Preview.  First, you will need to make sure that you select the C++ workload when you install Visual Studio.  Next, you will need to download MinGW itself.  There are actually many ways to install MinGW.  If you are particularly adventurous you can build it from source, but it is much easier to install any of the popular binary distributions.  One of the more popular distributions is MSYS2 (getting started guide).  A standard MSYS2 installation installs three build environments: POSIX build environments to natively target 32-bit and 64-bit Windows, and an emulation layer to build POSIX software that targets Windows.

If you have a particular project in mind that you are working with, it is also worth checking out if they have any project-specific instructions on how to install MinGW.  Some projects even include their own custom tailored MinGW distributions.

Use MinGW with Visual Studio

It has been possible to use GCC based compilers with Visual Studio projects for some time already, but many cross-platform projects that build with MinGW on Windows are not organized into Solution and Visual C++ project files.  Creating these assets for a real-world project can be time consuming.  To streamline onboarding cross-platform projects into Visual Studio, we now support opening folders directly, and in the latest preview it is easier than ever to use alternative compilers and build environments such as MinGW.

With “Open Folder” you can edit your code with full IntelliSense, build, and debug with the same fidelity as is available with full Solution and C++ project files.  However, instead of authoring hundreds of lines of XML you just need to write a small amount of JSON.  Even better, you only need to write the JSON for the features you need.  For instance, if you only want to edit code in the IDE and stick to the command line for everything else, you only need to write a few lines of JSON to enable IntelliSense.

Edit with Accurate IntelliSense

To get the most out of IntelliSense when using “Open Folder” you will need to create a CppProperties.json file.  You can create this file by selecting “Project->Edit Settings->CppProperties.json…” from the main menu.

Create CppProperties.json

This file is automatically populated with four configurations: “Debug” and “Release” for the x86 and x64 architectures.  If you don’t need all of these configurations you can remove them to reduce clutter.  Next, you will need to configure a few options to get IntelliSense that is consistent with your MinGW build.

In each configuration, you may notice an "intelliSenseMode" key.  For MinGW and other GCC based compilers you should set this to "windows-clang-x86" or "windows-clang-x64" depending on your architecture.

Next, you will need to add the include paths for your project.  Configuration specific include paths can be added to the "includePath" array under each configuration.  You can add project-wide include paths here as well but it may be easier to add them to the “INCLUDE” environment variable, by adding an "environment" block to your configuration file as a peer to "configurations":

&quot;environments&quot;: [
  {
    &quot;MINGW_PREFIX&quot;: &quot;C:/msys64/mingw64&quot;,
    &quot;MINGW_CHOST &quot;: &quot;x86_64-w64-mingw32&quot;,
    &quot;MINGW_PACKAGE_PREFIX&quot;: &quot;mingw-w64-x86_64&quot;,
    &quot;MSYSTEM&quot;: &quot;MINGW64&quot;,
    &quot;MSYSTEM_CARCH&quot;: &quot;x64_64&quot;,
    &quot;MSYSTEM_PREFIX&quot;: &quot;${env.MINGW_PREFIX}&quot;,
    &quot;SHELL&quot;: &quot;${env.MINGW_PREFIX}/../usr/bin/bash&quot;,
    &quot;TEMP&quot;: &quot;${env.MINGW_PREFIX}/../tmp&quot;,
    &quot;TMP&quot;: &quot;${env.TEMP}&quot;,
    &quot;PATH&quot;: &quot;${env.MINGW_PREFIX}/bin;${env.MINGW_PREFIX}/../usr/local/bin;${env.MINGW_PREFIX}/../usr/bin;${env.PATH}&quot;,
    &quot;INCLUDE&quot;: &quot;project/lib/includes;${env.MINGW_PREFIX}/mingw/includes&quot;
  }
],

Note: MINGW_PREFIX should point to your MinGW installation, if you installed MSYS2 with default settings this directory will be “C:/msys64/mingw64”.

Most GCC based compilers, including MinGW, automatically include certain system header directories.  For full IntelliSense, you will need to add these to the include path list.  You can get a list of these standard includes by running:

echo | gcc -Wp,-v -x c++ - -fsyntax-only

Build with MinGW

Once you have set up IntelliSense (with “MINGW_PREFIX” and its related environment variables defined and added to “PATH”), building with MinGW is quite simple.  First, you will need to create a build task.  Right click your build script (or any other file you would like to associate with the task) and select “Configure Tasks”:

Configure Tasks

This will create a “tasks.vs.json” file if you don’t already have one and create a stub.  Since MinGW and its tools are already on the PATH, as configured in CppProperties.json, you can configure the task to use them like this:

{
  &quot;version&quot;: &quot;0.2.1&quot;,
  &quot;tasks&quot;: [
    {
      &quot;taskName&quot;: &quot;build-all&quot;,
      &quot;appliesTo&quot;: &quot;hello.cpp&quot;,
      &quot;contextType&quot;: &quot;build&quot;,
      &quot;type&quot;: &quot;launch&quot;,
      &quot;command&quot;: &quot;${env.comspec}&quot;,
      &quot;args&quot;: [
        &quot;g++ -o bin/helloworld.exe -g hello.cpp&quot;
      ]
    }
  ]
}

This example is simple – it only builds one file – but this can call into Make or any other build tools supported by MinGW as well.  Once you save this file, you should have a “Build” option available for any files matched by the "appliesTo" tag.

Selecting “Build” from the context menu will run the command and stream any console output to the Output Window.

Debug MinGW Applications

Once you have configured your project to build, debugging can be configured with by selecting a template.  Once you have built an executable, find it in the folder view and select “Debug and Launch Settings” from the executable’s context menu in the Solution Explorer:

Debug and Launch Settings

This will allow you to select a debugger template for this executable:

Select a Debugger

This will create an entry in the “launch.vs.json” file that configures the debugger:

{
  &quot;version&quot;: &quot;0.2.1&quot;,
  &quot;defaults&quot;: {},
  &quot;configurations&quot;: [
    {
      &quot;type&quot;: &quot;cppdbg&quot;,
      &quot;name&quot;: &quot;helloworld.exe&quot;,
      &quot;project&quot;: &quot;bin\helloworld.exe&quot;,
      &quot;cwd&quot;: &quot;${workspaceRoot}&quot;,
      &quot;program&quot;: &quot;${debugInfo.target}&quot;,
      &quot;MIMode&quot;: &quot;gdb&quot;,
      &quot;miDebuggerPath&quot;: &quot;${env.MINGW_PREFIX}\bin\gdb.exe&quot;,
      &quot;externalConsole&quot;: true
    }
  ]
}

In most cases, you won’t have to modify this at all, as long as you defined “MINGW_PREFIX” in the “CppProperties.json” file.  Now you are ready to edit, build, and debug the code.  You can right click the executable and select “Debug” in the context menu or select the newly added launch configuration in the debug dropdown menu to use F5.  You should be able to use breakpoints and other debugging features like the Locals, Watch, and Autos Windows.  If you are having trouble hitting breakpoints, make sure your build command was configured to emit GDB compatible symbols and debugging information.

Finally, to complete the full inner loop, you can add an "output" tag to your build task in “tasks.vs.json”.  For instance:

&quot;output&quot;: &quot;${workspaceRoot}\bin\helloworld.exe&quot;,

What About Cygwin and Other Compilers in the GCC Family

While we have discussed MinGW specifically in this post, it is worth keeping in mind that very little of the content above is specific to MinGW.  These topics apply equally well to other compilers and build environments so long as they can produce binaries that can be debugged with GDB.  With minimal modification, the instructions under the “Use MinGW with Visual Studio” section above should work fine with other environments such as Cygwin and even other compilers like Clang.  In the future, Open Folder will support an even greater variety of compilers and debuggers - some out of the box, some with a little bit of extra configuration that may be specific to your projects.

Send Us Feedback

To try out the latest and greatest C++ features and give us some early feedback, please download and install the latest Visual Studio 2017 Preview.  As always, we welcome your feedback.  Feel free to send any comments through e-mail at visualcpp@microsoft.com, through Twitter @visualc, or Facebook at Microsoft Visual Cpp.

If you encounter other problems with Visual Studio 2017 please let us know via Report a Problem, which is available in both the installer and the IDE itself.  For suggestions, let us know through UserVoice. We look forward to your feedback!

Do people write insane code with multiple overlapping side effects with a straight face?

$
0
0


On an internal C# discussion list, a topic that comes up
every so often is asking about
the correct interpretation of statements like



a -= a *= a;
p[x++] = ++x;


I asked,



Who writes code like that with a straight face?
It's one thing to write it because you're trying to win the IOCCC
or you're writing a puzzle,
but in both cases, you know that you're doing something bizarre.
Are there people who write
a -= a *= a and
p[x++] = ++x;
and think,
"Gosh, I'm writing really good code?"


Eric Lippert replied
"Yes, there are most certainly such people."
He gave as one example a book from an apparently-successful author
(sales of over four million and counting)
who firmly believed that the terser your code,
the faster it ran.
The author crammed multiple side effects into a single expression,
used ternary operators like they were going out of style,
and generally believed that run time was proportional to the number
of semicolons executed,
and every variable killed a puppy.



Sure, with enough effort,
you could do enough flow analysis to have
the compiler emit a warning like
"The result of this operation may vary depending upon the order
of evaluation",
but then you have to deal with other problems.



First of all, there will be a lot of false positives.
For example, you might write



total_cost = p->base_price + p->calculate_tax();


This would raise the warning because the compiler observes
that the calculate_tax method is not const,
so it is worried
that executing the method may modify
the base_price,
in which case it matters whether you add the tax to the original base price
or the updated one.
Now, you may know (by using knowledge not available to the compiler)
that the calculate_tax method updates the tax
locale for the object,
but does not update the base price,
so you know that this is a false alarm.



The problem is that there are going to be an awful lot of these
false alarms, and people are just going to disable the warning.



Okay, so you dial things back and warn only for more blatant cases,
where a variable is modified and evaluated within the same expression.
"Warning: Expression relies on the order of evaluation."



Super-Confident Joe Expert programmer knows that his code
is awesome
and the compiler is just being a wuss.
"Well, obviously the variable is incremented first,
and then it is used to calculate the array index,
and then the result of the array lookup is stored back to the variable.
There's no order of evaluation conflict here.
Stupid compiler."
Super-Confident Joe Expert turns off the warning.
But then again,
Super-Confident Joe Expert is probably a lost cause,
so maybe we don't worry
about him.



Joe Beginner programmer doesn't really understand the warning.
"Well, let's see.
I compiled this function five times,
and I got the same result each time.
The result looks reliable to me.
Looks like a spurious warning."
The people who would benefit from the warning don't have the necessary
background to understand it.



Sure enough,
some time later,
it came up again.
Somebody asked why
x ^= y ^= x ^= y doesn't work in C#,
even though it works in C++.
More proof that people write code that rely upon
multiple side effects,
and they passionately believe that what they are doing is obvious
and guaranteed.

Mapping a custom naked domain to your Azure App Service

$
0
0

A naked URL means that there is no www or any other sub-domain prefix before the domain name and domain extension.  I wrote an updated version on how to configure a common www based custom domain here.

There are numerous articles about this which are posted here, but I though an update was in order.  To configure a custom domain for your Azure App Service, the following steps are required:  (Note: you can also purchase a custom domain right from within the portal itself, as described here.  If you do that, many of these complexities are removed.)

  • Create an A Record with your DNS provider
  • Add the naked custom domain to the Azure App Service

Create an A Record with your DNS provider

If you attempt to configure the naked domain without an A Record in the Azure portal you will likely get an error similar to the following.

A records were found pointing to the following IP addresses: ###.###.###.###, etc...
Please create an A record pointing to the following IP address: ###.###.###.###
No TXT records were found.
Please create a TXT record pointing from ****.com to ****.azurewebsites.net

You need to access the portal of your DNS provider and enter both the A Record and the TXT records as requested.  The end result may look something like the following.  You will also need to create a second CNAME record called AWVERIFY.

image

Figure 1, configure a naked custom domain, url with no www to an Azure App Service

Add the naked custom domain to the Azure App Service

In the Azure portal navigate to the Azure App Service to which you want to configure the naked custom domain to.  Then click on the Custom domain link, followed by the Add hostname link as shown in Figure 2.

image

Figure 2, add a naked custom domain to an azure app service

Enter the hostname and press the validate button.  Once validated, as shown in Figure 3, press the Add hostname button and you are all set.

image

Figure 3, add a naked hostname to an Azure App Service

If all works out then you see both the www and the naked domain on the Custom domain blade as shown in Figure 4.

image

Figure 4, configuring custom domains for an Azure App Service

Lync/Skype for Business の更新プログラム(CU) 一覧

$
0
0

こんばんは。 Japan Lync Support Team です。

1. Skype for Business Server 2015

Version KB Date
6.0.9319.281 4012620 2017/05/17※代表して core components への Link を記載しています。
6.0.9319.277 3061064 2017/02/13
6.0.9319.272 3061064 2016/11/04
6.0.9319.259 3149226 2016/06/30
6.0.9319.235 3061064 2016/03/18
6.0.9319.102 - 2015/11/17
6.0.9319.55 - 2015/06/19

 

2. Skype for Business 2016

Version KB Date
16.0.4561.1000 3213548 2017/07/05
16.0.4546.1000 3203382 2017/06/13
16.0.4534.1000 3178717 2017/05/09
16.0.4522.1000 3178717 2017/04/04
16.0.4510.1000 3178656 2017/03/14
16.0.4498.1000 3141501 2017/02/07
16.0.4471.1000 3127980 2016/12/06
16.0.4432.1000 3118288 2016/09/06
16.0.4405.1000 3115268 2016/07/05
16.0.4393.1000  3115087 2016/06/07
16.0.4339.1000 3114846 2016/03/08
16.0.4351.1000 3114696 2016/02/09
16.0.4324.1000 3114516 2016/01/12
16.0.4312.1000 3114372 2015/12/08
16.0.4300.1001 3085634 2015/11/10
16.0.4288.1000 2910994 2015/09/30

 

3. Lync Server 2013

※公開サイトでダウンロード可能なバージョンは最新版のみです。ただしセキュリティパッチ(Sec)の場合、ひとつ前のパッチが公開されている場合もございます。

Version KB Date
5.0.8308.992 2809243 2017/07/11
5.0.8308.987 4014154 2017/03/22
5.0.8308.984 3210166 2017/01/18
5.0.8308.974 3200079 2016/11/01
5.0.8308.965 2809243 2016/08/23
5.0.8308.956 3140581 2016/04/02
5.0.8308.945 31266373126638 2016/01/07
5.0.8308.941 312121331212153120728 2015/12/15
5.0.8308.887 3051951 2015/05/01
5.0.8308.871 3131061 2015/03/19
5.0.8308.857 3018232 2014/12/12
5.0.8308.815 2937305 2014/09/23
5.0.8308.803 2986072 2014/09/08
5.0.8308.738 2937310 2014/08/05
5.0.8308.577 2905048 2014/01/08
5.0.8308.556 2881684 2013/10/07
5.0.8308.420 2819565 2013/07/01
5.0.8308.291 2781547 2013/02/27

 

4. Lync 2013 クライアント(Skype for Business)

Basic や VDI プラグインも同じバージョンが適用可能です。

Version KB Date
15.0.4945.1000 3213574 2017/07/05
15.0.4937.1000 (Lynchelploc) 3191937 2017/06/13
15.0.4937.1000 3191939 2017/06/13
15.0.4927.1000 (Lynchelploc) 3191873 2017/05/02
15.0.4927.1000 3191876 2017/05/02
15.0.4919.1000 (Lynchelploc) 3172492 2017/04/04
15.0.4919.1000 3178731 2017/04/04
15.0.4911.1000 3172539 2017/03/15
15.0.4903.1001 3161988 2017/02/07
15.0.4893.1000 3141468 2017/01/03
15.0.4885.1000 3127976 2016/12/06
15.0.4875.1001 3127934 2016/11/01
15.0.4867.1000 3118348 2016/10/11
15.0.4859.1002 3118281 2016/09/06
15.0.4849.1000 3115431 2016/08/09
15.0.4841.1000 3115261 2016/07/05
15.0.4833.1000 3115033 2016/06/07
15.0.4809.1000 3114944 2016/04/12
15.0.4797.1000 3114732 2016/02/09
15.0.4787.1001 3114502 2016/01/07
15.0.4779.1001 3114351 2015/12/08
15.0.4771.1001 3101496 2015/11/10
15.0.4763.1001 3085581 2015/10/13
15.0.4753.1000 3085500 2015/09/08
15.0.4745.1000 3055014 2015/08/14
15.0.4727.1001 3054791 2015/06/09
15.0.4719.1000 3039779 2015/05/12
15.0.4711.1002 2889923 2015/04/14
15.0.4701.1000 2956174 2015/03/10
15.0.4693.1000 2920744 2015/02/10
15.0.4659.1001 2889929 2014/10/29
15.0.4649.1000 2889860 2014/09/09
15.0.4641.1000 2881070 2014/08/12
15.0.4623.1000 2850074 2014/06/10
15.0.4615.1001 2880980 2014/05/13
15.0.4605.1003 2880474 2014/04/11
15.0.4569.1508 2863908 2014/03/11
15.0.4551.1001 2817678 2013/11/12
15.0.4551.1005 2825630 2013/11/07
15.0.4517.1504 2817621 2013/08/13
15.0.4517.1004 2817465 2013/07/09
15.0.4517.1001 2768354 2013/06/11
15.0.4481.1004 2768004 2013/05/20
15.0.4481.1000 2760556 2013/03/20
15.0.4454.1509 2812461 2013/02/27

 

5. Lync Server 2010

Version KB Date
4.0.7577.728 2493736 2016/05/13
4.0.7577.726 2493736 2016/04/18
4.0.7577.713 3057803 2015/05/01
4.0.7577.710 3030726 2015/02/06
4.0.7577.230 2957044 2014/04/24
4.0.7577.225 2909888 2014/01/08
4.0.7577.223 2889610 2013/10/07
4.0.7577.217 2860700 2013/07/12
4.0.7577.216 2791381 2013/03/15
4.0.7577.211 2791665 2013/01/29
4.0.7577.206 2772405 2012/11/06
4.0.7577.203 2737915 2012/10/11
4.0.7577.199 2701585 2012/06/16
4.0.7577.198 2698370 2012/04/20
4.0.7577.197 2689846 2012/03/29
4.0.7577.190 2670352 2012/03/01
4.0.7577.189 2670430 2012/02/07
4.0.7577.188 2658818 2012/01/23
4.0.7577.183 2650982 2011/12/13
4.0.7577.183 2514980 2011/11/19
4.0.7577.170 2616433 2011/09/13
4.0.7577.167 2592292 2011/08/29
4.0.7577.166 2571546 2011/07/25
4.0.7577.137 2500442 2011/04/20

 

6. Lync 2010 クライアント 

Version KB Date
4.0.7577.4534 4020732 2017/06/13
4.0.7577.4525 4010299 2017/03/15
4.0.7577.4521 3188397 2016/10/12
4.0.7577.4510 3174301 2016/08/05
4.0.7577.4484 3096735 2015/11/10
4.0.7577.4478 3081087 2015/ 9/ 8
4.0.7577.4476 3075593 2015/ 8/11
4.0.7577.4474 3072611 2015/ 7/ 7
4.0.7577.4456 3006209 2014/11/11
4.0.7577.4446 2953593 2014/ 6/10
4.0.7577.4445 2953593 2014/ 4/17
4.0.7577.4419 2912208 2014/ 1/8
4.0.7577.4409 2884632 2013/10/7
4.0.7577.4398 2842627 2013/ 7/12
4.0.7577.4392 2843160 2013/ 7/9
4.0.7577.4388 2827750 2013/ 5/14
4.0.7577.4384 2815347 2013/ 4/9
4.0.7577.4378 2791382 2013/ 3/14
4.0.7577.4374 2793351 2013/ 1/29
4.0.7577.4356 2737155 2012/10/11
4.0.7577.4109 2726382 2012/10/9
4.0.7577.4103 2701664 2012/ 6/16
4.0.7577.4098 2693282 2012/ 6/12
4.0.7577.4097 2710584 2012/ 5/14
4.0.7577.4087 2684739 2012/ 3/28
4.0.7577.4072 2670326 2012/ 3/1
4.0.7577.4063 2669896 2012/ 2/7
4.0.7577.4061 2670498 2012/ 1/28
4.0.7577.4053 2647415 2011/11/21
4.0.7577.4051 2514982 2011/11/19
4.0.7577.336 2630294 2011/10/19
4.0.7577.330 2624907 2011/10/11
4.0.7577.314 2571543 2011/ 7/25
4.0.7577.280 2551268 2011/ 5/24
4.0.7577.275 2540951 2011/ 4/27
4.0.7577.253 2496325 2011/ 4/4
4.0.7577.108 2467763 2010/ 1/20

こちらも併せてご活用ください。
https://technet.microsoft.com/en-us/office/dn788954.aspx

免責事項:
本情報の内容 (添付文書、リンク先などを含む) は、作成日時点でのものであり、予告なく変更される場合があります。

 

Open Source at Microsoft

$
0
0

Thousands of Microsoft engineers use, contribute to and release open source every day across every platform, from the cloud to client operating systems, programming languages and more. So if your struggling finding repos at http://github.com there is now a search feature built into the http://opensource.microsoft.com site which also lists the top trending repo.

image

Popular projects include Visual Studio Code, TypeScript, and .NET. Microsoft's open source code is released under Open Source Initiative-approved licenses such as MIT and Apache 2.0.

Explore Microsoft open source

Community resources

Discover Microsoft-released open source

At opensource.microsoft.com you can explore open source that Microsoft teams have released and are collaborating with the broader community of software engineers.

Browse opensource.microsoft.com

Microsoft Open Source Code of Conduct

The Microsoft Open Source Code of Conduct outlines expectations for participation in Microsoft-managed open source communities, as well as steps for reporting unacceptable behavior.

Microsoft Open Source Code of Conduct

Microsoft Contributor License Agreements

We appreciate community contributions to code repositories governed by Microsoft. By signing a contribution license agreement, we ensure that the community is free to use your contributions.

Microsoft contributor license agreement

Third-party disclosures

An archive of notices and source code for certain third-party components shipped with Microsoft products, in accordance with the corresponding licenses that contain disclosure obligations.

Review notices and archives


Other resources

Microsoft + Open Source Marketing

Open dialogue about openness at Microsoft – open source, standards, interoperability, and the people and companies who inspire our commitment.

Check out open.microsoft.com

.NET Foundation

The .NET Foundation is an independent organization to foster open development and collaboration around the .NET ecosystem. It serves as a forum for community and commercial developers alike to broaden and strengthen the future of the .NET ecosystem by promoting openness and community participation to encourage innovation.

About the .NET Foundation



How to Track Impediments in VSTS

$
0
0

This post is from Premier Developer consultant Assaf Stone.


Note: While this post discusses impediments in VSTS, everything mentioned can be applied to TFS as well.

Visual Studio Team Services (or VSTS) has great tools to support Scrum teams. The product owner can use the backlog and board to track the progress of individual teams or the entire product, at the Product Backlog Item (PBI) level, at the Feature level or even the Epic level, throughout the entire lifetime of the product. The developers can track the progress that they are making within the sprint, and see how their work (tasks) fit into the larger picture, by associating them with the PBIs that make up the product.

But what about the Scrum Masters? What tools do they have in VSTS to help them track their work and their progress?

What are Impediments?

According to Scrum.org’s official Scrum guide, one of the services that a Scrum Master provides to the development team, is the removal of impediments to the team’s progress. An impediment is anything that causes a developer to be unable to make progress towards completing the sprint’s goal. Whenever a developer has a problem that cannot be solved within the scope of the team, it is the Scrum Master’s responsibility to remove it.

Impediments in VSTS

Visual Studio Team Services has a work item type dedicated towards tracking impediments, and progress on their removal. For projects using the Scrum Template, this work item type is called an impediment. For projects using the Agile or CMMI templates, this is called an Issue. Regardless of template, they both serve the same purpose: They mention a problem, and their state machine tracks the progress.

Unfortunately, impediments and issues do not show up in VSTS’ backlogs or boards. Those are designed for tracking progress on the delivery of the product, and the Impediment work item type is not included. That said, how should a Scrum Master and the Scrum team track these impediments, especially in large distributed projects, where face-to-face communication and jotting a note on a pad is not a viable solution?

Continue reading on Assaf’s blog here.

Wednesday Featured Forums: Microsoft Office Forum

$
0
0

Greetings Everyone

Welcome to another Wednesday featured forum, today we will be looking at the Microsoft Office Forum.

Like the Windows Server Forum, the Microsoft Office one has been around for a very long time as well going back to Office 1.5 released in March 1991 (WOW) and we are now up to Office 2016. Each version bringing in new features but its list of issues as well. Not saying its a bad thing but you might install a windows update that blocks PDF attachments in outlook for example.

Office I personally think is the one of the main applications that users use everyday on their computers, whether its checking emails in Outlook or doing a sales presentation in Excel or PowerPoint. Every single one of these applications has an important function in users operations, here is some of them:

  • Word
  • Excel
  • PowerPoint
  • OneNote
  • Outlook
  • Project

These days, the entire suite is available on all Major mobile Operating systems like Android, Windows Mobile and iOS. Productivity has not only shifted from a desktop or laptop but you have it with you on the go as well. Now you can respond to your bosses email by reading your mail on a Tablet without having to wait to be in the office. Talk about a major shift in cloud technology.

If you are an office 365 user and have a subscription that includes Office you will be able to use the latest version of Office available to you.

Users on Mac also have the latest version of office available to them.

If you are eager to test drive new features etc. you can subscribe to the insider program, same as with windows and get the fast/slow releases before it becomes publicly available.

If we take a look at the forums for office, this forum is huge, 80 000+ pages where people ask questions, to access this forum you can click the link below:

Some of the questions asked in this forum:

You will see that some of the questions asked has more than 40 responses, this is great community contribution. Some questions asked relate to activation, Outlook, Security patches and many other ones. I encourage you to be part of this by proposing answers to Moderating the forum if you are a Moderator.

With some many shortcuts, tweaks and options available, please create WIKI pages that will help the community. Share your knowledge on the office products.

As with other forums there are a lot of MVP's and Microsoft staff answering these questions.

For those interested in training you can do a lookup in your Region of who offers training.

That is all for now folks.

All the Best!

Edward

Model Comparison and Merging for Analysis Services

$
0
0

Relational-database schema comparison and merging is a well-established market. Leading products include SSDT Schema Compare and Redgate SQL Compare, which is partially integrated into Visual Studio. These tools are used by organizations seeking to adopt a DevOps culture to automate build-and-deployment processes and increase the reliability and repeatability of mission critical systems.

Comparison and merging of BI models also introduces opportunities to bridge the gap between self-service and IT-owned “corporate BI”. This helps organizations seeking to adopt a “bi-modal BI” strategy to mitigate the risk of competing IT-owned and business-owned models offering redundant solutions with conflicting definitions.

Such functionality is available for Analysis Services tabular models. Please see the Model Comparison and Merging for Analysis Services whitepaper for detailed usage scenarios, instructions and workflows.

This is made possible with BISM Normalizer, which we are pleased to announce now resides on the Analysis Services Git repo. BISM Normalizer is a popular open-source tool that works with Azure Analysis Services and SQL Server Analysis Services. All tabular model objects and compatibility levels, including the new 1400 compatibility level, are supported. As a Visual Studio extension, it is tightly integrated with source control systems, build and deployment processes, and model management workflows.

Schema-Diff

Blogs recomendados Lifecycle Services (LCS)

$
0
0

Estimados, les comparto algunos blogs con información relacionada a Lifecycle Services (LCS), espero les sea de utilidad.

How do I login to the new AX as a demo user persona?

https://blogs.msdn.microsoft.com/lcs/2016/03/17/how-do-i-login-to-the-new-ax-as-a-demo-user-persona/

Scratching your head on how to use your existing O365 account for AX7?

https://blogs.msdn.microsoft.com/lcs/2016/03/07/scratching-your-head-on-how-to-use-your-existing-o365-account-for-ax7/

How can I setup a Dynamics AX solution trial instance in Azure with my customization and demo data?

https://blogs.msdn.microsoft.com/lcs/2016/03/03/how-can-i-setup-a-dynamics-ax-solution-trial-instance-in-azure-with-my-customization-and-demo-data/

New Dynamics AX solutions trial - How can I invite a prospect to my trial instance?

https://blogs.msdn.microsoft.com/lcs/2016/03/03/new-dynamics-ax-solutions-trial-how-can-i-invite-a-prospect-to-my-trial-instance/

February 2016 release notes

https://blogs.msdn.microsoft.com/lcs/2016/02/25/february-2016-release-notes/

Saludos.

Announcing the availability of Keyword Planner API

$
0
0

We are excited to announce that Keyword Planner features are now available with the Ad Insight service.  The Keyword Planner APIs (likewise with Keyword Planner in the Bing Ads web application) enable you to identify the most effective ad groups and keywords to boost your campaign performance.

 

The new GetKeywordIdeas operation suggests new ad groups and keywords based on your existing keywords, website, and product category. You can also request historical statistics for keywords e.g., monthly searches, competition, average CPC, and ad impression share. You can further set various targets including location, language and network  as well as filters like keywords to include or exclude to customize the result. Additionally, You can use the returned suggested keyword bids as input to the new GetKeywordTrafficEstimates operation. Note: You can use the GetKeywordIdeaCategories operation to get product category details, in case you want to use the category criterion for keyword ideas.

 

The new GetKeywordTrafficEstimates operation provides traffic estimates for keywords e.g., average CPC, average position, clicks, CTR, impressions, and total cost. As input you provide the bid and optional daily budget along with targeting settings and negative keywords.

Supported Targeting:

  • Location: US, UK, CA, AU, FR & DE as whole country and fine-grained locations down to city level within these countries
  • Language: English, French, and German

 

We encourage users to leverage Keyword Planner API to get keyword suggestions with historical statistics and traffic estimations for the supported locations; However, if your market is not supported you can continue to use the existing Ad Insight operations:

 

The new GetKeywordIdeas and GetKeywordTrafficEstimates operations are already supported with the June release of Bing Ads SDKs, so now is a good time to upgrade! As always if you have any questions please feel free to contact support or post in the Bing Ads API developer forum.

Viewing all 29128 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>