Quantcast
Channel: MSDN Blogs
Viewing all 29128 articles
Browse latest View live

Creating a 3D model for 3D App Launcher

$
0
0

Bored about the classic 2D tile for representing you Windows Mixed Reality app in the Cliff House? Go to the next level:

Use a 3D model that characterizes and differentiates your app!

 

Here is a summary of the steps to achieve this.

1. Make your model ready for Windows Mixed Reality

    • From your favorite apps/tools, export a 3D model as glTF 2.0 with the .glb extension (As an exemple, I took a model from 3D Remix and exported it by Paint 3D)
    • Download the WindowsMRAssetConverter.exe from the glTF-Toolkit on gitHub - https://github.com/Microsoft/glTF-Toolkit/releases
    • Simply use the WindowsMRAssetConverter.exe tool with your model (.glb file) as parameter:

         

        2. Add your converted model as content to the App project

        • Build your Windows Mixed Reality project from Unity generating the Visual Studio C# Solution

        Unity Settings for building the C# Visual Studio solution

        • Open the built Windows Mixed Reality C# solution in Visual Studio. Right click on the UWP project in the SOLUTION EXPLORER window and choose ADD / EXISTING ITEM…

        Right click on the project / ADD / Existing Item...

        • Make sure that the BUILD ACTION for the added .glb file is set to CONTENT

        Build Action: Content

         

        3. Use your converted model in the App manifest

        • Open the Package.appxmanifest with a text editor (or right click on it in Visual Studio and then VIEW CODE)
        • Add the uap5 schema in the Package item at the begining of the file:
        xmlns:uap5="http://schemas.microsoft.com/appx/manifest/uap/windows10/5"
        • Also add uap5 in the IgnorableNamespaces list:
        IgnorableNamespaces="uap uap2 uap5 mp"

          The Package.appxmanifest file will start like the following:

          uap5 schema

          • Modify the DefaultTile element to integrate a uap5:MixedRealityModel element like:
          <uap5:MixedRealityModel Path="MyModel_converted.glb"/>

          The entire uap:DefaultTile element will look like this

          MixedRealityModel in DefaultTile element

           

          4. Build and deploy the app for testing

          • Build your UWP project either in x86 or x64
          • Lastly, deploy the app to the local machine (Right click on the project in SOLUTION EXPLORER and then DEPLOY;

          Deploy app

           

          Here is the result when we place the App tile in the Cliff House 🎊🎉 (Tomato model from Remix 3D )

          3D App Launcher

           

          Happy 3D App Launcher modeling 😉

          @sbovo for the 💻 🎮 Windows AppConsult team.

           

          References


          Azure Media Functions (Server-less Functions)

          $
          0
          0

          I've been working on a set of Azure Serverless functions for Azure Media Services to complement the original series developed in this Git Rep.

          I focused on building a set of functions that can migrate assets from AWS S3 to Azure Media Services if integrated with the Azure logic workflow built under the same Git Rep mentioned above.

          The other set is focusing on migration from Limelight

          The last set is focused on improving the way you can extract Azure Media Services Telemetry via Migrating them from Azure Storage Table to a single Azure SQL DB Table.

          You can find the source code in my gitHub repo.

          Let me know if you have more ideas to build up in the functions.

          Test

          The Food Truck Paradigm : Why should I trust open source software?

          $
          0
          0

          In this post, App Dev Manager Daniel Setlock reflects on the trust of open source software using the Food Truck Paradigm.


          Introduction

          It is somewhat shocking that in 2017 I still receive questions about how open source software, libraries, plug-ins, packages, etc. can be trusted, and the pushback that such things receive from cyber security personnel at organizations. While everyone has their own job to do, and their own criteria for success, limiting the availability of a robust and organized code base or tools due to a lack of understanding in regards to what Open Source actually means only hinders and never helps.

          When asked about this topic, my go-to analogy is a comparison with deciding whether to eat at a food truck or not. Strange, right? But stay with me, as I explain how these things, while not necessarily related, can be perceived with the same mentality as to whether or not it’s a good idea to use a JavaScript library on someone’s blog, or eat the ceviche being served out of the back of a van.

          opens2Why should I trust a Food Truck to not poison me?

          Food trucks can get a bad rap, but I have had some absolutely delicious dishes out of a white panel van. For some people whose passion is food it makes economical sense. Rather than the big presence of a brick and mortar store and all the overhead that it brings into the business, a freelance sort of thing is appealing.

          If a food truck comes to the same location regularly and you see other people partaking of the fare, it’s a reasonable assumption that it is safe to ingest. Should you be the first to try the brand-new food truck? Maybe not, unless you are the adventurous sort, but an established presence can be thought of as a safe bet for your dietary needs.

          Food trucks that make people sick don’t stick around. They fade quickly and it only takes a single bad experience. People don’t make much of a habit of posting positive reviews about an experience, but will take any opportunity to vent about a poor experience, and in the age of social media and connectivity, word of mouth has been replaced with hashtags and memes. No food truck tagged with #dysentery is going to last long.

          Food Truck vs OSS

          The analogy makes a lot of sense to me. If you have a consistent, well known, presence serving reliable wares (whether that is food or code) that has catered to numerous individuals, you have fostered trust inside the community and that trust is rewarded by people ordering at your window, or referencing your library.

          If a food truck poisons someone, or a code library goes rogue and allows malicious actions to take place, both cease to be used extremely quickly. Additionally, for an open source code base, library, plug-in, etc. you have the ability to make it your own. If it is saved locally, rather than referenced through a CDN, you know unequivocally that you have the authoritative version that you have used in the past and you trust. Think of it like this, if you forget your lunch one day, and have to venture out to the food trucks. You get your food, eat it, don’t get sick, and go on about your day. It is similar to saving the code and making it your own. You took a chance, and it worked. I am not saying that every time you use OSS, or a food truck, you are playing with fire, but it is simply another way to think about it. As a result, though, you now own that code, and whether that version had vulnerabilities or not, it must be maintained, Whether it is maintained through contributing to the code base or updating the version as new functionality or increased security is added. Studies have shown that individuals who pursue art for pleasure and passion, rather than profit, produce higher quality results. If you have a community of thousands of software developers contributing to an idea because it’s their passion, and a much smaller group of people creating an idea for profit, which idea would you think produces better results?

          Conclusion

          When I think about something as pervasive as jQuery or Bootstrap which has become an industry standard being shunned and deemed unsafe because it is open source is silly in the extreme. I have seen organizations decide to pursue .NET 4.6 instead of .NET Core or competing Entity Framework versions simply because one is open source and the other is not. While there are many reasons to prefer one framework over another, criteria for success tied to open standards unless they are deemed less secure or insufficient in some other way should be evaluated closely.

          Open source software is not inherently unsecure, malicious, or riddled with bugs. It could be someone’s passion project, a good idea that lacked the funding of a larger implementation, or possibly an individual simply wanting to contribute to the larger community. Just like eating at food trucks are not like purposefully ingesting improperly cooked food. It could be someone’s passion and this is how they pursue it. Ultimately the decision to use OSS for your project, or eat at a food truck on your lunch break, is up to you, with all the risks and benefits that brings.


          Premier Support for Developers provides strategic technology guidance, critical support coverage, and a range of essential services to help teams optimize development lifecycles and improve software quality.  Contact your Application Development Manager (ADM) or email us to learn more about what we can do for you.

          Azure IoT, Cognitive, Bot Service フォーラムについて

          $
          0
          0

          皆様もう 1 月が終わりますよ、早いですね。いかがお過ごしでしょうか!

          Cognitive Service 開発サポートの石沢です。

           

          弊社が管理/運営しております MSDN フォーラムにて、Azure IoT, Cognitive, Bot Service 用のフォーラムカテゴリが登場いたしました。

           

          Azure IoT, Cognitive, Bot Service フォーラムへようこそ 

          https://social.msdn.microsoft.com/Forums/ja-JP/f49b0282-78c6-4878-b1aa-2a81b57f0cf4/azure-iot-cognitive-bot-service-?forum=azureiotcognitivebotja

           

          本ブログの以下の記事でもご案内差し上げております通り、開発案件で技術サポートまでお問い合わせいただくためには、”DEVELOPER” 以上のサポートプランが必要となります。

          例えばサポートプランをお持ちでない場合に、本フォーラムをご活用いただければ幸いです。

           

          Cognitive Services の技術サポートのお問い合わせ方法

          https://blogs.msdn.microsoft.com/jpcognitiveblog/2017/12/24/cognitive-services-の技術サポートのお問い合わせ方法/

           

          それではよい節分を!

          Cognitive Services  開発サポートチーム 石沢 望夢

          word

          test pic by live writer

          Introducing the Microsoft Campus- BETT, Day One

          $
          0
          0


          Bett 2018 is finally upon us, thousands of educators will be flocking to the ExCel London to see what this year has in store for transforming education. Bringing together the whole education community together, Bett aims to inspire you to do amazing, game changing things, helping shape the future of education! Microsoft, as a global Bett partner, is very excited to be back this year ready to deliver inspiring content that shares next generation ideas and innovative, technology-led classroom practices in the UK and around the world.

          Microsoft’s educator-led sessions will showcase the ideas, practices and technologies that have the most impact on educational outcomes, along with inspiring concepts to help develop 21st century skills and find solutions to global challenges.

          Here are today's highlights on the agenda to ensure you don't miss any of the truly hands on experience with Microsoft. We’ll be showcasing the latest in future-ready education tools and solutions right across Bett so here's what not to miss today!


          Day One

           

          Microsoft has lots in store for the first day of Bett, with an exclusive Keynote Session from Anthony Salcito, Microsoft's VP of Worldwide Education, in the arena talking about 'The Future of Digital Learning' at 3.15pm. 


          Microsoft Campus Stand E300

           

          We also kick off Bett at the Microsoft stand (E300) with Director of Education in the UK, Ian Fordham, talking about 'What's New in Microsoft Edu?'

          10.30am-11.00am and 2.00pm-3.00pm daily in the Microsoft Learn Live Theatre.

           

           


          Schools Theatre 

           

          Over the in Schools Theatre Skype Master Teachers, Amy Rosenstein and Steve Auslander, will be showcasing Skype in the Classroom in their session 'Connecting 1000s of classrooms globally across languages & cultures'

          Wednesday 24th January 12.45pm-1.15pm


           

          Don't forget- swing by the Microsoft Campus at 4pm today to catch our 'What’s New in EDU Live Streaming' Microsoft’s monthly series that rounds-up What’s New in EDU will be recorded live every day from BETT!

           

           


          Download the Microsoft Map to access our easy to read agenda with a simple map that will show you all of our daily sessions across the different theatres. 

           


          Wanted: Kinder lernen programmieren – Mentoren gesucht

          $
          0
          0

          Ihr arbeitet gerne mit Kindern zusammen? Die CoderDojos suchen Mentoren!!

          Die CoderDojos sind ein Club für Kinder und Jugendliche zwischen 8 und 17 Jahren, in dem sie Technik und Programmieren lernen können. In regelmäßigen Treffen werden verschiedene Übungen gemacht.

          Wir sind in Kontakt mit den CoderDojos in Wien und Linz: https://wien.coderdojo.net/ bzw. http://coderdojo-linz.github.io/, und am 16. Februar finden bei uns im Haus zwei Veranstaltungen dazu statt:

          1. Der CoderDojo Hackathon, wo nach einer Tour durch den Microsoft Learning Hub neue Übungsbeispiele für die Kinder und Jugendlichen entwickelt werden: https://www.eventbrite.de/e/coderdojo-wien-hackathon-at-microsoft-tickets-41918976788?aff=erelpanelorg
          2. Das eigentliche CoderDojo für die Kids selber: https://www.eventbrite.de/e/coderdojo-wien-at-microsoft-tickets-41919099154

          Für beide Veranstaltungen werden noch Interessierte gesucht die an neuen Übungsbeispielen für die Kids mitarbeiten wollen, bzw. dann in weiterer Folge auch ab und zu bei den CoderDojos dabei sind und die interessierten Kids unterstützen.

          CoderDojo Wien Hackathon at Microsoft Tickets CoderDojo Wien @VERBUND Tickets

          New Features in Visual Studio Code IoT Edge Extension 0.1.3 & IoT Toolkit 0.5.0

          $
          0
          0

          At the Connect(); 2017 in last November, we announced public preview of Azure IoT Edge. Now you can bring the intelligence of the Cloud right to the IoT Edge as well as easily create and manage business logic for your devices. The new Azure IoT Edge extension for Visual Studio Code along with the updated Azure IoT Toolkit extension will make your IoT Edge developments a real pleasure, providing a set of functionalities including:

          • Creating new IoT Edge projects
          • Building and publishing IoT Edge modules
          • Debugging IoT Edge modules locally
          • Managing IoT Edge devices in IoT Hub
          • Deploying IoT solutions to IoT Edge devices
          • Stopping and restarting IoT Edge

          Recently an updated version of Visual Studio Code IoT Edge and IoT Toolkit extensions with a bunch of new features for Azure IoT Edge developers are now available in the marketplace. Here is a quick walkthrough of what's new:

          Edge device and module list

          The first thing you might notice after the update is the new IoT Hub device list. Edge devices now have a new icon which helps you distinguish them from general IoT Hub devices. In addition, you will see a list of Edge modules deployed to an Edge device by clicking on the Edge device. The color of the icon also gives you a quick idea of the module status.

          To view the module twin, simply right click on a module and select "Get Module Twin".

          Deployment manifest snippet

          The deployment manifest is a JSON document which describes the IoT Edge modules to be deployed and various configurations and properties about the modules. Previously you can get a deployment manifest template as a starting point with the "Generate Edge deployment manifest" command. In this update, we've added powerful code snippet features to streamline the deployment manifest authoring experience.

          Module snippet

          When you want to add a module to your IoT Edge solution, place your cursor in the modules property, and the snippet can be triggered by typing a quotation mark, or the IntelliSense shortcut (Ctrl+Space or Command+Space by default). You can move the cursor inside the snippet between cursor locations by pressing the Tab key. For the status and restartPolicy properties, you can see a list of available options and pick one of them from the list.

          Route snippet

          Adding a new route is also easier thanks to the route snippet. Place the cursor in the routes property, and you can trigger the route snippet in a similar way to module snippet. What's cool about this is that the route snippet will automatically load the names of the modules currently added in the deployment manifest.

          After you finish editing your deployment manifest, don't forget to apply it by right clicking on the Edge device which you want to deploy it to and selecting "Create deployment for Edge device".

          More additions

          • We've added the commands to create IoT Hub and IoT Hub device (including Edge device). With this addition, you now have the option to simulate an Edge device right in your favorite editor. We will have another post talking about end to end experience soon.
          • A new command to log in to container registry is added.
          • The "Build IoT Edge module" command is added to F#'s .fsproject file.

          What's next

          We believe these new features will make Edge developers' work even easier, and we also have tons of new ideas under brainstorm. Stay tuned!

          What Root Certificates exist on an Azure App Service, CA Root

          $
          0
          0

          As you may already know SSL/TLS is offloaded on the Front Ends (*) and this is where certificates are validated (AFAIK).  There is no way for you to access those machines to dump out what CAs are there.  The next, or closest thing I can think of is to dump them out via the KUDU PowerShell console.  I discuss KUDU here.

          After you login to KUDU, open the PowerShell console, as illustrated in Figure 1.

          image

          Figure 1, what CAs exist on Azure App Services

          Execute this PowerShell cmdlet which dumps out the Certificate Store Names, see Figure 2.

          DIR CERT:LocalMachine | select NAME
          

          image

          Figure 2, what certificates exist on an Azure App Service

          Then if I want to look at the Authorized Root Certificates, I execute this PowerShell cmdlet, see Figure 3.

          image

          Figure 3, what Authorized Root Certificates exist on an Azure App Service

          I would expect the output seen in Figure 3 to be the same as if you were to start CERTMGR -> add the Local Computer store and navigate to Trusted Root Certificate Authorities -> Certificates, as seen in Figure 4.

          CERTMGR Benjamin Perkins

          You cannot add Root Certificates to an App Service.  You can add intermediate certificates, see here.

          Like I mentioned at the beginning, this is what you would see on the App Service machine that is running your application, it is not the place where you would expect to see your SSL/TLS certificate.  Those are installed an configured on the Front End, see here.

          Also, you can install public certificates on Azure App Service now.  See here "App Service Certificates now supports public certificates (.cer)".

          Not that you only have access to store into CurrentUserMy certificate store which can then be retrieved via code on an Azure App Service like the following:

          var store = new X509Store(StoreName.My, StoreLocation.CurrentUser);
          

          *NOTE: If you are going to do some work with certificates on the Azure platfom, seriously consider Azure Key Vault. Read about that here and here "Get started with Azure Key Vault certificates".

          No response from bot emulator after the new update to Bot Framework/Azure Bot Service?

          $
          0
          0

          Why I cannot debug with my bot emulator after the new update to Bot Framework/Azure Bot Service?

          Did Microsoft just broke bot framework?


          Not really. But I spent a good few hours searching for the cause.

          The cause?

          bot.set('storage', tableStorage);

          Comment this little guy out when you’re doing a local debug. Upon publish, re-enable it.

          //bot.set('storage', tableStorage);

          You’re welcome if this save you a couple of minutes/hours.

          Cannot get Microsoft Teams @mentions to work?

          $
          0
          0

          Checking out the code at the official documentation?

          Node.js example code: Check for and strip @bot mention

          If it doesn’t work for you, don’t worry. I have found the cause.

          You just have to declare var botId = "YOUR BOT ID";

           

          Session.message instead of message.text in the documentation. Good luck.

          Simplify data access using de-normalized models

          $
          0
          0

          Classic relational databases enable you to create highly normalized data models with schema that might contain a lot of tables. Logical entities are broken into several tables and every complex property of the primary entity (for example, list, array, collection) is placed into separate table.

          An example of simple Person entity that has many email addresses and phone numbers is shown on the following diagram:

          This simple but still highly normalized data model is optimized for large number of concurrent users who can update any of the tables in the diagram. We can have one user who is updating Person row, other who is concurrently updating EmailAddress for the same person, third that is updating PersonPhone, and fourth that is updating PhoneNumberType table without affecting each other. Furthermore, we can have several users that are updating different email addresses for the same user without affecting each other because locks are applied at the row-level. As we can see this model is designed for high concurrent updates of the same logical records.

          The drawback of this approach is the fact that it requires many JOINs or separate queries to retrieve all necessary data if a user needs to get the data for a single person. Also, in order to insert new person that is physically split in several tables you would need to use transactions to update many table, and in many case we need to follow parent/child order defined by referential integrity, read primary key of the parent row before we insert child row, lock several tables, etc.

          There are many workloads that cannot fully leverage this design because they have the following characteristics:

          1. There are no large number of separate users that update fine-grained pieces of information for the same entity at the same time.
          2. User who are reading information about primary/logical entity (Person) usually must read all related information.
          3. Users are updating logical entity in one transaction, and not individual pieces (emails, phones). Complex entities are updated in batches and involve insert/updates both in primary table (Person in this case), and all related secondary tables, which might cause locking issues.

          In the schema displayed above, we can notice that email addresses and phone numbers for persons are not frequently changed (how many times you have changed email address or phone number in the last month?) For these types of workloads, highly normalized schema doesn’t provide the main high concurrency benefit but introduces the cost of reading and updating highly normalized data.

          Architects sometime choose NoSQL databases for that kind of models and workloads, because NoSQL collections enable them to physically store logical entities as single record (usually formatted as JSON). However, this choice is a problem if some parts of the schema must stay in SQL Database because they would need to join relational and non-relational data from the different sources in the application layer.

          SQL Server and Azure SQL Database are general-purpose multimodal databases that enable you to fine-tune your models and combine both relational and non-relational design concepts in your database. You can identify parts of the database schema that do not leverage highly normalized design and de-normalize them into hybrid relational + semistructured tables, that are optimized for the workload.

          In the example above, instead of dividing person related data in separate tables that will not be frequently updated, you can store these information in Person table as collections formatted as JSON arrays.

          ID FirstName LastName Emails Phones
          274 Stephen Jiang [{"EmailAddress":"stephen0@adventure-works.com"},{"EmailAddress":"stephen.jiang@outlook.com"}] [{"PhoneNumber":"112-555-6207","PhoneNumberType":"Work"},{"PhoneNumber":"238-555-0197","PhoneNumberType":"Cell"},{"PhoneNumber":"817-555-1797","PhoneNumberType":"Home"}]
          275 John Finley [{"EmailAddress":"johnf@adventure-works.com"}] [{"PhoneNumber":"112-555-6207","PhoneNumberType":"Work"}]

          This is equivalent to the design approach used in NoSQL database where both primary logical entity all related information are placed together. Access operations in this design become simple:

          SELECT *
          FROM Person
          WHERE BusinessEntityID = @id;
          
          INSERT INTO Person(FirstName, LastName,Emails,Phones)
          VALUES (@firstName, @lastName,@emails,@phones);
          
          UPDATE Person
          SET Emails = @emails
          WHERE BusinessEntityID = @id;
          
          DELETE Person
          WHERE BusinessEntityID = @id;

          Instead of transactions and locking across several tables, you can have single atomic requests that updates entities at logical level.

          With native JSON functions in SQL Server 2016, you can even perform fine-grained updates in the collections of entities:

          UPDATE Person
          SET Emails = JSON_MODIFY(Emails, '$[3].EmailAddress', @emails)
          WHERE BusinessEntityID = @id;

          In this case, we are updating third email address in the collection of emails using the parameter @email. Although updating JSON field with JSON_MODIFY function could not match performance of updating scalar columns in the separate table, this might be valid trade-off if most of the read/write queries run faster.

          FOR JSON clause in SQL Server 2016+ enables you to easily de-normalize parts of your highly normalized schema using simple T-SQL query:

          UPDATE Person
          SET Emails = (SELECT EmailAddress FROM EmailAddresses e WHERE e. BusinessEntityID = Person. BusinessEntityID FOR JSON PATH)

          Conclusion

          SQL Server and Azure SQL Database enable you to combine both relational and non-relational models in the same database. If you identify parts of the database schema that are suitable for NoSQL-like design models, you don’t need to export these parts into separate NoSQL database. You can leverage the same design concepts without going out of the database.

          Learn about quick school PC set up @ Microsoft Training Academy, Bett 2018

          $
          0
          0

          Visit the Microsoft Training Academy for daily sessions run by  our Microsoft Learning Consultants & Innovative Educators giving you a hands-on opportunity to learn about the latest and greatest tools that are being delivered to drive student and teacher outcomes, hosted on the Microsoft Campus at stand E300 throughout each day - with repeated sessions so you won’t miss out!


          Your Cloud Connected Classroom-Wednesday-Saturday 11.00am and 2.30pm daily

          Session Overview

          The main components of the session can be broken down into:

          • Setup School PC app - for setting up your new Windows 10 devices to Azure
            Active Directory (aka The Microsoft Cloud).
          • Intune for Education - for managing your cloud connected devices in an easy
            way.
          • Education Store - your purpose built app store from Microsoft to deploy
            software to your cloud connected devices.
          • Windows 10S - the new Cloud based Windows 10 OS especially built for Cloud
            connected devices in the classroom.

          Follow the Step-by-step Sway tutorial below from presenter Kevin Sait, Microsoft Learning Consultant which will all be demoed live in the Microsoft Training Academy session with additional Q & A.


           




          Visit the Microsoft Experience Bett 2018 page to find out more and get signed up.


          Report the value of physical locations

          $
          0
          0

          Key concepts

           

          Inventory dimension

          An inventory dimension can have a Physical value and a Financial value. The setting of the Physical value controls whether a dimension is active for Inventory management and Warehouse management. The setting of the Financial value controls whether a dimension is active for Inventory accounting in Cost management.

          Note: An inventory dimension can have an active Physical value but no Financial value. However, it can’t have an active Financial value but no Physical value.

          Cost object

          The term cost object was introduced in Microsoft Dynamics 365 for Finance and Operations, Enterprise edition. It represents a key concept that is used in the management of business costs. A cost object is an entity that costs and quantities accumulated for. For example, a cost object entity can be either a product or product variants, such as variants for style and color.

          A cost object is derived from the item ID and the inventory dimensions that have an active Financial value.

          There are three groups of inventory dimensions: product, storage, and tracking. Each inventory dimension can have a set of dimensions associated with it. For each dimension, you can set up the following inventory dimension values.

          Product dimension group Storage dimension group Tracking dimension group
          Dimension Value Dimension Value Dimension Value
          Color Financial by default Site Financial by default Batch Physical and or Financial
          Size Financial by default Warehouse Physical and or Financial Serial Physical and or Financial
          Configuration Financial by default Location Physical Owner Financial by default
          Style Financial by default Inventory status Physical
            License plate Physical

          Example: Define a cost object

          When an item is created, a set of inventory dimension groups can be assigned to it. The following table shows how you can define a cost object.

          Product dimension group Storage dimension group Tracking dimension group
          Color Financial by default Site Financial by default
            Warehouse Physical

           

          In the following example, the cost objects are defined by Item + Color + Site.

          Examples:

          • Speaker + Black + Site 1
          • Speaker + White + Site 1
          • Speaker + Black + Site 2

          The inventory objects can be used to report the physical quantity at any level in the warehouse that is defined by Item + Color + Site + Warehouse.

          Examples:

          • Speaker + Black + Site 1 + WH 11
          • Speaker + Black + Site 1 + WH 12
          • Speaker + White + Site 1 + WH12

          It’s crucial that you understand the concepts. The configuration and implementation of these concepts have a significant impact on the whole system, especially in Inventory management and Cost management.

          After the configuration is implemented, it’s almost irreversible. Any change will require significant resources and will affect system usage itself.

          In the rest of the document, we will use the Speaker item as an example. The inventory valuation method is set to Moving average.

          Cost object:

          • Speaker + Black + Site 1

          Inventory objects:

          • Speaker + Black + Site 1 + WH 11
          • Speaker + Black + Site 1 + WH 12

          After a few transactions have been posted, the following inventory transaction entries are generated in the Inventory subledger.

          Color Site Warehouse Financial date Reference Status Quantity Cost amount
          Black Site 1 WH11 1/1/2017 Purchase order 01 Purchased 1.00 10.00
          Black Site 1 WH12 2/1/2017 Purchase order 02 Purchased 2.00 26.00
          Black Site 1 WH11 3/1/2017 Sales order 01 Sold -1.00 -12.00

           

          The inventory close job was run as of January 31, 2017. Because the inventory valuation method was Moving average, no adjustments were posted.

          As part of the fiscal period–end process, an Inventory value report that shows the ending inventory balance in quantity and value is required. To meet this requirement, the inventory value report framework was introduced. The framework lets you create custom reports by including more data points that depend on the type of business. It also lets you define the level of aggregation for cost objects.

          Note: The inventory value report is designed to print only the values per cost object or aggregations of cost objects.

          You create an Inventory by cost object report, based on the configuration in the following table.

          FastTab group Field group Field Setup value Setup value
          General Range Posting date
          Columns Financial position Inventory Yes
            Resource ID View Yes
            Inventory dimensions Color View (Column) Yes
            Inventory dimensions Site View (Column) Yes
            Inventory dimensions Warehouse View (Column) Yes
            Average unit cost Calculate average unit cost Yes
          Rows Resource type Materials Yes
            Detail level Level Totals

           

          The report will look like this.

          Resource Color Site Warehouse Quantity Value Avg. unit cost
          Speaker Black 1 2.00 24.00 12.00

           

          Note: The Warehouse column will remain blank, because the Speaker item doesn’t have any cost object that includes the Warehouse dimension. Inventory dimension Warehouse is only set to Physical.

          View the inventory value by physical location Warehouse

          Configure the storage dimension group

          To meet the customer’s request, you could configure the storage dimension group differently. In this case, the Warehouse dimension is configured so that it has a Financial value.

          Product dimension group Storage dimension group Tracking dimension group
          Color Default Financial Site Default Financial
          Warehouse Financial

           

          This configuration affects how the Speaker item is handled by the system. The cost object and inventory object now have the same level of granularity.

          Cost objects:

          • Speaker + Black + Site 1 + WH 11
          • Speaker + Black + Site 1 + WH 12

          Inventory objects:

          • Speaker + Black + Site 1 + WH 11
          • Speaker + Black + Site 1 + WH 12

          The configuration also directly affects the inventory valuation. In this example, the FIFO, Weighted average, or Moving average inventory valuation method will be applied per cost object, and the overall result will differ.

          Color Site Warehouse Financial date Reference Status Quantity Cost amount
          Black Site 1 WH11 1/1/2017 Purchase order 01 Purchased 1.00 10.00
          Black Site 1 WH11 3/1/2017 Sales order 01 Sold -1.00 -10.00

           

          Color Site Warehouse Financial date Reference Status Quantity Cost amount
          Black Site 1 WH12 2/1/2017 Purchase order 02 Purchased 2.00 26.00

           

          The result will also differ when the Inventory value report is run by using the same configuration that is described in the previous section.

          Resource Color Site Warehouse Quantity Value Avg. unit cost
          Speaker Black 1 WH12 2.00 26.00 13.00

           

          The Warehouse column now has a value, and the inventory value is 26.00 instead of 24.00.

          Note: When you activate the Financial value for the Warehouse inventory dimension, you might affect performance. All transfers between warehouses are now considered financial movements, and financial movements can cause cycles in the Inventory close job. If the Warehouse inventory dimension is used only to physically track inventory, these will be closed as non-financial transfers before the cost calculation begins and the changes of cycles reduced.

          Create a custom report that looks at inventory transactions and settlements.

          You can create a custom report that sums inventory transactions and settlements by InventDim ID.

          The old Physical inventory by inventory dimension report was designed for that purpose. In the report dialog box, users could select any inventory dimensions, regardless of whether they were part of the defined cost objects.

          This approach works, provided that the inventory dimensions that you select are part of the defined cost object. However, if you select an inventory dimension that isn’t part of the cost object, the report starts to print incorrect results.

          The following table shows the result of printing balances per item and inventory dimensions.

          Resource Color Site Warehouse Quantity Value Avg. unit cost
          Speaker Black 1 WH11 0.00 -2.00 0.00
          Speaker Black 1 WH12 2.00 26.00 13.00

           

          Note: The inventory cost is calculated at a level above the inventory dimension and warehouse. Therefore, the cost on issue transaction from warehouse WH11 explains why the inventory value per warehouse can become negative.

          Report the value of physical locations

          If the value of a physical location must be reported, a sum of transactions per location, as described in the previous section, isn’t the solution.

          The correct approach is to calculate the value per location by using the following simple formula:

          Value = Cost object, cost × physical object, quantity

          Cost object:

          • Speaker + Black + Site 1
          Resource Color Site Quantity Value Avg. unit cost
          Speaker Black 1 2.00 24.00 12.00

           

          Inventory objects:

          • Speaker + Black + Site 1 + WH 11
          • Speaker + Black + Site 1 + WH 12
          Resource Color Site Warehouse Quantity Formula Value
          Speaker Black 1 WH11 0.00 12.00 × 0.00 0.00
          Speaker Black 1 WH12 2.00 12.00 × 2.00 24.00

           

          In Microsoft Dynamics AX 2012 R3, a new report that is named Inventory aging was introduced. As the name of the report implies, this report does more than just report the value by physical location. It can also show the age of the current inventory in the user-defined buckets.

          In the report dialog box, enter the following information.

          Field group Field Setup value
          As of date 31-01-2017
          View Item number View
            Color View
            Site View
            Warehouse View
          Aging period Unit of aging period Days
            Aging period 1 30
            Aging period 2 60
            Aging period 3 90
            Aging period 4 120

           

          The following table shows only the first section of the Inventory aging report, but the user-defined buckets have been omitted.

          Resource Color Site Warehouse On-hand Quantity On-hand Value Inventory value quantity Inventory value Avg. unit cost
          Speaker Black 1 WH11 0.00 0.00 2.00 24.00 12.00
          Speaker Black 1 WH12 2.00 24.00 2.00 24.00 12.00

           

          Conclusion

          If your organization must provide inventory value by any physical location, you don’t have to update the current configuration of inventory dimension groups. This change can be very intrusive, and also affects the cost calculation and performance. We also don’t recommend that you build a custom report for this purpose.

          The Inventory aging report is designed to calculate the cost per cost object and then can apply it to any physical level that is selected on the report. This report is designed to automatically detect the level that the cost object is defined at per item. It then applies the formula to calculate the value by physical location.

           

          Negative inventory in inventory accounting

          $
          0
          0

          Allowing physical negative inventory may have undesirable consequences in inventory accounting, especially if the inventory costing principle is Actual and the valuation method is either FIFO or Weighted average.

          Most of the issues that are related to physical negative inventory can be mitigated by using the correct configuration and maintenance of data.

          Example: Why is the cost out of sync?

          The following table lists the required setup for the Item model groups.

          Item Inventory model Physical negative inventory Latest cost price Active planned cost
          A FIFO Yes Yes No

           

          The purchase from the supplier always takes place at a unit cost of 7,500.00.

          The following table lists the events as they occur, in chronological order.

          Financial date Reference Receipt Issue Quantity Cost amount
          10/6/2017 Sales order 01 Sold

          -3.00

          10/6/2017 Purchase order 01 Purchased

          2.00

          15,000.00

          10/6/2017 Sales order 02 Sold

          -3.00

          -22,500.00

          10/6/2017 Purchase order 02 Purchased

          1.00

          7,500.00

          10/6/2017 Sales order 03 Sold

          -3.00

          -22,500.00

          10/6/2017 Purchase order 03 Purchased

          3.00

          22,500.00

          10/6/2017 Purchase order 04 Purchased

          2.00

          15,000.00

          10/6/2017 Purchase order 05 Purchased

          3.00

          22,500.00

          10/6/2017 Sales order 04 Sold

          -1.00

          -18,750.00

           

          The system starts issuing from inventory at cost per unit of 18,750.00 even though the cost of purchase has not exceeded 7,500.00.

          Why does this happen?

          In order to explain this in better detail, lets add a few more columns on the rightmost side, in green. These new columns represent the inventory balance after posting the specific transaction. The inventory balance is also known as InventSum.

          Financial date Reference Receipt Issue Quantity Cost amount Quantity Value Avg. unit cost
          10/6/2017 Sales order 01 Sold

          -3.00

          1)

          -3.00

          $0.00

          $0.00

          10/6/2017 Purchase order 01 Purchased

          2.00

          15,000.00

          -1.00

          $15,000.00

          -$15,000.00

          10/6/2017 Sales order 02 Sold

          -3.00

          -22,500.00 2)

          -4.00

          -$7,500.00

          $1,875.00

          10/6/2017 Purchase order 02 Purchased

          1.00

          7,500.00

          -3.00

          $0.00

          $0.00

          10/6/2017 Sales order 03 Sold

          -3.00

          -22,500.00 3)

          -6.00

          -$22,500.00

          $3,750.00

          10/6/2017 Purchase order 03 Purchased

          3.00

          22,500.00

          -3.00

          $0.00

          $0.00

          10/6/2017 Purchase order 04 Purchased

          2.00

          15,000.00

          -1.00

          $15,000.00

          -$15,000.00

          10/6/2017 Purchase order 05 Purchased

          3.00

          22,500.00

          2.00

          $37,500.00

          $18,750.00

          10/6/2017 Sales order 04 Sold

          -1.00

          -18,750.00 4)

          1.00

          $18,750.00

          $18,750.00

           

          Notes:

          1. The Cost per unit is 0.00. When the balance is negative, the system looks for a fallback cost to apply.
            1. First, the system looks for an active cost. This fails.
            2. Second, the system looks for cost set up in the Cost price field in the item master record. The cost is equal to 00.
          2. The Cost per unit is 7,500.00. The balance is still negative. The system looks for a fallback cost to apply.
            1. The system looks for an active cost. This succeeds.
            2. The latest cost price was set to Yes on the item record. The prior Purchase order unit cost of 7,500.00 has now become the active cost.
          3. The same condition applies as in number 2.
          4. The Cost per unit is 18,750.00. If the balance is positive at the time of posting the issue transaction, the system applies the average cost of the balance.
            1.  Average cost is calculated as: 37,500.00 / 2.00 = 18,750.00

          The issue is that the inventory balance of 1 piece at 18,750.00 is overvalued because the item has never been purchased at a cost higher than 7,500.00. The reason for this overvaluation is that the first issue transaction leaves inventory at a cost per unit of 0.00. The wrong cost estimation will continue to ripple through the following transactions.

          The only way to get the inventory balance back in sync is to run either the Recalculation or Inventory close jobs.

          Financial date Reference Receipt Issue Quantity Cost amount Qty Value Avg. Unit cost
          10/6/2017 Sales order 01 Sold

          -3.00

          -3.00

          $0.00

          $0.00

          10/6/2017 Purchase order 01 Purchased

          2.00

          15,000.00

          -1.00

          $15,000.00

          -$15,000.00

          10/6/2017 Sales order 02 Sold

          -3.00

          -22,500.00

          -4.00

          -$7,500.00

          $1,875.00

          10/6/2017 Purchase order 02 Purchased

          1.00

          7,500.00

          -3.00

          $0.00

          $0.00

          10/6/2017 Sales order 03 Sold

          -3.00

          -22,500.00

          -6.00

          -$22,500.00

          $3,750.00

          10/6/2017 Purchase order 03 Purchased

          3.00

          22,500.00

          -3.00

          $0.00

          $0.00

          10/6/2017 Purchase order 04 Purchased

          2.00

          15,000.00

          -1.00

          $15,000.00

          -$15,000.00

          10/6/2017 Purchase order 05 Purchased

          3.00

          22,500.00

          2.00

          $37,500.00

          $18,750.00

          10/6/2017 Sales order 04 Sold

          -1.00

          -18,750.00

          1.00

          $18,750.00

          $18,750.00

          30/6/2017 Sales order 01

          -22,500.00

          30/6/2017 Sales order 04

          11,250.00

          1.00

          7,500.00

           

          The Inventory close will apply the selected Inventory model, in which case is FIFO, and then adjust the cost on the issue transactions accordingly.

          Note: If the inventory balance is negative when executing the inventory close, the balance will not be adjusted. A full adjustment can only occur when the balance is positive, and enough receipts exist that can adjust the issues.

          Conclusion

          By default, all items should have an active cost. If you plan to allow temporary physical negative inventory, which is a valid scenario in certain business, its essential to apply an active cost before creating any transactions on the item.

          Finding QnAKnowledgebaseId and QnASubscriptionKey to configure the new Azure Bot Service

          $
          0
          0

          Bot Service went through an overhaul again; the configuration to connect your Bot Service to QnAMaker is different now. I am not sure for how long the information here will stay relevant but it is how you configure your Azure Bot Service to talk to QnAMaker.ai at the time of writing this post.

          When you use the new Bot Service template in Azure to deploy your QnAMaker bot, there are 2 settings you need to configure in App Settings - QnAKnowledgebaseId QnASubscriptionKey.

          Here's where you can find them:

          Go to https://qnamaker.ai/Home/MyServices click on your bot's "View Code"

          The first string right after "knowledgebases" is your QnAKnowledgebaseId, whereas the one right after Ocp-Apim-Subscription-Key is your QnASubscriptionKey

           

          Hope that helps.

          SQL Setup ToolSuite Introduction (3) – SQL Registry Viewer

          $
          0
          0

          You may want to know what registry keys will be added to system for a SQL server installation. If you use some registry snapshot tool to compare the window registries change of before and after the SQL installation you will find there are 40000~60000 modifications happening. However if you study the modifications carefully you will find that most of them doesn't have much sense, for example, lots of modifications go to "HKLMDRIVERSDriverDatabaseDeviceIds" entry. The most interesting modifications are:

          <>Installer related registry keys under
          HKEY_CLASSES_ROOTInstaller and
          ComputerHKEY_LOCAL_MACHINESOFTWAREMicrosoftWindowsCurrentVersionInstallerUserDataS-1-5-18

          <>COM+ related,Like
          ComputerHKEY_CLASSES_ROOTCLSID
          ComputerHKEY_CLASSES_ROOTInterface
          ComputerHKEY_CLASSES_ROOTTypeLib

          <> SQL specific
          ComputerHKEY_LOCAL_MACHINESOFTWAREMicrosoftMicrosoft SQL Server

          <>Service, performance log related keys,  and other keys under WOW6432Node

          <>Others...

          So is it possible to list all these important SQL server related registry keys? My answer is SQL Registry Viewer tool in GitHub:
          https://github.com/suyouquan/SQLSetupTools

          This tool pre-reads all SQL server meta data from setup source (the setup media containing those MSI/MSP files). The meta data includes  product code, patch code,package code, files, registry keys will be added to system etc. It will then use this prepared meta data to scan registry and display only SQL server related in the UI. The tool will ask you to specify service pack or CU you ever installed to your system for an accurate report. you can browser/search/export the keys in the UI easily.  Just be attention that this tool is not intended to list all SQL server related  keys. Just those important ones will be displayed.

          Enjoy and have fun!

          DevOps for Data Science – Load Testing and Auto-Scale

          $
          0
          0

          In this series on DevOps for Data Science, I’ve explained the concept of a DevOps “Maturity Model” – a list of things you can do, in order, that will set you on the path for implementing DevOps in Data Science. You can find each Maturity Model article in the series here:

          1. Infrastructure as Code (IaC)
          2. Continuous Integration (CI) and Automated Testing
          3. Continuous Delivery (CD)
          4. Release Management (RM)
          5. Application Performance Monitoring
          6. Load Testing and Auto-Scale (This article)

          The final DevOps Maturity Model  is Load Testing and Auto-Scale. Note that you want to follow this progression – there’s no way to do proper load-testing if you aren’t automatically integrating the Infrastructure as Code, CI, CD, RM and APM phases. The reason is that the automatic balancing you’ll do depends on the automation that precedes it – there’s no reason to scale something that you’re about to change.

          Load Testing

          I covered automated testing a previous article, but that type of testing focuses primarily on functionality and integration. For load testing, you’re running the system with as many inputs as you can, until it fails. For the Data Science team, you should inform the larger testing team about any load-testing you’ve done on your trained model (or the re-training task if that is incorporated into your part of the solution) using any load testing tools you can run in R or Python or whatever language/runtime you are using.

          The larger testing team will incorporate those numbers, run a “hammer” test on the entire solution, to see when the application becomes overloaded.

          An interesting development I’m seeing lately is that the Data Science team is asking for the metrics from the load (which also contains performance information of course) to do data analysis and even prediction. That’s a great value-add.

          Auto-Scale

          The Auto-Scale maturity level is where you really need to interact with the entire team, from the very earliest planning phase - of course, that is the very definition of DevOps. You need to find out how large the system will be as early as possible, because it can affect the design of your system. Certain technologies allow scale (Spark, Hadoop, Docker, others) and other technologies don’t parallelize or scale well. Writing your code in an efficient but unscalable technology will come back to hurt the application in the end, if the solution will grow. If you create a huge architecture and the solution should scale down to an “Internet of Things” environment, you’ll likewise face issues. Of course, some languages can be used on scalable technologies and smaller ones, so it’s up to you to know the limits and features of the various ways of working through these scenarios.

          With that, we’re done with my series on DevOps for Data Science. Follow the Maturity Model, develop the DevOps mindset, and take it one step at a time. It’s a journey worth taking.

          Viewing all 29128 articles
          Browse latest View live


          <script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>