Azure Foundry Local: What It Is, Why It’s Different, and When It Matters

So what is Foundry Local?

Foundry Local is Microsoft’s way of letting apps run AI models directly on your device. No cloud, no Azure account, no token costs. It runs fully offline on Windows, macOS with Apple Silicon, and Android.

At first glance, it looks similar to tools like Ollama or LM Studio. They all run models locally and expose APIs. But the real difference is how it’s packaged.

Ollama and similar tools run as separate services. Users install them, then your app talks to them over localhost.

Foundry Local flips that idea. It’s an SDK you bundle inside your own app. You ship it with your installer. The runtime is small, around 20 MB, and your app becomes self-contained. It downloads models when needed, caches them locally, and automatically uses the right hardware, whether that’s NVIDIA, AMD, Apple Silicon, or even NPUs.

In simple terms, instead of asking users to install an AI runtime, your app is the runtime.


Why that packaging actually matters

If you’ve ever depended on something like Ollama in a real product, you already know the pain.

Some users won’t install it. Others install the wrong version. Ports conflict. IT blocks background services. Suddenly you’re debugging someone else’s setup instead of your own app.

Foundry Local removes all of that. Everything lives inside your application. No external dependency, no background service, no guessing what environment the user has.

That’s really the core idea. It’s built for companies shipping software, not for people experimenting with models.


How it’s used

There are two main ways this shows up.

One is on devices, which is what most people will touch. You use SDKs in common languages and embed AI directly into your app.

The other is a more enterprise setup running on Azure Local with Kubernetes for edge environments like factories or hospitals.

One important detail. Foundry Local is designed for single-user scenarios. One app, one user, one model at a time. It’s not trying to be a shared AI server.

If you need high concurrency, something like vLLM is still the better choice.


Where it sits compared to other tools

To make sense of it, it helps to see the roles each tool plays.

llama.cpp is the core engine that started local LLMs. Fast, simple, and widely used.

Ollama makes it easy to download and run models quickly. It’s the easiest entry point for most developers.

LM Studio is more of a user-friendly interface for exploring models.

vLLM is built for scale and handling multiple users at once.

Foundry Local sits in a different spot. It’s not about running or serving models. It’s about shipping them inside applications.


The important part most people miss: model formats

Each system is built around a specific format.

GGUF is what most local tools use. It’s simple, portable, and heavily optimized for running models efficiently on CPUs and GPUs.

ONNX, which Foundry Local uses, is different. It doesn’t just store weights. It stores the full computation graph. Basically, it describes exactly how the model runs.

That makes it hardware-agnostic. You can run the same model across different devices and let the runtime figure out whether to use CPU, GPU, or NPU.

There’s also MLX, which is optimized specifically for Apple Silicon and performs very well there, but doesn’t really exist outside that ecosystem.

So the tradeoff is pretty clear. GGUF gives you the biggest ecosystem. ONNX gives you the most flexibility across hardware. MLX gives you peak performance on Apple devices.


Why Microsoft is doing this now

This part is actually the real story.

Hardware is changing. CPUs aren’t getting dramatically better for this kind of workload. GPUs are great but not always available on enterprise machines.

Meanwhile, NPUs are showing up everywhere. Intel, AMD, Qualcomm, Apple. New laptops increasingly have dedicated AI hardware.

The problem is each vendor has its own way of using that hardware. Without a common layer, developers would have to write separate code for each one.

That doesn’t scale.

This is where ONNX Runtime comes in. It acts as a bridge. One model, one API, and it runs on whatever hardware is available.

Foundry Local is essentially Microsoft building a developer-friendly layer on top of that idea.


So does it actually matter?

It depends on what you’re doing.

If you’re building a real application and want AI built in, this matters a lot. It solves distribution, compatibility, and hardware acceleration in one go.

If you’re experimenting or running models for yourself, it probably doesn’t. Tools like Ollama are still faster to get started with and have way more ready-to-use models.

If you’re building something that needs to serve many users, Foundry Local isn’t the right fit yet. That’s still a job for vLLM or similar systems.


The simple way to think about it

Each tool has a clear role.

Ollama is for running models
vLLM is for serving models
Foundry Local is for shipping models inside apps
That’s really it.

The bigger picture is where things get interesting. If NPUs become standard in everyday devices, then whoever controls the layer that connects apps to that hardware becomes very important. Microsoft is betting that layer will be ONNX Runtime, and Foundry Local is how developers interact with it.

Whether that bet pays off depends on how many real apps start using it. But the direction is already clear.

Preparing for Azure’s Deprecation of TLS 1.0 and 1.1: What You Need to Know

Microsoft Azure is set to deprecate support for TLS (Transport Layer Security) versions 1.0 and 1.1 on 31st of October 2024. This move is part of Microsoft’s ongoing commitment to enhance security and ensure that only the most secure protocols are used across its services. As these older versions become obsolete, it’s crucial for businesses and developers to understand the impact of this change and prepare accordingly.


In this blog post, we’ll delve into:

  • Why Microsoft is deprecating TLS 1.0 and 1.1
  • What the deprecation means for your applications and services
  • Whether you need to update your Azure services or if the change is automatic
  • Potential impacts on your business and solutions
  • How to prepare for the transition with a comprehensive checklist

Why Is Microsoft Deprecating TLS 1.0 and 1.1?

Microsoft is deprecating TLS 1.0 and 1.1 to strengthen security and comply with industry standards. These older versions have known vulnerabilities and are less secure by today’s standards. By moving exclusively to TLS 1.2 and higher, Microsoft aims to:

  • Enhance Security Posture: TLS 1.2 and 1.3 offer stronger encryption algorithms, reducing the risk of data breaches and unauthorized access.
  • Meet Compliance Standards: Many regulations now mandate the use of secure protocols like TLS 1.2 or higher.
  • Promote Best Practices: Encouraging the adoption of modern security protocols ensures a safer ecosystem for all Azure users.

What Does This Mean for Your Applications and Services?

Azure’s Automatic Enforcement

Azure will automatically enforce the deprecation of TLS 1.0 and 1.1 on its services. While Azure handles the enforcement on its end, it’s essential to ensure that your applications and services interacting with Azure are compatible with TLS 1.2 or higher.

Customer Action Required

  • Updating Applications and Configurations: If your applications or services currently use TLS 1.0 or 1.1, you must update both your application code and SSL/TLS configurations to support TLS 1.2 or 1.3.
  • Certificates and Cipher Suites: Review and update your SSL/TLS certificates and cipher suites to ensure compatibility with TLS 1.2 or higher.

Do You Need to Update Your Azure Services, or Will It Happen Automatically?

While Azure services will be updated automatically to enforce TLS 1.2 and higher, customer applications and services will not be updated by Azure. You are responsible for:

  • Ensuring Compatibility: Update your applications, services, and any client-side components to support TLS 1.2 or higher.
  • Testing and Validation: Proactively test your systems to identify any issues arising from the deprecation of TLS 1.0 and 1.1.

Potential Impacts on Your Business and Solutions

Connectivity Issues

  • Service Disruptions: Applications or services not updated to support TLS 1.2 or higher may fail to connect to Azure services, leading to downtime.
  • Third-Party Dependencies: Integrations with third-party services or clients that still use older TLS versions may break.

Business Disruption

  • Operational Interruptions: Downtime can affect productivity, revenue, and customer satisfaction.
  • Compliance Risks: Non-compliance with security standards may result in penalties or legal issues.

Security Enhancements

  • Improved Data Protection: Stronger encryption methods protect data integrity and privacy.
  • Reduced Vulnerabilities: Eliminating outdated protocols minimizes the risk of security breaches.

How to Prepare: A Comprehensive Checklist

To ensure a smooth transition, follow this detailed checklist:

1. Inventory Your Systems

  • Identify Applications and Services: List all applications, services, and devices that connect to Azure.
  • Determine TLS Usage: Check which TLS versions are currently in use.

2. Update Applications and Services

Application Code and Configurations

  • Modify Application Code:
  • Update Libraries and Frameworks: Ensure you’re using updated versions that support TLS 1.2 or 1.3.
    • .NET Applications: Use .NET Framework 4.6 or higher.
    • Java Applications: Update to a JDK version that supports TLS 1.2 or 1.3.
    • Python Applications: Use Python 2.7.9+ or 3.4+.
  • Specify TLS Version: Explicitly set TLS 1.2 or higher in your application’s code or configuration files.
  • Configuration Settings:
  • Update Configuration Files: Modify files like web.config or appsettings.json to enforce TLS 1.2 or higher.
  • Enable Strong Cryptography: Adjust registry settings on Windows systems to enable strong cryptography.

Certificates and SSL/TLS Configurations

  • Review SSL/TLS Certificates:
  • Check Compatibility: Ensure certificates use strong encryption algorithms (e.g., SHA-256).
  • Renew if Necessary: Obtain new certificates if current ones are outdated.
  • Update Server SSL/TLS Settings:
  • Enable TLS 1.2/1.3 Protocols: Configure servers to support only TLS 1.2 and 1.3.
  • Configure Cipher Suites: Use strong cipher suites compatible with TLS 1.2 or higher.
  • Disable Deprecated Protocols: Explicitly disable TLS 1.0 and 1.1 in server settings.

3. Assess Third-Party Dependencies

  • Contact Vendors: Confirm that third-party services support TLS 1.2 or higher.
  • Update Integrations: Modify integrations using older TLS versions.
  • Replace Outdated Components: Find alternatives for components that don’t support newer TLS versions.

4. Review Certificates and Configurations

  • Check Certificate Chain: Ensure the entire chain is valid and uses strong encryption.
  • Test SSL/TLS Configurations: Use tools like SSL Labs’ SSL Server Test to analyze your server.

5. Test in a Staging Environment

  • Simulate the Environment: Disable TLS 1.0 and 1.1 in a test setting.
  • Comprehensive Testing: Test all functionalities and monitor for issues.
  • Monitor Logs and Errors: Identify any TLS-related errors.

6. Update Client Software

  • Ensure Client Compatibility: Verify that client software supports TLS 1.2 or higher.
  • Distribute Updates: Release updates for client applications as needed.
  • User Communication: Inform users about necessary updates.

7. Prepare Your Infrastructure

  • Update Server Software:
  • Operating Systems: Use OS versions that support TLS 1.2 or higher (e.g., Windows Server 2012 R2+).
  • Web Servers: Update IIS, Apache, Nginx, etc., to the latest versions.
  • Configure Network Devices:
  • Firewalls and Load Balancers: Ensure they support and are configured for TLS 1.2 or higher.
  • VPN Gateways: Update configurations to use secure protocols.

8. Plan the Transition

  • Set a Timeline: Schedule updates before Azure’s deprecation date.
  • Communicate Internally: Inform stakeholders about upcoming changes.
  • Risk Mitigation: Develop contingency and rollback plans.

9. Update Development and Deployment Tools

  • CI/CD Pipelines: Ensure tools are compatible with TLS 1.2 or higher.
  • SDKs and APIs: Update to the latest versions.
  • Automation Scripts: Review and update scripts interacting with Azure.

10. Monitor and Support

  • Implement Monitoring:
  • Set Up Alerts: Configure for TLS-related errors.
  • Continuous Monitoring: Use tools to track performance post-migration.
  • Provide Support Channels:
  • Support Teams: Train staff for TLS-related issues.
  • Documentation: Update to reflect changes.

Specific Steps to Update Applications and Certificates

Updating Applications

  • Audit Your Codebase: Look for instances where TLS versions are hard-coded.
  • Update Security Protocols:
  • .NET Example: Set ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
  • Java Example: Configure JVM with -Djdk.tls.client.protocols="TLSv1.2"
  • Test Third-Party Libraries: Ensure they support TLS 1.2 or higher.
  • Recompile Applications: Ensure changes take effect.

Updating Certificates

  • Verify Certificate Details: Check signature algorithms and key lengths.
  • Obtain New Certificates: If necessary, get new ones with stronger encryption.
  • Update Certificate Stores: Install new certificates on all relevant servers.

Conclusion

Azure’s deprecation of TLS 1.0 and 1.1 is a significant move towards enhancing security and ensuring that only the most secure protocols are used. While Azure will handle updates on its end, it’s crucial for you to:

  • Proactively Update: Ensure your applications, services, and certificates are compatible with TLS 1.2 or higher.
  • Thoroughly Test: Identify and resolve issues before they impact production.
  • Stay Informed: Keep abreast of Azure’s timelines and updates.

By taking these steps, you can mitigate risks associated with the deprecation, ensuring a smooth transition and maintaining uninterrupted access to Azure services.

Rethinking Microsoft’s Ecosystem: The Missing Piece

Microsoft has made significant strides in AI, cloud computing, and PC technologies, establishing itself as a leader in these domains. The introduction of PC+ Copilot is a testament to their innovative approach, leveraging AI to enhance user experience. However, there remains a crucial element that could elevate Microsoft’s ecosystem to new heights: mobile phones.

The Current Landscape

Microsoft’s ecosystem is robust, with cloud-ready applications like Microsoft 365 and Office 365 seamlessly integrating with AI-enabled PCs. This creates a powerful synergy between cloud services and desktop applications. However, the mobile segment is conspicuously absent from this ecosystem. While Microsoft has ventured into the mobile space before, the timing and strategy were perhaps misaligned with market demands. Today, with an open-minded and adaptive approach, Microsoft has the opportunity to rethink and reintegrate mobile phones into their ecosystem.

A New Vision: Microsoft-Integrated Android

Imagine a mobile operating system based on Android, but with deep integration of Microsoft products and services. This approach could offer several benefits:

  1. Familiarity and App Compatibility: By using Android as the base, Microsoft can ensure compatibility with the vast array of existing Android apps. This addresses the initial challenge of app availability that plagued their previous mobile efforts.
  2. Seamless Integration: Similar to how Microsoft revamped the Edge browser by adopting Chromium, they can create a mobile OS that integrates seamlessly with their cloud and PC ecosystem. Features like cross-device file sharing, universal clipboard, and cloud synchronization can provide a user experience on par with, or even surpassing, Apple’s ecosystem.
  3. Enhanced Productivity: With Office 365, OneDrive, and other Microsoft tools natively integrated, users can transition effortlessly between their desktop and mobile devices. This continuity boosts productivity and simplifies workflows for both consumers and enterprise users.

Building on the Success of Microsoft Edge

The success of Microsoft Edge is a prime example of how adopting a robust foundation and layering it with Microsoft’s unique value proposition can lead to a superior product. By transitioning Edge to the Chromium engine, Microsoft not only improved performance and compatibility but also added unique features that distinguished Edge from other browsers. Similarly, using Android as the foundation for a new mobile OS allows Microsoft to leverage the strengths of a well-established platform while infusing it with their own innovative features.

Marketing and Technological Benefits

Marketing

  1. Brand Loyalty: Offering a mobile solution that integrates perfectly with existing Microsoft products can strengthen brand loyalty. Users who rely on Microsoft for their PC and cloud needs will find it appealing to extend this trust to their mobile devices.
  2. Targeted Campaigns: Highlighting the benefits of a unified ecosystem in marketing campaigns can attract both individual consumers and businesses looking for a cohesive IT environment.
  3. Strategic Partnerships: Licensing this new mobile OS to various manufacturers can increase market penetration and provide diverse device options for consumers.

Technological

  1. Innovation Leadership: By combining the power of AI, cloud services, and mobile technology, Microsoft can position itself as a leader in technological innovation.
  2. Security Enhancements: Building a mobile OS with security at its core can offer robust protection against modern threats. Integration with Microsoft Defender and other security tools can provide a secure environment for both personal and enterprise use.
  3. Unified Management: Enterprises can benefit from a unified management system for all devices, simplifying IT administration and enhancing security policies across platforms.

Security Benefits

  1. Enhanced Security: By controlling the mobile OS environment, Microsoft can ensure higher security standards. Features like integrated Microsoft Defender, secure boot processes, and regular security updates can provide a secure platform for users.
  2. Enterprise Control: For enterprise users, a Microsoft-integrated mobile OS can offer advanced security features and management tools, allowing IT departments to enforce security policies uniformly across all devices.
  3. Data Protection: Seamless integration with Microsoft’s cloud services ensures that data is protected through encryption and secure access controls, whether it is stored locally on the device or in the cloud.

Conclusion

Rethinking and reintegrating mobile phones into Microsoft’s ecosystem is not just a strategic move, but a necessary one to provide a comprehensive, seamless user experience. By leveraging Android as a base and building upon it with Microsoft’s products and services, the potential for a cohesive and secure ecosystem is immense. Building on the success seen with Microsoft Edge, this approach could redefine mobile productivity and set new standards in the tech industry, making Microsoft an even more integral part of our digital lives.

Creating a Clean Python Development Environment using Docker and Visual Studio Code

Python

Python is a high-level, dynamically-typed programming language that has taken the software development industry by storm. It’s known for its simplicity, readability, and vast library ecosystem. Python has become the language of choice for many in web development, data science, artificial intelligence, scientific computing, and more. Its versatile nature makes it ideal for both beginners and experienced developers.

Docker

Docker is a revolutionary tool that allows developers to create, deploy, and run applications in containers. Containers can be thought of as lightweight, stand-alone packages that contain everything needed to run an application, including the code, runtime, libraries, and system tools. Docker ensures that an application runs consistently across different environments, eliminating the infamous “it works on my machine” problem. It simplifies the process of setting up, distributing, and scaling applications, making it an invaluable tool for modern development.

Visual Studio Code

Visual Studio Code (VS Code) is a powerful, open-source code editor developed by Microsoft. It provides a lightweight yet feature-rich environment that supports a multitude of programming languages, including Python. With a vast ecosystem of extensions, integrated Git support, debugging capabilities, and an intuitive interface, VS Code has quickly become the editor of choice for many developers around the world.

Why Combine Python, Docker, and Visual Studio Code?

You might be wondering why one would want to combine Python, Docker, and Visual Studio Code. The answer lies in the fusion of simplicity, consistency, and efficiency. By using Docker, you can ensure that your Python application runs the same way, irrespective of where it’s deployed. This means no more headaches about dependency issues or system incompatibilities. On the other hand, VS Code provides a seamless development experience, with features that play nicely with both Python and Docker. Combining these three tools gives you a streamlined, consistent, and efficient development workflow.

Steps to Set Up Your Dev Environment:

  1. Install Prerequisites:
    • Install Docker and ensure it’s running.
    • Download and install Visual Studio Code.
    • Install the ‘Python’ and ‘Docker’ extensions from the Visual Studio Code marketplace.
  2. Setup Docker:
    • Create a new directory for your project.
    • Inside this directory, create a file named Dockerfile.
    • In the Dockerfile, start with the following content:

    • Create a requirements.txt file in the same directory, listing any Python libraries your project depends on,following content:

      numpy
      pandas

      or you can specify the library version:

      tensorflow==2.3.1
      uvicorn==0.12.2
      fastapi==0.63.0

  3. Build the Docker Container Image:
    • In VS Code, open the folder containing your Dockerfile and other project files.
    • Use the Docker extension to build your Docker image by right-clicking the Dockerfile and selecting ‘Build Image’ or run the command
      docker build -t mypythonenv .



    • Run the container and mount your working directory or folder where you have your Python code into the container
      docker run -it --rm -v C:\Users\Sarmad\Projects\MyPythonProject:/usr/src/app mypythonenv



  4. Attach the running Docker Container
    • Attach the running Python container into Visual Studio Code to run and debug your Python Code, click on the Docker icon, then right-click on the running container (in our example called “mypythonenv”) then attach it to Visual Studio Code


    • We have now Visual Studio Code accessing the Python environment running inside the Docker container, the container has access to your Python code files that were mounted in the docker run command line


  5. Run the Python code
    • To run our “hello-world.py” code, click on the Run and Debug icon, then the blue “Run and Debog” button, select Python File.


    • The Python Code will be running inside your container


  6. Clean Up & Share:
    • Once done with development, you can push your Docker image to a registry (like Docker Hub) or your own private registry for sharing or deployment.

By following these steps, you’ll have a Python development environment that’s clean, consistent, and easy to use.

Happy coding!

Get DataBase and System information from SQL

Before we migrate or upgrade we need to know some critical information that helps us find out about how SQL/System will be licensed based on CPU/Socket/Cores, some other information related to collation to find out what is the best way to consolidate Databases.

so I made this scrip that helped to get the basic informations that i needed, I decided to share it with the community and hopefully will be helpful for you.

Read More »