Technow: Context Managers Using contextlib, Phi-3 family, Verba RAG
In this article, we’ll dive into four exciting developments that are reshaping the landscape of AI, machine learning, and Python programming. First, learn how Python’s contextlib module simplifies resource management with the with statement, making tasks like file operations, database connections, and timing operations more efficient and error-proof. Then, discover Microsoft’s latest strides in the small language model race with the Phi-3 family, including the new Phi-3-vision multimodal model and Copilot+ PCs that redefine AI-powered computing. We’ll also explore the enhanced capabilities of Microsoft Copilots, which now support team collaboration and customizable AI agents for complex business processes. Finally, get acquainted with Verba RAG, Weaviate’s open-source tool for Retrieval-Augmented Generation, offering a user-friendly interface and versatile deployment options for advanced text generation tasks. This post will give you a comprehensive overview of these innovations and why they matter in today’s tech ecosystem.
Context Managers Using contextlib
Python’s contextlib module provides utilities for working with the with statement, which is used for resource management. The with statement ensures that resources are properly acquired and released, even in the presence of exceptions.
The contextlib module offers several classes and functions to create context managers, which are objects that define the __enter__ and __exit__ methods. The __enter__ method is called when entering the with block, and the __exit__ method is called when exiting, regardless of whether an exception occurred or not.
Context managers can be used for a variety of tasks, including:
- File operations: Automatically closing files after use.
- Thread and process locking: Acquiring and releasing locks.
- Database connections: Connecting to and disconnecting from databases.
- Temporary environments: Creating and cleaning up temporary environments.
- Logging: Temporarily changing logging levels.
- Timing operations: Measuring the execution time of a block of code.
Here’s a quick example of how you can create a custom context manager using contextlib to time a block of code:
import time
from contextlib import contextmanager
@contextmanager
def timed_block(label):
start = time.time()
try:
yield
finally:
end = time.time()
print(f"{label}: {end - start:.2f} seconds")
# Usage:
with timed_block("quick computation"):
# Some quick computations
sum(range(1000000))
Phi-3 family
Small Language Models
Microsoft wants to take the lead on the Small Language Model race. After their tiny but mighty Phi-3 released a few months ago, they just added new models to the Phi-3 family:
- Phi-3-vision: the first 4.2B multimodal model of the Phi family
- Phi-3-small: a new 7B parameters language model that rivals with models twice it size.
- New Phi-3 model: 14B parameter trained on 4.8T tokens achieving MMLU of 78 (comparable with Llama3 70B)
- Phi-Sliica: included in every Copilot+PC designed for NPUs with 650 tokens/sec prompt processing.
Microsoft Copilots and GitHub Copilot
Many updates and new features were added to the Copilots family.
- GitHub Copilot Extensions: Microsoft is empowering any developer to customize the Copilot experience with services like Azure, Docker, or Sentry directly within GitHub Copilot Chat.
Team Copilot: An expansion of Copilot from a personal AI assistant to a collaborative team member. Team Copilot can act as a meeting facilitator, note-taker, project manager, and collaborator across Microsoft 365 apps like Teams, Loop, and Planner. This moves Copilot capabilities from just assisting individuals to also assisting teams. - Copilot Studio Agents: New agent capabilities were added in Copilot Studio to allow developers to build proactive Copilots that can independently manage complex business processes. These Copilots can leverage memory, learn from feedback, and ask for help when needed.
New Copilot+ PCs
Microsoft introduced the world to a new category of Windows PCs designed for AI: Copilot+ PCs. Copilot+ PCs are the fastest, most intelligent Windows PCs ever built.
- Windows Copilot library and new Phi-Silica, a SOTA Small Language Model, included in every Copilot+PC.
- PyTorch and a new Web Neural Network – a web native machine learning framework – now native on Windows
- Powerful new silicon capable of 40+ TOPS
- AI tools like Recall (to easily find and remember what you have seen in your PC), Cocreator (for AI image generation), Live Captions (for real-time audio translation), and more
- 20x more powerful and up to 100x as efficient for running AI workloads and deliver industry-leading AI acceleration
Other announcements
Microsoft also announced Real Time intelligence within their Fabric app the Azure AI-powered analytics platform, now in preview, as well as a long list of partnerships from AMD to Khan Academy or Cognition AI.
- Access
You can try Phi-3-vision now on the Azure AI platform. - Phi-3 is also available on HuggingFace
- GitHub Copilot Extensions are supported in GitHub Copilot Chat on GitHub.com, Visual Studio, as well as VS Code.
Verba RAG
Verba is Weaviate’s open-source application designed to offer an end-to-end, streamlined, and user-friendly interface for Retrieval-Augmented Generation (RAG) out of the box.
You can run Verba either locally with HuggingFace and Ollama or through LLM providers such as OpenAI, Cohere, and Google. You can also choose between different RAG frameworks, data types, chunking & retrieving techniques in a very customizable way.