With this MCP configuration file you will be capable to use Smart Data Models locally in your AI agent.

{
"mcpServers": {
"smartdatamodels": {
"type": "http",
"serverUrl": "https://opendatamodels.org/mcp/v1"
}
}
}
With this MCP configuration file you will be capable to use Smart Data Models locally in your AI agent.

{
"mcpServers": {
"smartdatamodels": {
"type": "http",
"serverUrl": "https://opendatamodels.org/mcp/v1"
}
}
}
Are you building AI applications for Smart Cities, Energy, or IoT? Interoperability is often the biggest hurdle—but a new tool is making it easier to keep your LLMs “in the loop” with global standards.
Introducing the Smart Data Models MCP Server, a new reference implementation that connects Large Language Models directly to the Smart Data Models ecosystem. Built on the Model Context Protocol (MCP), this server allows AI agents to browse, retrieve, and implement standardized data schemas in real-time.
Key Features:
Instant Schema Access: Give your LLM the ability to look up official data models for everything from street lighting to soil sensors.
Enhanced Accuracy: Reduce hallucinations by providing the AI with the exact JSON-LD and NGSI-LD structures required for your project.
Seamless Integration: Designed for easy setup with MCP-compatible clients (like Claude Desktop), enabling a smoother developer workflow for digital twin and IoT projects.
By providing LLMs with a “dictionary” of standardized data, the Smart Data Models MCP Server ensures that your AI-driven solutions are born interoperable.
Check it out on GitHub: agaldemas/smartdatamodels-mcp
In these slides are the detailed explanation of this that can be reached here. See below the architecture and in the README the easy configuration.

Here there is the configuration file (there are several options)
{
"mcpServers": {
"smart-data-models-http": {
"autoApprove": [],
"disabled": false,
"type": "streamableHttp",
"timeout": 180,
"url": "http://127.0.0.1:3210/mcp"
}
}
}
The source code is available here
Thanks to Alain Galdemas for the contribution
We are thrilled to announce a significant new release of our Python package, pysmartdatamodels, designed to empower developers and streamline the contribution process for our community.
This update is packed with new data models till 6-3-26.

Yo do not need to update the package if you use the function it will update the information about the new datamodels published:
from pysmartdatamodels import pysmartdatamodels as sdm
sdm.update_data()
It will take several minutes (depending on your connection because it updates 140 Mb)
The tool automates the generation of Smart Data Models (SDM) from visual models, bridging the gap between high-level domain design and technical implementation for Digital Twins and IoT ecosystems.
Input: Users define domain entities and relationships using B-UML (a simplified UML dialect) within the BESSER Pearl editor.
Transformation: The engine maps these models to the NGSI-LD standard and Schema.org vocabularies.
Output: For every entity, it automatically generates a compliant folder containing:
schema.json: The technical JSON Schema definition.
Examples: Multi-format payloads including JSON-LD, NGSI-v2, and Normalized NGSI-LD.
Documentation: Automatically derived human-readable specifications.
Interoperability: Ensures 100% compliance with ETSI NGSI-LD and SDM contribution guidelines.
Model-Driven Engineering (MDE): Moves the “source of truth” to a visual model, reducing manual coding errors in complex JSON-LD structures.
Efficiency: Accelerates the deployment of standardized data spaces by automating the boilerplate required for context brokers (e.g., Orion).
Calling all SDM contributors!
To ensure a smooth integration process, we want to remind everyone that all contributions undergo automated testing prior to submission. To help you streamline your workflow and catch potential issues early, we’ve made our testing suite available for local use.
You can access the full source code for our testing framework under an open-source license. Running these tests on your own machine not only boosts the quality of your code but also saves valuable time for both you and the maintainers.
Download the Source: Head over to our repository to grab the testing suite.
Configure: Follow the simple instructions in the README. (Quick tip: You only need to update config.json with your specific local directories).
Run Anywhere: The suite is flexible—it supports testing against both remote repositories and local file systems.
By validating your work locally before you push, you’re helping us maintain a high standard for the SDM project.
Happy coding!

List of tests
Two new data models have been published in the subject dataModel.AAS (Asset administration shell) in the Smart manufacturing domain.
The data model was created thanks to the contributor Inga Miadowicz
I4SubmodelElementRelationship. Based on IDTA-01001-3-0, describes a generic RAMI4.0 SubmodelElement representing an Relationship of a referenced Asset Administration Shell

A new data model, MaterialAddition in the subject dataModel.IndustrialProcess
Thanks to the project ALCHIMIA and to the entities BFI (VDEh-Betriebsforschungsinstitut) and Scuola Superiore Sant’Anna the contributors,
There are two new data models, ProcessChemicalAnalysis and ProcessEvent in the subject dataModel.IndustrialProcess
Thanks to the project ALCHIMIA and to the entities BFI (VDEh-Betriebsforschungsinstitut) and Scuola Superiore Sant’Anna the contributors,
ProcessChemicalAnalysis. Schema for chemical analyses from industrial processes
ProcessEvent. Schema for generic events in industrial processes
soon there will be more models
The contributors of new data models can test their data models in their local repositories with the source code of the testing tool than also can use online
Home -> tools -> test your data model
it has been updated to deal with those attributes coming from languageMap properties in NGSI
We are thrilled to announce a significant new release of our Python package, pysmartdatamodels, designed to empower developers and streamline the contribution process for our community.
This update is packed with new data models and, most notably, powerful new validation and testing services.
The heart of the Smart Data Models initiative is our comprehensive library of open-licensed data models. With this new release, we have expanded it further, adding new models across our thirteen domains, from Smart Cities to Smart Energy. This package continues to provide powerful functions to integrate more than 1,000 standardized data models into your projects, digital twins, and data spaces.
To improve the quality and speed of contributions, we are excited to launch a brand new service for our contributors. You can now automatically test your data models before submitting them. This automated validation ensures that new models comply with our standards, making the review and integration process smoother for everyone.
For those who want to integrate this validation into their own workflows, the source code for the testing tool is also available.
In addition to the new contributor tool, you can also use the online service to validate payloads against existing data model examples.
You can find all the details, explore the functions, and get the latest package from the Python Package Index (PyPI). The updated README file includes comprehensive documentation on all the new features.
We are committed to making data interoperability easier for everyone. These updates are a direct result of community feedback and the hard work of our contributors. A huge thank you to everyone who has helped make this possible!