A new data model, Bollard in the subject dataModel.MarineTransport
Thanks to the contributors of the Puerto de Huelva

- Bollard. This data model describes a bollard in a port facility, used for mooring vessels.
A new data model, Bollard in the subject dataModel.MarineTransport
Thanks to the contributors of the Puerto de Huelva

A new data model, AisVessel in the subject dataModel.MarineTransport
Thanks to the contributors of the Puerto de Huelva

The tool automates the generation of Smart Data Models (SDM) from visual models, bridging the gap between high-level domain design and technical implementation for Digital Twins and IoT ecosystems.
Input: Users define domain entities and relationships using B-UML (a simplified UML dialect) within the BESSER Pearl editor.
Transformation: The engine maps these models to the NGSI-LD standard and Schema.org vocabularies.
Output: For every entity, it automatically generates a compliant folder containing:
schema.json: The technical JSON Schema definition.
Examples: Multi-format payloads including JSON-LD, NGSI-v2, and Normalized NGSI-LD.
Documentation: Automatically derived human-readable specifications.
Interoperability: Ensures 100% compliance with ETSI NGSI-LD and SDM contribution guidelines.
Model-Driven Engineering (MDE): Moves the “source of truth” to a visual model, reducing manual coding errors in complex JSON-LD structures.
Efficiency: Accelerates the deployment of standardized data spaces by automating the boilerplate required for context brokers (e.g., Orion).
Calling all SDM contributors!
To ensure a smooth integration process, we want to remind everyone that all contributions undergo automated testing prior to submission. To help you streamline your workflow and catch potential issues early, we’ve made our testing suite available for local use.
You can access the full source code for our testing framework under an open-source license. Running these tests on your own machine not only boosts the quality of your code but also saves valuable time for both you and the maintainers.
Download the Source: Head over to our repository to grab the testing suite.
Configure: Follow the simple instructions in the README. (Quick tip: You only need to update config.json with your specific local directories).
Run Anywhere: The suite is flexible—it supports testing against both remote repositories and local file systems.
By validating your work locally before you push, you’re helping us maintain a high standard for the SDM project.
Happy coding!

List of tests
The contributors of new data models can test their data models in their local repositories with the source code of the testing tool than also can use online
Home -> tools -> test your data model
it has been updated to deal with those attributes coming from languageMap properties in NGSI
We are thrilled to announce a significant new release of our Python package, pysmartdatamodels, designed to empower developers and streamline the contribution process for our community.
This update is packed with new data models and, most notably, powerful new validation and testing services.
The heart of the Smart Data Models initiative is our comprehensive library of open-licensed data models. With this new release, we have expanded it further, adding new models across our thirteen domains, from Smart Cities to Smart Energy. This package continues to provide powerful functions to integrate more than 1,000 standardized data models into your projects, digital twins, and data spaces.
To improve the quality and speed of contributions, we are excited to launch a brand new service for our contributors. You can now automatically test your data models before submitting them. This automated validation ensures that new models comply with our standards, making the review and integration process smoother for everyone.
For those who want to integrate this validation into their own workflows, the source code for the testing tool is also available.
In addition to the new contributor tool, you can also use the online service to validate payloads against existing data model examples.
You can find all the details, explore the functions, and get the latest package from the Python Package Index (PyPI). The updated README file includes comprehensive documentation on all the new features.
We are committed to making data interoperability easier for everyone. These updates are a direct result of community feedback and the hard work of our contributors. A huge thank you to everyone who has helped make this possible!
In the data-models repository you can access to the first version to use smart data models as a service. Thanks to the works for the Cyclops project.
The files available create a wrap up around pysmartdatamodels package and also add one service for the online validation of NGSI-LD payloads.
Here is the readme contents to have an explanation of this first version
This project consists of two main components:
pysdm_api3.py) that provides access to Smart Data Models functionalitydemo_script2.py) that demonstrates the API endpointspysdm_api3.py)A RESTful API that interfaces with the pysmartdatamodels library to provide access to Smart Data Models functionality.
| Endpoint | Method | Description |
|---|---|---|
/validate-url |
GET | Validate a JSON payload from a URL against Smart Data Models |
/subjects |
GET | List all available subjects |
/datamodels/{subject_name} |
GET | List data models for a subject |
/datamodels/{subject_name}/{datamodel_name}/attributes |
GET | Get attributes of a data model |
/datamodels/{subject_name}/{datamodel_name}/example |
GET | Get an example payload of a data model |
/search/datamodels/{name_pattern}/{likelihood} |
GET | Search for data models by approximate name |
/datamodels/exact-match/{datamodel_name} |
GET | Find a data model by exact name |
/subjects/exact-match/{subject_name} |
GET | Check if a subject exists by exact name |
/datamodels/{datamodel_name}/contexts |
GET | Get @context(s) for a data model name |
The /validate-url endpoint performs comprehensive validation:
demo_script2.py)A simple interactive script that demonstrates the API endpoints by opening a series of pre-configured URLs in your default web browser.
my_web_urls listpython demo_script2.pyThe demo includes examples of:
pip install fastapi uvicorn httpx pydantic jsonschema
python pysdm_api3.py
python demo_script2.py
Edit the my_web_urls list in demo_script2.py to change which endpoints are demonstrated.
Apache 2.0
In the new testing process, 4th option in the tools menu, now it is available a new test that checks if the example-normalized.jsonld is a valid NGSI LD file.
This process helps contributors to debug their data models before submit them officially (where there will be new tests before final approval)
The source code for the test is available at the repo.
Remember that if you want to improve / create a new test, just create a PR on the repo.
In the new testing process, 4th option in the tools menu, now it is available a new test that checks if the example-normalized.json is a valid NGSIv2 file.
This process helps contributors to debug their data models before submit them officially (where there will be new tests before final approval)
The source code for the test is available at the repo.
Remember that if you want to improve / create a new test, just create a PR on the repo.
Current test process for new and extended data models
In order to approve a new data model a test needs to be passed. It cold be accessed in the 3rd option in the tools menu at the front page:
Pro: it is currently working
Con: It is mostly created in a single file for testing and error messages are not very explicit about the errors detected
The new process
1) Every test is an independent file:
2) To test the new data model it copies to local the files and then run the tests, which is quicker.
What can you do with basic knowledge of python (or with a good AI service)
Here you can see the current files available in the github repository data-models subdirectory test_data_model
Instructions