Joining the Metaverse Standards Forum

FIWARE Foundation has joined the “Metaverse Standards Forum” The Metaverse Standards Forum is a non-profit, member-funded consortium of standards-related organizations, companies, and institutions that are cooperating to foster interoperability for an open and inclusive metaverse.

The Smart Data Models initiative is willing to contribute to those standards with all its data models to enable its use in the metaverse.


New pysmartdatamodels package 0.6.3 Release

We are thrilled to announce the latest update to pysmartdatamodels Python package, version 0.6.3, featuring a new function: generate_sql_schema()This addition enabling seamless generation of SQL schemas with just a few lines of code!

Introducing  generate_sql_schemqa() Function:

With the new function, generate_sql_schemqa()  pysmartdatamodels simplifies the process of creating SQL schemas for your data models by providing as input the model.yaml representation of a Smart Data Model.

Korean translation available for all data models

The Korean translation of the specifications has been incorporated. Therefore every specification folder now it has a new file with the Korean translation.

See an example:

Also the specification is linked from the file at the front of every data model.

New service: Export you data models to SQL schema

We provide a service to Generate a PostgreSQL schema SQL script from the model.yaml representation of a Smart Data Model.

You can access this service under this link following Tools > SQL service.

You need to provide as input the standard GitHub link to the model.yaml file or the raw version GitHub link.

Update: Generate the schema out of your payload

We provide a service to generate JSON schemas from the example payloads you provide. This service is designed to assist contributors who may have limited experience with JSON schemas but with actual examples. You can access the service through this link.

We are pleased to announce that we have made some updates to this service::

  • Added support for generating sub-property descriptions
  • Fixed the bug of duplicated property generation

Please feel free to try it out and leave your comments on

New Sections in README

We have introduced several new sections in the for all the smart data models that we’ve published

These are:

  • Links to CSV extensions of example payloads
  • Links to SQL schema
  • Services we designed to help self-contribution

Such as in the data model WeatherForecast:

Hope you find it useful and feel free to make your comments on

DCAT-AP catalogue service in beta version

DCAT-AP is, possibly, the most relevant standard of a catalogue of datasets (even resources as well). A data spaces’ data models’ building block needs a semantic catalogue of resources in DCAT-AP format.

Here you have the beta version of a service providing a DCAT-AP catalogue containing all semantic resources of Smart Data Models.

You just connect to this URL and you can get it. It will be updated once a day and whenever a data model is published.

You can also reach it at home -> tools -> DCAT-AP catalog (last option)

As long as the catalogue and its structure can be enriched I’d like to hear from you any comments / additions and critics you wish / need at

So you can enrich it before the official release

New Version of the Python Package pysmartdatamodels 0.6.1

There is a new version of the python package for pysmartdatamodels 0.6.1.

This python package includes all the data models and several functions to use them in your developments.


– Two updated functions

  • New extension for function update_broker() to allow updating nonexistent attribute into broker
  • Function validate_data_model_schema(), with wider range of validation

– Minor changes on providing links of repository, issues, updating on latest statistics, etc.

Get more details on the pypi page and feel free to try it out!

Vote for the new SQL service (Just two questions)

We are very close to launch a new service and we want to have your feedback on how to create this service.

So we made a Survey and you can help us by submitting your preference.

    This is a survey about the type of service that we want to create for the users of the Smart Data Models that generates a PostgreSQL schema SQL script.

    2. We would like to propose the following ways to interact with the service, choose one of these approaches :

    If you want us to discuss the suggestion fill your email here otherwise please leave it empty

    By using this form you agree with the storage and handling of your data by this website.

    Thank you!

    New version of the contribution manual

    The contribution manual has minor updates frequently, but now we have created a brand new version

    Restructuring the contents

    – Including explanation on how to use the test service for new data models

    – Making more understandable the contribution workflow

    Guidelines for contributing

    – The list of support channels, including the new on discord

    – Move to the annex the slides for the automatic documents generated on the publication process

    and helping to make more understandable to those users recently joining the Smart Data Models initiative.

    As always the shortcut to reach it works:

    But you can also find a direct link in these locations

    Upper menu -> Contribution manual (7th option)

    Main menu -> documentation -> Contribution manual (5th option in the drop down menu)

    Feel free to make your comments on or as comments in the manual.