Use smart data models as a service. First draft

In the data-models repository you can access to the first version to use smart data models as a service. Thanks to the works for the Cyclops project.

The files available create a wrap up around pysmartdatamodels package and also add one service for the online validation of NGSI-LD payloads.

Here is the readme contents to have an explanation of this first version

 

Smart Data Models API and Demo

This project consists of two main components:

  1. A FastAPI-based web service (pysdm_api3.py) that provides access to Smart Data Models functionality
  2. A demo script (demo_script2.py) that demonstrates the API endpoints
  3. A requirements file for the components use
  4. A bash script to run it

API Service (pysdm_api3.py)

A RESTful API that interfaces with the pysmartdatamodels library to provide access to Smart Data Models functionality.

Key Features

  • Payload Validation: Validate JSON payloads against Smart Data Models schemas
  • Data Model Exploration: Browse subjects, data models, and their attributes
  • Search Functionality: Find data models by exact or approximate name matching
  • Context Retrieval: Get @context information for data models
  • if you need other please email us to info @ smartdatamodels. org

Endpoints

Endpoint Method Description
/validate-url GET Validate a JSON payload from a URL against Smart Data Models
/subjects GET List all available subjects
/datamodels/{subject_name} GET List data models for a subject
/datamodels/{subject_name}/{datamodel_name}/attributes GET Get attributes of a data model
/datamodels/{subject_name}/{datamodel_name}/example GET Get an example payload of a data model
/search/datamodels/{name_pattern}/{likelihood} GET Search for data models by approximate name
/datamodels/exact-match/{datamodel_name} GET Find a data model by exact name
/subjects/exact-match/{subject_name} GET Check if a subject exists by exact name
/datamodels/{datamodel_name}/contexts GET Get @context(s) for a data model name

Validation Process

The /validate-url endpoint performs comprehensive validation:

  1. Fetches JSON from the provided URL
  2. Normalizes NGSI-LD payloads to key-values format
  3. Extracts the payload type
  4. Finds all subjects containing this type
  5. Retrieves schemas for each subject
  6. Validates against all schemas
  7. Returns consolidated results

Demo Script (demo_script2.py)

A simple interactive script that demonstrates the API endpoints by opening a series of pre-configured URLs in your default web browser.

Features

  • Opens each URL in a new browser tab
  • Pauses between URLs for user input
  • Allows early termination with ‘exit’ command
  • Provides clear progress indicators

Usage

  1. Configure the URLs in the my_web_urls list
  2. Run the script: python demo_script2.py
  3. Follow the on-screen instructions

The demo includes examples of:

  • Payload validation
  • Subject listing
  • Data model exploration
  • Attribute retrieval
  • Example payloads
  • Search functionality
  • Exact matching
  • Context retrieval

Requirements

  • Python 3.7+
  • FastAPI
  • Uvicorn
  • httpx
  • pydantic
  • jsonschema
  • webbrowser (standard library)

Installation

pip install fastapi uvicorn httpx pydantic jsonschema

Running the API

python pysdm_api3.py

Running the demo

python demo_script2.py

Configuration

Edit the my_web_urls list in demo_script2.py to change which endpoints are demonstrated.

License

Apache 2.0

New script for testing several data models at the same time.

Most of the files of the testing process have been updated and make it available the source code:

https://github.com/smart-data-models/data-models/tree/master/test_data_model

But also there is a new file

multiple_tests.py

This file enables you to test all the data models located in a internal subject (subdirectories of the root one). Currently this option is not available as a form but if you send us an email to our infno@smartdatamodels.org

we could create a specific form for that

See here an example of the outcome.

Another tiny improvement on the new testing process (ngsild payloads)

In the new testing process, 4th option in the tools menu, now it is available a new test that checks if the example-normalized.jsonld is a valid NGSI LD file.

This process helps contributors to debug their data models before submit them officially (where there will be new tests before final approval)

The source code for the test is available at the repo.

Remember that if you want to improve / create a new test, just create a PR on the repo.

Tiny improvement on the new testing process

In the new testing process, 4th option in the tools menu, now it is available a new test that checks if the example-normalized.json is a valid NGSIv2 file.

This process helps contributors to debug their data models before submit them officially (where there will be new tests before final approval)

The source code for the test is available at the repo.

Remember that if you want to improve / create a new test, just create a PR on the repo.

Improved test method for data models

When you want to contribute a new data model (or an improvement in an existing one) you need to pass a test.

The current process (3rd option in tools menu) keeps on working as it was.

But we have drafted a new method because

– We need to be more explicit about the tests passed and the errors

– We need to improve the performance

So you can check the new method in the 4th option of the Tools menu

Besides this, the tests are very modular so if you are a python programmer you can use them in your own system because the code is being released or indeed you can write new tests that would be included in the official site. Make a PR on the data-models repo and we will add it eventually. Check this post.

New testing process in progress were you can contribute your code

Current test process for new and extended data models

In order to approve a new data model a test needs to be passed. It cold be accessed in the 3rd option in the tools menu at the front page:

Pro: it is currently working

Con: It is mostly created in a single file for testing and error messages are not very explicit about the errors detected

The new process

1) Every test is an independent file:

2) To test the new data model it copies to local the files and then run the tests, which is quicker.

What can you do with basic knowledge of python (or with a good AI service)

Here you can see the current files available in the github repository data-models subdirectory test_data_model

Instructions

New subject Smart Data Models and new data model Attribute

Eating your own food is for SDM a demonstration that agile standardization works

We have created a new subject, SmartDataModels, where the structure of the assets of SDM will be released. We have started with the data models of Attribute according to the global data base of attributes (more than 157.000 currently, >100 MB os it takes a while to download)

 

  • Attribute. Description of the data model Attribute