Reference

datadoc package

Subpackages

Submodules

datadoc.app module

Top-level entrypoint, configuration and layout for the datadoc app.

Members of this module should not be imported into any sub-modules, this will cause circular imports.

build_app(app)

Define the layout, register callbacks.

Return type:

Dash

Parameters:

app (type[Dash])

collect_data_from_external_sources(executor)

Call classes and methods which collect data from external sources.

Must be non-blocking to prevent delays in app startup.

Return type:

None

Parameters:

executor (ThreadPoolExecutor)

get_app(executor, dataset_path=None)

Centralize all the ugliness around initializing the app.

Return type:

tuple[Dash, int]

Parameters:
  • executor (ThreadPoolExecutor)

  • dataset_path (str | None)

main(dataset_path=None)

Entrypoint when running as a script.

Return type:

None

Parameters:

dataset_path (str | None)

datadoc.config module

Centralised configuration management for Datadoc.

get_app_name()

Get the name of the app. Defaults to ‘Datadoc’.

Return type:

str

get_dapla_manual_naming_standard_url()

Get the URL to naming standard in the DAPLA manual.

Return type:

dict | None

get_dapla_region()

Get the Dapla region we’re running on.

Return type:

DaplaRegion | None

get_dapla_service()

Get the Dapla service we’re running on.

Return type:

DaplaService | None

get_dash_development_mode()

Get the development mode for Dash.

Return type:

bool

get_data_source_code()

The code for the organisational units code list in Klass.

Return type:

int | None

get_datadoc_dataset_path()

Get the path to the dataset.

Return type:

str | None

get_jupyterhub_http_referrer()

Get the JupyterHub http referrer.

Return type:

str | None

get_jupyterhub_service_prefix()

Get the JupyterHub service prefix.

Return type:

str | None

get_jupyterhub_user()

Get the JupyterHub user name.

Return type:

str | None

get_log_formatter()

Get log formatter configuration.

Return type:

Literal['simple', 'json']

get_log_level()

Get the log level.

Return type:

int

get_measurement_unit_code()

The code for the Measurement Unit code list in Klass.

Return type:

int | None

get_oidc_token()

Get the JWT token from the environment.

Return type:

str | None

get_organisational_unit_code()

The code for the organisational units code list in Klass.

Return type:

int | None

get_port()

Get the port to run the app on.

Return type:

int

get_statistical_subject_source_url()

Get the URL to the statistical subject source.

Return type:

str | None

get_unit_code()

The code for the Unit Type code list in Klass.

Return type:

int | None

datadoc.constants module

Repository for constant values in Datadoc.

datadoc.enums module

Enumerations used in Datadoc.

class Assessment(language_strings)

Bases: LanguageStringsEnum

Sensitivity of data.

OPEN = 'OPEN'
PROTECTED = 'PROTECTED'
SENSITIVE = 'SENSITIVE'
class DataSetState(language_strings)

Bases: LanguageStringsEnum

Processing state of a dataset.

INPUT_DATA = 'INPUT_DATA'
OUTPUT_DATA = 'OUTPUT_DATA'
PROCESSED_DATA = 'PROCESSED_DATA'
SOURCE_DATA = 'SOURCE_DATA'
STATISTICS = 'STATISTICS'
class DataSetStatus(language_strings)

Bases: LanguageStringsEnum

Lifecycle status of a dataset.

DEPRECATED = 'DEPRECATED'
DRAFT = 'DRAFT'
EXTERNAL = 'EXTERNAL'
INTERNAL = 'INTERNAL'
class DataType(language_strings)

Bases: LanguageStringsEnum

Simplified data types for metadata purposes.

BOOLEAN = 'BOOLEAN'
DATETIME = 'DATETIME'
FLOAT = 'FLOAT'
INTEGER = 'INTEGER'
STRING = 'STRING'
class IsPersonalData(language_strings)

Bases: LanguageStringsEnum

Is the variable instance personal data and if so, how is it encrypted.

NON_PSEUDONYMISED_ENCRYPTED_PERSONAL_DATA = 'NON_PSEUDONYMISED_ENCRYPTED_PERSONAL_DATA'
NOT_PERSONAL_DATA = 'NOT_PERSONAL_DATA'
PSEUDONYMISED_ENCRYPTED_PERSONAL_DATA = 'PSEUDONYMISED_ENCRYPTED_PERSONAL_DATA'
class LanguageStringsEnum(language_strings)

Bases: Enum

Enum class for storing LanguageStringType objects.

get_value_for_language(language)

Retrieve the string for the relevant language.

Return type:

str | None

Parameters:

language (SupportedLanguages)

class TemporalityTypeType(language_strings)

Bases: LanguageStringsEnum

Temporality of a dataset.

More information about temporality type: https://statistics-norway.atlassian.net/l/c/HV12q90R

ACCUMULATED = 'ACCUMULATED'
EVENT = 'EVENT'
FIXED = 'FIXED'
STATUS = 'STATUS'
class UseRestriction(language_strings)

Bases: LanguageStringsEnum

Lifecycle status of a dataset.

DELETION_ANONYMIZATION = 'DELETION_ANONYMIZATION'
PROCESS_LIMITATIONS = 'PROCESS_LIMITATIONS'
SECONDARY_USE_RESTRICTIONS = 'SECONDARY_USE_RESTRICTIONS'
class VariableRole(language_strings)

Bases: LanguageStringsEnum

The role of a variable in a dataset.

ATTRIBUTE = 'ATTRIBUTE'
IDENTIFIER = 'IDENTIFIER'
MEASURE = 'MEASURE'
START_TIME = 'START_TIME'
STOP_TIME = 'STOP_TIME'

datadoc.state module

Global state.

DANGER: This global is safe when Datadoc is run as designed, with an individual instance per user run within a Jupyter Notebook.

If Datadoc is redeployed as a multi-user web app then this storage strategy must be modified, since users will modify each others data. See here: https://dash.plotly.com/sharing-data-between-callbacks

data_sources: CodeList
measurement_units: CodeList
metadata: Datadoc
organisational_units: CodeList
statistic_subject_mapping: StatisticSubjectMapping
unit_types: CodeList

datadoc.utils module

General utilities.

get_app_version()

Get the version of the Datadoc package.

Return type:

str

get_timestamp_now()

Return a timestamp for the current moment.

Return type:

datetime

pick_random_port()

Pick a random free port number.

The function will bind a socket to port 0, and a random free port from 1024 to 65535 will be selected by the operating system.

Return type:

int

running_in_notebook()

Return True if running in Jupyter Notebook.

Return type:

bool

datadoc.wsgi module

Entrypoint for Gunicorn.

Module contents

Datadoc: Document datasets in Statistics Norway.

datadoc.logging_configuration package

Submodules

datadoc.logging_configuration.json_formatter module

class DatadocJSONFormatter(*, fmt_keys=None)

Bases: Formatter

Class for formatting json for log files.

Parameters:

fmt_keys (dict[str, str] | None)

format(record)

Method that creates the json structure from a message created by the _prepare_log_dict method.

Return type:

str

Parameters:

record (LogRecord)

datadoc.logging_configuration.logging_config module

get_log_config()

Configure logging for the application.

Return type:

dict[str, Any]

Module contents

Code that is used for setting up the logging for the Datadoc app.

datadoc.frontend package

Subpackages

Submodules

datadoc.frontend.constants module

Repository for constant values in Datadoc frontend module.

Module contents

Code relating to Dash and the user interface.

datadoc.frontend.fields package

Submodules

datadoc.frontend.fields.display_base module

Functionality common to displaying dataset and variables metadata.

class DisplayMetadata(identifier, display_name, description, obligatory=False, editable=True)

Bases: ABC

Controls how a given metadata field should be displayed.

Parameters:
  • identifier (str)

  • display_name (str)

  • description (str)

  • obligatory (bool)

  • editable (bool)

description: str
display_name: str
editable: bool = True
identifier: str
obligatory: bool = False
abstract render(component_id, metadata)

Build a component.

Return type:

Component

Parameters:
  • component_id (dict)

  • metadata (BaseModel)

url_encode_shortname_ids(component_id)

Encodes id to hanlde non ascii values.

Return type:

None

Parameters:

component_id (dict)

class MetadataCheckboxField(identifier, display_name, description, obligatory=False, editable=True)

Bases: DisplayMetadata

Controls for how a checkbox metadata field should be displayed.

Parameters:
  • identifier (str)

  • display_name (str)

  • description (str)

  • obligatory (bool)

  • editable (bool)

render(component_id, metadata)

Build Checkbox component.

Return type:

Checkbox

Parameters:
  • component_id (dict)

  • metadata (BaseModel)

class MetadataDateField(identifier, display_name, description, obligatory=False, editable=True)

Bases: DisplayMetadata

Controls how fields which define a single date are displayed.

Parameters:
  • identifier (str)

  • display_name (str)

  • description (str)

  • obligatory (bool)

  • editable (bool)

render(component_id, metadata)

Build Input date component.

Return type:

Input

Parameters:
  • component_id (dict)

  • metadata (BaseModel)

class MetadataDropdownField(identifier, display_name, description, obligatory=False, editable=True, options_getter=<class 'list'>)

Bases: DisplayMetadata

Controls how a Dropdown should be displayed.

Parameters:
  • identifier (str)

  • display_name (str)

  • description (str)

  • obligatory (bool)

  • editable (bool)

  • options_getter (Callable[[], list[dict[str, str]]])

options_getter

alias of list

render(component_id, metadata)

Build Dropdown component.

Return type:

Dropdown

Parameters:
  • component_id (dict)

  • metadata (BaseModel)

class MetadataInputField(identifier, display_name, description, obligatory=False, editable=True, type='text', value_getter=<function get_metadata_and_stringify>)

Bases: DisplayMetadata

Controls how an input field should be displayed.

Parameters:
  • identifier (str)

  • display_name (str)

  • description (str)

  • obligatory (bool)

  • editable (bool)

  • type (str)

  • value_getter (Callable[[BaseModel, str], Any])

render(component_id, metadata)

Build an Input component.

Return type:

Input

Parameters:
  • component_id (dict)

  • metadata (BaseModel)

type: str = 'text'
value_getter(identifier)

Get a metadata value from the model and cast to string.

Return type:

str | None

Parameters:
  • metadata (BaseModel)

  • identifier (str)

class MetadataMultiLanguageField(identifier, display_name, description, obligatory=False, editable=True, id_type='', type='text')

Bases: DisplayMetadata

Controls how fields which support multi-language are displayed.

These are a special case since they return a group of input fields..

Parameters:
  • identifier (str)

  • display_name (str)

  • description (str)

  • obligatory (bool)

  • editable (bool)

  • id_type (str)

  • type (str)

id_type: str = ''
render(component_id, metadata)

Build fieldset group.

Return type:

Fieldset

Parameters:
  • component_id (dict)

  • metadata (BaseModel)

render_input_group(component_id, metadata)

Build section with Input components for each language.

Return type:

Section

Parameters:
  • component_id (dict)

  • metadata (BaseModel)

type: str = 'text'
class MetadataPeriodField(identifier, display_name, description, obligatory=False, editable=True, id_type='')

Bases: DisplayMetadata

Controls how fields which define a time period are displayed.

These are a special case since two fields have a relationship to one another.

Parameters:
  • identifier (str)

  • display_name (str)

  • description (str)

  • obligatory (bool)

  • editable (bool)

  • id_type (str)

id_type: str = ''
render(component_id, metadata)

Build Input date component.

Return type:

Input

Parameters:
  • component_id (dict)

  • metadata (BaseModel)

get_comma_separated_string(metadata, identifier)

Get a metadata value which is a list of strings from the model and convert it to a comma separated string.

Return type:

str

Parameters:
  • metadata (BaseModel)

  • identifier (str)

get_data_source_options()

Collect the unit type options.

Return type:

list[dict[str, str]]

get_enum_options(enum)

Generate the list of options based on the currently chosen language.

Return type:

list[dict[str, str]]

Parameters:

enum (type[LanguageStringsEnum])

get_metadata_and_stringify(metadata, identifier)

Get a metadata value from the model and cast to string.

Return type:

str | None

Parameters:
  • metadata (BaseModel)

  • identifier (str)

get_multi_language_metadata_and_stringify(metadata, identifier, language)

Get a metadata value supporting multiple languages from the model.

Return type:

str | None

Parameters:
  • metadata (BaseModel)

  • identifier (str)

  • language (SupportedLanguages)

get_standard_metadata(metadata, identifier)

Get a metadata value from the model.

Return type:

str | list[str] | int | float | bool | date | None

Parameters:
  • metadata (BaseModel)

  • identifier (str)

datadoc.frontend.fields.display_dataset module

Functionality for displaying dataset metadata.

class DatasetIdentifiers(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)

Bases: str, Enum

As defined here: https://statistics-norway.atlassian.net/l/c/aoSfEWJU.

ASSESSMENT = 'assessment'
CONTAINS_DATA_FROM = 'contains_data_from'
CONTAINS_DATA_UNTIL = 'contains_data_until'
CONTAINS_PERSONAL_DATA = 'contains_personal_data'
DATASET_STATE = 'dataset_state'
DATASET_STATUS = 'dataset_status'
DATA_SOURCE = 'data_source'
DESCRIPTION = 'description'
FILE_PATH = 'file_path'
ID = 'id'
KEYWORD = 'keyword'
METADATA_CREATED_BY = 'metadata_created_by'
METADATA_CREATED_DATE = 'metadata_created_date'
METADATA_LAST_UPDATED_BY = 'metadata_last_updated_by'
METADATA_LAST_UPDATED_DATE = 'metadata_last_updated_date'
NAME = 'name'
OWNER = 'owner'
POPULATION_DESCRIPTION = 'population_description'
SHORT_NAME = 'short_name'
SPATIAL_COVERAGE_DESCRIPTION = 'spatial_coverage_description'
SUBJECT_FIELD = 'subject_field'
TEMPORALITY_TYPE = 'temporality_type'
UNIT_TYPE = 'unit_type'
USE_RESTRICTION = 'use_restriction'
USE_RESTRICTION_DATE = 'use_restriction_date'
VERSION = 'version'
VERSION_DESCRIPTION = 'version_description'
get_owner_options()

Collect the owner options.

Return type:

list[dict[str, str]]

get_statistical_subject_options()

Generate the list of options for statistical subject.

Return type:

list[dict[str, str]]

get_unit_type_options()

Collect the unit type options.

Return type:

list[dict[str, str]]

datadoc.frontend.fields.display_variables module

Functionality for displaying variables metadata.

class VariableIdentifiers(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)

Bases: str, Enum

As defined here: https://statistics-norway.atlassian.net/wiki/spaces/MPD/pages/3042869256/Variabelforekomst.

CLASSIFICATION_URI = 'classification_uri'
COMMENT = 'comment'
CONTAINS_DATA_FROM = 'contains_data_from'
CONTAINS_DATA_UNTIL = 'contains_data_until'
DATA_ELEMENT_PATH = 'data_element_path'
DATA_SOURCE = 'data_source'
DATA_TYPE = 'data_type'
DEFINITION_URI = 'definition_uri'
FORMAT = 'format'
IDENTIFIER = 'id'
INVALID_VALUE_DESCRIPTION = 'invalid_value_description'
IS_PERSONAL_DATA = 'is_personal_data'
MEASUREMENT_UNIT = 'measurement_unit'
MULTIPLICATION_FACTOR = 'multiplication_factor'
NAME = 'name'
POPULATION_DESCRIPTION = 'population_description'
SHORT_NAME = 'short_name'
TEMPORALITY_TYPE = 'temporality_type'
VARIABLE_ROLE = 'variable_role'
get_measurement_unit_options()

Collect the unit type options.

Return type:

list[dict[str, str]]

Module contents

Functionality for displaying dataset and variables metadata fields.

datadoc.frontend.components package

Submodules

datadoc.frontend.components.builders module

Factory functions for different components are defined here.

class AlertType(color)

Bases: object

Attributes of a concrete alert type.

Parameters:

color (str)

color: str
static get_type(alert_type)

Get a concrete alert type based on the given enum values.

Return type:

AlertType

Parameters:

alert_type (AlertTypes)

class AlertTypes(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)

Bases: Enum

Types of alerts.

ERROR = 3
SUCCESS = 1
WARNING = 2
build_dataset_edit_section(metadata_inputs, dataset, key)

Create edit section for dataset workspace.

Return type:

Section

Parameters:
build_dataset_machine_section(title, metadata_inputs, dataset, key)

Create section for dataset machine generated workspace.

Return type:

Section

Parameters:
build_edit_section(metadata_inputs, variable)

Create input section for variable workspace.

Return type:

Section

Parameters:
build_input_field_section(metadata_fields, side, variable, field_id='')

Create form with input fields for variable workspace.

Return type:

Form

Parameters:

Build link object with text and URL.

Return type:

dict | None

Parameters:
  • text (str)

  • href (str)

build_ssb_accordion(header, key, variable_short_name, children)

Build Accordion for one variable in variable workspace.

Return type:

Accordion

Parameters:
  • header (str)

  • key (dict)

  • variable_short_name (str)

  • children (list)

build_ssb_alert(alert_type, title, message=None, link=None, alert_list=None)

Make a Dash Alert according to SSBs Design System.

Return type:

Alert

Parameters:
  • alert_type (AlertTypes)

  • title (str)

  • message (str | None)

  • link (dict | None)

  • alert_list (list | None)

build_variables_machine_section(metadata_inputs, title, variable)

Create input section for variable workspace.

Return type:

Section

Parameters:
  • metadata_inputs (list)

  • title (str)

  • variable (Variable)

datadoc.frontend.components.control_bars module

Components and layout which are not inside a tab.

build_controls_bar()

Build the Controls Bar.

This contains: - A text input to specify the path to a dataset - A button to open a dataset - A button to save metadata to disk

Return type:

Section

Build footer control bar which resides below all the content.

Return type:

Aside

datadoc.frontend.components.identifiers module

Module contents

All components (UI elements) for datadoc are defined in this package.

When components use an repeated code, we should make a factory for that component, as a function in Builders.py

datadoc.frontend.callbacks package

Submodules

datadoc.frontend.callbacks.dataset module

Callbacks relating to datasets.

accept_dataset_metadata_date_input(dataset_identifier, contains_data_from, contains_data_until)

Validate and save date range inputs.

Return type:

tuple[bool, str, bool, str]

Parameters:
  • dataset_identifier (DatasetIdentifiers)

  • contains_data_from (str | None)

  • contains_data_until (str | None)

accept_dataset_metadata_input(value, metadata_identifier, language=None)

Handle user inputs of dataset metadata values.

Return type:

tuple[bool, str]

Parameters:
  • value (str | list[str] | int | float | bool | date | None | LanguageStringType)

  • metadata_identifier (str)

  • language (str | None)

open_dataset_handling(n_clicks, file_path, dataset_opened_counter)

Handle errors and other logic around opening a dataset file.

Return type:

tuple[Alert, int]

Parameters:
  • n_clicks (int)

  • file_path (str)

  • dataset_opened_counter (int)

open_file(file_path=None)

Load the given dataset into a DataDocMetadata instance.

Return type:

Datadoc

Parameters:

file_path (str | None)

process_keyword(value)

Convert a comma separated string to a list of strings.

e.g. ‘a,b ,c’ -> [‘a’, ‘b’, ‘c’]

Return type:

list[str]

Parameters:

value (str)

process_special_cases(value, metadata_identifier, language=None)

Pre-process metadata where needed.

Some types of metadata need processing before being saved to the model. Handle these cases here, other values are returned unchanged.

Return type:

str | list[str] | int | float | bool | date | None | LanguageStringType

Parameters:
  • value (str | list[str] | int | float | bool | date | None | LanguageStringType)

  • metadata_identifier (str)

  • language (str | None)

datadoc.frontend.callbacks.register_callbacks module

All decorated callback functions should be defined here.

Implementations of the callback functionality should be in other functions (in other files), to enable unit testing.

register_callbacks(app)

Define and register callbacks.

Return type:

None

Parameters:

app (Dash)

datadoc.frontend.callbacks.utils module

Functions which aren’t directly called from a decorated callback.

check_variable_names(variables)

Checks if a variable shortname complies with the naming standard.

Return type:

Alert | None

Returns:

An ssb alert with a message saying that what names dont comply with the naming standard.

Parameters:

variables (list)

dataset_control(error_message)

Check obligatory metadata values for dataset.

Parameters:

error_message (str) – A message generated by ObligatoryDatasetWarning containing names of fields missing value.

Return type:

Alert | None

find_existing_language_string(metadata_model_object, value, metadata_identifier, language)

Get or create a LanguageStrings object and return it.

Return type:

LanguageStringType | None

Parameters:
  • metadata_model_object (BaseModel)

  • value (str)

  • metadata_identifier (str)

  • language (str)

get_dataset_path()

Extract the path to the dataset from the potential sources.

Return type:

Path | CloudPath | str

parse_and_validate_dates(start_date, end_date)

Parse and validate the given dates.

Return type:

tuple[datetime | None, datetime | None]

Parameters:
  • start_date (str | datetime | None)

  • end_date (str | datetime | None)

Validate that:
  • The dates are in YYYY-MM-DD format

  • The start date is earlier or identical to the end date.

Examples: >>> parse_and_validate_dates(“2021-01-01”, “2021-01-01”) (datetime.datetime(2021, 1, 1, 0, 0, tzinfo=datetime.timezone.utc), datetime.datetime(2021, 1, 1, 0, 0, tzinfo=datetime.timezone.utc))

>>> parse_and_validate_dates("1990-01-01", "2050-01-01")
(datetime.datetime(1990, 1, 1, 0, 0, tzinfo=datetime.timezone.utc), datetime.datetime(2050, 1, 1, 0, 0, tzinfo=datetime.timezone.utc))
>>> parse_and_validate_dates(None, None)
(None, None)
>>> parse_and_validate_dates("1st January 2021", "1st January 2021")
Traceback (most recent call last):
...
ValueError: Validation error: Expected an ISO 8601-like string, but was given '1st January 2021'. Try passing in a format string to resolve this.
>>> parse_and_validate_dates(datetime.datetime(2050, 1, 1, 0, 0, tzinfo=datetime.timezone.utc), "1990-01-01")
Traceback (most recent call last):
...
ValueError: Validation error: contains_data_from must be the same or earlier date than contains_data_until
>>> parse_and_validate_dates("2050-01-01", "1990-01-01")
Traceback (most recent call last):
...
ValueError: Validation error: contains_data_from must be the same or earlier date than contains_data_until
render_tabs(tab)

Render tab content.

Return type:

Article | None

Parameters:

tab (str)

save_metadata_and_generate_alerts(metadata)

Save the metadata document to disk and check obligatory metadata.

Return type:

list

Returns:

List of alerts including obligatory metadata warnings if missing, and success alert if metadata is saved correctly.

Parameters:

metadata (Datadoc)

variables_control(error_message, variables)

Check obligatory metadata for variables and return an alert if any metadata is missing.

This function parses an error message to identify missing obligatory metadata fields for variables. If missing metadata is found, it generates an alert.

Parameters:
  • error_message (str | None) – A message generated by ObligatoryVariableWarning containing the variable short name and a list of field names with missing values.

  • variables (list) – list of datadoc variables

Return type:

Alert | None

Returns:

An alert object if there are missing metadata fields, otherwise None.

datadoc.frontend.callbacks.variables module

Callback functions to do with variables metadata.

accept_variable_metadata_date_input(variable_identifier, variable_short_name, contains_data_from, contains_data_until)

Validate and save date range inputs.

Return type:

tuple[bool, str, bool, str]

Parameters:
  • variable_identifier (VariableIdentifiers)

  • variable_short_name (str)

  • contains_data_from (str)

  • contains_data_until (str)

accept_variable_metadata_input(value, variable_short_name, metadata_field, language=None)

Validate and save the value when variable metadata is updated.

Returns an error message if an exception was raised, otherwise returns None.

Return type:

str | None

Parameters:
  • value (str | list[str] | int | float | bool | date | None)

  • variable_short_name (str)

  • metadata_field (str)

  • language (str | None)

handle_multi_language_metadata(metadata_field, new_value, updated_row_id, language)

Handle updates to fields which support multiple languages.

Return type:

str | list[str] | int | float | bool | date | None | LanguageStringType

Parameters:
  • metadata_field (str)

  • new_value (str | list[str] | int | float | bool | date | None | LanguageStringType)

  • updated_row_id (str)

  • language (str)

populate_variables_workspace(variables, search_query, dataset_opened_counter)

Create variable workspace with accordions for variables.

Allows for filtering which variables are displayed via the search box.

Return type:

list

Parameters:
  • variables (list[Variable])

  • search_query (str)

  • dataset_opened_counter (int)

set_variables_value_multilanguage_inherit_dataset_values(value, metadata_identifier, language)

Set variable multilanguage value based on dataset value.

Return type:

None

Parameters:
  • value (str | list[str] | int | float | bool | date | None | LanguageStringType)

  • metadata_identifier (str)

  • language (str)

set_variables_values_inherit_dataset_derived_date_values()

Set variable date values if variables date values are not set.

Covers the case for inherit dataset date values where dates are derived from dataset path and must be set on file opening.

Return type:

None

set_variables_values_inherit_dataset_values(value, metadata_identifier)

Set variable value based on dataset value.

Return type:

None

Parameters:
  • value (str | list[str] | int | float | bool | date | None | LanguageStringType)

  • metadata_identifier (str)

variable_identifier(dataset_identifier)

Pair corresponding identifiers.

Return type:

str | None

Parameters:

dataset_identifier (str)

variable_identifier_multilanguage(dataset_identifier)

Pair corresponding identifiers for multilanguage fields.

Return type:

str | None

Parameters:

dataset_identifier (str)

Module contents

Dash callback functions.

The actual decorated callbacks should be very minimal functions which call other functions where the logic lives. This is done to support unit testing because the decorated functions are difficult to test.

The functions where the logic lives should be categorised into files.