About Us
Our Approach
Services
Projects
News
Contact Us
 

Traditional Glossary

Share This
AJAX progress indicator
Search: (clear)
  • a

  • Ad Hoc Report
    A report is created for a unique one-time request.
  • Algorithm
    A mathematical formula or set of statements to perform a specific calculation using a set of data. It might be for solving a problem or provide a numerical result.
  • API
    The abbreviation of Application Program Interface which helps building web-based software by providing a set of protocols, standards, and tools for building software applications.
  • Application
    A software environment that supports one or more business processes or informational needs.
  • Archive
    A data store that is not used for operational or informational purposes but only for storing data that can be brought back to an operational or informational environment.
  • b

  • Business Intelligence (BI)
    A term used for the ability of an enterprise to gain deeper understanding of its current state and enable a data-driven decision making by exploring and analysing its data resources.
  • Business Process Model
    A graphical presentation of a business process which may be cross-functional or cross-organisational. The model could provide different levels of details.
  • Business Rule
    A rule that governs the behaviour of a business by defining policies and procedures and tend to establish data integrity guidelines.
  • Business Rules Engine
    A service that executes business rules that are declared rather than coded in a programming language.
  • Business Term
    A phrase, concept or terminology that is related to any aspect of an enterprise.
  • c

  • Change Control
    A defined process of an organisation in order to enforce that any technology, infrastructure, service, product, system or application is amended only in accordance with upon agreed rules. This may be overlooked by a formal Change Control Board of the organisation.
  • Cloud Computing
    In short "Cloud". It's used to refer to a practice of providing application, service, or storing, managing, processing data on a server hosted remotely.
  • Completeness
    A data quality metric to measure the percentage of required data in a column, table or file is known.
  • Configuration
    The selection of specific options and the specification of parameters that make a generalized software application or package behave in a precise way.
  • Consistency
    The degree to which a value, column or entity is consistent with another value, column or entity. Usually this is implemented in Data Quality rules to cross reference facts such as gender and title.
  • Cross-Reference Management
    Some entities can be identified in many different ways in business terms (i.e. excluding surrogate keys). An example would be Asset being identified by CUSIP, SEDOL, ISIN, etc. The cross-reference between the various identification schemes has to be managed. This is cross-reference management.
  • Crowdsourcing
    An activity to publish a given problem or task to the public in order to find a solution.
  • CRUD
    Describes the following set of access rights for data: Create, Read, Update, Delete.
  • d

  • Data Analytics
    An activity to examine raw data in order to derive information or draw conclusion. It can be a manual exercise or application-driven.
  • Data Architecture
    An information technology discipline focuses on standards, rules and data models are used to govern the data systems and architecture of an organisation.
  • Data Center
    A centralised repository, either belong to a single organisation or used by many as a service, for accommodating computer systems and related devices, such, servers and data storages.
  • Data Cleansing
    An activity to review and amend data in order to achieve a higher data quality level. For example, correcting typos, adding missing data or removing duplicates.
  • Data Dictionary
    A so-called metadata repository which stores data about data and database structures of an organisation. Provides a list of all data elements along with their data type, structure, source systems and information about their usage.
  • Data Entitlement
    A rule that determines user access level (also known as CRUD = Create, Read, Update, Delete) to a specific data element.
  • Data Federation
    A methodology of linking autonomous data sources together and providing access to them using a centralised interface without storing the data in a central repository, like a Data Warehouse. It is a form of data virtualisation.
  • Data Feed
    A specific way for receiving real-time update of data. For instance: RSS (Rich Site Summary).
  • Data Formatting
    The transformation of data in terms of its structure, representation, and code values to meet a specified target requirement. Often this involves placing data in non-relational objects such as flat files. The actual target may be different to the dataset in which the data is transported (e.g. a flat file may move data to a remote database).
  • Data Governance
    A set of principles, rules and procedures to manage data availability, integrity, usability and ownership of data across an organisation. These concepts may defined and enforced by a governing body or council within the organisation.
  • Data Governance Framework
    A logical structure for facilitating and coordinating a data governance program of an organisation.
  • Data Integration
    Data is integrated when it is semantically reconciled, all foreign keys are resolved, and transformations and computations associated with new data being integrated are re-executed. The latter is often referred to as business rules.
  • Data Lineage
    Also known as data flow diagram. A visual representation of an end-to-end data flow in an organisation. How data feeds from the source systems through all applications, services, databases and messages right to the end systems and reports.
  • Data Management
    A practice for governing the full data lifecycle of an organisation.
  • Data Management Association (DAMA)
    An international non-profit organization also known as DAMA. Its members are professionals, with technical and/or business background, are dedicated to continues improvement of information and data management practices.
  • Data Mapping
    An activity to assigning target data elements to data elements in the source system. It may be a one-to-one, many-to-one or one to many association between elements including possible transformation.
  • Data Mart
    A subset of a data warehouse is built for supporting the data-driven decision making of a specific team or business line.
  • Data Migration
    A process of moving data from one computer system or storage type to another one. It may be part of a system implementation, upgrade or consolidation exercise.
  • Data Minimisation
    Data that is collected and processed should not be retained or furher used unless necessary. This helps prevent data misuse and leaks.
  • Data Mining
    A computational process for analysing large amounts of data in order to discover relationships, patterns and trends using variety of techniques from artificial intelligence, machine learning and statistic.
  • Data Modeling
    A process for defining and analysing data requirements needed to support the business processes within the scope of corresponding information systems in organizations. There are three different types of data models produced while progressing from requirements through data objects and their relationships to the actual database to be used for the information system: conceptual model, logical model and physical model.
  • Data Movement
    A process to copy data reliably from a source to a destination. The format of the data during transport may be different to that in which either the source or the destination store it.
  • Data Owner
    An individual responsible for setting policies, standards and practices around a certain data element or domain. The responsibilities are defined by the data governance body of the organisation.
  • Data Profiling
    A process of reverse-engineering a particular data set in order to determine that how accurate the available metadata of that data set is. Furthermore, it may discover inaccuracy, redundancy or inconsistency in the data set.
  • Data Publication
    The staging of a dataset so that it can be moved to a data subscriber. The subscriber may be notified that the instance of the data is available to be moved. Alternatively the publisher may "push" it to the subscriber, or there may be some other notification and movement mechanism. The data is often filtered and formatted in a special way for the subscriber.
  • Data Quality Assessment
    A process of discovering inconsistencies, inaccuracy, incompleteness, redundancy and other anomalies in a dataset in order to provide comprehensive, consistent, relevant and timely data for data consumers.
  • Data Quality Metric
    Quantitative (and possibly qualitative) measure of data quality obtained from the application of data quality rules.
  • Data Science
    An interdisciplinary field of information technology which applies a variety of techniques from different fields, such, data mining, machine learning, computer programming, data visualization; in order to solve complex problems and discover insight from massive datasets.
  • Data Standard
    A set of rules agreed in order to establish common format and meaning of data. It may cover guidelines for naming conventions, standard formats, definitions, business rules and valid values.
  • Data Steward
    An individual responsible for enforcing the upon-agreed policies and maintaining high data quality, accessibility, performance, security around a certain data element or domain. The responsibilities are defined by the data governance body of the organisation.
  • Data store
    A repository for storing data persistently which may vary from a simple file format through data tables to collection of databases.
  • Data Transformation Rule
    A business rule for changing the representation of a data item from one state to another. It may include steps of amending data format, joining data from multiple sources, calculating aggregates or deriving new values. It is used in data profiling or data migration exercises.
  • Data Validation Rule
    A data quality rule for testing the representation of data rather than any aspect of its meaning, or content, or business relationship to other data. It may include valid value, data format or datatype check.
  • Data Virtualisation
    A technology of providing a consistent platform for data consumers in order to access desperate data sources with various technical aspects.
  • Data Visualisation
    Visual representation of objects in order to support exploration, examination, and communication of data. This feature makes possible to dynamically interact with data and enable to drill down to atomic elements.
  • Data Warehouse
    A relational database for integrating disparate data sources of an organisation in order to provide a central reporting and data analysis platform. Also known as Enterprise Data Warehouse (EDW) or in short DW, DWH.
  • Database
    A repository to store large collection of data that is organized for easy access, management and update. It has many types, such as relational database, NoSQL database, graph database or document store.
  • Database Extract
    The copying of data from a source database, in the format of the source database. The data may be filtered and/or placed in a view of that database.
  • Database Load
    A process by which a dataset containing multiple records is loaded into a database at one time.
  • Datatype
    A standard for representing a particular class of data in a relational database management system. The classes of data recognized by different vendors and products vary.
  • e

  • Exception Log
    A record of any errors that are detected at any point in any aspect of the operation of a certain software.
  • Extract-Transform-and-Load (ETL)
    A technology that combines extracting data from a source database with some degree of reformatting and quality checking, and the movement of the data to a target environment. The degree to which the data can be formatted and quality checked is limited.
  • f

  • Flat File
    A stand-alone batch data set that is purely a character representation of data, and has no indexing or internal structure for relating data content.
  • i

  • Integrity
    The degree to which a value, column or entity's relationships are complete. Usually this is implemented in Data Quality rules to identify missing parent, child or other relationships.
  • m

  • Master Data
    The most commonly used data within the organisation.
  • Message
    A packet of data that is passed from application to application. This movement is in real time, although the applications may not produce or consume it in real time. The message contains metadata for transport control and may contain additional metadata (e.g. XML tags). Ideally a message (or set of messages) will model a business event.
  • Metadata Repository
    A data store that is populated with metadata rather than with data that usually services a specific requirement. E.g. End-to-end data lineage, risk management etc.
  • o

  • Operational Metadata
    Metadata that is generated as a process executes. This certainly includes process metrics. However, it may also include things like the results of applying data quality validation rules.
  • p

  • Process Control
    A high level service or set of services that control or test a process to achieve a desirable outcome.
  • r

  • Reference Architecture
    An architecture that is not specific to any enterprise or specific context, but represents the current IT industry best practice for the domain being designed.
  • s

  • Semantic Rule
    A data quality rule for testing some aspect of the content, or meaning of data, or its relationship to other data. It may include data cross-validation of multiple fields.
  • Service
    A set of processes that are logically related and perform a distinct task. The service can be provided by one of more components. For example, a business rules engine is a service that can invoke many processes, including data quality checking and data transformation.
  • u

  • Uniqueness
    The degree to which a value, column or entity is unique within a dataset. Usually this is implemented in Data Quality rules to identify duplicates.