Wednesday, September 24, 2025

Information Classification and Role Management

 In continuation of the previous post..

Approach for Implementation

 

Step 1: Assign Role Clearance Level (One-time activity)

o   Assign a single, high-level clearance to every role in the User Management screen and its purpose defines the highest sensitivity of data each role is permitted to access.

Role

Clearance Level

Admin

Secret

Finance Manager

Confidential

Lab Manager

Confidential

Lab Technician

Internal

 

Step 2: Classify Attributes (One-time activity via DACPAC)

o   Classify each attribute once, stored directly in the database via DACPAC seed scripts:

o   Attributes will appear labelled clearly with their classification levels within the Role Management UI.

Entity

Attribute Name

Classification

Account

Account Number

Secret

Account

Financial Data

Confidential

Instrument

Serial Number

Confidential

Contact

Email Address

Internal

 

Step 3: Automatic Permission (Minimal effort)

o   View Permissions:

ü  Default "View" permission granted automatically if the Role's assigned Clearance Level >= Attribute's Classification Level.

Role Clearance

Attribute Classification

Automatically Assigned View Permission

Secret

Secret, Confidential, Internal, Public

Yes

Confidential

Confidential, Internal, Public

Yes

Internal

Internal, Public

Yes

 

o   Edit Permissions:

ü  No automatic edit permission will be inherited based on clearance level.

ü  Explicitly assigned by administrators via the Role Management UI.

Note:

o   If a Role has Feature-level Edit permission, it implicitly has View permissions, and all attributes within that feature (entity) will automatically inherit Edit permission by default.

 

Step 4: Role Management (Overrides Only, Exception-based)

o   Role Management UI can override any permissions inherited by the system

·        Inherited permissions are clearly displayed by default (based on classification and role clearance).

·        Admins explicitly manage deviations only (typically minimal).

o   Planned Usability Enhancements:

·        "View All" and "Edit All" toggles for bulk actions.

  

Specific Use Case Examples

  1. Use Case 1: Feature-level Edit Permission
    • Granting Edit permission to a feature (entity) automatically grants View and Edit permissions to all attributes within that entity.
  1. Use Case 2: Partial Edit Permissions (Attribute-level exceptions)

UI Form Behavior:

    • Fields without Edit permission are read-only.
    • On Save (POST/PUT), only authorized edited attributes are submitted.
    • The backend explicitly validates submissions. Unauthorized edits trigger explicit authorization errors.
    • Transactions succeed if no unauthorized attributes are submitted.
  1. Use Case 3: Bulk Imports (CSF Import Scenario)
    • A CSF import contains 4 samples; the user lacks edit permissions on 2 attributes.
    • The import succeeds for samples that do not involve restricted attributes.
    • The import explicitly fails or skips samples(which doesn’t have edit access to those attributes), providing clear feedback to the user:

"Some attributes weren't imported due to insufficient permissions."

  1. Use Case 4: Masking Behaviour
    • If a user has sufficient clearance but the View permission is disabled, the attribute appears masked in the UI (e.g., "****").
    • If a user does not have sufficient clearance but the View permission is enabled, the attribute appears masked in the UI (e.g., "****").
    • Both Clearance and View permissions are needed to view the attributes

Scenario

Outcome

UX benefit

Partial edit permissions

Non-editable fields disabled

Prevents confusion

No view permission

Attribute visible as masked (****)

Clearly indicates permission/clearance issue

No sufficient clearance

Attribute visible as masked (****)

Clearly indicates permission/clearance issue

 


Field Level Authorization and Data Classification

 Authorization architecture 

•   Implement Field-Level Role management allowing administrators to define View and Edit permissions for individual data fields based on user roles.

•  Incorporate Data Classification (e.g., Secret, Confidential, Internal, Public) for data fields and corresponding Clearance Levels for roles.

•  Implement Data Masking for sensitive fields when a user lacks sufficient permission or clearance to view the actual data.










UI Enhancements

  1. Extend the Hierarchy Tree:

The main tree structure needs to be extended to include Fields as the lowest level under Entities.

  • New Structure: Module -> Entity -> Field

The existing "Action" items (like Edit Account, Export Account) representing feature-level permissions should likely remain, grouped under Module.

  1. Introduce Field-Level View/Edit Controls:

New Checkboxes: Add specific View and Edit checkboxes that only appear next to the Field-level items in the tree.

  • These checkboxes control the permissions stored in the new RoleField Permission table.
  • These could be new columns aligned similarly to the existing Read/Write, but clearly designated for fields.
  1. Existing Read/Write Controls:

The existing Read and Write checkboxes should remain for the Module, Feature Group, and Action level items.

  1. Add Classification Column:

Introduce a new read-only column in the tree/list view, aligned with the rows.

  • For rows corresponding to Fields, this column should display the field's ClassificationLevel (e.g., 'Public', 'Internal', 'Confidential', 'Secret') fetched from the FieldClassification table.
  1. Display Role Clearance:

Somewhere near the selected Role name (e.g., below the dropdown), display the MaxClearanceLevel assigned to that Role (read-only).

  1. Implement Checkboxes for Fields:

Add aggregate View and Edit checkboxes at the Entity and Module levels within the tree.

This is to reflect and control the state of all underlying field permissions within that scope. Clicking them would allow bulk grant/revoke operations for fields

  1. Add Search/Filtering:

Include a search input field above the tree to allow administrators to quickly filter the potentially long list fields by name.

  1. Update Save/Copy Logic:

The Save button's action now needs to update potentially both the RolePermission (for feature-level changes) and RoleFieldPermission (for field-level changes) tables.

The Copy Privileges functionality needs to be updated to intelligently copy both sets of permissions if desired.

 

User Management

Role Clearance Levels: Each role is assigned a maximum sensitivity level it can access.

For instance, a Lab Technician role might have clearance = Internal (can see Class 3 and Class 4 data, but not Confidential or Secret), whereas a Lab Manager might have clearance = Confidential (Class 2) and an Admin clearance = Secret (Class 1).

We will represent this in the Role table (e.g. a column MaxSensitivity or numeric level). The Authorization service will interpret this such that:

  • A user’s effective clearance is the highest of any role they possess.
  • If a user has multiple roles, consider whichever role grants higher clearance prevails, since that user is trusted up to that level.

 Masking

Server-Side Masking Implementation: When a microservice determines that a field’s value should not be revealed, instead of dropping it, it replaces the value with a masked representation:

  • Use a constant string or asterisks of matching length. E.g., Customer Name Peter Lynch might be sent as *********ch (preserving last 2 digits)

 Authorization Logic:

When a microservice is determining whether to show a field, it now must consider both the user’s explicit field permission and the Users Clearance level:

  • Role-based check: Does the user’s role normally allow access to this field? (From the field-level permissions discussed above.)
  • Classification check: Is the field’s sensitivity <= the user’s clearance level? If not, the field must be treated as disallowed, even if the role would otherwise permit it.
ER



Friday, November 1, 2024

Barcode Printing with AWS AppStream

Amazon AppStream 2.0 supports local printer redirection, enabling users to print documents, including barcodes, from their streaming applications to printers connected to their local computers.  The below section explores some of the options we have explored for barcode  printing  for ZPL printers


  1. Direct ZPL Command Printing:
  • When sending ZPL commands from a local laptop using the Zebra JavaScript SDK, the printer successfully receives and prints the output as intended.
  • However, when attempting the same from within the AppStream environment, we consistently encounter connection refusals.
  • AppStream instances run in a cloud-based, isolated environment, which means that direct access to local printers over HTTP (e.g., 127.0.0.1:9100) is not possible. This is why the connection is successful locally but fails when attempted from AppStream.
  • The Zebra SDK works well in a local setting by communicating directly with printers over localhost. However, in AppStream, "localhost" refers to the AppStream server itself, not the client machine, leading to a connection refusal.
  1. Image and PDF Printing:
  • Store barcodes as images or PDFs and print from AppStream using Print Job Redirection and via local printing.
    • Local Computer- Configure Printer preferences, adjustments, label types etc and save these as default preferences.
    • In Appstream – Select the local default preferences and print – its prints Successfully.
    • Select preferences and click print. With this process, from AppStream, barcode can be printed using barcode settings done in local computer.

      

Possible Solutions Moving Forward

  • Option 1 -Print Job Redirection: Leveraging Print Job Redirection for generic print jobs like PDFs or images and print the barcode images/pdfs from Appstream( as above)
  • Option 2 -Save barcode images/pdf to a local path mapped to AppStream, allowing users to select and print them manually.
  • Option 3 - Save barcode images/pdf to a local path mapped to AppStream. Develop a listener application on the local machine that monitors a specific folder for new barcode files and automatically prints them.
  • Option 4 - Utilize a backend service to handle printing tasks (refer earlier post) which communicates with network printers to execute print jobs using ZPL commands. 

Friday, October 18, 2024

Dynamic Entity Attribute Value Model

Creating a database model that supports dynamic entity creation requires a flexible and scalable design. The goal is to design tables that can accommodate the creation of new entities  on the fly, without the need for structural database changes. 


Applications  built on a microservices architecture, where each microservice manages its own database. This architecture enhances service independence and scalability. Several of these microservices also implement the dynamic entity model in their databases, offering flexibility for managing dynamic and diverse data types and relationships.

Proposed dynamic entity design allows for individual microservices to be updated, scaled, and maintained independently, promoting agility and robustness in our system's overall functionality.


This model is Ideal for scenarios where entity attributes are numerous and varied, and where new attributes might be frequently added
Components:
  1. BaseEntity: Represents generic entity types in our system.
                Example ; Study, Parcel, Sample etc
     2. EntityInstance: Each instance of an entity is recorded here, storing unique occurrences of BaseEntity types. 
                Example, a specific study or a particular task.

     3. EntityAttribute: Defines the set of attributes applicable to each entity type.
  •         Attribute can contain validations like Minlength, max length, validations etc
  • -metadata for attributes added to attributes table
     4. EntityAttributeValue: Stores values for these attributes for each entity instance.  
     
     5. EntityMetadata: Provides additional descriptive information or configuration settings for each entity type or instance, enhancing the contextual understanding of the entities. 

    6. EntityRelationship: Manages the relationships between different entities, crucial for representing complex associations such as the linkage between studies and samples or tasks and their parent entities.
  • The EntityRelationship table essentially serves as a cross-reference (XRef) table, especially in a many-to-many relationship scenario. 
  • EntityInstance records are joined through EntityRelationship to find all related child entities for a given parent entity.


Managing Dynamic Attributes
Key Challenges
  • Attributes differ significantly across entity types and instances, requiring a system that can accommodate a wide range of data structure
  • As the number of entities and attributes grows, the system must scale efficiently without compromising performance.
  • Ensuring accuracy and consistency of data across various entities with diverse attributes.
Strategies for Managing Dynamic Attributes
  • DB model allows our microservices to define and modify attributes without restructuring the database schema.
  • Attribute Metadata Management: Utilize EntityMetadata to store additional information about attributes, such as data validation rules, which helps in maintaining data quality and integrity.
  • APIs will be designed to dynamically generate responses based on the attributes of the requested entities, ensuring flexibility and relevance in data delivery.
Use cases
  • When a new attribute is introduced, it will be registered in the EntityAttribute table, making it immediately available for association with entity instances.
  • Values for these attributes are stored in the EntityAttributeValue table, allowing for efficient retrieval and manipulation as per business logic requirements.
  • The frontend dynamically adjusts to display and interact with these attributes, providing a seamless user experience regardless of the underlying data 
Design Considerations

UI Structure  
  • Generic page that can adapt to display any entity type. This is already in place
  • Dynamic form to render different types of inputs/display fields based on the data type of each attribute ( text fields, data pickers, dropdowns etc)
  • metadata can be used to add additional information or influence the rendering of the entity (e.g., adding tooltips, conditional formatting).
  • UI to call  endpoints passing relevant entity and instance ID’s
Fetching Data
  • API endpoints to fetch data for a specific entity including its attributes and metadata
Example 
  • To fetch a specific Study instance, Query Should join baseentity, EntityInstance, EntityAttribute and EntityAttributeValue tables
  • If these  tables have large number of rows, Performance will be extremely slow. To mitigate that we can consider indexes on frequently queried columns like entityid
Data Integrity and Consistency –  
  • Enforce data type consistency. For instance, if an attribute 'StartDate' is designed to store dates, the schema restricts this field to date data types only.
  • Data Validation :before inserting or updating data, the application layer checks if a 'StartDate' is indeed a valid date and not just a random string or number.
  • When a new 'Study' entity is created, and multiple attributes are added, this process is wrapped in a transaction. If adding any attribute fails, the entire operation is rolled back to avoid partial updates.
Optimization and Performance:
  • Leveraging indexing on the EntityAttributeValue table for faster query execution.
  • Dynamic model queries can be complex and might impact performance. We should consider query optimization techniques, caching, or even indexed views if the performance becomes a concern.

Wednesday, August 28, 2024

Data Governance compliance List

Sequence list with relevant compliance and data protection controls to implement data governance in the system








Friday, April 26, 2024

Claim Based Authorization

 

1.     Claim Based Authorization

·        Token Validation: As requests come into the Ocelot API Gateway, the first step is to validate the JWT token issued by FAMS. This validation checks the token's integrity and authenticity.

·        Fetch User Claims: Once the token is validated, Ocelot should then communicate with the admin microservice to retrieve specific claims related to the user's roles and permissions. This is crucial for implementing fine-grained access control based on the roles associated with the token's user.


 


·        Validate Token

o   custom middleware in Ocelot to intercept incoming requests. Extract the JWT token from the Authorization header. Validate the token’s signature, issuer, and expiration using FAMS's KID (Same as H2M  token validation strategy).

·        Retrieve User Claims

o   After successful token validation, extract the user identifier from the token (claim that identifies the user).

o   Make an API call from Ocelot to the admin microservice, passing the user identifier to fetch the corresponding roles and permissions.

o   The admin microservice should respond with the necessary claims which define what actions the user is authorized to perform.

·        Enforce Authorization

o   Utilize the fetched claims to enforce authorization policies within Ocelot. This can be done through route rules in Ocelot configuration.

o   Based on the claims, decide whether to forward the request to downstream services or reject it.

·        Caching

o   Caching roles and permissions in Ocelot if they do not change frequently, to reduce the number of requests to the admin microservice.

2.     Cross Zone Authorization

Users who are allowed to make a cross zone call will have a role defined in admin microservice (or in IAM).  That scope will be added to the authorization header which can then be used in make cross zone api call else reject in its own zone

For cross zone call, add custom claim Boolean flag indicating cross zone access.

·        Ocelot receives cross zone request with role, extracts the JWT token.

·        Forwards the token to authorization service.

·        Authorization service validates the token and check cross zone permission.

·        Authorization service will allow/deny the request.

 

Monday, April 15, 2024

Barcode Printing Solution


In the labs, there will be different types of barcode label printers. When designing a solution for barcode systems, it is important to streamline the processes, protocols, and network connectivity to optimize operations and maintenance costs effectively.


1. Network Connectivity 
Transition USB Printers to Ethernet connections (TCP )
Printers that require USB connections and are currently connected to desktops in Zone 3 should gradually transition to Ethernet connections. 
Users can print from different locations with out physically connected printer 
- More secure (firewall protection)
- Queue Management
2. Printer Communications and Protocols
Implement IPP (Internet Printing Protocol) for networked printers, facilitating secure and standardized communication over IP networks. IPP is supported by most modern printers and provides features like encryption and authentication.
Ensure ZPL II-compatible label printers are used  and Print Management Microservice can generate and send ZPL II commands. 

3. Local Network Printing Solutions
To access printers on an  onpremise network
Establish VPN Connection to AWS to the network or via AWS direct Connect
Network Print server – Print server act as intermediary recovering print jobs from print microservice over VPN or Direct Connect and forwarding these jobs to isolated printers

4. Cloud printing Solution
Cloud Printing solution simplifies the architecture, it has direct IP printing/queue management, driver management. 
Reduces the need for on-premise print servers
Authentication, and authorization are integral parts to ensure that only authorized users can execute print jobs, and sensitive documents are handled securely.  Cloud printing provider will provide necessary API keys 

Cloud Printing Solution
With Cloud Printing service, on premise print servers may not be required.  Cloud   printing service typically manages queue management, job distribution and driver management



Options 
If Cloud  Services has to send print jobs directly to Printer behind firewall, then we need to open port 632 to allow incoming IPP traffic to onpremise printers and  VPN connection between cloud service abd Onpremise network 

Printers can be configured to make outbound connection to cloud provider over the network. In such case we just need to configure only the outbound rules


Local Network Printing Solution

API Gateway: Acts as the entry point for job submissions from the eLIMS UI.
Print Microservice: Processes the print jobs and interacts with SNS and SQS for messaging and queue management.
SNS/SQS Queue: Holds print jobs sent by the microservice. It manages the delivery of these jobs to the print servers in a scalable and fault-tolerant manner.
Local and Isolated Print Servers: Push print jobs from the SQS Queue and manage the actual printing process on their respective printers.




UI Printing
If there is no connectivity issues with the printers from the UI, then UI printing can also be used








Information Classification and Role Management

 In continuation of the previous post .. Approach for Implementation   Step 1: Assign Role Clearance Level (One-time activity) o    As...