Nueva publicación

Encontrar

Anuncio
· 19 abr, 2023

持续火热报名中:欢迎参加InterSystems 中国技术培训认证

为支持医疗信息行业人才发展,InterSystems 为中国市场量身定制了贴近需求、灵活、实操性强的技术认证培训计划,由 InterSystems 资深技术专家亲自授课,帮助用户快速掌握 InterSystems 技术,确保用户从快速发展的 InterSystems 技术中获益,以更好地服务于医院信息化建设。点击此处查看课程详情:InterSystems中国技术培训认证

您的最佳学习路径

 

为什么要参加 InterSystems 技术认证培训?

  • InterSystems 数据平台技术已成为国内医疗信息化领域的主流技术之一,支持全国数百家大型公立医院核心系统长期稳定运行 20 余年;
  • 专为中国技术用户量身定制,具有贴近需求、灵活、实操性强等特点;
  • InterSystems 资深技术专家亲自授课,帮助用户快速掌握 InterSystems 技术及最佳实践;
  • InterSystems 官方技术认证培训具备更高权威性,可以助力用户更好地运用 InterSystems 技术,并从快速发展的 InterSystems 技术中获益,保持技术先进性。

哪些用户可以参加认证培训?

凡使用 InterSystems 技术或对 InterSystems 技术感兴趣的IT从业人员或机构均可参加。

您可以从技术认证培训中获得哪些技能和成长?
  • 与时俱进的课程更新,理论与实践相结合的学习方式,可以帮助您持续提升对 InterSystems 技术的掌握;
  • 参与 InterSystems 的分级培训计划,考核通过即可获得认证证书;
  • 通过线下课程与活动,拓展技术人脉。
InterSystems 中国的认证培训讲师团成员是哪些?

InterSystems 中国资深工程师团队授课。

报名方式及开课时间是如何安排的?

报名人数满 5 人即开班,每季度一次,培训方式为线下培训,考试内容含书面测试与上机实践。课程收费请咨询您的 InterSystems 客户经理医院及医疗信息化企业推荐以机构方式参与培训。

如需报名或咨询更多详情,请联系您的 InterSystems 客户经理,或通过以下方式与 InterSystems 中国团队联系:

电话:400-601-9890

邮件:GCDPsales@InterSystems.com

2 comentarios
Comentarios (2)0
Inicie sesión o regístrese para continuar
Artículo
· 18 abr, 2023 Lectura de 2 min

AI generated text detection using IntegratedML

In recent years, artificial intelligence technologies for text generation have developed significantly. For example, text generation models based on neural networks can produce texts that are almost indistinguishable from texts written by humans.
ChatGPT is one such service. It is a huge neural network trained on a large number of texts, which can generate texts on various topics and be matched to a given context. 

A new task for people is to develop ways to recognize texts written not only by people but also by artificial intelligence (AI). This is because, in recent years, neural network-based text generation models have become capable of producing texts that are almost indistinguishable from texts written by humans.

There are two main methods for AI-written text recognition:

  • Use machine learning algorithms to analyze the statistical characteristics of the text;
  • Use cryptographic methods that can help determine the authorship of the text

In general, the task of AI text recognition is difficult but important.

I am happy to present an application for the recognition of the texts generated by AI. During development, I took the benefits of InterSystems Cloud SQL and Integrated ML, which include:

  • Fast and efficient data requests with high performance and speed;
  • User-friendly interface for non-experts in databases and machine learning;
  • Scalability and flexibility to quickly adjust ML models according to requirements;

In the development and further training of the model, I used an open dataset, namely 35 thousand written texts. Half of the texts were written by hand by a large number of authors, and the other half was generated by AI with ChatGPT.

Configuration used for GPT model:

model="text-curie-001"
temperature=0.7
max_tokens=300
top_p=1
frequency_penalty=0.4
presence_penalty=0.1

Next, about 20 basic parameters were determined, according to which further training was carried out. Here are some of the options I used:

  • Characters count
  • Words count
  • Average word length
  • Sentences count
  • Average sentence length
  • Unique words count
  • Stop words count
  • Unique words ratio
  • Punctuations count
  • Punctuations ratio
  • Questions count
  • Exclamations count
  • Digitals count
  • Capital letters count
  • Repeat words count
  • Unique bigrams count
  • Unique trigrams count
  • Unique fourgrams count

As a result, I got a simple application that you can use for your tasks or just have fun.

This is what it looks like:

imageTo try the application you can use online demo or run it locally with your own Cloud SQL account. 

Also, this application participates in the contest. If you like it, vote for it.

Welcome to the comments to discuss this app if you were interested.
 

1 Comentario
Comentarios (1)1
Inicie sesión o regístrese para continuar
Artículo
· 16 abr, 2023 Lectura de 4 min

Tuples ahead

Overview

Cross-Skilling from IRIS objectScript to Python it becomes clear there are some fascinating differences in syntax.

One of these areas was how Python returns Tuples from a method with automatic unpacking.

Effectively this presents as a method that returns multiple values. What an awesome invention :)

out1, out2 = some_function(in1, in2)

ObjectScript has an alternative approach with ByRef and Output parameters.

Do ##class(some_class).SomeMethod(.inAndOut1, in2, .out2)

Where:

  • inAndOut1 is ByRef
  • out2 is Output

The leading dot (".") in front of the variable name passes ByRef and for Output.

The purpose of this article is to describe how the community PyHelper utility has been enhanced to give a pythonic way to take advantage of ByRef and Output parameters. Gives access to %objlasterror and has an approach for Python None type handling.
 

    Example ByRef

    Normal invocation for embedded python would be:

    oHL7=iris.cls("EnsLib.HL7.Message")._OpenId('er12345')

    When this method fails to open, variable "oHL7" is an empty string.
    In the signature of this method there is a status parameter that is available to object script that gives an explanation of the exact problem.
    For example:

    • The record may not exist
    • The record couldn't be opened in default exclusive concurrency mode ("1"), within timeout
    ClassMethod %OpenId(id As %String = "", concurrency As %Integer = -1, ByRef sc As %Status = {$$$OK}) As %ObjectHandle

    The TupleOut method can assist returning the value of argument sc, back to a python context.
     

    > oHL7,tsc=iris.cls("alwo.PyHelper").TupleOut("EnsLib.HL7.Message","%OpenId",['sc'],1,'er145999', 0)
    > oHL7
    ''
    > iris.cls("%SYSTEM.Status").DisplayError(tsc)
    ERROR #5809: Object to Load not found, class 'EnsLib.HL7.Message', ID 'er145999'1
    ```

    The list ['sc'] contains a single item in this case. It can return multiple ByRef values, and in the order specified. Which is useful to automatically unpack to the intended python variables.

    Example Output parameter handling

    Python code:

    > oHL7=iris.cls("EnsLib.HL7.Message")._OpenId('145')
    > oHL7.GetValueAt('<%MSH:9.1')
    ''

    The returned string is empty but is this because the element is actually empty OR because something went wrong.
    In object script there is also an output status parameter (pStatus) that can be accessed to determine this condition.

    Object script code:

    > write oHL7.GetValueAt("<%MSH:9.1",,.pStatus)
    ''
    > Do $System.Status.DisplayError(pStatus)
    ERROR <Ens>ErrGeneral: No segment found at path '<%MSH'

    With TupleOut the equivalent functionality can be attained by returning and unpacking both the method return value AND the status output parameter.

    Python code:

    > hl7=iris.cls("EnsLib.HL7.Message")._OpenId(145,0)
    > val, status = iris.cls("alwo.PyHelper").TupleOut(hl7,"GetValueAt",['pStatus'],1,"<&$BadMSH:9.1")
    > val==''
    True
    > iris.cls("%SYSTEM.Status").IsError(status)
    1
    > iris.cls("%SYSTEM.Status").DisplayError(status)
    ERROR <Ens>ErrGeneral: No segment found at path '<&$BadMSH'1


    Special variable %objlasterror

    In objectscript there is access to percent variables across method scope.
    There are scenarios where detecting or accessing special variable %objlasterror is useful after calling a CORE or third party API
    The TupleOut method allows access to %objlasterror, as though it has been defined as an Output parameter, when invoking methods from Python

    > del _objlasterror
    
    > out,_objlasterror=iris.cls("alwo.PyHelper").TupleOut("EnsLib.HL7.Message","%OpenId",['%objlasterror'],1,'er145999', 0) 
    
    > iris.cls("%SYSTEM.Status").DisplayError(_objlasterror)
    ERROR #5809: Object to Load not found, class 'EnsLib.HL7.Message', ID 'er145999'1

    When None is not a String

    TupleOut handles python None references as objectscript undefined. This allows parameters to default and methods behave consistently.
    This is significant for example with %Persistent::%OnNew where the %OnNew method is not triggered when None is supplied for initvalue, but would be triggered if an empty string was supplied.

    In objectscript the implementation might say:

    do oHL7.myMethod("val1",,,"val2")

    Note the lack of variables between commas.

    TupleOut facilitates the same behavior with:

    Python:

    iris.cls("alwo.PyHelper").TupleOut(oHL7,"myMethod",[],0,"val1",None,None,"val2")

    Another way to consider this, is being able to have one line implementation of invocation code, that behaves flexibly depending on pre-setup of variables:

    Object Script:

    set arg1="val1"
    kill arg2
    kill arg3
    set arg4="val2"
    do oHL7.myMethod(.arg1, .arg2, .arg3, .arg4)

    TupleOut facilitates the same behavior with:

    Python:

    arg1="val1"
    arg2=None
    arg3=None
    arg4="val2"
    iris.cls("alwo.PyHelper").TupleOut(oHL7,"myMethod",[],0,arg1,arg2,arg3,arg4)

    List and Dictionaries

    When handling parameters for input, ByRef and Output, TupleOut utilizes PyHelper automatic mapping between:
    IRIS Lists and Python Lists
    IRIS Arrays and Python Arrays
    Where it takes care to always use strings to represent dictionary keys when moving from IRIS Arrays to Python Dict types.

    Conclusion

    Hope this article helps inspire new ideas and discussion for embedded Python ideas and suggestions.

    Hope also it gives encouragement to explore the flexibility for how IRIS can easily bend to meet new challenges.

    Comentarios (0)1
    Inicie sesión o regístrese para continuar
    Artículo
    · 13 abr, 2023 Lectura de 8 min

    What you always wanted to know about InterSystems IRIS but were afraid to ask :)

    What is InterSystems IRIS?

    InterSystems IRIS is a high-performance data platform designed for developing and deploying mission-critical applications. It is a unified data platform that combines transaction processing, analytics, and machine learning in a single product.

    InterSystems IRIS provides a comprehensive set of data management and development tools that enable developers to build, integrate, and deploy applications with ease. It supports a wide range of data models, including relational, object-oriented, hierarchical, and document-based models, and provides a powerful set of APIs for accessing data.

    InterSystems IRIS is used in a variety of industries, including healthcare, finance, logistics, and more, to power critical applications such as electronic health records, financial trading systems, and supply chain management platforms. It is known for its scalability, reliability, and ease of use, and is used by some of the world's largest organizations to manage their most important data-driven applications.

    What type of database is InterSystems IRIS?

    InterSystems IRIS is a multi-model database, which means that it supports multiple data models including relational, object-oriented, and document-based models. It is designed to be highly flexible and adaptable, allowing developers to choose the data model that best fits their application's requirements.

    InterSystems IRIS supports standard SQL for relational data management, and it also provides advanced indexing capabilities and query optimization to improve performance. In addition, it supports NoSQL document-oriented data storage, allowing developers to work with unstructured and semi-structured data. The object-oriented data model in InterSystems IRIS allows developers to work with complex data structures and build object-oriented applications.

    The multi-model architecture of InterSystems IRIS provides developers with the flexibility to work with different types of data in a single database, simplifying application development and management. This makes it a popular choice for building high-performance, data-driven applications in a variety of industries.

    What is the InterSystems IRIS database?

    InterSystems IRIS is a high-performance database management system (DBMS) that is designed to handle a wide variety of data management tasks. It is developed by InterSystems Corporation, a software company that specializes in providing data management, interoperability, and analytics solutions to businesses and organizations around the world.

    InterSystems IRIS is a powerful and flexible database platform that can handle both structured and unstructured data, and can be used for a variety of applications, including transaction processing, analytics, and machine learning. It provides a rich set of features and tools for managing data, including support for SQL, object-oriented data modeling, multi-dimensional data analysis, and integrated development and deployment tools.

    One of the key features of InterSystems IRIS is its ability to handle large amounts of data with high performance and scalability. It uses advanced caching and indexing techniques to optimize data access, and can be configured to work with a wide range of hardware configurations and operating systems.

    InterSystems IRIS also includes advanced security features, such as role-based access control, encryption, and auditing, to ensure the confidentiality, integrity, and availability of data.

    Overall, InterSystems IRIS is a powerful and flexible database platform that can help businesses and organizations manage their data more effectively and efficiently.

    What is InterSystems IRIS HealthShare?

    InterSystems IRIS HealthShare is a healthcare-specific platform that builds on top of InterSystems IRIS database and integration engine to provide a comprehensive solution for healthcare organizations. It is designed to enable healthcare organizations to securely and efficiently share patient data across different systems and applications, while also providing advanced analytics and insights to improve patient care.

    InterSystems IRIS HealthShare includes a wide range of features and tools, including:

    1. Health Information Exchange (HIE) capabilities that enable healthcare organizations to securely exchange patient data across different systems and providers.
    2. Master Patient Index (MPI) functionality that ensures accurate patient identification and record matching, even in the face of incomplete or inconsistent data.
    3. Clinical Viewer and Patient Portal that enable patients and clinicians to view and interact with patient data in a secure and intuitive way.
    4. Analytics and Business Intelligence tools that enable healthcare organizations to analyze patient data and identify patterns and trends that can improve patient outcomes and drive operational efficiencies.
    5. Interoperability capabilities that enable healthcare organizations to connect to and exchange data with a wide range of external systems and devices.

    Overall, InterSystems IRIS HealthShare is a powerful and flexible platform that can help healthcare organizations improve the quality of patient care while also reducing costs and improving operational efficiency.

    How is InterSystems IRIS data stored?

    InterSystems IRIS stores data using a hierarchical, multi-dimensional data model. Each element of it is called a Global. A Global is a persistent, hierarchical data structure that can be thought of as a collection of nodes that are organized into a tree-like structure. Each node in the tree is identified by a unique path, which is formed by concatenating a series of labels, separated by caret (^) characters.

    A Global can store a wide variety of data types, including strings, numbers, and binary data, and can be accessed using a variety of programming languages and APIs, including SQL, object-oriented programming, and Web services.

    InterSystems IRIS also provides a flexible and scalable indexing system that enables efficient retrieval of data from Globals. The indexing system allows developers to define custom indexes on specific attributes of the data, which can be used to quickly retrieve subsets of data that meet specific criteria.

    In addition to Globals, InterSystems IRIS also supports other data storage mechanisms, including relational tables, multidimensional arrays, and JSON documents. Relational tables are based on the SQL standard and provide a structured, tabular way to store data. Multidimensional arrays are used to store data that is organized into matrices or cubes, while JSON documents are used to store unstructured or semi-structured data.

    Overall, InterSystems IRIS provides a flexible and powerful data storage system that can handle a wide variety of data types and data models, making it well-suited for a wide range of applications and use cases.

    Is InterSystems IRIS a programming language?

    InterSystems IRIS is not a programming language itself, but it provides support for a variety of programming languages and APIs. Some of the programming languages that are supported by InterSystems IRIS include:

    1. ObjectScript: InterSystems' proprietary programming language, which is used for developing applications that run on the InterSystems IRIS platform.
    2. SQL: InterSystems IRIS provides full support for the SQL programming language, which can be used to interact with relational data stored in InterSystems IRIS.
    3. Java and .NET: InterSystems IRIS provides support for both the Java and .NET programming languages, which can be used to develop applications that interact with InterSystems IRIS.
    4. REST and SOAP APIs: InterSystems IRIS provides support for both RESTful and SOAP-based APIs, which can be used to develop Web services that interact with InterSystems IRIS.
    5. Node.js: InterSystems IRIS also provides support for Node.js, a popular JavaScript runtime environment, which can be used to develop server-side applications that interact with InterSystems IRIS.
    6. Python: InteSystems provides support for Python programming language, which can be used as one of the languages to develop applications that run on the InterSystems IRIS platform and as an API.

    Overall, while InterSystems IRIS is not a programming language in and of itself, it provides a wide range of tools and APIs that enable developers to build and deploy applications using a variety of programming languages and frameworks.

    Which model is best for InterSystems IRIS data?

    InterSystems IRIS supports a variety of data models, including hierarchical, relational, multidimensional, and document-based (JSON). The best model for your specific use case will depend on a variety of factors, including the nature of the data, the types of queries and analysis you need to perform, and the overall architecture of your application. Here are some general guidelines for choosing the best data model for your InterSystems IRIS implementation:

    1. Hierarchical Model: If your data has a hierarchical structure, such as patient records in a healthcare application or parts and subassemblies in a manufacturing system, a hierarchical model may be the best choice. Hierarchical models are optimized for fast traversal of tree-like structures and can provide excellent performance for certain types of queries and updates.
    2. Relational Model: If your data is highly structured and requires complex queries or joins, a relational model may be the best choice. Relational databases are well-suited for handling large amounts of structured data and provide powerful querying and reporting capabilities.
    3. Multidimensional Model: If your data is organized into matrices or cubes, such as financial data or scientific data, a multidimensional model may be the best choice. Multidimensional databases are optimized for fast querying and analysis of complex data structures.
    4. Document-Based Model: If your data is unstructured or semi-structured, such as social media posts or log files, a document-based model may be the best choice. Document databases are optimized for storing and querying unstructured data and can provide excellent performance for certain types of queries.

    What is InterSystems IRIS famous for?

    InterSystems IRIS is famous for several reasons, including:

    1. High Performance: InterSystems IRIS is known for its high performance and scalability, making it a popular choice for data-intensive applications that require fast and reliable data access.
    2. Integration Capabilities: InterSystems IRIS provides powerful integration capabilities, allowing it to connect to and exchange data with a wide range of external systems and applications. This makes it an ideal choice for organizations that need to integrate data from multiple sources or build complex data-driven applications.
    3. Flexibility: InterSystems IRIS supports a wide range of data models, programming languages, and APIs, giving developers the flexibility to choose the tools and approaches that work best for their specific needs.
    4. Healthcare Focus: InterSystems IRIS is widely used in the healthcare industry, where it is known for its powerful clinical data management capabilities and support for industry-standard data exchange formats.
    5. Developer Community: InterSystems has a large and active developer community, which provides support, resources, and best practices for building and deploying applications using InterSystems IRIS.

    Overall, InterSystems IRIS has earned a reputation as a powerful and flexible data management platform that can support a wide range of use cases and industries, making it a popular choice for organizations around the world.

    Comentarios (0)1
    Inicie sesión o regístrese para continuar
    Artículo
    · 10 abr, 2023 Lectura de 9 min

    Sending DICOM files between IRIS for Health and PACS software

    Welcome community members to a new article! this time we are going to test the interoperability capabilities of IRIS for Health to work with DICOM files.

    Let's go to configure a short workshop using Docker. You'll find at the end of the article the URL to access to GitHub if you want to make it run in your own computer.

    Previously to any configuration we are going to explain what is DICOM:

    • DICOM is the acronym of Digital Imaging and Communication in Medicine and it's a images and medic data transmission standard. In this protocol is included the format of the DICOM file and the communication protocol based on TCP/IP.
    • DICOM files support images and clinical documentation (you can include in a DICOM file images or documents "dicomized" as images).
    • DICOM protocol define services/operations for the DICOM files. You can request the storages of an image (C-STORE), execute queries (C-FIND) o move this images among the systems of the medical organizations (C-MOVE). You can review all these available services from this URL .
    • All the systems involved in a DICOM based communication request a DICOM message as response.

    You can see here a typical example of the architecture for a system designed to work with DICOM:

    General scheme of DICOM Network architecture

    We have some "modalities" (these modalities could be machines as scanners, MRIs or just the software that will store it) identified by the AE Title o AET (Application Entity Title). This AET will be unique for each modality and must be configured in those other modalities or systems that are going to communicate with it, in such a way that communication between both modalities is allowed.

    As you can see in the graph, the modalities are configured to store their images in a DICOM file server that may or may not belong to a PACS (Picture Archiving and Communication System) that is later consulted from a PACS web interface. It is increasingly common to include a VNA (Vendor Neutral Archive) system in organizations that is responsible for centralized storage and viewing of all DICOM files used by the organization.

    In general, in the most modern modalities, the destination of the generated images can be configured, but on many occasions it may be necessary or to carry out some type of action on the DICOM image fields (modify the patient identifier, include the clinical episode to the one with which it is related, etc) or, due to the inability of the modality, to take charge of capturing and forwarding the generated image to the system responsible for archiving. It is in these cases that the existence of an integration engine that provides us with such functionality is necessary, and there is none better than IRIS for Health!

    For our example we will consider the following scenario:

    • A certain modality is generating images that need to be sent to a PACS for registration.
    • Our DICOM or PACS server will receive these images and must forward them to a specific VNA.


    To simulate our PACS we will use Orthanc, an open source tool that will provide us with the basic functionalities for archiving and viewing DICOM images (more information here). Orthanc is kind enough to provide us with its use through an image that we can mount in Docker without any complications. Finally we will deploy an IRIS for Health container (it depends on when you read this article, the license may have expired, in that case you just have to update the docker-compose file of the code) in which we can mount our production.

    Let's take a look at the docker-compose we've configured:

    version: '3.1'  # Secrets are only available since this version of Docker Compose
    services:
      orthanc:
        image: jodogne/orthanc-plugins:1.11.0
        command: /run/secrets/  # Path to the configuration files (stored as secrets)
        ports:
          - 4242:4242
          - 8042:8042
        secrets:
          - orthanc.json
        environment:
          - ORTHANC_NAME=orthanc
        volumes:
          - /tmp/orthanc-db/:/var/lib/orthanc/db/
        hostname: orthanc
      iris:
        container_name: iris
        build:
          context: .
          dockerfile: iris/Dockerfile
        ports:
        - "52773:52773"
        - "2010:2010"
        - "23:2323"
        - "1972:1972"
        volumes:
        - ./shared:/shared
        command:
          --check-caps false
        hostname: iris
    secrets:
      orthanc.json:
        file: orthanc.json

    Access to the Orthanc web viewer will be done through port 8042 (http://localhost:8042), the IP destined to receive images via TCP/IP will be 4242 and its configuration will be done from the orthanc.json file. The management portal of our IRIS for Health will be 52773.

    Let's see what orthanc.json contains:

    {
        "Name" : "${ORTHANC_NAME} in Docker Compose",
        "RemoteAccessAllowed" : true,
        "AuthenticationEnabled": true,
        "RegisteredUsers": {
            "demo": "demo-pwd"
        },
        "DicomAssociationCloseDelay": 0,
        "DicomModalities" : {
            "iris" : [ "IRIS", "host.docker.internal", 2010 ]
          }
    }

     

    As you can see we have defined a demo user with a password demo-pwd and we have declared a mode called IRIS that will use port 2010 to receive images from Orthanc, "host.docker.internal" is the mask used by Docker to access other deployed containers.

    Let's check that after running the docker-compose build and docker-compose up -d we can access our IRIS for Health and Orthanc without problems:

    IRIS for Health is successfully deployed.

    Orthanc works too, so come on, get messy!

    Let's access the namespace called DICOM and open its production. We can see in it the following business components:

    We are going to review just the necessary components to manage the first case that we have presented for now. A modality that generates DICOM images but from which we cannot send them to our PACS. To do this we will use a Business Service of the standard class EnsLib.DICOM.Service.File configured to read all the .dcm files stored in the /shared/durable/in/ directory and send them to the Business Process of the Workshop.DICOM.Production.StorageFile class.

    Let's take a closer look at the main method of this Business Process:

    /// Messages received here are instances of EnsLib.DICOM.Document sent to this
    /// process by the service or operation config items. In this demo, the process is ever
    /// in one of two states, the Operation is connected or not.
    Method OnMessage(pSourceConfigName As %String, pInput As %Library.Persistent) As %Status
    {
        #dim tSC As %Status = $$$OK
        #dim tMsgType As %String
        do {
            
            If pInput.%Extends("Ens.AlarmResponse") {
                
                #; We are retrying, simulate 1st call
                #; Make sure we have a document
                Set pInput=..DocumentFromService
                $$$ASSERT(..CurrentState="OperationNotConnected")
            }
                
            #; If its a document sent from the service
            If pSourceConfigName'=..OperationDuplexName {
                
                #; If the operation has not been connected yet
                If ..CurrentState="OperationNotConnected" {
                    
                    #; We need to establish a connection to the operation,
                    #; Keep hold of the incoming document
                    Set ..DocumentFromService=pInput
                    
                    #; We will be called back at OnAssociationEstablished()
                    Set tSC=..EstablishAssociation(..OperationDuplexName)
                    
                } elseif ..CurrentState="OperationConnected" {
                    
                    #; The Operation is connected
                    #; Get the CommandField, it contains the type of request, it should ALWAYS be present
                    Set tMsgType=$$$MsgTyp2Str(pInput.GetValueAt("CommandSet.CommandField",,.tSC))
                    If $$$ISERR(tSC) Quit
                    #; We are only handling storage requests at present
                    $$$ASSERT(tMsgType="C-STORE-RQ")
            		
            		// set patientId = pInput.GetValueAt("DataSet.PatientID",,.tSC)
            		// Set ^PatientImageReceived(patientId) = pInput.GetValueAt("DataSet.PatientName",,.tSC)
                    #; We can forward the document to the operation
                    Set tSC=..SendRequestAsync(..OperationDuplexName,pInput,0)
                }
                
            } elseif pSourceConfigName=..OperationDuplexName {
                
                #; We have received a document from the operation
                Set tMsgType=$$$MsgTyp2Str(pInput.GetValueAt("CommandSet.CommandField",,.tSC))
                If $$$ISERR(tSC) Quit
                #; Should only EVER get a C-STORE-RSP
                $$$ASSERT(tMsgType="C-STORE-RSP")
    
                #; Now close the Association with the operation, we will be called back at
                #; OnAssociationReleased()
                Set tSC=..ReleaseAssociation(..OperationDuplexName)
                
                #; Finished with this document
                Set ..DocumentFromService="",..OriginatingMessageID=""
            }
        } while (0)
        
        Quit tSC
    }

    As we can see, this class is configured to check the origin of the DICOM file, if it does not come from the Business Operation defined in the OperationDuplexName parameter, it will mean that we must forward it to the PACS and therefore the metadata of the DICOM message located in the CommandSet section under the name CommandField shall be of type C-STORE-RQ (store request) prior to connection establishment. In this URL you can check the different values ​​that this metadata can take (in hexadecimal).

    In the case that the message comes from the indicated Business Operation, it is a sign that it corresponds to a DICOM response message to our previously sent DICOM, therefore it is validating that the CommandField of said message is of type C-STORE-RSP.

    Let's analyze a little more in detail the key configuration of the Business Operation EnsLib.DICOM.Operation.TCP used to send our DICOM to our PACS via TCP/IP:

    We have declared as IP the name of the hostname specified in the docker-compose in which Orthanc is deployed, as well as the port.

    We have configured two key elements for sending to PACS: the AET of our IRIS for Health (IRIS) and the AET of our PACS (ORTHANC). Without this configuration, no image sending is possible, as both IRIS and ORTHANC will validate that the sending/receiving modality has permission to do so.

    Where do we configure which modalities can send images from IRIS and which modalities can send images to us? It's very simple: we have access to the DICOM configuration functionality from the IRIS management portal:

    From this menu we can not only indicate which modalities can send us and to which we can send DICOM images, we can also indicate what type of images we will be able to send and receive, in such a way that we can reject any image that falls outside of this parameterization. As you can see in the image above we have configured connections both from IRIS to Orthanc and from Orthanc to IRIS. By default Orthanc supports any type of image, so we don't need to modify anything in its configuration.

    In order not to have problems with the images that we can send and receive from IRIS, we will configure the "Presentation Context" call, made up of "Abstract Syntax" made up of the combination of DICOM services (Store, Get, Find...) and an object (MR images , CT, etc...) and the "Transfer Syntax" that defines how information is exchanged and how data is represented.

    Well, we already have configured any possible connection between IRIS and Orthanc and vice versa. Let's proceed to launch a test including a DICOM file in the path defined in our Business Service:


    Very good! Here we have registered our DICOM files and we can see how they have gone through our production until they are sent to Orthanc. Let's go into more detail by checking out a message.

    Here we have our message with its CommandField set to 1, corresponding to C-STORE-RQ, now let's review the response we received from Orthanc:

    We can see that the value of CommandFile 32769 corresponds in hexadecimal to 8001, which, as we have seen in this URL, is equivalent to type C-STORE-RSP. We can also see that the response message is a DICOM message that only contains the values ​​defined in the Command Set.

    Let's check from Orthanc that we have received the messages correctly:

    Here are our messages successfully archived in our PACS. Goal achieved! We can now store the DICOM images of our modality in our PACS without any problem.

    In the next article we will deal with the opposite direction of communication, sending from the PACS to our modality configured in IRIS.

    Here you have available the code used for this article: https://github.com/intersystems-ib/workshop-dicom-orthanc

    Comentarios (0)1
    Inicie sesión o regístrese para continuar