Encontrar

Artículo
· 15 dic, 2024 Lectura de 4 min

第五十一章 File 输入 输出 - 文件路径名工具

第五十一章 File 输入 输出 - 文件路径名工具

文件路径名工具

如果当前设备是顺序文件,则$ZIO包含该文件的完整路径名。

可以使用$ZSEARCH返回指定文件或目录的完整文件规范(路径名和文件名)。文件名可能包含通配符, $ZSEARCH使用通配符返回一系列满足通配符的完全限定路径名。

%Library.File 类包含许多提供文件系统服务的方法。这些包括:

Comentarios (0)1
Inicie sesión o regístrese para continuar
Artículo
· 15 dic, 2024 Lectura de 3 min

Setup OAuth2 Client for iris-http-calls to Epic on FHIR

I have started working on utilizing Epic on FHIR about a month ago.

Creating a Public Private Key Pair

mkdir /home/ec2-user/path_to_key
openssl genrsa -out ./path_to_key/privatekey.pem 2048

For backend apps, you can export the public key to a base64 encoded X.509 certificate named publickey509.pem using this command...

2 comentarios
Comentarios (2)1
Inicie sesión o regístrese para continuar
Pregunta
· 15 dic, 2024

Versión evaluación CACHE

Como se puede observar mi versión de Cache ya no esta en la lista de productos.

Hace ya bastante tiempo hice algunas preguntas de una versión aun mas vieja, en esa versión tenia 5 bases de datos, en su momento solo instalé la principal para mí en la versión 2010, estoy intentando migrar las otras cuatro y me encuentro con el siguiente problema:

         - Al montar la base de datos solo la monta "lectura", sin opción de cambiar ni por el "portal de gestión de sistema" ni con ^DATABASE

Ya me identifique en su momento como un dinosaurio para las épocas que corren, mi conclusión en este punto es que al ser una licencia de evaluación no deja montar mas de una base de datos (lectura/escritura).

¿Es correcta mi conclusión?, de lo contrario alguien me podría indicar a donde radica el problema.

Gracias

2 comentarios
Comentarios (2)1
Inicie sesión o regístrese para continuar
Pregunta
· 15 dic, 2024

Is it possible to feed a question to the D.C. A.I. via URL parameter?

I want to provide a shortcut to the D.C. A.I. from a hotkey which will automatically populate the question field here - https://community.intersystems.com/ask-dc-ai.

Is there a way to construct this URL such that it will populate the "Ask a Programming Question" field (and better yet, execute the query)?

Thanks!

3 comentarios
Comentarios (3)1
Inicie sesión o regístrese para continuar
Artículo
· 14 dic, 2024 Lectura de 6 min

Rivian GeoLocation Plotting with IRIS Cloud Document and Databricks


 

Plotting the gnSSLocation data from my Rivian R1S across Michigan with InterSystems Cloud Document and Databricks

If you been looking for a use case for a Document Database, I came to the realization my favorite dead simple one is the ability to query a pile of JSON, right along side my other data with sql without really doing much. Which is the dream realized from the powerful Multi Model InterSystems Data Platform, and shown here in a simple notebook to visualize my geo location data my Rivian R1S is emitting for DeezWatts ( A Rivian Data Adventure ).

So here is the 2 step approach, Ingestion to and Visualization from InterSystems Cloud Document, using the JDBC document driver.

InterSystems Cloud Document Deployment

For starters, I fired up a small Cloud Document deployment on the InterSystems Cloud Services Portal, with an enabled listener.

I downloaded the ssl certificate, and snagged the drivers for JDBC and accompanying document driver as well.

Ingestion

For ingestion, I wanted to get a grip on how to lift a JSON document from the file system and persist it as a collection in the document database over the listener, for this I wrote a standalone java app. This was more utility as the fun all happened in the notebook after the data was up in there.
 

 
RivianDocDB.java

The above is quite close to JAVA trash, but worked, we can see the collection in the collection browser in the deployment.

Databricks

Now this takes a little bit of Databricks setup, but is well worth it to work with pyspark for the fun part.

I added the two InterSystems drivers to the cluster, and put the certificate in the import_cloudsql_certficiate.sh cluster init script so it gets added to the keystore.

For completeness, the cluster is running Databricks 16, Spark 3.5.0 and Scala 2.12

Visualization

So we should be set to run a PySpark job and plot where my whip has been in the subset of data Ill drag in.

We are using geopandas and geodatasets for a straight forward approach to plotting.

import geopandas as gpd
import geodatasets
from shapely.geometry import Polygon

Now, this takes a little bit to get used to, but here is the query to InterSystems Cloud Document using the JSON paths syntax and JSON_TABLE.

dbtablequery = f"(SELECT TOP 1000 lat,longitude FROM JSON_TABLE(deezwatts2 FORMAT COLLECTION, '$' COLUMNS (lat VARCHAR(20) path '$.whip2.data.vehicleState.gnssLocation.latitude', longitude VARCHAR(20) path '$.whip2.data.vehicleState.gnssLocation.longitude' ))) AS temp_table;"

 

I did manage to find a site that made it dead simple to create the json path @ jsonpath.com.

Next we setup the connection to the IRIS Document Database Deployment and read it into a dataframe.

# Read data from InterSystems Document Database via query above
df = (spark.read.format("jdbc") \
  .option("url", "jdbc:IRIS://k8s-05868f04-a88b7ecb-5c5e41660d-404345a22ba1370c.elb.us-east-1.amazonaws.com:443/USER") \
  .option("jars", "/Volumes/cloudsql/iris/irisvolume/intersystems-document-1.0.1.jar") \
  .option("driver", "com.intersystems.jdbc.IRISDriver") \
  .option("dbtable", dbtablequery) \
  .option("sql", "SELECT * FROM temp_table;") \
  .option("user", "SQLAdmin") \
  .option("password", "REDACTED") \
  .option("connection security level","10") \
  .option("sslConnection","true") \
  .load())


Next we grab an available map from geodatasets, the sdoh one is great for generic use of the united states.
 

# sdoh map is fantastic with bounding boxes
michigan = gpd.read_file(geodatasets.get_path("geoda.us_sdoh"))

gdf = gpd.GeoDataFrame(
    df.toPandas(), 
    geometry=gpd.points_from_xy(df.toPandas()['longitude'].astype(float), df.toPandas()['lat'].astype(float)), 
    crs=michigan.crs #"EPSG:4326"
)

Now the cool part, we want to zoom in on where we want to contain the geo location points of where the R1S has driven, for this we need a bounding box for the state of Michigan.

For this I used a really slick tool from Keene to draw the geo fence bounding box and it gives me the coordinates array!

Now that we have the coordinates array of the bounding box, we need slap them into a Polygon object.

polygon = Polygon([
      (
        -87.286377,
        45.9664245
      ),
      (
        -81.6503906,
        45.8134865
      ),
      (
        -82.3864746,
        42.1063737
      ),
      (
        -84.7814941,
        41.3520721
      ),
      (
        -87.253418,
        42.5045029
      ),
      (
        -87.5610352,
        45.8823607
      )
    ])

 

Now, lets plot the trail of the Rivian R1S! This will be for about 10,000 records (I used a top statement above to limit the results)
 

ax = michigan.clip(polygon).plot(color="lightblue", alpha=0.5,linewidth=0.8, edgecolor='gray')
ax.axis('off')
ax.annotate("Data: Rivian R1S Telemetry Data via InterSystems Document Database", xy=(0.01, .085), xycoords='figure fraction', fontsize=14, color='#555555')

gdf.plot(ax=ax, color="red", markersize=1.50, alpha=0.5, figsize=(200,200))

 

And there we have it... Detroit, Traverse City, Silver Lake Sand Dunes, Holland, Mullet Lake, Interlachen... Pure Michigan, Rivian style.


 

Comentarios (0)1
Inicie sesión o regístrese para continuar