Datometry Hyper-Q utilizes user defined functions (UDFs) to implement standard functionality. These are referred to as standard UDFs. The standard UDFs are distributed with each Hyper-Q release. You must install these UDFs when installing or upgrading your Hyper-Q deployment.
Prerequisites
Before installing or upgrading the Hyper-Q UDFs, you must install Hyper-Q. To learn more, see:
Important: | If you have customized UDFs with the same name as those located in the directory /opt/datometry/dtm/udf/standard , you must preserve these custom UDFs and reinstall them after upgrading to the latest Hyper-Q release. Ensure that you backup your custom UDFs and are aware of which UDFs in your Hyper-Q deployment have been modified. To learn more, see Upgrade Custom UDFs. |
Upgrade Custom UDFs
In addition to the many standard UDFs supported by Hyper-Q, many deployments rely on custom UDFs. Custom UDFs perform specific tasks within your database environment that are outside of the functionality provided by the standard UDFs included with Hyper-Q. When upgrading your Hyper-Q environment, you must preserve these custom UDFs and reinstall them after upgrading to the latest Hyper-Q release.
For example, Hyper-Q provides a standard HashRow UDF (labeled udf.database_name.hashrow.sql
), which is located in the directory /opt/datometry/dtm/udf/standard
. If your Hyper-Q deployment uses a customized version of the HashRow UDF that uses the same filename, it will be overwritten by the standard HashRow UDF when you upgrade. To preserve the custom HashRow UDF, you must create a backup copy of the custom HashRow definition prior to upgrading the standard UDFs, and reinstall it after upgrading Hyper-Q.
Procedure
- Install or upgrade Hyper-Q using the RPM packages provided by Datometry.
- Locate the UDF source files in the directory
/opt/datometry/dtm/udf/standard
. - (Optional) If your Hyper-Q deployment uses a non-standard (or modified) Metadata Store (MD Store), you must adjust the schema in the UDF source files to match your MD Store configuration by replacing
__DTM_MDSTORE
with the appropriate schema name. - Execute the SQL contained in all of the UDF source files directly on the target data warehouse using any suitable database client to connect directly to the target data warehouse. For example: psql (Postgres SQL) utility, sqlcmd utility for Microsoft Azure Synapse Analytics,
SQL Server Management Studio (SSMS), or SQL Workbench/J.
The UDFs are installed into the MD Store schema. Therefore, a database user with permission to create functions in that schema is required to perform this step.
Note: | You only need to perform the above procedure once per data warehouse instance, independent of the size of the Hyper-Q cluster connecting to the data warehouse instance. |
Some database clients require explicit batch handling to split multiple requests in the same file. For instructions how to process batch files with the sqlcmd utility, see Process SQL Batches for Azure Synapse Analytics Using sqlcmd.
Comments
0 comments
Please sign in to leave a comment.