Setting up the event stream
To be completed. Set up using AWS EventBridge
Initial configuration
Here is an example configuration for postgres databases:
# Postgres database
STORAGE=PGSQL
# Genesys region - this is for EU-WEST-2 (London), choose yours
BASE_URL=https://api.euw2.pure.cloud
# Your CX Vizz OAUTH Client-ID
CLIENT_ID=########-####-####-####-############
# Your CX Vizz OAUTH Secret
SECRET=###########################################
# The name of your organisation for diagnostic purposes
NAME=kerv-genesys
# Your databas connection string
SQL_PARAM=Host=a.b.c.d;Database=####;User ID=postgres;Password=########
# Only required if you export dynamic tables, for example, outbound campaign dialling lists
DYNAMIC_SQL_PARAM=Host=a.b.c.d;Database=######;User ID=postgres;Password=#########
# The earliest date that will be used for all data extraction
START_DATE=2024-01-01
# Used for downloading very recent conversations, use the URL that you have been allocated by Kerv
GENESYS_EVENTS_URL=https://u2fm4sjmyj.execute-api.eu-west-2.amazonaws.com
Here is the same example for Microsoft SQL Server:
# Microsoft SQL database
STORAGE=MSSQL
# Genesys region - this is for EU-WEST-2 (London), choose yours
BASE_URL=https://api.euw2.pure.cloud
# Your CX Vizz OAUTH Client-ID
CLIENT_ID=########-####-####-####-############
# Your CX Vizz OAUTH Secret
SECRET=###########################################
# The name of your organisation for diagnostic purposes
NAME=kerv-genesys
# Your databas connection string
SQL_PARAM=Server=a.b.c.d,1433;Initial Catalog=#######Persist Security Info=True;User ID=#####;Password=######;MultipleActiveResultSets=True;Encrypt=True;TrustServerCertificate=True;Connection Timeout=30;
# Only required if you export dynamic tables, for example, outbound campaign dialling lists
DYNAMIC_SQL_PARAM=Host=a.b.c.d;Database=######;User ID=postgres;Password=#########
# The earliest date that will be used for all data extraction
START_DATE=2024-01-01
# Used for downloading very recent conversations, use the URL that you have been allocated by Kerv
GENESYS_EVENTS_URL=https://u2fm4sjmyj.execute-api.eu-west-2.amazonaws.com
Downloading initial metadata
INSERT the following into the command_action table:
INSERT INTO command_action (key, command) VALUES( '00000000-0000-0000-0000-000000000001', 'queues');
INSERT INTO command_action (key, command) VALUES( '00000000-0000-0000-0000-000000000002', 'wrap_codes');
INSERT INTO command_action (key, command) VALUES( '00000000-0000-0000-0000-000000000003', 'presence_definitions');
INSERT INTO command_action (key, command) VALUES( '00000000-0000-0000-0000-000000000004', 'groups');
INSERT INTO command_action (key, command) VALUES( '00000000-0000-0000-0000-000000000005', 'users');
INSERT INTO command_action (key, command) VALUES( '00000000-0000-0000-0000-000000000006', 'divisions');
Now run CX Vizz.
This will, as a one-off action populate the database with the basics:
- All the queue definitions
- All the wrap-up code definitions
- All the presence definitions
- All groups
- All users
- All divisions
Depending on your volume and what your START_DATE is set to, this may take a while - it normally completes within a few hours for a moderately busy site with a few hundred agents operating 24/7. The log file will give you a good estimation of how long it has left to run.
Downloading initial conversations
INSERT INTO command_action (key, command) VALUES( '00000000-0000-0000-0000-000000000001', 'conversations_from_datalake')
This will download all conversations from START_DATE until the most recent Genesys Datalake date.
Creating an initial schedule:
INSERT INTO job_schedule ("key", cron_expression, "status") VALUES ('DECAFBAD-F00D-CAFE-0000-000000000000', '0 5,6,7,8,10,12,14,16,18,19 * * * *', 'Idle');
INSERT INTO job_schedule_action( "key", command, job_schedule_key, "order") VALUES ('DECAFBAD-F00D-CAFE-1111-000000000001', 'queues', 'DECAFBAD-F00D-CAFE-0000-000000000000', 20);
INSERT INTO job_schedule_action( "key", command, job_schedule_key, "order") VALUES ('DECAFBAD-F00D-CAFE-1111-000000000003', 'wrapup_codes', 'DECAFBAD-F00D-CAFE-0000-000000000000', 40);
INSERT INTO job_schedule_action( "key", command, job_schedule_key, "order") VALUES ('DECAFBAD-F00D-CAFE-1111-000000000005', 'presence_definitions', 'DECAFBAD-F00D-CAFE-0000-000000000000', 60);
INSERT INTO job_schedule_action( "key", command, job_schedule_key, "order") VALUES ('DECAFBAD-F00D-CAFE-1111-000000000008', 'groups', 'DECAFBAD-F00D-CAFE-0000-000000000000', 90);
INSERT INTO job_schedule_action( "key", command, job_schedule_key, "order") VALUES ('DECAFBAD-F00D-CAFE-1111-00000000000A', 'users', 'DECAFBAD-F00D-CAFE-0000-000000000000', 110);
INSERT INTO job_schedule_action( "key", command, job_schedule_key, "order") VALUES ('DECAFBAD-F00D-CAFE-1111-00000000000C', 'divisions', 'DECAFBAD-F00D-CAFE-0000-000000000000', 130);
INSERT INTO job_schedule_action( "key", command, job_schedule_key, "order") VALUES ('DECAFBAD-F00D-CAFE-1111-000000000012', 'conversations_from_datalake', 'DECAFBAD-F00D-CAFE-0000-000000000000', 200);
This will perform all the same actions as we did above, but will run every day at 5am, 6am, 7am, 8am, 10am, 12pm, 2pm, 4pm 6pm and 7pm.
Note: all times are in UTC, not in local time.
Setting up background tasks
INSERT INTO background_tasks( "key", command) VALUES ('DECAFBAD-F00D-CAFE-2222-000000000001', 'realtime_queue_conversations');
INSERT INTO background_tasks( "key", command) VALUES ('DECAFBAD-F00D-CAFE-2222-000000000002', 'realtime_queue_observations');
INSERT INTO background_tasks( "key", command) VALUES ('DECAFBAD-F00D-CAFE-2222-000000000003', 'realtime_user_presence');
This will create four background tasks, which will:
- keep queue_conversations, queue_observations, and user_presence tables up to date in realtime.
