Sameera De Silva
2 min readJul 30, 2021

Esrally how to write to multiple indexes parallelly to perform performance testing for Elasticsearch

Below track.json will ingest data sequentially. First it will ingest documents to the geocustom and then customrecords index.

{
"version": 2,
"description": "Tutorial benchmark for Rally",
"indices": [
{
"name": "customrecords",
"body": "",
"types": [ "docs" ]
}
],
"corpora": [
{
"name": "rally-tutorial",
"documents": [
{
"source-file": "documents.json",
"document-count": 12109130,
"uncompressed-bytes": 12258880607,
"target-index": "geocustom"
}
,
{
"source-file": "samples.json",
"document-count": 85901,
"uncompressed-bytes": 52313709,
"target-index": "customrecords"
}
]
}
],
"schedule": [
{
"operation": {
"operation-type": "create-index"
}
},
{
"operation": {
"operation-type": "cluster-health",
"request-params": {
"wait_for_status": "green"
}
}
},
{
"operation": {
"operation-type": "bulk",
"bulk-size": 100
},
"warmup-time-period": 120,
"clients": 8
}
]
}

In real world , documents come to multiple indices parallelly. To simulate that, you could use below track,json and run it with below command .

{
"version": 2,
"description": "Tutorial benchmark for Rally",
"indices": [
{
"name": "customrecords",
"body": ""
},
{
"name": "geocustom",
"body": ""
}
],
"corpora": [
{
"name": "rally-tutorial",
"documents": [
{
"source-file": "samples.json",
"document-count": 100000,
"uncompressed-bytes": 3295600000,
"target-index": "customrecords"
},
{
"source-file": "samples.json",
"document-count": 100000,
"uncompressed-bytes": 3295600000,
"target-index": "geocustom"
}
]
}
],
"schedule": [
{
"operation": {
"operation-type": "create-index"
}
},
{
"operation": {
"operation-type": "cluster-health",
"request-params": {
"wait_for_status": "yellow"
}
}
},
{
"parallel": {
"tasks": [
{
"name": "bulk1",
"operation": {
"operation-type": "bulk",
"indices": "customrecords",
"bulk-size": 100
},
"warmup-time-period": 0,
"clients": 1
},
{
"name": "bulk2",
"operation": {
"operation-type": "bulk",
"indices": "geocustom",
"bulk-size": 100
},
"warmup-time-period": 0,
"clients": 1
}
]
}
}
]
}

Save the file as track.json and use the below command in Centos to run it.

nohup /usr/local/bin/esrally — track-path=/home/usernamefolder/samfiles — target-hosts=https://demo-site.tools — pipeline=benchmark-only — client-options=”use_ssl:true,verify_certs:false,basic_auth_user:’username’,basic_auth_password:’password’,timeout:120" >> rally_stdout.log 2>> rally_stderr.log < /dev/null &

If the existing rally processes were their it won’t run so kill them.
pgrep esrally
kill -KILL pid
If it’s running parallelly, below screenshot should be displayed , that two bulk task are running together.

Credit-https://discuss.elastic.co/t/esrally-ingesting-to-two-indices-parallelly/279933/4

No responses yet