Ooensearch encryption/tokenization of data at rest

Hello there:

First of all, we are very excited about the new true open source flavor of elasticsearch.

I am interested to know if there is any plan in future to create or make the data at rest encryption publicly available and also any plans to tokenize and detokenize data?

Please let me know.

Regards,

Kiran.

Glad to hear you share in the excitement :slight_smile: Thanks for your question Kiran. Could you pls expand on your usecase or the problem you are trying to solve? Also can you elaborate on what you mean by tokenize / de-tokenize data?

Hi - We have a security requirement to be able to tokenize the data such as Personal Identity data when stored at disk so that if anyone gets access to the disk, they really can’t understand as the data is scrambled/tokenized using a key. At the same time, when it is required to be read via OS Dashboard, we need the ability to de-tokenize/descamble back to original text using a key and present to the user.

In addition, what can also benefit is that if data can be stored by OS to disk at rest in encrypted format.
I see Amazon OS service providng this capability, how can we get data at rest encryption for Azure? Any inputs would be appreciated.

Kiran Venkatesan,
We have a solution (Opensearch plugin) that encrypts all the sensitive data before OpenSearch indexes the data; and fulfills all the searches normally with <8% overhead on ingest and <3% overhead on search.
Check us out. If you like what you see you can schedule a consultation on our website or email me at pakshi@titaniam.io.

AWS OpenSearch partner blog:

Website:

Hi, Kiran,

We have a proxy that sits in front of OpenSearch and handles tokenization of fields that are configured as protected. It is primarily targeted at multi-tenant applications, so the tokenization uses different keys per tenant to prevent data leakage across tenants.

More info at Cloaked Search Overview | IronCore Labs