Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

provide Olm as a natively linked module for riot-desktop #888

Closed
ara4n opened this issue Apr 27, 2017 · 8 comments
Closed

provide Olm as a natively linked module for riot-desktop #888

ara4n opened this issue Apr 27, 2017 · 8 comments

Comments

@ara4n
Copy link
Member

ara4n commented Apr 27, 2017

No description provided.

@lampholder
Copy link
Member

@ara4n what does this get us? I don't understand well enough to put any sort of priority on this :P

@ara4n
Copy link
Member Author

ara4n commented Apr 30, 2017

it'd make crypto run ~50x faster on desktop, and make it more resistant to timing attacks. but given we need crypto to run at a usable speed on the web anyway the 50x speedup is probably not going to change much - it'd more be a case of using less cpu. it's prolly p3 e2e feature.

@richvdh
Copy link
Member

richvdh commented Jul 6, 2017

I'd prioritise element-hq/element-web#2503 and matrix-org/olm#3 over this.

@uhoreg
Copy link
Member

uhoreg commented Jan 6, 2018

I've created a preliminary native module for olm, available at https://gitlab.com/uhoreg/olm-native. I haven't done much checking to make sure that it's correct, other than running the olm unit tests, but the results show a definite improvement in speed. On my computer:

transpiled JavaScript:

> jasmine-node test --verbose --junitreport --captureExceptions

têst1 -> { plaintext: 'têst1', message_index: 0 }
hot beverage: ☕ -> { plaintext: 'hot beverage: ☕', message_index: 1 }
☕ -> { plaintext: '☕', message_index: 2 }
têst1 -> têst1
hot beverage: ☕ -> hot beverage: ☕

megolm - 638 ms
    should encrypt and decrypt - 637 ms

olm - 1273 ms
    should encrypt and decrypt - 1273 ms

Finished in 1.916 seconds
2 tests, 11 assertions, 0 failures, 0 skipped

native library:

> jasmine-node test --verbose --junitreport --captureExceptions

têst1 -> { plaintext: 'têst1', message_index: 0 }
hot beverage: ☕ -> { plaintext: 'hot beverage: ☕', message_index: 1 }
☕ -> { plaintext: '☕', message_index: 2 }
têst1 -> têst1
hot beverage: ☕ -> hot beverage: ☕

megolm - 17 ms
    should encrypt and decrypt - 15 ms

olm - 5 ms
    should encrypt and decrypt - 4 ms

Finished in 0.027 seconds
2 tests, 11 assertions, 0 failures, 0 skipped

I ran the test a few times, and the numbers are fairly consistent. With the native version, megolm is about 37 times faster, and olm is about 250 times faster.

@MTRNord
Copy link

MTRNord commented Jan 6, 2018

About packaging native stuff with electron: Check my spellcheck pr. I already added commands in there to recompile everything to glue it to the correct electron on packaging

@turt2live
Copy link
Member

Pardon my thickness, but isn't this WASM?

@jryans
Copy link
Contributor

jryans commented Feb 15, 2019

I think this issue is suggesting we use Olm as a native binary library, such as *.dll / *.dylib / *.so and friends, with some JS bindings as an interface layer (instead of compiling the C code to WASM). That appears to be what https://gitlab.com/uhoreg/olm-native works towards.

@t3chguy t3chguy transferred this issue from element-hq/element-web Apr 19, 2023
@richvdh
Copy link
Member

richvdh commented Apr 19, 2023

I don't think this is relevant to Element desktop any more.

@richvdh richvdh closed this as completed Apr 19, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants