Replies: 18 comments 51 replies
-
Testing now. Platform is Nethserver 7 (Centos 7) that has previously had 2.4 running from source.
Installation of the was fine, it didn't add the symlinks for the systemwide variable but from the instructions above that looks to be correct. I haven't been able to run it yet, getting the following errors:
|
Beta Was this translation helpful? Give feedback.
-
@tdcockers You may need to delete your PR download location to ensure you get the updated build. |
Beta Was this translation helpful? Give feedback.
-
Client just did a crash and restart, but quite possibly environment related as I'm running this on a system with 4GB RAM and it's handling a big number of files. Errors attached |
Beta Was this translation helpful? Give feedback.
-
OK .. The client is generating a compatible /delta response because Please ensure you add
The options
That should be the correct experience, as you are using
Possibly not. If you are running out of watches, the application should be advising that is the case (hence why you got that original error). There are potentially 2 issues going on here:
I am going to switch to using a CentOS 7 system for some SharePoint testing and see if I can re-create some of these issues you are seeing. |
Beta Was this translation helpful? Give feedback.
-
@tdcockers |
Beta Was this translation helpful? Give feedback.
-
@tdcockers If you rebuild, you should get now I am testing a 100K fileset against a SharePoint Library at the moment |
Beta Was this translation helpful? Give feedback.
-
@tdcockers |
Beta Was this translation helpful? Give feedback.
-
@bpozdena , @tdcockers |
Beta Was this translation helpful? Give feedback.
-
One performance suggestion is that deltaLink should only be updated when we actually process all changes, as Although it was mentioned earlier that this was not the intended use case, another thing is that the design of this version will cause a lot of memory usage. The temporary storage of 300k items takes up about 2.5GB of memory, so for me it probably requires at least 10GB of RAM for a startup sync in this version. |
Beta Was this translation helpful? Give feedback.
-
@JC-comp
Many thanks for calling this out - and yes you are right. The DB should only be updated when the processing of all JSON items to process has been done, so that if there is a failure for whatever reason, the last known deltaLink is then used on next application startup. That is an easy item to fix to add.
Thankyou for providing this real-world insight. In the v2.4.x the scope of the JSON responses was reduced, so that only the elements that were needed were asked for. In the current v2.5.0-alpha-X code, I have specifically unset that filter so that all JSON elements are being returned - thus this would be inflating the actual memory being used. At some point before getting into a -beta release or -release-candidate release, this was going to be re-looked at to ensure only the actual required items and elements are requested and responded with which then would help with the local memory aspect, but also time to receive object, time to process objects as there is less data being sent|received and processed. The challenge here is the following:
If we process the changeset bundle, when it is received, this leads to longer processing times, and has limitations on how to pipeline|parallel the download or other operations. This is why when re-developing the code, after looking at and testing many different options, quickly validating the JSON (should this be included, excluded), processing root objects first then storing any valid request for further processing worked the best performance wise.
Potentially not - even if you are getting a /delta response for your entire 1.7M objects online, whilst you will be receiving many changeset bundles, that initial filtering of inclusion|exclusion is going to discard many objects and these wont be kept in memory - only the objects that are going to be processed will be held in memory until they are processed. When I was performing a |
Beta Was this translation helpful? Give feedback.
-
Another performance concern is that the throughput drops off dramatically over time in |
Beta Was this translation helpful? Give feedback.
-
In this release, full scan is deactivated on startup. If this is made by design, perhaps we need something like database consistency check (without touching remote files) before syncing changes from online to local? As the database cannot reflect the true local state anymore. And when performing a full scan, should we mark all the items in db as out-of-sync first and remove those out-of-sync items at the end? Say a remote item is deleted and the db will not aware of this? |
Beta Was this translation helpful? Give feedback.
-
I've noticed that the
|
Beta Was this translation helpful? Give feedback.
-
@JC-comp As such, please can you test the following PR version - Main updates:
Using this updated version, if you are able to gather some performance metrics following https://github.com/abraunegg/onedrive/wiki/Generate-system-performance-data-for-performance-related-issues that would be greatly appreciated. |
Beta Was this translation helpful? Give feedback.
-
Is there an ETA, for this version being feature complete with the stable releases (the listed not working features)? ETA for when enough testing for this to be live will be? |
Beta Was this translation helpful? Give feedback.
-
@abraunegg
Steps to reproduce
|
Beta Was this translation helpful? Give feedback.
-
However, when I do a ^C in that exact area, when querying the API, getting the data at that same point you are performing a ^C .. there is no segfault:
|
Beta Was this translation helpful? Give feedback.
-
@JC-comp Default Ubuntu 22.04 using 'ldc' Ubuntu Package
Remove Ubuntu 'ldc' package and use 'ldc' from GitHub
Note: There is no segfault when not using the Ubuntu 'ldc' package - this segfault / core creation is the sole problem of Ubuntu provided 'ldc' package. |
Beta Was this translation helpful? Give feedback.
-
As part of the ongoing development of v2.5.0, client version v2.5.0-alpha-2 is now available for testing.
Changelog from alpha-1:
Whats still missing?
You can use your normal account if you wish (this is your choice), however this is 100% alpha quality software at the moment. If you experience data loss or something else with your real data - whilst not good, please take every effort to not use this code with your real data.
There will be functionality differences, there will be bugs or things that are not working right.
My ask is that you create a test report - what works, what does not work, what could be potentially changed | improved - basically I am soliciting some feedback on where this total re-write is heading as it matures to a v2.5.0 release.
How can you get v2.5.0-alpha-2 ?
You need to build this client from source. Follow the high level directions here to get the development requirements installed first: https://github.com/abraunegg/onedrive/blob/master/docs/INSTALL.md#building-from-source---high-level-requirements
Next clone and checkout the PR:
To run the PR, you need to run the client from the PR build directory:
When running the PR, your version should be as per below or greater:
The important point is
v2.5.0-alpha-2
. Thev2.4.25-XX-abcdefg
will change as new commits are put into the PR branch, whilst things are added/fixed/updated.What is known to be not functioning right at this stage:
alpha-2
but will need to be focused upon for future releases once feature parity with v2.4.25 is reached. Please do not raise an issue ticket because of this.I have no test files or test data - what can I use?
In developing this application I use a script as per below to generate random data:
Update
DESTDIR
with the right directory and adjustFILES
andFOLDERS
as desired. The above settings will create 10 folders, with each folder having 10 files. with each file size between 6KB and 125KB.To increase (or decrease) the random file size, change
BLOCKSIZE
, generally use ^2 values: 1,2,4,8,16,32,64,128,256 and so on.At 128 you will get files from 100KB -> ~3.8MB, at 256 most files will be between 500KB --> 7.5MB
Beta Was this translation helpful? Give feedback.
All reactions