-
-
Notifications
You must be signed in to change notification settings - Fork 4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Nextcloud and S3 #6954
Comments
ps: Premissions is full. $fs_s3->createDir("../dir1"); if create file as $fs_s3->write("../dir2/subdir5/subdir6/test.txt") |
Options "Check for changes" set. when NC refers to the directory in ceph radosgw log shows that reading the contents of the directory is, but for some reason, NC doesn't display them |
For test: added
dir4 and subdir write external scripts |
for some reason, the function filetype($path) AmazonS3.php return false on 'dir4' NC log: |
If use code from league\flysystem AwS3Adapter.php to determine directory - protected function doesDirectoryExist($location)
{
// Maybe this isn't an actual key, but a prefix.
// Do a prefix listing of objects to determine.
$command = $this->s3Client->getCommand(
'listObjects',
[
'Bucket' => $this->bucket,
'Prefix' => rtrim($location, '/') . '/',
'MaxKeys' => 1,
]
);
try {
$result = $this->s3Client->execute($command);
return $result['Contents'] || $result['CommonPrefixes'];
} catch (S3Exception $e) {
if ($e->getStatusCode() === 403) {
return false;
}
throw $e;
}
} and change AmazonS3.php function filetype (file/dir) |
I'm having the exact same issue (NC 12.0.3). I do have the However, a whole tree that I synced (i.e.
@ldapomni I've spotted the 288 public function filetype($path) {
289 $path = $this->normalizePath($path);
290
291 if ($this->isRoot($path)) {
292 return 'dir';
293 }
294
295 try {
296 if ($this->getConnection()->doesObjectExist($this->bucket, $path)) {
297 return 'file';
298 }
299 if ($this->getConnection()->doesObjectExist($this->bucket, $path.'/')) {
300 return 'dir';
301 }
302 } catch (S3Exception $e) {
303 \OCP\Util::logException('files_external', $e);
304 return false;
305 }
306
307 return false;
308 } However, I'm not sure which changes I should made based on your prior comment to test. Also, how to debug this further and gather the needed info? |
cc @icewind1991 |
I can't reproduce the problem with master (to be NC13) and amazon S3, when creating a folder and adding a file to it trough the aws console, and running Can anyone experiencing this problem try either the lastest NC13 beta (as always when using beta software: backup) or 12.0.4 which should include the same s3 changes. |
After update to NC13, problem still exists. |
I have a very similar error when trying to delete a folder from an S3 bucket within NC.
Nextcloud 12.0.5
|
@kruffin I have the same |
This issue needs to be resolved ASAP, I can't even see any file even when I erase and re.init the whole database. S3 delete readd on NC web ui doesn't work. |
I have the same problem. Snap NextCloud 13.0.2 on Ubuntu 16.04 Any workarounds for this? |
I am facing the same issue. The only workaround I found is to just create the directory via Nextcloud UI and the files will then appear. Eg.
Create the folder
One weird thing I noticed is that my logs are not showing any error at all, no warning, no error. Also, |
The workaround made by @yashmehrotra worked for me. Thanks |
Same issue. Nextcloud 13.0.5.2. PHP 7.0.30. I'm using the S3 service from Digital Ocean, called "Spaces." The workaround mentioned by @yashmehrotra did not work for me. I can see the folder I "created" through the Nextcloud UI, but nothing under that folder, and occ files:scan generates the errors in the log. Note: I used s3cmd to create a top level folder called "books," many folders below that, and many files in each folder. In the log, the errors look like this: {"reqId":"...","level":0,"time":"2018-07-25T11:32:24-07:00","remoteAddr":"...","user":"...","app":"OC\Files\Cache\Scanner","method":"PROPFIND","url":"/remote.php/webdav/archive/books","message":"!!! Path 'books/bbc4' is not accessible or present !!!","userAgent":"...","version":"13.0.5.2"} There is no such error for the "books" folder. Only the subfolders. |
Same issue. A This same issues exists with ownCloud, however Pydio seems to handle the situation properly. This likely stems from the fact that S3 stores everything as objects and "paths" are only implicit in that the object names have forward slashes ("/") in them. Therefore, some trickery is needed to emulate standard directory hierarchy. NC is obviously aware of this, but it sounds like some small piece of the puzzle is missing. |
Seeing the same issue after a fresh install of NC 14.0 on Docker |
|
The bucket is created automatically by nextcloud when connecting to S3, so i think that uses the default setup for a bucket... |
Thanks for suggesting versioning. For what it's worth, I'm using S3 storage provided by DigitalOcean, which does not yet support versioning. I have not reproduced the problem since my last comment. And I have not reproduced the problem on DigitalOcean and Amazon simultaneously to see if there is any difference. I hope to get back to that soon. It'd also be interesting to reproduce it with Storj or some other S3 service providers. |
I have found a workaround when all else fails... I wrote a C# script to copy the S3 directory structure (i.e. not download the files) to my computer. Then I used the Windows NC client (2.3.3) to sync the folder structure onto the NC server. Once you add the folders, the client will start showing the files, but it will also start syncing the files to you computer. To avoid downloading all the files, I added extensions like Here's the C# script to sync the S3 directory structure to your computer:
|
I have the same issue here. Files uploaded via AWS CLI are not appearing on Nextcloud. All permissions are set correctly, not sure how to debug this too. |
I applied that a few days ago to 15.0.2, and it solved the problem for me. S3 external storages remain excruciatingly slow (at least using Digital Ocean Spaces), but new stuff does show up as expected after the patch. |
I think it now needs a button to manually trigger files list reload... Updating the list on every view can be taking half an hour for large buckets, and turning off refresh on view leaves new files not visible |
Hi, i agree that a sync button should be usefull ... i used S3 upload API for init reason ... but after i want only use NC. |
@kesselb Many people are saying that this patch(https://github.com/nextcloud/server/compare/bugfix/6954/scan-external-s3) fixed the problem, will you be merging this into master ? |
@yroffin If you apply my patch https://docs.nextcloud.com/server/stable/admin_manual/configuration_server/occ_command.html#scan should work. @yashmehrotra Yes. |
I'm trying to work with an S3 storage through nextcloud, and both options it provides currently don't work well: So the solution might be either manual rescan, or tunable period of rescan, like 10 mins or half an hour, and no full rescans between those. |
Pick "scan never" and setup https://docs.nextcloud.com/server/stable/admin_manual/configuration_server/occ_command.html#scan as cron could work for your use case. |
Would that be a recommended option then? Most people who use S3 have a bunch of files in it, that's the reason to use it |
No. It could work for your use case. I think the cleanest way is to write the files to s3 through nextcloud. In this case you can use scan never. |
For some reason I did not have this issue in NC 15.0.5. However, once I upgraded to 15.0.7 I began to experience this issue. (I cannot say about 15.0.6)
I was unable to see these files from NC. I applied patch https://github.com/nextcloud/server/compare/bugfix/6954/scan-external-s3 (Here is the file: https://github.com/nextcloud/server/compare/bugfix/6954/scan-external-s3.patch) and it did correct this issue. Thank you! |
Like wanchic's reports, my Nextcloud has again lost access to my S3-compatible DigitalOcean Spaces external storage. I had installed that patch some time ago, and it worked. Since then, I have upgraded to 15.0.7, and my S3 external storage is again empty. Sorry, I can't say which version broke it. I'll try applying the patch again. I had understood the patch was adopted into the released Nextcloud. I wonder whether it got backed out somehow? I'll try to look through the source code history if I have time. Off topic: S3 external storages are basically unusable, even when they're working. They're just too slow. I tried setting "Check for Changes: Never," but it was still too slow to use. I'm no Nextcloud expert, but I have the impression Nextcloud keeps a database of metadata for known files, updated by "occ scan all." So my expectation is that Nextcloud will quickly display information from that database, whether or not it "checks for changes" in the background, even for S3 external storages. It appears instead it's crawling the directory every time I load the web page. That's a serious flaw that makes external storages completely unusable for me. I should search for that as a bug report and report it if I can't find anything. |
Not yet :( There is a open pull request but i was a bit late for Nextcloud 16 code freeze. It might be classified as bugfix and backported. |
I had the same issue with S3 DigitalOcean Spaces & Nextcloud, so to make Nextcloud efficient again, I had to switch to DO Block Storage. |
Unfortunately, I don't have a choice in the platform I use. Two weeks ago my client was mandated to use Heroku-Shield in order to meet HIPPA Compliance. Since Heroku uses a ephemeral disk platform, S3 was our only storage option that met HIPPA Compliancy for our client. |
Tried copying relatively large dirs from local disk to S3 with nextcloud. Half of folders are not visible, although copied fine. Need to enable rescan again. So it's not a solution... |
I applied patch https://github.com/nextcloud/server/compare/bugfix/6954/scan-external-s3 and Fixed my Issue. I Running Nextcloud 15.0.7 Thank you |
still broken in 16.0.1 it seems but the patch still fixes it :( |
Same here :
|
I can confirm the patch mentioned above resolved the issue for me. I'm running nextcloud 16.0.1 and using Digital Ocean spaces for external storage, with folders/files populated using rclone directly to DO spaces. I basically just downloaded the modified AmazonS3.php file into |
@kesselb wrote, "@sbw Sounds good to me. Would you mind to create a new enhancement issue for this? (I think it should be like that maybe something broken)." Sorry, I can't tell whether that pertains to the original issue (S3 external storage stuff never shows up in Nextcloud) or my off-topic observation that "S3 external storages are basically unusable, even when they're working. They're just too slow." I hope it's the latter. I hope you're suggesting I create an "enhancement issue" to make S3 external storage fast enough to be usable. If that's what you're suggesting, I'll do that when I have time. (It's not an enhancement, really, but a defect, in my opinion: I believe Nextcloud is meant to immediately display the information it already has from a previous scan. I don't know what Nextcloud is doing instead, but whatever it is, it can't be the intended behavior.) |
hey @alexklibisz i was trying to follow what you did but I couldn't modify or download updated AmazonS3.php due to read-only file system. How did you bypass it? |
@sbw this one.
@hasangokdag If you are using the snap package applying this patch is not possible. |
I am running nextcloud in a docker container, so I have root access inside
of it by default.
…On Sat, Jun 22, 2019, 06:42 Daniel Kesselberg ***@***.***> wrote:
S3 external storages are basically unusable, even when they're working.
They're just too slow.
@sbw <https://github.com/sbw> this one.
hey @alexklibisz <https://github.com/alexklibisz> i was trying to follow
what you did but I couldn't modify or download updated AmazonS3.php due to
read-only file system. How did you bypass it?
@hasangokdag <https://github.com/hasangokdag> If you are using the snap
package applying this patch is not possible.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#6954?email_source=notifications&email_token=AB5E27FS73YARJP7HA6SH2LP3X6Z7A5CNFSM4EAXSQC2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODYKGWDA#issuecomment-504654604>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AB5E27DY7ANRGFACFPB4YRTP3X6Z7ANCNFSM4EAXSQCQ>
.
|
For those like me using snap and impatient, you can upgrade to Nextcloud 17:
|
Nextcloud version: 12.0.0.3
Nextcloud is connected to S3. When copying files using aws s3 php sdk in a directory bucket,Nextcloud does not see them. But if you look through the content of S3 Browser, everything is there. If the files to burn through nextcloud, no problem. What could be the problem? Some kind of analogue occ filesystem_external:scan ?
The text was updated successfully, but these errors were encountered: