AWS S3 sync multiple domains

Hey there,

is it possible to sync assets from multiple domains with https://github.com/Flownative/flow-aws-s3 ? Or is it only a single domain storage extension?

Can i see the same storage folder / assets from different domains?

Thanks!

The concept of resource storages is a low level flow concept and concerns the whole installation, so yes this is for all your domains in the same installation. You can see the same assets from all sites by default. You can bind Asset collections (higher level concept) to certain sites and assets put in one of those are only visible for that site then.

Hi Christian,

the S3 plugin doesn’t really know about how the data it stores is used in your application, so it also doesn’t know about domains or sites.

That means: all your asset binary data for all domains / sites will usually be stored in the same S3 bucket. But since you never access this bucket / binary data directly, what counts is how Neos handles assets from different domains in the user interface.

In order to get more organized and set access restrictions you can use collections to your advantage: https://neos.readthedocs.io/en/stable/CreatingASite/MultiSite.html#separating-assets-between-sites

Ok thanks a lot!

But i do not get the actual asset list on each domain if there is something changed, deleted or uploaded, is that right? do i have to trigger the assets sync again? I’ve got different databases …

maybe with:

./flow resource:copy --publish persistent tmpNewCollection

but it doesn’t helped me … :frowning:

And is it possible to connect with typo3 storage folders. I can see that neos is building cryptical folders on aws which is not helpfull for connecting to typo3 … i guess.

So i think it is not possible to get a resource list from aws and import it to neos from scratch. We have an existing asset list … which is not inserted by any neos instance …

yes, that’s not possible. You need the Neos database as a registry for all the “resource” objects and then the connection is made to the binary data in S3 through the SHA1 hash.

If you want to connect existing data from S3 with Neos, you actually need to import them as assets again, like any other uploaded or imported asset.

But of course your scenario could also be implemented as a new feature …

Ok thank you for your advice to import the assets. Would be great to implement this as a new feature :smiley:

Just a semi good idea because neither system would know when to actually delete an asset from the storage. Both would delete it when the respective system doesn’t hold any references as that is how the system was designed and planned. IMHO it’s not a good idea to share this storage across applications that do not share data.

yes i know, it is not synced very well. but we have a huge archive of pictures, pdfs and other files we want to provide for each produkt website. we grant only read access so the management of these files will be done somewhere else.

It makes no sense to share data between each product websites and company site, because they want to have different data on each site with the same product. So we started to use only the same picture database to build content elements.

i’ll give you an example:

http://www.company-xy.de (access to all product pictures)
http://www.marke-xy.de (everything from this product)
http://www.merke-xy-experten.de (some product pictures)

we are talkin about 20 different product sites and 5 company domains ore more …

if we change one file in the s3 storage it should be changed on every site. And if its deleted it should disapear everywhere.

i’am now working on a command to import those s3 files to neos. i will see whats possible and we will look if it fits for our needs …

you might have a look at https://pydio.com/ it’s a central storage management tool which can also connect to s3.

By the way, what about webdav? There is a restriction management, but i didnt found any existing package for neos. would be also a good, maybe better. Maybe i will try to build a connector.

thanks.