• where can i find folders and files stored in OpenKM

  • OpenKM has many interesting features, but requires some configuration process to show its full potential.
OpenKM has many interesting features, but requires some configuration process to show its full potential.
Forum rules: Please, before asking something see the documentation wiki or use the search feature of the forum. And remember we don't have a crystal ball or mental readers, so if you post about an issue tell us which OpenKM are you using and also the browser and operating system version. For more info read How to Report Bugs Effectively.
 #43604  by jllort
 
OpenKM separate metadata information and binary information. All metadata information is stored into the database like folder name, document name, creation dates etc. Depending what OpenKM datastore is used the binary data location might change ( usually into the file system, but also can be into database ). In case you are using default OpenKM datastore ( what we suggest for best perfomance ) the data is stored under $TOMCAT_HOME/repository/datastore

Should understanding these database tables https://docs.openkm.com/kcenter/view/ok ... odeversion and based in the UUID of the document version you can discover the disk location, for example document version with uuid = '87220c50-1f5b-11e7-9598-0800200c9a66' will be into folder $TOMCAT_HOME/repository/datastore/87/22/0c/50

Finally you can build your own datastorage and distribute documents based in other logic
 #43608  by Florentin
 
Thank you for your explanation. I noticed that we could export the data and see it in the correct format through the Administrator part. Is it possible to make the export automatic and periodically?
I would like to make a kind of backup of my archived files.
Do you have another solution that makes the OpenKM Backup?

Thanks in advance!
 #43614  by jllort
 
First of all, if you are in production scenario you should use MySQL, Oracle, PostgreSQL, SQL Server etc... ( if you are using default configuration should migrate to one of these ).
You are on Linux or on Windows ?
 #43633  by jllort
 
Please do not merge several question in the same post. for each question add a new post ( otherwise other readers go into confusion when reading a topic what talks about a lot of things ).
 #45511  by rosario
 
@Florentin, could you please share the cronjob you used (its template of course) to periodically export the documents using the tool provided in the Administration section?

Thank you
Rosario
 #45515  by silverspr
 
Hi Rosario
You can find an export script here: https://docs.openkm.com/kcenter/view/ok ... orter.html
You will need to copy and modify it to suit your needs i.e. under configuration parameters: Fspath. Then save the file as a bsh file i.e. myfile.bsh.
Create a new Cron job in OpenKM under Administration and upload your customized bsh file.
hope that helps
d
 #45527  by rosario
 
Hi silverspr, thank you for the quick reply.

I noted on the admin dashboard I can check to export both metadata and history, together with the file and the three structure.

Is this also an option I have in the script you pointed me out to or is it something I need to work on?

What happens if I export twice on the same destination folder?

Is the log file of the cronjob appended or rewritten?

Thank you again for your support, have a great week end.
Rosario
 #45529  by silverspr
 
Hello Rosario
Both metadata and history are configurable under the configure parameters section, set each of those items to true.
The destination folder is overwritten each time the export is run.
Sorry I can't answer your question regarding the Cron job log: If I knew where to look for it within OpenKM I could tell you (I'm a client/user like yourself). I did a quick search for a log file and didn't come up with anything.
thanks
d
 #45530  by jllort
 
If you export a twice or more times the files will be overwritten, I suspect you want it for some reason. Can you explain what is your scenario?

For log file, there's an export log file at $TOMCAT_HOME/logs named RepositoryExportXXXXX.log
 #45531  by rosario
 
Dear silverspr and jllort , thank you for your quick answer.

@jllort, while examining the script mentioned by silverspr to execute as a cron job the export task (see link on his post), I noted the following lines:
// Configuration parameters
String token = DbSessionManager.getInstance().getSystemToken();
String LOG_FILE_NAME = "CrontabRepositoryExporter";
so I thought that such script, once uploaded as a cronjob for openkm, could produce a log report at each execution.

Now, to answer your question and describe the scenario I have in mind, I want to understand if it is possible to create a backup of all documents in a readable format, automatically.

If the export function, either manually or cron-job activated, every time rewrites the entire tree of docs, this may not be a viable option, at least when managing thousands or hundred of thousand documents (not mentioning the fact that right now the repository on my system is on a zfs file system with automated snapshots running hourly, daily, weekly, monthly and yearly).

Ideally if the export was executed onto a previously used directory, it would be great if existing files were just skipped.

As for the log management, is the log mentioned in the above script the same log that you are pointing me out to in $TOMCAT_HOME/logs named RepositoryExportXXXXX.log ?

I would guess no, since the log mentioned in the script should be the log of the cron job, while the log that you are referring to is the log of the export function.

In both cases, is there a rotation policy implemented for all the logs produced?

So, to summarize, the scenario is to understand how to automate a backup function of all the documents stored in openkm, in a readable format (if we include metadata and history, that is great).

I wish that export function could fit the purpose, but it has to be found out if an "incremental" export is possible or not.

Thank you in advance for your support.
Rosario
 #45563  by jllort
 
You can make something more interesting but should build your own script:
1- export all data like now
2- using automation log you can easily identify documents and folders ( created , updated and deleted ) if you follow these actions in order you will be able to update only the contents done into a single day.

You should be interested in
OKM_ACTIVITY table and might be in class LegacyDAO class for queries. From the select ( based in daily range ) you will get uuid of node affected, action etc...

About Us

OpenKM is part of the management software. A management software is a program that facilitates the accomplishment of administrative tasks. OpenKM is a document management system that allows you to manage business content and workflow in a more efficient way. Document managers guarantee data protection by establishing information security for business content.