About
The database replication process is the process that:
- creates the analytical data if needed
- replicates the data into the database.
Sync
Because this is an intensive process, it happens slowly:
- for every page visited
- or for every page that gains or lost backlinks.
And the output is cached.
If you want to sync the database completely, you can:
- run the cli
- run the Search Index Manager
Search Index Manager
The search index manager is a plugin that will replicate the page data. See how to install it and run it on this page. Search Index Manager
Cli
On the server, you can replicate the data to the database with the cli following command
cd $DOKUWIKI_HOME
# optional (for farm)
animal=animal-directory-name
# command
php ./bin/plugin.php combo metadata-to-database --host serverHostName --port 80 /
Monitoring
The replication date
The replication date is stored in the date_replication metadata and can be seen via:
- the metadata manager (integration tab)
- the Metadata Viewer.
- or query .
select
path,
date_replication
from
pages
order by date desc
limit 10;
The analytics date
You can see the analytics date on the date field of the JSON object
Example of query .
select
path,
json_extract(analytics, '$.date') as date
from
pages
order by date desc
limit 10;
Replication Request
When backlinks are added or deleted, a replication request of the page is asked to update them.
This request are stored in the table pages_to_replicate. If you query this table, you would get all page that should be replicated in the near future. This table should be empty 99% of the time as this is a temporary table.
select * from pages_to_replicate