Files @ cddff7f0dd08
Branch filter:

Location: kallithea/docs/usage/statistics.rst - annotation

cddff7f0dd08 1.2 KiB text/prs.fallenstein.rst Show Source Show as Raw Download as Raw
domruf
tests: use temporary copy of test.ini, possibly customized for TEST_DB

It was an undocumented feature that if setting the environment variable
TEST_DB, that would be used for tests instead of the default
kallithea/tests/test.ini sqlalchemy.url . That did however not work for Git
hooks and tests would fail.

Instead, create a copy of test.ini in the temp folder and use that for testing.
If TEST_DB is set, edit the file so the specified DB URL is used. This fixes
Git hook related tests if TEST_DB is used.

Since this also changes the path of %(here)s to the temporary location, just
use the default paths.

This also has the advantage that the data folders are now in the temp folder
as well. Therefore a broken data folder, from a past run, can no longer
influence a test.
.. _statistics:

=====================
Repository statistics
=====================

Kallithea has a *repository statistics* feature, disabled by default. When
enabled, the amount of commits per committer is visualized in a timeline. This
feature can be enabled using the ``Enable statistics`` checkbox on the
repository ``Settings`` page.

The statistics system makes heavy demands on the server resources, so
in order to keep a balance between usability and performance, statistics are
cached inside the database and gathered incrementally.

When Celery is disabled:

  On each first visit to the summary page a set of 250 commits are parsed and
  added to the statistics cache. This incremental gathering also happens on each
  visit to the statistics page, until all commits are fetched.

  Statistics are kept cached until additional commits are added to the
  repository. In such a case Kallithea will only fetch the new commits when
  updating its statistics cache.

When Celery is enabled:

  On the first visit to the summary page, Kallithea will create tasks that will
  execute on Celery workers. These tasks will gather all of the statistics until
  all commits are parsed. Each task parses 250 commits, then launches a new
  task.