Changeset - bf8ed0adbc66
[Not reviewed]
beta
0 3 0
Marcin Kuzminski - 14 years ago 2012-02-28 17:25:30
marcin@python-works.com
fixes #371 fixed issues with beaker/sqlalchemy and non-ascii cache keys
3 files changed with 24 insertions and 8 deletions:
0 comments (0 inline, 0 general)
docs/changelog.rst
Show inline comments
 
.. _changelog:
 

	
 
Changelog
 
=========
 

	
 

	
 
1.3.2 (**2012-XX-XX**)
 
----------------------
 

	
 
:status: in-progress
 
:branch: beta
 

	
 
news
 
++++
 

	
 

	
 
fixes
 
+++++
 

	
 
- fixed git protocol issues with repos-groups
 
- fixed git remote repos validator that prevented from cloning remote git repos
 
- fixes #370 ending slashes fixes for repo and groups
 
- fixes #368 improved git-protocol detection to handle other clients
 
- fixes #366 When Setting Repository Group To Blank Repo Group Wont Be 
 
  Moved To Root
 
- fixes #371 fixed issues with beaker/sqlalchemy and non-ascii cache keys 
 

	
 

	
 
1.3.1 (**2012-02-27**)
 
----------------------
 

	
 
news
 
++++
 

	
 

	
 
fixes
 
+++++
 

	
 
- redirection loop occurs when remember-me wasn't checked during login
 
- fixes issues with git blob history generation 
 
- don't fetch branch for git in file history dropdown. Causes unneeded slowness
 

	
 
1.3.0 (**2012-02-26**)
 
----------------------
 

	
 
news
 
++++
 

	
 
- code review, inspired by github code-comments 
 
- #215 rst and markdown README files support
 
- #252 Container-based and proxy pass-through authentication support
 
- #44 branch browser. Filtering of changelog by branches
 
- mercurial bookmarks support
 
- new hover top menu, optimized to add maximum size for important views
 
- configurable clone url template with possibility to specify  protocol like 
 
  ssh:// or http:// and also manually alter other parts of clone_url.
 
- enabled largefiles extension by default
 
- optimized summary file pages and saved a lot of unused space in them
 
- #239 option to manually mark repository as fork
 
- #320 mapping of commit authors to RhodeCode users
 
- #304 hashes are displayed using monospace font    
 
- diff configuration, toggle white lines and context lines
 
- #307 configurable diffs, whitespace toggle, increasing context lines
 
- sorting on branches, tags and bookmarks using YUI datatable
 
- improved file filter on files page
 
- implements #330 api method for listing nodes ar particular revision
 
- #73 added linking issues in commit messages to chosen issue tracker url
 
  based on user defined regular expression
 
- added linking of changesets in commit messages  
 
- new compact changelog with expandable commit messages
 
- firstname and lastname are optional in user creation
 
- #348 added post-create repository hook
 
- #212 global encoding settings is now configurable from .ini files 
 
- #227 added repository groups permissions
 
- markdown gets codehilite extensions
rhodecode/lib/caching_query.py
Show inline comments
 
"""caching_query.py
 

	
 
Represent persistence structures which allow the usage of
 
Beaker caching with SQLAlchemy.
 

	
 
The three new concepts introduced here are:
 

	
 
 * CachingQuery - a Query subclass that caches and
 
   retrieves results in/from Beaker.
 
 * FromCache - a query option that establishes caching
 
   parameters on a Query
 
 * RelationshipCache - a variant of FromCache which is specific
 
   to a query invoked during a lazy load.
 
 * _params_from_query - extracts value parameters from
 
   a Query.
 

	
 
The rest of what's here are standard SQLAlchemy and
 
Beaker constructs.
 

	
 
"""
 
import beaker
 
from beaker.exceptions import BeakerException
 

	
 
from sqlalchemy.orm.interfaces import MapperOption
 
from sqlalchemy.orm.query import Query
 
from sqlalchemy.sql import visitors
 
from rhodecode.lib import safe_str
 

	
 

	
 
class CachingQuery(Query):
 
    """A Query subclass which optionally loads full results from a Beaker
 
    cache region.
 

	
 
    The CachingQuery stores additional state that allows it to consult
 
    a Beaker cache before accessing the database:
 

	
 
    * A "region", which is a cache region argument passed to a
 
      Beaker CacheManager, specifies a particular cache configuration
 
      (including backend implementation, expiration times, etc.)
 
    * A "namespace", which is a qualifying name that identifies a
 
      group of keys within the cache.  A query that filters on a name
 
      might use the name "by_name", a query that filters on a date range
 
      to a joined table might use the name "related_date_range".
 

	
 
    When the above state is present, a Beaker cache is retrieved.
 

	
 
    The "namespace" name is first concatenated with
 
    a string composed of the individual entities and columns the Query
 
    requests, i.e. such as ``Query(User.id, User.name)``.
 

	
 
    The Beaker cache is then loaded from the cache manager based
 
    on the region and composed namespace.  The key within the cache
 
    itself is then constructed against the bind parameters specified
 
    by this query, which are usually literals defined in the
 
    WHERE clause.
 

	
 
    The FromCache and RelationshipCache mapper options below represent
 
    the "public" method of configuring this state upon the CachingQuery.
 

	
 
    """
 

	
 
    def __init__(self, manager, *args, **kw):
 
        self.cache_manager = manager
 
        Query.__init__(self, *args, **kw)
 

	
 
    def __iter__(self):
 
        """override __iter__ to pull results from Beaker
 
           if particular attributes have been configured.
 

	
 
           Note that this approach does *not* detach the loaded objects from
 
           the current session. If the cache backend is an in-process cache
 
           (like "memory") and lives beyond the scope of the current session's
 
           transaction, those objects may be expired. The method here can be
 
           modified to first expunge() each loaded item from the current
 
           session before returning the list of items, so that the items
 
@@ -92,99 +93,100 @@ class CachingQuery(Query):
 

	
 
        Raise KeyError if no value present and no
 
        createfunc specified.
 

	
 
        """
 
        cache, cache_key = _get_cache_parameters(self)
 
        ret = cache.get_value(cache_key, createfunc=createfunc)
 
        if merge:
 
            ret = self.merge_result(ret, load=False)
 
        return ret
 

	
 
    def set_value(self, value):
 
        """Set the value in the cache for this query."""
 

	
 
        cache, cache_key = _get_cache_parameters(self)
 
        cache.put(cache_key, value)
 

	
 

	
 
def query_callable(manager, query_cls=CachingQuery):
 
    def query(*arg, **kw):
 
        return query_cls(manager, *arg, **kw)
 
    return query
 

	
 

	
 
def get_cache_region(name, region):
 
    if region not in beaker.cache.cache_regions:
 
        raise BeakerException('Cache region `%s` not configured '
 
            'Check if proper cache settings are in the .ini files' % region)
 
    kw = beaker.cache.cache_regions[region]
 
    return beaker.cache.Cache._get_cache(name, kw)
 

	
 

	
 
def _get_cache_parameters(query):
 
    """For a query with cache_region and cache_namespace configured,
 
    return the correspoinding Cache instance and cache key, based
 
    on this query's current criterion and parameter values.
 

	
 
    """
 
    if not hasattr(query, '_cache_parameters'):
 
        raise ValueError("This Query does not have caching "
 
                         "parameters configured.")
 

	
 
    region, namespace, cache_key = query._cache_parameters
 

	
 
    namespace = _namespace_from_query(namespace, query)
 

	
 
    if cache_key is None:
 
        # cache key - the value arguments from this query's parameters.
 
        args = [str(x) for x in _params_from_query(query)]
 
        args.extend(filter(lambda k:k not in ['None', None, u'None'],
 
        args = [safe_str(x) for x in _params_from_query(query)]
 
        args.extend(filter(lambda k: k not in ['None', None, u'None'],
 
                           [str(query._limit), str(query._offset)]))
 

	
 
        cache_key = " ".join(args)
 

	
 
    if cache_key is None:
 
        raise Exception('Cache key cannot be None')
 

	
 
    # get cache
 
    #cache = query.cache_manager.get_cache_region(namespace, region)
 
    cache = get_cache_region(namespace, region)
 
    # optional - hash the cache_key too for consistent length
 
    # import uuid
 
    # cache_key= str(uuid.uuid5(uuid.NAMESPACE_DNS, cache_key))
 

	
 
    return cache, cache_key
 

	
 

	
 
def _namespace_from_query(namespace, query):
 
    # cache namespace - the token handed in by the
 
    # option + class we're querying against
 
    namespace = " ".join([namespace] + [str(x) for x in query._entities])
 

	
 
    # memcached wants this
 
    namespace = namespace.replace(' ', '_')
 

	
 
    return namespace
 

	
 

	
 
def _set_cache_parameters(query, region, namespace, cache_key):
 

	
 
    if hasattr(query, '_cache_parameters'):
 
        region, namespace, cache_key = query._cache_parameters
 
        raise ValueError("This query is already configured "
 
                        "for region %r namespace %r" %
 
                        (region, namespace)
 
                    )
 
    query._cache_parameters = region, namespace, cache_key
 

	
 

	
 
class FromCache(MapperOption):
 
    """Specifies that a Query should load results from a cache."""
 

	
 
    propagate_to_loaders = False
 

	
 
    def __init__(self, region, namespace, cache_key=None):
 
        """Construct a new FromCache.
 

	
 
        :param region: the cache region.  Should be a
 
        region configured in the Beaker CacheManager.
 

	
rhodecode/model/db.py
Show inline comments
 
# -*- coding: utf-8 -*-
 
"""
 
    rhodecode.model.db
 
    ~~~~~~~~~~~~~~~~~~
 

	
 
    Database Models for RhodeCode
 

	
 
    :created_on: Apr 08, 2010
 
    :author: marcink
 
    :copyright: (C) 2010-2012 Marcin Kuzminski <marcin@python-works.com>
 
    :license: GPLv3, see COPYING for more details.
 
"""
 
# This program is free software: you can redistribute it and/or modify
 
# it under the terms of the GNU General Public License as published by
 
# the Free Software Foundation, either version 3 of the License, or
 
# (at your option) any later version.
 
#
 
# This program is distributed in the hope that it will be useful,
 
# but WITHOUT ANY WARRANTY; without even the implied warranty of
 
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
 
# GNU General Public License for more details.
 
#
 
# You should have received a copy of the GNU General Public License
 
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
 

	
 
import os
 
import logging
 
import datetime
 
import traceback
 
from collections import defaultdict
 

	
 
from sqlalchemy import *
 
from sqlalchemy.ext.hybrid import hybrid_property
 
from sqlalchemy.orm import relationship, joinedload, class_mapper, validates
 
from beaker.cache import cache_region, region_invalidate
 

	
 
from rhodecode.lib.vcs import get_backend
 
from rhodecode.lib.vcs.utils.helpers import get_scm
 
from rhodecode.lib.vcs.exceptions import VCSError
 
from rhodecode.lib.vcs.utils.lazy import LazyProperty
 

	
 
from rhodecode.lib import str2bool, safe_str, get_changeset_safe, safe_unicode
 
from rhodecode.lib.compat import json
 
from rhodecode.lib.caching_query import FromCache
 

	
 
from rhodecode.model.meta import Base, Session
 
import hashlib
 

	
 

	
 
log = logging.getLogger(__name__)
 

	
 
#==============================================================================
 
# BASE CLASSES
 
#==============================================================================
 

	
 
_hash_key = lambda k: hashlib.md5(safe_str(k)).hexdigest()
 

	
 

	
 
class ModelSerializer(json.JSONEncoder):
 
    """
 
    Simple Serializer for JSON,
 

	
 
    usage::
 

	
 
        to make object customized for serialization implement a __json__
 
        method that will return a dict for serialization into json
 

	
 
    example::
 

	
 
        class Task(object):
 

	
 
            def __init__(self, name, value):
 
                self.name = name
 
                self.value = value
 

	
 
            def __json__(self):
 
                return dict(name=self.name,
 
                            value=self.value)
 

	
 
    """
 

	
 
    def default(self, obj):
 

	
 
        if hasattr(obj, '__json__'):
 
            return obj.__json__()
 
        else:
 
            return json.JSONEncoder.default(self, obj)
 

	
 

	
 
class BaseModel(object):
 
    """
 
    Base Model for all classess
 
    """
 

	
 
    @classmethod
 
    def _get_keys(cls):
 
        """return column names for this model """
 
        return class_mapper(cls).c.keys()
 

	
 
    def get_dict(self):
 
        """
 
        return dict with keys and values corresponding
 
        to this model data """
 

	
 
        d = {}
 
@@ -292,179 +295,185 @@ class User(Base, BaseModel):
 
    user_followers = relationship('UserFollowing', primaryjoin='UserFollowing.follows_user_id==User.user_id', cascade='all')
 
    repo_to_perm = relationship('UserRepoToPerm', primaryjoin='UserRepoToPerm.user_id==User.user_id', cascade='all')
 

	
 
    group_member = relationship('UsersGroupMember', cascade='all')
 

	
 
    notifications = relationship('UserNotification',)
 

	
 
    @hybrid_property
 
    def email(self):
 
        return self._email
 

	
 
    @email.setter
 
    def email(self, val):
 
        self._email = val.lower() if val else None
 

	
 
    @property
 
    def full_name(self):
 
        return '%s %s' % (self.name, self.lastname)
 

	
 
    @property
 
    def full_name_or_username(self):
 
        return ('%s %s' % (self.name, self.lastname)
 
                if (self.name and self.lastname) else self.username)
 

	
 
    @property
 
    def full_contact(self):
 
        return '%s %s <%s>' % (self.name, self.lastname, self.email)
 

	
 
    @property
 
    def short_contact(self):
 
        return '%s %s' % (self.name, self.lastname)
 

	
 
    @property
 
    def is_admin(self):
 
        return self.admin
 

	
 
    def __repr__(self):
 
        return "<%s('id:%s:%s')>" % (self.__class__.__name__,
 
                                     self.user_id, self.username)
 

	
 
    @classmethod
 
    def get_by_username(cls, username, case_insensitive=False, cache=False):
 
        if case_insensitive:
 
            q = cls.query().filter(cls.username.ilike(username))
 
        else:
 
            q = cls.query().filter(cls.username == username)
 

	
 
        if cache:
 
            q = q.options(FromCache("sql_cache_short",
 
                                    "get_user_%s" % username))
 
            q = q.options(FromCache(
 
                            "sql_cache_short",
 
                            "get_user_%s" % _hash_key(username)
 
                          )
 
            )
 
        return q.scalar()
 

	
 
    @classmethod
 
    def get_by_api_key(cls, api_key, cache=False):
 
        q = cls.query().filter(cls.api_key == api_key)
 

	
 
        if cache:
 
            q = q.options(FromCache("sql_cache_short",
 
                                    "get_api_key_%s" % api_key))
 
        return q.scalar()
 

	
 
    @classmethod
 
    def get_by_email(cls, email, case_insensitive=False, cache=False):
 
        if case_insensitive:
 
            q = cls.query().filter(cls.email.ilike(email))
 
        else:
 
            q = cls.query().filter(cls.email == email)
 

	
 
        if cache:
 
            q = q.options(FromCache("sql_cache_short",
 
                                    "get_api_key_%s" % email))
 
        return q.scalar()
 

	
 
    def update_lastlogin(self):
 
        """Update user lastlogin"""
 
        self.last_login = datetime.datetime.now()
 
        Session.add(self)
 
        log.debug('updated user %s lastlogin' % self.username)
 

	
 
    def __json__(self):
 
        return dict(
 
            email=self.email,
 
            full_name=self.full_name,
 
            full_name_or_username=self.full_name_or_username,
 
            short_contact=self.short_contact,
 
            full_contact=self.full_contact
 
        )
 

	
 

	
 
class UserLog(Base, BaseModel):
 
    __tablename__ = 'user_logs'
 
    __table_args__ = {'extend_existing': True}
 
    user_log_id = Column("user_log_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
 
    user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None)
 
    repository_id = Column("repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=True)
 
    repository_name = Column("repository_name", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None)
 
    user_ip = Column("user_ip", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None)
 
    action = Column("action", UnicodeText(length=1200000, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None)
 
    action_date = Column("action_date", DateTime(timezone=False), nullable=True, unique=None, default=None)
 

	
 
    @property
 
    def action_as_day(self):
 
        return datetime.date(*self.action_date.timetuple()[:3])
 

	
 
    user = relationship('User')
 
    repository = relationship('Repository',cascade='')
 

	
 

	
 
class UsersGroup(Base, BaseModel):
 
    __tablename__ = 'users_groups'
 
    __table_args__ = {'extend_existing': True}
 

	
 
    users_group_id = Column("users_group_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
 
    users_group_name = Column("users_group_name", String(length=255, convert_unicode=False, assert_unicode=None), nullable=False, unique=True, default=None)
 
    users_group_active = Column("users_group_active", Boolean(), nullable=True, unique=None, default=None)
 

	
 
    members = relationship('UsersGroupMember', cascade="all, delete, delete-orphan", lazy="joined")
 

	
 
    def __repr__(self):
 
        return '<userGroup(%s)>' % (self.users_group_name)
 

	
 
    @classmethod
 
    def get_by_group_name(cls, group_name, cache=False,
 
                          case_insensitive=False):
 
        if case_insensitive:
 
            q = cls.query().filter(cls.users_group_name.ilike(group_name))
 
        else:
 
            q = cls.query().filter(cls.users_group_name == group_name)
 
        if cache:
 
            q = q.options(FromCache("sql_cache_short",
 
                                    "get_user_%s" % group_name))
 
            q = q.options(FromCache(
 
                            "sql_cache_short",
 
                            "get_user_%s" % _hash_key(group_name)
 
                          )
 
            )
 
        return q.scalar()
 

	
 
    @classmethod
 
    def get(cls, users_group_id, cache=False):
 
        users_group = cls.query()
 
        if cache:
 
            users_group = users_group.options(FromCache("sql_cache_short",
 
                                    "get_users_group_%s" % users_group_id))
 
        return users_group.get(users_group_id)
 

	
 

	
 
class UsersGroupMember(Base, BaseModel):
 
    __tablename__ = 'users_groups_members'
 
    __table_args__ = {'extend_existing': True}
 

	
 
    users_group_member_id = Column("users_group_member_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
 
    users_group_id = Column("users_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None)
 
    user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None)
 

	
 
    user = relationship('User', lazy='joined')
 
    users_group = relationship('UsersGroup')
 

	
 
    def __init__(self, gr_id='', u_id=''):
 
        self.users_group_id = gr_id
 
        self.user_id = u_id
 

	
 

	
 
class Repository(Base, BaseModel):
 
    __tablename__ = 'repositories'
 
    __table_args__ = (
 
        UniqueConstraint('repo_name'),
 
        {'extend_existing': True},
 
    )
 

	
 
    repo_id = Column("repo_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
 
    repo_name = Column("repo_name", String(length=255, convert_unicode=False, assert_unicode=None), nullable=False, unique=True, default=None)
 
    clone_uri = Column("clone_uri", String(length=255, convert_unicode=False, assert_unicode=None), nullable=True, unique=False, default=None)
 
    repo_type = Column("repo_type", String(length=255, convert_unicode=False, assert_unicode=None), nullable=False, unique=False, default='hg')
 
    user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=False, default=None)
 
    private = Column("private", Boolean(), nullable=True, unique=None, default=None)
 
    enable_statistics = Column("statistics", Boolean(), nullable=True, unique=None, default=True)
 
    enable_downloads = Column("downloads", Boolean(), nullable=True, unique=None, default=True)
 
    description = Column("description", String(length=10000, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None)
 
    created_on = Column('created_on', DateTime(timezone=False), nullable=True, unique=None, default=datetime.datetime.now)
 

	
 
    fork_id = Column("fork_id", Integer(), ForeignKey('repositories.repo_id'), nullable=True, unique=False, default=None)
 
    group_id = Column("group_id", Integer(), ForeignKey('groups.group_id'), nullable=True, unique=False, default=None)
 

	
 
@@ -703,98 +712,101 @@ class RepoGroup(Base, BaseModel):
 
        {'extend_existing': True},
 
    )
 
    __mapper_args__ = {'order_by': 'group_name'}
 

	
 
    group_id = Column("group_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
 
    group_name = Column("group_name", String(length=255, convert_unicode=False, assert_unicode=None), nullable=False, unique=True, default=None)
 
    group_parent_id = Column("group_parent_id", Integer(), ForeignKey('groups.group_id'), nullable=True, unique=None, default=None)
 
    group_description = Column("group_description", String(length=10000, convert_unicode=False, assert_unicode=None), nullable=True, unique=None, default=None)
 

	
 
    repo_group_to_perm = relationship('UserRepoGroupToPerm', cascade='all', order_by='UserRepoGroupToPerm.group_to_perm_id')
 
    users_group_to_perm = relationship('UsersGroupRepoGroupToPerm', cascade='all')
 

	
 
    parent_group = relationship('RepoGroup', remote_side=group_id)
 

	
 
    def __init__(self, group_name='', parent_group=None):
 
        self.group_name = group_name
 
        self.parent_group = parent_group
 

	
 
    def __repr__(self):
 
        return "<%s('%s:%s')>" % (self.__class__.__name__, self.group_id,
 
                                  self.group_name)
 

	
 
    @classmethod
 
    def groups_choices(cls):
 
        from webhelpers.html import literal as _literal
 
        repo_groups = [('', '')]
 
        sep = ' &raquo; '
 
        _name = lambda k: _literal(sep.join(k))
 

	
 
        repo_groups.extend([(x.group_id, _name(x.full_path_splitted))
 
                              for x in cls.query().all()])
 

	
 
        repo_groups = sorted(repo_groups, key=lambda t: t[1].split(sep)[0])
 
        return repo_groups
 

	
 
    @classmethod
 
    def url_sep(cls):
 
        return '/'
 

	
 
    @classmethod
 
    def get_by_group_name(cls, group_name, cache=False, case_insensitive=False):
 
        if case_insensitive:
 
            gr = cls.query()\
 
                .filter(cls.group_name.ilike(group_name))
 
        else:
 
            gr = cls.query()\
 
                .filter(cls.group_name == group_name)
 
        if cache:
 
            gr = gr.options(FromCache("sql_cache_short",
 
                                          "get_group_%s" % group_name))
 
            gr = gr.options(FromCache(
 
                            "sql_cache_short",
 
                            "get_group_%s" % _hash_key(group_name)
 
                            )
 
            )
 
        return gr.scalar()
 

	
 
    @property
 
    def parents(self):
 
        parents_recursion_limit = 5
 
        groups = []
 
        if self.parent_group is None:
 
            return groups
 
        cur_gr = self.parent_group
 
        groups.insert(0, cur_gr)
 
        cnt = 0
 
        while 1:
 
            cnt += 1
 
            gr = getattr(cur_gr, 'parent_group', None)
 
            cur_gr = cur_gr.parent_group
 
            if gr is None:
 
                break
 
            if cnt == parents_recursion_limit:
 
                # this will prevent accidental infinit loops
 
                log.error('group nested more than %s' %
 
                          parents_recursion_limit)
 
                break
 

	
 
            groups.insert(0, gr)
 
        return groups
 

	
 
    @property
 
    def children(self):
 
        return RepoGroup.query().filter(RepoGroup.parent_group == self)
 

	
 
    @property
 
    def name(self):
 
        return self.group_name.split(RepoGroup.url_sep())[-1]
 

	
 
    @property
 
    def full_path(self):
 
        return self.group_name
 

	
 
    @property
 
    def full_path_splitted(self):
 
        return self.group_name.split(RepoGroup.url_sep())
 

	
 
    @property
 
    def repositories(self):
 
        return Repository.query().filter(Repository.group == self)
 

	
 
    @property
 
    def repositories_recursive_count(self):
0 comments (0 inline, 0 general)