Changeset - e1ab82613133
[Not reviewed]
default
0 33 1
Alessandro Molina - 9 years ago 2017-01-29 21:08:49
alessandro.molina@axant.it
backend: replace Pylons with TurboGears2

Replace the no-longer-supported Pylons application framework by TurboGears2
which is largely compatible/similar to Pylons.
Some interesting history is described at:
https://en.wikipedia.org/wiki/TurboGears

Changes by Dominik Ruf:
- fix sql config in test.ini

Changes by Thomas De Schampheleire:
- set-up of test suite
- tests: 'fix' repo archival test failure
Between Pylons and TurboGears2, there seems to be a small difference in the
headers sent for repository archive files, related to character encoding.
It is assumed that this difference is not important, and that the test
should just align with reality.
- remove need to import helpers/app_globals in lib
TurboGears2 by default expects helpers and app_globals to be available
in lib. For this reason kallithea/lib/__init__.py was originally changed
to include those files. However, this triggered several types of
circular import problems. If module A imported something from lib (e.g.
lib.annotate), and lib.helpers imported (possibly indirectly) module A,
then there was a circular import. Fix this by overruling the relevant
method of tg AppConfig, which is also hinted in the TurboGears2 code.
Hereby, the include of something from lib does not automatically import
helpers, greatly reducing the chances of circular import problems.
- make sure HTTP error '400' uses the custom error pages
TurboGears2 does not by default handle HTTP status code
'400 (Bad Request)' via the custom error page handling, causing a
standard non-styled error page.
- disable transaction manager
Kallithea currently handles its own transactions and does not need the
TurboGears2 transaction manager. However, TurboGears2 tries to enable it
by default and fails, throwing an error during application initialization.
The error itself seemed to be harmless for normal application functioning,
but was nevertheless confusing.
- add backlash as required dependency: backlash is meant as the WebError
replacement in TurboGears2 (originally WebError is part of Pylons). When
debug==true, it provides an interactive debugger in the browser. When
debug==false, backlash is necessary to show backtraces on the console.
- misc fixes
20 files changed:
0 comments (0 inline, 0 general)
dev_requirements.txt
Show inline comments
 
babel >= 0.9.6, < 2.4
 
waitress >= 0.8.8, < 1.0
 
pytest ~= 3.0
 
pytest-runner
 
pytest-sugar>=0.7.0
 
pytest-catchlog
 
mock
 
sphinx
 
webtest < 3
development.ini
Show inline comments
 
################################################################################
 
################################################################################
 
# Kallithea - Development config:                                              #
 
# listening on *:5000                                                          #
 
# sqlite and kallithea.db                                                      #
 
# initial_repo_scan = true                                                     #
 
# set debug = true                                                             #
 
# verbose and colorful logging                                                 #
 
#                                                                              #
 
# The %(here)s variable will be replaced with the parent directory of this file#
 
################################################################################
 
################################################################################
 

	
 
[DEFAULT]
 
debug = true
 
pdebug = false
 

	
 
################################################################################
 
## Email settings                                                             ##
 
##                                                                            ##
 
## Refer to the documentation ("Email settings") for more details.            ##
 
##                                                                            ##
 
## It is recommended to use a valid sender address that passes access         ##
 
## validation and spam filtering in mail servers.                             ##
 
################################################################################
 

	
 
## 'From' header for application emails. You can optionally add a name.
 
## Default:
 
#app_email_from = Kallithea
 
## Examples:
 
#app_email_from = Kallithea <kallithea-noreply@example.com>
 
#app_email_from = kallithea-noreply@example.com
 

	
 
## Subject prefix for application emails.
 
## A space between this prefix and the real subject is automatically added.
 
## Default:
 
#email_prefix =
 
## Example:
 
#email_prefix = [Kallithea]
 

	
 
## Recipients for error emails and fallback recipients of application mails.
 
## Multiple addresses can be specified, space-separated.
 
## Only addresses are allowed, do not add any name part.
 
## Default:
 
#email_to =
 
## Examples:
 
#email_to = admin@example.com
 
#email_to = admin@example.com another_admin@example.com
 

	
 
## 'From' header for error emails. You can optionally add a name.
 
## Default:
 
#error_email_from = pylons@yourapp.com
 
## Examples:
 
#error_email_from = Kallithea Errors <kallithea-noreply@example.com>
 
#error_email_from = paste_error@example.com
 

	
 
## SMTP server settings
 
## If specifying credentials, make sure to use secure connections.
 
## Default: Send unencrypted unauthenticated mails to the specified smtp_server.
 
## For "SSL", use smtp_use_ssl = true and smtp_port = 465.
 
## For "STARTTLS", use smtp_use_tls = true and smtp_port = 587.
 
#smtp_server = smtp.example.com
 
#smtp_username =
 
#smtp_password =
 
#smtp_port = 25
 
#smtp_use_ssl = false
 
#smtp_use_tls = false
 

	
 
[server:main]
 
## Gearbox default web server ##
 
#use = egg:gearbox#wsgiref
 
## nr of worker threads to spawn
 
#threadpool_workers = 1
 
## max request before thread respawn
 
#threadpool_max_requests = 100
 
## option to use threads of process
 
#use_threadpool = true
 

	
 
## Gearbox gevent web server ##
 
#use = egg:gearbox#gevent
 

	
 
## WAITRESS ##
 
use = egg:waitress#main
 
## number of worker threads
 
threads = 1
 
## MAX BODY SIZE 100GB
 
max_request_body_size = 107374182400
 
## use poll instead of select, fixes fd limits, may not work on old
 
## windows systems.
 
#asyncore_use_poll = True
 

	
 
## GUNICORN ##
 
#use = egg:gunicorn#main
 
## number of process workers. You must set `instance_id = *` when this option
 
## is set to more than one worker
 
#workers = 1
 
## process name
 
#proc_name = kallithea
 
## type of worker class, one of sync, eventlet, gevent, tornado
 
## recommended for bigger setup is using of of other than sync one
 
#worker_class = sync
 
#max_requests = 1000
 
## amount of time a worker can handle request before it gets killed and
 
## restarted
 
#timeout = 3600
 

	
 
## UWSGI ##
 
## run with uwsgi --ini-paste-logged <inifile.ini>
 
#[uwsgi]
 
#socket = /tmp/uwsgi.sock
 
#master = true
 
#http = 127.0.0.1:5000
 

	
 
## set as deamon and redirect all output to file
 
#daemonize = ./uwsgi_kallithea.log
 

	
 
## master process PID
 
#pidfile = ./uwsgi_kallithea.pid
 

	
 
## stats server with workers statistics, use uwsgitop
 
## for monitoring, `uwsgitop 127.0.0.1:1717`
 
#stats = 127.0.0.1:1717
 
#memory-report = true
 

	
 
## log 5XX errors
 
#log-5xx = true
 

	
 
## Set the socket listen queue size.
 
#listen = 256
 

	
 
## Gracefully Reload workers after the specified amount of managed requests
 
## (avoid memory leaks).
 
#max-requests = 1000
 

	
 
## enable large buffers
 
#buffer-size = 65535
 

	
 
## socket and http timeouts ##
 
#http-timeout = 3600
 
#socket-timeout = 3600
 

	
 
## Log requests slower than the specified number of milliseconds.
 
#log-slow = 10
 

	
 
## Exit if no app can be loaded.
 
#need-app = true
 

	
 
## Set lazy mode (load apps in workers instead of master).
 
#lazy = true
 

	
 
## scaling ##
 
## set cheaper algorithm to use, if not set default will be used
 
#cheaper-algo = spare
 

	
 
## minimum number of workers to keep at all times
 
#cheaper = 1
 

	
 
## number of workers to spawn at startup
 
#cheaper-initial = 1
 

	
 
## maximum number of workers that can be spawned
 
#workers = 4
 

	
 
## how many workers should be spawned at a time
 
#cheaper-step = 1
 

	
 
## COMMON ##
 
#host = 127.0.0.1
 
host = 0.0.0.0
 
port = 5000
 

	
 
## middleware for hosting the WSGI application under a URL prefix
 
#[filter:proxy-prefix]
 
#use = egg:PasteDeploy#prefix
 
#prefix = /<your-prefix>
 

	
 
[app:main]
 
use = egg:kallithea
 
## enable proxy prefix middleware
 
#filter-with = proxy-prefix
 

	
 
full_stack = true
 
static_files = true
 
## Available Languages:
 
## cs de fr hu ja nl_BE pl pt_BR ru sk zh_CN zh_TW
 
lang =
 
cache_dir = %(here)s/data
 
index_dir = %(here)s/data/index
 

	
 
## perform a full repository scan on each server start, this should be
 
## set to false after first startup, to allow faster server restarts.
 
#initial_repo_scan = false
 
initial_repo_scan = true
 

	
 
## uncomment and set this path to use archive download cache
 
archive_cache_dir = %(here)s/tarballcache
 

	
 
## change this to unique ID for security
 
app_instance_uuid = development-not-secret
 

	
 
## cut off limit for large diffs (size in bytes)
 
cut_off_limit = 256000
 

	
 
## force https in Kallithea, fixes https redirects, assumes it's always https
 
force_https = false
 

	
 
## use Strict-Transport-Security headers
 
use_htsts = false
 

	
 
## number of commits stats will parse on each iteration
 
commit_parse_limit = 25
 

	
 
## path to git executable
 
git_path = git
 

	
 
## git rev filter option, --all is the default filter, if you need to
 
## hide all refs in changelog switch this to --branches --tags
 
#git_rev_filter = --branches --tags
 

	
 
## RSS feed options
 
rss_cut_off_limit = 256000
 
rss_items_per_page = 10
 
rss_include_diff = false
 

	
 
## options for showing and identifying changesets
 
show_sha_length = 12
 
show_revision_number = false
 

	
 
## Canonical URL to use when creating full URLs in UI and texts.
 
## Useful when the site is available under different names or protocols.
 
## Defaults to what is provided in the WSGI environment.
 
#canonical_url = https://kallithea.example.com/repos
 

	
 
## gist URL alias, used to create nicer urls for gist. This should be an
 
## url that does rewrites to _admin/gists/<gistid>.
 
## example: http://gist.example.com/{gistid}. Empty means use the internal
 
## Kallithea url, ie. http[s]://kallithea.example.com/_admin/gists/<gistid>
 
gist_alias_url =
 

	
 
## white list of API enabled controllers. This allows to add list of
 
## controllers to which access will be enabled by api_key. eg: to enable
 
## api access to raw_files put `FilesController:raw`, to enable access to patches
 
## add `ChangesetController:changeset_patch`. This list should be "," separated
 
## Syntax is <ControllerClass>:<function>. Check debug logs for generated names
 
## Recommended settings below are commented out:
 
api_access_controllers_whitelist =
 
#    ChangesetController:changeset_patch,
 
#    ChangesetController:changeset_raw,
 
#    FilesController:raw,
 
#    FilesController:archivefile
 

	
 
## default encoding used to convert from and to unicode
 
## can be also a comma separated list of encoding in case of mixed encodings
 
default_encoding = utf8
 

	
 
## issue tracker for Kallithea (leave blank to disable, absent for default)
 
#bugtracker = https://bitbucket.org/conservancy/kallithea/issues
 

	
 
## issue tracking mapping for commits messages
 
## comment out issue_pat, issue_server, issue_prefix to enable
 

	
 
## pattern to get the issues from commit messages
 
## default one used here is #<numbers> with a regex passive group for `#`
 
## {id} will be all groups matched from this pattern
 

	
 
issue_pat = (?:\s*#)(\d+)
 

	
 
## server url to the issue, each {id} will be replaced with match
 
## fetched from the regex and {repo} is replaced with full repository name
 
## including groups {repo_name} is replaced with just name of repo
 

	
 
issue_server_link = https://issues.example.com/{repo}/issue/{id}
 

	
 
## prefix to add to link to indicate it's an url
 
## #314 will be replaced by <issue_prefix><id>
 

	
 
issue_prefix = #
 

	
 
## issue_pat, issue_server_link, issue_prefix can have suffixes to specify
 
## multiple patterns, to other issues server, wiki or others
 
## below an example how to create a wiki pattern
 
# wiki-some-id -> https://wiki.example.com/some-id
 

	
 
#issue_pat_wiki = (?:wiki-)(.+)
 
#issue_server_link_wiki = https://wiki.example.com/{id}
 
#issue_prefix_wiki = WIKI-
 

	
 
## alternative return HTTP header for failed authentication. Default HTTP
 
## response is 401 HTTPUnauthorized. Currently Mercurial clients have trouble with
 
## handling that. Set this variable to 403 to return HTTPForbidden
 
auth_ret_code =
 

	
 
## locking return code. When repository is locked return this HTTP code. 2XX
 
## codes don't break the transactions while 4XX codes do
 
lock_ret_code = 423
 

	
 
## allows to change the repository location in settings page
 
allow_repo_location_change = True
 

	
 
## allows to setup custom hooks in settings page
 
allow_custom_hooks_settings = True
 

	
 
## extra extensions for indexing, space separated and without the leading '.'.
 
# index.extensions =
 
#    gemfile
 
#    lock
 

	
 
## extra filenames for indexing, space separated
 
# index.filenames =
 
#    .dockerignore
 
#    .editorconfig
 
#    INSTALL
 
#    CHANGELOG
 

	
 
####################################
 
###        CELERY CONFIG        ####
 
####################################
 

	
 
use_celery = false
 

	
 
## Example: connect to the virtual host 'rabbitmqhost' on localhost as rabbitmq:
 
broker.url = amqp://rabbitmq:qewqew@localhost:5672/rabbitmqhost
 

	
 
celery.imports = kallithea.lib.celerylib.tasks
 
celery.accept.content = pickle
 
celery.result.backend = amqp
 
celery.result.dburi = amqp://
 
celery.result.serialier = json
 

	
 
#celery.send.task.error.emails = true
 
#celery.amqp.task.result.expires = 18000
 

	
 
celeryd.concurrency = 2
 
celeryd.max.tasks.per.child = 1
 

	
 
## If true, tasks will never be sent to the queue, but executed locally instead.
 
celery.always.eager = false
 

	
 
####################################
 
###         BEAKER CACHE        ####
 
####################################
 

	
 
beaker.cache.data_dir = %(here)s/data/cache/data
 
beaker.cache.lock_dir = %(here)s/data/cache/lock
 

	
 
beaker.cache.regions = short_term,long_term,sql_cache_short
 

	
 
beaker.cache.short_term.type = memory
 
beaker.cache.short_term.expire = 60
 
beaker.cache.short_term.key_length = 256
 

	
 
beaker.cache.long_term.type = memory
 
beaker.cache.long_term.expire = 36000
 
beaker.cache.long_term.key_length = 256
 

	
 
beaker.cache.sql_cache_short.type = memory
 
beaker.cache.sql_cache_short.expire = 10
 
beaker.cache.sql_cache_short.key_length = 256
 

	
 
####################################
 
###       BEAKER SESSION        ####
 
####################################
 

	
 
## Name of session cookie. Should be unique for a given host and path, even when running
 
## on different ports. Otherwise, cookie sessions will be shared and messed up.
 
beaker.session.key = kallithea
 
## Sessions should always only be accessible by the browser, not directly by JavaScript.
 
beaker.session.httponly = true
 
## Session lifetime. 2592000 seconds is 30 days.
 
beaker.session.timeout = 2592000
 

	
 
## Server secret used with HMAC to ensure integrity of cookies.
 
beaker.session.secret = development-not-secret
 
## Further, encrypt the data with AES.
 
#beaker.session.encrypt_key = <key_for_encryption>
 
#beaker.session.validate_key = <validation_key>
 

	
 
## Type of storage used for the session, current types are
 
## dbm, file, memcached, database, and memory.
 

	
 
## File system storage of session data. (default)
 
#beaker.session.type = file
 

	
 
## Cookie only, store all session data inside the cookie. Requires secure secrets.
 
#beaker.session.type = cookie
 

	
 
## Database storage of session data.
 
#beaker.session.type = ext:database
 
#beaker.session.sa.url = postgresql://postgres:qwe@localhost/kallithea
 
#beaker.session.table_name = db_session
 

	
 
############################
 
## ERROR HANDLING SYSTEMS ##
 
############################
 

	
 
####################
 
### [appenlight] ###
 
####################
 

	
 
## AppEnlight is tailored to work with Kallithea, see
 
## http://appenlight.com for details how to obtain an account
 
## you must install python package `appenlight_client` to make it work
 

	
 
## appenlight enabled
 
appenlight = false
 

	
 
appenlight.server_url = https://api.appenlight.com
 
appenlight.api_key = YOUR_API_KEY
 

	
 
## TWEAK AMOUNT OF INFO SENT HERE
 

	
 
## enables 404 error logging (default False)
 
appenlight.report_404 = false
 

	
 
## time in seconds after request is considered being slow (default 1)
 
appenlight.slow_request_time = 1
 

	
 
## record slow requests in application
 
## (needs to be enabled for slow datastore recording and time tracking)
 
appenlight.slow_requests = true
 

	
 
## enable hooking to application loggers
 
#appenlight.logging = true
 

	
 
## minimum log level for log capture
 
#appenlight.logging.level = WARNING
 

	
 
## send logs only from erroneous/slow requests
 
## (saves API quota for intensive logging)
 
appenlight.logging_on_error = false
 

	
 
## list of additional keywords that should be grabbed from environ object
 
## can be string with comma separated list of words in lowercase
 
## (by default client will always send following info:
 
## 'REMOTE_USER', 'REMOTE_ADDR', 'SERVER_NAME', 'CONTENT_TYPE' + all keys that
 
## start with HTTP* this list be extended with additional keywords here
 
appenlight.environ_keys_whitelist =
 

	
 
## list of keywords that should be blanked from request object
 
## can be string with comma separated list of words in lowercase
 
## (by default client will always blank keys that contain following words
 
## 'password', 'passwd', 'pwd', 'auth_tkt', 'secret', 'csrf'
 
## this list be extended with additional keywords set here
 
appenlight.request_keys_blacklist =
 

	
 
## list of namespaces that should be ignores when gathering log entries
 
## can be string with comma separated list of namespaces
 
## (by default the client ignores own entries: appenlight_client.client)
 
appenlight.log_namespace_blacklist =
 

	
 
################
 
### [sentry] ###
 
################
 

	
 
## sentry is a alternative open source error aggregator
 
## you must install python packages `sentry` and `raven` to enable
 

	
 
sentry.dsn = YOUR_DNS
 
sentry.servers =
 
sentry.name =
 
sentry.key =
 
sentry.public_key =
 
sentry.secret_key =
 
sentry.project =
 
sentry.site =
 
sentry.include_paths =
 
sentry.exclude_paths =
 

	
 
################################################################################
 
## WARNING: *THE LINE BELOW MUST BE UNCOMMENTED ON A PRODUCTION ENVIRONMENT*  ##
 
## Debug mode will enable the interactive debugging tool, allowing ANYONE to  ##
 
## execute malicious code after an exception is raised.                       ##
 
################################################################################
 
#set debug = false
 
set debug = true
 

	
 
##################################
 
###       LOGVIEW CONFIG       ###
 
##################################
 

	
 
logview.sqlalchemy = #faa
 
logview.pylons.templating = #bfb
 
logview.pylons.util = #eee
 

	
 
#########################################################
 
### DB CONFIGS - EACH DB WILL HAVE IT'S OWN CONFIG    ###
 
#########################################################
 

	
 
# SQLITE [default]
 
sqlalchemy.url = sqlite:///%(here)s/kallithea.db?timeout=60
 

	
 
# POSTGRESQL
 
#sqlalchemy.url = postgresql://user:pass@localhost/kallithea
 

	
 
# MySQL
 
#sqlalchemy.url = mysql://user:pass@localhost/kallithea?charset=utf8
 

	
 
# see sqlalchemy docs for others
 

	
 
sqlalchemy.echo = false
 
sqlalchemy.pool_recycle = 3600
 

	
 
################################
 
### ALEMBIC CONFIGURATION   ####
 
################################
 

	
 
[alembic]
 
script_location = kallithea:alembic
 

	
 
################################
 
### LOGGING CONFIGURATION   ####
 
################################
 

	
 
[loggers]
 
keys = root, routes, kallithea, sqlalchemy, gearbox, beaker, templates, whoosh_indexer
 
keys = root, routes, kallithea, sqlalchemy, tg, gearbox, beaker, templates, whoosh_indexer
 

	
 
[handlers]
 
keys = console, console_sql
 

	
 
[formatters]
 
keys = generic, color_formatter, color_formatter_sql
 

	
 
#############
 
## LOGGERS ##
 
#############
 

	
 
[logger_root]
 
level = NOTSET
 
handlers = console
 

	
 
[logger_routes]
 
level = DEBUG
 
handlers =
 
qualname = routes.middleware
 
## "level = DEBUG" logs the route matched and routing variables.
 
propagate = 1
 

	
 
[logger_beaker]
 
level = DEBUG
 
handlers =
 
qualname = beaker.container
 
propagate = 1
 

	
 
[logger_templates]
 
level = INFO
 
handlers =
 
qualname = pylons.templating
 
propagate = 1
 

	
 
[logger_kallithea]
 
level = DEBUG
 
handlers =
 
qualname = kallithea
 
propagate = 1
 

	
 
[logger_tg]
 
level = DEBUG
 
handlers =
 
qualname = tg
 
propagate = 1
 

	
 
[logger_gearbox]
 
level = DEBUG
 
handlers =
 
qualname = gearbox
 
propagate = 1
 

	
 
[logger_sqlalchemy]
 
level = INFO
 
handlers = console_sql
 
qualname = sqlalchemy.engine
 
propagate = 0
 

	
 
[logger_whoosh_indexer]
 
level = DEBUG
 
handlers =
 
qualname = whoosh_indexer
 
propagate = 1
 

	
 
##############
 
## HANDLERS ##
 
##############
 

	
 
[handler_console]
 
class = StreamHandler
 
args = (sys.stderr,)
 
#level = INFO
 
level = DEBUG
 
#formatter = generic
 
formatter = color_formatter
 

	
 
[handler_console_sql]
 
class = StreamHandler
 
args = (sys.stderr,)
 
#level = WARN
 
level = DEBUG
 
#formatter = generic
 
formatter = color_formatter_sql
 

	
 
################
 
## FORMATTERS ##
 
################
 

	
 
[formatter_generic]
 
format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
 
datefmt = %Y-%m-%d %H:%M:%S
 

	
 
[formatter_color_formatter]
 
class = kallithea.lib.colored_formatter.ColorFormatter
 
format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
 
datefmt = %Y-%m-%d %H:%M:%S
 

	
 
[formatter_color_formatter_sql]
 
class = kallithea.lib.colored_formatter.ColorFormatterSql
 
format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
 
datefmt = %Y-%m-%d %H:%M:%S
kallithea/__init__.py
Show inline comments
 
# -*- coding: utf-8 -*-
 
# This program is free software: you can redistribute it and/or modify
 
# it under the terms of the GNU General Public License as published by
 
# the Free Software Foundation, either version 3 of the License, or
 
# (at your option) any later version.
 
#
 
# This program is distributed in the hope that it will be useful,
 
# but WITHOUT ANY WARRANTY; without even the implied warranty of
 
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
 
# GNU General Public License for more details.
 
#
 
# You should have received a copy of the GNU General Public License
 
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
 
"""
 
kallithea
 
~~~~~~~~~
 

	
 
Kallithea, a web based repository management system.
 

	
 
Versioning implementation: http://www.python.org/dev/peps/pep-0386/
 

	
 
This file was forked by the Kallithea project in July 2014.
 
Original author and date, and relevant copyright and licensing information is below:
 
:created_on: Apr 9, 2010
 
:author: marcink
 
:copyright: (c) 2013 RhodeCode GmbH, (C) 2014 Bradley M. Kuhn, and others.
 
:license: GPLv3, see LICENSE.md for more details.
 
"""
 

	
 
import sys
 
import platform
 

	
 
# temporary aliasing to allow early introduction of imports like 'from tg import request'
 
import pylons
 
sys.modules['tg'] = pylons
 

	
 
VERSION = (0, 3, 99)
 
BACKENDS = {
 
    'hg': 'Mercurial repository',
 
    'git': 'Git repository',
 
}
 

	
 
CELERY_ON = False
 
CELERY_EAGER = False
 

	
 
CONFIG = {}
 

	
 
# Linked module for extensions
 
EXTENSIONS = {}
 

	
 
try:
 
    import kallithea.brand
 
except ImportError:
 
    pass
 
else:
 
    assert False, 'Database rebranding is no longer supported; see README.'
 

	
 

	
 
__version__ = '.'.join(str(each) for each in VERSION)
 
__platform__ = platform.system()
 
__license__ = 'GPLv3'
 
__py_version__ = sys.version_info
 
__author__ = "Various Authors"
 
__url__ = 'https://kallithea-scm.org/'
 

	
 
is_windows = __platform__ in ['Windows']
 
is_unix = not is_windows
kallithea/config/app_cfg.py
Show inline comments
 
# -*- coding: utf-8 -*-
 
# This program is free software: you can redistribute it and/or modify
 
# it under the terms of the GNU General Public License as published by
 
# the Free Software Foundation, either version 3 of the License, or
 
# (at your option) any later version.
 
#
 
# This program is distributed in the hope that it will be useful,
 
# but WITHOUT ANY WARRANTY; without even the implied warranty of
 
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
 
# GNU General Public License for more details.
 
#
 
# You should have received a copy of the GNU General Public License
 
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
 
"""
 
Global configuration file for TurboGears2 specific settings in Kallithea.
 

	
 
import os
 
import kallithea
 
This file complements the .ini file.
 
"""
 

	
 
import platform
 

	
 
import pylons
 
import mako.lookup
 
import formencode
 
import os, sys
 

	
 
import kallithea.lib.app_globals as app_globals
 

	
 
from kallithea.config.routing import make_map
 
import tg
 
from tg import hooks
 
from tg.configuration import AppConfig
 
from tg.support.converters import asbool
 

	
 
from kallithea.lib import helpers
 
from kallithea.lib.middleware.https_fixup import HttpsFixup
 
from kallithea.lib.middleware.simplegit import SimpleGit
 
from kallithea.lib.middleware.simplehg import SimpleHg
 
from kallithea.config.routing import make_map
 
from kallithea.lib.auth import set_available_permissions
 
from kallithea.lib.utils import repo2db_mapper, make_ui, set_app_settings, \
 
    load_rcextensions, check_git_version, set_vcs_config, set_indexer_config
 
from kallithea.lib.utils2 import engine_from_config, str2bool
 
from kallithea.model.base import init_model
 
from kallithea.lib.db_manage import DbManage
 
from kallithea.lib.utils import load_rcextensions, make_ui, set_app_settings, set_vcs_config, \
 
    set_indexer_config, check_git_version, repo2db_mapper
 
from kallithea.lib.utils2 import str2bool
 
from kallithea.model.scm import ScmModel
 

	
 
from routes.middleware import RoutesMiddleware
 
from paste.cascade import Cascade
 
from paste.registry import RegistryManager
 
from paste.urlparser import StaticURLParser
 
from paste.deploy.converters import asbool
 
import formencode
 
import kallithea
 

	
 

	
 
class KallitheaAppConfig(AppConfig):
 
    # Note: AppConfig has a misleading name, as it's not the application
 
    # configuration, but the application configurator. The AppConfig values are
 
    # used as a template to create the actual configuration, which might
 
    # overwrite or extend the one provided by the configurator template.
 

	
 
    # To make it clear, AppConfig creates the config and sets into it the same
 
    # values that AppConfig itself has. Then the values from the config file and
 
    # gearbox options are loaded and merged into the configuration. Then an
 
    # after_init_config(conf) method of AppConfig is called for any change that
 
    # might depend on options provided by configuration files.
 

	
 
    def __init__(self):
 
        super(KallitheaAppConfig, self).__init__()
 

	
 
        self['package'] = kallithea
 

	
 
        self['prefer_toscawidgets2'] = False
 
        self['use_toscawidgets'] = False
 

	
 
        self['renderers'] = []
 

	
 
        # Enable json in expose
 
        self['renderers'].append('json')
 

	
 
from pylons.middleware import ErrorHandler, StatusCodeRedirect
 
from pylons.wsgiapp import PylonsApp
 
        # Configure template rendering
 
        self['renderers'].append('mako')
 
        self['default_renderer'] = 'mako'
 
        self['use_dotted_templatenames'] = False
 

	
 
        # Configure Sessions, store data as JSON to avoid pickle security issues
 
        self['session.enabled'] = True
 
        self['session.data_serializer'] = 'json'
 

	
 
        # Configure the base SQLALchemy Setup
 
        self['use_sqlalchemy'] = True
 
        self['model'] = kallithea.model.base
 
        self['DBSession'] = kallithea.model.meta.Session
 

	
 
        # Configure App without an authentication backend.
 
        self['auth_backend'] = None
 

	
 
from kallithea.lib.middleware.simplehg import SimpleHg
 
from kallithea.lib.middleware.simplegit import SimpleGit
 
from kallithea.lib.middleware.https_fixup import HttpsFixup
 
from kallithea.lib.middleware.sessionmiddleware import SecureSessionMiddleware
 
from kallithea.lib.middleware.wrapper import RequestWrapper
 
        # Use custom error page for these errors. By default, Turbogears2 does not add
 
        # 400 in this list.
 
        # Explicitly listing all is considered more robust than appending to defaults,
 
        # in light of possible future framework changes.
 
        self['errorpage.status_codes'] = [400, 401, 403, 404]
 

	
 
def setup_configuration(config, paths, app_conf, test_env, test_index):
 
        # Disable transaction manager -- currently Kallithea takes care of transactions itself
 
        self['tm.enabled'] = False
 

	
 
base_config = KallitheaAppConfig()
 

	
 
# TODO still needed as long as we use pylonslib
 
sys.modules['pylons'] = tg
 

	
 
def setup_configuration(app):
 
    config = app.config
 

	
 
    # store some globals into kallithea
 
    kallithea.CELERY_ON = str2bool(config['app_conf'].get('use_celery'))
 
    kallithea.CELERY_EAGER = str2bool(config['app_conf'].get('celery.always.eager'))
 
    kallithea.CONFIG = config
 

	
 
    config['routes.map'] = make_map(config)
 
    config['pylons.app_globals'] = app_globals.Globals(config)
 
    config['pylons.h'] = helpers
 
    kallithea.CONFIG = config
 
    # Provide routes mapper to the RoutedController
 
    root_controller = app.find_controller('root')
 
    root_controller.mapper = config['routes.map'] = make_map(config)
 

	
 
    load_rcextensions(root_path=config['here'])
 

	
 
    # Setup cache object as early as possible
 
    pylons.cache._push_object(config['pylons.app_globals'].cache)
 

	
 
    # Create the Mako TemplateLookup, with the default auto-escaping
 
    config['pylons.app_globals'].mako_lookup = mako.lookup.TemplateLookup(
 
        directories=paths['templates'],
 
        strict_undefined=True,
 
        module_directory=os.path.join(app_conf['cache_dir'], 'templates'),
 
        input_encoding='utf-8', default_filters=['escape'],
 
        imports=['from webhelpers.html import escape'])
 

	
 
    # sets the c attribute access when don't existing attribute are accessed
 
    config['pylons.strict_tmpl_context'] = True
 
    # FIXME move test setup code out of here
 
    test = os.path.split(config['__file__'])[-1] == 'test.ini'
 
    if test:
 
        if test_env is None:
 
            test_env = not int(os.environ.get('KALLITHEA_NO_TMP_PATH', 0))
 
        if test_index is None:
 
            test_index = not int(os.environ.get('KALLITHEA_WHOOSH_TEST_DISABLE', 0))
 
        test_env = not int(os.environ.get('KALLITHEA_NO_TMP_PATH', 0))
 
        test_index = not int(os.environ.get('KALLITHEA_WHOOSH_TEST_DISABLE', 0))
 
        if os.environ.get('TEST_DB'):
 
            # swap config if we pass enviroment variable
 
            # swap config if we pass environment variable
 
            config['sqlalchemy.url'] = os.environ.get('TEST_DB')
 

	
 
        from kallithea.tests.fixture import create_test_env, create_test_index
 
        from kallithea.tests.base import TESTS_TMP_PATH
 
        #set KALLITHEA_NO_TMP_PATH=1 to disable re-creating the database and
 
        #test repos
 
        if test_env:
 
            create_test_env(TESTS_TMP_PATH, config)
 
        #set KALLITHEA_WHOOSH_TEST_DISABLE=1 to disable whoosh index during tests
 
        if test_index:
 
            create_test_index(TESTS_TMP_PATH, config, True)
 

	
 
    # MULTIPLE DB configs
 
    # Setup the SQLAlchemy database engine
 
    sa_engine = engine_from_config(config, 'sqlalchemy.')
 
    init_model(sa_engine)
 

	
 
    set_available_permissions(config)
 
    repos_path = make_ui('db').configitems('paths')[0][1]
 
    config['base_path'] = repos_path
 
    set_app_settings(config)
 

	
 
    instance_id = kallithea.CONFIG.get('instance_id', '*')
 
    if instance_id == '*':
 
        instance_id = '%s-%s' % (platform.uname()[1], os.getpid())
 
        kallithea.CONFIG['instance_id'] = instance_id
 

	
 
    # CONFIGURATION OPTIONS HERE (note: all config options will override
 
    # any Pylons config options)
 
    # update kallithea.CONFIG with the meanwhile changed 'config'
 
    kallithea.CONFIG.update(config)
 

	
 
    # store config reference into our module to skip import magic of
 
    # pylons
 
    kallithea.CONFIG.update(config)
 
    # configure vcs and indexer libraries (they are supposed to be independent
 
    # as much as possible and thus avoid importing tg.config or
 
    # kallithea.CONFIG).
 
    set_vcs_config(kallithea.CONFIG)
 
    set_indexer_config(kallithea.CONFIG)
 

	
 
    #check git version
 
    check_git_version()
 

	
 
    if str2bool(config.get('initial_repo_scan', True)):
 
        repo2db_mapper(ScmModel().repo_scan(repos_path),
 
                       remove_obsolete=False, install_git_hooks=False)
 

	
 
    formencode.api.set_stdtranslation(languages=[config.get('lang')])
 

	
 
    return config
 

	
 
def setup_application(config, global_conf, full_stack, static_files):
 
hooks.register('configure_new_app', setup_configuration)
 

	
 
    # The Pylons WSGI app
 
    app = PylonsApp(config=config)
 

	
 
    # Routing/Session/Cache Middleware
 
    app = RoutesMiddleware(app, config['routes.map'], use_method_override=False)
 
    app = SecureSessionMiddleware(app, config)
 

	
 
    # CUSTOM MIDDLEWARE HERE (filtered by error handling middlewares)
 
    if asbool(config['pdebug']):
 
        from kallithea.lib.profiler import ProfilingMiddleware
 
        app = ProfilingMiddleware(app)
 

	
 
    if asbool(full_stack):
 

	
 
        from kallithea.lib.middleware.sentry import Sentry
 
        from kallithea.lib.middleware.appenlight import AppEnlight
 
        if AppEnlight and asbool(config['app_conf'].get('appenlight')):
 
            app = AppEnlight(app, config)
 
        elif Sentry:
 
            app = Sentry(app, config)
 

	
 
        # Handle Python exceptions
 
        app = ErrorHandler(app, global_conf, **config['pylons.errorware'])
 
def setup_application(app):
 
    config = app.config
 

	
 
        # Display error documents for 401, 403, 404 status codes (and
 
        # 500 when debug is disabled)
 
        # Note: will buffer the output in memory!
 
        if asbool(config['debug']):
 
            app = StatusCodeRedirect(app)
 
        else:
 
            app = StatusCodeRedirect(app, [400, 401, 403, 404, 500])
 

	
 
        # we want our low level middleware to get to the request ASAP. We don't
 
        # need any pylons stack middleware in them - especially no StatusCodeRedirect buffering
 
        app = SimpleHg(app, config)
 
        app = SimpleGit(app, config)
 
    # we want our low level middleware to get to the request ASAP. We don't
 
    # need any stack middleware in them - especially no StatusCodeRedirect buffering
 
    app = SimpleHg(app, config)
 
    app = SimpleGit(app, config)
 

	
 
        # Enable https redirects based on HTTP_X_URL_SCHEME set by proxy
 
        if any(asbool(config.get(x)) for x in ['https_fixup', 'force_https', 'use_htsts']):
 
            app = HttpsFixup(app, config)
 

	
 
        app = RequestWrapper(app, config) # logging
 

	
 
    # Establish the Registry for this application
 
    app = RegistryManager(app) # thread / request-local module globals / variables
 
    # Enable https redirects based on HTTP_X_URL_SCHEME set by proxy
 
    if any(asbool(config.get(x)) for x in ['https_fixup', 'force_https', 'use_htsts']):
 
        app = HttpsFixup(app, config)
 
    return app
 

	
 
    if asbool(static_files):
 
        # Serve static files
 
        static_app = StaticURLParser(config['pylons.paths']['static_files'])
 
        app = Cascade([static_app, app])
 

	
 
    app.config = config
 

	
 
    return app
 
hooks.register('before_config', setup_application)
kallithea/config/environment.py
Show inline comments
 
# -*- coding: utf-8 -*-
 
# This program is free software: you can redistribute it and/or modify
 
# it under the terms of the GNU General Public License as published by
 
# the Free Software Foundation, either version 3 of the License, or
 
# (at your option) any later version.
 
#
 
# This program is distributed in the hope that it will be useful,
 
# but WITHOUT ANY WARRANTY; without even the implied warranty of
 
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
 
# GNU General Public License for more details.
 
#
 
# You should have received a copy of the GNU General Public License
 
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
 
"""
 
    Pylons environment configuration
 
"""
 
"""WSGI environment setup for Kallithea."""
 

	
 
import os
 
import kallithea
 
import pylons
 

	
 
from kallithea.config.app_cfg import setup_configuration
 
from kallithea.config.app_cfg import base_config
 

	
 
def load_environment(global_conf, app_conf,
 
                     test_env=None, test_index=None):
 
    """
 
    Configure the Pylons environment via the ``pylons.config``
 
    object
 
    """
 
    config = pylons.configuration.PylonsConfig()
 
__all__ = ['load_environment']
 

	
 
    # Pylons paths
 
    root = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
 
    paths = dict(
 
        root=root,
 
        controllers=os.path.join(root, 'controllers'),
 
        static_files=os.path.join(root, 'public'),
 
        templates=[os.path.join(root, 'templates')]
 
    )
 

	
 
    # Initialize config with the basic options
 
    config.init_app(global_conf, app_conf, package='kallithea', paths=paths)
 

	
 
    return setup_configuration(config, paths, app_conf, test_env, test_index)
 
# Use base_config to setup the environment loader function
 
load_environment = base_config.make_load_environment()
kallithea/config/middleware.py
Show inline comments
 
# -*- coding: utf-8 -*-
 
# This program is free software: you can redistribute it and/or modify
 
# it under the terms of the GNU General Public License as published by
 
# the Free Software Foundation, either version 3 of the License, or
 
# (at your option) any later version.
 
#
 
# This program is distributed in the hope that it will be useful,
 
# but WITHOUT ANY WARRANTY; without even the implied warranty of
 
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
 
# GNU General Public License for more details.
 
#
 
# You should have received a copy of the GNU General Public License
 
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
 
"""
 
    Pylons middleware initialization
 
"""
 
"""WSGI middleware initialization for the Kallithea application."""
 

	
 
from kallithea.config.app_cfg import setup_application
 
from kallithea.config.app_cfg import base_config
 
from kallithea.config.environment import load_environment
 

	
 
def make_app(global_conf, full_stack=True, static_files=True, **app_conf):
 
    """Create a Pylons WSGI application and return it
 
__all__ = ['make_app']
 

	
 
# Use base_config to setup the necessary PasteDeploy application factory.
 
# make_base_app will wrap the TurboGears2 app with all the middleware it needs.
 
make_base_app = base_config.setup_tg_wsgi_app(load_environment)
 

	
 
    ``global_conf``
 
        The inherited configuration for this application. Normally from
 
        the [DEFAULT] section of the Paste ini file.
 

	
 
def make_app(global_conf, full_stack=True, **app_conf):
 
    """
 
    Set up Kallithea with the settings found in the PasteDeploy configuration
 
    file used.
 

	
 
    ``full_stack``
 
        Whether or not this application provides a full WSGI stack (by
 
        default, meaning it handles its own exceptions and errors).
 
        Disable full_stack when this application is "managed" by
 
        another WSGI middleware.
 
    :param global_conf: The global settings for Kallithea (those
 
        defined under the ``[DEFAULT]`` section).
 
    :type global_conf: dict
 
    :param full_stack: Should the whole TurboGears2 stack be set up?
 
    :type full_stack: str or bool
 
    :return: The Kallithea application with all the relevant middleware
 
        loaded.
 

	
 
    ``app_conf``
 
        The application's local configuration. Normally specified in
 
        the [app:<name>] section of the Paste ini file (where <name>
 
        defaults to main).
 
    This is the PasteDeploy factory for the Kallithea application.
 

	
 
    ``app_conf`` contains all the application-specific settings (those defined
 
    under ``[app:main]``.
 
    """
 
    # Configure the Pylons environment
 
    config = load_environment(global_conf, app_conf)
 

	
 
    return setup_application(config, global_conf, full_stack, static_files)
 
    app = make_base_app(global_conf, full_stack=full_stack, **app_conf)
 
    return app
kallithea/config/routing.py
Show inline comments
 
# -*- coding: utf-8 -*-
 
# This program is free software: you can redistribute it and/or modify
 
# it under the terms of the GNU General Public License as published by
 
# the Free Software Foundation, either version 3 of the License, or
 
# (at your option) any later version.
 
#
 
# This program is distributed in the hope that it will be useful,
 
# but WITHOUT ANY WARRANTY; without even the implied warranty of
 
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
 
# GNU General Public License for more details.
 
#
 
# You should have received a copy of the GNU General Public License
 
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
 
"""
 
Routes configuration
 

	
 
The more specific and detailed routes should be defined first so they
 
may take precedent over the more generic routes. For more information
 
refer to the routes manual at http://routes.groovie.org/docs/
 
"""
 

	
 
from tg import request
 
from routes import Mapper
 

	
 
# prefix for non repository related links needs to be prefixed with `/`
 
ADMIN_PREFIX = '/_admin'
 

	
 

	
 
def make_map(config):
 
    """Create, configure and return the routes Mapper"""
 
    rmap = Mapper(directory=config['pylons.paths']['controllers'],
 
    rmap = Mapper(directory=config['paths']['controllers'],
 
                  always_scan=config['debug'])
 
    rmap.minimization = False
 
    rmap.explicit = False
 

	
 
    from kallithea.lib.utils import (is_valid_repo, is_valid_repo_group,
 
                                     get_repo_by_id)
 

	
 
    def check_repo(environ, match_dict):
 
        """
 
        check for valid repository for proper 404 handling
 

	
 
        :param environ:
 
        :param match_dict:
 
        """
 
        repo_name = match_dict.get('repo_name')
 

	
 
        if match_dict.get('f_path'):
 
            #fix for multiple initial slashes that causes errors
 
            # fix for multiple initial slashes that causes errors
 
            match_dict['f_path'] = match_dict['f_path'].lstrip('/')
 

	
 
        by_id_match = get_repo_by_id(repo_name)
 
        if by_id_match:
 
            repo_name = by_id_match
 
            match_dict['repo_name'] = repo_name
 

	
 
        return is_valid_repo(repo_name, config['base_path'])
 

	
 
    def check_group(environ, match_dict):
 
        """
 
        check for valid repository group for proper 404 handling
 

	
 
        :param environ:
 
        :param match_dict:
 
        """
 
        repo_group_name = match_dict.get('group_name')
 
        return is_valid_repo_group(repo_group_name, config['base_path'])
 

	
 
    def check_group_skip_path(environ, match_dict):
 
        """
 
        check for valid repository group for proper 404 handling, but skips
 
        verification of existing path
 

	
 
        :param environ:
 
        :param match_dict:
 
        """
 
        repo_group_name = match_dict.get('group_name')
 
        return is_valid_repo_group(repo_group_name, config['base_path'],
 
                                   skip_path_check=True)
 

	
 
    def check_user_group(environ, match_dict):
 
        """
 
        check for valid user group for proper 404 handling
 

	
 
        :param environ:
 
        :param match_dict:
 
        """
 
        return True
 

	
 
    def check_int(environ, match_dict):
 
        return match_dict.get('id').isdigit()
 

	
 
    # The ErrorController route (handles 404/500 error pages); it should
 
    # likely stay at the top, ensuring it can always be resolved
 
    rmap.connect('/error/{action}', controller='error')
 
    rmap.connect('/error/{action}/{id}', controller='error')
 

	
 
    #==========================================================================
 
    # CUSTOM ROUTES HERE
 
    #==========================================================================
 

	
 
    #MAIN PAGE
 
    rmap.connect('home', '/', controller='home', action='index')
 
    rmap.connect('about', '/about', controller='home', action='about')
 
    rmap.connect('repo_switcher_data', '/_repos', controller='home',
 
                 action='repo_switcher_data')
 

	
 
    rmap.connect('rst_help',
 
                 "http://docutils.sourceforge.net/docs/user/rst/quickref.html",
 
                 _static=True)
 
    rmap.connect('kallithea_project_url', "https://kallithea-scm.org/", _static=True)
 
    rmap.connect('issues_url', 'https://bitbucket.org/conservancy/kallithea/issues', _static=True)
 

	
 
    #ADMIN REPOSITORY ROUTES
 
    with rmap.submapper(path_prefix=ADMIN_PREFIX,
 
                        controller='admin/repos') as m:
 
        m.connect("repos", "/repos",
 
                  action="create", conditions=dict(method=["POST"]))
 
        m.connect("repos", "/repos",
 
                  action="index", conditions=dict(method=["GET"]))
 
        m.connect("new_repo", "/create_repository",
 
                  action="create_repository", conditions=dict(method=["GET"]))
 
        m.connect("update_repo", "/repos/{repo_name:.*?}",
 
                  action="update", conditions=dict(method=["POST"],
 
                  function=check_repo))
 
        m.connect("delete_repo", "/repos/{repo_name:.*?}/delete",
 
                  action="delete", conditions=dict(method=["POST"]))
 

	
 
    #ADMIN REPOSITORY GROUPS ROUTES
 
    with rmap.submapper(path_prefix=ADMIN_PREFIX,
 
                        controller='admin/repo_groups') as m:
 
        m.connect("repos_groups", "/repo_groups",
 
                  action="create", conditions=dict(method=["POST"]))
 
        m.connect("repos_groups", "/repo_groups",
 
                  action="index", conditions=dict(method=["GET"]))
 
        m.connect("new_repos_group", "/repo_groups/new",
 
                  action="new", conditions=dict(method=["GET"]))
 
        m.connect("update_repos_group", "/repo_groups/{group_name:.*?}",
 
                  action="update", conditions=dict(method=["POST"],
 
                                                   function=check_group))
 

	
 
        m.connect("repos_group", "/repo_groups/{group_name:.*?}",
 
                  action="show", conditions=dict(method=["GET"],
 
                                                 function=check_group))
 

	
 
        #EXTRAS REPO GROUP ROUTES
 
        m.connect("edit_repo_group", "/repo_groups/{group_name:.*?}/edit",
 
                  action="edit",
 
                  conditions=dict(method=["GET"], function=check_group))
 

	
 
        m.connect("edit_repo_group_advanced", "/repo_groups/{group_name:.*?}/edit/advanced",
 
                  action="edit_repo_group_advanced",
 
                  conditions=dict(method=["GET"], function=check_group))
 

	
 
        m.connect("edit_repo_group_perms", "/repo_groups/{group_name:.*?}/edit/permissions",
 
                  action="edit_repo_group_perms",
 
                  conditions=dict(method=["GET"], function=check_group))
 
        m.connect("edit_repo_group_perms_update", "/repo_groups/{group_name:.*?}/edit/permissions",
 
                  action="update_perms",
 
                  conditions=dict(method=["POST"], function=check_group))
 
        m.connect("edit_repo_group_perms_delete", "/repo_groups/{group_name:.*?}/edit/permissions/delete",
 
                  action="delete_perms",
 
                  conditions=dict(method=["POST"], function=check_group))
 

	
 
        m.connect("delete_repo_group", "/repo_groups/{group_name:.*?}/delete",
 
                  action="delete", conditions=dict(method=["POST"],
 
                                                   function=check_group_skip_path))
 

	
 

	
 
    #ADMIN USER ROUTES
 
    with rmap.submapper(path_prefix=ADMIN_PREFIX,
 
                        controller='admin/users') as m:
 
        m.connect("new_user", "/users/new",
 
                  action="create", conditions=dict(method=["POST"]))
 
        m.connect("users", "/users",
 
                  action="index", conditions=dict(method=["GET"]))
 
        m.connect("formatted_users", "/users.{format}",
 
                  action="index", conditions=dict(method=["GET"]))
 
        m.connect("new_user", "/users/new",
 
                  action="new", conditions=dict(method=["GET"]))
 
        m.connect("update_user", "/users/{id}",
 
                  action="update", conditions=dict(method=["POST"]))
 
        m.connect("delete_user", "/users/{id}/delete",
 
                  action="delete", conditions=dict(method=["POST"]))
 
        m.connect("edit_user", "/users/{id}/edit",
 
                  action="edit", conditions=dict(method=["GET"]))
 

	
 
        #EXTRAS USER ROUTES
 
        m.connect("edit_user_advanced", "/users/{id}/edit/advanced",
 
                  action="edit_advanced", conditions=dict(method=["GET"]))
 

	
 
        m.connect("edit_user_api_keys", "/users/{id}/edit/api_keys",
 
                  action="edit_api_keys", conditions=dict(method=["GET"]))
 
        m.connect("edit_user_api_keys_update", "/users/{id}/edit/api_keys",
 
                  action="add_api_key", conditions=dict(method=["POST"]))
 
        m.connect("edit_user_api_keys_delete", "/users/{id}/edit/api_keys/delete",
 
                  action="delete_api_key", conditions=dict(method=["POST"]))
 

	
 
        m.connect("edit_user_perms", "/users/{id}/edit/permissions",
 
                  action="edit_perms", conditions=dict(method=["GET"]))
 
        m.connect("edit_user_perms_update", "/users/{id}/edit/permissions",
 
                  action="update_perms", conditions=dict(method=["POST"]))
 

	
 
        m.connect("edit_user_emails", "/users/{id}/edit/emails",
 
                  action="edit_emails", conditions=dict(method=["GET"]))
 
        m.connect("edit_user_emails_update", "/users/{id}/edit/emails",
 
                  action="add_email", conditions=dict(method=["POST"]))
 
        m.connect("edit_user_emails_delete", "/users/{id}/edit/emails/delete",
 
                  action="delete_email", conditions=dict(method=["POST"]))
 

	
 
        m.connect("edit_user_ips", "/users/{id}/edit/ips",
 
                  action="edit_ips", conditions=dict(method=["GET"]))
 
        m.connect("edit_user_ips_update", "/users/{id}/edit/ips",
 
                  action="add_ip", conditions=dict(method=["POST"]))
 
        m.connect("edit_user_ips_delete", "/users/{id}/edit/ips/delete",
 
                  action="delete_ip", conditions=dict(method=["POST"]))
 

	
 
    #ADMIN USER GROUPS REST ROUTES
 
    with rmap.submapper(path_prefix=ADMIN_PREFIX,
 
                        controller='admin/user_groups') as m:
 
        m.connect("users_groups", "/user_groups",
 
                  action="create", conditions=dict(method=["POST"]))
 
        m.connect("users_groups", "/user_groups",
 
                  action="index", conditions=dict(method=["GET"]))
 
        m.connect("new_users_group", "/user_groups/new",
 
                  action="new", conditions=dict(method=["GET"]))
 
        m.connect("update_users_group", "/user_groups/{id}",
 
                  action="update", conditions=dict(method=["POST"]))
 
        m.connect("delete_users_group", "/user_groups/{id}/delete",
 
                  action="delete", conditions=dict(method=["POST"]))
 
        m.connect("edit_users_group", "/user_groups/{id}/edit",
 
                  action="edit", conditions=dict(method=["GET"]),
 
                  function=check_user_group)
 

	
 
        #EXTRAS USER GROUP ROUTES
 
        m.connect("edit_user_group_default_perms", "/user_groups/{id}/edit/default_perms",
 
                  action="edit_default_perms", conditions=dict(method=["GET"]))
 
        m.connect("edit_user_group_default_perms_update", "/user_groups/{id}/edit/default_perms",
 
                  action="update_default_perms", conditions=dict(method=["POST"]))
 

	
 

	
 
        m.connect("edit_user_group_perms", "/user_groups/{id}/edit/perms",
 
                  action="edit_perms", conditions=dict(method=["GET"]))
 
        m.connect("edit_user_group_perms_update", "/user_groups/{id}/edit/perms",
 
                  action="update_perms", conditions=dict(method=["POST"]))
 
        m.connect("edit_user_group_perms_delete", "/user_groups/{id}/edit/perms/delete",
 
                  action="delete_perms", conditions=dict(method=["POST"]))
 

	
 
        m.connect("edit_user_group_advanced", "/user_groups/{id}/edit/advanced",
 
                  action="edit_advanced", conditions=dict(method=["GET"]))
 

	
 
        m.connect("edit_user_group_members", "/user_groups/{id}/edit/members",
 
                  action="edit_members", conditions=dict(method=["GET"]))
 

	
 

	
 

	
 
    #ADMIN PERMISSIONS ROUTES
 
    with rmap.submapper(path_prefix=ADMIN_PREFIX,
 
                        controller='admin/permissions') as m:
 
        m.connect("admin_permissions", "/permissions",
 
                  action="permission_globals", conditions=dict(method=["POST"]))
 
        m.connect("admin_permissions", "/permissions",
 
                  action="permission_globals", conditions=dict(method=["GET"]))
 

	
 
        m.connect("admin_permissions_ips", "/permissions/ips",
 
                  action="permission_ips", conditions=dict(method=["GET"]))
 

	
 
        m.connect("admin_permissions_perms", "/permissions/perms",
 
                  action="permission_perms", conditions=dict(method=["GET"]))
 

	
 

	
 
    #ADMIN DEFAULTS ROUTES
 
    with rmap.submapper(path_prefix=ADMIN_PREFIX,
 
                        controller='admin/defaults') as m:
 
        m.connect('defaults', 'defaults',
 
                  action="index")
 
        m.connect('defaults_update', 'defaults/{id}/update',
 
                  action="update", conditions=dict(method=["POST"]))
 

	
 
    #ADMIN AUTH SETTINGS
 
    rmap.connect('auth_settings', '%s/auth' % ADMIN_PREFIX,
 
                 controller='admin/auth_settings', action='auth_settings',
 
                 conditions=dict(method=["POST"]))
 
    rmap.connect('auth_home', '%s/auth' % ADMIN_PREFIX,
 
                 controller='admin/auth_settings')
 

	
 
    #ADMIN SETTINGS ROUTES
 
    with rmap.submapper(path_prefix=ADMIN_PREFIX,
 
                        controller='admin/settings') as m:
 
        m.connect("admin_settings", "/settings",
 
                  action="settings_vcs", conditions=dict(method=["POST"]))
 
        m.connect("admin_settings", "/settings",
 
                  action="settings_vcs", conditions=dict(method=["GET"]))
 

	
 
        m.connect("admin_settings_mapping", "/settings/mapping",
 
                  action="settings_mapping", conditions=dict(method=["POST"]))
 
        m.connect("admin_settings_mapping", "/settings/mapping",
 
                  action="settings_mapping", conditions=dict(method=["GET"]))
 

	
 
        m.connect("admin_settings_global", "/settings/global",
 
                  action="settings_global", conditions=dict(method=["POST"]))
 
        m.connect("admin_settings_global", "/settings/global",
 
                  action="settings_global", conditions=dict(method=["GET"]))
 

	
 
        m.connect("admin_settings_visual", "/settings/visual",
 
                  action="settings_visual", conditions=dict(method=["POST"]))
 
        m.connect("admin_settings_visual", "/settings/visual",
 
                  action="settings_visual", conditions=dict(method=["GET"]))
 

	
 
        m.connect("admin_settings_email", "/settings/email",
 
                  action="settings_email", conditions=dict(method=["POST"]))
 
        m.connect("admin_settings_email", "/settings/email",
 
                  action="settings_email", conditions=dict(method=["GET"]))
 

	
 
        m.connect("admin_settings_hooks", "/settings/hooks",
 
                  action="settings_hooks", conditions=dict(method=["POST"]))
 
        m.connect("admin_settings_hooks_delete", "/settings/hooks/delete",
 
                  action="settings_hooks", conditions=dict(method=["POST"]))
 
        m.connect("admin_settings_hooks", "/settings/hooks",
 
                  action="settings_hooks", conditions=dict(method=["GET"]))
 

	
 
        m.connect("admin_settings_search", "/settings/search",
 
                  action="settings_search", conditions=dict(method=["POST"]))
 
        m.connect("admin_settings_search", "/settings/search",
 
                  action="settings_search", conditions=dict(method=["GET"]))
 

	
 
        m.connect("admin_settings_system", "/settings/system",
 
                  action="settings_system", conditions=dict(method=["POST"]))
 
        m.connect("admin_settings_system", "/settings/system",
 
                  action="settings_system", conditions=dict(method=["GET"]))
 
        m.connect("admin_settings_system_update", "/settings/system/updates",
 
                  action="settings_system_update", conditions=dict(method=["GET"]))
 

	
 
    #ADMIN MY ACCOUNT
 
    with rmap.submapper(path_prefix=ADMIN_PREFIX,
 
                        controller='admin/my_account') as m:
 

	
 
        m.connect("my_account", "/my_account",
 
                  action="my_account", conditions=dict(method=["GET"]))
 
        m.connect("my_account", "/my_account",
 
                  action="my_account", conditions=dict(method=["POST"]))
 

	
 
        m.connect("my_account_password", "/my_account/password",
 
                  action="my_account_password", conditions=dict(method=["GET"]))
 
        m.connect("my_account_password", "/my_account/password",
 
                  action="my_account_password", conditions=dict(method=["POST"]))
 

	
 
        m.connect("my_account_repos", "/my_account/repos",
 
                  action="my_account_repos", conditions=dict(method=["GET"]))
 

	
 
        m.connect("my_account_watched", "/my_account/watched",
 
                  action="my_account_watched", conditions=dict(method=["GET"]))
 

	
 
        m.connect("my_account_perms", "/my_account/perms",
 
                  action="my_account_perms", conditions=dict(method=["GET"]))
 

	
 
        m.connect("my_account_emails", "/my_account/emails",
 
                  action="my_account_emails", conditions=dict(method=["GET"]))
 
        m.connect("my_account_emails", "/my_account/emails",
 
                  action="my_account_emails_add", conditions=dict(method=["POST"]))
 
        m.connect("my_account_emails_delete", "/my_account/emails/delete",
 
                  action="my_account_emails_delete", conditions=dict(method=["POST"]))
 

	
 
        m.connect("my_account_api_keys", "/my_account/api_keys",
 
                  action="my_account_api_keys", conditions=dict(method=["GET"]))
 
        m.connect("my_account_api_keys", "/my_account/api_keys",
 
                  action="my_account_api_keys_add", conditions=dict(method=["POST"]))
 
        m.connect("my_account_api_keys_delete", "/my_account/api_keys/delete",
 
                  action="my_account_api_keys_delete", conditions=dict(method=["POST"]))
 

	
 
    #NOTIFICATION REST ROUTES
 
    with rmap.submapper(path_prefix=ADMIN_PREFIX,
 
                        controller='admin/notifications') as m:
 
        m.connect("notifications", "/notifications",
 
                  action="index", conditions=dict(method=["GET"]))
 
        m.connect("notifications_mark_all_read", "/notifications/mark_all_read",
 
                  action="mark_all_read", conditions=dict(method=["GET"]))
 
        m.connect("formatted_notifications", "/notifications.{format}",
 
                  action="index", conditions=dict(method=["GET"]))
 
        m.connect("notification_update", "/notifications/{notification_id}/update",
 
                  action="update", conditions=dict(method=["POST"]))
 
        m.connect("notification_delete", "/notifications/{notification_id}/delete",
 
                  action="delete", conditions=dict(method=["POST"]))
 
        m.connect("notification", "/notifications/{notification_id}",
 
                  action="show", conditions=dict(method=["GET"]))
 
        m.connect("formatted_notification", "/notifications/{notification_id}.{format}",
 
                  action="show", conditions=dict(method=["GET"]))
 

	
 
    #ADMIN GIST
 
    with rmap.submapper(path_prefix=ADMIN_PREFIX,
 
                        controller='admin/gists') as m:
 
        m.connect("gists", "/gists",
 
                  action="create", conditions=dict(method=["POST"]))
 
        m.connect("gists", "/gists",
 
                  action="index", conditions=dict(method=["GET"]))
 
        m.connect("new_gist", "/gists/new",
 
                  action="new", conditions=dict(method=["GET"]))
 

	
 

	
 
        m.connect("gist_delete", "/gists/{gist_id}/delete",
 
                  action="delete", conditions=dict(method=["POST"]))
 
        m.connect("edit_gist", "/gists/{gist_id}/edit",
 
                  action="edit", conditions=dict(method=["GET", "POST"]))
 
        m.connect("edit_gist_check_revision", "/gists/{gist_id}/edit/check_revision",
 
                  action="check_revision", conditions=dict(method=["POST"]))
 

	
 

	
 
        m.connect("gist", "/gists/{gist_id}",
 
                  action="show", conditions=dict(method=["GET"]))
 
        m.connect("gist_rev", "/gists/{gist_id}/{revision}",
 
                  revision="tip",
 
                  action="show", conditions=dict(method=["GET"]))
 
        m.connect("formatted_gist", "/gists/{gist_id}/{revision}/{format}",
 
                  revision="tip",
 
                  action="show", conditions=dict(method=["GET"]))
 
        m.connect("formatted_gist_file", "/gists/{gist_id}/{revision}/{format}/{f_path:.*}",
 
                  revision='tip',
 
                  action="show", conditions=dict(method=["GET"]))
 

	
 
    #ADMIN MAIN PAGES
 
    with rmap.submapper(path_prefix=ADMIN_PREFIX,
 
                        controller='admin/admin') as m:
 
        m.connect('admin_home', '', action='index')
 
        m.connect('admin_add_repo', '/add_repo/{new_repo:[a-z0-9\. _-]*}',
 
                  action='add_repo')
 
    #==========================================================================
 
    # API V2
 
    #==========================================================================
 
    with rmap.submapper(path_prefix=ADMIN_PREFIX,
 
                        controller='api/api') as m:
 
    with rmap.submapper(path_prefix=ADMIN_PREFIX, controller='api/api',
 
                        action='_dispatch') as m:
 
        m.connect('api', '/api')
 

	
 
    #USER JOURNAL
 
    rmap.connect('journal', '%s/journal' % ADMIN_PREFIX,
 
                 controller='journal', action='index')
 
    rmap.connect('journal_rss', '%s/journal/rss' % ADMIN_PREFIX,
 
                 controller='journal', action='journal_rss')
 
    rmap.connect('journal_atom', '%s/journal/atom' % ADMIN_PREFIX,
 
                 controller='journal', action='journal_atom')
 

	
 
    rmap.connect('public_journal', '%s/public_journal' % ADMIN_PREFIX,
 
                 controller='journal', action="public_journal")
 

	
 
    rmap.connect('public_journal_rss', '%s/public_journal/rss' % ADMIN_PREFIX,
 
                 controller='journal', action="public_journal_rss")
 

	
 
    rmap.connect('public_journal_rss_old', '%s/public_journal_rss' % ADMIN_PREFIX,
 
                 controller='journal', action="public_journal_rss")
 

	
 
    rmap.connect('public_journal_atom',
 
                 '%s/public_journal/atom' % ADMIN_PREFIX, controller='journal',
 
                 action="public_journal_atom")
 

	
 
    rmap.connect('public_journal_atom_old',
 
                 '%s/public_journal_atom' % ADMIN_PREFIX, controller='journal',
 
                 action="public_journal_atom")
 

	
 
    rmap.connect('toggle_following', '%s/toggle_following' % ADMIN_PREFIX,
 
                 controller='journal', action='toggle_following',
 
                 conditions=dict(method=["POST"]))
 

	
 
    #SEARCH
 
    rmap.connect('search', '%s/search' % ADMIN_PREFIX, controller='search',)
 
    rmap.connect('search_repo_admin', '%s/search/{repo_name:.*}' % ADMIN_PREFIX,
 
                 controller='search',
 
                 conditions=dict(function=check_repo))
 
    rmap.connect('search_repo', '/{repo_name:.*?}/search',
 
                 controller='search',
 
                 conditions=dict(function=check_repo),
 
                 )
 

	
 
    #LOGIN/LOGOUT/REGISTER/SIGN IN
 
    rmap.connect('authentication_token', '%s/authentication_token' % ADMIN_PREFIX, controller='login', action='authentication_token')
 
    rmap.connect('login_home', '%s/login' % ADMIN_PREFIX, controller='login')
 
    rmap.connect('logout_home', '%s/logout' % ADMIN_PREFIX, controller='login',
 
                 action='logout')
 

	
 
    rmap.connect('register', '%s/register' % ADMIN_PREFIX, controller='login',
 
                 action='register')
 

	
 
    rmap.connect('reset_password', '%s/password_reset' % ADMIN_PREFIX,
 
                 controller='login', action='password_reset')
 

	
 
    rmap.connect('reset_password_confirmation',
 
                 '%s/password_reset_confirmation' % ADMIN_PREFIX,
 
                 controller='login', action='password_reset_confirmation')
 

	
 
    #FEEDS
 
    rmap.connect('rss_feed_home', '/{repo_name:.*?}/feed/rss',
 
                controller='feed', action='rss',
 
                conditions=dict(function=check_repo))
 

	
 
    rmap.connect('atom_feed_home', '/{repo_name:.*?}/feed/atom',
 
                controller='feed', action='atom',
 
                conditions=dict(function=check_repo))
 

	
 
    #==========================================================================
 
    # REPOSITORY ROUTES
 
    #==========================================================================
 
    rmap.connect('repo_creating_home', '/{repo_name:.*?}/repo_creating',
 
                controller='admin/repos', action='repo_creating')
 
    rmap.connect('repo_check_home', '/{repo_name:.*?}/crepo_check',
 
                controller='admin/repos', action='repo_check')
 

	
 
    rmap.connect('summary_home', '/{repo_name:.*?}',
 
                controller='summary',
 
                conditions=dict(function=check_repo))
 

	
 
    # must be here for proper group/repo catching
 
    rmap.connect('repos_group_home', '/{group_name:.*}',
 
                controller='admin/repo_groups', action="show_by_name",
 
                conditions=dict(function=check_group))
 
    rmap.connect('repo_stats_home', '/{repo_name:.*?}/statistics',
 
                controller='summary', action='statistics',
 
                conditions=dict(function=check_repo))
 

	
 
    rmap.connect('repo_size', '/{repo_name:.*?}/repo_size',
 
                controller='summary', action='repo_size',
 
                conditions=dict(function=check_repo))
 

	
 
    rmap.connect('repo_refs_data', '/{repo_name:.*?}/refs-data',
 
                 controller='home', action='repo_refs_data')
 

	
 
    rmap.connect('changeset_home', '/{repo_name:.*?}/changeset/{revision:.*}',
 
                controller='changeset', revision='tip',
 
                conditions=dict(function=check_repo))
 
    rmap.connect('changeset_children', '/{repo_name:.*?}/changeset_children/{revision}',
 
                controller='changeset', revision='tip', action="changeset_children",
 
                conditions=dict(function=check_repo))
 
    rmap.connect('changeset_parents', '/{repo_name:.*?}/changeset_parents/{revision}',
 
                controller='changeset', revision='tip', action="changeset_parents",
 
                conditions=dict(function=check_repo))
 

	
 
    # repo edit options
 
    rmap.connect("edit_repo", "/{repo_name:.*?}/settings",
 
                 controller='admin/repos', action="edit",
 
                 conditions=dict(method=["GET"], function=check_repo))
 

	
 
    rmap.connect("edit_repo_perms", "/{repo_name:.*?}/settings/permissions",
 
                 controller='admin/repos', action="edit_permissions",
 
                 conditions=dict(method=["GET"], function=check_repo))
 
    rmap.connect("edit_repo_perms_update", "/{repo_name:.*?}/settings/permissions",
 
                 controller='admin/repos', action="edit_permissions_update",
 
                 conditions=dict(method=["POST"], function=check_repo))
 
    rmap.connect("edit_repo_perms_revoke", "/{repo_name:.*?}/settings/permissions/delete",
 
                 controller='admin/repos', action="edit_permissions_revoke",
 
                 conditions=dict(method=["POST"], function=check_repo))
 

	
 
    rmap.connect("edit_repo_fields", "/{repo_name:.*?}/settings/fields",
 
                 controller='admin/repos', action="edit_fields",
 
                 conditions=dict(method=["GET"], function=check_repo))
 
    rmap.connect('create_repo_fields', "/{repo_name:.*?}/settings/fields/new",
 
                 controller='admin/repos', action="create_repo_field",
 
                 conditions=dict(method=["POST"], function=check_repo))
 
    rmap.connect('delete_repo_fields', "/{repo_name:.*?}/settings/fields/{field_id}/delete",
 
                 controller='admin/repos', action="delete_repo_field",
 
                 conditions=dict(method=["POST"], function=check_repo))
 

	
 

	
 
    rmap.connect("edit_repo_advanced", "/{repo_name:.*?}/settings/advanced",
 
                 controller='admin/repos', action="edit_advanced",
 
                 conditions=dict(method=["GET"], function=check_repo))
 

	
 
    rmap.connect("edit_repo_advanced_locking", "/{repo_name:.*?}/settings/advanced/locking",
 
                 controller='admin/repos', action="edit_advanced_locking",
 
                 conditions=dict(method=["POST"], function=check_repo))
 
    rmap.connect('toggle_locking', "/{repo_name:.*?}/settings/advanced/locking_toggle",
 
                 controller='admin/repos', action="toggle_locking",
 
                 conditions=dict(method=["GET"], function=check_repo))
 

	
 
    rmap.connect("edit_repo_advanced_journal", "/{repo_name:.*?}/settings/advanced/journal",
 
                 controller='admin/repos', action="edit_advanced_journal",
 
                 conditions=dict(method=["POST"], function=check_repo))
 

	
 
    rmap.connect("edit_repo_advanced_fork", "/{repo_name:.*?}/settings/advanced/fork",
 
                 controller='admin/repos', action="edit_advanced_fork",
 
                 conditions=dict(method=["POST"], function=check_repo))
 

	
 

	
 
    rmap.connect("edit_repo_caches", "/{repo_name:.*?}/settings/caches",
 
                 controller='admin/repos', action="edit_caches",
 
                 conditions=dict(method=["GET"], function=check_repo))
 
    rmap.connect("update_repo_caches", "/{repo_name:.*?}/settings/caches",
 
                 controller='admin/repos', action="edit_caches",
 
                 conditions=dict(method=["POST"], function=check_repo))
 

	
 

	
 
    rmap.connect("edit_repo_remote", "/{repo_name:.*?}/settings/remote",
 
                 controller='admin/repos', action="edit_remote",
 
                 conditions=dict(method=["GET"], function=check_repo))
 
    rmap.connect("edit_repo_remote_update", "/{repo_name:.*?}/settings/remote",
 
                 controller='admin/repos', action="edit_remote",
 
                 conditions=dict(method=["POST"], function=check_repo))
 

	
 
    rmap.connect("edit_repo_statistics", "/{repo_name:.*?}/settings/statistics",
 
                 controller='admin/repos', action="edit_statistics",
 
                 conditions=dict(method=["GET"], function=check_repo))
 
    rmap.connect("edit_repo_statistics_update", "/{repo_name:.*?}/settings/statistics",
 
                 controller='admin/repos', action="edit_statistics",
 
                 conditions=dict(method=["POST"], function=check_repo))
 

	
 
    #still working url for backward compat.
 
    rmap.connect('raw_changeset_home_depraced',
 
                 '/{repo_name:.*?}/raw-changeset/{revision}',
 
                 controller='changeset', action='changeset_raw',
 
                 revision='tip', conditions=dict(function=check_repo))
 

	
 
    ## new URLs
 
    rmap.connect('changeset_raw_home',
 
                 '/{repo_name:.*?}/changeset-diff/{revision}',
 
                 controller='changeset', action='changeset_raw',
 
                 revision='tip', conditions=dict(function=check_repo))
 

	
 
    rmap.connect('changeset_patch_home',
 
                 '/{repo_name:.*?}/changeset-patch/{revision}',
 
                 controller='changeset', action='changeset_patch',
 
                 revision='tip', conditions=dict(function=check_repo))
 

	
 
    rmap.connect('changeset_download_home',
 
                 '/{repo_name:.*?}/changeset-download/{revision}',
 
                 controller='changeset', action='changeset_download',
 
                 revision='tip', conditions=dict(function=check_repo))
 

	
 
    rmap.connect('changeset_comment',
 
                 '/{repo_name:.*?}/changeset-comment/{revision}',
 
                controller='changeset', revision='tip', action='comment',
 
                conditions=dict(function=check_repo))
 

	
 
    rmap.connect('changeset_comment_delete',
 
                 '/{repo_name:.*?}/changeset-comment/{comment_id}/delete',
 
                controller='changeset', action='delete_comment',
 
                conditions=dict(function=check_repo, method=["POST"]))
 

	
 
    rmap.connect('changeset_info', '/changeset_info/{repo_name:.*?}/{revision}',
 
                 controller='changeset', action='changeset_info')
 

	
 
    rmap.connect('compare_home',
 
                 '/{repo_name:.*?}/compare',
 
                 controller='compare', action='index',
 
                 conditions=dict(function=check_repo))
 

	
 
    rmap.connect('compare_url',
 
                 '/{repo_name:.*?}/compare/{org_ref_type}@{org_ref_name:.*?}...{other_ref_type}@{other_ref_name:.*?}',
 
                 controller='compare', action='compare',
 
                 conditions=dict(function=check_repo),
 
                 requirements=dict(
 
                            org_ref_type='(branch|book|tag|rev|__other_ref_type__)',
 
                            other_ref_type='(branch|book|tag|rev|__org_ref_type__)')
 
                 )
 

	
 
    rmap.connect('pullrequest_home',
 
                 '/{repo_name:.*?}/pull-request/new', controller='pullrequests',
 
                 action='index', conditions=dict(function=check_repo,
 
                                                 method=["GET"]))
 

	
 
    rmap.connect('pullrequest_repo_info',
 
                 '/{repo_name:.*?}/pull-request-repo-info',
 
                 controller='pullrequests', action='repo_info',
 
                 conditions=dict(function=check_repo, method=["GET"]))
 

	
 
    rmap.connect('pullrequest',
 
                 '/{repo_name:.*?}/pull-request/new', controller='pullrequests',
 
                 action='create', conditions=dict(function=check_repo,
 
                                                  method=["POST"]))
 

	
 
    rmap.connect('pullrequest_show',
 
                 '/{repo_name:.*?}/pull-request/{pull_request_id:\\d+}{extra:(/.*)?}', extra='',
 
                 controller='pullrequests',
 
                 action='show', conditions=dict(function=check_repo,
 
                                                method=["GET"]))
 
    rmap.connect('pullrequest_post',
 
                 '/{repo_name:.*?}/pull-request/{pull_request_id}',
 
                 controller='pullrequests',
 
                 action='post', conditions=dict(function=check_repo,
 
                                                method=["POST"]))
 
    rmap.connect('pullrequest_delete',
 
                 '/{repo_name:.*?}/pull-request/{pull_request_id}/delete',
 
                 controller='pullrequests',
 
                 action='delete', conditions=dict(function=check_repo,
 
                                                  method=["POST"]))
 

	
 
    rmap.connect('pullrequest_show_all',
 
                 '/{repo_name:.*?}/pull-request',
 
                 controller='pullrequests',
 
                 action='show_all', conditions=dict(function=check_repo,
 
                                                method=["GET"]))
 

	
 
    rmap.connect('my_pullrequests',
 
                 '/my_pullrequests',
 
                 controller='pullrequests',
 
                 action='show_my', conditions=dict(method=["GET"]))
 

	
 
    rmap.connect('pullrequest_comment',
 
                 '/{repo_name:.*?}/pull-request-comment/{pull_request_id}',
 
                 controller='pullrequests',
 
                 action='comment', conditions=dict(function=check_repo,
 
                                                method=["POST"]))
 

	
 
    rmap.connect('pullrequest_comment_delete',
 
                 '/{repo_name:.*?}/pull-request-comment/{comment_id}/delete',
 
                controller='pullrequests', action='delete_comment',
 
                conditions=dict(function=check_repo, method=["POST"]))
 

	
 
    rmap.connect('summary_home_summary', '/{repo_name:.*?}/summary',
 
                controller='summary', conditions=dict(function=check_repo))
 

	
 
    rmap.connect('changelog_home', '/{repo_name:.*?}/changelog',
 
                controller='changelog', conditions=dict(function=check_repo))
 

	
 
    rmap.connect('changelog_summary_home', '/{repo_name:.*?}/changelog_summary',
 
                controller='changelog', action='changelog_summary',
 
                conditions=dict(function=check_repo))
 

	
 
    rmap.connect('changelog_file_home', '/{repo_name:.*?}/changelog/{revision}/{f_path:.*}',
 
                controller='changelog', f_path=None,
 
                conditions=dict(function=check_repo))
 

	
 
    rmap.connect('changelog_details', '/{repo_name:.*?}/changelog_details/{cs}',
 
                controller='changelog', action='changelog_details',
 
                conditions=dict(function=check_repo))
 

	
 
    rmap.connect('files_home', '/{repo_name:.*?}/files/{revision}/{f_path:.*}',
 
                controller='files', revision='tip', f_path='',
 
                conditions=dict(function=check_repo))
 

	
 
    rmap.connect('files_home_nopath', '/{repo_name:.*?}/files/{revision}',
 
                controller='files', revision='tip', f_path='',
 
                conditions=dict(function=check_repo))
 

	
 
    rmap.connect('files_history_home',
 
                 '/{repo_name:.*?}/history/{revision}/{f_path:.*}',
 
                 controller='files', action='history', revision='tip', f_path='',
 
                 conditions=dict(function=check_repo))
 

	
 
    rmap.connect('files_authors_home',
 
                 '/{repo_name:.*?}/authors/{revision}/{f_path:.*}',
 
                 controller='files', action='authors', revision='tip', f_path='',
 
                 conditions=dict(function=check_repo))
 

	
 
    rmap.connect('files_diff_home', '/{repo_name:.*?}/diff/{f_path:.*}',
 
                controller='files', action='diff', revision='tip', f_path='',
 
                conditions=dict(function=check_repo))
 

	
 
    rmap.connect('files_diff_2way_home', '/{repo_name:.*?}/diff-2way/{f_path:.+}',
 
                controller='files', action='diff_2way', revision='tip', f_path='',
 
                conditions=dict(function=check_repo))
 

	
 
    rmap.connect('files_rawfile_home',
 
                 '/{repo_name:.*?}/rawfile/{revision}/{f_path:.*}',
 
                 controller='files', action='rawfile', revision='tip',
 
                 f_path='', conditions=dict(function=check_repo))
 

	
 
    rmap.connect('files_raw_home',
 
                 '/{repo_name:.*?}/raw/{revision}/{f_path:.*}',
 
                 controller='files', action='raw', revision='tip', f_path='',
 
                 conditions=dict(function=check_repo))
 

	
 
    rmap.connect('files_annotate_home',
 
                 '/{repo_name:.*?}/annotate/{revision}/{f_path:.*}',
 
                 controller='files', action='index', revision='tip',
 
                 f_path='', annotate=True, conditions=dict(function=check_repo))
 

	
 
    rmap.connect('files_edit_home',
 
                 '/{repo_name:.*?}/edit/{revision}/{f_path:.*}',
 
                 controller='files', action='edit', revision='tip',
 
                 f_path='', conditions=dict(function=check_repo))
 

	
 
    rmap.connect('files_add_home',
 
                 '/{repo_name:.*?}/add/{revision}/{f_path:.*}',
 
                 controller='files', action='add', revision='tip',
 
                 f_path='', conditions=dict(function=check_repo))
 

	
 
    rmap.connect('files_delete_home',
 
                 '/{repo_name:.*?}/delete/{revision}/{f_path:.*}',
 
                 controller='files', action='delete', revision='tip',
 
                 f_path='', conditions=dict(function=check_repo))
 

	
 
    rmap.connect('files_archive_home', '/{repo_name:.*?}/archive/{fname}',
 
                controller='files', action='archivefile',
 
                conditions=dict(function=check_repo))
 

	
 
    rmap.connect('files_nodelist_home',
 
                 '/{repo_name:.*?}/nodelist/{revision}/{f_path:.*}',
 
                controller='files', action='nodelist',
 
                conditions=dict(function=check_repo))
 

	
 
    rmap.connect('repo_fork_create_home', '/{repo_name:.*?}/fork',
 
                controller='forks', action='fork_create',
 
                conditions=dict(function=check_repo, method=["POST"]))
 

	
 
    rmap.connect('repo_fork_home', '/{repo_name:.*?}/fork',
 
                controller='forks', action='fork',
 
                conditions=dict(function=check_repo))
 

	
 
    rmap.connect('repo_forks_home', '/{repo_name:.*?}/forks',
 
                 controller='forks', action='forks',
 
                 conditions=dict(function=check_repo))
 

	
 
    rmap.connect('repo_followers_home', '/{repo_name:.*?}/followers',
 
                 controller='followers', action='followers',
 
                 conditions=dict(function=check_repo))
 

	
 
    return rmap
 

	
 

	
 
class UrlGenerator(object):
 
    """Emulate pylons.url in providing a wrapper around routes.url
 

	
 
    This code was added during migration from Pylons to Turbogears2. Pylons
 
    already provided a wrapper like this, but Turbogears2 does not.
 

	
 
    When the routing of Kallithea is changed to use less Routes and more
 
    Turbogears2-style routing, this class may disappear or change.
 

	
 
    url() (the __call__ method) returns the URL based on a route name and
 
    arguments.
 
    url.current() returns the URL of the current page with arguments applied.
 

	
 
    Refer to documentation of Routes for details:
 
    https://routes.readthedocs.io/en/latest/generating.html#generation
 
    """
 
    def __call__(self, *args, **kwargs):
 
        return request.environ['routes.url'](*args, **kwargs)
 
    def current(self, *args, **kwargs):
 
        return request.environ['routes.url'].current(*args, **kwargs)
 

	
 
url = UrlGenerator()
kallithea/controllers/admin/repo_groups.py
Show inline comments
 
# -*- coding: utf-8 -*-
 
# This program is free software: you can redistribute it and/or modify
 
# it under the terms of the GNU General Public License as published by
 
# the Free Software Foundation, either version 3 of the License, or
 
# (at your option) any later version.
 
#
 
# This program is distributed in the hope that it will be useful,
 
# but WITHOUT ANY WARRANTY; without even the implied warranty of
 
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
 
# GNU General Public License for more details.
 
#
 
# You should have received a copy of the GNU General Public License
 
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
 
"""
 
kallithea.controllers.admin.repo_groups
 
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
 

	
 
Repository groups controller for Kallithea
 

	
 
This file was forked by the Kallithea project in July 2014.
 
Original author and date, and relevant copyright and licensing information is below:
 
:created_on: Mar 23, 2010
 
:author: marcink
 
:copyright: (c) 2013 RhodeCode GmbH, and others.
 
:license: GPLv3, see LICENSE.md for more details.
 
"""
 

	
 
import logging
 
import traceback
 
import formencode
 
import itertools
 

	
 
from formencode import htmlfill
 

	
 
from tg import request, tmpl_context as c
 
from tg import request, tmpl_context as c, app_globals
 
from tg.i18n import ugettext as _, ungettext
 
from webob.exc import HTTPFound, HTTPForbidden, HTTPNotFound, HTTPInternalServerError
 

	
 
import kallithea
 
from kallithea.config.routing import url
 
from kallithea.lib import helpers as h
 
from kallithea.lib.auth import LoginRequired, \
 
    HasRepoGroupPermissionLevelDecorator, HasRepoGroupPermissionLevel, \
 
    HasPermissionAny
 
from kallithea.lib.base import BaseController, render
 
from kallithea.model.db import RepoGroup, Repository
 
from kallithea.model.scm import RepoGroupList, AvailableRepoGroupChoices
 
from kallithea.model.repo_group import RepoGroupModel
 
from kallithea.model.forms import RepoGroupForm, RepoGroupPermsForm
 
from kallithea.model.meta import Session
 
from kallithea.model.repo import RepoModel
 
from kallithea.lib.utils2 import safe_int
 
from sqlalchemy.sql.expression import func
 

	
 

	
 
log = logging.getLogger(__name__)
 

	
 

	
 
class RepoGroupsController(BaseController):
 

	
 
    @LoginRequired()
 
    def _before(self, *args, **kwargs):
 
        super(RepoGroupsController, self)._before(*args, **kwargs)
 

	
 
    def __load_defaults(self, extras=(), exclude=()):
 
        """extras is used for keeping current parent ignoring permissions
 
        exclude is used for not moving group to itself TODO: also exclude descendants
 
        Note: only admin can create top level groups
 
        """
 
        repo_groups = AvailableRepoGroupChoices([], 'admin', extras)
 
        exclude_group_ids = set(rg.group_id for rg in exclude)
 
        c.repo_groups = [rg for rg in repo_groups
 
                         if rg[0] not in exclude_group_ids]
 

	
 
        repo_model = RepoModel()
 
        c.users_array = repo_model.get_users_js()
 
        c.user_groups_array = repo_model.get_user_groups_js()
 

	
 
    def __load_data(self, group_id):
 
        """
 
        Load defaults settings for edit, and update
 

	
 
        :param group_id:
 
        """
 
        repo_group = RepoGroup.get_or_404(group_id)
 
        data = repo_group.get_dict()
 
        data['group_name'] = repo_group.name
 

	
 
        # fill repository group users
 
        for p in repo_group.repo_group_to_perm:
 
            data.update({'u_perm_%s' % p.user.username:
 
                             p.permission.permission_name})
 

	
 
        # fill repository group groups
 
        for p in repo_group.users_group_to_perm:
 
            data.update({'g_perm_%s' % p.users_group.users_group_name:
 
                             p.permission.permission_name})
 

	
 
        return data
 

	
 
    def _revoke_perms_on_yourself(self, form_result):
 
        _up = filter(lambda u: request.authuser.username == u[0],
 
                     form_result['perms_updates'])
 
        _new = filter(lambda u: request.authuser.username == u[0],
 
                      form_result['perms_new'])
 
        if _new and _new[0][1] != 'group.admin' or _up and _up[0][1] != 'group.admin':
 
            return True
 
        return False
 

	
 
    def index(self, format='html'):
 
        _list = RepoGroup.query(sorted=True).all()
 
        group_iter = RepoGroupList(_list, perm_level='admin')
 
        repo_groups_data = []
 
        total_records = len(group_iter)
 
        _tmpl_lookup = kallithea.CONFIG['pylons.app_globals'].mako_lookup
 
        _tmpl_lookup = app_globals.mako_lookup
 
        template = _tmpl_lookup.get_template('data_table/_dt_elements.html')
 

	
 
        repo_group_name = lambda repo_group_name, children_groups: (
 
            template.get_def("repo_group_name")
 
            .render(repo_group_name, children_groups, _=_, h=h, c=c)
 
        )
 
        repo_group_actions = lambda repo_group_id, repo_group_name, gr_count: (
 
            template.get_def("repo_group_actions")
 
            .render(repo_group_id, repo_group_name, gr_count, _=_, h=h, c=c,
 
                    ungettext=ungettext)
 
        )
 

	
 
        for repo_gr in group_iter:
 
            children_groups = map(h.safe_unicode,
 
                itertools.chain((g.name for g in repo_gr.parents),
 
                                (x.name for x in [repo_gr])))
 
            repo_count = repo_gr.repositories.count()
 
            repo_groups_data.append({
 
                "raw_name": repo_gr.group_name,
 
                "group_name": repo_group_name(repo_gr.group_name, children_groups),
 
                "desc": h.escape(repo_gr.group_description),
 
                "repos": repo_count,
 
                "owner": h.person(repo_gr.owner),
 
                "action": repo_group_actions(repo_gr.group_id, repo_gr.group_name,
 
                                             repo_count)
 
            })
 

	
 
        c.data = {
 
            "totalRecords": total_records,
 
            "startIndex": 0,
 
            "sort": None,
 
            "dir": "asc",
 
            "records": repo_groups_data
 
        }
 

	
 
        return render('admin/repo_groups/repo_groups.html')
 

	
 
    def create(self):
 
        self.__load_defaults()
 

	
 
        # permissions for can create group based on parent_id are checked
 
        # here in the Form
 
        repo_group_form = RepoGroupForm(repo_groups=c.repo_groups)
 
        try:
 
            form_result = repo_group_form.to_python(dict(request.POST))
 
            gr = RepoGroupModel().create(
 
                group_name=form_result['group_name'],
 
                group_description=form_result['group_description'],
 
                parent=form_result['parent_group_id'],
 
                owner=request.authuser.user_id, # TODO: make editable
 
                copy_permissions=form_result['group_copy_permissions']
 
            )
 
            Session().commit()
 
            #TODO: in future action_logger(, '', '', '')
 
        except formencode.Invalid as errors:
 
            return htmlfill.render(
 
                render('admin/repo_groups/repo_group_add.html'),
 
                defaults=errors.value,
 
                errors=errors.error_dict or {},
 
                prefix_error=False,
 
                encoding="UTF-8",
 
                force_defaults=False)
 
        except Exception:
 
            log.error(traceback.format_exc())
 
            h.flash(_('Error occurred during creation of repository group %s') \
 
                    % request.POST.get('group_name'), category='error')
 
            parent_group_id = form_result['parent_group_id']
 
            #TODO: maybe we should get back to the main view, not the admin one
 
            raise HTTPFound(location=url('repos_groups', parent_group=parent_group_id))
 
        h.flash(_('Created repository group %s') % gr.group_name,
 
                category='success')
 
        raise HTTPFound(location=url('repos_group_home', group_name=gr.group_name))
 

	
 
    def new(self):
 
        if HasPermissionAny('hg.admin')('group create'):
 
            #we're global admin, we're ok and we can create TOP level groups
 
            pass
 
        else:
 
            # we pass in parent group into creation form, thus we know
 
            # what would be the group, we can check perms here !
 
            group_id = safe_int(request.GET.get('parent_group'))
 
            group = RepoGroup.get(group_id) if group_id else None
 
            group_name = group.group_name if group else None
 
            if HasRepoGroupPermissionLevel('admin')(group_name, 'group create'):
 
                pass
 
            else:
 
                raise HTTPForbidden()
 

	
 
        self.__load_defaults()
 
        return render('admin/repo_groups/repo_group_add.html')
 

	
 
    @HasRepoGroupPermissionLevelDecorator('admin')
 
    def update(self, group_name):
 
        c.repo_group = RepoGroup.guess_instance(group_name)
 
        self.__load_defaults(extras=[c.repo_group.parent_group],
 
                             exclude=[c.repo_group])
 

	
 
        # TODO: kill allow_empty_group - it is only used for redundant form validation!
 
        if HasPermissionAny('hg.admin')('group edit'):
 
            #we're global admin, we're ok and we can create TOP level groups
 
            allow_empty_group = True
 
        elif not c.repo_group.parent_group:
 
            allow_empty_group = True
 
        else:
 
            allow_empty_group = False
 
        repo_group_form = RepoGroupForm(
 
            edit=True,
 
            old_data=c.repo_group.get_dict(),
 
            repo_groups=c.repo_groups,
 
            can_create_in_root=allow_empty_group,
 
        )()
 
        try:
 
            form_result = repo_group_form.to_python(dict(request.POST))
 

	
 
            new_gr = RepoGroupModel().update(group_name, form_result)
 
            Session().commit()
 
            h.flash(_('Updated repository group %s') \
 
                    % form_result['group_name'], category='success')
 
            # we now have new name !
 
            group_name = new_gr.group_name
 
            #TODO: in future action_logger(, '', '', '')
 
        except formencode.Invalid as errors:
 
            c.active = 'settings'
 
            return htmlfill.render(
 
                render('admin/repo_groups/repo_group_edit.html'),
 
                defaults=errors.value,
 
                errors=errors.error_dict or {},
 
                prefix_error=False,
 
                encoding="UTF-8",
 
                force_defaults=False)
 
        except Exception:
 
            log.error(traceback.format_exc())
 
            h.flash(_('Error occurred during update of repository group %s') \
 
                    % request.POST.get('group_name'), category='error')
 

	
 
        raise HTTPFound(location=url('edit_repo_group', group_name=group_name))
 

	
 
    @HasRepoGroupPermissionLevelDecorator('admin')
 
    def delete(self, group_name):
 
        gr = c.repo_group = RepoGroup.guess_instance(group_name)
 
        repos = gr.repositories.all()
 
        if repos:
 
            h.flash(_('This group contains %s repositories and cannot be '
 
                      'deleted') % len(repos), category='warning')
 
            raise HTTPFound(location=url('repos_groups'))
 

	
 
        children = gr.children.all()
 
        if children:
 
            h.flash(_('This group contains %s subgroups and cannot be deleted'
 
                      % (len(children))), category='warning')
 
            raise HTTPFound(location=url('repos_groups'))
 

	
 
        try:
 
            RepoGroupModel().delete(group_name)
 
            Session().commit()
 
            h.flash(_('Removed repository group %s') % group_name,
 
                    category='success')
 
            #TODO: in future action_logger(, '', '', '')
 
        except Exception:
 
            log.error(traceback.format_exc())
 
            h.flash(_('Error occurred during deletion of repository group %s')
 
                    % group_name, category='error')
 

	
 
        if gr.parent_group:
 
            raise HTTPFound(location=url('repos_group_home', group_name=gr.parent_group.group_name))
 
        raise HTTPFound(location=url('repos_groups'))
 

	
 
    def show_by_name(self, group_name):
 
        """
 
        This is a proxy that does a lookup group_name -> id, and shows
 
        the group by id view instead
 
        """
 
        group_name = group_name.rstrip('/')
 
        id_ = RepoGroup.get_by_group_name(group_name)
 
        if id_:
 
            return self.show(group_name)
 
        raise HTTPNotFound
 

	
 
    @HasRepoGroupPermissionLevelDecorator('read')
 
    def show(self, group_name):
 
        c.active = 'settings'
 

	
 
        c.group = c.repo_group = RepoGroup.guess_instance(group_name)
 

	
 
        groups = RepoGroup.query(sorted=True).filter_by(parent_group=c.group).all()
 
        c.groups = self.scm_model.get_repo_groups(groups)
 

	
 
        repos_list = Repository.query(sorted=True).filter_by(group=c.group).all()
 
        repos_data = RepoModel().get_repos_as_dict(repos_list=repos_list,
 
                                                   admin=False, short_name=True)
 
        # data used to render the grid
 
        c.data = repos_data
 

	
 
        return render('admin/repo_groups/repo_group_show.html')
 

	
 
    @HasRepoGroupPermissionLevelDecorator('admin')
 
    def edit(self, group_name):
 
        c.active = 'settings'
 

	
 
        c.repo_group = RepoGroup.guess_instance(group_name)
 
        self.__load_defaults(extras=[c.repo_group.parent_group],
 
                             exclude=[c.repo_group])
 
        defaults = self.__load_data(c.repo_group.group_id)
 

	
 
        return htmlfill.render(
 
            render('admin/repo_groups/repo_group_edit.html'),
 
            defaults=defaults,
 
            encoding="UTF-8",
 
            force_defaults=False
 
        )
 

	
 
    @HasRepoGroupPermissionLevelDecorator('admin')
 
    def edit_repo_group_advanced(self, group_name):
 
        c.active = 'advanced'
 
        c.repo_group = RepoGroup.guess_instance(group_name)
 

	
 
        return render('admin/repo_groups/repo_group_edit.html')
 

	
 
    @HasRepoGroupPermissionLevelDecorator('admin')
 
    def edit_repo_group_perms(self, group_name):
 
        c.active = 'perms'
 
        c.repo_group = RepoGroup.guess_instance(group_name)
 
        self.__load_defaults()
 
        defaults = self.__load_data(c.repo_group.group_id)
 

	
 
        return htmlfill.render(
 
            render('admin/repo_groups/repo_group_edit.html'),
 
            defaults=defaults,
 
            encoding="UTF-8",
 
            force_defaults=False
 
        )
 

	
 
    @HasRepoGroupPermissionLevelDecorator('admin')
 
    def update_perms(self, group_name):
 
        """
 
        Update permissions for given repository group
 

	
 
        :param group_name:
 
        """
 

	
 
        c.repo_group = RepoGroup.guess_instance(group_name)
 
        valid_recursive_choices = ['none', 'repos', 'groups', 'all']
 
        form_result = RepoGroupPermsForm(valid_recursive_choices)().to_python(request.POST)
 
        if not request.authuser.is_admin:
 
            if self._revoke_perms_on_yourself(form_result):
 
                msg = _('Cannot revoke permission for yourself as admin')
 
                h.flash(msg, category='warning')
 
                raise HTTPFound(location=url('edit_repo_group_perms', group_name=group_name))
 
        recursive = form_result['recursive']
 
        # iterate over all members(if in recursive mode) of this groups and
 
        # set the permissions !
 
        # this can be potentially heavy operation
 
        RepoGroupModel()._update_permissions(c.repo_group,
 
                                             form_result['perms_new'],
 
                                             form_result['perms_updates'],
 
                                             recursive)
 
        #TODO: implement this
 
        #action_logger(request.authuser, 'admin_changed_repo_permissions',
 
        #              repo_name, request.ip_addr)
 
        Session().commit()
 
        h.flash(_('Repository group permissions updated'), category='success')
 
        raise HTTPFound(location=url('edit_repo_group_perms', group_name=group_name))
 

	
 
    @HasRepoGroupPermissionLevelDecorator('admin')
 
    def delete_perms(self, group_name):
 
        try:
 
            obj_type = request.POST.get('obj_type')
 
            obj_id = None
 
            if obj_type == 'user':
 
                obj_id = safe_int(request.POST.get('user_id'))
 
            elif obj_type == 'user_group':
 
                obj_id = safe_int(request.POST.get('user_group_id'))
 

	
 
            if not request.authuser.is_admin:
 
                if obj_type == 'user' and request.authuser.user_id == obj_id:
 
                    msg = _('Cannot revoke permission for yourself as admin')
 
                    h.flash(msg, category='warning')
 
                    raise Exception('revoke admin permission on self')
 
            recursive = request.POST.get('recursive', 'none')
 
            if obj_type == 'user':
 
                RepoGroupModel().delete_permission(repo_group=group_name,
 
                                                   obj=obj_id, obj_type='user',
 
                                                   recursive=recursive)
 
            elif obj_type == 'user_group':
 
                RepoGroupModel().delete_permission(repo_group=group_name,
 
                                                   obj=obj_id,
 
                                                   obj_type='user_group',
 
                                                   recursive=recursive)
 

	
 
            Session().commit()
 
        except Exception:
 
            log.error(traceback.format_exc())
 
            h.flash(_('An error occurred during revoking of permission'),
 
                    category='error')
 
            raise HTTPInternalServerError()
kallithea/controllers/admin/user_groups.py
Show inline comments
 
# -*- coding: utf-8 -*-
 
# This program is free software: you can redistribute it and/or modify
 
# it under the terms of the GNU General Public License as published by
 
# the Free Software Foundation, either version 3 of the License, or
 
# (at your option) any later version.
 
#
 
# This program is distributed in the hope that it will be useful,
 
# but WITHOUT ANY WARRANTY; without even the implied warranty of
 
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
 
# GNU General Public License for more details.
 
#
 
# You should have received a copy of the GNU General Public License
 
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
 
"""
 
kallithea.controllers.admin.user_groups
 
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
 

	
 
User Groups crud controller
 

	
 
This file was forked by the Kallithea project in July 2014.
 
Original author and date, and relevant copyright and licensing information is below:
 
:created_on: Jan 25, 2011
 
:author: marcink
 
:copyright: (c) 2013 RhodeCode GmbH, and others.
 
:license: GPLv3, see LICENSE.md for more details.
 
"""
 

	
 
import logging
 
import traceback
 
import formencode
 

	
 
from formencode import htmlfill
 
from tg import request, tmpl_context as c, config
 
from tg import request, tmpl_context as c, config, app_globals
 
from tg.i18n import ugettext as _
 
from webob.exc import HTTPFound
 

	
 
from sqlalchemy.orm import joinedload
 
from sqlalchemy.sql.expression import func
 
from webob.exc import HTTPInternalServerError
 

	
 
import kallithea
 
from kallithea.config.routing import url
 
from kallithea.lib import helpers as h
 
from kallithea.lib.exceptions import UserGroupsAssignedException, \
 
    RepoGroupAssignmentError
 
from kallithea.lib.utils2 import safe_unicode, safe_int
 
from kallithea.lib.auth import LoginRequired, \
 
    HasUserGroupPermissionLevelDecorator, HasPermissionAnyDecorator
 
from kallithea.lib.base import BaseController, render
 
from kallithea.model.scm import UserGroupList
 
from kallithea.model.user_group import UserGroupModel
 
from kallithea.model.repo import RepoModel
 
from kallithea.model.db import User, UserGroup, UserGroupToPerm, \
 
    UserGroupRepoToPerm, UserGroupRepoGroupToPerm
 
from kallithea.model.forms import UserGroupForm, UserGroupPermsForm, \
 
    CustomDefaultPermissionsForm
 
from kallithea.model.meta import Session
 
from kallithea.lib.utils import action_logger
 

	
 
log = logging.getLogger(__name__)
 

	
 

	
 
class UserGroupsController(BaseController):
 
    """REST Controller styled on the Atom Publishing Protocol"""
 

	
 
    @LoginRequired()
 
    def _before(self, *args, **kwargs):
 
        super(UserGroupsController, self)._before(*args, **kwargs)
 
        c.available_permissions = config['available_permissions']
 

	
 
    def __load_data(self, user_group_id):
 
        c.group_members_obj = sorted((x.user for x in c.user_group.members),
 
                                     key=lambda u: u.username.lower())
 

	
 
        c.group_members = [(x.user_id, x.username) for x in c.group_members_obj]
 
        c.available_members = sorted(((x.user_id, x.username) for x in
 
                                      User.query().all()),
 
                                     key=lambda u: u[1].lower())
 

	
 
    def __load_defaults(self, user_group_id):
 
        """
 
        Load defaults settings for edit, and update
 

	
 
        :param user_group_id:
 
        """
 
        user_group = UserGroup.get_or_404(user_group_id)
 
        data = user_group.get_dict()
 
        return data
 

	
 
    def index(self, format='html'):
 
        _list = UserGroup.query() \
 
                        .order_by(func.lower(UserGroup.users_group_name)) \
 
                        .all()
 
        group_iter = UserGroupList(_list, perm_level='admin')
 
        user_groups_data = []
 
        total_records = len(group_iter)
 
        _tmpl_lookup = kallithea.CONFIG['pylons.app_globals'].mako_lookup
 
        _tmpl_lookup = app_globals.mako_lookup
 
        template = _tmpl_lookup.get_template('data_table/_dt_elements.html')
 

	
 
        user_group_name = lambda user_group_id, user_group_name: (
 
            template.get_def("user_group_name")
 
            .render(user_group_id, user_group_name, _=_, h=h, c=c)
 
        )
 
        user_group_actions = lambda user_group_id, user_group_name: (
 
            template.get_def("user_group_actions")
 
            .render(user_group_id, user_group_name, _=_, h=h, c=c)
 
        )
 
        for user_gr in group_iter:
 

	
 
            user_groups_data.append({
 
                "raw_name": user_gr.users_group_name,
 
                "group_name": user_group_name(user_gr.users_group_id,
 
                                              user_gr.users_group_name),
 
                "desc": h.escape(user_gr.user_group_description),
 
                "members": len(user_gr.members),
 
                "active": h.boolicon(user_gr.users_group_active),
 
                "owner": h.person(user_gr.owner.username),
 
                "action": user_group_actions(user_gr.users_group_id, user_gr.users_group_name)
 
            })
 

	
 
        c.data = {
 
            "totalRecords": total_records,
 
            "startIndex": 0,
 
            "sort": None,
 
            "dir": "asc",
 
            "records": user_groups_data
 
        }
 

	
 
        return render('admin/user_groups/user_groups.html')
 

	
 
    @HasPermissionAnyDecorator('hg.admin', 'hg.usergroup.create.true')
 
    def create(self):
 
        users_group_form = UserGroupForm()()
 
        try:
 
            form_result = users_group_form.to_python(dict(request.POST))
 
            ug = UserGroupModel().create(name=form_result['users_group_name'],
 
                                         description=form_result['user_group_description'],
 
                                         owner=request.authuser.user_id,
 
                                         active=form_result['users_group_active'])
 

	
 
            gr = form_result['users_group_name']
 
            action_logger(request.authuser,
 
                          'admin_created_users_group:%s' % gr,
 
                          None, request.ip_addr)
 
            h.flash(h.literal(_('Created user group %s') % h.link_to(h.escape(gr), url('edit_users_group', id=ug.users_group_id))),
 
                category='success')
 
            Session().commit()
 
        except formencode.Invalid as errors:
 
            return htmlfill.render(
 
                render('admin/user_groups/user_group_add.html'),
 
                defaults=errors.value,
 
                errors=errors.error_dict or {},
 
                prefix_error=False,
 
                encoding="UTF-8",
 
                force_defaults=False)
 
        except Exception:
 
            log.error(traceback.format_exc())
 
            h.flash(_('Error occurred during creation of user group %s') \
 
                    % request.POST.get('users_group_name'), category='error')
 

	
 
        raise HTTPFound(location=url('users_groups'))
 

	
 
    @HasPermissionAnyDecorator('hg.admin', 'hg.usergroup.create.true')
 
    def new(self, format='html'):
 
        return render('admin/user_groups/user_group_add.html')
 

	
 
    @HasUserGroupPermissionLevelDecorator('admin')
 
    def update(self, id):
 
        c.user_group = UserGroup.get_or_404(id)
 
        c.active = 'settings'
 
        self.__load_data(id)
 

	
 
        available_members = [safe_unicode(x[0]) for x in c.available_members]
 

	
 
        users_group_form = UserGroupForm(edit=True,
 
                                         old_data=c.user_group.get_dict(),
 
                                         available_members=available_members)()
 

	
 
        try:
 
            form_result = users_group_form.to_python(request.POST)
 
            UserGroupModel().update(c.user_group, form_result)
 
            gr = form_result['users_group_name']
 
            action_logger(request.authuser,
 
                          'admin_updated_users_group:%s' % gr,
 
                          None, request.ip_addr)
 
            h.flash(_('Updated user group %s') % gr, category='success')
 
            Session().commit()
 
        except formencode.Invalid as errors:
 
            ug_model = UserGroupModel()
 
            defaults = errors.value
 
            e = errors.error_dict or {}
 
            defaults.update({
 
                'create_repo_perm': ug_model.has_perm(id,
 
                                                      'hg.create.repository'),
 
                'fork_repo_perm': ug_model.has_perm(id,
 
                                                    'hg.fork.repository'),
 
            })
 

	
 
            return htmlfill.render(
 
                render('admin/user_groups/user_group_edit.html'),
 
                defaults=defaults,
 
                errors=e,
 
                prefix_error=False,
 
                encoding="UTF-8",
 
                force_defaults=False)
 
        except Exception:
 
            log.error(traceback.format_exc())
 
            h.flash(_('Error occurred during update of user group %s') \
 
                    % request.POST.get('users_group_name'), category='error')
 

	
 
        raise HTTPFound(location=url('edit_users_group', id=id))
 

	
 
    @HasUserGroupPermissionLevelDecorator('admin')
 
    def delete(self, id):
 
        usr_gr = UserGroup.get_or_404(id)
 
        try:
 
            UserGroupModel().delete(usr_gr)
 
            Session().commit()
 
            h.flash(_('Successfully deleted user group'), category='success')
 
        except UserGroupsAssignedException as e:
 
            h.flash(e, category='error')
 
        except Exception:
 
            log.error(traceback.format_exc())
 
            h.flash(_('An error occurred during deletion of user group'),
 
                    category='error')
 
        raise HTTPFound(location=url('users_groups'))
 

	
 
    @HasUserGroupPermissionLevelDecorator('admin')
 
    def edit(self, id, format='html'):
 
        c.user_group = UserGroup.get_or_404(id)
 
        c.active = 'settings'
 
        self.__load_data(id)
 

	
 
        defaults = self.__load_defaults(id)
 

	
 
        return htmlfill.render(
 
            render('admin/user_groups/user_group_edit.html'),
 
            defaults=defaults,
 
            encoding="UTF-8",
 
            force_defaults=False
 
        )
 

	
 
    @HasUserGroupPermissionLevelDecorator('admin')
 
    def edit_perms(self, id):
 
        c.user_group = UserGroup.get_or_404(id)
 
        c.active = 'perms'
 

	
 
        repo_model = RepoModel()
 
        c.users_array = repo_model.get_users_js()
 
        c.user_groups_array = repo_model.get_user_groups_js()
 

	
 
        defaults = {}
 
        # fill user group users
 
        for p in c.user_group.user_user_group_to_perm:
 
            defaults.update({'u_perm_%s' % p.user.username:
 
                             p.permission.permission_name})
 

	
 
        for p in c.user_group.user_group_user_group_to_perm:
 
            defaults.update({'g_perm_%s' % p.user_group.users_group_name:
 
                             p.permission.permission_name})
 

	
 
        return htmlfill.render(
 
            render('admin/user_groups/user_group_edit.html'),
 
            defaults=defaults,
 
            encoding="UTF-8",
 
            force_defaults=False
 
        )
 

	
 
    @HasUserGroupPermissionLevelDecorator('admin')
 
    def update_perms(self, id):
 
        """
 
        grant permission for given usergroup
 

	
 
        :param id:
 
        """
 
        user_group = UserGroup.get_or_404(id)
 
        form = UserGroupPermsForm()().to_python(request.POST)
 

	
 
        # set the permissions !
 
        try:
 
            UserGroupModel()._update_permissions(user_group, form['perms_new'],
 
                                                 form['perms_updates'])
 
        except RepoGroupAssignmentError:
 
            h.flash(_('Target group cannot be the same'), category='error')
 
            raise HTTPFound(location=url('edit_user_group_perms', id=id))
 
        #TODO: implement this
 
        #action_logger(request.authuser, 'admin_changed_repo_permissions',
 
        #              repo_name, request.ip_addr)
 
        Session().commit()
 
        h.flash(_('User group permissions updated'), category='success')
 
        raise HTTPFound(location=url('edit_user_group_perms', id=id))
 

	
 
    @HasUserGroupPermissionLevelDecorator('admin')
 
    def delete_perms(self, id):
 
        try:
 
            obj_type = request.POST.get('obj_type')
 
            obj_id = None
 
            if obj_type == 'user':
 
                obj_id = safe_int(request.POST.get('user_id'))
 
            elif obj_type == 'user_group':
 
                obj_id = safe_int(request.POST.get('user_group_id'))
 

	
 
            if not request.authuser.is_admin:
 
                if obj_type == 'user' and request.authuser.user_id == obj_id:
 
                    msg = _('Cannot revoke permission for yourself as admin')
 
                    h.flash(msg, category='warning')
 
                    raise Exception('revoke admin permission on self')
 
            if obj_type == 'user':
 
                UserGroupModel().revoke_user_permission(user_group=id,
 
                                                        user=obj_id)
 
            elif obj_type == 'user_group':
 
                UserGroupModel().revoke_user_group_permission(target_user_group=id,
 
                                                              user_group=obj_id)
 
            Session().commit()
 
        except Exception:
 
            log.error(traceback.format_exc())
 
            h.flash(_('An error occurred during revoking of permission'),
 
                    category='error')
 
            raise HTTPInternalServerError()
 

	
 
    @HasUserGroupPermissionLevelDecorator('admin')
 
    def edit_default_perms(self, id):
 
        c.user_group = UserGroup.get_or_404(id)
 
        c.active = 'default_perms'
 

	
 
        permissions = {
 
            'repositories': {},
 
            'repositories_groups': {}
 
        }
 
        ugroup_repo_perms = UserGroupRepoToPerm.query() \
 
            .options(joinedload(UserGroupRepoToPerm.permission)) \
 
            .options(joinedload(UserGroupRepoToPerm.repository)) \
 
            .filter(UserGroupRepoToPerm.users_group_id == id) \
 
            .all()
 

	
 
        for gr in ugroup_repo_perms:
 
            permissions['repositories'][gr.repository.repo_name]  \
 
                = gr.permission.permission_name
 

	
 
        ugroup_group_perms = UserGroupRepoGroupToPerm.query() \
 
            .options(joinedload(UserGroupRepoGroupToPerm.permission)) \
 
            .options(joinedload(UserGroupRepoGroupToPerm.group)) \
 
            .filter(UserGroupRepoGroupToPerm.users_group_id == id) \
 
            .all()
 

	
 
        for gr in ugroup_group_perms:
 
            permissions['repositories_groups'][gr.group.group_name] \
 
                = gr.permission.permission_name
 
        c.permissions = permissions
 

	
 
        ug_model = UserGroupModel()
 

	
 
        defaults = c.user_group.get_dict()
 
        defaults.update({
 
            'create_repo_perm': ug_model.has_perm(c.user_group,
 
                                                  'hg.create.repository'),
 
            'create_user_group_perm': ug_model.has_perm(c.user_group,
 
                                                        'hg.usergroup.create.true'),
 
            'fork_repo_perm': ug_model.has_perm(c.user_group,
 
                                                'hg.fork.repository'),
 
        })
 

	
 
        return htmlfill.render(
 
            render('admin/user_groups/user_group_edit.html'),
 
            defaults=defaults,
 
            encoding="UTF-8",
 
            force_defaults=False
 
        )
 

	
 
    @HasUserGroupPermissionLevelDecorator('admin')
 
    def update_default_perms(self, id):
 
        user_group = UserGroup.get_or_404(id)
 

	
 
        try:
 
            form = CustomDefaultPermissionsForm()()
 
            form_result = form.to_python(request.POST)
 

	
 
            inherit_perms = form_result['inherit_default_permissions']
 
            user_group.inherit_default_permissions = inherit_perms
 
            usergroup_model = UserGroupModel()
 

	
 
            defs = UserGroupToPerm.query() \
 
                .filter(UserGroupToPerm.users_group == user_group) \
 
                .all()
 
            for ug in defs:
 
                Session().delete(ug)
 

	
 
            if form_result['create_repo_perm']:
 
                usergroup_model.grant_perm(id, 'hg.create.repository')
 
            else:
 
                usergroup_model.grant_perm(id, 'hg.create.none')
 
            if form_result['create_user_group_perm']:
 
                usergroup_model.grant_perm(id, 'hg.usergroup.create.true')
 
            else:
 
                usergroup_model.grant_perm(id, 'hg.usergroup.create.false')
 
            if form_result['fork_repo_perm']:
 
                usergroup_model.grant_perm(id, 'hg.fork.repository')
 
            else:
 
                usergroup_model.grant_perm(id, 'hg.fork.none')
 

	
 
            h.flash(_("Updated permissions"), category='success')
 
            Session().commit()
 
        except Exception:
 
            log.error(traceback.format_exc())
 
            h.flash(_('An error occurred during permissions saving'),
 
                    category='error')
 

	
 
        raise HTTPFound(location=url('edit_user_group_default_perms', id=id))
 

	
 
    @HasUserGroupPermissionLevelDecorator('admin')
 
    def edit_advanced(self, id):
 
        c.user_group = UserGroup.get_or_404(id)
 
        c.active = 'advanced'
 
        c.group_members_obj = sorted((x.user for x in c.user_group.members),
 
                                     key=lambda u: u.username.lower())
 
        return render('admin/user_groups/user_group_edit.html')
 

	
 

	
 
    @HasUserGroupPermissionLevelDecorator('admin')
 
    def edit_members(self, id):
 
        c.user_group = UserGroup.get_or_404(id)
 
        c.active = 'members'
 
        c.group_members_obj = sorted((x.user for x in c.user_group.members),
 
                                     key=lambda u: u.username.lower())
 

	
 
        c.group_members = [(x.user_id, x.username) for x in c.group_members_obj]
 
        return render('admin/user_groups/user_group_edit.html')
kallithea/controllers/admin/users.py
Show inline comments
 
# -*- coding: utf-8 -*-
 
# This program is free software: you can redistribute it and/or modify
 
# it under the terms of the GNU General Public License as published by
 
# the Free Software Foundation, either version 3 of the License, or
 
# (at your option) any later version.
 
#
 
# This program is distributed in the hope that it will be useful,
 
# but WITHOUT ANY WARRANTY; without even the implied warranty of
 
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
 
# GNU General Public License for more details.
 
#
 
# You should have received a copy of the GNU General Public License
 
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
 
"""
 
kallithea.controllers.admin.users
 
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
 

	
 
Users crud controller
 

	
 
This file was forked by the Kallithea project in July 2014.
 
Original author and date, and relevant copyright and licensing information is below:
 
:created_on: Apr 4, 2010
 
:author: marcink
 
:copyright: (c) 2013 RhodeCode GmbH, and others.
 
:license: GPLv3, see LICENSE.md for more details.
 
"""
 

	
 
import logging
 
import traceback
 
import formencode
 

	
 
from formencode import htmlfill
 
from tg import request, tmpl_context as c, config
 
from tg import request, tmpl_context as c, config, app_globals
 
from tg.i18n import ugettext as _
 
from sqlalchemy.sql.expression import func
 
from webob.exc import HTTPFound, HTTPNotFound
 

	
 
import kallithea
 
from kallithea.config.routing import url
 
from kallithea.lib.exceptions import DefaultUserException, \
 
    UserOwnsReposException, UserCreationError
 
from kallithea.lib import helpers as h
 
from kallithea.lib.auth import LoginRequired, HasPermissionAnyDecorator, \
 
    AuthUser
 
from kallithea.lib import auth_modules
 
from kallithea.lib.base import BaseController, render
 
from kallithea.model.api_key import ApiKeyModel
 

	
 
from kallithea.model.db import User, UserEmailMap, UserIpMap, UserToPerm
 
from kallithea.model.forms import UserForm, CustomDefaultPermissionsForm
 
from kallithea.model.user import UserModel
 
from kallithea.model.meta import Session
 
from kallithea.lib.utils import action_logger
 
from kallithea.lib.utils2 import datetime_to_time, safe_int, generate_api_key
 

	
 
log = logging.getLogger(__name__)
 

	
 

	
 
class UsersController(BaseController):
 
    """REST Controller styled on the Atom Publishing Protocol"""
 

	
 
    @LoginRequired()
 
    @HasPermissionAnyDecorator('hg.admin')
 
    def _before(self, *args, **kwargs):
 
        super(UsersController, self)._before(*args, **kwargs)
 
        c.available_permissions = config['available_permissions']
 

	
 
    def index(self, format='html'):
 
        c.users_list = User.query().order_by(User.username) \
 
                        .filter_by(is_default_user=False) \
 
                        .order_by(func.lower(User.username)) \
 
                        .all()
 

	
 
        users_data = []
 
        total_records = len(c.users_list)
 
        _tmpl_lookup = kallithea.CONFIG['pylons.app_globals'].mako_lookup
 
        _tmpl_lookup = app_globals.mako_lookup
 
        template = _tmpl_lookup.get_template('data_table/_dt_elements.html')
 

	
 
        grav_tmpl = '<div class="gravatar">%s</div>'
 

	
 
        username = lambda user_id, username: (
 
                template.get_def("user_name")
 
                .render(user_id, username, _=_, h=h, c=c))
 

	
 
        user_actions = lambda user_id, username: (
 
                template.get_def("user_actions")
 
                .render(user_id, username, _=_, h=h, c=c))
 

	
 
        for user in c.users_list:
 
            users_data.append({
 
                "gravatar": grav_tmpl % h.gravatar(user.email, size=20),
 
                "raw_name": user.username,
 
                "username": username(user.user_id, user.username),
 
                "firstname": h.escape(user.name),
 
                "lastname": h.escape(user.lastname),
 
                "last_login": h.fmt_date(user.last_login),
 
                "last_login_raw": datetime_to_time(user.last_login),
 
                "active": h.boolicon(user.active),
 
                "admin": h.boolicon(user.admin),
 
                "extern_type": user.extern_type,
 
                "extern_name": user.extern_name,
 
                "action": user_actions(user.user_id, user.username),
 
            })
 

	
 
        c.data = {
 
            "totalRecords": total_records,
 
            "startIndex": 0,
 
            "sort": None,
 
            "dir": "asc",
 
            "records": users_data
 
        }
 

	
 
        return render('admin/users/users.html')
 

	
 
    def create(self):
 
        c.default_extern_type = User.DEFAULT_AUTH_TYPE
 
        c.default_extern_name = ''
 
        user_model = UserModel()
 
        user_form = UserForm()()
 
        try:
 
            form_result = user_form.to_python(dict(request.POST))
 
            user = user_model.create(form_result)
 
            action_logger(request.authuser, 'admin_created_user:%s' % user.username,
 
                          None, request.ip_addr)
 
            h.flash(_('Created user %s') % user.username,
 
                    category='success')
 
            Session().commit()
 
        except formencode.Invalid as errors:
 
            return htmlfill.render(
 
                render('admin/users/user_add.html'),
 
                defaults=errors.value,
 
                errors=errors.error_dict or {},
 
                prefix_error=False,
 
                encoding="UTF-8",
 
                force_defaults=False)
 
        except UserCreationError as e:
 
            h.flash(e, 'error')
 
        except Exception:
 
            log.error(traceback.format_exc())
 
            h.flash(_('Error occurred during creation of user %s') \
 
                    % request.POST.get('username'), category='error')
 
        raise HTTPFound(location=url('edit_user', id=user.user_id))
 

	
 
    def new(self, format='html'):
 
        c.default_extern_type = User.DEFAULT_AUTH_TYPE
 
        c.default_extern_name = ''
 
        return render('admin/users/user_add.html')
 

	
 
    def update(self, id):
 
        user_model = UserModel()
 
        user = user_model.get(id)
 
        _form = UserForm(edit=True, old_data={'user_id': id,
 
                                              'email': user.email})()
 
        form_result = {}
 
        try:
 
            form_result = _form.to_python(dict(request.POST))
 
            skip_attrs = ['extern_type', 'extern_name',
 
                         ] + auth_modules.get_managed_fields(user)
 

	
 
            user_model.update(id, form_result, skip_attrs=skip_attrs)
 
            usr = form_result['username']
 
            action_logger(request.authuser, 'admin_updated_user:%s' % usr,
 
                          None, request.ip_addr)
 
            h.flash(_('User updated successfully'), category='success')
 
            Session().commit()
 
        except formencode.Invalid as errors:
 
            defaults = errors.value
 
            e = errors.error_dict or {}
 
            defaults.update({
 
                'create_repo_perm': user_model.has_perm(id,
 
                                                        'hg.create.repository'),
 
                'fork_repo_perm': user_model.has_perm(id, 'hg.fork.repository'),
 
            })
 
            return htmlfill.render(
 
                self._render_edit_profile(user),
 
                defaults=defaults,
 
                errors=e,
 
                prefix_error=False,
 
                encoding="UTF-8",
 
                force_defaults=False)
 
        except Exception:
 
            log.error(traceback.format_exc())
 
            h.flash(_('Error occurred during update of user %s') \
 
                    % form_result.get('username'), category='error')
 
        raise HTTPFound(location=url('edit_user', id=id))
 

	
 
    def delete(self, id):
 
        usr = User.get_or_404(id)
 
        try:
 
            UserModel().delete(usr)
 
            Session().commit()
 
            h.flash(_('Successfully deleted user'), category='success')
 
        except (UserOwnsReposException, DefaultUserException) as e:
 
            h.flash(e, category='warning')
 
        except Exception:
 
            log.error(traceback.format_exc())
 
            h.flash(_('An error occurred during deletion of user'),
 
                    category='error')
 
        raise HTTPFound(location=url('users'))
 

	
 
    def _get_user_or_raise_if_default(self, id):
 
        try:
 
            return User.get_or_404(id, allow_default=False)
 
        except DefaultUserException:
 
            h.flash(_("The default user cannot be edited"), category='warning')
 
            raise HTTPNotFound
 

	
 
    def _render_edit_profile(self, user):
 
        c.user = user
 
        c.active = 'profile'
 
        c.perm_user = AuthUser(dbuser=user)
 
        managed_fields = auth_modules.get_managed_fields(user)
 
        c.readonly = lambda n: 'readonly' if n in managed_fields else None
 
        return render('admin/users/user_edit.html')
 

	
 
    def edit(self, id, format='html'):
 
        user = self._get_user_or_raise_if_default(id)
 
        defaults = user.get_dict()
 

	
 
        return htmlfill.render(
 
            self._render_edit_profile(user),
 
            defaults=defaults,
 
            encoding="UTF-8",
 
            force_defaults=False)
 

	
 
    def edit_advanced(self, id):
 
        c.user = self._get_user_or_raise_if_default(id)
 
        c.active = 'advanced'
 
        c.perm_user = AuthUser(dbuser=c.user)
 

	
 
        umodel = UserModel()
 
        defaults = c.user.get_dict()
 
        defaults.update({
 
            'create_repo_perm': umodel.has_perm(c.user, 'hg.create.repository'),
 
            'create_user_group_perm': umodel.has_perm(c.user,
 
                                                      'hg.usergroup.create.true'),
 
            'fork_repo_perm': umodel.has_perm(c.user, 'hg.fork.repository'),
 
        })
 
        return htmlfill.render(
 
            render('admin/users/user_edit.html'),
 
            defaults=defaults,
 
            encoding="UTF-8",
 
            force_defaults=False)
 

	
 
    def edit_api_keys(self, id):
 
        c.user = self._get_user_or_raise_if_default(id)
 
        c.active = 'api_keys'
 
        show_expired = True
 
        c.lifetime_values = [
 
            (str(-1), _('Forever')),
 
            (str(5), _('5 minutes')),
 
            (str(60), _('1 hour')),
 
            (str(60 * 24), _('1 day')),
 
            (str(60 * 24 * 30), _('1 month')),
 
        ]
 
        c.lifetime_options = [(c.lifetime_values, _("Lifetime"))]
 
        c.user_api_keys = ApiKeyModel().get_api_keys(c.user.user_id,
 
                                                     show_expired=show_expired)
 
        defaults = c.user.get_dict()
 
        return htmlfill.render(
 
            render('admin/users/user_edit.html'),
 
            defaults=defaults,
 
            encoding="UTF-8",
 
            force_defaults=False)
 

	
 
    def add_api_key(self, id):
 
        c.user = self._get_user_or_raise_if_default(id)
 

	
 
        lifetime = safe_int(request.POST.get('lifetime'), -1)
 
        description = request.POST.get('description')
 
        ApiKeyModel().create(c.user.user_id, description, lifetime)
 
        Session().commit()
 
        h.flash(_("API key successfully created"), category='success')
 
        raise HTTPFound(location=url('edit_user_api_keys', id=c.user.user_id))
 

	
 
    def delete_api_key(self, id):
 
        c.user = self._get_user_or_raise_if_default(id)
 

	
 
        api_key = request.POST.get('del_api_key')
 
        if request.POST.get('del_api_key_builtin'):
 
            c.user.api_key = generate_api_key()
 
            Session().commit()
 
            h.flash(_("API key successfully reset"), category='success')
 
        elif api_key:
 
            ApiKeyModel().delete(api_key, c.user.user_id)
 
            Session().commit()
 
            h.flash(_("API key successfully deleted"), category='success')
 

	
 
        raise HTTPFound(location=url('edit_user_api_keys', id=c.user.user_id))
 

	
 
    def update_account(self, id):
 
        pass
 

	
 
    def edit_perms(self, id):
 
        c.user = self._get_user_or_raise_if_default(id)
 
        c.active = 'perms'
 
        c.perm_user = AuthUser(dbuser=c.user)
 

	
 
        umodel = UserModel()
 
        defaults = c.user.get_dict()
 
        defaults.update({
 
            'create_repo_perm': umodel.has_perm(c.user, 'hg.create.repository'),
 
            'create_user_group_perm': umodel.has_perm(c.user,
 
                                                      'hg.usergroup.create.true'),
 
            'fork_repo_perm': umodel.has_perm(c.user, 'hg.fork.repository'),
 
        })
 
        return htmlfill.render(
 
            render('admin/users/user_edit.html'),
 
            defaults=defaults,
 
            encoding="UTF-8",
 
            force_defaults=False)
 

	
 
    def update_perms(self, id):
 
        user = self._get_user_or_raise_if_default(id)
 

	
 
        try:
 
            form = CustomDefaultPermissionsForm()()
 
            form_result = form.to_python(request.POST)
 

	
 
            inherit_perms = form_result['inherit_default_permissions']
 
            user.inherit_default_permissions = inherit_perms
 
            user_model = UserModel()
 

	
 
            defs = UserToPerm.query() \
 
                .filter(UserToPerm.user == user) \
 
                .all()
 
            for ug in defs:
 
                Session().delete(ug)
 

	
 
            if form_result['create_repo_perm']:
 
                user_model.grant_perm(id, 'hg.create.repository')
 
            else:
 
                user_model.grant_perm(id, 'hg.create.none')
 
            if form_result['create_user_group_perm']:
 
                user_model.grant_perm(id, 'hg.usergroup.create.true')
 
            else:
 
                user_model.grant_perm(id, 'hg.usergroup.create.false')
 
            if form_result['fork_repo_perm']:
 
                user_model.grant_perm(id, 'hg.fork.repository')
 
            else:
 
                user_model.grant_perm(id, 'hg.fork.none')
 
            h.flash(_("Updated permissions"), category='success')
 
            Session().commit()
 
        except Exception:
 
            log.error(traceback.format_exc())
 
            h.flash(_('An error occurred during permissions saving'),
 
                    category='error')
 
        raise HTTPFound(location=url('edit_user_perms', id=id))
 

	
 
    def edit_emails(self, id):
 
        c.user = self._get_user_or_raise_if_default(id)
 
        c.active = 'emails'
 
        c.user_email_map = UserEmailMap.query() \
 
            .filter(UserEmailMap.user == c.user).all()
 

	
 
        defaults = c.user.get_dict()
 
        return htmlfill.render(
 
            render('admin/users/user_edit.html'),
 
            defaults=defaults,
 
            encoding="UTF-8",
 
            force_defaults=False)
 

	
 
    def add_email(self, id):
 
        user = self._get_user_or_raise_if_default(id)
 
        email = request.POST.get('new_email')
 
        user_model = UserModel()
 

	
 
        try:
 
            user_model.add_extra_email(id, email)
 
            Session().commit()
 
            h.flash(_("Added email %s to user") % email, category='success')
 
        except formencode.Invalid as error:
 
            msg = error.error_dict['email']
 
            h.flash(msg, category='error')
 
        except Exception:
 
            log.error(traceback.format_exc())
 
            h.flash(_('An error occurred during email saving'),
 
                    category='error')
 
        raise HTTPFound(location=url('edit_user_emails', id=id))
 

	
 
    def delete_email(self, id):
 
        user = self._get_user_or_raise_if_default(id)
 
        email_id = request.POST.get('del_email_id')
 
        user_model = UserModel()
 
        user_model.delete_extra_email(id, email_id)
 
        Session().commit()
 
        h.flash(_("Removed email from user"), category='success')
 
        raise HTTPFound(location=url('edit_user_emails', id=id))
 

	
 
    def edit_ips(self, id):
 
        c.user = self._get_user_or_raise_if_default(id)
 
        c.active = 'ips'
 
        c.user_ip_map = UserIpMap.query() \
 
            .filter(UserIpMap.user == c.user).all()
 

	
 
        c.inherit_default_ips = c.user.inherit_default_permissions
 
        c.default_user_ip_map = UserIpMap.query() \
 
            .filter(UserIpMap.user == User.get_default_user()).all()
 

	
 
        defaults = c.user.get_dict()
 
        return htmlfill.render(
 
            render('admin/users/user_edit.html'),
 
            defaults=defaults,
 
            encoding="UTF-8",
 
            force_defaults=False)
 

	
 
    def add_ip(self, id):
 
        ip = request.POST.get('new_ip')
 
        user_model = UserModel()
 

	
 
        try:
 
            user_model.add_extra_ip(id, ip)
 
            Session().commit()
 
            h.flash(_("Added IP address %s to user whitelist") % ip, category='success')
 
        except formencode.Invalid as error:
 
            msg = error.error_dict['ip']
 
            h.flash(msg, category='error')
 
        except Exception:
 
            log.error(traceback.format_exc())
 
            h.flash(_('An error occurred while adding IP address'),
 
                    category='error')
 

	
 
        if 'default_user' in request.POST:
 
            raise HTTPFound(location=url('admin_permissions_ips'))
 
        raise HTTPFound(location=url('edit_user_ips', id=id))
 

	
 
    def delete_ip(self, id):
 
        ip_id = request.POST.get('del_ip_id')
 
        user_model = UserModel()
 
        user_model.delete_extra_ip(id, ip_id)
 
        Session().commit()
 
        h.flash(_("Removed IP address from user whitelist"), category='success')
 

	
 
        if 'default_user' in request.POST:
 
            raise HTTPFound(location=url('admin_permissions_ips'))
 
        raise HTTPFound(location=url('edit_user_ips', id=id))
kallithea/controllers/api/__init__.py
Show inline comments
 
# -*- coding: utf-8 -*-
 
# This program is free software: you can redistribute it and/or modify
 
# it under the terms of the GNU General Public License as published by
 
# the Free Software Foundation, either version 3 of the License, or
 
# (at your option) any later version.
 
#
 
# This program is distributed in the hope that it will be useful,
 
# but WITHOUT ANY WARRANTY; without even the implied warranty of
 
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
 
# GNU General Public License for more details.
 
#
 
# You should have received a copy of the GNU General Public License
 
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
 
"""
 
kallithea.controllers.api
 
~~~~~~~~~~~~~~~~~~~~~~~~~
 

	
 
JSON RPC controller
 

	
 
This file was forked by the Kallithea project in July 2014.
 
Original author and date, and relevant copyright and licensing information is below:
 
:created_on: Aug 20, 2011
 
:author: marcink
 
:copyright: (c) 2013 RhodeCode GmbH, and others.
 
:license: GPLv3, see LICENSE.md for more details.
 
"""
 

	
 
import inspect
 
import logging
 
import types
 
import traceback
 
import time
 
import itertools
 

	
 
from paste.response import replace_header
 
from pylons.controllers import WSGIController
 
from pylons.controllers.util import Response
 
from tg import request
 
from tg import Response, response, request, TGController
 

	
 
from webob.exc import HTTPError
 
from webob.exc import HTTPError, HTTPException, WSGIHTTPException
 

	
 
from kallithea.model.db import User
 
from kallithea.model import meta
 
from kallithea.lib.compat import json
 
from kallithea.lib.auth import AuthUser
 
from kallithea.lib.base import _get_ip_addr as _get_ip, _get_access_path
 
from kallithea.lib.utils2 import safe_unicode, safe_str
 

	
 
log = logging.getLogger('JSONRPC')
 

	
 

	
 
class JSONRPCError(BaseException):
 

	
 
    def __init__(self, message):
 
        self.message = message
 
        super(JSONRPCError, self).__init__()
 

	
 
    def __str__(self):
 
        return safe_str(self.message)
 

	
 

	
 
class JSONRPCErrorResponse(Response, Exception):
 
class JSONRPCErrorResponse(Response, HTTPException):
 
    """
 
    Generate a Response object with a JSON-RPC error body
 
    """
 

	
 
    def __init__(self, message=None, retid=None, code=None):
 
        HTTPException.__init__(self, message, self)
 
        Response.__init__(self,
 
                          body=json.dumps(dict(id=retid, result=None, error=message)),
 
                          json_body=dict(id=retid, result=None, error=message),
 
                          status=code,
 
                          content_type='application/json')
 

	
 

	
 
class JSONRPCController(WSGIController):
 
class JSONRPCController(TGController):
 
    """
 
     A WSGI-speaking JSON-RPC controller class
 

	
 
     See the specification:
 
     <http://json-rpc.org/wiki/specification>`.
 

	
 
     Valid controller return values should be json-serializable objects.
 

	
 
     Sub-classes should catch their exceptions and raise JSONRPCError
 
     if they want to pass meaningful errors to the client.
 

	
 
     """
 

	
 
    def _get_ip_addr(self, environ):
 
        return _get_ip(environ)
 

	
 
    def _get_method_args(self):
 
        """
 
        Return `self._rpc_args` to dispatched controller method
 
        chosen by __call__
 
        """
 
        return self._rpc_args
 

	
 
    def __call__(self, environ, start_response):
 
    def _dispatch(self, state, remainder=None):
 
        """
 
        Parse the request body as JSON, look up the method on the
 
        controller and if it exists, dispatch to it.
 
        """
 
        try:
 
            return self._handle_request(environ, start_response)
 
        except JSONRPCErrorResponse as e:
 
            return e
 
        finally:
 
            meta.Session.remove()
 
        # Since we are here we should respond as JSON
 
        response.content_type = 'application/json'
 

	
 
    def _handle_request(self, environ, start_response):
 
        environ = state.request.environ
 
        start = time.time()
 
        ip_addr = request.ip_addr = self._get_ip_addr(environ)
 
        self._req_id = None
 
        if 'CONTENT_LENGTH' not in environ:
 
            log.debug("No Content-Length")
 
            raise JSONRPCErrorResponse(retid=self._req_id,
 
                                       message="No Content-Length in request")
 
        else:
 
            length = environ['CONTENT_LENGTH'] or 0
 
            length = int(environ['CONTENT_LENGTH'])
 
            log.debug('Content-Length: %s', length)
 

	
 
        if length == 0:
 
            raise JSONRPCErrorResponse(retid=self._req_id,
 
                                       message="Content-Length is 0")
 

	
 
        raw_body = environ['wsgi.input'].read(length)
 

	
 
        try:
 
            json_body = json.loads(raw_body)
 
        except ValueError as e:
 
            # catch JSON errors Here
 
            raise JSONRPCErrorResponse(retid=self._req_id,
 
                                       message="JSON parse error ERR:%s RAW:%r"
 
                                                % (e, raw_body))
 

	
 
        # check AUTH based on API key
 
        try:
 
            self._req_api_key = json_body['api_key']
 
            self._req_id = json_body['id']
 
            self._req_method = json_body['method']
 
            self._request_params = json_body['args']
 
            if not isinstance(self._request_params, dict):
 
                self._request_params = {}
 

	
 
            log.debug('method: %s, params: %s',
 
                      self._req_method, self._request_params)
 
        except KeyError as e:
 
            raise JSONRPCErrorResponse(retid=self._req_id,
 
                                       message='Incorrect JSON query missing %s' % e)
 

	
 
        # check if we can find this session using api_key
 
        try:
 
            u = User.get_by_api_key(self._req_api_key)
 
            if u is None:
 
                raise JSONRPCErrorResponse(retid=self._req_id,
 
                                           message='Invalid API key')
 

	
 
            auth_u = AuthUser(dbuser=u)
 
            if not AuthUser.check_ip_allowed(auth_u, ip_addr):
 
                raise JSONRPCErrorResponse(retid=self._req_id,
 
                                           message='request from IP:%s not allowed' % (ip_addr,))
 
            else:
 
                log.info('Access for IP:%s allowed', ip_addr)
 

	
 
        except Exception as e:
 
            raise JSONRPCErrorResponse(retid=self._req_id,
 
                                       message='Invalid API key')
 

	
 
        self._error = None
 
        try:
 
            self._func = self._find_method()
 
        except AttributeError as e:
 
            raise JSONRPCErrorResponse(retid=self._req_id,
 
                                       message=str(e))
 

	
 
        # now that we have a method, add self._req_params to
 
        # self.kargs and dispatch control to WGIController
 
        argspec = inspect.getargspec(self._func)
 
        arglist = argspec[0][1:]
 
        defaults = map(type, argspec[3] or [])
 
        default_empty = types.NotImplementedType
 

	
 
        # kw arguments required by this method
 
        func_kwargs = dict(itertools.izip_longest(reversed(arglist), reversed(defaults),
 
                                                  fillvalue=default_empty))
 

	
 
        # this is little trick to inject logged in user for
 
        # perms decorators to work they expect the controller class to have
 
        # authuser attribute set
 
        request.authuser = request.user = auth_u
 

	
 
        # This attribute will need to be first param of a method that uses
 
        # api_key, which is translated to instance of user at that name
 
        USER_SESSION_ATTR = 'apiuser'
 

	
 
        # get our arglist and check if we provided them as args
 
        for arg, default in func_kwargs.iteritems():
 
            if arg == USER_SESSION_ATTR:
 
                # USER_SESSION_ATTR is something translated from API key and
 
                # this is checked before so we don't need validate it
 
                continue
 

	
 
            # skip the required param check if it's default value is
 
            # NotImplementedType (default_empty)
 
            if default == default_empty and arg not in self._request_params:
 
                raise JSONRPCErrorResponse(
 
                    retid=self._req_id,
 
                    message='Missing non optional `%s` arg in JSON DATA' % arg,
 
                )
 

	
 
        extra = set(self._request_params).difference(func_kwargs)
 
        if extra:
 
                raise JSONRPCErrorResponse(
 
                    retid=self._req_id,
 
                    message='Unknown %s arg in JSON DATA' %
 
                            ', '.join('`%s`' % arg for arg in extra),
 
                )
 

	
 
        self._rpc_args = {}
 

	
 
        self._rpc_args.update(self._request_params)
 

	
 
        self._rpc_args['action'] = self._req_method
 
        self._rpc_args['environ'] = environ
 
        self._rpc_args['start_response'] = start_response
 

	
 
        status = []
 
        headers = []
 
        exc_info = []
 

	
 
        def change_content(new_status, new_headers, new_exc_info=None):
 
            status.append(new_status)
 
            headers.extend(new_headers)
 
            exc_info.append(new_exc_info)
 

	
 
        output = WSGIController.__call__(self, environ, change_content)
 
        output = list(output) # expand iterator - just to ensure exact timing
 
        replace_header(headers, 'Content-Type', 'application/json')
 
        start_response(status[0], headers, exc_info[0])
 
        log.info('IP: %s Request to %s time: %.3fs' % (
 
            self._get_ip_addr(environ),
 
            safe_unicode(_get_access_path(environ)), time.time() - start)
 
        )
 
        return output
 

	
 
    def _dispatch_call(self):
 
        state.set_action(self._rpc_call, [])
 
        state.set_params(self._rpc_args)
 
        return state
 

	
 
    def _rpc_call(self, action, environ, **rpc_args):
 
        """
 
        Implement dispatch interface specified by WSGIController
 
        Call the specified RPC Method
 
        """
 
        raw_response = ''
 
        try:
 
            raw_response = self._inspect_call(self._func)
 
            raw_response = getattr(self, action)(**rpc_args)
 
            if isinstance(raw_response, HTTPError):
 
                self._error = str(raw_response)
 
        except JSONRPCError as e:
 
            self._error = safe_str(e)
 
        except Exception as e:
 
            log.error('Encountered unhandled exception: %s',
 
                      traceback.format_exc(),)
 
            json_exc = JSONRPCError('Internal server error')
 
            self._error = safe_str(json_exc)
 

	
 
        if self._error is not None:
 
            raw_response = None
 

	
 
        response = dict(id=self._req_id, result=raw_response, error=self._error)
 
        try:
 
            return json.dumps(response)
 
        except TypeError as e:
 
            log.error('API FAILED. Error encoding response: %s', e)
 
            return json.dumps(
 
                dict(
 
                    id=self._req_id,
 
                    result=None,
 
                    error="Error encoding response"
 
                )
 
            )
 

	
 
    def _find_method(self):
 
        """
 
        Return method named by `self._req_method` in controller if able
 
        """
 
        log.debug('Trying to find JSON-RPC method: %s', self._req_method)
 
        if self._req_method.startswith('_'):
 
            raise AttributeError("Method not allowed")
 

	
 
        try:
 
            func = getattr(self, self._req_method, None)
 
        except UnicodeEncodeError:
 
            raise AttributeError("Problem decoding unicode in requested "
 
                                 "method name.")
 

	
 
        if isinstance(func, types.MethodType):
 
            return func
 
        else:
 
            raise AttributeError("No such method: %s" % (self._req_method,))
kallithea/controllers/error.py
Show inline comments
 
# -*- coding: utf-8 -*-
 
# This program is free software: you can redistribute it and/or modify
 
# it under the terms of the GNU General Public License as published by
 
# the Free Software Foundation, either version 3 of the License, or
 
# (at your option) any later version.
 
#
 
# This program is distributed in the hope that it will be useful,
 
# but WITHOUT ANY WARRANTY; without even the implied warranty of
 
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
 
# GNU General Public License for more details.
 
#
 
# You should have received a copy of the GNU General Public License
 
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
 
"""
 
kallithea.controllers.error
 
~~~~~~~~~~~~~~~~~~~~~~~~~~~
 

	
 
Kallithea error controller
 

	
 
This file was forked by the Kallithea project in July 2014.
 
Original author and date, and relevant copyright and licensing information is below:
 
:created_on: Dec 8, 2010
 
:author: marcink
 
:copyright: (c) 2013 RhodeCode GmbH, and others.
 
:license: GPLv3, see LICENSE.md for more details.
 
"""
 

	
 
import os
 
import cgi
 
import logging
 

	
 
from tg import tmpl_context as c, request, config
 
from tg import tmpl_context as c, request, config, expose
 
from tg.i18n import ugettext as _
 
from pylons.middleware import media_path
 

	
 
from kallithea.lib.base import BaseController, render
 

	
 
log = logging.getLogger(__name__)
 

	
 

	
 
class ErrorController(BaseController):
 
    """Generates error documents as and when they are required.
 

	
 
    The ErrorDocuments middleware forwards to ErrorController when error
 
    related status codes are returned from the application.
 

	
 
    This behavior can be altered by changing the parameters to the
 
    ErrorDocuments middleware in your config/middleware.py file.
 
    """
 

	
 
    def _before(self, *args, **kwargs):
 
        # disable all base actions since we don't need them here
 
        pass
 

	
 
    def document(self):
 
        resp = request.environ.get('pylons.original_response')
 
    @expose('/errors/error_document.html')
 
    def document(self, *args, **kwargs):
 
        resp = request.environ.get('tg.original_response')
 
        c.site_name = config.get('title')
 

	
 
        log.debug('### %s ###', resp and resp.status or 'no response')
 

	
 
        e = request.environ
 
        c.serv_p = r'%(protocol)s://%(host)s/' % {
 
            'protocol': e.get('wsgi.url_scheme'),
 
            'host': e.get('HTTP_HOST'), }
 
        if resp:
 
            c.error_message = cgi.escape(request.GET.get('code',
 
                                                         str(resp.status)))
 
            c.error_explanation = self.get_error_explanation(resp.status_int)
 
        else:
 
            c.error_message = _('No response')
 
            c.error_explanation = _('Unknown error')
 

	
 
        return render('/errors/error_document.html')
 
        return dict()
 

	
 
    def get_error_explanation(self, code):
 
        """ get the error explanations of int codes
 
            [400, 401, 403, 404, 500]"""
 
        try:
 
            code = int(code)
 
        except ValueError:
 
            code = 500
 

	
 
        if code == 400:
 
            return _('The request could not be understood by the server'
 
                     ' due to malformed syntax.')
 
        if code == 401:
 
            return _('Unauthorized access to resource')
 
        if code == 403:
 
            return _("You don't have permission to view this page")
 
        if code == 404:
 
            return _('The resource could not be found')
 
        if code == 500:
 
            return _('The server encountered an unexpected condition'
 
                     ' which prevented it from fulfilling the request.')
kallithea/controllers/root.py
Show inline comments
 
new file 100644
 
# -*- coding: utf-8 -*-
 
# This program is free software: you can redistribute it and/or modify
 
# it under the terms of the GNU General Public License as published by
 
# the Free Software Foundation, either version 3 of the License, or
 
# (at your option) any later version.
 
#
 
# This program is distributed in the hope that it will be useful,
 
# but WITHOUT ANY WARRANTY; without even the implied warranty of
 
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
 
# GNU General Public License for more details.
 
#
 
# You should have received a copy of the GNU General Public License
 
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
 
from tgext.routes import RoutedController
 
from kallithea.lib.base import BaseController
 
from kallithea.controllers.error import ErrorController
 

	
 

	
 
# With TurboGears, the RootController is the controller from which all routing
 
# starts from. It is 'magically' found based on the fact that a controller
 
# 'foo' is expected to have a class name FooController, located in a file
 
# foo.py, inside config['paths']['controllers']. The name 'root' for the root
 
# controller is the default name. The dictionary config['paths'] determines the
 
# directories where templates, static files and controllers are found. It is
 
# set up in tg.AppConfig based on AppConfig['package'] ('kallithea') and the
 
# respective defaults 'templates', 'public' and 'controllers'.
 
# Inherit from RoutedController to allow Kallithea to use regex-based routing.
 
class RootController(RoutedController, BaseController):
 

	
 
    # the following assignment hooks in error handling
 
    error = ErrorController()
kallithea/lib/app_globals.py
Show inline comments
 
# -*- coding: utf-8 -*-
 
# This program is free software: you can redistribute it and/or modify
 
# it under the terms of the GNU General Public License as published by
 
# the Free Software Foundation, either version 3 of the License, or
 
# (at your option) any later version.
 
#
 
# This program is distributed in the hope that it will be useful,
 
# but WITHOUT ANY WARRANTY; without even the implied warranty of
 
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
 
# GNU General Public License for more details.
 
#
 
# You should have received a copy of the GNU General Public License
 
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
 

	
 
"""
 
kallithea.lib.app_globals
 
~~~~~~~~~~~~~~~~~~~~~~~~~
 

	
 
The application's Globals object
 

	
 
This file was forked by the Kallithea project in July 2014.
 
Original author and date, and relevant copyright and licensing information is below:
 
:created_on: Oct 06, 2010
 
:author: marcink
 
:copyright: (c) 2013 RhodeCode GmbH, and others.
 
:license: GPLv3, see LICENSE.md for more details.
 
"""
 

	
 
from beaker.cache import CacheManager
 
from beaker.util import parse_cache_config_options
 
import tg
 
from tg import config
 

	
 

	
 
class Globals(object):
 
    """
 
    Globals acts as a container for objects available throughout the
 
    life of the application
 
    """
 

	
 
    def __init__(self, config):
 
    def __init__(self):
 
        """One instance of Globals is created during application
 
        initialization and is available during requests via the
 
        'app_globals' variable
 

	
 
        """
 
        self.cache = CacheManager(**parse_cache_config_options(config))
 
        self.available_permissions = None   # propagated after init_model
 

	
 
    @property
 
    def cache(self):
 
        return tg.cache
 

	
 
    @property
 
    def mako_lookup(self):
 
        return config['render_functions']['mako'].normal_loader
kallithea/lib/base.py
Show inline comments
 
# -*- coding: utf-8 -*-
 
# This program is free software: you can redistribute it and/or modify
 
# it under the terms of the GNU General Public License as published by
 
# the Free Software Foundation, either version 3 of the License, or
 
# (at your option) any later version.
 
#
 
# This program is distributed in the hope that it will be useful,
 
# but WITHOUT ANY WARRANTY; without even the implied warranty of
 
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
 
# GNU General Public License for more details.
 
#
 
# You should have received a copy of the GNU General Public License
 
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
 

	
 
"""
 
kallithea.lib.base
 
~~~~~~~~~~~~~~~~~~
 

	
 
The base Controller API
 
Provides the BaseController class for subclassing. And usage in different
 
controllers
 

	
 
This file was forked by the Kallithea project in July 2014.
 
Original author and date, and relevant copyright and licensing information is below:
 
:created_on: Oct 06, 2010
 
:author: marcink
 
:copyright: (c) 2013 RhodeCode GmbH, and others.
 
:license: GPLv3, see LICENSE.md for more details.
 
"""
 

	
 
import datetime
 
import decorator
 
import logging
 
import time
 
import traceback
 
import warnings
 

	
 
import webob.exc
 
import paste.httpexceptions
 
import paste.auth.basic
 
import paste.httpheaders
 
from webhelpers.pylonslib import secure_form
 

	
 
from tg import config, tmpl_context as c, request, response, session
 
from pylons.controllers import WSGIController
 
from pylons.templating import render_mako as render  # don't remove this import
 
from tg import config, tmpl_context as c, request, response, session, render_template
 
from tg import TGController
 
from tg.i18n import ugettext as _
 

	
 
from kallithea import __version__, BACKENDS
 

	
 
from kallithea.config.routing import url
 
from kallithea.lib.utils2 import str2bool, safe_unicode, AttributeDict, \
 
    safe_str, safe_int
 
from kallithea.lib import auth_modules
 
from kallithea.lib.auth import AuthUser, HasPermissionAnyMiddleware
 
from kallithea.lib.compat import json
 
from kallithea.lib.utils import get_repo_slug
 
from kallithea.lib.exceptions import UserCreationError
 
from kallithea.lib.vcs.exceptions import RepositoryError, EmptyRepositoryError, ChangesetDoesNotExistError
 
from kallithea.model import meta
 

	
 
from kallithea.model.db import PullRequest, Repository, Ui, User, Setting
 
from kallithea.model.notification import NotificationModel
 
from kallithea.model.scm import ScmModel
 

	
 
log = logging.getLogger(__name__)
 

	
 

	
 
def render(template_path):
 
    return render_template({'url': url}, 'mako', template_path)
 

	
 

	
 
def _filter_proxy(ip):
 
    """
 
    HEADERS can have multiple ips inside the left-most being the original
 
    client, and each successive proxy that passed the request adding the IP
 
    address where it received the request from.
 

	
 
    :param ip:
 
    """
 
    if ',' in ip:
 
        _ips = ip.split(',')
 
        _first_ip = _ips[0].strip()
 
        log.debug('Got multiple IPs %s, using %s', ','.join(_ips), _first_ip)
 
        return _first_ip
 
    return ip
 

	
 

	
 
def _get_ip_addr(environ):
 
    proxy_key = 'HTTP_X_REAL_IP'
 
    proxy_key2 = 'HTTP_X_FORWARDED_FOR'
 
    def_key = 'REMOTE_ADDR'
 

	
 
    ip = environ.get(proxy_key)
 
    if ip:
 
        return _filter_proxy(ip)
 

	
 
    ip = environ.get(proxy_key2)
 
    if ip:
 
        return _filter_proxy(ip)
 

	
 
    ip = environ.get(def_key, '0.0.0.0')
 
    return _filter_proxy(ip)
 

	
 

	
 
def _get_access_path(environ):
 
    path = environ.get('PATH_INFO')
 
    org_req = environ.get('pylons.original_request')
 
    org_req = environ.get('tg.original_request')
 
    if org_req:
 
        path = org_req.environ.get('PATH_INFO')
 
    return path
 

	
 

	
 
def log_in_user(user, remember, is_external_auth):
 
    """
 
    Log a `User` in and update session and cookies. If `remember` is True,
 
    the session cookie is set to expire in a year; otherwise, it expires at
 
    the end of the browser session.
 

	
 
    Returns populated `AuthUser` object.
 
    """
 
    user.update_lastlogin()
 
    meta.Session().commit()
 

	
 
    auth_user = AuthUser(dbuser=user,
 
                         is_external_auth=is_external_auth)
 
    # It should not be possible to explicitly log in as the default user.
 
    assert not auth_user.is_default_user
 
    auth_user.is_authenticated = True
 

	
 
    # Start new session to prevent session fixation attacks.
 
    session.invalidate()
 
    session['authuser'] = cookie = auth_user.to_cookie()
 

	
 
    # If they want to be remembered, update the cookie.
 
    # NOTE: Assumes that beaker defaults to browser session cookie.
 
    if remember:
 
        t = datetime.datetime.now() + datetime.timedelta(days=365)
 
        session._set_cookie_expires(t)
 

	
 
    session.save()
 

	
 
    log.info('user %s is now authenticated and stored in '
 
             'session, session attrs %s', user.username, cookie)
 

	
 
    # dumps session attrs back to cookie
 
    session._update_cookie_out()
 

	
 
    return auth_user
 

	
 

	
 
class BasicAuth(paste.auth.basic.AuthBasicAuthenticator):
 

	
 
    def __init__(self, realm, authfunc, auth_http_code=None):
 
        self.realm = realm
 
        self.authfunc = authfunc
 
        self._rc_auth_http_code = auth_http_code
 

	
 
    def build_authentication(self):
 
        head = paste.httpheaders.WWW_AUTHENTICATE.tuples('Basic realm="%s"' % self.realm)
 
        if self._rc_auth_http_code and self._rc_auth_http_code == '403':
 
            # return 403 if alternative http return code is specified in
 
            # Kallithea config
 
            return paste.httpexceptions.HTTPForbidden(headers=head)
 
        return paste.httpexceptions.HTTPUnauthorized(headers=head)
 

	
 
    def authenticate(self, environ):
 
        authorization = paste.httpheaders.AUTHORIZATION(environ)
 
        if not authorization:
 
            return self.build_authentication()
 
        (authmeth, auth) = authorization.split(' ', 1)
 
        if 'basic' != authmeth.lower():
 
            return self.build_authentication()
 
        auth = auth.strip().decode('base64')
 
        _parts = auth.split(':', 1)
 
        if len(_parts) == 2:
 
            username, password = _parts
 
            if self.authfunc(username, password, environ) is not None:
 
                return username
 
        return self.build_authentication()
 

	
 
    __call__ = authenticate
 

	
 

	
 
class BaseVCSController(object):
 
    """Base controller for handling Mercurial/Git protocol requests
 
    (coming from a VCS client, and not a browser).
 
    """
 

	
 
    def __init__(self, application, config):
 
        self.application = application
 
        self.config = config
 
        # base path of repo locations
 
        self.basepath = self.config['base_path']
 
        # authenticate this VCS request using the authentication modules
 
        self.authenticate = BasicAuth('', auth_modules.authenticate,
 
                                      config.get('auth_ret_code'))
 

	
 
    def _authorize(self, environ, start_response, action, repo_name, ip_addr):
 
        """Authenticate and authorize user.
 

	
 
        Since we're dealing with a VCS client and not a browser, we only
 
        support HTTP basic authentication, either directly via raw header
 
        inspection, or by using container authentication to delegate the
 
        authentication to the web server.
 

	
 
        Returns (user, None) on successful authentication and authorization.
 
        Returns (None, wsgi_app) to send the wsgi_app response to the client.
 
        """
 
        # Check if anonymous access is allowed.
 
        default_user = User.get_default_user(cache=True)
 
        is_default_user_allowed = (default_user.active and
 
            self._check_permission(action, default_user, repo_name, ip_addr))
 
        if is_default_user_allowed:
 
            return default_user, None
 

	
 
        if not default_user.active:
 
            log.debug('Anonymous access is disabled')
 
        else:
 
            log.debug('Not authorized to access this '
 
                      'repository as anonymous user')
 

	
 
        username = None
 
        #==============================================================
 
        # DEFAULT PERM FAILED OR ANONYMOUS ACCESS IS DISABLED SO WE
 
        # NEED TO AUTHENTICATE AND ASK FOR AUTH USER PERMISSIONS
 
        #==============================================================
 

	
 
        # try to auth based on environ, container auth methods
 
        log.debug('Running PRE-AUTH for container based authentication')
 
        pre_auth = auth_modules.authenticate('', '', environ)
 
        if pre_auth is not None and pre_auth.get('username'):
 
            username = pre_auth['username']
 
        log.debug('PRE-AUTH got %s as username', username)
 

	
 
        # If not authenticated by the container, running basic auth
 
        if not username:
 
            self.authenticate.realm = safe_str(self.config['realm'])
 
            result = self.authenticate(environ)
 
            if isinstance(result, str):
 
                paste.httpheaders.AUTH_TYPE.update(environ, 'basic')
 
                paste.httpheaders.REMOTE_USER.update(environ, result)
 
                username = result
 
            else:
 
                return None, result.wsgi_application
 

	
 
        #==============================================================
 
        # CHECK PERMISSIONS FOR THIS REQUEST USING GIVEN USERNAME
 
        #==============================================================
 
        try:
 
            user = User.get_by_username_or_email(username)
 
            if user is None or not user.active:
 
                return None, webob.exc.HTTPForbidden()
 
        except Exception:
 
            log.error(traceback.format_exc())
 
            return None, webob.exc.HTTPInternalServerError()
 

	
 
        #check permissions for this repository
 
        perm = self._check_permission(action, user, repo_name, ip_addr)
 
        if not perm:
 
            return None, webob.exc.HTTPForbidden()
 

	
 
        return user, None
 

	
 
    def _handle_request(self, environ, start_response):
 
        raise NotImplementedError()
 

	
 
    def _get_by_id(self, repo_name):
 
        """
 
        Gets a special pattern _<ID> from clone url and tries to replace it
 
        with a repository_name for support of _<ID> permanent URLs
 

	
 
        :param repo_name:
 
        """
 

	
 
        data = repo_name.split('/')
 
        if len(data) >= 2:
 
            from kallithea.lib.utils import get_repo_by_id
 
            by_id_match = get_repo_by_id(repo_name)
 
            if by_id_match:
 
                data[1] = safe_str(by_id_match)
 

	
 
        return '/'.join(data)
 

	
 
    def _invalidate_cache(self, repo_name):
 
        """
 
        Sets cache for this repository for invalidation on next access
 

	
 
        :param repo_name: full repo name, also a cache key
 
        """
 
        ScmModel().mark_for_invalidation(repo_name)
 

	
 
    def _check_permission(self, action, user, repo_name, ip_addr=None):
 
        """
 
        Checks permissions using action (push/pull) user and repository
 
        name
 

	
 
        :param action: push or pull action
 
        :param user: `User` instance
 
        :param repo_name: repository name
 
        """
 
        # check IP
 
        ip_allowed = AuthUser.check_ip_allowed(user, ip_addr)
 
        if ip_allowed:
 
            log.info('Access for IP:%s allowed', ip_addr)
 
        else:
 
            return False
 

	
 
        if action == 'push':
 
            if not HasPermissionAnyMiddleware('repository.write',
 
                                              'repository.admin')(user,
 
                                                                  repo_name):
 
                return False
 

	
 
        else:
 
            #any other action need at least read permission
 
            if not HasPermissionAnyMiddleware('repository.read',
 
                                              'repository.write',
 
                                              'repository.admin')(user,
 
                                                                  repo_name):
 
                return False
 

	
 
        return True
 

	
 
    def _get_ip_addr(self, environ):
 
        return _get_ip_addr(environ)
 

	
 
    def _check_locking_state(self, environ, action, repo, user_id):
 
        """
 
        Checks locking on this repository, if locking is enabled and lock is
 
        present returns a tuple of make_lock, locked, locked_by.
 
        make_lock can have 3 states None (do nothing) True, make lock
 
        False release lock, This value is later propagated to hooks, which
 
        do the locking. Think about this as signals passed to hooks what to do.
 

	
 
        """
 
        locked = False  # defines that locked error should be thrown to user
 
        make_lock = None
 
        repo = Repository.get_by_repo_name(repo)
 
        user = User.get(user_id)
 

	
 
        # this is kind of hacky, but due to how mercurial handles client-server
 
        # server see all operation on changeset; bookmarks, phases and
 
        # obsolescence marker in different transaction, we don't want to check
 
        # locking on those
 
        obsolete_call = environ['QUERY_STRING'] in ['cmd=listkeys',]
 
        locked_by = repo.locked
 
        if repo and repo.enable_locking and not obsolete_call:
 
            if action == 'push':
 
                #check if it's already locked !, if it is compare users
 
                user_id, _date = repo.locked
 
                if user.user_id == user_id:
 
                    log.debug('Got push from user %s, now unlocking', user)
 
                    # unlock if we have push from user who locked
 
                    make_lock = False
 
                else:
 
                    # we're not the same user who locked, ban with 423 !
 
                    locked = True
 
            if action == 'pull':
 
                if repo.locked[0] and repo.locked[1]:
 
                    locked = True
 
                else:
 
                    log.debug('Setting lock on repo %s by %s', repo, user)
 
                    make_lock = True
 

	
 
        else:
 
            log.debug('Repository %s do not have locking enabled', repo)
 
        log.debug('FINAL locking values make_lock:%s,locked:%s,locked_by:%s',
 
                  make_lock, locked, locked_by)
 
        return make_lock, locked, locked_by
 

	
 
    def __call__(self, environ, start_response):
 
        start = time.time()
 
        try:
 
            return self._handle_request(environ, start_response)
 
        finally:
 
            log = logging.getLogger('kallithea.' + self.__class__.__name__)
 
            log.debug('Request time: %.3fs', time.time() - start)
 
            meta.Session.remove()
 

	
 

	
 
class BaseController(WSGIController):
 
class BaseController(TGController):
 

	
 
    def _before(self, *args, **kwargs):
 
        pass
 

	
 
    def __before__(self):
 
        """
 
        __before__ is called before controller methods and after __call__
 
        _before is called before controller methods and after __call__
 
        """
 
        c.kallithea_version = __version__
 
        rc_config = Setting.get_app_settings()
 

	
 
        # Visual options
 
        c.visual = AttributeDict({})
 

	
 
        ## DB stored
 
        c.visual.show_public_icon = str2bool(rc_config.get('show_public_icon'))
 
        c.visual.show_private_icon = str2bool(rc_config.get('show_private_icon'))
 
        c.visual.stylify_metatags = str2bool(rc_config.get('stylify_metatags'))
 
        c.visual.page_size = safe_int(rc_config.get('dashboard_items', 100))
 
        c.visual.admin_grid_items = safe_int(rc_config.get('admin_grid_items', 100))
 
        c.visual.repository_fields = str2bool(rc_config.get('repository_fields'))
 
        c.visual.show_version = str2bool(rc_config.get('show_version'))
 
        c.visual.use_gravatar = str2bool(rc_config.get('use_gravatar'))
 
        c.visual.gravatar_url = rc_config.get('gravatar_url')
 

	
 
        c.ga_code = rc_config.get('ga_code')
 
        # TODO: replace undocumented backwards compatibility hack with db upgrade and rename ga_code
 
        if c.ga_code and '<' not in c.ga_code:
 
            c.ga_code = '''<script type="text/javascript">
 
                var _gaq = _gaq || [];
 
                _gaq.push(['_setAccount', '%s']);
 
                _gaq.push(['_trackPageview']);
 

	
 
                (function() {
 
                    var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true;
 
                    ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js';
 
                    var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s);
 
                    })();
 
            </script>''' % c.ga_code
 
        c.site_name = rc_config.get('title')
 
        c.clone_uri_tmpl = rc_config.get('clone_uri_tmpl')
 

	
 
        ## INI stored
 
        c.visual.allow_repo_location_change = str2bool(config.get('allow_repo_location_change', True))
 
        c.visual.allow_custom_hooks_settings = str2bool(config.get('allow_custom_hooks_settings', True))
 

	
 
        c.instance_id = config.get('instance_id')
 
        c.issues_url = config.get('bugtracker', url('issues_url'))
 
        # END CONFIG VARS
 

	
 
        c.repo_name = get_repo_slug(request)  # can be empty
 
        c.backends = BACKENDS.keys()
 
        c.unread_notifications = NotificationModel() \
 
                        .get_unread_cnt_for_user(request.authuser.user_id)
 

	
 
        self.cut_off_limit = safe_int(config.get('cut_off_limit'))
 

	
 
        c.my_pr_count = PullRequest.query(reviewer_id=request.authuser.user_id, include_closed=False).count()
 

	
 
        self.scm_model = ScmModel()
 

	
 
        # __before__ in Pylons is called _before in TurboGears2. As preparation
 
        # to the migration to TurboGears2, all __before__ methods were already
 
        # renamed to _before.  We call them from here to keep the behavior.
 
        # This is a temporary call that will be removed in the real TurboGears2
 
        # migration commit.
 
        self._before()
 

	
 
    @staticmethod
 
    def _determine_auth_user(api_key, bearer_token, session_authuser):
 
        """
 
        Create an `AuthUser` object given the API key/bearer token
 
        (if any) and the value of the authuser session cookie.
 
        """
 

	
 
        # Authenticate by bearer token
 
        if bearer_token is not None:
 
            api_key = bearer_token
 

	
 
        # Authenticate by API key
 
        if api_key is not None:
 
            au = AuthUser(dbuser=User.get_by_api_key(api_key),
 
                authenticating_api_key=api_key, is_external_auth=True)
 
            if au.is_anonymous:
 
                log.warning('API key ****%s is NOT valid', api_key[-4:])
 
                raise webob.exc.HTTPForbidden(_('Invalid API key'))
 
            return au
 

	
 
        # Authenticate by session cookie
 
        # In ancient login sessions, 'authuser' may not be a dict.
 
        # In that case, the user will have to log in again.
 
        # v0.3 and earlier included an 'is_authenticated' key; if present,
 
        # this must be True.
 
        if isinstance(session_authuser, dict) and session_authuser.get('is_authenticated', True):
 
            try:
 
                return AuthUser.from_cookie(session_authuser)
 
            except UserCreationError as e:
 
                # container auth or other auth functions that create users on
 
                # the fly can throw UserCreationError to signal issues with
 
                # user creation. Explanation should be provided in the
 
                # exception object.
 
                from kallithea.lib import helpers as h
 
                h.flash(e, 'error', logf=log.error)
 

	
 
        # Authenticate by auth_container plugin (if enabled)
 
        if any(
 
            plugin.is_container_auth
 
            for plugin in auth_modules.get_auth_plugins()
 
        ):
 
            try:
 
                user_info = auth_modules.authenticate('', '', request.environ)
 
            except UserCreationError as e:
 
                from kallithea.lib import helpers as h
 
                h.flash(e, 'error', logf=log.error)
 
            else:
 
                if user_info is not None:
 
                    username = user_info['username']
 
                    user = User.get_by_username(username, case_insensitive=True)
 
                    return log_in_user(user, remember=False,
 
                                       is_external_auth=True)
 

	
 
        # User is anonymous
 
        return AuthUser()
 

	
 
    @staticmethod
 
    def _basic_security_checks():
 
        """Perform basic security/sanity checks before processing the request."""
 

	
 
        # Only allow the following HTTP request methods.
 
        if request.method not in ['GET', 'HEAD', 'POST']:
 
            raise webob.exc.HTTPMethodNotAllowed()
 

	
 
        # Also verify the _method override - no longer allowed.
 
        if request.params.get('_method') is None:
 
            pass # no override, no problem
 
        else:
 
            raise webob.exc.HTTPMethodNotAllowed()
 

	
 
        # Make sure CSRF token never appears in the URL. If so, invalidate it.
 
        if secure_form.token_key in request.GET:
 
            log.error('CSRF key leak detected')
 
            session.pop(secure_form.token_key, None)
 
            session.save()
 
            from kallithea.lib import helpers as h
 
            h.flash(_('CSRF token leak has been detected - all form tokens have been expired'),
 
                    category='error')
 

	
 
        # WebOb already ignores request payload parameters for anything other
 
        # than POST/PUT, but double-check since other Kallithea code relies on
 
        # this assumption.
 
        if request.method not in ['POST', 'PUT'] and request.POST:
 
            log.error('%r request with payload parameters; WebOb should have stopped this', request.method)
 
            raise webob.exc.HTTPBadRequest()
 

	
 
    def __call__(self, environ, start_response):
 
        """Invoke the Controller"""
 

	
 
        # WSGIController.__call__ dispatches to the Controller method
 
        # the request is routed to. This routing information is
 
        # available in environ['pylons.routes_dict']
 
    def __call__(self, environ, context):
 
        try:
 
            request.ip_addr = _get_ip_addr(environ)
 
            # make sure that we update permissions each time we call controller
 

	
 
            self._basic_security_checks()
 

	
 
            #set globals for auth user
 

	
 
            bearer_token = None
 
            try:
 
                # Request.authorization may raise ValueError on invalid input
 
                type, params = request.authorization
 
            except (ValueError, TypeError):
 
                pass
 
            else:
 
                if type.lower() == 'bearer':
 
                    bearer_token = params
 

	
 
            request.authuser = request.user = self._determine_auth_user(
 
                request.GET.get('api_key'),
 
                bearer_token,
 
                session.get('authuser'),
 
            )
 

	
 
            log.info('IP: %s User: %s accessed %s',
 
                request.ip_addr, request.authuser,
 
                safe_unicode(_get_access_path(environ)),
 
            )
 
            return WSGIController.__call__(self, environ, start_response)
 
            return super(BaseController, self).__call__(environ, context)
 
        except webob.exc.HTTPException as e:
 
            return e(environ, start_response)
 
        finally:
 
            meta.Session.remove()
 
            return e
 

	
 

	
 
class BaseRepoController(BaseController):
 
    """
 
    Base class for controllers responsible for loading all needed data for
 
    repository loaded items are
 

	
 
    c.db_repo_scm_instance: instance of scm repository
 
    c.db_repo: instance of db
 
    c.repository_followers: number of followers
 
    c.repository_forks: number of forks
 
    c.repository_following: weather the current user is following the current repo
 
    """
 

	
 
    def _before(self, *args, **kwargs):
 
        super(BaseRepoController, self)._before(*args, **kwargs)
 
        if c.repo_name:  # extracted from routes
 
            _dbr = Repository.get_by_repo_name(c.repo_name)
 
            if not _dbr:
 
                return
 

	
 
            log.debug('Found repository in database %s with state `%s`',
 
                      safe_unicode(_dbr), safe_unicode(_dbr.repo_state))
 
            route = getattr(request.environ.get('routes.route'), 'name', '')
 

	
 
            # allow to delete repos that are somehow damages in filesystem
 
            if route in ['delete_repo']:
 
                return
 

	
 
            if _dbr.repo_state in [Repository.STATE_PENDING]:
 
                if route in ['repo_creating_home']:
 
                    return
 
                check_url = url('repo_creating_home', repo_name=c.repo_name)
 
                raise webob.exc.HTTPFound(location=check_url)
 

	
 
            dbr = c.db_repo = _dbr
 
            c.db_repo_scm_instance = c.db_repo.scm_instance
 
            if c.db_repo_scm_instance is None:
 
                log.error('%s this repository is present in database but it '
 
                          'cannot be created as an scm instance', c.repo_name)
 
                from kallithea.lib import helpers as h
 
                h.flash(h.literal(_('Repository not found in the filesystem')),
 
                        category='error')
 
                raise paste.httpexceptions.HTTPNotFound()
 

	
 
            # some globals counter for menu
 
            c.repository_followers = self.scm_model.get_followers(dbr)
 
            c.repository_forks = self.scm_model.get_forks(dbr)
 
            c.repository_pull_requests = self.scm_model.get_pull_requests(dbr)
 
            c.repository_following = self.scm_model.is_following_repo(
 
                                    c.repo_name, request.authuser.user_id)
 

	
 
    @staticmethod
 
    def _get_ref_rev(repo, ref_type, ref_name, returnempty=False):
 
        """
 
        Safe way to get changeset. If error occurs show error.
 
        """
 
        from kallithea.lib import helpers as h
 
        try:
 
            return repo.scm_instance.get_ref_revision(ref_type, ref_name)
 
        except EmptyRepositoryError as e:
 
            if returnempty:
 
                return repo.scm_instance.EMPTY_CHANGESET
 
            h.flash(h.literal(_('There are no changesets yet')),
 
                    category='error')
 
            raise webob.exc.HTTPNotFound()
 
        except ChangesetDoesNotExistError as e:
 
            h.flash(h.literal(_('Changeset for %s %s not found in %s') %
 
                              (ref_type, ref_name, repo.repo_name)),
 
                    category='error')
 
            raise webob.exc.HTTPNotFound()
 
        except RepositoryError as e:
 
            log.error(traceback.format_exc())
 
            h.flash(safe_str(e), category='error')
 
            raise webob.exc.HTTPBadRequest()
 

	
 

	
 
class WSGIResultCloseCallback(object):
 
    """Wrap a WSGI result and let close call close after calling the
 
    close method on the result.
 
    """
 
    def __init__(self, result, close):
 
        self._result = result
 
        self._close = close
 

	
 
    def __iter__(self):
 
        return iter(self._result)
 

	
 
    def close(self):
 
        if hasattr(self._result, 'close'):
 
            self._result.close()
 
        self._close()
 

	
 

	
 
@decorator.decorator
 
def jsonify(func, *args, **kwargs):
 
    """Action decorator that formats output for JSON
 

	
 
    Given a function that will return content, this decorator will turn
 
    the result into JSON, with a content-type of 'application/json' and
 
    output it.
 
    """
 
    response.headers['Content-Type'] = 'application/json; charset=utf-8'
 
    data = func(*args, **kwargs)
 
    if isinstance(data, (list, tuple)):
 
        # A JSON list response is syntactically valid JavaScript and can be
 
        # loaded and executed as JavaScript by a malicious third-party site
 
        # using <script>, which can lead to cross-site data leaks.
 
        # JSON responses should therefore be scalars or objects (i.e. Python
 
        # dicts), because a JSON object is a syntax error if intepreted as JS.
 
        msg = "JSON responses with Array envelopes are susceptible to " \
 
              "cross-site data leak attacks, see " \
 
              "https://web.archive.org/web/20120519231904/http://wiki.pylonshq.com/display/pylonsfaq/Warnings"
 
        warnings.warn(msg, Warning, 2)
 
        log.warning(msg)
 
    log.debug("Returning JSON wrapped action output")
 
    return json.dumps(data, encoding='utf-8')
kallithea/lib/paster_commands/common.py
Show inline comments
 
# -*- coding: utf-8 -*-
 
# This program is free software: you can redistribute it and/or modify
 
# it under the terms of the GNU General Public License as published by
 
# the Free Software Foundation, either version 3 of the License, or
 
# (at your option) any later version.
 
#
 
# This program is distributed in the hope that it will be useful,
 
# but WITHOUT ANY WARRANTY; without even the implied warranty of
 
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
 
# GNU General Public License for more details.
 
#
 
# You should have received a copy of the GNU General Public License
 
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
 
"""
 
kallithea.lib.paster_commands.common
 
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
 

	
 
Common code for gearbox commands.
 

	
 
This file was forked by the Kallithea project in July 2014.
 
Original author and date, and relevant copyright and licensing information is below:
 
:created_on: Apr 18, 2010
 
:author: marcink
 
:copyright: (c) 2013 RhodeCode GmbH, and others.
 
:license: GPLv3, see LICENSE.md for more details.
 
"""
 

	
 
import os
 
import sys
 
import logging.config
 

	
 
import paste.deploy
 
import gearbox.command
 

	
 

	
 
def ask_ok(prompt, retries=4, complaint='Yes or no please!'):
 
    while True:
 
        ok = raw_input(prompt)
 
        if ok in ('y', 'ye', 'yes'):
 
            return True
 
        if ok in ('n', 'no', 'nop', 'nope'):
 
            return False
 
        retries = retries - 1
 
        if retries < 0:
 
            raise IOError
 
        print complaint
 

	
 

	
 
class BasePasterCommand(gearbox.command.Command):
 
    """
 
    Abstract Base Class for gearbox commands.
 
    """
 

	
 
    # override to control how much get_parser and run should do:
 
    takes_config_file = True
 
    requires_db_session = True
 

	
 
    def run(self, args):
 
        """
 
        Overrides Command.run
 

	
 
        Checks for a config file argument and loads it.
 
        """
 
        if self.takes_config_file:
 
             self._bootstrap_config(args.config_file)
 
             if self.requires_db_session:
 
                  self._init_session()
 

	
 
        return super(BasePasterCommand, self).run(args)
 

	
 
    def get_parser(self, prog_name):
 
        parser = super(BasePasterCommand, self).get_parser(prog_name)
 

	
 
        if self.takes_config_file:
 
            parser.add_argument("-c", "--config",
 
                help='Kallithea .ini file with configuration of database etc',
 
                dest='config_file', required=True)
 

	
 
        return parser
 

	
 
    def _bootstrap_config(self, config_file):
 
        """
 
        Read the config file and initialize logging and the application.
 
        """
 
        from tg import config as pylonsconfig
 
        from kallithea.config.middleware import make_app
 

	
 
        path_to_ini_file = os.path.realpath(config_file)
 
        conf = paste.deploy.appconfig('config:' + path_to_ini_file)
 
        logging.config.fileConfig(path_to_ini_file)
 
        pylonsconfig.init_app(conf.global_conf, conf.local_conf)
 
        make_app(conf.global_conf, **conf.local_conf)
 

	
 
    def _init_session(self):
 
        """
 
        Initialize SqlAlchemy Session from global config.
 
        """
 

	
 
        from tg import config
 
        from kallithea.model.base import init_model
 
        from kallithea.lib.utils2 import engine_from_config
 
        from kallithea.lib.utils import setup_cache_regions
 
        setup_cache_regions(config)
 
        engine = engine_from_config(config, 'sqlalchemy.')
 
        init_model(engine)
 

	
 
    def error(self, msg, exitcode=1):
 
        """Write error message and exit"""
 
        sys.stderr.write('%s\n' % msg)
 
        raise SystemExit(exitcode)
kallithea/lib/paster_commands/make_index.py
Show inline comments
 
# -*- coding: utf-8 -*-
 
# This program is free software: you can redistribute it and/or modify
 
# it under the terms of the GNU General Public License as published by
 
# the Free Software Foundation, either version 3 of the License, or
 
# (at your option) any later version.
 
#
 
# This program is distributed in the hope that it will be useful,
 
# but WITHOUT ANY WARRANTY; without even the implied warranty of
 
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
 
# GNU General Public License for more details.
 
#
 
# You should have received a copy of the GNU General Public License
 
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
 
"""
 
kallithea.lib.paster_commands.make_index
 
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
 

	
 
make-index gearbox command for Kallithea
 

	
 
This file was forked by the Kallithea project in July 2014.
 
Original author and date, and relevant copyright and licensing information is below:
 
:created_on: Aug 17, 2010
 
:author: marcink
 
:copyright: (c) 2013 RhodeCode GmbH, and others.
 
:license: GPLv3, see LICENSE.md for more details.
 
"""
 

	
 

	
 
import os
 
import sys
 
from os.path import dirname
 

	
 
from string import strip
 
from kallithea.model.repo import RepoModel
 
from kallithea.lib.paster_commands.common import BasePasterCommand
 
from kallithea.lib.utils import load_rcextensions
 

	
 

	
 
class Command(BasePasterCommand):
 
    "Kallithea: Create or update full text search index"
 

	
 
    def take_action(self, args):
 
        from pylons import config
 
        from tg import config
 
        index_location = config['index_dir']
 
        load_rcextensions(config['here'])
 

	
 
        repo_location = args.repo_location \
 
            if args.repo_location else RepoModel().repos_path
 
        repo_list = map(strip, args.repo_list.split(',')) \
 
            if args.repo_list else None
 

	
 
        repo_update_list = map(strip, args.repo_update_list.split(',')) \
 
            if args.repo_update_list else None
 

	
 
        #======================================================================
 
        # WHOOSH DAEMON
 
        #======================================================================
 
        from kallithea.lib.pidlock import LockHeld, DaemonLock
 
        from kallithea.lib.indexers.daemon import WhooshIndexingDaemon
 
        try:
 
            l = DaemonLock(file_=os.path.join(dirname(dirname(index_location)),
 
                                              'make_index.lock'))
 
            WhooshIndexingDaemon(index_location=index_location,
 
                                 repo_location=repo_location,
 
                                 repo_list=repo_list,
 
                                 repo_update_list=repo_update_list) \
 
                .run(full_index=args.full_index)
 
            l.release()
 
        except LockHeld:
 
            sys.exit(1)
 

	
 
    def get_parser(self, prog_name):
 
        parser = super(Command, self).get_parser(prog_name)
 

	
 
        parser.add_argument('--repo-location',
 
                          action='store',
 
                          dest='repo_location',
 
                          help="Specifies repositories location to index OPTIONAL",
 
                          )
 
        parser.add_argument('--index-only',
 
                          action='store',
 
                          dest='repo_list',
 
                          help="Specifies a comma separated list of repositories "
 
                                "to build index on. If not given all repositories "
 
                                "are scanned for indexing. OPTIONAL",
 
                          )
 
        parser.add_argument('--update-only',
 
                          action='store',
 
                          dest='repo_update_list',
 
                          help="Specifies a comma separated list of repositories "
 
                                "to re-build index on. OPTIONAL",
 
                          )
 
        parser.add_argument('-f',
 
                          action='store_true',
 
                          dest='full_index',
 
                          help="Specifies that index should be made full i.e"
 
                                " destroy old and build from scratch",
 
                          default=False)
 

	
 
        return parser
kallithea/lib/paster_commands/make_rcextensions.py
Show inline comments
 
# -*- coding: utf-8 -*-
 
# This program is free software: you can redistribute it and/or modify
 
# it under the terms of the GNU General Public License as published by
 
# the Free Software Foundation, either version 3 of the License, or
 
# (at your option) any later version.
 
#
 
# This program is distributed in the hope that it will be useful,
 
# but WITHOUT ANY WARRANTY; without even the implied warranty of
 
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
 
# GNU General Public License for more details.
 
#
 
# You should have received a copy of the GNU General Public License
 
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
 
"""
 
kallithea.lib.paster_commands.make_rcextensions
 
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
 

	
 
make-rcext gearbox command for Kallithea
 

	
 
This file was forked by the Kallithea project in July 2014.
 
Original author and date, and relevant copyright and licensing information is below:
 
:created_on: Mar 6, 2012
 
:author: marcink
 
:copyright: (c) 2013 RhodeCode GmbH, and others.
 
:license: GPLv3, see LICENSE.md for more details.
 
"""
 

	
 

	
 
import os
 
import sys
 
import pkg_resources
 

	
 
from kallithea.lib.paster_commands.common import ask_ok, BasePasterCommand
 

	
 

	
 
class Command(BasePasterCommand):
 
    """Kallithea: Write template file for extending Kallithea in Python
 

	
 
    A rcextensions directory with a __init__.py file will be created next to
 
    the ini file. Local customizations in that file will survive upgrades.
 
    The file contains instructions on how it can be customized.
 
    """
 

	
 
    takes_config_file = False
 

	
 
    def take_action(self, args):
 
        from pylons import config
 
        from tg import config
 

	
 
        here = config['here']
 
        content = pkg_resources.resource_string(
 
            'kallithea', os.path.join('config', 'rcextensions', '__init__.py')
 
        )
 
        ext_file = os.path.join(here, 'rcextensions', '__init__.py')
 
        if os.path.exists(ext_file):
 
            msg = ('Extension file already exists, do you want '
 
                   'to overwrite it ? [y/n]')
 
            if not ask_ok(msg):
 
                print 'Nothing done, exiting...'
 
                return
 

	
 
        dirname = os.path.dirname(ext_file)
 
        if not os.path.isdir(dirname):
 
            os.makedirs(dirname)
 
        with open(ext_file, 'wb') as f:
 
            f.write(content)
 
            print 'Wrote new extensions file to %s' % ext_file
kallithea/lib/paster_commands/template.ini.mako
Show inline comments
 
## -*- coding: utf-8 -*-
 
<%text>################################################################################</%text>
 
<%text>################################################################################</%text>
 
# Kallithea - config file generated with kallithea-config                      #
 
<%text>################################################################################</%text>
 
<%text>################################################################################</%text>
 

	
 
[DEFAULT]
 
debug = true
 
pdebug = false
 

	
 
<%text>################################################################################</%text>
 
<%text>## Email settings                                                             ##</%text>
 
<%text>##                                                                            ##</%text>
 
<%text>## Refer to the documentation ("Email settings") for more details.            ##</%text>
 
<%text>##                                                                            ##</%text>
 
<%text>## It is recommended to use a valid sender address that passes access         ##</%text>
 
<%text>## validation and spam filtering in mail servers.                             ##</%text>
 
<%text>################################################################################</%text>
 

	
 
<%text>## 'From' header for application emails. You can optionally add a name.</%text>
 
<%text>## Default:</%text>
 
#app_email_from = Kallithea
 
<%text>## Examples:</%text>
 
#app_email_from = Kallithea <kallithea-noreply@example.com>
 
#app_email_from = kallithea-noreply@example.com
 

	
 
<%text>## Subject prefix for application emails.</%text>
 
<%text>## A space between this prefix and the real subject is automatically added.</%text>
 
<%text>## Default:</%text>
 
#email_prefix =
 
<%text>## Example:</%text>
 
#email_prefix = [Kallithea]
 

	
 
<%text>## Recipients for error emails and fallback recipients of application mails.</%text>
 
<%text>## Multiple addresses can be specified, space-separated.</%text>
 
<%text>## Only addresses are allowed, do not add any name part.</%text>
 
<%text>## Default:</%text>
 
#email_to =
 
<%text>## Examples:</%text>
 
#email_to = admin@example.com
 
#email_to = admin@example.com another_admin@example.com
 

	
 
<%text>## 'From' header for error emails. You can optionally add a name.</%text>
 
<%text>## Default:</%text>
 
#error_email_from = pylons@yourapp.com
 
<%text>## Examples:</%text>
 
#error_email_from = Kallithea Errors <kallithea-noreply@example.com>
 
#error_email_from = paste_error@example.com
 

	
 
<%text>## SMTP server settings</%text>
 
<%text>## If specifying credentials, make sure to use secure connections.</%text>
 
<%text>## Default: Send unencrypted unauthenticated mails to the specified smtp_server.</%text>
 
<%text>## For "SSL", use smtp_use_ssl = true and smtp_port = 465.</%text>
 
<%text>## For "STARTTLS", use smtp_use_tls = true and smtp_port = 587.</%text>
 
#smtp_server = smtp.example.com
 
#smtp_username =
 
#smtp_password =
 
#smtp_port = 25
 
#smtp_use_ssl = false
 
#smtp_use_tls = false
 

	
 
[server:main]
 
%if http_server == 'gearbox':
 
<%text>## Gearbox default web server ##</%text>
 
use = egg:gearbox#wsgiref
 
<%text>## nr of worker threads to spawn</%text>
 
threadpool_workers = 1
 
<%text>## max request before thread respawn</%text>
 
threadpool_max_requests = 100
 
<%text>## option to use threads of process</%text>
 
use_threadpool = true
 

	
 
%elif http_server == 'gevent':
 
<%text>## Gearbox gevent web server ##</%text>
 
use = egg:gearbox#gevent
 

	
 
%elif http_server == 'waitress':
 
<%text>## WAITRESS ##</%text>
 
use = egg:waitress#main
 
<%text>## number of worker threads</%text>
 
threads = 1
 
<%text>## MAX BODY SIZE 100GB</%text>
 
max_request_body_size = 107374182400
 
<%text>## use poll instead of select, fixes fd limits, may not work on old</%text>
 
<%text>## windows systems.</%text>
 
#asyncore_use_poll = True
 

	
 
%elif http_server == 'gunicorn':
 
<%text>## GUNICORN ##</%text>
 
use = egg:gunicorn#main
 
<%text>## number of process workers. You must set `instance_id = *` when this option</%text>
 
<%text>## is set to more than one worker</%text>
 
workers = 1
 
<%text>## process name</%text>
 
proc_name = kallithea
 
<%text>## type of worker class, one of sync, eventlet, gevent, tornado</%text>
 
<%text>## recommended for bigger setup is using of of other than sync one</%text>
 
worker_class = sync
 
max_requests = 1000
 
<%text>## amount of time a worker can handle request before it gets killed and</%text>
 
<%text>## restarted</%text>
 
timeout = 3600
 

	
 
%elif http_server == 'uwsgi':
 
<%text>## UWSGI ##</%text>
 
<%text>## run with uwsgi --ini-paste-logged <inifile.ini></%text>
 
[uwsgi]
 
socket = /tmp/uwsgi.sock
 
master = true
 
http = 127.0.0.1:5000
 

	
 
<%text>## set as deamon and redirect all output to file</%text>
 
#daemonize = ./uwsgi_kallithea.log
 

	
 
<%text>## master process PID</%text>
 
pidfile = ./uwsgi_kallithea.pid
 

	
 
<%text>## stats server with workers statistics, use uwsgitop</%text>
 
<%text>## for monitoring, `uwsgitop 127.0.0.1:1717`</%text>
 
stats = 127.0.0.1:1717
 
memory-report = true
 

	
 
<%text>## log 5XX errors</%text>
 
log-5xx = true
 

	
 
<%text>## Set the socket listen queue size.</%text>
 
listen = 256
 

	
 
<%text>## Gracefully Reload workers after the specified amount of managed requests</%text>
 
<%text>## (avoid memory leaks).</%text>
 
max-requests = 1000
 

	
 
<%text>## enable large buffers</%text>
 
buffer-size = 65535
 

	
 
<%text>## socket and http timeouts ##</%text>
 
http-timeout = 3600
 
socket-timeout = 3600
 

	
 
<%text>## Log requests slower than the specified number of milliseconds.</%text>
 
log-slow = 10
 

	
 
<%text>## Exit if no app can be loaded.</%text>
 
need-app = true
 

	
 
<%text>## Set lazy mode (load apps in workers instead of master).</%text>
 
lazy = true
 

	
 
<%text>## scaling ##</%text>
 
<%text>## set cheaper algorithm to use, if not set default will be used</%text>
 
cheaper-algo = spare
 

	
 
<%text>## minimum number of workers to keep at all times</%text>
 
cheaper = 1
 

	
 
<%text>## number of workers to spawn at startup</%text>
 
cheaper-initial = 1
 

	
 
<%text>## maximum number of workers that can be spawned</%text>
 
workers = 4
 

	
 
<%text>## how many workers should be spawned at a time</%text>
 
cheaper-step = 1
 

	
 
%endif
 
<%text>## COMMON ##</%text>
 
host = ${host}
 
port = ${port}
 

	
 
<%text>## middleware for hosting the WSGI application under a URL prefix</%text>
 
#[filter:proxy-prefix]
 
#use = egg:PasteDeploy#prefix
 
#prefix = /<your-prefix>
 

	
 
[app:main]
 
use = egg:kallithea
 
<%text>## enable proxy prefix middleware</%text>
 
#filter-with = proxy-prefix
 

	
 
full_stack = true
 
static_files = true
 
<%text>## Available Languages:</%text>
 
<%text>## cs de fr hu ja nl_BE pl pt_BR ru sk zh_CN zh_TW</%text>
 
lang =
 
cache_dir = ${here}/data
 
index_dir = ${here}/data/index
 

	
 
<%text>## perform a full repository scan on each server start, this should be</%text>
 
<%text>## set to false after first startup, to allow faster server restarts.</%text>
 
initial_repo_scan = false
 

	
 
<%text>## uncomment and set this path to use archive download cache</%text>
 
archive_cache_dir = ${here}/tarballcache
 

	
 
<%text>## change this to unique ID for security</%text>
 
app_instance_uuid = ${uuid()}
 

	
 
<%text>## cut off limit for large diffs (size in bytes)</%text>
 
cut_off_limit = 256000
 

	
 
<%text>## force https in Kallithea, fixes https redirects, assumes it's always https</%text>
 
force_https = false
 

	
 
<%text>## use Strict-Transport-Security headers</%text>
 
use_htsts = false
 

	
 
<%text>## number of commits stats will parse on each iteration</%text>
 
commit_parse_limit = 25
 

	
 
<%text>## path to git executable</%text>
 
git_path = git
 

	
 
<%text>## git rev filter option, --all is the default filter, if you need to</%text>
 
<%text>## hide all refs in changelog switch this to --branches --tags</%text>
 
#git_rev_filter = --branches --tags
 

	
 
<%text>## RSS feed options</%text>
 
rss_cut_off_limit = 256000
 
rss_items_per_page = 10
 
rss_include_diff = false
 

	
 
<%text>## options for showing and identifying changesets</%text>
 
show_sha_length = 12
 
show_revision_number = false
 

	
 
<%text>## Canonical URL to use when creating full URLs in UI and texts.</%text>
 
<%text>## Useful when the site is available under different names or protocols.</%text>
 
<%text>## Defaults to what is provided in the WSGI environment.</%text>
 
#canonical_url = https://kallithea.example.com/repos
 

	
 
<%text>## gist URL alias, used to create nicer urls for gist. This should be an</%text>
 
<%text>## url that does rewrites to _admin/gists/<gistid>.</%text>
 
<%text>## example: http://gist.example.com/{gistid}. Empty means use the internal</%text>
 
<%text>## Kallithea url, ie. http[s]://kallithea.example.com/_admin/gists/<gistid></%text>
 
gist_alias_url =
 

	
 
<%text>## white list of API enabled controllers. This allows to add list of</%text>
 
<%text>## controllers to which access will be enabled by api_key. eg: to enable</%text>
 
<%text>## api access to raw_files put `FilesController:raw`, to enable access to patches</%text>
 
<%text>## add `ChangesetController:changeset_patch`. This list should be "," separated</%text>
 
<%text>## Syntax is <ControllerClass>:<function>. Check debug logs for generated names</%text>
 
<%text>## Recommended settings below are commented out:</%text>
 
api_access_controllers_whitelist =
 
#    ChangesetController:changeset_patch,
 
#    ChangesetController:changeset_raw,
 
#    FilesController:raw,
 
#    FilesController:archivefile
 

	
 
<%text>## default encoding used to convert from and to unicode</%text>
 
<%text>## can be also a comma separated list of encoding in case of mixed encodings</%text>
 
default_encoding = utf8
 

	
 
<%text>## issue tracker for Kallithea (leave blank to disable, absent for default)</%text>
 
#bugtracker = https://bitbucket.org/conservancy/kallithea/issues
 

	
 
<%text>## issue tracking mapping for commits messages</%text>
 
<%text>## comment out issue_pat, issue_server, issue_prefix to enable</%text>
 

	
 
<%text>## pattern to get the issues from commit messages</%text>
 
<%text>## default one used here is #<numbers> with a regex passive group for `#`</%text>
 
<%text>## {id} will be all groups matched from this pattern</%text>
 

	
 
issue_pat = (?:\s*#)(\d+)
 

	
 
<%text>## server url to the issue, each {id} will be replaced with match</%text>
 
<%text>## fetched from the regex and {repo} is replaced with full repository name</%text>
 
<%text>## including groups {repo_name} is replaced with just name of repo</%text>
 

	
 
issue_server_link = https://issues.example.com/{repo}/issue/{id}
 

	
 
<%text>## prefix to add to link to indicate it's an url</%text>
 
<%text>## #314 will be replaced by <issue_prefix><id></%text>
 

	
 
issue_prefix = #
 

	
 
<%text>## issue_pat, issue_server_link, issue_prefix can have suffixes to specify</%text>
 
<%text>## multiple patterns, to other issues server, wiki or others</%text>
 
<%text>## below an example how to create a wiki pattern</%text>
 
# wiki-some-id -> https://wiki.example.com/some-id
 

	
 
#issue_pat_wiki = (?:wiki-)(.+)
 
#issue_server_link_wiki = https://wiki.example.com/{id}
 
#issue_prefix_wiki = WIKI-
 

	
 
<%text>## alternative return HTTP header for failed authentication. Default HTTP</%text>
 
<%text>## response is 401 HTTPUnauthorized. Currently Mercurial clients have trouble with</%text>
 
<%text>## handling that. Set this variable to 403 to return HTTPForbidden</%text>
 
auth_ret_code =
 

	
 
<%text>## locking return code. When repository is locked return this HTTP code. 2XX</%text>
 
<%text>## codes don't break the transactions while 4XX codes do</%text>
 
lock_ret_code = 423
 

	
 
<%text>## allows to change the repository location in settings page</%text>
 
allow_repo_location_change = True
 

	
 
<%text>## allows to setup custom hooks in settings page</%text>
 
allow_custom_hooks_settings = True
 

	
 
<%text>## extra extensions for indexing, space separated and without the leading '.'.</%text>
 
# index.extensions =
 
#    gemfile
 
#    lock
 

	
 
<%text>## extra filenames for indexing, space separated</%text>
 
# index.filenames =
 
#    .dockerignore
 
#    .editorconfig
 
#    INSTALL
 
#    CHANGELOG
 

	
 
<%text>####################################</%text>
 
<%text>###        CELERY CONFIG        ####</%text>
 
<%text>####################################</%text>
 

	
 
use_celery = false
 

	
 
<%text>## Example: connect to the virtual host 'rabbitmqhost' on localhost as rabbitmq:</%text>
 
broker.url = amqp://rabbitmq:qewqew@localhost:5672/rabbitmqhost
 

	
 
celery.imports = kallithea.lib.celerylib.tasks
 
celery.accept.content = pickle
 
celery.result.backend = amqp
 
celery.result.dburi = amqp://
 
celery.result.serialier = json
 

	
 
#celery.send.task.error.emails = true
 
#celery.amqp.task.result.expires = 18000
 

	
 
celeryd.concurrency = 2
 
celeryd.max.tasks.per.child = 1
 

	
 
<%text>## If true, tasks will never be sent to the queue, but executed locally instead.</%text>
 
celery.always.eager = false
 

	
 
<%text>####################################</%text>
 
<%text>###         BEAKER CACHE        ####</%text>
 
<%text>####################################</%text>
 

	
 
beaker.cache.data_dir = ${here}/data/cache/data
 
beaker.cache.lock_dir = ${here}/data/cache/lock
 

	
 
beaker.cache.regions = short_term,long_term,sql_cache_short
 

	
 
beaker.cache.short_term.type = memory
 
beaker.cache.short_term.expire = 60
 
beaker.cache.short_term.key_length = 256
 

	
 
beaker.cache.long_term.type = memory
 
beaker.cache.long_term.expire = 36000
 
beaker.cache.long_term.key_length = 256
 

	
 
beaker.cache.sql_cache_short.type = memory
 
beaker.cache.sql_cache_short.expire = 10
 
beaker.cache.sql_cache_short.key_length = 256
 

	
 
<%text>####################################</%text>
 
<%text>###       BEAKER SESSION        ####</%text>
 
<%text>####################################</%text>
 

	
 
<%text>## Name of session cookie. Should be unique for a given host and path, even when running</%text>
 
<%text>## on different ports. Otherwise, cookie sessions will be shared and messed up.</%text>
 
beaker.session.key = kallithea
 
<%text>## Sessions should always only be accessible by the browser, not directly by JavaScript.</%text>
 
beaker.session.httponly = true
 
<%text>## Session lifetime. 2592000 seconds is 30 days.</%text>
 
beaker.session.timeout = 2592000
 

	
 
<%text>## Server secret used with HMAC to ensure integrity of cookies.</%text>
 
beaker.session.secret = ${uuid()}
 
<%text>## Further, encrypt the data with AES.</%text>
 
#beaker.session.encrypt_key = <key_for_encryption>
 
#beaker.session.validate_key = <validation_key>
 

	
 
<%text>## Type of storage used for the session, current types are</%text>
 
<%text>## dbm, file, memcached, database, and memory.</%text>
 

	
 
<%text>## File system storage of session data. (default)</%text>
 
#beaker.session.type = file
 

	
 
<%text>## Cookie only, store all session data inside the cookie. Requires secure secrets.</%text>
 
#beaker.session.type = cookie
 

	
 
<%text>## Database storage of session data.</%text>
 
#beaker.session.type = ext:database
 
#beaker.session.sa.url = postgresql://postgres:qwe@localhost/kallithea
 
#beaker.session.table_name = db_session
 

	
 
%if error_aggregation_service == 'appenlight':
 
<%text>############################</%text>
 
<%text>## ERROR HANDLING SYSTEMS ##</%text>
 
<%text>############################</%text>
 

	
 
<%text>####################</%text>
 
<%text>### [appenlight] ###</%text>
 
<%text>####################</%text>
 

	
 
<%text>## AppEnlight is tailored to work with Kallithea, see</%text>
 
<%text>## http://appenlight.com for details how to obtain an account</%text>
 
<%text>## you must install python package `appenlight_client` to make it work</%text>
 

	
 
<%text>## appenlight enabled</%text>
 
appenlight = false
 

	
 
appenlight.server_url = https://api.appenlight.com
 
appenlight.api_key = YOUR_API_KEY
 

	
 
<%text>## TWEAK AMOUNT OF INFO SENT HERE</%text>
 

	
 
<%text>## enables 404 error logging (default False)</%text>
 
appenlight.report_404 = false
 

	
 
<%text>## time in seconds after request is considered being slow (default 1)</%text>
 
appenlight.slow_request_time = 1
 

	
 
<%text>## record slow requests in application</%text>
 
<%text>## (needs to be enabled for slow datastore recording and time tracking)</%text>
 
appenlight.slow_requests = true
 

	
 
<%text>## enable hooking to application loggers</%text>
 
#appenlight.logging = true
 

	
 
<%text>## minimum log level for log capture</%text>
 
#appenlight.logging.level = WARNING
 

	
 
<%text>## send logs only from erroneous/slow requests</%text>
 
<%text>## (saves API quota for intensive logging)</%text>
 
appenlight.logging_on_error = false
 

	
 
<%text>## list of additional keywords that should be grabbed from environ object</%text>
 
<%text>## can be string with comma separated list of words in lowercase</%text>
 
<%text>## (by default client will always send following info:</%text>
 
<%text>## 'REMOTE_USER', 'REMOTE_ADDR', 'SERVER_NAME', 'CONTENT_TYPE' + all keys that</%text>
 
<%text>## start with HTTP* this list be extended with additional keywords here</%text>
 
appenlight.environ_keys_whitelist =
 

	
 
<%text>## list of keywords that should be blanked from request object</%text>
 
<%text>## can be string with comma separated list of words in lowercase</%text>
 
<%text>## (by default client will always blank keys that contain following words</%text>
 
<%text>## 'password', 'passwd', 'pwd', 'auth_tkt', 'secret', 'csrf'</%text>
 
<%text>## this list be extended with additional keywords set here</%text>
 
appenlight.request_keys_blacklist =
 

	
 
<%text>## list of namespaces that should be ignores when gathering log entries</%text>
 
<%text>## can be string with comma separated list of namespaces</%text>
 
<%text>## (by default the client ignores own entries: appenlight_client.client)</%text>
 
appenlight.log_namespace_blacklist =
 

	
 
%elif error_aggregation_service == 'sentry':
 
<%text>################</%text>
 
<%text>### [sentry] ###</%text>
 
<%text>################</%text>
 

	
 
<%text>## sentry is a alternative open source error aggregator</%text>
 
<%text>## you must install python packages `sentry` and `raven` to enable</%text>
 

	
 
sentry.dsn = YOUR_DNS
 
sentry.servers =
 
sentry.name =
 
sentry.key =
 
sentry.public_key =
 
sentry.secret_key =
 
sentry.project =
 
sentry.site =
 
sentry.include_paths =
 
sentry.exclude_paths =
 

	
 
%endif
 
<%text>################################################################################</%text>
 
<%text>## WARNING: *THE LINE BELOW MUST BE UNCOMMENTED ON A PRODUCTION ENVIRONMENT*  ##</%text>
 
<%text>## Debug mode will enable the interactive debugging tool, allowing ANYONE to  ##</%text>
 
<%text>## execute malicious code after an exception is raised.                       ##</%text>
 
<%text>################################################################################</%text>
 
set debug = false
 

	
 
<%text>##################################</%text>
 
<%text>###       LOGVIEW CONFIG       ###</%text>
 
<%text>##################################</%text>
 

	
 
logview.sqlalchemy = #faa
 
logview.pylons.templating = #bfb
 
logview.pylons.util = #eee
 

	
 
<%text>#########################################################</%text>
 
<%text>### DB CONFIGS - EACH DB WILL HAVE IT'S OWN CONFIG    ###</%text>
 
<%text>#########################################################</%text>
 

	
 
%if database_engine == 'sqlite':
 
# SQLITE [default]
 
sqlalchemy.url = sqlite:///${here}/kallithea.db?timeout=60
 

	
 
%elif database_engine == 'postgres':
 
# POSTGRESQL
 
sqlalchemy.url = postgresql://user:pass@localhost/kallithea
 

	
 
%elif database_engine == 'mysql':
 
# MySQL
 
sqlalchemy.url = mysql://user:pass@localhost/kallithea?charset=utf8
 

	
 
%endif
 
# see sqlalchemy docs for others
 

	
 
sqlalchemy.echo = false
 
sqlalchemy.pool_recycle = 3600
 

	
 
<%text>################################</%text>
 
<%text>### ALEMBIC CONFIGURATION   ####</%text>
 
<%text>################################</%text>
 

	
 
[alembic]
 
script_location = kallithea:alembic
 

	
 
<%text>################################</%text>
 
<%text>### LOGGING CONFIGURATION   ####</%text>
 
<%text>################################</%text>
 

	
 
[loggers]
 
keys = root, routes, kallithea, sqlalchemy, gearbox, beaker, templates, whoosh_indexer
 
keys = root, routes, kallithea, sqlalchemy, tg, gearbox, beaker, templates, whoosh_indexer
 

	
 
[handlers]
 
keys = console, console_sql
 

	
 
[formatters]
 
keys = generic, color_formatter, color_formatter_sql
 

	
 
<%text>#############</%text>
 
<%text>## LOGGERS ##</%text>
 
<%text>#############</%text>
 

	
 
[logger_root]
 
level = NOTSET
 
handlers = console
 

	
 
[logger_routes]
 
level = DEBUG
 
handlers =
 
qualname = routes.middleware
 
<%text>## "level = DEBUG" logs the route matched and routing variables.</%text>
 
propagate = 1
 

	
 
[logger_beaker]
 
level = DEBUG
 
handlers =
 
qualname = beaker.container
 
propagate = 1
 

	
 
[logger_templates]
 
level = INFO
 
handlers =
 
qualname = pylons.templating
 
propagate = 1
 

	
 
[logger_kallithea]
 
level = DEBUG
 
handlers =
 
qualname = kallithea
 
propagate = 1
 

	
 
[logger_tg]
 
level = DEBUG
 
handlers =
 
qualname = tg
 
propagate = 1
 

	
 
[logger_gearbox]
 
level = DEBUG
 
handlers =
 
qualname = gearbox
 
propagate = 1
 

	
 
[logger_sqlalchemy]
 
level = INFO
 
handlers = console_sql
 
qualname = sqlalchemy.engine
 
propagate = 0
 

	
 
[logger_whoosh_indexer]
 
level = DEBUG
 
handlers =
 
qualname = whoosh_indexer
 
propagate = 1
 

	
 
<%text>##############</%text>
 
<%text>## HANDLERS ##</%text>
 
<%text>##############</%text>
 

	
 
[handler_console]
 
class = StreamHandler
 
args = (sys.stderr,)
 
level = INFO
 
formatter = generic
 

	
 
[handler_console_sql]
 
class = StreamHandler
 
args = (sys.stderr,)
 
level = WARN
 
formatter = generic
 

	
 
<%text>################</%text>
 
<%text>## FORMATTERS ##</%text>
 
<%text>################</%text>
 

	
 
[formatter_generic]
 
format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
 
datefmt = %Y-%m-%d %H:%M:%S
 

	
 
[formatter_color_formatter]
 
class = kallithea.lib.colored_formatter.ColorFormatter
 
format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
 
datefmt = %Y-%m-%d %H:%M:%S
 

	
 
[formatter_color_formatter_sql]
 
class = kallithea.lib.colored_formatter.ColorFormatterSql
 
format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
 
datefmt = %Y-%m-%d %H:%M:%S
kallithea/lib/utils.py
Show inline comments
 
# -*- coding: utf-8 -*-
 
# This program is free software: you can redistribute it and/or modify
 
# it under the terms of the GNU General Public License as published by
 
# the Free Software Foundation, either version 3 of the License, or
 
# (at your option) any later version.
 
#
 
# This program is distributed in the hope that it will be useful,
 
# but WITHOUT ANY WARRANTY; without even the implied warranty of
 
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
 
# GNU General Public License for more details.
 
#
 
# You should have received a copy of the GNU General Public License
 
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
 
"""
 
kallithea.lib.utils
 
~~~~~~~~~~~~~~~~~~~
 

	
 
Utilities library for Kallithea
 

	
 
This file was forked by the Kallithea project in July 2014.
 
Original author and date, and relevant copyright and licensing information is below:
 
:created_on: Apr 18, 2010
 
:author: marcink
 
:copyright: (c) 2013 RhodeCode GmbH, and others.
 
:license: GPLv3, see LICENSE.md for more details.
 
"""
 

	
 
import os
 
import re
 
import logging
 
import datetime
 
import traceback
 
import beaker
 

	
 
from tg import request, response
 
from webhelpers.text import collapse, remove_formatting, strip_tags
 
from beaker.cache import _cache_decorate
 

	
 
from kallithea.lib.vcs.utils.hgcompat import ui, config
 
from kallithea.lib.vcs.utils.helpers import get_scm
 
from kallithea.lib.vcs.exceptions import VCSError
 

	
 
from kallithea.model import meta
 
from kallithea.model.db import Repository, User, Ui, \
 
    UserLog, RepoGroup, Setting, UserGroup
 
from kallithea.model.repo_group import RepoGroupModel
 
from kallithea.lib.utils2 import safe_str, safe_unicode, get_current_authuser
 
from kallithea.lib.vcs.utils.fakemod import create_module
 

	
 
log = logging.getLogger(__name__)
 

	
 
REMOVED_REPO_PAT = re.compile(r'rm__\d{8}_\d{6}_\d{6}_.*')
 

	
 

	
 
def recursive_replace(str_, replace=' '):
 
    """
 
    Recursive replace of given sign to just one instance
 

	
 
    :param str_: given string
 
    :param replace: char to find and replace multiple instances
 

	
 
    Examples::
 
    >>> recursive_replace("Mighty---Mighty-Bo--sstones",'-')
 
    'Mighty-Mighty-Bo-sstones'
 
    """
 

	
 
    if str_.find(replace * 2) == -1:
 
        return str_
 
    else:
 
        str_ = str_.replace(replace * 2, replace)
 
        return recursive_replace(str_, replace)
 

	
 

	
 
def repo_name_slug(value):
 
    """
 
    Return slug of name of repository
 
    This function is called on each creation/modification
 
    of repository to prevent bad names in repo
 
    """
 

	
 
    slug = remove_formatting(value)
 
    slug = strip_tags(slug)
 

	
 
    for c in """`?=[]\;'"<>,/~!@#$%^&*()+{}|: """:
 
        slug = slug.replace(c, '-')
 
    slug = recursive_replace(slug, '-')
 
    slug = collapse(slug, '-')
 
    return slug
 

	
 

	
 
#==============================================================================
 
# PERM DECORATOR HELPERS FOR EXTRACTING NAMES FOR PERM CHECKS
 
#==============================================================================
 
def get_repo_slug(request):
 
    _repo = request.environ['pylons.routes_dict'].get('repo_name')
 
    if _repo:
 
        _repo = _repo.rstrip('/')
 
    return _repo
 

	
 

	
 
def get_repo_group_slug(request):
 
    _group = request.environ['pylons.routes_dict'].get('group_name')
 
    if _group:
 
        _group = _group.rstrip('/')
 
    return _group
 

	
 

	
 
def get_user_group_slug(request):
 
    _group = request.environ['pylons.routes_dict'].get('id')
 
    _group = UserGroup.get(_group)
 
    if _group:
 
        return _group.users_group_name
 
    return None
 

	
 

	
 
def _extract_id_from_repo_name(repo_name):
 
    if repo_name.startswith('/'):
 
        repo_name = repo_name.lstrip('/')
 
    by_id_match = re.match(r'^_(\d{1,})', repo_name)
 
    if by_id_match:
 
        return by_id_match.groups()[0]
 

	
 

	
 
def get_repo_by_id(repo_name):
 
    """
 
    Extracts repo_name by id from special urls. Example url is _11/repo_name
 

	
 
    :param repo_name:
 
    :return: repo_name if matched else None
 
    """
 
    _repo_id = _extract_id_from_repo_name(repo_name)
 
    if _repo_id:
 
        from kallithea.model.db import Repository
 
        repo = Repository.get(_repo_id)
 
        if repo:
 
            # TODO: return repo instead of reponame? or would that be a layering violation?
 
            return repo.repo_name
 
    return None
 

	
 

	
 
def action_logger(user, action, repo, ipaddr='', commit=False):
 
    """
 
    Action logger for various actions made by users
 

	
 
    :param user: user that made this action, can be a unique username string or
 
        object containing user_id attribute
 
    :param action: action to log, should be on of predefined unique actions for
 
        easy translations
 
    :param repo: string name of repository or object containing repo_id,
 
        that action was made on
 
    :param ipaddr: optional IP address from what the action was made
 

	
 
    """
 

	
 
    # if we don't get explicit IP address try to get one from registered user
 
    # in tmpl context var
 
    if not ipaddr:
 
        ipaddr = getattr(get_current_authuser(), 'ip_addr', '')
 

	
 
    if getattr(user, 'user_id', None):
 
        user_obj = User.get(user.user_id)
 
    elif isinstance(user, basestring):
 
        user_obj = User.get_by_username(user)
 
    else:
 
        raise Exception('You have to provide a user object or a username')
 

	
 
    if getattr(repo, 'repo_id', None):
 
        repo_obj = Repository.get(repo.repo_id)
 
        repo_name = repo_obj.repo_name
 
    elif isinstance(repo, basestring):
 
        repo_name = repo.lstrip('/')
 
        repo_obj = Repository.get_by_repo_name(repo_name)
 
    else:
 
        repo_obj = None
 
        repo_name = u''
 

	
 
    user_log = UserLog()
 
    user_log.user_id = user_obj.user_id
 
    user_log.username = user_obj.username
 
    user_log.action = safe_unicode(action)
 

	
 
    user_log.repository = repo_obj
 
    user_log.repository_name = repo_name
 

	
 
    user_log.action_date = datetime.datetime.now()
 
    user_log.user_ip = ipaddr
 
    meta.Session().add(user_log)
 

	
 
    log.info('Logging action:%s on %s by user:%s ip:%s',
 
             action, safe_unicode(repo), user_obj, ipaddr)
 
    if commit:
 
        meta.Session().commit()
 

	
 

	
 
def get_filesystem_repos(path):
 
    """
 
    Scans given path for repos and return (name,(type,path)) tuple
 

	
 
    :param path: path to scan for repositories
 
    :param recursive: recursive search and return names with subdirs in front
 
    """
 

	
 
    # remove ending slash for better results
 
    path = safe_str(path.rstrip(os.sep))
 
    log.debug('now scanning in %s', path)
 

	
 
    def isdir(*n):
 
        return os.path.isdir(os.path.join(*n))
 

	
 
    for root, dirs, _files in os.walk(path):
 
        recurse_dirs = []
 
        for subdir in dirs:
 
            # skip removed repos
 
            if REMOVED_REPO_PAT.match(subdir):
 
                continue
 

	
 
            #skip .<something> dirs TODO: rly? then we should prevent creating them ...
 
            if subdir.startswith('.'):
 
                continue
 

	
 
            cur_path = os.path.join(root, subdir)
 
            if isdir(cur_path, '.git'):
 
                log.warning('ignoring non-bare Git repo: %s', cur_path)
 
                continue
 

	
 
            if (isdir(cur_path, '.hg') or
 
                isdir(cur_path, '.svn') or
 
                isdir(cur_path, 'objects') and (isdir(cur_path, 'refs') or
 
                                                os.path.isfile(os.path.join(cur_path, 'packed-refs')))):
 

	
 
                if not os.access(cur_path, os.R_OK) or not os.access(cur_path, os.X_OK):
 
                    log.warning('ignoring repo path without access: %s', cur_path)
 
                    continue
 

	
 
                if not os.access(cur_path, os.W_OK):
 
                    log.warning('repo path without write access: %s', cur_path)
 

	
 
                try:
 
                    scm_info = get_scm(cur_path)
 
                    assert cur_path.startswith(path)
 
                    repo_path = cur_path[len(path) + 1:]
 
                    yield repo_path, scm_info
 
                    continue # no recursion
 
                except VCSError:
 
                    # We should perhaps ignore such broken repos, but especially
 
                    # the bare git detection is unreliable so we dive into it
 
                    pass
 

	
 
            recurse_dirs.append(subdir)
 

	
 
        dirs[:] = recurse_dirs
 

	
 

	
 
def is_valid_repo(repo_name, base_path, scm=None):
 
    """
 
    Returns True if given path is a valid repository False otherwise.
 
    If scm param is given also compare if given scm is the same as expected
 
    from scm parameter
 

	
 
    :param repo_name:
 
    :param base_path:
 
    :param scm:
 

	
 
    :return True: if given path is a valid repository
 
    """
 
    full_path = os.path.join(safe_str(base_path), safe_str(repo_name))
 

	
 
    try:
 
        scm_ = get_scm(full_path)
 
        if scm:
 
            return scm_[0] == scm
 
        return True
 
    except VCSError:
 
        return False
 

	
 

	
 
def is_valid_repo_group(repo_group_name, base_path, skip_path_check=False):
 
    """
 
    Returns True if given path is a repository group False otherwise
 

	
 
    :param repo_name:
 
    :param base_path:
 
    """
 
    full_path = os.path.join(safe_str(base_path), safe_str(repo_group_name))
 

	
 
    # check if it's not a repo
 
    if is_valid_repo(repo_group_name, base_path):
 
        return False
 

	
 
    try:
 
        # we need to check bare git repos at higher level
 
        # since we might match branches/hooks/info/objects or possible
 
        # other things inside bare git repo
 
        get_scm(os.path.dirname(full_path))
 
        return False
 
    except VCSError:
 
        pass
 

	
 
    # check if it's a valid path
 
    if skip_path_check or os.path.isdir(full_path):
 
        return True
 

	
 
    return False
 

	
 

	
 
#propagated from mercurial documentation
 
ui_sections = ['alias', 'auth',
 
                'decode/encode', 'defaults',
 
                'diff', 'email',
 
                'extensions', 'format',
 
                'merge-patterns', 'merge-tools',
 
                'hooks', 'http_proxy',
 
                'smtp', 'patch',
 
                'paths', 'profiling',
 
                'server', 'trusted',
 
                'ui', 'web', ]
 

	
 

	
 
def make_ui(read_from='file', path=None, checkpaths=True, clear_session=True):
 
    """
 
    A function that will read python rc files or database
 
    and make an mercurial ui object from read options
 

	
 
    :param path: path to mercurial config file
 
    :param checkpaths: check the path
 
    :param read_from: read from 'file' or 'db'
 
    """
 

	
 
    baseui = ui.ui()
 

	
 
    # clean the baseui object
 
    baseui._ocfg = config.config()
 
    baseui._ucfg = config.config()
 
    baseui._tcfg = config.config()
 

	
 
    if read_from == 'file':
 
        if not os.path.isfile(path):
 
            log.debug('hgrc file is not present at %s, skipping...', path)
 
            return False
 
        log.debug('reading hgrc from %s', path)
 
        cfg = config.config()
 
        cfg.read(path)
 
        for section in ui_sections:
 
            for k, v in cfg.items(section):
 
                log.debug('settings ui from file: [%s] %s=%s', section, k, v)
 
                baseui.setconfig(safe_str(section), safe_str(k), safe_str(v))
 

	
 
    elif read_from == 'db':
 
        sa = meta.Session()
 
        ret = sa.query(Ui).all()
 

	
 
        hg_ui = ret
 
        for ui_ in hg_ui:
 
            if ui_.ui_active:
 
                ui_val = '' if ui_.ui_value is None else safe_str(ui_.ui_value)
 
                log.debug('settings ui from db: [%s] %s=%r', ui_.ui_section,
 
                          ui_.ui_key, ui_val)
 
                baseui.setconfig(safe_str(ui_.ui_section), safe_str(ui_.ui_key),
 
                                 ui_val)
 
        if clear_session:
 
            meta.Session.remove()
 

	
 
        # force set push_ssl requirement to False, Kallithea handles that
 
        baseui.setconfig('web', 'push_ssl', False)
 
        baseui.setconfig('web', 'allow_push', '*')
 
        # prevent interactive questions for ssh password / passphrase
 
        ssh = baseui.config('ui', 'ssh', default='ssh')
 
        baseui.setconfig('ui', 'ssh', '%s -oBatchMode=yes -oIdentitiesOnly=yes' % ssh)
 

	
 
    return baseui
 

	
 

	
 
def set_app_settings(config):
 
    """
 
    Updates app config with new settings from database
 

	
 
    :param config:
 
    """
 
    hgsettings = Setting.get_app_settings()
 

	
 
    for k, v in hgsettings.items():
 
        config[k] = v
 

	
 

	
 
def set_vcs_config(config):
 
    """
 
    Patch VCS config with some Kallithea specific stuff
 

	
 
    :param config: kallithea.CONFIG
 
    """
 
    from kallithea.lib.vcs import conf
 
    from kallithea.lib.utils2 import aslist
 
    conf.settings.BACKENDS = {
 
        'hg': 'kallithea.lib.vcs.backends.hg.MercurialRepository',
 
        'git': 'kallithea.lib.vcs.backends.git.GitRepository',
 
    }
 

	
 
    conf.settings.GIT_EXECUTABLE_PATH = config.get('git_path', 'git')
 
    conf.settings.GIT_REV_FILTER = config.get('git_rev_filter', '--all').strip()
 
    conf.settings.DEFAULT_ENCODINGS = aslist(config.get('default_encoding',
 
                                                        'utf8'), sep=',')
 

	
 

	
 
def set_indexer_config(config):
 
    """
 
    Update Whoosh index mapping
 

	
 
    :param config: kallithea.CONFIG
 
    """
 
    from kallithea.config import conf
 

	
 
    log.debug('adding extra into INDEX_EXTENSIONS')
 
    conf.INDEX_EXTENSIONS.extend(re.split('\s+', config.get('index.extensions', '')))
 

	
 
    log.debug('adding extra into INDEX_FILENAMES')
 
    conf.INDEX_FILENAMES.extend(re.split('\s+', config.get('index.filenames', '')))
 

	
 

	
 
def map_groups(path):
 
    """
 
    Given a full path to a repository, create all nested groups that this
 
    repo is inside. This function creates parent-child relationships between
 
    groups and creates default perms for all new groups.
 

	
 
    :param paths: full path to repository
 
    """
 
    sa = meta.Session()
 
    groups = path.split(Repository.url_sep())
 
    parent = None
 
    group = None
 

	
 
    # last element is repo in nested groups structure
 
    groups = groups[:-1]
 
    rgm = RepoGroupModel()
 
    owner = User.get_first_admin()
 
    for lvl, group_name in enumerate(groups):
 
        group_name = u'/'.join(groups[:lvl] + [group_name])
 
        group = RepoGroup.get_by_group_name(group_name)
 
        desc = '%s group' % group_name
 

	
 
        # skip folders that are now removed repos
 
        if REMOVED_REPO_PAT.match(group_name):
 
            break
 

	
 
        if group is None:
 
            log.debug('creating group level: %s group_name: %s',
 
                      lvl, group_name)
 
            group = RepoGroup(group_name, parent)
 
            group.group_description = desc
 
            group.owner = owner
 
            sa.add(group)
 
            rgm._create_default_perms(group)
 
            sa.flush()
 

	
 
        parent = group
 
    return group
 

	
 

	
 
def repo2db_mapper(initial_repo_list, remove_obsolete=False,
 
                   install_git_hooks=False, user=None, overwrite_git_hooks=False):
 
    """
 
    maps all repos given in initial_repo_list, non existing repositories
 
    are created, if remove_obsolete is True it also check for db entries
 
    that are not in initial_repo_list and removes them.
 

	
 
    :param initial_repo_list: list of repositories found by scanning methods
 
    :param remove_obsolete: check for obsolete entries in database
 
    :param install_git_hooks: if this is True, also check and install git hook
 
        for a repo if missing
 
    :param overwrite_git_hooks: if this is True, overwrite any existing git hooks
 
        that may be encountered (even if user-deployed)
 
    """
 
    from kallithea.model.repo import RepoModel
 
    from kallithea.model.scm import ScmModel
 
    sa = meta.Session()
 
    repo_model = RepoModel()
 
    if user is None:
 
        user = User.get_first_admin()
 
    added = []
 

	
 
    ##creation defaults
 
    defs = Setting.get_default_repo_settings(strip_prefix=True)
 
    enable_statistics = defs.get('repo_enable_statistics')
 
    enable_locking = defs.get('repo_enable_locking')
 
    enable_downloads = defs.get('repo_enable_downloads')
 
    private = defs.get('repo_private')
 

	
 
    for name, repo in initial_repo_list.items():
 
        group = map_groups(name)
 
        unicode_name = safe_unicode(name)
 
        db_repo = repo_model.get_by_repo_name(unicode_name)
 
        # found repo that is on filesystem not in Kallithea database
 
        if not db_repo:
 
            log.info('repository %s not found, creating now', name)
 
            added.append(name)
 
            desc = (repo.description
 
                    if repo.description != 'unknown'
 
                    else '%s repository' % name)
 

	
 
            new_repo = repo_model._create_repo(
 
                repo_name=name,
 
                repo_type=repo.alias,
 
                description=desc,
 
                repo_group=getattr(group, 'group_id', None),
 
                owner=user,
 
                enable_locking=enable_locking,
 
                enable_downloads=enable_downloads,
 
                enable_statistics=enable_statistics,
 
                private=private,
 
                state=Repository.STATE_CREATED
 
            )
 
            sa.commit()
 
            # we added that repo just now, and make sure it has githook
 
            # installed, and updated server info
 
            if new_repo.repo_type == 'git':
 
                git_repo = new_repo.scm_instance
 
                ScmModel().install_git_hooks(git_repo)
 
                # update repository server-info
 
                log.debug('Running update server info')
 
                git_repo._update_server_info()
 
            new_repo.update_changeset_cache()
 
        elif install_git_hooks:
 
            if db_repo.repo_type == 'git':
 
                ScmModel().install_git_hooks(db_repo.scm_instance, force_create=overwrite_git_hooks)
 

	
 
    removed = []
 
    # remove from database those repositories that are not in the filesystem
 
    unicode_initial_repo_list = set(safe_unicode(name) for name in initial_repo_list)
 
    for repo in sa.query(Repository).all():
 
        if repo.repo_name not in unicode_initial_repo_list:
 
            if remove_obsolete:
 
                log.debug("Removing non-existing repository found in db `%s`",
 
                          repo.repo_name)
 
                try:
 
                    RepoModel().delete(repo, forks='detach', fs_remove=False)
 
                    sa.commit()
 
                except Exception:
 
                    #don't hold further removals on error
 
                    log.error(traceback.format_exc())
 
                    sa.rollback()
 
            removed.append(repo.repo_name)
 
    return added, removed
 

	
 

	
 
def load_rcextensions(root_path):
 
    import kallithea
 
    from kallithea.config import conf
 

	
 
    path = os.path.join(root_path, 'rcextensions', '__init__.py')
 
    if os.path.isfile(path):
 
        rcext = create_module('rc', path)
 
        EXT = kallithea.EXTENSIONS = rcext
 
        log.debug('Found rcextensions now loading %s...', rcext)
 

	
 
        # Additional mappings that are not present in the pygments lexers
 
        conf.LANGUAGES_EXTENSIONS_MAP.update(getattr(EXT, 'EXTRA_MAPPINGS', {}))
 

	
 
        #OVERRIDE OUR EXTENSIONS FROM RC-EXTENSIONS (if present)
 

	
 
        if getattr(EXT, 'INDEX_EXTENSIONS', []):
 
            log.debug('settings custom INDEX_EXTENSIONS')
 
            conf.INDEX_EXTENSIONS = getattr(EXT, 'INDEX_EXTENSIONS', [])
 

	
 
        #ADDITIONAL MAPPINGS
 
        log.debug('adding extra into INDEX_EXTENSIONS')
 
        conf.INDEX_EXTENSIONS.extend(getattr(EXT, 'EXTRA_INDEX_EXTENSIONS', []))
 

	
 
        # auto check if the module is not missing any data, set to default if is
 
        # this will help autoupdate new feature of rcext module
 
        #from kallithea.config import rcextensions
 
        #for k in dir(rcextensions):
 
        #    if not k.startswith('_') and not hasattr(EXT, k):
 
        #        setattr(EXT, k, getattr(rcextensions, k))
 

	
 

	
 
#==============================================================================
 
# MISC
 
#==============================================================================
 

	
 
def check_git_version():
 
    """
 
    Checks what version of git is installed in system, and issues a warning
 
    if it's too old for Kallithea to work properly.
 
    """
 
    from kallithea import BACKENDS
 
    from kallithea.lib.vcs.backends.git.repository import GitRepository
 
    from kallithea.lib.vcs.conf import settings
 
    from distutils.version import StrictVersion
 

	
 
    if 'git' not in BACKENDS:
 
        return None
 

	
 
    stdout, stderr = GitRepository._run_git_command(['--version'], _bare=True,
 
                                                    _safe=True)
 

	
 
    m = re.search("\d+.\d+.\d+", stdout)
 
    if m:
 
        ver = StrictVersion(m.group(0))
 
    else:
 
        ver = StrictVersion('0.0.0')
 

	
 
    req_ver = StrictVersion('1.7.4')
 

	
 
    log.debug('Git executable: "%s" version %s detected: %s',
 
              settings.GIT_EXECUTABLE_PATH, ver, stdout)
 
    if stderr:
 
        log.warning('Error detecting git version: %r', stderr)
 
    elif ver < req_ver:
 
        log.warning('Kallithea detected git version %s, which is too old '
 
                    'for the system to function properly. '
 
                    'Please upgrade to version %s or later.' % (ver, req_ver))
 
    return ver
 

	
 

	
 
#===============================================================================
 
# CACHE RELATED METHODS
 
#===============================================================================
 

	
 
# set cache regions for beaker so celery can utilise it
 
def setup_cache_regions(settings):
 
    # Create dict with just beaker cache configs with prefix stripped
 
    cache_settings = {'regions': None}
 
    prefix = 'beaker.cache.'
 
    for key in settings:
 
        if key.startswith(prefix):
 
            name = key[len(prefix):]
 
            cache_settings[name] = settings[key]
 
    # Find all regions, apply defaults, and apply to beaker
 
    if cache_settings['regions']:
 
        for region in cache_settings['regions'].split(','):
 
            region = region.strip()
 
            prefix = region + '.'
 
            region_settings = {}
 
            for key in cache_settings:
 
                if key.startswith(prefix):
 
                    name = key[len(prefix):]
 
                    region_settings[name] = cache_settings[key]
 
            region_settings.setdefault('expire',
 
                                       cache_settings.get('expire', '60'))
 
            region_settings.setdefault('lock_dir',
 
                                       cache_settings.get('lock_dir'))
 
            region_settings.setdefault('data_dir',
 
                                       cache_settings.get('data_dir'))
 
            region_settings.setdefault('type',
 
                                       cache_settings.get('type', 'memory'))
 
            beaker.cache.cache_regions[region] = region_settings
 

	
 

	
 
def conditional_cache(region, prefix, condition, func):
 
    """
 

	
 
    Conditional caching function use like::
 
        def _c(arg):
 
            #heavy computation function
 
            return data
 

	
 
        # depending from condition the compute is wrapped in cache or not
 
        compute = conditional_cache('short_term', 'cache_desc', condition=True, func=func)
 
        return compute(arg)
 

	
 
    :param region: name of cache region
 
    :param prefix: cache region prefix
 
    :param condition: condition for cache to be triggered, and return data cached
 
    :param func: wrapped heavy function to compute
 

	
 
    """
 
    wrapped = func
 
    if condition:
 
        log.debug('conditional_cache: True, wrapping call of '
 
                  'func: %s into %s region cache' % (region, func))
 
        wrapped = _cache_decorate((prefix,), None, None, region)(func)
 

	
 
    return wrapped

Changeset was too big and was cut off... Show full diff anyway

0 comments (0 inline, 0 general)