...
 
Commits (200)
......@@ -114,7 +114,7 @@ black:
before_script:
- pip install black
script:
- black --exclude "/(\.git|\.hg|\.mypy_cache|\.tox|\.venv|_build|buck-out|build|dist|migrations)/" --check --diff api/
- black --check --diff api/
flake8:
image: python:3.6
......
......@@ -10,6 +10,57 @@ This changelog is viewable on the web at https://docs.funkwhale.audio/changelog.
.. towncrier
0.18.3 (2019-03-21)
-------------------
Upgrade instructions are available at
https://docs.funkwhale.audio/index.html
Avoid mixed content when deploying mono-container behind proxy [Manual action required]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
*You are only concerned if you use the mono-container docker deployment behind a reverse proxy*
Because of `an issue in our mono-container configuration <https://github.com/thetarkus/docker-funkwhale/issues/19>`_, users deploying Funkwhale via docker
using our `funkwhale/all-in-one` image could face some mixed content warnings (and possibly other troubles)
when browsing the Web UI.
This is fixed in this release, but on existing deployments, you'll need to add ``NESTED_PROXY=1`` in your container
environment (either in your ``.env`` file, or via your container management tool), then recreate your funkwhale container.
Enhancements:
- Added title on hover for truncated content (#766)
- Ask for confirmation before leaving upload page if there is a an upload in process (#630)
- Exclude in-place imported files from quota computation (#570)
- Truncate filename in library file table to ensure correct display of the table. (#735)
Bugfixes:
- Avoid mixed content when deploying mono-container behind HTTPS proxy (thetarkus/docker-funkwhale#19)
- Display new notifications immediatly on notifications page (#729)
- Ensure cover art from uploaded files is picked up properly on existing albums (#757)
- Fixed a crash when federating a track with unspecified position
- Fixed broken Activity and Actor modules in django admin (#767)
- Fixed broken sample apache configuration (#764)
- Fixed constant and unpredictable reordering during file upload (#716)
- Fixed delivering of local activities causing unintended side effects, such as rollbacking changes (#737)
- Fixed escaping issues in translated strings (#652)
- Fixed saving moderation policy when clicking on "Cancel" (#751)
- i18n: Update page title when changing the App's language. (#511)
- Include disc number in Subsonic responses (#765)
- Do not send notification when rejecting a follow on a local library (#743)
Documentation:
- Added documentation on mono-container docker upgrade (#713)
- Added documentation to set up let's encrypt certificate (#745)
0.18.2 (2019-02-13)
-------------------
......
......@@ -358,6 +358,7 @@ Internationalization
--------------------
We're using https://github.com/Polyconseil/vue-gettext to manage i18n in the project.
When working on the front-end, any end-user string should be marked as a translatable string,
with the proper context, as described below.
......
......@@ -76,7 +76,8 @@ v1_patterns += [
include(("funkwhale_api.users.api_urls", "users"), namespace="users"),
),
url(
r"^auth/", include(("funkwhale_api.users.auth_urls", "auth"), namespace="auth")
r"^oauth/",
include(("funkwhale_api.users.oauth.urls", "oauth"), namespace="oauth"),
),
url(r"^token/$", jwt_views.obtain_jwt_token, name="token"),
url(r"^token/refresh/$", jwt_views.refresh_jwt_token, name="token_refresh"),
......
from channels.routing import ProtocolTypeRouter, URLRouter
from channels.sessions import SessionMiddlewareStack
from django.conf.urls import url
from funkwhale_api.common.auth import TokenAuthMiddleware
......@@ -8,12 +7,8 @@ from funkwhale_api.instance import consumers
application = ProtocolTypeRouter(
{
# Empty for now (http->django views is added by default)
"websocket": SessionMiddlewareStack(
TokenAuthMiddleware(
URLRouter(
[url("^api/v1/activity$", consumers.InstanceActivityConsumer)]
)
)
"websocket": TokenAuthMiddleware(
URLRouter([url("^api/v1/activity$", consumers.InstanceActivityConsumer)])
)
}
)
......@@ -121,6 +121,7 @@ THIRD_PARTY_APPS = (
"allauth.account", # registration
"allauth.socialaccount", # registration
"corsheaders",
"oauth2_provider",
"rest_framework",
"rest_framework.authtoken",
"taggit",
......@@ -152,6 +153,7 @@ LOCAL_APPS = (
"funkwhale_api.common.apps.CommonConfig",
"funkwhale_api.activity.apps.ActivityConfig",
"funkwhale_api.users", # custom users app
"funkwhale_api.users.oauth",
# Your stuff: custom apps go here
"funkwhale_api.instance",
"funkwhale_api.music",
......@@ -183,10 +185,6 @@ MIDDLEWARE = (
"funkwhale_api.users.middleware.RecordActivityMiddleware",
)
# MIGRATIONS CONFIGURATION
# ------------------------------------------------------------------------------
MIGRATION_MODULES = {"sites": "funkwhale_api.contrib.sites.migrations"}
# DEBUG
# ------------------------------------------------------------------------------
# See: https://docs.djangoproject.com/en/dev/ref/settings/#debug
......@@ -222,6 +220,16 @@ DATABASES = {
"default": env.db("DATABASE_URL")
}
DATABASES["default"]["ATOMIC_REQUESTS"] = True
DATABASES["default"]["CONN_MAX_AGE"] = env("DB_CONN_MAX_AGE", default=60 * 60)
MIGRATION_MODULES = {
# see https://github.com/jazzband/django-oauth-toolkit/issues/634
# swappable models are badly designed in oauth2_provider
# ignore migrations and provide our own models.
"oauth2_provider": None,
"sites": "funkwhale_api.contrib.sites.migrations",
}
#
# DATABASES = {
# 'default': {
......@@ -298,6 +306,19 @@ STATIC_ROOT = env("STATIC_ROOT", default=str(ROOT_DIR("staticfiles")))
STATIC_URL = env("STATIC_URL", default="/staticfiles/")
DEFAULT_FILE_STORAGE = "funkwhale_api.common.storage.ASCIIFileSystemStorage"
AWS_DEFAULT_ACL = None
AWS_QUERYSTRING_AUTH = False
AWS_ACCESS_KEY_ID = env("AWS_ACCESS_KEY_ID", default=None)
if AWS_ACCESS_KEY_ID:
AWS_ACCESS_KEY_ID = AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY = env("AWS_SECRET_ACCESS_KEY")
AWS_STORAGE_BUCKET_NAME = env("AWS_STORAGE_BUCKET_NAME")
AWS_S3_ENDPOINT_URL = env("AWS_S3_ENDPOINT_URL", default=None)
AWS_LOCATION = env("AWS_LOCATION", default="")
DEFAULT_FILE_STORAGE = "storages.backends.s3boto3.S3Boto3Storage"
# See: https://docs.djangoproject.com/en/dev/ref/contrib/staticfiles/#std:setting-STATICFILES_DIRS
STATICFILES_DIRS = (str(APPS_DIR.path("static")),)
......@@ -330,7 +351,7 @@ AUTHENTICATION_BACKENDS = (
"funkwhale_api.users.auth_backends.ModelBackend",
"allauth.account.auth_backends.AuthenticationBackend",
)
SESSION_COOKIE_HTTPONLY = True
SESSION_COOKIE_HTTPONLY = False
# Some really nice defaults
ACCOUNT_AUTHENTICATION_METHOD = "username_email"
ACCOUNT_EMAIL_REQUIRED = True
......@@ -343,6 +364,23 @@ AUTH_USER_MODEL = "users.User"
LOGIN_REDIRECT_URL = "users:redirect"
LOGIN_URL = "account_login"
# OAuth configuration
from funkwhale_api.users.oauth import scopes # noqa
OAUTH2_PROVIDER = {
"SCOPES": {s.id: s.label for s in scopes.SCOPES_BY_ID.values()},
"ALLOWED_REDIRECT_URI_SCHEMES": ["http", "https", "urn"],
# we keep expired tokens for 15 days, for tracability
"REFRESH_TOKEN_EXPIRE_SECONDS": 3600 * 24 * 15,
"AUTHORIZATION_CODE_EXPIRE_SECONDS": 5 * 60,
"ACCESS_TOKEN_EXPIRE_SECONDS": 60 * 60 * 10,
"OAUTH2_SERVER_CLASS": "funkwhale_api.users.oauth.server.OAuth2Server",
}
OAUTH2_PROVIDER_APPLICATION_MODEL = "users.Application"
OAUTH2_PROVIDER_ACCESS_TOKEN_MODEL = "users.AccessToken"
OAUTH2_PROVIDER_GRANT_MODEL = "users.Grant"
OAUTH2_PROVIDER_REFRESH_TOKEN_MODEL = "users.RefreshToken"
# LDAP AUTHENTICATION CONFIGURATION
# ------------------------------------------------------------------------------
AUTH_LDAP_ENABLED = env.bool("LDAP_ENABLED", default=False)
......@@ -450,16 +488,28 @@ CELERY_TASK_TIME_LIMIT = 300
CELERY_BEAT_SCHEDULE = {
"federation.clean_music_cache": {
"task": "federation.clean_music_cache",
"schedule": crontab(hour="*/2"),
"schedule": crontab(minute="0", hour="*/2"),
"options": {"expires": 60 * 2},
},
"music.clean_transcoding_cache": {
"task": "music.clean_transcoding_cache",
"schedule": crontab(hour="*"),
"schedule": crontab(minute="0", hour="*"),
"options": {"expires": 60 * 2},
},
"oauth.clear_expired_tokens": {
"task": "oauth.clear_expired_tokens",
"schedule": crontab(minute="0", hour="0"),
"options": {"expires": 60 * 60 * 24},
},
"federation.refresh_nodeinfo_known_nodes": {
"task": "federation.refresh_nodeinfo_known_nodes",
"schedule": crontab(minute="0", hour="*"),
"options": {"expires": 60 * 60},
},
}
NODEINFO_REFRESH_DELAY = env.int("NODEINFO_REFRESH_DELAY", default=3600 * 24)
JWT_AUTH = {
"JWT_ALLOW_REFRESH": True,
"JWT_EXPIRATION_DELTA": datetime.timedelta(days=7),
......@@ -477,7 +527,6 @@ CORS_ORIGIN_ALLOW_ALL = True
CORS_ALLOW_CREDENTIALS = True
REST_FRAMEWORK = {
"DEFAULT_PERMISSION_CLASSES": ("rest_framework.permissions.IsAuthenticated",),
"DEFAULT_PAGINATION_CLASS": "funkwhale_api.common.pagination.FunkwhalePagination",
"PAGE_SIZE": 25,
"DEFAULT_PARSER_CLASSES": (
......@@ -487,11 +536,15 @@ REST_FRAMEWORK = {
"funkwhale_api.federation.parsers.ActivityParser",
),
"DEFAULT_AUTHENTICATION_CLASSES": (
"oauth2_provider.contrib.rest_framework.OAuth2Authentication",
"funkwhale_api.common.authentication.JSONWebTokenAuthenticationQS",
"funkwhale_api.common.authentication.BearerTokenHeaderAuth",
"funkwhale_api.common.authentication.JSONWebTokenAuthentication",
"rest_framework.authentication.SessionAuthentication",
"rest_framework.authentication.BasicAuthentication",
"rest_framework.authentication.SessionAuthentication",
),
"DEFAULT_PERMISSION_CLASSES": (
"funkwhale_api.users.oauth.permissions.ScopePermission",
),
"DEFAULT_FILTER_BACKENDS": (
"rest_framework.filters.OrderingFilter",
......@@ -537,7 +590,7 @@ MUSICBRAINZ_HOSTNAME = env("MUSICBRAINZ_HOSTNAME", default="musicbrainz.org")
# Custom Admin URL, use {% url 'admin:index' %}
ADMIN_URL = env("DJANGO_ADMIN_URL", default="^api/admin/")
CSRF_USE_SESSIONS = False
CSRF_USE_SESSIONS = True
SESSION_ENGINE = "django.contrib.sessions.backends.cache"
# Playlist settings
......
# -*- coding: utf-8 -*-
__version__ = "0.18.2"
__version__ = "0.19.0-rc1"
__version_info__ = tuple(
[
int(num) if num.isdigit() else num
......
from asgiref.sync import async_to_sync
from channels.generic.websocket import JsonWebsocketConsumer
from channels import auth
from funkwhale_api.common import channels
class JsonAuthConsumer(JsonWebsocketConsumer):
def connect(self):
if "user" not in self.scope:
try:
self.scope["user"] = async_to_sync(auth.get_user)(self.scope)
except (ValueError, AssertionError, AttributeError, KeyError):
return self.close()
if self.scope["user"] and self.scope["user"].is_authenticated:
return self.accept()
else:
try:
assert self.scope["user"].pk is not None
except (AssertionError, AttributeError, KeyError):
return self.close()
return self.accept()
def accept(self):
super().accept()
for group in self.groups:
......
......@@ -87,4 +87,6 @@ def mutations_route(types):
)
return response.Response(serializer.data, status=status.HTTP_201_CREATED)
return decorators.action(methods=["get", "post"], detail=True)(mutations)
return decorators.action(
methods=["get", "post"], detail=True, required_scope="edits"
)(mutations)
......@@ -49,6 +49,6 @@ class SmartSearchFilter(django_filters.CharFilter):
return qs
try:
cleaned = self.config.clean(value)
except forms.ValidationError:
except (forms.ValidationError):
return qs.none()
return search.apply(qs, cleaned)
......@@ -104,6 +104,31 @@ class MultipleQueryFilter(filters.TypedMultipleChoiceFilter):
self.lookup_expr = "in"
def filter_target(value):
config = {
"artist": ["artist", "target_id", int],
"album": ["album", "target_id", int],
"track": ["track", "target_id", int],
}
parts = value.lower().split(" ")
if parts[0].strip() not in config:
raise forms.ValidationError("Improper target")
conf = config[parts[0].strip()]
query = Q(target_content_type__model=conf[0])
if len(parts) > 1:
_, lookup_field, validator = conf
try:
lookup_value = validator(parts[1].strip())
except TypeError:
raise forms.ValidationError("Imparsable target id")
return query & Q(**{lookup_field: lookup_value})
return query
class MutationFilter(filters.FilterSet):
is_approved = NullBooleanFilter("is_approved")
q = fields.SmartSearchFilter(
......@@ -116,6 +141,7 @@ class MutationFilter(filters.FilterSet):
filter_fields={
"domain": {"to": "created_by__domain__name__iexact"},
"is_approved": get_null_boolean_filter("is_approved"),
"target": {"handler": filter_target},
"is_applied": {"to": "is_applied"},
},
)
......
......@@ -4,13 +4,63 @@ from django.contrib.postgres.fields import JSONField
from django.contrib.contenttypes.fields import GenericForeignKey
from django.contrib.contenttypes.models import ContentType
from django.conf import settings
from django.db import models, transaction
from django.core.serializers.json import DjangoJSONEncoder
from django.db import connections, models, transaction
from django.db.models import Lookup
from django.db.models.fields import Field
from django.db.models.sql.compiler import SQLCompiler
from django.utils import timezone
from django.urls import reverse
from funkwhale_api.federation import utils as federation_utils
@Field.register_lookup
class NotEqual(Lookup):
lookup_name = "ne"
def as_sql(self, compiler, connection):
lhs, lhs_params = self.process_lhs(compiler, connection)
rhs, rhs_params = self.process_rhs(compiler, connection)
params = lhs_params + rhs_params
return "%s <> %s" % (lhs, rhs), params
class NullsLastSQLCompiler(SQLCompiler):
def get_order_by(self):
result = super().get_order_by()
if result and self.connection.vendor == "postgresql":
return [
(
expr,
(
sql + " NULLS LAST" if not sql.endswith(" NULLS LAST") else sql,
params,
is_ref,
),
)
for (expr, (sql, params, is_ref)) in result
]
return result
class NullsLastQuery(models.sql.query.Query):
"""Use a custom compiler to inject 'NULLS LAST' (for PostgreSQL)."""
def get_compiler(self, using=None, connection=None):
if using is None and connection is None:
raise ValueError("Need either using or connection")
if using:
connection = connections[using]
return NullsLastSQLCompiler(self, connection, using)
class NullsLastQuerySet(models.QuerySet):
def __init__(self, model=None, query=None, using=None, hints=None):
super().__init__(model, query, using, hints)
self.query = query or NullsLastQuery(self.model)
class LocalFromFidQuerySet:
def local(self, include=True):
host = settings.FEDERATION_HOSTNAME
......@@ -57,8 +107,8 @@ class Mutation(models.Model):
applied_date = models.DateTimeField(null=True, blank=True, db_index=True)
summary = models.TextField(max_length=2000, null=True, blank=True)
payload = JSONField()
previous_state = JSONField(null=True, default=None)
payload = JSONField(encoder=DjangoJSONEncoder)
previous_state = JSONField(null=True, default=None, encoder=DjangoJSONEncoder)
target_id = models.IntegerField(null=True)
target_content_type = models.ForeignKey(
......
......@@ -2,7 +2,7 @@ import persisting_theory
from rest_framework import serializers
from django.db import models
from django.db import models, transaction
class ConfNotFound(KeyError):
......@@ -23,6 +23,7 @@ class Registry(persisting_theory.Registry):
return decorator
@transaction.atomic
def apply(self, type, obj, payload):
conf = self.get_conf(type, obj)
serializer = conf["serializer_class"](obj, data=payload)
......@@ -73,6 +74,9 @@ class MutationSerializer(serializers.Serializer):
def apply(self, obj, validated_data):
raise NotImplementedError()
def post_apply(self, obj, validated_data):
pass
def get_previous_state(self, obj, validated_data):
return
......@@ -88,8 +92,11 @@ class UpdateMutationSerializer(serializers.ModelSerializer, MutationSerializer):
kwargs.setdefault("partial", True)
super().__init__(*args, **kwargs)
@transaction.atomic
def apply(self, obj, validated_data):
return self.update(obj, validated_data)
r = self.update(obj, validated_data)
self.post_apply(r, validated_data)
return r
def validate(self, validated_data):
if not validated_data:
......@@ -114,7 +121,14 @@ class UpdateMutationSerializer(serializers.ModelSerializer, MutationSerializer):
# to ensure we store ids instead of model instances in our json
# payload
for field, attr in self.serialized_relations.items():
data[field] = getattr(data[field], attr)
try:
obj = data[field]
except KeyError:
continue
if obj is None:
data[field] = None
else:
data[field] = getattr(obj, attr)
return data
def create(self, validated_data):
......
from rest_framework.pagination import PageNumberPagination
from rest_framework.pagination import PageNumberPagination, _positive_int
class FunkwhalePagination(PageNumberPagination):
page_size_query_param = "page_size"
max_page_size = 50
default_max_page_size = 50
default_page_size = None
view = None
def paginate_queryset(self, queryset, request, view=None):
self.view = view
return super().paginate_queryset(queryset, request, view)
def get_page_size(self, request):
max_page_size = (
getattr(self.view, "max_page_size", 0) or self.default_max_page_size
)
page_size = getattr(self.view, "default_page_size", 0) or max_page_size
if self.page_size_query_param:
try:
return _positive_int(
request.query_params[self.page_size_query_param],
strict=True,
cutoff=max_page_size,
)
except (KeyError, ValueError):
pass
return page_size
......@@ -47,6 +47,6 @@ class OwnerPermission(BasePermission):
owner_field = getattr(view, "owner_field", "user")
owner = operator.attrgetter(owner_field)(obj)
if owner != request.user:
if not owner or not request.user.is_authenticated or owner != request.user:
raise Http404
return True
......@@ -65,6 +65,9 @@ def apply(qs, config_data):
q = config_data.get(k)
if q:
qs = qs.filter(q)
distinct = config_data.get("distinct", False)
if distinct:
qs = qs.distinct()
return qs
......@@ -77,13 +80,28 @@ class SearchConfig:
def clean(self, query):
tokens = parse_query(query)
cleaned_data = {}
cleaned_data["types"] = self.clean_types(filter_tokens(tokens, ["is"]))
cleaned_data["search_query"] = self.clean_search_query(
filter_tokens(tokens, [None, "in"])
filter_tokens(tokens, [None, "in"] + list(self.search_fields.keys()))
)
unhandled_tokens = [
t
for t in tokens
if t["key"] not in [None, "is", "in"] + list(self.search_fields.keys())
]
cleaned_data["filter_query"], matching_filters = self.clean_filter_query(
unhandled_tokens
)
unhandled_tokens = [t for t in tokens if t["key"] not in [None, "is", "in"]]
cleaned_data["filter_query"] = self.clean_filter_query(unhandled_tokens)
if matching_filters:
cleaned_data["distinct"] = any(
[
self.filter_fields[k].get("distinct", False)
for k in matching_filters
if k in self.filter_fields
]
)
else:
cleaned_data["distinct"] = False
return cleaned_data
def clean_search_query(self, tokens):
......@@ -95,12 +113,37 @@ class SearchConfig:
} or set(self.search_fields.keys())
fields_subset = set(self.search_fields.keys()) & fields_subset
to_fields = [self.search_fields[k]["to"] for k in fields_subset]
specific_field_query = None
for token in tokens:
if token["key"] not in self.search_fields:
continue
to = self.search_fields[token["key"]]["to"]
try:
field = token["field"]
value = field.clean(token["value"])
except KeyError:
# no cleaning to apply
value = token["value"]
q = Q(**{"{}__icontains".format(to): value})
if not specific_field_query:
specific_field_query = q
else:
specific_field_query &= q
query_string = " ".join([t["value"] for t in filter_tokens(tokens, [None])])
return get_query(query_string, sorted(to_fields))
unhandled_tokens_query = get_query(query_string, sorted(to_fields))
if specific_field_query and unhandled_tokens_query:
return unhandled_tokens_query & specific_field_query
elif specific_field_query:
return specific_field_query
elif unhandled_tokens_query:
return unhandled_tokens_query
return None
def clean_filter_query(self, tokens):
if not self.filter_fields or not tokens:
return
return None, []
matching = [t for t in tokens if t["key"] in self.filter_fields]
queries = [self.get_filter_query(token) for token in matching]
......@@ -110,7 +153,7 @@ class SearchConfig:
query = q
else:
query = query & q
return query
return query, [m["key"] for m in matching]
def get_filter_query(self, token):
raw_value = token["value"]
......
......@@ -201,3 +201,30 @@ def concat_dicts(*dicts):
n.update(d)
return n
def get_updated_fields(conf, data, obj):
"""
Given a list of fields, a dict and an object, will return the dict keys/values
that differ from the corresponding fields on the object.
"""
final_conf = []
for c in conf:
if isinstance(c, str):
final_conf.append((c, c))
else:
final_conf.append(c)
final_data = {}
for data_field, obj_field in final_conf:
try:
data_value = data[data_field]
except KeyError:
continue
obj_value = getattr(obj, obj_field)
if obj_value != data_value:
final_data[obj_field] = data_value
return final_data
......@@ -36,6 +36,7 @@ class MutationViewSet(
lookup_field = "uuid"
queryset = (
models.Mutation.objects.all()
.exclude(target_id=None)
.order_by("-creation_date")
.select_related("created_by", "approved_by")
.prefetch_related("target")
......
......@@ -2,6 +2,8 @@ import uuid
import factory
import persisting_theory
from django.conf import settings
from faker.providers import internet as internet_provider
......@@ -50,11 +52,11 @@ class FunkwhaleProvider(internet_provider.Provider):
not random enough
"""
def federation_url(self, prefix=""):
def federation_url(self, prefix="", local=False):
def path_generator():
return "{}/{}".format(prefix, uuid.uuid4())
domain = self.domain_name()
domain = settings.FEDERATION_HOSTNAME if local else self.domain_name()
protocol = "https"
path = path_generator()
return "{}://{}/{}".format(protocol, domain, path)
......
from rest_framework import mixins, status, viewsets
from rest_framework.decorators import action
from rest_framework.permissions import IsAuthenticatedOrReadOnly
from rest_framework.response import Response
from django.db.models import Prefetch
......@@ -9,6 +8,7 @@ from funkwhale_api.activity import record
from funkwhale_api.common import fields, permissions
from funkwhale_api.music.models import Track
from funkwhale_api.music import utils as music_utils
from funkwhale_api.users.oauth import permissions as oauth_permissions
from . import filters, models, serializers
......@@ -24,10 +24,11 @@ class TrackFavoriteViewSet(
serializer_class = serializers.UserTrackFavoriteSerializer
queryset = models.TrackFavorite.objects.all().select_related("user")
permission_classes = [
permissions.ConditionalAuthentication,
oauth_permissions.ScopePermission,
permissions.OwnerPermission,
IsAuthenticatedOrReadOnly,
]
required_scope = "favorites"
anonymous_policy = "setting"
owner_checks = ["write"]
def get_serializer_class(self):
......
......@@ -121,6 +121,7 @@ def receive(activity, on_behalf_of):
from . import models
from . import serializers
from . import tasks
from .routes import inbox
# we ensure the activity has the bare minimum structure before storing
# it in our database
......@@ -128,6 +129,10 @@ def receive(activity, on_behalf_of):
data=activity, context={"actor": on_behalf_of, "local_recipients": True}
)
serializer.is_valid(raise_exception=True)
if not inbox.get_matching_handlers(activity):
# discard unhandlable activity
return
if should_reject(
fid=serializer.validated_data.get("id"),
actor_id=serializer.validated_data["actor"].fid,
......@@ -360,30 +365,9 @@ class OutboxRouter(Router):
return activities
def recursive_getattr(obj, key, permissive=False):
"""
Given a dictionary such as {'user': {'name': 'Bob'}} and
a dotted string such as user.name, returns 'Bob'.
If the value is not present, returns None
"""
v = obj
for k in key.split("."):
try:
v = v.get(k)
except (TypeError, AttributeError):
if not permissive:
raise
return
if v is None:
return
return v
def match_route(route, payload):
for key, value in route.items():
payload_value = recursive_getattr(payload, key)
payload_value = recursive_getattr(payload, key, permissive=True)
if payload_value != value:
return False
......@@ -427,6 +411,27 @@ def prepare_deliveries_and_inbox_items(recipient_list, type):
remote_inbox_urls.add(actor.shared_inbox_url or actor.inbox_url)
urls.append(r["target"].followers_url)
elif isinstance(r, dict) and r["type"] == "instances_with_followers":
# we want to broadcast the activity to other instances service actors
# when we have at least one follower from this instance
follows = (
models.LibraryFollow.objects.filter(approved=True)
.exclude(actor__domain_id=settings.FEDERATION_HOSTNAME)
.exclude(actor__domain=None)
.union(
models.Follow.objects.filter(approved=True)
.exclude(actor__domain_id=settings.FEDERATION_HOSTNAME)
.exclude(actor__domain=None)
)
)
actors = models.Actor.objects.filter(
managed_domains__name__in=follows.values_list(
"actor__domain_id", flat=True
)
)
values = actors.values("shared_inbox_url", "inbox_url")
for v in values:
remote_inbox_urls.add(v["shared_inbox_url"] or v["inbox_url"])
deliveries = [models.Delivery(inbox_url=url) for url in remote_inbox_urls]
inbox_items = [
models.InboxItem(actor=actor, type=type) for actor in local_recipients
......
......@@ -30,11 +30,19 @@ class DomainAdmin(admin.ModelAdmin):
search_fields = ["name"]
@admin.register(models.Fetch)
class FetchAdmin(admin.ModelAdmin):
list_display = ["url", "actor", "status", "creation_date", "fetch_date", "detail"]
search_fields = ["url", "actor__username"]
list_filter = ["status"]
list_select_related = True
@admin.register(models.Activity)
class ActivityAdmin(admin.ModelAdmin):
list_display = ["type", "fid", "url", "actor", "creation_date"]
search_fields = ["payload", "fid", "url", "actor__domain"]
list_filter = ["type", "actor__domain"]
search_fields = ["payload", "fid", "url", "actor__domain__name"]
list_filter = ["type", "actor__domain__name"]
actions = [redeliver_activities]
list_select_related = True
......@@ -49,7 +57,7 @@ class ActorAdmin(admin.ModelAdmin):
"creation_date",
"last_fetch_date",
]
search_fields = ["fid", "domain", "preferred_username"]
search_fields = ["fid", "domain__name", "preferred_username"]
list_filter = ["type"]
......
......@@ -144,3 +144,19 @@ class InboxItemActionSerializer(common_serializers.ActionSerializer):
def handle_read(self, objects):
return objects.update(is_read=True)
class FetchSerializer(serializers.ModelSerializer):
actor = federation_serializers.APIActorSerializer()
class Meta:
model = models.Fetch
fields = [
"id",
"url",
"actor",
"status",
"detail",
"creation_date",
"fetch_date",
]
......@@ -3,6 +3,7 @@ from rest_framework import routers
from . import api_views
router = routers.SimpleRouter()
router.register(r"fetches", api_views.FetchViewSet, "fetches")
router.register(r"follows/library", api_views.LibraryFollowViewSet, "library-follows")
router.register(r"inbox", api_views.InboxItemViewSet, "inbox")
router.register(r"libraries", api_views.LibraryViewSet, "libraries")
......
......@@ -10,6 +10,7 @@ from rest_framework import response
from rest_framework import viewsets
from funkwhale_api.music import models as music_models
from funkwhale_api.users.oauth import permissions as oauth_permissions
from . import activity
from . import api_serializers
......@@ -43,7 +44,8 @@ class LibraryFollowViewSet(
.select_related("actor", "target__actor")
)
serializer_class = api_serializers.LibraryFollowSerializer
permission_classes = [permissions.IsAuthenticated]
permission_classes = [oauth_permissions.ScopePermission]
required_scope = "follows"
filterset_class = filters.LibraryFollowFilter
ordering_fields = ("creation_date",)
......@@ -100,7 +102,8 @@ class LibraryViewSet(mixins.RetrieveModelMixin, viewsets.GenericViewSet):
.annotate(_uploads_count=Count("uploads"))
)
serializer_class = api_serializers.LibrarySerializer
permission_classes = [permissions.IsAuthenticated]
permission_classes = [oauth_permissions.ScopePermission]
required_scope = "libraries"
def get_queryset(self):
qs = super().get_queryset()
......@@ -132,6 +135,7 @@ class LibraryViewSet(mixins.RetrieveModelMixin, viewsets.GenericViewSet):
try:
library = utils.retrieve_ap_object(
fid,
actor=request.user.actor,
queryset=self.queryset,
serializer_class=serializers.LibrarySerializer,
)
......@@ -168,7 +172,8 @@ class InboxItemViewSet(
.order_by("-activity__creation_date")
)
serializer_class = api_serializers.InboxItemSerializer
permission_classes = [permissions.IsAuthenticated]
permission_classes = [oauth_permissions.ScopePermission]
required_scope = "notifications"
filterset_class = filters.InboxItemFilter
ordering_fields = ("activity__creation_date",)
......@@ -185,3 +190,10 @@ class InboxItemViewSet(
serializer.is_valid(raise_exception=True)
result = serializer.save()
return response.Response(result, status=200)
class FetchViewSet(mixins.RetrieveModelMixin, viewsets.GenericViewSet):
queryset = models.Fetch.objects.select_related("actor")
serializer_class = api_serializers.FetchSerializer
permission_classes = [permissions.IsAuthenticated]
from django.db import transaction
from rest_framework import decorators
from rest_framework import permissions
from rest_framework import response
from rest_framework import status
from funkwhale_api.common import utils as common_utils
from . import api_serializers
from . import filters
from . import models
from . import tasks
from . import utils
def fetches_route():
@transaction.atomic
def fetches(self, request, *args, **kwargs):
obj = self.get_object()
if request.method == "GET":
queryset = models.Fetch.objects.get_for_object(obj).select_related("actor")
queryset = queryset.order_by("-creation_date")
filterset = filters.FetchFilter(request.GET, queryset=queryset)
page = self.paginate_queryset(filterset.qs)
if page is not None:
serializer = api_serializers.FetchSerializer(page, many=True)
return self.get_paginated_response(serializer.data)
serializer = api_serializers.FetchSerializer(queryset, many=True)
return response.Response(serializer.data)
if request.method == "POST":
if utils.is_local(obj.fid):
return response.Response(
{"detail": "Cannot fetch a local object"}, status=400
)
fetch = models.Fetch.objects.create(
url=obj.fid, actor=request.user.actor, object=obj
)
common_utils.on_commit(tasks.fetch.delay, fetch_id=fetch.pk)
serializer = api_serializers.FetchSerializer(fetch)
return response.Response(serializer.data, status=status.HTTP_201_CREATED)
return decorators.action(
methods=["get", "post"],
detail=True,
permission_classes=[permissions.IsAuthenticated],
)(fetches)
......@@ -75,6 +75,15 @@ class DomainFactory(NoUpdateOnCreate, factory.django.DjangoModelFactory):
model = "federation.Domain"
django_get_or_create = ("name",)
@factory.post_generation
def with_service_actor(self, create, extracted, **kwargs):
if not create or not extracted:
return
self.service_actor = ActorFactory(domain=self)
self.save(update_fields=["service_actor"])
return self.service_actor
@registry.register
class ActorFactory(NoUpdateOnCreate, factory.DjangoModelFactory):
......@@ -157,7 +166,7 @@ class MusicLibraryFactory(NoUpdateOnCreate, factory.django.DjangoModelFactory):
@registry.register
class LibraryScan(NoUpdateOnCreate, factory.django.DjangoModelFactory):
class LibraryScanFactory(NoUpdateOnCreate, factory.django.DjangoModelFactory):
library = factory.SubFactory(MusicLibraryFactory)
actor = factory.SubFactory(ActorFactory)
total_files = factory.LazyAttribute(lambda o: o.library.uploads_count)
......@@ -166,6 +175,14 @@ class LibraryScan(NoUpdateOnCreate, factory.django.DjangoModelFactory):
model = "music.LibraryScan"
@registry.register
class FetchFactory(NoUpdateOnCreate, factory.django.DjangoModelFactory):
actor = factory.SubFactory(ActorFactory)
class Meta:
model = "federation.Fetch"
@registry.register
class ActivityFactory(NoUpdateOnCreate, factory.django.DjangoModelFactory):
actor = factory.SubFactory(ActorFactory)
......
import django_filters
from rest_framework import serializers
from . import models
from . import utils
class ActorRelatedField(serializers.EmailField):
......@@ -16,3 +19,15 @@ class ActorRelatedField(serializers.EmailField):
)
except models.Actor.DoesNotExist:
raise serializers.ValidationError("Invalid actor name")
class DomainFromURLFilter(django_filters.CharFilter):
def __init__(self, *args, **kwargs):
self.url_field = kwargs.pop("url_field", "fid")
super().__init__(*args, **kwargs)
def filter(self, qs, value):
if not value:
return qs
query = utils.get_domain_query_from_url(value, self.url_field)
return qs.filter(query)
......@@ -46,3 +46,14 @@ class InboxItemFilter(django_filters.FilterSet):
def filter_before(self, queryset, field_name, value):
return queryset.filter(pk__lte=value)
class FetchFilter(django_filters.FilterSet):
ordering = django_filters.OrderingFilter(
# tuple-mapping retains order
fields=(("creation_date", "creation_date"), ("fetch_date", "fetch_date"))
)
class Meta:
model = models.Fetch
fields = ["status", "object_id", "url"]
......@@ -57,7 +57,9 @@ def insert_context(ctx, doc):
existing = doc["@context"]
if isinstance(existing, list):
if ctx not in existing:
existing = existing[:]
existing.append(ctx)
doc["@context"] = existing
else:
doc["@context"] = [existing, ctx]
return doc
......@@ -215,6 +217,15 @@ def get_default_context():
return ["https://www.w3.org/ns/activitystreams", "https://w3id.org/security/v1", {}]