Skip to content
Snippets Groups Projects
Verified Commit b206c3cf authored by Eliot Berriot's avatar Eliot Berriot
Browse files

Merge branch 'release/0.15'

parents 544a60b8 b6ac2dc3
No related branches found
No related tags found
No related merge requests found
Showing
with 625 additions and 56 deletions
......@@ -90,3 +90,4 @@ data/
po/*.po
docs/swagger
_build
......@@ -7,11 +7,93 @@ variables:
stages:
- review
- lint
- test
- build
- deploy
review_front:
stage: review
image: node:9
when: manual
allow_failure: true
before_script:
- cd front
script:
- yarn install
# this is to ensure we don't have any errors in the output,
# cf https://code.eliotberriot.com/funkwhale/funkwhale/issues/169
- INSTANCE_URL=$REVIEW_INSTANCE_URL yarn run build | tee /dev/stderr | (! grep -i 'ERROR in')
- mkdir -p /static/front/$CI_BUILD_REF_SLUG
- cp -r dist/* /static/front/$CI_BUILD_REF_SLUG
cache:
key: "$CI_PROJECT_ID__front_dependencies"
paths:
- front/node_modules
- front/yarn.lock
environment:
name: review/front-$CI_BUILD_REF_NAME
url: http://front-$CI_BUILD_REF_SLUG.$REVIEW_DOMAIN
on_stop: stop_front_review
only:
- branches@funkwhale/funkwhale
tags:
- funkwhale-review
stop_front_review:
stage: review
script:
- rm -rf /static/front/$CI_BUILD_REF_SLUG/
variables:
GIT_STRATEGY: none
when: manual
environment:
name: review/front-$CI_BUILD_REF_NAME
action: stop
tags:
- funkwhale-review
review_docs:
stage: review
image: python:3.6
when: manual
allow_failure: true
variables:
BUILD_PATH: "../public"
before_script:
- cd docs
cache:
key: "$CI_PROJECT_ID__sphinx"
paths:
- "$PIP_CACHE_DIR"
script:
- pip install sphinx
- ./build_docs.sh
- mkdir -p /static/docs/$CI_BUILD_REF_SLUG
- cp -r $CI_PROJECT_DIR/public/* /static/docs/$CI_BUILD_REF_SLUG
environment:
name: review/docs-$CI_BUILD_REF_NAME
url: http://docs-$CI_BUILD_REF_SLUG.$REVIEW_DOMAIN
on_stop: stop_docs_review
only:
- branches@funkwhale/funkwhale
tags:
- funkwhale-review
stop_docs_review:
stage: review
script:
- rm -rf /static/docs/$CI_BUILD_REF_SLUG/
variables:
GIT_STRATEGY: none
when: manual
environment:
name: review/docs-$CI_BUILD_REF_NAME
action: stop
tags:
- funkwhale-review
black:
image: python:3.6
stage: lint
......@@ -20,7 +102,7 @@ black:
before_script:
- pip install black
script:
- black --check --diff api/
- black --exclude "/(\.git|\.hg|\.mypy_cache|\.tox|\.venv|_build|buck-out|build|dist|migrations)/" --check --diff api/
flake8:
image: python:3.6
......@@ -126,6 +208,10 @@ pages:
script:
- pip install sphinx
- ./build_docs.sh
cache:
key: "$CI_PROJECT_ID__sphinx"
paths:
- "$PIP_CACHE_DIR"
artifacts:
paths:
- public
......
......@@ -10,6 +10,83 @@ This changelog is viewable on the web at https://docs.funkwhale.audio/changelog.
.. towncrier
0.15 (2018-06-24)
-----------------
Upgrade instructions are available at
https://docs.funkwhale.audio/upgrading.html
Features:
- Added admin interface to manage import requests (#190)
- Added replace flag during import to replace already present tracks with a new
version of their track file (#222)
- Funkwhale's front-end can now point to any instance (#327) Removed front-end
and back-end coupling
- Management interface for users (#212)
- New invite system (#248) New invite system
Enhancements:
- Added "TV" to the list of highlighted words during YouTube import (#154)
- Command line import now accepts unlimited args (#242)
Bugfixes:
- Expose track files date in manage API (#307)
- Fixed current track restart/hiccup when shuffling queue, deleting track from
queue or reordering (#310)
- Include user's current private playlists on playlist list (#302)
- Remove link to generic radios, since they don't have detail pages (#324)
Documentation:
- Document that Funkwhale may be installed with YunoHost (#325)
- Documented a saner layout with symlinks for in-place imports (#254)
- Upgrade documentation now use the correct user on non-docker setups (#265)
Invite system
^^^^^^^^^^^^^
On closed instances, it has always been a little bit painful to create accounts
by hand for new users. This release solve that by adding invitations.
You can generate invitation codes via the "users" admin interface (you'll find a
link in the sidebar). Those codes are valid for 14 days, and can be used once
to create a new account on the instance, even if registrations are closed.
By default, we generate a random code for invitations, but you can also use custom codes
if you need to print them or make them fancier ;)
Invitations generation and management requires the "settings" permission.
Removed front-end and back-end coupling
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Eventhough Funkwhale's front-end has always been a Single Page Application,
talking to an API, it was only able to talk to an API on the same domain.
There was no real technical justification behind this (only lazyness), and it was
also blocking interesting use cases:
- Use multiple customized versions of the front-end with the same instance
- Use a customized version of the front-end with multiple instances
- Use a locally hosted front-end with a remote API, which is especially useful in development
From now on, Funkwhale's front-end can connect to any Funkwhale server. You can
change the server you are connecting to in the footer.
Fixing this also unlocked a really interesting feature in our development/review workflow:
by leveraging Gitlab CI and review apps, we are now able to deploy automatically live versions of
a merge request, making it possible for anyone to review front-end changes easily, without
the need to install a local environment.
0.14.2 (2018-06-16)
-------------------
......
Contibute to Funkwhale development
Contribute to Funkwhale development
==================================
First of all, thank you for your interest in the project! We really
......@@ -12,6 +12,42 @@ This document will guide you through common operations such as:
- Writing unit tests to validate your work
- Submit your work
A quick path to contribute on the front-end
-------------------------------------------
The next sections of this document include a full installation guide to help
you setup a local, development version of Funkwhale. If you only want to fix small things
on the front-end, and don't want to manage a full development environment, there is anoter way.
As the front-end can work with any Funkwhale server, you can work with the front-end only,
and make it talk with an existing instance (like the demo one, or you own instance, if you have one).
If even that is too much for you, you can also make your changes without any development environment,
and open a merge request. We will be able to to review your work easily by spawning automatically a
live version of your changes, thanks to Gitlab Review apps.
Setup front-end only development environment
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
1. Clone the repository::
git clone ssh://git@code.eliotberriot.com:2222/funkwhale/funkwhale.git
cd funkwhale
cd front
2. Install [nodejs](https://nodejs.org/en/download/package-manager/) and [yarn](https://yarnpkg.com/lang/en/docs/install/#debian-stable)
3. Install the dependencies::
yarn install
4. Launch the development server::
# this will serve the front-end on http://localhost:8000
WEBPACK_DEVSERVER_PORT=8000 yarn dev
5. Make the front-end talk with an existing server (like https://demo.funkwhale.audio),
by clicking on the corresponding link in the footer
6. Start hacking!
Setup your development environment
----------------------------------
......
......@@ -146,6 +146,7 @@ MIDDLEWARE = (
"django.contrib.auth.middleware.AuthenticationMiddleware",
"django.contrib.messages.middleware.MessageMiddleware",
"django.middleware.clickjacking.XFrameOptionsMiddleware",
"funkwhale_api.users.middleware.RecordActivityMiddleware",
)
# MIGRATIONS CONFIGURATION
......@@ -460,3 +461,7 @@ MUSIC_DIRECTORY_PATH = env("MUSIC_DIRECTORY_PATH", default=None)
MUSIC_DIRECTORY_SERVE_PATH = env(
"MUSIC_DIRECTORY_SERVE_PATH", default=MUSIC_DIRECTORY_PATH
)
USERS_INVITATION_EXPIRATION_DAYS = env.int(
"USERS_INVITATION_EXPIRATION_DAYS", default=14
)
# -*- coding: utf-8 -*-
__version__ = "0.14.2"
__version__ = "0.15"
__version_info__ = tuple(
[
int(num) if num.isdigit() else num
......
......@@ -17,13 +17,13 @@ def get_privacy_field():
)
def privacy_level_query(user, lookup_field="privacy_level"):
def privacy_level_query(user, lookup_field="privacy_level", user_field="user"):
if user.is_anonymous:
return models.Q(**{lookup_field: "everyone"})
return models.Q(
**{"{}__in".format(lookup_field): ["followers", "instance", "everyone"]}
)
**{"{}__in".format(lookup_field): ["instance", "everyone"]}
) | models.Q(**{lookup_field: "me", user_field: user})
class SearchFilter(django_filters.CharFilter):
......
from rest_framework import serializers
class Action(object):
def __init__(self, name, allow_all=False, qs_filter=None):
self.name = name
self.allow_all = allow_all
self.qs_filter = qs_filter
def __repr__(self):
return "<Action {}>".format(self.name)
class ActionSerializer(serializers.Serializer):
"""
A special serializer that can operate on a list of objects
......@@ -11,19 +21,16 @@ class ActionSerializer(serializers.Serializer):
objects = serializers.JSONField(required=True)
filters = serializers.DictField(required=False)
actions = None
filterset_class = None
# those are actions identifier where we don't want to allow the "all"
# selector because it's to dangerous. Like object deletion.
dangerous_actions = []
def __init__(self, *args, **kwargs):
self.actions_by_name = {a.name: a for a in self.actions}
self.queryset = kwargs.pop("queryset")
if self.actions is None:
raise ValueError(
"You must declare a list of actions on " "the serializer class"
)
for action in self.actions:
for action in self.actions_by_name.keys():
handler_name = "handle_{}".format(action)
assert hasattr(self, handler_name), "{} miss a {} method".format(
self.__class__.__name__, handler_name
......@@ -31,13 +38,14 @@ class ActionSerializer(serializers.Serializer):
super().__init__(self, *args, **kwargs)
def validate_action(self, value):
if value not in self.actions:
try:
return self.actions_by_name[value]
except KeyError:
raise serializers.ValidationError(
"{} is not a valid action. Pick one of {}.".format(
value, ", ".join(self.actions)
value, ", ".join(self.actions_by_name.keys())
)
)
return value
def validate_objects(self, value):
if value == "all":
......@@ -51,33 +59,35 @@ class ActionSerializer(serializers.Serializer):
)
def validate(self, data):
dangerous = data["action"] in self.dangerous_actions
if dangerous and self.initial_data["objects"] == "all":
allow_all = data["action"].allow_all
if not allow_all and self.initial_data["objects"] == "all":
raise serializers.ValidationError(
"This action is to dangerous to be applied to all objects"
)
if self.filterset_class and "filters" in data:
qs_filterset = self.filterset_class(
data["filters"], queryset=data["objects"]
"You cannot apply this action on all objects"
)
final_filters = data.get("filters", {}) or {}
if self.filterset_class and final_filters:
qs_filterset = self.filterset_class(final_filters, queryset=data["objects"])
try:
assert qs_filterset.form.is_valid()
except (AssertionError, TypeError):
raise serializers.ValidationError("Invalid filters")
data["objects"] = qs_filterset.qs
if data["action"].qs_filter:
data["objects"] = data["action"].qs_filter(data["objects"])
data["count"] = data["objects"].count()
if data["count"] < 1:
raise serializers.ValidationError("No object matching your request")
return data
def save(self):
handler_name = "handle_{}".format(self.validated_data["action"])
handler_name = "handle_{}".format(self.validated_data["action"].name)
handler = getattr(self, handler_name)
result = handler(self.validated_data["objects"])
payload = {
"updated": self.validated_data["count"],
"action": self.validated_data["action"],
"action": self.validated_data["action"].name,
"result": result,
}
return payload
......@@ -769,7 +769,7 @@ class CollectionSerializer(serializers.Serializer):
class LibraryTrackActionSerializer(common_serializers.ActionSerializer):
actions = ["import"]
actions = [common_serializers.Action("import", allow_all=True)]
filterset_class = filters.LibraryTrackFilter
@transaction.atomic
......
from django_filters import rest_framework as filters
from funkwhale_api.common import fields
from funkwhale_api.music import models as music_models
from funkwhale_api.requests import models as requests_models
from funkwhale_api.users import models as users_models
class ManageTrackFileFilterSet(filters.FilterSet):
......@@ -18,3 +19,45 @@ class ManageTrackFileFilterSet(filters.FilterSet):
class Meta:
model = music_models.TrackFile
fields = ["q", "track__album", "track__artist", "track", "library_track"]
class ManageUserFilterSet(filters.FilterSet):
q = fields.SearchFilter(search_fields=["username", "email", "name"])
class Meta:
model = users_models.User
fields = [
"q",
"is_active",
"privacy_level",
"is_staff",
"is_superuser",
"permission_upload",
"permission_library",
"permission_settings",
"permission_federation",
]
class ManageInvitationFilterSet(filters.FilterSet):
q = fields.SearchFilter(search_fields=["owner__username", "code", "owner__email"])
is_open = filters.BooleanFilter(method="filter_is_open")
class Meta:
model = users_models.Invitation
fields = ["q", "is_open"]
def filter_is_open(self, queryset, field_name, value):
if value is None:
return queryset
return queryset.open(value)
class ManageImportRequestFilterSet(filters.FilterSet):
q = fields.SearchFilter(
search_fields=["user__username", "albums", "artist_name", "comment"]
)
class Meta:
model = requests_models.ImportRequest
fields = ["q", "status"]
from django.db import transaction
from django.utils import timezone
from rest_framework import serializers
from funkwhale_api.common import serializers as common_serializers
from funkwhale_api.music import models as music_models
from funkwhale_api.requests import models as requests_models
from funkwhale_api.users import models as users_models
from . import filters
......@@ -52,6 +55,7 @@ class ManageTrackFileSerializer(serializers.ModelSerializer):
"track",
"duration",
"mimetype",
"creation_date",
"bitrate",
"size",
"path",
......@@ -60,10 +64,172 @@ class ManageTrackFileSerializer(serializers.ModelSerializer):
class ManageTrackFileActionSerializer(common_serializers.ActionSerializer):
actions = ["delete"]
dangerous_actions = ["delete"]
actions = [common_serializers.Action("delete", allow_all=False)]
filterset_class = filters.ManageTrackFileFilterSet
@transaction.atomic
def handle_delete(self, objects):
return objects.delete()
class PermissionsSerializer(serializers.Serializer):
def to_representation(self, o):
return o.get_permissions(defaults=self.context.get("default_permissions"))
def to_internal_value(self, o):
return {"permissions": o}
class ManageUserSimpleSerializer(serializers.ModelSerializer):
class Meta:
model = users_models.User
fields = (
"id",
"username",
"email",
"name",
"is_active",
"is_staff",
"is_superuser",
"date_joined",
"last_activity",
"privacy_level",
)
class ManageUserSerializer(serializers.ModelSerializer):
permissions = PermissionsSerializer(source="*")
class Meta:
model = users_models.User
fields = (
"id",
"username",
"email",
"name",
"is_active",
"is_staff",
"is_superuser",
"date_joined",
"last_activity",
"permissions",
"privacy_level",
)
read_only_fields = [
"id",
"email",
"privacy_level",
"username",
"date_joined",
"last_activity",
]
def update(self, instance, validated_data):
instance = super().update(instance, validated_data)
permissions = validated_data.pop("permissions", {})
if permissions:
for p, value in permissions.items():
setattr(instance, "permission_{}".format(p), value)
instance.save(
update_fields=["permission_{}".format(p) for p in permissions.keys()]
)
return instance
class ManageInvitationSerializer(serializers.ModelSerializer):
users = ManageUserSimpleSerializer(many=True, required=False)
owner = ManageUserSimpleSerializer(required=False)
code = serializers.CharField(required=False, allow_null=True)
class Meta:
model = users_models.Invitation
fields = ("id", "owner", "code", "expiration_date", "creation_date", "users")
read_only_fields = ["id", "expiration_date", "owner", "creation_date", "users"]
def validate_code(self, value):
if not value:
return value
if users_models.Invitation.objects.filter(code__iexact=value).exists():
raise serializers.ValidationError(
"An invitation with this code already exists"
)
return value
class ManageInvitationActionSerializer(common_serializers.ActionSerializer):
actions = [
common_serializers.Action(
"delete", allow_all=False, qs_filter=lambda qs: qs.open()
)
]
filterset_class = filters.ManageInvitationFilterSet
@transaction.atomic
def handle_delete(self, objects):
return objects.delete()
class ManageImportRequestSerializer(serializers.ModelSerializer):
user = ManageUserSimpleSerializer(required=False)
class Meta:
model = requests_models.ImportRequest
fields = [
"id",
"status",
"creation_date",
"imported_date",
"user",
"albums",
"artist_name",
"comment",
]
read_only_fields = [
"id",
"status",
"creation_date",
"imported_date",
"user",
"albums",
"artist_name",
"comment",
]
def validate_code(self, value):
if not value:
return value
if users_models.Invitation.objects.filter(code__iexact=value).exists():
raise serializers.ValidationError(
"An invitation with this code already exists"
)
return value
class ManageImportRequestActionSerializer(common_serializers.ActionSerializer):
actions = [
common_serializers.Action(
"mark_closed",
allow_all=True,
qs_filter=lambda qs: qs.filter(status__in=["pending", "accepted"]),
),
common_serializers.Action(
"mark_imported",
allow_all=True,
qs_filter=lambda qs: qs.filter(status__in=["pending", "accepted"]),
),
common_serializers.Action("delete", allow_all=False),
]
filterset_class = filters.ManageImportRequestFilterSet
@transaction.atomic
def handle_delete(self, objects):
return objects.delete()
@transaction.atomic
def handle_mark_closed(self, objects):
return objects.update(status="closed")
@transaction.atomic
def handle_mark_imported(self, objects):
now = timezone.now()
return objects.update(status="imported", imported_date=now)
......@@ -5,7 +5,18 @@ from . import views
library_router = routers.SimpleRouter()
library_router.register(r"track-files", views.ManageTrackFileViewSet, "track-files")
requests_router = routers.SimpleRouter()
requests_router.register(
r"import-requests", views.ManageImportRequestViewSet, "import-requests"
)
users_router = routers.SimpleRouter()
users_router.register(r"users", views.ManageUserViewSet, "users")
users_router.register(r"invitations", views.ManageInvitationViewSet, "invitations")
urlpatterns = [
url(r"^library/", include((library_router.urls, "instance"), namespace="library"))
url(r"^library/", include((library_router.urls, "instance"), namespace="library")),
url(r"^users/", include((users_router.urls, "instance"), namespace="users")),
url(
r"^requests/", include((requests_router.urls, "instance"), namespace="requests")
),
]
from rest_framework import mixins, response, viewsets
from rest_framework.decorators import list_route
from funkwhale_api.common import preferences
from funkwhale_api.music import models as music_models
from funkwhale_api.requests import models as requests_models
from funkwhale_api.users import models as users_models
from funkwhale_api.users.permissions import HasUserPermission
from . import filters, serializers
class ManageTrackFileViewSet(
mixins.ListModelMixin,
mixins.RetrieveModelMixin,
mixins.DestroyModelMixin,
viewsets.GenericViewSet,
mixins.ListModelMixin, mixins.RetrieveModelMixin, viewsets.GenericViewSet
):
queryset = (
music_models.TrackFile.objects.all()
......@@ -41,3 +41,83 @@ class ManageTrackFileViewSet(
serializer.is_valid(raise_exception=True)
result = serializer.save()
return response.Response(result, status=200)
class ManageUserViewSet(
mixins.ListModelMixin,
mixins.RetrieveModelMixin,
mixins.UpdateModelMixin,
viewsets.GenericViewSet,
):
queryset = users_models.User.objects.all().order_by("-id")
serializer_class = serializers.ManageUserSerializer
filter_class = filters.ManageUserFilterSet
permission_classes = (HasUserPermission,)
required_permissions = ["settings"]
ordering_fields = ["date_joined", "last_activity", "username"]
def get_serializer_context(self):
context = super().get_serializer_context()
context["default_permissions"] = preferences.get("users__default_permissions")
return context
class ManageInvitationViewSet(
mixins.CreateModelMixin,
mixins.ListModelMixin,
mixins.RetrieveModelMixin,
mixins.UpdateModelMixin,
viewsets.GenericViewSet,
):
queryset = (
users_models.Invitation.objects.all()
.order_by("-id")
.prefetch_related("users")
.select_related("owner")
)
serializer_class = serializers.ManageInvitationSerializer
filter_class = filters.ManageInvitationFilterSet
permission_classes = (HasUserPermission,)
required_permissions = ["settings"]
ordering_fields = ["creation_date", "expiration_date"]
def perform_create(self, serializer):
serializer.save(owner=self.request.user)
@list_route(methods=["post"])
def action(self, request, *args, **kwargs):
queryset = self.get_queryset()
serializer = serializers.ManageInvitationActionSerializer(
request.data, queryset=queryset
)
serializer.is_valid(raise_exception=True)
result = serializer.save()
return response.Response(result, status=200)
class ManageImportRequestViewSet(
mixins.ListModelMixin,
mixins.RetrieveModelMixin,
mixins.UpdateModelMixin,
viewsets.GenericViewSet,
):
queryset = (
requests_models.ImportRequest.objects.all()
.order_by("-id")
.select_related("user")
)
serializer_class = serializers.ManageImportRequestSerializer
filter_class = filters.ManageImportRequestFilterSet
permission_classes = (HasUserPermission,)
required_permissions = ["library"]
ordering_fields = ["creation_date", "imported_date"]
@list_route(methods=["post"])
def action(self, request, *args, **kwargs):
queryset = self.get_queryset()
serializer = serializers.ManageImportRequestActionSerializer(
request.data, queryset=queryset
)
serializer.is_valid(raise_exception=True)
result = serializer.save()
return response.Response(result, status=200)
......@@ -89,6 +89,7 @@ class ImportJobFactory(factory.django.DjangoModelFactory):
batch = factory.SubFactory(ImportBatchFactory)
source = factory.Faker("url")
mbid = factory.Faker("uuid4")
replace_if_duplicate = False
class Meta:
model = "music.ImportJob"
......
# Generated by Django 2.0.6 on 2018-06-22 13:36
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('music', '0027_auto_20180515_1808'),
]
operations = [
migrations.AddField(
model_name='importjob',
name='replace_if_duplicate',
field=models.BooleanField(default=False),
),
]
......@@ -539,7 +539,7 @@ class ImportBatch(models.Model):
related_name="import_batches",
null=True,
blank=True,
on_delete=models.CASCADE,
on_delete=models.SET_NULL,
)
class Meta:
......@@ -567,6 +567,7 @@ class ImportBatch(models.Model):
class ImportJob(models.Model):
uuid = models.UUIDField(unique=True, db_index=True, default=uuid.uuid4)
replace_if_duplicate = models.BooleanField(default=False)
batch = models.ForeignKey(
ImportBatch, related_name="jobs", on_delete=models.CASCADE
)
......
......@@ -80,10 +80,11 @@ def import_track_from_remote(library_track):
)[0]
def _do_import(import_job, replace=False, use_acoustid=False):
def _do_import(import_job, use_acoustid=False):
logger.info("[Import Job %s] starting job", import_job.pk)
from_file = bool(import_job.audio_file)
mbid = import_job.mbid
replace = import_job.replace_if_duplicate
acoustid_track_id = None
duration = None
track = None
......@@ -135,8 +136,8 @@ def _do_import(import_job, replace=False, use_acoustid=False):
track_file = None
if replace:
logger.info("[Import Job %s] replacing existing audio file", import_job.pk)
track_file = track.files.first()
logger.info("[Import Job %s] deleting existing audio file", import_job.pk)
track.files.all().delete()
elif track.files.count() > 0:
logger.info(
"[Import Job %s] skipping, we already have a file for this track",
......@@ -163,7 +164,7 @@ def _do_import(import_job, replace=False, use_acoustid=False):
# no downloading, we hotlink
pass
elif not import_job.audio_file and not import_job.source.startswith("file://"):
# not an implace import, and we have a source, so let's download it
# not an inplace import, and we have a source, so let's download it
logger.info("[Import Job %s] downloading audio file from remote", import_job.pk)
track_file.download_file()
elif not import_job.audio_file and import_job.source.startswith("file://"):
......@@ -243,14 +244,14 @@ def get_cover_from_fs(dir_path):
@celery.require_instance(
models.ImportJob.objects.filter(status__in=["pending", "errored"]), "import_job"
)
def import_job_run(self, import_job, replace=False, use_acoustid=False):
def import_job_run(self, import_job, use_acoustid=False):
def mark_errored(exc):
logger.error("[Import Job %s] Error during import: %s", import_job.pk, str(exc))
import_job.status = "errored"
import_job.save(update_fields=["status"])
try:
tf = _do_import(import_job, replace, use_acoustid=use_acoustid)
tf = _do_import(import_job, use_acoustid=use_acoustid)
return tf.pk if tf else None
except Exception as exc:
if not settings.DEBUG:
......
......@@ -110,7 +110,9 @@ class PlaylistTrackViewSet(
def get_queryset(self):
return self.queryset.filter(
fields.privacy_level_query(
self.request.user, lookup_field="playlist__privacy_level"
self.request.user,
lookup_field="playlist__privacy_level",
user_field="playlist__user",
)
)
......
......@@ -13,7 +13,7 @@ class Command(BaseCommand):
help = "Import audio files mathinc given glob pattern"
def add_arguments(self, parser):
parser.add_argument("path", type=str)
parser.add_argument("path", nargs="+", type=str)
parser.add_argument(
"--recursive",
action="store_true",
......@@ -55,6 +55,17 @@ class Command(BaseCommand):
"import and not much disk space available."
),
)
parser.add_argument(
"--replace",
action="store_true",
dest="replace",
default=False,
help=(
"Use this flag to replace duplicates (tracks with same "
"musicbrainz mbid, or same artist, album and title) on import "
"with their newest version."
),
)
parser.add_argument(
"--noinput",
"--no-input",
......@@ -65,10 +76,13 @@ class Command(BaseCommand):
def handle(self, *args, **options):
glob_kwargs = {}
matching = []
if options["recursive"]:
glob_kwargs["recursive"] = True
try:
matching = sorted(glob.glob(options["path"], **glob_kwargs))
for import_path in options["path"]:
matching += glob.glob(import_path, **glob_kwargs)
matching = sorted(list(set(matching)))
except TypeError:
raise Exception("You need Python 3.5 to use the --recursive flag")
......@@ -109,16 +123,23 @@ class Command(BaseCommand):
"No superuser available, please provide a --username"
)
filtered = self.filter_matching(matching, options)
if options["replace"]:
filtered = {"initial": matching, "skipped": [], "new": matching}
message = "- {} files to be replaced"
import_paths = matching
else:
filtered = self.filter_matching(matching)
message = "- {} files already found in database"
import_paths = filtered["new"]
self.stdout.write("Import summary:")
self.stdout.write(
"- {} files found matching this pattern: {}".format(
len(matching), options["path"]
)
)
self.stdout.write(
"- {} files already found in database".format(len(filtered["skipped"]))
)
self.stdout.write(message.format(len(filtered["skipped"])))
self.stdout.write("- {} new files".format(len(filtered["new"])))
self.stdout.write(
......@@ -138,12 +159,12 @@ class Command(BaseCommand):
if input("".join(message)) != "yes":
raise CommandError("Import cancelled.")
batch, errors = self.do_import(filtered["new"], user=user, options=options)
batch, errors = self.do_import(import_paths, user=user, options=options)
message = "Successfully imported {} tracks"
if options["async"]:
message = "Successfully launched import for {} tracks"
self.stdout.write(message.format(len(filtered["new"])))
self.stdout.write(message.format(len(import_paths)))
if len(errors) > 0:
self.stderr.write("{} tracks could not be imported:".format(len(errors)))
......@@ -153,7 +174,7 @@ class Command(BaseCommand):
"For details, please refer to import batch #{}".format(batch.pk)
)
def filter_matching(self, matching, options):
def filter_matching(self, matching):
sources = ["file://{}".format(p) for p in matching]
# we skip reimport for path that are already found
# as a TrackFile.source
......@@ -193,7 +214,9 @@ class Command(BaseCommand):
return batch, errors
def import_file(self, path, batch, import_handler, options):
job = batch.jobs.create(source="file://" + path)
job = batch.jobs.create(
source="file://" + path, replace_if_duplicate=options["replace"]
)
if not options["in_place"]:
name = os.path.basename(path)
with open(path, "rb") as f:
......
......@@ -7,12 +7,12 @@ from django.contrib.auth.admin import UserAdmin as AuthUserAdmin
from django.contrib.auth.forms import UserChangeForm, UserCreationForm
from django.utils.translation import ugettext_lazy as _
from .models import User
from . import models
class MyUserChangeForm(UserChangeForm):
class Meta(UserChangeForm.Meta):
model = User
model = models.User
class MyUserCreationForm(UserCreationForm):
......@@ -22,18 +22,18 @@ class MyUserCreationForm(UserCreationForm):
)
class Meta(UserCreationForm.Meta):
model = User
model = models.User
def clean_username(self):
username = self.cleaned_data["username"]
try:
User.objects.get(username=username)
except User.DoesNotExist:
models.User.objects.get(username=username)
except models.User.DoesNotExist:
return username
raise forms.ValidationError(self.error_messages["duplicate_username"])
@admin.register(User)
@admin.register(models.User)
class UserAdmin(AuthUserAdmin):
form = MyUserChangeForm
add_form = MyUserCreationForm
......@@ -74,3 +74,11 @@ class UserAdmin(AuthUserAdmin):
(_("Important dates"), {"fields": ("last_login", "date_joined")}),
(_("Useless fields"), {"fields": ("user_permissions", "groups")}),
)
@admin.register(models.Invitation)
class InvitationAdmin(admin.ModelAdmin):
list_select_related = True
list_display = ["owner", "code", "creation_date", "expiration_date"]
search_fields = ["owner__username", "code"]
readonly_fields = ["expiration_date", "code"]
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment