Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found
Select Git revision

Target

Select target project
  • funkwhale/cli
  • neodarz/cli
  • apetresc/cli
  • EorlBruder/cli
  • prplecake/cli
5 results
Select Git revision
Show changes
Commits on Source (103)
Showing
with 1821 additions and 706 deletions
dist
build
funkwhale
venv
.idea
funkwhale_cli.egg-info
\ No newline at end of file
stages:
- test
- build
- publish
variables:
PIP_CACHE_DIR: "$CI_PROJECT_DIR/.cache/pip"
DOCKER_HOST: tcp://docker:2375/
DOCKER_DRIVER: overlay2
DOCKER_TLS_CERTDIR: ""
cache:
paths:
- .cache/pip
test:
stage: test
image: python:$PY_VER
before_script:
- pip install .[test]
script:
- pytest
tags:
- docker
parallel:
matrix:
- PY_VER: ["3.6", "3.7", "3.8", "3.9", "3.10"]
build-linux:
stage: build
image: python:3.8
before_script:
- pip install .[build]
- pip install --upgrade setuptools
script:
- pyinstaller --clean -y cli.spec --distpath .
- echo "Testing the generated CLI works…" && (./funkwhale --help | grep uploads) && echo "funkwhale CLI working \o/"
artifacts:
name: "linux_${CI_COMMIT_REF_NAME}"
paths:
- funkwhale
only:
- tags@funkwhale/cli
- master@funkwhale/cli
tags:
- docker
build-windows:
# there is a weird Gitlab / windows interaction
# cf https://github.com/cdrx/docker-pyinstaller/issues/38
# so we cannot use the regular docker executor
stage: build
image: docker:stable
services:
- docker:20-dind
script:
- docker run --rm -v "$(pwd):/src/" cdrx/pyinstaller-windows:python3 "pip install .[build] && pyinstaller --clean -y cli.spec --distpath ."
- docker run --rm -v "$(pwd):/src/" cdrx/pyinstaller-windows:python3 "echo 'Testing the generated CLI works…' && (wine ./funkwhale.exe --help | grep uploads) && echo 'funkwhale CLI working \o/'"
after_script:
- docker run --rm -v "$(pwd):/src/" python:3.8 bash -c "rm -rf /src/build && py3clean /src/"
artifacts:
name: "windows_${CI_COMMIT_REF_NAME}"
paths:
- funkwhale.exe
only:
- tags@funkwhale/cli
- master@funkwhale/cli
tags:
- dind
build-pypi:
stage: build
image: python:3
before_script:
- apt-get update
- pip install .[build-pypi]
script:
- python3 setup.py sdist bdist_wheel
tags:
- docker
artifacts:
paths:
- dist/*
expire_in: 1 week
only:
- tags@funkwhale/cli
publish:
stage: publish
image: python:3
before_script:
- apt-get update
- pip install .[publish]
script:
- twine upload dist/*
tags:
- docker
dependencies:
- build-pypi
only:
- tags@funkwhale/cli
Changelog
=========
.. towncrier
# Contributing to the Funkwhale-Cli project
This guide needs to be expanded and is currently only documenting things for the changelog-generation.
## Typical workflow for a contribution
0. Fork the project if you did not already or if you do not have access to the main repository
1. Checkout the master branch and pull most recent changes: git checkout master && git pull
2. If working on an issue, assign yourself to the issue. Otherwise, consider open an issue before starting to work on something, especially for new features.
3. Create a dedicated branch for your work 42-awesome-fix. It is good practice to prefix your branch name with the ID of the issue you are solving.
4. Work on your stuff
5. Commit small, atomic changes to make it easier to review your contribution
6. Add a changelog fragment to summarize your changes: echo "Implemented awesome stuff (#42)" > changes/changelog.d/42.feature
7. Push your branch
8. Create your merge request
9. Take a step back and enjoy, we're really grateful you did all of this and took the time to contribute!
## Changelog management
To ensure we have extensive and well-structured changelog, any significant work such as closing an issue must include a changelog fragment. Small changes may include a changelog fragment as well but this is not mandatory. If you're not sure about what to do, do not panic, open your merge request normally and we'll figure everything during the review ;)
Changelog fragments are text files that can contain one or multiple lines that describe the changes occurring in a bunch of commits. Those files reside in `changes/changelog.d`.
### Content
A typical fragment looks like that:
> Fixed broken audio player on Chrome 42 for ogg files (#567)
If the work fixes one or more issues, the issue number should be included at the end of the fragment ((#567) is the issue number in the previous example).
If your work is not related to a specific issue, use the merge request identifier instead, like this:
> Fixed a typo in landing page copy (!342)
### Naming
Fragment files should respect the following naming pattern: `changes/changelog.d/<name>.<category>`. Name can be anything describing your work, or simply the identifier of the issue number you are fixing. Category can be one of:
- `feature`: for new features
- `enhancement`: for enhancements on existing features
- `bugfix`: for bugfixes
- `doc`: for documentation
- `i18n`: for internationalization-related work
- `misc`: for anything else
### Shortcuts
Here is a shortcut you can use/adapt to easily create new fragments from command-line:
```bash
issue="42"
content="Fixed an overflowing issue on small resolutions (#$issue)"
category="bugfix"
echo "$content ($issue)" > changes/changelog.d/$issue.$category
```
You can of course create fragments by hand in your text editor, or from Gitlab's
interface as well.
## Making a release
To make a new 3.4 release:
```bash
# setup
export NEXT_RELEASE=3.4 # replace with the next release number
export PREVIOUS_RELEASE=3.3 # replace with the previous release number
# ensure you have an up-to-date repo
git checkout master
git pull
# compile changelog
towncrier --version $NEXT_RELEASE --yes
# polish changelog
# - update the date
# - look for typos
# - add list of contributors via `python3 scripts/get-contributions-stats.py develop $PREVIOUS_RELEASE`
nano CHANGELOG
# Set the `version` variable to $NEXT_RELEASE
nano setup.cfg
# commit
git add .
git commit -m "Version bump and changelog for $NEXT_RELEASE"
# tag
git tag $NEXT_RELEASE
# publish
git push --tags && git push
```
\ No newline at end of file
This diff is collapsed.
......@@ -2,12 +2,90 @@ A command line interface to interact with Funkwhale servers.
# Installation
This package can be installed via pip:
```
pip install funkwhale-cli
```
We also provide some prebuilt binaries for Windows and Linux.
On Linux:
```
curl -L "https://dev.funkwhale.audio/funkwhale/cli/-/jobs/artifacts/master/raw/funkwhale?job=build-linux" -o /usr/local/bin/funkwhale
chmod +x /usr/local/bin/funkwhale
```
On Windows:
```
curl -L "https://dev.funkwhale.audio/funkwhale/cli/-/jobs/artifacts/master/raw/funkwhale.exe?job=build-windows" -o funkwhale.exe
```
# Usage
```bash
# get help
funkwhale --help
# get help on a specific command
funkwhale tracks ls --help
# get login
funkwhale -H https://demo.funkwhale.audio login # credentials are demo and demo on this server
# Store the server URL to avoid specifying it on the CLI
echo "FUNKWHALE_SERVER_URL=https://demo.funkwhale.audio" >> .env
# Create a library
funkwhale libraries create --visibility=me
# Upload some content to the server
funkwhale uploads create <library_id> ~/Music/**/*.mp3
# Search some tracks
funkwhale tracks ls jekk
# Download a track to the ~/Music directory
funkwhale tracks download -d ~/Music <track_id>
# Download a track and customize the target directory
funkwhale tracks download -d ~/Music -t "{artist}/{album} ({year})/{title}.{extension}" <track_id>
# Download all tracks matching a search, in ogg format
funkwhale tracks ls jekk --ids --limit 0 | xargs funkwhale tracks download -f ogg -d ~/Music
# Download all favorite tracks
funkwhale tracks ls --filter "favorites=true" --ids --limit 0 | xargs funkwhale tracks download -d ~/Music
# Download a track and pipe the output directly to VLC
funkwhale tracks download <track_id> | cvlc -
# Generate a playlist-file from a funkwhale playlist in a custom directory
funkwhale playlist tracks --ids <playlist_id> | funkwhale tracks generate-playlist -d ~/Music -t "{artist}/{album} ({year})/{title}.{extension}"
# Delete your library
funkwhale libraries rm <library_id>
# Logout
funkwhale logout
```
# Installation (from source)
This cli requires python 3.6 or greater:
git clone https://dev.funkwhale.audio/funkwhale/cli.git
cd cli
pip install .
# Usage
``funkwhale --help``
# Build the binary
You can build the binary for you platform using the following commands:
pip install .[build]
pyinstaller cli.spec
This will output a binary in `./dist/funkwhale`.
Created the project and initial functionality.
\ No newline at end of file
Upgrade via pip or use our prebuilt binaries for Linux or Windows.
{% for section, _ in sections.items() %}
{% if sections[section] %}
{% for category, val in definitions.items() if category in sections[section]%}
{{ definitions[category]['name'] }}:
{% if definitions[category]['showcontent'] %}
{% for text in sections[section][category].keys()|sort() %}
- {{ text }}
{% endfor %}
{% else %}
- {{ sections[section][category]['']|join(', ') }}
{% endif %}
{% if sections[section][category]|length == 0 %}
No significant changes.
{% else %}
{% endif %}
{% endfor %}
{% else %}
No significant changes.
{% endif %}
{% endfor %}
# -*- mode: python -*-
block_cipher = None
a = Analysis(
["funkwhale_cli/__main__.py"],
pathex=[],
binaries=[],
datas=[],
hiddenimports=[],
hookspath=[],
runtime_hooks=[],
excludes=[],
win_no_prefer_redirects=False,
win_private_assemblies=False,
cipher=block_cipher,
noarchive=False,
)
pyz = PYZ(a.pure, a.zipped_data, cipher=block_cipher)
exe = EXE(
pyz,
a.scripts,
a.binaries,
a.zipfiles,
a.datas,
[],
name="funkwhale",
debug=False,
bootloader_ignore_signals=False,
strip=False,
upx=True,
runtime_tmpdir=None,
console=True,
)
from funkwhale_cli import cli
cli.cli()
import aiohttp
from . import exceptions
from . import logs
from . import schemas
from . import settings
......@@ -79,7 +78,8 @@ class API(object):
full_url = self.base_url + path
headers = kwargs.setdefault("headers", {})
if self.token:
headers["Authorization"] = "JWT {}".format(self.token)
scheme = "JWT" if len(self.token) > 50 else "Bearer"
headers["Authorization"] = " ".join([scheme, str(self.token)])
handler = getattr(self._session, method)
return handler(full_url, *args, **kwargs)
......
This diff is collapsed.
from . import auth
from . import albums
from . import artists
from . import channels
from . import favorites
from . import libraries
from . import playlists
from . import tracks
from . import uploads
from . import users
from . import server
from .base import cli
__all__ = [
"auth",
"albums",
"artists",
"channels",
"favorites",
"libraries",
"playlists",
"server",
"tracks",
"uploads",
"users",
"cli",
]
import click
from . import base
@base.cli.group()
@click.pass_context
def albums(ctx):
pass
albums_ls = base.get_ls_command(
albums,
"api/v1/albums/",
output_conf={
"labels": ["ID", "Title", "Artist", "Tracks", "Created"],
"type": "ALBUM",
"id_field": "ID",
},
)
import click
from . import base
@base.cli.group()
@click.pass_context
def artists(ctx):
pass
artists_ls = base.get_ls_command(
artists,
"api/v1/artists/",
output_conf={
"labels": ["ID", "Name", "Albums", "Tracks", "Created"],
"type": "ARTIST",
"id_field": "ID",
},
)
import click
import keyring
# importing the backends explicitely is required for PyInstaller to work
import keyring.backends.kwallet
import keyring.backends.Windows
import keyring.backends.OS_X
import keyring.backends.SecretService
import keyring.backends.chainer
from . import base
from .. import api
class lazy_credential:
"""
A proxy object to request access to the proxy object at the later possible point,
cf #4
"""
def __init__(self, *args):
self.args = args
self._cached_value = None
@property
def value(self):
if self._cached_value:
return self._cached_value
try:
v = keyring.get_password(*self.args)
except ValueError as e:
raise click.ClickException(
"Error while retrieving password from keyring: {}. Your password may be incorrect.".format(
e.args[0]
)
)
except Exception as e:
raise click.ClickException(
"Error while retrieving password from keyring: {}".format(e.args[0])
)
self._cached_value = v
return v
def __str__(self):
return str(self.value)
def __eq__(self, other):
return self.value == other
def __repr__(self):
return str(self.value)
def __bool__(self):
return bool(self.value)
def __len__(self):
return len(str(self))
def init_keyring():
# small hack to fix some weird issues with pyinstaller and keyring
# there seems to be a cache issue somewhere
try:
del keyring.backend.get_all_keyring.__wrapped__.always_returns
except AttributeError:
pass
keyring.core.init_backend()
# /end of hack
@base.cli.command()
@click.option("-u", "--username", envvar="FUNKWHALE_USERNAME", prompt=True)
@click.option(
"-p", "--password", envvar="FUNKWHALE_PASSWORD", prompt=True, hide_input=True
)
@click.pass_context
@base.async_command
async def login(ctx, username, password):
async with api.get_session() as session:
token = await api.get_jwt_token(
session, ctx.obj["SERVER_URL"], username=username, password=password
)
try:
keyring.set_password(ctx.obj["SERVER_URL"], "_", token)
except ValueError as e:
raise click.ClickException(
"Error while retrieving password from keyring: {}. Your password may be incorrect.".format(
e.args[0]
)
)
except Exception as e:
raise click.ClickException(
"Error while retrieving password from keyring: {}".format(e.args[0])
)
click.echo("Login successfull!")
@base.cli.command()
@click.pass_context
@base.async_command
async def logout(ctx):
keyring.delete_password(ctx.obj["SERVER_URL"], "_")
click.echo("Logout successfull!")
import asyncio
import aiohttp
import click
import click_log
import dotenv
import functools
import ssl
import sys
import urllib.parse
import math
import json
from funkwhale_cli import api
from funkwhale_cli import config
from funkwhale_cli import exceptions
from funkwhale_cli import logs
from funkwhale_cli import output
from funkwhale_cli import utils
click_log.basic_config(logs.logger)
NOOP = object()
SSL_PROTOCOLS = (asyncio.sslproto.SSLProtocol,)
try:
import uvloop.loop
except ImportError:
pass
else:
SSL_PROTOCOLS = (*SSL_PROTOCOLS, uvloop.loop.SSLProtocol)
def ignore_aiohttp_ssl_eror(loop):
"""Ignore aiohttp #3535 / cpython #13548 issue with SSL data after close
There is an issue in Python 3.7 up to 3.7.3 that over-reports a
ssl.SSLError fatal error (ssl.SSLError: [SSL: KRB5_S_INIT] application data
after close notify (_ssl.c:2609)) after we are already done with the
connection. See GitHub issues aio-libs/aiohttp#3535 and
python/cpython#13548.
Given a loop, this sets up an exception handler that ignores this specific
exception, but passes everything else on to the previous exception handler
this one replaces.
Checks for fixed Python versions, disabling itself when running on 3.7.4+
or 3.8.
"""
if sys.version_info >= (3, 7, 4):
return
orig_handler = loop.get_exception_handler()
def ignore_ssl_error(loop, context):
if context.get("message") in {
"SSL error in data received",
"Fatal error on transport",
}:
# validate we have the right exception, transport and protocol
exception = context.get("exception")
protocol = context.get("protocol")
if (
isinstance(exception, ssl.SSLError)
and exception.reason == "KRB5_S_INIT"
and isinstance(protocol, SSL_PROTOCOLS)
):
if loop.get_debug():
asyncio.log.logger.debug("Ignoring asyncio SSL KRB5_S_INIT error")
return
if orig_handler is not None:
orig_handler(loop, context)
else:
loop.default_exception_handler(context)
loop.set_exception_handler(ignore_ssl_error)
def noop_decorator(f):
return f
def URL(v):
if v is NOOP:
raise click.ClickException(
"You need to specify a server, either via the -H flag or using the FUNKWHALE_SERVER_URL environment variable"
)
v = str(v) if v else None
parsed = urllib.parse.urlparse(v)
if parsed.scheme not in ["http", "https"] or not parsed.netloc:
raise ValueError("{} is not a valid url".format(v))
if not v.endswith("/"):
v = v + "/"
return v
def env_file(v):
if v is NOOP:
v = None
if v is not None:
v = click.Path(exists=True)(v)
env_files = [v or ".env", config.get_env_file()]
for p in env_files:
logs.logger.debug("Loading env file at {}".format(p))
dotenv.load_dotenv(p, override=False)
return v
def async_command(f):
def wrapper(*args, **kwargs):
loop = asyncio.get_event_loop()
ignore_aiohttp_ssl_eror(loop)
_async_reraise = kwargs.pop("_async_reraise", False)
try:
return loop.run_until_complete(f(*args, **kwargs))
except (aiohttp.client_exceptions.ClientError) as e:
if _async_reraise:
raise
message = str(e)
if hasattr(e, "status") and e.status == 401:
message = "Remote answered with {}, ensure your are logged in first".format(
e.status
)
raise click.ClickException(message)
except (exceptions.FunkwhaleError) as e:
if _async_reraise:
raise
message = str(e)
raise click.ClickException(message)
else:
raise
return functools.update_wrapper(wrapper, f)
async def check_status(response):
text = await response.text()
try:
response.raise_for_status()
except aiohttp.client_exceptions.ClientError as e:
raise click.ClickException(str(e) + ": {}".format(text))
SERVER_DECORATOR = click.option(
"-H",
"--url",
envvar="FUNKWHALE_SERVER_URL",
type=URL,
default=NOOP,
help="The URL of the Funkwhale server to query",
)
TOKEN_DECORATOR = click.option(
"-t",
"--token",
envvar="FUNKWHALE_TOKEN",
help="A Bearer token to use for authentication",
)
RAW_DECORATOR = click.option(
"--raw", is_flag=True, help="Directly output JSON returned by the happy"
)
def set_server(ctx, url, token, use_auth=True):
from . import auth
ctx.ensure_object(dict)
ctx.obj["SERVER_URL"] = url
parsed = urllib.parse.urlparse(url)
ctx.obj["SERVER_NETLOC"] = parsed.netloc
ctx.obj["SERVER_PROTOCOL"] = parsed.scheme
token = (token or auth.lazy_credential(url, "_")) if use_auth else None
ctx.obj["remote"] = api.get_api(
domain=ctx.obj["SERVER_NETLOC"],
protocol=ctx.obj["SERVER_PROTOCOL"],
token=token,
)
@click.group()
@click.option(
"-e",
"--env-file",
envvar="ENV_FILE",
type=env_file,
default=NOOP,
help="Path to an env file to use. A .env file will be used automatically if any",
)
@click.option(
"-q",
"--quiet",
envvar="FUNKWHALE_QUIET",
is_flag=True,
default=False,
help="Disable logging",
)
@click.option(
"--no-login",
envvar="FUNKWHALE_NO_LOGIN",
is_flag=True,
default=False,
help="Disable authentication/keyring",
)
@SERVER_DECORATOR
@TOKEN_DECORATOR
@click_log.simple_verbosity_option(logs.logger, expose_value=True)
@click.pass_context
def cli(ctx, env_file, url, verbosity, token, quiet, no_login):
from . import auth
auth.init_keyring()
ctx.ensure_object(dict)
logs.logger.disabled = quiet
set_server(ctx, url, token, use_auth=not no_login)
def get_pagination_data(payload):
data = {"next_page": None, "page_size": None}
if payload.get("next"):
next_page = utils.get_url_param(payload["next"], "page")
data["next_page"] = int(next_page)
data["total_pages"] = math.ceil(payload["count"] / len(payload["results"]))
data["current_page"] = int(next_page) - 1
data["page_size"] = len(payload["results"])
if payload.get("previous"):
previous_page = utils.get_url_param(payload["previous"], "page") or 0
data.setdefault("current_page", int(previous_page) + 1)
data.setdefault("total_pages", data["current_page"])
if (
not data["page_size"]
and payload["count"] - len(payload["results"]) > 0
and data["total_pages"] > 1
):
data["page_size"] = int(payload["count"] - len(payload["results"])) / (
data["total_pages"] - 1
)
data.setdefault("current_page", 1)
data.setdefault("total_pages", 1)
return data
def get_ls_command(
group,
endpoint,
output_conf,
pagination=True,
filter=True,
ordering=True,
scope=False,
with_id=False,
owned_conf=None,
name="ls",
doc="",
id_metavar="ID",
):
available_fields = sorted(
set(output_conf["labels"]) | set(output.FIELDS["*"].keys())
)
id_decorator = (
click.argument("id", metavar=id_metavar) if with_id else noop_decorator
)
page_decorator = (
click.option("--page", "-p", type=click.INT, default=1)
if pagination
else noop_decorator
)
page_size_decorator = (
click.option("--page-size", "-s", type=click.INT, default=None)
if pagination
else noop_decorator
)
limit_decorator = (
click.option("--limit", "-l", type=click.INT, default=1)
if pagination
else noop_decorator
)
ordering_decorator = (
click.option("--ordering", "-o", default=None) if ordering else noop_decorator
)
scope_decorator = click.option("--scope", default=None) if scope else noop_decorator
filter_decorator = (
click.option("--filter", "-f", multiple=True) if filter else noop_decorator
)
owned_decorator = (
click.option("--owned", is_flag=True, default=False)
if owned_conf
else noop_decorator
)
@id_decorator
@click.argument("query", nargs=-1)
@RAW_DECORATOR
@click.option(
"--format", "-t", type=click.Choice(output.TABLE_FORMATS), default="simple"
)
@click.option("--no-headers", "-h", is_flag=True, default=False)
@click.option("--ids", "-i", is_flag=True)
@page_decorator
@page_size_decorator
@ordering_decorator
@scope_decorator
@filter_decorator
@limit_decorator
@owned_decorator
@click.option(
"--column",
"-c",
multiple=True,
help="Which column to display. Available: {}. \nDefault: {}".format(
", ".join(available_fields), ", ".join(output_conf["labels"])
),
)
@click.pass_context
@async_command
async def ls(ctx, raw, column, format, no_headers, ids, **kwargs):
id = kwargs.get("id")
limit = kwargs.get("limit")
page = kwargs.get("page")
page_size = kwargs.get("page_size")
ordering = kwargs.get("ordering")
scope = kwargs.get("scope")
filter = kwargs.get("filter")
query = kwargs.get("query")
owned = kwargs.get("owned")
if ids:
no_headers = True
column = [output_conf.get("id_field", "UUID")]
format = "plain"
base_url = endpoint
if with_id:
base_url = base_url.format(id)
next_page_url = None
page_count = 0
while True:
if limit and page_count >= limit:
break
async with ctx.obj["remote"]:
if not pagination or page_count == 0:
url = base_url
params = {}
if page:
params["page"] = page
if page_size:
params["page_size"] = page_size
if ordering:
params["ordering"] = ordering
if scope:
params["scope"] = scope
if query:
params["q"] = " ".join(query)
if filter:
for f in filter:
query = urllib.parse.parse_qs(f)
for k, v in query.items():
params[k] = v[0]
if owned_conf and owned:
user_info = await get_user_info(ctx)
params[owned_conf["param"]] = utils.recursive_getattr(
user_info, owned_conf["field"]
)
else:
params = {}
url = next_page_url
if not url:
break
result = await ctx.obj["remote"].request("get", url, params=params)
result.raise_for_status()
payload = await result.json()
next_page_url = payload.get("next")
page_count += 1
if raw:
click.echo(json.dumps(payload, sort_keys=True, indent=4))
else:
click.echo(
output.table(
payload["results"],
column or output_conf["labels"],
type=output_conf["type"],
format=format,
headers=not no_headers,
)
)
if not pagination:
break
pagination_data = get_pagination_data(payload)
if pagination_data["page_size"]:
start = (
int(
(pagination_data["current_page"] - 1)
* pagination_data["page_size"]
)
+ 1
)
else:
start = 1
end = min(start + len(payload["results"]) - 1, payload["count"])
logs.logger.info(
"\nObjects {start}-{end} on {total} (page {current_page}/{total_pages})".format(
start=start,
end=end,
total=payload["count"],
current_page=pagination_data["current_page"],
total_pages=pagination_data["total_pages"] or 1,
)
)
ls.__doc__ = doc
return group.command(name)(ls)
def get_show_command(
group, url_template, output_conf, name="show", force_id=None, doc=""
):
available_fields = sorted(
set(output_conf["labels"]) | set(output.FIELDS["*"].keys())
)
if force_id:
def id_decorator(f):
@functools.wraps(f)
def inner(raw, column, format):
return f(raw, force_id, column, format)
return inner
else:
id_decorator = click.argument("id")
@id_decorator
@RAW_DECORATOR
@click.option(
"--format", "-t", type=click.Choice(output.TABLE_FORMATS), default="simple"
)
@click.option(
"--column",
"-c",
multiple=True,
help="Which column to display. Available: {}. \nDefault: {}".format(
", ".join(available_fields), ", ".join(output_conf["labels"])
),
)
@click.pass_context
@async_command
async def show(ctx, raw, id, column, format):
async with ctx.obj["remote"]:
async with ctx.obj["remote"].request(
"get", url_template.format(id)
) as result:
result.raise_for_status()
payload = await result.json()
if raw:
click.echo(json.dumps(payload, sort_keys=True, indent=4))
else:
click.echo(
output.obj_table(
payload,
column or output_conf["labels"],
type=output_conf["type"],
format=format,
)
)
show.__doc__ = ""
return group.command(name)(show)
def get_delete_command(
group,
url_template,
confirm="Do you want to delete {} objects? This action is irreversible.",
doc="Delect the given objects",
name="rm",
id_metavar="ID",
):
@click.argument("id", nargs=-1, metavar=id_metavar)
@RAW_DECORATOR
@click.option("--no-input", is_flag=True)
@click.pass_context
@async_command
async def delete(ctx, raw, id, no_input):
async with ctx.obj["remote"]:
if not no_input and not click.confirm(confirm.format(len(id)), abort=True):
return
for i in id:
result = await ctx.obj["remote"].request(
"delete", url_template.format(i)
)
if result.status == 404:
logs.logger.warn("Couldn't delete {}: object not found".format(i))
else:
result.raise_for_status()
click.echo("{} Objects deleted!".format(len(id)))
delete.__doc__ = doc
return group.command(name)(delete)
async def get_user_info(ctx):
async with ctx.obj["remote"].request("get", "api/v1/users/users/me/") as result:
result.raise_for_status()
return await result.json()
import click
import json
from . import base
from .. import output
@base.cli.group()
@click.pass_context
def channels(ctx):
"""
Manage channels
"""
channels_ls = base.get_ls_command(
channels,
"api/v1/channels/",
scope=True,
output_conf={
"labels": ["UUID", "Name", "Category", "Username", "Tags"],
"type": "CHANNEL",
},
)
channels_rm = base.get_delete_command(channels, "api/v1/channels/{}/")
@channels.command("create")
@click.option("--name", prompt=True)
@click.option("--username", prompt=True)
@click.option("--cover")
@click.option("--description")
@click.option("--tags", default="")
@click.option("--itunes-category", default=None)
@click.option("--language", default=None)
@click.option(
"--content-category",
type=click.Choice(["music", "podcast", "other"]),
default="podcast",
prompt=True,
)
@click.option("--raw", is_flag=True)
@click.pass_context
@base.async_command
async def channels_create(
ctx,
raw,
name,
username,
cover,
description,
tags,
language,
itunes_category,
content_category,
):
data = {
"name": name,
"username": username,
"content_category": content_category,
}
if description:
data["description"] = {
"text": description,
"content_type": "text/markdown",
}
else:
data["description"] = None
if cover:
data["cover"] = cover
missing_fields = []
if not itunes_category:
missing_fields.append(("--itunes-category", "itunes_category"))
if not language:
missing_fields.append(("--language", "language"))
if content_category == "podcast" and missing_fields:
click.secho(
"You must provide adequate values for these fields: {}. Allowed values are listed below".format(
", ".join([f for f, _ in missing_fields])
),
fg="red",
)
async with ctx.obj["remote"]:
result = await ctx.obj["remote"].request(
"get", "api/v1/channels/metadata-choices"
)
choices = await result.json()
for flag, field in missing_fields:
click.echo(
"{}: {}".format(flag, ", ".join([c["value"] for c in choices[field]]))
)
raise click.ClickException("Missing params")
data["tags"] = [t.strip() for t in tags.replace(" ", ",").split(",") if t]
if content_category == "podcast":
data["metadata"] = {
"itunes_category": itunes_category,
"language": language,
}
async with ctx.obj["remote"]:
result = await ctx.obj["remote"].request("post", "api/v1/channels/", json=data)
await base.check_status(result)
payload = await result.json()
if raw:
click.echo(json.dumps(payload, sort_keys=True, indent=4))
else:
click.echo("Channel created:")
click.echo(
output.table(
[payload], ["UUID", "Name", "Category", "Username"], type="CHANNEL"
)
)