Compare commits
3 Commits
3a299a040a
...
1.11
| Author | SHA1 | Date | |
|---|---|---|---|
| d6cabd5f6c | |||
| 63a824f9d9 | |||
| a12c2fb57e |
36
Dockerfile
36
Dockerfile
@@ -1,36 +0,0 @@
|
|||||||
# Nutze ein schlankes Python-Image
|
|
||||||
FROM python:3.11-slim-bullseye
|
|
||||||
|
|
||||||
ENV PYTHONUNBUFFERED 1
|
|
||||||
|
|
||||||
WORKDIR /konova
|
|
||||||
|
|
||||||
# Installiere System-Abhängigkeiten
|
|
||||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
|
||||||
gdal-bin redis-server nginx \
|
|
||||||
&& rm -rf /var/lib/apt/lists/* # Platz sparen
|
|
||||||
|
|
||||||
# Erstelle benötigte Verzeichnisse & setze Berechtigungen
|
|
||||||
RUN mkdir -p /var/log/nginx /var/log/gunicorn /var/lib/nginx /tmp/nginx_client_body \
|
|
||||||
&& touch /var/log/nginx/access.log /var/log/nginx/error.log \
|
|
||||||
&& chown -R root:root /var/log/nginx /var/lib/nginx /tmp/nginx_client_body
|
|
||||||
|
|
||||||
# Kopiere und installiere Python-Abhängigkeiten
|
|
||||||
COPY ./requirements.txt /konova/
|
|
||||||
RUN pip install --upgrade pip && pip install --no-cache-dir -r requirements.txt
|
|
||||||
|
|
||||||
# Entferne Standard-Nginx-Site und ersetze sie durch eigene Config
|
|
||||||
RUN rm -rf /etc/nginx/sites-enabled/default
|
|
||||||
COPY ./nginx.conf /etc/nginx/conf.d
|
|
||||||
|
|
||||||
# Kopiere restliche Projektdateien
|
|
||||||
COPY . /konova/
|
|
||||||
|
|
||||||
# Sammle statische Dateien
|
|
||||||
RUN python manage.py collectstatic --noinput
|
|
||||||
|
|
||||||
# Exponiere Ports
|
|
||||||
#EXPOSE 80 6379 8000
|
|
||||||
|
|
||||||
# Setze Entrypoint
|
|
||||||
ENTRYPOINT ["/konova/docker-entrypoint.sh"]
|
|
||||||
56
README.md
56
README.md
@@ -4,7 +4,6 @@ the database postgresql and the css library bootstrap as well as the icon packag
|
|||||||
fontawesome for a modern look, following best practices from the industry.
|
fontawesome for a modern look, following best practices from the industry.
|
||||||
|
|
||||||
## Background processes
|
## Background processes
|
||||||
### !!! For non-docker run
|
|
||||||
Konova uses celery for background processing. To start the worker you need to run
|
Konova uses celery for background processing. To start the worker you need to run
|
||||||
```shell
|
```shell
|
||||||
$ celery -A konova worker -l INFO
|
$ celery -A konova worker -l INFO
|
||||||
@@ -19,58 +18,3 @@ Technical documention is provided in the projects git wiki.
|
|||||||
A user documentation is not available (and not needed, yet).
|
A user documentation is not available (and not needed, yet).
|
||||||
|
|
||||||
|
|
||||||
# Docker
|
|
||||||
To run the docker-compose as expected, you need to take the following steps:
|
|
||||||
|
|
||||||
1. Create a database containing docker, using an appropriate Dockerfile, e.g. the following
|
|
||||||
```
|
|
||||||
version: '3.3'
|
|
||||||
services:
|
|
||||||
postgis:
|
|
||||||
image: postgis/postgis
|
|
||||||
restart: always
|
|
||||||
container_name: postgis-docker
|
|
||||||
ports:
|
|
||||||
- 5433:5432
|
|
||||||
volumes:
|
|
||||||
- db-volume:/var/lib/postgresql/data
|
|
||||||
environment:
|
|
||||||
- POSTGRES_PASSWORD=postgres
|
|
||||||
- POSTGRES_USER=postgres
|
|
||||||
networks:
|
|
||||||
- db-network-bridge
|
|
||||||
|
|
||||||
networks:
|
|
||||||
db-network-bridge:
|
|
||||||
driver: "bridge"
|
|
||||||
|
|
||||||
volumes:
|
|
||||||
db-volume:
|
|
||||||
```
|
|
||||||
This Dockerfile creates a Docker container running postgresql and postgis, creates the default superuser postgres,
|
|
||||||
creates a named volume for persisting the database and creates a new network bridge, which **must be used by any other
|
|
||||||
container, which wants to write/read on this database**.
|
|
||||||
|
|
||||||
2. Make sure the name of the network bridge above matches the network in the konova docker-compose.yml
|
|
||||||
3. Get into the running postgis container (`docker exec -it postgis-docker bash`) and create new databases, users and so on. Make sure the database `konova` exists now!
|
|
||||||
4. Replace all `CHANGE_ME_xy` values inside of konova/docker-compose.yml for your installation. Make sure the `SSO_HOST` holds the proper SSO host, e.g. for the arnova project `arnova.example.org` (Arnova must be installed and the webserver configured as well, of course)
|
|
||||||
5. Take a look on konova/settings.py and konova/sub_settings/django_settings.py. Again: Replace all occurences of `CHANGE_ME` with proper values for your installation.
|
|
||||||
1. Make sure you have the proper host strings added to `ALLOWED_HOSTS` inside of django_settings.py.
|
|
||||||
6. Build and run the docker setup using `docker-compose build` and `docker-compose start` from the main directory of this project (where the docker-compose.yml lives)
|
|
||||||
7. Run migrations! To do so, get into the konova service container (`docker exec -it konova-docker bash`) and run the needed commands (`python manage.py makemigrations LIST_OF_ALL_MIGRATABLE_APPS`, then `python manage.py migrate`)
|
|
||||||
8. Run the setup command `python manage.py setup` and follow the instructions on the CLI
|
|
||||||
9. To enable **SMTP** mail support, make sure your host machine (the one where the docker container run) has the postfix service configured properly. Make sure the `mynetworks` variable is xtended using the docker network bridge ip, created in the postgis container and used by the konova services.
|
|
||||||
1. **Hint**: You can find out this easily by trying to perform a test mail in the running konova web application (which will fail, of course). Then take a look to the latest entries in `/var/log/mail.log` on your host machine. The failed IP will be displayed there.
|
|
||||||
2. **Please note**: This installation guide is based on SMTP using postfix!
|
|
||||||
3. Restart the postfix service on your host machine to reload the new configuration (`service postfix restart`)
|
|
||||||
10. Finally, make sure your host machine webserver passes incoming requests properly to the docker nginx webserver of konova. A proper nginx config for the host machine may look like this:
|
|
||||||
```
|
|
||||||
server {
|
|
||||||
server_name konova.domain.org;
|
|
||||||
|
|
||||||
location / {
|
|
||||||
proxy_pass http://localhost:KONOVA_NGINX_DOCKER_PORT/;
|
|
||||||
proxy_set_header Host $host;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
@@ -86,7 +86,7 @@ class EcoAccountWorkflowTestCase(BaseWorkflowTestCase):
|
|||||||
new_title = self.create_dummy_string()
|
new_title = self.create_dummy_string()
|
||||||
new_identifier = self.create_dummy_string()
|
new_identifier = self.create_dummy_string()
|
||||||
new_comment = self.create_dummy_string()
|
new_comment = self.create_dummy_string()
|
||||||
new_geometry = MultiPolygon(srid=4326) # Create an empty geometry
|
new_geometry = self.create_dummy_geometry()
|
||||||
test_conservation_office = self.get_conservation_office_code()
|
test_conservation_office = self.get_conservation_office_code()
|
||||||
test_deductable_surface = self.eco_account.deductable_surface + 100
|
test_deductable_surface = self.eco_account.deductable_surface + 100
|
||||||
|
|
||||||
@@ -103,7 +103,7 @@ class EcoAccountWorkflowTestCase(BaseWorkflowTestCase):
|
|||||||
"identifier": new_identifier,
|
"identifier": new_identifier,
|
||||||
"title": new_title,
|
"title": new_title,
|
||||||
"comment": new_comment,
|
"comment": new_comment,
|
||||||
"geom": new_geometry.geojson,
|
"geom": self.create_geojson(new_geometry),
|
||||||
"surface": test_deductable_surface,
|
"surface": test_deductable_surface,
|
||||||
"conservation_office": test_conservation_office.id
|
"conservation_office": test_conservation_office.id
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,23 +0,0 @@
|
|||||||
version: '3.3'
|
|
||||||
|
|
||||||
services:
|
|
||||||
konova:
|
|
||||||
external_links:
|
|
||||||
- postgis:db
|
|
||||||
- arnova-nginx-server:arnova
|
|
||||||
build: .
|
|
||||||
image: "ksp/konova:1.8"
|
|
||||||
container_name: "konova-docker"
|
|
||||||
command: ./docker-entrypoint.sh
|
|
||||||
restart: always
|
|
||||||
volumes:
|
|
||||||
- /data/apps/konova/uploaded_files:/konova_uploaded_files
|
|
||||||
ports:
|
|
||||||
- "1337:80"
|
|
||||||
|
|
||||||
# Instead of an own, new network, we need to connect to the existing one, which is provided by the postgis container
|
|
||||||
# NOTE: THIS NETWORK MUST EXIST
|
|
||||||
networks:
|
|
||||||
default:
|
|
||||||
external:
|
|
||||||
name: postgis_nat_it_backend
|
|
||||||
@@ -1,27 +0,0 @@
|
|||||||
#!/bin/bash
|
|
||||||
|
|
||||||
set -e # Beende Skript bei Fehlern
|
|
||||||
set -o pipefail # Fehler in Pipelines nicht ignorieren
|
|
||||||
|
|
||||||
# Starte Redis
|
|
||||||
redis-server --daemonize yes
|
|
||||||
|
|
||||||
# Starte Celery Worker im Hintergrund
|
|
||||||
celery -A konova worker --loglevel=info &
|
|
||||||
|
|
||||||
# Starte Nginx als Hintergrundprozess
|
|
||||||
nginx -g "daemon off;" &
|
|
||||||
|
|
||||||
# Setze Gunicorn Worker-Anzahl (Standard: (2*CPUs)+1)
|
|
||||||
WORKERS=${GUNICORN_WORKERS:-$((2 * $(nproc) + 1))}
|
|
||||||
|
|
||||||
# Stelle sicher, dass Logs existieren
|
|
||||||
mkdir -p /var/log/gunicorn
|
|
||||||
touch /var/log/gunicorn/access.log /var/log/gunicorn/error.log
|
|
||||||
|
|
||||||
# Starte Gunicorn als Hauptprozess
|
|
||||||
exec gunicorn --workers="$WORKERS" konova.wsgi:application \
|
|
||||||
--bind=0.0.0.0:8000 \
|
|
||||||
--access-logfile /var/log/gunicorn/access.log \
|
|
||||||
--error-logfile /var/log/gunicorn/error.log \
|
|
||||||
--access-logformat '%({x-real-ip}i)s via %(h)s %(l)s %(u)s %(t)s "%(r)s" %(s)s %(b)s "%(f)s" "%(a)s"'
|
|
||||||
@@ -84,7 +84,7 @@ class EmaWorkflowTestCase(BaseWorkflowTestCase):
|
|||||||
new_title = self.create_dummy_string()
|
new_title = self.create_dummy_string()
|
||||||
new_identifier = self.create_dummy_string()
|
new_identifier = self.create_dummy_string()
|
||||||
new_comment = self.create_dummy_string()
|
new_comment = self.create_dummy_string()
|
||||||
new_geometry = MultiPolygon(srid=4326) # Create an empty geometry
|
new_geometry = self.create_dummy_geometry() # Create an empty geometry
|
||||||
test_conservation_office = self.get_conservation_office_code()
|
test_conservation_office = self.get_conservation_office_code()
|
||||||
|
|
||||||
check_on_elements = {
|
check_on_elements = {
|
||||||
@@ -99,7 +99,7 @@ class EmaWorkflowTestCase(BaseWorkflowTestCase):
|
|||||||
"identifier": new_identifier,
|
"identifier": new_identifier,
|
||||||
"title": new_title,
|
"title": new_title,
|
||||||
"comment": new_comment,
|
"comment": new_comment,
|
||||||
"geom": new_geometry.geojson,
|
"geom": self.create_geojson(new_geometry),
|
||||||
"conservation_office": test_conservation_office.id
|
"conservation_office": test_conservation_office.id
|
||||||
}
|
}
|
||||||
self.client_user.post(url, post_data)
|
self.client_user.post(url, post_data)
|
||||||
|
|||||||
@@ -124,7 +124,7 @@ class EditInterventionFormTestCase(NewInterventionFormTestCase):
|
|||||||
self.assertIsNotNone(obj.responsible.handler)
|
self.assertIsNotNone(obj.responsible.handler)
|
||||||
self.assertEqual(obj.title, data["title"])
|
self.assertEqual(obj.title, data["title"])
|
||||||
self.assertEqual(obj.comment, data["comment"])
|
self.assertEqual(obj.comment, data["comment"])
|
||||||
self.assertTrue(test_geom.equals_exact(obj.geometry.geom, 0.000001))
|
self.assert_equal_geometries(test_geom, obj.geometry.geom)
|
||||||
|
|
||||||
self.assertEqual(obj.legal.binding_date, today)
|
self.assertEqual(obj.legal.binding_date, today)
|
||||||
self.assertEqual(obj.legal.registration_date, today)
|
self.assertEqual(obj.legal.registration_date, today)
|
||||||
|
|||||||
@@ -72,9 +72,8 @@ class SimpleGeomForm(BaseForm):
|
|||||||
# will be rendered again on failed submit
|
# will be rendered again on failed submit
|
||||||
self.initialize_form_field("geom", self.data["geom"])
|
self.initialize_form_field("geom", self.data["geom"])
|
||||||
|
|
||||||
# Read geojson into gdal geometry
|
# Initialize features list with empty MultiPolygon, so that an empty input will result in a
|
||||||
# HINT: This can be simplified if the geojson format holds data in epsg:4326 (GDAL provides direct creation for
|
# proper empty MultiPolygon object
|
||||||
# this case)
|
|
||||||
features = []
|
features = []
|
||||||
features_json = geom.get("features", [])
|
features_json = geom.get("features", [])
|
||||||
accepted_ogr_types = [
|
accepted_ogr_types = [
|
||||||
@@ -102,19 +101,22 @@ class SimpleGeomForm(BaseForm):
|
|||||||
return is_valid
|
return is_valid
|
||||||
|
|
||||||
is_valid &= self.__is_area_valid(g)
|
is_valid &= self.__is_area_valid(g)
|
||||||
|
g = Polygon.from_ewkt(g.ewkt)
|
||||||
polygon = Polygon.from_ewkt(g.ewkt)
|
is_valid &= g.valid
|
||||||
is_valid &= polygon.valid
|
if not g.valid:
|
||||||
if not polygon.valid:
|
self.add_error("geom", g.valid_reason)
|
||||||
self.add_error("geom", polygon.valid_reason)
|
|
||||||
return is_valid
|
return is_valid
|
||||||
|
|
||||||
features.append(polygon)
|
if isinstance(g, Polygon):
|
||||||
|
features.append(g)
|
||||||
|
elif isinstance(g, MultiPolygon):
|
||||||
|
features.extend(list(g))
|
||||||
|
|
||||||
# Unionize all geometry features into one new MultiPolygon
|
# Unionize all geometry features into one new MultiPolygon
|
||||||
form_geom = MultiPolygon(srid=DEFAULT_SRID_RLP)
|
if features:
|
||||||
for feature in features:
|
form_geom = MultiPolygon(*features, srid=DEFAULT_SRID_RLP).unary_union
|
||||||
form_geom = form_geom.union(feature)
|
else:
|
||||||
|
form_geom = MultiPolygon(srid=DEFAULT_SRID_RLP)
|
||||||
|
|
||||||
# Make sure to convert into a MultiPolygon. Relevant if a single Polygon is provided.
|
# Make sure to convert into a MultiPolygon. Relevant if a single Polygon is provided.
|
||||||
form_geom = Geometry.cast_to_multipolygon(form_geom)
|
form_geom = Geometry.cast_to_multipolygon(form_geom)
|
||||||
|
|||||||
@@ -395,7 +395,7 @@ class Geometry(BaseResource):
|
|||||||
output_geom
|
output_geom
|
||||||
"""
|
"""
|
||||||
output_geom = input_geom
|
output_geom = input_geom
|
||||||
if input_geom.geom_type != "MultiPolygon":
|
if not isinstance(input_geom, MultiPolygon):
|
||||||
output_geom = MultiPolygon(input_geom, srid=DEFAULT_SRID_RLP)
|
output_geom = MultiPolygon(input_geom, srid=DEFAULT_SRID_RLP)
|
||||||
return output_geom
|
return output_geom
|
||||||
|
|
||||||
|
|||||||
@@ -192,10 +192,11 @@ STATICFILES_DIRS = [
|
|||||||
]
|
]
|
||||||
|
|
||||||
# EMAIL (see https://docs.djangoproject.com/en/dev/topics/email/)
|
# EMAIL (see https://docs.djangoproject.com/en/dev/topics/email/)
|
||||||
|
|
||||||
|
# CHANGE_ME !!! ONLY FOR DEVELOPMENT !!!
|
||||||
if DEBUG:
|
if DEBUG:
|
||||||
# ONLY FOR DEVELOPMENT NEEDED
|
|
||||||
EMAIL_BACKEND = 'django.core.mail.backends.filebased.EmailBackend'
|
EMAIL_BACKEND = 'django.core.mail.backends.filebased.EmailBackend'
|
||||||
EMAIL_FILE_PATH = '/tmp/app-messages'
|
EMAIL_FILE_PATH = '/tmp/app-messages' # change this to a proper location
|
||||||
|
|
||||||
DEFAULT_FROM_EMAIL = env("DEFAULT_FROM_EMAIL") # The default email address for the 'from' element
|
DEFAULT_FROM_EMAIL = env("DEFAULT_FROM_EMAIL") # The default email address for the 'from' element
|
||||||
SERVER_EMAIL = DEFAULT_FROM_EMAIL # The default email sender address, which is used by Django to send errors via mail
|
SERVER_EMAIL = DEFAULT_FROM_EMAIL # The default email sender address, which is used by Django to send errors via mail
|
||||||
|
|||||||
@@ -469,7 +469,7 @@ class BaseTestCase(TestCase):
|
|||||||
eco_account.save()
|
eco_account.save()
|
||||||
return eco_account
|
return eco_account
|
||||||
|
|
||||||
def assert_equal_geometries(self, geom1: MultiPolygon, geom2: MultiPolygon, tolerance = 0.001):
|
def assert_equal_geometries(self, geom1: MultiPolygon, geom2: MultiPolygon, tolerance=0.001):
|
||||||
""" Assert for geometries to be equal
|
""" Assert for geometries to be equal
|
||||||
|
|
||||||
Transforms the geometries to matching srids before checking
|
Transforms the geometries to matching srids before checking
|
||||||
@@ -491,7 +491,10 @@ class BaseTestCase(TestCase):
|
|||||||
# transformation from one coordinate system into the other, which is valid
|
# transformation from one coordinate system into the other, which is valid
|
||||||
geom1.transform(geom2.srid)
|
geom1.transform(geom2.srid)
|
||||||
geom2.transform(geom1.srid)
|
geom2.transform(geom1.srid)
|
||||||
self.assertTrue(geom1.equals_exact(geom2, tolerance) or geom2.equals_exact(geom1, tolerance))
|
self.assertTrue(
|
||||||
|
geom1.equals_exact(geom2, tolerance=tolerance),
|
||||||
|
msg=f"Difference is {abs(geom1.area - geom2.area)} with {geom1.area} and {geom2.area} in a tolerance of {tolerance}"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
class BaseViewTestCase(BaseTestCase):
|
class BaseViewTestCase(BaseTestCase):
|
||||||
|
|||||||
25
nginx.conf
25
nginx.conf
@@ -1,25 +0,0 @@
|
|||||||
server {
|
|
||||||
listen 80;
|
|
||||||
client_max_body_size 25M;
|
|
||||||
|
|
||||||
location / {
|
|
||||||
proxy_pass http://127.0.0.1:8000;
|
|
||||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
|
||||||
proxy_set_header X-Real-IP $remote_addr;
|
|
||||||
proxy_set_header Host $host;
|
|
||||||
proxy_redirect off;
|
|
||||||
proxy_cache_bypass $http_upgrade;
|
|
||||||
}
|
|
||||||
|
|
||||||
location /static/ {
|
|
||||||
alias /konova/static/;
|
|
||||||
access_log /var/log/nginx/access.log;
|
|
||||||
autoindex off;
|
|
||||||
types {
|
|
||||||
text/css css;
|
|
||||||
application/javascript js;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
error_log /var/log/nginx/error.log;
|
|
||||||
}
|
|
||||||
@@ -48,7 +48,7 @@ pytz==2024.2
|
|||||||
PyYAML==6.0.2
|
PyYAML==6.0.2
|
||||||
qrcode==7.3.1
|
qrcode==7.3.1
|
||||||
redis==5.1.0b6
|
redis==5.1.0b6
|
||||||
requests<2.32.0
|
requests==2.32.3
|
||||||
six==1.16.0
|
six==1.16.0
|
||||||
soupsieve==2.5
|
soupsieve==2.5
|
||||||
sqlparse==0.5.1
|
sqlparse==0.5.1
|
||||||
|
|||||||
@@ -15,6 +15,7 @@ class UserNotificationAdmin(admin.ModelAdmin):
|
|||||||
class UserAdmin(admin.ModelAdmin):
|
class UserAdmin(admin.ModelAdmin):
|
||||||
list_display = [
|
list_display = [
|
||||||
"id",
|
"id",
|
||||||
|
"sso_identifier",
|
||||||
"username",
|
"username",
|
||||||
"first_name",
|
"first_name",
|
||||||
"last_name",
|
"last_name",
|
||||||
|
|||||||
18
user/migrations/0010_user_sso_identifier.py
Normal file
18
user/migrations/0010_user_sso_identifier.py
Normal file
@@ -0,0 +1,18 @@
|
|||||||
|
# Generated by Django 5.1.6 on 2025-09-12 06:10
|
||||||
|
|
||||||
|
from django.db import migrations, models
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
('user', '0009_user_oauth_token'),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.AddField(
|
||||||
|
model_name='user',
|
||||||
|
name='sso_identifier',
|
||||||
|
field=models.CharField(blank=True, db_comment='Identifies the account based on an unique identifier from the SSO system', max_length=255, null=True),
|
||||||
|
),
|
||||||
|
]
|
||||||
@@ -6,6 +6,7 @@ Created on: 15.11.21
|
|||||||
|
|
||||||
"""
|
"""
|
||||||
from django.contrib.auth.models import AbstractUser
|
from django.contrib.auth.models import AbstractUser
|
||||||
|
from django.core.exceptions import ObjectDoesNotExist
|
||||||
|
|
||||||
from django.db import models
|
from django.db import models
|
||||||
|
|
||||||
@@ -32,6 +33,12 @@ class User(AbstractUser):
|
|||||||
db_comment="OAuth token for the user",
|
db_comment="OAuth token for the user",
|
||||||
related_name="+"
|
related_name="+"
|
||||||
)
|
)
|
||||||
|
sso_identifier = models.CharField(
|
||||||
|
blank=True,
|
||||||
|
null=True,
|
||||||
|
db_comment="Identifies the account based on an unique identifier from the SSO system",
|
||||||
|
max_length=255,
|
||||||
|
)
|
||||||
|
|
||||||
def is_notification_setting_set(self, notification_enum: UserNotificationEnum):
|
def is_notification_setting_set(self, notification_enum: UserNotificationEnum):
|
||||||
return self.notifications.filter(
|
return self.notifications.filter(
|
||||||
@@ -264,4 +271,48 @@ class User(AbstractUser):
|
|||||||
self.oauth_token.delete()
|
self.oauth_token.delete()
|
||||||
self.oauth_token = token
|
self.oauth_token = token
|
||||||
self.save()
|
self.save()
|
||||||
return self
|
return self
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def resolve_user_using_propagation_data(data: dict):
|
||||||
|
""" Fetches user from db by the given data from propagation process
|
||||||
|
|
||||||
|
Args:
|
||||||
|
data (dict): json containing user information from the sso system
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
user (User): The resolved user
|
||||||
|
"""
|
||||||
|
username = data.get("username", None)
|
||||||
|
sso_identifier = data.get("sso_identifier", None)
|
||||||
|
if not username and not sso_identifier:
|
||||||
|
raise AssertionError("No username or sso identifier provided")
|
||||||
|
|
||||||
|
try:
|
||||||
|
user = User.objects.get(username=username)
|
||||||
|
except ObjectDoesNotExist:
|
||||||
|
try:
|
||||||
|
user = User.objects.get(sso_identifier=sso_identifier)
|
||||||
|
except ObjectDoesNotExist:
|
||||||
|
raise ObjectDoesNotExist("No user with this username or sso identifier was found")
|
||||||
|
|
||||||
|
return user
|
||||||
|
|
||||||
|
def update_user_using_propagation_data(self, data: dict):
|
||||||
|
""" Update user data based on propagation data from sso system
|
||||||
|
|
||||||
|
Args:
|
||||||
|
data (dict): json containing user information from the sso system
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
user (User): The updated user
|
||||||
|
"""
|
||||||
|
skipable_attrs = {
|
||||||
|
"is_staff",
|
||||||
|
"is_superuser",
|
||||||
|
}
|
||||||
|
for _attr, _val in data.items():
|
||||||
|
if _attr in skipable_attrs:
|
||||||
|
continue
|
||||||
|
setattr(self, _attr, _val)
|
||||||
|
return self
|
||||||
|
|||||||
@@ -44,17 +44,8 @@ class PropagateUserView(View):
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
status = "updated"
|
status = "updated"
|
||||||
user = User.objects.get(username=body.get('username'))
|
user = User.resolve_user_using_propagation_data(body)
|
||||||
# Update user data, excluding some changes
|
user = user.update_user_using_propagation_data(body)
|
||||||
skipable_attrs = {
|
|
||||||
"username",
|
|
||||||
"is_staff",
|
|
||||||
"is_superuser",
|
|
||||||
}
|
|
||||||
for _attr, _val in body.items():
|
|
||||||
if _attr in skipable_attrs:
|
|
||||||
continue
|
|
||||||
setattr(user, _attr, _val)
|
|
||||||
except ObjectDoesNotExist:
|
except ObjectDoesNotExist:
|
||||||
user = User(**body)
|
user = User(**body)
|
||||||
status = "created"
|
status = "created"
|
||||||
|
|||||||
Reference in New Issue
Block a user