feat: Issue pagination (#4109)

* dev: separate order by of issue queryset to separate utilty function

* dev: pagination for spreadhseet and gantt

* dev: group pagination

* dev: paginate single entities

* dev: refactor pagination

* dev: paginating issue apis

* dev: grouped pagination for empty groups

* dev: ungrouped list

* dev: fix paginator for single groups

* dev: fix paginating true list

* dev: state__group pagination

* fix: imports

* dev: fix grouping on taget date and project_id

* dev: remove unused imports

* dev: add ruff in dependencies

* make store changes for pagination

* fix some build errors due to type changes

* dev: add total pages key

* chore: paginator changes

* implement pagination for spreadsheet, list, kanban and calendar

* fix: order by grouped pagination

* dev: sub group paginator

* dev: grouped paginator

* dev: sub grouping paginator

* restructure gantt layout charts

* dev: fix pagination count

* dev: date filtering for issues

* dev: group by counts

* implement new logic for pagination layouts

* fix: label id and assignee id interchange

* dev: fix priority ordering

* fix group by bugs

* dev: grouping for priority

* fix reeordering while update

* dev: fix order by for pagination

* fix: total results for sub group pagination

* dev: add comments and fix ordering

* fix orderby priority for spreadsheet

* fix subGroupCount

* Fix logic for load more in Kanban

* fix issue quick add

* dev: fix issue creation

* dev: add sorting

* fix order by for modules and cycles

* fix non render of Issues

* fix subGroupKey generation when subGroupId is null

* dev: fix cycle and module issue

* dev: fix sub grouping

* fix: imports

* fix minor build errors

* fix major build errors

* fix priority order by

* grouped pagination cursor logic changes

* fix calendar pagination

* active cycle issues pagination

* dev: fix lint errors

* fix Kanban subgroup dnd

* fix empty subgroup kanbans

* fix updation from an empty field with groupBy

* fix issue count of groups

* fix issue sorting on first page fetch

* dev: remove pagination from list endpoint add ordering for sub grouping and handle error for empty issues

* refactor module and cycle issues

* fix quick add refactor

* refactor gantt roots

* fix empty states

* fix filter params

* fix group by module

* minor UX changes

* fix sub grouping in Kanban

* remove unnecessary sorting logic in backend (Nikhil's changes)

* dev: add error handling when using without on results

* calendar layout loader improvement

* list per page count logic change

* spreadsheet loader improvement

* Added loader for issues load more pagination

* fix quick add in gantt

* dev: add profile issue pagination

* fix all issue and profile issues logic

* remove empty state from calendar layout

* use useEffect instead of swr to fetch issues to have quick switching between views cycles etc

* dev: add aggregation for multi fields

* fix priority sorting for workspace issues

* fix move from draft for draft issues

* fix pagination loader for spreadsheet

* fetch project, module and cycle stats on update, create and delete of issues

* increase horizontal margin

* change load more pagination to on scroll pagination for active cycle issues

* fix linting error

* dev: fix ordering when order by m2m

* dev: fix null paginations

* dev: commenting

* 0add comments to the issue stores methods

* fix order by for array properties

* fix: priority ordering

* perform optimistic updates while adding or removing cycles or modules

* fix build errors

* dev: add default values when iterating through sub group

* Move code from EE to CE repo

* chore: folder structure updates

* Move sortabla and radio input to packages/ui

* chore: updated empty and loading screens

* chore: delete an estimate point

* chore: estimate point response change

* chore: updated create estimate and handled the build error

* chore: migration fixes

* chore: updated create estimate

* [WEB-1322] dev: conflict free pages collaboration (#4463)

* chore: pages realtime

* chore: empty binary response

* chore: added a ypy package

* feat: pages collaboration

* chore: update fetching logic

* chore: degrade ypy version

* chore: replace useEffect fetch logic with useSWR

* chore: move all the update logic to the page store

* refactor: remove react-hook-form

* chore: save description_html as well

* chore: migrate old data logic

* fix: added description_binary as field name

* fix: code cleanup

* refactor: create separate hook to handle page description

* fix: build errors

* chore: combine updates instead of using the whole document

* chore: removed ypy package

* chore: added conflict resolving logic to the client side

* chore: add a save changes button

* chore: add read-only validation

* chore: remove saving state information

* chore: added permission class

* chore: removed the migration file

* chore: corrected the model field

* chore: rename pageStore to page

* chore: update collaboration provider

* chore: add try catch to handle error

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>

* chore: create estimate workflow update

* chore: editing and deleting the existing estimate updates

* chore: updating the new estinates in update modal

* chore: ui changed

* chore: response changes of get and post

* chore: new field added in estimates

* chore: individual endpoint for estimate points

* chore: typo changes

* chore: create estimate point

* chore: integrated new endpoints

* chore: update key value pair

* chore: update sorting in the estimates

* Add custom option in the estimate templates

* chore: handled current project active estimate

* chore: handle estimate update worklfow

* chore: AIO docker images for preview deployments (#4605)

* fix: adding single docker base file

* action added

* fix action

* dockerfile.base modified

* action fix

* dockerfile

* fix: base aio dockerfile

* fix: dockerfile.base

* fix: dockerfile base

* fix: modified folder structure

* fix: action

* fix: dockerfile

* fix: dockerfile.base

* fix: supervisor file name changed

* fix: base dockerfile updated

* fix dockerfile base

* fix: base dockerfile

* fix: docker files

* fix: base dockerfile

* update base image

* modified docker aio base

* aio base modified to debian-12-slim

* fixes

* finalize the dockerfiles with volume exposure

* modified the aio build and dockerfile

* fix: codacy suggestions implemented

* fix: codacy fix

* update aio build action

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>

* chore: handled estimates switch

* chore: handled estimate edit

* chore: handled close button in estimate edit

* chore: updated ceate estimare workflow

* chore: updated switch estimate

* fix minor bugs in base issues store

* single column scroll pagination

* UI changes for load more button

* chore: UI and typos

* chore: resolved build error

* [WEB-1184] feat: issue bulk operations (#4530)

* chore: bulk operations

* chore: archive bulk issues

* chore: bulk ops keys changed

* chore: bulk delete and archive confirmation modals

* style: list layout spacing

* chore: create hoc for multi-select groups

* chore: update multiple select components

* chore: archive, target and start date error messsage

* chore: edge case handling

* chore: bulk ops in spreadsheet layout

* chore: update UI

* chore: scroll element into view

* fix: shift + arrow navigation

* chore: implement bulk ops in the gantt layout

* fix: ui bugs

* chore: move selection logic to store

* fix: group selection

* refactor: multiple select store

* style: dropdowns UI

* fix: bulk assignee and label update mutation

* chore: removed migrations

* refactor: entities grouping logic

* fix performance issue is selection of bulk ops

* fix: shift keyboard navigation

* fix: group click action

* chore: start and target date validation

* chore: remove optimistic updates, check archivability in frontend

* chore: code optimisation

* chore: add store comments

* refactor: component fragmentation

* style: issue active state

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
Co-authored-by: rahulramesha <rahulramesham@gmail.com>

* fix a performance issue when there are too many groups

* chore: updated delete dropdown and handled the repeated values while creating and updating the estimate point

* [WEB-1424] chore: page and view logo implementation, and emoji/icon picker improvement (#4583)

* chore: added logo_props

* chore: logo props in cycles, views and modules

* chore: emoji icon picker types updated

* chore: info icon added to plane ui package

* chore: icon color adjust helper function added

* style: icon picker ui improvement and default color options updated

* chore: update page logo action added in store

* chore: emoji code to unicode helper function added

* chore: common logo renderer component added

* chore: app header project logo updated

* chore: project logo updated across platform

* chore: page logo picker added

* chore: control link component improvement

* chore: list item improvement

* chore: emoji picker component updated

* chore: space app and package logo prop type updated

* chore: migration

* chore: logo added to project view

* chore: page logo picker added in create modal and breadcrumbs

* chore: view logo picker added in create modal and updated breadcrumbs

* fix: build error

* chore: AIO docker images for preview deployments (#4605)

* fix: adding single docker base file

* action added

* fix action

* dockerfile.base modified

* action fix

* dockerfile

* fix: base aio dockerfile

* fix: dockerfile.base

* fix: dockerfile base

* fix: modified folder structure

* fix: action

* fix: dockerfile

* fix: dockerfile.base

* fix: supervisor file name changed

* fix: base dockerfile updated

* fix dockerfile base

* fix: base dockerfile

* fix: docker files

* fix: base dockerfile

* update base image

* modified docker aio base

* aio base modified to debian-12-slim

* fixes

* finalize the dockerfiles with volume exposure

* modified the aio build and dockerfile

* fix: codacy suggestions implemented

* fix: codacy fix

* update aio build action

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>

* fix: merge conflict

* chore: lucide react added to planu ui package

* chore: new emoji picker component added with lucid icon and code refactor

* chore: logo component updated

* chore: emoji picker updated for pages and views

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
Co-authored-by: Manish Gupta <59428681+mguptahub@users.noreply.github.com>
Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>

* chore: handled inline errors in the estimate switch

* fix module and cycle drag and drop

* Fix issue count bug for accumulated actions

* chore: handled active and availability vadilation

* chore: handled create and update components in projecr estimates

* chore: added migration

* Add category specific values for custom template

* chore: estimate dropdown handled in issues

* chore: estimate alerts

* fix bulk updates

* chore: updated alerts

* add optional chaining

* Extract the list row actions

* change color of load more to match new Issues

* list group collapsible

* fix: updated and handled the estimate points

* fix: upgrader ee banner

* Fix issues with sortable

* Fix sortable spacing issue in create estimate modal

* fix: updated the issue create sorting

* chore: removed radio button from ui and updated in the estimates

* chore: resolved import error in packaged ui

* chore: handled props in create modal

* chore: removed ee files

* chore: changed default analytics

* fix: pagination ordering for grouped and subgrouped

* chore: removed the migration file

* chore: estimate point value in graph

* chore: estimate point key change

* chore: squashed migration (#4634)

* chore: squashed migration

* chore: removed instance migraion

* chore: key changes

* chore: issue activity back migration

* dev: replaced estimate key with estimate id and replaced estimate type from number to string in issue

* chore: estimate point value field

* chore: estimate point activity

* chore: removed the unused function

* chore: resolved merge conflicts

* chore: deploy board keys changed

* chore: yarn lock file change

* chore: resolved frontend build

---------

Co-authored-by: guru_sainath <gurusainath007@gmail.com>

* [WEB-1516] refactor: space app routing and layouts (#4705)

* dev: change layout

* chore: replace workspace slug and project id with anchor

* chore: migration fixes

* chore: update filtering logic

* chore: endpoint changes

* chore: update endpoint

* chore: changed url pratterns

* chore: use client side for layout and page

* chore: issue vote changes

* chore: project deploy board response change

* refactor: publish project store and components

* fix: update layout options after fetching settings

* chore: remove unnecessary types

* style: peek overview

* refactor: components folder structure

* fix: redirect from old path

* chore: make the whole issue block clickable

* chore: removed the migration file

* chore: add server side redirection for old routes

* chore: is enabled key change

* chore: update types

* chore: removed the migration file

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>

* Merge develop into revamp-estimates-ce

* chore: removed migration file and updated the estimate system order and removed ee banner

* chore: initial radio select in create estimate

* chore: space key changes

* Fix sortable component as the sort order was broken.

* fix: formatting and linting errors

* fix Alignment for load more

* add logic to approuter

* fix approuter changes and fix build

* chore: removed the linting issue

---------

Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
Co-authored-by: Satish Gandham <satish.iitg@gmail.com>
Co-authored-by: guru_sainath <gurusainath007@gmail.com>
Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
Co-authored-by: Aaryan Khandelwal <65252264+aaryan610@users.noreply.github.com>
Co-authored-by: Manish Gupta <59428681+mguptahub@users.noreply.github.com>
Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
Co-authored-by: Anmol Singh Bhatia <121005188+anmolsinghbhatia@users.noreply.github.com>
Co-authored-by: Bavisetti Narayan <72156168+NarayanBavisetti@users.noreply.github.com>
Co-authored-by: pushya22 <130810100+pushya22@users.noreply.github.com>
This commit is contained in:
rahulramesha 2024-06-10 20:15:03 +05:30 committed by GitHub
parent 7ac07b7b73
commit 666d35afb9
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
234 changed files with 9056 additions and 6188 deletions

View file

@ -2,10 +2,6 @@
from .base import BaseSerializer
from plane.db.models import Estimate, EstimatePoint
from plane.app.serializers import (
WorkspaceLiteSerializer,
ProjectLiteSerializer,
)
from rest_framework import serializers

View file

@ -19,6 +19,8 @@ from plane.app.views import (
IssueUserDisplayPropertyEndpoint,
IssueViewSet,
LabelViewSet,
BulkIssueOperationsEndpoint,
BulkArchiveIssuesEndpoint,
)
urlpatterns = [
@ -81,6 +83,11 @@ urlpatterns = [
BulkDeleteIssuesEndpoint.as_view(),
name="project-issues-bulk",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/bulk-archive-issues/",
BulkArchiveIssuesEndpoint.as_view(),
name="bulk-archive-issues",
),
##
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/sub-issues/",
@ -298,4 +305,9 @@ urlpatterns = [
),
name="project-issue-draft",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/bulk-operation-issues/",
BulkIssueOperationsEndpoint.as_view(),
name="bulk-operations-issues",
),
]

View file

@ -113,9 +113,7 @@ from .issue.activity import (
IssueActivityEndpoint,
)
from .issue.archive import (
IssueArchiveViewSet,
)
from .issue.archive import IssueArchiveViewSet, BulkArchiveIssuesEndpoint
from .issue.attachment import (
IssueAttachmentEndpoint,
@ -154,6 +152,8 @@ from .issue.subscriber import (
)
from .issue.bulk_operations import BulkIssueOperationsEndpoint
from .module.base import (
ModuleViewSet,
ModuleLinkViewSet,

View file

@ -1,4 +1,6 @@
# Python imports
import traceback
import zoneinfo
from django.conf import settings
from django.core.exceptions import ObjectDoesNotExist, ValidationError
@ -76,7 +78,11 @@ class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
response = super().handle_exception(exc)
return response
except Exception as e:
print(e) if settings.DEBUG else print("Server Error")
(
print(e, traceback.format_exc())
if settings.DEBUG
else print("Server Error")
)
if isinstance(e, IntegrityError):
return Response(
{"error": "The payload is not valid"},

View file

@ -2,43 +2,50 @@
import json
# Django imports
from django.db.models import (
Func,
F,
Q,
OuterRef,
Value,
UUIDField,
)
from django.core import serializers
from django.db.models import (
F,
Func,
OuterRef,
Q,
)
from django.utils import timezone
from django.utils.decorators import method_decorator
from django.views.decorators.gzip import gzip_page
from django.contrib.postgres.aggregates import ArrayAgg
from django.contrib.postgres.fields import ArrayField
from django.db.models.functions import Coalesce
# Third party imports
from rest_framework.response import Response
from rest_framework import status
from rest_framework.response import Response
from plane.app.permissions import (
ProjectEntityPermission,
)
# Module imports
from .. import BaseViewSet
from plane.app.serializers import (
IssueSerializer,
CycleIssueSerializer,
)
from plane.app.permissions import ProjectEntityPermission
from plane.bgtasks.issue_activites_task import issue_activity
from plane.db.models import (
Cycle,
CycleIssue,
Issue,
IssueLink,
IssueAttachment,
IssueLink,
)
from plane.utils.grouper import (
issue_group_values,
issue_on_results,
issue_queryset_grouper,
)
from plane.bgtasks.issue_activites_task import issue_activity
from plane.utils.issue_filters import issue_filters
from plane.utils.user_timezone_converter import user_timezone_converter
from plane.utils.order_queryset import order_issue_queryset
from plane.utils.paginator import (
GroupedOffsetPaginator,
SubGroupedOffsetPaginator,
)
# Module imports
class CycleIssueViewSet(BaseViewSet):
serializer_class = CycleIssueSerializer
@ -86,14 +93,9 @@ class CycleIssueViewSet(BaseViewSet):
@method_decorator(gzip_page)
def list(self, request, slug, project_id, cycle_id):
fields = [
field
for field in request.GET.get("fields", "").split(",")
if field
]
order_by = request.GET.get("order_by", "created_at")
order_by_param = request.GET.get("order_by", "created_at")
filters = issue_filters(request.query_params, "GET")
queryset = (
issue_queryset = (
Issue.issue_objects.filter(issue_cycle__cycle_id=cycle_id)
.filter(project_id=project_id)
.filter(workspace__slug=slug)
@ -105,7 +107,6 @@ class CycleIssueViewSet(BaseViewSet):
"issue_module__module",
"issue_cycle__cycle",
)
.order_by(order_by)
.filter(**filters)
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
@ -130,73 +131,112 @@ class CycleIssueViewSet(BaseViewSet):
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
label_ids=Coalesce(
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
assignee_ids=Coalesce(
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
module_ids=Coalesce(
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=~Q(issue_module__module_id__isnull=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
)
.order_by(order_by)
)
if self.fields:
issues = IssueSerializer(
queryset, many=True, fields=fields if fields else None
).data
else:
issues = queryset.values(
"id",
"name",
"state_id",
"sort_order",
"completed_at",
"estimate_point",
"priority",
"start_date",
"target_date",
"sequence_id",
"project_id",
"parent_id",
"cycle_id",
"module_ids",
"label_ids",
"assignee_ids",
"sub_issues_count",
"created_at",
"updated_at",
"created_by",
"updated_by",
"attachment_count",
"link_count",
"is_draft",
"archived_at",
)
datetime_fields = ["created_at", "updated_at"]
issues = user_timezone_converter(
issues, datetime_fields, request.user.user_timezone
)
filters = issue_filters(request.query_params, "GET")
return Response(issues, status=status.HTTP_200_OK)
order_by_param = request.GET.get("order_by", "-created_at")
issue_queryset = issue_queryset.filter(**filters)
# Issue queryset
issue_queryset, order_by_param = order_issue_queryset(
issue_queryset=issue_queryset,
order_by_param=order_by_param,
)
# Group by
group_by = request.GET.get("group_by", False)
sub_group_by = request.GET.get("sub_group_by", False)
# issue queryset
issue_queryset = issue_queryset_grouper(
queryset=issue_queryset,
group_by=group_by,
sub_group_by=sub_group_by,
)
if group_by:
# Check group and sub group value paginate
if sub_group_by:
if group_by == sub_group_by:
return Response(
{
"error": "Group by and sub group by cannot have same parameters"
},
status=status.HTTP_400_BAD_REQUEST,
)
else:
# group and sub group pagination
return self.paginate(
request=request,
order_by=order_by_param,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by,
issues=issues,
sub_group_by=sub_group_by,
),
paginator_cls=SubGroupedOffsetPaginator,
group_by_fields=issue_group_values(
field=group_by,
slug=slug,
project_id=project_id,
filters=filters,
),
sub_group_by_fields=issue_group_values(
field=sub_group_by,
slug=slug,
project_id=project_id,
filters=filters,
),
group_by_field_name=group_by,
sub_group_by_field_name=sub_group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
)
# Group Paginate
else:
# Group paginate
return self.paginate(
request=request,
order_by=order_by_param,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by,
issues=issues,
sub_group_by=sub_group_by,
),
paginator_cls=GroupedOffsetPaginator,
group_by_fields=issue_group_values(
field=group_by,
slug=slug,
project_id=project_id,
filters=filters,
),
group_by_field_name=group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
)
else:
# List Paginate
return self.paginate(
order_by=order_by_param,
request=request,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by, issues=issues, sub_group_by=sub_group_by
),
)
def create(self, request, slug, project_id, cycle_id):
issues = request.data.get("issues", [])

View file

@ -1,52 +1,53 @@
# Django imports
from django.db.models import (
Q,
Case,
When,
Value,
CharField,
Count,
F,
Exists,
OuterRef,
Subquery,
JSONField,
Func,
Prefetch,
IntegerField,
)
from django.contrib.postgres.aggregates import ArrayAgg
from django.contrib.postgres.fields import ArrayField
from django.db.models import UUIDField
from django.db.models import (
Case,
CharField,
Count,
Exists,
F,
Func,
IntegerField,
JSONField,
OuterRef,
Prefetch,
Q,
Subquery,
UUIDField,
Value,
When,
)
from django.db.models.functions import Coalesce
from django.utils import timezone
from rest_framework import status
# Third Party imports
from rest_framework.response import Response
from rest_framework import status
from plane.app.serializers import (
DashboardSerializer,
IssueActivitySerializer,
IssueSerializer,
WidgetSerializer,
)
from plane.db.models import (
Dashboard,
DashboardWidget,
Issue,
IssueActivity,
IssueAttachment,
IssueLink,
IssueRelation,
Project,
ProjectMember,
User,
Widget,
)
from plane.utils.issue_filters import issue_filters
# Module imports
from .. import BaseAPIView
from plane.db.models import (
Issue,
IssueActivity,
ProjectMember,
Widget,
DashboardWidget,
Dashboard,
Project,
IssueLink,
IssueAttachment,
IssueRelation,
User,
)
from plane.app.serializers import (
IssueActivitySerializer,
IssueSerializer,
DashboardSerializer,
WidgetSerializer,
)
from plane.utils.issue_filters import issue_filters
def dashboard_overview_stats(self, request, slug):
@ -569,6 +570,7 @@ def dashboard_recent_collaborators(self, request, slug):
)
return self.paginate(
order_by=request.GET.get("order_by", "-created_at"),
request=request,
queryset=project_members_with_activities,
controller=lambda qs: self.get_results_controller(qs, slug),

View file

@ -1,14 +1,14 @@
# Third Party imports
from rest_framework.response import Response
from rest_framework import status
from rest_framework.response import Response
from plane.app.permissions import WorkSpaceAdminPermission
from plane.app.serializers import ExporterHistorySerializer
from plane.bgtasks.export_task import issue_export_task
from plane.db.models import ExporterHistory, Project, Workspace
# Module imports
from .. import BaseAPIView
from plane.app.permissions import WorkSpaceAdminPermission
from plane.bgtasks.export_task import issue_export_task
from plane.db.models import Project, ExporterHistory, Workspace
from plane.app.serializers import ExporterHistorySerializer
class ExportIssuesEndpoint(BaseAPIView):
@ -72,6 +72,7 @@ class ExportIssuesEndpoint(BaseAPIView):
"cursor", False
):
return self.paginate(
order_by=request.GET.get("order_by", "-created_at"),
request=request,
queryset=exporter_history,
on_results=lambda exporter_history: ExporterHistorySerializer(

View file

@ -2,52 +2,54 @@
import json
# Django imports
from django.utils import timezone
from django.db.models import (
Prefetch,
OuterRef,
Func,
F,
Q,
Case,
Value,
CharField,
When,
Exists,
Max,
UUIDField,
)
from django.core.serializers.json import DjangoJSONEncoder
from django.db.models import (
F,
Func,
OuterRef,
Q,
Prefetch,
Exists,
)
from django.utils import timezone
from django.utils.decorators import method_decorator
from django.views.decorators.gzip import gzip_page
from django.contrib.postgres.aggregates import ArrayAgg
from django.contrib.postgres.fields import ArrayField
from django.db.models.functions import Coalesce
# Third Party imports
from rest_framework.response import Response
from rest_framework import status
from rest_framework.response import Response
# Module imports
from .. import BaseViewSet
from plane.app.serializers import (
IssueSerializer,
IssueFlatSerializer,
IssueDetailSerializer,
)
from plane.app.permissions import (
ProjectEntityPermission,
)
from plane.app.serializers import (
IssueFlatSerializer,
IssueSerializer,
IssueDetailSerializer
)
from plane.bgtasks.issue_activites_task import issue_activity
from plane.db.models import (
Issue,
IssueLink,
IssueAttachment,
IssueLink,
IssueSubscriber,
IssueReaction,
)
from plane.bgtasks.issue_activites_task import issue_activity
from plane.utils.grouper import (
issue_group_values,
issue_on_results,
issue_queryset_grouper,
)
from plane.utils.issue_filters import issue_filters
from plane.utils.user_timezone_converter import user_timezone_converter
from plane.utils.order_queryset import order_issue_queryset
from plane.utils.paginator import (
GroupedOffsetPaginator,
SubGroupedOffsetPaginator,
)
# Module imports
from .. import BaseViewSet, BaseAPIView
class IssueArchiveViewSet(BaseViewSet):
permission_classes = [
@ -92,33 +94,6 @@ class IssueArchiveViewSet(BaseViewSet):
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
label_ids=Coalesce(
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
assignee_ids=Coalesce(
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
module_ids=Coalesce(
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=~Q(issue_module__module_id__isnull=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
)
)
@method_decorator(gzip_page)
@ -126,125 +101,116 @@ class IssueArchiveViewSet(BaseViewSet):
filters = issue_filters(request.query_params, "GET")
show_sub_issues = request.GET.get("show_sub_issues", "true")
# Custom ordering for priority and state
priority_order = ["urgent", "high", "medium", "low", "none"]
state_order = [
"backlog",
"unstarted",
"started",
"completed",
"cancelled",
]
order_by_param = request.GET.get("order_by", "-created_at")
issue_queryset = self.get_queryset().filter(**filters)
# Priority Ordering
if order_by_param == "priority" or order_by_param == "-priority":
priority_order = (
priority_order
if order_by_param == "priority"
else priority_order[::-1]
)
issue_queryset = issue_queryset.annotate(
priority_order=Case(
*[
When(priority=p, then=Value(i))
for i, p in enumerate(priority_order)
],
output_field=CharField(),
)
).order_by("priority_order")
# State Ordering
elif order_by_param in [
"state__name",
"state__group",
"-state__name",
"-state__group",
]:
state_order = (
state_order
if order_by_param in ["state__name", "state__group"]
else state_order[::-1]
)
issue_queryset = issue_queryset.annotate(
state_order=Case(
*[
When(state__group=state_group, then=Value(i))
for i, state_group in enumerate(state_order)
],
default=Value(len(state_order)),
output_field=CharField(),
)
).order_by("state_order")
# assignee and label ordering
elif order_by_param in [
"labels__name",
"-labels__name",
"assignees__first_name",
"-assignees__first_name",
]:
issue_queryset = issue_queryset.annotate(
max_values=Max(
order_by_param[1::]
if order_by_param.startswith("-")
else order_by_param
)
).order_by(
"-max_values"
if order_by_param.startswith("-")
else "max_values"
)
else:
issue_queryset = issue_queryset.order_by(order_by_param)
issue_queryset = (
issue_queryset
if show_sub_issues == "true"
else issue_queryset.filter(parent__isnull=True)
)
if self.expand or self.fields:
issues = IssueSerializer(
issue_queryset,
many=True,
fields=self.fields,
).data
else:
issues = issue_queryset.values(
"id",
"name",
"state_id",
"sort_order",
"completed_at",
"estimate_point",
"priority",
"start_date",
"target_date",
"sequence_id",
"project_id",
"parent_id",
"cycle_id",
"module_ids",
"label_ids",
"assignee_ids",
"sub_issues_count",
"created_at",
"updated_at",
"created_by",
"updated_by",
"attachment_count",
"link_count",
"is_draft",
"archived_at",
)
datetime_fields = ["created_at", "updated_at"]
issues = user_timezone_converter(
issues, datetime_fields, request.user.user_timezone
)
# Issue queryset
issue_queryset, order_by_param = order_issue_queryset(
issue_queryset=issue_queryset,
order_by_param=order_by_param,
)
return Response(issues, status=status.HTTP_200_OK)
# Group by
group_by = request.GET.get("group_by", False)
sub_group_by = request.GET.get("sub_group_by", False)
# issue queryset
issue_queryset = issue_queryset_grouper(
queryset=issue_queryset,
group_by=group_by,
sub_group_by=sub_group_by,
)
if group_by:
# Check group and sub group value paginate
if sub_group_by:
if group_by == sub_group_by:
return Response(
{
"error": "Group by and sub group by cannot have same parameters"
},
status=status.HTTP_400_BAD_REQUEST,
)
else:
# group and sub group pagination
return self.paginate(
request=request,
order_by=order_by_param,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by,
issues=issues,
sub_group_by=sub_group_by,
),
paginator_cls=SubGroupedOffsetPaginator,
group_by_fields=issue_group_values(
field=group_by,
slug=slug,
project_id=project_id,
filters=filters,
),
sub_group_by_fields=issue_group_values(
field=sub_group_by,
slug=slug,
project_id=project_id,
filters=filters,
),
group_by_field_name=group_by,
sub_group_by_field_name=sub_group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
)
# Group Paginate
else:
# Group paginate
return self.paginate(
request=request,
order_by=order_by_param,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by,
issues=issues,
sub_group_by=sub_group_by,
),
paginator_cls=GroupedOffsetPaginator,
group_by_fields=issue_group_values(
field=group_by,
slug=slug,
project_id=project_id,
filters=filters,
),
group_by_field_name=group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
)
else:
# List Paginate
return self.paginate(
order_by=order_by_param,
request=request,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by, issues=issues, sub_group_by=sub_group_by
),
)
def retrieve(self, request, slug, project_id, pk=None):
issue = (
@ -351,3 +317,58 @@ class IssueArchiveViewSet(BaseViewSet):
issue.save()
return Response(status=status.HTTP_204_NO_CONTENT)
class BulkArchiveIssuesEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def post(self, request, slug, project_id):
issue_ids = request.data.get("issue_ids", [])
if not len(issue_ids):
return Response(
{"error": "Issue IDs are required"},
status=status.HTTP_400_BAD_REQUEST,
)
issues = Issue.objects.filter(
workspace__slug=slug, project_id=project_id, pk__in=issue_ids
).select_related("state")
bulk_archive_issues = []
for issue in issues:
if issue.state.group not in ["completed", "cancelled"]:
return Response(
{
"error_code": 4091,
"error_message": "INVALID_ARCHIVE_STATE_GROUP"
},
status=status.HTTP_400_BAD_REQUEST,
)
issue_activity.delay(
type="issue.activity.updated",
requested_data=json.dumps(
{
"archived_at": str(timezone.now().date()),
"automation": False,
}
),
actor_id=str(request.user.id),
issue_id=str(issue.id),
project_id=str(project_id),
current_instance=json.dumps(
IssueSerializer(issue).data, cls=DjangoJSONEncoder
),
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
)
issue.archived_at = timezone.now().date()
bulk_archive_issues.append(issue)
Issue.objects.bulk_update(bulk_archive_issues, ["archived_at"])
return Response(
{"archived_at": str(timezone.now().date())},
status=status.HTTP_200_OK,
)

View file

@ -1,34 +1,30 @@
# Python imports
import json
# Django imports
from django.contrib.postgres.aggregates import ArrayAgg
from django.contrib.postgres.fields import ArrayField
from django.core.serializers.json import DjangoJSONEncoder
from django.db.models import (
Case,
CharField,
Exists,
F,
Func,
Max,
OuterRef,
Prefetch,
Q,
UUIDField,
Value,
When,
)
from django.db.models.functions import Coalesce
# Django imports
from django.utils import timezone
from django.utils.decorators import method_decorator
from django.views.decorators.gzip import gzip_page
from rest_framework import status
# Third Party imports
from rest_framework import status
from rest_framework.response import Response
# Module imports
from plane.app.permissions import (
ProjectEntityPermission,
ProjectLitePermission,
@ -49,11 +45,21 @@ from plane.db.models import (
IssueSubscriber,
Project,
)
from plane.utils.grouper import (
issue_group_values,
issue_on_results,
issue_queryset_grouper,
)
from plane.utils.issue_filters import issue_filters
from plane.utils.order_queryset import order_issue_queryset
from plane.utils.paginator import (
GroupedOffsetPaginator,
SubGroupedOffsetPaginator,
)
from .. import BaseAPIView, BaseViewSet
from plane.utils.user_timezone_converter import user_timezone_converter
# Module imports
from .. import BaseAPIView, BaseViewSet
class IssueListEndpoint(BaseAPIView):
@ -105,110 +111,28 @@ class IssueListEndpoint(BaseAPIView):
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
label_ids=Coalesce(
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
assignee_ids=Coalesce(
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
module_ids=Coalesce(
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=~Q(issue_module__module_id__isnull=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
)
).distinct()
filters = issue_filters(request.query_params, "GET")
# Custom ordering for priority and state
priority_order = ["urgent", "high", "medium", "low", "none"]
state_order = [
"backlog",
"unstarted",
"started",
"completed",
"cancelled",
]
order_by_param = request.GET.get("order_by", "-created_at")
issue_queryset = queryset.filter(**filters)
# Issue queryset
issue_queryset, _ = order_issue_queryset(
issue_queryset=issue_queryset,
order_by_param=order_by_param,
)
# Priority Ordering
if order_by_param == "priority" or order_by_param == "-priority":
priority_order = (
priority_order
if order_by_param == "priority"
else priority_order[::-1]
)
issue_queryset = issue_queryset.annotate(
priority_order=Case(
*[
When(priority=p, then=Value(i))
for i, p in enumerate(priority_order)
],
output_field=CharField(),
)
).order_by("priority_order")
# Group by
group_by = request.GET.get("group_by", False)
sub_group_by = request.GET.get("sub_group_by", False)
# State Ordering
elif order_by_param in [
"state__name",
"state__group",
"-state__name",
"-state__group",
]:
state_order = (
state_order
if order_by_param in ["state__name", "state__group"]
else state_order[::-1]
)
issue_queryset = issue_queryset.annotate(
state_order=Case(
*[
When(state__group=state_group, then=Value(i))
for i, state_group in enumerate(state_order)
],
default=Value(len(state_order)),
output_field=CharField(),
)
).order_by("state_order")
# assignee and label ordering
elif order_by_param in [
"labels__name",
"-labels__name",
"assignees__first_name",
"-assignees__first_name",
]:
issue_queryset = issue_queryset.annotate(
max_values=Max(
order_by_param[1::]
if order_by_param.startswith("-")
else order_by_param
)
).order_by(
"-max_values"
if order_by_param.startswith("-")
else "max_values"
)
else:
issue_queryset = issue_queryset.order_by(order_by_param)
# issue queryset
issue_queryset = issue_queryset_grouper(
queryset=issue_queryset,
group_by=group_by,
sub_group_by=sub_group_by,
)
if self.fields or self.expand:
issues = IssueSerializer(
@ -304,33 +228,6 @@ class IssueViewSet(BaseViewSet):
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
label_ids=Coalesce(
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
assignee_ids=Coalesce(
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
module_ids=Coalesce(
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=~Q(issue_module__module_id__isnull=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
)
).distinct()
@method_decorator(gzip_page)
@ -340,116 +237,104 @@ class IssueViewSet(BaseViewSet):
issue_queryset = self.get_queryset().filter(**filters)
# Custom ordering for priority and state
priority_order = ["urgent", "high", "medium", "low", "none"]
state_order = [
"backlog",
"unstarted",
"started",
"completed",
"cancelled",
]
# Priority Ordering
if order_by_param == "priority" or order_by_param == "-priority":
priority_order = (
priority_order
if order_by_param == "priority"
else priority_order[::-1]
)
issue_queryset = issue_queryset.annotate(
priority_order=Case(
*[
When(priority=p, then=Value(i))
for i, p in enumerate(priority_order)
],
output_field=CharField(),
)
).order_by("priority_order")
# Issue queryset
issue_queryset, order_by_param = order_issue_queryset(
issue_queryset=issue_queryset,
order_by_param=order_by_param,
)
# State Ordering
elif order_by_param in [
"state__name",
"state__group",
"-state__name",
"-state__group",
]:
state_order = (
state_order
if order_by_param in ["state__name", "state__group"]
else state_order[::-1]
)
issue_queryset = issue_queryset.annotate(
state_order=Case(
*[
When(state__group=state_group, then=Value(i))
for i, state_group in enumerate(state_order)
],
default=Value(len(state_order)),
output_field=CharField(),
# Group by
group_by = request.GET.get("group_by", False)
sub_group_by = request.GET.get("sub_group_by", False)
# issue queryset
issue_queryset = issue_queryset_grouper(
queryset=issue_queryset,
group_by=group_by,
sub_group_by=sub_group_by,
)
if group_by:
if sub_group_by:
if group_by == sub_group_by:
return Response(
{
"error": "Group by and sub group by cannot have same parameters"
},
status=status.HTTP_400_BAD_REQUEST,
)
else:
return self.paginate(
request=request,
order_by=order_by_param,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by,
issues=issues,
sub_group_by=sub_group_by,
),
paginator_cls=SubGroupedOffsetPaginator,
group_by_fields=issue_group_values(
field=group_by,
slug=slug,
project_id=project_id,
filters=filters,
),
sub_group_by_fields=issue_group_values(
field=sub_group_by,
slug=slug,
project_id=project_id,
filters=filters,
),
group_by_field_name=group_by,
sub_group_by_field_name=sub_group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
)
else:
# Group paginate
return self.paginate(
request=request,
order_by=order_by_param,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by,
issues=issues,
sub_group_by=sub_group_by,
),
paginator_cls=GroupedOffsetPaginator,
group_by_fields=issue_group_values(
field=group_by,
slug=slug,
project_id=project_id,
filters=filters,
),
group_by_field_name=group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
)
).order_by("state_order")
# assignee and label ordering
elif order_by_param in [
"labels__name",
"-labels__name",
"assignees__first_name",
"-assignees__first_name",
]:
issue_queryset = issue_queryset.annotate(
max_values=Max(
order_by_param[1::]
if order_by_param.startswith("-")
else order_by_param
)
).order_by(
"-max_values"
if order_by_param.startswith("-")
else "max_values"
)
else:
issue_queryset = issue_queryset.order_by(order_by_param)
# Only use serializer when expand or fields else return by values
if self.expand or self.fields:
issues = IssueSerializer(
issue_queryset,
many=True,
fields=self.fields,
expand=self.expand,
).data
else:
issues = issue_queryset.values(
"id",
"name",
"state_id",
"sort_order",
"completed_at",
"estimate_point",
"priority",
"start_date",
"target_date",
"sequence_id",
"project_id",
"parent_id",
"cycle_id",
"module_ids",
"label_ids",
"assignee_ids",
"sub_issues_count",
"created_at",
"updated_at",
"created_by",
"updated_by",
"attachment_count",
"link_count",
"is_draft",
"archived_at",
return self.paginate(
order_by=order_by_param,
request=request,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by, issues=issues, sub_group_by=sub_group_by
),
)
datetime_fields = ["created_at", "updated_at"]
issues = user_timezone_converter(
issues, datetime_fields, request.user.user_timezone
)
return Response(issues, status=status.HTTP_200_OK)
def create(self, request, slug, project_id):
project = Project.objects.get(pk=project_id)
@ -481,8 +366,13 @@ class IssueViewSet(BaseViewSet):
origin=request.META.get("HTTP_ORIGIN"),
)
issue = (
self.get_queryset()
.filter(pk=serializer.data["id"])
issue_queryset_grouper(
queryset=self.get_queryset().filter(
pk=serializer.data["id"]
),
group_by=None,
sub_group_by=None,
)
.values(
"id",
"name",
@ -523,6 +413,33 @@ class IssueViewSet(BaseViewSet):
issue = (
self.get_queryset()
.filter(pk=pk)
.annotate(
label_ids=Coalesce(
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
assignee_ids=Coalesce(
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
module_ids=Coalesce(
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=~Q(issue_module__module_id__isnull=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
)
.prefetch_related(
Prefetch(
"issue_reactions",

View file

@ -0,0 +1,288 @@
# Python imports
import json
from datetime import datetime
# Django imports
from django.utils import timezone
# Third Party imports
from rest_framework.response import Response
from rest_framework import status
# Module imports
from .. import BaseAPIView
from plane.app.permissions import (
ProjectEntityPermission,
)
from plane.db.models import (
Project,
Issue,
IssueLabel,
IssueAssignee,
)
from plane.bgtasks.issue_activites_task import issue_activity
class BulkIssueOperationsEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def post(self, request, slug, project_id):
issue_ids = request.data.get("issue_ids", [])
if not len(issue_ids):
return Response(
{"error": "Issue IDs are required"},
status=status.HTTP_400_BAD_REQUEST,
)
# Get all the issues
issues = (
Issue.objects.filter(
workspace__slug=slug, project_id=project_id, pk__in=issue_ids
)
.select_related("state")
.prefetch_related("labels", "assignees")
)
# Current epoch
epoch = int(timezone.now().timestamp())
# Project details
project = Project.objects.get(workspace__slug=slug, pk=project_id)
workspace_id = project.workspace_id
# Initialize arrays
bulk_update_issues = []
bulk_issue_activities = []
bulk_update_issue_labels = []
bulk_update_issue_assignees = []
properties = request.data.get("properties", {})
if properties.get("start_date", False) and properties.get("target_date", False):
if (
datetime.strptime(properties.get("start_date"), "%Y-%m-%d").date()
> datetime.strptime(properties.get("target_date"), "%Y-%m-%d").date()
):
return Response(
{
"error_code": 4100,
"error_message": "INVALID_ISSUE_DATES",
},
status=status.HTTP_400_BAD_REQUEST,
)
for issue in issues:
# Priority
if properties.get("priority", False):
bulk_issue_activities.append(
{
"type": "issue.activity.updated",
"requested_data": json.dumps(
{"priority": properties.get("priority")}
),
"current_instance": json.dumps(
{"priority": (issue.priority)}
),
"issue_id": str(issue.id),
"actor_id": str(request.user.id),
"project_id": str(project_id),
"epoch": epoch,
}
)
issue.priority = properties.get("priority")
# State
if properties.get("state_id", False):
bulk_issue_activities.append(
{
"type": "issue.activity.updated",
"requested_data": json.dumps(
{"state": properties.get("state")}
),
"current_instance": json.dumps(
{"state": str(issue.state_id)}
),
"issue_id": str(issue.id),
"actor_id": str(request.user.id),
"project_id": str(project_id),
"epoch": epoch,
}
)
issue.state_id = properties.get("state_id")
# Start date
if properties.get("start_date", False):
if (
issue.target_date
and not properties.get("target_date", False)
and issue.target_date
<= datetime.strptime(
properties.get("start_date"), "%Y-%m-%d"
).date()
):
return Response(
{
"error_code": 4101,
"error_message": "INVALID_ISSUE_START_DATE",
},
status=status.HTTP_400_BAD_REQUEST,
)
bulk_issue_activities.append(
{
"type": "issue.activity.updated",
"requested_data": json.dumps(
{"start_date": properties.get("start_date")}
),
"current_instance": json.dumps(
{"start_date": str(issue.start_date)}
),
"issue_id": str(issue.id),
"actor_id": str(request.user.id),
"project_id": str(project_id),
"epoch": epoch,
}
)
issue.start_date = properties.get("start_date")
# Target date
if properties.get("target_date", False):
if (
issue.start_date
and not properties.get("start_date", False)
and issue.start_date
>= datetime.strptime(
properties.get("target_date"), "%Y-%m-%d"
).date()
):
return Response(
{
"error_code": 4102,
"error_message": "INVALID_ISSUE_TARGET_DATE",
},
status=status.HTTP_400_BAD_REQUEST,
)
bulk_issue_activities.append(
{
"type": "issue.activity.updated",
"requested_data": json.dumps(
{"target_date": properties.get("target_date")}
),
"current_instance": json.dumps(
{"target_date": str(issue.target_date)}
),
"issue_id": str(issue.id),
"actor_id": str(request.user.id),
"project_id": str(project_id),
"epoch": epoch,
}
)
issue.target_date = properties.get("target_date")
bulk_update_issues.append(issue)
# Labels
if properties.get("label_ids", []):
for label_id in properties.get("label_ids", []):
bulk_update_issue_labels.append(
IssueLabel(
issue=issue,
label_id=label_id,
created_by=request.user,
project_id=project_id,
workspace_id=workspace_id,
)
)
bulk_issue_activities.append(
{
"type": "issue.activity.updated",
"requested_data": json.dumps(
{"label_ids": properties.get("label_ids", [])}
),
"current_instance": json.dumps(
{
"label_ids": [
str(label.id)
for label in issue.labels.all()
]
}
),
"issue_id": str(issue.id),
"actor_id": str(request.user.id),
"project_id": str(project_id),
"epoch": epoch,
}
)
# Assignees
if properties.get("assignee_ids", []):
for assignee_id in properties.get(
"assignee_ids", issue.assignees
):
bulk_update_issue_assignees.append(
IssueAssignee(
issue=issue,
assignee_id=assignee_id,
created_by=request.user,
project_id=project_id,
workspace_id=workspace_id,
)
)
bulk_issue_activities.append(
{
"type": "issue.activity.updated",
"requested_data": json.dumps(
{
"assignee_ids": properties.get(
"assignee_ids", []
)
}
),
"current_instance": json.dumps(
{
"assignee_ids": [
str(assignee.id)
for assignee in issue.assignees.all()
]
}
),
"issue_id": str(issue.id),
"actor_id": str(request.user.id),
"project_id": str(project_id),
"epoch": epoch,
}
)
# Bulk update all the objects
Issue.objects.bulk_update(
bulk_update_issues,
[
"priority",
"start_date",
"target_date",
"state",
],
batch_size=100,
)
# Create new labels
IssueLabel.objects.bulk_create(
bulk_update_issue_labels,
ignore_conflicts=True,
batch_size=100,
)
# Create new assignees
IssueAssignee.objects.bulk_create(
bulk_update_issue_assignees,
ignore_conflicts=True,
batch_size=100,
)
# update the issue activity
[
issue_activity.delay(**activity)
for activity in bulk_issue_activities
]
return Response(status=status.HTTP_204_NO_CONTENT)

View file

@ -6,18 +6,14 @@ from django.contrib.postgres.aggregates import ArrayAgg
from django.contrib.postgres.fields import ArrayField
from django.core.serializers.json import DjangoJSONEncoder
from django.db.models import (
Case,
CharField,
Exists,
F,
Func,
Max,
OuterRef,
Prefetch,
Q,
UUIDField,
Value,
When,
)
from django.db.models.functions import Coalesce
from django.utils import timezone
@ -28,6 +24,7 @@ from django.views.decorators.gzip import gzip_page
from rest_framework import status
from rest_framework.response import Response
# Module imports
from plane.app.permissions import ProjectEntityPermission
from plane.app.serializers import (
IssueCreateSerializer,
@ -44,10 +41,17 @@ from plane.db.models import (
IssueSubscriber,
Project,
)
from plane.utils.grouper import (
issue_group_values,
issue_on_results,
issue_queryset_grouper,
)
from plane.utils.issue_filters import issue_filters
from plane.utils.user_timezone_converter import user_timezone_converter
# Module imports
from plane.utils.order_queryset import order_issue_queryset
from plane.utils.paginator import (
GroupedOffsetPaginator,
SubGroupedOffsetPaginator,
)
from .. import BaseViewSet
@ -88,153 +92,116 @@ class IssueDraftViewSet(BaseViewSet):
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
label_ids=Coalesce(
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
assignee_ids=Coalesce(
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
module_ids=Coalesce(
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=~Q(issue_module__module_id__isnull=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
)
).distinct()
@method_decorator(gzip_page)
def list(self, request, slug, project_id):
filters = issue_filters(request.query_params, "GET")
# Custom ordering for priority and state
priority_order = ["urgent", "high", "medium", "low", "none"]
state_order = [
"backlog",
"unstarted",
"started",
"completed",
"cancelled",
]
order_by_param = request.GET.get("order_by", "-created_at")
issue_queryset = self.get_queryset().filter(**filters)
# Issue queryset
issue_queryset, order_by_param = order_issue_queryset(
issue_queryset=issue_queryset,
order_by_param=order_by_param,
)
# Priority Ordering
if order_by_param == "priority" or order_by_param == "-priority":
priority_order = (
priority_order
if order_by_param == "priority"
else priority_order[::-1]
)
issue_queryset = issue_queryset.annotate(
priority_order=Case(
*[
When(priority=p, then=Value(i))
for i, p in enumerate(priority_order)
],
output_field=CharField(),
)
).order_by("priority_order")
# Group by
group_by = request.GET.get("group_by", False)
sub_group_by = request.GET.get("sub_group_by", False)
# State Ordering
elif order_by_param in [
"state__name",
"state__group",
"-state__name",
"-state__group",
]:
state_order = (
state_order
if order_by_param in ["state__name", "state__group"]
else state_order[::-1]
)
issue_queryset = issue_queryset.annotate(
state_order=Case(
*[
When(state__group=state_group, then=Value(i))
for i, state_group in enumerate(state_order)
],
default=Value(len(state_order)),
output_field=CharField(),
# issue queryset
issue_queryset = issue_queryset_grouper(
queryset=issue_queryset,
group_by=group_by,
sub_group_by=sub_group_by,
)
if group_by:
# Check group and sub group value paginate
if sub_group_by:
if group_by == sub_group_by:
return Response(
{
"error": "Group by and sub group by cannot have same parameters"
},
status=status.HTTP_400_BAD_REQUEST,
)
else:
# group and sub group pagination
return self.paginate(
request=request,
order_by=order_by_param,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by,
issues=issues,
sub_group_by=sub_group_by,
),
paginator_cls=SubGroupedOffsetPaginator,
group_by_fields=issue_group_values(
field=group_by,
slug=slug,
project_id=project_id,
filters=filters,
),
sub_group_by_fields=issue_group_values(
field=sub_group_by,
slug=slug,
project_id=project_id,
filters=filters,
),
group_by_field_name=group_by,
sub_group_by_field_name=sub_group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
)
# Group Paginate
else:
# Group paginate
return self.paginate(
request=request,
order_by=order_by_param,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by,
issues=issues,
sub_group_by=sub_group_by,
),
paginator_cls=GroupedOffsetPaginator,
group_by_fields=issue_group_values(
field=group_by,
slug=slug,
project_id=project_id,
filters=filters,
),
group_by_field_name=group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
)
).order_by("state_order")
# assignee and label ordering
elif order_by_param in [
"labels__name",
"-labels__name",
"assignees__first_name",
"-assignees__first_name",
]:
issue_queryset = issue_queryset.annotate(
max_values=Max(
order_by_param[1::]
if order_by_param.startswith("-")
else order_by_param
)
).order_by(
"-max_values"
if order_by_param.startswith("-")
else "max_values"
)
else:
issue_queryset = issue_queryset.order_by(order_by_param)
# Only use serializer when expand else return by values
if self.expand or self.fields:
issues = IssueSerializer(
issue_queryset,
many=True,
fields=self.fields,
expand=self.expand,
).data
else:
issues = issue_queryset.values(
"id",
"name",
"state_id",
"sort_order",
"completed_at",
"estimate_point",
"priority",
"start_date",
"target_date",
"sequence_id",
"project_id",
"parent_id",
"cycle_id",
"module_ids",
"label_ids",
"assignee_ids",
"sub_issues_count",
"created_at",
"updated_at",
"created_by",
"updated_by",
"attachment_count",
"link_count",
"is_draft",
"archived_at",
# List Paginate
return self.paginate(
order_by=order_by_param,
request=request,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by, issues=issues, sub_group_by=sub_group_by
),
)
datetime_fields = ["created_at", "updated_at"]
issues = user_timezone_converter(
issues, datetime_fields, request.user.user_timezone
)
return Response(issues, status=status.HTTP_200_OK)
def create(self, request, slug, project_id):
project = Project.objects.get(pk=project_id)
@ -265,12 +232,45 @@ class IssueDraftViewSet(BaseViewSet):
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
)
issue = (
self.get_queryset().filter(pk=serializer.data["id"]).first()
)
return Response(
IssueSerializer(issue).data, status=status.HTTP_201_CREATED
issue_queryset_grouper(
queryset=self.get_queryset().filter(
pk=serializer.data["id"]
),
group_by=None,
sub_group_by=None,
)
.values(
"id",
"name",
"state_id",
"sort_order",
"completed_at",
"estimate_point",
"priority",
"start_date",
"target_date",
"sequence_id",
"project_id",
"parent_id",
"cycle_id",
"module_ids",
"label_ids",
"assignee_ids",
"sub_issues_count",
"created_at",
"updated_at",
"created_by",
"updated_by",
"attachment_count",
"link_count",
"is_draft",
"archived_at",
)
.first()
)
return Response(issue, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def partial_update(self, request, slug, project_id, pk):
@ -309,6 +309,33 @@ class IssueDraftViewSet(BaseViewSet):
issue = (
self.get_queryset()
.filter(pk=pk)
.annotate(
label_ids=Coalesce(
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
assignee_ids=Coalesce(
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
module_ids=Coalesce(
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=~Q(issue_module__module_id__isnull=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
)
.prefetch_related(
Prefetch(
"issue_reactions",

View file

@ -1,37 +1,50 @@
# Python imports
import json
from django.db.models import (
F,
Func,
OuterRef,
Q,
)
# Django Imports
from django.utils import timezone
from django.db.models import F, OuterRef, Func, Q
from django.utils.decorators import method_decorator
from django.views.decorators.gzip import gzip_page
from django.contrib.postgres.aggregates import ArrayAgg
from django.contrib.postgres.fields import ArrayField
from django.db.models import Value, UUIDField
from django.db.models.functions import Coalesce
# Third party imports
from rest_framework.response import Response
from rest_framework import status
from rest_framework.response import Response
from plane.app.permissions import (
ProjectEntityPermission,
)
from plane.app.serializers import (
ModuleIssueSerializer,
)
from plane.bgtasks.issue_activites_task import issue_activity
from plane.db.models import (
Issue,
IssueAttachment,
IssueLink,
ModuleIssue,
Project,
)
from plane.utils.grouper import (
issue_group_values,
issue_on_results,
issue_queryset_grouper,
)
from plane.utils.issue_filters import issue_filters
from plane.utils.order_queryset import order_issue_queryset
from plane.utils.paginator import (
GroupedOffsetPaginator,
SubGroupedOffsetPaginator,
)
# Module imports
from .. import BaseViewSet
from plane.app.serializers import (
ModuleIssueSerializer,
IssueSerializer,
)
from plane.app.permissions import ProjectEntityPermission
from plane.db.models import (
ModuleIssue,
Project,
Issue,
IssueLink,
IssueAttachment,
)
from plane.bgtasks.issue_activites_task import issue_activity
from plane.utils.issue_filters import issue_filters
from plane.utils.user_timezone_converter import user_timezone_converter
class ModuleIssueViewSet(BaseViewSet):
serializer_class = ModuleIssueSerializer
@ -80,82 +93,115 @@ class ModuleIssueViewSet(BaseViewSet):
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
label_ids=Coalesce(
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
assignee_ids=Coalesce(
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
module_ids=Coalesce(
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=~Q(issue_module__module_id__isnull=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
)
).distinct()
@method_decorator(gzip_page)
def list(self, request, slug, project_id, module_id):
fields = [
field
for field in request.GET.get("fields", "").split(",")
if field
]
filters = issue_filters(request.query_params, "GET")
issue_queryset = self.get_queryset().filter(**filters)
if self.fields or self.expand:
issues = IssueSerializer(
issue_queryset, many=True, fields=fields if fields else None
).data
else:
issues = issue_queryset.values(
"id",
"name",
"state_id",
"sort_order",
"completed_at",
"estimate_point",
"priority",
"start_date",
"target_date",
"sequence_id",
"project_id",
"parent_id",
"cycle_id",
"module_ids",
"label_ids",
"assignee_ids",
"sub_issues_count",
"created_at",
"updated_at",
"created_by",
"updated_by",
"attachment_count",
"link_count",
"is_draft",
"archived_at",
)
datetime_fields = ["created_at", "updated_at"]
issues = user_timezone_converter(
issues, datetime_fields, request.user.user_timezone
)
order_by_param = request.GET.get("order_by", "created_at")
return Response(issues, status=status.HTTP_200_OK)
# Issue queryset
issue_queryset, order_by_param = order_issue_queryset(
issue_queryset=issue_queryset,
order_by_param=order_by_param,
)
# Group by
group_by = request.GET.get("group_by", False)
sub_group_by = request.GET.get("sub_group_by", False)
# issue queryset
issue_queryset = issue_queryset_grouper(
queryset=issue_queryset,
group_by=group_by,
sub_group_by=sub_group_by,
)
if group_by:
# Check group and sub group value paginate
if sub_group_by:
if group_by == sub_group_by:
return Response(
{
"error": "Group by and sub group by cannot have same parameters"
},
status=status.HTTP_400_BAD_REQUEST,
)
else:
# group and sub group pagination
return self.paginate(
request=request,
order_by=order_by_param,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by,
issues=issues,
sub_group_by=sub_group_by,
),
paginator_cls=SubGroupedOffsetPaginator,
group_by_fields=issue_group_values(
field=group_by,
slug=slug,
project_id=project_id,
filters=filters,
),
sub_group_by_fields=issue_group_values(
field=sub_group_by,
slug=slug,
project_id=project_id,
filters=filters,
),
group_by_field_name=group_by,
sub_group_by_field_name=sub_group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
)
# Group Paginate
else:
# Group paginate
return self.paginate(
request=request,
order_by=order_by_param,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by,
issues=issues,
sub_group_by=sub_group_by,
),
paginator_cls=GroupedOffsetPaginator,
group_by_fields=issue_group_values(
field=group_by,
slug=slug,
project_id=project_id,
filters=filters,
),
group_by_field_name=group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
)
else:
# List Paginate
return self.paginate(
order_by=order_by_param,
request=request,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by, issues=issues, sub_group_by=sub_group_by
),
)
# create multiple issues inside a module
def create_module_issues(self, request, slug, project_id, module_id):

View file

@ -1,26 +1,27 @@
# Django imports
from django.db.models import Q, OuterRef, Exists
from django.db.models import Exists, OuterRef, Q
from django.utils import timezone
# Third party imports
from rest_framework import status
from rest_framework.response import Response
from plane.utils.paginator import BasePaginator
# Module imports
from ..base import BaseViewSet, BaseAPIView
from plane.db.models import (
Notification,
IssueAssignee,
IssueSubscriber,
Issue,
WorkspaceMember,
UserNotificationPreference,
)
from plane.app.serializers import (
NotificationSerializer,
UserNotificationPreferenceSerializer,
)
from plane.db.models import (
Issue,
IssueAssignee,
IssueSubscriber,
Notification,
UserNotificationPreference,
WorkspaceMember,
)
from plane.utils.paginator import BasePaginator
# Module imports
from ..base import BaseAPIView, BaseViewSet
class NotificationViewSet(BaseViewSet, BasePaginator):
@ -131,6 +132,7 @@ class NotificationViewSet(BaseViewSet, BasePaginator):
"cursor", False
):
return self.paginate(
order_by=request.GET.get("order_by", "-created_at"),
request=request,
queryset=(notifications),
on_results=lambda notifications: NotificationSerializer(

View file

@ -1,26 +1,25 @@
# Python imports
import boto3
from django.conf import settings
from django.utils import timezone
import json
# Django imports
from django.db import IntegrityError
from django.db.models import (
Prefetch,
Q,
Exists,
OuterRef,
F,
Func,
OuterRef,
Prefetch,
Q,
Subquery,
)
from django.conf import settings
from django.utils import timezone
from django.core.serializers.json import DjangoJSONEncoder
# Third Party imports
from rest_framework.response import Response
from rest_framework import status
from rest_framework import serializers
from rest_framework import serializers, status
from rest_framework.permissions import AllowAny
# Module imports
@ -35,20 +34,19 @@ from plane.app.permissions import (
ProjectBasePermission,
ProjectMemberPermission,
)
from plane.db.models import (
Project,
ProjectMember,
Workspace,
State,
UserFavorite,
ProjectIdentifier,
Module,
Cycle,
Inbox,
DeployBoard,
IssueProperty,
Issue,
Module,
Project,
ProjectIdentifier,
ProjectMember,
State,
Workspace,
)
from plane.utils.cache import cache_response
from plane.bgtasks.webhook_task import model_activity
@ -168,6 +166,7 @@ class ProjectViewSet(BaseViewSet):
"cursor", False
):
return self.paginate(
order_by=request.GET.get("order_by", "-created_at"),
request=request,
queryset=(projects),
on_results=lambda projects: ProjectListSerializer(

View file

@ -250,6 +250,7 @@ class UserActivityEndpoint(BaseAPIView, BasePaginator):
).select_related("actor", "workspace", "issue", "project")
return self.paginate(
order_by=request.GET.get("order_by", "-created_at"),
request=request,
queryset=queryset,
on_results=lambda issue_activities: IssueActivitySerializer(

View file

@ -1,47 +1,56 @@
# Django imports
from django.db.models import (
Q,
OuterRef,
Func,
F,
Case,
Value,
CharField,
When,
Exists,
Max,
)
from django.utils.decorators import method_decorator
from django.views.decorators.gzip import gzip_page
from django.contrib.postgres.aggregates import ArrayAgg
from django.contrib.postgres.fields import ArrayField
from django.db.models import UUIDField
from django.db.models import (
Exists,
F,
Func,
OuterRef,
Q,
UUIDField,
Value,
)
from django.db.models.functions import Coalesce
from django.utils.decorators import method_decorator
from django.views.decorators.gzip import gzip_page
from rest_framework import status
# Third party imports
from rest_framework.response import Response
from rest_framework import status
from plane.app.permissions import (
ProjectEntityPermission,
WorkspaceEntityPermission,
)
from plane.app.serializers import (
IssueViewSerializer,
)
from plane.db.models import (
Issue,
IssueAttachment,
IssueLink,
IssueView,
Workspace,
)
from plane.utils.grouper import (
issue_group_values,
issue_on_results,
issue_queryset_grouper,
)
from plane.utils.issue_filters import issue_filters
from plane.utils.order_queryset import order_issue_queryset
from plane.utils.paginator import (
GroupedOffsetPaginator,
SubGroupedOffsetPaginator,
)
# Module imports
from .. import BaseViewSet
from plane.app.serializers import (
IssueViewSerializer,
IssueSerializer,
)
from plane.app.permissions import (
WorkspaceEntityPermission,
ProjectEntityPermission,
)
from plane.db.models import (
Workspace,
IssueView,
Issue,
UserFavorite,
IssueLink,
IssueAttachment,
)
from plane.utils.issue_filters import issue_filters
from plane.utils.user_timezone_converter import user_timezone_converter
class GlobalViewViewSet(BaseViewSet):
serializer_class = IssueViewSerializer
@ -143,17 +152,6 @@ class GlobalViewIssuesViewSet(BaseViewSet):
@method_decorator(gzip_page)
def list(self, request, slug):
filters = issue_filters(request.query_params, "GET")
# Custom ordering for priority and state
priority_order = ["urgent", "high", "medium", "low", "none"]
state_order = [
"backlog",
"unstarted",
"started",
"completed",
"cancelled",
]
order_by_param = request.GET.get("order_by", "-created_at")
issue_queryset = (
@ -162,103 +160,107 @@ class GlobalViewIssuesViewSet(BaseViewSet):
.annotate(cycle_id=F("issue_cycle__cycle_id"))
)
# Priority Ordering
if order_by_param == "priority" or order_by_param == "-priority":
priority_order = (
priority_order
if order_by_param == "priority"
else priority_order[::-1]
)
issue_queryset = issue_queryset.annotate(
priority_order=Case(
*[
When(priority=p, then=Value(i))
for i, p in enumerate(priority_order)
],
output_field=CharField(),
)
).order_by("priority_order")
# Issue queryset
issue_queryset, order_by_param = order_issue_queryset(
issue_queryset=issue_queryset,
order_by_param=order_by_param,
)
# State Ordering
elif order_by_param in [
"state__name",
"state__group",
"-state__name",
"-state__group",
]:
state_order = (
state_order
if order_by_param in ["state__name", "state__group"]
else state_order[::-1]
)
issue_queryset = issue_queryset.annotate(
state_order=Case(
*[
When(state__group=state_group, then=Value(i))
for i, state_group in enumerate(state_order)
],
default=Value(len(state_order)),
output_field=CharField(),
)
).order_by("state_order")
# assignee and label ordering
elif order_by_param in [
"labels__name",
"-labels__name",
"assignees__first_name",
"-assignees__first_name",
]:
issue_queryset = issue_queryset.annotate(
max_values=Max(
order_by_param[1::]
if order_by_param.startswith("-")
else order_by_param
)
).order_by(
"-max_values"
if order_by_param.startswith("-")
else "max_values"
)
else:
issue_queryset = issue_queryset.order_by(order_by_param)
# Group by
group_by = request.GET.get("group_by", False)
sub_group_by = request.GET.get("sub_group_by", False)
if self.fields:
issues = IssueSerializer(
issue_queryset, many=True, fields=self.fields
).data
# issue queryset
issue_queryset = issue_queryset_grouper(
queryset=issue_queryset,
group_by=group_by,
sub_group_by=sub_group_by,
)
if group_by:
# Check group and sub group value paginate
if sub_group_by:
if group_by == sub_group_by:
return Response(
{
"error": "Group by and sub group by cannot have same parameters"
},
status=status.HTTP_400_BAD_REQUEST,
)
else:
# group and sub group pagination
return self.paginate(
request=request,
order_by=order_by_param,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by,
issues=issues,
sub_group_by=sub_group_by,
),
paginator_cls=SubGroupedOffsetPaginator,
group_by_fields=issue_group_values(
field=group_by,
slug=slug,
project_id=None,
filters=filters,
),
sub_group_by_fields=issue_group_values(
field=sub_group_by,
slug=slug,
project_id=None,
filters=filters,
),
group_by_field_name=group_by,
sub_group_by_field_name=sub_group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
)
# Group Paginate
else:
# Group paginate
return self.paginate(
request=request,
order_by=order_by_param,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by,
issues=issues,
sub_group_by=sub_group_by,
),
paginator_cls=GroupedOffsetPaginator,
group_by_fields=issue_group_values(
field=group_by,
slug=slug,
project_id=None,
filters=filters,
),
group_by_field_name=group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
)
else:
issues = issue_queryset.values(
"id",
"name",
"state_id",
"sort_order",
"completed_at",
"estimate_point",
"priority",
"start_date",
"target_date",
"sequence_id",
"project_id",
"parent_id",
"cycle_id",
"module_ids",
"label_ids",
"assignee_ids",
"sub_issues_count",
"created_at",
"updated_at",
"created_by",
"updated_by",
"attachment_count",
"link_count",
"is_draft",
"archived_at",
# List Paginate
return self.paginate(
order_by=order_by_param,
request=request,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by, issues=issues, sub_group_by=sub_group_by
),
)
datetime_fields = ["created_at", "updated_at"]
issues = user_timezone_converter(
issues, datetime_fields, request.user.user_timezone
)
return Response(issues, status=status.HTTP_200_OK)
class IssueViewViewSet(BaseViewSet):

View file

@ -1,61 +1,66 @@
# Python imports
from datetime import date
from dateutil.relativedelta import relativedelta
# Django imports
from django.utils import timezone
from django.db.models import (
OuterRef,
Func,
F,
Q,
Count,
Case,
Value,
CharField,
When,
Max,
Count,
F,
Func,
IntegerField,
UUIDField,
OuterRef,
Q,
Value,
When,
)
from django.db.models.functions import ExtractWeek, Cast
from django.db.models.fields import DateField
from django.contrib.postgres.aggregates import ArrayAgg
from django.contrib.postgres.fields import ArrayField
from django.db.models.functions import Coalesce
from django.db.models.functions import Cast, ExtractWeek
from django.utils import timezone
# Third party modules
from rest_framework import status
from rest_framework.response import Response
# Module imports
from plane.app.serializers import (
WorkSpaceSerializer,
ProjectMemberSerializer,
IssueActivitySerializer,
IssueSerializer,
WorkspaceUserPropertiesSerializer,
)
from plane.app.views.base import BaseAPIView
from plane.db.models import (
User,
Workspace,
ProjectMember,
IssueActivity,
Issue,
IssueLink,
IssueAttachment,
IssueSubscriber,
Project,
WorkspaceMember,
CycleIssue,
WorkspaceUserProperties,
)
from plane.app.permissions import (
WorkspaceEntityPermission,
WorkspaceViewerPermission,
)
# Module imports
from plane.app.serializers import (
IssueActivitySerializer,
ProjectMemberSerializer,
WorkSpaceSerializer,
WorkspaceUserPropertiesSerializer,
)
from plane.app.views.base import BaseAPIView
from plane.db.models import (
CycleIssue,
Issue,
IssueActivity,
IssueAttachment,
IssueLink,
IssueSubscriber,
Project,
ProjectMember,
User,
Workspace,
WorkspaceMember,
WorkspaceUserProperties,
)
from plane.utils.grouper import (
issue_group_values,
issue_on_results,
issue_queryset_grouper,
)
from plane.utils.issue_filters import issue_filters
from plane.utils.order_queryset import order_issue_queryset
from plane.utils.paginator import (
GroupedOffsetPaginator,
SubGroupedOffsetPaginator,
)
class UserLastProjectWithWorkspaceEndpoint(BaseAPIView):
@ -99,22 +104,8 @@ class WorkspaceUserProfileIssuesEndpoint(BaseAPIView):
]
def get(self, request, slug, user_id):
fields = [
field
for field in request.GET.get("fields", "").split(",")
if field
]
filters = issue_filters(request.query_params, "GET")
# Custom ordering for priority and state
priority_order = ["urgent", "high", "medium", "low", "none"]
state_order = [
"backlog",
"unstarted",
"started",
"completed",
"cancelled",
]
filters = issue_filters(request.query_params, "GET")
order_by_param = request.GET.get("order_by", "-created_at")
issue_queryset = (
@ -152,100 +143,103 @@ class WorkspaceUserProfileIssuesEndpoint(BaseAPIView):
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
label_ids=Coalesce(
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
assignee_ids=Coalesce(
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
module_ids=Coalesce(
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=~Q(issue_module__module_id__isnull=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
)
.order_by("created_at")
).distinct()
# Priority Ordering
if order_by_param == "priority" or order_by_param == "-priority":
priority_order = (
priority_order
if order_by_param == "priority"
else priority_order[::-1]
)
issue_queryset = issue_queryset.annotate(
priority_order=Case(
*[
When(priority=p, then=Value(i))
for i, p in enumerate(priority_order)
],
output_field=CharField(),
)
).order_by("priority_order")
# Issue queryset
issue_queryset, order_by_param = order_issue_queryset(
issue_queryset=issue_queryset,
order_by_param=order_by_param,
)
# State Ordering
elif order_by_param in [
"state__name",
"state__group",
"-state__name",
"-state__group",
]:
state_order = (
state_order
if order_by_param in ["state__name", "state__group"]
else state_order[::-1]
)
issue_queryset = issue_queryset.annotate(
state_order=Case(
*[
When(state__group=state_group, then=Value(i))
for i, state_group in enumerate(state_order)
],
default=Value(len(state_order)),
output_field=CharField(),
# Group by
group_by = request.GET.get("group_by", False)
sub_group_by = request.GET.get("sub_group_by", False)
# issue queryset
issue_queryset = issue_queryset_grouper(
queryset=issue_queryset,
group_by=group_by,
sub_group_by=sub_group_by,
)
if group_by:
if sub_group_by:
if group_by == sub_group_by:
return Response(
{
"error": "Group by and sub group by cannot have same parameters"
},
status=status.HTTP_400_BAD_REQUEST,
)
else:
return self.paginate(
request=request,
order_by=order_by_param,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by,
issues=issues,
sub_group_by=sub_group_by,
),
paginator_cls=SubGroupedOffsetPaginator,
group_by_fields=issue_group_values(
field=group_by,
slug=slug,
filters=filters,
),
sub_group_by_fields=issue_group_values(
field=sub_group_by,
slug=slug,
filters=filters,
),
group_by_field_name=group_by,
sub_group_by_field_name=sub_group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
)
else:
# Group paginate
return self.paginate(
request=request,
order_by=order_by_param,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by,
issues=issues,
sub_group_by=sub_group_by,
),
paginator_cls=GroupedOffsetPaginator,
group_by_fields=issue_group_values(
field=group_by,
slug=slug,
filters=filters,
),
group_by_field_name=group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
)
).order_by("state_order")
# assignee and label ordering
elif order_by_param in [
"labels__name",
"-labels__name",
"assignees__first_name",
"-assignees__first_name",
]:
issue_queryset = issue_queryset.annotate(
max_values=Max(
order_by_param[1::]
if order_by_param.startswith("-")
else order_by_param
)
).order_by(
"-max_values"
if order_by_param.startswith("-")
else "max_values"
)
else:
issue_queryset = issue_queryset.order_by(order_by_param)
issues = IssueSerializer(
issue_queryset, many=True, fields=fields if fields else None
).data
return Response(issues, status=status.HTTP_200_OK)
return self.paginate(
order_by=order_by_param,
request=request,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by, issues=issues, sub_group_by=sub_group_by
),
)
class WorkspaceUserPropertiesEndpoint(BaseAPIView):
@ -397,6 +391,7 @@ class WorkspaceUserActivityEndpoint(BaseAPIView):
queryset = queryset.filter(project__in=projects)
return self.paginate(
order_by=request.GET.get("order_by", "-created_at"),
request=request,
queryset=queryset,
on_results=lambda issue_activities: IssueActivitySerializer(

View file

@ -5,6 +5,7 @@ import logging
from celery import shared_task
# Django imports
# Third party imports
from django.core.mail import EmailMultiAlternatives, get_connection
from django.template.loader import render_to_string
from django.utils.html import strip_tags

View file

@ -1,46 +1,29 @@
# Python imports
import json
from django.core.serializers.json import DjangoJSONEncoder
from django.db.models import Exists, F, Func, OuterRef, Q, Prefetch
# Django imports
from django.utils import timezone
from django.db.models import (
Prefetch,
OuterRef,
Func,
F,
Q,
Case,
Value,
CharField,
When,
Exists,
Max,
IntegerField,
)
from django.core.serializers.json import DjangoJSONEncoder
# Third Party imports
from rest_framework.response import Response
from rest_framework import status
from rest_framework.permissions import AllowAny, IsAuthenticated
# Module imports
from .base import BaseViewSet, BaseAPIView
from plane.app.serializers import (
IssueCommentSerializer,
IssueReactionSerializer,
CommentReactionSerializer,
IssueVoteSerializer,
IssuePublicSerializer,
)
# Third Party imports
from rest_framework.response import Response
from plane.app.serializers import (
CommentReactionSerializer,
IssueCommentSerializer,
IssuePublicSerializer,
IssueReactionSerializer,
IssueVoteSerializer,
)
from plane.db.models import (
Issue,
IssueComment,
Label,
IssueLink,
IssueAttachment,
State,
ProjectMember,
IssueReaction,
CommentReaction,
@ -49,8 +32,20 @@ from plane.db.models import (
ProjectPublicMember,
)
from plane.bgtasks.issue_activites_task import issue_activity
from plane.utils.grouper import group_results
from plane.utils.grouper import (
issue_group_values,
issue_on_results,
issue_queryset_grouper,
)
from plane.utils.issue_filters import issue_filters
from plane.utils.order_queryset import order_issue_queryset
from plane.utils.paginator import (
GroupedOffsetPaginator,
SubGroupedOffsetPaginator,
)
# Module imports
from .base import BaseAPIView, BaseViewSet
class IssueCommentPublicViewSet(BaseViewSet):
@ -535,17 +530,10 @@ class ProjectIssuesPublicEndpoint(BaseAPIView):
anchor=anchor, entity_name="project"
)
filters = issue_filters(request.query_params, "GET")
project_id = project_deploy_board.entity_identifier
slug = project_deploy_board.workspace.slug
# Custom ordering for priority and state
priority_order = ["urgent", "high", "medium", "low", "none"]
state_order = [
"backlog",
"unstarted",
"started",
"completed",
"cancelled",
]
filters = issue_filters(request.query_params, "GET")
order_by_param = request.GET.get("order_by", "-created_at")
@ -576,7 +564,6 @@ class ProjectIssuesPublicEndpoint(BaseAPIView):
)
.filter(**filters)
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(module_id=F("issue_module__module_id"))
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@ -591,113 +578,118 @@ class ProjectIssuesPublicEndpoint(BaseAPIView):
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
).distinct()
order_by_param = request.GET.get("order_by", "-created_at")
issue_queryset = self.get_queryset().filter(**filters)
# Issue queryset
issue_queryset, order_by_param = order_issue_queryset(
issue_queryset=issue_queryset,
order_by_param=order_by_param,
)
# Priority Ordering
if order_by_param == "priority" or order_by_param == "-priority":
priority_order = (
priority_order
if order_by_param == "priority"
else priority_order[::-1]
)
issue_queryset = issue_queryset.annotate(
priority_order=Case(
*[
When(priority=p, then=Value(i))
for i, p in enumerate(priority_order)
],
output_field=CharField(),
)
).order_by("priority_order")
# Group by
group_by = request.GET.get("group_by", False)
sub_group_by = request.GET.get("sub_group_by", False)
# State Ordering
elif order_by_param in [
"state__name",
"state__group",
"-state__name",
"-state__group",
]:
state_order = (
state_order
if order_by_param in ["state__name", "state__group"]
else state_order[::-1]
)
issue_queryset = issue_queryset.annotate(
state_order=Case(
*[
When(state__group=state_group, then=Value(i))
for i, state_group in enumerate(state_order)
],
default=Value(len(state_order)),
output_field=CharField(),
# issue queryset
issue_queryset = issue_queryset_grouper(
queryset=issue_queryset,
group_by=group_by,
sub_group_by=sub_group_by,
)
if group_by:
# Check group and sub group value paginate
if sub_group_by:
if group_by == sub_group_by:
return Response(
{
"error": "Group by and sub group by cannot have same parameters"
},
status=status.HTTP_400_BAD_REQUEST,
)
else:
# group and sub group pagination
return self.paginate(
request=request,
order_by=order_by_param,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by,
issues=issues,
sub_group_by=sub_group_by,
),
paginator_cls=SubGroupedOffsetPaginator,
group_by_fields=issue_group_values(
field=group_by,
slug=slug,
project_id=project_id,
filters=filters,
),
sub_group_by_fields=issue_group_values(
field=sub_group_by,
slug=slug,
project_id=project_id,
filters=filters,
),
group_by_field_name=group_by,
sub_group_by_field_name=sub_group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
)
# Group Paginate
else:
# Group paginate
return self.paginate(
request=request,
order_by=order_by_param,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by,
issues=issues,
sub_group_by=sub_group_by,
),
paginator_cls=GroupedOffsetPaginator,
group_by_fields=issue_group_values(
field=group_by,
slug=slug,
project_id=project_id,
filters=filters,
),
group_by_field_name=group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
)
).order_by("state_order")
# assignee and label ordering
elif order_by_param in [
"labels__name",
"-labels__name",
"assignees__first_name",
"-assignees__first_name",
]:
issue_queryset = issue_queryset.annotate(
max_values=Max(
order_by_param[1::]
if order_by_param.startswith("-")
else order_by_param
)
).order_by(
"-max_values"
if order_by_param.startswith("-")
else "max_values"
)
else:
issue_queryset = issue_queryset.order_by(order_by_param)
issues = IssuePublicSerializer(issue_queryset, many=True).data
state_group_order = [
"backlog",
"unstarted",
"started",
"completed",
"cancelled",
]
states = (
State.objects.filter(
~Q(name="Triage"),
workspace_id=project_deploy_board.workspace_id,
project_id=project_deploy_board.project_id,
)
.annotate(
custom_order=Case(
*[
When(group=value, then=Value(index))
for index, value in enumerate(state_group_order)
],
default=Value(len(state_group_order)),
output_field=IntegerField(),
# List Paginate
return self.paginate(
order_by=order_by_param,
request=request,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by, issues=issues, sub_group_by=sub_group_by
),
)
.values("name", "group", "color", "id")
.order_by("custom_order", "sequence")
)
labels = Label.objects.filter(
workspace_id=project_deploy_board.workspace_id,
project_id=project_deploy_board.project_id,
).values("id", "name", "color", "parent")
## Grouping the results
group_by = request.GET.get("group_by", False)
if group_by:
issues = group_results(issues, group_by)
return Response(
{
"issues": issues,
"states": states,
"labels": labels,
},
status=status.HTTP_200_OK,
)

View file

@ -1,5 +1,9 @@
# Python imports
import logging
import traceback
# Django imports
from django.conf import settings
# Third party imports
from sentry_sdk import capture_exception
@ -11,6 +15,10 @@ def log_exception(e):
logger = logging.getLogger("plane")
logger.error(e)
# Log traceback if running in Debug
if settings.DEBUG:
logger.error(traceback.format_exc(e))
# Capture in sentry if configured
capture_exception(e)
return

View file

@ -1,240 +1,191 @@
def resolve_keys(group_keys, value):
"""resolve keys to a key which will be used for
grouping
# Django imports
from django.contrib.postgres.aggregates import ArrayAgg
from django.contrib.postgres.fields import ArrayField
from django.db.models import Q, UUIDField, Value
from django.db.models.functions import Coalesce
Args:
group_keys (string): key which will be used for grouping
value (obj): data value
Returns:
string: the key which will be used for
"""
keys = group_keys.split(".")
for key in keys:
value = value.get(key, None)
return value
# Module imports
from plane.db.models import (
Cycle,
Issue,
Label,
Module,
Project,
ProjectMember,
State,
WorkspaceMember,
)
def group_results(results_data, group_by, sub_group_by=False):
"""group results data into certain group_by
def issue_queryset_grouper(queryset, group_by, sub_group_by):
Args:
results_data (obj): complete results data
group_by (key): string
FIELD_MAPPER = {
"label_ids": "labels__id",
"assignee_ids": "assignees__id",
"module_ids": "issue_module__module_id",
}
Returns:
obj: grouped results
"""
if sub_group_by:
main_responsive_dict = dict()
annotations_map = {
"assignee_ids": ("assignees__id", ~Q(assignees__id__isnull=True)),
"label_ids": ("labels__id", ~Q(labels__id__isnull=True)),
"module_ids": (
"issue_module__module_id",
~Q(issue_module__module_id__isnull=True),
),
}
default_annotations = {
key: Coalesce(
ArrayAgg(
field,
distinct=True,
filter=condition,
),
Value([], output_field=ArrayField(UUIDField())),
)
for key, (field, condition) in annotations_map.items()
if FIELD_MAPPER.get(key) != group_by
or FIELD_MAPPER.get(key) != sub_group_by
}
if sub_group_by == "priority":
main_responsive_dict = {
"urgent": {},
"high": {},
"medium": {},
"low": {},
"none": {},
}
return queryset.annotate(**default_annotations)
for value in results_data:
main_group_attribute = resolve_keys(sub_group_by, value)
group_attribute = resolve_keys(group_by, value)
if isinstance(main_group_attribute, list) and not isinstance(
group_attribute, list
):
if len(main_group_attribute):
for attrib in main_group_attribute:
if str(attrib) not in main_responsive_dict:
main_responsive_dict[str(attrib)] = {}
if (
str(group_attribute)
in main_responsive_dict[str(attrib)]
):
main_responsive_dict[str(attrib)][
str(group_attribute)
].append(value)
else:
main_responsive_dict[str(attrib)][
str(group_attribute)
] = []
main_responsive_dict[str(attrib)][
str(group_attribute)
].append(value)
else:
if str(None) not in main_responsive_dict:
main_responsive_dict[str(None)] = {}
if str(group_attribute) in main_responsive_dict[str(None)]:
main_responsive_dict[str(None)][
str(group_attribute)
].append(value)
else:
main_responsive_dict[str(None)][
str(group_attribute)
] = []
main_responsive_dict[str(None)][
str(group_attribute)
].append(value)
def issue_on_results(issues, group_by, sub_group_by):
elif isinstance(group_attribute, list) and not isinstance(
main_group_attribute, list
):
if str(main_group_attribute) not in main_responsive_dict:
main_responsive_dict[str(main_group_attribute)] = {}
if len(group_attribute):
for attrib in group_attribute:
if (
str(attrib)
in main_responsive_dict[str(main_group_attribute)]
):
main_responsive_dict[str(main_group_attribute)][
str(attrib)
].append(value)
else:
main_responsive_dict[str(main_group_attribute)][
str(attrib)
] = []
main_responsive_dict[str(main_group_attribute)][
str(attrib)
].append(value)
else:
if (
str(None)
in main_responsive_dict[str(main_group_attribute)]
):
main_responsive_dict[str(main_group_attribute)][
str(None)
].append(value)
else:
main_responsive_dict[str(main_group_attribute)][
str(None)
] = []
main_responsive_dict[str(main_group_attribute)][
str(None)
].append(value)
FIELD_MAPPER = {
"labels__id": "label_ids",
"assignees__id": "assignee_ids",
"issue_module__module_id": "module_ids",
}
elif isinstance(group_attribute, list) and isinstance(
main_group_attribute, list
):
if len(main_group_attribute):
for main_attrib in main_group_attribute:
if str(main_attrib) not in main_responsive_dict:
main_responsive_dict[str(main_attrib)] = {}
if len(group_attribute):
for attrib in group_attribute:
if (
str(attrib)
in main_responsive_dict[str(main_attrib)]
):
main_responsive_dict[str(main_attrib)][
str(attrib)
].append(value)
else:
main_responsive_dict[str(main_attrib)][
str(attrib)
] = []
main_responsive_dict[str(main_attrib)][
str(attrib)
].append(value)
else:
if (
str(None)
in main_responsive_dict[str(main_attrib)]
):
main_responsive_dict[str(main_attrib)][
str(None)
].append(value)
else:
main_responsive_dict[str(main_attrib)][
str(None)
] = []
main_responsive_dict[str(main_attrib)][
str(None)
].append(value)
else:
if str(None) not in main_responsive_dict:
main_responsive_dict[str(None)] = {}
if len(group_attribute):
for attrib in group_attribute:
if str(attrib) in main_responsive_dict[str(None)]:
main_responsive_dict[str(None)][
str(attrib)
].append(value)
else:
main_responsive_dict[str(None)][
str(attrib)
] = []
main_responsive_dict[str(None)][
str(attrib)
].append(value)
else:
if str(None) in main_responsive_dict[str(None)]:
main_responsive_dict[str(None)][str(None)].append(
value
)
else:
main_responsive_dict[str(None)][str(None)] = []
main_responsive_dict[str(None)][str(None)].append(
value
)
else:
main_group_attribute = resolve_keys(sub_group_by, value)
group_attribute = resolve_keys(group_by, value)
original_list = ["assignee_ids", "label_ids", "module_ids"]
if str(main_group_attribute) not in main_responsive_dict:
main_responsive_dict[str(main_group_attribute)] = {}
required_fields = [
"id",
"name",
"state_id",
"sort_order",
"completed_at",
"estimate_point",
"priority",
"start_date",
"target_date",
"sequence_id",
"project_id",
"parent_id",
"cycle_id",
"sub_issues_count",
"created_at",
"updated_at",
"created_by",
"updated_by",
"attachment_count",
"link_count",
"is_draft",
"archived_at",
"state__group",
]
if (
str(group_attribute)
in main_responsive_dict[str(main_group_attribute)]
):
main_responsive_dict[str(main_group_attribute)][
str(group_attribute)
].append(value)
else:
main_responsive_dict[str(main_group_attribute)][
str(group_attribute)
] = []
main_responsive_dict[str(main_group_attribute)][
str(group_attribute)
].append(value)
if group_by in FIELD_MAPPER:
original_list.remove(FIELD_MAPPER[group_by])
original_list.append(group_by)
return main_responsive_dict
if sub_group_by in FIELD_MAPPER:
original_list.remove(FIELD_MAPPER[sub_group_by])
original_list.append(sub_group_by)
else:
response_dict = {}
required_fields.extend(original_list)
return issues.values(*required_fields)
if group_by == "priority":
response_dict = {
"urgent": [],
"high": [],
"medium": [],
"low": [],
"none": [],
}
for value in results_data:
group_attribute = resolve_keys(group_by, value)
if isinstance(group_attribute, list):
if len(group_attribute):
for attrib in group_attribute:
if str(attrib) in response_dict:
response_dict[str(attrib)].append(value)
else:
response_dict[str(attrib)] = []
response_dict[str(attrib)].append(value)
else:
if str(None) in response_dict:
response_dict[str(None)].append(value)
else:
response_dict[str(None)] = []
response_dict[str(None)].append(value)
else:
if str(group_attribute) in response_dict:
response_dict[str(group_attribute)].append(value)
else:
response_dict[str(group_attribute)] = []
response_dict[str(group_attribute)].append(value)
return response_dict
def issue_group_values(field, slug, project_id=None, filters=dict):
if field == "state_id":
queryset = State.objects.filter(
~Q(name="Triage"),
workspace__slug=slug,
).values_list("id", flat=True)
if project_id:
return list(queryset.filter(project_id=project_id))
else:
return list(queryset)
if field == "labels__id":
queryset = Label.objects.filter(workspace__slug=slug).values_list(
"id", flat=True
)
if project_id:
return list(queryset.filter(project_id=project_id)) + ["None"]
else:
return list(queryset) + ["None"]
if field == "assignees__id":
if project_id:
return ProjectMember.objects.filter(
workspace__slug=slug,
project_id=project_id,
is_active=True,
).values_list("member_id", flat=True)
else:
return list(
WorkspaceMember.objects.filter(
workspace__slug=slug, is_active=True
).values_list("member_id", flat=True)
)
if field == "issue_module__module_id":
queryset = Module.objects.filter(
workspace__slug=slug,
).values_list("id", flat=True)
if project_id:
return list(queryset.filter(project_id=project_id)) + ["None"]
else:
return list(queryset) + ["None"]
if field == "cycle_id":
queryset = Cycle.objects.filter(
workspace__slug=slug,
).values_list("id", flat=True)
if project_id:
return list(queryset.filter(project_id=project_id)) + ["None"]
else:
return list(queryset) + ["None"]
if field == "project_id":
queryset = Project.objects.filter(workspace__slug=slug).values_list(
"id", flat=True
)
return list(queryset)
if field == "priority":
return [
"low",
"medium",
"high",
"urgent",
"none",
]
if field == "state__group":
return [
"backlog",
"unstarted",
"started",
"completed",
"cancelled",
]
if field == "target_date":
queryset = (
Issue.issue_objects.filter(workspace__slug=slug)
.filter(**filters)
.values_list("target_date", flat=True)
.distinct()
)
if project_id:
return list(queryset.filter(project_id=project_id))
else:
return list(queryset)
if field == "start_date":
queryset = (
Issue.issue_objects.filter(workspace__slug=slug)
.filter(**filters)
.values_list("start_date", flat=True)
.distinct()
)
if project_id:
return list(queryset.filter(project_id=project_id))
else:
return list(queryset)
return []

View file

@ -1,6 +1,7 @@
import re
import uuid
from datetime import timedelta
from django.utils import timezone
# The date from pattern
@ -63,24 +64,27 @@ def date_filter(filter, date_term, queries):
"""
for query in queries:
date_query = query.split(";")
if len(date_query) >= 2:
match = pattern.match(date_query[0])
if match:
if len(date_query) == 3:
digit, term = date_query[0].split("_")
string_date_filter(
filter=filter,
duration=int(digit),
subsequent=date_query[1],
term=term,
date_filter=date_term,
offset=date_query[2],
)
else:
if "after" in date_query:
filter[f"{date_term}__gte"] = date_query[0]
if date_query:
if len(date_query) >= 2:
match = pattern.match(date_query[0])
if match:
if len(date_query) == 3:
digit, term = date_query[0].split("_")
string_date_filter(
filter=filter,
duration=int(digit),
subsequent=date_query[1],
term=term,
date_filter=date_term,
offset=date_query[2],
)
else:
filter[f"{date_term}__lte"] = date_query[0]
if "after" in date_query:
filter[f"{date_term}__gte"] = date_query[0]
else:
filter[f"{date_term}__lte"] = date_query[0]
else:
filter[f"{date_term}__contains"] = date_query[0]
def filter_state(params, filter, method, prefix=""):

View file

@ -0,0 +1,84 @@
from django.db.models import (
Case,
CharField,
Min,
Value,
When,
)
# Custom ordering for priority and state
PRIORITY_ORDER = ["urgent", "high", "medium", "low", "none"]
STATE_ORDER = [
"backlog",
"unstarted",
"started",
"completed",
"cancelled",
]
def order_issue_queryset(issue_queryset, order_by_param="-created_at"):
# Priority Ordering
if order_by_param == "priority" or order_by_param == "-priority":
issue_queryset = issue_queryset.annotate(
priority_order=Case(
*[
When(priority=p, then=Value(i))
for i, p in enumerate(PRIORITY_ORDER)
],
output_field=CharField(),
)
).order_by("priority_order")
order_by_param = (
"-priority_order"
if order_by_param.startswith("-")
else "priority_order"
)
# State Ordering
elif order_by_param in [
"state__group",
"-state__group",
]:
state_order = (
STATE_ORDER
if order_by_param in ["state__name", "state__group"]
else STATE_ORDER[::-1]
)
issue_queryset = issue_queryset.annotate(
state_order=Case(
*[
When(state__group=state_group, then=Value(i))
for i, state_group in enumerate(state_order)
],
default=Value(len(state_order)),
output_field=CharField(),
)
).order_by("state_order")
order_by_param = (
"-state_order" if order_by_param.startswith("-") else "state_order"
)
# assignee and label ordering
elif order_by_param in [
"labels__name",
"assignees__first_name",
"issue_module__module__name",
"-labels__name",
"-assignees__first_name",
"-issue_module__module__name",
]:
issue_queryset = issue_queryset.annotate(
min_values=Min(
order_by_param[1::]
if order_by_param.startswith("-")
else order_by_param
)
).order_by(
"-min_values" if order_by_param.startswith("-") else "min_values"
)
order_by_param = (
"-min_values" if order_by_param.startswith("-") else "min_values"
)
else:
issue_queryset = issue_queryset.order_by(order_by_param)
order_by_param = order_by_param
return issue_queryset, order_by_param

View file

@ -1,33 +1,49 @@
from rest_framework.response import Response
from rest_framework.exceptions import ParseError
from collections.abc import Sequence
# Python imports
import math
from collections import defaultdict
from collections.abc import Sequence
# Django imports
from django.db.models import Count, F, Window
from django.db.models.functions import RowNumber
# Third party imports
from rest_framework.exceptions import ParseError
from rest_framework.response import Response
# Module imports
class Cursor:
# The cursor value
def __init__(self, value, offset=0, is_prev=False, has_results=None):
self.value = value
self.offset = int(offset)
self.is_prev = bool(is_prev)
self.has_results = has_results
# Return the cursor value in string format
def __str__(self):
return f"{self.value}:{self.offset}:{int(self.is_prev)}"
# Return the cursor value
def __eq__(self, other):
return all(
getattr(self, attr) == getattr(other, attr)
for attr in ("value", "offset", "is_prev", "has_results")
)
# Return the representation of the cursor
def __repr__(self):
return f"{type(self).__name__,}: value={self.value} offset={self.offset}, is_prev={int(self.is_prev)}"
# Return if the cursor is true
def __bool__(self):
return bool(self.has_results)
@classmethod
def from_string(cls, value):
"""Return the cursor value from string format"""
try:
bits = value.split(":")
if len(bits) != 3:
@ -50,15 +66,19 @@ class CursorResult(Sequence):
self.max_hits = max_hits
def __len__(self):
# Return the length of the results
return len(self.results)
def __iter__(self):
# Return the iterator of the results
return iter(self.results)
def __getitem__(self, key):
# Return the results based on the key
return self.results[key]
def __repr__(self):
# Return the representation of the results
return f"<{type(self).__name__}: results={len(self.results)}>"
@ -85,11 +105,14 @@ class OffsetPaginator:
max_offset=None,
on_results=None,
):
# Key tuple and remove `-` if descending order by
self.key = (
order_by
if order_by is None or isinstance(order_by, (list, tuple, set))
else (order_by,)
else (order_by[1::] if order_by.startswith("-") else order_by,)
)
# Set desc to true when `-` exists in the order by
self.desc = True if order_by.startswith("-") else False
self.queryset = queryset
self.max_limit = max_limit
self.max_offset = max_offset
@ -101,11 +124,101 @@ class OffsetPaginator:
if cursor is None:
cursor = Cursor(0, 0, 0)
# Get the min from limit and max limit
limit = min(limit, self.max_limit)
# queryset
queryset = self.queryset
if self.key:
queryset = queryset.order_by(*self.key)
queryset = queryset.order_by(
(
F(*self.key).desc(nulls_last=True)
if self.desc
else F(*self.key).asc(nulls_last=True)
),
"-created_at",
)
# The current page
page = cursor.offset
# The offset
offset = cursor.offset * cursor.value
stop = offset + (cursor.value or limit) + 1
if self.max_offset is not None and offset >= self.max_offset:
raise BadPaginationError("Pagination offset too large")
if offset < 0:
raise BadPaginationError("Pagination offset cannot be negative")
results = queryset[offset:stop]
if cursor.value != limit:
results = results[-(limit + 1) :]
# Adjust cursors based on the results for pagination
next_cursor = Cursor(limit, page + 1, False, results.count() > limit)
# If the page is greater than 0, then set the previous cursor
prev_cursor = Cursor(limit, page - 1, True, page > 0)
# Process the results
results = results[:limit]
# Process the results
if self.on_results:
results = self.on_results(results)
# Count the queryset
count = queryset.count()
# Optionally, calculate the total count and max_hits if needed
max_hits = math.ceil(count / limit)
# Return the cursor results
return CursorResult(
results=results,
next=next_cursor,
prev=prev_cursor,
hits=count,
max_hits=max_hits,
)
def process_results(self, results):
raise NotImplementedError
class GroupedOffsetPaginator(OffsetPaginator):
# Field mappers
FIELD_MAPPER = {
"labels__id": "label_ids",
"assignees__id": "assignee_ids",
"modules__id": "module_ids",
}
def __init__(
self,
queryset,
group_by_field_name,
group_by_fields,
count_filter,
*args,
**kwargs,
):
# Initiate the parent class for all the parameters
super().__init__(queryset, *args, **kwargs)
self.group_by_field_name = group_by_field_name
self.group_by_fields = group_by_fields
self.count_filter = count_filter
def get_result(self, limit=50, cursor=None):
# offset is page #
# value is page limit
if cursor is None:
cursor = Cursor(0, 0, 0)
limit = min(limit, self.max_limit)
# Adjust the initial offset and stop based on the cursor and limit
queryset = self.queryset
page = cursor.offset
offset = cursor.offset * cursor.value
@ -116,20 +229,73 @@ class OffsetPaginator:
if offset < 0:
raise BadPaginationError("Pagination offset cannot be negative")
results = list(queryset[offset:stop])
if cursor.value != limit:
results = results[-(limit + 1) :]
# Compute the results
results = {}
# Create window for all the groups
queryset = queryset.annotate(
row_number=Window(
expression=RowNumber(),
partition_by=[F(self.group_by_field_name)],
order_by=(
(
F(*self.key).desc(
nulls_last=True
) # order by desc if desc is set
if self.desc
else F(*self.key).asc(
nulls_last=True
) # Order by asc if set
),
F("created_at").desc(),
),
)
)
# Filter the results by row number
results = queryset.filter(
row_number__gt=offset, row_number__lt=stop
).order_by(
(
F(*self.key).desc(nulls_last=True)
if self.desc
else F(*self.key).asc(nulls_last=True)
),
F("created_at").desc(),
)
next_cursor = Cursor(limit, page + 1, False, len(results) > limit)
prev_cursor = Cursor(limit, page - 1, True, page > 0)
results = list(results[:limit])
if self.on_results:
results = self.on_results(results)
# Adjust cursors based on the grouped results for pagination
next_cursor = Cursor(
limit,
page + 1,
False,
queryset.filter(row_number__gte=stop).exists(),
)
prev_cursor = Cursor(
limit,
page - 1,
True,
page > 0,
)
# Count the queryset
count = queryset.count()
max_hits = math.ceil(count / limit)
# Optionally, calculate the total count and max_hits if needed
# This might require adjustments based on specific use cases
if results:
max_hits = math.ceil(
queryset.values(self.group_by_field_name)
.annotate(
count=Count(
"id",
filter=self.count_filter,
distinct=True,
)
)
.order_by("-count")[0]["count"]
/ limit
)
else:
max_hits = 0
return CursorResult(
results=results,
next=next_cursor,
@ -138,6 +304,393 @@ class OffsetPaginator:
max_hits=max_hits,
)
def __get_total_queryset(self):
# Get total queryset
return (
self.queryset.values(self.group_by_field_name)
.annotate(
count=Count(
"id",
filter=self.count_filter,
distinct=True,
)
)
.order_by()
)
def __get_total_dict(self):
# Convert the total into dictionary of keys as group name and value as the total
total_group_dict = {}
for group in self.__get_total_queryset():
total_group_dict[str(group.get(self.group_by_field_name))] = (
total_group_dict.get(
str(group.get(self.group_by_field_name)), 0
)
+ (1 if group.get("count") == 0 else group.get("count"))
)
return total_group_dict
def __get_field_dict(self):
# Create a field dictionary
total_group_dict = self.__get_total_dict()
return {
str(field): {
"results": [],
"total_results": total_group_dict.get(str(field), 0),
}
for field in self.group_by_fields
}
def __result_already_added(self, result, group):
# Check if the result is already added then add it
for existing_issue in group:
if existing_issue["id"] == result["id"]:
return True
return False
def __query_multi_grouper(self, results):
# Grouping for m2m values
total_group_dict = self.__get_total_dict()
# Preparing a dict to keep track of group IDs associated with each label ID
result_group_mapping = defaultdict(set)
# Preparing a dict to group result by group ID
grouped_by_field_name = defaultdict(list)
# Iterate over results to fill the above dictionaries
for result in results:
result_id = result["id"]
group_id = result[self.group_by_field_name]
result_group_mapping[str(result_id)].add(str(group_id))
# Adding group_ids key to each issue and grouping by group_name
for result in results:
result_id = result["id"]
group_ids = list(result_group_mapping[str(result_id)])
result[self.FIELD_MAPPER.get(self.group_by_field_name)] = (
[] if "None" in group_ids else group_ids
)
# If a result belongs to multiple groups, add it to each group
for group_id in group_ids:
if not self.__result_already_added(
result, grouped_by_field_name[group_id]
):
grouped_by_field_name[group_id].append(result)
# Convert grouped_by_field_name back to a list for each group
processed_results = {
str(group_id): {
"results": issues,
"total_results": total_group_dict.get(str(group_id)),
}
for group_id, issues in grouped_by_field_name.items()
}
return processed_results
def __query_grouper(self, results):
# Grouping for single values
processed_results = self.__get_field_dict()
for result in results:
(
print(result["created_at"].date(), result["priority"])
if str(result[self.group_by_field_name])
== "c88dfd3b-e97e-4948-851b-a5fe1e36ffd0"
else None
)
group_value = str(result.get(self.group_by_field_name))
if group_value in processed_results:
processed_results[str(group_value)]["results"].append(result)
return processed_results
def process_results(self, results):
# Process results
if results:
if self.group_by_field_name in self.FIELD_MAPPER:
processed_results = self.__query_multi_grouper(results=results)
else:
processed_results = self.__query_grouper(results=results)
else:
processed_results = {}
return processed_results
class SubGroupedOffsetPaginator(OffsetPaginator):
FIELD_MAPPER = {
"labels__id": "label_ids",
"assignees__id": "assignee_ids",
"modules__id": "module_ids",
}
def __init__(
self,
queryset,
group_by_field_name,
sub_group_by_field_name,
group_by_fields,
sub_group_by_fields,
count_filter,
*args,
**kwargs,
):
super().__init__(queryset, *args, **kwargs)
self.group_by_field_name = group_by_field_name
self.group_by_fields = group_by_fields
self.sub_group_by_field_name = sub_group_by_field_name
self.sub_group_by_fields = sub_group_by_fields
self.count_filter = count_filter
def get_result(self, limit=30, cursor=None):
# offset is page #
# value is page limit
if cursor is None:
cursor = Cursor(0, 0, 0)
limit = min(limit, self.max_limit)
# Adjust the initial offset and stop based on the cursor and limit
queryset = self.queryset
page = cursor.offset
offset = cursor.offset * cursor.value
stop = offset + (cursor.value or limit) + 1
if self.max_offset is not None and offset >= self.max_offset:
raise BadPaginationError("Pagination offset too large")
if offset < 0:
raise BadPaginationError("Pagination offset cannot be negative")
# Compute the results
results = {}
# Create windows for group and sub group field name
queryset = queryset.annotate(
row_number=Window(
expression=RowNumber(),
partition_by=[
F(self.group_by_field_name),
F(self.sub_group_by_field_name),
],
order_by=(
(
F(*self.key).desc(nulls_last=True)
if self.desc
else F(*self.key).asc(nulls_last=True)
),
"-created_at",
),
)
)
# Filter the results
results = queryset.filter(
row_number__gt=offset, row_number__lt=stop
).order_by(
(
F(*self.key).desc(nulls_last=True)
if self.desc
else F(*self.key).asc(nulls_last=True)
),
F("created_at").desc(),
)
# Adjust cursors based on the grouped results for pagination
next_cursor = Cursor(
limit,
page + 1,
False,
queryset.filter(row_number__gte=stop).exists(),
)
prev_cursor = Cursor(
limit,
page - 1,
True,
page > 0,
)
# Count the queryset
count = queryset.count()
# Optionally, calculate the total count and max_hits if needed
# This might require adjustments based on specific use cases
if results:
max_hits = math.ceil(
queryset.values(self.group_by_field_name)
.annotate(
count=Count(
"id",
filter=self.count_filter,
distinct=True,
)
)
.order_by("-count")[0]["count"]
/ limit
)
else:
max_hits = 0
return CursorResult(
results=results,
next=next_cursor,
prev=prev_cursor,
hits=count,
max_hits=max_hits,
)
def __get_group_total_queryset(self):
# Get group totals
return (
self.queryset.order_by(self.group_by_field_name)
.values(self.group_by_field_name)
.annotate(
count=Count(
"id",
filter=self.count_filter,
distinct=True,
)
)
.distinct()
)
def __get_subgroup_total_queryset(self):
# Get subgroup totals
return (
self.queryset.values(
self.group_by_field_name, self.sub_group_by_field_name
)
.annotate(
count=Count("id", filter=self.count_filter, distinct=True)
)
.order_by()
.values(
self.group_by_field_name, self.sub_group_by_field_name, "count"
)
)
def __get_total_dict(self):
# Use the above to convert to dictionary of 2D objects
total_group_dict = {}
total_sub_group_dict = {}
for group in self.__get_group_total_queryset():
total_group_dict[str(group.get(self.group_by_field_name))] = (
total_group_dict.get(
str(group.get(self.group_by_field_name)), 0
)
+ (1 if group.get("count") == 0 else group.get("count"))
)
# Sub group total values
for item in self.__get_subgroup_total_queryset():
group = str(item[self.group_by_field_name])
subgroup = str(item[self.sub_group_by_field_name])
count = item["count"]
if group not in total_sub_group_dict:
total_sub_group_dict[str(group)] = {}
if subgroup not in total_sub_group_dict[group]:
total_sub_group_dict[str(group)][str(subgroup)] = {}
total_sub_group_dict[group][subgroup] = count
return total_group_dict, total_sub_group_dict
def __get_field_dict(self):
total_group_dict, total_sub_group_dict = self.__get_total_dict()
return {
str(group): {
"results": {
str(sub_group): {
"results": [],
"total_results": total_sub_group_dict.get(
str(group)
).get(str(sub_group), 0),
}
for sub_group in total_sub_group_dict.get(str(group), [])
},
"total_results": total_group_dict.get(str(group), 0),
}
for group in self.group_by_fields
}
def __query_multi_grouper(self, results):
# Multi grouper
processed_results = self.__get_field_dict()
# Preparing a dict to keep track of group IDs associated with each label ID
result_group_mapping = defaultdict(set)
result_sub_group_mapping = defaultdict(set)
# Iterate over results to fill the above dictionaries
if self.group_by_field_name in self.FIELD_MAPPER:
for result in results:
result_id = result["id"]
group_id = result[self.group_by_field_name]
result_group_mapping[str(result_id)].add(str(group_id))
# Use the same calculation for the sub group
if self.sub_group_by_field_name in self.FIELD_MAPPER:
for result in results:
result_id = result["id"]
sub_group_id = result[self.sub_group_by_field_name]
result_sub_group_mapping[str(result_id)].add(str(sub_group_id))
# Iterate over results
for result in results:
# Get the group value
group_value = str(result.get(self.group_by_field_name))
# Get the sub group value
sub_group_value = str(result.get(self.sub_group_by_field_name))
if (
group_value in processed_results
and sub_group_value
in processed_results[str(group_value)]["results"]
):
if self.group_by_field_name in self.FIELD_MAPPER:
# for multi grouper
group_ids = list(result_group_mapping[str(result_id)])
result[self.FIELD_MAPPER.get(self.group_by_field_name)] = (
[] if "None" in group_ids else group_ids
)
if self.sub_group_by_field_name in self.FIELD_MAPPER:
sub_group_ids = list(result_group_mapping[str(result_id)])
# for multi groups
result[self.FIELD_MAPPER.get(self.group_by_field_name)] = (
[] if "None" in sub_group_ids else sub_group_ids
)
processed_results[str(group_value)]["results"][
str(sub_group_value)
]["results"].append(result)
return processed_results
def __query_grouper(self, results):
# Single grouper
processed_results = self.__get_field_dict()
for result in results:
group_value = str(result.get(self.group_by_field_name))
sub_group_value = str(result.get(self.sub_group_by_field_name))
processed_results[group_value]["results"][sub_group_value][
"results"
].append(result)
return processed_results
def process_results(self, results):
if results:
if (
self.group_by_field_name in self.FIELD_MAPPER
or self.sub_group_by_field_name in self.FIELD_MAPPER
):
processed_results = self.__query_multi_grouper(results=results)
else:
processed_results = self.__query_grouper(results=results)
else:
processed_results = {}
return processed_results
class BasePaginator:
"""BasePaginator class can be inherited by any View to return a paginated view"""
@ -171,6 +724,11 @@ class BasePaginator:
cursor_cls=Cursor,
extra_stats=None,
controller=None,
group_by_field_name=None,
group_by_fields=None,
sub_group_by_field_name=None,
sub_group_by_fields=None,
count_filter=None,
**paginator_kwargs,
):
"""Paginate the request"""
@ -178,15 +736,27 @@ class BasePaginator:
# Convert the cursor value to integer and float from string
input_cursor = None
if request.GET.get(self.cursor_name):
try:
input_cursor = cursor_cls.from_string(
request.GET.get(self.cursor_name)
)
except ValueError:
raise ParseError(detail="Invalid cursor parameter.")
try:
input_cursor = cursor_cls.from_string(
request.GET.get(self.cursor_name, f"{per_page}:0:0"),
)
except ValueError:
raise ParseError(detail="Invalid cursor parameter.")
if not paginator:
if group_by_field_name:
paginator_kwargs["group_by_field_name"] = group_by_field_name
paginator_kwargs["group_by_fields"] = group_by_fields
paginator_kwargs["count_filter"] = count_filter
if sub_group_by_field_name:
paginator_kwargs["sub_group_by_field_name"] = (
sub_group_by_field_name
)
paginator_kwargs["sub_group_by_fields"] = (
sub_group_by_fields
)
paginator = paginator_cls(**paginator_kwargs)
try:
@ -196,12 +766,14 @@ class BasePaginator:
except BadPaginationError:
raise ParseError(detail="Error in parsing")
# Serialize result according to the on_result function
if on_results:
results = on_results(cursor_result.results)
else:
results = cursor_result.results
if group_by_field_name:
results = paginator.process_results(results=results)
# Add Manipulation functions to the response
if controller is not None:
results = controller(results)
@ -211,6 +783,9 @@ class BasePaginator:
# Return the response
response = Response(
{
"grouped_by": group_by_field_name,
"sub_grouped_by": sub_group_by_field_name,
"total_count": (cursor_result.hits),
"next_cursor": str(cursor_result.next),
"prev_cursor": str(cursor_result.prev),
"next_page_results": cursor_result.next.has_results,

View file

@ -60,4 +60,5 @@ zxcvbn==4.4.28
# timezone
pytz==2024.1
# jwt
PyJWT==2.8.0
PyJWT==2.8.0