release: v0.27.0 #7296

This commit is contained in:
sriram veeraghanta 2025-07-01 17:40:56 +05:30 committed by GitHub
commit 5a3f709e72
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
1615 changed files with 27874 additions and 19715 deletions

View file

@ -2,6 +2,7 @@
*.pyc
.env
venv
.venv
node_modules/
**/node_modules/
npm-debug.log
@ -14,4 +15,4 @@ build/
out/
**/out/
dist/
**/dist/
**/dist/

3
.gitignore vendored
View file

@ -1,5 +1,6 @@
node_modules
.next
.yarn
### NextJS ###
# Dependencies
@ -52,6 +53,8 @@ mediafiles
.env
.DS_Store
logs/
htmlcov/
.coverage
node_modules/
assets/dist/

1
.yarnrc.yml Normal file
View file

@ -0,0 +1 @@
nodeLinker: node-modules

View file

@ -35,11 +35,12 @@ This helps us triage and manage issues more efficiently.
### Requirements
- Node.js version v16.18.0
- Docker Engine installed and running
- Node.js version 20+ [LTS version](https://nodejs.org/en/about/previous-releases)
- Python version 3.8+
- Postgres version v14
- Redis version v6.2.7
- **Memory**: Minimum **12 GB RAM** recommended
- **Memory**: Minimum **12 GB RAM** recommended
> ⚠️ Running the project on a system with only 8 GB RAM may lead to setup failures or memory crashes (especially during Docker container build/start or dependency install). Use cloud environments like GitHub Codespaces or upgrade local RAM if possible.
### Setup the project
@ -68,6 +69,17 @@ chmod +x setup.sh
docker compose -f docker-compose-local.yml up
```
4. Start web apps:
```bash
yarn dev
```
5. Open your browser to http://localhost:3001/god-mode/ and register yourself as instance admin
6. Open up your browser to http://localhost:3000 then log in using the same credentials from the previous step
Thats it! Youre all set to begin coding. Remember to refresh your browser if changes dont auto-reload. Happy contributing! 🎉
## Missing a Feature?
If a feature is missing, you can directly _request_ a new one [here](https://github.com/makeplane/plane/issues/new?assignees=&labels=feature&template=feature_request.yml&title=%F0%9F%9A%80+Feature%3A+). You also can do the same by choosing "🚀 Feature" when raising a [New Issue](https://github.com/makeplane/plane/issues/new/choose) on our GitHub Repository.
@ -93,7 +105,7 @@ To ensure consistency throughout the source code, please keep these rules in min
- **Improve documentation** - fix incomplete or missing [docs](https://docs.plane.so/), bad wording, examples or explanations.
## Contributing to language support
This guide is designed to help contributors understand how to add or update translations in the application.
This guide is designed to help contributors understand how to add or update translations in the application.
### Understanding translation structure
@ -108,7 +120,7 @@ packages/i18n/src/locales/
├── fr/
│ └── translations.json
└── [language]/
└── translations.json
└── translations.json
```
#### Nested structure
To keep translations organized, we use a nested structure for keys. This makes it easier to manage and locate specific translations. For example:
@ -128,14 +140,14 @@ To keep translations organized, we use a nested structure for keys. This makes i
We use [IntlMessageFormat](https://formatjs.github.io/docs/intl-messageformat/) to handle dynamic content, such as variables and pluralization. Here's how to format your translations:
#### Examples
- **Simple variables**
- **Simple variables**
```json
{
"greeting": "Hello, {name}!"
}
```
- **Pluralization**
- **Pluralization**
```json
{
"items": "{count, plural, one {Work item} other {Work items}}"
@ -160,15 +172,15 @@ We use [IntlMessageFormat](https://formatjs.github.io/docs/intl-messageformat/)
### Adding new languages
Adding a new language involves several steps to ensure it integrates seamlessly with the project. Follow these instructions carefully:
1. **Update type definitions**
1. **Update type definitions**
Add the new language to the TLanguage type in the language definitions file:
```typescript
// types/language.ts
export type TLanguage = "en" | "fr" | "your-lang";
```
```
2. **Add language configuration**
2. **Add language configuration**
Include the new language in the list of supported languages:
```typescript
@ -179,14 +191,14 @@ Include the new language in the list of supported languages:
];
```
3. **Create translation files**
3. **Create translation files**
1. Create a new folder for your language under locales (e.g., `locales/your-lang/`).
2. Add a `translations.json` file inside the folder.
3. Copy the structure from an existing translation file and translate all keys.
4. **Update import logic**
4. **Update import logic**
Modify the language import logic to include your new language:
```typescript

View file

@ -47,10 +47,10 @@ Meet [Plane](https://plane.so/), an open-source project management tool to track
Getting started with Plane is simple. Choose the setup that works best for you:
- **Plane Cloud**
- **Plane Cloud**
Sign up for a free account on [Plane Cloud](https://app.plane.so)—it's the fastest way to get up and running without worrying about infrastructure.
- **Self-host Plane**
- **Self-host Plane**
Prefer full control over your data and infrastructure? Install and run Plane on your own servers. Follow our detailed [deployment guides](https://developers.plane.so/self-hosting/overview) to get started.
| Installation methods | Docs link |
@ -62,22 +62,22 @@ Prefer full control over your data and infrastructure? Install and run Plane on
## 🌟 Features
- **Issues**
- **Issues**
Efficiently create and manage tasks with a robust rich text editor that supports file uploads. Enhance organization and tracking by adding sub-properties and referencing related issues.
- **Cycles**
- **Cycles**
Maintain your teams momentum with Cycles. Track progress effortlessly using burn-down charts and other insightful tools.
- **Modules**
Simplify complex projects by dividing them into smaller, manageable modules.
- **Modules**
Simplify complex projects by dividing them into smaller, manageable modules.
- **Views**
- **Views**
Customize your workflow by creating filters to display only the most relevant issues. Save and share these views with ease.
- **Pages**
- **Pages**
Capture and organize ideas using Plane Pages, complete with AI capabilities and a rich text editor. Format text, insert images, add hyperlinks, or convert your notes into actionable items.
- **Analytics**
- **Analytics**
Access real-time insights across all your Plane data. Visualize trends, remove blockers, and keep your projects moving forward.
- **Drive** (_coming soon_): The drive helps you share documents, images, videos, or any other files that make sense to you or your team and align on the problem/solution.
@ -85,38 +85,7 @@ Access real-time insights across all your Plane data. Visualize trends, remove b
## 🛠️ Local development
### Pre-requisites
- Ensure Docker Engine is installed and running.
### Development setup
Setting up your local environment is simple and straightforward. Follow these steps to get started:
1. Clone the repository:
```
git clone https://github.com/makeplane/plane.git
```
2. Navigate to the project folder:
```
cd plane
```
3. Create a new branch for your feature or fix:
```
git checkout -b <feature-branch-name>
```
4. Run the setup script in the terminal:
```
./setup.sh
```
5. Open the project in an IDE such as VS Code.
6. Review the `.env` files in the relevant folders. Refer to [Environment Setup](./ENV_SETUP.md) for details on the environment variables used.
7. Start the services using Docker:
```
docker compose -f docker-compose-local.yml up -d
```
Thats it! Youre all set to begin coding. Remember to refresh your browser if changes dont auto-reload. Happy contributing! 🎉
See [CONTRIBUTING](./CONTRIBUTING.md)
## ⚙️ Built with
[![Next JS](https://img.shields.io/badge/next.js-000000?style=for-the-badge&logo=nextdotjs&logoColor=white)](https://nextjs.org/)
@ -194,7 +163,7 @@ Feel free to ask questions, report bugs, participate in discussions, share ideas
If you discover a security vulnerability in Plane, please report it responsibly instead of opening a public issue. We take all legitimate reports seriously and will investigate them promptly. See [Security policy](https://github.com/makeplane/plane/blob/master/SECURITY.md) for more info.
To disclose any security issues, please email us at security@plane.so.
To disclose any security issues, please email us at security@plane.so.
## 🤝 Contributing
@ -219,4 +188,4 @@ Please read [CONTRIBUTING.md](https://github.com/makeplane/plane/blob/master/CON
## License
This project is licensed under the [GNU Affero General Public License v3.0](https://github.com/makeplane/plane/blob/master/LICENSE.txt).
This project is licensed under the [GNU Affero General Public License v3.0](https://github.com/makeplane/plane/blob/master/LICENSE.txt).

View file

@ -1,3 +1,12 @@
NEXT_PUBLIC_API_BASE_URL=""
NEXT_PUBLIC_API_BASE_URL="http://localhost:8000"
NEXT_PUBLIC_WEB_BASE_URL="http://localhost:3000"
NEXT_PUBLIC_ADMIN_BASE_URL="http://localhost:3001"
NEXT_PUBLIC_ADMIN_BASE_PATH="/god-mode"
NEXT_PUBLIC_WEB_BASE_URL=""
NEXT_PUBLIC_SPACE_BASE_URL="http://localhost:3002"
NEXT_PUBLIC_SPACE_BASE_PATH="/spaces"
NEXT_PUBLIC_LIVE_BASE_URL="http://localhost:3100"
NEXT_PUBLIC_LIVE_BASE_PATH="/live"

View file

@ -26,16 +26,16 @@ export const InstanceAIForm: FC<IInstanceAIForm> = (props) => {
formState: { errors, isSubmitting },
} = useForm<AIFormValues>({
defaultValues: {
OPENAI_API_KEY: config["OPENAI_API_KEY"],
GPT_ENGINE: config["GPT_ENGINE"],
LLM_API_KEY: config["LLM_API_KEY"],
LLM_MODEL: config["LLM_MODEL"],
},
});
const aiFormFields: TControllerInputFormField[] = [
{
key: "GPT_ENGINE",
key: "LLM_MODEL",
type: "text",
label: "GPT_ENGINE",
label: "LLM Model",
description: (
<>
Choose an OpenAI engine.{" "}
@ -49,12 +49,12 @@ export const InstanceAIForm: FC<IInstanceAIForm> = (props) => {
</a>
</>
),
placeholder: "gpt-3.5-turbo",
error: Boolean(errors.GPT_ENGINE),
placeholder: "gpt-4o-mini",
error: Boolean(errors.LLM_MODEL),
required: false,
},
{
key: "OPENAI_API_KEY",
key: "LLM_API_KEY",
type: "password",
label: "API key",
description: (
@ -71,7 +71,7 @@ export const InstanceAIForm: FC<IInstanceAIForm> = (props) => {
</>
),
placeholder: "sk-asddassdfasdefqsdfasd23das3dasdcasd",
error: Boolean(errors.OPENAI_API_KEY),
error: Boolean(errors.LLM_API_KEY),
required: false,
},
];

View file

@ -98,11 +98,7 @@ export const InstanceGithubConfigForm: FC<Props> = (props) => {
key: "GITHUB_ORGANIZATION_ID",
type: "text",
label: "Organization ID",
description: (
<>
The organization github ID.
</>
),
description: <>The organization github ID.</>,
placeholder: "123456789",
error: Boolean(errors.GITHUB_ORGANIZATION_ID),
required: false,

View file

@ -3,18 +3,16 @@
import { ReactNode } from "react";
import { ThemeProvider, useTheme } from "next-themes";
import { SWRConfig } from "swr";
// ui
// plane imports
import { ADMIN_BASE_PATH, DEFAULT_SWR_CONFIG } from "@plane/constants";
import { Toast } from "@plane/ui";
import { resolveGeneralTheme } from "@plane/utils";
// constants
// helpers
// lib
import { InstanceProvider } from "@/lib/instance-provider";
import { StoreProvider } from "@/lib/store-provider";
import { UserProvider } from "@/lib/user-provider";
// styles
import "./globals.css";
import "@/styles/globals.css";
const ToastWithTheme = () => {
const { resolvedTheme } = useTheme();

View file

@ -7,7 +7,7 @@ import { LogOut, UserCog2, Palette } from "lucide-react";
import { Menu, Transition } from "@headlessui/react";
// plane internal packages
import { API_BASE_URL } from "@plane/constants";
import {AuthService } from "@plane/services";
import { AuthService } from "@plane/services";
import { Avatar } from "@plane/ui";
import { getFileURL, cn } from "@plane/utils";
// hooks

View file

@ -67,9 +67,8 @@ export const InstanceHeader: FC = observer(() => {
{breadcrumbItems.length >= 0 && (
<div>
<Breadcrumbs>
<Breadcrumbs.BreadcrumbItem
type="text"
link={
<Breadcrumbs.Item
component={
<BreadcrumbLink
href="/general/"
label="Settings"
@ -80,10 +79,9 @@ export const InstanceHeader: FC = observer(() => {
{breadcrumbItems.map(
(item) =>
item.title && (
<Breadcrumbs.BreadcrumbItem
<Breadcrumbs.Item
key={item.title}
type="text"
link={<BreadcrumbLink href={item.href} label={item.title} />}
component={<BreadcrumbLink href={item.href} label={item.title} />}
/>
)
)}

View file

@ -1,11 +1,11 @@
import { FC } from "react";
import { Info, X } from "lucide-react";
// plane constants
import { TAuthErrorInfo } from "@plane/constants";
import { TAdminAuthErrorInfo } from "@plane/constants";
type TAuthBanner = {
bannerData: TAuthErrorInfo | undefined;
handleBannerData?: (bannerData: TAuthErrorInfo | undefined) => void;
bannerData: TAdminAuthErrorInfo | undefined;
handleBannerData?: (bannerData: TAdminAuthErrorInfo | undefined) => void;
};
export const AuthBanner: FC<TAuthBanner> = (props) => {

View file

@ -4,7 +4,7 @@ import { FC, useEffect, useMemo, useState } from "react";
import { useSearchParams } from "next/navigation";
import { Eye, EyeOff } from "lucide-react";
// plane internal packages
import { API_BASE_URL, EAdminAuthErrorCodes, TAuthErrorInfo } from "@plane/constants";
import { API_BASE_URL, EAdminAuthErrorCodes, TAdminAuthErrorInfo } from "@plane/constants";
import { AuthService } from "@plane/services";
import { Button, Input, Spinner } from "@plane/ui";
// components
@ -54,7 +54,7 @@ export const InstanceSignInForm: FC = (props) => {
const [csrfToken, setCsrfToken] = useState<string | undefined>(undefined);
const [formData, setFormData] = useState<TFormData>(defaultFromData);
const [isSubmitting, setIsSubmitting] = useState(false);
const [errorInfo, setErrorInfo] = useState<TAuthErrorInfo | undefined>(undefined);
const [errorInfo, setErrorInfo] = useState<TAdminAuthErrorInfo | undefined>(undefined);
const handleFormChange = (key: keyof TFormData, value: string | boolean) =>
setFormData((prev) => ({ ...prev, [key]: value }));

View file

@ -3,7 +3,7 @@ import Image from "next/image";
import Link from "next/link";
import { KeyRound, Mails } from "lucide-react";
// plane packages
import { SUPPORT_EMAIL, EAdminAuthErrorCodes, TAuthErrorInfo } from "@plane/constants";
import { SUPPORT_EMAIL, EAdminAuthErrorCodes, TAdminAuthErrorInfo } from "@plane/constants";
import { TGetBaseAuthenticationModeProps, TInstanceAuthenticationModes } from "@plane/types";
import { resolveGeneralTheme } from "@plane/utils";
// components
@ -89,7 +89,7 @@ const errorCodeMessages: {
export const authErrorHandler = (
errorCode: EAdminAuthErrorCodes,
email?: string | undefined
): TAuthErrorInfo | undefined => {
): TAdminAuthErrorInfo | undefined => {
const bannerAlertErrorCodes = [
EAdminAuthErrorCodes.ADMIN_ALREADY_EXIST,
EAdminAuthErrorCodes.REQUIRED_ADMIN_EMAIL_PASSWORD_FIRST_NAME,

View file

@ -2,7 +2,7 @@ import set from "lodash/set";
import { observable, action, computed, makeObservable, runInAction } from "mobx";
// plane internal packages
import { EInstanceStatus, TInstanceStatus } from "@plane/constants";
import {InstanceService} from "@plane/services";
import { InstanceService } from "@plane/services";
import {
IInstance,
IInstanceAdmin,

View file

@ -1 +1 @@
export * from "ce/components/authentication/authentication-modes";
export * from "ce/components/authentication/authentication-modes";

View file

@ -1,7 +1,7 @@
{
"name": "admin",
"description": "Admin UI for Plane",
"version": "0.26.1",
"version": "0.27.0",
"license": "AGPL-3.0",
"private": true,
"scripts": {
@ -10,6 +10,7 @@
"build": "next build",
"preview": "next build && next start",
"start": "next start",
"format": "prettier --write .",
"lint": "eslint . --ext .ts,.tsx",
"lint:errors": "eslint . --ext .ts,.tsx --quiet"
},
@ -17,10 +18,11 @@
"@headlessui/react": "^1.7.19",
"@plane/constants": "*",
"@plane/hooks": "*",
"@plane/propel": "*",
"@plane/services": "*",
"@plane/types": "*",
"@plane/ui": "*",
"@plane/utils": "*",
"@plane/services": "*",
"@tailwindcss/typography": "^0.5.9",
"@types/lodash": "^4.17.0",
"autoprefixer": "10.4.14",
@ -29,7 +31,7 @@
"lucide-react": "^0.469.0",
"mobx": "^6.12.0",
"mobx-react": "^9.1.1",
"next": "^14.2.29",
"next": "14.2.30",
"next-themes": "^0.2.1",
"postcss": "^8.4.38",
"react": "^18.3.1",
@ -48,6 +50,6 @@
"@types/react-dom": "^18.2.18",
"@types/uuid": "^9.0.8",
"@types/zxcvbn": "^4.4.4",
"typescript": "5.3.3"
"typescript": "5.8.3"
}
}

View file

@ -1,8 +1,2 @@
module.exports = {
plugins: {
"postcss-import": {},
"tailwindcss/nesting": {},
tailwindcss: {},
autoprefixer: {},
},
};
// eslint-disable-next-line @typescript-eslint/no-require-imports
module.exports = require("@plane/tailwind-config/postcss.config.js");

View file

@ -1,5 +1,4 @@
@import url("https://fonts.googleapis.com/css2?family=Inter:wght@200;300;400;500;600;700;800&display=swap");
@import url("https://fonts.googleapis.com/css2?family=Material+Symbols+Rounded:opsz,wght,FILL,GRAD@48,400,0,0&display=swap");
@import "@plane/propel/styles/fonts";
@tailwind base;
@tailwind components;
@ -60,23 +59,31 @@
--color-border-300: 212, 212, 212; /* strong border- 1 */
--color-border-400: 185, 185, 185; /* strong border- 2 */
--color-shadow-2xs: 0px 0px 1px 0px rgba(23, 23, 23, 0.06), 0px 1px 2px 0px rgba(23, 23, 23, 0.06),
--color-shadow-2xs:
0px 0px 1px 0px rgba(23, 23, 23, 0.06), 0px 1px 2px 0px rgba(23, 23, 23, 0.06),
0px 1px 2px 0px rgba(23, 23, 23, 0.14);
--color-shadow-xs: 0px 1px 2px 0px rgba(0, 0, 0, 0.16), 0px 2px 4px 0px rgba(16, 24, 40, 0.12),
--color-shadow-xs:
0px 1px 2px 0px rgba(0, 0, 0, 0.16), 0px 2px 4px 0px rgba(16, 24, 40, 0.12),
0px 1px 8px -1px rgba(16, 24, 40, 0.1);
--color-shadow-sm: 0px 1px 4px 0px rgba(0, 0, 0, 0.01), 0px 4px 8px 0px rgba(0, 0, 0, 0.02),
0px 1px 12px 0px rgba(0, 0, 0, 0.12);
--color-shadow-rg: 0px 3px 6px 0px rgba(0, 0, 0, 0.1), 0px 4px 4px 0px rgba(16, 24, 40, 0.08),
--color-shadow-sm:
0px 1px 4px 0px rgba(0, 0, 0, 0.01), 0px 4px 8px 0px rgba(0, 0, 0, 0.02), 0px 1px 12px 0px rgba(0, 0, 0, 0.12);
--color-shadow-rg:
0px 3px 6px 0px rgba(0, 0, 0, 0.1), 0px 4px 4px 0px rgba(16, 24, 40, 0.08),
0px 1px 12px 0px rgba(16, 24, 40, 0.04);
--color-shadow-md: 0px 4px 8px 0px rgba(0, 0, 0, 0.12), 0px 6px 12px 0px rgba(16, 24, 40, 0.12),
--color-shadow-md:
0px 4px 8px 0px rgba(0, 0, 0, 0.12), 0px 6px 12px 0px rgba(16, 24, 40, 0.12),
0px 1px 16px 0px rgba(16, 24, 40, 0.12);
--color-shadow-lg: 0px 6px 12px 0px rgba(0, 0, 0, 0.12), 0px 8px 16px 0px rgba(0, 0, 0, 0.12),
--color-shadow-lg:
0px 6px 12px 0px rgba(0, 0, 0, 0.12), 0px 8px 16px 0px rgba(0, 0, 0, 0.12),
0px 1px 24px 0px rgba(16, 24, 40, 0.12);
--color-shadow-xl: 0px 0px 18px 0px rgba(0, 0, 0, 0.16), 0px 0px 24px 0px rgba(16, 24, 40, 0.16),
--color-shadow-xl:
0px 0px 18px 0px rgba(0, 0, 0, 0.16), 0px 0px 24px 0px rgba(16, 24, 40, 0.16),
0px 0px 52px 0px rgba(16, 24, 40, 0.16);
--color-shadow-2xl: 0px 8px 16px 0px rgba(0, 0, 0, 0.12), 0px 12px 24px 0px rgba(16, 24, 40, 0.12),
--color-shadow-2xl:
0px 8px 16px 0px rgba(0, 0, 0, 0.12), 0px 12px 24px 0px rgba(16, 24, 40, 0.12),
0px 1px 32px 0px rgba(16, 24, 40, 0.12);
--color-shadow-3xl: 0px 12px 24px 0px rgba(0, 0, 0, 0.12), 0px 16px 32px 0px rgba(0, 0, 0, 0.12),
--color-shadow-3xl:
0px 12px 24px 0px rgba(0, 0, 0, 0.12), 0px 16px 32px 0px rgba(0, 0, 0, 0.12),
0px 1px 48px 0px rgba(16, 24, 40, 0.12);
--color-shadow-4xl: 0px 8px 40px 0px rgba(0, 0, 61, 0.05), 0px 12px 32px -16px rgba(0, 0, 0, 0.05);

View file

@ -1,13 +1,19 @@
{
"extends": "@plane/typescript-config/nextjs.json",
"compilerOptions": {
"plugins": [{ "name": "next" }],
"plugins": [
{
"name": "next"
}
],
"baseUrl": ".",
"paths": {
"@/*": ["core/*"],
"@/public/*": ["public/*"],
"@/plane-admin/*": ["ce/*"]
}
"@/plane-admin/*": ["ce/*"],
"@/styles/*": ["styles/*"]
},
"strictNullChecks": true
},
"include": ["next-env.d.ts", "next.config.js", "**/*.ts", "**/*.tsx", ".next/types/**/*.ts"],
"exclude": ["node_modules"]

25
apiserver/.coveragerc Normal file
View file

@ -0,0 +1,25 @@
[run]
source = plane
omit =
*/tests/*
*/migrations/*
*/settings/*
*/wsgi.py
*/asgi.py
*/urls.py
manage.py
*/admin.py
*/apps.py
[report]
exclude_lines =
pragma: no cover
def __repr__
if self.debug:
raise NotImplementedError
if __name__ == .__main__.
pass
raise ImportError
[html]
directory = htmlcov

View file

@ -1,7 +1,7 @@
# Backend
# Debug value for api server use it as 0 for production use
DEBUG=0
CORS_ALLOWED_ORIGINS="http://localhost"
CORS_ALLOWED_ORIGINS="http://localhost:3000,http://localhost:3001,http://localhost:3002,http://localhost:3100"
# Database Settings
POSTGRES_USER="plane"
@ -27,7 +27,7 @@ RABBITMQ_VHOST="plane"
AWS_REGION=""
AWS_ACCESS_KEY_ID="access-key"
AWS_SECRET_ACCESS_KEY="secret-key"
AWS_S3_ENDPOINT_URL="http://plane-minio:9000"
AWS_S3_ENDPOINT_URL="http://localhost:9000"
# Changing this requires change in the nginx.conf for uploads if using minio setup
AWS_S3_BUCKET_NAME="uploads"
# Maximum file upload limit
@ -37,22 +37,31 @@ FILE_SIZE_LIMIT=5242880
DOCKERIZED=1 # deprecated
# set to 1 If using the pre-configured minio setup
USE_MINIO=1
USE_MINIO=0
# Nginx Configuration
NGINX_PORT=80
# Email redirections and minio domain settings
WEB_URL="http://localhost"
WEB_URL="http://localhost:8000"
# Gunicorn Workers
GUNICORN_WORKERS=2
# Base URLs
ADMIN_BASE_URL=
SPACE_BASE_URL=
APP_BASE_URL=
ADMIN_BASE_URL="http://localhost:3001"
ADMIN_BASE_PATH="/god-mode"
SPACE_BASE_URL="http://localhost:3002"
SPACE_BASE_PATH="/spaces"
APP_BASE_URL="http://localhost:3000"
APP_BASE_PATH=""
LIVE_BASE_URL="http://localhost:3100"
LIVE_BASE_PATH="/live"
LIVE_SERVER_SECRET_KEY="secret-key"
# Hard delete files after days
HARD_DELETE_AFTER_DAYS=60

View file

@ -1,6 +1,6 @@
{
"name": "plane-api",
"version": "0.26.1",
"version": "0.27.0",
"license": "AGPL-3.0",
"private": true,
"description": "API server powering Plane's backend"

View file

@ -15,4 +15,4 @@ from .state import StateLiteSerializer, StateSerializer
from .cycle import CycleSerializer, CycleIssueSerializer, CycleLiteSerializer
from .module import ModuleSerializer, ModuleIssueSerializer, ModuleLiteSerializer
from .intake import IntakeIssueSerializer
from .estimate import EstimatePointSerializer
from .estimate import EstimatePointSerializer

View file

@ -48,11 +48,6 @@ class CycleSerializer(BaseSerializer):
if not project_id:
raise serializers.ValidationError("Project ID is required")
is_start_date_end_date_equal = (
True
if str(data.get("start_date")) == str(data.get("end_date"))
else False
)
data["start_date"] = convert_to_utc(
date=str(data.get("start_date").date()),
project_id=project_id,
@ -61,7 +56,6 @@ class CycleSerializer(BaseSerializer):
data["end_date"] = convert_to_utc(
date=str(data.get("end_date", None).date()),
project_id=project_id,
is_start_date_end_date_equal=is_start_date_end_date_equal,
)
return data

View file

@ -160,12 +160,15 @@ class IssueSerializer(BaseSerializer):
else:
try:
# Then assign it to default assignee, if it is a valid assignee
if default_assignee_id is not None and ProjectMember.objects.filter(
member_id=default_assignee_id,
project_id=project_id,
role__gte=15,
is_active=True
).exists():
if (
default_assignee_id is not None
and ProjectMember.objects.filter(
member_id=default_assignee_id,
project_id=project_id,
role__gte=15,
is_active=True,
).exists()
):
IssueAssignee.objects.create(
assignee_id=default_assignee_id,
issue=issue,

View file

@ -788,6 +788,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
issue_cycle__issue__deleted_at__isnull=True,
),
)
)
@ -799,6 +800,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
issue_cycle__issue__deleted_at__isnull=True,
),
)
)
@ -847,6 +849,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
)
)
)
old_cycle = old_cycle.first()
estimate_type = Project.objects.filter(
workspace__slug=slug,
@ -966,7 +969,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
)
estimate_completion_chart = burndown_plot(
queryset=old_cycle.first(),
queryset=old_cycle,
slug=slug,
project_id=project_id,
plot_type="points",
@ -1114,7 +1117,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
# Pass the new_cycle queryset to burndown_plot
completion_chart = burndown_plot(
queryset=old_cycle.first(),
queryset=old_cycle,
slug=slug,
project_id=project_id,
plot_type="issues",
@ -1126,12 +1129,12 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
).first()
current_cycle.progress_snapshot = {
"total_issues": old_cycle.first().total_issues,
"completed_issues": old_cycle.first().completed_issues,
"cancelled_issues": old_cycle.first().cancelled_issues,
"started_issues": old_cycle.first().started_issues,
"unstarted_issues": old_cycle.first().unstarted_issues,
"backlog_issues": old_cycle.first().backlog_issues,
"total_issues": old_cycle.total_issues,
"completed_issues": old_cycle.completed_issues,
"cancelled_issues": old_cycle.cancelled_issues,
"started_issues": old_cycle.started_issues,
"unstarted_issues": old_cycle.unstarted_issues,
"backlog_issues": old_cycle.backlog_issues,
"distribution": {
"labels": label_distribution_data,
"assignees": assignee_distribution_data,

View file

@ -58,7 +58,7 @@ from plane.bgtasks.storage_metadata_task import get_asset_object_metadata
from .base import BaseAPIView
from plane.utils.host import base_host
from plane.bgtasks.webhook_task import model_activity
from plane.bgtasks.work_item_link_task import crawl_work_item_link_title
class WorkspaceIssueAPIEndpoint(BaseAPIView):
"""
@ -692,6 +692,9 @@ class IssueLinkAPIEndpoint(BaseAPIView):
serializer = IssueLinkSerializer(data=request.data)
if serializer.is_valid():
serializer.save(project_id=project_id, issue_id=issue_id)
crawl_work_item_link_title.delay(
serializer.data.get("id"), serializer.data.get("url")
)
link = IssueLink.objects.get(pk=serializer.data["id"])
link.created_by_id = request.data.get("created_by", request.user.id)
@ -719,6 +722,9 @@ class IssueLinkAPIEndpoint(BaseAPIView):
serializer = IssueLinkSerializer(issue_link, data=request.data, partial=True)
if serializer.is_valid():
serializer.save()
crawl_work_item_link_title.delay(
serializer.data.get("id"), serializer.data.get("url")
)
issue_activity.delay(
type="link.activity.updated",
requested_data=requested_data,

View file

@ -172,14 +172,14 @@ class ProjectAPIEndpoint(BaseAPIView):
states = [
{
"name": "Backlog",
"color": "#A3A3A3",
"color": "#60646C",
"sequence": 15000,
"group": "backlog",
"default": True,
},
{
"name": "Todo",
"color": "#3A3A3A",
"color": "#60646C",
"sequence": 25000,
"group": "unstarted",
},
@ -191,13 +191,13 @@ class ProjectAPIEndpoint(BaseAPIView):
},
{
"name": "Done",
"color": "#16A34A",
"color": "#46A758",
"sequence": 45000,
"group": "completed",
},
{
"name": "Cancelled",
"color": "#EF4444",
"color": "#9AA4BC",
"sequence": 55000,
"group": "cancelled",
},

View file

@ -39,7 +39,7 @@ from .project import (
ProjectMemberRoleSerializer,
)
from .state import StateSerializer, StateLiteSerializer
from .view import IssueViewSerializer
from .view import IssueViewSerializer, ViewIssueListSerializer
from .cycle import (
CycleSerializer,
CycleIssueSerializer,
@ -74,6 +74,7 @@ from .issue import (
IssueLinkLiteSerializer,
IssueVersionDetailSerializer,
IssueDescriptionVersionDetailSerializer,
IssueListDetailSerializer,
)
from .module import (

View file

@ -25,11 +25,6 @@ class CycleWriteSerializer(BaseSerializer):
or (self.instance and self.instance.project_id)
or self.context.get("project_id", None)
)
is_start_date_end_date_equal = (
True
if str(data.get("start_date")) == str(data.get("end_date"))
else False
)
data["start_date"] = convert_to_utc(
date=str(data.get("start_date").date()),
project_id=project_id,
@ -38,7 +33,6 @@ class CycleWriteSerializer(BaseSerializer):
data["end_date"] = convert_to_utc(
date=str(data.get("end_date", None).date()),
project_id=project_id,
is_start_date_end_date_equal=is_start_date_end_date_equal,
)
return data

View file

@ -53,6 +53,7 @@ def get_entity_model_and_serializer(entity_type):
}
return entity_map.get(entity_type, (None, None))
class UserFavoriteSerializer(serializers.ModelSerializer):
entity_data = serializers.SerializerMethodField()

View file

@ -725,6 +725,110 @@ class IssueSerializer(DynamicBaseSerializer):
read_only_fields = fields
class IssueListDetailSerializer(serializers.Serializer):
def __init__(self, *args, **kwargs):
# Extract expand parameter and store it as instance variable
self.expand = kwargs.pop("expand", []) or []
# Extract fields parameter and store it as instance variable
self.fields = kwargs.pop("fields", []) or []
super().__init__(*args, **kwargs)
def get_module_ids(self, obj):
return [module.module_id for module in obj.issue_module.all()]
def get_label_ids(self, obj):
return [label.label_id for label in obj.label_issue.all()]
def get_assignee_ids(self, obj):
return [assignee.assignee_id for assignee in obj.issue_assignee.all()]
def to_representation(self, instance):
data = {
# Basic fields
"id": instance.id,
"name": instance.name,
"state_id": instance.state_id,
"sort_order": instance.sort_order,
"completed_at": instance.completed_at,
"estimate_point": instance.estimate_point_id,
"priority": instance.priority,
"start_date": instance.start_date,
"target_date": instance.target_date,
"sequence_id": instance.sequence_id,
"project_id": instance.project_id,
"parent_id": instance.parent_id,
"created_at": instance.created_at,
"updated_at": instance.updated_at,
"created_by": instance.created_by_id,
"updated_by": instance.updated_by_id,
"is_draft": instance.is_draft,
"archived_at": instance.archived_at,
# Computed fields
"cycle_id": instance.cycle_id,
"module_ids": self.get_module_ids(instance),
"label_ids": self.get_label_ids(instance),
"assignee_ids": self.get_assignee_ids(instance),
"sub_issues_count": instance.sub_issues_count,
"attachment_count": instance.attachment_count,
"link_count": instance.link_count,
}
# Handle expanded fields only when requested - using direct field access
if self.expand:
if "issue_relation" in self.expand:
relations = []
for relation in instance.issue_relation.all():
related_issue = relation.related_issue
# If the related issue is deleted, skip it
if not related_issue:
continue
# Add the related issue to the relations list
relations.append(
{
"id": related_issue.id,
"project_id": related_issue.project_id,
"sequence_id": related_issue.sequence_id,
"name": related_issue.name,
"relation_type": relation.relation_type,
"state_id": related_issue.state_id,
"priority": related_issue.priority,
"created_by": related_issue.created_by_id,
"created_at": related_issue.created_at,
"updated_at": related_issue.updated_at,
"updated_by": related_issue.updated_by_id,
}
)
data["issue_relation"] = relations
if "issue_related" in self.expand:
related = []
for relation in instance.issue_related.all():
issue = relation.issue
# If the related issue is deleted, skip it
if not issue:
continue
# Add the related issue to the related list
related.append(
{
"id": issue.id,
"project_id": issue.project_id,
"sequence_id": issue.sequence_id,
"name": issue.name,
"relation_type": relation.relation_type,
"state_id": issue.state_id,
"priority": issue.priority,
"created_by": issue.created_by_id,
"created_at": issue.created_at,
"updated_at": issue.updated_at,
"updated_by": issue.updated_by_id,
}
)
data["issue_related"] = related
return data
class IssueLiteSerializer(DynamicBaseSerializer):
class Meta:
model = Issue

View file

@ -148,10 +148,13 @@ class ProjectMemberAdminSerializer(BaseSerializer):
fields = "__all__"
class ProjectMemberRoleSerializer(DynamicBaseSerializer):
class ProjectMemberRoleSerializer(DynamicBaseSerializer):
original_role = serializers.IntegerField(source='role', read_only=True)
class Meta:
model = ProjectMember
fields = ("id", "role", "member", "project")
fields = ("id", "role", "member", "project", "original_role", "created_at")
read_only_fields = ["original_role", "created_at"]
class ProjectMemberInviteSerializer(BaseSerializer):

View file

@ -1,11 +1,13 @@
# Module imports
from .base import BaseSerializer
from rest_framework import serializers
from plane.db.models import State
class StateSerializer(BaseSerializer):
order = serializers.FloatField(required=False)
class Meta:
model = State
fields = [
@ -18,6 +20,7 @@ class StateSerializer(BaseSerializer):
"default",
"description",
"sequence",
"order",
]
read_only_fields = ["workspace", "project"]

View file

@ -3,11 +3,22 @@ from rest_framework import serializers
# Module import
from plane.db.models import Account, Profile, User, Workspace, WorkspaceMemberInvite
from plane.utils.url import contains_url
from .base import BaseSerializer
class UserSerializer(BaseSerializer):
def validate_first_name(self, value):
if contains_url(value):
raise serializers.ValidationError("First name cannot contain a URL.")
return value
def validate_last_name(self, value):
if contains_url(value):
raise serializers.ValidationError("Last name cannot contain a URL.")
return value
class Meta:
model = User
# Exclude password field from the serializer
@ -99,11 +110,16 @@ class UserMeSettingsSerializer(BaseSerializer):
workspace_member__member=obj.id,
workspace_member__is_active=True,
).first()
logo_asset_url = workspace.logo_asset.asset_url if workspace.logo_asset is not None else ""
return {
"last_workspace_id": profile.last_workspace_id,
"last_workspace_slug": (
workspace.slug if workspace is not None else ""
),
"last_workspace_name": (
workspace.name if workspace is not None else ""
),
"last_workspace_logo": (logo_asset_url),
"fallback_workspace_id": profile.last_workspace_id,
"fallback_workspace_slug": (
workspace.slug if workspace is not None else ""

View file

@ -7,6 +7,49 @@ from plane.db.models import IssueView
from plane.utils.issue_filters import issue_filters
class ViewIssueListSerializer(serializers.Serializer):
def get_assignee_ids(self, instance):
return [assignee.assignee_id for assignee in instance.issue_assignee.all()]
def get_label_ids(self, instance):
return [label.label_id for label in instance.label_issue.all()]
def get_module_ids(self, instance):
return [module.module_id for module in instance.issue_module.all()]
def to_representation(self, instance):
data = {
"id": instance.id,
"name": instance.name,
"state_id": instance.state_id,
"sort_order": instance.sort_order,
"completed_at": instance.completed_at,
"estimate_point": instance.estimate_point_id,
"priority": instance.priority,
"start_date": instance.start_date,
"target_date": instance.target_date,
"sequence_id": instance.sequence_id,
"project_id": instance.project_id,
"parent_id": instance.parent_id,
"cycle_id": instance.cycle_id,
"sub_issues_count": instance.sub_issues_count,
"created_at": instance.created_at,
"updated_at": instance.updated_at,
"created_by": instance.created_by_id,
"updated_by": instance.updated_by_id,
"attachment_count": instance.attachment_count,
"link_count": instance.link_count,
"is_draft": instance.is_draft,
"archived_at": instance.archived_at,
"state__group": instance.state.group if instance.state else None,
"assignee_ids": self.get_assignee_ids(instance),
"label_ids": self.get_label_ids(instance),
"module_ids": self.get_module_ids(instance),
}
return data
class IssueViewSerializer(DynamicBaseSerializer):
is_favorite = serializers.BooleanField(read_only=True)

View file

@ -1,7 +1,5 @@
# Third party imports
from rest_framework import serializers
from rest_framework import status
from rest_framework.response import Response
# Module imports
from .base import BaseSerializer, DynamicBaseSerializer
@ -25,10 +23,12 @@ from plane.db.models import (
WorkspaceUserPreference,
)
from plane.utils.constants import RESTRICTED_WORKSPACE_SLUGS
from plane.utils.url import contains_url
# Django imports
from django.core.validators import URLValidator
from django.core.exceptions import ValidationError
import re
class WorkSpaceSerializer(DynamicBaseSerializer):
@ -36,10 +36,21 @@ class WorkSpaceSerializer(DynamicBaseSerializer):
logo_url = serializers.CharField(read_only=True)
role = serializers.IntegerField(read_only=True)
def validate_name(self, value):
# Check if the name contains a URL
if contains_url(value):
raise serializers.ValidationError("Name must not contain URLs")
return value
def validate_slug(self, value):
# Check if the slug is restricted
if value in RESTRICTED_WORKSPACE_SLUGS:
raise serializers.ValidationError("Slug is not valid")
# Slug should only contain alphanumeric characters, hyphens, and underscores
if not re.match(r"^[a-zA-Z0-9_-]+$", value):
raise serializers.ValidationError(
"Slug can only contain letters, numbers, hyphens (-), and underscores (_)"
)
return value
class Meta:
@ -148,7 +159,6 @@ class WorkspaceUserLinkSerializer(BaseSerializer):
return value
def create(self, validated_data):
# Filtering the WorkspaceUserLink with the given url to check if the link already exists.
@ -157,7 +167,7 @@ class WorkspaceUserLinkSerializer(BaseSerializer):
workspace_user_link = WorkspaceUserLink.objects.filter(
url=url,
workspace_id=validated_data.get("workspace_id"),
owner_id=validated_data.get("owner_id")
owner_id=validated_data.get("owner_id"),
)
if workspace_user_link.exists():
@ -173,10 +183,8 @@ class WorkspaceUserLinkSerializer(BaseSerializer):
url = validated_data.get("url")
workspace_user_link = WorkspaceUserLink.objects.filter(
url=url,
workspace_id=instance.workspace_id,
owner=instance.owner
)
url=url, workspace_id=instance.workspace_id, owner=instance.owner
)
if workspace_user_link.exclude(pk=instance.id).exists():
raise serializers.ValidationError(
@ -185,8 +193,10 @@ class WorkspaceUserLinkSerializer(BaseSerializer):
return super().update(instance, validated_data)
class IssueRecentVisitSerializer(serializers.ModelSerializer):
project_identifier = serializers.SerializerMethodField()
assignees = serializers.SerializerMethodField()
class Meta:
model = Issue
@ -204,9 +214,15 @@ class IssueRecentVisitSerializer(serializers.ModelSerializer):
def get_project_identifier(self, obj):
project = obj.project
return project.identifier if project else None
def get_assignees(self, obj):
return list(
obj.assignees.filter(issue_assignee__deleted_at__isnull=True).values_list(
"id", flat=True
)
)
class ProjectRecentVisitSerializer(serializers.ModelSerializer):
project_members = serializers.SerializerMethodField()

View file

@ -6,8 +6,14 @@ from plane.app.views import (
AnalyticViewViewset,
SavedAnalyticEndpoint,
ExportAnalyticsEndpoint,
AdvanceAnalyticsEndpoint,
AdvanceAnalyticsStatsEndpoint,
AdvanceAnalyticsChartEndpoint,
DefaultAnalyticsEndpoint,
ProjectStatsEndpoint,
ProjectAdvanceAnalyticsEndpoint,
ProjectAdvanceAnalyticsStatsEndpoint,
ProjectAdvanceAnalyticsChartEndpoint,
)
@ -49,4 +55,34 @@ urlpatterns = [
ProjectStatsEndpoint.as_view(),
name="project-analytics",
),
path(
"workspaces/<str:slug>/advance-analytics/",
AdvanceAnalyticsEndpoint.as_view(),
name="advance-analytics",
),
path(
"workspaces/<str:slug>/advance-analytics-stats/",
AdvanceAnalyticsStatsEndpoint.as_view(),
name="advance-analytics-stats",
),
path(
"workspaces/<str:slug>/advance-analytics-charts/",
AdvanceAnalyticsChartEndpoint.as_view(),
name="advance-analytics-chart",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/advance-analytics/",
ProjectAdvanceAnalyticsEndpoint.as_view(),
name="project-advance-analytics",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/advance-analytics-stats/",
ProjectAdvanceAnalyticsStatsEndpoint.as_view(),
name="project-advance-analytics-stats",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/advance-analytics-charts/",
ProjectAdvanceAnalyticsChartEndpoint.as_view(),
name="project-advance-analytics-chart",
),
]

View file

@ -4,14 +4,14 @@ from plane.app.views import ApiTokenEndpoint, ServiceApiTokenEndpoint
urlpatterns = [
# API Tokens
path(
"workspaces/<str:slug>/api-tokens/",
"users/api-tokens/",
ApiTokenEndpoint.as_view(),
name="api-tokens",
),
path(
"workspaces/<str:slug>/api-tokens/<uuid:pk>/",
"users/api-tokens/<uuid:pk>/",
ApiTokenEndpoint.as_view(),
name="api-tokens",
name="api-tokens-details",
),
path(
"workspaces/<str:slug>/service-api-tokens/",

View file

@ -12,6 +12,9 @@ from plane.app.views import (
AssetRestoreEndpoint,
ProjectAssetEndpoint,
ProjectBulkAssetEndpoint,
AssetCheckEndpoint,
WorkspaceAssetDownloadEndpoint,
ProjectAssetDownloadEndpoint,
)
@ -81,5 +84,21 @@ urlpatterns = [
path(
"assets/v2/workspaces/<str:slug>/projects/<uuid:project_id>/<uuid:entity_id>/bulk/",
ProjectBulkAssetEndpoint.as_view(),
name="bulk-asset-update",
),
path(
"assets/v2/workspaces/<str:slug>/check/<uuid:asset_id>/",
AssetCheckEndpoint.as_view(),
name="asset-check",
),
path(
"assets/v2/workspaces/<str:slug>/download/<uuid:asset_id>/",
WorkspaceAssetDownloadEndpoint.as_view(),
name="workspace-asset-download",
),
path(
"assets/v2/workspaces/<str:slug>/projects/<uuid:project_id>/download/<uuid:asset_id>/",
ProjectAssetDownloadEndpoint.as_view(),
name="project-asset-download",
),
]

View file

@ -106,6 +106,9 @@ from .asset.v2 import (
AssetRestoreEndpoint,
ProjectAssetEndpoint,
ProjectBulkAssetEndpoint,
AssetCheckEndpoint,
WorkspaceAssetDownloadEndpoint,
ProjectAssetDownloadEndpoint,
)
from .issue.base import (
IssueListEndpoint,
@ -199,6 +202,18 @@ from .analytic.base import (
ProjectStatsEndpoint,
)
from .analytic.advance import (
AdvanceAnalyticsEndpoint,
AdvanceAnalyticsStatsEndpoint,
AdvanceAnalyticsChartEndpoint,
)
from .analytic.project_analytics import (
ProjectAdvanceAnalyticsEndpoint,
ProjectAdvanceAnalyticsStatsEndpoint,
ProjectAdvanceAnalyticsChartEndpoint,
)
from .notification.base import (
NotificationViewSet,
UnreadNotificationEndpoint,

View file

@ -0,0 +1,366 @@
from rest_framework.response import Response
from rest_framework import status
from typing import Dict, List, Any
from django.db.models import QuerySet, Q, Count
from django.http import HttpRequest
from django.db.models.functions import TruncMonth
from django.utils import timezone
from plane.app.views.base import BaseAPIView
from plane.app.permissions import ROLE, allow_permission
from plane.db.models import (
WorkspaceMember,
Project,
Issue,
Cycle,
Module,
IssueView,
ProjectPage,
Workspace,
CycleIssue,
ModuleIssue,
ProjectMember,
)
from plane.utils.build_chart import build_analytics_chart
from plane.utils.date_utils import (
get_analytics_filters,
)
class AdvanceAnalyticsBaseView(BaseAPIView):
def initialize_workspace(self, slug: str, type: str) -> None:
self._workspace_slug = slug
self.filters = get_analytics_filters(
slug=slug,
type=type,
user=self.request.user,
date_filter=self.request.GET.get("date_filter", None),
project_ids=self.request.GET.get("project_ids", None),
)
class AdvanceAnalyticsEndpoint(AdvanceAnalyticsBaseView):
def get_filtered_counts(self, queryset: QuerySet) -> Dict[str, int]:
def get_filtered_count() -> int:
if self.filters["analytics_date_range"]:
return queryset.filter(
created_at__gte=self.filters["analytics_date_range"]["current"][
"gte"
],
created_at__lte=self.filters["analytics_date_range"]["current"][
"lte"
],
).count()
return queryset.count()
def get_previous_count() -> int:
if self.filters["analytics_date_range"] and self.filters[
"analytics_date_range"
].get("previous"):
return queryset.filter(
created_at__gte=self.filters["analytics_date_range"]["previous"][
"gte"
],
created_at__lte=self.filters["analytics_date_range"]["previous"][
"lte"
],
).count()
return 0
return {
"count": get_filtered_count(),
# "filter_count": get_previous_count(),
}
def get_overview_data(self) -> Dict[str, Dict[str, int]]:
members_query = WorkspaceMember.objects.filter(
workspace__slug=self._workspace_slug, is_active=True
)
if self.request.GET.get("project_ids", None):
project_ids = self.request.GET.get("project_ids", None)
project_ids = [str(project_id) for project_id in project_ids.split(",")]
members_query = ProjectMember.objects.filter(
project_id__in=project_ids, is_active=True
)
return {
"total_users": self.get_filtered_counts(members_query),
"total_admins": self.get_filtered_counts(
members_query.filter(role=ROLE.ADMIN.value)
),
"total_members": self.get_filtered_counts(
members_query.filter(role=ROLE.MEMBER.value)
),
"total_guests": self.get_filtered_counts(
members_query.filter(role=ROLE.GUEST.value)
),
"total_projects": self.get_filtered_counts(
Project.objects.filter(**self.filters["project_filters"])
),
"total_work_items": self.get_filtered_counts(
Issue.issue_objects.filter(**self.filters["base_filters"])
),
"total_cycles": self.get_filtered_counts(
Cycle.objects.filter(**self.filters["base_filters"])
),
"total_intake": self.get_filtered_counts(
Issue.objects.filter(**self.filters["base_filters"]).filter(
issue_intake__status__in=["-2", "0"]
)
),
}
def get_work_items_stats(self) -> Dict[str, Dict[str, int]]:
base_queryset = Issue.issue_objects.filter(**self.filters["base_filters"])
return {
"total_work_items": self.get_filtered_counts(base_queryset),
"started_work_items": self.get_filtered_counts(
base_queryset.filter(state__group="started")
),
"backlog_work_items": self.get_filtered_counts(
base_queryset.filter(state__group="backlog")
),
"un_started_work_items": self.get_filtered_counts(
base_queryset.filter(state__group="unstarted")
),
"completed_work_items": self.get_filtered_counts(
base_queryset.filter(state__group="completed")
),
}
@allow_permission([ROLE.ADMIN, ROLE.MEMBER], level="WORKSPACE")
def get(self, request: HttpRequest, slug: str) -> Response:
self.initialize_workspace(slug, type="analytics")
tab = request.GET.get("tab", "overview")
if tab == "overview":
return Response(
self.get_overview_data(),
status=status.HTTP_200_OK,
)
elif tab == "work-items":
return Response(
self.get_work_items_stats(),
status=status.HTTP_200_OK,
)
return Response({"message": "Invalid tab"}, status=status.HTTP_400_BAD_REQUEST)
class AdvanceAnalyticsStatsEndpoint(AdvanceAnalyticsBaseView):
def get_project_issues_stats(self) -> QuerySet:
# Get the base queryset with workspace and project filters
base_queryset = Issue.issue_objects.filter(**self.filters["base_filters"])
# Apply date range filter if available
if self.filters["chart_period_range"]:
start_date, end_date = self.filters["chart_period_range"]
base_queryset = base_queryset.filter(
created_at__date__gte=start_date, created_at__date__lte=end_date
)
return (
base_queryset.values("project_id", "project__name").annotate(
cancelled_work_items=Count("id", filter=Q(state__group="cancelled")),
completed_work_items=Count("id", filter=Q(state__group="completed")),
backlog_work_items=Count("id", filter=Q(state__group="backlog")),
un_started_work_items=Count("id", filter=Q(state__group="unstarted")),
started_work_items=Count("id", filter=Q(state__group="started")),
)
.order_by("project_id")
)
def get_work_items_stats(self) -> Dict[str, Dict[str, int]]:
base_queryset = Issue.issue_objects.filter(**self.filters["base_filters"])
return (
base_queryset
.values("project_id", "project__name")
.annotate(
cancelled_work_items=Count("id", filter=Q(state__group="cancelled")),
completed_work_items=Count("id", filter=Q(state__group="completed")),
backlog_work_items=Count("id", filter=Q(state__group="backlog")),
un_started_work_items=Count("id", filter=Q(state__group="unstarted")),
started_work_items=Count("id", filter=Q(state__group="started")),
)
.order_by("project_id")
)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER], level="WORKSPACE")
def get(self, request: HttpRequest, slug: str) -> Response:
self.initialize_workspace(slug, type="chart")
type = request.GET.get("type", "work-items")
if type == "work-items":
return Response(
self.get_work_items_stats(),
status=status.HTTP_200_OK,
)
return Response({"message": "Invalid type"}, status=status.HTTP_400_BAD_REQUEST)
class AdvanceAnalyticsChartEndpoint(AdvanceAnalyticsBaseView):
def project_chart(self) -> List[Dict[str, Any]]:
# Get the base queryset with workspace and project filters
base_queryset = Issue.issue_objects.filter(**self.filters["base_filters"])
date_filter = {}
# Apply date range filter if available
if self.filters["chart_period_range"]:
start_date, end_date = self.filters["chart_period_range"]
date_filter = {
"created_at__date__gte": start_date,
"created_at__date__lte": end_date,
}
total_work_items = base_queryset.filter(**date_filter).count()
total_cycles = Cycle.objects.filter(
**self.filters["base_filters"], **date_filter
).count()
total_modules = Module.objects.filter(
**self.filters["base_filters"], **date_filter
).count()
total_intake = Issue.objects.filter(
issue_intake__isnull=False, **self.filters["base_filters"], **date_filter
).count()
total_members = WorkspaceMember.objects.filter(
workspace__slug=self._workspace_slug, is_active=True, **date_filter
).count()
total_pages = ProjectPage.objects.filter(
**self.filters["base_filters"], **date_filter
).count()
total_views = IssueView.objects.filter(
**self.filters["base_filters"], **date_filter
).count()
data = {
"work_items": total_work_items,
"cycles": total_cycles,
"modules": total_modules,
"intake": total_intake,
"members": total_members,
"pages": total_pages,
"views": total_views,
}
return [
{
"key": key,
"name": key.replace("_", " ").title(),
"count": value or 0,
}
for key, value in data.items()
]
def work_item_completion_chart(self) -> Dict[str, Any]:
# Get the base queryset
queryset = (
Issue.issue_objects.filter(**self.filters["base_filters"])
.select_related("workspace", "state", "parent")
.prefetch_related(
"assignees", "labels", "issue_module__module", "issue_cycle__cycle"
)
)
workspace = Workspace.objects.get(slug=self._workspace_slug)
start_date = workspace.created_at.date().replace(day=1)
# Apply date range filter if available
if self.filters["chart_period_range"]:
start_date, end_date = self.filters["chart_period_range"]
queryset = queryset.filter(
created_at__date__gte=start_date, created_at__date__lte=end_date
)
# Annotate by month and count
monthly_stats = (
queryset.annotate(month=TruncMonth("created_at"))
.values("month")
.annotate(
created_count=Count("id"),
completed_count=Count("id", filter=Q(state__group="completed")),
)
.order_by("month")
)
# Create dictionary of month -> counts
stats_dict = {
stat["month"].strftime("%Y-%m-%d"): {
"created_count": stat["created_count"],
"completed_count": stat["completed_count"],
}
for stat in monthly_stats
}
# Generate monthly data (ensure months with 0 count are included)
data = []
# include the current date at the end
end_date = timezone.now().date()
last_month = end_date.replace(day=1)
current_month = start_date
while current_month <= last_month:
date_str = current_month.strftime("%Y-%m-%d")
stats = stats_dict.get(date_str, {"created_count": 0, "completed_count": 0})
data.append(
{
"key": date_str,
"name": date_str,
"count": stats["created_count"],
"completed_issues": stats["completed_count"],
"created_issues": stats["created_count"],
}
)
# Move to next month
if current_month.month == 12:
current_month = current_month.replace(
year=current_month.year + 1, month=1
)
else:
current_month = current_month.replace(month=current_month.month + 1)
schema = {
"completed_issues": "completed_issues",
"created_issues": "created_issues",
}
return {"data": data, "schema": schema}
@allow_permission([ROLE.ADMIN, ROLE.MEMBER], level="WORKSPACE")
def get(self, request: HttpRequest, slug: str) -> Response:
self.initialize_workspace(slug, type="chart")
type = request.GET.get("type", "projects")
group_by = request.GET.get("group_by", None)
x_axis = request.GET.get("x_axis", "PRIORITY")
if type == "projects":
return Response(self.project_chart(), status=status.HTTP_200_OK)
elif type == "custom-work-items":
queryset = (
Issue.issue_objects.filter(**self.filters["base_filters"])
.select_related("workspace", "state", "parent")
.prefetch_related(
"assignees", "labels", "issue_module__module", "issue_cycle__cycle"
)
)
# Apply date range filter if available
if self.filters["chart_period_range"]:
start_date, end_date = self.filters["chart_period_range"]
queryset = queryset.filter(
created_at__date__gte=start_date, created_at__date__lte=end_date
)
return Response(
build_analytics_chart(queryset, x_axis, group_by),
status=status.HTTP_200_OK,
)
elif type == "work-items":
return Response(
self.work_item_completion_chart(),
status=status.HTTP_200_OK,
)
return Response({"message": "Invalid type"}, status=status.HTTP_400_BAD_REQUEST)

View file

@ -0,0 +1,421 @@
from rest_framework.response import Response
from rest_framework import status
from typing import Dict, Any
from django.db.models import QuerySet, Q, Count
from django.http import HttpRequest
from django.db.models.functions import TruncMonth
from django.utils import timezone
from datetime import timedelta
from plane.app.views.base import BaseAPIView
from plane.app.permissions import ROLE, allow_permission
from plane.db.models import (
Project,
Issue,
Cycle,
Module,
CycleIssue,
ModuleIssue,
)
from django.db import models
from django.db.models import F, Case, When, Value
from django.db.models.functions import Concat
from plane.utils.build_chart import build_analytics_chart
from plane.utils.date_utils import (
get_analytics_filters,
)
class ProjectAdvanceAnalyticsBaseView(BaseAPIView):
def initialize_workspace(self, slug: str, type: str) -> None:
self._workspace_slug = slug
self.filters = get_analytics_filters(
slug=slug,
type=type,
user=self.request.user,
date_filter=self.request.GET.get("date_filter", None),
project_ids=self.request.GET.get("project_ids", None),
)
class ProjectAdvanceAnalyticsEndpoint(ProjectAdvanceAnalyticsBaseView):
def get_filtered_counts(self, queryset: QuerySet) -> Dict[str, int]:
def get_filtered_count() -> int:
if self.filters["analytics_date_range"]:
return queryset.filter(
created_at__gte=self.filters["analytics_date_range"]["current"][
"gte"
],
created_at__lte=self.filters["analytics_date_range"]["current"][
"lte"
],
).count()
return queryset.count()
return {
"count": get_filtered_count(),
}
def get_work_items_stats(
self, project_id, cycle_id=None, module_id=None
) -> Dict[str, Dict[str, int]]:
"""
Returns work item stats for the workspace, or filtered by cycle_id or module_id if provided.
"""
base_queryset = None
if cycle_id is not None:
cycle_issues = CycleIssue.objects.filter(
**self.filters["base_filters"], cycle_id=cycle_id
).values_list("issue_id", flat=True)
base_queryset = Issue.issue_objects.filter(id__in=cycle_issues)
elif module_id is not None:
module_issues = ModuleIssue.objects.filter(
**self.filters["base_filters"], module_id=module_id
).values_list("issue_id", flat=True)
base_queryset = Issue.issue_objects.filter(id__in=module_issues)
else:
base_queryset = Issue.issue_objects.filter(
**self.filters["base_filters"], project_id=project_id
)
return {
"total_work_items": self.get_filtered_counts(base_queryset),
"started_work_items": self.get_filtered_counts(
base_queryset.filter(state__group="started")
),
"backlog_work_items": self.get_filtered_counts(
base_queryset.filter(state__group="backlog")
),
"un_started_work_items": self.get_filtered_counts(
base_queryset.filter(state__group="unstarted")
),
"completed_work_items": self.get_filtered_counts(
base_queryset.filter(state__group="completed")
),
}
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def get(self, request: HttpRequest, slug: str, project_id: str) -> Response:
self.initialize_workspace(slug, type="analytics")
# Optionally accept cycle_id or module_id as query params
cycle_id = request.GET.get("cycle_id", None)
module_id = request.GET.get("module_id", None)
return Response(
self.get_work_items_stats(
cycle_id=cycle_id, module_id=module_id, project_id=project_id
),
status=status.HTTP_200_OK,
)
class ProjectAdvanceAnalyticsStatsEndpoint(ProjectAdvanceAnalyticsBaseView):
def get_project_issues_stats(self) -> QuerySet:
# Get the base queryset with workspace and project filters
base_queryset = Issue.issue_objects.filter(**self.filters["base_filters"])
# Apply date range filter if available
if self.filters["chart_period_range"]:
start_date, end_date = self.filters["chart_period_range"]
base_queryset = base_queryset.filter(
created_at__date__gte=start_date, created_at__date__lte=end_date
)
return (
base_queryset.values("project_id", "project__name")
.annotate(
cancelled_work_items=Count("id", filter=Q(state__group="cancelled")),
completed_work_items=Count("id", filter=Q(state__group="completed")),
backlog_work_items=Count("id", filter=Q(state__group="backlog")),
un_started_work_items=Count("id", filter=Q(state__group="unstarted")),
started_work_items=Count("id", filter=Q(state__group="started")),
)
.order_by("project_id")
)
def get_work_items_stats(
self, project_id, cycle_id=None, module_id=None
) -> Dict[str, Dict[str, int]]:
base_queryset = None
if cycle_id is not None:
cycle_issues = CycleIssue.objects.filter(
**self.filters["base_filters"], cycle_id=cycle_id
).values_list("issue_id", flat=True)
base_queryset = Issue.issue_objects.filter(id__in=cycle_issues)
elif module_id is not None:
module_issues = ModuleIssue.objects.filter(
**self.filters["base_filters"], module_id=module_id
).values_list("issue_id", flat=True)
base_queryset = Issue.issue_objects.filter(id__in=module_issues)
else:
base_queryset = Issue.issue_objects.filter(
**self.filters["base_filters"], project_id=project_id
)
return (
base_queryset.annotate(display_name=F("assignees__display_name"))
.annotate(assignee_id=F("assignees__id"))
.annotate(avatar=F("assignees__avatar"))
.annotate(
avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
assignees__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"assignees__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
assignees__avatar_asset__isnull=True, then="assignees__avatar"
),
default=Value(None),
output_field=models.CharField(),
)
)
.values("display_name", "assignee_id", "avatar_url")
.annotate(
cancelled_work_items=Count(
"id", filter=Q(state__group="cancelled"), distinct=True
),
completed_work_items=Count(
"id", filter=Q(state__group="completed"), distinct=True
),
backlog_work_items=Count(
"id", filter=Q(state__group="backlog"), distinct=True
),
un_started_work_items=Count(
"id", filter=Q(state__group="unstarted"), distinct=True
),
started_work_items=Count(
"id", filter=Q(state__group="started"), distinct=True
),
)
.order_by("display_name")
)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def get(self, request: HttpRequest, slug: str, project_id: str) -> Response:
self.initialize_workspace(slug, type="chart")
type = request.GET.get("type", "work-items")
if type == "work-items":
# Optionally accept cycle_id or module_id as query params
cycle_id = request.GET.get("cycle_id", None)
module_id = request.GET.get("module_id", None)
return Response(
self.get_work_items_stats(
project_id=project_id, cycle_id=cycle_id, module_id=module_id
),
status=status.HTTP_200_OK,
)
return Response({"message": "Invalid type"}, status=status.HTTP_400_BAD_REQUEST)
class ProjectAdvanceAnalyticsChartEndpoint(ProjectAdvanceAnalyticsBaseView):
def work_item_completion_chart(
self, project_id, cycle_id=None, module_id=None
) -> Dict[str, Any]:
# Get the base queryset
queryset = (
Issue.issue_objects.filter(**self.filters["base_filters"])
.filter(project_id=project_id)
.select_related("workspace", "state", "parent")
.prefetch_related(
"assignees", "labels", "issue_module__module", "issue_cycle__cycle"
)
)
if cycle_id is not None:
cycle_issues = CycleIssue.objects.filter(
**self.filters["base_filters"], cycle_id=cycle_id
).values_list("issue_id", flat=True)
cycle = Cycle.objects.filter(id=cycle_id).first()
if cycle and cycle.start_date:
start_date = cycle.start_date.date()
end_date = cycle.end_date.date()
else:
return {"data": [], "schema": {}}
queryset = cycle_issues
elif module_id is not None:
module_issues = ModuleIssue.objects.filter(
**self.filters["base_filters"], module_id=module_id
).values_list("issue_id", flat=True)
module = Module.objects.filter(id=module_id).first()
if module and module.start_date:
start_date = module.start_date
end_date = module.target_date
else:
return {"data": [], "schema": {}}
queryset = module_issues
else:
project = Project.objects.filter(id=project_id).first()
if project.created_at:
start_date = project.created_at.date().replace(day=1)
else:
return {"data": [], "schema": {}}
if cycle_id or module_id:
# Get daily stats with optimized query
daily_stats = (
queryset.values("created_at__date")
.annotate(
created_count=Count("id"),
completed_count=Count(
"id", filter=Q(issue__state__group="completed")
),
)
.order_by("created_at__date")
)
# Create a dictionary of existing stats with summed counts
stats_dict = {
stat["created_at__date"].strftime("%Y-%m-%d"): {
"created_count": stat["created_count"],
"completed_count": stat["completed_count"],
}
for stat in daily_stats
}
# Generate data for all days in the range
data = []
current_date = start_date
while current_date <= end_date:
date_str = current_date.strftime("%Y-%m-%d")
stats = stats_dict.get(
date_str, {"created_count": 0, "completed_count": 0}
)
data.append(
{
"key": date_str,
"name": date_str,
"count": stats["created_count"] + stats["completed_count"],
"completed_issues": stats["completed_count"],
"created_issues": stats["created_count"],
}
)
current_date += timedelta(days=1)
else:
# Apply date range filter if available
if self.filters["chart_period_range"]:
start_date, end_date = self.filters["chart_period_range"]
queryset = queryset.filter(
created_at__date__gte=start_date, created_at__date__lte=end_date
)
# Annotate by month and count
monthly_stats = (
queryset.annotate(month=TruncMonth("created_at"))
.values("month")
.annotate(
created_count=Count("id"),
completed_count=Count("id", filter=Q(state__group="completed")),
)
.order_by("month")
)
# Create dictionary of month -> counts
stats_dict = {
stat["month"].strftime("%Y-%m-%d"): {
"created_count": stat["created_count"],
"completed_count": stat["completed_count"],
}
for stat in monthly_stats
}
# Generate monthly data (ensure months with 0 count are included)
data = []
# include the current date at the end
end_date = timezone.now().date()
last_month = end_date.replace(day=1)
current_month = start_date
while current_month <= last_month:
date_str = current_month.strftime("%Y-%m-%d")
stats = stats_dict.get(
date_str, {"created_count": 0, "completed_count": 0}
)
data.append(
{
"key": date_str,
"name": date_str,
"count": stats["created_count"],
"completed_issues": stats["completed_count"],
"created_issues": stats["created_count"],
}
)
# Move to next month
if current_month.month == 12:
current_month = current_month.replace(
year=current_month.year + 1, month=1
)
else:
current_month = current_month.replace(month=current_month.month + 1)
schema = {
"completed_issues": "completed_issues",
"created_issues": "created_issues",
}
return {"data": data, "schema": schema}
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def get(self, request: HttpRequest, slug: str, project_id: str) -> Response:
self.initialize_workspace(slug, type="chart")
type = request.GET.get("type", "projects")
group_by = request.GET.get("group_by", None)
x_axis = request.GET.get("x_axis", "PRIORITY")
cycle_id = request.GET.get("cycle_id", None)
module_id = request.GET.get("module_id", None)
if type == "custom-work-items":
queryset = (
Issue.issue_objects.filter(**self.filters["base_filters"])
.filter(project_id=project_id)
.select_related("workspace", "state", "parent")
.prefetch_related(
"assignees", "labels", "issue_module__module", "issue_cycle__cycle"
)
)
# Apply cycle/module filters if present
if cycle_id is not None:
cycle_issues = CycleIssue.objects.filter(
**self.filters["base_filters"], cycle_id=cycle_id
).values_list("issue_id", flat=True)
queryset = queryset.filter(id__in=cycle_issues)
elif module_id is not None:
module_issues = ModuleIssue.objects.filter(
**self.filters["base_filters"], module_id=module_id
).values_list("issue_id", flat=True)
queryset = queryset.filter(id__in=module_issues)
# Apply date range filter if available
if self.filters["chart_period_range"]:
start_date, end_date = self.filters["chart_period_range"]
queryset = queryset.filter(
created_at__date__gte=start_date, created_at__date__lte=end_date
)
return Response(
build_analytics_chart(queryset, x_axis, group_by),
status=status.HTTP_200_OK,
)
elif type == "work-items":
# Optionally accept cycle_id or module_id as query params
cycle_id = request.GET.get("cycle_id", None)
module_id = request.GET.get("module_id", None)
return Response(
self.work_item_completion_chart(
project_id=project_id, cycle_id=cycle_id, module_id=module_id
),
status=status.HTTP_200_OK,
)
return Response({"message": "Invalid type"}, status=status.HTTP_400_BAD_REQUEST)

View file

@ -1,24 +1,23 @@
# Python import
from uuid import uuid4
from typing import Optional
# Third party
from rest_framework.response import Response
from rest_framework.request import Request
from rest_framework import status
# Module import
from .base import BaseAPIView
from plane.db.models import APIToken, Workspace
from plane.app.serializers import APITokenSerializer, APITokenReadSerializer
from plane.app.permissions import WorkspaceOwnerPermission
from plane.app.permissions import WorkspaceEntityPermission
class ApiTokenEndpoint(BaseAPIView):
permission_classes = [WorkspaceOwnerPermission]
def post(self, request, slug):
def post(self, request: Request) -> Response:
label = request.data.get("label", str(uuid4().hex))
description = request.data.get("description", "")
workspace = Workspace.objects.get(slug=slug)
expired_at = request.data.get("expired_at", None)
# Check the user type
@ -28,7 +27,6 @@ class ApiTokenEndpoint(BaseAPIView):
label=label,
description=description,
user=request.user,
workspace=workspace,
user_type=user_type,
expired_at=expired_at,
)
@ -37,29 +35,23 @@ class ApiTokenEndpoint(BaseAPIView):
# Token will be only visible while creating
return Response(serializer.data, status=status.HTTP_201_CREATED)
def get(self, request, slug, pk=None):
def get(self, request: Request, pk: Optional[str] = None) -> Response:
if pk is None:
api_tokens = APIToken.objects.filter(
user=request.user, workspace__slug=slug, is_service=False
)
api_tokens = APIToken.objects.filter(user=request.user, is_service=False)
serializer = APITokenReadSerializer(api_tokens, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
else:
api_tokens = APIToken.objects.get(
user=request.user, workspace__slug=slug, pk=pk
)
api_tokens = APIToken.objects.get(user=request.user, pk=pk)
serializer = APITokenReadSerializer(api_tokens)
return Response(serializer.data, status=status.HTTP_200_OK)
def delete(self, request, slug, pk):
api_token = APIToken.objects.get(
workspace__slug=slug, user=request.user, pk=pk, is_service=False
)
def delete(self, request: Request, pk: str) -> Response:
api_token = APIToken.objects.get(user=request.user, pk=pk, is_service=False)
api_token.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
def patch(self, request, slug, pk):
api_token = APIToken.objects.get(workspace__slug=slug, user=request.user, pk=pk)
def patch(self, request: Request, pk: str) -> Response:
api_token = APIToken.objects.get(user=request.user, pk=pk)
serializer = APITokenSerializer(api_token, data=request.data, partial=True)
if serializer.is_valid():
serializer.save()
@ -68,9 +60,9 @@ class ApiTokenEndpoint(BaseAPIView):
class ServiceApiTokenEndpoint(BaseAPIView):
permission_classes = [WorkspaceOwnerPermission]
permission_classes = [WorkspaceEntityPermission]
def post(self, request, slug):
def post(self, request: Request, slug: str) -> Response:
workspace = Workspace.objects.get(slug=slug)
api_token = APIToken.objects.filter(

View file

@ -707,3 +707,67 @@ class ProjectBulkAssetEndpoint(BaseAPIView):
pass
return Response(status=status.HTTP_204_NO_CONTENT)
class AssetCheckEndpoint(BaseAPIView):
"""Endpoint to check if an asset exists."""
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST], level="WORKSPACE")
def get(self, request, slug, asset_id):
asset = FileAsset.all_objects.filter(
id=asset_id, workspace__slug=slug, deleted_at__isnull=True
).exists()
return Response({"exists": asset}, status=status.HTTP_200_OK)
class WorkspaceAssetDownloadEndpoint(BaseAPIView):
"""Endpoint to generate a download link for an asset with content-disposition=attachment."""
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST], level="WORKSPACE")
def get(self, request, slug, asset_id):
try:
asset = FileAsset.objects.get(
id=asset_id,
workspace__slug=slug,
is_uploaded=True,
)
except FileAsset.DoesNotExist:
return Response(
{"error": "The requested asset could not be found."},
status=status.HTTP_404_NOT_FOUND,
)
storage = S3Storage(request=request)
signed_url = storage.generate_presigned_url(
object_name=asset.asset.name,
disposition=f"attachment; filename={asset.asset.name}",
)
return HttpResponseRedirect(signed_url)
class ProjectAssetDownloadEndpoint(BaseAPIView):
"""Endpoint to generate a download link for an asset with content-disposition=attachment."""
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST], level="PROJECT")
def get(self, request, slug, project_id, asset_id):
try:
asset = FileAsset.objects.get(
id=asset_id,
workspace__slug=slug,
project_id=project_id,
is_uploaded=True,
)
except FileAsset.DoesNotExist:
return Response(
{"error": "The requested asset could not be found."},
status=status.HTTP_404_NOT_FOUND,
)
storage = S3Storage(request=request)
signed_url = storage.generate_presigned_url(
object_name=asset.asset.name,
disposition=f"attachment; filename={asset.asset.name}",
)
return HttpResponseRedirect(signed_url)

View file

@ -117,6 +117,7 @@ class CycleViewSet(BaseViewSet):
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
issue_cycle__issue__deleted_at__isnull=True,
),
)
)
@ -129,6 +130,7 @@ class CycleViewSet(BaseViewSet):
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
issue_cycle__issue__deleted_at__isnull=True,
),
)
)
@ -141,6 +143,7 @@ class CycleViewSet(BaseViewSet):
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
issue_cycle__issue__deleted_at__isnull=True,
),
)
)
@ -266,9 +269,7 @@ class CycleViewSet(BaseViewSet):
"created_by",
)
datetime_fields = ["start_date", "end_date"]
data = user_timezone_converter(
data, datetime_fields, project_timezone
)
data = user_timezone_converter(data, datetime_fields, project_timezone)
return Response(data, status=status.HTTP_200_OK)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
@ -415,9 +416,7 @@ class CycleViewSet(BaseViewSet):
project_timezone = project.timezone
datetime_fields = ["start_date", "end_date"]
cycle = user_timezone_converter(
cycle, datetime_fields, project_timezone
)
cycle = user_timezone_converter(cycle, datetime_fields, project_timezone)
# Send the model activity
model_activity.delay(
@ -574,16 +573,12 @@ class CycleDateCheckEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
is_start_date_end_date_equal = (
True if str(start_date) == str(end_date) else False
)
start_date = convert_to_utc(
date=str(start_date), project_id=project_id, is_start_date=True
)
end_date = convert_to_utc(
date=str(end_date),
project_id=project_id,
is_start_date_end_date_equal=is_start_date_end_date_equal,
)
# Check if any cycle intersects in the given interval
@ -668,6 +663,7 @@ class TransferCycleIssueEndpoint(BaseAPIView):
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
issue_cycle__issue__deleted_at__isnull=True,
),
)
)
@ -732,6 +728,7 @@ class TransferCycleIssueEndpoint(BaseAPIView):
)
)
)
old_cycle = old_cycle.first()
estimate_type = Project.objects.filter(
workspace__slug=slug,
@ -850,7 +847,7 @@ class TransferCycleIssueEndpoint(BaseAPIView):
)
estimate_completion_chart = burndown_plot(
queryset=old_cycle.first(),
queryset=old_cycle,
slug=slug,
project_id=project_id,
plot_type="points",
@ -997,7 +994,7 @@ class TransferCycleIssueEndpoint(BaseAPIView):
# Pass the new_cycle queryset to burndown_plot
completion_chart = burndown_plot(
queryset=old_cycle.first(),
queryset=old_cycle,
slug=slug,
project_id=project_id,
plot_type="issues",
@ -1009,12 +1006,12 @@ class TransferCycleIssueEndpoint(BaseAPIView):
).first()
current_cycle.progress_snapshot = {
"total_issues": old_cycle.first().total_issues,
"completed_issues": old_cycle.first().completed_issues,
"cancelled_issues": old_cycle.first().cancelled_issues,
"started_issues": old_cycle.first().started_issues,
"unstarted_issues": old_cycle.first().unstarted_issues,
"backlog_issues": old_cycle.first().backlog_issues,
"total_issues": old_cycle.total_issues,
"completed_issues": old_cycle.completed_issues,
"cancelled_issues": old_cycle.cancelled_issues,
"started_issues": old_cycle.started_issues,
"unstarted_issues": old_cycle.unstarted_issues,
"backlog_issues": old_cycle.backlog_issues,
"distribution": {
"labels": label_distribution_data,
"assignees": assignee_distribution_data,
@ -1122,6 +1119,13 @@ class CycleUserPropertiesEndpoint(BaseAPIView):
class CycleProgressEndpoint(BaseAPIView):
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def get(self, request, slug, project_id, cycle_id):
cycle = Cycle.objects.filter(
workspace__slug=slug, project_id=project_id, id=cycle_id
).first()
if not cycle:
return Response(
{"error": "Cycle not found"}, status=status.HTTP_404_NOT_FOUND
)
aggregate_estimates = (
Issue.issue_objects.filter(
estimate_point__estimate__type="points",
@ -1172,53 +1176,60 @@ class CycleProgressEndpoint(BaseAPIView):
),
)
)
if cycle.progress_snapshot:
backlog_issues = cycle.progress_snapshot.get("backlog_issues", 0)
unstarted_issues = cycle.progress_snapshot.get("unstarted_issues", 0)
started_issues = cycle.progress_snapshot.get("started_issues", 0)
cancelled_issues = cycle.progress_snapshot.get("cancelled_issues", 0)
completed_issues = cycle.progress_snapshot.get("completed_issues", 0)
total_issues = cycle.progress_snapshot.get("total_issues", 0)
else:
backlog_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
state__group="backlog",
).count()
backlog_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
state__group="backlog",
).count()
unstarted_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
state__group="unstarted",
).count()
unstarted_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
state__group="unstarted",
).count()
started_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
state__group="started",
).count()
started_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
state__group="started",
).count()
cancelled_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
state__group="cancelled",
).count()
cancelled_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
state__group="cancelled",
).count()
completed_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
state__group="completed",
).count()
completed_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
state__group="completed",
).count()
total_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
).count()
total_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
).count()
return Response(
{
@ -1279,6 +1290,25 @@ class CycleAnalyticsEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
# this will tell whether the issues were transferred to the new cycle
"""
if the issues were transferred to the new cycle, then the progress_snapshot will be present
return the progress_snapshot data in the analytics for each date
else issues were not transferred to the new cycle then generate the stats from the cycle isssue bridge tables
"""
if cycle.progress_snapshot:
distribution = cycle.progress_snapshot.get("distribution", {})
return Response(
{
"labels": distribution.get("labels", []),
"assignees": distribution.get("assignees", []),
"completion_chart": distribution.get("completion_chart", {}),
},
status=status.HTTP_200_OK,
)
estimate_type = Project.objects.filter(
workspace__slug=slug,
pk=project_id,

View file

@ -29,6 +29,7 @@ from plane.utils.paginator import GroupedOffsetPaginator, SubGroupedOffsetPagina
from plane.app.permissions import allow_permission, ROLE
from plane.utils.host import base_host
class CycleIssueViewSet(BaseViewSet):
serializer_class = CycleIssueSerializer
model = CycleIssue

View file

@ -11,8 +11,7 @@ from rest_framework.response import Response
# Module import
from plane.app.permissions import ROLE, allow_permission
from plane.app.serializers import (ProjectLiteSerializer,
WorkspaceLiteSerializer)
from plane.app.serializers import ProjectLiteSerializer, WorkspaceLiteSerializer
from plane.db.models import Project, Workspace
from plane.license.utils.instance_value import get_configuration_value
from plane.utils.exception_logger import log_exception
@ -22,6 +21,7 @@ from ..base import BaseAPIView
class LLMProvider:
"""Base class for LLM provider configurations"""
name: str = ""
models: List[str] = []
default_model: str = ""
@ -34,11 +34,13 @@ class LLMProvider:
"default_model": cls.default_model,
}
class OpenAIProvider(LLMProvider):
name = "OpenAI"
models = ["gpt-3.5-turbo", "gpt-4o-mini", "gpt-4o", "o1-mini", "o1-preview"]
default_model = "gpt-4o-mini"
class AnthropicProvider(LLMProvider):
name = "Anthropic"
models = [
@ -49,40 +51,45 @@ class AnthropicProvider(LLMProvider):
"claude-2.1",
"claude-2",
"claude-instant-1.2",
"claude-instant-1"
"claude-instant-1",
]
default_model = "claude-3-sonnet-20240229"
class GeminiProvider(LLMProvider):
name = "Gemini"
models = ["gemini-pro", "gemini-1.5-pro-latest", "gemini-pro-vision"]
default_model = "gemini-pro"
SUPPORTED_PROVIDERS = {
"openai": OpenAIProvider,
"anthropic": AnthropicProvider,
"gemini": GeminiProvider,
}
def get_llm_config() -> Tuple[str | None, str | None, str | None]:
"""
Helper to get LLM configuration values, returns:
- api_key, model, provider
"""
api_key, provider_key, model = get_configuration_value([
{
"key": "LLM_API_KEY",
"default": os.environ.get("LLM_API_KEY", None),
},
{
"key": "LLM_PROVIDER",
"default": os.environ.get("LLM_PROVIDER", "openai"),
},
{
"key": "LLM_MODEL",
"default": os.environ.get("LLM_MODEL", None),
},
])
api_key, provider_key, model = get_configuration_value(
[
{
"key": "LLM_API_KEY",
"default": os.environ.get("LLM_API_KEY", None),
},
{
"key": "LLM_PROVIDER",
"default": os.environ.get("LLM_PROVIDER", "openai"),
},
{
"key": "LLM_MODEL",
"default": os.environ.get("LLM_MODEL", None),
},
]
)
provider = SUPPORTED_PROVIDERS.get(provider_key.lower())
if not provider:
@ -99,16 +106,20 @@ def get_llm_config() -> Tuple[str | None, str | None, str | None]:
# Validate model is supported by provider
if model not in provider.models:
log_exception(ValueError(
f"Model {model} not supported by {provider.name}. "
f"Supported models: {', '.join(provider.models)}"
))
log_exception(
ValueError(
f"Model {model} not supported by {provider.name}. "
f"Supported models: {', '.join(provider.models)}"
)
)
return None, None, None
return api_key, model, provider_key
def get_llm_response(task, prompt, api_key: str, model: str, provider: str) -> Tuple[str | None, str | None]:
def get_llm_response(
task, prompt, api_key: str, model: str, provider: str
) -> Tuple[str | None, str | None]:
"""Helper to get LLM completion response"""
final_text = task + "\n" + prompt
try:
@ -118,10 +129,7 @@ def get_llm_response(task, prompt, api_key: str, model: str, provider: str) -> T
client = OpenAI(api_key=api_key)
chat_completion = client.chat.completions.create(
model=model,
messages=[
{"role": "user", "content": final_text}
]
model=model, messages=[{"role": "user", "content": final_text}]
)
text = chat_completion.choices[0].message.content
return text, None
@ -135,6 +143,7 @@ def get_llm_response(task, prompt, api_key: str, model: str, provider: str) -> T
else:
return None, f"Error occurred while generating response from {provider}"
class GPTIntegrationEndpoint(BaseAPIView):
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def post(self, request, slug, project_id):
@ -152,7 +161,9 @@ class GPTIntegrationEndpoint(BaseAPIView):
{"error": "Task is required"}, status=status.HTTP_400_BAD_REQUEST
)
text, error = get_llm_response(task, request.data.get("prompt", False), api_key, model, provider)
text, error = get_llm_response(
task, request.data.get("prompt", False), api_key, model, provider
)
if not text and error:
return Response(
{"error": "An internal error has occurred."},
@ -190,7 +201,9 @@ class WorkspaceGPTIntegrationEndpoint(BaseAPIView):
{"error": "Task is required"}, status=status.HTTP_400_BAD_REQUEST
)
text, error = get_llm_response(task, request.data.get("prompt", False), api_key, model, provider)
text, error = get_llm_response(
task, request.data.get("prompt", False), api_key, model, provider
)
if not text and error:
return Response(
{"error": "An internal error has occurred."},

View file

@ -38,6 +38,7 @@ from plane.utils.paginator import GroupedOffsetPaginator, SubGroupedOffsetPagina
from plane.app.permissions import allow_permission, ROLE
from plane.utils.error_codes import ERROR_CODES
from plane.utils.host import base_host
# Module imports
from .. import BaseViewSet, BaseAPIView

View file

@ -23,6 +23,7 @@ from plane.settings.storage import S3Storage
from plane.bgtasks.storage_metadata_task import get_asset_object_metadata
from plane.utils.host import base_host
class IssueAttachmentEndpoint(BaseAPIView):
serializer_class = IssueAttachmentSerializer
model = FileAsset

View file

@ -32,6 +32,7 @@ from plane.app.serializers import (
IssueDetailSerializer,
IssueUserPropertySerializer,
IssueSerializer,
IssueListDetailSerializer,
)
from plane.bgtasks.issue_activities_task import issue_activity
from plane.db.models import (
@ -46,6 +47,9 @@ from plane.db.models import (
CycleIssue,
UserRecentVisit,
ModuleIssue,
IssueRelation,
IssueAssignee,
IssueLabel,
)
from plane.utils.grouper import (
issue_group_values,
@ -944,10 +948,57 @@ class IssueDetailEndpoint(BaseAPIView):
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def get(self, request, slug, project_id):
filters = issue_filters(request.query_params, "GET")
# check for the project member role, if the role is 5 then check for the guest_view_all_features
# if it is true then show all the issues else show only the issues created by the user
permission_subquery = (
Issue.issue_objects.filter(
workspace__slug=slug, project_id=project_id, id=OuterRef("id")
)
.filter(
Q(
project__project_projectmember__member=self.request.user,
project__project_projectmember__is_active=True,
project__project_projectmember__role__gt=ROLE.GUEST.value,
)
| Q(
project__project_projectmember__member=self.request.user,
project__project_projectmember__is_active=True,
project__project_projectmember__role=ROLE.GUEST.value,
project__guest_view_all_features=True,
)
| Q(
project__project_projectmember__member=self.request.user,
project__project_projectmember__is_active=True,
project__project_projectmember__role=ROLE.GUEST.value,
project__guest_view_all_features=False,
created_by=self.request.user,
)
)
.values("id")
)
# Main issue query
issue = (
Issue.issue_objects.filter(workspace__slug=slug, project_id=project_id)
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels", "issue_module__module")
.filter(Exists(permission_subquery))
.prefetch_related(
Prefetch(
"issue_assignee",
queryset=IssueAssignee.objects.all(),
)
)
.prefetch_related(
Prefetch(
"label_issue",
queryset=IssueLabel.objects.all(),
)
)
.prefetch_related(
Prefetch(
"issue_module",
queryset=ModuleIssue.objects.all(),
)
)
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
@ -955,43 +1006,6 @@ class IssueDetailEndpoint(BaseAPIView):
).values("cycle_id")[:1]
)
)
.annotate(
label_ids=Coalesce(
ArrayAgg(
"labels__id",
distinct=True,
filter=Q(
~Q(labels__id__isnull=True)
& Q(label_issue__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
assignee_ids=Coalesce(
ArrayAgg(
"assignees__id",
distinct=True,
filter=Q(
~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True)
& Q(issue_assignee__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
module_ids=Coalesce(
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=Q(
~Q(issue_module__module_id__isnull=True)
& Q(issue_module__module__archived_at__isnull=True)
& Q(issue_module__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@ -1014,6 +1028,24 @@ class IssueDetailEndpoint(BaseAPIView):
.values("count")
)
)
# Add additional prefetch based on expand parameter
if self.expand:
if "issue_relation" in self.expand:
issue = issue.prefetch_related(
Prefetch(
"issue_relation",
queryset=IssueRelation.objects.select_related("related_issue"),
)
)
if "issue_related" in self.expand:
issue = issue.prefetch_related(
Prefetch(
"issue_related",
queryset=IssueRelation.objects.select_related("issue"),
)
)
issue = issue.filter(**filters)
order_by_param = request.GET.get("order_by", "-created_at")
# Issue queryset
@ -1024,7 +1056,7 @@ class IssueDetailEndpoint(BaseAPIView):
request=request,
order_by=order_by_param,
queryset=(issue),
on_results=lambda issue: IssueSerializer(
on_results=lambda issue: IssueListDetailSerializer(
issue, many=True, fields=self.fields, expand=self.expand
).data,
)

View file

@ -19,6 +19,7 @@ from plane.db.models import IssueComment, ProjectMember, CommentReaction, Projec
from plane.bgtasks.issue_activities_task import issue_activity
from plane.utils.host import base_host
class IssueCommentViewSet(BaseViewSet):
serializer_class = IssueCommentSerializer
model = IssueComment

View file

@ -15,8 +15,10 @@ from plane.app.serializers import IssueLinkSerializer
from plane.app.permissions import ProjectEntityPermission
from plane.db.models import IssueLink
from plane.bgtasks.issue_activities_task import issue_activity
from plane.bgtasks.work_item_link_task import crawl_work_item_link_title
from plane.utils.host import base_host
class IssueLinkViewSet(BaseViewSet):
permission_classes = [ProjectEntityPermission]
@ -43,6 +45,9 @@ class IssueLinkViewSet(BaseViewSet):
serializer = IssueLinkSerializer(data=request.data)
if serializer.is_valid():
serializer.save(project_id=project_id, issue_id=issue_id)
crawl_work_item_link_title.delay(
serializer.data.get("id"), serializer.data.get("url")
)
issue_activity.delay(
type="link.activity.created",
requested_data=json.dumps(serializer.data, cls=DjangoJSONEncoder),
@ -54,6 +59,10 @@ class IssueLinkViewSet(BaseViewSet):
notification=True,
origin=base_host(request=request, is_app=True),
)
issue_link = self.get_queryset().get(id=serializer.data.get("id"))
serializer = IssueLinkSerializer(issue_link)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
@ -65,9 +74,14 @@ class IssueLinkViewSet(BaseViewSet):
current_instance = json.dumps(
IssueLinkSerializer(issue_link).data, cls=DjangoJSONEncoder
)
serializer = IssueLinkSerializer(issue_link, data=request.data, partial=True)
if serializer.is_valid():
serializer.save()
crawl_work_item_link_title.delay(
serializer.data.get("id"), serializer.data.get("url")
)
issue_activity.delay(
type="link.activity.updated",
requested_data=requested_data,
@ -79,6 +93,9 @@ class IssueLinkViewSet(BaseViewSet):
notification=True,
origin=base_host(request=request, is_app=True),
)
issue_link = self.get_queryset().get(id=serializer.data.get("id"))
serializer = IssueLinkSerializer(issue_link)
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)

View file

@ -17,6 +17,7 @@ from plane.db.models import IssueReaction
from plane.bgtasks.issue_activities_task import issue_activity
from plane.utils.host import base_host
class IssueReactionViewSet(BaseViewSet):
serializer_class = IssueReactionSerializer
model = IssueReaction

View file

@ -29,6 +29,7 @@ from plane.bgtasks.issue_activities_task import issue_activity
from plane.utils.issue_relation_mapper import get_actual_relation
from plane.utils.host import base_host
class IssueRelationViewSet(BaseViewSet):
serializer_class = IssueRelationSerializer
model = IssueRelation

View file

@ -23,6 +23,8 @@ from plane.bgtasks.issue_activities_task import issue_activity
from plane.utils.timezone_converter import user_timezone_converter
from collections import defaultdict
from plane.utils.host import base_host
from plane.utils.order_queryset import order_issue_queryset
class SubIssuesEndpoint(BaseAPIView):
permission_classes = [ProjectEntityPermission]
@ -102,6 +104,15 @@ class SubIssuesEndpoint(BaseAPIView):
.order_by("-created_at")
)
# Ordering
order_by_param = request.GET.get("order_by", "-created_at")
group_by = request.GET.get("group_by", False)
if order_by_param:
sub_issues, order_by_param = order_issue_queryset(
sub_issues, order_by_param
)
# create's a dict with state group name with their respective issue id's
result = defaultdict(list)
for sub_issue in sub_issues:
@ -138,6 +149,26 @@ class SubIssuesEndpoint(BaseAPIView):
sub_issues = user_timezone_converter(
sub_issues, datetime_fields, request.user.user_timezone
)
# Grouping
if group_by:
result_dict = defaultdict(list)
for issue in sub_issues:
if group_by == "assignees__ids":
if issue["assignee_ids"]:
assignee_ids = issue["assignee_ids"]
for assignee_id in assignee_ids:
result_dict[str(assignee_id)].append(issue)
elif issue["assignee_ids"] == []:
result_dict["None"].append(issue)
elif group_by:
result_dict[str(issue[group_by])].append(issue)
return Response(
{"sub_issues": result_dict, "state_distribution": result},
status=status.HTTP_200_OK,
)
return Response(
{"sub_issues": sub_issues, "state_distribution": result},
status=status.HTTP_200_OK,

View file

@ -63,6 +63,7 @@ from .. import BaseAPIView, BaseViewSet
from plane.bgtasks.recent_visited_task import recent_visited_task
from plane.utils.host import base_host
class ModuleViewSet(BaseViewSet):
model = Module
webhook_event = "module"
@ -710,23 +711,31 @@ class ModuleViewSet(BaseViewSet):
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def partial_update(self, request, slug, project_id, pk):
module = self.get_queryset().filter(pk=pk)
module_queryset = self.get_queryset().filter(pk=pk)
if module.first().archived_at:
current_module = module_queryset.first()
if not current_module:
return Response(
{"error": "Module not found"},
status=status.HTTP_404_NOT_FOUND,
)
if current_module.archived_at:
return Response(
{"error": "Archived module cannot be updated"},
status=status.HTTP_400_BAD_REQUEST,
)
current_instance = json.dumps(
ModuleSerializer(module.first()).data, cls=DjangoJSONEncoder
ModuleSerializer(current_module).data, cls=DjangoJSONEncoder
)
serializer = ModuleWriteSerializer(
module.first(), data=request.data, partial=True
current_module, data=request.data, partial=True
)
if serializer.is_valid():
serializer.save()
module = module.values(
module = module_queryset.values(
# Required fields
"id",
"workspace_id",

View file

@ -36,6 +36,7 @@ from plane.utils.paginator import GroupedOffsetPaginator, SubGroupedOffsetPagina
from .. import BaseViewSet
from plane.utils.host import base_host
class ModuleIssueViewSet(BaseViewSet):
serializer_class = ModuleIssueSerializer
model = ModuleIssue
@ -280,7 +281,11 @@ class ModuleIssueViewSet(BaseViewSet):
issue_id=str(issue_id),
project_id=str(project_id),
current_instance=json.dumps(
{"module_name": module_issue.first().module.name if (module_issue.first() and module_issue.first().module) else None}
{
"module_name": module_issue.first().module.name
if (module_issue.first() and module_issue.first().module)
else None
}
),
epoch=int(timezone.now().timestamp()),
notification=True,

View file

@ -42,6 +42,7 @@ from plane.bgtasks.page_version_task import page_version
from plane.bgtasks.recent_visited_task import recent_visited_task
from plane.bgtasks.copy_s3_object import copy_s3_objects
def unarchive_archive_page_and_descendants(page_id, archived_at):
# Your SQL query
sql = """
@ -198,7 +199,7 @@ class PageViewSet(BaseViewSet):
project = Project.objects.get(pk=project_id)
"""
if the role is guest and guest_view_all_features is false and owned by is not
if the role is guest and guest_view_all_features is false and owned by is not
the requesting user then dont show the page
"""
@ -572,6 +573,12 @@ class PageDuplicateEndpoint(BaseAPIView):
pk=page_id, workspace__slug=slug, projects__id=project_id
).first()
# check for permission
if page.access == Page.PRIVATE_ACCESS and page.owned_by_id != request.user.id:
return Response(
{"error": "Permission denied"}, status=status.HTTP_403_FORBIDDEN
)
# get all the project ids where page is present
project_ids = ProjectPage.objects.filter(page_id=page_id).values_list(
"project_id", flat=True

View file

@ -275,14 +275,14 @@ class ProjectViewSet(BaseViewSet):
states = [
{
"name": "Backlog",
"color": "#A3A3A3",
"color": "#60646C",
"sequence": 15000,
"group": "backlog",
"default": True,
},
{
"name": "Todo",
"color": "#3A3A3A",
"color": "#60646C",
"sequence": 25000,
"group": "unstarted",
},
@ -294,13 +294,13 @@ class ProjectViewSet(BaseViewSet):
},
{
"name": "Done",
"color": "#16A34A",
"color": "#46A758",
"sequence": 45000,
"group": "completed",
},
{
"name": "Cancelled",
"color": "#EF4444",
"color": "#9AA4BC",
"sequence": 55000,
"group": "cancelled",
},
@ -341,7 +341,10 @@ class ProjectViewSet(BaseViewSet):
except IntegrityError as e:
if "already exists" in str(e):
return Response(
{"name": "The project name is already taken"},
{
"name": "The project name is already taken",
"code": "PROJECT_NAME_ALREADY_EXIST",
},
status=status.HTTP_409_CONFLICT,
)
except Workspace.DoesNotExist:
@ -350,7 +353,10 @@ class ProjectViewSet(BaseViewSet):
)
except serializers.ValidationError:
return Response(
{"identifier": "The project identifier is already taken"},
{
"identifier": "The project identifier is already taken",
"code": "PROJECT_IDENTIFIER_ALREADY_EXIST",
},
status=status.HTTP_409_CONFLICT,
)
@ -445,7 +451,7 @@ class ProjectViewSet(BaseViewSet):
is_active=True,
).exists()
):
project = Project.objects.get(pk=pk)
project = Project.objects.get(pk=pk, workspace__slug=slug)
project.delete()
webhook_activity.delay(
event="project",

View file

@ -29,6 +29,7 @@ from plane.db.models import (
from plane.db.models.project import ProjectNetwork
from plane.utils.host import base_host
class ProjectInvitationsViewset(BaseViewSet):
serializer_class = ProjectMemberInviteSerializer
model = ProjectMemberInvite

View file

@ -168,6 +168,8 @@ class ProjectMemberViewSet(BaseViewSet):
workspace__slug=slug,
member__is_bot=False,
is_active=True,
member__member_workspace__workspace__slug=slug,
member__member_workspace__is_active=True,
).select_related("project", "member", "workspace")
serializer = ProjectMemberRoleSerializer(
@ -313,7 +315,11 @@ class UserProjectRolesEndpoint(BaseAPIView):
def get(self, request, slug):
project_members = ProjectMember.objects.filter(
workspace__slug=slug, member_id=request.user.id, is_active=True
workspace__slug=slug,
member_id=request.user.id,
is_active=True,
member__member_workspace__workspace__slug=slug,
member__member_workspace__is_active=True,
).values("project_id", "role")
project_members = {

View file

@ -1,5 +1,5 @@
# Django imports
from django.db.models import Q
from django.db.models import Q, QuerySet
# Third party imports
from rest_framework import status
@ -12,6 +12,95 @@ from plane.utils.issue_search import search_issues
class IssueSearchEndpoint(BaseAPIView):
def filter_issues_by_project(self, project_id: int, issues: QuerySet) -> QuerySet:
"""
Filter issues by project
"""
issues = issues.filter(project_id=project_id)
return issues
def search_issues_by_query(self, query: str, issues: QuerySet) -> QuerySet:
"""
Search issues by query
"""
issues = search_issues(query, issues)
return issues
def search_issues_and_excluding_parent(
self, issues: QuerySet, issue_id: str
) -> QuerySet:
"""
Search issues and epics by query excluding the parent
"""
issue = Issue.issue_objects.filter(pk=issue_id).first()
if issue:
issues = issues.filter(
~Q(pk=issue_id), ~Q(pk=issue.parent_id), ~Q(parent_id=issue_id)
)
return issues
def filter_issues_excluding_related_issues(
self, issue_id: str, issues: QuerySet
) -> QuerySet:
"""
Filter issues excluding related issues
"""
issue = Issue.issue_objects.filter(pk=issue_id).first()
related_issue_ids = (
IssueRelation.objects.filter(Q(related_issue=issue) | Q(issue=issue))
.values_list("issue_id", "related_issue_id")
.distinct()
)
related_issue_ids = [item for sublist in related_issue_ids for item in sublist]
if issue:
issues = issues.filter(~Q(pk=issue_id), ~Q(pk__in=related_issue_ids))
return issues
def filter_root_issues_only(self, issue_id: str, issues: QuerySet) -> QuerySet:
"""
Filter root issues only
"""
issue = Issue.issue_objects.filter(pk=issue_id).first()
if issue:
issues = issues.filter(~Q(pk=issue_id), parent__isnull=True)
if issue.parent:
issues = issues.filter(~Q(pk=issue.parent_id))
return issues
def exclude_issues_in_cycles(self, issues: QuerySet) -> QuerySet:
"""
Exclude issues in cycles
"""
issues = issues.exclude(
Q(issue_cycle__isnull=False) & Q(issue_cycle__deleted_at__isnull=True)
)
return issues
def exclude_issues_in_module(self, issues: QuerySet, module: str) -> QuerySet:
"""
Exclude issues in a module
"""
issues = issues.exclude(
Q(issue_module__module=module) & Q(issue_module__deleted_at__isnull=True)
)
return issues
def filter_issues_without_target_date(self, issues: QuerySet) -> QuerySet:
"""
Filter issues without a target date
"""
issues = issues.filter(target_date__isnull=True)
return issues
def get(self, request, slug, project_id):
query = request.query_params.get("search", False)
workspace_search = request.query_params.get("workspace_search", "false")
@ -21,7 +110,6 @@ class IssueSearchEndpoint(BaseAPIView):
module = request.query_params.get("module", False)
sub_issue = request.query_params.get("sub_issue", "false")
target_date = request.query_params.get("target_date", True)
issue_id = request.query_params.get("issue_id", False)
issues = Issue.issue_objects.filter(
@ -32,52 +120,28 @@ class IssueSearchEndpoint(BaseAPIView):
)
if workspace_search == "false":
issues = issues.filter(project_id=project_id)
issues = self.filter_issues_by_project(project_id, issues)
if query:
issues = search_issues(query, issues)
issues = self.search_issues_by_query(query, issues)
if parent == "true" and issue_id:
issue = Issue.issue_objects.filter(pk=issue_id).first()
if issue:
issues = issues.filter(
~Q(pk=issue_id), ~Q(pk=issue.parent_id), ~Q(parent_id=issue_id)
)
issues = self.search_issues_and_excluding_parent(issues, issue_id)
if issue_relation == "true" and issue_id:
issue = Issue.issue_objects.filter(pk=issue_id).first()
related_issue_ids = IssueRelation.objects.filter(
Q(related_issue=issue) | Q(issue=issue)
).values_list(
"issue_id", "related_issue_id"
).distinct()
issues = self.filter_issues_excluding_related_issues(issue_id, issues)
related_issue_ids = [item for sublist in related_issue_ids for item in sublist]
if issue:
issues = issues.filter(
~Q(pk=issue_id),
~Q(pk__in=related_issue_ids),
)
if sub_issue == "true" and issue_id:
issue = Issue.issue_objects.filter(pk=issue_id).first()
if issue:
issues = issues.filter(~Q(pk=issue_id), parent__isnull=True)
if issue.parent:
issues = issues.filter(~Q(pk=issue.parent_id))
issues = self.filter_root_issues_only(issue_id, issues)
if cycle == "true":
issues = issues.exclude(
Q(issue_cycle__isnull=False) & Q(issue_cycle__deleted_at__isnull=True)
)
issues = self.exclude_issues_in_cycles(issues)
if module:
issues = issues.exclude(
Q(issue_module__module=module)
& Q(issue_module__deleted_at__isnull=True)
)
issues = self.exclude_issues_in_module(issues, module)
if target_date == "none":
issues = issues.filter(target_date__isnull=True)
issues = self.filter_issues_without_target_date(issues)
if ProjectMember.objects.filter(
project_id=project_id, member=self.request.user, is_active=True, role=5

View file

@ -1,5 +1,6 @@
# Python imports
from itertools import groupby
from collections import defaultdict
# Django imports
from django.db.utils import IntegrityError
@ -74,7 +75,19 @@ class StateViewSet(BaseViewSet):
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def list(self, request, slug, project_id):
states = StateSerializer(self.get_queryset(), many=True).data
grouped_states = defaultdict(list)
for state in states:
grouped_states[state["group"]].append(state)
for group, group_states in grouped_states.items():
count = len(group_states)
for index, state in enumerate(group_states, start=1):
state["order"] = index / count
grouped = request.GET.get("grouped", False)
if grouped == "true":
state_dict = {}
for key, value in groupby(
@ -83,6 +96,7 @@ class StateViewSet(BaseViewSet):
):
state_dict[str(key)] = list(value)
return Response(state_dict, status=status.HTTP_200_OK)
return Response(states, status=status.HTTP_200_OK)
@invalidate_cache(path="workspaces/:slug/states/", url_params=True, user=False)

View file

@ -24,125 +24,152 @@ class TimezoneEndpoint(APIView):
@method_decorator(cache_page(60 * 60 * 2))
def get(self, request):
timezone_locations = [
('Midway Island', 'Pacific/Midway'), # UTC-11:00
('American Samoa', 'Pacific/Pago_Pago'), # UTC-11:00
('Hawaii', 'Pacific/Honolulu'), # UTC-10:00
('Aleutian Islands', 'America/Adak'), # UTC-10:00 (DST: UTC-09:00)
('Marquesas Islands', 'Pacific/Marquesas'), # UTC-09:30
('Alaska', 'America/Anchorage'), # UTC-09:00 (DST: UTC-08:00)
('Gambier Islands', 'Pacific/Gambier'), # UTC-09:00
('Pacific Time (US and Canada)', 'America/Los_Angeles'), # UTC-08:00 (DST: UTC-07:00)
('Baja California', 'America/Tijuana'), # UTC-08:00 (DST: UTC-07:00)
('Mountain Time (US and Canada)', 'America/Denver'), # UTC-07:00 (DST: UTC-06:00)
('Arizona', 'America/Phoenix'), # UTC-07:00
('Chihuahua, Mazatlan', 'America/Chihuahua'), # UTC-07:00 (DST: UTC-06:00)
('Central Time (US and Canada)', 'America/Chicago'), # UTC-06:00 (DST: UTC-05:00)
('Saskatchewan', 'America/Regina'), # UTC-06:00
('Guadalajara, Mexico City, Monterrey', 'America/Mexico_City'), # UTC-06:00 (DST: UTC-05:00)
('Tegucigalpa, Honduras', 'America/Tegucigalpa'), # UTC-06:00
('Costa Rica', 'America/Costa_Rica'), # UTC-06:00
('Eastern Time (US and Canada)', 'America/New_York'), # UTC-05:00 (DST: UTC-04:00)
('Lima', 'America/Lima'), # UTC-05:00
('Bogota', 'America/Bogota'), # UTC-05:00
('Quito', 'America/Guayaquil'), # UTC-05:00
('Chetumal', 'America/Cancun'), # UTC-05:00 (DST: UTC-04:00)
('Caracas (Old Venezuela Time)', 'America/Caracas'), # UTC-04:30
('Atlantic Time (Canada)', 'America/Halifax'), # UTC-04:00 (DST: UTC-03:00)
('Caracas', 'America/Caracas'), # UTC-04:00
('Santiago', 'America/Santiago'), # UTC-04:00 (DST: UTC-03:00)
('La Paz', 'America/La_Paz'), # UTC-04:00
('Manaus', 'America/Manaus'), # UTC-04:00
('Georgetown', 'America/Guyana'), # UTC-04:00
('Bermuda', 'Atlantic/Bermuda'), # UTC-04:00 (DST: UTC-03:00)
('Newfoundland Time (Canada)', 'America/St_Johns'), # UTC-03:30 (DST: UTC-02:30)
('Buenos Aires', 'America/Argentina/Buenos_Aires'), # UTC-03:00
('Brasilia', 'America/Sao_Paulo'), # UTC-03:00
('Greenland', 'America/Godthab'), # UTC-03:00 (DST: UTC-02:00)
('Montevideo', 'America/Montevideo'), # UTC-03:00
('Falkland Islands', 'Atlantic/Stanley'), # UTC-03:00
('South Georgia and the South Sandwich Islands', 'Atlantic/South_Georgia'), # UTC-02:00
('Azores', 'Atlantic/Azores'), # UTC-01:00 (DST: UTC+00:00)
('Cape Verde Islands', 'Atlantic/Cape_Verde'), # UTC-01:00
('Dublin', 'Europe/Dublin'), # UTC+00:00 (DST: UTC+01:00)
('Reykjavik', 'Atlantic/Reykjavik'), # UTC+00:00
('Lisbon', 'Europe/Lisbon'), # UTC+00:00 (DST: UTC+01:00)
('Monrovia', 'Africa/Monrovia'), # UTC+00:00
('Casablanca', 'Africa/Casablanca'), # UTC+00:00 (DST: UTC+01:00)
('Central European Time (Berlin, Rome, Paris)', 'Europe/Paris'), # UTC+01:00 (DST: UTC+02:00)
('West Central Africa', 'Africa/Lagos'), # UTC+01:00
('Algiers', 'Africa/Algiers'), # UTC+01:00
('Lagos', 'Africa/Lagos'), # UTC+01:00
('Tunis', 'Africa/Tunis'), # UTC+01:00
('Eastern European Time (Cairo, Helsinki, Kyiv)', 'Europe/Kiev'), # UTC+02:00 (DST: UTC+03:00)
('Athens', 'Europe/Athens'), # UTC+02:00 (DST: UTC+03:00)
('Jerusalem', 'Asia/Jerusalem'), # UTC+02:00 (DST: UTC+03:00)
('Johannesburg', 'Africa/Johannesburg'), # UTC+02:00
('Harare, Pretoria', 'Africa/Harare'), # UTC+02:00
('Moscow Time', 'Europe/Moscow'), # UTC+03:00
('Baghdad', 'Asia/Baghdad'), # UTC+03:00
('Nairobi', 'Africa/Nairobi'), # UTC+03:00
('Kuwait, Riyadh', 'Asia/Riyadh'), # UTC+03:00
('Tehran', 'Asia/Tehran'), # UTC+03:30 (DST: UTC+04:30)
('Abu Dhabi', 'Asia/Dubai'), # UTC+04:00
('Baku', 'Asia/Baku'), # UTC+04:00 (DST: UTC+05:00)
('Yerevan', 'Asia/Yerevan'), # UTC+04:00 (DST: UTC+05:00)
('Astrakhan', 'Europe/Astrakhan'), # UTC+04:00
('Tbilisi', 'Asia/Tbilisi'), # UTC+04:00
('Mauritius', 'Indian/Mauritius'), # UTC+04:00
('Islamabad', 'Asia/Karachi'), # UTC+05:00
('Karachi', 'Asia/Karachi'), # UTC+05:00
('Tashkent', 'Asia/Tashkent'), # UTC+05:00
('Yekaterinburg', 'Asia/Yekaterinburg'), # UTC+05:00
('Maldives', 'Indian/Maldives'), # UTC+05:00
('Chagos', 'Indian/Chagos'), # UTC+05:00
('Chennai', 'Asia/Kolkata'), # UTC+05:30
('Kolkata', 'Asia/Kolkata'), # UTC+05:30
('Mumbai', 'Asia/Kolkata'), # UTC+05:30
('New Delhi', 'Asia/Kolkata'), # UTC+05:30
('Sri Jayawardenepura', 'Asia/Colombo'), # UTC+05:30
('Kathmandu', 'Asia/Kathmandu'), # UTC+05:45
('Dhaka', 'Asia/Dhaka'), # UTC+06:00
('Almaty', 'Asia/Almaty'), # UTC+06:00
('Bishkek', 'Asia/Bishkek'), # UTC+06:00
('Thimphu', 'Asia/Thimphu'), # UTC+06:00
('Yangon (Rangoon)', 'Asia/Yangon'), # UTC+06:30
('Cocos Islands', 'Indian/Cocos'), # UTC+06:30
('Bangkok', 'Asia/Bangkok'), # UTC+07:00
('Hanoi', 'Asia/Ho_Chi_Minh'), # UTC+07:00
('Jakarta', 'Asia/Jakarta'), # UTC+07:00
('Novosibirsk', 'Asia/Novosibirsk'), # UTC+07:00
('Krasnoyarsk', 'Asia/Krasnoyarsk'), # UTC+07:00
('Beijing', 'Asia/Shanghai'), # UTC+08:00
('Singapore', 'Asia/Singapore'), # UTC+08:00
('Perth', 'Australia/Perth'), # UTC+08:00
('Hong Kong', 'Asia/Hong_Kong'), # UTC+08:00
('Ulaanbaatar', 'Asia/Ulaanbaatar'), # UTC+08:00
('Palau', 'Pacific/Palau'), # UTC+08:00
('Eucla', 'Australia/Eucla'), # UTC+08:45
('Tokyo', 'Asia/Tokyo'), # UTC+09:00
('Seoul', 'Asia/Seoul'), # UTC+09:00
('Yakutsk', 'Asia/Yakutsk'), # UTC+09:00
('Adelaide', 'Australia/Adelaide'), # UTC+09:30 (DST: UTC+10:30)
('Darwin', 'Australia/Darwin'), # UTC+09:30
('Sydney', 'Australia/Sydney'), # UTC+10:00 (DST: UTC+11:00)
('Brisbane', 'Australia/Brisbane'), # UTC+10:00
('Guam', 'Pacific/Guam'), # UTC+10:00
('Vladivostok', 'Asia/Vladivostok'), # UTC+10:00
('Tahiti', 'Pacific/Tahiti'), # UTC+10:00
('Lord Howe Island', 'Australia/Lord_Howe'), # UTC+10:30 (DST: UTC+11:00)
('Solomon Islands', 'Pacific/Guadalcanal'), # UTC+11:00
('Magadan', 'Asia/Magadan'), # UTC+11:00
('Norfolk Island', 'Pacific/Norfolk'), # UTC+11:00
('Bougainville Island', 'Pacific/Bougainville'), # UTC+11:00
('Chokurdakh', 'Asia/Srednekolymsk'), # UTC+11:00
('Auckland', 'Pacific/Auckland'), # UTC+12:00 (DST: UTC+13:00)
('Wellington', 'Pacific/Auckland'), # UTC+12:00 (DST: UTC+13:00)
('Fiji Islands', 'Pacific/Fiji'), # UTC+12:00 (DST: UTC+13:00)
('Anadyr', 'Asia/Anadyr'), # UTC+12:00
('Chatham Islands', 'Pacific/Chatham'), # UTC+12:45 (DST: UTC+13:45)
("Nuku'alofa", 'Pacific/Tongatapu'), # UTC+13:00
('Samoa', 'Pacific/Apia'), # UTC+13:00 (DST: UTC+14:00)
('Kiritimati Island', 'Pacific/Kiritimati') # UTC+14:00
("Midway Island", "Pacific/Midway"), # UTC-11:00
("American Samoa", "Pacific/Pago_Pago"), # UTC-11:00
("Hawaii", "Pacific/Honolulu"), # UTC-10:00
("Aleutian Islands", "America/Adak"), # UTC-10:00 (DST: UTC-09:00)
("Marquesas Islands", "Pacific/Marquesas"), # UTC-09:30
("Alaska", "America/Anchorage"), # UTC-09:00 (DST: UTC-08:00)
("Gambier Islands", "Pacific/Gambier"), # UTC-09:00
(
"Pacific Time (US and Canada)",
"America/Los_Angeles",
), # UTC-08:00 (DST: UTC-07:00)
("Baja California", "America/Tijuana"), # UTC-08:00 (DST: UTC-07:00)
(
"Mountain Time (US and Canada)",
"America/Denver",
), # UTC-07:00 (DST: UTC-06:00)
("Arizona", "America/Phoenix"), # UTC-07:00
("Chihuahua, Mazatlan", "America/Chihuahua"), # UTC-07:00 (DST: UTC-06:00)
(
"Central Time (US and Canada)",
"America/Chicago",
), # UTC-06:00 (DST: UTC-05:00)
("Saskatchewan", "America/Regina"), # UTC-06:00
(
"Guadalajara, Mexico City, Monterrey",
"America/Mexico_City",
), # UTC-06:00 (DST: UTC-05:00)
("Tegucigalpa, Honduras", "America/Tegucigalpa"), # UTC-06:00
("Costa Rica", "America/Costa_Rica"), # UTC-06:00
(
"Eastern Time (US and Canada)",
"America/New_York",
), # UTC-05:00 (DST: UTC-04:00)
("Lima", "America/Lima"), # UTC-05:00
("Bogota", "America/Bogota"), # UTC-05:00
("Quito", "America/Guayaquil"), # UTC-05:00
("Chetumal", "America/Cancun"), # UTC-05:00 (DST: UTC-04:00)
("Caracas (Old Venezuela Time)", "America/Caracas"), # UTC-04:30
("Atlantic Time (Canada)", "America/Halifax"), # UTC-04:00 (DST: UTC-03:00)
("Caracas", "America/Caracas"), # UTC-04:00
("Santiago", "America/Santiago"), # UTC-04:00 (DST: UTC-03:00)
("La Paz", "America/La_Paz"), # UTC-04:00
("Manaus", "America/Manaus"), # UTC-04:00
("Georgetown", "America/Guyana"), # UTC-04:00
("Bermuda", "Atlantic/Bermuda"), # UTC-04:00 (DST: UTC-03:00)
(
"Newfoundland Time (Canada)",
"America/St_Johns",
), # UTC-03:30 (DST: UTC-02:30)
("Buenos Aires", "America/Argentina/Buenos_Aires"), # UTC-03:00
("Brasilia", "America/Sao_Paulo"), # UTC-03:00
("Greenland", "America/Godthab"), # UTC-03:00 (DST: UTC-02:00)
("Montevideo", "America/Montevideo"), # UTC-03:00
("Falkland Islands", "Atlantic/Stanley"), # UTC-03:00
(
"South Georgia and the South Sandwich Islands",
"Atlantic/South_Georgia",
), # UTC-02:00
("Azores", "Atlantic/Azores"), # UTC-01:00 (DST: UTC+00:00)
("Cape Verde Islands", "Atlantic/Cape_Verde"), # UTC-01:00
("Dublin", "Europe/Dublin"), # UTC+00:00 (DST: UTC+01:00)
("Reykjavik", "Atlantic/Reykjavik"), # UTC+00:00
("Lisbon", "Europe/Lisbon"), # UTC+00:00 (DST: UTC+01:00)
("Monrovia", "Africa/Monrovia"), # UTC+00:00
("Casablanca", "Africa/Casablanca"), # UTC+00:00 (DST: UTC+01:00)
(
"Central European Time (Berlin, Rome, Paris)",
"Europe/Paris",
), # UTC+01:00 (DST: UTC+02:00)
("West Central Africa", "Africa/Lagos"), # UTC+01:00
("Algiers", "Africa/Algiers"), # UTC+01:00
("Lagos", "Africa/Lagos"), # UTC+01:00
("Tunis", "Africa/Tunis"), # UTC+01:00
(
"Eastern European Time (Cairo, Helsinki, Kyiv)",
"Europe/Kiev",
), # UTC+02:00 (DST: UTC+03:00)
("Athens", "Europe/Athens"), # UTC+02:00 (DST: UTC+03:00)
("Jerusalem", "Asia/Jerusalem"), # UTC+02:00 (DST: UTC+03:00)
("Johannesburg", "Africa/Johannesburg"), # UTC+02:00
("Harare, Pretoria", "Africa/Harare"), # UTC+02:00
("Moscow Time", "Europe/Moscow"), # UTC+03:00
("Baghdad", "Asia/Baghdad"), # UTC+03:00
("Nairobi", "Africa/Nairobi"), # UTC+03:00
("Kuwait, Riyadh", "Asia/Riyadh"), # UTC+03:00
("Tehran", "Asia/Tehran"), # UTC+03:30 (DST: UTC+04:30)
("Abu Dhabi", "Asia/Dubai"), # UTC+04:00
("Baku", "Asia/Baku"), # UTC+04:00 (DST: UTC+05:00)
("Yerevan", "Asia/Yerevan"), # UTC+04:00 (DST: UTC+05:00)
("Astrakhan", "Europe/Astrakhan"), # UTC+04:00
("Tbilisi", "Asia/Tbilisi"), # UTC+04:00
("Mauritius", "Indian/Mauritius"), # UTC+04:00
("Islamabad", "Asia/Karachi"), # UTC+05:00
("Karachi", "Asia/Karachi"), # UTC+05:00
("Tashkent", "Asia/Tashkent"), # UTC+05:00
("Yekaterinburg", "Asia/Yekaterinburg"), # UTC+05:00
("Maldives", "Indian/Maldives"), # UTC+05:00
("Chagos", "Indian/Chagos"), # UTC+05:00
("Chennai", "Asia/Kolkata"), # UTC+05:30
("Kolkata", "Asia/Kolkata"), # UTC+05:30
("Mumbai", "Asia/Kolkata"), # UTC+05:30
("New Delhi", "Asia/Kolkata"), # UTC+05:30
("Sri Jayawardenepura", "Asia/Colombo"), # UTC+05:30
("Kathmandu", "Asia/Kathmandu"), # UTC+05:45
("Dhaka", "Asia/Dhaka"), # UTC+06:00
("Almaty", "Asia/Almaty"), # UTC+06:00
("Bishkek", "Asia/Bishkek"), # UTC+06:00
("Thimphu", "Asia/Thimphu"), # UTC+06:00
("Yangon (Rangoon)", "Asia/Yangon"), # UTC+06:30
("Cocos Islands", "Indian/Cocos"), # UTC+06:30
("Bangkok", "Asia/Bangkok"), # UTC+07:00
("Hanoi", "Asia/Ho_Chi_Minh"), # UTC+07:00
("Jakarta", "Asia/Jakarta"), # UTC+07:00
("Novosibirsk", "Asia/Novosibirsk"), # UTC+07:00
("Krasnoyarsk", "Asia/Krasnoyarsk"), # UTC+07:00
("Beijing", "Asia/Shanghai"), # UTC+08:00
("Singapore", "Asia/Singapore"), # UTC+08:00
("Perth", "Australia/Perth"), # UTC+08:00
("Hong Kong", "Asia/Hong_Kong"), # UTC+08:00
("Ulaanbaatar", "Asia/Ulaanbaatar"), # UTC+08:00
("Palau", "Pacific/Palau"), # UTC+08:00
("Eucla", "Australia/Eucla"), # UTC+08:45
("Tokyo", "Asia/Tokyo"), # UTC+09:00
("Seoul", "Asia/Seoul"), # UTC+09:00
("Yakutsk", "Asia/Yakutsk"), # UTC+09:00
("Adelaide", "Australia/Adelaide"), # UTC+09:30 (DST: UTC+10:30)
("Darwin", "Australia/Darwin"), # UTC+09:30
("Sydney", "Australia/Sydney"), # UTC+10:00 (DST: UTC+11:00)
("Brisbane", "Australia/Brisbane"), # UTC+10:00
("Guam", "Pacific/Guam"), # UTC+10:00
("Vladivostok", "Asia/Vladivostok"), # UTC+10:00
("Tahiti", "Pacific/Tahiti"), # UTC+10:00
("Lord Howe Island", "Australia/Lord_Howe"), # UTC+10:30 (DST: UTC+11:00)
("Solomon Islands", "Pacific/Guadalcanal"), # UTC+11:00
("Magadan", "Asia/Magadan"), # UTC+11:00
("Norfolk Island", "Pacific/Norfolk"), # UTC+11:00
("Bougainville Island", "Pacific/Bougainville"), # UTC+11:00
("Chokurdakh", "Asia/Srednekolymsk"), # UTC+11:00
("Auckland", "Pacific/Auckland"), # UTC+12:00 (DST: UTC+13:00)
("Wellington", "Pacific/Auckland"), # UTC+12:00 (DST: UTC+13:00)
("Fiji Islands", "Pacific/Fiji"), # UTC+12:00 (DST: UTC+13:00)
("Anadyr", "Asia/Anadyr"), # UTC+12:00
("Chatham Islands", "Pacific/Chatham"), # UTC+12:45 (DST: UTC+13:45)
("Nuku'alofa", "Pacific/Tongatapu"), # UTC+13:00
("Samoa", "Pacific/Apia"), # UTC+13:00 (DST: UTC+14:00)
("Kiritimati Island", "Pacific/Kiritimati"), # UTC+14:00
]
timezone_list = []
@ -150,7 +177,6 @@ class TimezoneEndpoint(APIView):
# Process timezone mapping
for friendly_name, tz_identifier in timezone_locations:
try:
tz = pytz.timezone(tz_identifier)
current_offset = now.astimezone(tz).strftime("%z")

View file

@ -1,8 +1,13 @@
# Django imports
from django.contrib.postgres.aggregates import ArrayAgg
from django.contrib.postgres.fields import ArrayField
from django.db.models import Exists, F, Func, OuterRef, Q, UUIDField, Value, Subquery
from django.db.models.functions import Coalesce
from django.db.models import (
Exists,
F,
Func,
OuterRef,
Q,
Subquery,
Prefetch,
)
from django.utils.decorators import method_decorator
from django.views.decorators.gzip import gzip_page
from django.db import transaction
@ -13,7 +18,7 @@ from rest_framework.response import Response
# Module imports
from plane.app.permissions import allow_permission, ROLE
from plane.app.serializers import IssueViewSerializer
from plane.app.serializers import IssueViewSerializer, ViewIssueListSerializer
from plane.db.models import (
Issue,
FileAsset,
@ -25,15 +30,12 @@ from plane.db.models import (
Project,
CycleIssue,
UserRecentVisit,
)
from plane.utils.grouper import (
issue_group_values,
issue_on_results,
issue_queryset_grouper,
IssueAssignee,
IssueLabel,
ModuleIssue,
)
from plane.utils.issue_filters import issue_filters
from plane.utils.order_queryset import order_issue_queryset
from plane.utils.paginator import GroupedOffsetPaginator, SubGroupedOffsetPaginator
from plane.bgtasks.recent_visited_task import recent_visited_task
from .. import BaseViewSet
from plane.db.models import UserFavorite
@ -143,6 +145,28 @@ class WorkspaceViewViewSet(BaseViewSet):
class WorkspaceViewIssuesViewSet(BaseViewSet):
def _get_project_permission_filters(self):
"""
Get common project permission filters for guest users and role-based access control.
Returns Q object for filtering issues based on user role and project settings.
"""
return Q(
Q(
project__project_projectmember__role=5,
project__guest_view_all_features=True,
)
| Q(
project__project_projectmember__role=5,
project__guest_view_all_features=False,
created_by=self.request.user,
)
|
# For other roles (role > 5), show all issues
Q(project__project_projectmember__role__gt=5),
project__project_projectmember__member=self.request.user,
project__project_projectmember__is_active=True,
)
def get_queryset(self):
return (
Issue.issue_objects.annotate(
@ -152,12 +176,25 @@ class WorkspaceViewIssuesViewSet(BaseViewSet):
.values("count")
)
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(
project__project_projectmember__member=self.request.user,
project__project_projectmember__is_active=True,
.select_related("state")
.prefetch_related(
Prefetch(
"issue_assignee",
queryset=IssueAssignee.objects.all(),
)
)
.prefetch_related(
Prefetch(
"label_issue",
queryset=IssueLabel.objects.all(),
)
)
.prefetch_related(
Prefetch(
"issue_module",
queryset=ModuleIssue.objects.all(),
)
)
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels", "issue_module__module")
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
@ -186,43 +223,6 @@ class WorkspaceViewIssuesViewSet(BaseViewSet):
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
label_ids=Coalesce(
ArrayAgg(
"labels__id",
distinct=True,
filter=Q(
~Q(labels__id__isnull=True)
& Q(label_issue__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
assignee_ids=Coalesce(
ArrayAgg(
"assignees__id",
distinct=True,
filter=Q(
~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True)
& Q(issue_assignee__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
module_ids=Coalesce(
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=Q(
~Q(issue_module__module_id__isnull=True)
& Q(issue_module__module__archived_at__isnull=True)
& Q(issue_module__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
)
)
@method_decorator(gzip_page)
@ -233,126 +233,36 @@ class WorkspaceViewIssuesViewSet(BaseViewSet):
filters = issue_filters(request.query_params, "GET")
order_by_param = request.GET.get("order_by", "-created_at")
issue_queryset = (
self.get_queryset()
.filter(**filters)
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
issue_queryset = self.get_queryset().filter(**filters)
# Get common project permission filters
permission_filters = self._get_project_permission_filters()
# Base query for the counts
total_issue_count = (
Issue.issue_objects.filter(**filters)
.filter(workspace__slug=slug)
.filter(permission_filters)
.only("id")
)
# check for the project member role, if the role is 5 then check for the guest_view_all_features if it is true then show all the issues else show only the issues created by the user
issue_queryset = issue_queryset.filter(
Q(
project__project_projectmember__role=5,
project__guest_view_all_features=True,
)
| Q(
project__project_projectmember__role=5,
project__guest_view_all_features=False,
created_by=self.request.user,
)
|
# For other roles (role < 5), show all issues
Q(project__project_projectmember__role__gt=5),
project__project_projectmember__member=self.request.user,
project__project_projectmember__is_active=True,
)
# Apply project permission filters to the issue queryset
issue_queryset = issue_queryset.filter(permission_filters)
# Issue queryset
issue_queryset, order_by_param = order_issue_queryset(
issue_queryset=issue_queryset, order_by_param=order_by_param
)
# Group by
group_by = request.GET.get("group_by", False)
sub_group_by = request.GET.get("sub_group_by", False)
# issue queryset
issue_queryset = issue_queryset_grouper(
queryset=issue_queryset, group_by=group_by, sub_group_by=sub_group_by
# List Paginate
return self.paginate(
order_by=order_by_param,
request=request,
queryset=issue_queryset,
on_results=lambda issues: ViewIssueListSerializer(issues, many=True).data,
total_count_queryset=total_issue_count,
)
if group_by:
# Check group and sub group value paginate
if sub_group_by:
if group_by == sub_group_by:
return Response(
{
"error": "Group by and sub group by cannot have same parameters"
},
status=status.HTTP_400_BAD_REQUEST,
)
else:
# group and sub group pagination
return self.paginate(
request=request,
order_by=order_by_param,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by, issues=issues, sub_group_by=sub_group_by
),
paginator_cls=SubGroupedOffsetPaginator,
group_by_fields=issue_group_values(
field=group_by, slug=slug, project_id=None, filters=filters
),
sub_group_by_fields=issue_group_values(
field=sub_group_by,
slug=slug,
project_id=None,
filters=filters,
),
group_by_field_name=group_by,
sub_group_by_field_name=sub_group_by,
count_filter=Q(
Q(issue_intake__status=1)
| Q(issue_intake__status=-1)
| Q(issue_intake__status=2)
| Q(issue_intake__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
)
# Group Paginate
else:
# Group paginate
return self.paginate(
request=request,
order_by=order_by_param,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by, issues=issues, sub_group_by=sub_group_by
),
paginator_cls=GroupedOffsetPaginator,
group_by_fields=issue_group_values(
field=group_by, slug=slug, project_id=None, filters=filters
),
group_by_field_name=group_by,
count_filter=Q(
Q(issue_intake__status=1)
| Q(issue_intake__status=-1)
| Q(issue_intake__status=2)
| Q(issue_intake__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
)
else:
# List Paginate
return self.paginate(
order_by=order_by_param,
request=request,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by, issues=issues, sub_group_by=sub_group_by
),
)
class IssueViewViewSet(BaseViewSet):
serializer_class = IssueViewSerializer

View file

@ -3,6 +3,7 @@ import csv
import io
import os
from datetime import date
import uuid
from dateutil.relativedelta import relativedelta
from django.db import IntegrityError
@ -35,6 +36,7 @@ from plane.db.models import (
Workspace,
WorkspaceMember,
WorkspaceTheme,
Profile,
)
from plane.app.permissions import ROLE, allow_permission
from django.utils.decorators import method_decorator
@ -42,6 +44,8 @@ from django.views.decorators.cache import cache_control
from django.views.decorators.vary import vary_on_cookie
from plane.utils.constants import RESTRICTED_WORKSPACE_SLUGS
from plane.license.utils.instance_value import get_configuration_value
from plane.bgtasks.workspace_seed_task import workspace_seed
from plane.utils.url import contains_url
class WorkSpaceViewSet(BaseViewSet):
@ -108,6 +112,12 @@ class WorkSpaceViewSet(BaseViewSet):
status=status.HTTP_400_BAD_REQUEST,
)
if contains_url(name):
return Response(
{"error": "Name cannot contain a URL"},
status=status.HTTP_400_BAD_REQUEST,
)
if serializer.is_valid(raise_exception=True):
serializer.save(owner=request.user)
# Create Workspace member
@ -126,6 +136,8 @@ class WorkSpaceViewSet(BaseViewSet):
data["total_members"] = total_members
data["role"] = 20
workspace_seed.delay(serializer.data["id"])
return Response(data, status=status.HTTP_201_CREATED)
return Response(
[serializer.errors[error][0] for error in serializer.errors],
@ -147,8 +159,18 @@ class WorkSpaceViewSet(BaseViewSet):
def partial_update(self, request, *args, **kwargs):
return super().partial_update(request, *args, **kwargs)
def remove_last_workspace_ids_from_user_settings(self, id: uuid.UUID) -> None:
"""
Remove the last workspace id from the user settings
"""
Profile.objects.filter(last_workspace_id=id).update(last_workspace_id=None)
return
@allow_permission([ROLE.ADMIN], level="WORKSPACE")
def destroy(self, request, *args, **kwargs):
# Get the workspace
workspace = self.get_object()
self.remove_last_workspace_ids_from_user_settings(workspace.id)
return super().destroy(request, *args, **kwargs)
@ -156,8 +178,6 @@ class UserWorkSpacesEndpoint(BaseAPIView):
search_fields = ["name"]
filterset_fields = ["owner"]
@method_decorator(cache_control(private=True, max_age=12))
@method_decorator(vary_on_cookie)
def get(self, request):
fields = [field for field in request.GET.get("fields", "").split(",") if field]
member_count = (

View file

@ -12,6 +12,7 @@ from plane.app.permissions import WorkspaceViewerPermission
from plane.app.serializers.cycle import CycleSerializer
from plane.utils.timezone_converter import user_timezone_converter
class WorkspaceCyclesEndpoint(BaseAPIView):
permission_classes = [WorkspaceViewerPermission]
@ -29,6 +30,7 @@ class WorkspaceCyclesEndpoint(BaseAPIView):
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
issue_cycle__issue__deleted_at__isnull=True,
),
)
)

View file

@ -38,6 +38,7 @@ from plane.bgtasks.issue_activities_task import issue_activity
from plane.utils.issue_filters import issue_filters
from plane.utils.host import base_host
class WorkspaceDraftIssueViewSet(BaseViewSet):
model = DraftIssue

View file

@ -1,5 +1,6 @@
# Django imports
from django.db.models import Count, Q, OuterRef, Subquery, IntegerField
from django.utils import timezone
from django.db.models.functions import Coalesce
# Third party modules
@ -133,7 +134,7 @@ class WorkSpaceMemberViewSet(BaseViewSet):
# Deactivate the users from the projects where the user is part of
_ = ProjectMember.objects.filter(
workspace__slug=slug, member_id=workspace_member.member_id, is_active=True
).update(is_active=False)
).update(is_active=False, updated_at=timezone.now())
workspace_member.is_active = False
workspace_member.save()
@ -194,7 +195,7 @@ class WorkSpaceMemberViewSet(BaseViewSet):
# # Deactivate the users from the projects where the user is part of
_ = ProjectMember.objects.filter(
workspace__slug=slug, member_id=workspace_member.member_id, is_active=True
).update(is_active=False)
).update(is_active=False, updated_at=timezone.now())
# # Deactivate the user
workspace_member.is_active = False

View file

@ -8,6 +8,7 @@ from plane.app.views.base import BaseAPIView
from plane.db.models import State
from plane.app.permissions import WorkspaceEntityPermission
from plane.utils.cache import cache_response
from collections import defaultdict
class WorkspaceStatesEndpoint(BaseAPIView):
@ -22,5 +23,16 @@ class WorkspaceStatesEndpoint(BaseAPIView):
project__archived_at__isnull=True,
is_triage=False,
)
grouped_states = defaultdict(list)
for state in states:
grouped_states[state.group].append(state)
for group, group_states in grouped_states.items():
count = len(group_states)
for index, state in enumerate(group_states, start=1):
state.order = index / count
serializer = StateSerializer(states, many=True).data
return Response(serializer, status=status.HTTP_200_OK)

View file

@ -27,10 +27,7 @@ class WorkspaceUserPreferenceViewSet(BaseAPIView):
create_preference_keys = []
keys = [
key
for key, _ in WorkspaceUserPreference.UserPreferenceKeys.choices
]
keys = [key for key, _ in WorkspaceUserPreference.UserPreferenceKeys.choices]
for preference in keys:
if preference not in get_preference.values_list("key", flat=True):
@ -39,7 +36,10 @@ class WorkspaceUserPreferenceViewSet(BaseAPIView):
preference = WorkspaceUserPreference.objects.bulk_create(
[
WorkspaceUserPreference(
key=key, user=request.user, workspace=workspace, sort_order=(65535 + (i*10000))
key=key,
user=request.user,
workspace=workspace,
sort_order=(65535 + (i * 10000)),
)
for i, key in enumerate(create_preference_keys)
],
@ -47,10 +47,13 @@ class WorkspaceUserPreferenceViewSet(BaseAPIView):
ignore_conflicts=True,
)
preferences = WorkspaceUserPreference.objects.filter(
user=request.user, workspace_id=workspace.id
).order_by("sort_order").values("key", "is_pinned", "sort_order")
preferences = (
WorkspaceUserPreference.objects.filter(
user=request.user, workspace_id=workspace.id
)
.order_by("sort_order")
.values("key", "is_pinned", "sort_order")
)
user_preferences = {}
@ -58,7 +61,7 @@ class WorkspaceUserPreferenceViewSet(BaseAPIView):
user_preferences[(str(preference["key"]))] = {
"is_pinned": preference["is_pinned"],
"sort_order": preference["sort_order"],
}
}
return Response(
user_preferences,
status=status.HTTP_200_OK,

View file

@ -18,6 +18,7 @@ from plane.bgtasks.user_activation_email_task import user_activation_email
from plane.utils.host import base_host
from plane.utils.ip_address import get_client_ip
class Adapter:
"""Common interface for all auth providers"""

View file

@ -41,7 +41,6 @@ AUTHENTICATION_ERROR_CODES = {
"GOOGLE_OAUTH_PROVIDER_ERROR": 5115,
"GITHUB_OAUTH_PROVIDER_ERROR": 5120,
"GITLAB_OAUTH_PROVIDER_ERROR": 5121,
# Reset Password
"INVALID_PASSWORD_TOKEN": 5125,
"EXPIRED_PASSWORD_TOKEN": 5130,

View file

@ -25,23 +25,24 @@ class GitHubOAuthProvider(OauthAdapter):
organization_scope = "read:org"
def __init__(self, request, code=None, state=None, callback=None):
GITHUB_CLIENT_ID, GITHUB_CLIENT_SECRET, GITHUB_ORGANIZATION_ID = get_configuration_value(
[
{
"key": "GITHUB_CLIENT_ID",
"default": os.environ.get("GITHUB_CLIENT_ID"),
},
{
"key": "GITHUB_CLIENT_SECRET",
"default": os.environ.get("GITHUB_CLIENT_SECRET"),
},
{
"key": "GITHUB_ORGANIZATION_ID",
"default": os.environ.get("GITHUB_ORGANIZATION_ID"),
},
]
GITHUB_CLIENT_ID, GITHUB_CLIENT_SECRET, GITHUB_ORGANIZATION_ID = (
get_configuration_value(
[
{
"key": "GITHUB_CLIENT_ID",
"default": os.environ.get("GITHUB_CLIENT_ID"),
},
{
"key": "GITHUB_CLIENT_SECRET",
"default": os.environ.get("GITHUB_CLIENT_SECRET"),
},
{
"key": "GITHUB_ORGANIZATION_ID",
"default": os.environ.get("GITHUB_ORGANIZATION_ID"),
},
]
)
)
if not (GITHUB_CLIENT_ID and GITHUB_CLIENT_SECRET):
@ -128,7 +129,10 @@ class GitHubOAuthProvider(OauthAdapter):
def is_user_in_organization(self, github_username):
headers = {"Authorization": f"Bearer {self.token_data.get('access_token')}"}
response = requests.get(f"{self.org_membership_url}/{self.organization_id}/memberships/{github_username}", headers=headers)
response = requests.get(
f"{self.org_membership_url}/{self.organization_id}/memberships/{github_username}",
headers=headers,
)
return response.status_code == 200 # 200 means the user is a member
def set_user_data(self):
@ -145,7 +149,6 @@ class GitHubOAuthProvider(OauthAdapter):
error_message="GITHUB_USER_NOT_IN_ORG",
)
email = self.__get_email(headers=headers)
super().set_user_data(
{

View file

@ -42,11 +42,11 @@ urlpatterns = [
# credentials
path("sign-in/", SignInAuthEndpoint.as_view(), name="sign-in"),
path("sign-up/", SignUpAuthEndpoint.as_view(), name="sign-up"),
path("spaces/sign-in/", SignInAuthSpaceEndpoint.as_view(), name="sign-in"),
path("spaces/sign-up/", SignUpAuthSpaceEndpoint.as_view(), name="sign-in"),
path("spaces/sign-in/", SignInAuthSpaceEndpoint.as_view(), name="space-sign-in"),
path("spaces/sign-up/", SignUpAuthSpaceEndpoint.as_view(), name="space-sign-up"),
# signout
path("sign-out/", SignOutAuthEndpoint.as_view(), name="sign-out"),
path("spaces/sign-out/", SignOutAuthSpaceEndpoint.as_view(), name="sign-out"),
path("spaces/sign-out/", SignOutAuthSpaceEndpoint.as_view(), name="space-sign-out"),
# csrf token
path("get-csrf-token/", CSRFTokenEndpoint.as_view(), name="get_csrf_token"),
# Magic sign in
@ -56,17 +56,17 @@ urlpatterns = [
path(
"spaces/magic-generate/",
MagicGenerateSpaceEndpoint.as_view(),
name="magic-generate",
name="space-magic-generate",
),
path(
"spaces/magic-sign-in/",
MagicSignInSpaceEndpoint.as_view(),
name="magic-sign-in",
name="space-magic-sign-in",
),
path(
"spaces/magic-sign-up/",
MagicSignUpSpaceEndpoint.as_view(),
name="magic-sign-up",
name="space-magic-sign-up",
),
## Google Oauth
path("google/", GoogleOauthInitiateEndpoint.as_view(), name="google-initiate"),
@ -74,12 +74,12 @@ urlpatterns = [
path(
"spaces/google/",
GoogleOauthInitiateSpaceEndpoint.as_view(),
name="google-initiate",
name="space-google-initiate",
),
path(
"google/callback/",
"spaces/google/callback/",
GoogleCallbackSpaceEndpoint.as_view(),
name="google-callback",
name="space-google-callback",
),
## Github Oauth
path("github/", GitHubOauthInitiateEndpoint.as_view(), name="github-initiate"),
@ -87,12 +87,12 @@ urlpatterns = [
path(
"spaces/github/",
GitHubOauthInitiateSpaceEndpoint.as_view(),
name="github-initiate",
name="space-github-initiate",
),
path(
"spaces/github/callback/",
GitHubCallbackSpaceEndpoint.as_view(),
name="github-callback",
name="space-github-callback",
),
## Gitlab Oauth
path("gitlab/", GitLabOauthInitiateEndpoint.as_view(), name="gitlab-initiate"),
@ -100,12 +100,12 @@ urlpatterns = [
path(
"spaces/gitlab/",
GitLabOauthInitiateSpaceEndpoint.as_view(),
name="gitlab-initiate",
name="space-gitlab-initiate",
),
path(
"spaces/gitlab/callback/",
GitLabCallbackSpaceEndpoint.as_view(),
name="gitlab-callback",
name="space-gitlab-callback",
),
# Email Check
path("email-check/", EmailCheckEndpoint.as_view(), name="email-check"),
@ -120,12 +120,12 @@ urlpatterns = [
path(
"spaces/forgot-password/",
ForgotPasswordSpaceEndpoint.as_view(),
name="forgot-password",
name="space-forgot-password",
),
path(
"spaces/reset-password/<uidb64>/<token>/",
ResetPasswordSpaceEndpoint.as_view(),
name="forgot-password",
name="space-forgot-password",
),
path("change-password/", ChangePasswordEndpoint.as_view(), name="forgot-password"),
path("set-password/", SetUserPasswordEndpoint.as_view(), name="set-password"),

View file

@ -1,30 +1,53 @@
# Django imports
from django.conf import settings
from django.http import HttpRequest
# Third party imports
from rest_framework.request import Request
# Module imports
from plane.utils.ip_address import get_client_ip
def base_host(request: Request | HttpRequest, is_admin: bool = False, is_space: bool = False, is_app: bool = False) -> str:
def base_host(
request: Request | HttpRequest,
is_admin: bool = False,
is_space: bool = False,
is_app: bool = False,
) -> str:
"""Utility function to return host / origin from the request"""
# Calculate the base origin from request
base_origin = settings.WEB_URL or settings.APP_BASE_URL
# Admin redirections
# Admin redirection
if is_admin:
if settings.ADMIN_BASE_URL:
return settings.ADMIN_BASE_URL
else:
return base_origin + "/god-mode/"
admin_base_path = getattr(settings, "ADMIN_BASE_PATH", None)
if not isinstance(admin_base_path, str):
admin_base_path = "/god-mode/"
if not admin_base_path.startswith("/"):
admin_base_path = "/" + admin_base_path
if not admin_base_path.endswith("/"):
admin_base_path += "/"
# Space redirections
if is_space:
if settings.SPACE_BASE_URL:
return settings.SPACE_BASE_URL
if settings.ADMIN_BASE_URL:
return settings.ADMIN_BASE_URL + admin_base_path
else:
return base_origin + "/spaces/"
return base_origin + admin_base_path
# Space redirection
if is_space:
space_base_path = getattr(settings, "SPACE_BASE_PATH", None)
if not isinstance(space_base_path, str):
space_base_path = "/spaces/"
if not space_base_path.startswith("/"):
space_base_path = "/" + space_base_path
if not space_base_path.endswith("/"):
space_base_path += "/"
if settings.SPACE_BASE_URL:
return settings.SPACE_BASE_URL + space_base_path
else:
return base_origin + space_base_path
# App Redirection
if is_app:

View file

@ -6,6 +6,7 @@ from django.conf import settings
from plane.utils.host import base_host
from plane.utils.ip_address import get_client_ip
def user_login(request, user, is_app=False, is_admin=False, is_space=False):
login(request=request, user=user)

View file

@ -21,6 +21,7 @@ from plane.authentication.adapter.error import (
)
from plane.utils.path_validator import validate_next_path
class SignInAuthEndpoint(View):
def post(self, request):
next_path = request.POST.get("next_path")

View file

@ -18,6 +18,7 @@ from plane.authentication.adapter.error import (
)
from plane.utils.path_validator import validate_next_path
class GitHubOauthInitiateEndpoint(View):
def get(self, request):
# Get host and next path

View file

@ -18,6 +18,7 @@ from plane.authentication.adapter.error import (
)
from plane.utils.path_validator import validate_next_path
class GitLabOauthInitiateEndpoint(View):
def get(self, request):
# Get host and next path

View file

@ -20,6 +20,7 @@ from plane.authentication.adapter.error import (
)
from plane.utils.path_validator import validate_next_path
class GoogleOauthInitiateEndpoint(View):
def get(self, request):
request.session["host"] = base_host(request=request, is_app=True)
@ -95,7 +96,9 @@ class GoogleCallbackEndpoint(View):
# Get the redirection path
path = get_redirection_path(user=user)
# redirect to referer path
url = urljoin(base_host, str(validate_next_path(next_path)) if next_path else path)
url = urljoin(
base_host, str(validate_next_path(next_path)) if next_path else path
)
return HttpResponseRedirect(url)
except AuthenticationException as e:
params = e.get_error_dict()

View file

@ -53,12 +53,14 @@ class ChangePasswordEndpoint(APIView):
error_message="MISSING_PASSWORD",
payload={"error": "Old password is missing"},
)
return Response(exc.get_error_dict(), status=status.HTTP_400_BAD_REQUEST)
return Response(
exc.get_error_dict(), status=status.HTTP_400_BAD_REQUEST
)
# Get the new password
new_password = request.data.get("new_password", False)
if not new_password:
if not new_password:
exc = AuthenticationException(
error_code=AUTHENTICATION_ERROR_CODES["MISSING_PASSWORD"],
error_message="MISSING_PASSWORD",
@ -66,7 +68,6 @@ class ChangePasswordEndpoint(APIView):
)
return Response(exc.get_error_dict(), status=status.HTTP_400_BAD_REQUEST)
# If the user password is not autoset then we need to check the old passwords
if not user.is_password_autoset and not user.check_password(old_password):
exc = AuthenticationException(

View file

@ -25,6 +25,7 @@ from plane.authentication.adapter.error import (
)
from plane.utils.path_validator import validate_next_path
class MagicGenerateSpaceEndpoint(APIView):
permission_classes = [AllowAny]
@ -38,7 +39,6 @@ class MagicGenerateSpaceEndpoint(APIView):
)
return Response(exc.get_error_dict(), status=status.HTTP_400_BAD_REQUEST)
email = request.data.get("email", "").strip().lower()
try:
validate_email(email)

View file

@ -459,8 +459,37 @@ def analytic_export_task(email, data, slug):
csv_buffer = generate_csv_from_rows(rows)
send_export_email(email, slug, csv_buffer, rows)
logging.getLogger("plane").info("Email sent succesfully.")
logging.getLogger("plane.worker").info("Email sent successfully.")
return
except Exception as e:
log_exception(e)
return
@shared_task
def export_analytics_to_csv_email(data, headers, keys, email, slug):
try:
"""
Prepares a CSV from data and sends it as an email attachment.
Parameters:
- data: List of dictionaries (e.g. from .values())
- headers: List of CSV column headers
- keys: Keys to extract from each data item (dict)
- email: Email address to send to
- slug: Used for the filename
"""
# Prepare rows: header + data rows
rows = [headers]
for item in data:
row = [item.get(key, "") for key in keys]
rows.append(row)
# Generate CSV buffer
csv_buffer = generate_csv_from_rows(rows)
# Send email with CSV attachment
send_export_email(email=email, slug=slug, csv_buffer=csv_buffer, rows=rows)
except Exception as e:
log_exception(e)
return

View file

@ -12,6 +12,7 @@ from plane.db.models import FileAsset, Page, Issue
from plane.utils.exception_logger import log_exception
from plane.settings.storage import S3Storage
from celery import shared_task
from plane.utils.url import normalize_url_path
def get_entity_id_field(entity_type, entity_id):
@ -67,11 +68,14 @@ def sync_with_external_service(entity_name, description_html):
"description_html": description_html,
"variant": "rich" if entity_name == "PAGE" else "document",
}
response = requests.post(
f"{settings.LIVE_BASE_URL}/convert-document/",
json=data,
headers=None,
)
live_url = settings.LIVE_URL
if not live_url:
return {}
url = normalize_url_path(f"{live_url}/convert-document/")
response = requests.post(url, json=data, headers=None)
if response.status_code == 200:
return response.json()
except requests.RequestException as e:

View file

@ -33,6 +33,7 @@ from plane.db.models import (
Intake,
IntakeIssue,
)
from plane.db.models.intake import SourceType
def create_project(workspace, user_id):
@ -388,7 +389,7 @@ def create_intake_issues(workspace, project, user_id, intake_issue_count):
if status == 0
else None
),
source="in-app",
source=SourceType.IN_APP,
workspace=workspace,
project=project,
)

View file

@ -284,6 +284,7 @@ def send_email_notification(
"project": str(issue.project.name),
"user_preference": f"{base_api}/profile/preferences/email",
"comments": comments,
"entity_type": "issue",
}
html_content = render_to_string(
"emails/notifications/issue-updates.html", context
@ -309,7 +310,7 @@ def send_email_notification(
)
msg.attach_alternative(html_content, "text/html")
msg.send()
logging.getLogger("plane").info("Email Sent Successfully")
logging.getLogger("plane.worker").info("Email Sent Successfully")
# Update the logs
EmailNotificationLog.objects.filter(
@ -325,7 +326,7 @@ def send_email_notification(
release_lock(lock_id=lock_id)
return
else:
logging.getLogger("plane").info("Duplicate email received skipping")
logging.getLogger("plane.worker").info("Duplicate email received skipping")
return
except (Issue.DoesNotExist, User.DoesNotExist):
release_lock(lock_id=lock_id)

View file

@ -3,34 +3,49 @@ import csv
import io
import json
import zipfile
from typing import List
import boto3
from botocore.client import Config
from uuid import UUID
from datetime import datetime, date
# Third party imports
from celery import shared_task
# Django imports
from django.conf import settings
from django.utils import timezone
from openpyxl import Workbook
from django.db.models import F, Prefetch
from collections import defaultdict
# Module imports
from plane.db.models import ExporterHistory, Issue
from plane.db.models import ExporterHistory, Issue, FileAsset, Label, User, IssueComment
from plane.utils.exception_logger import log_exception
def dateTimeConverter(time):
def dateTimeConverter(time: datetime) -> str | None:
"""
Convert a datetime object to a formatted string.
"""
if time:
return time.strftime("%a, %d %b %Y %I:%M:%S %Z%z")
def dateConverter(time):
def dateConverter(time: date) -> str | None:
"""
Convert a date object to a formatted string.
"""
if time:
return time.strftime("%a, %d %b %Y")
def create_csv_file(data):
def create_csv_file(data: List[List[str]]) -> str:
"""
Create a CSV file from the provided data.
"""
csv_buffer = io.StringIO()
csv_writer = csv.writer(csv_buffer, delimiter=",", quoting=csv.QUOTE_ALL)
@ -41,11 +56,17 @@ def create_csv_file(data):
return csv_buffer.getvalue()
def create_json_file(data):
def create_json_file(data: List[dict]) -> str:
"""
Create a JSON file from the provided data.
"""
return json.dumps(data)
def create_xlsx_file(data):
def create_xlsx_file(data: List[List[str]]) -> bytes:
"""
Create an XLSX file from the provided data.
"""
workbook = Workbook()
sheet = workbook.active
@ -58,7 +79,10 @@ def create_xlsx_file(data):
return xlsx_buffer.getvalue()
def create_zip_file(files):
def create_zip_file(files: List[tuple[str, str | bytes]]) -> io.BytesIO:
"""
Create a ZIP file from the provided files.
"""
zip_buffer = io.BytesIO()
with zipfile.ZipFile(zip_buffer, "w", zipfile.ZIP_DEFLATED) as zipf:
for filename, file_content in files:
@ -68,7 +92,13 @@ def create_zip_file(files):
return zip_buffer
def upload_to_s3(zip_file, workspace_id, token_id, slug):
# TODO: Change the upload_to_s3 function to use the new storage method with entry in file asset table
def upload_to_s3(
zip_file: io.BytesIO, workspace_id: UUID, token_id: str, slug: str
) -> None:
"""
Upload a ZIP file to S3 and generate a presigned URL.
"""
file_name = (
f"{workspace_id}/export-{slug}-{token_id[:6]}-{str(timezone.now().date())}.zip"
)
@ -150,75 +180,85 @@ def upload_to_s3(zip_file, workspace_id, token_id, slug):
exporter_instance.save(update_fields=["status", "url", "key"])
def generate_table_row(issue):
def generate_table_row(issue: dict) -> List[str]:
"""
Generate a table row from an issue dictionary.
"""
return [
f"""{issue["project__identifier"]}-{issue["sequence_id"]}""",
issue["project__name"],
f"""{issue["project_identifier"]}-{issue["sequence_id"]}""",
issue["project_name"],
issue["name"],
issue["description_stripped"],
issue["state__name"],
issue["description"],
issue["state_name"],
dateConverter(issue["start_date"]),
dateConverter(issue["target_date"]),
issue["priority"],
(
f"{issue['created_by__first_name']} {issue['created_by__last_name']}"
if issue["created_by__first_name"] and issue["created_by__last_name"]
else ""
),
(
f"{issue['assignees__first_name']} {issue['assignees__last_name']}"
if issue["assignees__first_name"] and issue["assignees__last_name"]
else ""
),
issue["labels__name"] if issue["labels__name"] else "",
issue["issue_cycle__cycle__name"],
dateConverter(issue["issue_cycle__cycle__start_date"]),
dateConverter(issue["issue_cycle__cycle__end_date"]),
issue["issue_module__module__name"],
dateConverter(issue["issue_module__module__start_date"]),
dateConverter(issue["issue_module__module__target_date"]),
issue["created_by"],
", ".join(issue["labels"]) if issue["labels"] else "",
issue["cycle_name"],
issue["cycle_start_date"],
issue["cycle_end_date"],
", ".join(issue.get("module_name", "")) if issue.get("module_name") else "",
dateTimeConverter(issue["created_at"]),
dateTimeConverter(issue["updated_at"]),
dateTimeConverter(issue["completed_at"]),
dateTimeConverter(issue["archived_at"]),
(
", ".join(
[
f"{comment['comment']} ({comment['created_at']} by {comment['created_by']})"
for comment in issue["comments"]
]
)
if issue["comments"]
else ""
),
issue["estimate"] if issue["estimate"] else "",
", ".join(issue["link"]) if issue["link"] else "",
", ".join(issue["assignees"]) if issue["assignees"] else "",
issue["subscribers_count"] if issue["subscribers_count"] else "",
issue["attachment_count"] if issue["attachment_count"] else "",
", ".join(issue["attachment_links"]) if issue["attachment_links"] else "",
]
def generate_json_row(issue):
def generate_json_row(issue: dict) -> dict:
"""
Generate a JSON row from an issue dictionary.
"""
return {
"ID": f"""{issue["project__identifier"]}-{issue["sequence_id"]}""",
"Project": issue["project__name"],
"ID": f"""{issue["project_identifier"]}-{issue["sequence_id"]}""",
"Project": issue["project_name"],
"Name": issue["name"],
"Description": issue["description_stripped"],
"State": issue["state__name"],
"Description": issue["description"],
"State": issue["state_name"],
"Start Date": dateConverter(issue["start_date"]),
"Target Date": dateConverter(issue["target_date"]),
"Priority": issue["priority"],
"Created By": (
f"{issue['created_by__first_name']} {issue['created_by__last_name']}"
if issue["created_by__first_name"] and issue["created_by__last_name"]
else ""
),
"Assignee": (
f"{issue['assignees__first_name']} {issue['assignees__last_name']}"
if issue["assignees__first_name"] and issue["assignees__last_name"]
else ""
),
"Labels": issue["labels__name"] if issue["labels__name"] else "",
"Cycle Name": issue["issue_cycle__cycle__name"],
"Cycle Start Date": dateConverter(issue["issue_cycle__cycle__start_date"]),
"Cycle End Date": dateConverter(issue["issue_cycle__cycle__end_date"]),
"Module Name": issue["issue_module__module__name"],
"Module Start Date": dateConverter(issue["issue_module__module__start_date"]),
"Module Target Date": dateConverter(issue["issue_module__module__target_date"]),
"Created By": (f"{issue['created_by']}" if issue["created_by"] else ""),
"Assignee": issue["assignees"],
"Labels": issue["labels"],
"Cycle Name": issue["cycle_name"],
"Cycle Start Date": issue["cycle_start_date"],
"Cycle End Date": issue["cycle_end_date"],
"Module Name": issue["module_name"],
"Created At": dateTimeConverter(issue["created_at"]),
"Updated At": dateTimeConverter(issue["updated_at"]),
"Completed At": dateTimeConverter(issue["completed_at"]),
"Archived At": dateTimeConverter(issue["archived_at"]),
"Comments": issue["comments"],
"Estimate": issue["estimate"],
"Link": issue["link"],
"Subscribers Count": issue["subscribers_count"],
"Attachment Count": issue["attachment_count"],
"Attachment Links": issue["attachment_links"],
}
def update_json_row(rows, row):
def update_json_row(rows: List[dict], row: dict) -> None:
"""
Update the json row with the new assignee and label.
"""
matched_index = next(
(
index
@ -247,7 +287,10 @@ def update_json_row(rows, row):
rows.append(row)
def update_table_row(rows, row):
def update_table_row(rows: List[List[str]], row: List[str]) -> None:
"""
Update the table row with the new assignee and label.
"""
matched_index = next(
(index for index, existing_row in enumerate(rows) if existing_row[0] == row[0]),
None,
@ -269,7 +312,12 @@ def update_table_row(rows, row):
rows.append(row)
def generate_csv(header, project_id, issues, files):
def generate_csv(
header: List[str],
project_id: str,
issues: List[dict],
files: List[tuple[str, str | bytes]],
) -> None:
"""
Generate CSV export for all the passed issues.
"""
@ -281,7 +329,15 @@ def generate_csv(header, project_id, issues, files):
files.append((f"{project_id}.csv", csv_file))
def generate_json(header, project_id, issues, files):
def generate_json(
header: List[str],
project_id: str,
issues: List[dict],
files: List[tuple[str, str | bytes]],
) -> None:
"""
Generate JSON export for all the passed issues.
"""
rows = []
for issue in issues:
row = generate_json_row(issue)
@ -290,68 +346,169 @@ def generate_json(header, project_id, issues, files):
files.append((f"{project_id}.json", json_file))
def generate_xlsx(header, project_id, issues, files):
def generate_xlsx(
header: List[str],
project_id: str,
issues: List[dict],
files: List[tuple[str, str | bytes]],
) -> None:
"""
Generate XLSX export for all the passed issues.
"""
rows = [header]
for issue in issues:
row = generate_table_row(issue)
update_table_row(rows, row)
xlsx_file = create_xlsx_file(rows)
files.append((f"{project_id}.xlsx", xlsx_file))
def get_created_by(obj: Issue | IssueComment) -> str:
"""
Get the created by user for the given object.
"""
if obj.created_by:
return f"{obj.created_by.first_name} {obj.created_by.last_name}"
return ""
@shared_task
def issue_export_task(provider, workspace_id, project_ids, token_id, multiple, slug):
def issue_export_task(
provider: str,
workspace_id: UUID,
project_ids: List[str],
token_id: str,
multiple: bool,
slug: str,
):
"""
Export issues from the workspace.
provider (str): The provider to export the issues to csv | json | xlsx.
token_id (str): The export object token id.
multiple (bool): Whether to export the issues to multiple files per project.
"""
try:
exporter_instance = ExporterHistory.objects.get(token=token_id)
exporter_instance.status = "processing"
exporter_instance.save(update_fields=["status"])
# Base query to get the issues
workspace_issues = (
(
Issue.objects.filter(
workspace__id=workspace_id,
project_id__in=project_ids,
project__project_projectmember__member=exporter_instance.initiated_by_id,
project__project_projectmember__is_active=True,
project__archived_at__isnull=True,
)
.select_related("project", "workspace", "state", "parent", "created_by")
.prefetch_related(
"assignees", "labels", "issue_cycle__cycle", "issue_module__module"
)
.values(
"id",
"project__identifier",
"project__name",
"project__id",
"sequence_id",
"name",
"description_stripped",
"priority",
"start_date",
"target_date",
"state__name",
"created_at",
"updated_at",
"completed_at",
"archived_at",
"issue_cycle__cycle__name",
"issue_cycle__cycle__start_date",
"issue_cycle__cycle__end_date",
"issue_module__module__name",
"issue_module__module__start_date",
"issue_module__module__target_date",
"created_by__first_name",
"created_by__last_name",
"assignees__first_name",
"assignees__last_name",
"labels__name",
)
Issue.objects.filter(
workspace__id=workspace_id,
project_id__in=project_ids,
project__project_projectmember__member=exporter_instance.initiated_by_id,
project__project_projectmember__is_active=True,
project__archived_at__isnull=True,
)
.select_related(
"project",
"workspace",
"state",
"parent",
"created_by",
"estimate_point",
)
.prefetch_related(
"labels",
"issue_cycle__cycle",
"issue_module__module",
"issue_comments",
"assignees",
Prefetch(
"assignees",
queryset=User.objects.only("first_name", "last_name").distinct(),
to_attr="assignee_details",
),
Prefetch(
"labels",
queryset=Label.objects.only("name").distinct(),
to_attr="label_details",
),
"issue_subscribers",
"issue_link",
)
.order_by("project__identifier", "sequence_id")
.distinct()
)
# CSV header
# Get the attachments for the issues
file_assets = FileAsset.objects.filter(
issue_id__in=workspace_issues.values_list("id", flat=True),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
).annotate(work_item_id=F("issue_id"), asset_id=F("id"))
# Create a dictionary to store the attachments for the issues
attachment_dict = defaultdict(list)
for asset in file_assets:
attachment_dict[asset.work_item_id].append(asset.asset_id)
# Create a list to store the issues data
issues_data = []
# Iterate over the issues
for issue in workspace_issues:
attachments = attachment_dict.get(issue.id, [])
issue_data = {
"id": issue.id,
"project_identifier": issue.project.identifier,
"project_name": issue.project.name,
"project_id": issue.project.id,
"sequence_id": issue.sequence_id,
"name": issue.name,
"description": issue.description_stripped,
"priority": issue.priority,
"start_date": issue.start_date,
"target_date": issue.target_date,
"state_name": issue.state.name if issue.state else None,
"created_at": issue.created_at,
"updated_at": issue.updated_at,
"completed_at": issue.completed_at,
"archived_at": issue.archived_at,
"module_name": [
module.module.name for module in issue.issue_module.all()
],
"created_by": get_created_by(issue),
"labels": [label.name for label in issue.label_details],
"comments": [
{
"comment": comment.comment_stripped,
"created_at": dateConverter(comment.created_at),
"created_by": get_created_by(comment),
}
for comment in issue.issue_comments.all()
],
"estimate": issue.estimate_point.value
if issue.estimate_point and issue.estimate_point.value
else "",
"link": [link.url for link in issue.issue_link.all()],
"assignees": [
f"{assignee.first_name} {assignee.last_name}"
for assignee in issue.assignee_details
],
"subscribers_count": issue.issue_subscribers.count(),
"attachment_count": len(attachments),
"attachment_links": [
f"/api/assets/v2/workspaces/{issue.workspace.slug}/projects/{issue.project_id}/issues/{issue.id}/attachments/{asset}/"
for asset in attachments
],
}
# Get Cycles data for the issue
cycle = issue.issue_cycle.last()
if cycle:
# Update cycle data
issue_data["cycle_name"] = cycle.cycle.name
issue_data["cycle_start_date"] = dateConverter(cycle.cycle.start_date)
issue_data["cycle_end_date"] = dateConverter(cycle.cycle.end_date)
else:
issue_data["cycle_name"] = ""
issue_data["cycle_start_date"] = ""
issue_data["cycle_end_date"] = ""
issues_data.append(issue_data)
# CSV header
header = [
"ID",
"Project",
@ -362,20 +519,25 @@ def issue_export_task(provider, workspace_id, project_ids, token_id, multiple, s
"Target Date",
"Priority",
"Created By",
"Assignee",
"Labels",
"Cycle Name",
"Cycle Start Date",
"Cycle End Date",
"Module Name",
"Module Start Date",
"Module Target Date",
"Created At",
"Updated At",
"Completed At",
"Archived At",
"Comments",
"Estimate",
"Link",
"Assignees",
"Subscribers Count",
"Attachment Count",
"Attachment Links",
]
# Map the provider to the function
EXPORTER_MAPPER = {
"csv": generate_csv,
"json": generate_json,
@ -384,8 +546,13 @@ def issue_export_task(provider, workspace_id, project_ids, token_id, multiple, s
files = []
if multiple:
project_dict = defaultdict(list)
for issue in issues_data:
project_dict[str(issue["project_id"])].append(issue)
for project_id in project_ids:
issues = workspace_issues.filter(project__id=project_id)
issues = project_dict.get(str(project_id), [])
exporter = EXPORTER_MAPPER.get(provider)
if exporter is not None:
exporter(header, project_id, issues, files)
@ -393,7 +560,7 @@ def issue_export_task(provider, workspace_id, project_ids, token_id, multiple, s
else:
exporter = EXPORTER_MAPPER.get(provider)
if exporter is not None:
exporter(header, workspace_id, workspace_issues, files)
exporter(header, workspace_id, issues_data, files)
zip_buffer = create_zip_file(files)
upload_to_s3(zip_buffer, workspace_id, token_id, slug)

View file

@ -63,7 +63,7 @@ def forgot_password(first_name, email, uidb64, token, current_site):
)
msg.attach_alternative(html_content, "text/html")
msg.send()
logging.getLogger("plane").info("Email sent successfully")
logging.getLogger("plane.worker").info("Email sent successfully")
return
except Exception as e:
log_exception(e)

View file

@ -1650,40 +1650,6 @@ def issue_activity(
# Save all the values to database
issue_activities_created = IssueActivity.objects.bulk_create(issue_activities)
# Post the updates to segway for integrations and webhooks
if len(issue_activities_created):
for activity in issue_activities_created:
webhook_activity.delay(
event=(
"issue_comment"
if activity.field == "comment"
else "intake_issue"
if intake
else "issue"
),
event_id=(
activity.issue_comment_id
if activity.field == "comment"
else intake
if intake
else activity.issue_id
),
verb=activity.verb,
field=(
"description" if activity.field == "comment" else activity.field
),
old_value=(
activity.old_value if activity.old_value != "" else None
),
new_value=(
activity.new_value if activity.new_value != "" else None
),
actor_id=activity.actor_id,
current_site=origin,
slug=activity.workspace.slug,
old_identifier=activity.old_identifier,
new_identifier=activity.new_identifier,
)
if notification:
notifications.delay(

View file

@ -53,7 +53,7 @@ def magic_link(email, key, token):
)
msg.attach_alternative(html_content, "text/html")
msg.send()
logging.getLogger("plane").info("Email sent successfully.")
logging.getLogger("plane.worker").info("Email sent successfully.")
return
except Exception as e:
log_exception(e)

View file

@ -80,7 +80,7 @@ def project_add_user_email(current_site, project_member_id, invitor_id):
# Send the email
msg.send()
# Log the success
logging.getLogger("plane").info("Email sent successfully.")
logging.getLogger("plane.worker").info("Email sent successfully.")
return
except Exception as e:
log_exception(e)

View file

@ -76,7 +76,7 @@ def project_invitation(email, project_id, token, current_site, invitor):
msg.attach_alternative(html_content, "text/html")
msg.send()
logging.getLogger("plane").info("Email sent successfully.")
logging.getLogger("plane.worker").info("Email sent successfully.")
return
except (Project.DoesNotExist, ProjectMemberInvite.DoesNotExist):
return

View file

@ -58,7 +58,7 @@ def user_activation_email(current_site, user_id):
msg.attach_alternative(html_content, "text/html")
msg.send()
logging.getLogger("plane").info("Email sent successfully.")
logging.getLogger("plane.worker").info("Email sent successfully.")
return
except Exception as e:
log_exception(e)

View file

@ -60,7 +60,7 @@ def user_deactivation_email(current_site, user_id):
# Attach HTML content
msg.attach_alternative(html_content, "text/html")
msg.send()
logging.getLogger("plane").info("Email sent successfully.")
logging.getLogger("plane.worker").info("Email sent successfully.")
return
except Exception as e:
log_exception(e)

View file

@ -5,6 +5,7 @@ import logging
import uuid
import requests
from typing import Any, Dict, List, Optional, Union
# Third party imports
from celery import shared_task
@ -70,150 +71,89 @@ MODEL_MAPPER = {
}
def get_model_data(event, event_id, many=False):
logger = logging.getLogger("plane.worker")
def get_model_data(
event: str, event_id: Union[str, List[str]], many: bool = False
) -> Dict[str, Any]:
"""
Retrieve and serialize model data based on the event type.
Args:
event (str): The type of event/model to retrieve data for
event_id (Union[str, List[str]]): The ID or list of IDs of the model instance(s)
many (bool): Whether to retrieve multiple instances
Returns:
Dict[str, Any]: Serialized model data
Raises:
ValueError: If serializer is not found for the event
ObjectDoesNotExist: If model instance is not found
"""
model = MODEL_MAPPER.get(event)
if many:
queryset = model.objects.filter(pk__in=event_id)
else:
queryset = model.objects.get(pk=event_id)
serializer = SERIALIZER_MAPPER.get(event)
return serializer(queryset, many=many).data
if model is None:
raise ValueError(f"Model not found for event: {event}")
@shared_task(
bind=True,
autoretry_for=(requests.RequestException,),
retry_backoff=600,
max_retries=5,
retry_jitter=True,
)
def webhook_task(self, webhook, slug, event, event_data, action, current_site):
try:
webhook = Webhook.objects.get(id=webhook, workspace__slug=slug)
if many:
queryset = model.objects.filter(pk__in=event_id)
else:
queryset = model.objects.get(pk=event_id)
headers = {
"Content-Type": "application/json",
"User-Agent": "Autopilot",
"X-Plane-Delivery": str(uuid.uuid4()),
"X-Plane-Event": event,
}
serializer = SERIALIZER_MAPPER.get(event)
if serializer is None:
raise ValueError(f"Serializer not found for event: {event}")
# # Your secret key
event_data = (
json.loads(json.dumps(event_data, cls=DjangoJSONEncoder))
if event_data is not None
else None
)
action = {
"POST": "create",
"PATCH": "update",
"PUT": "update",
"DELETE": "delete",
}.get(action, action)
payload = {
"event": event,
"action": action,
"webhook_id": str(webhook.id),
"workspace_id": str(webhook.workspace_id),
"data": event_data,
}
# Use HMAC for generating signature
if webhook.secret_key:
hmac_signature = hmac.new(
webhook.secret_key.encode("utf-8"),
json.dumps(payload).encode("utf-8"),
hashlib.sha256,
)
signature = hmac_signature.hexdigest()
headers["X-Plane-Signature"] = signature
# Send the webhook event
response = requests.post(webhook.url, headers=headers, json=payload, timeout=30)
# Log the webhook request
WebhookLog.objects.create(
workspace_id=str(webhook.workspace_id),
webhook=str(webhook.id),
event_type=str(event),
request_method=str(action),
request_headers=str(headers),
request_body=str(payload),
response_status=str(response.status_code),
response_headers=str(response.headers),
response_body=str(response.text),
retry_count=str(self.request.retries),
)
except Webhook.DoesNotExist:
return
except requests.RequestException as e:
# Log the failed webhook request
WebhookLog.objects.create(
workspace_id=str(webhook.workspace_id),
webhook=str(webhook.id),
event_type=str(event),
request_method=str(action),
request_headers=str(headers),
request_body=str(payload),
response_status=500,
response_headers="",
response_body=str(e),
retry_count=str(self.request.retries),
)
# Retry logic
if self.request.retries >= self.max_retries:
Webhook.objects.filter(pk=webhook.id).update(is_active=False)
if webhook:
# send email for the deactivation of the webhook
send_webhook_deactivation_email(
webhook_id=webhook.id,
receiver_id=webhook.created_by_id,
reason=str(e),
current_site=current_site,
)
return
raise requests.RequestException()
except Exception as e:
if settings.DEBUG:
print(e)
log_exception(e)
return
return serializer(queryset, many=many).data
except ObjectDoesNotExist:
raise ObjectDoesNotExist(f"No {event} found with id: {event_id}")
@shared_task
def send_webhook_deactivation_email(webhook_id, receiver_id, current_site, reason):
# Get email configurations
(
EMAIL_HOST,
EMAIL_HOST_USER,
EMAIL_HOST_PASSWORD,
EMAIL_PORT,
EMAIL_USE_TLS,
EMAIL_USE_SSL,
EMAIL_FROM,
) = get_email_configuration()
receiver = User.objects.get(pk=receiver_id)
webhook = Webhook.objects.get(pk=webhook_id)
subject = "Webhook Deactivated"
message = f"Webhook {webhook.url} has been deactivated due to failed requests."
# Send the mail
context = {
"email": receiver.email,
"message": message,
"webhook_url": f"{current_site}/{str(webhook.workspace.slug)}/settings/webhooks/{str(webhook.id)}",
}
html_content = render_to_string(
"emails/notifications/webhook-deactivate.html", context
)
text_content = strip_tags(html_content)
def send_webhook_deactivation_email(
webhook_id: str, receiver_id: str, current_site: str, reason: str
) -> None:
"""
Send an email notification when a webhook is deactivated.
Args:
webhook_id (str): ID of the deactivated webhook
receiver_id (str): ID of the user to receive the notification
current_site (str): Current site URL
reason (str): Reason for webhook deactivation
"""
try:
(
EMAIL_HOST,
EMAIL_HOST_USER,
EMAIL_HOST_PASSWORD,
EMAIL_PORT,
EMAIL_USE_TLS,
EMAIL_USE_SSL,
EMAIL_FROM,
) = get_email_configuration()
receiver = User.objects.get(pk=receiver_id)
webhook = Webhook.objects.get(pk=webhook_id)
# Get the webhook payload
subject = "Webhook Deactivated"
message = f"Webhook {webhook.url} has been deactivated due to failed requests."
# Send the mail
context = {
"email": receiver.email,
"message": message,
"webhook_url": f"{current_site}/{str(webhook.workspace.slug)}/settings/webhooks/{str(webhook.id)}",
}
html_content = render_to_string(
"emails/notifications/webhook-deactivate.html", context
)
text_content = strip_tags(html_content)
# Set the email connection
connection = get_connection(
host=EMAIL_HOST,
port=int(EMAIL_PORT),
@ -223,6 +163,7 @@ def send_webhook_deactivation_email(webhook_id, receiver_id, current_site, reaso
use_ssl=EMAIL_USE_SSL == "1",
)
# Create the email message
msg = EmailMultiAlternatives(
subject=subject,
body=text_content,
@ -232,11 +173,10 @@ def send_webhook_deactivation_email(webhook_id, receiver_id, current_site, reaso
)
msg.attach_alternative(html_content, "text/html")
msg.send()
logging.getLogger("plane").info("Email sent successfully.")
return
logger.info("Email sent successfully.")
except Exception as e:
log_exception(e)
return
logger.error(f"Failed to send email: {e}")
@shared_task(
@ -247,10 +187,29 @@ def send_webhook_deactivation_email(webhook_id, receiver_id, current_site, reaso
retry_jitter=True,
)
def webhook_send_task(
self, webhook, slug, event, event_data, action, current_site, activity
):
self,
webhook_id: str,
slug: str,
event: str,
event_data: Optional[Dict[str, Any]],
action: str,
current_site: str,
activity: Optional[Dict[str, Any]],
) -> None:
"""
Send webhook notifications to configured endpoints.
Args:
webhook (str): Webhook ID
slug (str): Workspace slug
event (str): Event type
event_data (Optional[Dict[str, Any]]): Event data to be sent
action (str): HTTP method/action
current_site (str): Current site URL
activity (Optional[Dict[str, Any]]): Activity data
"""
try:
webhook = Webhook.objects.get(id=webhook, workspace__slug=slug)
webhook = Webhook.objects.get(id=webhook_id, workspace__slug=slug)
headers = {
"Content-Type": "application/json",
@ -297,7 +256,12 @@ def webhook_send_task(
)
signature = hmac_signature.hexdigest()
headers["X-Plane-Signature"] = signature
except Exception as e:
log_exception(e)
logger.error(f"Failed to send webhook: {e}")
return
try:
# Send the webhook event
response = requests.post(webhook.url, headers=headers, json=payload, timeout=30)
@ -314,7 +278,7 @@ def webhook_send_task(
response_body=str(response.text),
retry_count=str(self.request.retries),
)
logger.info(f"Webhook {webhook.id} sent successfully")
except requests.RequestException as e:
# Log the failed webhook request
WebhookLog.objects.create(
@ -329,12 +293,13 @@ def webhook_send_task(
response_body=str(e),
retry_count=str(self.request.retries),
)
logger.error(f"Webhook {webhook.id} failed with error: {e}")
# Retry logic
if self.request.retries >= self.max_retries:
Webhook.objects.filter(pk=webhook.id).update(is_active=False)
if webhook:
# send email for the deactivation of the webhook
send_webhook_deactivation_email(
send_webhook_deactivation_email.delay(
webhook_id=webhook.id,
receiver_id=webhook.created_by_id,
reason=str(e),
@ -344,26 +309,50 @@ def webhook_send_task(
raise requests.RequestException()
except Exception as e:
if settings.DEBUG:
print(e)
log_exception(e)
return
@shared_task
def webhook_activity(
event,
verb,
field,
old_value,
new_value,
actor_id,
slug,
current_site,
event_id,
old_identifier,
new_identifier,
):
event: str,
verb: str,
field: Optional[str],
old_value: Any,
new_value: Any,
actor_id: str | uuid.UUID,
slug: str,
current_site: str,
event_id: str | uuid.UUID,
old_identifier: Optional[str],
new_identifier: Optional[str],
) -> None:
"""
Process and send webhook notifications for various activities in the system.
This task filters relevant webhooks based on the event type and sends notifications
to all active webhooks for the workspace.
Args:
event (str): Type of event (project, issue, module, cycle, issue_comment)
verb (str): Action performed (created, updated, deleted)
field (Optional[str]): Name of the field that was changed
old_value (Any): Previous value of the field
new_value (Any): New value of the field
actor_id (str | uuid.UUID): ID of the user who performed the action
slug (str): Workspace slug
current_site (str): Current site URL
event_id (str | uuid.UUID): ID of the event object
old_identifier (Optional[str]): Previous identifier if any
new_identifier (Optional[str]): New identifier if any
Returns:
None
Note:
The function silently returns on ObjectDoesNotExist exceptions to handle
race conditions where objects might have been deleted.
"""
try:
webhooks = Webhook.objects.filter(workspace__slug=slug, is_active=True)
@ -384,7 +373,7 @@ def webhook_activity(
for webhook in webhooks:
webhook_send_task.delay(
webhook=webhook.id,
webhook_id=webhook.id,
slug=slug,
event=event,
event_data=(

View file

@ -0,0 +1,177 @@
# Python imports
import logging
# Third party imports
from celery import shared_task
import requests
from bs4 import BeautifulSoup
from urllib.parse import urlparse, urljoin
import base64
import ipaddress
from typing import Dict, Any
from typing import Optional
from plane.db.models import IssueLink
from plane.utils.exception_logger import log_exception
logger = logging.getLogger("plane.worker")
DEFAULT_FAVICON = "PHN2ZyB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciIHdpZHRoPSIyNCIgaGVpZ2h0PSIyNCIgdmlld0JveD0iMCAwIDI0IDI0IiBmaWxsPSJub25lIiBzdHJva2U9ImN1cnJlbnRDb2xvciIgc3Ryb2tlLXdpZHRoPSIyIiBzdHJva2UtbGluZWNhcD0icm91bmQiIHN0cm9rZS1saW5lam9pbj0icm91bmQiIGNsYXNzPSJsdWNpZGUgbHVjaWRlLWxpbmstaWNvbiBsdWNpZGUtbGluayI+PHBhdGggZD0iTTEwIDEzYTUgNSAwIDAgMCA3LjU0LjU0bDMtM2E1IDUgMCAwIDAtNy4wNy03LjA3bC0xLjcyIDEuNzEiLz48cGF0aCBkPSJNMTQgMTFhNSA1IDAgMCAwLTcuNTQtLjU0bC0zIDNhNSA1IDAgMCAwIDcuMDcgNy4wN2wxLjcxLTEuNzEiLz48L3N2Zz4=" # noqa: E501
def crawl_work_item_link_title_and_favicon(url: str) -> Dict[str, Any]:
"""
Crawls a URL to extract the title and favicon.
Args:
url (str): The URL to crawl
Returns:
str: JSON string containing title and base64-encoded favicon
"""
try:
# Prevent access to private IP ranges
parsed = urlparse(url)
try:
ip = ipaddress.ip_address(parsed.hostname)
if ip.is_private or ip.is_loopback or ip.is_reserved:
raise ValueError("Access to private/internal networks is not allowed")
except ValueError:
# Not an IP address, continue with domain validation
pass
# Set up headers to mimic a real browser
headers = {
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36" # noqa: E501
}
soup = None
title = None
try:
response = requests.get(url, headers=headers, timeout=1)
soup = BeautifulSoup(response.content, "html.parser")
title_tag = soup.find("title")
title = title_tag.get_text().strip() if title_tag else None
except requests.RequestException as e:
logger.warning(f"Failed to fetch HTML for title: {str(e)}")
# Fetch and encode favicon
favicon_base64 = fetch_and_encode_favicon(headers, soup, url)
# Prepare result
result = {
"title": title,
"favicon": favicon_base64["favicon_base64"],
"url": url,
"favicon_url": favicon_base64["favicon_url"],
}
return result
except Exception as e:
log_exception(e)
return {
"error": f"Unexpected error: {str(e)}",
"title": None,
"favicon": None,
"url": url,
}
def find_favicon_url(soup: Optional[BeautifulSoup], base_url: str) -> Optional[str]:
"""
Find the favicon URL from HTML soup.
Args:
soup: BeautifulSoup object
base_url: Base URL for resolving relative paths
Returns:
str: Absolute URL to favicon or None
"""
if soup is not None:
# Look for various favicon link tags
favicon_selectors = [
'link[rel="icon"]',
'link[rel="shortcut icon"]',
'link[rel="apple-touch-icon"]',
'link[rel="apple-touch-icon-precomposed"]',
]
for selector in favicon_selectors:
favicon_tag = soup.select_one(selector)
if favicon_tag and favicon_tag.get("href"):
return urljoin(base_url, favicon_tag["href"])
# Fallback to /favicon.ico
parsed_url = urlparse(base_url)
fallback_url = f"{parsed_url.scheme}://{parsed_url.netloc}/favicon.ico"
# Check if fallback exists
try:
response = requests.head(fallback_url, timeout=2)
if response.status_code == 200:
return fallback_url
except requests.RequestException as e:
log_exception(e)
return None
return None
def fetch_and_encode_favicon(
headers: Dict[str, str], soup: Optional[BeautifulSoup], url: str
) -> Dict[str, Optional[str]]:
"""
Fetch favicon and encode it as base64.
Args:
favicon_url: URL to the favicon
headers: Request headers
Returns:
str: Base64 encoded favicon with data URI prefix or None
"""
try:
favicon_url = find_favicon_url(soup, url)
if favicon_url is None:
return {
"favicon_url": None,
"favicon_base64": f"data:image/svg+xml;base64,{DEFAULT_FAVICON}",
}
response = requests.get(favicon_url, headers=headers, timeout=1)
# Get content type
content_type = response.headers.get("content-type", "image/x-icon")
# Convert to base64
favicon_base64 = base64.b64encode(response.content).decode("utf-8")
# Return as data URI
return {
"favicon_url": favicon_url,
"favicon_base64": f"data:{content_type};base64,{favicon_base64}",
}
except Exception as e:
logger.warning(f"Failed to fetch favicon: {e}")
return {
"favicon_url": None,
"favicon_base64": f"data:image/svg+xml;base64,{DEFAULT_FAVICON}",
}
@shared_task
def crawl_work_item_link_title(id: str, url: str) -> None:
meta_data = crawl_work_item_link_title_and_favicon(url)
issue_link = IssueLink.objects.get(id=id)
issue_link.metadata = meta_data
issue_link.save()

View file

@ -78,7 +78,7 @@ def workspace_invitation(email, workspace_id, token, current_site, inviter):
)
msg.attach_alternative(html_content, "text/html")
msg.send()
logging.getLogger("plane").info("Email sent successfully")
logging.getLogger("plane.worker").info("Email sent successfully")
return
except (Workspace.DoesNotExist, WorkspaceMemberInvite.DoesNotExist):
return

Some files were not shown because too many files have changed in this diff Show more