Add Package Registry (#16510)
* Added package store settings. * Added models. * Added generic package registry. * Added tests. * Added NuGet package registry. * Moved service index to api file. * Added NPM package registry. * Added Maven package registry. * Added PyPI package registry. * Summary is deprecated. * Changed npm name. * Sanitize project url. * Allow only scoped packages. * Added user interface. * Changed method name. * Added missing migration file. * Set page info. * Added documentation. * Added documentation links. * Fixed wrong error message. * Lint template files. * Fixed merge errors. * Fixed unit test storage path. * Switch to json module. * Added suggestions. * Added package webhook. * Add package api. * Fixed swagger file. * Fixed enum and comments. * Fixed NuGet pagination. * Print test names. * Added api tests. * Fixed access level. * Fix User unmarshal. * Added RubyGems package registry. * Fix lint. * Implemented io.Writer. * Added support for sha256/sha512 checksum files. * Improved maven-metadata.xml support. * Added support for symbol package uploads. * Added tests. * Added overview docs. * Added npm dependencies and keywords. * Added no-packages information. * Display file size. * Display asset count. * Fixed filter alignment. * Added package icons. * Formatted instructions. * Allow anonymous package downloads. * Fixed comments. * Fixed postgres test. * Moved file. * Moved models to models/packages. * Use correct error response format per client. * Use simpler search form. * Fixed IsProd. * Restructured data model. * Prevent empty filename. * Fix swagger. * Implemented user/org registry. * Implemented UI. * Use GetUserByIDCtx. * Use table for dependencies. * make svg * Added support for unscoped npm packages. * Add support for npm dist tags. * Added tests for npm tags. * Unlink packages if repository gets deleted. * Prevent user/org delete if a packages exist. * Use package unlink in repository service. * Added support for composer packages. * Restructured package docs. * Added missing tests. * Fixed generic content page. * Fixed docs. * Fixed swagger. * Added missing type. * Fixed ambiguous column. * Organize content store by sha256 hash. * Added admin package management. * Added support for sorting. * Add support for multiple identical versions/files. * Added missing repository unlink. * Added file properties. * make fmt * lint * Added Conan package registry. * Updated docs. * Unify package names. * Added swagger enum. * Use longer TEXT column type. * Removed version composite key. * Merged package and container registry. * Removed index. * Use dedicated package router. * Moved files to new location. * Updated docs. * Fixed JOIN order. * Fixed GROUP BY statement. * Fixed GROUP BY #2. * Added symbol server support. * Added more tests. * Set NOT NULL. * Added setting to disable package registries. * Moved auth into service. * refactor * Use ctx everywhere. * Added package cleanup task. * Changed packages path. * Added container registry. * Refactoring * Updated comparison. * Fix swagger. * Fixed table order. * Use token auth for npm routes. * Enabled ReverseProxy auth. * Added packages link for orgs. * Fixed anonymous org access. * Enable copy button for setup instructions. * Merge error * Added suggestions. * Fixed merge. * Handle "generic". * Added link for TODO. * Added suggestions. * Changed temporary buffer filename. * Added suggestions. * Apply suggestions from code review Co-authored-by: Thomas Boerger <thomas@webhippie.de> * Update docs/content/doc/packages/nuget.en-us.md Co-authored-by: wxiaoguang <wxiaoguang@gmail.com> Co-authored-by: Thomas Boerger <thomas@webhippie.de>
This commit is contained in:
parent
2bce1ea986
commit
1d332342db
197 changed files with 18563 additions and 55 deletions
|
@ -1900,6 +1900,24 @@ PATH =
|
||||||
;; If CLEANUP_TYPE is set to PerWebhook, this is number of hook_task records to keep for a webhook (i.e. keep the most recent x deliveries).
|
;; If CLEANUP_TYPE is set to PerWebhook, this is number of hook_task records to keep for a webhook (i.e. keep the most recent x deliveries).
|
||||||
;NUMBER_TO_KEEP = 10
|
;NUMBER_TO_KEEP = 10
|
||||||
|
|
||||||
|
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
||||||
|
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
||||||
|
;; Cleanup expired packages
|
||||||
|
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
||||||
|
;[cron.cleanup_packages]
|
||||||
|
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
||||||
|
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
||||||
|
;; Whether to enable the job
|
||||||
|
;ENABLED = true
|
||||||
|
;; Whether to always run at least once at start up time (if ENABLED)
|
||||||
|
;RUN_AT_START = true
|
||||||
|
;; Whether to emit notice on successful execution too
|
||||||
|
;NOTICE_ON_SUCCESS = false
|
||||||
|
;; Time interval for job to run
|
||||||
|
;SCHEDULE = @midnight
|
||||||
|
;; Unreferenced blobs created more than OLDER_THAN ago are subject to deletion
|
||||||
|
;OLDER_THAN = 24h
|
||||||
|
|
||||||
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
||||||
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
||||||
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
||||||
|
@ -2221,6 +2239,18 @@ PATH =
|
||||||
;; Enable/Disable federation capabilities
|
;; Enable/Disable federation capabilities
|
||||||
; ENABLED = true
|
; ENABLED = true
|
||||||
|
|
||||||
|
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
||||||
|
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
||||||
|
;[packages]
|
||||||
|
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
||||||
|
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
||||||
|
;;
|
||||||
|
;; Enable/Disable package registry capabilities
|
||||||
|
;ENABLED = true
|
||||||
|
;;
|
||||||
|
;; Path for chunked uploads. Defaults to APP_DATA_PATH + `tmp/package-upload`
|
||||||
|
;CHUNKED_UPLOAD_PATH = tmp/package-upload
|
||||||
|
|
||||||
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
||||||
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
||||||
;; default storage for attachments, lfs and avatars
|
;; default storage for attachments, lfs and avatars
|
||||||
|
@ -2251,6 +2281,16 @@ PATH =
|
||||||
;; Where your lfs files reside, default is data/lfs.
|
;; Where your lfs files reside, default is data/lfs.
|
||||||
;PATH = data/lfs
|
;PATH = data/lfs
|
||||||
|
|
||||||
|
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
||||||
|
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
||||||
|
;; settings for packages, will override storage setting
|
||||||
|
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
||||||
|
;[storage.packages]
|
||||||
|
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
||||||
|
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
||||||
|
;; storage type
|
||||||
|
;STORAGE_TYPE = local
|
||||||
|
|
||||||
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
||||||
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
|
||||||
;; customize storage
|
;; customize storage
|
||||||
|
|
|
@ -856,7 +856,7 @@ Default templates for project boards:
|
||||||
- `RUN_AT_START`: **true**: Run repository statistics check at start time.
|
- `RUN_AT_START`: **true**: Run repository statistics check at start time.
|
||||||
- `SCHEDULE`: **@midnight**: Cron syntax for scheduling repository statistics check.
|
- `SCHEDULE`: **@midnight**: Cron syntax for scheduling repository statistics check.
|
||||||
|
|
||||||
### Cron - Cleanup hook_task Table (`cron.cleanup_hook_task_table`)
|
#### Cron - Cleanup hook_task Table (`cron.cleanup_hook_task_table`)
|
||||||
|
|
||||||
- `ENABLED`: **true**: Enable cleanup hook_task job.
|
- `ENABLED`: **true**: Enable cleanup hook_task job.
|
||||||
- `RUN_AT_START`: **false**: Run cleanup hook_task at start time (if ENABLED).
|
- `RUN_AT_START`: **false**: Run cleanup hook_task at start time (if ENABLED).
|
||||||
|
@ -865,6 +865,14 @@ Default templates for project boards:
|
||||||
- `OLDER_THAN`: **168h**: If CLEANUP_TYPE is set to OlderThan, then any delivered hook_task records older than this expression will be deleted.
|
- `OLDER_THAN`: **168h**: If CLEANUP_TYPE is set to OlderThan, then any delivered hook_task records older than this expression will be deleted.
|
||||||
- `NUMBER_TO_KEEP`: **10**: If CLEANUP_TYPE is set to PerWebhook, this is number of hook_task records to keep for a webhook (i.e. keep the most recent x deliveries).
|
- `NUMBER_TO_KEEP`: **10**: If CLEANUP_TYPE is set to PerWebhook, this is number of hook_task records to keep for a webhook (i.e. keep the most recent x deliveries).
|
||||||
|
|
||||||
|
#### Cron - Cleanup expired packages (`cron.cleanup_packages`)
|
||||||
|
|
||||||
|
- `ENABLED`: **true**: Enable cleanup expired packages job.
|
||||||
|
- `RUN_AT_START`: **true**: Run job at start time (if ENABLED).
|
||||||
|
- `NOTICE_ON_SUCCESS`: **false**: Notify every time this job runs.
|
||||||
|
- `SCHEDULE`: **@midnight**: Cron syntax for the job.
|
||||||
|
- `OLDER_THAN`: **24h**: Unreferenced package data created more than OLDER_THAN ago is subject to deletion.
|
||||||
|
|
||||||
#### Cron - Update Migration Poster ID (`cron.update_migration_poster_id`)
|
#### Cron - Update Migration Poster ID (`cron.update_migration_poster_id`)
|
||||||
|
|
||||||
- `SCHEDULE`: **@midnight** : Interval as a duration between each synchronization, it will always attempt synchronization when the instance starts.
|
- `SCHEDULE`: **@midnight** : Interval as a duration between each synchronization, it will always attempt synchronization when the instance starts.
|
||||||
|
@ -1077,6 +1085,11 @@ Task queue configuration has been moved to `queue.task`. However, the below conf
|
||||||
|
|
||||||
- `ENABLED`: **true**: Enable/Disable federation capabilities
|
- `ENABLED`: **true**: Enable/Disable federation capabilities
|
||||||
|
|
||||||
|
## Packages (`packages`)
|
||||||
|
|
||||||
|
- `ENABLED`: **true**: Enable/Disable package registry capabilities
|
||||||
|
- `CHUNKED_UPLOAD_PATH`: **tmp/package-upload**: Path for chunked uploads. Defaults to `APP_DATA_PATH` + `tmp/package-upload`
|
||||||
|
|
||||||
## Mirror (`mirror`)
|
## Mirror (`mirror`)
|
||||||
|
|
||||||
- `ENABLED`: **true**: Enables the mirror functionality. Set to **false** to disable all mirrors.
|
- `ENABLED`: **true**: Enables the mirror functionality. Set to **false** to disable all mirrors.
|
||||||
|
|
|
@ -8,6 +8,6 @@ draft: false
|
||||||
menu:
|
menu:
|
||||||
sidebar:
|
sidebar:
|
||||||
name: "Developers"
|
name: "Developers"
|
||||||
weight: 50
|
weight: 55
|
||||||
identifier: "developers"
|
identifier: "developers"
|
||||||
---
|
---
|
||||||
|
|
|
@ -8,6 +8,6 @@ draft: false
|
||||||
menu:
|
menu:
|
||||||
sidebar:
|
sidebar:
|
||||||
name: "開發人員"
|
name: "開發人員"
|
||||||
weight: 50
|
weight: 55
|
||||||
identifier: "developers"
|
identifier: "developers"
|
||||||
---
|
---
|
||||||
|
|
|
@ -34,25 +34,25 @@ _Symbols used in table:_
|
||||||
## General Features
|
## General Features
|
||||||
|
|
||||||
| Feature | Gitea | Gogs | GitHub EE | GitLab CE | GitLab EE | BitBucket | RhodeCode CE |
|
| Feature | Gitea | Gogs | GitHub EE | GitLab CE | GitLab EE | BitBucket | RhodeCode CE |
|
||||||
| ----------------------------------- | -------------------------------------------------- | ---- | --------- | --------- | --------- | -------------- | ------------ |
|
| ----------------------------------- | ---------------------------------------------------| ---- | --------- | --------- | --------- | -------------- | ------------ |
|
||||||
| Open source and free | ✓ | ✓ | ✘ | ✓ | ✘ | ✘ | ✓ |
|
| Open source and free | ✓ | ✓ | ✘ | ✓ | ✘ | ✘ | ✓ |
|
||||||
| Low resource usage (RAM/CPU) | ✓ | ✓ | ✘ | ✘ | ✘ | ✘ | ✘ |
|
| Low resource usage (RAM/CPU) | ✓ | ✓ | ✘ | ✘ | ✘ | ✘ | ✘ |
|
||||||
| Multiple database support | ✓ | ✓ | ✘ | ⁄ | ⁄ | ✓ | ✓ |
|
| Multiple database support | ✓ | ✓ | ✘ | ⁄ | ⁄ | ✓ | ✓ |
|
||||||
| Multiple OS support | ✓ | ✓ | ✘ | ✘ | ✘ | ✘ | ✓ |
|
| Multiple OS support | ✓ | ✓ | ✘ | ✘ | ✘ | ✘ | ✓ |
|
||||||
| Easy upgrade process | ✓ | ✓ | ✘ | ✓ | ✓ | ✘ | ✓ |
|
| Easy upgrade process | ✓ | ✓ | ✘ | ✓ | ✓ | ✘ | ✓ |
|
||||||
| Markdown support | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ |
|
| Markdown support | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ |
|
||||||
| Orgmode support | ✓ | ✘ | ✓ | ✘ | ✘ | ✘ | ? |
|
| Orgmode support | ✓ | ✘ | ✓ | ✘ | ✘ | ✘ | ? |
|
||||||
| CSV support | ✓ | ✘ | ✓ | ✘ | ✘ | ✓ | ? |
|
| CSV support | ✓ | ✘ | ✓ | ✘ | ✘ | ✓ | ? |
|
||||||
| Third-party render tool support | ✓ | ✘ | ✘ | ✘ | ✘ | ✓ | ? |
|
| Third-party render tool support | ✓ | ✘ | ✘ | ✘ | ✘ | ✓ | ? |
|
||||||
| Static Git-powered pages | [✘](https://github.com/go-gitea/gitea/issues/302) | ✘ | ✓ | ✓ | ✓ | ✘ | ✘ |
|
| Static Git-powered pages | [✘](https://github.com/go-gitea/gitea/issues/302) | ✘ | ✓ | ✓ | ✓ | ✘ | ✘ |
|
||||||
| Integrated Git-powered wiki | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ (cloud only) | ✘ |
|
| Integrated Git-powered wiki | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ (cloud only) | ✘ |
|
||||||
| Deploy Tokens | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ |
|
| Deploy Tokens | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ |
|
||||||
| Repository Tokens with write rights | ✓ | ✘ | ✓ | ✓ | ✓ | ✓ | ✓ |
|
| Repository Tokens with write rights | ✓ | ✘ | ✓ | ✓ | ✓ | ✓ | ✓ |
|
||||||
| Built-in Container Registry | [✘](https://github.com/go-gitea/gitea/issues/2316) | ✘ | ✘ | ✓ | ✓ | ✘ | ✘ |
|
| Built-in Package/Container Registry | ✓ | ✘ | ✓ | ✓ | ✓ | ✘ | ✘ |
|
||||||
| External git mirroring | ✓ | ✓ | ✘ | ✘ | ✓ | ✓ | ✓ |
|
| External git mirroring | ✓ | ✓ | ✘ | ✘ | ✓ | ✓ | ✓ |
|
||||||
| WebAuthn (2FA) | ✓ | ✘ | ✓ | ✓ | ✓ | ✓ | ? |
|
| WebAuthn (2FA) | ✓ | ✘ | ✓ | ✓ | ✓ | ✓ | ? |
|
||||||
| Built-in CI/CD | ✘ | ✘ | ✓ | ✓ | ✓ | ✘ | ✘ |
|
| Built-in CI/CD | ✘ | ✘ | ✓ | ✓ | ✓ | ✘ | ✘ |
|
||||||
| Subgroups: groups within groups | ✘ | ✘ | ✘ | ✓ | ✓ | ✘ | ✓ |
|
| Subgroups: groups within groups | ✘ | ✘ | ✘ | ✓ | ✓ | ✘ | ✓ |
|
||||||
|
|
||||||
## Code management
|
## Code management
|
||||||
|
|
||||||
|
|
12
docs/content/doc/packages.en-us.md
Normal file
12
docs/content/doc/packages.en-us.md
Normal file
|
@ -0,0 +1,12 @@
|
||||||
|
---
|
||||||
|
date: "2021-07-20T00:00:00+00:00"
|
||||||
|
title: "Package Registry"
|
||||||
|
slug: "packages"
|
||||||
|
toc: false
|
||||||
|
draft: false
|
||||||
|
menu:
|
||||||
|
sidebar:
|
||||||
|
name: "Package Registry"
|
||||||
|
weight: 45
|
||||||
|
identifier: "packages"
|
||||||
|
---
|
120
docs/content/doc/packages/composer.en-us.md
Normal file
120
docs/content/doc/packages/composer.en-us.md
Normal file
|
@ -0,0 +1,120 @@
|
||||||
|
---
|
||||||
|
date: "2021-07-20T00:00:00+00:00"
|
||||||
|
title: "Composer Packages Repository"
|
||||||
|
slug: "packages/composer"
|
||||||
|
draft: false
|
||||||
|
toc: false
|
||||||
|
menu:
|
||||||
|
sidebar:
|
||||||
|
parent: "packages"
|
||||||
|
name: "Composer"
|
||||||
|
weight: 10
|
||||||
|
identifier: "composer"
|
||||||
|
---
|
||||||
|
|
||||||
|
# Composer Packages Repository
|
||||||
|
|
||||||
|
Publish [Composer](https://getcomposer.org/) packages for your user or organization.
|
||||||
|
|
||||||
|
**Table of Contents**
|
||||||
|
|
||||||
|
{{< toc >}}
|
||||||
|
|
||||||
|
## Requirements
|
||||||
|
|
||||||
|
To work with the Composer package registry, you can use [Composer](https://getcomposer.org/download/) to consume and a HTTP upload client like `curl` to publish packages.
|
||||||
|
|
||||||
|
## Publish a package
|
||||||
|
|
||||||
|
To publish a Composer package perform a HTTP PUT operation with the package content in the request body.
|
||||||
|
The package content must be the zipped PHP project with the `composer.json` file.
|
||||||
|
You cannot publish a package if a package of the same name and version already exists. You must delete the existing package first.
|
||||||
|
|
||||||
|
```
|
||||||
|
PUT https://gitea.example.com/api/packages/{owner}/composer
|
||||||
|
```
|
||||||
|
|
||||||
|
| Parameter | Description |
|
||||||
|
| ---------- | ----------- |
|
||||||
|
| `owner` | The owner of the package. |
|
||||||
|
|
||||||
|
If the `composer.json` file does not contain a `version` property, you must provide it as a query parameter:
|
||||||
|
|
||||||
|
```
|
||||||
|
PUT https://gitea.example.com/api/packages/{owner}/composer?version={x.y.z}
|
||||||
|
```
|
||||||
|
|
||||||
|
Example request using HTTP Basic authentication:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
curl --user your_username:your_password_or_token \
|
||||||
|
--upload-file path/to/project.zip \
|
||||||
|
https://gitea.example.com/api/packages/testuser/composer
|
||||||
|
```
|
||||||
|
|
||||||
|
Or specify the package version as query parameter:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
curl --user your_username:your_password_or_token \
|
||||||
|
--upload-file path/to/project.zip \
|
||||||
|
https://gitea.example.com/api/packages/testuser/composer?version=1.0.3
|
||||||
|
```
|
||||||
|
|
||||||
|
The server responds with the following HTTP Status codes.
|
||||||
|
|
||||||
|
| HTTP Status Code | Meaning |
|
||||||
|
| ----------------- | ------- |
|
||||||
|
| `201 Created` | The package has been published. |
|
||||||
|
| `400 Bad Request` | The package name and/or version are invalid or a package with the same name and version already exist. |
|
||||||
|
|
||||||
|
## Configuring the package registry
|
||||||
|
|
||||||
|
To register the package registry you need to add it to the Composer `config.json` file (which can usually be found under `<user-home-dir>/.composer/config.json`):
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"repositories": [{
|
||||||
|
"type": "composer",
|
||||||
|
"url": "https://gitea.example.com/api/packages/{owner}/composer"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
To access the package registry using credentials, you must specify them in the `auth.json` file as follows:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"http-basic": {
|
||||||
|
"gitea.example.com": {
|
||||||
|
"username": "{username}",
|
||||||
|
"password": "{password}"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
| Parameter | Description |
|
||||||
|
| ---------- | ----------- |
|
||||||
|
| `owner` | The owner of the package. |
|
||||||
|
| `username` | Your Gitea username. |
|
||||||
|
| `password` | Your Gitea password or a personal access token. |
|
||||||
|
|
||||||
|
## Install a package
|
||||||
|
|
||||||
|
To install a package from the package registry, execute the following command:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
composer require {package_name}
|
||||||
|
```
|
||||||
|
|
||||||
|
Optional you can specify the package version:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
composer require {package_name}:{package_version}
|
||||||
|
```
|
||||||
|
|
||||||
|
| Parameter | Description |
|
||||||
|
| ----------------- | ----------- |
|
||||||
|
| `package_name` | The package name. |
|
||||||
|
| `package_version` | The package version. |
|
101
docs/content/doc/packages/conan.en-us.md
Normal file
101
docs/content/doc/packages/conan.en-us.md
Normal file
|
@ -0,0 +1,101 @@
|
||||||
|
---
|
||||||
|
date: "2021-07-20T00:00:00+00:00"
|
||||||
|
title: "Conan Packages Repository"
|
||||||
|
slug: "packages/conan"
|
||||||
|
draft: false
|
||||||
|
toc: false
|
||||||
|
menu:
|
||||||
|
sidebar:
|
||||||
|
parent: "packages"
|
||||||
|
name: "Conan"
|
||||||
|
weight: 20
|
||||||
|
identifier: "conan"
|
||||||
|
---
|
||||||
|
|
||||||
|
# Conan Packages Repository
|
||||||
|
|
||||||
|
Publish [Conan](https://conan.io/) packages for your user or organization.
|
||||||
|
|
||||||
|
**Table of Contents**
|
||||||
|
|
||||||
|
{{< toc >}}
|
||||||
|
|
||||||
|
## Requirements
|
||||||
|
|
||||||
|
To work with the Conan package registry, you need to use the [conan](https://conan.io/downloads.html) command line tool to consume and publish packages.
|
||||||
|
|
||||||
|
## Configuring the package registry
|
||||||
|
|
||||||
|
To register the package registry you need to configure a new Conan remote:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
conan remote add {remote} https://gitea.example.com/api/packages/{owner}/conan
|
||||||
|
conan user --remote {remote} --password {password} {username}
|
||||||
|
```
|
||||||
|
|
||||||
|
| Parameter | Description |
|
||||||
|
| -----------| ----------- |
|
||||||
|
| `remote` | The remote name. |
|
||||||
|
| `username` | Your Gitea username. |
|
||||||
|
| `password` | Your Gitea password or a personal access token. |
|
||||||
|
| `owner` | The owner of the package. |
|
||||||
|
|
||||||
|
For example:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
conan remote add gitea https://gitea.example.com/api/packages/testuser/conan
|
||||||
|
conan user --remote gitea --password password123 testuser
|
||||||
|
```
|
||||||
|
|
||||||
|
## Publish a package
|
||||||
|
|
||||||
|
Publish a Conan package by running the following command:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
conan upload --remote={remote} {recipe}
|
||||||
|
```
|
||||||
|
|
||||||
|
| Parameter | Description |
|
||||||
|
| ----------| ----------- |
|
||||||
|
| `remote` | The remote name. |
|
||||||
|
| `recipe` | The recipe to upload. |
|
||||||
|
|
||||||
|
For example:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
conan upload --remote=gitea ConanPackage/1.2@gitea/final
|
||||||
|
```
|
||||||
|
|
||||||
|
The Gitea Conan package registry has full [revision](https://docs.conan.io/en/latest/versioning/revisions.html) support.
|
||||||
|
|
||||||
|
## Install a package
|
||||||
|
|
||||||
|
To install a Conan package from the package registry, execute the following command:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
conan install --remote={remote} {recipe}
|
||||||
|
```
|
||||||
|
|
||||||
|
| Parameter | Description |
|
||||||
|
| ----------| ----------- |
|
||||||
|
| `remote` | The remote name. |
|
||||||
|
| `recipe` | The recipe to download. |
|
||||||
|
|
||||||
|
For example:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
conan install --remote=gitea ConanPackage/1.2@gitea/final
|
||||||
|
```
|
||||||
|
|
||||||
|
## Supported commands
|
||||||
|
|
||||||
|
```
|
||||||
|
conan install
|
||||||
|
conan get
|
||||||
|
conan info
|
||||||
|
conan search
|
||||||
|
conan upload
|
||||||
|
conan user
|
||||||
|
conan download
|
||||||
|
conan remove
|
||||||
|
```
|
91
docs/content/doc/packages/container.en-us.md
Normal file
91
docs/content/doc/packages/container.en-us.md
Normal file
|
@ -0,0 +1,91 @@
|
||||||
|
---
|
||||||
|
date: "2021-07-20T00:00:00+00:00"
|
||||||
|
title: "Container Registry"
|
||||||
|
slug: "packages/container"
|
||||||
|
draft: false
|
||||||
|
toc: false
|
||||||
|
menu:
|
||||||
|
sidebar:
|
||||||
|
parent: "packages"
|
||||||
|
name: "Container Registry"
|
||||||
|
weight: 30
|
||||||
|
identifier: "container"
|
||||||
|
---
|
||||||
|
|
||||||
|
# Container Registry
|
||||||
|
|
||||||
|
Publish [Open Container Initiative](https://opencontainers.org/) compliant images for your user or organization.
|
||||||
|
The container registry follows the OCI specs and supports all compatible images like [Docker](https://www.docker.com/) and [Helm Charts](https://helm.sh/).
|
||||||
|
|
||||||
|
**Table of Contents**
|
||||||
|
|
||||||
|
{{< toc >}}
|
||||||
|
|
||||||
|
## Requirements
|
||||||
|
|
||||||
|
To work with the Container registry, you can use the tools for your specific image type.
|
||||||
|
The following examples use the `docker` client.
|
||||||
|
|
||||||
|
## Login to the container registry
|
||||||
|
|
||||||
|
To push an image or if the image is in a private registry, you have to authenticate:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
docker login gitea.example.com
|
||||||
|
```
|
||||||
|
|
||||||
|
## Image naming convention
|
||||||
|
|
||||||
|
Images must follow this naming convention:
|
||||||
|
|
||||||
|
`{registry}/{owner}/{image}`
|
||||||
|
|
||||||
|
For example, these are all valid image names for the owner `testuser`:
|
||||||
|
|
||||||
|
`gitea.example.com/testuser/myimage`
|
||||||
|
|
||||||
|
`gitea.example.com/testuser/my-image`
|
||||||
|
|
||||||
|
`gitea.example.com/testuser/my/image`
|
||||||
|
|
||||||
|
**NOTE:** The registry only supports case-insensitive tag names. So `image:tag` and `image:Tag` get treated as the same image and tag.
|
||||||
|
|
||||||
|
## Push an image
|
||||||
|
|
||||||
|
Push an image by executing the following command:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
docker push gitea.example.com/{owner}/{image}:{tag}
|
||||||
|
```
|
||||||
|
|
||||||
|
| Parameter | Description |
|
||||||
|
| ----------| ----------- |
|
||||||
|
| `owner` | The owner of the image. |
|
||||||
|
| `image` | The name of the image. |
|
||||||
|
| `tag` | The tag of the image. |
|
||||||
|
|
||||||
|
For example:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
docker push gitea.example.com/testuser/myimage:latest
|
||||||
|
```
|
||||||
|
|
||||||
|
## Pull an image
|
||||||
|
|
||||||
|
Pull an image by executing the following command:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
docker pull gitea.example.com/{owner}/{image}:{tag}
|
||||||
|
```
|
||||||
|
|
||||||
|
| Parameter | Description |
|
||||||
|
| ----------| ----------- |
|
||||||
|
| `owner` | The owner of the image. |
|
||||||
|
| `image` | The name of the image. |
|
||||||
|
| `tag` | The tag of the image. |
|
||||||
|
|
||||||
|
For example:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
docker pull gitea.example.com/testuser/myimage:latest
|
||||||
|
```
|
80
docs/content/doc/packages/generic.en-us.md
Normal file
80
docs/content/doc/packages/generic.en-us.md
Normal file
|
@ -0,0 +1,80 @@
|
||||||
|
---
|
||||||
|
date: "2021-07-20T00:00:00+00:00"
|
||||||
|
title: "Generic Packages Repository"
|
||||||
|
slug: "packages/generic"
|
||||||
|
draft: false
|
||||||
|
toc: false
|
||||||
|
menu:
|
||||||
|
sidebar:
|
||||||
|
parent: "packages"
|
||||||
|
name: "Generic"
|
||||||
|
weight: 40
|
||||||
|
identifier: "generic"
|
||||||
|
---
|
||||||
|
|
||||||
|
# Generic Packages Repository
|
||||||
|
|
||||||
|
Publish generic files, like release binaries or other output, for your user or organization.
|
||||||
|
|
||||||
|
**Table of Contents**
|
||||||
|
|
||||||
|
{{< toc >}}
|
||||||
|
|
||||||
|
## Authenticate to the package registry
|
||||||
|
|
||||||
|
To authenticate to the Package Registry, you need to provide [custom HTTP headers or use HTTP Basic authentication]({{< relref "doc/developers/api-usage.en-us.md#authentication" >}}).
|
||||||
|
|
||||||
|
## Publish a package
|
||||||
|
|
||||||
|
To publish a generic package perform a HTTP PUT operation with the package content in the request body.
|
||||||
|
You cannot publish a package if a package of the same name and version already exists. You must delete the existing package first.
|
||||||
|
|
||||||
|
```
|
||||||
|
PUT https://gitea.example.com/api/packages/{owner}/generic/{package_name}/{package_version}/{file_name}
|
||||||
|
```
|
||||||
|
|
||||||
|
| Parameter | Description |
|
||||||
|
| ----------------- | ----------- |
|
||||||
|
| `owner` | The owner of the package. |
|
||||||
|
| `package_name` | The package name. It can contain only lowercase letters (`a-z`), uppercase letter (`A-Z`), numbers (`0-9`), dots (`.`), hyphens (`-`), or underscores (`_`). |
|
||||||
|
| `package_version` | The package version as described in the [SemVer](https://semver.org/) spec. |
|
||||||
|
| `file_name` | The filename. It can contain only lowercase letters (`a-z`), uppercase letter (`A-Z`), numbers (`0-9`), dots (`.`), hyphens (`-`), or underscores (`_`). |
|
||||||
|
|
||||||
|
Example request using HTTP Basic authentication:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
curl --user your_username:your_password_or_token \
|
||||||
|
--upload-file path/to/file.bin \
|
||||||
|
https://gitea.example.com/api/packages/testuser/generic/test_package/1.0.0/file.bin
|
||||||
|
```
|
||||||
|
|
||||||
|
The server reponds with the following HTTP Status codes.
|
||||||
|
|
||||||
|
| HTTP Status Code | Meaning |
|
||||||
|
| ----------------- | ------- |
|
||||||
|
| `201 Created` | The package has been published. |
|
||||||
|
| `400 Bad Request` | The package name and/or version are invalid or a package with the same name and version already exist. |
|
||||||
|
|
||||||
|
## Download a package
|
||||||
|
|
||||||
|
To download a generic package perform a HTTP GET operation.
|
||||||
|
|
||||||
|
```
|
||||||
|
GET https://gitea.example.com/api/packages/{owner}/generic/{package_name}/{package_version}/{file_name}
|
||||||
|
```
|
||||||
|
|
||||||
|
| Parameter | Description |
|
||||||
|
| ----------------- | ----------- |
|
||||||
|
| `owner` | The owner of the package. |
|
||||||
|
| `package_name` | The package name. |
|
||||||
|
| `package_version` | The package version. |
|
||||||
|
| `file_name` | The filename. |
|
||||||
|
|
||||||
|
The file content is served in the response body. The response content type is `application/octet-stream`.
|
||||||
|
|
||||||
|
Example request using HTTP Basic authentication:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
curl --user your_username:your_token_or_password \
|
||||||
|
https://gitea.example.com/api/packages/testuser/generic/test_package/1.0.0/file.bin
|
||||||
|
```
|
110
docs/content/doc/packages/maven.en-us.md
Normal file
110
docs/content/doc/packages/maven.en-us.md
Normal file
|
@ -0,0 +1,110 @@
|
||||||
|
---
|
||||||
|
date: "2021-07-20T00:00:00+00:00"
|
||||||
|
title: "Maven Packages Repository"
|
||||||
|
slug: "packages/maven"
|
||||||
|
draft: false
|
||||||
|
toc: false
|
||||||
|
menu:
|
||||||
|
sidebar:
|
||||||
|
parent: "packages"
|
||||||
|
name: "Maven"
|
||||||
|
weight: 50
|
||||||
|
identifier: "maven"
|
||||||
|
---
|
||||||
|
|
||||||
|
# Maven Packages Repository
|
||||||
|
|
||||||
|
Publish [Maven](https://maven.apache.org) packages for your user or organization.
|
||||||
|
|
||||||
|
**Table of Contents**
|
||||||
|
|
||||||
|
{{< toc >}}
|
||||||
|
|
||||||
|
## Requirements
|
||||||
|
|
||||||
|
To work with the Maven package registry, you can use [Maven](https://maven.apache.org/install.html) or [Gradle](https://gradle.org/install/).
|
||||||
|
The following examples use `Maven`.
|
||||||
|
|
||||||
|
## Configuring the package registry
|
||||||
|
|
||||||
|
To register the package registry you first need to add your access token to the [`settings.xml`](https://maven.apache.org/settings.html) file:
|
||||||
|
|
||||||
|
```xml
|
||||||
|
<settings>
|
||||||
|
<servers>
|
||||||
|
<server>
|
||||||
|
<id>gitea</id>
|
||||||
|
<configuration>
|
||||||
|
<httpHeaders>
|
||||||
|
<property>
|
||||||
|
<name>Authorization</name>
|
||||||
|
<value>token {access_token}</value>
|
||||||
|
</property>
|
||||||
|
</httpHeaders>
|
||||||
|
</configuration>
|
||||||
|
</server>
|
||||||
|
</servers>
|
||||||
|
</settings>
|
||||||
|
```
|
||||||
|
|
||||||
|
Afterwards add the following sections to your project `pom.xml` file:
|
||||||
|
|
||||||
|
```xml
|
||||||
|
<repositories>
|
||||||
|
<repository>
|
||||||
|
<id>gitea</id>
|
||||||
|
<url>https://gitea.example.com/api/packages/{owner}/maven</url>
|
||||||
|
</repository>
|
||||||
|
</repositories>
|
||||||
|
<distributionManagement>
|
||||||
|
<repository>
|
||||||
|
<id>gitea</id>
|
||||||
|
<url>https://gitea.example.com/api/packages/{owner}/maven</url>
|
||||||
|
</repository>
|
||||||
|
<snapshotRepository>
|
||||||
|
<id>gitea</id>
|
||||||
|
<url>https://gitea.example.com/api/packages/{owner}/maven</url>
|
||||||
|
</snapshotRepository>
|
||||||
|
</distributionManagement>
|
||||||
|
```
|
||||||
|
|
||||||
|
| Parameter | Description |
|
||||||
|
| -------------- | ----------- |
|
||||||
|
| `access_token` | Your [personal access token]({{< relref "doc/developers/api-usage.en-us.md#authentication" >}}). |
|
||||||
|
| `owner` | The owner of the package. |
|
||||||
|
|
||||||
|
## Publish a package
|
||||||
|
|
||||||
|
To publish a package simply run:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
mvn deploy
|
||||||
|
```
|
||||||
|
|
||||||
|
You cannot publish a package if a package of the same name and version already exists. You must delete the existing package first.
|
||||||
|
|
||||||
|
## Install a package
|
||||||
|
|
||||||
|
To install a Maven package from the package registry, add a new dependency to your project `pom.xml` file:
|
||||||
|
|
||||||
|
```xml
|
||||||
|
<dependency>
|
||||||
|
<groupId>com.test.package</groupId>
|
||||||
|
<artifactId>test_project</artifactId>
|
||||||
|
<version>1.0.0</version>
|
||||||
|
</dependency>
|
||||||
|
```
|
||||||
|
|
||||||
|
Afterwards run:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
mvn install
|
||||||
|
```
|
||||||
|
|
||||||
|
## Supported commands
|
||||||
|
|
||||||
|
```
|
||||||
|
mvn install
|
||||||
|
mvn deploy
|
||||||
|
mvn dependency:get:
|
||||||
|
```
|
118
docs/content/doc/packages/npm.en-us.md
Normal file
118
docs/content/doc/packages/npm.en-us.md
Normal file
|
@ -0,0 +1,118 @@
|
||||||
|
---
|
||||||
|
date: "2021-07-20T00:00:00+00:00"
|
||||||
|
title: "npm Packages Repository"
|
||||||
|
slug: "packages/npm"
|
||||||
|
draft: false
|
||||||
|
toc: false
|
||||||
|
menu:
|
||||||
|
sidebar:
|
||||||
|
parent: "packages"
|
||||||
|
name: "npm"
|
||||||
|
weight: 60
|
||||||
|
identifier: "npm"
|
||||||
|
---
|
||||||
|
|
||||||
|
# npm Packages Repository
|
||||||
|
|
||||||
|
Publish [npm](https://www.npmjs.com/) packages for your user or organization.
|
||||||
|
|
||||||
|
**Table of Contents**
|
||||||
|
|
||||||
|
{{< toc >}}
|
||||||
|
|
||||||
|
## Requirements
|
||||||
|
|
||||||
|
To work with the npm package registry, you need [Node.js](https://nodejs.org/en/download/) coupled with a package manager such as [Yarn](https://classic.yarnpkg.com/en/docs/install) or [npm](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm/) itself.
|
||||||
|
|
||||||
|
The registry supports [scoped](https://docs.npmjs.com/misc/scope/) and unscoped packages.
|
||||||
|
|
||||||
|
The following examples use the `npm` tool with the scope `@test`.
|
||||||
|
|
||||||
|
## Configuring the package registry
|
||||||
|
|
||||||
|
To register the package registry you need to configure a new package source.
|
||||||
|
|
||||||
|
```shell
|
||||||
|
npm config set {scope}:registry https://gitea.example.com/api/packages/{owner}/npm/
|
||||||
|
npm config set -- '//gitea.example.com/api/packages/{owner}/npm/:_authToken' "{token}"
|
||||||
|
```
|
||||||
|
|
||||||
|
| Parameter | Description |
|
||||||
|
| ------------ | ----------- |
|
||||||
|
| `scope` | The scope of the packages. |
|
||||||
|
| `owner` | The owner of the package. |
|
||||||
|
| `token` | Your [personal access token]({{< relref "doc/developers/api-usage.en-us.md#authentication" >}}). |
|
||||||
|
|
||||||
|
For example:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
npm config set @test:registry https://gitea.example.com/api/packages/testuser/npm/
|
||||||
|
npm config set -- '//gitea.example.com/api/packages/testuser/npm/:_authToken' "personal_access_token"
|
||||||
|
```
|
||||||
|
|
||||||
|
or without scope:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
npm config set registry https://gitea.example.com/api/packages/testuser/npm/
|
||||||
|
npm config set -- '//gitea.example.com/api/packages/testuser/npm/:_authToken' "personal_access_token"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Publish a package
|
||||||
|
|
||||||
|
Publish a package by running the following command in your project:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
npm publish
|
||||||
|
```
|
||||||
|
|
||||||
|
You cannot publish a package if a package of the same name and version already exists. You must delete the existing package first.
|
||||||
|
|
||||||
|
## Install a package
|
||||||
|
|
||||||
|
To install a package from the package registry, execute the following command:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
npm install {package_name}
|
||||||
|
```
|
||||||
|
|
||||||
|
| Parameter | Description |
|
||||||
|
| -------------- | ----------- |
|
||||||
|
| `package_name` | The package name. |
|
||||||
|
|
||||||
|
For example:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
npm install @test/test_package
|
||||||
|
```
|
||||||
|
|
||||||
|
## Tag a package
|
||||||
|
|
||||||
|
The registry supports [version tags](https://docs.npmjs.com/adding-dist-tags-to-packages/) which can be managed by `npm dist-tag`:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
npm dist-tag add {package_name}@{version} {tag}
|
||||||
|
```
|
||||||
|
|
||||||
|
| Parameter | Description |
|
||||||
|
| -------------- | ----------- |
|
||||||
|
| `package_name` | The package name. |
|
||||||
|
| `version` | The version of the package. |
|
||||||
|
| `tag` | The tag name. |
|
||||||
|
|
||||||
|
For example:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
npm dist-tag add test_package@1.0.2 release
|
||||||
|
```
|
||||||
|
|
||||||
|
The tag name must not be a valid version. All tag names which are parsable as a version are rejected.
|
||||||
|
|
||||||
|
## Supported commands
|
||||||
|
|
||||||
|
```
|
||||||
|
npm install
|
||||||
|
npm ci
|
||||||
|
npm publish
|
||||||
|
npm dist-tag
|
||||||
|
npm view
|
||||||
|
```
|
116
docs/content/doc/packages/nuget.en-us.md
Normal file
116
docs/content/doc/packages/nuget.en-us.md
Normal file
|
@ -0,0 +1,116 @@
|
||||||
|
---
|
||||||
|
date: "2021-07-20T00:00:00+00:00"
|
||||||
|
title: "NuGet Packages Repository"
|
||||||
|
slug: "packages/nuget"
|
||||||
|
draft: false
|
||||||
|
toc: false
|
||||||
|
menu:
|
||||||
|
sidebar:
|
||||||
|
parent: "packages"
|
||||||
|
name: "NuGet"
|
||||||
|
weight: 70
|
||||||
|
identifier: "nuget"
|
||||||
|
---
|
||||||
|
|
||||||
|
# NuGet Packages Repository
|
||||||
|
|
||||||
|
Publish [NuGet](https://www.nuget.org/) packages for your user or organization. The package registry supports [NuGet Symbol Packages](https://docs.microsoft.com/en-us/nuget/create-packages/symbol-packages-snupkg) too.
|
||||||
|
|
||||||
|
**Table of Contents**
|
||||||
|
|
||||||
|
{{< toc >}}
|
||||||
|
|
||||||
|
## Requirements
|
||||||
|
|
||||||
|
To work with the NuGet package registry, you can use command-line interface tools as well as NuGet features in various IDEs like Visual Studio.
|
||||||
|
More informations about NuGet clients can be found in [the official documentation](https://docs.microsoft.com/en-us/nuget/install-nuget-client-tools).
|
||||||
|
The following examples use the `dotnet nuget` tool.
|
||||||
|
|
||||||
|
## Configuring the package registry
|
||||||
|
|
||||||
|
To register the package registry you need to configure a new NuGet feed source:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
dotnet nuget add source --name {source_name} --username {username} --password {password} https://gitea.example.com/api/packages/{owner}/nuget/index.json
|
||||||
|
```
|
||||||
|
|
||||||
|
| Parameter | Description |
|
||||||
|
| ------------- | ----------- |
|
||||||
|
| `source_name` | The desired source name. |
|
||||||
|
| `username` | Your Gitea username. |
|
||||||
|
| `password` | Your Gitea password or a personal access token. |
|
||||||
|
| `owner` | The owner of the package. |
|
||||||
|
|
||||||
|
For example:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
dotnet nuget add source --name gitea --username testuser --password password123 https://gitea.example.com/api/packages/testuser/nuget/index.json
|
||||||
|
```
|
||||||
|
|
||||||
|
## Publish a package
|
||||||
|
|
||||||
|
Publish a package by running the following command:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
dotnet nuget push --source {source_name} {package_file}
|
||||||
|
```
|
||||||
|
|
||||||
|
| Parameter | Description |
|
||||||
|
| -------------- | ----------- |
|
||||||
|
| `source_name` | The desired source name. |
|
||||||
|
| `package_file` | Path to the package `.nupkg` file. |
|
||||||
|
|
||||||
|
For example:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
dotnet nuget push --source gitea test_package.1.0.0.nupkg
|
||||||
|
```
|
||||||
|
|
||||||
|
You cannot publish a package if a package of the same name and version already exists. You must delete the existing package first.
|
||||||
|
|
||||||
|
### Symbol Packages
|
||||||
|
|
||||||
|
The NuGet package registry has build support for a symbol server. The PDB files embedded in a symbol package (`.snupkg`) can get requested by clients.
|
||||||
|
To do so, register the NuGet package registry as symbol source:
|
||||||
|
|
||||||
|
```
|
||||||
|
https://gitea.example.com/api/packages/{owner}/nuget/symbols
|
||||||
|
```
|
||||||
|
|
||||||
|
| Parameter | Description |
|
||||||
|
| --------- | ----------- |
|
||||||
|
| `owner` | The owner of the package registry. |
|
||||||
|
|
||||||
|
For example:
|
||||||
|
|
||||||
|
```
|
||||||
|
https://gitea.example.com/api/packages/testuser/nuget/symbols
|
||||||
|
```
|
||||||
|
|
||||||
|
## Install a package
|
||||||
|
|
||||||
|
To install a NuGet package from the package registry, execute the following command:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
dotnet add package --source {source_name} --version {package_version} {package_name}
|
||||||
|
```
|
||||||
|
|
||||||
|
| Parameter | Description |
|
||||||
|
| ----------------- | ----------- |
|
||||||
|
| `source_name` | The desired source name. |
|
||||||
|
| `package_name` | The package name. |
|
||||||
|
| `package_version` | The package version. |
|
||||||
|
|
||||||
|
For example:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
dotnet add package --source gitea --version 1.0.0 test_package
|
||||||
|
```
|
||||||
|
|
||||||
|
## Supported commands
|
||||||
|
|
||||||
|
```
|
||||||
|
dotnet add
|
||||||
|
dotnet nuget push
|
||||||
|
dotnet nuget delete
|
||||||
|
```
|
76
docs/content/doc/packages/overview.en-us.md
Normal file
76
docs/content/doc/packages/overview.en-us.md
Normal file
|
@ -0,0 +1,76 @@
|
||||||
|
---
|
||||||
|
date: "2021-07-20T00:00:00+00:00"
|
||||||
|
title: "Package Registry"
|
||||||
|
slug: "packages/overview"
|
||||||
|
draft: false
|
||||||
|
toc: false
|
||||||
|
menu:
|
||||||
|
sidebar:
|
||||||
|
parent: "packages"
|
||||||
|
name: "Overview"
|
||||||
|
weight: 1
|
||||||
|
identifier: "overview"
|
||||||
|
---
|
||||||
|
|
||||||
|
# Package Registry
|
||||||
|
|
||||||
|
The Package Registry can be used as a public or private registry for common package managers.
|
||||||
|
|
||||||
|
**Table of Contents**
|
||||||
|
|
||||||
|
{{< toc >}}
|
||||||
|
|
||||||
|
## Supported package managers
|
||||||
|
|
||||||
|
The following package managers are currently supported:
|
||||||
|
|
||||||
|
| Name | Language | Package client |
|
||||||
|
| ---- | -------- | -------------- |
|
||||||
|
| [Composer]({{< relref "doc/packages/composer.en-us.md" >}}) | PHP | `composer` |
|
||||||
|
| [Conan]({{< relref "doc/packages/conan.en-us.md" >}}) | C++ | `conan` |
|
||||||
|
| [Container]({{< relref "doc/packages/container.en-us.md" >}}) | - | any OCI compliant client |
|
||||||
|
| [Generic]({{< relref "doc/packages/generic.en-us.md" >}}) | - | any HTTP client |
|
||||||
|
| [Maven]({{< relref "doc/packages/maven.en-us.md" >}}) | Java | `mvn`, `gradle` |
|
||||||
|
| [npm]({{< relref "doc/packages/npm.en-us.md" >}}) | JavaScript | `npm`, `yarn` |
|
||||||
|
| [NuGet]({{< relref "doc/packages/nuget.en-us.md" >}}) | .NET | `nuget` |
|
||||||
|
| [PyPI]({{< relref "doc/packages/pypi.en-us.md" >}}) | Python | `pip`, `twine` |
|
||||||
|
| [RubyGems]({{< relref "doc/packages/rubygems.en-us.md" >}}) | Ruby | `gem`, `Bundler` |
|
||||||
|
|
||||||
|
**The following paragraphs only apply if Packages are not globally disabled!**
|
||||||
|
|
||||||
|
## View packages
|
||||||
|
|
||||||
|
You can view the packages of a repository on the repository page.
|
||||||
|
|
||||||
|
1. Go to the repository.
|
||||||
|
1. Go to **Packages** in the navigation bar.
|
||||||
|
|
||||||
|
To view more details about a package, select the name of the package.
|
||||||
|
|
||||||
|
## Download a package
|
||||||
|
|
||||||
|
To download a package from your repository:
|
||||||
|
|
||||||
|
1. Go to **Packages** in the navigation bar.
|
||||||
|
1. Select the name of the package to view the details.
|
||||||
|
1. In the **Assets** section, select the name of the package file you want to download.
|
||||||
|
|
||||||
|
## Delete a package
|
||||||
|
|
||||||
|
You cannot edit a package after you published it in the Package Registry. Instead, you
|
||||||
|
must delete and recreate it.
|
||||||
|
|
||||||
|
To delete a package from your repository:
|
||||||
|
|
||||||
|
1. Go to **Packages** in the navigation bar.
|
||||||
|
1. Select the name of the package to view the details.
|
||||||
|
1. Click **Delete package** to permanently delete the package.
|
||||||
|
|
||||||
|
## Disable the Package Registry
|
||||||
|
|
||||||
|
The Package Registry is automatically enabled. To disable it for a single repository:
|
||||||
|
|
||||||
|
1. Go to **Settings** in the navigation bar.
|
||||||
|
1. Disable **Enable Repository Packages Registry**.
|
||||||
|
|
||||||
|
Previously published packages are not deleted by disabling the Package Registry.
|
85
docs/content/doc/packages/pypi.en-us.md
Normal file
85
docs/content/doc/packages/pypi.en-us.md
Normal file
|
@ -0,0 +1,85 @@
|
||||||
|
---
|
||||||
|
date: "2021-07-20T00:00:00+00:00"
|
||||||
|
title: "PyPI Packages Repository"
|
||||||
|
slug: "packages/pypi"
|
||||||
|
draft: false
|
||||||
|
toc: false
|
||||||
|
menu:
|
||||||
|
sidebar:
|
||||||
|
parent: "packages"
|
||||||
|
name: "PyPI"
|
||||||
|
weight: 80
|
||||||
|
identifier: "pypi"
|
||||||
|
---
|
||||||
|
|
||||||
|
# PyPI Packages Repository
|
||||||
|
|
||||||
|
Publish [PyPI](https://pypi.org/) packages for your user or organization.
|
||||||
|
|
||||||
|
**Table of Contents**
|
||||||
|
|
||||||
|
{{< toc >}}
|
||||||
|
|
||||||
|
## Requirements
|
||||||
|
|
||||||
|
To work with the PyPI package registry, you need to use the tools [pip](https://pypi.org/project/pip/) to consume and [twine](https://pypi.org/project/twine/) to publish packages.
|
||||||
|
|
||||||
|
## Configuring the package registry
|
||||||
|
|
||||||
|
To register the package registry you need to edit your local `~/.pypirc` file. Add
|
||||||
|
|
||||||
|
```ini
|
||||||
|
[distutils]
|
||||||
|
index-servers = gitea
|
||||||
|
|
||||||
|
[gitea]
|
||||||
|
repository = https://gitea.example.com/api/packages/{owner}/pypi
|
||||||
|
username = {username}
|
||||||
|
password = {password}
|
||||||
|
```
|
||||||
|
|
||||||
|
| Placeholder | Description |
|
||||||
|
| ------------ | ----------- |
|
||||||
|
| `owner` | The owner of the package. |
|
||||||
|
| `username` | Your Gitea username. |
|
||||||
|
| `password` | Your Gitea password or a [personal access token]({{< relref "doc/developers/api-usage.en-us.md#authentication" >}}). |
|
||||||
|
|
||||||
|
## Publish a package
|
||||||
|
|
||||||
|
Publish a package by running the following command:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
python3 -m twine upload --repository gitea /path/to/files/*
|
||||||
|
```
|
||||||
|
|
||||||
|
The package files have the extensions `.tar.gz` and `.whl`.
|
||||||
|
|
||||||
|
You cannot publish a package if a package of the same name and version already exists. You must delete the existing package first.
|
||||||
|
|
||||||
|
## Install a package
|
||||||
|
|
||||||
|
To install a PyPI package from the package registry, execute the following command:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
pip install --index-url https://{username}:{password}@gitea.example.com/api/packages/{owner}/pypi/simple --no-deps {package_name}
|
||||||
|
```
|
||||||
|
|
||||||
|
| Parameter | Description |
|
||||||
|
| ----------------- | ----------- |
|
||||||
|
| `username` | Your Gitea username. |
|
||||||
|
| `password` | Your Gitea password or a personal access token. |
|
||||||
|
| `owner` | The owner of the package. |
|
||||||
|
| `package_name` | The package name. |
|
||||||
|
|
||||||
|
For example:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
pip install --index-url https://testuser:password123@gitea.example.com/api/packages/testuser/pypi/simple --no-deps test_package
|
||||||
|
```
|
||||||
|
|
||||||
|
## Supported commands
|
||||||
|
|
||||||
|
```
|
||||||
|
pip install
|
||||||
|
twine upload
|
||||||
|
```
|
127
docs/content/doc/packages/rubygems.en-us.md
Normal file
127
docs/content/doc/packages/rubygems.en-us.md
Normal file
|
@ -0,0 +1,127 @@
|
||||||
|
---
|
||||||
|
date: "2021-07-20T00:00:00+00:00"
|
||||||
|
title: "RubyGems Packages Repository"
|
||||||
|
slug: "packages/rubygems"
|
||||||
|
draft: false
|
||||||
|
toc: false
|
||||||
|
menu:
|
||||||
|
sidebar:
|
||||||
|
parent: "packages"
|
||||||
|
name: "RubyGems"
|
||||||
|
weight: 90
|
||||||
|
identifier: "rubygems"
|
||||||
|
---
|
||||||
|
|
||||||
|
# RubyGems Packages Repository
|
||||||
|
|
||||||
|
Publish [RubyGems](https://guides.rubygems.org/) packages for your user or organization.
|
||||||
|
|
||||||
|
**Table of Contents**
|
||||||
|
|
||||||
|
{{< toc >}}
|
||||||
|
|
||||||
|
## Requirements
|
||||||
|
|
||||||
|
To work with the RubyGems package registry, you need to use the [gem](https://guides.rubygems.org/command-reference/) command line tool to consume and publish packages.
|
||||||
|
|
||||||
|
## Configuring the package registry
|
||||||
|
|
||||||
|
To register the package registry edit the `~/.gem/credentials` file and add:
|
||||||
|
|
||||||
|
```ini
|
||||||
|
---
|
||||||
|
https://gitea.example.com/api/packages/{owner}/rubygems: Bearer {token}
|
||||||
|
```
|
||||||
|
|
||||||
|
| Parameter | Description |
|
||||||
|
| ------------- | ----------- |
|
||||||
|
| `owner` | The owner of the package. |
|
||||||
|
| `token` | Your personal access token. |
|
||||||
|
|
||||||
|
For example:
|
||||||
|
|
||||||
|
```
|
||||||
|
---
|
||||||
|
https://gitea.example.com/api/packages/testuser/rubygems: Bearer 3bd626f84b01cd26b873931eace1e430a5773cc4
|
||||||
|
```
|
||||||
|
|
||||||
|
## Publish a package
|
||||||
|
|
||||||
|
Publish a package by running the following command:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
gem push --host {host} {package_file}
|
||||||
|
```
|
||||||
|
|
||||||
|
| Parameter | Description |
|
||||||
|
| -------------- | ----------- |
|
||||||
|
| `host` | URL to the package registry. |
|
||||||
|
| `package_file` | Path to the package `.gem` file. |
|
||||||
|
|
||||||
|
For example:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
gem push --host https://gitea.example.com/api/packages/testuser/rubygems test_package-1.0.0.gem
|
||||||
|
```
|
||||||
|
|
||||||
|
You cannot publish a package if a package of the same name and version already exists. You must delete the existing package first.
|
||||||
|
|
||||||
|
## Install a package
|
||||||
|
|
||||||
|
To install a package from the package registry you can use [Bundler](https://bundler.io) or `gem`.
|
||||||
|
|
||||||
|
### Bundler
|
||||||
|
|
||||||
|
Add a new `source` block to your `Gemfile`:
|
||||||
|
|
||||||
|
```
|
||||||
|
source "https://gitea.example.com/api/packages/{owner}/rubygems" do
|
||||||
|
gem "{package_name}"
|
||||||
|
end
|
||||||
|
```
|
||||||
|
|
||||||
|
| Parameter | Description |
|
||||||
|
| ----------------- | ----------- |
|
||||||
|
| `owner` | The owner of the package. |
|
||||||
|
| `package_name` | The package name. |
|
||||||
|
|
||||||
|
For example:
|
||||||
|
|
||||||
|
```
|
||||||
|
source "https://gitea.example.com/api/packages/testuser/rubygems" do
|
||||||
|
gem "test_package"
|
||||||
|
end
|
||||||
|
```
|
||||||
|
|
||||||
|
Afterwards run the following command:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
bundle install
|
||||||
|
```
|
||||||
|
|
||||||
|
### gem
|
||||||
|
|
||||||
|
Execute the following command:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
gem install --host https://gitea.example.com/api/packages/{owner}/rubygems {package_name}
|
||||||
|
```
|
||||||
|
|
||||||
|
| Parameter | Description |
|
||||||
|
| ----------------- | ----------- |
|
||||||
|
| `owner` | The owner of the package. |
|
||||||
|
| `package_name` | The package name. |
|
||||||
|
|
||||||
|
For example:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
gem install --host https://gitea.example.com/api/packages/testuser/rubygems test_package
|
||||||
|
```
|
||||||
|
|
||||||
|
## Supported commands
|
||||||
|
|
||||||
|
```
|
||||||
|
gem install
|
||||||
|
bundle install
|
||||||
|
gem push
|
||||||
|
```
|
|
@ -8,6 +8,6 @@ draft: false
|
||||||
menu:
|
menu:
|
||||||
sidebar:
|
sidebar:
|
||||||
name: "Übersetzung"
|
name: "Übersetzung"
|
||||||
weight: 45
|
weight: 50
|
||||||
identifier: "translation"
|
identifier: "translation"
|
||||||
---
|
---
|
||||||
|
|
|
@ -8,6 +8,6 @@ draft: false
|
||||||
menu:
|
menu:
|
||||||
sidebar:
|
sidebar:
|
||||||
name: "Translation"
|
name: "Translation"
|
||||||
weight: 45
|
weight: 50
|
||||||
identifier: "translation"
|
identifier: "translation"
|
||||||
---
|
---
|
||||||
|
|
|
@ -8,6 +8,6 @@ draft: false
|
||||||
menu:
|
menu:
|
||||||
sidebar:
|
sidebar:
|
||||||
name: "翻譯"
|
name: "翻譯"
|
||||||
weight: 45
|
weight: 50
|
||||||
identifier: "translation"
|
identifier: "translation"
|
||||||
---
|
---
|
||||||
|
|
214
integrations/api_packages_composer_test.go
Normal file
214
integrations/api_packages_composer_test.go
Normal file
|
@ -0,0 +1,214 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package integrations
|
||||||
|
|
||||||
|
import (
|
||||||
|
"archive/zip"
|
||||||
|
"bytes"
|
||||||
|
"fmt"
|
||||||
|
"net/http"
|
||||||
|
neturl "net/url"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/models/db"
|
||||||
|
"code.gitea.io/gitea/models/packages"
|
||||||
|
"code.gitea.io/gitea/models/unittest"
|
||||||
|
user_model "code.gitea.io/gitea/models/user"
|
||||||
|
composer_module "code.gitea.io/gitea/modules/packages/composer"
|
||||||
|
"code.gitea.io/gitea/modules/setting"
|
||||||
|
"code.gitea.io/gitea/routers/api/packages/composer"
|
||||||
|
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestPackageComposer(t *testing.T) {
|
||||||
|
defer prepareTestEnv(t)()
|
||||||
|
user := unittest.AssertExistsAndLoadBean(t, &user_model.User{ID: 2}).(*user_model.User)
|
||||||
|
|
||||||
|
vendorName := "gitea"
|
||||||
|
projectName := "composer-package"
|
||||||
|
packageName := vendorName + "/" + projectName
|
||||||
|
packageVersion := "1.0.3"
|
||||||
|
packageDescription := "Package Description"
|
||||||
|
packageType := "composer-plugin"
|
||||||
|
packageAuthor := "Gitea Authors"
|
||||||
|
packageLicense := "MIT"
|
||||||
|
|
||||||
|
var buf bytes.Buffer
|
||||||
|
archive := zip.NewWriter(&buf)
|
||||||
|
w, _ := archive.Create("composer.json")
|
||||||
|
w.Write([]byte(`{
|
||||||
|
"name": "` + packageName + `",
|
||||||
|
"description": "` + packageDescription + `",
|
||||||
|
"type": "` + packageType + `",
|
||||||
|
"license": "` + packageLicense + `",
|
||||||
|
"authors": [
|
||||||
|
{
|
||||||
|
"name": "` + packageAuthor + `"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}`))
|
||||||
|
archive.Close()
|
||||||
|
content := buf.Bytes()
|
||||||
|
|
||||||
|
url := fmt.Sprintf("%sapi/packages/%s/composer", setting.AppURL, user.Name)
|
||||||
|
|
||||||
|
t.Run("ServiceIndex", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/packages.json", url))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
var result composer.ServiceIndexResponse
|
||||||
|
DecodeJSON(t, resp, &result)
|
||||||
|
|
||||||
|
assert.Equal(t, url+"/search.json?q=%query%&type=%type%", result.SearchTemplate)
|
||||||
|
assert.Equal(t, url+"/p2/%package%.json", result.MetadataTemplate)
|
||||||
|
assert.Equal(t, url+"/list.json", result.PackageList)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Upload", func(t *testing.T) {
|
||||||
|
t.Run("MissingVersion", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequestWithBody(t, "PUT", url, bytes.NewReader(content))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
MakeRequest(t, req, http.StatusBadRequest)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Valid", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
uploadURL := url + "?version=" + packageVersion
|
||||||
|
|
||||||
|
req := NewRequestWithBody(t, "PUT", uploadURL, bytes.NewReader(content))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
MakeRequest(t, req, http.StatusCreated)
|
||||||
|
|
||||||
|
pvs, err := packages.GetVersionsByPackageType(db.DefaultContext, user.ID, packages.TypeComposer)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pvs, 1)
|
||||||
|
|
||||||
|
pd, err := packages.GetPackageDescriptor(db.DefaultContext, pvs[0])
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.NotNil(t, pd.SemVer)
|
||||||
|
assert.IsType(t, &composer_module.Metadata{}, pd.Metadata)
|
||||||
|
assert.Equal(t, packageName, pd.Package.Name)
|
||||||
|
assert.Equal(t, packageVersion, pd.Version.Version)
|
||||||
|
|
||||||
|
pfs, err := packages.GetFilesByVersionID(db.DefaultContext, pvs[0].ID)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pfs, 1)
|
||||||
|
assert.Equal(t, fmt.Sprintf("%s-%s.%s.zip", vendorName, projectName, packageVersion), pfs[0].Name)
|
||||||
|
assert.True(t, pfs[0].IsLead)
|
||||||
|
|
||||||
|
pb, err := packages.GetBlobByID(db.DefaultContext, pfs[0].BlobID)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Equal(t, int64(len(content)), pb.Size)
|
||||||
|
|
||||||
|
req = NewRequestWithBody(t, "PUT", uploadURL, bytes.NewReader(content))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
MakeRequest(t, req, http.StatusBadRequest)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Download", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
pvs, err := packages.GetVersionsByPackageType(db.DefaultContext, user.ID, packages.TypeComposer)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pvs, 1)
|
||||||
|
assert.Equal(t, int64(0), pvs[0].DownloadCount)
|
||||||
|
|
||||||
|
pfs, err := packages.GetFilesByVersionID(db.DefaultContext, pvs[0].ID)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pfs, 1)
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/files/%s/%s/%s", url, neturl.PathEscape(packageName), neturl.PathEscape(pvs[0].LowerVersion), neturl.PathEscape(pfs[0].LowerName)))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
assert.Equal(t, content, resp.Body.Bytes())
|
||||||
|
|
||||||
|
pvs, err = packages.GetVersionsByPackageType(db.DefaultContext, user.ID, packages.TypeComposer)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pvs, 1)
|
||||||
|
assert.Equal(t, int64(1), pvs[0].DownloadCount)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("SearchService", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
cases := []struct {
|
||||||
|
Query string
|
||||||
|
Type string
|
||||||
|
Page int
|
||||||
|
PerPage int
|
||||||
|
ExpectedTotal int64
|
||||||
|
ExpectedResults int
|
||||||
|
}{
|
||||||
|
{"", "", 0, 0, 1, 1},
|
||||||
|
{"", "", 1, 1, 1, 1},
|
||||||
|
{"test", "", 1, 0, 0, 0},
|
||||||
|
{"gitea", "", 1, 1, 1, 1},
|
||||||
|
{"gitea", "", 2, 1, 1, 0},
|
||||||
|
{"", packageType, 1, 1, 1, 1},
|
||||||
|
{"gitea", packageType, 1, 1, 1, 1},
|
||||||
|
{"gitea", "dummy", 1, 1, 0, 0},
|
||||||
|
}
|
||||||
|
|
||||||
|
for i, c := range cases {
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/search.json?q=%s&type=%s&page=%d&per_page=%d", url, c.Query, c.Type, c.Page, c.PerPage))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
var result composer.SearchResultResponse
|
||||||
|
DecodeJSON(t, resp, &result)
|
||||||
|
|
||||||
|
assert.Equal(t, c.ExpectedTotal, result.Total, "case %d: unexpected total hits", i)
|
||||||
|
assert.Len(t, result.Results, c.ExpectedResults, "case %d: unexpected result count", i)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("EnumeratePackages", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", url+"/list.json")
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
var result map[string][]string
|
||||||
|
DecodeJSON(t, resp, &result)
|
||||||
|
|
||||||
|
assert.Contains(t, result, "packageNames")
|
||||||
|
names := result["packageNames"]
|
||||||
|
assert.Len(t, names, 1)
|
||||||
|
assert.Equal(t, packageName, names[0])
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("PackageMetadata", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/p2/%s/%s.json", url, vendorName, projectName))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
var result composer.PackageMetadataResponse
|
||||||
|
DecodeJSON(t, resp, &result)
|
||||||
|
|
||||||
|
assert.Contains(t, result.Packages, packageName)
|
||||||
|
pkgs := result.Packages[packageName]
|
||||||
|
assert.Len(t, pkgs, 1)
|
||||||
|
assert.Equal(t, packageName, pkgs[0].Name)
|
||||||
|
assert.Equal(t, packageVersion, pkgs[0].Version)
|
||||||
|
assert.Equal(t, packageType, pkgs[0].Type)
|
||||||
|
assert.Equal(t, packageDescription, pkgs[0].Description)
|
||||||
|
assert.Len(t, pkgs[0].Authors, 1)
|
||||||
|
assert.Equal(t, packageAuthor, pkgs[0].Authors[0].Name)
|
||||||
|
assert.Equal(t, "zip", pkgs[0].Dist.Type)
|
||||||
|
assert.Equal(t, "7b40bfd6da811b2b78deec1e944f156dbb2c747b", pkgs[0].Dist.Checksum)
|
||||||
|
})
|
||||||
|
}
|
724
integrations/api_packages_conan_test.go
Normal file
724
integrations/api_packages_conan_test.go
Normal file
|
@ -0,0 +1,724 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package integrations
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
"net/http"
|
||||||
|
stdurl "net/url"
|
||||||
|
"strings"
|
||||||
|
"testing"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/models/db"
|
||||||
|
"code.gitea.io/gitea/models/packages"
|
||||||
|
conan_model "code.gitea.io/gitea/models/packages/conan"
|
||||||
|
"code.gitea.io/gitea/models/unittest"
|
||||||
|
user_model "code.gitea.io/gitea/models/user"
|
||||||
|
conan_module "code.gitea.io/gitea/modules/packages/conan"
|
||||||
|
"code.gitea.io/gitea/modules/setting"
|
||||||
|
conan_router "code.gitea.io/gitea/routers/api/packages/conan"
|
||||||
|
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
conanfileName = "conanfile.py"
|
||||||
|
conaninfoName = "conaninfo.txt"
|
||||||
|
|
||||||
|
conanLicense = "MIT"
|
||||||
|
conanAuthor = "Gitea <info@gitea.io>"
|
||||||
|
conanHomepage = "https://gitea.io/"
|
||||||
|
conanURL = "https://gitea.com/"
|
||||||
|
conanDescription = "Description of ConanPackage"
|
||||||
|
conanTopic = "gitea"
|
||||||
|
|
||||||
|
conanPackageReference = "dummyreference"
|
||||||
|
|
||||||
|
contentConaninfo = `[settings]
|
||||||
|
arch=x84_64
|
||||||
|
|
||||||
|
[requires]
|
||||||
|
fmt/7.1.3
|
||||||
|
|
||||||
|
[options]
|
||||||
|
shared=False
|
||||||
|
|
||||||
|
[full_settings]
|
||||||
|
arch=x84_64
|
||||||
|
|
||||||
|
[full_requires]
|
||||||
|
fmt/7.1.3
|
||||||
|
|
||||||
|
[full_options]
|
||||||
|
shared=False
|
||||||
|
|
||||||
|
[recipe_hash]
|
||||||
|
74714915a51073acb548ca1ce29afbac
|
||||||
|
|
||||||
|
[env]
|
||||||
|
CC=gcc-10`
|
||||||
|
)
|
||||||
|
|
||||||
|
func addTokenAuthHeader(request *http.Request, token string) *http.Request {
|
||||||
|
request.Header.Set("Authorization", token)
|
||||||
|
return request
|
||||||
|
}
|
||||||
|
|
||||||
|
func buildConanfileContent(name, version string) string {
|
||||||
|
return `from conans import ConanFile, CMake, tools
|
||||||
|
|
||||||
|
class ConanPackageConan(ConanFile):
|
||||||
|
name = "` + name + `"
|
||||||
|
version = "` + version + `"
|
||||||
|
license = "` + conanLicense + `"
|
||||||
|
author = "` + conanAuthor + `"
|
||||||
|
homepage = "` + conanHomepage + `"
|
||||||
|
url = "` + conanURL + `"
|
||||||
|
description = "` + conanDescription + `"
|
||||||
|
topics = ("` + conanTopic + `")
|
||||||
|
settings = "os", "compiler", "build_type", "arch"
|
||||||
|
options = {"shared": [True, False], "fPIC": [True, False]}
|
||||||
|
default_options = {"shared": False, "fPIC": True}
|
||||||
|
generators = "cmake"`
|
||||||
|
}
|
||||||
|
|
||||||
|
func uploadConanPackageV1(t *testing.T, baseURL, token, name, version, user, channel string) {
|
||||||
|
contentConanfile := buildConanfileContent(name, version)
|
||||||
|
|
||||||
|
recipeURL := fmt.Sprintf("%s/v1/conans/%s/%s/%s/%s", baseURL, name, version, user, channel)
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", recipeURL)
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
MakeRequest(t, req, http.StatusNotFound)
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", fmt.Sprintf("%s/digest", recipeURL))
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
MakeRequest(t, req, http.StatusNotFound)
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", fmt.Sprintf("%s/download_urls", recipeURL))
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
MakeRequest(t, req, http.StatusNotFound)
|
||||||
|
|
||||||
|
req = NewRequest(t, "POST", fmt.Sprintf("%s/upload_urls", recipeURL))
|
||||||
|
MakeRequest(t, req, http.StatusUnauthorized)
|
||||||
|
|
||||||
|
req = NewRequestWithJSON(t, "POST", fmt.Sprintf("%s/upload_urls", recipeURL), map[string]int64{
|
||||||
|
conanfileName: int64(len(contentConanfile)),
|
||||||
|
"removed.txt": 0,
|
||||||
|
})
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
uploadURLs := make(map[string]string)
|
||||||
|
DecodeJSON(t, resp, &uploadURLs)
|
||||||
|
|
||||||
|
assert.Contains(t, uploadURLs, conanfileName)
|
||||||
|
assert.NotContains(t, uploadURLs, "removed.txt")
|
||||||
|
|
||||||
|
uploadURL := uploadURLs[conanfileName]
|
||||||
|
assert.NotEmpty(t, uploadURL)
|
||||||
|
|
||||||
|
req = NewRequestWithBody(t, "PUT", uploadURL, strings.NewReader(contentConanfile))
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
MakeRequest(t, req, http.StatusCreated)
|
||||||
|
|
||||||
|
packageURL := fmt.Sprintf("%s/packages/%s", recipeURL, conanPackageReference)
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", packageURL)
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
MakeRequest(t, req, http.StatusNotFound)
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", fmt.Sprintf("%s/digest", packageURL))
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
MakeRequest(t, req, http.StatusNotFound)
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", fmt.Sprintf("%s/download_urls", packageURL))
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
MakeRequest(t, req, http.StatusNotFound)
|
||||||
|
|
||||||
|
req = NewRequest(t, "POST", fmt.Sprintf("%s/upload_urls", packageURL))
|
||||||
|
MakeRequest(t, req, http.StatusUnauthorized)
|
||||||
|
|
||||||
|
req = NewRequestWithJSON(t, "POST", fmt.Sprintf("%s/upload_urls", packageURL), map[string]int64{
|
||||||
|
conaninfoName: int64(len(contentConaninfo)),
|
||||||
|
"removed.txt": 0,
|
||||||
|
})
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
resp = MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
uploadURLs = make(map[string]string)
|
||||||
|
DecodeJSON(t, resp, &uploadURLs)
|
||||||
|
|
||||||
|
assert.Contains(t, uploadURLs, conaninfoName)
|
||||||
|
assert.NotContains(t, uploadURLs, "removed.txt")
|
||||||
|
|
||||||
|
uploadURL = uploadURLs[conaninfoName]
|
||||||
|
assert.NotEmpty(t, uploadURL)
|
||||||
|
|
||||||
|
req = NewRequestWithBody(t, "PUT", uploadURL, strings.NewReader(contentConaninfo))
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
MakeRequest(t, req, http.StatusCreated)
|
||||||
|
}
|
||||||
|
|
||||||
|
func uploadConanPackageV2(t *testing.T, baseURL, token, name, version, user, channel, recipeRevision, packageRevision string) {
|
||||||
|
contentConanfile := buildConanfileContent(name, version)
|
||||||
|
|
||||||
|
recipeURL := fmt.Sprintf("%s/v2/conans/%s/%s/%s/%s/revisions/%s", baseURL, name, version, user, channel, recipeRevision)
|
||||||
|
|
||||||
|
req := NewRequestWithBody(t, "PUT", fmt.Sprintf("%s/files/%s", recipeURL, conanfileName), strings.NewReader(contentConanfile))
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
MakeRequest(t, req, http.StatusCreated)
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", fmt.Sprintf("%s/files", recipeURL))
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
var list *struct {
|
||||||
|
Files map[string]interface{} `json:"files"`
|
||||||
|
}
|
||||||
|
DecodeJSON(t, resp, &list)
|
||||||
|
assert.Len(t, list.Files, 1)
|
||||||
|
assert.Contains(t, list.Files, conanfileName)
|
||||||
|
|
||||||
|
packageURL := fmt.Sprintf("%s/packages/%s/revisions/%s", recipeURL, conanPackageReference, packageRevision)
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", fmt.Sprintf("%s/files", packageURL))
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
MakeRequest(t, req, http.StatusNotFound)
|
||||||
|
|
||||||
|
req = NewRequestWithBody(t, "PUT", fmt.Sprintf("%s/files/%s", packageURL, conaninfoName), strings.NewReader(contentConaninfo))
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
MakeRequest(t, req, http.StatusCreated)
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", fmt.Sprintf("%s/files", packageURL))
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
resp = MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
list = nil
|
||||||
|
DecodeJSON(t, resp, &list)
|
||||||
|
assert.Len(t, list.Files, 1)
|
||||||
|
assert.Contains(t, list.Files, conaninfoName)
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestPackageConan(t *testing.T) {
|
||||||
|
defer prepareTestEnv(t)()
|
||||||
|
user := unittest.AssertExistsAndLoadBean(t, &user_model.User{ID: 2}).(*user_model.User)
|
||||||
|
|
||||||
|
name := "ConanPackage"
|
||||||
|
version1 := "1.2"
|
||||||
|
version2 := "1.3"
|
||||||
|
user1 := "dummy"
|
||||||
|
user2 := "gitea"
|
||||||
|
channel1 := "test"
|
||||||
|
channel2 := "final"
|
||||||
|
revision1 := "rev1"
|
||||||
|
revision2 := "rev2"
|
||||||
|
|
||||||
|
url := fmt.Sprintf("%sapi/packages/%s/conan", setting.AppURL, user.Name)
|
||||||
|
|
||||||
|
t.Run("v1", func(t *testing.T) {
|
||||||
|
t.Run("Ping", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/v1/ping", url))
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
assert.Equal(t, "revisions", resp.Header().Get("X-Conan-Server-Capabilities"))
|
||||||
|
})
|
||||||
|
|
||||||
|
token := ""
|
||||||
|
|
||||||
|
t.Run("Authenticate", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/v1/users/authenticate", url))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
body := resp.Body.String()
|
||||||
|
assert.NotEmpty(t, body)
|
||||||
|
|
||||||
|
token = fmt.Sprintf("Bearer %s", body)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("CheckCredentials", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/v1/users/check_credentials", url))
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
MakeRequest(t, req, http.StatusOK)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Upload", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
uploadConanPackageV1(t, url, token, name, version1, user1, channel1)
|
||||||
|
|
||||||
|
t.Run("Validate", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
pvs, err := packages.GetVersionsByPackageType(db.DefaultContext, user.ID, packages.TypeConan)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pvs, 1)
|
||||||
|
|
||||||
|
pd, err := packages.GetPackageDescriptor(db.DefaultContext, pvs[0])
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.NotNil(t, pd.SemVer)
|
||||||
|
assert.Equal(t, name, pd.Package.Name)
|
||||||
|
assert.Equal(t, version1, pd.Version.Version)
|
||||||
|
assert.IsType(t, &conan_module.Metadata{}, pd.Metadata)
|
||||||
|
metadata := pd.Metadata.(*conan_module.Metadata)
|
||||||
|
assert.Equal(t, conanLicense, metadata.License)
|
||||||
|
assert.Equal(t, conanAuthor, metadata.Author)
|
||||||
|
assert.Equal(t, conanHomepage, metadata.ProjectURL)
|
||||||
|
assert.Equal(t, conanURL, metadata.RepositoryURL)
|
||||||
|
assert.Equal(t, conanDescription, metadata.Description)
|
||||||
|
assert.Equal(t, []string{conanTopic}, metadata.Keywords)
|
||||||
|
|
||||||
|
pfs, err := packages.GetFilesByVersionID(db.DefaultContext, pvs[0].ID)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pfs, 2)
|
||||||
|
|
||||||
|
for _, pf := range pfs {
|
||||||
|
pb, err := packages.GetBlobByID(db.DefaultContext, pf.BlobID)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
|
||||||
|
if pf.Name == conanfileName {
|
||||||
|
assert.True(t, pf.IsLead)
|
||||||
|
|
||||||
|
assert.Equal(t, int64(len(buildConanfileContent(name, version1))), pb.Size)
|
||||||
|
} else if pf.Name == conaninfoName {
|
||||||
|
assert.False(t, pf.IsLead)
|
||||||
|
|
||||||
|
assert.Equal(t, int64(len(contentConaninfo)), pb.Size)
|
||||||
|
} else {
|
||||||
|
assert.Fail(t, "unknown file: %s", pf.Name)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Download", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
recipeURL := fmt.Sprintf("%s/v1/conans/%s/%s/%s/%s", url, name, version1, user1, channel1)
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", recipeURL)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
fileHashes := make(map[string]string)
|
||||||
|
DecodeJSON(t, resp, &fileHashes)
|
||||||
|
assert.Len(t, fileHashes, 1)
|
||||||
|
assert.Contains(t, fileHashes, conanfileName)
|
||||||
|
assert.Equal(t, "7abc52241c22090782c54731371847a8", fileHashes[conanfileName])
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", fmt.Sprintf("%s/digest", recipeURL))
|
||||||
|
resp = MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
downloadURLs := make(map[string]string)
|
||||||
|
DecodeJSON(t, resp, &downloadURLs)
|
||||||
|
assert.Contains(t, downloadURLs, conanfileName)
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", fmt.Sprintf("%s/download_urls", recipeURL))
|
||||||
|
resp = MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
DecodeJSON(t, resp, &downloadURLs)
|
||||||
|
assert.Contains(t, downloadURLs, conanfileName)
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", downloadURLs[conanfileName])
|
||||||
|
resp = MakeRequest(t, req, http.StatusOK)
|
||||||
|
assert.Equal(t, buildConanfileContent(name, version1), resp.Body.String())
|
||||||
|
|
||||||
|
packageURL := fmt.Sprintf("%s/packages/%s", recipeURL, conanPackageReference)
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", packageURL)
|
||||||
|
resp = MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
fileHashes = make(map[string]string)
|
||||||
|
DecodeJSON(t, resp, &fileHashes)
|
||||||
|
assert.Len(t, fileHashes, 1)
|
||||||
|
assert.Contains(t, fileHashes, conaninfoName)
|
||||||
|
assert.Equal(t, "7628bfcc5b17f1470c468621a78df394", fileHashes[conaninfoName])
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", fmt.Sprintf("%s/digest", packageURL))
|
||||||
|
resp = MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
downloadURLs = make(map[string]string)
|
||||||
|
DecodeJSON(t, resp, &downloadURLs)
|
||||||
|
assert.Contains(t, downloadURLs, conaninfoName)
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", fmt.Sprintf("%s/download_urls", packageURL))
|
||||||
|
resp = MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
DecodeJSON(t, resp, &downloadURLs)
|
||||||
|
assert.Contains(t, downloadURLs, conaninfoName)
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", downloadURLs[conaninfoName])
|
||||||
|
resp = MakeRequest(t, req, http.StatusOK)
|
||||||
|
assert.Equal(t, contentConaninfo, resp.Body.String())
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Search", func(t *testing.T) {
|
||||||
|
uploadConanPackageV1(t, url, token, name, version2, user1, channel1)
|
||||||
|
uploadConanPackageV1(t, url, token, name, version1, user1, channel2)
|
||||||
|
uploadConanPackageV1(t, url, token, name, version1, user2, channel1)
|
||||||
|
uploadConanPackageV1(t, url, token, name, version1, user2, channel2)
|
||||||
|
|
||||||
|
t.Run("Recipe", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
cases := []struct {
|
||||||
|
Query string
|
||||||
|
Expected []string
|
||||||
|
}{
|
||||||
|
{"ConanPackage", []string{"ConanPackage/1.2@dummy/test", "ConanPackage/1.3@dummy/test", "ConanPackage/1.2@dummy/final", "ConanPackage/1.2@gitea/test", "ConanPackage/1.2@gitea/final"}},
|
||||||
|
{"ConanPackage/1.2", []string{"ConanPackage/1.2@dummy/test", "ConanPackage/1.2@dummy/final", "ConanPackage/1.2@gitea/test", "ConanPackage/1.2@gitea/final"}},
|
||||||
|
{"ConanPackage/1.1", []string{}},
|
||||||
|
{"Conan*", []string{"ConanPackage/1.2@dummy/test", "ConanPackage/1.3@dummy/test", "ConanPackage/1.2@dummy/final", "ConanPackage/1.2@gitea/test", "ConanPackage/1.2@gitea/final"}},
|
||||||
|
{"ConanPackage/", []string{"ConanPackage/1.2@dummy/test", "ConanPackage/1.3@dummy/test", "ConanPackage/1.2@dummy/final", "ConanPackage/1.2@gitea/test", "ConanPackage/1.2@gitea/final"}},
|
||||||
|
{"ConanPackage/*", []string{"ConanPackage/1.2@dummy/test", "ConanPackage/1.3@dummy/test", "ConanPackage/1.2@dummy/final", "ConanPackage/1.2@gitea/test", "ConanPackage/1.2@gitea/final"}},
|
||||||
|
{"ConanPackage/1*", []string{"ConanPackage/1.2@dummy/test", "ConanPackage/1.3@dummy/test", "ConanPackage/1.2@dummy/final", "ConanPackage/1.2@gitea/test", "ConanPackage/1.2@gitea/final"}},
|
||||||
|
{"ConanPackage/*2", []string{"ConanPackage/1.2@dummy/test", "ConanPackage/1.2@dummy/final", "ConanPackage/1.2@gitea/test", "ConanPackage/1.2@gitea/final"}},
|
||||||
|
{"ConanPackage/1*2", []string{"ConanPackage/1.2@dummy/test", "ConanPackage/1.2@dummy/final", "ConanPackage/1.2@gitea/test", "ConanPackage/1.2@gitea/final"}},
|
||||||
|
{"ConanPackage/1.2@", []string{"ConanPackage/1.2@dummy/test", "ConanPackage/1.2@dummy/final", "ConanPackage/1.2@gitea/test", "ConanPackage/1.2@gitea/final"}},
|
||||||
|
{"ConanPackage/1.2@du*", []string{"ConanPackage/1.2@dummy/test", "ConanPackage/1.2@dummy/final"}},
|
||||||
|
{"ConanPackage/1.2@du*/", []string{"ConanPackage/1.2@dummy/test", "ConanPackage/1.2@dummy/final"}},
|
||||||
|
{"ConanPackage/1.2@du*/*test", []string{"ConanPackage/1.2@dummy/test"}},
|
||||||
|
{"ConanPackage/1.2@du*/*st", []string{"ConanPackage/1.2@dummy/test"}},
|
||||||
|
{"ConanPackage/1.2@gitea/*", []string{"ConanPackage/1.2@gitea/test", "ConanPackage/1.2@gitea/final"}},
|
||||||
|
{"*/*@dummy", []string{"ConanPackage/1.2@dummy/test", "ConanPackage/1.3@dummy/test", "ConanPackage/1.2@dummy/final"}},
|
||||||
|
{"*/*@*/final", []string{"ConanPackage/1.2@dummy/final", "ConanPackage/1.2@gitea/final"}},
|
||||||
|
}
|
||||||
|
|
||||||
|
for i, c := range cases {
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/v1/conans/search?q=%s", url, stdurl.QueryEscape(c.Query)))
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
var result *conan_router.SearchResult
|
||||||
|
DecodeJSON(t, resp, &result)
|
||||||
|
|
||||||
|
assert.ElementsMatch(t, c.Expected, result.Results, "case %d: unexpected result", i)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Package", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/v1/conans/%s/%s/%s/%s/search", url, name, version1, user1, channel2))
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
var result map[string]*conan_module.Conaninfo
|
||||||
|
DecodeJSON(t, resp, &result)
|
||||||
|
|
||||||
|
assert.Contains(t, result, conanPackageReference)
|
||||||
|
info := result[conanPackageReference]
|
||||||
|
assert.NotEmpty(t, info.Settings)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Delete", func(t *testing.T) {
|
||||||
|
t.Run("Package", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
cases := []struct {
|
||||||
|
Channel string
|
||||||
|
References []string
|
||||||
|
}{
|
||||||
|
{channel1, []string{conanPackageReference}},
|
||||||
|
{channel2, []string{}},
|
||||||
|
}
|
||||||
|
|
||||||
|
for i, c := range cases {
|
||||||
|
rref, _ := conan_module.NewRecipeReference(name, version1, user1, c.Channel, conan_module.DefaultRevision)
|
||||||
|
references, err := conan_model.GetPackageReferences(db.DefaultContext, user.ID, rref)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.NotEmpty(t, references)
|
||||||
|
|
||||||
|
req := NewRequestWithJSON(t, "POST", fmt.Sprintf("%s/v1/conans/%s/%s/%s/%s/packages/delete", url, name, version1, user1, c.Channel), map[string][]string{
|
||||||
|
"package_ids": c.References,
|
||||||
|
})
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
references, err = conan_model.GetPackageReferences(db.DefaultContext, user.ID, rref)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Empty(t, references, "case %d: should be empty", i)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Recipe", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
cases := []struct {
|
||||||
|
Channel string
|
||||||
|
}{
|
||||||
|
{channel1},
|
||||||
|
{channel2},
|
||||||
|
}
|
||||||
|
|
||||||
|
for i, c := range cases {
|
||||||
|
rref, _ := conan_module.NewRecipeReference(name, version1, user1, c.Channel, conan_module.DefaultRevision)
|
||||||
|
revisions, err := conan_model.GetRecipeRevisions(db.DefaultContext, user.ID, rref)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.NotEmpty(t, revisions)
|
||||||
|
|
||||||
|
req := NewRequest(t, "DELETE", fmt.Sprintf("%s/v1/conans/%s/%s/%s/%s", url, name, version1, user1, c.Channel))
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
revisions, err = conan_model.GetRecipeRevisions(db.DefaultContext, user.ID, rref)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Empty(t, revisions, "case %d: should be empty", i)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("v2", func(t *testing.T) {
|
||||||
|
t.Run("Ping", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/v2/ping", url))
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
assert.Equal(t, "revisions", resp.Header().Get("X-Conan-Server-Capabilities"))
|
||||||
|
})
|
||||||
|
|
||||||
|
token := ""
|
||||||
|
|
||||||
|
t.Run("Authenticate", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/v2/users/authenticate", url))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
body := resp.Body.String()
|
||||||
|
assert.NotEmpty(t, body)
|
||||||
|
|
||||||
|
token = fmt.Sprintf("Bearer %s", body)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("CheckCredentials", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/v2/users/check_credentials", url))
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
MakeRequest(t, req, http.StatusOK)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Upload", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
uploadConanPackageV2(t, url, token, name, version1, user1, channel1, revision1, revision1)
|
||||||
|
|
||||||
|
t.Run("Validate", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
pvs, err := packages.GetVersionsByPackageType(db.DefaultContext, user.ID, packages.TypeConan)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pvs, 2)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Latest", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
recipeURL := fmt.Sprintf("%s/v2/conans/%s/%s/%s/%s", url, name, version1, user1, channel1)
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/latest", recipeURL))
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
obj := make(map[string]string)
|
||||||
|
DecodeJSON(t, resp, &obj)
|
||||||
|
assert.Contains(t, obj, "revision")
|
||||||
|
assert.Equal(t, revision1, obj["revision"])
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", fmt.Sprintf("%s/revisions/%s/packages/%s/latest", recipeURL, revision1, conanPackageReference))
|
||||||
|
resp = MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
obj = make(map[string]string)
|
||||||
|
DecodeJSON(t, resp, &obj)
|
||||||
|
assert.Contains(t, obj, "revision")
|
||||||
|
assert.Equal(t, revision1, obj["revision"])
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("ListRevisions", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
uploadConanPackageV2(t, url, token, name, version1, user1, channel1, revision1, revision2)
|
||||||
|
uploadConanPackageV2(t, url, token, name, version1, user1, channel1, revision2, revision1)
|
||||||
|
uploadConanPackageV2(t, url, token, name, version1, user1, channel1, revision2, revision2)
|
||||||
|
|
||||||
|
recipeURL := fmt.Sprintf("%s/v2/conans/%s/%s/%s/%s/revisions", url, name, version1, user1, channel1)
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", recipeURL)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
type RevisionInfo struct {
|
||||||
|
Revision string `json:"revision"`
|
||||||
|
Time time.Time `json:"time"`
|
||||||
|
}
|
||||||
|
|
||||||
|
type RevisionList struct {
|
||||||
|
Revisions []*RevisionInfo `json:"revisions"`
|
||||||
|
}
|
||||||
|
|
||||||
|
var list *RevisionList
|
||||||
|
DecodeJSON(t, resp, &list)
|
||||||
|
assert.Len(t, list.Revisions, 2)
|
||||||
|
revs := make([]string, 0, len(list.Revisions))
|
||||||
|
for _, rev := range list.Revisions {
|
||||||
|
revs = append(revs, rev.Revision)
|
||||||
|
}
|
||||||
|
assert.ElementsMatch(t, []string{revision1, revision2}, revs)
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", fmt.Sprintf("%s/%s/packages/%s/revisions", recipeURL, revision1, conanPackageReference))
|
||||||
|
resp = MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
DecodeJSON(t, resp, &list)
|
||||||
|
assert.Len(t, list.Revisions, 2)
|
||||||
|
revs = make([]string, 0, len(list.Revisions))
|
||||||
|
for _, rev := range list.Revisions {
|
||||||
|
revs = append(revs, rev.Revision)
|
||||||
|
}
|
||||||
|
assert.ElementsMatch(t, []string{revision1, revision2}, revs)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Search", func(t *testing.T) {
|
||||||
|
t.Run("Recipe", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
cases := []struct {
|
||||||
|
Query string
|
||||||
|
Expected []string
|
||||||
|
}{
|
||||||
|
{"ConanPackage", []string{"ConanPackage/1.2@dummy/test", "ConanPackage/1.3@dummy/test", "ConanPackage/1.2@gitea/test", "ConanPackage/1.2@gitea/final"}},
|
||||||
|
{"ConanPackage/1.2", []string{"ConanPackage/1.2@dummy/test", "ConanPackage/1.2@gitea/test", "ConanPackage/1.2@gitea/final"}},
|
||||||
|
{"ConanPackage/1.1", []string{}},
|
||||||
|
{"Conan*", []string{"ConanPackage/1.2@dummy/test", "ConanPackage/1.3@dummy/test", "ConanPackage/1.2@gitea/test", "ConanPackage/1.2@gitea/final"}},
|
||||||
|
{"ConanPackage/", []string{"ConanPackage/1.2@dummy/test", "ConanPackage/1.3@dummy/test", "ConanPackage/1.2@gitea/test", "ConanPackage/1.2@gitea/final"}},
|
||||||
|
{"ConanPackage/*", []string{"ConanPackage/1.2@dummy/test", "ConanPackage/1.3@dummy/test", "ConanPackage/1.2@gitea/test", "ConanPackage/1.2@gitea/final"}},
|
||||||
|
{"ConanPackage/1*", []string{"ConanPackage/1.2@dummy/test", "ConanPackage/1.3@dummy/test", "ConanPackage/1.2@gitea/test", "ConanPackage/1.2@gitea/final"}},
|
||||||
|
{"ConanPackage/*2", []string{"ConanPackage/1.2@dummy/test", "ConanPackage/1.2@gitea/test", "ConanPackage/1.2@gitea/final"}},
|
||||||
|
{"ConanPackage/1*2", []string{"ConanPackage/1.2@dummy/test", "ConanPackage/1.2@gitea/test", "ConanPackage/1.2@gitea/final"}},
|
||||||
|
{"ConanPackage/1.2@", []string{"ConanPackage/1.2@dummy/test", "ConanPackage/1.2@gitea/test", "ConanPackage/1.2@gitea/final"}},
|
||||||
|
{"ConanPackage/1.2@du*", []string{"ConanPackage/1.2@dummy/test"}},
|
||||||
|
{"ConanPackage/1.2@du*/", []string{"ConanPackage/1.2@dummy/test"}},
|
||||||
|
{"ConanPackage/1.2@du*/*test", []string{"ConanPackage/1.2@dummy/test"}},
|
||||||
|
{"ConanPackage/1.2@du*/*st", []string{"ConanPackage/1.2@dummy/test"}},
|
||||||
|
{"ConanPackage/1.2@gitea/*", []string{"ConanPackage/1.2@gitea/test", "ConanPackage/1.2@gitea/final"}},
|
||||||
|
{"*/*@dummy", []string{"ConanPackage/1.2@dummy/test", "ConanPackage/1.3@dummy/test"}},
|
||||||
|
{"*/*@*/final", []string{"ConanPackage/1.2@gitea/final"}},
|
||||||
|
}
|
||||||
|
|
||||||
|
for i, c := range cases {
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/v2/conans/search?q=%s", url, stdurl.QueryEscape(c.Query)))
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
var result *conan_router.SearchResult
|
||||||
|
DecodeJSON(t, resp, &result)
|
||||||
|
|
||||||
|
assert.ElementsMatch(t, c.Expected, result.Results, "case %d: unexpected result", i)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Package", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/v2/conans/%s/%s/%s/%s/search", url, name, version1, user1, channel1))
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
var result map[string]*conan_module.Conaninfo
|
||||||
|
DecodeJSON(t, resp, &result)
|
||||||
|
|
||||||
|
assert.Contains(t, result, conanPackageReference)
|
||||||
|
info := result[conanPackageReference]
|
||||||
|
assert.NotEmpty(t, info.Settings)
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", fmt.Sprintf("%s/v2/conans/%s/%s/%s/%s/revisions/%s/search", url, name, version1, user1, channel1, revision1))
|
||||||
|
resp = MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
result = make(map[string]*conan_module.Conaninfo)
|
||||||
|
DecodeJSON(t, resp, &result)
|
||||||
|
|
||||||
|
assert.Contains(t, result, conanPackageReference)
|
||||||
|
info = result[conanPackageReference]
|
||||||
|
assert.NotEmpty(t, info.Settings)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Delete", func(t *testing.T) {
|
||||||
|
t.Run("Package", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
rref, _ := conan_module.NewRecipeReference(name, version1, user1, channel1, revision1)
|
||||||
|
pref, _ := conan_module.NewPackageReference(rref, conanPackageReference, conan_module.DefaultRevision)
|
||||||
|
|
||||||
|
checkPackageRevisionCount := func(count int) {
|
||||||
|
revisions, err := conan_model.GetPackageRevisions(db.DefaultContext, user.ID, pref)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, revisions, count)
|
||||||
|
}
|
||||||
|
checkPackageReferenceCount := func(count int) {
|
||||||
|
references, err := conan_model.GetPackageReferences(db.DefaultContext, user.ID, rref)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, references, count)
|
||||||
|
}
|
||||||
|
|
||||||
|
checkPackageRevisionCount(2)
|
||||||
|
|
||||||
|
req := NewRequest(t, "DELETE", fmt.Sprintf("%s/v2/conans/%s/%s/%s/%s/revisions/%s/packages/%s/revisions/%s", url, name, version1, user1, channel1, revision1, conanPackageReference, revision1))
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
checkPackageRevisionCount(1)
|
||||||
|
|
||||||
|
req = NewRequest(t, "DELETE", fmt.Sprintf("%s/v2/conans/%s/%s/%s/%s/revisions/%s/packages/%s", url, name, version1, user1, channel1, revision1, conanPackageReference))
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
checkPackageRevisionCount(0)
|
||||||
|
|
||||||
|
rref = rref.WithRevision(revision2)
|
||||||
|
|
||||||
|
checkPackageReferenceCount(1)
|
||||||
|
|
||||||
|
req = NewRequest(t, "DELETE", fmt.Sprintf("%s/v2/conans/%s/%s/%s/%s/revisions/%s/packages", url, name, version1, user1, channel1, revision2))
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
checkPackageReferenceCount(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Recipe", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
rref, _ := conan_module.NewRecipeReference(name, version1, user1, channel1, conan_module.DefaultRevision)
|
||||||
|
|
||||||
|
checkRecipeRevisionCount := func(count int) {
|
||||||
|
revisions, err := conan_model.GetRecipeRevisions(db.DefaultContext, user.ID, rref)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, revisions, count)
|
||||||
|
}
|
||||||
|
|
||||||
|
checkRecipeRevisionCount(2)
|
||||||
|
|
||||||
|
req := NewRequest(t, "DELETE", fmt.Sprintf("%s/v2/conans/%s/%s/%s/%s/revisions/%s", url, name, version1, user1, channel1, revision1))
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
checkRecipeRevisionCount(1)
|
||||||
|
|
||||||
|
req = NewRequest(t, "DELETE", fmt.Sprintf("%s/v2/conans/%s/%s/%s/%s", url, name, version1, user1, channel1))
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
checkRecipeRevisionCount(0)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
})
|
||||||
|
}
|
534
integrations/api_packages_container_test.go
Normal file
534
integrations/api_packages_container_test.go
Normal file
|
@ -0,0 +1,534 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package integrations
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"encoding/base64"
|
||||||
|
"fmt"
|
||||||
|
"net/http"
|
||||||
|
"strings"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/models/db"
|
||||||
|
packages_model "code.gitea.io/gitea/models/packages"
|
||||||
|
container_model "code.gitea.io/gitea/models/packages/container"
|
||||||
|
"code.gitea.io/gitea/models/unittest"
|
||||||
|
user_model "code.gitea.io/gitea/models/user"
|
||||||
|
container_module "code.gitea.io/gitea/modules/packages/container"
|
||||||
|
"code.gitea.io/gitea/modules/packages/container/oci"
|
||||||
|
"code.gitea.io/gitea/modules/setting"
|
||||||
|
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestPackageContainer(t *testing.T) {
|
||||||
|
defer prepareTestEnv(t)()
|
||||||
|
user := unittest.AssertExistsAndLoadBean(t, &user_model.User{ID: 2}).(*user_model.User)
|
||||||
|
|
||||||
|
has := func(l packages_model.PackagePropertyList, name string) bool {
|
||||||
|
for _, pp := range l {
|
||||||
|
if pp.Name == name {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
images := []string{"test", "te/st"}
|
||||||
|
tags := []string{"latest", "main"}
|
||||||
|
multiTag := "multi"
|
||||||
|
|
||||||
|
unknownDigest := "sha256:0000000000000000000000000000000000000000000000000000000000000000"
|
||||||
|
|
||||||
|
blobDigest := "sha256:a3ed95caeb02ffe68cdd9fd84406680ae93d633cb16422d00e8a7c22955b46d4"
|
||||||
|
blobContent, _ := base64.StdEncoding.DecodeString(`H4sIAAAJbogA/2IYBaNgFIxYAAgAAP//Lq+17wAEAAA=`)
|
||||||
|
|
||||||
|
configDigest := "sha256:4607e093bec406eaadb6f3a340f63400c9d3a7038680744c406903766b938f0d"
|
||||||
|
configContent := `{"architecture":"amd64","config":{"Env":["PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"],"Cmd":["/true"],"ArgsEscaped":true,"Image":"sha256:9bd8b88dc68b80cffe126cc820e4b52c6e558eb3b37680bfee8e5f3ed7b8c257"},"container":"b89fe92a887d55c0961f02bdfbfd8ac3ddf66167db374770d2d9e9fab3311510","container_config":{"Hostname":"b89fe92a887d","Env":["PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"],"Cmd":["/bin/sh","-c","#(nop) ","CMD [\"/true\"]"],"ArgsEscaped":true,"Image":"sha256:9bd8b88dc68b80cffe126cc820e4b52c6e558eb3b37680bfee8e5f3ed7b8c257"},"created":"2022-01-01T00:00:00.000000000Z","docker_version":"20.10.12","history":[{"created":"2022-01-01T00:00:00.000000000Z","created_by":"/bin/sh -c #(nop) COPY file:0e7589b0c800daaf6fa460d2677101e4676dd9491980210cb345480e513f3602 in /true "},{"created":"2022-01-01T00:00:00.000000001Z","created_by":"/bin/sh -c #(nop) CMD [\"/true\"]","empty_layer":true}],"os":"linux","rootfs":{"type":"layers","diff_ids":["sha256:0ff3b91bdf21ecdf2f2f3d4372c2098a14dbe06cd678e8f0a85fd4902d00e2e2"]}}`
|
||||||
|
|
||||||
|
manifestDigest := "sha256:4f10484d1c1bb13e3956b4de1cd42db8e0f14a75be1617b60f2de3cd59c803c6"
|
||||||
|
manifestContent := `{"schemaVersion":2,"mediaType":"` + oci.MediaTypeDockerManifest + `","config":{"mediaType":"application/vnd.docker.container.image.v1+json","digest":"sha256:4607e093bec406eaadb6f3a340f63400c9d3a7038680744c406903766b938f0d","size":1069},"layers":[{"mediaType":"application/vnd.docker.image.rootfs.diff.tar.gzip","digest":"sha256:a3ed95caeb02ffe68cdd9fd84406680ae93d633cb16422d00e8a7c22955b46d4","size":32}]}`
|
||||||
|
|
||||||
|
untaggedManifestDigest := "sha256:4305f5f5572b9a426b88909b036e52ee3cf3d7b9c1b01fac840e90747f56623d"
|
||||||
|
untaggedManifestContent := `{"schemaVersion":2,"mediaType":"` + oci.MediaTypeImageManifest + `","config":{"mediaType":"application/vnd.docker.container.image.v1+json","digest":"sha256:4607e093bec406eaadb6f3a340f63400c9d3a7038680744c406903766b938f0d","size":1069},"layers":[{"mediaType":"application/vnd.docker.image.rootfs.diff.tar.gzip","digest":"sha256:a3ed95caeb02ffe68cdd9fd84406680ae93d633cb16422d00e8a7c22955b46d4","size":32}]}`
|
||||||
|
|
||||||
|
indexManifestDigest := "sha256:bab112d6efb9e7f221995caaaa880352feb5bd8b1faf52fae8d12c113aa123ec"
|
||||||
|
indexManifestContent := `{"schemaVersion":2,"mediaType":"` + oci.MediaTypeImageIndex + `","manifests":[{"mediaType":"` + oci.MediaTypeDockerManifest + `","digest":"` + manifestDigest + `","platform":{"os":"linux","architecture":"arm","variant":"v7"}},{"mediaType":"` + oci.MediaTypeImageManifest + `","digest":"` + untaggedManifestDigest + `","platform":{"os":"linux","architecture":"arm64","variant":"v8"}}]}`
|
||||||
|
|
||||||
|
anonymousToken := ""
|
||||||
|
userToken := ""
|
||||||
|
|
||||||
|
t.Run("Authenticate", func(t *testing.T) {
|
||||||
|
type TokenResponse struct {
|
||||||
|
Token string `json:"token"`
|
||||||
|
}
|
||||||
|
|
||||||
|
authenticate := []string{
|
||||||
|
`Bearer realm="` + setting.AppURL + `v2/token"`,
|
||||||
|
`Basic`,
|
||||||
|
}
|
||||||
|
|
||||||
|
t.Run("Anonymous", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%sv2", setting.AppURL))
|
||||||
|
resp := MakeRequest(t, req, http.StatusUnauthorized)
|
||||||
|
|
||||||
|
assert.ElementsMatch(t, authenticate, resp.Header().Values("WWW-Authenticate"))
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", fmt.Sprintf("%sv2/token", setting.AppURL))
|
||||||
|
resp = MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
tokenResponse := &TokenResponse{}
|
||||||
|
DecodeJSON(t, resp, &tokenResponse)
|
||||||
|
|
||||||
|
assert.NotEmpty(t, tokenResponse.Token)
|
||||||
|
|
||||||
|
anonymousToken = fmt.Sprintf("Bearer %s", tokenResponse.Token)
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", fmt.Sprintf("%sv2", setting.AppURL))
|
||||||
|
addTokenAuthHeader(req, anonymousToken)
|
||||||
|
resp = MakeRequest(t, req, http.StatusOK)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("User", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%sv2", setting.AppURL))
|
||||||
|
resp := MakeRequest(t, req, http.StatusUnauthorized)
|
||||||
|
|
||||||
|
assert.ElementsMatch(t, authenticate, resp.Header().Values("WWW-Authenticate"))
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", fmt.Sprintf("%sv2/token", setting.AppURL))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
resp = MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
tokenResponse := &TokenResponse{}
|
||||||
|
DecodeJSON(t, resp, &tokenResponse)
|
||||||
|
|
||||||
|
assert.NotEmpty(t, tokenResponse.Token)
|
||||||
|
|
||||||
|
userToken = fmt.Sprintf("Bearer %s", tokenResponse.Token)
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", fmt.Sprintf("%sv2", setting.AppURL))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
resp = MakeRequest(t, req, http.StatusOK)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("DetermineSupport", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%sv2", setting.AppURL))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
assert.Equal(t, "registry/2.0", resp.Header().Get("Docker-Distribution-Api-Version"))
|
||||||
|
})
|
||||||
|
|
||||||
|
for _, image := range images {
|
||||||
|
t.Run(fmt.Sprintf("[Image:%s]", image), func(t *testing.T) {
|
||||||
|
url := fmt.Sprintf("%sv2/%s/%s", setting.AppURL, user.Name, image)
|
||||||
|
|
||||||
|
t.Run("UploadBlob/Monolithic", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "POST", fmt.Sprintf("%s/blobs/uploads", url))
|
||||||
|
addTokenAuthHeader(req, anonymousToken)
|
||||||
|
MakeRequest(t, req, http.StatusUnauthorized)
|
||||||
|
|
||||||
|
req = NewRequestWithBody(t, "POST", fmt.Sprintf("%s/blobs/uploads?digest=%s", url, unknownDigest), bytes.NewReader(blobContent))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
MakeRequest(t, req, http.StatusBadRequest)
|
||||||
|
|
||||||
|
req = NewRequestWithBody(t, "POST", fmt.Sprintf("%s/blobs/uploads?digest=%s", url, blobDigest), bytes.NewReader(blobContent))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
resp := MakeRequest(t, req, http.StatusCreated)
|
||||||
|
|
||||||
|
assert.Equal(t, fmt.Sprintf("/v2/%s/%s/blobs/%s", user.Name, image, blobDigest), resp.Header().Get("Location"))
|
||||||
|
assert.Equal(t, blobDigest, resp.Header().Get("Docker-Content-Digest"))
|
||||||
|
|
||||||
|
pv, err := packages_model.GetInternalVersionByNameAndVersion(db.DefaultContext, user.ID, packages_model.TypeContainer, image, container_model.UploadVersion)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
|
||||||
|
pfs, err := packages_model.GetFilesByVersionID(db.DefaultContext, pv.ID)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pfs, 1)
|
||||||
|
|
||||||
|
pb, err := packages_model.GetBlobByID(db.DefaultContext, pfs[0].BlobID)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.EqualValues(t, len(blobContent), pb.Size)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("UploadBlob/Chunked", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "POST", fmt.Sprintf("%s/blobs/uploads", url))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
resp := MakeRequest(t, req, http.StatusAccepted)
|
||||||
|
|
||||||
|
uuid := resp.Header().Get("Docker-Upload-Uuid")
|
||||||
|
assert.NotEmpty(t, uuid)
|
||||||
|
|
||||||
|
pbu, err := packages_model.GetBlobUploadByID(db.DefaultContext, uuid)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.EqualValues(t, 0, pbu.BytesReceived)
|
||||||
|
|
||||||
|
uploadURL := resp.Header().Get("Location")
|
||||||
|
assert.NotEmpty(t, uploadURL)
|
||||||
|
|
||||||
|
req = NewRequestWithBody(t, "PATCH", setting.AppURL+uploadURL[1:]+"000", bytes.NewReader(blobContent))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
MakeRequest(t, req, http.StatusNotFound)
|
||||||
|
|
||||||
|
req = NewRequestWithBody(t, "PATCH", setting.AppURL+uploadURL[1:], bytes.NewReader(blobContent))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
|
||||||
|
req.Header.Set("Content-Range", "1-10")
|
||||||
|
MakeRequest(t, req, http.StatusRequestedRangeNotSatisfiable)
|
||||||
|
|
||||||
|
contentRange := fmt.Sprintf("0-%d", len(blobContent)-1)
|
||||||
|
req.Header.Set("Content-Range", contentRange)
|
||||||
|
resp = MakeRequest(t, req, http.StatusAccepted)
|
||||||
|
|
||||||
|
assert.Equal(t, uuid, resp.Header().Get("Docker-Upload-Uuid"))
|
||||||
|
assert.Equal(t, contentRange, resp.Header().Get("Range"))
|
||||||
|
|
||||||
|
pbu, err = packages_model.GetBlobUploadByID(db.DefaultContext, uuid)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.EqualValues(t, len(blobContent), pbu.BytesReceived)
|
||||||
|
|
||||||
|
uploadURL = resp.Header().Get("Location")
|
||||||
|
|
||||||
|
req = NewRequest(t, "PUT", fmt.Sprintf("%s?digest=%s", setting.AppURL+uploadURL[1:], blobDigest))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
resp = MakeRequest(t, req, http.StatusCreated)
|
||||||
|
|
||||||
|
assert.Equal(t, fmt.Sprintf("/v2/%s/%s/blobs/%s", user.Name, image, blobDigest), resp.Header().Get("Location"))
|
||||||
|
assert.Equal(t, blobDigest, resp.Header().Get("Docker-Content-Digest"))
|
||||||
|
})
|
||||||
|
|
||||||
|
for _, tag := range tags {
|
||||||
|
t.Run(fmt.Sprintf("[Tag:%s]", tag), func(t *testing.T) {
|
||||||
|
t.Run("UploadManifest", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequestWithBody(t, "POST", fmt.Sprintf("%s/blobs/uploads?digest=%s", url, configDigest), strings.NewReader(configContent))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
MakeRequest(t, req, http.StatusCreated)
|
||||||
|
|
||||||
|
req = NewRequestWithBody(t, "PUT", fmt.Sprintf("%s/manifests/%s", url, tag), strings.NewReader(manifestContent))
|
||||||
|
addTokenAuthHeader(req, anonymousToken)
|
||||||
|
req.Header.Set("Content-Type", oci.MediaTypeDockerManifest)
|
||||||
|
MakeRequest(t, req, http.StatusUnauthorized)
|
||||||
|
|
||||||
|
req = NewRequestWithBody(t, "PUT", fmt.Sprintf("%s/manifests/%s", url, tag), strings.NewReader(manifestContent))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
req.Header.Set("Content-Type", oci.MediaTypeDockerManifest)
|
||||||
|
resp := MakeRequest(t, req, http.StatusCreated)
|
||||||
|
|
||||||
|
assert.Equal(t, manifestDigest, resp.Header().Get("Docker-Content-Digest"))
|
||||||
|
|
||||||
|
pv, err := packages_model.GetVersionByNameAndVersion(db.DefaultContext, user.ID, packages_model.TypeContainer, image, tag)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
|
||||||
|
pd, err := packages_model.GetPackageDescriptor(db.DefaultContext, pv)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Nil(t, pd.SemVer)
|
||||||
|
assert.Equal(t, image, pd.Package.Name)
|
||||||
|
assert.Equal(t, tag, pd.Version.Version)
|
||||||
|
assert.True(t, has(pd.Properties, container_module.PropertyManifestTagged))
|
||||||
|
|
||||||
|
assert.IsType(t, &container_module.Metadata{}, pd.Metadata)
|
||||||
|
metadata := pd.Metadata.(*container_module.Metadata)
|
||||||
|
assert.Equal(t, container_module.TypeOCI, metadata.Type)
|
||||||
|
assert.Len(t, metadata.ImageLayers, 2)
|
||||||
|
assert.Empty(t, metadata.MultiArch)
|
||||||
|
|
||||||
|
assert.Len(t, pd.Files, 3)
|
||||||
|
for _, pfd := range pd.Files {
|
||||||
|
switch pfd.File.Name {
|
||||||
|
case container_model.ManifestFilename:
|
||||||
|
assert.True(t, pfd.File.IsLead)
|
||||||
|
assert.Equal(t, oci.MediaTypeDockerManifest, pfd.Properties.GetByName(container_module.PropertyMediaType))
|
||||||
|
assert.Equal(t, manifestDigest, pfd.Properties.GetByName(container_module.PropertyDigest))
|
||||||
|
case strings.Replace(configDigest, ":", "_", 1):
|
||||||
|
assert.False(t, pfd.File.IsLead)
|
||||||
|
assert.Equal(t, "application/vnd.docker.container.image.v1+json", pfd.Properties.GetByName(container_module.PropertyMediaType))
|
||||||
|
assert.Equal(t, configDigest, pfd.Properties.GetByName(container_module.PropertyDigest))
|
||||||
|
case strings.Replace(blobDigest, ":", "_", 1):
|
||||||
|
assert.False(t, pfd.File.IsLead)
|
||||||
|
assert.Equal(t, "application/vnd.docker.image.rootfs.diff.tar.gzip", pfd.Properties.GetByName(container_module.PropertyMediaType))
|
||||||
|
assert.Equal(t, blobDigest, pfd.Properties.GetByName(container_module.PropertyDigest))
|
||||||
|
default:
|
||||||
|
assert.Fail(t, "unknown file: %s", pfd.File.Name)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Overwrite existing tag
|
||||||
|
req = NewRequestWithBody(t, "PUT", fmt.Sprintf("%s/manifests/%s", url, tag), strings.NewReader(manifestContent))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
req.Header.Set("Content-Type", oci.MediaTypeDockerManifest)
|
||||||
|
MakeRequest(t, req, http.StatusCreated)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("HeadManifest", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "HEAD", fmt.Sprintf("%s/manifests/unknown-tag", url))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
MakeRequest(t, req, http.StatusNotFound)
|
||||||
|
|
||||||
|
req = NewRequest(t, "HEAD", fmt.Sprintf("%s/manifests/%s", url, tag))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
assert.Equal(t, fmt.Sprintf("%d", len(manifestContent)), resp.Header().Get("Content-Length"))
|
||||||
|
assert.Equal(t, manifestDigest, resp.Header().Get("Docker-Content-Digest"))
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("GetManifest", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/manifests/unknown-tag", url))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
MakeRequest(t, req, http.StatusNotFound)
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", fmt.Sprintf("%s/manifests/%s", url, tag))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
assert.Equal(t, fmt.Sprintf("%d", len(manifestContent)), resp.Header().Get("Content-Length"))
|
||||||
|
assert.Equal(t, oci.MediaTypeDockerManifest, resp.Header().Get("Content-Type"))
|
||||||
|
assert.Equal(t, manifestDigest, resp.Header().Get("Docker-Content-Digest"))
|
||||||
|
assert.Equal(t, manifestContent, resp.Body.String())
|
||||||
|
})
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
t.Run("UploadUntaggedManifest", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequestWithBody(t, "PUT", fmt.Sprintf("%s/manifests/%s", url, untaggedManifestDigest), strings.NewReader(untaggedManifestContent))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
req.Header.Set("Content-Type", oci.MediaTypeImageManifest)
|
||||||
|
resp := MakeRequest(t, req, http.StatusCreated)
|
||||||
|
|
||||||
|
assert.Equal(t, untaggedManifestDigest, resp.Header().Get("Docker-Content-Digest"))
|
||||||
|
|
||||||
|
req = NewRequest(t, "HEAD", fmt.Sprintf("%s/manifests/%s", url, untaggedManifestDigest))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
resp = MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
assert.Equal(t, fmt.Sprintf("%d", len(untaggedManifestContent)), resp.Header().Get("Content-Length"))
|
||||||
|
assert.Equal(t, untaggedManifestDigest, resp.Header().Get("Docker-Content-Digest"))
|
||||||
|
|
||||||
|
pv, err := packages_model.GetVersionByNameAndVersion(db.DefaultContext, user.ID, packages_model.TypeContainer, image, untaggedManifestDigest)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
|
||||||
|
pd, err := packages_model.GetPackageDescriptor(db.DefaultContext, pv)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Nil(t, pd.SemVer)
|
||||||
|
assert.Equal(t, image, pd.Package.Name)
|
||||||
|
assert.Equal(t, untaggedManifestDigest, pd.Version.Version)
|
||||||
|
assert.False(t, has(pd.Properties, container_module.PropertyManifestTagged))
|
||||||
|
|
||||||
|
assert.IsType(t, &container_module.Metadata{}, pd.Metadata)
|
||||||
|
|
||||||
|
assert.Len(t, pd.Files, 3)
|
||||||
|
for _, pfd := range pd.Files {
|
||||||
|
if pfd.File.Name == container_model.ManifestFilename {
|
||||||
|
assert.True(t, pfd.File.IsLead)
|
||||||
|
assert.Equal(t, oci.MediaTypeImageManifest, pfd.Properties.GetByName(container_module.PropertyMediaType))
|
||||||
|
assert.Equal(t, untaggedManifestDigest, pfd.Properties.GetByName(container_module.PropertyDigest))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("UploadIndexManifest", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequestWithBody(t, "PUT", fmt.Sprintf("%s/manifests/%s", url, multiTag), strings.NewReader(indexManifestContent))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
req.Header.Set("Content-Type", oci.MediaTypeImageIndex)
|
||||||
|
resp := MakeRequest(t, req, http.StatusCreated)
|
||||||
|
|
||||||
|
assert.Equal(t, indexManifestDigest, resp.Header().Get("Docker-Content-Digest"))
|
||||||
|
|
||||||
|
pv, err := packages_model.GetVersionByNameAndVersion(db.DefaultContext, user.ID, packages_model.TypeContainer, image, multiTag)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
|
||||||
|
pd, err := packages_model.GetPackageDescriptor(db.DefaultContext, pv)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Nil(t, pd.SemVer)
|
||||||
|
assert.Equal(t, image, pd.Package.Name)
|
||||||
|
assert.Equal(t, multiTag, pd.Version.Version)
|
||||||
|
assert.True(t, has(pd.Properties, container_module.PropertyManifestTagged))
|
||||||
|
|
||||||
|
getAllByName := func(l packages_model.PackagePropertyList, name string) []string {
|
||||||
|
values := make([]string, 0, len(l))
|
||||||
|
for _, pp := range l {
|
||||||
|
if pp.Name == name {
|
||||||
|
values = append(values, pp.Value)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return values
|
||||||
|
}
|
||||||
|
assert.ElementsMatch(t, []string{manifestDigest, untaggedManifestDigest}, getAllByName(pd.Properties, container_module.PropertyManifestReference))
|
||||||
|
|
||||||
|
assert.IsType(t, &container_module.Metadata{}, pd.Metadata)
|
||||||
|
metadata := pd.Metadata.(*container_module.Metadata)
|
||||||
|
assert.Equal(t, container_module.TypeOCI, metadata.Type)
|
||||||
|
assert.Contains(t, metadata.MultiArch, "linux/arm/v7")
|
||||||
|
assert.Equal(t, manifestDigest, metadata.MultiArch["linux/arm/v7"])
|
||||||
|
assert.Contains(t, metadata.MultiArch, "linux/arm64/v8")
|
||||||
|
assert.Equal(t, untaggedManifestDigest, metadata.MultiArch["linux/arm64/v8"])
|
||||||
|
|
||||||
|
assert.Len(t, pd.Files, 1)
|
||||||
|
assert.True(t, pd.Files[0].File.IsLead)
|
||||||
|
assert.Equal(t, oci.MediaTypeImageIndex, pd.Files[0].Properties.GetByName(container_module.PropertyMediaType))
|
||||||
|
assert.Equal(t, indexManifestDigest, pd.Files[0].Properties.GetByName(container_module.PropertyDigest))
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("UploadBlob/Mount", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "POST", fmt.Sprintf("%s/blobs/uploads?mount=%s", url, unknownDigest))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
MakeRequest(t, req, http.StatusAccepted)
|
||||||
|
|
||||||
|
req = NewRequest(t, "POST", fmt.Sprintf("%s/blobs/uploads?mount=%s", url, blobDigest))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
resp := MakeRequest(t, req, http.StatusCreated)
|
||||||
|
|
||||||
|
assert.Equal(t, fmt.Sprintf("/v2/%s/%s/blobs/%s", user.Name, image, blobDigest), resp.Header().Get("Location"))
|
||||||
|
assert.Equal(t, blobDigest, resp.Header().Get("Docker-Content-Digest"))
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("HeadBlob", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "HEAD", fmt.Sprintf("%s/blobs/%s", url, unknownDigest))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
MakeRequest(t, req, http.StatusNotFound)
|
||||||
|
|
||||||
|
req = NewRequest(t, "HEAD", fmt.Sprintf("%s/blobs/%s", url, blobDigest))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
assert.Equal(t, fmt.Sprintf("%d", len(blobContent)), resp.Header().Get("Content-Length"))
|
||||||
|
assert.Equal(t, blobDigest, resp.Header().Get("Docker-Content-Digest"))
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("GetBlob", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/blobs/%s", url, unknownDigest))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
MakeRequest(t, req, http.StatusNotFound)
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", fmt.Sprintf("%s/blobs/%s", url, blobDigest))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
assert.Equal(t, fmt.Sprintf("%d", len(blobContent)), resp.Header().Get("Content-Length"))
|
||||||
|
assert.Equal(t, blobDigest, resp.Header().Get("Docker-Content-Digest"))
|
||||||
|
assert.Equal(t, blobContent, resp.Body.Bytes())
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("GetTagList", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
cases := []struct {
|
||||||
|
URL string
|
||||||
|
ExpectedTags []string
|
||||||
|
ExpectedLink string
|
||||||
|
}{
|
||||||
|
{
|
||||||
|
URL: fmt.Sprintf("%s/tags/list", url),
|
||||||
|
ExpectedTags: []string{"latest", "main", "multi"},
|
||||||
|
ExpectedLink: fmt.Sprintf(`</v2/%s/%s/tags/list?last=multi>; rel="next"`, user.Name, image),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
URL: fmt.Sprintf("%s/tags/list?n=0", url),
|
||||||
|
ExpectedTags: []string{},
|
||||||
|
ExpectedLink: "",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
URL: fmt.Sprintf("%s/tags/list?n=2", url),
|
||||||
|
ExpectedTags: []string{"latest", "main"},
|
||||||
|
ExpectedLink: fmt.Sprintf(`</v2/%s/%s/tags/list?last=main&n=2>; rel="next"`, user.Name, image),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
URL: fmt.Sprintf("%s/tags/list?last=main", url),
|
||||||
|
ExpectedTags: []string{"multi"},
|
||||||
|
ExpectedLink: fmt.Sprintf(`</v2/%s/%s/tags/list?last=multi>; rel="next"`, user.Name, image),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
URL: fmt.Sprintf("%s/tags/list?n=1&last=latest", url),
|
||||||
|
ExpectedTags: []string{"main"},
|
||||||
|
ExpectedLink: fmt.Sprintf(`</v2/%s/%s/tags/list?last=main&n=1>; rel="next"`, user.Name, image),
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, c := range cases {
|
||||||
|
req := NewRequest(t, "GET", c.URL)
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
type TagList struct {
|
||||||
|
Name string `json:"name"`
|
||||||
|
Tags []string `json:"tags"`
|
||||||
|
}
|
||||||
|
|
||||||
|
tagList := &TagList{}
|
||||||
|
DecodeJSON(t, resp, &tagList)
|
||||||
|
|
||||||
|
assert.Equal(t, user.Name+"/"+image, tagList.Name)
|
||||||
|
assert.Equal(t, c.ExpectedTags, tagList.Tags)
|
||||||
|
assert.Equal(t, c.ExpectedLink, resp.Header().Get("Link"))
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Delete", func(t *testing.T) {
|
||||||
|
t.Run("Blob", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "DELETE", fmt.Sprintf("%s/blobs/%s", url, blobDigest))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
MakeRequest(t, req, http.StatusAccepted)
|
||||||
|
|
||||||
|
req = NewRequest(t, "HEAD", fmt.Sprintf("%s/blobs/%s", url, blobDigest))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
MakeRequest(t, req, http.StatusNotFound)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("ManifestByDigest", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "DELETE", fmt.Sprintf("%s/manifests/%s", url, untaggedManifestDigest))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
MakeRequest(t, req, http.StatusAccepted)
|
||||||
|
|
||||||
|
req = NewRequest(t, "HEAD", fmt.Sprintf("%s/manifests/%s", url, untaggedManifestDigest))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
MakeRequest(t, req, http.StatusNotFound)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("ManifestByTag", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "DELETE", fmt.Sprintf("%s/manifests/%s", url, multiTag))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
MakeRequest(t, req, http.StatusAccepted)
|
||||||
|
|
||||||
|
req = NewRequest(t, "HEAD", fmt.Sprintf("%s/manifests/%s", url, multiTag))
|
||||||
|
addTokenAuthHeader(req, userToken)
|
||||||
|
MakeRequest(t, req, http.StatusNotFound)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
109
integrations/api_packages_generic_test.go
Normal file
109
integrations/api_packages_generic_test.go
Normal file
|
@ -0,0 +1,109 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package integrations
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"fmt"
|
||||||
|
"net/http"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/models/db"
|
||||||
|
"code.gitea.io/gitea/models/packages"
|
||||||
|
"code.gitea.io/gitea/models/unittest"
|
||||||
|
user_model "code.gitea.io/gitea/models/user"
|
||||||
|
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestPackageGeneric(t *testing.T) {
|
||||||
|
defer prepareTestEnv(t)()
|
||||||
|
user := unittest.AssertExistsAndLoadBean(t, &user_model.User{ID: 2}).(*user_model.User)
|
||||||
|
|
||||||
|
packageName := "te-st_pac.kage"
|
||||||
|
packageVersion := "1.0.3"
|
||||||
|
filename := "fi-le_na.me"
|
||||||
|
content := []byte{1, 2, 3}
|
||||||
|
|
||||||
|
url := fmt.Sprintf("/api/packages/%s/generic/%s/%s/%s", user.Name, packageName, packageVersion, filename)
|
||||||
|
|
||||||
|
t.Run("Upload", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequestWithBody(t, "PUT", url, bytes.NewReader(content))
|
||||||
|
AddBasicAuthHeader(req, user.Name)
|
||||||
|
MakeRequest(t, req, http.StatusCreated)
|
||||||
|
|
||||||
|
pvs, err := packages.GetVersionsByPackageType(db.DefaultContext, user.ID, packages.TypeGeneric)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pvs, 1)
|
||||||
|
|
||||||
|
pd, err := packages.GetPackageDescriptor(db.DefaultContext, pvs[0])
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.NotNil(t, pd.SemVer)
|
||||||
|
assert.Nil(t, pd.Metadata)
|
||||||
|
assert.Equal(t, packageName, pd.Package.Name)
|
||||||
|
assert.Equal(t, packageVersion, pd.Version.Version)
|
||||||
|
|
||||||
|
pfs, err := packages.GetFilesByVersionID(db.DefaultContext, pvs[0].ID)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pfs, 1)
|
||||||
|
assert.Equal(t, filename, pfs[0].Name)
|
||||||
|
assert.True(t, pfs[0].IsLead)
|
||||||
|
|
||||||
|
pb, err := packages.GetBlobByID(db.DefaultContext, pfs[0].BlobID)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Equal(t, int64(len(content)), pb.Size)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("UploadExists", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequestWithBody(t, "PUT", url, bytes.NewReader(content))
|
||||||
|
AddBasicAuthHeader(req, user.Name)
|
||||||
|
MakeRequest(t, req, http.StatusBadRequest)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Download", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", url)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
assert.Equal(t, content, resp.Body.Bytes())
|
||||||
|
|
||||||
|
pvs, err := packages.GetVersionsByPackageType(db.DefaultContext, user.ID, packages.TypeGeneric)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pvs, 1)
|
||||||
|
assert.Equal(t, int64(1), pvs[0].DownloadCount)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Delete", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "DELETE", url)
|
||||||
|
AddBasicAuthHeader(req, user.Name)
|
||||||
|
MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
pvs, err := packages.GetVersionsByPackageType(db.DefaultContext, user.ID, packages.TypeGeneric)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Empty(t, pvs)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("DownloadNotExists", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", url)
|
||||||
|
MakeRequest(t, req, http.StatusNotFound)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("DeleteNotExists", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "DELETE", url)
|
||||||
|
AddBasicAuthHeader(req, user.Name)
|
||||||
|
MakeRequest(t, req, http.StatusNotFound)
|
||||||
|
})
|
||||||
|
}
|
205
integrations/api_packages_maven_test.go
Normal file
205
integrations/api_packages_maven_test.go
Normal file
|
@ -0,0 +1,205 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package integrations
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
"net/http"
|
||||||
|
"strings"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/models/db"
|
||||||
|
"code.gitea.io/gitea/models/packages"
|
||||||
|
"code.gitea.io/gitea/models/unittest"
|
||||||
|
user_model "code.gitea.io/gitea/models/user"
|
||||||
|
"code.gitea.io/gitea/modules/packages/maven"
|
||||||
|
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestPackageMaven(t *testing.T) {
|
||||||
|
defer prepareTestEnv(t)()
|
||||||
|
user := unittest.AssertExistsAndLoadBean(t, &user_model.User{ID: 2}).(*user_model.User)
|
||||||
|
|
||||||
|
groupID := "com.gitea"
|
||||||
|
artifactID := "test-project"
|
||||||
|
packageName := groupID + "-" + artifactID
|
||||||
|
packageVersion := "1.0.1"
|
||||||
|
packageDescription := "Test Description"
|
||||||
|
|
||||||
|
root := fmt.Sprintf("/api/packages/%s/maven/%s/%s", user.Name, strings.ReplaceAll(groupID, ".", "/"), artifactID)
|
||||||
|
filename := fmt.Sprintf("%s-%s.jar", packageName, packageVersion)
|
||||||
|
|
||||||
|
putFile := func(t *testing.T, path, content string, expectedStatus int) {
|
||||||
|
req := NewRequestWithBody(t, "PUT", root+path, strings.NewReader(content))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
MakeRequest(t, req, expectedStatus)
|
||||||
|
}
|
||||||
|
|
||||||
|
t.Run("Upload", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
putFile(t, fmt.Sprintf("/%s/%s", packageVersion, filename), "test", http.StatusCreated)
|
||||||
|
putFile(t, "/maven-metadata.xml", "test", http.StatusOK)
|
||||||
|
|
||||||
|
pvs, err := packages.GetVersionsByPackageType(db.DefaultContext, user.ID, packages.TypeMaven)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pvs, 1)
|
||||||
|
|
||||||
|
pd, err := packages.GetPackageDescriptor(db.DefaultContext, pvs[0])
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Nil(t, pd.SemVer)
|
||||||
|
assert.Nil(t, pd.Metadata)
|
||||||
|
assert.Equal(t, packageName, pd.Package.Name)
|
||||||
|
assert.Equal(t, packageVersion, pd.Version.Version)
|
||||||
|
|
||||||
|
pfs, err := packages.GetFilesByVersionID(db.DefaultContext, pvs[0].ID)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pfs, 1)
|
||||||
|
assert.Equal(t, filename, pfs[0].Name)
|
||||||
|
assert.False(t, pfs[0].IsLead)
|
||||||
|
|
||||||
|
pb, err := packages.GetBlobByID(db.DefaultContext, pfs[0].BlobID)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Equal(t, int64(4), pb.Size)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("UploadExists", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
putFile(t, fmt.Sprintf("/%s/%s", packageVersion, filename), "test", http.StatusBadRequest)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Download", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/%s/%s", root, packageVersion, filename))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
assert.Equal(t, []byte("test"), resp.Body.Bytes())
|
||||||
|
|
||||||
|
pvs, err := packages.GetVersionsByPackageType(db.DefaultContext, user.ID, packages.TypeMaven)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pvs, 1)
|
||||||
|
assert.Equal(t, int64(0), pvs[0].DownloadCount)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("UploadVerifySHA1", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
t.Run("Missmatch", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
putFile(t, fmt.Sprintf("/%s/%s.sha1", packageVersion, filename), "test", http.StatusBadRequest)
|
||||||
|
})
|
||||||
|
t.Run("Valid", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
putFile(t, fmt.Sprintf("/%s/%s.sha1", packageVersion, filename), "a94a8fe5ccb19ba61c4c0873d391e987982fbbd3", http.StatusOK)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
pomContent := `<?xml version="1.0"?>
|
||||||
|
<project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
|
||||||
|
<groupId>` + groupID + `</groupId>
|
||||||
|
<artifactId>` + artifactID + `</artifactId>
|
||||||
|
<version>` + packageVersion + `</version>
|
||||||
|
<description>` + packageDescription + `</description>
|
||||||
|
</project>`
|
||||||
|
|
||||||
|
t.Run("UploadPOM", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
pvs, err := packages.GetVersionsByPackageType(db.DefaultContext, user.ID, packages.TypeMaven)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pvs, 1)
|
||||||
|
|
||||||
|
pd, err := packages.GetPackageDescriptor(db.DefaultContext, pvs[0])
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Nil(t, pd.Metadata)
|
||||||
|
|
||||||
|
putFile(t, fmt.Sprintf("/%s/%s.pom", packageVersion, filename), pomContent, http.StatusCreated)
|
||||||
|
|
||||||
|
pvs, err = packages.GetVersionsByPackageType(db.DefaultContext, user.ID, packages.TypeMaven)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pvs, 1)
|
||||||
|
|
||||||
|
pd, err = packages.GetPackageDescriptor(db.DefaultContext, pvs[0])
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.IsType(t, &maven.Metadata{}, pd.Metadata)
|
||||||
|
assert.Equal(t, packageDescription, pd.Metadata.(*maven.Metadata).Description)
|
||||||
|
|
||||||
|
pfs, err := packages.GetFilesByVersionID(db.DefaultContext, pvs[0].ID)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pfs, 2)
|
||||||
|
i := 0
|
||||||
|
if strings.HasSuffix(pfs[1].Name, ".pom") {
|
||||||
|
i = 1
|
||||||
|
}
|
||||||
|
assert.Equal(t, filename+".pom", pfs[i].Name)
|
||||||
|
assert.True(t, pfs[i].IsLead)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("DownloadPOM", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/%s/%s.pom", root, packageVersion, filename))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
assert.Equal(t, []byte(pomContent), resp.Body.Bytes())
|
||||||
|
|
||||||
|
pvs, err := packages.GetVersionsByPackageType(db.DefaultContext, user.ID, packages.TypeMaven)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pvs, 1)
|
||||||
|
assert.Equal(t, int64(1), pvs[0].DownloadCount)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("DownloadChecksums", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/1.2.3/%s", root, filename))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
MakeRequest(t, req, http.StatusNotFound)
|
||||||
|
|
||||||
|
for key, checksum := range map[string]string{
|
||||||
|
"md5": "098f6bcd4621d373cade4e832627b4f6",
|
||||||
|
"sha1": "a94a8fe5ccb19ba61c4c0873d391e987982fbbd3",
|
||||||
|
"sha256": "9f86d081884c7d659a2feaa0c55ad015a3bf4f1b2b0b822cd15d6c15b0f00a08",
|
||||||
|
"sha512": "ee26b0dd4af7e749aa1a8ee3c10ae9923f618980772e473f8819a5d4940e0db27ac185f8a0e1d5f84f88bc887fd67b143732c304cc5fa9ad8e6f57f50028a8ff",
|
||||||
|
} {
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/%s/%s.%s", root, packageVersion, filename, key))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
assert.Equal(t, checksum, resp.Body.String())
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("DownloadMetadata", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", root+"/maven-metadata.xml")
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
expectedMetadata := `<?xml version="1.0" encoding="UTF-8"?>` + "\n<metadata><groupId>com.gitea</groupId><artifactId>test-project</artifactId><versioning><release>1.0.1</release><latest>1.0.1</latest><versions><version>1.0.1</version></versions></versioning></metadata>"
|
||||||
|
assert.Equal(t, expectedMetadata, resp.Body.String())
|
||||||
|
|
||||||
|
for key, checksum := range map[string]string{
|
||||||
|
"md5": "6bee0cebaaa686d658adf3e7e16371a0",
|
||||||
|
"sha1": "8696abce499fe84d9ea93e5492abe7147e195b6c",
|
||||||
|
"sha256": "3f48322f81c4b2c3bb8649ae1e5c9801476162b520e1c2734ac06b2c06143208",
|
||||||
|
"sha512": "cb075aa2e2ef1a83cdc14dd1e08c505b72d633399b39e73a21f00f0deecb39a3e2c79f157c1163f8a3854828750706e0dec3a0f5e4778e91f8ec2cf351a855f2",
|
||||||
|
} {
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/maven-metadata.xml.%s", root, key))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
assert.Equal(t, checksum, resp.Body.String())
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
222
integrations/api_packages_npm_test.go
Normal file
222
integrations/api_packages_npm_test.go
Normal file
|
@ -0,0 +1,222 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package integrations
|
||||||
|
|
||||||
|
import (
|
||||||
|
"encoding/base64"
|
||||||
|
"fmt"
|
||||||
|
"net/http"
|
||||||
|
"net/url"
|
||||||
|
"strings"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/models/db"
|
||||||
|
"code.gitea.io/gitea/models/packages"
|
||||||
|
"code.gitea.io/gitea/models/unittest"
|
||||||
|
user_model "code.gitea.io/gitea/models/user"
|
||||||
|
"code.gitea.io/gitea/modules/packages/npm"
|
||||||
|
"code.gitea.io/gitea/modules/setting"
|
||||||
|
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestPackageNpm(t *testing.T) {
|
||||||
|
defer prepareTestEnv(t)()
|
||||||
|
user := unittest.AssertExistsAndLoadBean(t, &user_model.User{ID: 2}).(*user_model.User)
|
||||||
|
|
||||||
|
token := fmt.Sprintf("Bearer %s", getTokenForLoggedInUser(t, loginUser(t, user.Name)))
|
||||||
|
|
||||||
|
packageName := "@scope/test-package"
|
||||||
|
packageVersion := "1.0.1-pre"
|
||||||
|
packageTag := "latest"
|
||||||
|
packageTag2 := "release"
|
||||||
|
packageAuthor := "KN4CK3R"
|
||||||
|
packageDescription := "Test Description"
|
||||||
|
|
||||||
|
data := "H4sIAAAAAAAA/ytITM5OTE/VL4DQelnF+XkMVAYGBgZmJiYK2MRBwNDcSIHB2NTMwNDQzMwAqA7IMDUxA9LUdgg2UFpcklgEdAql5kD8ogCnhwio5lJQUMpLzE1VslJQcihOzi9I1S9JLS7RhSYIJR2QgrLUouLM/DyQGkM9Az1D3YIiqExKanFyUWZBCVQ2BKhVwQVJDKwosbQkI78IJO/tZ+LsbRykxFXLNdA+HwWjYBSMgpENACgAbtAACAAA"
|
||||||
|
upload := `{
|
||||||
|
"_id": "` + packageName + `",
|
||||||
|
"name": "` + packageName + `",
|
||||||
|
"description": "` + packageDescription + `",
|
||||||
|
"dist-tags": {
|
||||||
|
"` + packageTag + `": "` + packageVersion + `"
|
||||||
|
},
|
||||||
|
"versions": {
|
||||||
|
"` + packageVersion + `": {
|
||||||
|
"name": "` + packageName + `",
|
||||||
|
"version": "` + packageVersion + `",
|
||||||
|
"description": "` + packageDescription + `",
|
||||||
|
"author": {
|
||||||
|
"name": "` + packageAuthor + `"
|
||||||
|
},
|
||||||
|
"dist": {
|
||||||
|
"integrity": "sha512-yA4FJsVhetynGfOC1jFf79BuS+jrHbm0fhh+aHzCQkOaOBXKf9oBnC4a6DnLLnEsHQDRLYd00cwj8sCXpC+wIg==",
|
||||||
|
"shasum": "aaa7eaf852a948b0aa05afeda35b1badca155d90"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"_attachments": {
|
||||||
|
"` + packageName + `-` + packageVersion + `.tgz": {
|
||||||
|
"data": "` + data + `"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}`
|
||||||
|
|
||||||
|
root := fmt.Sprintf("/api/packages/%s/npm/%s", user.Name, url.QueryEscape(packageName))
|
||||||
|
tagsRoot := fmt.Sprintf("/api/packages/%s/npm/-/package/%s/dist-tags", user.Name, url.QueryEscape(packageName))
|
||||||
|
filename := fmt.Sprintf("%s-%s.tgz", strings.Split(packageName, "/")[1], packageVersion)
|
||||||
|
|
||||||
|
t.Run("Upload", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequestWithBody(t, "PUT", root, strings.NewReader(upload))
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
MakeRequest(t, req, http.StatusCreated)
|
||||||
|
|
||||||
|
pvs, err := packages.GetVersionsByPackageType(db.DefaultContext, user.ID, packages.TypeNpm)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pvs, 1)
|
||||||
|
|
||||||
|
pd, err := packages.GetPackageDescriptor(db.DefaultContext, pvs[0])
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.NotNil(t, pd.SemVer)
|
||||||
|
assert.IsType(t, &npm.Metadata{}, pd.Metadata)
|
||||||
|
assert.Equal(t, packageName, pd.Package.Name)
|
||||||
|
assert.Equal(t, packageVersion, pd.Version.Version)
|
||||||
|
assert.Len(t, pd.Properties, 1)
|
||||||
|
assert.Equal(t, npm.TagProperty, pd.Properties[0].Name)
|
||||||
|
assert.Equal(t, packageTag, pd.Properties[0].Value)
|
||||||
|
|
||||||
|
pfs, err := packages.GetFilesByVersionID(db.DefaultContext, pvs[0].ID)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pfs, 1)
|
||||||
|
assert.Equal(t, filename, pfs[0].Name)
|
||||||
|
assert.True(t, pfs[0].IsLead)
|
||||||
|
|
||||||
|
pb, err := packages.GetBlobByID(db.DefaultContext, pfs[0].BlobID)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Equal(t, int64(192), pb.Size)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("UploadExists", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequestWithBody(t, "PUT", root, strings.NewReader(upload))
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
MakeRequest(t, req, http.StatusBadRequest)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Download", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/-/%s/%s", root, packageVersion, filename))
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
b, _ := base64.StdEncoding.DecodeString(data)
|
||||||
|
assert.Equal(t, b, resp.Body.Bytes())
|
||||||
|
|
||||||
|
pvs, err := packages.GetVersionsByPackageType(db.DefaultContext, user.ID, packages.TypeNpm)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pvs, 1)
|
||||||
|
assert.Equal(t, int64(1), pvs[0].DownloadCount)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("PackageMetadata", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("/api/packages/%s/npm/%s", user.Name, "does-not-exist"))
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
MakeRequest(t, req, http.StatusNotFound)
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", root)
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
var result npm.PackageMetadata
|
||||||
|
DecodeJSON(t, resp, &result)
|
||||||
|
|
||||||
|
assert.Equal(t, packageName, result.ID)
|
||||||
|
assert.Equal(t, packageName, result.Name)
|
||||||
|
assert.Equal(t, packageDescription, result.Description)
|
||||||
|
assert.Contains(t, result.DistTags, packageTag)
|
||||||
|
assert.Equal(t, packageVersion, result.DistTags[packageTag])
|
||||||
|
assert.Equal(t, packageAuthor, result.Author.Name)
|
||||||
|
assert.Contains(t, result.Versions, packageVersion)
|
||||||
|
pmv := result.Versions[packageVersion]
|
||||||
|
assert.Equal(t, fmt.Sprintf("%s@%s", packageName, packageVersion), pmv.ID)
|
||||||
|
assert.Equal(t, packageName, pmv.Name)
|
||||||
|
assert.Equal(t, packageDescription, pmv.Description)
|
||||||
|
assert.Equal(t, packageAuthor, pmv.Author.Name)
|
||||||
|
assert.Equal(t, "sha512-yA4FJsVhetynGfOC1jFf79BuS+jrHbm0fhh+aHzCQkOaOBXKf9oBnC4a6DnLLnEsHQDRLYd00cwj8sCXpC+wIg==", pmv.Dist.Integrity)
|
||||||
|
assert.Equal(t, "aaa7eaf852a948b0aa05afeda35b1badca155d90", pmv.Dist.Shasum)
|
||||||
|
assert.Equal(t, fmt.Sprintf("%s%s/-/%s/%s", setting.AppURL, root[1:], packageVersion, filename), pmv.Dist.Tarball)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("AddTag", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
test := func(t *testing.T, status int, tag, version string) {
|
||||||
|
req := NewRequestWithBody(t, "PUT", fmt.Sprintf("%s/%s", tagsRoot, tag), strings.NewReader(`"`+version+`"`))
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
MakeRequest(t, req, status)
|
||||||
|
}
|
||||||
|
|
||||||
|
test(t, http.StatusBadRequest, "1.0", packageVersion)
|
||||||
|
test(t, http.StatusBadRequest, "v1.0", packageVersion)
|
||||||
|
test(t, http.StatusNotFound, packageTag2, "1.2")
|
||||||
|
test(t, http.StatusOK, packageTag, packageVersion)
|
||||||
|
test(t, http.StatusOK, packageTag2, packageVersion)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("ListTags", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", tagsRoot)
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
var result map[string]string
|
||||||
|
DecodeJSON(t, resp, &result)
|
||||||
|
|
||||||
|
assert.Len(t, result, 2)
|
||||||
|
assert.Contains(t, result, packageTag)
|
||||||
|
assert.Equal(t, packageVersion, result[packageTag])
|
||||||
|
assert.Contains(t, result, packageTag2)
|
||||||
|
assert.Equal(t, packageVersion, result[packageTag2])
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("PackageMetadataDistTags", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", root)
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
var result npm.PackageMetadata
|
||||||
|
DecodeJSON(t, resp, &result)
|
||||||
|
|
||||||
|
assert.Len(t, result.DistTags, 2)
|
||||||
|
assert.Contains(t, result.DistTags, packageTag)
|
||||||
|
assert.Equal(t, packageVersion, result.DistTags[packageTag])
|
||||||
|
assert.Contains(t, result.DistTags, packageTag2)
|
||||||
|
assert.Equal(t, packageVersion, result.DistTags[packageTag2])
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("DeleteTag", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
test := func(t *testing.T, status int, tag string) {
|
||||||
|
req := NewRequest(t, "DELETE", fmt.Sprintf("%s/%s", tagsRoot, tag))
|
||||||
|
req = addTokenAuthHeader(req, token)
|
||||||
|
MakeRequest(t, req, status)
|
||||||
|
}
|
||||||
|
|
||||||
|
test(t, http.StatusBadRequest, "v1.0")
|
||||||
|
test(t, http.StatusBadRequest, "1.0")
|
||||||
|
test(t, http.StatusOK, "dummy")
|
||||||
|
test(t, http.StatusOK, packageTag2)
|
||||||
|
})
|
||||||
|
}
|
381
integrations/api_packages_nuget_test.go
Normal file
381
integrations/api_packages_nuget_test.go
Normal file
|
@ -0,0 +1,381 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package integrations
|
||||||
|
|
||||||
|
import (
|
||||||
|
"archive/zip"
|
||||||
|
"bytes"
|
||||||
|
"encoding/base64"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"net/http"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/models/db"
|
||||||
|
"code.gitea.io/gitea/models/packages"
|
||||||
|
"code.gitea.io/gitea/models/unittest"
|
||||||
|
user_model "code.gitea.io/gitea/models/user"
|
||||||
|
nuget_module "code.gitea.io/gitea/modules/packages/nuget"
|
||||||
|
"code.gitea.io/gitea/modules/setting"
|
||||||
|
"code.gitea.io/gitea/routers/api/packages/nuget"
|
||||||
|
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestPackageNuGet(t *testing.T) {
|
||||||
|
defer prepareTestEnv(t)()
|
||||||
|
user := unittest.AssertExistsAndLoadBean(t, &user_model.User{ID: 2}).(*user_model.User)
|
||||||
|
|
||||||
|
packageName := "test.package"
|
||||||
|
packageVersion := "1.0.3"
|
||||||
|
packageAuthors := "KN4CK3R"
|
||||||
|
packageDescription := "Gitea Test Package"
|
||||||
|
symbolFilename := "test.pdb"
|
||||||
|
symbolID := "d910bb6948bd4c6cb40155bcf52c3c94"
|
||||||
|
|
||||||
|
var buf bytes.Buffer
|
||||||
|
archive := zip.NewWriter(&buf)
|
||||||
|
w, _ := archive.Create("package.nuspec")
|
||||||
|
w.Write([]byte(`<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<package xmlns="http://schemas.microsoft.com/packaging/2013/05/nuspec.xsd">
|
||||||
|
<metadata>
|
||||||
|
<id>` + packageName + `</id>
|
||||||
|
<version>` + packageVersion + `</version>
|
||||||
|
<authors>` + packageAuthors + `</authors>
|
||||||
|
<description>` + packageDescription + `</description>
|
||||||
|
<group targetFramework=".NETStandard2.0">
|
||||||
|
<dependency id="Microsoft.CSharp" version="4.5.0" />
|
||||||
|
</group>
|
||||||
|
</metadata>
|
||||||
|
</package>`))
|
||||||
|
archive.Close()
|
||||||
|
content := buf.Bytes()
|
||||||
|
|
||||||
|
url := fmt.Sprintf("/api/packages/%s/nuget", user.Name)
|
||||||
|
|
||||||
|
t.Run("ServiceIndex", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/index.json", url))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
var result nuget.ServiceIndexResponse
|
||||||
|
DecodeJSON(t, resp, &result)
|
||||||
|
|
||||||
|
assert.Equal(t, "3.0.0", result.Version)
|
||||||
|
assert.NotEmpty(t, result.Resources)
|
||||||
|
|
||||||
|
root := setting.AppURL + url[1:]
|
||||||
|
for _, r := range result.Resources {
|
||||||
|
switch r.Type {
|
||||||
|
case "SearchQueryService":
|
||||||
|
fallthrough
|
||||||
|
case "SearchQueryService/3.0.0-beta":
|
||||||
|
fallthrough
|
||||||
|
case "SearchQueryService/3.0.0-rc":
|
||||||
|
assert.Equal(t, root+"/query", r.ID)
|
||||||
|
case "RegistrationsBaseUrl":
|
||||||
|
fallthrough
|
||||||
|
case "RegistrationsBaseUrl/3.0.0-beta":
|
||||||
|
fallthrough
|
||||||
|
case "RegistrationsBaseUrl/3.0.0-rc":
|
||||||
|
assert.Equal(t, root+"/registration", r.ID)
|
||||||
|
case "PackageBaseAddress/3.0.0":
|
||||||
|
assert.Equal(t, root+"/package", r.ID)
|
||||||
|
case "PackagePublish/2.0.0":
|
||||||
|
assert.Equal(t, root, r.ID)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Upload", func(t *testing.T) {
|
||||||
|
t.Run("DependencyPackage", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequestWithBody(t, "PUT", url, bytes.NewReader(content))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
MakeRequest(t, req, http.StatusCreated)
|
||||||
|
|
||||||
|
pvs, err := packages.GetVersionsByPackageType(db.DefaultContext, user.ID, packages.TypeNuGet)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pvs, 1)
|
||||||
|
|
||||||
|
pd, err := packages.GetPackageDescriptor(db.DefaultContext, pvs[0])
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.NotNil(t, pd.SemVer)
|
||||||
|
assert.IsType(t, &nuget_module.Metadata{}, pd.Metadata)
|
||||||
|
assert.Equal(t, packageName, pd.Package.Name)
|
||||||
|
assert.Equal(t, packageVersion, pd.Version.Version)
|
||||||
|
|
||||||
|
pfs, err := packages.GetFilesByVersionID(db.DefaultContext, pvs[0].ID)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pfs, 1)
|
||||||
|
assert.Equal(t, fmt.Sprintf("%s.%s.nupkg", packageName, packageVersion), pfs[0].Name)
|
||||||
|
assert.True(t, pfs[0].IsLead)
|
||||||
|
|
||||||
|
pb, err := packages.GetBlobByID(db.DefaultContext, pfs[0].BlobID)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Equal(t, int64(len(content)), pb.Size)
|
||||||
|
|
||||||
|
req = NewRequestWithBody(t, "PUT", url, bytes.NewReader(content))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
MakeRequest(t, req, http.StatusBadRequest)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("SymbolPackage", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
createPackage := func(id, packageType string) io.Reader {
|
||||||
|
var buf bytes.Buffer
|
||||||
|
archive := zip.NewWriter(&buf)
|
||||||
|
|
||||||
|
w, _ := archive.Create("package.nuspec")
|
||||||
|
w.Write([]byte(`<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<package xmlns="http://schemas.microsoft.com/packaging/2013/05/nuspec.xsd">
|
||||||
|
<metadata>
|
||||||
|
<id>` + id + `</id>
|
||||||
|
<version>` + packageVersion + `</version>
|
||||||
|
<authors>` + packageAuthors + `</authors>
|
||||||
|
<description>` + packageDescription + `</description>
|
||||||
|
<packageTypes><packageType name="` + packageType + `" /></packageTypes>
|
||||||
|
</metadata>
|
||||||
|
</package>`))
|
||||||
|
|
||||||
|
w, _ = archive.Create(symbolFilename)
|
||||||
|
b, _ := base64.StdEncoding.DecodeString(`QlNKQgEAAQAAAAAADAAAAFBEQiB2MS4wAAAAAAAABgB8AAAAWAAAACNQZGIAAAAA1AAAAAgBAAAj
|
||||||
|
fgAA3AEAAAQAAAAjU3RyaW5ncwAAAADgAQAABAAAACNVUwDkAQAAMAAAACNHVUlEAAAAFAIAACgB
|
||||||
|
AAAjQmxvYgAAAGm7ENm9SGxMtAFVvPUsPJTF6PbtAAAAAFcVogEJAAAAAQAAAA==`)
|
||||||
|
w.Write(b)
|
||||||
|
|
||||||
|
archive.Close()
|
||||||
|
return &buf
|
||||||
|
}
|
||||||
|
|
||||||
|
req := NewRequestWithBody(t, "PUT", fmt.Sprintf("%s/symbolpackage", url), createPackage("unknown-package", "SymbolsPackage"))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
MakeRequest(t, req, http.StatusNotFound)
|
||||||
|
|
||||||
|
req = NewRequestWithBody(t, "PUT", fmt.Sprintf("%s/symbolpackage", url), createPackage(packageName, "DummyPackage"))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
MakeRequest(t, req, http.StatusBadRequest)
|
||||||
|
|
||||||
|
req = NewRequestWithBody(t, "PUT", fmt.Sprintf("%s/symbolpackage", url), createPackage(packageName, "SymbolsPackage"))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
MakeRequest(t, req, http.StatusCreated)
|
||||||
|
|
||||||
|
pvs, err := packages.GetVersionsByPackageType(db.DefaultContext, user.ID, packages.TypeNuGet)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pvs, 1)
|
||||||
|
|
||||||
|
pd, err := packages.GetPackageDescriptor(db.DefaultContext, pvs[0])
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.NotNil(t, pd.SemVer)
|
||||||
|
assert.IsType(t, &nuget_module.Metadata{}, pd.Metadata)
|
||||||
|
assert.Equal(t, packageName, pd.Package.Name)
|
||||||
|
assert.Equal(t, packageVersion, pd.Version.Version)
|
||||||
|
|
||||||
|
pfs, err := packages.GetFilesByVersionID(db.DefaultContext, pvs[0].ID)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pfs, 3)
|
||||||
|
for _, pf := range pfs {
|
||||||
|
switch pf.Name {
|
||||||
|
case fmt.Sprintf("%s.%s.nupkg", packageName, packageVersion):
|
||||||
|
case fmt.Sprintf("%s.%s.snupkg", packageName, packageVersion):
|
||||||
|
assert.False(t, pf.IsLead)
|
||||||
|
|
||||||
|
pb, err := packages.GetBlobByID(db.DefaultContext, pf.BlobID)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Equal(t, int64(616), pb.Size)
|
||||||
|
case symbolFilename:
|
||||||
|
assert.False(t, pf.IsLead)
|
||||||
|
|
||||||
|
pb, err := packages.GetBlobByID(db.DefaultContext, pf.BlobID)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Equal(t, int64(160), pb.Size)
|
||||||
|
|
||||||
|
pps, err := packages.GetProperties(db.DefaultContext, packages.PropertyTypeFile, pf.ID)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pps, 1)
|
||||||
|
assert.Equal(t, nuget_module.PropertySymbolID, pps[0].Name)
|
||||||
|
assert.Equal(t, symbolID, pps[0].Value)
|
||||||
|
default:
|
||||||
|
assert.Fail(t, "unexpected file: %v", pf.Name)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
req = NewRequestWithBody(t, "PUT", fmt.Sprintf("%s/symbolpackage", url), createPackage(packageName, "SymbolsPackage"))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
MakeRequest(t, req, http.StatusBadRequest)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Download", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
checkDownloadCount := func(count int64) {
|
||||||
|
pvs, err := packages.GetVersionsByPackageType(db.DefaultContext, user.ID, packages.TypeNuGet)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pvs, 1)
|
||||||
|
assert.Equal(t, count, pvs[0].DownloadCount)
|
||||||
|
}
|
||||||
|
|
||||||
|
checkDownloadCount(0)
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/package/%s/%s/%s.%s.nupkg", url, packageName, packageVersion, packageName, packageVersion))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
assert.Equal(t, content, resp.Body.Bytes())
|
||||||
|
|
||||||
|
checkDownloadCount(1)
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", fmt.Sprintf("%s/package/%s/%s/%s.%s.snupkg", url, packageName, packageVersion, packageName, packageVersion))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
checkDownloadCount(1)
|
||||||
|
|
||||||
|
t.Run("Symbol", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/symbols/%s/%sFFFFFFFF/gitea.pdb", url, symbolFilename, symbolID))
|
||||||
|
MakeRequest(t, req, http.StatusBadRequest)
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", fmt.Sprintf("%s/symbols/%s/%sFFFFFFFF/%s", url, symbolFilename, "00000000000000000000000000000000", symbolFilename))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
MakeRequest(t, req, http.StatusNotFound)
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", fmt.Sprintf("%s/symbols/%s/%sFFFFFFFF/%s", url, symbolFilename, symbolID, symbolFilename))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
checkDownloadCount(1)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("SearchService", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
cases := []struct {
|
||||||
|
Query string
|
||||||
|
Skip int
|
||||||
|
Take int
|
||||||
|
ExpectedTotal int64
|
||||||
|
ExpectedResults int
|
||||||
|
}{
|
||||||
|
{"", 0, 0, 1, 1},
|
||||||
|
{"", 0, 10, 1, 1},
|
||||||
|
{"gitea", 0, 10, 0, 0},
|
||||||
|
{"test", 0, 10, 1, 1},
|
||||||
|
{"test", 1, 10, 1, 0},
|
||||||
|
}
|
||||||
|
|
||||||
|
for i, c := range cases {
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/query?q=%s&skip=%d&take=%d", url, c.Query, c.Skip, c.Take))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
var result nuget.SearchResultResponse
|
||||||
|
DecodeJSON(t, resp, &result)
|
||||||
|
|
||||||
|
assert.Equal(t, c.ExpectedTotal, result.TotalHits, "case %d: unexpected total hits", i)
|
||||||
|
assert.Len(t, result.Data, c.ExpectedResults, "case %d: unexpected result count", i)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("RegistrationService", func(t *testing.T) {
|
||||||
|
indexURL := fmt.Sprintf("%s%s/registration/%s/index.json", setting.AppURL, url[1:], packageName)
|
||||||
|
leafURL := fmt.Sprintf("%s%s/registration/%s/%s.json", setting.AppURL, url[1:], packageName, packageVersion)
|
||||||
|
contentURL := fmt.Sprintf("%s%s/package/%s/%s/%s.%s.nupkg", setting.AppURL, url[1:], packageName, packageVersion, packageName, packageVersion)
|
||||||
|
|
||||||
|
t.Run("RegistrationIndex", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/registration/%s/index.json", url, packageName))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
var result nuget.RegistrationIndexResponse
|
||||||
|
DecodeJSON(t, resp, &result)
|
||||||
|
|
||||||
|
assert.Equal(t, indexURL, result.RegistrationIndexURL)
|
||||||
|
assert.Equal(t, 1, result.Count)
|
||||||
|
assert.Len(t, result.Pages, 1)
|
||||||
|
assert.Equal(t, indexURL, result.Pages[0].RegistrationPageURL)
|
||||||
|
assert.Equal(t, packageVersion, result.Pages[0].Lower)
|
||||||
|
assert.Equal(t, packageVersion, result.Pages[0].Upper)
|
||||||
|
assert.Equal(t, 1, result.Pages[0].Count)
|
||||||
|
assert.Len(t, result.Pages[0].Items, 1)
|
||||||
|
assert.Equal(t, packageName, result.Pages[0].Items[0].CatalogEntry.ID)
|
||||||
|
assert.Equal(t, packageVersion, result.Pages[0].Items[0].CatalogEntry.Version)
|
||||||
|
assert.Equal(t, packageAuthors, result.Pages[0].Items[0].CatalogEntry.Authors)
|
||||||
|
assert.Equal(t, packageDescription, result.Pages[0].Items[0].CatalogEntry.Description)
|
||||||
|
assert.Equal(t, leafURL, result.Pages[0].Items[0].CatalogEntry.CatalogLeafURL)
|
||||||
|
assert.Equal(t, contentURL, result.Pages[0].Items[0].CatalogEntry.PackageContentURL)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("RegistrationLeaf", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/registration/%s/%s.json", url, packageName, packageVersion))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
var result nuget.RegistrationLeafResponse
|
||||||
|
DecodeJSON(t, resp, &result)
|
||||||
|
|
||||||
|
assert.Equal(t, leafURL, result.RegistrationLeafURL)
|
||||||
|
assert.Equal(t, contentURL, result.PackageContentURL)
|
||||||
|
assert.Equal(t, indexURL, result.RegistrationIndexURL)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("PackageService", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/package/%s/index.json", url, packageName))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
var result nuget.PackageVersionsResponse
|
||||||
|
DecodeJSON(t, resp, &result)
|
||||||
|
|
||||||
|
assert.Len(t, result.Versions, 1)
|
||||||
|
assert.Equal(t, packageVersion, result.Versions[0])
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Delete", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "DELETE", fmt.Sprintf("%s/%s/%s", url, packageName, packageVersion))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
pvs, err := packages.GetVersionsByPackageType(db.DefaultContext, user.ID, packages.TypeNuGet)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Empty(t, pvs)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("DownloadNotExists", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/package/%s/%s/%s.%s.nupkg", url, packageName, packageVersion, packageName, packageVersion))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
MakeRequest(t, req, http.StatusNotFound)
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", fmt.Sprintf("%s/package/%s/%s/%s.%s.snupkg", url, packageName, packageVersion, packageName, packageVersion))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
MakeRequest(t, req, http.StatusNotFound)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("DeleteNotExists", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "DELETE", fmt.Sprintf("%s/package/%s/%s", url, packageName, packageVersion))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
MakeRequest(t, req, http.StatusNotFound)
|
||||||
|
})
|
||||||
|
}
|
181
integrations/api_packages_pypi_test.go
Normal file
181
integrations/api_packages_pypi_test.go
Normal file
|
@ -0,0 +1,181 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package integrations
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"mime/multipart"
|
||||||
|
"net/http"
|
||||||
|
"regexp"
|
||||||
|
"strings"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/models/db"
|
||||||
|
"code.gitea.io/gitea/models/packages"
|
||||||
|
"code.gitea.io/gitea/models/unittest"
|
||||||
|
user_model "code.gitea.io/gitea/models/user"
|
||||||
|
"code.gitea.io/gitea/modules/packages/pypi"
|
||||||
|
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestPackagePyPI(t *testing.T) {
|
||||||
|
defer prepareTestEnv(t)()
|
||||||
|
user := unittest.AssertExistsAndLoadBean(t, &user_model.User{ID: 2}).(*user_model.User)
|
||||||
|
|
||||||
|
packageName := "test-package"
|
||||||
|
packageVersion := "1.0.1"
|
||||||
|
packageAuthor := "KN4CK3R"
|
||||||
|
packageDescription := "Test Description"
|
||||||
|
|
||||||
|
content := "test"
|
||||||
|
hashSHA256 := "9f86d081884c7d659a2feaa0c55ad015a3bf4f1b2b0b822cd15d6c15b0f00a08"
|
||||||
|
|
||||||
|
root := fmt.Sprintf("/api/packages/%s/pypi", user.Name)
|
||||||
|
|
||||||
|
uploadFile := func(t *testing.T, filename, content string, expectedStatus int) {
|
||||||
|
body := &bytes.Buffer{}
|
||||||
|
writer := multipart.NewWriter(body)
|
||||||
|
part, _ := writer.CreateFormFile("content", filename)
|
||||||
|
_, _ = io.Copy(part, strings.NewReader(content))
|
||||||
|
|
||||||
|
writer.WriteField("name", packageName)
|
||||||
|
writer.WriteField("version", packageVersion)
|
||||||
|
writer.WriteField("author", packageAuthor)
|
||||||
|
writer.WriteField("summary", packageDescription)
|
||||||
|
writer.WriteField("description", packageDescription)
|
||||||
|
writer.WriteField("sha256_digest", hashSHA256)
|
||||||
|
writer.WriteField("requires_python", "3.6")
|
||||||
|
|
||||||
|
_ = writer.Close()
|
||||||
|
|
||||||
|
req := NewRequestWithBody(t, "POST", root, body)
|
||||||
|
req.Header.Add("Content-Type", writer.FormDataContentType())
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
MakeRequest(t, req, expectedStatus)
|
||||||
|
}
|
||||||
|
|
||||||
|
t.Run("Upload", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
filename := "test.whl"
|
||||||
|
uploadFile(t, filename, content, http.StatusCreated)
|
||||||
|
|
||||||
|
pvs, err := packages.GetVersionsByPackageType(db.DefaultContext, user.ID, packages.TypePyPI)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pvs, 1)
|
||||||
|
|
||||||
|
pd, err := packages.GetPackageDescriptor(db.DefaultContext, pvs[0])
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.NotNil(t, pd.SemVer)
|
||||||
|
assert.IsType(t, &pypi.Metadata{}, pd.Metadata)
|
||||||
|
assert.Equal(t, packageName, pd.Package.Name)
|
||||||
|
assert.Equal(t, packageVersion, pd.Version.Version)
|
||||||
|
|
||||||
|
pfs, err := packages.GetFilesByVersionID(db.DefaultContext, pvs[0].ID)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pfs, 1)
|
||||||
|
assert.Equal(t, filename, pfs[0].Name)
|
||||||
|
assert.True(t, pfs[0].IsLead)
|
||||||
|
|
||||||
|
pb, err := packages.GetBlobByID(db.DefaultContext, pfs[0].BlobID)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Equal(t, int64(4), pb.Size)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("UploadAddFile", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
filename := "test.tar.gz"
|
||||||
|
uploadFile(t, filename, content, http.StatusCreated)
|
||||||
|
|
||||||
|
pvs, err := packages.GetVersionsByPackageType(db.DefaultContext, user.ID, packages.TypePyPI)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pvs, 1)
|
||||||
|
|
||||||
|
pd, err := packages.GetPackageDescriptor(db.DefaultContext, pvs[0])
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.NotNil(t, pd.SemVer)
|
||||||
|
assert.IsType(t, &pypi.Metadata{}, pd.Metadata)
|
||||||
|
assert.Equal(t, packageName, pd.Package.Name)
|
||||||
|
assert.Equal(t, packageVersion, pd.Version.Version)
|
||||||
|
|
||||||
|
pfs, err := packages.GetFilesByVersionID(db.DefaultContext, pvs[0].ID)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pfs, 2)
|
||||||
|
|
||||||
|
pf, err := packages.GetFileForVersionByName(db.DefaultContext, pvs[0].ID, filename, packages.EmptyFileKey)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Equal(t, filename, pf.Name)
|
||||||
|
assert.True(t, pf.IsLead)
|
||||||
|
|
||||||
|
pb, err := packages.GetBlobByID(db.DefaultContext, pf.BlobID)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Equal(t, int64(4), pb.Size)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("UploadHashMismatch", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
filename := "test2.whl"
|
||||||
|
uploadFile(t, filename, "dummy", http.StatusBadRequest)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("UploadExists", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
uploadFile(t, "test.whl", content, http.StatusBadRequest)
|
||||||
|
uploadFile(t, "test.tar.gz", content, http.StatusBadRequest)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Download", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
downloadFile := func(filename string) {
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/files/%s/%s/%s", root, packageName, packageVersion, filename))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
assert.Equal(t, []byte(content), resp.Body.Bytes())
|
||||||
|
}
|
||||||
|
|
||||||
|
downloadFile("test.whl")
|
||||||
|
downloadFile("test.tar.gz")
|
||||||
|
|
||||||
|
pvs, err := packages.GetVersionsByPackageType(db.DefaultContext, user.ID, packages.TypePyPI)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pvs, 1)
|
||||||
|
assert.Equal(t, int64(2), pvs[0].DownloadCount)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("PackageMetadata", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/simple/%s", root, packageName))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
htmlDoc := NewHTMLParser(t, resp.Body)
|
||||||
|
nodes := htmlDoc.doc.Find("a").Nodes
|
||||||
|
assert.Len(t, nodes, 2)
|
||||||
|
|
||||||
|
hrefMatcher := regexp.MustCompile(fmt.Sprintf(`%s/files/%s/%s/test\..+#sha256-%s`, root, packageName, packageVersion, hashSHA256))
|
||||||
|
|
||||||
|
for _, a := range nodes {
|
||||||
|
for _, att := range a.Attr {
|
||||||
|
switch att.Key {
|
||||||
|
case "href":
|
||||||
|
assert.Regexp(t, hrefMatcher, att.Val)
|
||||||
|
case "data-requires-python":
|
||||||
|
assert.Equal(t, "3.6", att.Val)
|
||||||
|
default:
|
||||||
|
t.Fail()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
226
integrations/api_packages_rubygems_test.go
Normal file
226
integrations/api_packages_rubygems_test.go
Normal file
|
@ -0,0 +1,226 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package integrations
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"encoding/base64"
|
||||||
|
"fmt"
|
||||||
|
"mime/multipart"
|
||||||
|
"net/http"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/models/db"
|
||||||
|
"code.gitea.io/gitea/models/packages"
|
||||||
|
"code.gitea.io/gitea/models/unittest"
|
||||||
|
user_model "code.gitea.io/gitea/models/user"
|
||||||
|
"code.gitea.io/gitea/modules/packages/rubygems"
|
||||||
|
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestPackageRubyGems(t *testing.T) {
|
||||||
|
defer prepareTestEnv(t)()
|
||||||
|
user := unittest.AssertExistsAndLoadBean(t, &user_model.User{ID: 2}).(*user_model.User)
|
||||||
|
|
||||||
|
packageName := "gitea"
|
||||||
|
packageVersion := "1.0.5"
|
||||||
|
packageFilename := "gitea-1.0.5.gem"
|
||||||
|
|
||||||
|
gemContent, _ := base64.StdEncoding.DecodeString(`bWV0YWRhdGEuZ3oAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAADAwMDA0NDQAMDAwMDAw
|
||||||
|
MAAwMDAwMDAwADAwMDAwMDAxMDQxADE0MTEwNzcyMzY2ADAxMzQ0MQAgMAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAB1c3RhcgAwMHdoZWVsAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAd2hlZWwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAwMDAwMDAwADAwMDAw
|
||||||
|
MDAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAf
|
||||||
|
iwgA9vQjYQID1VVNb9QwEL37V5he9pRsmlJAFlQckCoOXAriQIUix5nNmsYf2JOqKwS/nYmz2d3Q
|
||||||
|
qqCCKpFdadfjmfdm5nmcLMv4k9DXm6Wrv4BCcQ5GiPcelF5pJVE7y6w0IHirESS7hhDJJu4I+jhu
|
||||||
|
Mc53Tsd5kZ8y30lcuWAEH2KY7HHtQhQs4+cJkwwuwNdeB6JhtbaNDoLTL1MQsFJrqQnr8jNrJJJH
|
||||||
|
WZTHWfEiK094UYj0zYvp4Z9YAx5sA1ZpSCS3M30zeWwo2bG60FvUBjIKJts2GwMW76r0Yr9NzjN3
|
||||||
|
YhwsGX2Ozl4dpcWwvK9d43PQtDIv9igvHwSyIIwFmXHjqTqxLY8MPkCADmQk80p2EfZ6VbM6/ue6
|
||||||
|
/1D0Bq7/qeA/zh6W82leHmhFWUHn/JbsEfT6q7QbiCpoj8l0QcEUFLmX6kq2wBEiMjBSd+Pwt7T5
|
||||||
|
Ot0kuXYMbkD1KOuOBnWYb7hBsAP4bhlkFRqnqpWefMZ/pHCn6+WIFGq2dgY8EQq+RvRRLJcTyZJ1
|
||||||
|
WhHqGPTu7QdmACXdJFLwb9+ZdxErbSPKrqsMxJhAWCJ1qaqRdtu6yktcT/STsamG0qp7rsa5EL/K
|
||||||
|
MBua30uw4ynzExqYWRJDfx8/kQWN3PwsDh2jYLr1W+pZcAmCs9splvnz/Flesqhbq21bXcGG/OLh
|
||||||
|
+2fv/JTF3hgZyCW9OaZjxoZjdnBGfgKpxZyJ1QYAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAZGF0
|
||||||
|
YS50YXIuZ3oAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAADAwMDA0NDQAMDAwMDAwMAAw
|
||||||
|
MDAwMDAwADAwMDAwMDAwMjQyADE0MTEwNzcyMzY2ADAxMzM2MQAgMAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAB1c3RhcgAwMHdoZWVsAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAd2hlZWwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAwMDAwMDAwADAwMDAwMDAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAfiwgA
|
||||||
|
9vQjYQID7M/NCsMgDABgz32KrA/QxersK/Q17ExXIcyhlr7+HLv1sJ02KPhBCPk5JOyn881nsl2c
|
||||||
|
xI+gRDRaC3zbZ8RBCamlxGHolTFlX11kLwDFH6wp21hO2RYi/rD3bb5/7iCubFOCMbBtABzNkIjn
|
||||||
|
bvGlAnisOUE7EnOALUR2p7b06e6aV4iqqqrquJ4AAAD//wMA+sA/NQAIAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAGNoZWNr
|
||||||
|
c3Vtcy55YW1sLmd6AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAwMDAwNDQ0ADAwMDAwMDAAMDAw
|
||||||
|
MDAwMAAwMDAwMDAwMDQ1MAAxNDExMDc3MjM2NgAwMTQ2MTIAIDAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAdXN0YXIAMDB3aGVlbAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAHdoZWVsAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAMDAwMDAwMAAwMDAwMDAwAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAH4sIAPb0
|
||||||
|
I2ECA2WQOa4UQAxE8znFXGCQ21vbPyMj5wRuL0Qk6EecnmZCyKyy9FSvXq/X4/u3ryj68Xg+f/Zn
|
||||||
|
VHzGlx+/P57qvU4XxWalBKftSXOgCjNYkdRycrC5Axem+W4HqS12PNEv7836jF9vnlHxwSyxKY+y
|
||||||
|
go0cPblyHzkrZ4HF1GSVhe7mOOoasXNk2fnbUxb+19Pp9tobD/QlJKMX7y204PREh6nQ5hG9Alw6
|
||||||
|
x4TnmtA+aekGfm6wAseog2LSgpR4Q7cYnAH3K4qAQa6A6JCC1gpuY7P+9YxE5SZ+j0eVGbaBTwBQ
|
||||||
|
iIqRUyyzLCoFCBdYNWxniapTavD97blXTzFvgoVoAsKBAtlU48cdaOmeZDpwV01OtcGwjscfeUrY
|
||||||
|
B9QBAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
|
||||||
|
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA`)
|
||||||
|
|
||||||
|
root := fmt.Sprintf("/api/packages/%s/rubygems", user.Name)
|
||||||
|
|
||||||
|
uploadFile := func(t *testing.T, expectedStatus int) {
|
||||||
|
req := NewRequestWithBody(t, "POST", fmt.Sprintf("%s/api/v1/gems", root), bytes.NewReader(gemContent))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
MakeRequest(t, req, expectedStatus)
|
||||||
|
}
|
||||||
|
|
||||||
|
t.Run("Upload", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
uploadFile(t, http.StatusCreated)
|
||||||
|
|
||||||
|
pvs, err := packages.GetVersionsByPackageType(db.DefaultContext, user.ID, packages.TypeRubyGems)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pvs, 1)
|
||||||
|
|
||||||
|
pd, err := packages.GetPackageDescriptor(db.DefaultContext, pvs[0])
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.NotNil(t, pd.SemVer)
|
||||||
|
assert.IsType(t, &rubygems.Metadata{}, pd.Metadata)
|
||||||
|
assert.Equal(t, packageName, pd.Package.Name)
|
||||||
|
assert.Equal(t, packageVersion, pd.Version.Version)
|
||||||
|
|
||||||
|
pfs, err := packages.GetFilesByVersionID(db.DefaultContext, pvs[0].ID)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pfs, 1)
|
||||||
|
assert.Equal(t, packageFilename, pfs[0].Name)
|
||||||
|
assert.True(t, pfs[0].IsLead)
|
||||||
|
|
||||||
|
pb, err := packages.GetBlobByID(db.DefaultContext, pfs[0].BlobID)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Equal(t, int64(4608), pb.Size)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("UploadExists", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
uploadFile(t, http.StatusBadRequest)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Download", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/gems/%s", root, packageFilename))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
assert.Equal(t, gemContent, resp.Body.Bytes())
|
||||||
|
|
||||||
|
pvs, err := packages.GetVersionsByPackageType(db.DefaultContext, user.ID, packages.TypeRubyGems)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pvs, 1)
|
||||||
|
assert.Equal(t, int64(1), pvs[0].DownloadCount)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("DownloadGemspec", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/quick/Marshal.4.8/%sspec.rz", root, packageFilename))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
b, _ := base64.StdEncoding.DecodeString(`eJxi4Si1EndPzbWyCi5ITc5My0xOLMnMz2M8zMIRLeGpxGWsZ6RnzGbF5hqSyempxJWeWZKayGbN
|
||||||
|
EBJqJQjWFZZaVJyZnxfN5qnEZahnoGcKkjTwVBJyB6lUKEhMzk5MTwULGngqcRaVJlWCONEMBp5K
|
||||||
|
DGAWSKc7zFhPJamg0qRK99TcYphehZLU4hKInFhGSUlBsZW+PtgZepn5+iDxECRzDUDGcfh6hoA4
|
||||||
|
gAAAAP//MS06Gw==`)
|
||||||
|
assert.Equal(t, b, resp.Body.Bytes())
|
||||||
|
|
||||||
|
pvs, err := packages.GetVersionsByPackageType(db.DefaultContext, user.ID, packages.TypeRubyGems)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pvs, 1)
|
||||||
|
assert.Equal(t, int64(1), pvs[0].DownloadCount)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("EnumeratePackages", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
enumeratePackages := func(t *testing.T, endpoint string, expectedContent []byte) {
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("%s/%s", root, endpoint))
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
assert.Equal(t, expectedContent, resp.Body.Bytes())
|
||||||
|
}
|
||||||
|
|
||||||
|
b, _ := base64.StdEncoding.DecodeString(`H4sICAAAAAAA/3NwZWNzLjQuOABi4Yhmi+bwVOJKzyxJTWSzYnMNCbUSdE/NtbIKSy0qzszPi2bzVOIy1DPQM2WzZgjxVOIsKk2qBDEBAQAA///xOEYKOwAAAA==`)
|
||||||
|
enumeratePackages(t, "specs.4.8.gz", b)
|
||||||
|
b, _ = base64.StdEncoding.DecodeString(`H4sICAAAAAAA/2xhdGVzdF9zcGVjcy40LjgAYuGIZovm8FTiSs8sSU1ks2JzDQm1EnRPzbWyCkstKs7Mz4tm81TiMtQz0DNls2YI8VTiLCpNqgQxAQEAAP//8ThGCjsAAAA=`)
|
||||||
|
enumeratePackages(t, "latest_specs.4.8.gz", b)
|
||||||
|
b, _ = base64.StdEncoding.DecodeString(`H4sICAAAAAAA/3ByZXJlbGVhc2Vfc3BlY3MuNC44AGLhiGYABAAA//9snXr5BAAAAA==`)
|
||||||
|
enumeratePackages(t, "prerelease_specs.4.8.gz", b)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Delete", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
body := bytes.Buffer{}
|
||||||
|
writer := multipart.NewWriter(&body)
|
||||||
|
writer.WriteField("gem_name", packageName)
|
||||||
|
writer.WriteField("version", packageVersion)
|
||||||
|
writer.Close()
|
||||||
|
|
||||||
|
req := NewRequestWithBody(t, "DELETE", fmt.Sprintf("%s/api/v1/gems/yank", root), &body)
|
||||||
|
req.Header.Add("Content-Type", writer.FormDataContentType())
|
||||||
|
req = AddBasicAuthHeader(req, user.Name)
|
||||||
|
MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
pvs, err := packages.GetVersionsByPackageType(db.DefaultContext, user.ID, packages.TypeRubyGems)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Empty(t, pvs)
|
||||||
|
})
|
||||||
|
}
|
102
integrations/api_packages_test.go
Normal file
102
integrations/api_packages_test.go
Normal file
|
@ -0,0 +1,102 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package integrations
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"fmt"
|
||||||
|
"net/http"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/models/packages"
|
||||||
|
"code.gitea.io/gitea/models/unittest"
|
||||||
|
user_model "code.gitea.io/gitea/models/user"
|
||||||
|
api "code.gitea.io/gitea/modules/structs"
|
||||||
|
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestPackageAPI(t *testing.T) {
|
||||||
|
defer prepareTestEnv(t)()
|
||||||
|
user := unittest.AssertExistsAndLoadBean(t, &user_model.User{ID: 4}).(*user_model.User)
|
||||||
|
session := loginUser(t, user.Name)
|
||||||
|
token := getTokenForLoggedInUser(t, session)
|
||||||
|
|
||||||
|
packageName := "test-package"
|
||||||
|
packageVersion := "1.0.3"
|
||||||
|
filename := "file.bin"
|
||||||
|
|
||||||
|
url := fmt.Sprintf("/api/packages/%s/generic/%s/%s/%s", user.Name, packageName, packageVersion, filename)
|
||||||
|
req := NewRequestWithBody(t, "PUT", url, bytes.NewReader([]byte{}))
|
||||||
|
AddBasicAuthHeader(req, user.Name)
|
||||||
|
MakeRequest(t, req, http.StatusCreated)
|
||||||
|
|
||||||
|
t.Run("ListPackages", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("/api/v1/packages/%s?token=%s", user.Name, token))
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
var apiPackages []*api.Package
|
||||||
|
DecodeJSON(t, resp, &apiPackages)
|
||||||
|
|
||||||
|
assert.Len(t, apiPackages, 1)
|
||||||
|
assert.Equal(t, string(packages.TypeGeneric), apiPackages[0].Type)
|
||||||
|
assert.Equal(t, packageName, apiPackages[0].Name)
|
||||||
|
assert.Equal(t, packageVersion, apiPackages[0].Version)
|
||||||
|
assert.NotNil(t, apiPackages[0].Creator)
|
||||||
|
assert.Equal(t, user.Name, apiPackages[0].Creator.UserName)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("GetPackage", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("/api/v1/packages/%s/dummy/%s/%s?token=%s", user.Name, packageName, packageVersion, token))
|
||||||
|
MakeRequest(t, req, http.StatusNotFound)
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", fmt.Sprintf("/api/v1/packages/%s/generic/%s/%s?token=%s", user.Name, packageName, packageVersion, token))
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
var p *api.Package
|
||||||
|
DecodeJSON(t, resp, &p)
|
||||||
|
|
||||||
|
assert.Equal(t, string(packages.TypeGeneric), p.Type)
|
||||||
|
assert.Equal(t, packageName, p.Name)
|
||||||
|
assert.Equal(t, packageVersion, p.Version)
|
||||||
|
assert.NotNil(t, p.Creator)
|
||||||
|
assert.Equal(t, user.Name, p.Creator.UserName)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("ListPackageFiles", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "GET", fmt.Sprintf("/api/v1/packages/%s/dummy/%s/%s/files?token=%s", user.Name, packageName, packageVersion, token))
|
||||||
|
MakeRequest(t, req, http.StatusNotFound)
|
||||||
|
|
||||||
|
req = NewRequest(t, "GET", fmt.Sprintf("/api/v1/packages/%s/generic/%s/%s/files?token=%s", user.Name, packageName, packageVersion, token))
|
||||||
|
resp := MakeRequest(t, req, http.StatusOK)
|
||||||
|
|
||||||
|
var files []*api.PackageFile
|
||||||
|
DecodeJSON(t, resp, &files)
|
||||||
|
|
||||||
|
assert.Len(t, files, 1)
|
||||||
|
assert.Equal(t, int64(0), files[0].Size)
|
||||||
|
assert.Equal(t, filename, files[0].Name)
|
||||||
|
assert.Equal(t, "d41d8cd98f00b204e9800998ecf8427e", files[0].HashMD5)
|
||||||
|
assert.Equal(t, "da39a3ee5e6b4b0d3255bfef95601890afd80709", files[0].HashSHA1)
|
||||||
|
assert.Equal(t, "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855", files[0].HashSHA256)
|
||||||
|
assert.Equal(t, "cf83e1357eefb8bdf1542850d66d8007d620e4050b5715dc83f4a921d36ce9ce47d0d13c5d85f2b0ff8318d2877eec2f63b931bd47417a81a538327af927da3e", files[0].HashSHA512)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("DeletePackage", func(t *testing.T) {
|
||||||
|
defer PrintCurrentTest(t)()
|
||||||
|
|
||||||
|
req := NewRequest(t, "DELETE", fmt.Sprintf("/api/v1/packages/%s/dummy/%s/%s?token=%s", user.Name, packageName, packageVersion, token))
|
||||||
|
MakeRequest(t, req, http.StatusNotFound)
|
||||||
|
|
||||||
|
req = NewRequest(t, "DELETE", fmt.Sprintf("/api/v1/packages/%s/generic/%s/%s?token=%s", user.Name, packageName, packageVersion, token))
|
||||||
|
MakeRequest(t, req, http.StatusNoContent)
|
||||||
|
})
|
||||||
|
}
|
|
@ -58,6 +58,21 @@ func (err ErrUserHasOrgs) Error() string {
|
||||||
return fmt.Sprintf("user still has membership of organizations [uid: %d]", err.UID)
|
return fmt.Sprintf("user still has membership of organizations [uid: %d]", err.UID)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// ErrUserOwnPackages notifies that the user (still) owns the packages.
|
||||||
|
type ErrUserOwnPackages struct {
|
||||||
|
UID int64
|
||||||
|
}
|
||||||
|
|
||||||
|
// IsErrUserOwnPackages checks if an error is an ErrUserOwnPackages.
|
||||||
|
func IsErrUserOwnPackages(err error) bool {
|
||||||
|
_, ok := err.(ErrUserOwnPackages)
|
||||||
|
return ok
|
||||||
|
}
|
||||||
|
|
||||||
|
func (err ErrUserOwnPackages) Error() string {
|
||||||
|
return fmt.Sprintf("user still has ownership of packages [uid: %d]", err.UID)
|
||||||
|
}
|
||||||
|
|
||||||
// __ __.__ __ .__
|
// __ __.__ __ .__
|
||||||
// / \ / \__| | _|__|
|
// / \ / \__| | _|__|
|
||||||
// \ \/\/ / | |/ / |
|
// \ \/\/ / | |/ / |
|
||||||
|
|
|
@ -378,6 +378,8 @@ var migrations = []Migration{
|
||||||
|
|
||||||
// v211 -> v212
|
// v211 -> v212
|
||||||
NewMigration("Create ForeignReference table", createForeignReferenceTable),
|
NewMigration("Create ForeignReference table", createForeignReferenceTable),
|
||||||
|
// v212 -> v213
|
||||||
|
NewMigration("Add package tables", addPackageTables),
|
||||||
}
|
}
|
||||||
|
|
||||||
// GetCurrentDBVersion returns the current db version
|
// GetCurrentDBVersion returns the current db version
|
||||||
|
|
94
models/migrations/v212.go
Normal file
94
models/migrations/v212.go
Normal file
|
@ -0,0 +1,94 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package migrations
|
||||||
|
|
||||||
|
import (
|
||||||
|
"code.gitea.io/gitea/modules/timeutil"
|
||||||
|
|
||||||
|
"xorm.io/xorm"
|
||||||
|
)
|
||||||
|
|
||||||
|
func addPackageTables(x *xorm.Engine) error {
|
||||||
|
type Package struct {
|
||||||
|
ID int64 `xorm:"pk autoincr"`
|
||||||
|
OwnerID int64 `xorm:"UNIQUE(s) INDEX NOT NULL"`
|
||||||
|
RepoID int64 `xorm:"INDEX"`
|
||||||
|
Type string `xorm:"UNIQUE(s) INDEX NOT NULL"`
|
||||||
|
Name string `xorm:"NOT NULL"`
|
||||||
|
LowerName string `xorm:"UNIQUE(s) INDEX NOT NULL"`
|
||||||
|
SemverCompatible bool `xorm:"NOT NULL DEFAULT false"`
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := x.Sync2(new(Package)); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
type PackageVersion struct {
|
||||||
|
ID int64 `xorm:"pk autoincr"`
|
||||||
|
PackageID int64 `xorm:"UNIQUE(s) INDEX NOT NULL"`
|
||||||
|
CreatorID int64 `xorm:"NOT NULL DEFAULT 0"`
|
||||||
|
Version string `xorm:"NOT NULL"`
|
||||||
|
LowerVersion string `xorm:"UNIQUE(s) INDEX NOT NULL"`
|
||||||
|
CreatedUnix timeutil.TimeStamp `xorm:"created INDEX NOT NULL"`
|
||||||
|
IsInternal bool `xorm:"INDEX NOT NULL DEFAULT false"`
|
||||||
|
MetadataJSON string `xorm:"metadata_json TEXT"`
|
||||||
|
DownloadCount int64 `xorm:"NOT NULL DEFAULT 0"`
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := x.Sync2(new(PackageVersion)); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
type PackageProperty struct {
|
||||||
|
ID int64 `xorm:"pk autoincr"`
|
||||||
|
RefType int64 `xorm:"INDEX NOT NULL"`
|
||||||
|
RefID int64 `xorm:"INDEX NOT NULL"`
|
||||||
|
Name string `xorm:"INDEX NOT NULL"`
|
||||||
|
Value string `xorm:"TEXT NOT NULL"`
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := x.Sync2(new(PackageProperty)); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
type PackageFile struct {
|
||||||
|
ID int64 `xorm:"pk autoincr"`
|
||||||
|
VersionID int64 `xorm:"UNIQUE(s) INDEX NOT NULL"`
|
||||||
|
BlobID int64 `xorm:"INDEX NOT NULL"`
|
||||||
|
Name string `xorm:"NOT NULL"`
|
||||||
|
LowerName string `xorm:"UNIQUE(s) INDEX NOT NULL"`
|
||||||
|
CompositeKey string `xorm:"UNIQUE(s) INDEX"`
|
||||||
|
IsLead bool `xorm:"NOT NULL DEFAULT false"`
|
||||||
|
CreatedUnix timeutil.TimeStamp `xorm:"created INDEX NOT NULL"`
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := x.Sync2(new(PackageFile)); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
type PackageBlob struct {
|
||||||
|
ID int64 `xorm:"pk autoincr"`
|
||||||
|
Size int64 `xorm:"NOT NULL DEFAULT 0"`
|
||||||
|
HashMD5 string `xorm:"hash_md5 char(32) UNIQUE(md5) INDEX NOT NULL"`
|
||||||
|
HashSHA1 string `xorm:"hash_sha1 char(40) UNIQUE(sha1) INDEX NOT NULL"`
|
||||||
|
HashSHA256 string `xorm:"hash_sha256 char(64) UNIQUE(sha256) INDEX NOT NULL"`
|
||||||
|
HashSHA512 string `xorm:"hash_sha512 char(128) UNIQUE(sha512) INDEX NOT NULL"`
|
||||||
|
CreatedUnix timeutil.TimeStamp `xorm:"created INDEX NOT NULL"`
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := x.Sync2(new(PackageBlob)); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
type PackageBlobUpload struct {
|
||||||
|
ID string `xorm:"pk"`
|
||||||
|
BytesReceived int64 `xorm:"NOT NULL DEFAULT 0"`
|
||||||
|
HashStateBytes []byte `xorm:"BLOB"`
|
||||||
|
CreatedUnix timeutil.TimeStamp `xorm:"created NOT NULL"`
|
||||||
|
UpdatedUnix timeutil.TimeStamp `xorm:"updated INDEX NOT NULL"`
|
||||||
|
}
|
||||||
|
|
||||||
|
return x.Sync2(new(PackageBlobUpload))
|
||||||
|
}
|
171
models/packages/conan/references.go
Normal file
171
models/packages/conan/references.go
Normal file
|
@ -0,0 +1,171 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package conan
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"errors"
|
||||||
|
"strconv"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/models/db"
|
||||||
|
"code.gitea.io/gitea/models/packages"
|
||||||
|
conan_module "code.gitea.io/gitea/modules/packages/conan"
|
||||||
|
"code.gitea.io/gitea/modules/timeutil"
|
||||||
|
|
||||||
|
"xorm.io/builder"
|
||||||
|
)
|
||||||
|
|
||||||
|
var (
|
||||||
|
ErrRecipeReferenceNotExist = errors.New("Recipe reference does not exist")
|
||||||
|
ErrPackageReferenceNotExist = errors.New("Package reference does not exist")
|
||||||
|
)
|
||||||
|
|
||||||
|
// RecipeExists checks if a recipe exists
|
||||||
|
func RecipeExists(ctx context.Context, ownerID int64, ref *conan_module.RecipeReference) (bool, error) {
|
||||||
|
revisions, err := GetRecipeRevisions(ctx, ownerID, ref)
|
||||||
|
if err != nil {
|
||||||
|
return false, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return len(revisions) != 0, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
type PropertyValue struct {
|
||||||
|
Value string
|
||||||
|
CreatedUnix timeutil.TimeStamp
|
||||||
|
}
|
||||||
|
|
||||||
|
func findPropertyValues(ctx context.Context, propertyName string, ownerID int64, name, version string, propertyFilter map[string]string) ([]*PropertyValue, error) {
|
||||||
|
var propsCond builder.Cond = builder.Eq{
|
||||||
|
"package_property.ref_type": packages.PropertyTypeFile,
|
||||||
|
}
|
||||||
|
propsCond = propsCond.And(builder.Expr("package_property.ref_id = package_file.id"))
|
||||||
|
|
||||||
|
propsCondBlock := builder.NewCond()
|
||||||
|
for name, value := range propertyFilter {
|
||||||
|
propsCondBlock = propsCondBlock.Or(builder.Eq{
|
||||||
|
"package_property.name": name,
|
||||||
|
"package_property.value": value,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
propsCond = propsCond.And(propsCondBlock)
|
||||||
|
|
||||||
|
var cond builder.Cond = builder.Eq{
|
||||||
|
"package.type": packages.TypeConan,
|
||||||
|
"package.owner_id": ownerID,
|
||||||
|
"package.lower_name": strings.ToLower(name),
|
||||||
|
"package_version.lower_version": strings.ToLower(version),
|
||||||
|
"package_version.is_internal": false,
|
||||||
|
strconv.Itoa(len(propertyFilter)): builder.Select("COUNT(*)").Where(propsCond).From("package_property"),
|
||||||
|
}
|
||||||
|
|
||||||
|
in2 := builder.
|
||||||
|
Select("package_file.id").
|
||||||
|
From("package_file").
|
||||||
|
Join("INNER", "package_version", "package_version.id = package_file.version_id").
|
||||||
|
Join("INNER", "package", "package.id = package_version.package_id").
|
||||||
|
Where(cond)
|
||||||
|
|
||||||
|
query := builder.
|
||||||
|
Select("package_property.value, MAX(package_file.created_unix) AS created_unix").
|
||||||
|
From("package_property").
|
||||||
|
Join("INNER", "package_file", "package_file.id = package_property.ref_id").
|
||||||
|
Where(builder.Eq{"package_property.name": propertyName}.And(builder.In("package_property.ref_id", in2))).
|
||||||
|
GroupBy("package_property.value").
|
||||||
|
OrderBy("created_unix DESC")
|
||||||
|
|
||||||
|
var values []*PropertyValue
|
||||||
|
return values, db.GetEngine(ctx).SQL(query).Find(&values)
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetRecipeRevisions gets all revisions of a recipe
|
||||||
|
func GetRecipeRevisions(ctx context.Context, ownerID int64, ref *conan_module.RecipeReference) ([]*PropertyValue, error) {
|
||||||
|
values, err := findPropertyValues(
|
||||||
|
ctx,
|
||||||
|
conan_module.PropertyRecipeRevision,
|
||||||
|
ownerID,
|
||||||
|
ref.Name,
|
||||||
|
ref.Version,
|
||||||
|
map[string]string{
|
||||||
|
conan_module.PropertyRecipeUser: ref.User,
|
||||||
|
conan_module.PropertyRecipeChannel: ref.Channel,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return values, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetLastRecipeRevision gets the latest recipe revision
|
||||||
|
func GetLastRecipeRevision(ctx context.Context, ownerID int64, ref *conan_module.RecipeReference) (*PropertyValue, error) {
|
||||||
|
revisions, err := GetRecipeRevisions(ctx, ownerID, ref)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(revisions) == 0 {
|
||||||
|
return nil, ErrRecipeReferenceNotExist
|
||||||
|
}
|
||||||
|
return revisions[0], nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetPackageReferences gets all package references of a recipe
|
||||||
|
func GetPackageReferences(ctx context.Context, ownerID int64, ref *conan_module.RecipeReference) ([]*PropertyValue, error) {
|
||||||
|
values, err := findPropertyValues(
|
||||||
|
ctx,
|
||||||
|
conan_module.PropertyPackageReference,
|
||||||
|
ownerID,
|
||||||
|
ref.Name,
|
||||||
|
ref.Version,
|
||||||
|
map[string]string{
|
||||||
|
conan_module.PropertyRecipeUser: ref.User,
|
||||||
|
conan_module.PropertyRecipeChannel: ref.Channel,
|
||||||
|
conan_module.PropertyRecipeRevision: ref.Revision,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return values, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetPackageRevisions gets all revision of a package
|
||||||
|
func GetPackageRevisions(ctx context.Context, ownerID int64, ref *conan_module.PackageReference) ([]*PropertyValue, error) {
|
||||||
|
values, err := findPropertyValues(
|
||||||
|
ctx,
|
||||||
|
conan_module.PropertyPackageRevision,
|
||||||
|
ownerID,
|
||||||
|
ref.Recipe.Name,
|
||||||
|
ref.Recipe.Version,
|
||||||
|
map[string]string{
|
||||||
|
conan_module.PropertyRecipeUser: ref.Recipe.User,
|
||||||
|
conan_module.PropertyRecipeChannel: ref.Recipe.Channel,
|
||||||
|
conan_module.PropertyRecipeRevision: ref.Recipe.Revision,
|
||||||
|
conan_module.PropertyPackageReference: ref.Reference,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return values, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetLastPackageRevision gets the latest package revision
|
||||||
|
func GetLastPackageRevision(ctx context.Context, ownerID int64, ref *conan_module.PackageReference) (*PropertyValue, error) {
|
||||||
|
revisions, err := GetPackageRevisions(ctx, ownerID, ref)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(revisions) == 0 {
|
||||||
|
return nil, ErrPackageReferenceNotExist
|
||||||
|
}
|
||||||
|
return revisions[0], nil
|
||||||
|
}
|
149
models/packages/conan/search.go
Normal file
149
models/packages/conan/search.go
Normal file
|
@ -0,0 +1,149 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package conan
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"fmt"
|
||||||
|
"strconv"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/models/db"
|
||||||
|
"code.gitea.io/gitea/models/packages"
|
||||||
|
conan_module "code.gitea.io/gitea/modules/packages/conan"
|
||||||
|
|
||||||
|
"xorm.io/builder"
|
||||||
|
)
|
||||||
|
|
||||||
|
// buildCondition creates a Like condition if a wildcard is present. Otherwise Eq is used.
|
||||||
|
func buildCondition(name, value string) builder.Cond {
|
||||||
|
if strings.Contains(value, "*") {
|
||||||
|
return builder.Like{name, strings.ReplaceAll(strings.ReplaceAll(value, "_", "\\_"), "*", "%")}
|
||||||
|
}
|
||||||
|
return builder.Eq{name: value}
|
||||||
|
}
|
||||||
|
|
||||||
|
type RecipeSearchOptions struct {
|
||||||
|
OwnerID int64
|
||||||
|
Name string
|
||||||
|
Version string
|
||||||
|
User string
|
||||||
|
Channel string
|
||||||
|
}
|
||||||
|
|
||||||
|
// SearchRecipes gets all recipes matching the search options
|
||||||
|
func SearchRecipes(ctx context.Context, opts *RecipeSearchOptions) ([]string, error) {
|
||||||
|
var cond builder.Cond = builder.Eq{
|
||||||
|
"package_file.is_lead": true,
|
||||||
|
"package.type": packages.TypeConan,
|
||||||
|
"package.owner_id": opts.OwnerID,
|
||||||
|
"package_version.is_internal": false,
|
||||||
|
}
|
||||||
|
|
||||||
|
if opts.Name != "" {
|
||||||
|
cond = cond.And(buildCondition("package.lower_name", strings.ToLower(opts.Name)))
|
||||||
|
}
|
||||||
|
if opts.Version != "" {
|
||||||
|
cond = cond.And(buildCondition("package_version.lower_version", strings.ToLower(opts.Version)))
|
||||||
|
}
|
||||||
|
if opts.User != "" || opts.Channel != "" {
|
||||||
|
var propsCond builder.Cond = builder.Eq{
|
||||||
|
"package_property.ref_type": packages.PropertyTypeFile,
|
||||||
|
}
|
||||||
|
propsCond = propsCond.And(builder.Expr("package_property.ref_id = package_file.id"))
|
||||||
|
|
||||||
|
count := 0
|
||||||
|
propsCondBlock := builder.NewCond()
|
||||||
|
if opts.User != "" {
|
||||||
|
count++
|
||||||
|
propsCondBlock = propsCondBlock.Or(builder.Eq{"package_property.name": conan_module.PropertyRecipeUser}.And(buildCondition("package_property.value", opts.User)))
|
||||||
|
}
|
||||||
|
if opts.Channel != "" {
|
||||||
|
count++
|
||||||
|
propsCondBlock = propsCondBlock.Or(builder.Eq{"package_property.name": conan_module.PropertyRecipeChannel}.And(buildCondition("package_property.value", opts.Channel)))
|
||||||
|
}
|
||||||
|
propsCond = propsCond.And(propsCondBlock)
|
||||||
|
|
||||||
|
cond = cond.And(builder.Eq{
|
||||||
|
strconv.Itoa(count): builder.Select("COUNT(*)").Where(propsCond).From("package_property"),
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
query := builder.
|
||||||
|
Select("package.name, package_version.version, package_file.id").
|
||||||
|
From("package_file").
|
||||||
|
Join("INNER", "package_version", "package_version.id = package_file.version_id").
|
||||||
|
Join("INNER", "package", "package.id = package_version.package_id").
|
||||||
|
Where(cond)
|
||||||
|
|
||||||
|
results := make([]struct {
|
||||||
|
Name string
|
||||||
|
Version string
|
||||||
|
ID int64
|
||||||
|
}, 0, 5)
|
||||||
|
err := db.GetEngine(ctx).SQL(query).Find(&results)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
unique := make(map[string]bool)
|
||||||
|
for _, info := range results {
|
||||||
|
recipe := fmt.Sprintf("%s/%s", info.Name, info.Version)
|
||||||
|
|
||||||
|
props, _ := packages.GetProperties(ctx, packages.PropertyTypeFile, info.ID)
|
||||||
|
if len(props) > 0 {
|
||||||
|
var (
|
||||||
|
user = ""
|
||||||
|
channel = ""
|
||||||
|
)
|
||||||
|
for _, prop := range props {
|
||||||
|
if prop.Name == conan_module.PropertyRecipeUser {
|
||||||
|
user = prop.Value
|
||||||
|
}
|
||||||
|
if prop.Name == conan_module.PropertyRecipeChannel {
|
||||||
|
channel = prop.Value
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if user != "" && channel != "" {
|
||||||
|
recipe = fmt.Sprintf("%s@%s/%s", recipe, user, channel)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
unique[recipe] = true
|
||||||
|
}
|
||||||
|
|
||||||
|
recipes := make([]string, 0, len(unique))
|
||||||
|
for recipe := range unique {
|
||||||
|
recipes = append(recipes, recipe)
|
||||||
|
}
|
||||||
|
return recipes, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetPackageInfo gets the Conaninfo for a package
|
||||||
|
func GetPackageInfo(ctx context.Context, ownerID int64, ref *conan_module.PackageReference) (string, error) {
|
||||||
|
values, err := findPropertyValues(
|
||||||
|
ctx,
|
||||||
|
conan_module.PropertyPackageInfo,
|
||||||
|
ownerID,
|
||||||
|
ref.Recipe.Name,
|
||||||
|
ref.Recipe.Version,
|
||||||
|
map[string]string{
|
||||||
|
conan_module.PropertyRecipeUser: ref.Recipe.User,
|
||||||
|
conan_module.PropertyRecipeChannel: ref.Recipe.Channel,
|
||||||
|
conan_module.PropertyRecipeRevision: ref.Recipe.Revision,
|
||||||
|
conan_module.PropertyPackageReference: ref.Reference,
|
||||||
|
conan_module.PropertyPackageRevision: ref.Revision,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(values) == 0 {
|
||||||
|
return "", ErrPackageReferenceNotExist
|
||||||
|
}
|
||||||
|
|
||||||
|
return values[0].Value, nil
|
||||||
|
}
|
10
models/packages/container/const.go
Normal file
10
models/packages/container/const.go
Normal file
|
@ -0,0 +1,10 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package container
|
||||||
|
|
||||||
|
const (
|
||||||
|
ManifestFilename = "manifest.json"
|
||||||
|
UploadVersion = "_upload"
|
||||||
|
)
|
227
models/packages/container/search.go
Normal file
227
models/packages/container/search.go
Normal file
|
@ -0,0 +1,227 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package container
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"errors"
|
||||||
|
"strings"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/models/db"
|
||||||
|
"code.gitea.io/gitea/models/packages"
|
||||||
|
container_module "code.gitea.io/gitea/modules/packages/container"
|
||||||
|
|
||||||
|
"xorm.io/builder"
|
||||||
|
)
|
||||||
|
|
||||||
|
var ErrContainerBlobNotExist = errors.New("Container blob does not exist")
|
||||||
|
|
||||||
|
type BlobSearchOptions struct {
|
||||||
|
OwnerID int64
|
||||||
|
Image string
|
||||||
|
Digest string
|
||||||
|
Tag string
|
||||||
|
IsManifest bool
|
||||||
|
}
|
||||||
|
|
||||||
|
func (opts *BlobSearchOptions) toConds() builder.Cond {
|
||||||
|
var cond builder.Cond = builder.Eq{
|
||||||
|
"package.type": packages.TypeContainer,
|
||||||
|
}
|
||||||
|
|
||||||
|
if opts.OwnerID != 0 {
|
||||||
|
cond = cond.And(builder.Eq{"package.owner_id": opts.OwnerID})
|
||||||
|
}
|
||||||
|
if opts.Image != "" {
|
||||||
|
cond = cond.And(builder.Eq{"package.lower_name": strings.ToLower(opts.Image)})
|
||||||
|
}
|
||||||
|
if opts.Tag != "" {
|
||||||
|
cond = cond.And(builder.Eq{"package_version.lower_version": strings.ToLower(opts.Tag)})
|
||||||
|
}
|
||||||
|
if opts.IsManifest {
|
||||||
|
cond = cond.And(builder.Eq{"package_file.lower_name": ManifestFilename})
|
||||||
|
}
|
||||||
|
if opts.Digest != "" {
|
||||||
|
var propsCond builder.Cond = builder.Eq{
|
||||||
|
"package_property.ref_type": packages.PropertyTypeFile,
|
||||||
|
"package_property.name": container_module.PropertyDigest,
|
||||||
|
"package_property.value": opts.Digest,
|
||||||
|
}
|
||||||
|
|
||||||
|
cond = cond.And(builder.In("package_file.id", builder.Select("package_property.ref_id").Where(propsCond).From("package_property")))
|
||||||
|
}
|
||||||
|
|
||||||
|
return cond
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetContainerBlob gets the container blob matching the blob search options
|
||||||
|
// If multiple matching blobs are found (manifests with the same digest) the first (according to the database) is selected.
|
||||||
|
func GetContainerBlob(ctx context.Context, opts *BlobSearchOptions) (*packages.PackageFileDescriptor, error) {
|
||||||
|
pfds, err := getContainerBlobsLimit(ctx, opts, 1)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
if len(pfds) != 1 {
|
||||||
|
return nil, ErrContainerBlobNotExist
|
||||||
|
}
|
||||||
|
|
||||||
|
return pfds[0], nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetContainerBlobs gets the container blobs matching the blob search options
|
||||||
|
func GetContainerBlobs(ctx context.Context, opts *BlobSearchOptions) ([]*packages.PackageFileDescriptor, error) {
|
||||||
|
return getContainerBlobsLimit(ctx, opts, 0)
|
||||||
|
}
|
||||||
|
|
||||||
|
func getContainerBlobsLimit(ctx context.Context, opts *BlobSearchOptions, limit int) ([]*packages.PackageFileDescriptor, error) {
|
||||||
|
pfs := make([]*packages.PackageFile, 0, limit)
|
||||||
|
sess := db.GetEngine(ctx).
|
||||||
|
Join("INNER", "package_version", "package_version.id = package_file.version_id").
|
||||||
|
Join("INNER", "package", "package.id = package_version.package_id").
|
||||||
|
Where(opts.toConds())
|
||||||
|
|
||||||
|
if limit > 0 {
|
||||||
|
sess = sess.Limit(limit)
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := sess.Find(&pfs); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
pfds := make([]*packages.PackageFileDescriptor, 0, len(pfs))
|
||||||
|
for _, pf := range pfs {
|
||||||
|
pfd, err := packages.GetPackageFileDescriptor(ctx, pf)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
pfds = append(pfds, pfd)
|
||||||
|
}
|
||||||
|
|
||||||
|
return pfds, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetManifestVersions gets all package versions representing the matching manifest
|
||||||
|
func GetManifestVersions(ctx context.Context, opts *BlobSearchOptions) ([]*packages.PackageVersion, error) {
|
||||||
|
cond := opts.toConds().And(builder.Eq{"package_version.is_internal": false})
|
||||||
|
|
||||||
|
pvs := make([]*packages.PackageVersion, 0, 10)
|
||||||
|
return pvs, db.GetEngine(ctx).
|
||||||
|
Join("INNER", "package", "package.id = package_version.package_id").
|
||||||
|
Join("INNER", "package_file", "package_file.version_id = package_version.id").
|
||||||
|
Where(cond).
|
||||||
|
Find(&pvs)
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetImageTags gets a sorted list of the tags of an image
|
||||||
|
// The result is suitable for the api call.
|
||||||
|
func GetImageTags(ctx context.Context, ownerID int64, image string, n int, last string) ([]string, error) {
|
||||||
|
// Short circuit: n == 0 should return an empty list
|
||||||
|
if n == 0 {
|
||||||
|
return []string{}, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
var cond builder.Cond = builder.Eq{
|
||||||
|
"package.type": packages.TypeContainer,
|
||||||
|
"package.owner_id": ownerID,
|
||||||
|
"package.lower_name": strings.ToLower(image),
|
||||||
|
"package_version.is_internal": false,
|
||||||
|
}
|
||||||
|
|
||||||
|
var propsCond builder.Cond = builder.Eq{
|
||||||
|
"package_property.ref_type": packages.PropertyTypeVersion,
|
||||||
|
"package_property.name": container_module.PropertyManifestTagged,
|
||||||
|
}
|
||||||
|
|
||||||
|
cond = cond.And(builder.In("package_version.id", builder.Select("package_property.ref_id").Where(propsCond).From("package_property")))
|
||||||
|
|
||||||
|
if last != "" {
|
||||||
|
cond = cond.And(builder.Gt{"package_version.lower_version": strings.ToLower(last)})
|
||||||
|
}
|
||||||
|
|
||||||
|
sess := db.GetEngine(ctx).
|
||||||
|
Table("package_version").
|
||||||
|
Select("package_version.lower_version").
|
||||||
|
Join("INNER", "package", "package.id = package_version.package_id").
|
||||||
|
Where(cond).
|
||||||
|
Asc("package_version.lower_version")
|
||||||
|
|
||||||
|
var tags []string
|
||||||
|
if n > 0 {
|
||||||
|
sess = sess.Limit(n)
|
||||||
|
|
||||||
|
tags = make([]string, 0, n)
|
||||||
|
} else {
|
||||||
|
tags = make([]string, 0, 10)
|
||||||
|
}
|
||||||
|
|
||||||
|
return tags, sess.Find(&tags)
|
||||||
|
}
|
||||||
|
|
||||||
|
type ImageTagsSearchOptions struct {
|
||||||
|
PackageID int64
|
||||||
|
Query string
|
||||||
|
IsTagged bool
|
||||||
|
db.Paginator
|
||||||
|
}
|
||||||
|
|
||||||
|
func (opts *ImageTagsSearchOptions) toConds() builder.Cond {
|
||||||
|
var cond builder.Cond = builder.Eq{
|
||||||
|
"package.type": packages.TypeContainer,
|
||||||
|
"package.id": opts.PackageID,
|
||||||
|
"package_version.is_internal": false,
|
||||||
|
}
|
||||||
|
|
||||||
|
if opts.Query != "" {
|
||||||
|
cond = cond.And(builder.Like{"package_version.lower_version", strings.ToLower(opts.Query)})
|
||||||
|
}
|
||||||
|
|
||||||
|
var propsCond builder.Cond = builder.Eq{
|
||||||
|
"package_property.ref_type": packages.PropertyTypeVersion,
|
||||||
|
"package_property.name": container_module.PropertyManifestTagged,
|
||||||
|
}
|
||||||
|
|
||||||
|
in := builder.In("package_version.id", builder.Select("package_property.ref_id").Where(propsCond).From("package_property"))
|
||||||
|
|
||||||
|
if opts.IsTagged {
|
||||||
|
cond = cond.And(in)
|
||||||
|
} else {
|
||||||
|
cond = cond.And(builder.Not{in})
|
||||||
|
}
|
||||||
|
|
||||||
|
return cond
|
||||||
|
}
|
||||||
|
|
||||||
|
// SearchImageTags gets a sorted list of the tags of an image
|
||||||
|
func SearchImageTags(ctx context.Context, opts *ImageTagsSearchOptions) ([]*packages.PackageVersion, int64, error) {
|
||||||
|
sess := db.GetEngine(ctx).
|
||||||
|
Join("INNER", "package", "package.id = package_version.package_id").
|
||||||
|
Where(opts.toConds()).
|
||||||
|
Desc("package_version.created_unix")
|
||||||
|
|
||||||
|
if opts.Paginator != nil {
|
||||||
|
sess = db.SetSessionPagination(sess, opts)
|
||||||
|
}
|
||||||
|
|
||||||
|
pvs := make([]*packages.PackageVersion, 0, 10)
|
||||||
|
count, err := sess.FindAndCount(&pvs)
|
||||||
|
return pvs, count, err
|
||||||
|
}
|
||||||
|
|
||||||
|
func SearchExpiredUploadedBlobs(ctx context.Context, olderThan time.Duration) ([]*packages.PackageFile, error) {
|
||||||
|
var cond builder.Cond = builder.Eq{
|
||||||
|
"package_version.is_internal": true,
|
||||||
|
"package_version.lower_version": UploadVersion,
|
||||||
|
"package.type": packages.TypeContainer,
|
||||||
|
}
|
||||||
|
cond = cond.And(builder.Lt{"package_file.created_unix": time.Now().Add(-olderThan).Unix()})
|
||||||
|
|
||||||
|
var pfs []*packages.PackageFile
|
||||||
|
return pfs, db.GetEngine(ctx).
|
||||||
|
Join("INNER", "package_version", "package_version.id = package_file.version_id").
|
||||||
|
Join("INNER", "package", "package.id = package_version.package_id").
|
||||||
|
Where(cond).
|
||||||
|
Find(&pfs)
|
||||||
|
}
|
192
models/packages/descriptor.go
Normal file
192
models/packages/descriptor.go
Normal file
|
@ -0,0 +1,192 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package packages
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"fmt"
|
||||||
|
"net/url"
|
||||||
|
|
||||||
|
repo_model "code.gitea.io/gitea/models/repo"
|
||||||
|
user_model "code.gitea.io/gitea/models/user"
|
||||||
|
"code.gitea.io/gitea/modules/json"
|
||||||
|
"code.gitea.io/gitea/modules/packages/composer"
|
||||||
|
"code.gitea.io/gitea/modules/packages/conan"
|
||||||
|
"code.gitea.io/gitea/modules/packages/container"
|
||||||
|
"code.gitea.io/gitea/modules/packages/maven"
|
||||||
|
"code.gitea.io/gitea/modules/packages/npm"
|
||||||
|
"code.gitea.io/gitea/modules/packages/nuget"
|
||||||
|
"code.gitea.io/gitea/modules/packages/pypi"
|
||||||
|
"code.gitea.io/gitea/modules/packages/rubygems"
|
||||||
|
|
||||||
|
"github.com/hashicorp/go-version"
|
||||||
|
)
|
||||||
|
|
||||||
|
// PackagePropertyList is a list of package properties
|
||||||
|
type PackagePropertyList []*PackageProperty
|
||||||
|
|
||||||
|
// GetByName gets the first property value with the specific name
|
||||||
|
func (l PackagePropertyList) GetByName(name string) string {
|
||||||
|
for _, pp := range l {
|
||||||
|
if pp.Name == name {
|
||||||
|
return pp.Value
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return ""
|
||||||
|
}
|
||||||
|
|
||||||
|
// PackageDescriptor describes a package
|
||||||
|
type PackageDescriptor struct {
|
||||||
|
Package *Package
|
||||||
|
Owner *user_model.User
|
||||||
|
Repository *repo_model.Repository
|
||||||
|
Version *PackageVersion
|
||||||
|
SemVer *version.Version
|
||||||
|
Creator *user_model.User
|
||||||
|
Properties PackagePropertyList
|
||||||
|
Metadata interface{}
|
||||||
|
Files []*PackageFileDescriptor
|
||||||
|
}
|
||||||
|
|
||||||
|
// PackageFileDescriptor describes a package file
|
||||||
|
type PackageFileDescriptor struct {
|
||||||
|
File *PackageFile
|
||||||
|
Blob *PackageBlob
|
||||||
|
Properties PackagePropertyList
|
||||||
|
}
|
||||||
|
|
||||||
|
// PackageWebLink returns the package web link
|
||||||
|
func (pd *PackageDescriptor) PackageWebLink() string {
|
||||||
|
return fmt.Sprintf("%s/-/packages/%s/%s", pd.Owner.HTMLURL(), string(pd.Package.Type), url.PathEscape(pd.Package.LowerName))
|
||||||
|
}
|
||||||
|
|
||||||
|
// FullWebLink returns the package version web link
|
||||||
|
func (pd *PackageDescriptor) FullWebLink() string {
|
||||||
|
return fmt.Sprintf("%s/%s", pd.PackageWebLink(), url.PathEscape(pd.Version.LowerVersion))
|
||||||
|
}
|
||||||
|
|
||||||
|
// CalculateBlobSize returns the total blobs size in bytes
|
||||||
|
func (pd *PackageDescriptor) CalculateBlobSize() int64 {
|
||||||
|
size := int64(0)
|
||||||
|
for _, f := range pd.Files {
|
||||||
|
size += f.Blob.Size
|
||||||
|
}
|
||||||
|
return size
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetPackageDescriptor gets the package description for a version
|
||||||
|
func GetPackageDescriptor(ctx context.Context, pv *PackageVersion) (*PackageDescriptor, error) {
|
||||||
|
p, err := GetPackageByID(ctx, pv.PackageID)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
o, err := user_model.GetUserByIDCtx(ctx, p.OwnerID)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
repository, err := repo_model.GetRepositoryByIDCtx(ctx, p.RepoID)
|
||||||
|
if err != nil && !repo_model.IsErrRepoNotExist(err) {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
creator, err := user_model.GetUserByIDCtx(ctx, pv.CreatorID)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
var semVer *version.Version
|
||||||
|
if p.SemverCompatible {
|
||||||
|
semVer, err = version.NewVersion(pv.Version)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
pvps, err := GetProperties(ctx, PropertyTypeVersion, pv.ID)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
pfs, err := GetFilesByVersionID(ctx, pv.ID)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
pfds := make([]*PackageFileDescriptor, 0, len(pfs))
|
||||||
|
for _, pf := range pfs {
|
||||||
|
pfd, err := GetPackageFileDescriptor(ctx, pf)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
pfds = append(pfds, pfd)
|
||||||
|
}
|
||||||
|
|
||||||
|
var metadata interface{}
|
||||||
|
switch p.Type {
|
||||||
|
case TypeComposer:
|
||||||
|
metadata = &composer.Metadata{}
|
||||||
|
case TypeConan:
|
||||||
|
metadata = &conan.Metadata{}
|
||||||
|
case TypeContainer:
|
||||||
|
metadata = &container.Metadata{}
|
||||||
|
case TypeGeneric:
|
||||||
|
// generic packages have no metadata
|
||||||
|
case TypeNuGet:
|
||||||
|
metadata = &nuget.Metadata{}
|
||||||
|
case TypeNpm:
|
||||||
|
metadata = &npm.Metadata{}
|
||||||
|
case TypeMaven:
|
||||||
|
metadata = &maven.Metadata{}
|
||||||
|
case TypePyPI:
|
||||||
|
metadata = &pypi.Metadata{}
|
||||||
|
case TypeRubyGems:
|
||||||
|
metadata = &rubygems.Metadata{}
|
||||||
|
default:
|
||||||
|
panic(fmt.Sprintf("unknown package type: %s", string(p.Type)))
|
||||||
|
}
|
||||||
|
if metadata != nil {
|
||||||
|
if err := json.Unmarshal([]byte(pv.MetadataJSON), &metadata); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return &PackageDescriptor{
|
||||||
|
Package: p,
|
||||||
|
Owner: o,
|
||||||
|
Repository: repository,
|
||||||
|
Version: pv,
|
||||||
|
SemVer: semVer,
|
||||||
|
Creator: creator,
|
||||||
|
Properties: PackagePropertyList(pvps),
|
||||||
|
Metadata: metadata,
|
||||||
|
Files: pfds,
|
||||||
|
}, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetPackageFileDescriptor gets a package file descriptor for a package file
|
||||||
|
func GetPackageFileDescriptor(ctx context.Context, pf *PackageFile) (*PackageFileDescriptor, error) {
|
||||||
|
pb, err := GetBlobByID(ctx, pf.BlobID)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
pfps, err := GetProperties(ctx, PropertyTypeFile, pf.ID)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
return &PackageFileDescriptor{
|
||||||
|
pf,
|
||||||
|
pb,
|
||||||
|
PackagePropertyList(pfps),
|
||||||
|
}, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetPackageDescriptors gets the package descriptions for the versions
|
||||||
|
func GetPackageDescriptors(ctx context.Context, pvs []*PackageVersion) ([]*PackageDescriptor, error) {
|
||||||
|
pds := make([]*PackageDescriptor, 0, len(pvs))
|
||||||
|
for _, pv := range pvs {
|
||||||
|
pd, err := GetPackageDescriptor(ctx, pv)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
pds = append(pds, pd)
|
||||||
|
}
|
||||||
|
return pds, nil
|
||||||
|
}
|
213
models/packages/package.go
Normal file
213
models/packages/package.go
Normal file
|
@ -0,0 +1,213 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package packages
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"errors"
|
||||||
|
"fmt"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/models/db"
|
||||||
|
|
||||||
|
"xorm.io/builder"
|
||||||
|
)
|
||||||
|
|
||||||
|
func init() {
|
||||||
|
db.RegisterModel(new(Package))
|
||||||
|
}
|
||||||
|
|
||||||
|
var (
|
||||||
|
// ErrDuplicatePackage indicates a duplicated package error
|
||||||
|
ErrDuplicatePackage = errors.New("Package does exist already")
|
||||||
|
// ErrPackageNotExist indicates a package not exist error
|
||||||
|
ErrPackageNotExist = errors.New("Package does not exist")
|
||||||
|
)
|
||||||
|
|
||||||
|
// Type of a package
|
||||||
|
type Type string
|
||||||
|
|
||||||
|
// List of supported packages
|
||||||
|
const (
|
||||||
|
TypeComposer Type = "composer"
|
||||||
|
TypeConan Type = "conan"
|
||||||
|
TypeContainer Type = "container"
|
||||||
|
TypeGeneric Type = "generic"
|
||||||
|
TypeNuGet Type = "nuget"
|
||||||
|
TypeNpm Type = "npm"
|
||||||
|
TypeMaven Type = "maven"
|
||||||
|
TypePyPI Type = "pypi"
|
||||||
|
TypeRubyGems Type = "rubygems"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Name gets the name of the package type
|
||||||
|
func (pt Type) Name() string {
|
||||||
|
switch pt {
|
||||||
|
case TypeComposer:
|
||||||
|
return "Composer"
|
||||||
|
case TypeConan:
|
||||||
|
return "Conan"
|
||||||
|
case TypeContainer:
|
||||||
|
return "Container"
|
||||||
|
case TypeGeneric:
|
||||||
|
return "Generic"
|
||||||
|
case TypeNuGet:
|
||||||
|
return "NuGet"
|
||||||
|
case TypeNpm:
|
||||||
|
return "npm"
|
||||||
|
case TypeMaven:
|
||||||
|
return "Maven"
|
||||||
|
case TypePyPI:
|
||||||
|
return "PyPI"
|
||||||
|
case TypeRubyGems:
|
||||||
|
return "RubyGems"
|
||||||
|
}
|
||||||
|
panic(fmt.Sprintf("unknown package type: %s", string(pt)))
|
||||||
|
}
|
||||||
|
|
||||||
|
// SVGName gets the name of the package type svg image
|
||||||
|
func (pt Type) SVGName() string {
|
||||||
|
switch pt {
|
||||||
|
case TypeComposer:
|
||||||
|
return "gitea-composer"
|
||||||
|
case TypeConan:
|
||||||
|
return "gitea-conan"
|
||||||
|
case TypeContainer:
|
||||||
|
return "octicon-container"
|
||||||
|
case TypeGeneric:
|
||||||
|
return "octicon-package"
|
||||||
|
case TypeNuGet:
|
||||||
|
return "gitea-nuget"
|
||||||
|
case TypeNpm:
|
||||||
|
return "gitea-npm"
|
||||||
|
case TypeMaven:
|
||||||
|
return "gitea-maven"
|
||||||
|
case TypePyPI:
|
||||||
|
return "gitea-python"
|
||||||
|
case TypeRubyGems:
|
||||||
|
return "gitea-rubygems"
|
||||||
|
}
|
||||||
|
panic(fmt.Sprintf("unknown package type: %s", string(pt)))
|
||||||
|
}
|
||||||
|
|
||||||
|
// Package represents a package
|
||||||
|
type Package struct {
|
||||||
|
ID int64 `xorm:"pk autoincr"`
|
||||||
|
OwnerID int64 `xorm:"UNIQUE(s) INDEX NOT NULL"`
|
||||||
|
RepoID int64 `xorm:"INDEX"`
|
||||||
|
Type Type `xorm:"UNIQUE(s) INDEX NOT NULL"`
|
||||||
|
Name string `xorm:"NOT NULL"`
|
||||||
|
LowerName string `xorm:"UNIQUE(s) INDEX NOT NULL"`
|
||||||
|
SemverCompatible bool `xorm:"NOT NULL DEFAULT false"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// TryInsertPackage inserts a package. If a package exists already, ErrDuplicatePackage is returned
|
||||||
|
func TryInsertPackage(ctx context.Context, p *Package) (*Package, error) {
|
||||||
|
e := db.GetEngine(ctx)
|
||||||
|
|
||||||
|
key := &Package{
|
||||||
|
OwnerID: p.OwnerID,
|
||||||
|
Type: p.Type,
|
||||||
|
LowerName: p.LowerName,
|
||||||
|
}
|
||||||
|
|
||||||
|
has, err := e.Get(key)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
if has {
|
||||||
|
return key, ErrDuplicatePackage
|
||||||
|
}
|
||||||
|
if _, err = e.Insert(p); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
return p, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// SetRepositoryLink sets the linked repository
|
||||||
|
func SetRepositoryLink(ctx context.Context, packageID, repoID int64) error {
|
||||||
|
_, err := db.GetEngine(ctx).ID(packageID).Cols("repo_id").Update(&Package{RepoID: repoID})
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// UnlinkRepositoryFromAllPackages unlinks every package from the repository
|
||||||
|
func UnlinkRepositoryFromAllPackages(ctx context.Context, repoID int64) error {
|
||||||
|
_, err := db.GetEngine(ctx).Where("repo_id = ?", repoID).Cols("repo_id").Update(&Package{})
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetPackageByID gets a package by id
|
||||||
|
func GetPackageByID(ctx context.Context, packageID int64) (*Package, error) {
|
||||||
|
p := &Package{}
|
||||||
|
|
||||||
|
has, err := db.GetEngine(ctx).ID(packageID).Get(p)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
if !has {
|
||||||
|
return nil, ErrPackageNotExist
|
||||||
|
}
|
||||||
|
return p, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetPackageByName gets a package by name
|
||||||
|
func GetPackageByName(ctx context.Context, ownerID int64, packageType Type, name string) (*Package, error) {
|
||||||
|
var cond builder.Cond = builder.Eq{
|
||||||
|
"package.owner_id": ownerID,
|
||||||
|
"package.type": packageType,
|
||||||
|
"package.lower_name": strings.ToLower(name),
|
||||||
|
}
|
||||||
|
|
||||||
|
p := &Package{}
|
||||||
|
|
||||||
|
has, err := db.GetEngine(ctx).
|
||||||
|
Where(cond).
|
||||||
|
Get(p)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
if !has {
|
||||||
|
return nil, ErrPackageNotExist
|
||||||
|
}
|
||||||
|
return p, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetPackagesByType gets all packages of a specific type
|
||||||
|
func GetPackagesByType(ctx context.Context, ownerID int64, packageType Type) ([]*Package, error) {
|
||||||
|
var cond builder.Cond = builder.Eq{
|
||||||
|
"package.owner_id": ownerID,
|
||||||
|
"package.type": packageType,
|
||||||
|
}
|
||||||
|
|
||||||
|
ps := make([]*Package, 0, 10)
|
||||||
|
return ps, db.GetEngine(ctx).
|
||||||
|
Where(cond).
|
||||||
|
Find(&ps)
|
||||||
|
}
|
||||||
|
|
||||||
|
// DeletePackagesIfUnreferenced deletes a package if there are no associated versions
|
||||||
|
func DeletePackagesIfUnreferenced(ctx context.Context) error {
|
||||||
|
in := builder.
|
||||||
|
Select("package_version.package_id").
|
||||||
|
From("package").
|
||||||
|
Join("LEFT", "package_version", "package_version.package_id = package.id").
|
||||||
|
Where(builder.Expr("package_version.id IS NULL"))
|
||||||
|
|
||||||
|
_, err := db.GetEngine(ctx).
|
||||||
|
Where(builder.In("package.id", in)).
|
||||||
|
Delete(&Package{})
|
||||||
|
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// HasOwnerPackages tests if a user/org has packages
|
||||||
|
func HasOwnerPackages(ctx context.Context, ownerID int64) (bool, error) {
|
||||||
|
return db.GetEngine(ctx).Where("owner_id = ?", ownerID).Exist(&Package{})
|
||||||
|
}
|
||||||
|
|
||||||
|
// HasRepositoryPackages tests if a repository has packages
|
||||||
|
func HasRepositoryPackages(ctx context.Context, repositoryID int64) (bool, error) {
|
||||||
|
return db.GetEngine(ctx).Where("repo_id = ?", repositoryID).Exist(&Package{})
|
||||||
|
}
|
85
models/packages/package_blob.go
Normal file
85
models/packages/package_blob.go
Normal file
|
@ -0,0 +1,85 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package packages
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"errors"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/models/db"
|
||||||
|
"code.gitea.io/gitea/modules/timeutil"
|
||||||
|
)
|
||||||
|
|
||||||
|
// ErrPackageBlobNotExist indicates a package blob not exist error
|
||||||
|
var ErrPackageBlobNotExist = errors.New("Package blob does not exist")
|
||||||
|
|
||||||
|
func init() {
|
||||||
|
db.RegisterModel(new(PackageBlob))
|
||||||
|
}
|
||||||
|
|
||||||
|
// PackageBlob represents a package blob
|
||||||
|
type PackageBlob struct {
|
||||||
|
ID int64 `xorm:"pk autoincr"`
|
||||||
|
Size int64 `xorm:"NOT NULL DEFAULT 0"`
|
||||||
|
HashMD5 string `xorm:"hash_md5 char(32) UNIQUE(md5) INDEX NOT NULL"`
|
||||||
|
HashSHA1 string `xorm:"hash_sha1 char(40) UNIQUE(sha1) INDEX NOT NULL"`
|
||||||
|
HashSHA256 string `xorm:"hash_sha256 char(64) UNIQUE(sha256) INDEX NOT NULL"`
|
||||||
|
HashSHA512 string `xorm:"hash_sha512 char(128) UNIQUE(sha512) INDEX NOT NULL"`
|
||||||
|
CreatedUnix timeutil.TimeStamp `xorm:"created INDEX NOT NULL"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetOrInsertBlob inserts a blob. If the blob exists already the existing blob is returned
|
||||||
|
func GetOrInsertBlob(ctx context.Context, pb *PackageBlob) (*PackageBlob, bool, error) {
|
||||||
|
e := db.GetEngine(ctx)
|
||||||
|
|
||||||
|
has, err := e.Get(pb)
|
||||||
|
if err != nil {
|
||||||
|
return nil, false, err
|
||||||
|
}
|
||||||
|
if has {
|
||||||
|
return pb, true, nil
|
||||||
|
}
|
||||||
|
if _, err = e.Insert(pb); err != nil {
|
||||||
|
return nil, false, err
|
||||||
|
}
|
||||||
|
return pb, false, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetBlobByID gets a blob by id
|
||||||
|
func GetBlobByID(ctx context.Context, blobID int64) (*PackageBlob, error) {
|
||||||
|
pb := &PackageBlob{}
|
||||||
|
|
||||||
|
has, err := db.GetEngine(ctx).ID(blobID).Get(pb)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
if !has {
|
||||||
|
return nil, ErrPackageBlobNotExist
|
||||||
|
}
|
||||||
|
return pb, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// FindExpiredUnreferencedBlobs gets all blobs without associated files older than the specific duration
|
||||||
|
func FindExpiredUnreferencedBlobs(ctx context.Context, olderThan time.Duration) ([]*PackageBlob, error) {
|
||||||
|
pbs := make([]*PackageBlob, 0, 10)
|
||||||
|
return pbs, db.GetEngine(ctx).
|
||||||
|
Table("package_blob").
|
||||||
|
Join("LEFT OUTER", "package_file", "package_file.blob_id = package_blob.id").
|
||||||
|
Where("package_file.id IS NULL AND package_blob.created_unix < ?", time.Now().Add(-olderThan).Unix()).
|
||||||
|
Find(&pbs)
|
||||||
|
}
|
||||||
|
|
||||||
|
// DeleteBlobByID deletes a blob by id
|
||||||
|
func DeleteBlobByID(ctx context.Context, blobID int64) error {
|
||||||
|
_, err := db.GetEngine(ctx).ID(blobID).Delete(&PackageBlob{})
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetTotalBlobSize returns the total blobs size in bytes
|
||||||
|
func GetTotalBlobSize() (int64, error) {
|
||||||
|
return db.GetEngine(db.DefaultContext).
|
||||||
|
SumInt(&PackageBlob{}, "size")
|
||||||
|
}
|
81
models/packages/package_blob_upload.go
Normal file
81
models/packages/package_blob_upload.go
Normal file
|
@ -0,0 +1,81 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package packages
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"errors"
|
||||||
|
"strings"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/models/db"
|
||||||
|
"code.gitea.io/gitea/modules/timeutil"
|
||||||
|
"code.gitea.io/gitea/modules/util"
|
||||||
|
)
|
||||||
|
|
||||||
|
// ErrPackageBlobUploadNotExist indicates a package blob upload not exist error
|
||||||
|
var ErrPackageBlobUploadNotExist = errors.New("Package blob upload does not exist")
|
||||||
|
|
||||||
|
func init() {
|
||||||
|
db.RegisterModel(new(PackageBlobUpload))
|
||||||
|
}
|
||||||
|
|
||||||
|
// PackageBlobUpload represents a package blob upload
|
||||||
|
type PackageBlobUpload struct {
|
||||||
|
ID string `xorm:"pk"`
|
||||||
|
BytesReceived int64 `xorm:"NOT NULL DEFAULT 0"`
|
||||||
|
HashStateBytes []byte `xorm:"BLOB"`
|
||||||
|
CreatedUnix timeutil.TimeStamp `xorm:"created NOT NULL"`
|
||||||
|
UpdatedUnix timeutil.TimeStamp `xorm:"updated INDEX NOT NULL"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// CreateBlobUpload inserts a blob upload
|
||||||
|
func CreateBlobUpload(ctx context.Context) (*PackageBlobUpload, error) {
|
||||||
|
id, err := util.CryptoRandomString(25)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
pbu := &PackageBlobUpload{
|
||||||
|
ID: strings.ToLower(id),
|
||||||
|
}
|
||||||
|
|
||||||
|
_, err = db.GetEngine(ctx).Insert(pbu)
|
||||||
|
return pbu, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetBlobUploadByID gets a blob upload by id
|
||||||
|
func GetBlobUploadByID(ctx context.Context, id string) (*PackageBlobUpload, error) {
|
||||||
|
pbu := &PackageBlobUpload{}
|
||||||
|
|
||||||
|
has, err := db.GetEngine(ctx).ID(id).Get(pbu)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
if !has {
|
||||||
|
return nil, ErrPackageBlobUploadNotExist
|
||||||
|
}
|
||||||
|
return pbu, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// UpdateBlobUpload updates the blob upload
|
||||||
|
func UpdateBlobUpload(ctx context.Context, pbu *PackageBlobUpload) error {
|
||||||
|
_, err := db.GetEngine(ctx).ID(pbu.ID).Update(pbu)
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// DeleteBlobUploadByID deletes the blob upload
|
||||||
|
func DeleteBlobUploadByID(ctx context.Context, id string) error {
|
||||||
|
_, err := db.GetEngine(ctx).ID(id).Delete(&PackageBlobUpload{})
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// FindExpiredBlobUploads gets all expired blob uploads
|
||||||
|
func FindExpiredBlobUploads(ctx context.Context, olderThan time.Duration) ([]*PackageBlobUpload, error) {
|
||||||
|
pbus := make([]*PackageBlobUpload, 0, 10)
|
||||||
|
return pbus, db.GetEngine(ctx).
|
||||||
|
Where("updated_unix < ?", time.Now().Add(-olderThan).Unix()).
|
||||||
|
Find(&pbus)
|
||||||
|
}
|
201
models/packages/package_file.go
Normal file
201
models/packages/package_file.go
Normal file
|
@ -0,0 +1,201 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package packages
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"errors"
|
||||||
|
"strconv"
|
||||||
|
"strings"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/models/db"
|
||||||
|
"code.gitea.io/gitea/modules/timeutil"
|
||||||
|
|
||||||
|
"xorm.io/builder"
|
||||||
|
)
|
||||||
|
|
||||||
|
func init() {
|
||||||
|
db.RegisterModel(new(PackageFile))
|
||||||
|
}
|
||||||
|
|
||||||
|
var (
|
||||||
|
// ErrDuplicatePackageFile indicates a duplicated package file error
|
||||||
|
ErrDuplicatePackageFile = errors.New("Package file does exist already")
|
||||||
|
// ErrPackageFileNotExist indicates a package file not exist error
|
||||||
|
ErrPackageFileNotExist = errors.New("Package file does not exist")
|
||||||
|
)
|
||||||
|
|
||||||
|
// EmptyFileKey is a named constant for an empty file key
|
||||||
|
const EmptyFileKey = ""
|
||||||
|
|
||||||
|
// PackageFile represents a package file
|
||||||
|
type PackageFile struct {
|
||||||
|
ID int64 `xorm:"pk autoincr"`
|
||||||
|
VersionID int64 `xorm:"UNIQUE(s) INDEX NOT NULL"`
|
||||||
|
BlobID int64 `xorm:"INDEX NOT NULL"`
|
||||||
|
Name string `xorm:"NOT NULL"`
|
||||||
|
LowerName string `xorm:"UNIQUE(s) INDEX NOT NULL"`
|
||||||
|
CompositeKey string `xorm:"UNIQUE(s) INDEX"`
|
||||||
|
IsLead bool `xorm:"NOT NULL DEFAULT false"`
|
||||||
|
CreatedUnix timeutil.TimeStamp `xorm:"created INDEX NOT NULL"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// TryInsertFile inserts a file. If the file exists already ErrDuplicatePackageFile is returned
|
||||||
|
func TryInsertFile(ctx context.Context, pf *PackageFile) (*PackageFile, error) {
|
||||||
|
e := db.GetEngine(ctx)
|
||||||
|
|
||||||
|
key := &PackageFile{
|
||||||
|
VersionID: pf.VersionID,
|
||||||
|
LowerName: pf.LowerName,
|
||||||
|
CompositeKey: pf.CompositeKey,
|
||||||
|
}
|
||||||
|
|
||||||
|
has, err := e.Get(key)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
if has {
|
||||||
|
return pf, ErrDuplicatePackageFile
|
||||||
|
}
|
||||||
|
if _, err = e.Insert(pf); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
return pf, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetFilesByVersionID gets all files of a version
|
||||||
|
func GetFilesByVersionID(ctx context.Context, versionID int64) ([]*PackageFile, error) {
|
||||||
|
pfs := make([]*PackageFile, 0, 10)
|
||||||
|
return pfs, db.GetEngine(ctx).Where("version_id = ?", versionID).Find(&pfs)
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetFileForVersionByID gets a file of a version by id
|
||||||
|
func GetFileForVersionByID(ctx context.Context, versionID, fileID int64) (*PackageFile, error) {
|
||||||
|
pf := &PackageFile{
|
||||||
|
VersionID: versionID,
|
||||||
|
}
|
||||||
|
|
||||||
|
has, err := db.GetEngine(ctx).ID(fileID).Get(pf)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
if !has {
|
||||||
|
return nil, ErrPackageFileNotExist
|
||||||
|
}
|
||||||
|
return pf, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetFileForVersionByName gets a file of a version by name
|
||||||
|
func GetFileForVersionByName(ctx context.Context, versionID int64, name, key string) (*PackageFile, error) {
|
||||||
|
if name == "" {
|
||||||
|
return nil, ErrPackageFileNotExist
|
||||||
|
}
|
||||||
|
|
||||||
|
pf := &PackageFile{
|
||||||
|
VersionID: versionID,
|
||||||
|
LowerName: strings.ToLower(name),
|
||||||
|
CompositeKey: key,
|
||||||
|
}
|
||||||
|
|
||||||
|
has, err := db.GetEngine(ctx).Get(pf)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
if !has {
|
||||||
|
return nil, ErrPackageFileNotExist
|
||||||
|
}
|
||||||
|
return pf, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// DeleteFileByID deletes a file
|
||||||
|
func DeleteFileByID(ctx context.Context, fileID int64) error {
|
||||||
|
_, err := db.GetEngine(ctx).ID(fileID).Delete(&PackageFile{})
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// PackageFileSearchOptions are options for SearchXXX methods
|
||||||
|
type PackageFileSearchOptions struct {
|
||||||
|
OwnerID int64
|
||||||
|
PackageType string
|
||||||
|
VersionID int64
|
||||||
|
Query string
|
||||||
|
CompositeKey string
|
||||||
|
Properties map[string]string
|
||||||
|
OlderThan time.Duration
|
||||||
|
db.Paginator
|
||||||
|
}
|
||||||
|
|
||||||
|
func (opts *PackageFileSearchOptions) toConds() builder.Cond {
|
||||||
|
cond := builder.NewCond()
|
||||||
|
|
||||||
|
if opts.VersionID != 0 {
|
||||||
|
cond = cond.And(builder.Eq{"package_file.version_id": opts.VersionID})
|
||||||
|
} else if opts.OwnerID != 0 || (opts.PackageType != "" && opts.PackageType != "all") {
|
||||||
|
var versionCond builder.Cond = builder.Eq{
|
||||||
|
"package_version.is_internal": false,
|
||||||
|
}
|
||||||
|
if opts.OwnerID != 0 {
|
||||||
|
versionCond = versionCond.And(builder.Eq{"package.owner_id": opts.OwnerID})
|
||||||
|
}
|
||||||
|
if opts.PackageType != "" && opts.PackageType != "all" {
|
||||||
|
versionCond = versionCond.And(builder.Eq{"package.type": opts.PackageType})
|
||||||
|
}
|
||||||
|
|
||||||
|
in := builder.
|
||||||
|
Select("package_version.id").
|
||||||
|
From("package_version").
|
||||||
|
Join("INNER", "package", "package.id = package_version.package_id").
|
||||||
|
Where(versionCond)
|
||||||
|
|
||||||
|
cond = cond.And(builder.In("package_file.version_id", in))
|
||||||
|
}
|
||||||
|
if opts.CompositeKey != "" {
|
||||||
|
cond = cond.And(builder.Eq{"package_file.composite_key": opts.CompositeKey})
|
||||||
|
}
|
||||||
|
if opts.Query != "" {
|
||||||
|
cond = cond.And(builder.Like{"package_file.lower_name", strings.ToLower(opts.Query)})
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(opts.Properties) != 0 {
|
||||||
|
var propsCond builder.Cond = builder.Eq{
|
||||||
|
"package_property.ref_type": PropertyTypeFile,
|
||||||
|
}
|
||||||
|
propsCond = propsCond.And(builder.Expr("package_property.ref_id = package_file.id"))
|
||||||
|
|
||||||
|
propsCondBlock := builder.NewCond()
|
||||||
|
for name, value := range opts.Properties {
|
||||||
|
propsCondBlock = propsCondBlock.Or(builder.Eq{
|
||||||
|
"package_property.name": name,
|
||||||
|
"package_property.value": value,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
propsCond = propsCond.And(propsCondBlock)
|
||||||
|
|
||||||
|
cond = cond.And(builder.Eq{
|
||||||
|
strconv.Itoa(len(opts.Properties)): builder.Select("COUNT(*)").Where(propsCond).From("package_property"),
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
if opts.OlderThan != 0 {
|
||||||
|
cond = cond.And(builder.Lt{"package_file.created_unix": time.Now().Add(-opts.OlderThan).Unix()})
|
||||||
|
}
|
||||||
|
|
||||||
|
return cond
|
||||||
|
}
|
||||||
|
|
||||||
|
// SearchFiles gets all files of packages matching the search options
|
||||||
|
func SearchFiles(ctx context.Context, opts *PackageFileSearchOptions) ([]*PackageFile, int64, error) {
|
||||||
|
sess := db.GetEngine(ctx).
|
||||||
|
Where(opts.toConds())
|
||||||
|
|
||||||
|
if opts.Paginator != nil {
|
||||||
|
sess = db.SetSessionPagination(sess, opts)
|
||||||
|
}
|
||||||
|
|
||||||
|
pfs := make([]*PackageFile, 0, 10)
|
||||||
|
count, err := sess.FindAndCount(&pfs)
|
||||||
|
return pfs, count, err
|
||||||
|
}
|
70
models/packages/package_property.go
Normal file
70
models/packages/package_property.go
Normal file
|
@ -0,0 +1,70 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package packages
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/models/db"
|
||||||
|
)
|
||||||
|
|
||||||
|
func init() {
|
||||||
|
db.RegisterModel(new(PackageProperty))
|
||||||
|
}
|
||||||
|
|
||||||
|
type PropertyType int64
|
||||||
|
|
||||||
|
const (
|
||||||
|
// PropertyTypeVersion means the reference is a package version
|
||||||
|
PropertyTypeVersion PropertyType = iota // 0
|
||||||
|
// PropertyTypeFile means the reference is a package file
|
||||||
|
PropertyTypeFile // 1
|
||||||
|
)
|
||||||
|
|
||||||
|
// PackageProperty represents a property of a package version or file
|
||||||
|
type PackageProperty struct {
|
||||||
|
ID int64 `xorm:"pk autoincr"`
|
||||||
|
RefType PropertyType `xorm:"INDEX NOT NULL"`
|
||||||
|
RefID int64 `xorm:"INDEX NOT NULL"`
|
||||||
|
Name string `xorm:"INDEX NOT NULL"`
|
||||||
|
Value string `xorm:"TEXT NOT NULL"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// InsertProperty creates a property
|
||||||
|
func InsertProperty(ctx context.Context, refType PropertyType, refID int64, name, value string) (*PackageProperty, error) {
|
||||||
|
pp := &PackageProperty{
|
||||||
|
RefType: refType,
|
||||||
|
RefID: refID,
|
||||||
|
Name: name,
|
||||||
|
Value: value,
|
||||||
|
}
|
||||||
|
|
||||||
|
_, err := db.GetEngine(ctx).Insert(pp)
|
||||||
|
return pp, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetProperties gets all properties
|
||||||
|
func GetProperties(ctx context.Context, refType PropertyType, refID int64) ([]*PackageProperty, error) {
|
||||||
|
pps := make([]*PackageProperty, 0, 10)
|
||||||
|
return pps, db.GetEngine(ctx).Where("ref_type = ? AND ref_id = ?", refType, refID).Find(&pps)
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetPropertiesByName gets all properties with a specific name
|
||||||
|
func GetPropertiesByName(ctx context.Context, refType PropertyType, refID int64, name string) ([]*PackageProperty, error) {
|
||||||
|
pps := make([]*PackageProperty, 0, 10)
|
||||||
|
return pps, db.GetEngine(ctx).Where("ref_type = ? AND ref_id = ? AND name = ?", refType, refID, name).Find(&pps)
|
||||||
|
}
|
||||||
|
|
||||||
|
// DeleteAllProperties deletes all properties of a ref
|
||||||
|
func DeleteAllProperties(ctx context.Context, refType PropertyType, refID int64) error {
|
||||||
|
_, err := db.GetEngine(ctx).Where("ref_type = ? AND ref_id = ?", refType, refID).Delete(&PackageProperty{})
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// DeletePropertyByID deletes a property
|
||||||
|
func DeletePropertyByID(ctx context.Context, propertyID int64) error {
|
||||||
|
_, err := db.GetEngine(ctx).ID(propertyID).Delete(&PackageProperty{})
|
||||||
|
return err
|
||||||
|
}
|
316
models/packages/package_version.go
Normal file
316
models/packages/package_version.go
Normal file
|
@ -0,0 +1,316 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package packages
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"errors"
|
||||||
|
"strconv"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/models/db"
|
||||||
|
"code.gitea.io/gitea/modules/timeutil"
|
||||||
|
|
||||||
|
"xorm.io/builder"
|
||||||
|
)
|
||||||
|
|
||||||
|
var (
|
||||||
|
// ErrDuplicatePackageVersion indicates a duplicated package version error
|
||||||
|
ErrDuplicatePackageVersion = errors.New("Package version does exist already")
|
||||||
|
// ErrPackageVersionNotExist indicates a package version not exist error
|
||||||
|
ErrPackageVersionNotExist = errors.New("Package version does not exist")
|
||||||
|
)
|
||||||
|
|
||||||
|
func init() {
|
||||||
|
db.RegisterModel(new(PackageVersion))
|
||||||
|
}
|
||||||
|
|
||||||
|
// PackageVersion represents a package version
|
||||||
|
type PackageVersion struct {
|
||||||
|
ID int64 `xorm:"pk autoincr"`
|
||||||
|
PackageID int64 `xorm:"UNIQUE(s) INDEX NOT NULL"`
|
||||||
|
CreatorID int64 `xorm:"NOT NULL DEFAULT 0"`
|
||||||
|
Version string `xorm:"NOT NULL"`
|
||||||
|
LowerVersion string `xorm:"UNIQUE(s) INDEX NOT NULL"`
|
||||||
|
CreatedUnix timeutil.TimeStamp `xorm:"created INDEX NOT NULL"`
|
||||||
|
IsInternal bool `xorm:"INDEX NOT NULL DEFAULT false"`
|
||||||
|
MetadataJSON string `xorm:"metadata_json TEXT"`
|
||||||
|
DownloadCount int64 `xorm:"NOT NULL DEFAULT 0"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetOrInsertVersion inserts a version. If the same version exist already ErrDuplicatePackageVersion is returned
|
||||||
|
func GetOrInsertVersion(ctx context.Context, pv *PackageVersion) (*PackageVersion, error) {
|
||||||
|
e := db.GetEngine(ctx)
|
||||||
|
|
||||||
|
key := &PackageVersion{
|
||||||
|
PackageID: pv.PackageID,
|
||||||
|
LowerVersion: pv.LowerVersion,
|
||||||
|
}
|
||||||
|
|
||||||
|
has, err := e.Get(key)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
if has {
|
||||||
|
return key, ErrDuplicatePackageVersion
|
||||||
|
}
|
||||||
|
if _, err = e.Insert(pv); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
return pv, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// UpdateVersion updates a version
|
||||||
|
func UpdateVersion(ctx context.Context, pv *PackageVersion) error {
|
||||||
|
_, err := db.GetEngine(ctx).ID(pv.ID).Update(pv)
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// IncrementDownloadCounter increments the download counter of a version
|
||||||
|
func IncrementDownloadCounter(ctx context.Context, versionID int64) error {
|
||||||
|
_, err := db.GetEngine(ctx).Exec("UPDATE `package_version` SET `download_count` = `download_count` + 1 WHERE `id` = ?", versionID)
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetVersionByID gets a version by id
|
||||||
|
func GetVersionByID(ctx context.Context, versionID int64) (*PackageVersion, error) {
|
||||||
|
pv := &PackageVersion{}
|
||||||
|
|
||||||
|
has, err := db.GetEngine(ctx).ID(versionID).Get(pv)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
if !has {
|
||||||
|
return nil, ErrPackageNotExist
|
||||||
|
}
|
||||||
|
return pv, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetVersionByNameAndVersion gets a version by name and version number
|
||||||
|
func GetVersionByNameAndVersion(ctx context.Context, ownerID int64, packageType Type, name, version string) (*PackageVersion, error) {
|
||||||
|
return getVersionByNameAndVersion(ctx, ownerID, packageType, name, version, false)
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetInternalVersionByNameAndVersion gets a version by name and version number
|
||||||
|
func GetInternalVersionByNameAndVersion(ctx context.Context, ownerID int64, packageType Type, name, version string) (*PackageVersion, error) {
|
||||||
|
return getVersionByNameAndVersion(ctx, ownerID, packageType, name, version, true)
|
||||||
|
}
|
||||||
|
|
||||||
|
func getVersionByNameAndVersion(ctx context.Context, ownerID int64, packageType Type, name, version string, isInternal bool) (*PackageVersion, error) {
|
||||||
|
var cond builder.Cond = builder.Eq{
|
||||||
|
"package.owner_id": ownerID,
|
||||||
|
"package.type": packageType,
|
||||||
|
"package.lower_name": strings.ToLower(name),
|
||||||
|
"package_version.is_internal": isInternal,
|
||||||
|
}
|
||||||
|
pv := &PackageVersion{
|
||||||
|
LowerVersion: strings.ToLower(version),
|
||||||
|
}
|
||||||
|
has, err := db.GetEngine(ctx).
|
||||||
|
Join("INNER", "package", "package.id = package_version.package_id").
|
||||||
|
Where(cond).
|
||||||
|
Get(pv)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
if !has {
|
||||||
|
return nil, ErrPackageNotExist
|
||||||
|
}
|
||||||
|
|
||||||
|
return pv, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetVersionsByPackageType gets all versions of a specific type
|
||||||
|
func GetVersionsByPackageType(ctx context.Context, ownerID int64, packageType Type) ([]*PackageVersion, error) {
|
||||||
|
var cond builder.Cond = builder.Eq{
|
||||||
|
"package.owner_id": ownerID,
|
||||||
|
"package.type": packageType,
|
||||||
|
"package_version.is_internal": false,
|
||||||
|
}
|
||||||
|
|
||||||
|
pvs := make([]*PackageVersion, 0, 10)
|
||||||
|
return pvs, db.GetEngine(ctx).
|
||||||
|
Where(cond).
|
||||||
|
Join("INNER", "package", "package.id = package_version.package_id").
|
||||||
|
Find(&pvs)
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetVersionsByPackageName gets all versions of a specific package
|
||||||
|
func GetVersionsByPackageName(ctx context.Context, ownerID int64, packageType Type, name string) ([]*PackageVersion, error) {
|
||||||
|
var cond builder.Cond = builder.Eq{
|
||||||
|
"package.owner_id": ownerID,
|
||||||
|
"package.type": packageType,
|
||||||
|
"package.lower_name": strings.ToLower(name),
|
||||||
|
"package_version.is_internal": false,
|
||||||
|
}
|
||||||
|
|
||||||
|
pvs := make([]*PackageVersion, 0, 10)
|
||||||
|
return pvs, db.GetEngine(ctx).
|
||||||
|
Where(cond).
|
||||||
|
Join("INNER", "package", "package.id = package_version.package_id").
|
||||||
|
Find(&pvs)
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetVersionsByFilename gets all versions which are linked to a filename
|
||||||
|
func GetVersionsByFilename(ctx context.Context, ownerID int64, packageType Type, filename string) ([]*PackageVersion, error) {
|
||||||
|
var cond builder.Cond = builder.Eq{
|
||||||
|
"package.owner_id": ownerID,
|
||||||
|
"package.type": packageType,
|
||||||
|
"package_file.lower_name": strings.ToLower(filename),
|
||||||
|
"package_version.is_internal": false,
|
||||||
|
}
|
||||||
|
|
||||||
|
pvs := make([]*PackageVersion, 0, 10)
|
||||||
|
return pvs, db.GetEngine(ctx).
|
||||||
|
Where(cond).
|
||||||
|
Join("INNER", "package_file", "package_file.version_id = package_version.id").
|
||||||
|
Join("INNER", "package", "package.id = package_version.package_id").
|
||||||
|
Find(&pvs)
|
||||||
|
}
|
||||||
|
|
||||||
|
// DeleteVersionByID deletes a version by id
|
||||||
|
func DeleteVersionByID(ctx context.Context, versionID int64) error {
|
||||||
|
_, err := db.GetEngine(ctx).ID(versionID).Delete(&PackageVersion{})
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// HasVersionFileReferences checks if there are associated files
|
||||||
|
func HasVersionFileReferences(ctx context.Context, versionID int64) (bool, error) {
|
||||||
|
return db.GetEngine(ctx).Get(&PackageFile{
|
||||||
|
VersionID: versionID,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// PackageSearchOptions are options for SearchXXX methods
|
||||||
|
type PackageSearchOptions struct {
|
||||||
|
OwnerID int64
|
||||||
|
RepoID int64
|
||||||
|
Type string
|
||||||
|
PackageID int64
|
||||||
|
QueryName string
|
||||||
|
QueryVersion string
|
||||||
|
Properties map[string]string
|
||||||
|
Sort string
|
||||||
|
db.Paginator
|
||||||
|
}
|
||||||
|
|
||||||
|
func (opts *PackageSearchOptions) toConds() builder.Cond {
|
||||||
|
var cond builder.Cond = builder.Eq{"package_version.is_internal": false}
|
||||||
|
|
||||||
|
if opts.OwnerID != 0 {
|
||||||
|
cond = cond.And(builder.Eq{"package.owner_id": opts.OwnerID})
|
||||||
|
}
|
||||||
|
if opts.RepoID != 0 {
|
||||||
|
cond = cond.And(builder.Eq{"package.repo_id": opts.RepoID})
|
||||||
|
}
|
||||||
|
if opts.Type != "" && opts.Type != "all" {
|
||||||
|
cond = cond.And(builder.Eq{"package.type": opts.Type})
|
||||||
|
}
|
||||||
|
if opts.PackageID != 0 {
|
||||||
|
cond = cond.And(builder.Eq{"package.id": opts.PackageID})
|
||||||
|
}
|
||||||
|
if opts.QueryName != "" {
|
||||||
|
cond = cond.And(builder.Like{"package.lower_name", strings.ToLower(opts.QueryName)})
|
||||||
|
}
|
||||||
|
if opts.QueryVersion != "" {
|
||||||
|
cond = cond.And(builder.Like{"package_version.lower_version", strings.ToLower(opts.QueryVersion)})
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(opts.Properties) != 0 {
|
||||||
|
var propsCond builder.Cond = builder.Eq{
|
||||||
|
"package_property.ref_type": PropertyTypeVersion,
|
||||||
|
}
|
||||||
|
propsCond = propsCond.And(builder.Expr("package_property.ref_id = package_version.id"))
|
||||||
|
|
||||||
|
propsCondBlock := builder.NewCond()
|
||||||
|
for name, value := range opts.Properties {
|
||||||
|
propsCondBlock = propsCondBlock.Or(builder.Eq{
|
||||||
|
"package_property.name": name,
|
||||||
|
"package_property.value": value,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
propsCond = propsCond.And(propsCondBlock)
|
||||||
|
|
||||||
|
cond = cond.And(builder.Eq{
|
||||||
|
strconv.Itoa(len(opts.Properties)): builder.Select("COUNT(*)").Where(propsCond).From("package_property"),
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
return cond
|
||||||
|
}
|
||||||
|
|
||||||
|
func (opts *PackageSearchOptions) configureOrderBy(e db.Engine) {
|
||||||
|
switch opts.Sort {
|
||||||
|
case "alphabetically":
|
||||||
|
e.Asc("package.name")
|
||||||
|
case "reversealphabetically":
|
||||||
|
e.Desc("package.name")
|
||||||
|
case "highestversion":
|
||||||
|
e.Desc("package_version.version")
|
||||||
|
case "lowestversion":
|
||||||
|
e.Asc("package_version.version")
|
||||||
|
case "oldest":
|
||||||
|
e.Asc("package_version.created_unix")
|
||||||
|
default:
|
||||||
|
e.Desc("package_version.created_unix")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// SearchVersions gets all versions of packages matching the search options
|
||||||
|
func SearchVersions(ctx context.Context, opts *PackageSearchOptions) ([]*PackageVersion, int64, error) {
|
||||||
|
sess := db.GetEngine(ctx).
|
||||||
|
Where(opts.toConds()).
|
||||||
|
Table("package_version").
|
||||||
|
Join("INNER", "package", "package.id = package_version.package_id")
|
||||||
|
|
||||||
|
opts.configureOrderBy(sess)
|
||||||
|
|
||||||
|
if opts.Paginator != nil {
|
||||||
|
sess = db.SetSessionPagination(sess, opts)
|
||||||
|
}
|
||||||
|
|
||||||
|
pvs := make([]*PackageVersion, 0, 10)
|
||||||
|
count, err := sess.FindAndCount(&pvs)
|
||||||
|
return pvs, count, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// SearchLatestVersions gets the latest version of every package matching the search options
|
||||||
|
func SearchLatestVersions(ctx context.Context, opts *PackageSearchOptions) ([]*PackageVersion, int64, error) {
|
||||||
|
cond := opts.toConds().
|
||||||
|
And(builder.Expr("pv2.id IS NULL"))
|
||||||
|
|
||||||
|
sess := db.GetEngine(ctx).
|
||||||
|
Table("package_version").
|
||||||
|
Join("LEFT", "package_version pv2", "package_version.package_id = pv2.package_id AND (package_version.created_unix < pv2.created_unix OR (package_version.created_unix = pv2.created_unix AND package_version.id < pv2.id))").
|
||||||
|
Join("INNER", "package", "package.id = package_version.package_id").
|
||||||
|
Where(cond)
|
||||||
|
|
||||||
|
opts.configureOrderBy(sess)
|
||||||
|
|
||||||
|
if opts.Paginator != nil {
|
||||||
|
sess = db.SetSessionPagination(sess, opts)
|
||||||
|
}
|
||||||
|
|
||||||
|
pvs := make([]*PackageVersion, 0, 10)
|
||||||
|
count, err := sess.FindAndCount(&pvs)
|
||||||
|
return pvs, count, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// FindVersionsByPropertyNameAndValue gets all package versions which are associated with a specific property + value
|
||||||
|
func FindVersionsByPropertyNameAndValue(ctx context.Context, packageID int64, name, value string) ([]*PackageVersion, error) {
|
||||||
|
var cond builder.Cond = builder.Eq{
|
||||||
|
"package_property.ref_type": PropertyTypeVersion,
|
||||||
|
"package_property.name": name,
|
||||||
|
"package_property.value": value,
|
||||||
|
"package_version.package_id": packageID,
|
||||||
|
"package_version.is_internal": false,
|
||||||
|
}
|
||||||
|
|
||||||
|
pvs := make([]*PackageVersion, 0, 5)
|
||||||
|
return pvs, db.GetEngine(ctx).
|
||||||
|
Where(cond).
|
||||||
|
Join("INNER", "package_property", "package_property.ref_id = package_version.id").
|
||||||
|
Find(&pvs)
|
||||||
|
}
|
|
@ -26,7 +26,7 @@ import (
|
||||||
)
|
)
|
||||||
|
|
||||||
var (
|
var (
|
||||||
reservedRepoNames = []string{".", ".."}
|
reservedRepoNames = []string{".", "..", "-"}
|
||||||
reservedRepoPatterns = []string{"*.git", "*.wiki", "*.rss", "*.atom"}
|
reservedRepoPatterns = []string{"*.git", "*.wiki", "*.rss", "*.atom"}
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
|
@ -95,6 +95,8 @@ func MainTest(m *testing.M, pathToGiteaRoot string, fixtureFiles ...string) {
|
||||||
|
|
||||||
setting.RepoArchive.Storage.Path = filepath.Join(setting.AppDataPath, "repo-archive")
|
setting.RepoArchive.Storage.Path = filepath.Join(setting.AppDataPath, "repo-archive")
|
||||||
|
|
||||||
|
setting.Packages.Storage.Path = filepath.Join(setting.AppDataPath, "packages")
|
||||||
|
|
||||||
if err = storage.Init(); err != nil {
|
if err = storage.Init(); err != nil {
|
||||||
fatalTestError("storage.Init: %v\n", err)
|
fatalTestError("storage.Init: %v\n", err)
|
||||||
}
|
}
|
||||||
|
|
|
@ -605,6 +605,7 @@ var (
|
||||||
"stars",
|
"stars",
|
||||||
"template",
|
"template",
|
||||||
"user",
|
"user",
|
||||||
|
"v2",
|
||||||
}
|
}
|
||||||
|
|
||||||
reservedUserPatterns = []string{"*.keys", "*.gpg", "*.rss", "*.atom"}
|
reservedUserPatterns = []string{"*.keys", "*.gpg", "*.rss", "*.atom"}
|
||||||
|
|
|
@ -49,6 +49,7 @@ const (
|
||||||
HookEventPullRequestSync HookEventType = "pull_request_sync"
|
HookEventPullRequestSync HookEventType = "pull_request_sync"
|
||||||
HookEventRepository HookEventType = "repository"
|
HookEventRepository HookEventType = "repository"
|
||||||
HookEventRelease HookEventType = "release"
|
HookEventRelease HookEventType = "release"
|
||||||
|
HookEventPackage HookEventType = "package"
|
||||||
)
|
)
|
||||||
|
|
||||||
// Event returns the HookEventType as an event string
|
// Event returns the HookEventType as an event string
|
||||||
|
|
|
@ -134,6 +134,7 @@ type HookEvents struct {
|
||||||
PullRequestSync bool `json:"pull_request_sync"`
|
PullRequestSync bool `json:"pull_request_sync"`
|
||||||
Repository bool `json:"repository"`
|
Repository bool `json:"repository"`
|
||||||
Release bool `json:"release"`
|
Release bool `json:"release"`
|
||||||
|
Package bool `json:"package"`
|
||||||
}
|
}
|
||||||
|
|
||||||
// HookEvent represents events that will delivery hook.
|
// HookEvent represents events that will delivery hook.
|
||||||
|
@ -339,6 +340,12 @@ func (w *Webhook) HasRepositoryEvent() bool {
|
||||||
(w.ChooseEvents && w.HookEvents.Repository)
|
(w.ChooseEvents && w.HookEvents.Repository)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// HasPackageEvent returns if hook enabled package event.
|
||||||
|
func (w *Webhook) HasPackageEvent() bool {
|
||||||
|
return w.SendEverything ||
|
||||||
|
(w.ChooseEvents && w.HookEvents.Package)
|
||||||
|
}
|
||||||
|
|
||||||
// EventCheckers returns event checkers
|
// EventCheckers returns event checkers
|
||||||
func (w *Webhook) EventCheckers() []struct {
|
func (w *Webhook) EventCheckers() []struct {
|
||||||
Has func() bool
|
Has func() bool
|
||||||
|
@ -368,6 +375,7 @@ func (w *Webhook) EventCheckers() []struct {
|
||||||
{w.HasPullRequestSyncEvent, HookEventPullRequestSync},
|
{w.HasPullRequestSyncEvent, HookEventPullRequestSync},
|
||||||
{w.HasRepositoryEvent, HookEventRepository},
|
{w.HasRepositoryEvent, HookEventRepository},
|
||||||
{w.HasReleaseEvent, HookEventRelease},
|
{w.HasReleaseEvent, HookEventRelease},
|
||||||
|
{w.HasPackageEvent, HookEventPackage},
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -72,6 +72,7 @@ func TestWebhook_EventsArray(t *testing.T) {
|
||||||
"pull_request", "pull_request_assign", "pull_request_label", "pull_request_milestone",
|
"pull_request", "pull_request_assign", "pull_request_label", "pull_request_milestone",
|
||||||
"pull_request_comment", "pull_request_review_approved", "pull_request_review_rejected",
|
"pull_request_comment", "pull_request_review_approved", "pull_request_review_rejected",
|
||||||
"pull_request_review_comment", "pull_request_sync", "repository", "release",
|
"pull_request_review_comment", "pull_request_sync", "repository", "release",
|
||||||
|
"package",
|
||||||
},
|
},
|
||||||
(&Webhook{
|
(&Webhook{
|
||||||
HookEvent: &HookEvent{SendEverything: true},
|
HookEvent: &HookEvent{SendEverything: true},
|
||||||
|
|
|
@ -70,6 +70,7 @@ type Context struct {
|
||||||
ContextUser *user_model.User
|
ContextUser *user_model.User
|
||||||
Repo *Repository
|
Repo *Repository
|
||||||
Org *Organization
|
Org *Organization
|
||||||
|
Package *Package
|
||||||
}
|
}
|
||||||
|
|
||||||
// TrHTMLEscapeArgs runs Tr but pre-escapes all arguments with html.EscapeString.
|
// TrHTMLEscapeArgs runs Tr but pre-escapes all arguments with html.EscapeString.
|
||||||
|
@ -331,6 +332,18 @@ func (ctx *Context) RespHeader() http.Header {
|
||||||
return ctx.Resp.Header()
|
return ctx.Resp.Header()
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// SetServeHeaders sets necessary content serve headers
|
||||||
|
func (ctx *Context) SetServeHeaders(filename string) {
|
||||||
|
ctx.Resp.Header().Set("Content-Description", "File Transfer")
|
||||||
|
ctx.Resp.Header().Set("Content-Type", "application/octet-stream")
|
||||||
|
ctx.Resp.Header().Set("Content-Disposition", "attachment; filename="+filename)
|
||||||
|
ctx.Resp.Header().Set("Content-Transfer-Encoding", "binary")
|
||||||
|
ctx.Resp.Header().Set("Expires", "0")
|
||||||
|
ctx.Resp.Header().Set("Cache-Control", "must-revalidate")
|
||||||
|
ctx.Resp.Header().Set("Pragma", "public")
|
||||||
|
ctx.Resp.Header().Set("Access-Control-Expose-Headers", "Content-Disposition")
|
||||||
|
}
|
||||||
|
|
||||||
// ServeContent serves content to http request
|
// ServeContent serves content to http request
|
||||||
func (ctx *Context) ServeContent(name string, r io.ReadSeeker, params ...interface{}) {
|
func (ctx *Context) ServeContent(name string, r io.ReadSeeker, params ...interface{}) {
|
||||||
modTime := time.Now()
|
modTime := time.Now()
|
||||||
|
@ -340,14 +353,7 @@ func (ctx *Context) ServeContent(name string, r io.ReadSeeker, params ...interfa
|
||||||
modTime = v
|
modTime = v
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
ctx.Resp.Header().Set("Content-Description", "File Transfer")
|
ctx.SetServeHeaders(name)
|
||||||
ctx.Resp.Header().Set("Content-Type", "application/octet-stream")
|
|
||||||
ctx.Resp.Header().Set("Content-Disposition", "attachment; filename="+name)
|
|
||||||
ctx.Resp.Header().Set("Content-Transfer-Encoding", "binary")
|
|
||||||
ctx.Resp.Header().Set("Expires", "0")
|
|
||||||
ctx.Resp.Header().Set("Cache-Control", "must-revalidate")
|
|
||||||
ctx.Resp.Header().Set("Pragma", "public")
|
|
||||||
ctx.Resp.Header().Set("Access-Control-Expose-Headers", "Content-Disposition")
|
|
||||||
http.ServeContent(ctx.Resp, ctx.Req, name, modTime, r)
|
http.ServeContent(ctx.Resp, ctx.Req, name, modTime, r)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -359,31 +365,41 @@ func (ctx *Context) ServeFile(file string, names ...string) {
|
||||||
} else {
|
} else {
|
||||||
name = path.Base(file)
|
name = path.Base(file)
|
||||||
}
|
}
|
||||||
ctx.Resp.Header().Set("Content-Description", "File Transfer")
|
ctx.SetServeHeaders(name)
|
||||||
ctx.Resp.Header().Set("Content-Type", "application/octet-stream")
|
|
||||||
ctx.Resp.Header().Set("Content-Disposition", "attachment; filename="+name)
|
|
||||||
ctx.Resp.Header().Set("Content-Transfer-Encoding", "binary")
|
|
||||||
ctx.Resp.Header().Set("Expires", "0")
|
|
||||||
ctx.Resp.Header().Set("Cache-Control", "must-revalidate")
|
|
||||||
ctx.Resp.Header().Set("Pragma", "public")
|
|
||||||
http.ServeFile(ctx.Resp, ctx.Req, file)
|
http.ServeFile(ctx.Resp, ctx.Req, file)
|
||||||
}
|
}
|
||||||
|
|
||||||
// ServeStream serves file via io stream
|
// ServeStream serves file via io stream
|
||||||
func (ctx *Context) ServeStream(rd io.Reader, name string) {
|
func (ctx *Context) ServeStream(rd io.Reader, name string) {
|
||||||
ctx.Resp.Header().Set("Content-Description", "File Transfer")
|
ctx.SetServeHeaders(name)
|
||||||
ctx.Resp.Header().Set("Content-Type", "application/octet-stream")
|
|
||||||
ctx.Resp.Header().Set("Content-Disposition", "attachment; filename="+name)
|
|
||||||
ctx.Resp.Header().Set("Content-Transfer-Encoding", "binary")
|
|
||||||
ctx.Resp.Header().Set("Expires", "0")
|
|
||||||
ctx.Resp.Header().Set("Cache-Control", "must-revalidate")
|
|
||||||
ctx.Resp.Header().Set("Pragma", "public")
|
|
||||||
_, err := io.Copy(ctx.Resp, rd)
|
_, err := io.Copy(ctx.Resp, rd)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
ctx.ServerError("Download file failed", err)
|
ctx.ServerError("Download file failed", err)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// UploadStream returns the request body or the first form file
|
||||||
|
// Only form files need to get closed.
|
||||||
|
func (ctx *Context) UploadStream() (rd io.ReadCloser, needToClose bool, err error) {
|
||||||
|
contentType := strings.ToLower(ctx.Req.Header.Get("Content-Type"))
|
||||||
|
if strings.HasPrefix(contentType, "application/x-www-form-urlencoded") || strings.HasPrefix(contentType, "multipart/form-data") {
|
||||||
|
if err := ctx.Req.ParseMultipartForm(32 << 20); err != nil {
|
||||||
|
return nil, false, err
|
||||||
|
}
|
||||||
|
if ctx.Req.MultipartForm.File == nil {
|
||||||
|
return nil, false, http.ErrMissingFile
|
||||||
|
}
|
||||||
|
for _, files := range ctx.Req.MultipartForm.File {
|
||||||
|
if len(files) > 0 {
|
||||||
|
r, err := files[0].Open()
|
||||||
|
return r, true, err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return nil, false, http.ErrMissingFile
|
||||||
|
}
|
||||||
|
return ctx.Req.Body, false, nil
|
||||||
|
}
|
||||||
|
|
||||||
// Error returned an error to web browser
|
// Error returned an error to web browser
|
||||||
func (ctx *Context) Error(status int, contents ...string) {
|
func (ctx *Context) Error(status int, contents ...string) {
|
||||||
v := http.StatusText(status)
|
v := http.StatusText(status)
|
||||||
|
|
109
modules/context/package.go
Normal file
109
modules/context/package.go
Normal file
|
@ -0,0 +1,109 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package context
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
"net/http"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/models/organization"
|
||||||
|
packages_model "code.gitea.io/gitea/models/packages"
|
||||||
|
"code.gitea.io/gitea/models/perm"
|
||||||
|
user_model "code.gitea.io/gitea/models/user"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Package contains owner, access mode and optional the package descriptor
|
||||||
|
type Package struct {
|
||||||
|
Owner *user_model.User
|
||||||
|
AccessMode perm.AccessMode
|
||||||
|
Descriptor *packages_model.PackageDescriptor
|
||||||
|
}
|
||||||
|
|
||||||
|
// PackageAssignment returns a middleware to handle Context.Package assignment
|
||||||
|
func PackageAssignment() func(ctx *Context) {
|
||||||
|
return func(ctx *Context) {
|
||||||
|
packageAssignment(ctx, func(status int, title string, obj interface{}) {
|
||||||
|
err, ok := obj.(error)
|
||||||
|
if !ok {
|
||||||
|
err = fmt.Errorf("%s", obj)
|
||||||
|
}
|
||||||
|
if status == http.StatusNotFound {
|
||||||
|
ctx.NotFound(title, err)
|
||||||
|
} else {
|
||||||
|
ctx.ServerError(title, err)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// PackageAssignmentAPI returns a middleware to handle Context.Package assignment
|
||||||
|
func PackageAssignmentAPI() func(ctx *APIContext) {
|
||||||
|
return func(ctx *APIContext) {
|
||||||
|
packageAssignment(ctx.Context, ctx.Error)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func packageAssignment(ctx *Context, errCb func(int, string, interface{})) {
|
||||||
|
ctx.Package = &Package{
|
||||||
|
Owner: ctx.ContextUser,
|
||||||
|
}
|
||||||
|
|
||||||
|
if ctx.Doer != nil && ctx.Doer.ID == ctx.ContextUser.ID {
|
||||||
|
ctx.Package.AccessMode = perm.AccessModeOwner
|
||||||
|
} else {
|
||||||
|
if ctx.Package.Owner.IsOrganization() {
|
||||||
|
if organization.HasOrgOrUserVisible(ctx, ctx.Package.Owner, ctx.Doer) {
|
||||||
|
ctx.Package.AccessMode = perm.AccessModeRead
|
||||||
|
if ctx.Doer != nil {
|
||||||
|
var err error
|
||||||
|
ctx.Package.AccessMode, err = organization.OrgFromUser(ctx.Package.Owner).GetOrgUserMaxAuthorizeLevel(ctx.Doer.ID)
|
||||||
|
if err != nil {
|
||||||
|
errCb(http.StatusInternalServerError, "GetOrgUserMaxAuthorizeLevel", err)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
ctx.Package.AccessMode = perm.AccessModeRead
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
packageType := ctx.Params("type")
|
||||||
|
name := ctx.Params("name")
|
||||||
|
version := ctx.Params("version")
|
||||||
|
if packageType != "" && name != "" && version != "" {
|
||||||
|
pv, err := packages_model.GetVersionByNameAndVersion(ctx, ctx.Package.Owner.ID, packages_model.Type(packageType), name, version)
|
||||||
|
if err != nil {
|
||||||
|
if err == packages_model.ErrPackageNotExist {
|
||||||
|
errCb(http.StatusNotFound, "GetVersionByNameAndVersion", err)
|
||||||
|
} else {
|
||||||
|
errCb(http.StatusInternalServerError, "GetVersionByNameAndVersion", err)
|
||||||
|
}
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
ctx.Package.Descriptor, err = packages_model.GetPackageDescriptor(ctx, pv)
|
||||||
|
if err != nil {
|
||||||
|
errCb(http.StatusInternalServerError, "GetPackageDescriptor", err)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// PackageContexter initializes a package context for a request.
|
||||||
|
func PackageContexter() func(next http.Handler) http.Handler {
|
||||||
|
return func(next http.Handler) http.Handler {
|
||||||
|
return http.HandlerFunc(func(resp http.ResponseWriter, req *http.Request) {
|
||||||
|
ctx := Context{
|
||||||
|
Resp: NewResponse(resp),
|
||||||
|
Data: map[string]interface{}{},
|
||||||
|
}
|
||||||
|
|
||||||
|
ctx.Req = WithContext(req, &ctx)
|
||||||
|
|
||||||
|
next.ServeHTTP(ctx.Resp, ctx.Req)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
43
modules/convert/package.go
Normal file
43
modules/convert/package.go
Normal file
|
@ -0,0 +1,43 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package convert
|
||||||
|
|
||||||
|
import (
|
||||||
|
"code.gitea.io/gitea/models/packages"
|
||||||
|
"code.gitea.io/gitea/models/perm"
|
||||||
|
api "code.gitea.io/gitea/modules/structs"
|
||||||
|
)
|
||||||
|
|
||||||
|
// ToPackage convert a packages.PackageDescriptor to api.Package
|
||||||
|
func ToPackage(pd *packages.PackageDescriptor) *api.Package {
|
||||||
|
var repo *api.Repository
|
||||||
|
if pd.Repository != nil {
|
||||||
|
repo = ToRepo(pd.Repository, perm.AccessModeNone)
|
||||||
|
}
|
||||||
|
|
||||||
|
return &api.Package{
|
||||||
|
ID: pd.Version.ID,
|
||||||
|
Owner: ToUser(pd.Owner, nil),
|
||||||
|
Repository: repo,
|
||||||
|
Creator: ToUser(pd.Creator, nil),
|
||||||
|
Type: string(pd.Package.Type),
|
||||||
|
Name: pd.Package.Name,
|
||||||
|
Version: pd.Version.Version,
|
||||||
|
CreatedAt: pd.Version.CreatedUnix.AsTime(),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ToPackageFile converts packages.PackageFileDescriptor to api.PackageFile
|
||||||
|
func ToPackageFile(pfd *packages.PackageFileDescriptor) *api.PackageFile {
|
||||||
|
return &api.PackageFile{
|
||||||
|
ID: pfd.File.ID,
|
||||||
|
Size: pfd.Blob.Size,
|
||||||
|
Name: pfd.File.Name,
|
||||||
|
HashMD5: pfd.Blob.HashMD5,
|
||||||
|
HashSHA1: pfd.Blob.HashSHA1,
|
||||||
|
HashSHA256: pfd.Blob.HashSHA256,
|
||||||
|
HashSHA512: pfd.Blob.HashSHA512,
|
||||||
|
}
|
||||||
|
}
|
|
@ -6,6 +6,7 @@ package base
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"code.gitea.io/gitea/models"
|
"code.gitea.io/gitea/models"
|
||||||
|
packages_model "code.gitea.io/gitea/models/packages"
|
||||||
repo_model "code.gitea.io/gitea/models/repo"
|
repo_model "code.gitea.io/gitea/models/repo"
|
||||||
user_model "code.gitea.io/gitea/models/user"
|
user_model "code.gitea.io/gitea/models/user"
|
||||||
"code.gitea.io/gitea/modules/repository"
|
"code.gitea.io/gitea/modules/repository"
|
||||||
|
@ -54,4 +55,6 @@ type Notifier interface {
|
||||||
NotifySyncCreateRef(doer *user_model.User, repo *repo_model.Repository, refType, refFullName, refID string)
|
NotifySyncCreateRef(doer *user_model.User, repo *repo_model.Repository, refType, refFullName, refID string)
|
||||||
NotifySyncDeleteRef(doer *user_model.User, repo *repo_model.Repository, refType, refFullName string)
|
NotifySyncDeleteRef(doer *user_model.User, repo *repo_model.Repository, refType, refFullName string)
|
||||||
NotifyRepoPendingTransfer(doer, newOwner *user_model.User, repo *repo_model.Repository)
|
NotifyRepoPendingTransfer(doer, newOwner *user_model.User, repo *repo_model.Repository)
|
||||||
|
NotifyPackageCreate(doer *user_model.User, pd *packages_model.PackageDescriptor)
|
||||||
|
NotifyPackageDelete(doer *user_model.User, pd *packages_model.PackageDescriptor)
|
||||||
}
|
}
|
||||||
|
|
|
@ -6,6 +6,7 @@ package base
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"code.gitea.io/gitea/models"
|
"code.gitea.io/gitea/models"
|
||||||
|
packages_model "code.gitea.io/gitea/models/packages"
|
||||||
repo_model "code.gitea.io/gitea/models/repo"
|
repo_model "code.gitea.io/gitea/models/repo"
|
||||||
user_model "code.gitea.io/gitea/models/user"
|
user_model "code.gitea.io/gitea/models/user"
|
||||||
"code.gitea.io/gitea/modules/repository"
|
"code.gitea.io/gitea/modules/repository"
|
||||||
|
@ -173,3 +174,11 @@ func (*NullNotifier) NotifySyncDeleteRef(doer *user_model.User, repo *repo_model
|
||||||
// NotifyRepoPendingTransfer places a place holder function
|
// NotifyRepoPendingTransfer places a place holder function
|
||||||
func (*NullNotifier) NotifyRepoPendingTransfer(doer, newOwner *user_model.User, repo *repo_model.Repository) {
|
func (*NullNotifier) NotifyRepoPendingTransfer(doer, newOwner *user_model.User, repo *repo_model.Repository) {
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// NotifyPackageCreate places a place holder function
|
||||||
|
func (*NullNotifier) NotifyPackageCreate(doer *user_model.User, pd *packages_model.PackageDescriptor) {
|
||||||
|
}
|
||||||
|
|
||||||
|
// NotifyPackageDelete places a place holder function
|
||||||
|
func (*NullNotifier) NotifyPackageDelete(doer *user_model.User, pd *packages_model.PackageDescriptor) {
|
||||||
|
}
|
||||||
|
|
|
@ -6,6 +6,7 @@ package notification
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"code.gitea.io/gitea/models"
|
"code.gitea.io/gitea/models"
|
||||||
|
packages_model "code.gitea.io/gitea/models/packages"
|
||||||
repo_model "code.gitea.io/gitea/models/repo"
|
repo_model "code.gitea.io/gitea/models/repo"
|
||||||
user_model "code.gitea.io/gitea/models/user"
|
user_model "code.gitea.io/gitea/models/user"
|
||||||
"code.gitea.io/gitea/modules/notification/action"
|
"code.gitea.io/gitea/modules/notification/action"
|
||||||
|
@ -306,3 +307,17 @@ func NotifyRepoPendingTransfer(doer, newOwner *user_model.User, repo *repo_model
|
||||||
notifier.NotifyRepoPendingTransfer(doer, newOwner, repo)
|
notifier.NotifyRepoPendingTransfer(doer, newOwner, repo)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// NotifyPackageCreate notifies creation of a package to notifiers
|
||||||
|
func NotifyPackageCreate(doer *user_model.User, pd *packages_model.PackageDescriptor) {
|
||||||
|
for _, notifier := range notifiers {
|
||||||
|
notifier.NotifyPackageCreate(doer, pd)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// NotifyPackageDelete notifies deletion of a package to notifiers
|
||||||
|
func NotifyPackageDelete(doer *user_model.User, pd *packages_model.PackageDescriptor) {
|
||||||
|
for _, notifier := range notifiers {
|
||||||
|
notifier.NotifyPackageDelete(doer, pd)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
|
@ -8,6 +8,7 @@ import (
|
||||||
"fmt"
|
"fmt"
|
||||||
|
|
||||||
"code.gitea.io/gitea/models"
|
"code.gitea.io/gitea/models"
|
||||||
|
packages_model "code.gitea.io/gitea/models/packages"
|
||||||
"code.gitea.io/gitea/models/perm"
|
"code.gitea.io/gitea/models/perm"
|
||||||
repo_model "code.gitea.io/gitea/models/repo"
|
repo_model "code.gitea.io/gitea/models/repo"
|
||||||
"code.gitea.io/gitea/models/unit"
|
"code.gitea.io/gitea/models/unit"
|
||||||
|
@ -855,3 +856,33 @@ func (m *webhookNotifier) NotifySyncCreateRef(pusher *user_model.User, repo *rep
|
||||||
func (m *webhookNotifier) NotifySyncDeleteRef(pusher *user_model.User, repo *repo_model.Repository, refType, refFullName string) {
|
func (m *webhookNotifier) NotifySyncDeleteRef(pusher *user_model.User, repo *repo_model.Repository, refType, refFullName string) {
|
||||||
m.NotifyDeleteRef(pusher, repo, refType, refFullName)
|
m.NotifyDeleteRef(pusher, repo, refType, refFullName)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (m *webhookNotifier) NotifyPackageCreate(doer *user_model.User, pd *packages_model.PackageDescriptor) {
|
||||||
|
notifyPackage(doer, pd, api.HookPackageCreated)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (m *webhookNotifier) NotifyPackageDelete(doer *user_model.User, pd *packages_model.PackageDescriptor) {
|
||||||
|
notifyPackage(doer, pd, api.HookPackageDeleted)
|
||||||
|
}
|
||||||
|
|
||||||
|
func notifyPackage(sender *user_model.User, pd *packages_model.PackageDescriptor, action api.HookPackageAction) {
|
||||||
|
if pd.Repository == nil {
|
||||||
|
// TODO https://github.com/go-gitea/gitea/pull/17940
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
org := pd.Owner
|
||||||
|
if !org.IsOrganization() {
|
||||||
|
org = nil
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := webhook_services.PrepareWebhooks(pd.Repository, webhook.HookEventPackage, &api.PackagePayload{
|
||||||
|
Action: action,
|
||||||
|
Repository: convert.ToRepo(pd.Repository, perm.AccessModeNone),
|
||||||
|
Package: convert.ToPackage(pd),
|
||||||
|
Organization: convert.ToUser(org, nil),
|
||||||
|
Sender: convert.ToUser(sender, nil),
|
||||||
|
}); err != nil {
|
||||||
|
log.Error("PrepareWebhooks: %v", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
147
modules/packages/composer/metadata.go
Normal file
147
modules/packages/composer/metadata.go
Normal file
|
@ -0,0 +1,147 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package composer
|
||||||
|
|
||||||
|
import (
|
||||||
|
"archive/zip"
|
||||||
|
"errors"
|
||||||
|
"io"
|
||||||
|
"regexp"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/modules/json"
|
||||||
|
"code.gitea.io/gitea/modules/validation"
|
||||||
|
|
||||||
|
"github.com/hashicorp/go-version"
|
||||||
|
)
|
||||||
|
|
||||||
|
// TypeProperty is the name of the property for Composer package types
|
||||||
|
const TypeProperty = "composer.type"
|
||||||
|
|
||||||
|
var (
|
||||||
|
// ErrMissingComposerFile indicates a missing composer.json file
|
||||||
|
ErrMissingComposerFile = errors.New("composer.json file is missing")
|
||||||
|
// ErrInvalidName indicates an invalid package name
|
||||||
|
ErrInvalidName = errors.New("package name is invalid")
|
||||||
|
// ErrInvalidVersion indicates an invalid package version
|
||||||
|
ErrInvalidVersion = errors.New("package version is invalid")
|
||||||
|
)
|
||||||
|
|
||||||
|
// Package represents a Composer package
|
||||||
|
type Package struct {
|
||||||
|
Name string
|
||||||
|
Version string
|
||||||
|
Type string
|
||||||
|
Metadata *Metadata
|
||||||
|
}
|
||||||
|
|
||||||
|
// Metadata represents the metadata of a Composer package
|
||||||
|
type Metadata struct {
|
||||||
|
Description string `json:"description,omitempty"`
|
||||||
|
Keywords []string `json:"keywords,omitempty"`
|
||||||
|
Homepage string `json:"homepage,omitempty"`
|
||||||
|
License Licenses `json:"license,omitempty"`
|
||||||
|
Authors []Author `json:"authors,omitempty"`
|
||||||
|
Autoload map[string]interface{} `json:"autoload,omitempty"`
|
||||||
|
AutoloadDev map[string]interface{} `json:"autoload-dev,omitempty"`
|
||||||
|
Extra map[string]interface{} `json:"extra,omitempty"`
|
||||||
|
Require map[string]string `json:"require,omitempty"`
|
||||||
|
RequireDev map[string]string `json:"require-dev,omitempty"`
|
||||||
|
Suggest map[string]string `json:"suggest,omitempty"`
|
||||||
|
Provide map[string]string `json:"provide,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Licenses represents the licenses of a Composer package
|
||||||
|
type Licenses []string
|
||||||
|
|
||||||
|
// UnmarshalJSON reads from a string or array
|
||||||
|
func (l *Licenses) UnmarshalJSON(data []byte) error {
|
||||||
|
switch data[0] {
|
||||||
|
case '"':
|
||||||
|
var value string
|
||||||
|
if err := json.Unmarshal(data, &value); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
*l = Licenses{value}
|
||||||
|
case '[':
|
||||||
|
values := make([]string, 0, 5)
|
||||||
|
if err := json.Unmarshal(data, &values); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
*l = Licenses(values)
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Author represents an author
|
||||||
|
type Author struct {
|
||||||
|
Name string `json:"name,omitempty"`
|
||||||
|
Email string `json:"email,omitempty"`
|
||||||
|
Homepage string `json:"homepage,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
var nameMatch = regexp.MustCompile(`\A[a-z0-9]([_\.-]?[a-z0-9]+)*/[a-z0-9](([_\.]?|-{0,2})[a-z0-9]+)*\z`)
|
||||||
|
|
||||||
|
// ParsePackage parses the metadata of a Composer package file
|
||||||
|
func ParsePackage(r io.ReaderAt, size int64) (*Package, error) {
|
||||||
|
archive, err := zip.NewReader(r, size)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, file := range archive.File {
|
||||||
|
if strings.Count(file.Name, "/") > 1 {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
if strings.HasSuffix(strings.ToLower(file.Name), "composer.json") {
|
||||||
|
f, err := archive.Open(file.Name)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
defer f.Close()
|
||||||
|
|
||||||
|
return ParseComposerFile(f)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return nil, ErrMissingComposerFile
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParseComposerFile parses a composer.json file to retrieve the metadata of a Composer package
|
||||||
|
func ParseComposerFile(r io.Reader) (*Package, error) {
|
||||||
|
var cj struct {
|
||||||
|
Name string `json:"name"`
|
||||||
|
Version string `json:"version"`
|
||||||
|
Type string `json:"type"`
|
||||||
|
Metadata
|
||||||
|
}
|
||||||
|
if err := json.NewDecoder(r).Decode(&cj); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if !nameMatch.MatchString(cj.Name) {
|
||||||
|
return nil, ErrInvalidName
|
||||||
|
}
|
||||||
|
|
||||||
|
if cj.Version != "" {
|
||||||
|
if _, err := version.NewSemver(cj.Version); err != nil {
|
||||||
|
return nil, ErrInvalidVersion
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if !validation.IsValidURL(cj.Homepage) {
|
||||||
|
cj.Homepage = ""
|
||||||
|
}
|
||||||
|
|
||||||
|
if cj.Type == "" {
|
||||||
|
cj.Type = "library"
|
||||||
|
}
|
||||||
|
|
||||||
|
return &Package{
|
||||||
|
Name: cj.Name,
|
||||||
|
Version: cj.Version,
|
||||||
|
Type: cj.Type,
|
||||||
|
Metadata: &cj.Metadata,
|
||||||
|
}, nil
|
||||||
|
}
|
130
modules/packages/composer/metadata_test.go
Normal file
130
modules/packages/composer/metadata_test.go
Normal file
|
@ -0,0 +1,130 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package composer
|
||||||
|
|
||||||
|
import (
|
||||||
|
"archive/zip"
|
||||||
|
"bytes"
|
||||||
|
"strings"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/modules/json"
|
||||||
|
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
name = "gitea/composer-package"
|
||||||
|
description = "Package Description"
|
||||||
|
packageType = "composer-plugin"
|
||||||
|
author = "Gitea Authors"
|
||||||
|
email = "no.reply@gitea.io"
|
||||||
|
homepage = "https://gitea.io"
|
||||||
|
license = "MIT"
|
||||||
|
)
|
||||||
|
|
||||||
|
const composerContent = `{
|
||||||
|
"name": "` + name + `",
|
||||||
|
"description": "` + description + `",
|
||||||
|
"type": "` + packageType + `",
|
||||||
|
"license": "` + license + `",
|
||||||
|
"authors": [
|
||||||
|
{
|
||||||
|
"name": "` + author + `",
|
||||||
|
"email": "` + email + `"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"homepage": "` + homepage + `",
|
||||||
|
"autoload": {
|
||||||
|
"psr-4": {"Gitea\\ComposerPackage\\": "src/"}
|
||||||
|
},
|
||||||
|
"require": {
|
||||||
|
"php": ">=7.2 || ^8.0"
|
||||||
|
}
|
||||||
|
}`
|
||||||
|
|
||||||
|
func TestLicenseUnmarshal(t *testing.T) {
|
||||||
|
var l Licenses
|
||||||
|
assert.NoError(t, json.NewDecoder(strings.NewReader(`["MIT"]`)).Decode(&l))
|
||||||
|
assert.Len(t, l, 1)
|
||||||
|
assert.Equal(t, "MIT", l[0])
|
||||||
|
assert.NoError(t, json.NewDecoder(strings.NewReader(`"MIT"`)).Decode(&l))
|
||||||
|
assert.Len(t, l, 1)
|
||||||
|
assert.Equal(t, "MIT", l[0])
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestParsePackage(t *testing.T) {
|
||||||
|
createArchive := func(name, content string) []byte {
|
||||||
|
var buf bytes.Buffer
|
||||||
|
archive := zip.NewWriter(&buf)
|
||||||
|
w, _ := archive.Create(name)
|
||||||
|
w.Write([]byte(content))
|
||||||
|
archive.Close()
|
||||||
|
return buf.Bytes()
|
||||||
|
}
|
||||||
|
|
||||||
|
t.Run("MissingComposerFile", func(t *testing.T) {
|
||||||
|
data := createArchive("dummy.txt", "")
|
||||||
|
|
||||||
|
cp, err := ParsePackage(bytes.NewReader(data), int64(len(data)))
|
||||||
|
assert.Nil(t, cp)
|
||||||
|
assert.ErrorIs(t, err, ErrMissingComposerFile)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("MissingComposerFileInRoot", func(t *testing.T) {
|
||||||
|
data := createArchive("sub/sub/composer.json", "")
|
||||||
|
|
||||||
|
cp, err := ParsePackage(bytes.NewReader(data), int64(len(data)))
|
||||||
|
assert.Nil(t, cp)
|
||||||
|
assert.ErrorIs(t, err, ErrMissingComposerFile)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("InvalidComposerFile", func(t *testing.T) {
|
||||||
|
data := createArchive("composer.json", "")
|
||||||
|
|
||||||
|
cp, err := ParsePackage(bytes.NewReader(data), int64(len(data)))
|
||||||
|
assert.Nil(t, cp)
|
||||||
|
assert.Error(t, err)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Valid", func(t *testing.T) {
|
||||||
|
data := createArchive("composer.json", composerContent)
|
||||||
|
|
||||||
|
cp, err := ParsePackage(bytes.NewReader(data), int64(len(data)))
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.NotNil(t, cp)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestParseComposerFile(t *testing.T) {
|
||||||
|
t.Run("InvalidPackageName", func(t *testing.T) {
|
||||||
|
cp, err := ParseComposerFile(strings.NewReader(`{}`))
|
||||||
|
assert.Nil(t, cp)
|
||||||
|
assert.ErrorIs(t, err, ErrInvalidName)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("InvalidPackageVersion", func(t *testing.T) {
|
||||||
|
cp, err := ParseComposerFile(strings.NewReader(`{"name": "gitea/composer-package", "version": "1.a.3"}`))
|
||||||
|
assert.Nil(t, cp)
|
||||||
|
assert.ErrorIs(t, err, ErrInvalidVersion)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Valid", func(t *testing.T) {
|
||||||
|
cp, err := ParseComposerFile(strings.NewReader(composerContent))
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.NotNil(t, cp)
|
||||||
|
|
||||||
|
assert.Equal(t, name, cp.Name)
|
||||||
|
assert.Empty(t, cp.Version)
|
||||||
|
assert.Equal(t, description, cp.Metadata.Description)
|
||||||
|
assert.Len(t, cp.Metadata.Authors, 1)
|
||||||
|
assert.Equal(t, author, cp.Metadata.Authors[0].Name)
|
||||||
|
assert.Equal(t, email, cp.Metadata.Authors[0].Email)
|
||||||
|
assert.Equal(t, homepage, cp.Metadata.Homepage)
|
||||||
|
assert.Equal(t, packageType, cp.Type)
|
||||||
|
assert.Len(t, cp.Metadata.License, 1)
|
||||||
|
assert.Equal(t, license, cp.Metadata.License[0])
|
||||||
|
})
|
||||||
|
}
|
68
modules/packages/conan/conanfile_parser.go
Normal file
68
modules/packages/conan/conanfile_parser.go
Normal file
|
@ -0,0 +1,68 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package conan
|
||||||
|
|
||||||
|
import (
|
||||||
|
"io"
|
||||||
|
"regexp"
|
||||||
|
"strings"
|
||||||
|
)
|
||||||
|
|
||||||
|
var (
|
||||||
|
patternAuthor = compilePattern("author")
|
||||||
|
patternHomepage = compilePattern("homepage")
|
||||||
|
patternURL = compilePattern("url")
|
||||||
|
patternLicense = compilePattern("license")
|
||||||
|
patternDescription = compilePattern("description")
|
||||||
|
patternTopics = regexp.MustCompile(`(?im)^\s*topics\s*=\s*\((.+)\)`)
|
||||||
|
patternTopicList = regexp.MustCompile(`\s*['"](.+?)['"]\s*,?`)
|
||||||
|
)
|
||||||
|
|
||||||
|
func compilePattern(name string) *regexp.Regexp {
|
||||||
|
return regexp.MustCompile(`(?im)^\s*` + name + `\s*=\s*['"\(](.+)['"\)]`)
|
||||||
|
}
|
||||||
|
|
||||||
|
func ParseConanfile(r io.Reader) (*Metadata, error) {
|
||||||
|
buf, err := io.ReadAll(io.LimitReader(r, 1<<20))
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
metadata := &Metadata{}
|
||||||
|
|
||||||
|
m := patternAuthor.FindSubmatch(buf)
|
||||||
|
if len(m) > 1 && len(m[1]) > 0 {
|
||||||
|
metadata.Author = string(m[1])
|
||||||
|
}
|
||||||
|
m = patternHomepage.FindSubmatch(buf)
|
||||||
|
if len(m) > 1 && len(m[1]) > 0 {
|
||||||
|
metadata.ProjectURL = string(m[1])
|
||||||
|
}
|
||||||
|
m = patternURL.FindSubmatch(buf)
|
||||||
|
if len(m) > 1 && len(m[1]) > 0 {
|
||||||
|
metadata.RepositoryURL = string(m[1])
|
||||||
|
}
|
||||||
|
m = patternLicense.FindSubmatch(buf)
|
||||||
|
if len(m) > 1 && len(m[1]) > 0 {
|
||||||
|
metadata.License = strings.ReplaceAll(strings.ReplaceAll(string(m[1]), "'", ""), "\"", "")
|
||||||
|
}
|
||||||
|
m = patternDescription.FindSubmatch(buf)
|
||||||
|
if len(m) > 1 && len(m[1]) > 0 {
|
||||||
|
metadata.Description = string(m[1])
|
||||||
|
}
|
||||||
|
m = patternTopics.FindSubmatch(buf)
|
||||||
|
if len(m) > 1 && len(m[1]) > 0 {
|
||||||
|
m2 := patternTopicList.FindAllSubmatch(m[1], -1)
|
||||||
|
if len(m2) > 0 {
|
||||||
|
metadata.Keywords = make([]string, 0, len(m2))
|
||||||
|
for _, g := range m2 {
|
||||||
|
if len(g) > 1 {
|
||||||
|
metadata.Keywords = append(metadata.Keywords, string(g[1]))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return metadata, nil
|
||||||
|
}
|
51
modules/packages/conan/conanfile_parser_test.go
Normal file
51
modules/packages/conan/conanfile_parser_test.go
Normal file
|
@ -0,0 +1,51 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package conan
|
||||||
|
|
||||||
|
import (
|
||||||
|
"strings"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
name = "ConanPackage"
|
||||||
|
version = "1.2"
|
||||||
|
license = "MIT"
|
||||||
|
author = "Gitea <info@gitea.io>"
|
||||||
|
homepage = "https://gitea.io/"
|
||||||
|
url = "https://gitea.com/"
|
||||||
|
description = "Description of ConanPackage"
|
||||||
|
topic1 = "gitea"
|
||||||
|
topic2 = "conan"
|
||||||
|
contentConanfile = `from conans import ConanFile, CMake, tools
|
||||||
|
|
||||||
|
class ConanPackageConan(ConanFile):
|
||||||
|
name = "` + name + `"
|
||||||
|
version = "` + version + `"
|
||||||
|
license = "` + license + `"
|
||||||
|
author = "` + author + `"
|
||||||
|
homepage = "` + homepage + `"
|
||||||
|
url = "` + url + `"
|
||||||
|
description = "` + description + `"
|
||||||
|
topics = ("` + topic1 + `", "` + topic2 + `")
|
||||||
|
settings = "os", "compiler", "build_type", "arch"
|
||||||
|
options = {"shared": [True, False], "fPIC": [True, False]}
|
||||||
|
default_options = {"shared": False, "fPIC": True}
|
||||||
|
generators = "cmake"
|
||||||
|
`
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestParseConanfile(t *testing.T) {
|
||||||
|
metadata, err := ParseConanfile(strings.NewReader(contentConanfile))
|
||||||
|
assert.Nil(t, err)
|
||||||
|
assert.Equal(t, license, metadata.License)
|
||||||
|
assert.Equal(t, author, metadata.Author)
|
||||||
|
assert.Equal(t, homepage, metadata.ProjectURL)
|
||||||
|
assert.Equal(t, url, metadata.RepositoryURL)
|
||||||
|
assert.Equal(t, description, metadata.Description)
|
||||||
|
assert.Equal(t, []string{topic1, topic2}, metadata.Keywords)
|
||||||
|
}
|
123
modules/packages/conan/conaninfo_parser.go
Normal file
123
modules/packages/conan/conaninfo_parser.go
Normal file
|
@ -0,0 +1,123 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package conan
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bufio"
|
||||||
|
"errors"
|
||||||
|
"io"
|
||||||
|
"strings"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Conaninfo represents infos of a Conan package
|
||||||
|
type Conaninfo struct {
|
||||||
|
Settings map[string]string `json:"settings"`
|
||||||
|
FullSettings map[string]string `json:"full_settings"`
|
||||||
|
Requires []string `json:"requires"`
|
||||||
|
FullRequires []string `json:"full_requires"`
|
||||||
|
Options map[string]string `json:"options"`
|
||||||
|
FullOptions map[string]string `json:"full_options"`
|
||||||
|
RecipeHash string `json:"recipe_hash"`
|
||||||
|
Environment map[string][]string `json:"environment"`
|
||||||
|
}
|
||||||
|
|
||||||
|
func ParseConaninfo(r io.Reader) (*Conaninfo, error) {
|
||||||
|
sections, err := readSections(io.LimitReader(r, 1<<20))
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
info := &Conaninfo{}
|
||||||
|
for section, lines := range sections {
|
||||||
|
if len(lines) == 0 {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
switch section {
|
||||||
|
case "settings":
|
||||||
|
info.Settings = toMap(lines)
|
||||||
|
case "full_settings":
|
||||||
|
info.FullSettings = toMap(lines)
|
||||||
|
case "options":
|
||||||
|
info.Options = toMap(lines)
|
||||||
|
case "full_options":
|
||||||
|
info.FullOptions = toMap(lines)
|
||||||
|
case "requires":
|
||||||
|
info.Requires = lines
|
||||||
|
case "full_requires":
|
||||||
|
info.FullRequires = lines
|
||||||
|
case "recipe_hash":
|
||||||
|
info.RecipeHash = lines[0]
|
||||||
|
case "env":
|
||||||
|
info.Environment = toMapArray(lines)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return info, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func readSections(r io.Reader) (map[string][]string, error) {
|
||||||
|
sections := make(map[string][]string)
|
||||||
|
|
||||||
|
section := ""
|
||||||
|
lines := make([]string, 0, 5)
|
||||||
|
|
||||||
|
scanner := bufio.NewScanner(r)
|
||||||
|
for scanner.Scan() {
|
||||||
|
line := strings.TrimSpace(scanner.Text())
|
||||||
|
if strings.HasPrefix(line, "[") && strings.HasSuffix(line, "]") {
|
||||||
|
if section != "" {
|
||||||
|
sections[section] = lines
|
||||||
|
}
|
||||||
|
section = line[1 : len(line)-1]
|
||||||
|
lines = make([]string, 0, 5)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
if section != "" {
|
||||||
|
if line != "" {
|
||||||
|
lines = append(lines, line)
|
||||||
|
}
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
if line != "" {
|
||||||
|
return nil, errors.New("Invalid conaninfo.txt")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if err := scanner.Err(); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
if section != "" {
|
||||||
|
sections[section] = lines
|
||||||
|
}
|
||||||
|
return sections, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func toMap(lines []string) map[string]string {
|
||||||
|
result := make(map[string]string)
|
||||||
|
for _, line := range lines {
|
||||||
|
parts := strings.SplitN(line, "=", 2)
|
||||||
|
if len(parts) != 2 || len(parts[0]) == 0 || len(parts[1]) == 0 {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
result[parts[0]] = parts[1]
|
||||||
|
}
|
||||||
|
return result
|
||||||
|
}
|
||||||
|
|
||||||
|
func toMapArray(lines []string) map[string][]string {
|
||||||
|
result := make(map[string][]string)
|
||||||
|
for _, line := range lines {
|
||||||
|
parts := strings.SplitN(line, "=", 2)
|
||||||
|
if len(parts) != 2 || len(parts[0]) == 0 || len(parts[1]) == 0 {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
var items []string
|
||||||
|
if strings.HasPrefix(parts[1], "[") && strings.HasSuffix(parts[1], "]") {
|
||||||
|
items = strings.Split(parts[1], ",")
|
||||||
|
} else {
|
||||||
|
items = []string{parts[1]}
|
||||||
|
}
|
||||||
|
result[parts[0]] = items
|
||||||
|
}
|
||||||
|
return result
|
||||||
|
}
|
85
modules/packages/conan/conaninfo_parser_test.go
Normal file
85
modules/packages/conan/conaninfo_parser_test.go
Normal file
|
@ -0,0 +1,85 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package conan
|
||||||
|
|
||||||
|
import (
|
||||||
|
"strings"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
settingsKey = "arch"
|
||||||
|
settingsValue = "x84_64"
|
||||||
|
optionsKey = "shared"
|
||||||
|
optionsValue = "False"
|
||||||
|
requires = "fmt/7.1.3"
|
||||||
|
hash = "74714915a51073acb548ca1ce29afbac"
|
||||||
|
envKey = "CC"
|
||||||
|
envValue = "gcc-10"
|
||||||
|
|
||||||
|
contentConaninfo = `[settings]
|
||||||
|
` + settingsKey + `=` + settingsValue + `
|
||||||
|
|
||||||
|
[requires]
|
||||||
|
` + requires + `
|
||||||
|
|
||||||
|
[options]
|
||||||
|
` + optionsKey + `=` + optionsValue + `
|
||||||
|
|
||||||
|
[full_settings]
|
||||||
|
` + settingsKey + `=` + settingsValue + `
|
||||||
|
|
||||||
|
[full_requires]
|
||||||
|
` + requires + `
|
||||||
|
|
||||||
|
[full_options]
|
||||||
|
` + optionsKey + `=` + optionsValue + `
|
||||||
|
|
||||||
|
[recipe_hash]
|
||||||
|
` + hash + `
|
||||||
|
|
||||||
|
[env]
|
||||||
|
` + envKey + `=` + envValue + `
|
||||||
|
|
||||||
|
`
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestParseConaninfo(t *testing.T) {
|
||||||
|
info, err := ParseConaninfo(strings.NewReader(contentConaninfo))
|
||||||
|
assert.NotNil(t, info)
|
||||||
|
assert.Nil(t, err)
|
||||||
|
assert.Equal(
|
||||||
|
t,
|
||||||
|
map[string]string{
|
||||||
|
settingsKey: settingsValue,
|
||||||
|
},
|
||||||
|
info.Settings,
|
||||||
|
)
|
||||||
|
assert.Equal(t, info.Settings, info.FullSettings)
|
||||||
|
assert.Equal(
|
||||||
|
t,
|
||||||
|
map[string]string{
|
||||||
|
optionsKey: optionsValue,
|
||||||
|
},
|
||||||
|
info.Options,
|
||||||
|
)
|
||||||
|
assert.Equal(t, info.Options, info.FullOptions)
|
||||||
|
assert.Equal(
|
||||||
|
t,
|
||||||
|
[]string{requires},
|
||||||
|
info.Requires,
|
||||||
|
)
|
||||||
|
assert.Equal(t, info.Requires, info.FullRequires)
|
||||||
|
assert.Equal(t, hash, info.RecipeHash)
|
||||||
|
assert.Equal(
|
||||||
|
t,
|
||||||
|
map[string][]string{
|
||||||
|
envKey: {envValue},
|
||||||
|
},
|
||||||
|
info.Environment,
|
||||||
|
)
|
||||||
|
}
|
24
modules/packages/conan/metadata.go
Normal file
24
modules/packages/conan/metadata.go
Normal file
|
@ -0,0 +1,24 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package conan
|
||||||
|
|
||||||
|
const (
|
||||||
|
PropertyRecipeUser = "conan.recipe.user"
|
||||||
|
PropertyRecipeChannel = "conan.recipe.channel"
|
||||||
|
PropertyRecipeRevision = "conan.recipe.revision"
|
||||||
|
PropertyPackageReference = "conan.package.reference"
|
||||||
|
PropertyPackageRevision = "conan.package.revision"
|
||||||
|
PropertyPackageInfo = "conan.package.info"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Metadata represents the metadata of a Conan package
|
||||||
|
type Metadata struct {
|
||||||
|
Author string `json:"author,omitempty"`
|
||||||
|
License string `json:"license,omitempty"`
|
||||||
|
ProjectURL string `json:"project_url,omitempty"`
|
||||||
|
RepositoryURL string `json:"repository_url,omitempty"`
|
||||||
|
Description string `json:"description,omitempty"`
|
||||||
|
Keywords []string `json:"keywords,omitempty"`
|
||||||
|
}
|
155
modules/packages/conan/reference.go
Normal file
155
modules/packages/conan/reference.go
Normal file
|
@ -0,0 +1,155 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package conan
|
||||||
|
|
||||||
|
import (
|
||||||
|
"errors"
|
||||||
|
"fmt"
|
||||||
|
"regexp"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/modules/log"
|
||||||
|
|
||||||
|
goversion "github.com/hashicorp/go-version"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
// taken from https://github.com/conan-io/conan/blob/develop/conans/model/ref.py
|
||||||
|
minChars = 2
|
||||||
|
maxChars = 51
|
||||||
|
|
||||||
|
// DefaultRevision if no revision is specified
|
||||||
|
DefaultRevision = "0"
|
||||||
|
)
|
||||||
|
|
||||||
|
var (
|
||||||
|
namePattern = regexp.MustCompile(fmt.Sprintf(`^[a-zA-Z0-9_][a-zA-Z0-9_\+\.-]{%d,%d}$`, minChars-1, maxChars-1))
|
||||||
|
revisionPattern = regexp.MustCompile(fmt.Sprintf(`^[a-zA-Z0-9]{1,%d}$`, maxChars))
|
||||||
|
|
||||||
|
ErrValidation = errors.New("Could not validate one or more reference fields")
|
||||||
|
)
|
||||||
|
|
||||||
|
// RecipeReference represents a recipe <Name>/<Version>@<User>/<Channel>#<Revision>
|
||||||
|
type RecipeReference struct {
|
||||||
|
Name string
|
||||||
|
Version string
|
||||||
|
User string
|
||||||
|
Channel string
|
||||||
|
Revision string
|
||||||
|
}
|
||||||
|
|
||||||
|
func NewRecipeReference(name, version, user, channel, revision string) (*RecipeReference, error) {
|
||||||
|
log.Trace("Conan Recipe: %s/%s(@%s/%s(#%s))", name, version, user, channel, revision)
|
||||||
|
|
||||||
|
if user == "_" {
|
||||||
|
user = ""
|
||||||
|
}
|
||||||
|
if channel == "_" {
|
||||||
|
channel = ""
|
||||||
|
}
|
||||||
|
|
||||||
|
if (user != "" && channel == "") || (user == "" && channel != "") {
|
||||||
|
return nil, ErrValidation
|
||||||
|
}
|
||||||
|
|
||||||
|
if !namePattern.MatchString(name) {
|
||||||
|
return nil, ErrValidation
|
||||||
|
}
|
||||||
|
if _, err := goversion.NewSemver(version); err != nil {
|
||||||
|
return nil, ErrValidation
|
||||||
|
}
|
||||||
|
if user != "" && !namePattern.MatchString(user) {
|
||||||
|
return nil, ErrValidation
|
||||||
|
}
|
||||||
|
if channel != "" && !namePattern.MatchString(channel) {
|
||||||
|
return nil, ErrValidation
|
||||||
|
}
|
||||||
|
if revision != "" && !revisionPattern.MatchString(revision) {
|
||||||
|
return nil, ErrValidation
|
||||||
|
}
|
||||||
|
|
||||||
|
return &RecipeReference{name, version, user, channel, revision}, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (r *RecipeReference) RevisionOrDefault() string {
|
||||||
|
if r.Revision == "" {
|
||||||
|
return DefaultRevision
|
||||||
|
}
|
||||||
|
return r.Revision
|
||||||
|
}
|
||||||
|
|
||||||
|
func (r *RecipeReference) String() string {
|
||||||
|
rev := ""
|
||||||
|
if r.Revision != "" {
|
||||||
|
rev = "#" + r.Revision
|
||||||
|
}
|
||||||
|
if r.User == "" || r.Channel == "" {
|
||||||
|
return fmt.Sprintf("%s/%s%s", r.Name, r.Version, rev)
|
||||||
|
}
|
||||||
|
return fmt.Sprintf("%s/%s@%s/%s%s", r.Name, r.Version, r.User, r.Channel, rev)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (r *RecipeReference) LinkName() string {
|
||||||
|
user := r.User
|
||||||
|
if user == "" {
|
||||||
|
user = "_"
|
||||||
|
}
|
||||||
|
channel := r.Channel
|
||||||
|
if channel == "" {
|
||||||
|
channel = "_"
|
||||||
|
}
|
||||||
|
return fmt.Sprintf("%s/%s/%s/%s/%s", r.Name, r.Version, user, channel, r.RevisionOrDefault())
|
||||||
|
}
|
||||||
|
|
||||||
|
func (r *RecipeReference) WithRevision(revision string) *RecipeReference {
|
||||||
|
return &RecipeReference{r.Name, r.Version, r.User, r.Channel, revision}
|
||||||
|
}
|
||||||
|
|
||||||
|
// AsKey builds the additional key for the package file
|
||||||
|
func (r *RecipeReference) AsKey() string {
|
||||||
|
return fmt.Sprintf("%s|%s|%s", r.User, r.Channel, r.RevisionOrDefault())
|
||||||
|
}
|
||||||
|
|
||||||
|
// PackageReference represents a package of a recipe <Name>/<Version>@<User>/<Channel>#<Revision> <Reference>#<Revision>
|
||||||
|
type PackageReference struct {
|
||||||
|
Recipe *RecipeReference
|
||||||
|
Reference string
|
||||||
|
Revision string
|
||||||
|
}
|
||||||
|
|
||||||
|
func NewPackageReference(recipe *RecipeReference, reference, revision string) (*PackageReference, error) {
|
||||||
|
log.Trace("Conan Package: %v %s(#%s)", recipe, reference, revision)
|
||||||
|
|
||||||
|
if recipe == nil {
|
||||||
|
return nil, ErrValidation
|
||||||
|
}
|
||||||
|
if reference == "" || !revisionPattern.MatchString(reference) {
|
||||||
|
return nil, ErrValidation
|
||||||
|
}
|
||||||
|
if revision != "" && !revisionPattern.MatchString(revision) {
|
||||||
|
return nil, ErrValidation
|
||||||
|
}
|
||||||
|
|
||||||
|
return &PackageReference{recipe, reference, revision}, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (r *PackageReference) RevisionOrDefault() string {
|
||||||
|
if r.Revision == "" {
|
||||||
|
return DefaultRevision
|
||||||
|
}
|
||||||
|
return r.Revision
|
||||||
|
}
|
||||||
|
|
||||||
|
func (r *PackageReference) LinkName() string {
|
||||||
|
return fmt.Sprintf("%s/%s", r.Reference, r.RevisionOrDefault())
|
||||||
|
}
|
||||||
|
|
||||||
|
func (r *PackageReference) WithRevision(revision string) *PackageReference {
|
||||||
|
return &PackageReference{r.Recipe, r.Reference, revision}
|
||||||
|
}
|
||||||
|
|
||||||
|
// AsKey builds the additional key for the package file
|
||||||
|
func (r *PackageReference) AsKey() string {
|
||||||
|
return fmt.Sprintf("%s|%s|%s|%s|%s", r.Recipe.User, r.Recipe.Channel, r.Recipe.RevisionOrDefault(), r.Reference, r.RevisionOrDefault())
|
||||||
|
}
|
147
modules/packages/conan/reference_test.go
Normal file
147
modules/packages/conan/reference_test.go
Normal file
|
@ -0,0 +1,147 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package conan
|
||||||
|
|
||||||
|
import (
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestNewRecipeReference(t *testing.T) {
|
||||||
|
cases := []struct {
|
||||||
|
Name string
|
||||||
|
Version string
|
||||||
|
User string
|
||||||
|
Channel string
|
||||||
|
Revision string
|
||||||
|
IsValid bool
|
||||||
|
}{
|
||||||
|
{"", "", "", "", "", false},
|
||||||
|
{"name", "", "", "", "", false},
|
||||||
|
{"", "1.0", "", "", "", false},
|
||||||
|
{"", "", "user", "", "", false},
|
||||||
|
{"", "", "", "channel", "", false},
|
||||||
|
{"", "", "", "", "0", false},
|
||||||
|
{"name", "1.0", "", "", "", true},
|
||||||
|
{"name", "1.0", "user", "", "", false},
|
||||||
|
{"name", "1.0", "", "channel", "", false},
|
||||||
|
{"name", "1.0", "user", "channel", "", true},
|
||||||
|
{"name", "1.0", "_", "", "", true},
|
||||||
|
{"name", "1.0", "", "_", "", true},
|
||||||
|
{"name", "1.0", "_", "_", "", true},
|
||||||
|
{"name", "1.0", "_", "_", "0", true},
|
||||||
|
{"name", "1.0", "", "", "0", true},
|
||||||
|
{"name", "1.0", "", "", "000000000000000000000000000000000000000000000000000000000000", false},
|
||||||
|
}
|
||||||
|
|
||||||
|
for i, c := range cases {
|
||||||
|
rref, err := NewRecipeReference(c.Name, c.Version, c.User, c.Channel, c.Revision)
|
||||||
|
if c.IsValid {
|
||||||
|
assert.NoError(t, err, "case %d, should be invalid", i)
|
||||||
|
assert.NotNil(t, rref, "case %d, should not be nil", i)
|
||||||
|
} else {
|
||||||
|
assert.Error(t, err, "case %d, should be valid", i)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestRecipeReferenceRevisionOrDefault(t *testing.T) {
|
||||||
|
rref, err := NewRecipeReference("name", "1.0", "", "", "")
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Equal(t, DefaultRevision, rref.RevisionOrDefault())
|
||||||
|
|
||||||
|
rref, err = NewRecipeReference("name", "1.0", "", "", DefaultRevision)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Equal(t, DefaultRevision, rref.RevisionOrDefault())
|
||||||
|
|
||||||
|
rref, err = NewRecipeReference("name", "1.0", "", "", "Az09")
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Equal(t, "Az09", rref.RevisionOrDefault())
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestRecipeReferenceString(t *testing.T) {
|
||||||
|
rref, err := NewRecipeReference("name", "1.0", "", "", "")
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Equal(t, "name/1.0", rref.String())
|
||||||
|
|
||||||
|
rref, err = NewRecipeReference("name", "1.0", "user", "channel", "")
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Equal(t, "name/1.0@user/channel", rref.String())
|
||||||
|
|
||||||
|
rref, err = NewRecipeReference("name", "1.0", "user", "channel", "Az09")
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Equal(t, "name/1.0@user/channel#Az09", rref.String())
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestRecipeReferenceLinkName(t *testing.T) {
|
||||||
|
rref, err := NewRecipeReference("name", "1.0", "", "", "")
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Equal(t, "name/1.0/_/_/0", rref.LinkName())
|
||||||
|
|
||||||
|
rref, err = NewRecipeReference("name", "1.0", "user", "channel", "")
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Equal(t, "name/1.0/user/channel/0", rref.LinkName())
|
||||||
|
|
||||||
|
rref, err = NewRecipeReference("name", "1.0", "user", "channel", "Az09")
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Equal(t, "name/1.0/user/channel/Az09", rref.LinkName())
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestNewPackageReference(t *testing.T) {
|
||||||
|
rref, _ := NewRecipeReference("name", "1.0", "", "", "")
|
||||||
|
|
||||||
|
cases := []struct {
|
||||||
|
Recipe *RecipeReference
|
||||||
|
Reference string
|
||||||
|
Revision string
|
||||||
|
IsValid bool
|
||||||
|
}{
|
||||||
|
{nil, "", "", false},
|
||||||
|
{rref, "", "", false},
|
||||||
|
{nil, "aZ09", "", false},
|
||||||
|
{rref, "aZ09", "", true},
|
||||||
|
{rref, "", "Az09", false},
|
||||||
|
{rref, "aZ09", "Az09", true},
|
||||||
|
}
|
||||||
|
|
||||||
|
for i, c := range cases {
|
||||||
|
pref, err := NewPackageReference(c.Recipe, c.Reference, c.Revision)
|
||||||
|
if c.IsValid {
|
||||||
|
assert.NoError(t, err, "case %d, should be invalid", i)
|
||||||
|
assert.NotNil(t, pref, "case %d, should not be nil", i)
|
||||||
|
} else {
|
||||||
|
assert.Error(t, err, "case %d, should be valid", i)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestPackageReferenceRevisionOrDefault(t *testing.T) {
|
||||||
|
rref, _ := NewRecipeReference("name", "1.0", "", "", "")
|
||||||
|
|
||||||
|
pref, err := NewPackageReference(rref, "ref", "")
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Equal(t, DefaultRevision, pref.RevisionOrDefault())
|
||||||
|
|
||||||
|
pref, err = NewPackageReference(rref, "ref", DefaultRevision)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Equal(t, DefaultRevision, pref.RevisionOrDefault())
|
||||||
|
|
||||||
|
pref, err = NewPackageReference(rref, "ref", "Az09")
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Equal(t, "Az09", pref.RevisionOrDefault())
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestPackageReferenceLinkName(t *testing.T) {
|
||||||
|
rref, _ := NewRecipeReference("name", "1.0", "", "", "")
|
||||||
|
|
||||||
|
pref, err := NewPackageReference(rref, "ref", "")
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Equal(t, "ref/0", pref.LinkName())
|
||||||
|
|
||||||
|
pref, err = NewPackageReference(rref, "ref", "Az09")
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Equal(t, "ref/Az09", pref.LinkName())
|
||||||
|
}
|
56
modules/packages/container/helm/helm.go
Normal file
56
modules/packages/container/helm/helm.go
Normal file
|
@ -0,0 +1,56 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package helm
|
||||||
|
|
||||||
|
// https://github.com/helm/helm/blob/main/pkg/chart/
|
||||||
|
|
||||||
|
const ConfigMediaType = "application/vnd.cncf.helm.config.v1+json"
|
||||||
|
|
||||||
|
// Maintainer describes a Chart maintainer.
|
||||||
|
type Maintainer struct {
|
||||||
|
// Name is a user name or organization name
|
||||||
|
Name string `json:"name,omitempty"`
|
||||||
|
// Email is an optional email address to contact the named maintainer
|
||||||
|
Email string `json:"email,omitempty"`
|
||||||
|
// URL is an optional URL to an address for the named maintainer
|
||||||
|
URL string `json:"url,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Metadata for a Chart file. This models the structure of a Chart.yaml file.
|
||||||
|
type Metadata struct {
|
||||||
|
// The name of the chart. Required.
|
||||||
|
Name string `json:"name,omitempty"`
|
||||||
|
// The URL to a relevant project page, git repo, or contact person
|
||||||
|
Home string `json:"home,omitempty"`
|
||||||
|
// Source is the URL to the source code of this chart
|
||||||
|
Sources []string `json:"sources,omitempty"`
|
||||||
|
// A SemVer 2 conformant version string of the chart. Required.
|
||||||
|
Version string `json:"version,omitempty"`
|
||||||
|
// A one-sentence description of the chart
|
||||||
|
Description string `json:"description,omitempty"`
|
||||||
|
// A list of string keywords
|
||||||
|
Keywords []string `json:"keywords,omitempty"`
|
||||||
|
// A list of name and URL/email address combinations for the maintainer(s)
|
||||||
|
Maintainers []*Maintainer `json:"maintainers,omitempty"`
|
||||||
|
// The URL to an icon file.
|
||||||
|
Icon string `json:"icon,omitempty"`
|
||||||
|
// The API Version of this chart. Required.
|
||||||
|
APIVersion string `json:"apiVersion,omitempty"`
|
||||||
|
// The condition to check to enable chart
|
||||||
|
Condition string `json:"condition,omitempty"`
|
||||||
|
// The tags to check to enable chart
|
||||||
|
Tags string `json:"tags,omitempty"`
|
||||||
|
// The version of the application enclosed inside of this chart.
|
||||||
|
AppVersion string `json:"appVersion,omitempty"`
|
||||||
|
// Whether or not this chart is deprecated
|
||||||
|
Deprecated bool `json:"deprecated,omitempty"`
|
||||||
|
// Annotations are additional mappings uninterpreted by Helm,
|
||||||
|
// made available for inspection by other applications.
|
||||||
|
Annotations map[string]string `json:"annotations,omitempty"`
|
||||||
|
// KubeVersion is a SemVer constraint specifying the version of Kubernetes required.
|
||||||
|
KubeVersion string `json:"kubeVersion,omitempty"`
|
||||||
|
// Specifies the chart type: application or library
|
||||||
|
Type string `json:"type,omitempty"`
|
||||||
|
}
|
157
modules/packages/container/metadata.go
Normal file
157
modules/packages/container/metadata.go
Normal file
|
@ -0,0 +1,157 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package container
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/modules/json"
|
||||||
|
"code.gitea.io/gitea/modules/packages/container/helm"
|
||||||
|
"code.gitea.io/gitea/modules/packages/container/oci"
|
||||||
|
"code.gitea.io/gitea/modules/validation"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
PropertyDigest = "container.digest"
|
||||||
|
PropertyMediaType = "container.mediatype"
|
||||||
|
PropertyManifestTagged = "container.manifest.tagged"
|
||||||
|
PropertyManifestReference = "container.manifest.reference"
|
||||||
|
|
||||||
|
DefaultPlatform = "linux/amd64"
|
||||||
|
|
||||||
|
labelLicenses = "org.opencontainers.image.licenses"
|
||||||
|
labelURL = "org.opencontainers.image.url"
|
||||||
|
labelSource = "org.opencontainers.image.source"
|
||||||
|
labelDocumentation = "org.opencontainers.image.documentation"
|
||||||
|
labelDescription = "org.opencontainers.image.description"
|
||||||
|
labelAuthors = "org.opencontainers.image.authors"
|
||||||
|
)
|
||||||
|
|
||||||
|
type ImageType string
|
||||||
|
|
||||||
|
const (
|
||||||
|
TypeOCI ImageType = "oci"
|
||||||
|
TypeHelm ImageType = "helm"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Name gets the name of the image type
|
||||||
|
func (it ImageType) Name() string {
|
||||||
|
switch it {
|
||||||
|
case TypeHelm:
|
||||||
|
return "Helm Chart"
|
||||||
|
default:
|
||||||
|
return "OCI / Docker"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Metadata represents the metadata of a Container package
|
||||||
|
type Metadata struct {
|
||||||
|
Type ImageType `json:"type"`
|
||||||
|
IsTagged bool `json:"is_tagged"`
|
||||||
|
Platform string `json:"platform,omitempty"`
|
||||||
|
Description string `json:"description,omitempty"`
|
||||||
|
Authors []string `json:"authors,omitempty"`
|
||||||
|
Licenses string `json:"license,omitempty"`
|
||||||
|
ProjectURL string `json:"project_url,omitempty"`
|
||||||
|
RepositoryURL string `json:"repository_url,omitempty"`
|
||||||
|
DocumentationURL string `json:"documentation_url,omitempty"`
|
||||||
|
Labels map[string]string `json:"labels,omitempty"`
|
||||||
|
ImageLayers []string `json:"layer_creation,omitempty"`
|
||||||
|
MultiArch map[string]string `json:"multiarch,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParseImageConfig parses the metadata of an image config
|
||||||
|
func ParseImageConfig(mediaType oci.MediaType, r io.Reader) (*Metadata, error) {
|
||||||
|
if strings.EqualFold(string(mediaType), helm.ConfigMediaType) {
|
||||||
|
return parseHelmConfig(r)
|
||||||
|
}
|
||||||
|
|
||||||
|
// fallback to OCI Image Config
|
||||||
|
return parseOCIImageConfig(r)
|
||||||
|
}
|
||||||
|
|
||||||
|
func parseOCIImageConfig(r io.Reader) (*Metadata, error) {
|
||||||
|
var image oci.Image
|
||||||
|
if err := json.NewDecoder(r).Decode(&image); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
platform := DefaultPlatform
|
||||||
|
if image.OS != "" && image.Architecture != "" {
|
||||||
|
platform = fmt.Sprintf("%s/%s", image.OS, image.Architecture)
|
||||||
|
if image.Variant != "" {
|
||||||
|
platform = fmt.Sprintf("%s/%s", platform, image.Variant)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
imageLayers := make([]string, 0, len(image.History))
|
||||||
|
for _, history := range image.History {
|
||||||
|
cmd := history.CreatedBy
|
||||||
|
if i := strings.Index(cmd, "#(nop) "); i != -1 {
|
||||||
|
cmd = strings.TrimSpace(cmd[i+7:])
|
||||||
|
}
|
||||||
|
imageLayers = append(imageLayers, cmd)
|
||||||
|
}
|
||||||
|
|
||||||
|
metadata := &Metadata{
|
||||||
|
Type: TypeOCI,
|
||||||
|
Platform: platform,
|
||||||
|
Licenses: image.Config.Labels[labelLicenses],
|
||||||
|
ProjectURL: image.Config.Labels[labelURL],
|
||||||
|
RepositoryURL: image.Config.Labels[labelSource],
|
||||||
|
DocumentationURL: image.Config.Labels[labelDocumentation],
|
||||||
|
Description: image.Config.Labels[labelDescription],
|
||||||
|
Labels: image.Config.Labels,
|
||||||
|
ImageLayers: imageLayers,
|
||||||
|
}
|
||||||
|
|
||||||
|
if authors, ok := image.Config.Labels[labelAuthors]; ok {
|
||||||
|
metadata.Authors = []string{authors}
|
||||||
|
}
|
||||||
|
|
||||||
|
if !validation.IsValidURL(metadata.ProjectURL) {
|
||||||
|
metadata.ProjectURL = ""
|
||||||
|
}
|
||||||
|
if !validation.IsValidURL(metadata.RepositoryURL) {
|
||||||
|
metadata.RepositoryURL = ""
|
||||||
|
}
|
||||||
|
if !validation.IsValidURL(metadata.DocumentationURL) {
|
||||||
|
metadata.DocumentationURL = ""
|
||||||
|
}
|
||||||
|
|
||||||
|
return metadata, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func parseHelmConfig(r io.Reader) (*Metadata, error) {
|
||||||
|
var config helm.Metadata
|
||||||
|
if err := json.NewDecoder(r).Decode(&config); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
metadata := &Metadata{
|
||||||
|
Type: TypeHelm,
|
||||||
|
Description: config.Description,
|
||||||
|
ProjectURL: config.Home,
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(config.Maintainers) > 0 {
|
||||||
|
authors := make([]string, 0, len(config.Maintainers))
|
||||||
|
for _, maintainer := range config.Maintainers {
|
||||||
|
authors = append(authors, maintainer.Name)
|
||||||
|
}
|
||||||
|
metadata.Authors = authors
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(config.Sources) > 0 && validation.IsValidURL(config.Sources[0]) {
|
||||||
|
metadata.RepositoryURL = config.Sources[0]
|
||||||
|
}
|
||||||
|
if !validation.IsValidURL(metadata.ProjectURL) {
|
||||||
|
metadata.ProjectURL = ""
|
||||||
|
}
|
||||||
|
|
||||||
|
return metadata, nil
|
||||||
|
}
|
62
modules/packages/container/metadata_test.go
Normal file
62
modules/packages/container/metadata_test.go
Normal file
|
@ -0,0 +1,62 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package container
|
||||||
|
|
||||||
|
import (
|
||||||
|
"strings"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/modules/packages/container/helm"
|
||||||
|
"code.gitea.io/gitea/modules/packages/container/oci"
|
||||||
|
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestParseImageConfig(t *testing.T) {
|
||||||
|
description := "Image Description"
|
||||||
|
author := "Gitea"
|
||||||
|
license := "MIT"
|
||||||
|
projectURL := "https://gitea.io"
|
||||||
|
repositoryURL := "https://gitea.com/gitea"
|
||||||
|
documentationURL := "https://docs.gitea.io"
|
||||||
|
|
||||||
|
configOCI := `{"config": {"labels": {"` + labelAuthors + `": "` + author + `", "` + labelLicenses + `": "` + license + `", "` + labelURL + `": "` + projectURL + `", "` + labelSource + `": "` + repositoryURL + `", "` + labelDocumentation + `": "` + documentationURL + `", "` + labelDescription + `": "` + description + `"}}, "history": [{"created_by": "do it 1"}, {"created_by": "dummy #(nop) do it 2"}]}`
|
||||||
|
|
||||||
|
metadata, err := ParseImageConfig(oci.MediaType(oci.MediaTypeImageManifest), strings.NewReader(configOCI))
|
||||||
|
assert.NoError(t, err)
|
||||||
|
|
||||||
|
assert.Equal(t, TypeOCI, metadata.Type)
|
||||||
|
assert.Equal(t, description, metadata.Description)
|
||||||
|
assert.ElementsMatch(t, []string{author}, metadata.Authors)
|
||||||
|
assert.Equal(t, license, metadata.Licenses)
|
||||||
|
assert.Equal(t, projectURL, metadata.ProjectURL)
|
||||||
|
assert.Equal(t, repositoryURL, metadata.RepositoryURL)
|
||||||
|
assert.Equal(t, documentationURL, metadata.DocumentationURL)
|
||||||
|
assert.Equal(t, []string{"do it 1", "do it 2"}, metadata.ImageLayers)
|
||||||
|
assert.Equal(
|
||||||
|
t,
|
||||||
|
map[string]string{
|
||||||
|
labelAuthors: author,
|
||||||
|
labelLicenses: license,
|
||||||
|
labelURL: projectURL,
|
||||||
|
labelSource: repositoryURL,
|
||||||
|
labelDocumentation: documentationURL,
|
||||||
|
labelDescription: description,
|
||||||
|
},
|
||||||
|
metadata.Labels,
|
||||||
|
)
|
||||||
|
assert.Empty(t, metadata.MultiArch)
|
||||||
|
|
||||||
|
configHelm := `{"description":"` + description + `", "home": "` + projectURL + `", "sources": ["` + repositoryURL + `"], "maintainers":[{"name":"` + author + `"}]}`
|
||||||
|
|
||||||
|
metadata, err = ParseImageConfig(oci.MediaType(helm.ConfigMediaType), strings.NewReader(configHelm))
|
||||||
|
assert.NoError(t, err)
|
||||||
|
|
||||||
|
assert.Equal(t, TypeHelm, metadata.Type)
|
||||||
|
assert.Equal(t, description, metadata.Description)
|
||||||
|
assert.ElementsMatch(t, []string{author}, metadata.Authors)
|
||||||
|
assert.Equal(t, projectURL, metadata.ProjectURL)
|
||||||
|
assert.Equal(t, repositoryURL, metadata.RepositoryURL)
|
||||||
|
}
|
27
modules/packages/container/oci/digest.go
Normal file
27
modules/packages/container/oci/digest.go
Normal file
|
@ -0,0 +1,27 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package oci
|
||||||
|
|
||||||
|
import (
|
||||||
|
"regexp"
|
||||||
|
"strings"
|
||||||
|
)
|
||||||
|
|
||||||
|
var digestPattern = regexp.MustCompile(`\Asha256:[a-f0-9]{64}\z`)
|
||||||
|
|
||||||
|
type Digest string
|
||||||
|
|
||||||
|
// Validate checks if the digest has a valid SHA256 signature
|
||||||
|
func (d Digest) Validate() bool {
|
||||||
|
return digestPattern.MatchString(string(d))
|
||||||
|
}
|
||||||
|
|
||||||
|
func (d Digest) Hash() string {
|
||||||
|
p := strings.SplitN(string(d), ":", 2)
|
||||||
|
if len(p) != 2 {
|
||||||
|
return ""
|
||||||
|
}
|
||||||
|
return p[1]
|
||||||
|
}
|
36
modules/packages/container/oci/mediatype.go
Normal file
36
modules/packages/container/oci/mediatype.go
Normal file
|
@ -0,0 +1,36 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package oci
|
||||||
|
|
||||||
|
import (
|
||||||
|
"strings"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
MediaTypeImageManifest = "application/vnd.oci.image.manifest.v1+json"
|
||||||
|
MediaTypeImageIndex = "application/vnd.oci.image.index.v1+json"
|
||||||
|
MediaTypeDockerManifest = "application/vnd.docker.distribution.manifest.v2+json"
|
||||||
|
MediaTypeDockerManifestList = "application/vnd.docker.distribution.manifest.list.v2+json"
|
||||||
|
)
|
||||||
|
|
||||||
|
type MediaType string
|
||||||
|
|
||||||
|
// IsValid tests if the media type is in the OCI or Docker namespace
|
||||||
|
func (m MediaType) IsValid() bool {
|
||||||
|
s := string(m)
|
||||||
|
return strings.HasPrefix(s, "application/vnd.docker.") || strings.HasPrefix(s, "application/vnd.oci.")
|
||||||
|
}
|
||||||
|
|
||||||
|
// IsImageManifest tests if the media type is an image manifest
|
||||||
|
func (m MediaType) IsImageManifest() bool {
|
||||||
|
s := string(m)
|
||||||
|
return strings.EqualFold(s, MediaTypeDockerManifest) || strings.EqualFold(s, MediaTypeImageManifest)
|
||||||
|
}
|
||||||
|
|
||||||
|
// IsImageIndex tests if the media type is an image index
|
||||||
|
func (m MediaType) IsImageIndex() bool {
|
||||||
|
s := string(m)
|
||||||
|
return strings.EqualFold(s, MediaTypeDockerManifestList) || strings.EqualFold(s, MediaTypeImageIndex)
|
||||||
|
}
|
191
modules/packages/container/oci/oci.go
Normal file
191
modules/packages/container/oci/oci.go
Normal file
|
@ -0,0 +1,191 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package oci
|
||||||
|
|
||||||
|
import (
|
||||||
|
"time"
|
||||||
|
)
|
||||||
|
|
||||||
|
// https://github.com/opencontainers/image-spec/tree/main/specs-go/v1
|
||||||
|
|
||||||
|
// ImageConfig defines the execution parameters which should be used as a base when running a container using an image.
|
||||||
|
type ImageConfig struct {
|
||||||
|
// User defines the username or UID which the process in the container should run as.
|
||||||
|
User string `json:"User,omitempty"`
|
||||||
|
|
||||||
|
// ExposedPorts a set of ports to expose from a container running this image.
|
||||||
|
ExposedPorts map[string]struct{} `json:"ExposedPorts,omitempty"`
|
||||||
|
|
||||||
|
// Env is a list of environment variables to be used in a container.
|
||||||
|
Env []string `json:"Env,omitempty"`
|
||||||
|
|
||||||
|
// Entrypoint defines a list of arguments to use as the command to execute when the container starts.
|
||||||
|
Entrypoint []string `json:"Entrypoint,omitempty"`
|
||||||
|
|
||||||
|
// Cmd defines the default arguments to the entrypoint of the container.
|
||||||
|
Cmd []string `json:"Cmd,omitempty"`
|
||||||
|
|
||||||
|
// Volumes is a set of directories describing where the process is likely write data specific to a container instance.
|
||||||
|
Volumes map[string]struct{} `json:"Volumes,omitempty"`
|
||||||
|
|
||||||
|
// WorkingDir sets the current working directory of the entrypoint process in the container.
|
||||||
|
WorkingDir string `json:"WorkingDir,omitempty"`
|
||||||
|
|
||||||
|
// Labels contains arbitrary metadata for the container.
|
||||||
|
Labels map[string]string `json:"Labels,omitempty"`
|
||||||
|
|
||||||
|
// StopSignal contains the system call signal that will be sent to the container to exit.
|
||||||
|
StopSignal string `json:"StopSignal,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// RootFS describes a layer content addresses
|
||||||
|
type RootFS struct {
|
||||||
|
// Type is the type of the rootfs.
|
||||||
|
Type string `json:"type"`
|
||||||
|
|
||||||
|
// DiffIDs is an array of layer content hashes, in order from bottom-most to top-most.
|
||||||
|
DiffIDs []string `json:"diff_ids"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// History describes the history of a layer.
|
||||||
|
type History struct {
|
||||||
|
// Created is the combined date and time at which the layer was created, formatted as defined by RFC 3339, section 5.6.
|
||||||
|
Created *time.Time `json:"created,omitempty"`
|
||||||
|
|
||||||
|
// CreatedBy is the command which created the layer.
|
||||||
|
CreatedBy string `json:"created_by,omitempty"`
|
||||||
|
|
||||||
|
// Author is the author of the build point.
|
||||||
|
Author string `json:"author,omitempty"`
|
||||||
|
|
||||||
|
// Comment is a custom message set when creating the layer.
|
||||||
|
Comment string `json:"comment,omitempty"`
|
||||||
|
|
||||||
|
// EmptyLayer is used to mark if the history item created a filesystem diff.
|
||||||
|
EmptyLayer bool `json:"empty_layer,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Image is the JSON structure which describes some basic information about the image.
|
||||||
|
// This provides the `application/vnd.oci.image.config.v1+json` mediatype when marshalled to JSON.
|
||||||
|
type Image struct {
|
||||||
|
// Created is the combined date and time at which the image was created, formatted as defined by RFC 3339, section 5.6.
|
||||||
|
Created *time.Time `json:"created,omitempty"`
|
||||||
|
|
||||||
|
// Author defines the name and/or email address of the person or entity which created and is responsible for maintaining the image.
|
||||||
|
Author string `json:"author,omitempty"`
|
||||||
|
|
||||||
|
// Architecture is the CPU architecture which the binaries in this image are built to run on.
|
||||||
|
Architecture string `json:"architecture"`
|
||||||
|
|
||||||
|
// Variant is the variant of the specified CPU architecture which image binaries are intended to run on.
|
||||||
|
Variant string `json:"variant,omitempty"`
|
||||||
|
|
||||||
|
// OS is the name of the operating system which the image is built to run on.
|
||||||
|
OS string `json:"os"`
|
||||||
|
|
||||||
|
// OSVersion is an optional field specifying the operating system
|
||||||
|
// version, for example on Windows `10.0.14393.1066`.
|
||||||
|
OSVersion string `json:"os.version,omitempty"`
|
||||||
|
|
||||||
|
// OSFeatures is an optional field specifying an array of strings,
|
||||||
|
// each listing a required OS feature (for example on Windows `win32k`).
|
||||||
|
OSFeatures []string `json:"os.features,omitempty"`
|
||||||
|
|
||||||
|
// Config defines the execution parameters which should be used as a base when running a container using the image.
|
||||||
|
Config ImageConfig `json:"config,omitempty"`
|
||||||
|
|
||||||
|
// RootFS references the layer content addresses used by the image.
|
||||||
|
RootFS RootFS `json:"rootfs"`
|
||||||
|
|
||||||
|
// History describes the history of each layer.
|
||||||
|
History []History `json:"history,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Descriptor describes the disposition of targeted content.
|
||||||
|
// This structure provides `application/vnd.oci.descriptor.v1+json` mediatype
|
||||||
|
// when marshalled to JSON.
|
||||||
|
type Descriptor struct {
|
||||||
|
// MediaType is the media type of the object this schema refers to.
|
||||||
|
MediaType MediaType `json:"mediaType,omitempty"`
|
||||||
|
|
||||||
|
// Digest is the digest of the targeted content.
|
||||||
|
Digest Digest `json:"digest"`
|
||||||
|
|
||||||
|
// Size specifies the size in bytes of the blob.
|
||||||
|
Size int64 `json:"size"`
|
||||||
|
|
||||||
|
// URLs specifies a list of URLs from which this object MAY be downloaded
|
||||||
|
URLs []string `json:"urls,omitempty"`
|
||||||
|
|
||||||
|
// Annotations contains arbitrary metadata relating to the targeted content.
|
||||||
|
Annotations map[string]string `json:"annotations,omitempty"`
|
||||||
|
|
||||||
|
// Data is an embedding of the targeted content. This is encoded as a base64
|
||||||
|
// string when marshalled to JSON (automatically, by encoding/json). If
|
||||||
|
// present, Data can be used directly to avoid fetching the targeted content.
|
||||||
|
Data []byte `json:"data,omitempty"`
|
||||||
|
|
||||||
|
// Platform describes the platform which the image in the manifest runs on.
|
||||||
|
//
|
||||||
|
// This should only be used when referring to a manifest.
|
||||||
|
Platform *Platform `json:"platform,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Platform describes the platform which the image in the manifest runs on.
|
||||||
|
type Platform struct {
|
||||||
|
// Architecture field specifies the CPU architecture, for example
|
||||||
|
// `amd64` or `ppc64`.
|
||||||
|
Architecture string `json:"architecture"`
|
||||||
|
|
||||||
|
// OS specifies the operating system, for example `linux` or `windows`.
|
||||||
|
OS string `json:"os"`
|
||||||
|
|
||||||
|
// OSVersion is an optional field specifying the operating system
|
||||||
|
// version, for example on Windows `10.0.14393.1066`.
|
||||||
|
OSVersion string `json:"os.version,omitempty"`
|
||||||
|
|
||||||
|
// OSFeatures is an optional field specifying an array of strings,
|
||||||
|
// each listing a required OS feature (for example on Windows `win32k`).
|
||||||
|
OSFeatures []string `json:"os.features,omitempty"`
|
||||||
|
|
||||||
|
// Variant is an optional field specifying a variant of the CPU, for
|
||||||
|
// example `v7` to specify ARMv7 when architecture is `arm`.
|
||||||
|
Variant string `json:"variant,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
type SchemaMediaBase struct {
|
||||||
|
// SchemaVersion is the image manifest schema that this image follows
|
||||||
|
SchemaVersion int `json:"schemaVersion"`
|
||||||
|
|
||||||
|
// MediaType specifies the type of this document data structure e.g. `application/vnd.oci.image.manifest.v1+json`
|
||||||
|
MediaType MediaType `json:"mediaType,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Manifest provides `application/vnd.oci.image.manifest.v1+json` mediatype structure when marshalled to JSON.
|
||||||
|
type Manifest struct {
|
||||||
|
SchemaMediaBase
|
||||||
|
|
||||||
|
// Config references a configuration object for a container, by digest.
|
||||||
|
// The referenced configuration object is a JSON blob that the runtime uses to set up the container.
|
||||||
|
Config Descriptor `json:"config"`
|
||||||
|
|
||||||
|
// Layers is an indexed list of layers referenced by the manifest.
|
||||||
|
Layers []Descriptor `json:"layers"`
|
||||||
|
|
||||||
|
// Annotations contains arbitrary metadata for the image manifest.
|
||||||
|
Annotations map[string]string `json:"annotations,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Index references manifests for various platforms.
|
||||||
|
// This structure provides `application/vnd.oci.image.index.v1+json` mediatype when marshalled to JSON.
|
||||||
|
type Index struct {
|
||||||
|
SchemaMediaBase
|
||||||
|
|
||||||
|
// Manifests references platform specific manifests.
|
||||||
|
Manifests []Descriptor `json:"manifests"`
|
||||||
|
|
||||||
|
// Annotations contains arbitrary metadata for the image index.
|
||||||
|
Annotations map[string]string `json:"annotations,omitempty"`
|
||||||
|
}
|
17
modules/packages/container/oci/reference.go
Normal file
17
modules/packages/container/oci/reference.go
Normal file
|
@ -0,0 +1,17 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package oci
|
||||||
|
|
||||||
|
import (
|
||||||
|
"regexp"
|
||||||
|
)
|
||||||
|
|
||||||
|
var referencePattern = regexp.MustCompile(`\A[a-zA-Z0-9_][a-zA-Z0-9._-]{0,127}\z`)
|
||||||
|
|
||||||
|
type Reference string
|
||||||
|
|
||||||
|
func (r Reference) Validate() bool {
|
||||||
|
return referencePattern.MatchString(string(r))
|
||||||
|
}
|
47
modules/packages/content_store.go
Normal file
47
modules/packages/content_store.go
Normal file
|
@ -0,0 +1,47 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package packages
|
||||||
|
|
||||||
|
import (
|
||||||
|
"io"
|
||||||
|
"path"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/modules/storage"
|
||||||
|
)
|
||||||
|
|
||||||
|
// BlobHash256Key is the key to address a blob content
|
||||||
|
type BlobHash256Key string
|
||||||
|
|
||||||
|
// ContentStore is a wrapper around ObjectStorage
|
||||||
|
type ContentStore struct {
|
||||||
|
store storage.ObjectStorage
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewContentStore creates the default package store
|
||||||
|
func NewContentStore() *ContentStore {
|
||||||
|
contentStore := &ContentStore{storage.Packages}
|
||||||
|
return contentStore
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get gets a package blob
|
||||||
|
func (s *ContentStore) Get(key BlobHash256Key) (storage.Object, error) {
|
||||||
|
return s.store.Open(keyToRelativePath(key))
|
||||||
|
}
|
||||||
|
|
||||||
|
// Save stores a package blob
|
||||||
|
func (s *ContentStore) Save(key BlobHash256Key, r io.Reader, size int64) error {
|
||||||
|
_, err := s.store.Save(keyToRelativePath(key), r, size)
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Delete deletes a package blob
|
||||||
|
func (s *ContentStore) Delete(key BlobHash256Key) error {
|
||||||
|
return s.store.Delete(keyToRelativePath(key))
|
||||||
|
}
|
||||||
|
|
||||||
|
// keyToRelativePath converts the sha256 key aabb000000... to aa/bb/aabb000000...
|
||||||
|
func keyToRelativePath(key BlobHash256Key) string {
|
||||||
|
return path.Join(string(key)[0:2], string(key)[2:4], string(key))
|
||||||
|
}
|
70
modules/packages/hashed_buffer.go
Normal file
70
modules/packages/hashed_buffer.go
Normal file
|
@ -0,0 +1,70 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package packages
|
||||||
|
|
||||||
|
import (
|
||||||
|
"io"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/modules/util/filebuffer"
|
||||||
|
)
|
||||||
|
|
||||||
|
// HashedSizeReader provide methods to read, sum hashes and a Size method
|
||||||
|
type HashedSizeReader interface {
|
||||||
|
io.Reader
|
||||||
|
HashSummer
|
||||||
|
Size() int64
|
||||||
|
}
|
||||||
|
|
||||||
|
// HashedBuffer is buffer which calculates multiple checksums
|
||||||
|
type HashedBuffer struct {
|
||||||
|
*filebuffer.FileBackedBuffer
|
||||||
|
|
||||||
|
hash *MultiHasher
|
||||||
|
|
||||||
|
combinedWriter io.Writer
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewHashedBuffer creates a hashed buffer with a specific maximum memory size
|
||||||
|
func NewHashedBuffer(maxMemorySize int) (*HashedBuffer, error) {
|
||||||
|
b, err := filebuffer.New(maxMemorySize)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
hash := NewMultiHasher()
|
||||||
|
|
||||||
|
combinedWriter := io.MultiWriter(b, hash)
|
||||||
|
|
||||||
|
return &HashedBuffer{
|
||||||
|
b,
|
||||||
|
hash,
|
||||||
|
combinedWriter,
|
||||||
|
}, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// CreateHashedBufferFromReader creates a hashed buffer and copies the provided reader data into it.
|
||||||
|
func CreateHashedBufferFromReader(r io.Reader, maxMemorySize int) (*HashedBuffer, error) {
|
||||||
|
b, err := NewHashedBuffer(maxMemorySize)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
_, err = io.Copy(b, r)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return b, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Write implements io.Writer
|
||||||
|
func (b *HashedBuffer) Write(p []byte) (int, error) {
|
||||||
|
return b.combinedWriter.Write(p)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Sums gets the MD5, SHA1, SHA256 and SHA512 checksums of the data
|
||||||
|
func (b *HashedBuffer) Sums() (hashMD5, hashSHA1, hashSHA256, hashSHA512 []byte) {
|
||||||
|
return b.hash.Sums()
|
||||||
|
}
|
89
modules/packages/maven/metadata.go
Normal file
89
modules/packages/maven/metadata.go
Normal file
|
@ -0,0 +1,89 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package maven
|
||||||
|
|
||||||
|
import (
|
||||||
|
"encoding/xml"
|
||||||
|
"io"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/modules/validation"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Metadata represents the metadata of a Maven package
|
||||||
|
type Metadata struct {
|
||||||
|
GroupID string `json:"group_id,omitempty"`
|
||||||
|
ArtifactID string `json:"artifact_id,omitempty"`
|
||||||
|
Name string `json:"name,omitempty"`
|
||||||
|
Description string `json:"description,omitempty"`
|
||||||
|
ProjectURL string `json:"project_url,omitempty"`
|
||||||
|
Licenses []string `json:"licenses,omitempty"`
|
||||||
|
Dependencies []*Dependency `json:"dependencies,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Dependency represents a dependency of a Maven package
|
||||||
|
type Dependency struct {
|
||||||
|
GroupID string `json:"group_id,omitempty"`
|
||||||
|
ArtifactID string `json:"artifact_id,omitempty"`
|
||||||
|
Version string `json:"version,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
type pomStruct struct {
|
||||||
|
XMLName xml.Name `xml:"project"`
|
||||||
|
GroupID string `xml:"groupId"`
|
||||||
|
ArtifactID string `xml:"artifactId"`
|
||||||
|
Version string `xml:"version"`
|
||||||
|
Name string `xml:"name"`
|
||||||
|
Description string `xml:"description"`
|
||||||
|
URL string `xml:"url"`
|
||||||
|
Licenses []struct {
|
||||||
|
Name string `xml:"name"`
|
||||||
|
URL string `xml:"url"`
|
||||||
|
Distribution string `xml:"distribution"`
|
||||||
|
} `xml:"licenses>license"`
|
||||||
|
Dependencies []struct {
|
||||||
|
GroupID string `xml:"groupId"`
|
||||||
|
ArtifactID string `xml:"artifactId"`
|
||||||
|
Version string `xml:"version"`
|
||||||
|
Scope string `xml:"scope"`
|
||||||
|
} `xml:"dependencies>dependency"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParsePackageMetaData parses the metadata of a pom file
|
||||||
|
func ParsePackageMetaData(r io.Reader) (*Metadata, error) {
|
||||||
|
var pom pomStruct
|
||||||
|
if err := xml.NewDecoder(r).Decode(&pom); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if !validation.IsValidURL(pom.URL) {
|
||||||
|
pom.URL = ""
|
||||||
|
}
|
||||||
|
|
||||||
|
licenses := make([]string, 0, len(pom.Licenses))
|
||||||
|
for _, l := range pom.Licenses {
|
||||||
|
if l.Name != "" {
|
||||||
|
licenses = append(licenses, l.Name)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
dependencies := make([]*Dependency, 0, len(pom.Dependencies))
|
||||||
|
for _, d := range pom.Dependencies {
|
||||||
|
dependencies = append(dependencies, &Dependency{
|
||||||
|
GroupID: d.GroupID,
|
||||||
|
ArtifactID: d.ArtifactID,
|
||||||
|
Version: d.Version,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
return &Metadata{
|
||||||
|
GroupID: pom.GroupID,
|
||||||
|
ArtifactID: pom.ArtifactID,
|
||||||
|
Name: pom.Name,
|
||||||
|
Description: pom.Description,
|
||||||
|
ProjectURL: pom.URL,
|
||||||
|
Licenses: licenses,
|
||||||
|
Dependencies: dependencies,
|
||||||
|
}, nil
|
||||||
|
}
|
73
modules/packages/maven/metadata_test.go
Normal file
73
modules/packages/maven/metadata_test.go
Normal file
|
@ -0,0 +1,73 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package maven
|
||||||
|
|
||||||
|
import (
|
||||||
|
"strings"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
groupID = "org.gitea"
|
||||||
|
artifactID = "my-project"
|
||||||
|
version = "1.0.1"
|
||||||
|
name = "My Gitea Project"
|
||||||
|
description = "Package Description"
|
||||||
|
projectURL = "https://gitea.io"
|
||||||
|
license = "MIT"
|
||||||
|
dependencyGroupID = "org.gitea.core"
|
||||||
|
dependencyArtifactID = "git"
|
||||||
|
dependencyVersion = "5.0.0"
|
||||||
|
)
|
||||||
|
|
||||||
|
const pomContent = `<?xml version="1.0"?>
|
||||||
|
<project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
|
||||||
|
<groupId>` + groupID + `</groupId>
|
||||||
|
<artifactId>` + artifactID + `</artifactId>
|
||||||
|
<version>` + version + `</version>
|
||||||
|
<name>` + name + `</name>
|
||||||
|
<description>` + description + `</description>
|
||||||
|
<url>` + projectURL + `</url>
|
||||||
|
<licenses>
|
||||||
|
<license>
|
||||||
|
<name>` + license + `</name>
|
||||||
|
</license>
|
||||||
|
</licenses>
|
||||||
|
<dependencies>
|
||||||
|
<dependency>
|
||||||
|
<groupId>` + dependencyGroupID + `</groupId>
|
||||||
|
<artifactId>` + dependencyArtifactID + `</artifactId>
|
||||||
|
<version>` + dependencyVersion + `</version>
|
||||||
|
</dependency>
|
||||||
|
</dependencies>
|
||||||
|
</project>`
|
||||||
|
|
||||||
|
func TestParsePackageMetaData(t *testing.T) {
|
||||||
|
t.Run("InvalidFile", func(t *testing.T) {
|
||||||
|
m, err := ParsePackageMetaData(strings.NewReader(""))
|
||||||
|
assert.Nil(t, m)
|
||||||
|
assert.Error(t, err)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Valid", func(t *testing.T) {
|
||||||
|
m, err := ParsePackageMetaData(strings.NewReader(pomContent))
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.NotNil(t, m)
|
||||||
|
|
||||||
|
assert.Equal(t, groupID, m.GroupID)
|
||||||
|
assert.Equal(t, artifactID, m.ArtifactID)
|
||||||
|
assert.Equal(t, name, m.Name)
|
||||||
|
assert.Equal(t, description, m.Description)
|
||||||
|
assert.Equal(t, projectURL, m.ProjectURL)
|
||||||
|
assert.Len(t, m.Licenses, 1)
|
||||||
|
assert.Equal(t, license, m.Licenses[0])
|
||||||
|
assert.Len(t, m.Dependencies, 1)
|
||||||
|
assert.Equal(t, dependencyGroupID, m.Dependencies[0].GroupID)
|
||||||
|
assert.Equal(t, dependencyArtifactID, m.Dependencies[0].ArtifactID)
|
||||||
|
assert.Equal(t, dependencyVersion, m.Dependencies[0].Version)
|
||||||
|
})
|
||||||
|
}
|
123
modules/packages/multi_hasher.go
Normal file
123
modules/packages/multi_hasher.go
Normal file
|
@ -0,0 +1,123 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package packages
|
||||||
|
|
||||||
|
import (
|
||||||
|
"crypto/md5"
|
||||||
|
"crypto/sha1"
|
||||||
|
"crypto/sha256"
|
||||||
|
"crypto/sha512"
|
||||||
|
"encoding"
|
||||||
|
"errors"
|
||||||
|
"hash"
|
||||||
|
"io"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
marshaledSizeMD5 = 92
|
||||||
|
marshaledSizeSHA1 = 96
|
||||||
|
marshaledSizeSHA256 = 108
|
||||||
|
marshaledSizeSHA512 = 204
|
||||||
|
|
||||||
|
marshaledSize = marshaledSizeMD5 + marshaledSizeSHA1 + marshaledSizeSHA256 + marshaledSizeSHA512
|
||||||
|
)
|
||||||
|
|
||||||
|
// HashSummer provide a Sums method
|
||||||
|
type HashSummer interface {
|
||||||
|
Sums() (hashMD5, hashSHA1, hashSHA256, hashSHA512 []byte)
|
||||||
|
}
|
||||||
|
|
||||||
|
// MultiHasher calculates multiple checksums
|
||||||
|
type MultiHasher struct {
|
||||||
|
md5 hash.Hash
|
||||||
|
sha1 hash.Hash
|
||||||
|
sha256 hash.Hash
|
||||||
|
sha512 hash.Hash
|
||||||
|
|
||||||
|
combinedWriter io.Writer
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewMultiHasher creates a multi hasher
|
||||||
|
func NewMultiHasher() *MultiHasher {
|
||||||
|
md5 := md5.New()
|
||||||
|
sha1 := sha1.New()
|
||||||
|
sha256 := sha256.New()
|
||||||
|
sha512 := sha512.New()
|
||||||
|
|
||||||
|
combinedWriter := io.MultiWriter(md5, sha1, sha256, sha512)
|
||||||
|
|
||||||
|
return &MultiHasher{
|
||||||
|
md5,
|
||||||
|
sha1,
|
||||||
|
sha256,
|
||||||
|
sha512,
|
||||||
|
combinedWriter,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// MarshalBinary implements encoding.BinaryMarshaler
|
||||||
|
func (h *MultiHasher) MarshalBinary() ([]byte, error) {
|
||||||
|
md5Bytes, err := h.md5.(encoding.BinaryMarshaler).MarshalBinary()
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
sha1Bytes, err := h.sha1.(encoding.BinaryMarshaler).MarshalBinary()
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
sha256Bytes, err := h.sha256.(encoding.BinaryMarshaler).MarshalBinary()
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
sha512Bytes, err := h.sha512.(encoding.BinaryMarshaler).MarshalBinary()
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
b := make([]byte, 0, marshaledSize)
|
||||||
|
b = append(b, md5Bytes...)
|
||||||
|
b = append(b, sha1Bytes...)
|
||||||
|
b = append(b, sha256Bytes...)
|
||||||
|
b = append(b, sha512Bytes...)
|
||||||
|
return b, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// UnmarshalBinary implements encoding.BinaryUnmarshaler
|
||||||
|
func (h *MultiHasher) UnmarshalBinary(b []byte) error {
|
||||||
|
if len(b) != marshaledSize {
|
||||||
|
return errors.New("invalid hash state size")
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := h.md5.(encoding.BinaryUnmarshaler).UnmarshalBinary(b[:marshaledSizeMD5]); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
b = b[marshaledSizeMD5:]
|
||||||
|
if err := h.sha1.(encoding.BinaryUnmarshaler).UnmarshalBinary(b[:marshaledSizeSHA1]); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
b = b[marshaledSizeSHA1:]
|
||||||
|
if err := h.sha256.(encoding.BinaryUnmarshaler).UnmarshalBinary(b[:marshaledSizeSHA256]); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
b = b[marshaledSizeSHA256:]
|
||||||
|
return h.sha512.(encoding.BinaryUnmarshaler).UnmarshalBinary(b[:marshaledSizeSHA512])
|
||||||
|
}
|
||||||
|
|
||||||
|
// Write implements io.Writer
|
||||||
|
func (h *MultiHasher) Write(p []byte) (int, error) {
|
||||||
|
return h.combinedWriter.Write(p)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Sums gets the MD5, SHA1, SHA256 and SHA512 checksums of the data
|
||||||
|
func (h *MultiHasher) Sums() (hashMD5, hashSHA1, hashSHA256, hashSHA512 []byte) {
|
||||||
|
hashMD5 = h.md5.Sum(nil)
|
||||||
|
hashSHA1 = h.sha1.Sum(nil)
|
||||||
|
hashSHA256 = h.sha256.Sum(nil)
|
||||||
|
hashSHA512 = h.sha512.Sum(nil)
|
||||||
|
return
|
||||||
|
}
|
54
modules/packages/multi_hasher_test.go
Normal file
54
modules/packages/multi_hasher_test.go
Normal file
|
@ -0,0 +1,54 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package packages
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
expectedMD5 = "e3bef03c5f3b7f6b3ab3e3053ed71e9c"
|
||||||
|
expectedSHA1 = "060b3b99f88e96085b4a68e095bc9e3d1d91e1bc"
|
||||||
|
expectedSHA256 = "6ccce4863b70f258d691f59609d31b4502e1ba5199942d3bc5d35d17a4ce771d"
|
||||||
|
expectedSHA512 = "7f70e439ba8c52025c1f06cdf6ae443c4b8ed2e90059cdb9bbbf8adf80846f185a24acca9245b128b226d61753b0d7ed46580a69c8999eeff3bc13a4d0bd816c"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestMultiHasherSums(t *testing.T) {
|
||||||
|
t.Run("Sums", func(t *testing.T) {
|
||||||
|
h := NewMultiHasher()
|
||||||
|
h.Write([]byte("gitea"))
|
||||||
|
|
||||||
|
hashMD5, hashSHA1, hashSHA256, hashSHA512 := h.Sums()
|
||||||
|
|
||||||
|
assert.Equal(t, expectedMD5, fmt.Sprintf("%x", hashMD5))
|
||||||
|
assert.Equal(t, expectedSHA1, fmt.Sprintf("%x", hashSHA1))
|
||||||
|
assert.Equal(t, expectedSHA256, fmt.Sprintf("%x", hashSHA256))
|
||||||
|
assert.Equal(t, expectedSHA512, fmt.Sprintf("%x", hashSHA512))
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("State", func(t *testing.T) {
|
||||||
|
h := NewMultiHasher()
|
||||||
|
h.Write([]byte("git"))
|
||||||
|
|
||||||
|
state, err := h.MarshalBinary()
|
||||||
|
assert.NoError(t, err)
|
||||||
|
|
||||||
|
h2 := NewMultiHasher()
|
||||||
|
err = h2.UnmarshalBinary(state)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
|
||||||
|
h2.Write([]byte("ea"))
|
||||||
|
|
||||||
|
hashMD5, hashSHA1, hashSHA256, hashSHA512 := h2.Sums()
|
||||||
|
|
||||||
|
assert.Equal(t, expectedMD5, fmt.Sprintf("%x", hashMD5))
|
||||||
|
assert.Equal(t, expectedSHA1, fmt.Sprintf("%x", hashSHA1))
|
||||||
|
assert.Equal(t, expectedSHA256, fmt.Sprintf("%x", hashSHA256))
|
||||||
|
assert.Equal(t, expectedSHA512, fmt.Sprintf("%x", hashSHA512))
|
||||||
|
})
|
||||||
|
}
|
256
modules/packages/npm/creator.go
Normal file
256
modules/packages/npm/creator.go
Normal file
|
@ -0,0 +1,256 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package npm
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"crypto/sha1"
|
||||||
|
"crypto/sha512"
|
||||||
|
"encoding/base64"
|
||||||
|
"errors"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"regexp"
|
||||||
|
"strings"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/modules/json"
|
||||||
|
"code.gitea.io/gitea/modules/validation"
|
||||||
|
|
||||||
|
"github.com/hashicorp/go-version"
|
||||||
|
)
|
||||||
|
|
||||||
|
var (
|
||||||
|
// ErrInvalidPackage indicates an invalid package
|
||||||
|
ErrInvalidPackage = errors.New("The package is invalid")
|
||||||
|
// ErrInvalidPackageName indicates an invalid name
|
||||||
|
ErrInvalidPackageName = errors.New("The package name is invalid")
|
||||||
|
// ErrInvalidPackageVersion indicates an invalid version
|
||||||
|
ErrInvalidPackageVersion = errors.New("The package version is invalid")
|
||||||
|
// ErrInvalidAttachment indicates a invalid attachment
|
||||||
|
ErrInvalidAttachment = errors.New("The package attachment is invalid")
|
||||||
|
// ErrInvalidIntegrity indicates an integrity validation error
|
||||||
|
ErrInvalidIntegrity = errors.New("Failed to validate integrity")
|
||||||
|
)
|
||||||
|
|
||||||
|
var nameMatch = regexp.MustCompile(`\A((@[^\s\/~'!\(\)\*]+?)[\/])?([^_.][^\s\/~'!\(\)\*]+)\z`)
|
||||||
|
|
||||||
|
// Package represents a npm package
|
||||||
|
type Package struct {
|
||||||
|
Name string
|
||||||
|
Version string
|
||||||
|
DistTags []string
|
||||||
|
Metadata Metadata
|
||||||
|
Filename string
|
||||||
|
Data []byte
|
||||||
|
}
|
||||||
|
|
||||||
|
// PackageMetadata https://github.com/npm/registry/blob/master/docs/REGISTRY-API.md#package
|
||||||
|
type PackageMetadata struct {
|
||||||
|
ID string `json:"_id"`
|
||||||
|
Name string `json:"name"`
|
||||||
|
Description string `json:"description"`
|
||||||
|
DistTags map[string]string `json:"dist-tags,omitempty"`
|
||||||
|
Versions map[string]*PackageMetadataVersion `json:"versions"`
|
||||||
|
Readme string `json:"readme,omitempty"`
|
||||||
|
Maintainers []User `json:"maintainers,omitempty"`
|
||||||
|
Time map[string]time.Time `json:"time,omitempty"`
|
||||||
|
Homepage string `json:"homepage,omitempty"`
|
||||||
|
Keywords []string `json:"keywords,omitempty"`
|
||||||
|
Repository Repository `json:"repository,omitempty"`
|
||||||
|
Author User `json:"author"`
|
||||||
|
ReadmeFilename string `json:"readmeFilename,omitempty"`
|
||||||
|
Users map[string]bool `json:"users,omitempty"`
|
||||||
|
License string `json:"license,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// PackageMetadataVersion https://github.com/npm/registry/blob/master/docs/REGISTRY-API.md#version
|
||||||
|
type PackageMetadataVersion struct {
|
||||||
|
ID string `json:"_id"`
|
||||||
|
Name string `json:"name"`
|
||||||
|
Version string `json:"version"`
|
||||||
|
Description string `json:"description"`
|
||||||
|
Author User `json:"author"`
|
||||||
|
Homepage string `json:"homepage,omitempty"`
|
||||||
|
License string `json:"license,omitempty"`
|
||||||
|
Repository Repository `json:"repository,omitempty"`
|
||||||
|
Keywords []string `json:"keywords,omitempty"`
|
||||||
|
Dependencies map[string]string `json:"dependencies,omitempty"`
|
||||||
|
DevDependencies map[string]string `json:"devDependencies,omitempty"`
|
||||||
|
PeerDependencies map[string]string `json:"peerDependencies,omitempty"`
|
||||||
|
OptionalDependencies map[string]string `json:"optionalDependencies,omitempty"`
|
||||||
|
Readme string `json:"readme,omitempty"`
|
||||||
|
Dist PackageDistribution `json:"dist"`
|
||||||
|
Maintainers []User `json:"maintainers,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// PackageDistribution https://github.com/npm/registry/blob/master/docs/REGISTRY-API.md#version
|
||||||
|
type PackageDistribution struct {
|
||||||
|
Integrity string `json:"integrity"`
|
||||||
|
Shasum string `json:"shasum"`
|
||||||
|
Tarball string `json:"tarball"`
|
||||||
|
FileCount int `json:"fileCount,omitempty"`
|
||||||
|
UnpackedSize int `json:"unpackedSize,omitempty"`
|
||||||
|
NpmSignature string `json:"npm-signature,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// User https://github.com/npm/registry/blob/master/docs/REGISTRY-API.md#package
|
||||||
|
type User struct {
|
||||||
|
Username string `json:"username,omitempty"`
|
||||||
|
Name string `json:"name"`
|
||||||
|
Email string `json:"email,omitempty"`
|
||||||
|
URL string `json:"url,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// UnmarshalJSON is needed because User objects can be strings or objects
|
||||||
|
func (u *User) UnmarshalJSON(data []byte) error {
|
||||||
|
switch data[0] {
|
||||||
|
case '"':
|
||||||
|
if err := json.Unmarshal(data, &u.Name); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
case '{':
|
||||||
|
var tmp struct {
|
||||||
|
Username string `json:"username"`
|
||||||
|
Name string `json:"name"`
|
||||||
|
Email string `json:"email"`
|
||||||
|
URL string `json:"url"`
|
||||||
|
}
|
||||||
|
if err := json.Unmarshal(data, &tmp); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
u.Username = tmp.Username
|
||||||
|
u.Name = tmp.Name
|
||||||
|
u.Email = tmp.Email
|
||||||
|
u.URL = tmp.URL
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Repository https://github.com/npm/registry/blob/master/docs/REGISTRY-API.md#version
|
||||||
|
type Repository struct {
|
||||||
|
Type string `json:"type"`
|
||||||
|
URL string `json:"url"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// PackageAttachment https://github.com/npm/registry/blob/master/docs/REGISTRY-API.md#package
|
||||||
|
type PackageAttachment struct {
|
||||||
|
ContentType string `json:"content_type"`
|
||||||
|
Data string `json:"data"`
|
||||||
|
Length int `json:"length"`
|
||||||
|
}
|
||||||
|
|
||||||
|
type packageUpload struct {
|
||||||
|
PackageMetadata
|
||||||
|
Attachments map[string]*PackageAttachment `json:"_attachments"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParsePackage parses the content into a npm package
|
||||||
|
func ParsePackage(r io.Reader) (*Package, error) {
|
||||||
|
var upload packageUpload
|
||||||
|
if err := json.NewDecoder(r).Decode(&upload); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, meta := range upload.Versions {
|
||||||
|
if !validateName(meta.Name) {
|
||||||
|
return nil, ErrInvalidPackageName
|
||||||
|
}
|
||||||
|
|
||||||
|
v, err := version.NewSemver(meta.Version)
|
||||||
|
if err != nil {
|
||||||
|
return nil, ErrInvalidPackageVersion
|
||||||
|
}
|
||||||
|
|
||||||
|
scope := ""
|
||||||
|
name := meta.Name
|
||||||
|
nameParts := strings.SplitN(meta.Name, "/", 2)
|
||||||
|
if len(nameParts) == 2 {
|
||||||
|
scope = nameParts[0]
|
||||||
|
name = nameParts[1]
|
||||||
|
}
|
||||||
|
|
||||||
|
if !validation.IsValidURL(meta.Homepage) {
|
||||||
|
meta.Homepage = ""
|
||||||
|
}
|
||||||
|
|
||||||
|
p := &Package{
|
||||||
|
Name: meta.Name,
|
||||||
|
Version: v.String(),
|
||||||
|
DistTags: make([]string, 0, 1),
|
||||||
|
Metadata: Metadata{
|
||||||
|
Scope: scope,
|
||||||
|
Name: name,
|
||||||
|
Description: meta.Description,
|
||||||
|
Author: meta.Author.Name,
|
||||||
|
License: meta.License,
|
||||||
|
ProjectURL: meta.Homepage,
|
||||||
|
Keywords: meta.Keywords,
|
||||||
|
Dependencies: meta.Dependencies,
|
||||||
|
DevelopmentDependencies: meta.DevDependencies,
|
||||||
|
PeerDependencies: meta.PeerDependencies,
|
||||||
|
OptionalDependencies: meta.OptionalDependencies,
|
||||||
|
Readme: meta.Readme,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
for tag := range upload.DistTags {
|
||||||
|
p.DistTags = append(p.DistTags, tag)
|
||||||
|
}
|
||||||
|
|
||||||
|
p.Filename = strings.ToLower(fmt.Sprintf("%s-%s.tgz", name, p.Version))
|
||||||
|
|
||||||
|
attachment := func() *PackageAttachment {
|
||||||
|
for _, a := range upload.Attachments {
|
||||||
|
return a
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}()
|
||||||
|
if attachment == nil || len(attachment.Data) == 0 {
|
||||||
|
return nil, ErrInvalidAttachment
|
||||||
|
}
|
||||||
|
|
||||||
|
data, err := base64.StdEncoding.DecodeString(attachment.Data)
|
||||||
|
if err != nil {
|
||||||
|
return nil, ErrInvalidAttachment
|
||||||
|
}
|
||||||
|
p.Data = data
|
||||||
|
|
||||||
|
integrity := strings.SplitN(meta.Dist.Integrity, "-", 2)
|
||||||
|
if len(integrity) != 2 {
|
||||||
|
return nil, ErrInvalidIntegrity
|
||||||
|
}
|
||||||
|
integrityHash, err := base64.StdEncoding.DecodeString(integrity[1])
|
||||||
|
if err != nil {
|
||||||
|
return nil, ErrInvalidIntegrity
|
||||||
|
}
|
||||||
|
var hash []byte
|
||||||
|
switch integrity[0] {
|
||||||
|
case "sha1":
|
||||||
|
tmp := sha1.Sum(data)
|
||||||
|
hash = tmp[:]
|
||||||
|
case "sha512":
|
||||||
|
tmp := sha512.Sum512(data)
|
||||||
|
hash = tmp[:]
|
||||||
|
}
|
||||||
|
if !bytes.Equal(integrityHash, hash) {
|
||||||
|
return nil, ErrInvalidIntegrity
|
||||||
|
}
|
||||||
|
|
||||||
|
return p, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil, ErrInvalidPackage
|
||||||
|
}
|
||||||
|
|
||||||
|
func validateName(name string) bool {
|
||||||
|
if strings.TrimSpace(name) != name {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
if len(name) == 0 || len(name) > 214 {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
return nameMatch.MatchString(name)
|
||||||
|
}
|
272
modules/packages/npm/creator_test.go
Normal file
272
modules/packages/npm/creator_test.go
Normal file
|
@ -0,0 +1,272 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package npm
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"encoding/base64"
|
||||||
|
"fmt"
|
||||||
|
"strings"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/modules/json"
|
||||||
|
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestParsePackage(t *testing.T) {
|
||||||
|
packageScope := "@scope"
|
||||||
|
packageName := "test-package"
|
||||||
|
packageFullName := packageScope + "/" + packageName
|
||||||
|
packageVersion := "1.0.1-pre"
|
||||||
|
packageTag := "latest"
|
||||||
|
packageAuthor := "KN4CK3R"
|
||||||
|
packageDescription := "Test Description"
|
||||||
|
data := "H4sIAAAAAAAA/ytITM5OTE/VL4DQelnF+XkMVAYGBgZmJiYK2MRBwNDcSIHB2NTMwNDQzMwAqA7IMDUxA9LUdgg2UFpcklgEdAql5kD8ogCnhwio5lJQUMpLzE1VslJQcihOzi9I1S9JLS7RhSYIJR2QgrLUouLM/DyQGkM9Az1D3YIiqExKanFyUWZBCVQ2BKhVwQVJDKwosbQkI78IJO/tZ+LsbRykxFXLNdA+HwWjYBSMgpENACgAbtAACAAA"
|
||||||
|
integrity := "sha512-yA4FJsVhetynGfOC1jFf79BuS+jrHbm0fhh+aHzCQkOaOBXKf9oBnC4a6DnLLnEsHQDRLYd00cwj8sCXpC+wIg=="
|
||||||
|
|
||||||
|
t.Run("InvalidUpload", func(t *testing.T) {
|
||||||
|
p, err := ParsePackage(bytes.NewReader([]byte{0}))
|
||||||
|
assert.Nil(t, p)
|
||||||
|
assert.Error(t, err)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("InvalidUploadNoData", func(t *testing.T) {
|
||||||
|
b, _ := json.Marshal(packageUpload{})
|
||||||
|
p, err := ParsePackage(bytes.NewReader(b))
|
||||||
|
assert.Nil(t, p)
|
||||||
|
assert.ErrorIs(t, err, ErrInvalidPackage)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("InvalidPackageName", func(t *testing.T) {
|
||||||
|
test := func(t *testing.T, name string) {
|
||||||
|
b, _ := json.Marshal(packageUpload{
|
||||||
|
PackageMetadata: PackageMetadata{
|
||||||
|
ID: name,
|
||||||
|
Name: name,
|
||||||
|
Versions: map[string]*PackageMetadataVersion{
|
||||||
|
packageVersion: {
|
||||||
|
Name: name,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
})
|
||||||
|
|
||||||
|
p, err := ParsePackage(bytes.NewReader(b))
|
||||||
|
assert.Nil(t, p)
|
||||||
|
assert.ErrorIs(t, err, ErrInvalidPackageName)
|
||||||
|
}
|
||||||
|
|
||||||
|
test(t, " test ")
|
||||||
|
test(t, " test")
|
||||||
|
test(t, "test ")
|
||||||
|
test(t, "te st")
|
||||||
|
test(t, "invalid/scope")
|
||||||
|
test(t, "@invalid/_name")
|
||||||
|
test(t, "@invalid/.name")
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("ValidPackageName", func(t *testing.T) {
|
||||||
|
test := func(t *testing.T, name string) {
|
||||||
|
b, _ := json.Marshal(packageUpload{
|
||||||
|
PackageMetadata: PackageMetadata{
|
||||||
|
ID: name,
|
||||||
|
Name: name,
|
||||||
|
Versions: map[string]*PackageMetadataVersion{
|
||||||
|
packageVersion: {
|
||||||
|
Name: name,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
})
|
||||||
|
|
||||||
|
p, err := ParsePackage(bytes.NewReader(b))
|
||||||
|
assert.Nil(t, p)
|
||||||
|
assert.ErrorIs(t, err, ErrInvalidPackageVersion)
|
||||||
|
}
|
||||||
|
|
||||||
|
test(t, "test")
|
||||||
|
test(t, "@scope/name")
|
||||||
|
test(t, packageFullName)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("InvalidPackageVersion", func(t *testing.T) {
|
||||||
|
version := "first-version"
|
||||||
|
b, _ := json.Marshal(packageUpload{
|
||||||
|
PackageMetadata: PackageMetadata{
|
||||||
|
ID: packageFullName,
|
||||||
|
Name: packageFullName,
|
||||||
|
Versions: map[string]*PackageMetadataVersion{
|
||||||
|
version: {
|
||||||
|
Name: packageFullName,
|
||||||
|
Version: version,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
})
|
||||||
|
|
||||||
|
p, err := ParsePackage(bytes.NewReader(b))
|
||||||
|
assert.Nil(t, p)
|
||||||
|
assert.ErrorIs(t, err, ErrInvalidPackageVersion)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("InvalidAttachment", func(t *testing.T) {
|
||||||
|
b, _ := json.Marshal(packageUpload{
|
||||||
|
PackageMetadata: PackageMetadata{
|
||||||
|
ID: packageFullName,
|
||||||
|
Name: packageFullName,
|
||||||
|
Versions: map[string]*PackageMetadataVersion{
|
||||||
|
packageVersion: {
|
||||||
|
Name: packageFullName,
|
||||||
|
Version: packageVersion,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
Attachments: map[string]*PackageAttachment{
|
||||||
|
"dummy.tgz": {},
|
||||||
|
},
|
||||||
|
})
|
||||||
|
|
||||||
|
p, err := ParsePackage(bytes.NewReader(b))
|
||||||
|
assert.Nil(t, p)
|
||||||
|
assert.ErrorIs(t, err, ErrInvalidAttachment)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("InvalidData", func(t *testing.T) {
|
||||||
|
filename := fmt.Sprintf("%s-%s.tgz", packageFullName, packageVersion)
|
||||||
|
b, _ := json.Marshal(packageUpload{
|
||||||
|
PackageMetadata: PackageMetadata{
|
||||||
|
ID: packageFullName,
|
||||||
|
Name: packageFullName,
|
||||||
|
Versions: map[string]*PackageMetadataVersion{
|
||||||
|
packageVersion: {
|
||||||
|
Name: packageFullName,
|
||||||
|
Version: packageVersion,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
Attachments: map[string]*PackageAttachment{
|
||||||
|
filename: {
|
||||||
|
Data: "/",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
})
|
||||||
|
|
||||||
|
p, err := ParsePackage(bytes.NewReader(b))
|
||||||
|
assert.Nil(t, p)
|
||||||
|
assert.ErrorIs(t, err, ErrInvalidAttachment)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("InvalidIntegrity", func(t *testing.T) {
|
||||||
|
filename := fmt.Sprintf("%s-%s.tgz", packageFullName, packageVersion)
|
||||||
|
b, _ := json.Marshal(packageUpload{
|
||||||
|
PackageMetadata: PackageMetadata{
|
||||||
|
ID: packageFullName,
|
||||||
|
Name: packageFullName,
|
||||||
|
Versions: map[string]*PackageMetadataVersion{
|
||||||
|
packageVersion: {
|
||||||
|
Name: packageFullName,
|
||||||
|
Version: packageVersion,
|
||||||
|
Dist: PackageDistribution{
|
||||||
|
Integrity: "sha512-test==",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
Attachments: map[string]*PackageAttachment{
|
||||||
|
filename: {
|
||||||
|
Data: data,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
})
|
||||||
|
|
||||||
|
p, err := ParsePackage(bytes.NewReader(b))
|
||||||
|
assert.Nil(t, p)
|
||||||
|
assert.ErrorIs(t, err, ErrInvalidIntegrity)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("InvalidIntegrity2", func(t *testing.T) {
|
||||||
|
filename := fmt.Sprintf("%s-%s.tgz", packageFullName, packageVersion)
|
||||||
|
b, _ := json.Marshal(packageUpload{
|
||||||
|
PackageMetadata: PackageMetadata{
|
||||||
|
ID: packageFullName,
|
||||||
|
Name: packageFullName,
|
||||||
|
Versions: map[string]*PackageMetadataVersion{
|
||||||
|
packageVersion: {
|
||||||
|
Name: packageFullName,
|
||||||
|
Version: packageVersion,
|
||||||
|
Dist: PackageDistribution{
|
||||||
|
Integrity: integrity,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
Attachments: map[string]*PackageAttachment{
|
||||||
|
filename: {
|
||||||
|
Data: base64.StdEncoding.EncodeToString([]byte("data")),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
})
|
||||||
|
|
||||||
|
p, err := ParsePackage(bytes.NewReader(b))
|
||||||
|
assert.Nil(t, p)
|
||||||
|
assert.ErrorIs(t, err, ErrInvalidIntegrity)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Valid", func(t *testing.T) {
|
||||||
|
filename := fmt.Sprintf("%s-%s.tgz", packageFullName, packageVersion)
|
||||||
|
b, _ := json.Marshal(packageUpload{
|
||||||
|
PackageMetadata: PackageMetadata{
|
||||||
|
ID: packageFullName,
|
||||||
|
Name: packageFullName,
|
||||||
|
DistTags: map[string]string{
|
||||||
|
packageTag: packageVersion,
|
||||||
|
},
|
||||||
|
Versions: map[string]*PackageMetadataVersion{
|
||||||
|
packageVersion: {
|
||||||
|
Name: packageFullName,
|
||||||
|
Version: packageVersion,
|
||||||
|
Description: packageDescription,
|
||||||
|
Author: User{Name: packageAuthor},
|
||||||
|
License: "MIT",
|
||||||
|
Homepage: "https://gitea.io/",
|
||||||
|
Readme: packageDescription,
|
||||||
|
Dependencies: map[string]string{
|
||||||
|
"package": "1.2.0",
|
||||||
|
},
|
||||||
|
Dist: PackageDistribution{
|
||||||
|
Integrity: integrity,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
Attachments: map[string]*PackageAttachment{
|
||||||
|
filename: {
|
||||||
|
Data: data,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
})
|
||||||
|
|
||||||
|
p, err := ParsePackage(bytes.NewReader(b))
|
||||||
|
assert.NotNil(t, p)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
|
||||||
|
assert.Equal(t, packageFullName, p.Name)
|
||||||
|
assert.Equal(t, packageVersion, p.Version)
|
||||||
|
assert.Equal(t, []string{packageTag}, p.DistTags)
|
||||||
|
assert.Equal(t, fmt.Sprintf("%s-%s.tgz", strings.Split(packageFullName, "/")[1], packageVersion), p.Filename)
|
||||||
|
b, _ = base64.StdEncoding.DecodeString(data)
|
||||||
|
assert.Equal(t, b, p.Data)
|
||||||
|
assert.Equal(t, packageName, p.Metadata.Name)
|
||||||
|
assert.Equal(t, packageScope, p.Metadata.Scope)
|
||||||
|
assert.Equal(t, packageDescription, p.Metadata.Description)
|
||||||
|
assert.Equal(t, packageDescription, p.Metadata.Readme)
|
||||||
|
assert.Equal(t, packageAuthor, p.Metadata.Author)
|
||||||
|
assert.Equal(t, "MIT", p.Metadata.License)
|
||||||
|
assert.Equal(t, "https://gitea.io/", p.Metadata.ProjectURL)
|
||||||
|
assert.Contains(t, p.Metadata.Dependencies, "package")
|
||||||
|
assert.Equal(t, "1.2.0", p.Metadata.Dependencies["package"])
|
||||||
|
})
|
||||||
|
}
|
24
modules/packages/npm/metadata.go
Normal file
24
modules/packages/npm/metadata.go
Normal file
|
@ -0,0 +1,24 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package npm
|
||||||
|
|
||||||
|
// TagProperty is the name of the property for tag management
|
||||||
|
const TagProperty = "npm.tag"
|
||||||
|
|
||||||
|
// Metadata represents the metadata of a npm package
|
||||||
|
type Metadata struct {
|
||||||
|
Scope string `json:"scope,omitempty"`
|
||||||
|
Name string `json:"name,omitempty"`
|
||||||
|
Description string `json:"description,omitempty"`
|
||||||
|
Author string `json:"author,omitempty"`
|
||||||
|
License string `json:"license,omitempty"`
|
||||||
|
ProjectURL string `json:"project_url,omitempty"`
|
||||||
|
Keywords []string `json:"keywords,omitempty"`
|
||||||
|
Dependencies map[string]string `json:"dependencies,omitempty"`
|
||||||
|
DevelopmentDependencies map[string]string `json:"development_dependencies,omitempty"`
|
||||||
|
PeerDependencies map[string]string `json:"peer_dependencies,omitempty"`
|
||||||
|
OptionalDependencies map[string]string `json:"optional_dependencies,omitempty"`
|
||||||
|
Readme string `json:"readme,omitempty"`
|
||||||
|
}
|
187
modules/packages/nuget/metadata.go
Normal file
187
modules/packages/nuget/metadata.go
Normal file
|
@ -0,0 +1,187 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package nuget
|
||||||
|
|
||||||
|
import (
|
||||||
|
"archive/zip"
|
||||||
|
"encoding/xml"
|
||||||
|
"errors"
|
||||||
|
"io"
|
||||||
|
"path/filepath"
|
||||||
|
"regexp"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/modules/validation"
|
||||||
|
|
||||||
|
"github.com/hashicorp/go-version"
|
||||||
|
)
|
||||||
|
|
||||||
|
var (
|
||||||
|
// ErrMissingNuspecFile indicates a missing Nuspec file
|
||||||
|
ErrMissingNuspecFile = errors.New("Nuspec file is missing")
|
||||||
|
// ErrNuspecFileTooLarge indicates a Nuspec file which is too large
|
||||||
|
ErrNuspecFileTooLarge = errors.New("Nuspec file is too large")
|
||||||
|
// ErrNuspecInvalidID indicates an invalid id in the Nuspec file
|
||||||
|
ErrNuspecInvalidID = errors.New("Nuspec file contains an invalid id")
|
||||||
|
// ErrNuspecInvalidVersion indicates an invalid version in the Nuspec file
|
||||||
|
ErrNuspecInvalidVersion = errors.New("Nuspec file contains an invalid version")
|
||||||
|
)
|
||||||
|
|
||||||
|
// PackageType specifies the package type the metadata describes
|
||||||
|
type PackageType int
|
||||||
|
|
||||||
|
const (
|
||||||
|
// DependencyPackage represents a package (*.nupkg)
|
||||||
|
DependencyPackage PackageType = iota + 1
|
||||||
|
// SymbolsPackage represents a symbol package (*.snupkg)
|
||||||
|
SymbolsPackage
|
||||||
|
|
||||||
|
PropertySymbolID = "nuget.symbol.id"
|
||||||
|
)
|
||||||
|
|
||||||
|
var idmatch = regexp.MustCompile(`\A\w+(?:[.-]\w+)*\z`)
|
||||||
|
|
||||||
|
const maxNuspecFileSize = 3 * 1024 * 1024
|
||||||
|
|
||||||
|
// Package represents a Nuget package
|
||||||
|
type Package struct {
|
||||||
|
PackageType PackageType
|
||||||
|
ID string
|
||||||
|
Version string
|
||||||
|
Metadata *Metadata
|
||||||
|
}
|
||||||
|
|
||||||
|
// Metadata represents the metadata of a Nuget package
|
||||||
|
type Metadata struct {
|
||||||
|
Description string `json:"description,omitempty"`
|
||||||
|
ReleaseNotes string `json:"release_notes,omitempty"`
|
||||||
|
Authors string `json:"authors,omitempty"`
|
||||||
|
ProjectURL string `json:"project_url,omitempty"`
|
||||||
|
RepositoryURL string `json:"repository_url,omitempty"`
|
||||||
|
Dependencies map[string][]Dependency `json:"dependencies,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Dependency represents a dependency of a Nuget package
|
||||||
|
type Dependency struct {
|
||||||
|
ID string `json:"id"`
|
||||||
|
Version string `json:"version"`
|
||||||
|
}
|
||||||
|
|
||||||
|
type nuspecPackage struct {
|
||||||
|
Metadata struct {
|
||||||
|
ID string `xml:"id"`
|
||||||
|
Version string `xml:"version"`
|
||||||
|
Authors string `xml:"authors"`
|
||||||
|
RequireLicenseAcceptance bool `xml:"requireLicenseAcceptance"`
|
||||||
|
ProjectURL string `xml:"projectUrl"`
|
||||||
|
Description string `xml:"description"`
|
||||||
|
ReleaseNotes string `xml:"releaseNotes"`
|
||||||
|
PackageTypes struct {
|
||||||
|
PackageType []struct {
|
||||||
|
Name string `xml:"name,attr"`
|
||||||
|
} `xml:"packageType"`
|
||||||
|
} `xml:"packageTypes"`
|
||||||
|
Repository struct {
|
||||||
|
URL string `xml:"url,attr"`
|
||||||
|
} `xml:"repository"`
|
||||||
|
Dependencies struct {
|
||||||
|
Group []struct {
|
||||||
|
TargetFramework string `xml:"targetFramework,attr"`
|
||||||
|
Dependency []struct {
|
||||||
|
ID string `xml:"id,attr"`
|
||||||
|
Version string `xml:"version,attr"`
|
||||||
|
Exclude string `xml:"exclude,attr"`
|
||||||
|
} `xml:"dependency"`
|
||||||
|
} `xml:"group"`
|
||||||
|
} `xml:"dependencies"`
|
||||||
|
} `xml:"metadata"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParsePackageMetaData parses the metadata of a Nuget package file
|
||||||
|
func ParsePackageMetaData(r io.ReaderAt, size int64) (*Package, error) {
|
||||||
|
archive, err := zip.NewReader(r, size)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, file := range archive.File {
|
||||||
|
if filepath.Dir(file.Name) != "." {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
if strings.HasSuffix(strings.ToLower(file.Name), ".nuspec") {
|
||||||
|
if file.UncompressedSize64 > maxNuspecFileSize {
|
||||||
|
return nil, ErrNuspecFileTooLarge
|
||||||
|
}
|
||||||
|
f, err := archive.Open(file.Name)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
defer f.Close()
|
||||||
|
|
||||||
|
return ParseNuspecMetaData(f)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return nil, ErrMissingNuspecFile
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParseNuspecMetaData parses a Nuspec file to retrieve the metadata of a Nuget package
|
||||||
|
func ParseNuspecMetaData(r io.Reader) (*Package, error) {
|
||||||
|
var p nuspecPackage
|
||||||
|
if err := xml.NewDecoder(r).Decode(&p); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if !idmatch.MatchString(p.Metadata.ID) {
|
||||||
|
return nil, ErrNuspecInvalidID
|
||||||
|
}
|
||||||
|
|
||||||
|
v, err := version.NewSemver(p.Metadata.Version)
|
||||||
|
if err != nil {
|
||||||
|
return nil, ErrNuspecInvalidVersion
|
||||||
|
}
|
||||||
|
|
||||||
|
if !validation.IsValidURL(p.Metadata.ProjectURL) {
|
||||||
|
p.Metadata.ProjectURL = ""
|
||||||
|
}
|
||||||
|
|
||||||
|
packageType := DependencyPackage
|
||||||
|
for _, pt := range p.Metadata.PackageTypes.PackageType {
|
||||||
|
if pt.Name == "SymbolsPackage" {
|
||||||
|
packageType = SymbolsPackage
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
m := &Metadata{
|
||||||
|
Description: p.Metadata.Description,
|
||||||
|
ReleaseNotes: p.Metadata.ReleaseNotes,
|
||||||
|
Authors: p.Metadata.Authors,
|
||||||
|
ProjectURL: p.Metadata.ProjectURL,
|
||||||
|
RepositoryURL: p.Metadata.Repository.URL,
|
||||||
|
Dependencies: make(map[string][]Dependency),
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, group := range p.Metadata.Dependencies.Group {
|
||||||
|
deps := make([]Dependency, 0, len(group.Dependency))
|
||||||
|
for _, dep := range group.Dependency {
|
||||||
|
if dep.ID == "" || dep.Version == "" {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
deps = append(deps, Dependency{
|
||||||
|
ID: dep.ID,
|
||||||
|
Version: dep.Version,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
if len(deps) > 0 {
|
||||||
|
m.Dependencies[group.TargetFramework] = deps
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return &Package{
|
||||||
|
PackageType: packageType,
|
||||||
|
ID: p.Metadata.ID,
|
||||||
|
Version: v.String(),
|
||||||
|
Metadata: m,
|
||||||
|
}, nil
|
||||||
|
}
|
163
modules/packages/nuget/metadata_test.go
Normal file
163
modules/packages/nuget/metadata_test.go
Normal file
|
@ -0,0 +1,163 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package nuget
|
||||||
|
|
||||||
|
import (
|
||||||
|
"archive/zip"
|
||||||
|
"bytes"
|
||||||
|
"strings"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
id = "System.Gitea"
|
||||||
|
semver = "1.0.1"
|
||||||
|
authors = "Gitea Authors"
|
||||||
|
projectURL = "https://gitea.io"
|
||||||
|
description = "Package Description"
|
||||||
|
releaseNotes = "Package Release Notes"
|
||||||
|
repositoryURL = "https://gitea.io/gitea/gitea"
|
||||||
|
targetFramework = ".NETStandard2.1"
|
||||||
|
dependencyID = "System.Text.Json"
|
||||||
|
dependencyVersion = "5.0.0"
|
||||||
|
)
|
||||||
|
|
||||||
|
const nuspecContent = `<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<package xmlns="http://schemas.microsoft.com/packaging/2013/05/nuspec.xsd">
|
||||||
|
<metadata>
|
||||||
|
<id>` + id + `</id>
|
||||||
|
<version>` + semver + `</version>
|
||||||
|
<authors>` + authors + `</authors>
|
||||||
|
<requireLicenseAcceptance>true</requireLicenseAcceptance>
|
||||||
|
<projectUrl>` + projectURL + `</projectUrl>
|
||||||
|
<description>` + description + `</description>
|
||||||
|
<releaseNotes>` + releaseNotes + `</releaseNotes>
|
||||||
|
<repository url="` + repositoryURL + `" />
|
||||||
|
<dependencies>
|
||||||
|
<group targetFramework="` + targetFramework + `">
|
||||||
|
<dependency id="` + dependencyID + `" version="` + dependencyVersion + `" exclude="Build,Analyzers" />
|
||||||
|
</group>
|
||||||
|
</dependencies>
|
||||||
|
</metadata>
|
||||||
|
</package>`
|
||||||
|
|
||||||
|
const symbolsNuspecContent = `<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<package xmlns="http://schemas.microsoft.com/packaging/2013/05/nuspec.xsd">
|
||||||
|
<metadata>
|
||||||
|
<id>` + id + `</id>
|
||||||
|
<version>` + semver + `</version>
|
||||||
|
<description>` + description + `</description>
|
||||||
|
<packageTypes>
|
||||||
|
<packageType name="SymbolsPackage" />
|
||||||
|
</packageTypes>
|
||||||
|
<dependencies>
|
||||||
|
<group targetFramework="` + targetFramework + `" />
|
||||||
|
</dependencies>
|
||||||
|
</metadata>
|
||||||
|
</package>`
|
||||||
|
|
||||||
|
func TestParsePackageMetaData(t *testing.T) {
|
||||||
|
createArchive := func(name, content string) []byte {
|
||||||
|
var buf bytes.Buffer
|
||||||
|
archive := zip.NewWriter(&buf)
|
||||||
|
w, _ := archive.Create(name)
|
||||||
|
w.Write([]byte(content))
|
||||||
|
archive.Close()
|
||||||
|
return buf.Bytes()
|
||||||
|
}
|
||||||
|
|
||||||
|
t.Run("MissingNuspecFile", func(t *testing.T) {
|
||||||
|
data := createArchive("dummy.txt", "")
|
||||||
|
|
||||||
|
np, err := ParsePackageMetaData(bytes.NewReader(data), int64(len(data)))
|
||||||
|
assert.Nil(t, np)
|
||||||
|
assert.ErrorIs(t, err, ErrMissingNuspecFile)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("MissingNuspecFileInRoot", func(t *testing.T) {
|
||||||
|
data := createArchive("sub/package.nuspec", "")
|
||||||
|
|
||||||
|
np, err := ParsePackageMetaData(bytes.NewReader(data), int64(len(data)))
|
||||||
|
assert.Nil(t, np)
|
||||||
|
assert.ErrorIs(t, err, ErrMissingNuspecFile)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("InvalidNuspecFile", func(t *testing.T) {
|
||||||
|
data := createArchive("package.nuspec", "")
|
||||||
|
|
||||||
|
np, err := ParsePackageMetaData(bytes.NewReader(data), int64(len(data)))
|
||||||
|
assert.Nil(t, np)
|
||||||
|
assert.Error(t, err)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("InvalidPackageId", func(t *testing.T) {
|
||||||
|
data := createArchive("package.nuspec", `<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<package xmlns="http://schemas.microsoft.com/packaging/2013/05/nuspec.xsd">
|
||||||
|
<metadata></metadata>
|
||||||
|
</package>`)
|
||||||
|
|
||||||
|
np, err := ParsePackageMetaData(bytes.NewReader(data), int64(len(data)))
|
||||||
|
assert.Nil(t, np)
|
||||||
|
assert.ErrorIs(t, err, ErrNuspecInvalidID)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("InvalidPackageVersion", func(t *testing.T) {
|
||||||
|
data := createArchive("package.nuspec", `<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<package xmlns="http://schemas.microsoft.com/packaging/2013/05/nuspec.xsd">
|
||||||
|
<metadata>
|
||||||
|
<id>`+id+`</id>
|
||||||
|
</metadata>
|
||||||
|
</package>`)
|
||||||
|
|
||||||
|
np, err := ParsePackageMetaData(bytes.NewReader(data), int64(len(data)))
|
||||||
|
assert.Nil(t, np)
|
||||||
|
assert.ErrorIs(t, err, ErrNuspecInvalidVersion)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Valid", func(t *testing.T) {
|
||||||
|
data := createArchive("package.nuspec", nuspecContent)
|
||||||
|
|
||||||
|
np, err := ParsePackageMetaData(bytes.NewReader(data), int64(len(data)))
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.NotNil(t, np)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestParseNuspecMetaData(t *testing.T) {
|
||||||
|
t.Run("Dependency Package", func(t *testing.T) {
|
||||||
|
np, err := ParseNuspecMetaData(strings.NewReader(nuspecContent))
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.NotNil(t, np)
|
||||||
|
assert.Equal(t, DependencyPackage, np.PackageType)
|
||||||
|
|
||||||
|
assert.Equal(t, id, np.ID)
|
||||||
|
assert.Equal(t, semver, np.Version)
|
||||||
|
assert.Equal(t, authors, np.Metadata.Authors)
|
||||||
|
assert.Equal(t, projectURL, np.Metadata.ProjectURL)
|
||||||
|
assert.Equal(t, description, np.Metadata.Description)
|
||||||
|
assert.Equal(t, releaseNotes, np.Metadata.ReleaseNotes)
|
||||||
|
assert.Equal(t, repositoryURL, np.Metadata.RepositoryURL)
|
||||||
|
assert.Len(t, np.Metadata.Dependencies, 1)
|
||||||
|
assert.Contains(t, np.Metadata.Dependencies, targetFramework)
|
||||||
|
deps := np.Metadata.Dependencies[targetFramework]
|
||||||
|
assert.Len(t, deps, 1)
|
||||||
|
assert.Equal(t, dependencyID, deps[0].ID)
|
||||||
|
assert.Equal(t, dependencyVersion, deps[0].Version)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Symbols Package", func(t *testing.T) {
|
||||||
|
np, err := ParseNuspecMetaData(strings.NewReader(symbolsNuspecContent))
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.NotNil(t, np)
|
||||||
|
assert.Equal(t, SymbolsPackage, np.PackageType)
|
||||||
|
|
||||||
|
assert.Equal(t, id, np.ID)
|
||||||
|
assert.Equal(t, semver, np.Version)
|
||||||
|
assert.Equal(t, description, np.Metadata.Description)
|
||||||
|
assert.Empty(t, np.Metadata.Dependencies)
|
||||||
|
})
|
||||||
|
}
|
187
modules/packages/nuget/symbol_extractor.go
Normal file
187
modules/packages/nuget/symbol_extractor.go
Normal file
|
@ -0,0 +1,187 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package nuget
|
||||||
|
|
||||||
|
import (
|
||||||
|
"archive/zip"
|
||||||
|
"bytes"
|
||||||
|
"encoding/binary"
|
||||||
|
"errors"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"path"
|
||||||
|
"path/filepath"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/modules/packages"
|
||||||
|
)
|
||||||
|
|
||||||
|
var (
|
||||||
|
ErrMissingPdbFiles = errors.New("Package does not contain PDB files")
|
||||||
|
ErrInvalidFiles = errors.New("Package contains invalid files")
|
||||||
|
ErrInvalidPdbMagicNumber = errors.New("Invalid Portable PDB magic number")
|
||||||
|
ErrMissingPdbStream = errors.New("Missing PDB stream")
|
||||||
|
)
|
||||||
|
|
||||||
|
type PortablePdb struct {
|
||||||
|
Name string
|
||||||
|
ID string
|
||||||
|
Content *packages.HashedBuffer
|
||||||
|
}
|
||||||
|
|
||||||
|
type PortablePdbList []*PortablePdb
|
||||||
|
|
||||||
|
func (l PortablePdbList) Close() {
|
||||||
|
for _, pdb := range l {
|
||||||
|
pdb.Content.Close()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ExtractPortablePdb extracts PDB files from a .snupkg file
|
||||||
|
func ExtractPortablePdb(r io.ReaderAt, size int64) (PortablePdbList, error) {
|
||||||
|
archive, err := zip.NewReader(r, size)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
var pdbs PortablePdbList
|
||||||
|
|
||||||
|
err = func() error {
|
||||||
|
for _, file := range archive.File {
|
||||||
|
if strings.HasSuffix(file.Name, "/") {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
ext := strings.ToLower(filepath.Ext(file.Name))
|
||||||
|
|
||||||
|
switch ext {
|
||||||
|
case ".nuspec", ".xml", ".psmdcp", ".rels", ".p7s":
|
||||||
|
continue
|
||||||
|
case ".pdb":
|
||||||
|
f, err := archive.Open(file.Name)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
buf, err := packages.CreateHashedBufferFromReader(f, 32*1024*1024)
|
||||||
|
|
||||||
|
f.Close()
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
id, err := ParseDebugHeaderID(buf)
|
||||||
|
if err != nil {
|
||||||
|
buf.Close()
|
||||||
|
return fmt.Errorf("Invalid PDB file: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if _, err := buf.Seek(0, io.SeekStart); err != nil {
|
||||||
|
buf.Close()
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
pdbs = append(pdbs, &PortablePdb{
|
||||||
|
Name: path.Base(file.Name),
|
||||||
|
ID: id,
|
||||||
|
Content: buf,
|
||||||
|
})
|
||||||
|
default:
|
||||||
|
return ErrInvalidFiles
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}()
|
||||||
|
if err != nil {
|
||||||
|
pdbs.Close()
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(pdbs) == 0 {
|
||||||
|
return nil, ErrMissingPdbFiles
|
||||||
|
}
|
||||||
|
|
||||||
|
return pdbs, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParseDebugHeaderID TODO
|
||||||
|
func ParseDebugHeaderID(r io.ReadSeeker) (string, error) {
|
||||||
|
var magic uint32
|
||||||
|
if err := binary.Read(r, binary.LittleEndian, &magic); err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
if magic != 0x424A5342 {
|
||||||
|
return "", ErrInvalidPdbMagicNumber
|
||||||
|
}
|
||||||
|
|
||||||
|
if _, err := r.Seek(8, io.SeekCurrent); err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
|
||||||
|
var versionStringSize int32
|
||||||
|
if err := binary.Read(r, binary.LittleEndian, &versionStringSize); err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
if _, err := r.Seek(int64(versionStringSize), io.SeekCurrent); err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
if _, err := r.Seek(2, io.SeekCurrent); err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
|
||||||
|
var streamCount int16
|
||||||
|
if err := binary.Read(r, binary.LittleEndian, &streamCount); err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
|
||||||
|
read4ByteAlignedString := func(r io.Reader) (string, error) {
|
||||||
|
b := make([]byte, 4)
|
||||||
|
var buf bytes.Buffer
|
||||||
|
for {
|
||||||
|
if _, err := r.Read(b); err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
if i := bytes.IndexByte(b, 0); i != -1 {
|
||||||
|
buf.Write(b[:i])
|
||||||
|
return buf.String(), nil
|
||||||
|
}
|
||||||
|
buf.Write(b)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
for i := 0; i < int(streamCount); i++ {
|
||||||
|
var offset uint32
|
||||||
|
if err := binary.Read(r, binary.LittleEndian, &offset); err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
if _, err := r.Seek(4, io.SeekCurrent); err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
name, err := read4ByteAlignedString(r)
|
||||||
|
if err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
|
||||||
|
if name == "#Pdb" {
|
||||||
|
if _, err := r.Seek(int64(offset), io.SeekStart); err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
|
||||||
|
b := make([]byte, 16)
|
||||||
|
if _, err := r.Read(b); err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
|
||||||
|
data1 := binary.LittleEndian.Uint32(b[0:4])
|
||||||
|
data2 := binary.LittleEndian.Uint16(b[4:6])
|
||||||
|
data3 := binary.LittleEndian.Uint16(b[6:8])
|
||||||
|
data4 := b[8:16]
|
||||||
|
|
||||||
|
return fmt.Sprintf("%08x%04x%04x%04x%012x", data1, data2, data3, data4[:2], data4[2:]), nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return "", ErrMissingPdbStream
|
||||||
|
}
|
82
modules/packages/nuget/symbol_extractor_test.go
Normal file
82
modules/packages/nuget/symbol_extractor_test.go
Normal file
|
@ -0,0 +1,82 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package nuget
|
||||||
|
|
||||||
|
import (
|
||||||
|
"archive/zip"
|
||||||
|
"bytes"
|
||||||
|
"encoding/base64"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
|
)
|
||||||
|
|
||||||
|
const pdbContent = `QlNKQgEAAQAAAAAADAAAAFBEQiB2MS4wAAAAAAAABgB8AAAAWAAAACNQZGIAAAAA1AAAAAgBAAAj
|
||||||
|
fgAA3AEAAAQAAAAjU3RyaW5ncwAAAADgAQAABAAAACNVUwDkAQAAMAAAACNHVUlEAAAAFAIAACgB
|
||||||
|
AAAjQmxvYgAAAGm7ENm9SGxMtAFVvPUsPJTF6PbtAAAAAFcVogEJAAAAAQAAAA==`
|
||||||
|
|
||||||
|
func TestExtractPortablePdb(t *testing.T) {
|
||||||
|
createArchive := func(name string, content []byte) []byte {
|
||||||
|
var buf bytes.Buffer
|
||||||
|
archive := zip.NewWriter(&buf)
|
||||||
|
w, _ := archive.Create(name)
|
||||||
|
w.Write(content)
|
||||||
|
archive.Close()
|
||||||
|
return buf.Bytes()
|
||||||
|
}
|
||||||
|
|
||||||
|
t.Run("MissingPdbFiles", func(t *testing.T) {
|
||||||
|
var buf bytes.Buffer
|
||||||
|
zip.NewWriter(&buf).Close()
|
||||||
|
|
||||||
|
pdbs, err := ExtractPortablePdb(bytes.NewReader(buf.Bytes()), int64(buf.Len()))
|
||||||
|
assert.ErrorIs(t, err, ErrMissingPdbFiles)
|
||||||
|
assert.Empty(t, pdbs)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("InvalidFiles", func(t *testing.T) {
|
||||||
|
data := createArchive("sub/test.bin", []byte{})
|
||||||
|
|
||||||
|
pdbs, err := ExtractPortablePdb(bytes.NewReader(data), int64(len(data)))
|
||||||
|
assert.ErrorIs(t, err, ErrInvalidFiles)
|
||||||
|
assert.Empty(t, pdbs)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Valid", func(t *testing.T) {
|
||||||
|
b, _ := base64.StdEncoding.DecodeString(pdbContent)
|
||||||
|
data := createArchive("test.pdb", b)
|
||||||
|
|
||||||
|
pdbs, err := ExtractPortablePdb(bytes.NewReader(data), int64(len(data)))
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Len(t, pdbs, 1)
|
||||||
|
assert.Equal(t, "test.pdb", pdbs[0].Name)
|
||||||
|
assert.Equal(t, "d910bb6948bd4c6cb40155bcf52c3c94", pdbs[0].ID)
|
||||||
|
pdbs.Close()
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestParseDebugHeaderID(t *testing.T) {
|
||||||
|
t.Run("InvalidPdbMagicNumber", func(t *testing.T) {
|
||||||
|
id, err := ParseDebugHeaderID(bytes.NewReader([]byte{0, 0, 0, 0}))
|
||||||
|
assert.ErrorIs(t, err, ErrInvalidPdbMagicNumber)
|
||||||
|
assert.Empty(t, id)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("MissingPdbStream", func(t *testing.T) {
|
||||||
|
b, _ := base64.StdEncoding.DecodeString(`QlNKQgEAAQAAAAAADAAAAFBEQiB2MS4wAAAAAAAAAQB8AAAAWAAAACNVUwA=`)
|
||||||
|
|
||||||
|
id, err := ParseDebugHeaderID(bytes.NewReader(b))
|
||||||
|
assert.ErrorIs(t, err, ErrMissingPdbStream)
|
||||||
|
assert.Empty(t, id)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Valid", func(t *testing.T) {
|
||||||
|
b, _ := base64.StdEncoding.DecodeString(pdbContent)
|
||||||
|
|
||||||
|
id, err := ParseDebugHeaderID(bytes.NewReader(b))
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Equal(t, "d910bb6948bd4c6cb40155bcf52c3c94", id)
|
||||||
|
})
|
||||||
|
}
|
16
modules/packages/pypi/metadata.go
Normal file
16
modules/packages/pypi/metadata.go
Normal file
|
@ -0,0 +1,16 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package pypi
|
||||||
|
|
||||||
|
// Metadata represents the metadata of a PyPI package
|
||||||
|
type Metadata struct {
|
||||||
|
Author string `json:"author,omitempty"`
|
||||||
|
Description string `json:"description,omitempty"`
|
||||||
|
LongDescription string `json:"long_description,omitempty"`
|
||||||
|
Summary string `json:"summary,omitempty"`
|
||||||
|
ProjectURL string `json:"project_url,omitempty"`
|
||||||
|
License string `json:"license,omitempty"`
|
||||||
|
RequiresPython string `json:"requires_python,omitempty"`
|
||||||
|
}
|
311
modules/packages/rubygems/marshal.go
Normal file
311
modules/packages/rubygems/marshal.go
Normal file
|
@ -0,0 +1,311 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package rubygems
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bufio"
|
||||||
|
"bytes"
|
||||||
|
"errors"
|
||||||
|
"io"
|
||||||
|
"reflect"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
majorVersion = 4
|
||||||
|
minorVersion = 8
|
||||||
|
|
||||||
|
typeNil = '0'
|
||||||
|
typeTrue = 'T'
|
||||||
|
typeFalse = 'F'
|
||||||
|
typeFixnum = 'i'
|
||||||
|
typeString = '"'
|
||||||
|
typeSymbol = ':'
|
||||||
|
typeSymbolLink = ';'
|
||||||
|
typeArray = '['
|
||||||
|
typeIVar = 'I'
|
||||||
|
typeUserMarshal = 'U'
|
||||||
|
typeUserDef = 'u'
|
||||||
|
typeObject = 'o'
|
||||||
|
)
|
||||||
|
|
||||||
|
var (
|
||||||
|
// ErrUnsupportedType indicates an unsupported type
|
||||||
|
ErrUnsupportedType = errors.New("Type is unsupported")
|
||||||
|
// ErrInvalidIntRange indicates an invalid number range
|
||||||
|
ErrInvalidIntRange = errors.New("Number is not in valid range")
|
||||||
|
)
|
||||||
|
|
||||||
|
// RubyUserMarshal is a Ruby object that has a marshal_load function.
|
||||||
|
type RubyUserMarshal struct {
|
||||||
|
Name string
|
||||||
|
Value interface{}
|
||||||
|
}
|
||||||
|
|
||||||
|
// RubyUserDef is a Ruby object that has a _load function.
|
||||||
|
type RubyUserDef struct {
|
||||||
|
Name string
|
||||||
|
Value interface{}
|
||||||
|
}
|
||||||
|
|
||||||
|
// RubyObject is a default Ruby object.
|
||||||
|
type RubyObject struct {
|
||||||
|
Name string
|
||||||
|
Member map[string]interface{}
|
||||||
|
}
|
||||||
|
|
||||||
|
// MarshalEncoder mimics Rubys Marshal class.
|
||||||
|
// Note: Only supports types used by the RubyGems package registry.
|
||||||
|
type MarshalEncoder struct {
|
||||||
|
w *bufio.Writer
|
||||||
|
symbols map[string]int
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewMarshalEncoder creates a new MarshalEncoder
|
||||||
|
func NewMarshalEncoder(w io.Writer) *MarshalEncoder {
|
||||||
|
return &MarshalEncoder{
|
||||||
|
w: bufio.NewWriter(w),
|
||||||
|
symbols: map[string]int{},
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Encode encodes the given type
|
||||||
|
func (e *MarshalEncoder) Encode(v interface{}) error {
|
||||||
|
if _, err := e.w.Write([]byte{majorVersion, minorVersion}); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := e.marshal(v); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
return e.w.Flush()
|
||||||
|
}
|
||||||
|
|
||||||
|
func (e *MarshalEncoder) marshal(v interface{}) error {
|
||||||
|
if v == nil {
|
||||||
|
return e.marshalNil()
|
||||||
|
}
|
||||||
|
|
||||||
|
val := reflect.ValueOf(v)
|
||||||
|
typ := reflect.TypeOf(v)
|
||||||
|
|
||||||
|
if typ.Kind() == reflect.Ptr {
|
||||||
|
val = val.Elem()
|
||||||
|
typ = typ.Elem()
|
||||||
|
}
|
||||||
|
|
||||||
|
switch typ.Kind() {
|
||||||
|
case reflect.Bool:
|
||||||
|
return e.marshalBool(val.Bool())
|
||||||
|
case reflect.Int, reflect.Int8, reflect.Int16, reflect.Int32:
|
||||||
|
return e.marshalInt(val.Int())
|
||||||
|
case reflect.String:
|
||||||
|
return e.marshalString(val.String())
|
||||||
|
case reflect.Slice, reflect.Array:
|
||||||
|
return e.marshalArray(val)
|
||||||
|
}
|
||||||
|
|
||||||
|
switch typ.Name() {
|
||||||
|
case "RubyUserMarshal":
|
||||||
|
return e.marshalUserMarshal(val.Interface().(RubyUserMarshal))
|
||||||
|
case "RubyUserDef":
|
||||||
|
return e.marshalUserDef(val.Interface().(RubyUserDef))
|
||||||
|
case "RubyObject":
|
||||||
|
return e.marshalObject(val.Interface().(RubyObject))
|
||||||
|
}
|
||||||
|
|
||||||
|
return ErrUnsupportedType
|
||||||
|
}
|
||||||
|
|
||||||
|
func (e *MarshalEncoder) marshalNil() error {
|
||||||
|
return e.w.WriteByte(typeNil)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (e *MarshalEncoder) marshalBool(b bool) error {
|
||||||
|
if b {
|
||||||
|
return e.w.WriteByte(typeTrue)
|
||||||
|
}
|
||||||
|
return e.w.WriteByte(typeFalse)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (e *MarshalEncoder) marshalInt(i int64) error {
|
||||||
|
if err := e.w.WriteByte(typeFixnum); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
return e.marshalIntInternal(i)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (e *MarshalEncoder) marshalIntInternal(i int64) error {
|
||||||
|
if i == 0 {
|
||||||
|
return e.w.WriteByte(0)
|
||||||
|
} else if 0 < i && i < 123 {
|
||||||
|
return e.w.WriteByte(byte(i + 5))
|
||||||
|
} else if -124 < i && i <= -1 {
|
||||||
|
return e.w.WriteByte(byte(i - 5))
|
||||||
|
}
|
||||||
|
|
||||||
|
var len int
|
||||||
|
if 122 < i && i <= 0xff {
|
||||||
|
len = 1
|
||||||
|
} else if 0xff < i && i <= 0xffff {
|
||||||
|
len = 2
|
||||||
|
} else if 0xffff < i && i <= 0xffffff {
|
||||||
|
len = 3
|
||||||
|
} else if 0xffffff < i && i <= 0x3fffffff {
|
||||||
|
len = 4
|
||||||
|
} else if -0x100 <= i && i < -123 {
|
||||||
|
len = -1
|
||||||
|
} else if -0x10000 <= i && i < -0x100 {
|
||||||
|
len = -2
|
||||||
|
} else if -0x1000000 <= i && i < -0x100000 {
|
||||||
|
len = -3
|
||||||
|
} else if -0x40000000 <= i && i < -0x1000000 {
|
||||||
|
len = -4
|
||||||
|
} else {
|
||||||
|
return ErrInvalidIntRange
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := e.w.WriteByte(byte(len)); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if len < 0 {
|
||||||
|
len = -len
|
||||||
|
}
|
||||||
|
|
||||||
|
for c := 0; c < len; c++ {
|
||||||
|
if err := e.w.WriteByte(byte(i >> uint(8*c) & 0xff)); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (e *MarshalEncoder) marshalString(str string) error {
|
||||||
|
if err := e.w.WriteByte(typeIVar); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := e.marshalRawString(str); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := e.marshalIntInternal(1); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := e.marshalSymbol("E"); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
return e.marshalBool(true)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (e *MarshalEncoder) marshalRawString(str string) error {
|
||||||
|
if err := e.w.WriteByte(typeString); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := e.marshalIntInternal(int64(len(str))); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
_, err := e.w.WriteString(str)
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
func (e *MarshalEncoder) marshalSymbol(str string) error {
|
||||||
|
if index, ok := e.symbols[str]; ok {
|
||||||
|
if err := e.w.WriteByte(typeSymbolLink); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
return e.marshalIntInternal(int64(index))
|
||||||
|
}
|
||||||
|
|
||||||
|
e.symbols[str] = len(e.symbols)
|
||||||
|
|
||||||
|
if err := e.w.WriteByte(typeSymbol); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := e.marshalIntInternal(int64(len(str))); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
_, err := e.w.WriteString(str)
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
func (e *MarshalEncoder) marshalArray(arr reflect.Value) error {
|
||||||
|
if err := e.w.WriteByte(typeArray); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
len := arr.Len()
|
||||||
|
|
||||||
|
if err := e.marshalIntInternal(int64(len)); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
for i := 0; i < len; i++ {
|
||||||
|
if err := e.marshal(arr.Index(i).Interface()); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (e *MarshalEncoder) marshalUserMarshal(userMarshal RubyUserMarshal) error {
|
||||||
|
if err := e.w.WriteByte(typeUserMarshal); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := e.marshalSymbol(userMarshal.Name); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
return e.marshal(userMarshal.Value)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (e *MarshalEncoder) marshalUserDef(userDef RubyUserDef) error {
|
||||||
|
var buf bytes.Buffer
|
||||||
|
if err := NewMarshalEncoder(&buf).Encode(userDef.Value); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := e.w.WriteByte(typeUserDef); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if err := e.marshalSymbol(userDef.Name); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if err := e.marshalIntInternal(int64(buf.Len())); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
_, err := e.w.Write(buf.Bytes())
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
func (e *MarshalEncoder) marshalObject(obj RubyObject) error {
|
||||||
|
if err := e.w.WriteByte(typeObject); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if err := e.marshalSymbol(obj.Name); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if err := e.marshalIntInternal(int64(len(obj.Member))); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
for k, v := range obj.Member {
|
||||||
|
if err := e.marshalSymbol(k); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if err := e.marshal(v); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
99
modules/packages/rubygems/marshal_test.go
Normal file
99
modules/packages/rubygems/marshal_test.go
Normal file
|
@ -0,0 +1,99 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package rubygems
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestMinimalEncoder(t *testing.T) {
|
||||||
|
cases := []struct {
|
||||||
|
Value interface{}
|
||||||
|
Expected []byte
|
||||||
|
Error error
|
||||||
|
}{
|
||||||
|
{
|
||||||
|
Value: nil,
|
||||||
|
Expected: []byte{4, 8, 0x30},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Value: true,
|
||||||
|
Expected: []byte{4, 8, 'T'},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Value: false,
|
||||||
|
Expected: []byte{4, 8, 'F'},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Value: 0,
|
||||||
|
Expected: []byte{4, 8, 'i', 0},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Value: 1,
|
||||||
|
Expected: []byte{4, 8, 'i', 6},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Value: -1,
|
||||||
|
Expected: []byte{4, 8, 'i', 0xfa},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Value: 0x1fffffff,
|
||||||
|
Expected: []byte{4, 8, 'i', 4, 0xff, 0xff, 0xff, 0x1f},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Value: 0x41000000,
|
||||||
|
Error: ErrInvalidIntRange,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Value: "test",
|
||||||
|
Expected: []byte{4, 8, 'I', '"', 9, 't', 'e', 's', 't', 6, ':', 6, 'E', 'T'},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Value: []int{1, 2},
|
||||||
|
Expected: []byte{4, 8, '[', 7, 'i', 6, 'i', 7},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Value: &RubyUserMarshal{
|
||||||
|
Name: "Test",
|
||||||
|
Value: 4,
|
||||||
|
},
|
||||||
|
Expected: []byte{4, 8, 'U', ':', 9, 'T', 'e', 's', 't', 'i', 9},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Value: &RubyUserDef{
|
||||||
|
Name: "Test",
|
||||||
|
Value: 4,
|
||||||
|
},
|
||||||
|
Expected: []byte{4, 8, 'u', ':', 9, 'T', 'e', 's', 't', 9, 4, 8, 'i', 9},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Value: &RubyObject{
|
||||||
|
Name: "Test",
|
||||||
|
Member: map[string]interface{}{
|
||||||
|
"test": 4,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
Expected: []byte{4, 8, 'o', ':', 9, 'T', 'e', 's', 't', 6, ':', 9, 't', 'e', 's', 't', 'i', 9},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Value: &struct {
|
||||||
|
Name string
|
||||||
|
}{
|
||||||
|
"test",
|
||||||
|
},
|
||||||
|
Error: ErrUnsupportedType,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
for i, c := range cases {
|
||||||
|
var b bytes.Buffer
|
||||||
|
err := NewMarshalEncoder(&b).Encode(c.Value)
|
||||||
|
assert.ErrorIs(t, err, c.Error)
|
||||||
|
assert.Equal(t, c.Expected, b.Bytes(), "case %d", i)
|
||||||
|
}
|
||||||
|
}
|
222
modules/packages/rubygems/metadata.go
Normal file
222
modules/packages/rubygems/metadata.go
Normal file
|
@ -0,0 +1,222 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package rubygems
|
||||||
|
|
||||||
|
import (
|
||||||
|
"archive/tar"
|
||||||
|
"compress/gzip"
|
||||||
|
"errors"
|
||||||
|
"io"
|
||||||
|
"regexp"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/modules/validation"
|
||||||
|
|
||||||
|
"gopkg.in/yaml.v2"
|
||||||
|
)
|
||||||
|
|
||||||
|
var (
|
||||||
|
// ErrMissingMetadataFile indicates a missing metadata.gz file
|
||||||
|
ErrMissingMetadataFile = errors.New("Metadata file is missing")
|
||||||
|
// ErrInvalidName indicates an invalid id in the metadata.gz file
|
||||||
|
ErrInvalidName = errors.New("Metadata file contains an invalid name")
|
||||||
|
// ErrInvalidVersion indicates an invalid version in the metadata.gz file
|
||||||
|
ErrInvalidVersion = errors.New("Metadata file contains an invalid version")
|
||||||
|
)
|
||||||
|
|
||||||
|
var versionMatcher = regexp.MustCompile(`\A[0-9]+(?:\.[0-9a-zA-Z]+)*(?:-[0-9A-Za-z-]+(?:\.[0-9A-Za-z-]+)*)?\z`)
|
||||||
|
|
||||||
|
// Package represents a RubyGems package
|
||||||
|
type Package struct {
|
||||||
|
Name string
|
||||||
|
Version string
|
||||||
|
Metadata *Metadata
|
||||||
|
}
|
||||||
|
|
||||||
|
// Metadata represents the metadata of a RubyGems package
|
||||||
|
type Metadata struct {
|
||||||
|
Platform string `json:"platform,omitempty"`
|
||||||
|
Description string `json:"description,omitempty"`
|
||||||
|
Summary string `json:"summary,omitempty"`
|
||||||
|
Authors []string `json:"authors,omitempty"`
|
||||||
|
Licenses []string `json:"licenses,omitempty"`
|
||||||
|
RequiredRubyVersion []VersionRequirement `json:"required_ruby_version,omitempty"`
|
||||||
|
RequiredRubygemsVersion []VersionRequirement `json:"required_rubygems_version,omitempty"`
|
||||||
|
ProjectURL string `json:"project_url,omitempty"`
|
||||||
|
RuntimeDependencies []Dependency `json:"runtime_dependencies,omitempty"`
|
||||||
|
DevelopmentDependencies []Dependency `json:"development_dependencies,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// VersionRequirement represents a version restriction
|
||||||
|
type VersionRequirement struct {
|
||||||
|
Restriction string `json:"restriction"`
|
||||||
|
Version string `json:"version"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Dependency represents a dependency of a RubyGems package
|
||||||
|
type Dependency struct {
|
||||||
|
Name string `json:"name"`
|
||||||
|
Version []VersionRequirement `json:"version"`
|
||||||
|
}
|
||||||
|
|
||||||
|
type gemspec struct {
|
||||||
|
Name string `yaml:"name"`
|
||||||
|
Version struct {
|
||||||
|
Version string `yaml:"version"`
|
||||||
|
} `yaml:"version"`
|
||||||
|
Platform string `yaml:"platform"`
|
||||||
|
Authors []string `yaml:"authors"`
|
||||||
|
Autorequire interface{} `yaml:"autorequire"`
|
||||||
|
Bindir string `yaml:"bindir"`
|
||||||
|
CertChain []interface{} `yaml:"cert_chain"`
|
||||||
|
Date string `yaml:"date"`
|
||||||
|
Dependencies []struct {
|
||||||
|
Name string `yaml:"name"`
|
||||||
|
Requirement requirement `yaml:"requirement"`
|
||||||
|
Type string `yaml:"type"`
|
||||||
|
Prerelease bool `yaml:"prerelease"`
|
||||||
|
VersionRequirements requirement `yaml:"version_requirements"`
|
||||||
|
} `yaml:"dependencies"`
|
||||||
|
Description string `yaml:"description"`
|
||||||
|
Email string `yaml:"email"`
|
||||||
|
Executables []string `yaml:"executables"`
|
||||||
|
Extensions []interface{} `yaml:"extensions"`
|
||||||
|
ExtraRdocFiles []string `yaml:"extra_rdoc_files"`
|
||||||
|
Files []string `yaml:"files"`
|
||||||
|
Homepage string `yaml:"homepage"`
|
||||||
|
Licenses []string `yaml:"licenses"`
|
||||||
|
Metadata struct {
|
||||||
|
BugTrackerURI string `yaml:"bug_tracker_uri"`
|
||||||
|
ChangelogURI string `yaml:"changelog_uri"`
|
||||||
|
DocumentationURI string `yaml:"documentation_uri"`
|
||||||
|
SourceCodeURI string `yaml:"source_code_uri"`
|
||||||
|
} `yaml:"metadata"`
|
||||||
|
PostInstallMessage interface{} `yaml:"post_install_message"`
|
||||||
|
RdocOptions []interface{} `yaml:"rdoc_options"`
|
||||||
|
RequirePaths []string `yaml:"require_paths"`
|
||||||
|
RequiredRubyVersion requirement `yaml:"required_ruby_version"`
|
||||||
|
RequiredRubygemsVersion requirement `yaml:"required_rubygems_version"`
|
||||||
|
Requirements []interface{} `yaml:"requirements"`
|
||||||
|
RubygemsVersion string `yaml:"rubygems_version"`
|
||||||
|
SigningKey interface{} `yaml:"signing_key"`
|
||||||
|
SpecificationVersion int `yaml:"specification_version"`
|
||||||
|
Summary string `yaml:"summary"`
|
||||||
|
TestFiles []interface{} `yaml:"test_files"`
|
||||||
|
}
|
||||||
|
|
||||||
|
type requirement struct {
|
||||||
|
Requirements [][]interface{} `yaml:"requirements"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// AsVersionRequirement converts into []VersionRequirement
|
||||||
|
func (r requirement) AsVersionRequirement() []VersionRequirement {
|
||||||
|
requirements := make([]VersionRequirement, 0, len(r.Requirements))
|
||||||
|
for _, req := range r.Requirements {
|
||||||
|
if len(req) != 2 {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
restriction, ok := req[0].(string)
|
||||||
|
if !ok {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
vm, ok := req[1].(map[interface{}]interface{})
|
||||||
|
if !ok {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
versionInt, ok := vm["version"]
|
||||||
|
if !ok {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
version, ok := versionInt.(string)
|
||||||
|
if !ok || version == "0" {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
requirements = append(requirements, VersionRequirement{
|
||||||
|
Restriction: restriction,
|
||||||
|
Version: version,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
return requirements
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParsePackageMetaData parses the metadata of a Gem package file
|
||||||
|
func ParsePackageMetaData(r io.Reader) (*Package, error) {
|
||||||
|
archive := tar.NewReader(r)
|
||||||
|
for {
|
||||||
|
hdr, err := archive.Next()
|
||||||
|
if err == io.EOF {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if hdr.Name == "metadata.gz" {
|
||||||
|
return parseMetadataFile(archive)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil, ErrMissingMetadataFile
|
||||||
|
}
|
||||||
|
|
||||||
|
func parseMetadataFile(r io.Reader) (*Package, error) {
|
||||||
|
zr, err := gzip.NewReader(r)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
defer zr.Close()
|
||||||
|
|
||||||
|
var spec gemspec
|
||||||
|
if err := yaml.NewDecoder(zr).Decode(&spec); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(spec.Name) == 0 || strings.Contains(spec.Name, "/") {
|
||||||
|
return nil, ErrInvalidName
|
||||||
|
}
|
||||||
|
|
||||||
|
if !versionMatcher.MatchString(spec.Version.Version) {
|
||||||
|
return nil, ErrInvalidVersion
|
||||||
|
}
|
||||||
|
|
||||||
|
if !validation.IsValidURL(spec.Homepage) {
|
||||||
|
spec.Homepage = ""
|
||||||
|
}
|
||||||
|
if !validation.IsValidURL(spec.Metadata.SourceCodeURI) {
|
||||||
|
spec.Metadata.SourceCodeURI = ""
|
||||||
|
}
|
||||||
|
|
||||||
|
m := &Metadata{
|
||||||
|
Platform: spec.Platform,
|
||||||
|
Description: spec.Description,
|
||||||
|
Summary: spec.Summary,
|
||||||
|
Authors: spec.Authors,
|
||||||
|
Licenses: spec.Licenses,
|
||||||
|
ProjectURL: spec.Homepage,
|
||||||
|
RequiredRubyVersion: spec.RequiredRubyVersion.AsVersionRequirement(),
|
||||||
|
RequiredRubygemsVersion: spec.RequiredRubygemsVersion.AsVersionRequirement(),
|
||||||
|
DevelopmentDependencies: make([]Dependency, 0, 5),
|
||||||
|
RuntimeDependencies: make([]Dependency, 0, 5),
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, gemdep := range spec.Dependencies {
|
||||||
|
dep := Dependency{
|
||||||
|
Name: gemdep.Name,
|
||||||
|
Version: gemdep.Requirement.AsVersionRequirement(),
|
||||||
|
}
|
||||||
|
if gemdep.Type == ":runtime" {
|
||||||
|
m.RuntimeDependencies = append(m.RuntimeDependencies, dep)
|
||||||
|
} else {
|
||||||
|
m.DevelopmentDependencies = append(m.DevelopmentDependencies, dep)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return &Package{
|
||||||
|
Name: spec.Name,
|
||||||
|
Version: spec.Version.Version,
|
||||||
|
Metadata: m,
|
||||||
|
}, nil
|
||||||
|
}
|
89
modules/packages/rubygems/metadata_test.go
Normal file
89
modules/packages/rubygems/metadata_test.go
Normal file
|
@ -0,0 +1,89 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package rubygems
|
||||||
|
|
||||||
|
import (
|
||||||
|
"archive/tar"
|
||||||
|
"bytes"
|
||||||
|
"encoding/base64"
|
||||||
|
"io"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestParsePackageMetaData(t *testing.T) {
|
||||||
|
createArchive := func(filename string, content []byte) io.Reader {
|
||||||
|
var buf bytes.Buffer
|
||||||
|
tw := tar.NewWriter(&buf)
|
||||||
|
hdr := &tar.Header{
|
||||||
|
Name: filename,
|
||||||
|
Mode: 0o600,
|
||||||
|
Size: int64(len(content)),
|
||||||
|
}
|
||||||
|
tw.WriteHeader(hdr)
|
||||||
|
tw.Write(content)
|
||||||
|
tw.Close()
|
||||||
|
return &buf
|
||||||
|
}
|
||||||
|
|
||||||
|
t.Run("MissingMetadataFile", func(t *testing.T) {
|
||||||
|
data := createArchive("dummy.txt", []byte{0})
|
||||||
|
|
||||||
|
rp, err := ParsePackageMetaData(data)
|
||||||
|
assert.ErrorIs(t, err, ErrMissingMetadataFile)
|
||||||
|
assert.Nil(t, rp)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Valid", func(t *testing.T) {
|
||||||
|
content, _ := base64.StdEncoding.DecodeString("H4sICHC/I2EEAG1ldGFkYXRhAAEeAOH/bmFtZTogZwp2ZXJzaW9uOgogIHZlcnNpb246IDEKWw35Tx4AAAA=")
|
||||||
|
data := createArchive("metadata.gz", content)
|
||||||
|
|
||||||
|
rp, err := ParsePackageMetaData(data)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.NotNil(t, rp)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestParseMetadataFile(t *testing.T) {
|
||||||
|
content, _ := base64.StdEncoding.DecodeString(`H4sIAMe7I2ECA9VVTW/UMBC9+1eYXvaUbJpSQBZUHJAqDlwK4kCFIseZzZrGH9iTqisEv52Js9nd
|
||||||
|
0KqggiqRXWnX45n3ZuZ5nCzL+JPQ15ulq7+AQnEORoj3HpReaSVRO8usNCB4qxEku4YQySbuCPo4
|
||||||
|
bjHOd07HeZGfMt9JXLlgBB9imOxx7UIULOPnCZMMLsDXXgeiYbW2jQ6C0y9TELBSa6kJ6/IzaySS
|
||||||
|
R1mUx1nxIitPeFGI9M2L6eGfWAMebANWaUgktzN9M3lsKNmxutBb1AYyCibbNhsDFu+q9GK/Tc4z
|
||||||
|
d2IcLBl9js5eHaXFsLyvXeNz0LQyL/YoLx8EsiCMBZlx46k6sS2PDD5AgA5kJPNKdhH2elWzOv7n
|
||||||
|
uv9Q9Aau/6ngP84elvNpXh5oRVlB5/yW7BH0+qu0G4gqaI/JdEHBFBS5l+pKtsARIjIwUnfj8Le0
|
||||||
|
+TrdJLl2DG5A9SjrjgZ1mG+4QbAD+G4ZZBUap6qVnnzGf6Rwp+vliBRqtnYGPBEKvkb0USyXE8mS
|
||||||
|
dVoR6hj07u0HZgAl3SRS8G/fmXcRK20jyq6rDMSYQFgidamqkXbbuspLXE/0k7GphtKqe67GuRC/
|
||||||
|
yjAbmt9LsOMp8xMamFkSQ38fP5EFjdz8LA4do2C69VvqWXAJgrPbKZb58/xZXrKoW6ttW13Bhvzi
|
||||||
|
4ftn7/yUxd4YGcglvTmmY8aGY3ZwRn4CqcWcidUGAAA=`)
|
||||||
|
rp, err := parseMetadataFile(bytes.NewReader(content))
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.NotNil(t, rp)
|
||||||
|
|
||||||
|
assert.Equal(t, "gitea", rp.Name)
|
||||||
|
assert.Equal(t, "1.0.5", rp.Version)
|
||||||
|
assert.Equal(t, "ruby", rp.Metadata.Platform)
|
||||||
|
assert.Equal(t, "Gitea package", rp.Metadata.Summary)
|
||||||
|
assert.Equal(t, "RubyGems package test", rp.Metadata.Description)
|
||||||
|
assert.Equal(t, []string{"Gitea"}, rp.Metadata.Authors)
|
||||||
|
assert.Equal(t, "https://gitea.io/", rp.Metadata.ProjectURL)
|
||||||
|
assert.Equal(t, []string{"MIT"}, rp.Metadata.Licenses)
|
||||||
|
assert.Empty(t, rp.Metadata.RequiredRubygemsVersion)
|
||||||
|
assert.Len(t, rp.Metadata.RequiredRubyVersion, 1)
|
||||||
|
assert.Equal(t, ">=", rp.Metadata.RequiredRubyVersion[0].Restriction)
|
||||||
|
assert.Equal(t, "2.3.0", rp.Metadata.RequiredRubyVersion[0].Version)
|
||||||
|
assert.Len(t, rp.Metadata.RuntimeDependencies, 1)
|
||||||
|
assert.Equal(t, "runtime-dep", rp.Metadata.RuntimeDependencies[0].Name)
|
||||||
|
assert.Len(t, rp.Metadata.RuntimeDependencies[0].Version, 2)
|
||||||
|
assert.Equal(t, ">=", rp.Metadata.RuntimeDependencies[0].Version[0].Restriction)
|
||||||
|
assert.Equal(t, "1.2.0", rp.Metadata.RuntimeDependencies[0].Version[0].Version)
|
||||||
|
assert.Equal(t, "<", rp.Metadata.RuntimeDependencies[0].Version[1].Restriction)
|
||||||
|
assert.Equal(t, "2.0", rp.Metadata.RuntimeDependencies[0].Version[1].Version)
|
||||||
|
assert.Len(t, rp.Metadata.DevelopmentDependencies, 1)
|
||||||
|
assert.Equal(t, "dev-dep", rp.Metadata.DevelopmentDependencies[0].Name)
|
||||||
|
assert.Len(t, rp.Metadata.DevelopmentDependencies[0].Version, 1)
|
||||||
|
assert.Equal(t, "~>", rp.Metadata.DevelopmentDependencies[0].Version[0].Restriction)
|
||||||
|
assert.Equal(t, "5.2", rp.Metadata.DevelopmentDependencies[0].Version[0].Version)
|
||||||
|
}
|
47
modules/setting/packages.go
Normal file
47
modules/setting/packages.go
Normal file
|
@ -0,0 +1,47 @@
|
||||||
|
// Copyright 2022 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package setting
|
||||||
|
|
||||||
|
import (
|
||||||
|
"os"
|
||||||
|
"path/filepath"
|
||||||
|
|
||||||
|
"code.gitea.io/gitea/modules/log"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Package registry settings
|
||||||
|
var (
|
||||||
|
Packages = struct {
|
||||||
|
Storage
|
||||||
|
Enabled bool
|
||||||
|
ChunkedUploadPath string
|
||||||
|
RegistryHost string
|
||||||
|
}{
|
||||||
|
Enabled: true,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
func newPackages() {
|
||||||
|
sec := Cfg.Section("packages")
|
||||||
|
if err := sec.MapTo(&Packages); err != nil {
|
||||||
|
log.Fatal("Failed to map Packages settings: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
Packages.Storage = getStorage("packages", "", nil)
|
||||||
|
|
||||||
|
Packages.RegistryHost = Domain
|
||||||
|
if (Protocol == HTTP && HTTPPort != "80") || (Protocol == HTTPS && HTTPPort != "443") {
|
||||||
|
Packages.RegistryHost += ":" + HTTPPort
|
||||||
|
}
|
||||||
|
|
||||||
|
Packages.ChunkedUploadPath = filepath.ToSlash(sec.Key("CHUNKED_UPLOAD_PATH").MustString("tmp/package-upload"))
|
||||||
|
if !filepath.IsAbs(Packages.ChunkedUploadPath) {
|
||||||
|
Packages.ChunkedUploadPath = filepath.ToSlash(filepath.Join(AppDataPath, Packages.ChunkedUploadPath))
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := os.MkdirAll(Packages.ChunkedUploadPath, os.ModePerm); err != nil {
|
||||||
|
log.Error("Unable to create chunked upload directory: %s (%v)", Packages.ChunkedUploadPath, err)
|
||||||
|
}
|
||||||
|
}
|
|
@ -212,6 +212,7 @@ var (
|
||||||
MembersPagingNum int
|
MembersPagingNum int
|
||||||
FeedMaxCommitNum int
|
FeedMaxCommitNum int
|
||||||
FeedPagingNum int
|
FeedPagingNum int
|
||||||
|
PackagesPagingNum int
|
||||||
GraphMaxCommitNum int
|
GraphMaxCommitNum int
|
||||||
CodeCommentLines int
|
CodeCommentLines int
|
||||||
ReactionMaxUserNum int
|
ReactionMaxUserNum int
|
||||||
|
@ -264,6 +265,7 @@ var (
|
||||||
MembersPagingNum: 20,
|
MembersPagingNum: 20,
|
||||||
FeedMaxCommitNum: 5,
|
FeedMaxCommitNum: 5,
|
||||||
FeedPagingNum: 20,
|
FeedPagingNum: 20,
|
||||||
|
PackagesPagingNum: 20,
|
||||||
GraphMaxCommitNum: 100,
|
GraphMaxCommitNum: 100,
|
||||||
CodeCommentLines: 4,
|
CodeCommentLines: 4,
|
||||||
ReactionMaxUserNum: 10,
|
ReactionMaxUserNum: 10,
|
||||||
|
@ -1016,6 +1018,8 @@ func loadFromConf(allowEmpty bool, extraConfig string) {
|
||||||
|
|
||||||
newPictureService()
|
newPictureService()
|
||||||
|
|
||||||
|
newPackages()
|
||||||
|
|
||||||
if err = Cfg.Section("ui").MapTo(&UI); err != nil {
|
if err = Cfg.Section("ui").MapTo(&UI); err != nil {
|
||||||
log.Fatal("Failed to map UI settings: %v", err)
|
log.Fatal("Failed to map UI settings: %v", err)
|
||||||
} else if err = Cfg.Section("markdown").MapTo(&Markdown); err != nil {
|
} else if err = Cfg.Section("markdown").MapTo(&Markdown); err != nil {
|
||||||
|
|
|
@ -123,6 +123,9 @@ var (
|
||||||
|
|
||||||
// RepoArchives represents repository archives storage
|
// RepoArchives represents repository archives storage
|
||||||
RepoArchives ObjectStorage
|
RepoArchives ObjectStorage
|
||||||
|
|
||||||
|
// Packages represents packages storage
|
||||||
|
Packages ObjectStorage
|
||||||
)
|
)
|
||||||
|
|
||||||
// Init init the stoarge
|
// Init init the stoarge
|
||||||
|
@ -143,7 +146,11 @@ func Init() error {
|
||||||
return err
|
return err
|
||||||
}
|
}
|
||||||
|
|
||||||
return initRepoArchives()
|
if err := initRepoArchives(); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
return initPackages()
|
||||||
}
|
}
|
||||||
|
|
||||||
// NewStorage takes a storage type and some config and returns an ObjectStorage or an error
|
// NewStorage takes a storage type and some config and returns an ObjectStorage or an error
|
||||||
|
@ -188,3 +195,9 @@ func initRepoArchives() (err error) {
|
||||||
RepoArchives, err = NewStorage(setting.RepoArchive.Storage.Type, &setting.RepoArchive.Storage)
|
RepoArchives, err = NewStorage(setting.RepoArchive.Storage.Type, &setting.RepoArchive.Storage)
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func initPackages() (err error) {
|
||||||
|
log.Info("Initialising Packages storage with type: %s", setting.Packages.Storage.Type)
|
||||||
|
Packages, err = NewStorage(setting.Packages.Storage.Type, &setting.Packages.Storage)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
|
@ -110,6 +110,7 @@ var (
|
||||||
_ Payloader = &PullRequestPayload{}
|
_ Payloader = &PullRequestPayload{}
|
||||||
_ Payloader = &RepositoryPayload{}
|
_ Payloader = &RepositoryPayload{}
|
||||||
_ Payloader = &ReleasePayload{}
|
_ Payloader = &ReleasePayload{}
|
||||||
|
_ Payloader = &PackagePayload{}
|
||||||
)
|
)
|
||||||
|
|
||||||
// _________ __
|
// _________ __
|
||||||
|
@ -425,3 +426,27 @@ type RepositoryPayload struct {
|
||||||
func (p *RepositoryPayload) JSONPayload() ([]byte, error) {
|
func (p *RepositoryPayload) JSONPayload() ([]byte, error) {
|
||||||
return json.MarshalIndent(p, "", " ")
|
return json.MarshalIndent(p, "", " ")
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// HookPackageAction an action that happens to a package
|
||||||
|
type HookPackageAction string
|
||||||
|
|
||||||
|
const (
|
||||||
|
// HookPackageCreated created
|
||||||
|
HookPackageCreated HookPackageAction = "created"
|
||||||
|
// HookPackageDeleted deleted
|
||||||
|
HookPackageDeleted HookPackageAction = "deleted"
|
||||||
|
)
|
||||||
|
|
||||||
|
// PackagePayload represents a package payload
|
||||||
|
type PackagePayload struct {
|
||||||
|
Action HookPackageAction `json:"action"`
|
||||||
|
Repository *Repository `json:"repository"`
|
||||||
|
Package *Package `json:"package"`
|
||||||
|
Organization *User `json:"organization"`
|
||||||
|
Sender *User `json:"sender"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// JSONPayload implements Payload
|
||||||
|
func (p *PackagePayload) JSONPayload() ([]byte, error) {
|
||||||
|
return json.MarshalIndent(p, "", " ")
|
||||||
|
}
|
||||||
|
|
33
modules/structs/package.go
Normal file
33
modules/structs/package.go
Normal file
|
@ -0,0 +1,33 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package structs
|
||||||
|
|
||||||
|
import (
|
||||||
|
"time"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Package represents a package
|
||||||
|
type Package struct {
|
||||||
|
ID int64 `json:"id"`
|
||||||
|
Owner *User `json:"owner"`
|
||||||
|
Repository *Repository `json:"repository"`
|
||||||
|
Creator *User `json:"creator"`
|
||||||
|
Type string `json:"type"`
|
||||||
|
Name string `json:"name"`
|
||||||
|
Version string `json:"version"`
|
||||||
|
// swagger:strfmt date-time
|
||||||
|
CreatedAt time.Time `json:"created_at"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// PackageFile represents a package file
|
||||||
|
type PackageFile struct {
|
||||||
|
ID int64 `json:"id"`
|
||||||
|
Size int64
|
||||||
|
Name string `json:"name"`
|
||||||
|
HashMD5 string `json:"md5"`
|
||||||
|
HashSHA1 string `json:"sha1"`
|
||||||
|
HashSHA256 string `json:"sha256"`
|
||||||
|
HashSHA512 string `json:"sha512"`
|
||||||
|
}
|
|
@ -34,6 +34,7 @@ import (
|
||||||
"code.gitea.io/gitea/modules/json"
|
"code.gitea.io/gitea/modules/json"
|
||||||
"code.gitea.io/gitea/modules/log"
|
"code.gitea.io/gitea/modules/log"
|
||||||
"code.gitea.io/gitea/modules/markup"
|
"code.gitea.io/gitea/modules/markup"
|
||||||
|
"code.gitea.io/gitea/modules/markup/markdown"
|
||||||
"code.gitea.io/gitea/modules/repository"
|
"code.gitea.io/gitea/modules/repository"
|
||||||
"code.gitea.io/gitea/modules/setting"
|
"code.gitea.io/gitea/modules/setting"
|
||||||
"code.gitea.io/gitea/modules/svg"
|
"code.gitea.io/gitea/modules/svg"
|
||||||
|
@ -161,7 +162,16 @@ func NewFuncMap() []template.FuncMap {
|
||||||
"RenderEmojiPlain": emoji.ReplaceAliases,
|
"RenderEmojiPlain": emoji.ReplaceAliases,
|
||||||
"ReactionToEmoji": ReactionToEmoji,
|
"ReactionToEmoji": ReactionToEmoji,
|
||||||
"RenderNote": RenderNote,
|
"RenderNote": RenderNote,
|
||||||
"IsMultilineCommitMessage": IsMultilineCommitMessage,
|
"RenderMarkdownToHtml": func(input string) template.HTML {
|
||||||
|
output, err := markdown.RenderString(&markup.RenderContext{
|
||||||
|
URLPrefix: setting.AppSubURL,
|
||||||
|
}, input)
|
||||||
|
if err != nil {
|
||||||
|
log.Error("RenderString: %v", err)
|
||||||
|
}
|
||||||
|
return template.HTML(output)
|
||||||
|
},
|
||||||
|
"IsMultilineCommitMessage": IsMultilineCommitMessage,
|
||||||
"ThemeColorMetaTag": func() string {
|
"ThemeColorMetaTag": func() string {
|
||||||
return setting.UI.ThemeColorMetaTag
|
return setting.UI.ThemeColorMetaTag
|
||||||
},
|
},
|
||||||
|
|
147
modules/util/filebuffer/file_backed_buffer.go
Normal file
147
modules/util/filebuffer/file_backed_buffer.go
Normal file
|
@ -0,0 +1,147 @@
|
||||||
|
// Copyright 2021 The Gitea Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package filebuffer
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"errors"
|
||||||
|
"io"
|
||||||
|
"os"
|
||||||
|
)
|
||||||
|
|
||||||
|
const maxInt = int(^uint(0) >> 1) // taken from bytes.Buffer
|
||||||
|
|
||||||
|
var (
|
||||||
|
// ErrInvalidMemorySize occurs if the memory size is not in a valid range
|
||||||
|
ErrInvalidMemorySize = errors.New("Memory size must be greater 0 and lower math.MaxInt32")
|
||||||
|
// ErrWriteAfterRead occurs if Write is called after a read operation
|
||||||
|
ErrWriteAfterRead = errors.New("Write is unsupported after a read operation")
|
||||||
|
)
|
||||||
|
|
||||||
|
type readAtSeeker interface {
|
||||||
|
io.ReadSeeker
|
||||||
|
io.ReaderAt
|
||||||
|
}
|
||||||
|
|
||||||
|
// FileBackedBuffer uses a memory buffer with a fixed size.
|
||||||
|
// If more data is written a temporary file is used instead.
|
||||||
|
// It implements io.ReadWriteCloser, io.ReadSeekCloser and io.ReaderAt
|
||||||
|
type FileBackedBuffer struct {
|
||||||
|
maxMemorySize int64
|
||||||
|
size int64
|
||||||
|
buffer bytes.Buffer
|
||||||
|
file *os.File
|
||||||
|
reader readAtSeeker
|
||||||
|
}
|
||||||
|
|
||||||
|
// New creates a file backed buffer with a specific maximum memory size
|
||||||
|
func New(maxMemorySize int) (*FileBackedBuffer, error) {
|
||||||
|
if maxMemorySize < 0 || maxMemorySize > maxInt {
|
||||||
|
return nil, ErrInvalidMemorySize
|
||||||
|
}
|
||||||
|
|
||||||
|
return &FileBackedBuffer{
|
||||||
|
maxMemorySize: int64(maxMemorySize),
|
||||||
|
}, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// CreateFromReader creates a file backed buffer and copies the provided reader data into it.
|
||||||
|
func CreateFromReader(r io.Reader, maxMemorySize int) (*FileBackedBuffer, error) {
|
||||||
|
b, err := New(maxMemorySize)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
_, err = io.Copy(b, r)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return b, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Write implements io.Writer
|
||||||
|
func (b *FileBackedBuffer) Write(p []byte) (int, error) {
|
||||||
|
if b.reader != nil {
|
||||||
|
return 0, ErrWriteAfterRead
|
||||||
|
}
|
||||||
|
|
||||||
|
var n int
|
||||||
|
var err error
|
||||||
|
|
||||||
|
if b.file != nil {
|
||||||
|
n, err = b.file.Write(p)
|
||||||
|
} else {
|
||||||
|
if b.size+int64(len(p)) > b.maxMemorySize {
|
||||||
|
b.file, err = os.CreateTemp("", "gitea-buffer-")
|
||||||
|
if err != nil {
|
||||||
|
return 0, err
|
||||||
|
}
|
||||||
|
|
||||||
|
_, err = io.Copy(b.file, &b.buffer)
|
||||||
|
if err != nil {
|
||||||
|
return 0, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return b.Write(p)
|
||||||
|
}
|
||||||
|
|
||||||
|
n, err = b.buffer.Write(p)
|
||||||
|
}
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
return n, err
|
||||||
|
}
|
||||||
|
b.size += int64(n)
|
||||||
|
return n, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Size returns the byte size of the buffered data
|
||||||
|
func (b *FileBackedBuffer) Size() int64 {
|
||||||
|
return b.size
|
||||||
|
}
|
||||||
|
|
||||||
|
func (b *FileBackedBuffer) switchToReader() {
|
||||||
|
if b.reader != nil {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if b.file != nil {
|
||||||
|
b.reader = b.file
|
||||||
|
} else {
|
||||||
|
b.reader = bytes.NewReader(b.buffer.Bytes())
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Read implements io.Reader
|
||||||
|
func (b *FileBackedBuffer) Read(p []byte) (int, error) {
|
||||||
|
b.switchToReader()
|
||||||
|
|
||||||
|
return b.reader.Read(p)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ReadAt implements io.ReaderAt
|
||||||
|
func (b *FileBackedBuffer) ReadAt(p []byte, off int64) (int, error) {
|
||||||
|
b.switchToReader()
|
||||||
|
|
||||||
|
return b.reader.ReadAt(p, off)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Seek implements io.Seeker
|
||||||
|
func (b *FileBackedBuffer) Seek(offset int64, whence int) (int64, error) {
|
||||||
|
b.switchToReader()
|
||||||
|
|
||||||
|
return b.reader.Seek(offset, whence)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Close implements io.Closer
|
||||||
|
func (b *FileBackedBuffer) Close() error {
|
||||||
|
if b.file != nil {
|
||||||
|
err := b.file.Close()
|
||||||
|
os.Remove(b.file.Name())
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
|
@ -488,7 +488,9 @@ auth_failed = Authentication failed: %v
|
||||||
|
|
||||||
still_own_repo = "Your account owns one or more repositories; delete or transfer them first."
|
still_own_repo = "Your account owns one or more repositories; delete or transfer them first."
|
||||||
still_has_org = "Your account is a member of one or more organizations; leave them first."
|
still_has_org = "Your account is a member of one or more organizations; leave them first."
|
||||||
|
still_own_packages = "Your account owns one or more packages; delete them first."
|
||||||
org_still_own_repo = "This organization still owns one or more repositories; delete or transfer them first."
|
org_still_own_repo = "This organization still owns one or more repositories; delete or transfer them first."
|
||||||
|
org_still_own_packages = "This organization still owns one or more packages; delete them first."
|
||||||
|
|
||||||
target_branch_not_exist = Target branch does not exist.
|
target_branch_not_exist = Target branch does not exist.
|
||||||
|
|
||||||
|
@ -1793,6 +1795,7 @@ settings.pulls.allow_manual_merge = Enable Mark PR as manually merged
|
||||||
settings.pulls.enable_autodetect_manual_merge = Enable autodetect manual merge (Note: In some special cases, misjudgments can occur)
|
settings.pulls.enable_autodetect_manual_merge = Enable autodetect manual merge (Note: In some special cases, misjudgments can occur)
|
||||||
settings.pulls.allow_rebase_update = Enable updating pull request branch by rebase
|
settings.pulls.allow_rebase_update = Enable updating pull request branch by rebase
|
||||||
settings.pulls.default_delete_branch_after_merge = Delete pull request branch after merge by default
|
settings.pulls.default_delete_branch_after_merge = Delete pull request branch after merge by default
|
||||||
|
settings.packages_desc = Enable Repository Packages Registry
|
||||||
settings.projects_desc = Enable Repository Projects
|
settings.projects_desc = Enable Repository Projects
|
||||||
settings.admin_settings = Administrator Settings
|
settings.admin_settings = Administrator Settings
|
||||||
settings.admin_enable_health_check = Enable Repository Health Checks (git fsck)
|
settings.admin_enable_health_check = Enable Repository Health Checks (git fsck)
|
||||||
|
@ -1950,6 +1953,8 @@ settings.event_pull_request_review = Pull Request Reviewed
|
||||||
settings.event_pull_request_review_desc = Pull request approved, rejected, or review comment.
|
settings.event_pull_request_review_desc = Pull request approved, rejected, or review comment.
|
||||||
settings.event_pull_request_sync = Pull Request Synchronized
|
settings.event_pull_request_sync = Pull Request Synchronized
|
||||||
settings.event_pull_request_sync_desc = Pull request synchronized.
|
settings.event_pull_request_sync_desc = Pull request synchronized.
|
||||||
|
settings.event_package = Package
|
||||||
|
settings.event_package_desc = Package created or deleted in a repository.
|
||||||
settings.branch_filter = Branch filter
|
settings.branch_filter = Branch filter
|
||||||
settings.branch_filter_desc = Branch whitelist for push, branch creation and branch deletion events, specified as glob pattern. If empty or <code>*</code>, events for all branches are reported. See <a href="https://pkg.go.dev/github.com/gobwas/glob#Compile">github.com/gobwas/glob</a> documentation for syntax. Examples: <code>master</code>, <code>{master,release*}</code>.
|
settings.branch_filter_desc = Branch whitelist for push, branch creation and branch deletion events, specified as glob pattern. If empty or <code>*</code>, events for all branches are reported. See <a href="https://pkg.go.dev/github.com/gobwas/glob#Compile">github.com/gobwas/glob</a> documentation for syntax. Examples: <code>master</code>, <code>{master,release*}</code>.
|
||||||
settings.active = Active
|
settings.active = Active
|
||||||
|
@ -2431,6 +2436,7 @@ dashboard.resync_all_hooks = Resynchronize pre-receive, update and post-receive
|
||||||
dashboard.reinit_missing_repos = Reinitialize all missing Git repositories for which records exist
|
dashboard.reinit_missing_repos = Reinitialize all missing Git repositories for which records exist
|
||||||
dashboard.sync_external_users = Synchronize external user data
|
dashboard.sync_external_users = Synchronize external user data
|
||||||
dashboard.cleanup_hook_task_table = Cleanup hook_task table
|
dashboard.cleanup_hook_task_table = Cleanup hook_task table
|
||||||
|
dashboard.cleanup_packages = Cleanup expired packages
|
||||||
dashboard.server_uptime = Server Uptime
|
dashboard.server_uptime = Server Uptime
|
||||||
dashboard.current_goroutine = Current Goroutines
|
dashboard.current_goroutine = Current Goroutines
|
||||||
dashboard.current_memory_usage = Current Memory Usage
|
dashboard.current_memory_usage = Current Memory Usage
|
||||||
|
@ -2500,6 +2506,7 @@ users.update_profile = Update User Account
|
||||||
users.delete_account = Delete User Account
|
users.delete_account = Delete User Account
|
||||||
users.still_own_repo = This user still owns one or more repositories. Delete or transfer these repositories first.
|
users.still_own_repo = This user still owns one or more repositories. Delete or transfer these repositories first.
|
||||||
users.still_has_org = This user is a member of an organization. Remove the user from any organizations first.
|
users.still_has_org = This user is a member of an organization. Remove the user from any organizations first.
|
||||||
|
users.still_own_packages = This user still owns one or more packages. Delete these packages first.
|
||||||
users.deletion_success = The user account has been deleted.
|
users.deletion_success = The user account has been deleted.
|
||||||
users.reset_2fa = Reset 2FA
|
users.reset_2fa = Reset 2FA
|
||||||
users.list_status_filter.menu_text = Filter
|
users.list_status_filter.menu_text = Filter
|
||||||
|
@ -2546,6 +2553,17 @@ repos.forks = Forks
|
||||||
repos.issues = Issues
|
repos.issues = Issues
|
||||||
repos.size = Size
|
repos.size = Size
|
||||||
|
|
||||||
|
packages.package_manage_panel = Package Management
|
||||||
|
packages.total_size = Total Size: %s
|
||||||
|
packages.owner = Owner
|
||||||
|
packages.creator = Creator
|
||||||
|
packages.name = Name
|
||||||
|
packages.version = Version
|
||||||
|
packages.type = Type
|
||||||
|
packages.repository = Repository
|
||||||
|
packages.size = Size
|
||||||
|
packages.published = Published
|
||||||
|
|
||||||
defaulthooks = Default Webhooks
|
defaulthooks = Default Webhooks
|
||||||
defaulthooks.desc = Webhooks automatically make HTTP POST requests to a server when certain Gitea events trigger. Webhooks defined here are defaults and will be copied into all new repositories. Read more in the <a target="_blank" rel="noopener" href="https://docs.gitea.io/en-us/webhooks/">webhooks guide</a>.
|
defaulthooks.desc = Webhooks automatically make HTTP POST requests to a server when certain Gitea events trigger. Webhooks defined here are defaults and will be copied into all new repositories. Read more in the <a target="_blank" rel="noopener" href="https://docs.gitea.io/en-us/webhooks/">webhooks guide</a>.
|
||||||
defaulthooks.add_webhook = Add Default Webhook
|
defaulthooks.add_webhook = Add Default Webhook
|
||||||
|
@ -2982,3 +3000,92 @@ error.probable_bad_default_signature = "WARNING! Although the default key has th
|
||||||
unit = Unit
|
unit = Unit
|
||||||
error.no_unit_allowed_repo = You are not allowed to access any section of this repository.
|
error.no_unit_allowed_repo = You are not allowed to access any section of this repository.
|
||||||
error.unit_not_allowed = You are not allowed to access this repository section.
|
error.unit_not_allowed = You are not allowed to access this repository section.
|
||||||
|
|
||||||
|
[packages]
|
||||||
|
title = Packages
|
||||||
|
desc = Manage repository packages.
|
||||||
|
empty = There are no packages yet.
|
||||||
|
empty.documentation = For more information on the package registry, see <a target="_blank" rel="noopener noreferrer" href="https://docs.gitea.io/en-us/overview">the documentation</a>.
|
||||||
|
filter.type = Type
|
||||||
|
filter.type.all = All
|
||||||
|
filter.no_result = Your filter produced no results.
|
||||||
|
filter.container.tagged = Tagged
|
||||||
|
filter.container.untagged = Untagged
|
||||||
|
published_by = Published %[1]s by <a href="%[2]s">%[3]s</a>
|
||||||
|
published_by_in = Published %[1]s by <a href="%[2]s">%[3]s</a> in <a href="%[4]s"><strong>%[5]s</strong></a>
|
||||||
|
installation = Installation
|
||||||
|
about = About this package
|
||||||
|
requirements = Requirements
|
||||||
|
dependencies = Dependencies
|
||||||
|
keywords = Keywords
|
||||||
|
details = Details
|
||||||
|
details.author = Author
|
||||||
|
details.project_site = Project Site
|
||||||
|
details.license = License
|
||||||
|
assets = Assets
|
||||||
|
versions = Versions
|
||||||
|
versions.on = on
|
||||||
|
versions.view_all = View all
|
||||||
|
dependency.id = ID
|
||||||
|
dependency.version = Version
|
||||||
|
composer.registry = Setup this registry in your <code>~/.composer/config.json</code> file:
|
||||||
|
composer.install = To install the package using Composer, run the following command:
|
||||||
|
composer.documentation = For more information on the Composer registry, see <a target="_blank" rel="noopener noreferrer" href="https://docs.gitea.io/en-us/composer/">the documentation</a>.
|
||||||
|
composer.dependencies = Dependencies
|
||||||
|
composer.dependencies.development = Development Dependencies
|
||||||
|
conan.details.repository = Repository
|
||||||
|
conan.registry = Setup this registry from the command line:
|
||||||
|
conan.install = To install the package using Conan, run the following command:
|
||||||
|
conan.documentation = For more information on the Conan registry, see <a target="_blank" rel="noopener noreferrer" href="https://docs.gitea.io/en-us/conan/">the documentation</a>.
|
||||||
|
container.details.type = Image Type
|
||||||
|
container.details.platform = Platform
|
||||||
|
container.details.repository_site = Repository Site
|
||||||
|
container.details.documentation_site = Documentation Site
|
||||||
|
container.pull = Pull the image from the command line:
|
||||||
|
container.documentation = For more information on the Container registry, see <a target="_blank" rel="noopener noreferrer" href="https://docs.gitea.io/en-us/container/">the documentation</a>.
|
||||||
|
container.multi_arch = OS / Arch
|
||||||
|
container.layers = Image Layers
|
||||||
|
container.labels = Labels
|
||||||
|
container.labels.key = Key
|
||||||
|
container.labels.value = Value
|
||||||
|
generic.download = Download package from the command line:
|
||||||
|
generic.documentation = For more information on the generic registry, see <a target="_blank" rel="noopener noreferrer" href="https://docs.gitea.io/en-us/generic">the documentation</a>.
|
||||||
|
maven.registry = Setup this registry in your project <code>pom.xml</code> file:
|
||||||
|
maven.install = To use the package include the following in the <code>dependencies</code> block in the <code>pom.xml</code> file:
|
||||||
|
maven.install2 = Run via command line:
|
||||||
|
maven.download = To download the dependency, run via command line:
|
||||||
|
maven.documentation = For more information on the Maven registry, see <a target="_blank" rel="noopener noreferrer" href="https://docs.gitea.io/en-us/maven/">the documentation</a>.
|
||||||
|
nuget.registry = Setup this registry from the command line:
|
||||||
|
nuget.install = To install the package using NuGet, run the following command:
|
||||||
|
nuget.documentation = For more information on the NuGet registry, see <a target="_blank" rel="noopener noreferrer" href="https://docs.gitea.io/en-us/nuget/">the documentation</a>.
|
||||||
|
nuget.dependency.framework = Target Framework
|
||||||
|
npm.registry = Setup this registry in your project <code>.npmrc</code> file:
|
||||||
|
npm.install = To install the package using npm, run the following command:
|
||||||
|
npm.install2 = or add it to the package.json file:
|
||||||
|
npm.documentation = For more information on the npm registry, see <a target="_blank" rel="noopener noreferrer" href="https://docs.gitea.io/en-us/npm/">the documentation</a>.
|
||||||
|
npm.dependencies = Dependencies
|
||||||
|
npm.dependencies.development = Development Dependencies
|
||||||
|
npm.dependencies.peer = Peer Dependencies
|
||||||
|
npm.dependencies.optional = Optional Dependencies
|
||||||
|
npm.details.tag = Tag
|
||||||
|
pypi.requires = Requires Python
|
||||||
|
pypi.install = To install the package using pip, run the following command:
|
||||||
|
pypi.documentation = For more information on the PyPI registry, see <a target="_blank" rel="noopener noreferrer" href="https://docs.gitea.io/en-us/pypi/">the documentation</a>.
|
||||||
|
rubygems.install = To install the package using gem, run the following command:
|
||||||
|
rubygems.install2 = or add it to the Gemfile:
|
||||||
|
rubygems.dependencies.runtime = Runtime Dependencies
|
||||||
|
rubygems.dependencies.development = Development Dependencies
|
||||||
|
rubygems.required.ruby = Requires Ruby version
|
||||||
|
rubygems.required.rubygems = Requires RubyGem version
|
||||||
|
rubygems.documentation = For more information on the RubyGems registry, see <a target="_blank" rel="noopener noreferrer" href="https://docs.gitea.io/en-us/rubygems/">the documentation</a>.
|
||||||
|
settings.link = Link this package to a repository
|
||||||
|
settings.link.description = If you link a package with a repository, the package is listed in the repository's package list.
|
||||||
|
settings.link.select = Select Repository
|
||||||
|
settings.link.button = Update Repository Link
|
||||||
|
settings.link.success = Repository link was successfully updated.
|
||||||
|
settings.link.error = Failed to update repository link.
|
||||||
|
settings.delete = Delete package
|
||||||
|
settings.delete.description = Deleting a package is permanent and cannot be undone.
|
||||||
|
settings.delete.notice = You are about to delete %s (%s). This operation is irreversible, are you sure?
|
||||||
|
settings.delete.success = The package has been deleted.
|
||||||
|
settings.delete.error = Failed to delete the package.
|
1
public/img/svg/gitea-composer.svg
Normal file
1
public/img/svg/gitea-composer.svg
Normal file
File diff suppressed because one or more lines are too long
After Width: | Height: | Size: 5.6 KiB |
1
public/img/svg/gitea-conan.svg
Normal file
1
public/img/svg/gitea-conan.svg
Normal file
|
@ -0,0 +1 @@
|
||||||
|
<svg viewBox="147 6 105 106" xml:space="preserve" class="svg gitea-conan" width="16" height="16" aria-hidden="true"><path d="m198.7 59.75-51.08-29.62v47.49l51.08 33.65z" fill="#6699cb"/><clipPath id="gitea-conan__a"><path d="m147.49 30.14 51.21 29.61 51.08-27.24-52.39-25.78z"/></clipPath><path d="M147.49 6.73h102.3v53.01h-102.3z" clip-path="url(#gitea-conan__a)" fill="#afd5e6"/><path d="m198.7 59.75 51.08-27.24v47.48l-51.08 31.28z" clip-rule="evenodd" fill="#7ba7d3" fill-rule="evenodd"/><path d="m198.93 19.49-2.96.33-.43.18-.47.01-.42.18-2.31.55-.33.14-.31.01-.28.23-4.27 1.58-.22.17c-1.93.75-3.49 1.8-5.16 2.66l-.19.2c-1.5.84-2.03 1.28-3.08 2.32l-.25.17-1.06 1.42-.21.18-.35.71-.19.2c-1.2 2.75-1.18 3.19-.93 6.4l.21.32v.33l.15.29.4.99.17.23.18.51.21.18c.61 1.1 1.37 1.97 2.1 2.77.41.45 2.16 1.87 2.85 2.22l.19.21c1.4.67 2.44 1.51 4.22 2.13l.24.16 3.45 1.08.39.19c1.19.13 2.44.48 3.76.65 1.44.19 2.2-.5 3.4-1.02l.23-.17h.16l.23-.17 5.47-2.52.23-.17h.16l.23-.17 3.15-1.49-.28-.12c-1.85-.08-4.04.2-6.04.15-2.01-.05-3.87-.42-5.71-.5l-.39-.19c-1.33-.13-2.66-.69-3.81-1.08l-.25-.16c-1.85-.66-3.55-2.12-4.35-3.63-1.27-2.4-.48-4.18.48-6.21l.21-.18.17-.33.22-.18c.99-1.41 3.43-3.37 5.83-4.13l.25-.16 2.54-.72.37-.19.39.02.39-.19 1.69-.14c.41-.27.62-.23 1.2-.24h3.93c.62-.02 1.16-.02 1.6.23l2.29.31.28.22c1.39.2 2.55.97 3.72 1.4l.2.19.73.34.19.2c1.23.65 3.41 2.65 3.87 4.24l.16.26c.52 1.8.39 2.4-.01 4.17l-.16.33-.64 1.38.96-.39.21-.18 7.56-3.91.21-.18 1.81-.89.21-.18 1.81-.89.21-.2c.07-.39-2.27-2.32-2.77-2.79l-.18-.25c-.61-.52-1.49-1.28-2.21-1.73l-.18-.22c-.72-.41-1.33-1.05-2.03-1.39l-.19-.2-1.83-1.05-.19-.2-2.38-1.24-.23-.17-3.07-1.27-.26-.16-1.85-.52-.29-.22h-.32l-.36-.16h-.34l-.32-.21c-1.51-.14-3.17-.63-4.86-.79-2.03-.18-4.01.05-5.83-.11l-.72.22z" fill="#6699cb"/><path d="m225.14 45.65 1.91-1.02v49.28l-1.91 1.17z" clip-rule="evenodd" fill="#2f6799" fill-rule="evenodd"/></svg>
|
After Width: | Height: | Size: 1.8 KiB |
Some files were not shown because too many files have changed in this diff Show more
Reference in a new issue