Compare commits

...

46 Commits

Author SHA1 Message Date
zeripath
b0819efaea Place wrapper around comment as diff to catch panics (#15085) (#15086)
* Place wrapper around comment as diff to prevent panics

* propagate the panic up

Signed-off-by: Andrew Thornton <art27@cantab.net>
2021-03-21 16:16:07 +01:00
6543
d7a3bcdd70 Changelog v1.13.5 (#15084) 2021-03-21 15:05:21 +01:00
zeripath
7a85e228d8 Update to goldmark 1.3.3 (#15059) (#15061)
Backport #15059

Signed-off-by: Andrew Thornton <art27@cantab.net>
2021-03-20 10:31:28 +00:00
6543
a461d90415 Fix bug when upload on web (#15042) (#15055)
* Fix bug when upload on web

* move into own function

Co-authored-by: 6543 <6543@obermui.de>
Co-authored-by: zeripath <art27@cantab.net>

Co-authored-by: Lunny Xiao <xiaolunwen@gmail.com>
Co-authored-by: zeripath <art27@cantab.net>
2021-03-20 09:37:53 +08:00
6543
70e4134130 Delete Labels & IssueLabels on Repo Delete too (#15039) (#15051)
* Doctor: find IssueLabels without existing label

* Repo Delete: delete labels & issue_labels too
2021-03-19 22:13:39 +01:00
zeripath
909f2be99d Fix postgres ID sequences broken by recreate-table (#15015) (#15029)
Backport #15015

Unfortunately there is a subtle problem with recreatetable on postgres which
leads to the sequences not being renamed and not being left at 0.

Fix #14725

Signed-off-by: Andrew Thornton <art27@cantab.net>
2021-03-19 04:23:58 +01:00
6543
645c0d8abd another clusterfuzz spotted issue (#15032) (#15034)
Signed-off-by: Andrew Thornton <art27@cantab.net>

Co-authored-by: zeripath <art27@cantab.net>
2021-03-19 00:21:33 +02:00
zeripath
8c461eb261 Fix several render issues (#14986) (#15013)
Backport #14986

* Fix an issue with panics related to attributes
* Wrap goldmark render in a recovery function
* Reduce memory use in render emoji
* Use a pipe for rendering goldmark - still needs more work and a limiter

Signed-off-by: Andrew Thornton <art27@cantab.net>
Co-authored-by: Lauris BH <lauris@nix.lv>
2021-03-17 10:58:58 +02:00
Norwin
fff66eb016 API: fix set milestone on PR creation (#14981) (#15001)
* API: fix set milestone on PR creation

pr creation via API failed with 404, because we searched
for milestoneID 0, due to uninitialized var usage D:

* add tests

Co-authored-by: 6543 <6543@obermui.de>

Co-authored-by: 6543 <6543@obermui.de>
2021-03-15 11:01:04 -04:00
zeripath
c965ed6529 Make sure sibling images get a link too (#14979) (#14995)
Backport #14979

Due a problem with the ast.Walker in the our transformer in goldmark
an image with a sibling image will not be transformed to gain a parent
link. This PR fixes this.

Fix #12925

Signed-off-by: Andrew Thornton <art27@cantab.net>
2021-03-15 12:34:56 +08:00
zeripath
71a2adbf10 Fix Anchor jumping with escaped query components (#14969) (#14977)
Backport #14969

Fix #14968

Signed-off-by: Andrew Thornton <art27@cantab.net>
2021-03-13 09:54:53 +00:00
Norwin
3231b70043 check if original author is set (#14972)
Co-authored-by: 6543 <6543@obermui.de>
2021-03-13 11:05:56 +08:00
Norwin
e3c44923d7 fix release mail html template (#14976)
was missing an </a>
2021-03-12 20:39:05 +00:00
zeripath
3e7dccdf47 Fix excluding more than two labels on issues list (#14962) (#14973)
Backport #14962

Fix #14840

Signed-off-by: Andrew Thornton <art27@cantab.net>
Co-authored-by: Norwin Roosen <git@nroo.de>
Co-authored-by: jaqra <48099350+jaqra@users.noreply.github.com>

Co-authored-by: Norwin Roosen <git@nroo.de>
Co-authored-by: jaqra <48099350+jaqra@users.noreply.github.com>
2021-03-12 18:12:14 +01:00
6543
33c2c49627 Prevent panic when editing forked repos by API (#14960) (#14963)
When editing forked repos using the API the BaseRepository needs to loaded
in order to check its visibility otherwise there will be NPE panic.

Fix #14956

Signed-off-by: Andrew Thornton <art27@cantab.net>

Co-authored-by: zeripath <art27@cantab.net>
2021-03-12 08:54:18 +08:00
fnetX (aka fralix)
05ac72cf33 Add "captcha" to list of reserved usernames (#14930)
Signed-off-by: Otto Richter <git@fralix.ovh>
2021-03-08 17:50:13 +01:00
zeripath
906ecfd173 Re-enable import local paths after reversion from #13610 (#14925) (#14927)
Backport #14925

PR #13610 unfortunately disabled importing repositories from local paths.
This PR restores this functionality.

Fix #14700

Signed-off-by: Andrew Thornton <art27@cantab.net>
2021-03-08 14:50:57 +01:00
6543
75496b9ff5 Changelog v1.13.4 (#14917)
* Changelog v1.13.4

* nit
2021-03-07 23:02:54 +08:00
zeripath
8dad47a94a Fix race in LFS ContentStore.Put(...) (#14895) (#14913)
Backport #14895

Continuing on from #14888

The previous implementation has race whereby an incomplete upload or
hash mismatch upload can end up in the ContentStore. This PR moves the
validation into the reader so that if there is a hash error or size
mismatch the reader will return with an error instead of an io.EOF
causing the storage to abort the storage.

Signed-off-by: Andrew Thornton <art27@cantab.net>
2021-03-07 00:53:37 +02:00
6543
8e792986bb Fix a couple of issues with a feeds (#14897) (#14903)
Backport (#14897)

witch fix couple of issues with feeds
2021-03-06 06:13:38 +01:00
6543
da80e90ac8 Fix race in local storage (#14888) (#14901)
LocalStorage should only put completed files in position

Signed-off-by: Andrew Thornton <art27@cantab.net>

Co-authored-by: zeripath <art27@cantab.net>
Co-authored-by: techknowlogick <techknowlogick@gitea.io>
2021-03-06 05:07:03 +01:00
6543
74dc22358b When transfering repository and database transaction failed, rollback the renames (#14864) (#14902)
Fix #14821

Co-authored-by: Andrew Thornton <art27@cantab.net>
Co-authored-by: 6543 <6543@obermui.de>

Co-authored-by: Lunny Xiao <xiaolunwen@gmail.com>
Co-authored-by: Andrew Thornton <art27@cantab.net>
2021-03-06 11:12:11 +08:00
John Olheiser
7d3e174906 Signed-off-by: jolheiser <john.olheiser@gmail.com> (#14898) (#14899) 2021-03-05 23:54:01 +02:00
6543
8456700411 [Docs] Fix how lfs data path is set (#14855) (#14884)
* fix docs: lfs data path

* DEPRECATED | 已废弃

Co-authored-by: techknowlogick <techknowlogick@gitea.io>
Co-authored-by: Lunny Xiao <xiaolunwen@gmail.com>
2021-03-04 22:10:15 +01:00
6543
8a6acbbc12 IsUserAllowedToUpdate should igonre if user is nil (#14886) 2021-03-04 21:28:28 +01:00
Lunny Xiao
98b3d8d5e1 Add changelog for v1.13.3 (#14877)
Add changelog for v1.13.3

Co-authored-by: zeripath <art27@cantab.net>
Co-authored-by: 6543 <6543@obermui.de>
Co-authored-by: techknowlogick <matti@mdranta.net>
2021-03-04 15:42:57 +01:00
zeripath
e663f7459a Fix paging of file commit logs (#14831) (#14879)
Backport #14831

Unfortunately `git log revision ... --skip=x -- path` skips the number of commits
not the number of commits relating to the path.

This PR changes the function to have a reader that reads and skips the
necessary number of commits by hand instead.

Fix #8716

Signed-off-by: Andrew Thornton <art27@cantab.net>
Co-authored-by: 6543 <6543@obermui.de>

Co-authored-by: 6543 <6543@obermui.de>
2021-03-04 19:53:28 +08:00
6543
7e85cba3e5 Print usefull error if SQLite is used in settings but not supported (#14476) (#14874)
* move log output to points where they are relefant

* check explicit of sqlite3 in settings
2021-03-03 21:54:32 +00:00
zeripath
26628aa1d1 Fix display since time round (#14226) (#14873)
Backport #14226

* Fix display since time round

* Fix since time

* Fix tests

Co-authored-by: Lunny Xiao <xiaolunwen@gmail.com>
2021-03-03 21:17:34 +00:00
zeripath
d9d2e8f1e8 When Deleting Repository only explicitly close PRs whose base is not this repository (#14823) (#14842)
Backport #14823

When Deleting Repository only explicitly close PRs whose base is not this repository

Fix #14775

Signed-off-by: Andrew Thornton <art27@cantab.net>
2021-03-02 21:44:14 +08:00
zeripath
4558eeb21a Set HCaptchaSiteKey on Link Account pages (#14834) (#14839)
Backport #14834

When using HCaptcha on link account pages the site key needs to be passed
in. This PR ensures that HCaptchaSiteKey is set in the data.

Fix #14766

Signed-off-by: Andrew Thornton <art27@cantab.net>
2021-03-01 16:12:48 +01:00
zeripath
be25afc6de Fix a couple of CommentAsPatch issues. (#14804) (#14820)
Backport #14804

* CutDiffAroundLine makes the incorrect assumption that `---` and `+++` always represent part of the header of a diff.

This PR adds a flag to its parsing to prevent this problem and adds a streaming parsing technique to CutDiffAroundLine using an io.pipe instead of just sending data to an unbounded buffer.

Fix #14711

* Handle unquoted comment patch files

When making comment patches unfortunately the patch does not always quote the filename
This makes the diff --git header ambiguous again.

This PR finally adds handling for ambiguity in to parse patch

Fix #14812

* Add in testing for no error

There is no way currently for CutDiffAroundLine in this test to cause an
error however, it should still be tested.

Signed-off-by: Andrew Thornton <art27@cantab.net>
2021-02-28 15:19:51 +02:00
zeripath
90bf1e7961 Disable broken OAuth2 providers at startup (#14802) (#14811)
Backport #14802

Instead of causing a log.Fatal, we should handle broken OAuth2
providers by disabling them.

Fix #8930

Signed-off-by: Andrew Thornton <art27@cantab.net>
Co-authored-by: techknowlogick <techknowlogick@gitea.io>
2021-02-26 11:44:45 +01:00
6543
77ce08976d Re-enable transfer repo back from org to user account (#14807)
* re-enable transfer repo back from org to user account

* add test case
2021-02-26 11:08:09 +02:00
6543
8f389c5dfa Build for only available darwin target (#14771) (#14798)
Co-authored-by: techknowlogick <techknowlogick@gitea.io>
2021-02-25 15:29:03 +01:00
6543
edef62e69e Backport: Repo Transfer permission checks (#14792) (#14794)
* Backport: Repo Transfer permission checks (#14792)

* update tests
2021-02-25 15:49:27 +08:00
a1012112796
cdff144f76 Fix double alert in oauth2 application edit view (#14764) (#14768)
Signed-off-by: a1012112796 <1012112796@qq.com>
2021-02-23 00:22:49 +01:00
zeripath
ad6084a222 Fix broken spans in diffs (#14678) (#14683)
Backport #14678

Gitea runs diff on highlighted code fragment for each line in order to
provide code highlight diffs. Unfortunately this diff algorithm is not
aware that span tags and entities are atomic and cannot be split.

The current fixup code makes some attempt to fix these broken tags
however, it cannot handle situations where a tag is split over multiple
blocks.

This PR provides a more algorithmic fixup mechanism whereby spans and
entities are completely coalesced into their respective blocks.

This may result in a incompletely reduced diff but - it will definitely
prevent the broken entities and spans that are currently possible.

As a result of this fixup several inconsistencies were discovered in our
testcases and these were also fixed.

Fix #14231

Signed-off-by: Andrew Thornton <art27@cantab.net>

Co-authored-by: 6543 <6543@obermui.de>
2021-02-15 00:30:07 +01:00
zeripath
d3200db041 HasPreviousCommit causes recursive load of commits unnecessarily (#14598) (#14649)
This PR improves HasPreviousCommit to prevent the automatic and recursive loading
of previous commits using git merge-base --is-ancestor and git rev-list

Fix #13684

Signed-off-by: Andrew Thornton <art27@cantab.net>

Co-authored-by: 6543 <6543@obermui.de>
2021-02-15 00:44:26 +02:00
zeripath
f305cffcaf Prevent race in PersistableChannelUniqueQueue.Has (#14651) (#14676)
Backport #14651

There is potentially a race with a slow starting internal
queue causing a NPE if Has is checked before the internal
queue has been setup.

This PR adds a lock on the Has() fn.

Fix #14311

Signed-off-by: Andrew Thornton <art27@cantab.net>

Co-authored-by: 6543 <6543@obermui.de>
2021-02-14 01:50:50 +01:00
Lunny Xiao
c0320065b6 Turn default hash password algorightm back to pbkdf2 from argon2 until we found a better one (#14673) (#14675)
* Turn default hash password algorightm back to pbkdf2 from argon2 until we found a better one

* Add a warning on document

Co-authored-by: zeripath <art27@cantab.net>
2021-02-13 21:19:33 +01:00
zeripath
a1b74c5509 Allow org labels to be set with issue templates (#14593) (#14647)
Backport #14593

Fix #13688

Signed-off-by: Andrew Thornton <art27@cantab.net>

Co-authored-by: Lunny Xiao <xiaolunwen@gmail.com>
2021-02-13 19:34:47 +01:00
zeripath
101fb0d7e2 Do not assume all 40 char strings are SHA1s (#14624) (#14648)
Backport #14624

GetCommit() assumes that all 40 char strings are SHA1s. This leads to an
error if you try to do a PR on a branch which is 40 characters long.

This PR attempts the SHA first - and if it fails will switch to using rev-parse.

Fix #14470

Signed-off-by: Andrew Thornton <art27@cantab.net>
2021-02-14 01:25:47 +08:00
zeripath
82637c240a Accept multiple SSH keys in single LDAP SSHPublicKey attribute (#13989) (#14607)
Backport #13989

Fix #13984

Fix #14566

Signed-off-by: Andrew Thornton <art27@cantab.net>
2021-02-08 09:25:30 +08:00
6543
d0174d45ed Fix bug about ListOptions and stars/watchers pagnation (#14556) (#14573)
* Fix bug about ListOptions and stars/watchers pagnation

* fix unit test
2021-02-05 21:11:15 +00:00
Anton Khimich
da7a525c5c Fix GPG key deletion during account deletion (#14561) (#14569)
Per #14531, deleting a user account will delete the user's GPG keys
from the `gpg_key` table but not from `gpg_key_import`, which causes
an error when creating an account with the same email and attempting
to re-add the same key. This commit deletes all entries from
`gpg_key_import` that match any GPG key IDs belonging to the user.

Co-authored-by: Anton Khimich <anton.khimicha@mail.utoronto.ca>
2021-02-04 21:28:48 +01:00
83 changed files with 2438 additions and 368 deletions

View File

@@ -4,6 +4,63 @@ This changelog goes through all the changes that have been made in each release
without substantial changes to our git log; to see the highlights of what has
been added to each release, please refer to the [blog](https://blog.gitea.io).
## [1.13.5](https://github.com/go-gitea/gitea/releases/tag/v1.13.5) - 2021-03-21
* SECURITY
* Update to goldmark 1.3.3 (#15059) (#15061)
* API
* Fix set milestone on PR creation (#14981) (#15001)
* Prevent panic when editing forked repos by API (#14960) (#14963)
* BUGFIXES
* Fix bug when upload on web (#15042) (#15055)
* Delete Labels & IssueLabels on Repo Delete too (#15039) (#15051)
* another clusterfuzz spotted issue (#15032) (#15034)
* Fix postgres ID sequences broken by recreate-table (#15015) (#15029)
* Fix several render issues (#14986) (#15013)
* Make sure sibling images get a link too (#14979) (#14995)
* Fix Anchor jumping with escaped query components (#14969) (#14977)
* fix release mail html template (#14976)
* Fix excluding more than two labels on issues list (#14962) (#14973)
* don't mark each comment poster as OP (#14971) (#14972)
* Add "captcha" to list of reserved usernames (#14930)
* Re-enable import local paths after reversion from #13610 (#14925) (#14927)
## [1.13.4](https://github.com/go-gitea/gitea/releases/tag/v1.13.4) - 2021-03-07
* SECURITY
* Fix issue popups (#14898) (#14899)
* BUGFIXES
* Fix race in LFS ContentStore.Put(...) (#14895) (#14913)
* Fix a couple of issues with a feeds (#14897) (#14903)
* When transfering repository and database transaction failed, rollback the renames (#14864) (#14902)
* Fix race in local storage (#14888) (#14901)
* Fix 500 on pull view page if user is not loged in (#14885) (#14886)
* DOCS
* Fix how lfs data path is set (#14855) (#14884)
## [1.13.3](https://github.com/go-gitea/gitea/releases/tag/v1.13.3) - 2021-03-04
* BREAKING
* Turn default hash password algorithm back to pbkdf2 from argon2 until we find a better one (#14673) (#14675)
* BUGFIXES
* Fix paging of file commit logs (#14831) (#14879)
* Print useful error if SQLite is used in settings but not supported (#14476) (#14874)
* Fix display since time round (#14226) (#14873)
* When Deleting Repository only explicitly close PRs whose base is not this repository (#14823) (#14842)
* Set HCaptchaSiteKey on Link Account pages (#14834) (#14839)
* Fix a couple of CommentAsPatch issues. (#14804) (#14820)
* Disable broken OAuth2 providers at startup (#14802) (#14811)
* Repo Transfer permission checks (#14792) (#14794)
* Fix double alert in oauth2 application edit view (#14764) (#14768)
* Fix broken spans in diffs (#14678) (#14683)
* Prevent race in PersistableChannelUniqueQueue.Has (#14651) (#14676)
* HasPreviousCommit causes recursive load of commits unnecessarily (#14598) (#14649)
* Do not assume all 40 char strings are SHA1s (#14624) (#14648)
* Allow org labels to be set with issue templates (#14593) (#14647)
* Accept multiple SSH keys in single LDAP SSHPublicKey attribute (#13989) (#14607)
* Fix bug about ListOptions and stars/watchers pagnation (#14556) (#14573)
* Fix GPG key deletion during account deletion (#14561) (#14569)
## [1.13.2](https://github.com/go-gitea/gitea/releases/tag/v1.13.2) - 2021-01-31
* SECURITY

View File

@@ -585,7 +585,7 @@ release-darwin: | $(DIST_DIRS)
@hash xgo > /dev/null 2>&1; if [ $$? -ne 0 ]; then \
GO111MODULE=off $(GO) get -u src.techknowlogick.com/xgo; \
fi
CGO_CFLAGS="$(CGO_CFLAGS)" GO111MODULE=off xgo -go $(XGO_VERSION) -dest $(DIST)/binaries -tags 'netgo osusergo $(TAGS)' -ldflags '$(LDFLAGS)' -targets 'darwin/*' -out gitea-$(VERSION) .
CGO_CFLAGS="$(CGO_CFLAGS)" GO111MODULE=off xgo -go $(XGO_VERSION) -dest $(DIST)/binaries -tags 'netgo osusergo $(TAGS)' -ldflags '$(LDFLAGS)' -targets 'darwin/amd64' -out gitea-$(VERSION) .
ifeq ($(CI),drone)
cp /build/* $(DIST)/binaries
endif

View File

@@ -606,6 +606,22 @@ func runDoctorCheckDBConsistency(ctx *cli.Context) ([]string, error) {
}
}
// find IssueLabels without existing label
count, err = models.CountOrphanedIssueLabels()
if err != nil {
return nil, err
}
if count > 0 {
if ctx.Bool("fix") {
if err = models.DeleteOrphanedIssueLabels(); err != nil {
return nil, err
}
results = append(results, fmt.Sprintf("%d issue_labels without existing label deleted", count))
} else {
results = append(results, fmt.Sprintf("%d issue_labels without existing label", count))
}
}
//find issues without existing repository
count, err = models.CountOrphanedIssues()
if err != nil {
@@ -670,6 +686,23 @@ func runDoctorCheckDBConsistency(ctx *cli.Context) ([]string, error) {
}
}
if setting.Database.UsePostgreSQL {
count, err = models.CountBadSequences()
if err != nil {
return nil, err
}
if count > 0 {
if ctx.Bool("fix") {
err := models.FixBadSequences()
if err != nil {
return nil, err
}
results = append(results, fmt.Sprintf("%d sequences updated", count))
} else {
results = append(results, fmt.Sprintf("%d sequences with incorrect values", count))
}
}
}
//ToDo: function to recalc all counters
return results, nil

View File

@@ -548,7 +548,7 @@ ONLY_ALLOW_PUSH_IF_GITEA_ENVIRONMENT_SET = true
;Classes include "lower,upper,digit,spec"
PASSWORD_COMPLEXITY = off
; Password Hash algorithm, either "argon2", "pbkdf2", "scrypt" or "bcrypt"
PASSWORD_HASH_ALGO = argon2
PASSWORD_HASH_ALGO = pbkdf2
; Set false to allow JavaScript to read CSRF cookie
CSRF_COOKIE_HTTP_ONLY = true
; Validate against https://haveibeenpwned.com/Passwords to see if a password has been exposed

View File

@@ -276,7 +276,7 @@ Values containing `#` or `;` must be quoted using `` ` `` or `"""`.
- `LANDING_PAGE`: **home**: Landing page for unauthenticated users \[home, explore, organizations, login\].
- `LFS_START_SERVER`: **false**: Enables git-lfs support.
- `LFS_CONTENT_PATH`: **%(APP_DATA_PATH)/lfs**: Default LFS content path. (if it is on local storage.)
- `LFS_CONTENT_PATH`: **%(APP_DATA_PATH)/lfs**: DEPRECATED: Default LFS content path. (if it is on local storage.)
- `LFS_JWT_SECRET`: **\<empty\>**: LFS authentication secret, change this a unique string.
- `LFS_HTTP_AUTH_EXPIRY`: **20m**: LFS authentication validity period in time.Duration, pushes taking longer than this may fail.
- `LFS_MAX_FILE_SIZE`: **0**: Maximum allowed LFS file size in bytes (Set to 0 for no limit).
@@ -402,7 +402,7 @@ relation to port exhaustion.
- `IMPORT_LOCAL_PATHS`: **false**: Set to `false` to prevent all users (including admin) from importing local path on server.
- `INTERNAL_TOKEN`: **\<random at every install if no uri set\>**: Secret used to validate communication within Gitea binary.
- `INTERNAL_TOKEN_URI`: **<empty>**: Instead of defining internal token in the configuration, this configuration option can be used to give Gitea a path to a file that contains the internal token (example value: `file:/etc/gitea/internal_token`)
- `PASSWORD_HASH_ALGO`: **argon2**: The hash algorithm to use \[argon2, pbkdf2, scrypt, bcrypt\].
- `PASSWORD_HASH_ALGO`: **pbkdf2**: The hash algorithm to use \[argon2, pbkdf2, scrypt, bcrypt\], argon2 will spend more memory than others.
- `CSRF_COOKIE_HTTP_ONLY`: **true**: Set false to allow JavaScript to read CSRF cookie.
- `MIN_PASSWORD_LENGTH`: **6**: Minimum password length for new users.
- `PASSWORD_COMPLEXITY`: **off**: Comma separated list of character classes required to pass minimum complexity. If left empty or no valid values are specified, checking is disabled (off):
@@ -828,7 +828,7 @@ is `data/lfs` and the default of `MINIO_BASE_PATH` is `lfs/`.
- `STORAGE_TYPE`: **local**: Storage type for lfs, `local` for local disk or `minio` for s3 compatible object storage service or other name defined with `[storage.xxx]`
- `SERVE_DIRECT`: **false**: Allows the storage driver to redirect to authenticated URLs to serve files directly. Currently, only Minio/S3 is supported via signed URLs, local does nothing.
- `CONTENT_PATH`: **./data/lfs**: Where to store LFS files, only available when `STORAGE_TYPE` is `local`.
- `PATH`: **./data/lfs**: Where to store LFS files, only available when `STORAGE_TYPE` is `local`. If not set it fall back to deprecated LFS_CONTENT_PATH value in [server] section.
- `MINIO_ENDPOINT`: **localhost:9000**: Minio endpoint to connect only available when `STORAGE_TYPE` is `minio`
- `MINIO_ACCESS_KEY_ID`: Minio accessKeyID to connect only available when `STORAGE_TYPE` is `minio`
- `MINIO_SECRET_ACCESS_KEY`: Minio secretAccessKey to connect only available when `STORAGE_TYPE is` `minio`

View File

@@ -73,6 +73,7 @@ menu:
- `LFS_START_SERVER`: 是否启用 git-lfs 支持. 可以为 `true``false` 默认是 `false`
- `LFS_JWT_SECRET`: LFS 认证密钥,改成自己的。
- `LFS_CONTENT_PATH`: **已废弃**, 存放 lfs 命令上传的文件的地方,默认是 `data/lfs`
## Database (`database`)
@@ -323,7 +324,7 @@ LFS 的存储配置。 如果 `STORAGE_TYPE` 为空,则此配置将从 `[stora
- `STORAGE_TYPE`: **local**: LFS 的存储类型,`local` 将存储到磁盘,`minio` 将存储到 s3 兼容的对象服务。
- `SERVE_DIRECT`: **false**: 允许直接重定向到存储系统。当前,仅 Minio/S3 是支持的。
- `CONTENT_PATH`: 存放 lfs 命令上传的文件的地方,默认是 `data/lfs`
- `PATH`: 存放 lfs 命令上传的文件的地方,默认是 `data/lfs`
- `MINIO_ENDPOINT`: **localhost:9000**: Minio 地址,仅当 `LFS_STORAGE_TYPE``minio` 时有效。
- `MINIO_ACCESS_KEY_ID`: Minio accessKeyID仅当 `LFS_STORAGE_TYPE``minio` 时有效。
- `MINIO_SECRET_ACCESS_KEY`: Minio secretAccessKey仅当 `LFS_STORAGE_TYPE``minio` 时有效。

2
go.mod
View File

@@ -99,7 +99,7 @@ require (
github.com/urfave/cli v1.20.0
github.com/xanzy/go-gitlab v0.37.0
github.com/yohcop/openid-go v1.0.0
github.com/yuin/goldmark v1.2.1
github.com/yuin/goldmark v1.3.3
github.com/yuin/goldmark-highlighting v0.0.0-20200307114337-60d527fdb691
github.com/yuin/goldmark-meta v0.0.0-20191126180153-f0638e958b60
go.jolheiser.com/hcaptcha v0.0.4

2
go.sum
View File

@@ -887,6 +887,8 @@ github.com/yuin/goldmark v1.1.25/go.mod h1:3hX8gzYuyVAZsxl0MRgGTJEmQBFcNTphYh9de
github.com/yuin/goldmark v1.1.32/go.mod h1:3hX8gzYuyVAZsxl0MRgGTJEmQBFcNTphYh9decYSb74=
github.com/yuin/goldmark v1.2.1 h1:ruQGxdhGHe7FWOJPT0mKs5+pD2Xs1Bm/kdGlHO04FmM=
github.com/yuin/goldmark v1.2.1/go.mod h1:3hX8gzYuyVAZsxl0MRgGTJEmQBFcNTphYh9decYSb74=
github.com/yuin/goldmark v1.3.3 h1:37BdQwPx8VOSic8eDSWee6QL9mRpZRm9VJp/QugNrW0=
github.com/yuin/goldmark v1.3.3/go.mod h1:mwnBkeHKe2W/ZEtQ+71ViKU8L12m81fl3OWwC1Zlc8k=
github.com/yuin/goldmark-highlighting v0.0.0-20200307114337-60d527fdb691 h1:VWSxtAiQNh3zgHJpdpkpVYjTPqRE3P6UZCOPa1nRDio=
github.com/yuin/goldmark-highlighting v0.0.0-20200307114337-60d527fdb691/go.mod h1:YLF3kDffRfUH/bTxOxHhV6lxwIB3Vfj91rEwNMS9MXo=
github.com/yuin/goldmark-meta v0.0.0-20191126180153-f0638e958b60 h1:gZucqLjL1eDzVWrXj4uiWeMbAopJlBR2mKQAsTGdPwo=

View File

@@ -74,8 +74,79 @@ func TestAPICreatePullSuccess(t *testing.T) {
Base: "master",
Title: "create a failure pr",
})
session.MakeRequest(t, req, 201)
session.MakeRequest(t, req, http.StatusUnprocessableEntity) // second request should fail
}
func TestAPICreatePullWithFieldsSuccess(t *testing.T) {
defer prepareTestEnv(t)()
// repo10 have code, pulls units.
repo10 := models.AssertExistsAndLoadBean(t, &models.Repository{ID: 10}).(*models.Repository)
owner10 := models.AssertExistsAndLoadBean(t, &models.User{ID: repo10.OwnerID}).(*models.User)
// repo11 only have code unit but should still create pulls
repo11 := models.AssertExistsAndLoadBean(t, &models.Repository{ID: 11}).(*models.Repository)
owner11 := models.AssertExistsAndLoadBean(t, &models.User{ID: repo11.OwnerID}).(*models.User)
session := loginUser(t, owner11.Name)
token := getTokenForLoggedInUser(t, session)
opts := &api.CreatePullRequestOption{
Head: fmt.Sprintf("%s:master", owner11.Name),
Base: "master",
Title: "create a failure pr",
Body: "foobaaar",
Milestone: 5,
Assignees: []string{owner10.Name},
Labels: []int64{5},
}
req := NewRequestWithJSON(t, http.MethodPost, fmt.Sprintf("/api/v1/repos/%s/%s/pulls?token=%s", owner10.Name, repo10.Name, token), opts)
res := session.MakeRequest(t, req, 201)
pull := new(api.PullRequest)
DecodeJSON(t, res, pull)
assert.NotNil(t, pull.Milestone)
assert.EqualValues(t, opts.Milestone, pull.Milestone.ID)
if assert.Len(t, pull.Assignees, 1) {
assert.EqualValues(t, opts.Assignees[0], owner10.Name)
}
assert.NotNil(t, pull.Labels)
assert.EqualValues(t, opts.Labels[0], pull.Labels[0].ID)
}
func TestAPICreatePullWithFieldsFailure(t *testing.T) {
defer prepareTestEnv(t)()
// repo10 have code, pulls units.
repo10 := models.AssertExistsAndLoadBean(t, &models.Repository{ID: 10}).(*models.Repository)
owner10 := models.AssertExistsAndLoadBean(t, &models.User{ID: repo10.OwnerID}).(*models.User)
// repo11 only have code unit but should still create pulls
repo11 := models.AssertExistsAndLoadBean(t, &models.Repository{ID: 11}).(*models.Repository)
owner11 := models.AssertExistsAndLoadBean(t, &models.User{ID: repo11.OwnerID}).(*models.User)
session := loginUser(t, owner11.Name)
token := getTokenForLoggedInUser(t, session)
opts := &api.CreatePullRequestOption{
Head: fmt.Sprintf("%s:master", owner11.Name),
Base: "master",
}
req := NewRequestWithJSON(t, http.MethodPost, fmt.Sprintf("/api/v1/repos/%s/%s/pulls?token=%s", owner10.Name, repo10.Name, token), opts)
session.MakeRequest(t, req, http.StatusUnprocessableEntity)
opts.Title = "is required"
opts.Milestone = 666
session.MakeRequest(t, req, http.StatusUnprocessableEntity)
opts.Milestone = 5
opts.Assignees = []string{"qweruqweroiuyqweoiruywqer"}
session.MakeRequest(t, req, http.StatusUnprocessableEntity)
opts.Assignees = []string{owner10.LoginName}
opts.Labels = []int64{55555}
session.MakeRequest(t, req, http.StatusUnprocessableEntity)
opts.Labels = []int64{5}
}
func TestAPIEditPull(t *testing.T) {

View File

@@ -445,11 +445,12 @@ func TestAPIRepoTransfer(t *testing.T) {
expectedStatus int
}{
{ctxUserID: 1, newOwner: "user2", teams: nil, expectedStatus: http.StatusAccepted},
{ctxUserID: 2, newOwner: "user1", teams: nil, expectedStatus: http.StatusAccepted},
{ctxUserID: 2, newOwner: "user1", teams: nil, expectedStatus: http.StatusForbidden},
{ctxUserID: 2, newOwner: "user6", teams: nil, expectedStatus: http.StatusForbidden},
{ctxUserID: 1, newOwner: "user2", teams: &[]int64{2}, expectedStatus: http.StatusUnprocessableEntity},
{ctxUserID: 1, newOwner: "user3", teams: &[]int64{5}, expectedStatus: http.StatusForbidden},
{ctxUserID: 1, newOwner: "user3", teams: &[]int64{2}, expectedStatus: http.StatusAccepted},
{ctxUserID: 2, newOwner: "user2", teams: nil, expectedStatus: http.StatusAccepted},
}
defer prepareTestEnv(t)()

View File

@@ -237,6 +237,6 @@ func TestLDAPUserSSHKeySync(t *testing.T) {
syncedKeys[i] = strings.TrimSpace(divs.Eq(i).Text())
}
assert.ElementsMatch(t, u.SSHKeys, syncedKeys)
assert.ElementsMatch(t, u.SSHKeys, syncedKeys, "Unequal number of keys synchronized for user: %s", u.UserName)
}
}

View File

@@ -18,7 +18,7 @@ func TestGetCommitStatuses(t *testing.T) {
sha1 := "1234123412341234123412341234123412341234"
statuses, maxResults, err := GetCommitStatuses(repo1, sha1, &CommitStatusOptions{})
statuses, maxResults, err := GetCommitStatuses(repo1, sha1, &CommitStatusOptions{ListOptions: ListOptions{Page: 1, PageSize: 50}})
assert.NoError(t, err)
assert.Equal(t, int(maxResults), 5)
assert.Len(t, statuses, 5)

View File

@@ -5,10 +5,13 @@
package models
import (
"fmt"
"reflect"
"regexp"
"strings"
"testing"
"code.gitea.io/gitea/modules/setting"
"github.com/stretchr/testify/assert"
"xorm.io/builder"
)
@@ -221,6 +224,24 @@ func DeleteOrphanedLabels() error {
return nil
}
// CountOrphanedIssueLabels return count of IssueLabels witch have no label behind anymore
func CountOrphanedIssueLabels() (int64, error) {
return x.Table("issue_label").
Join("LEFT", "label", "issue_label.label_id = label.id").
Where(builder.IsNull{"label.id"}).Count()
}
// DeleteOrphanedIssueLabels delete IssueLabels witch have no label behind anymore
func DeleteOrphanedIssueLabels() error {
_, err := x.In("id", builder.Select("issue_label.id").From("issue_label").
Join("LEFT", "label", "issue_label.label_id = label.id").
Where(builder.IsNull{"label.id"})).
Delete(IssueLabel{})
return err
}
// CountOrphanedIssues count issues without a repo
func CountOrphanedIssues() (int64, error) {
return x.Table("issue").
@@ -295,3 +316,61 @@ func FixNullArchivedRepository() (int64, error) {
IsArchived: false,
})
}
// CountBadSequences looks for broken sequences from recreate-table mistakes
func CountBadSequences() (int64, error) {
if !setting.Database.UsePostgreSQL {
return 0, nil
}
sess := x.NewSession()
defer sess.Close()
var sequences []string
schema := sess.Engine().Dialect().URI().Schema
sess.Engine().SetSchema("")
if err := sess.Table("information_schema.sequences").Cols("sequence_name").Where("sequence_name LIKE 'tmp_recreate__%_id_seq%' AND sequence_catalog = ?", setting.Database.Name).Find(&sequences); err != nil {
return 0, err
}
sess.Engine().SetSchema(schema)
return int64(len(sequences)), nil
}
// FixBadSequences fixes for broken sequences from recreate-table mistakes
func FixBadSequences() error {
if !setting.Database.UsePostgreSQL {
return nil
}
sess := x.NewSession()
defer sess.Close()
if err := sess.Begin(); err != nil {
return err
}
var sequences []string
schema := sess.Engine().Dialect().URI().Schema
sess.Engine().SetSchema("")
if err := sess.Table("information_schema.sequences").Cols("sequence_name").Where("sequence_name LIKE 'tmp_recreate__%_id_seq%' AND sequence_catalog = ?", setting.Database.Name).Find(&sequences); err != nil {
return err
}
sess.Engine().SetSchema(schema)
sequenceRegexp := regexp.MustCompile(`tmp_recreate__(\w+)_id_seq.*`)
for _, sequence := range sequences {
tableName := sequenceRegexp.FindStringSubmatch(sequence)[1]
newSequenceName := tableName + "_id_seq"
if _, err := sess.Exec(fmt.Sprintf("ALTER SEQUENCE `%s` RENAME TO `%s`", sequence, newSequenceName)); err != nil {
return err
}
if _, err := sess.Exec(fmt.Sprintf("SELECT setval('%s', COALESCE((SELECT MAX(id)+1 FROM `%s`), 1), false)", newSequenceName, tableName)); err != nil {
return err
}
}
return sess.Commit()
}

View File

@@ -33,3 +33,11 @@
num_issues: 1
num_closed_issues: 0
-
id: 5
repo_id: 10
org_id: 0
name: pull-test-label
color: '#000000'
num_issues: 0
num_closed_issues: 0

View File

@@ -29,3 +29,11 @@
content: content random
is_closed: false
num_issues: 0
-
id: 5
repo_id: 10
name: milestone of repo 10
content: for testing with PRs
is_closed: false
num_issues: 0

View File

@@ -146,6 +146,7 @@
num_closed_issues: 0
num_pulls: 1
num_closed_pulls: 0
num_milestones: 1
is_mirror: false
num_forks: 1
status: 0

View File

@@ -65,7 +65,11 @@ func (key *GPGKey) AfterLoad(session *xorm.Session) {
// ListGPGKeys returns a list of public keys belongs to given user.
func ListGPGKeys(uid int64, listOptions ListOptions) ([]*GPGKey, error) {
sess := x.Where("owner_id=? AND primary_key_id=''", uid)
return listGPGKeys(x, uid, listOptions)
}
func listGPGKeys(e Engine, uid int64, listOptions ListOptions) ([]*GPGKey, error) {
sess := e.Table(&GPGKey{}).Where("owner_id=? AND primary_key_id=''", uid)
if listOptions.Page != 0 {
sess = listOptions.setSessionPagination(sess)
}

View File

@@ -764,3 +764,15 @@ func DeleteIssueLabel(issue *Issue, label *Label, doer *User) (err error) {
return sess.Commit()
}
func deleteLabelsByRepoID(sess Engine, repoID int64) error {
deleteCond := builder.Select("id").From("label").Where(builder.Eq{"label.repo_id": repoID})
if _, err := sess.In("label_id", deleteCond).
Delete(&IssueLabel{}); err != nil {
return err
}
_, err := sess.Delete(&Label{RepoID: repoID})
return err
}

View File

@@ -16,13 +16,13 @@ type ListOptions struct {
Page int // start from 1
}
func (opts ListOptions) getPaginatedSession() *xorm.Session {
func (opts *ListOptions) getPaginatedSession() *xorm.Session {
opts.setDefaultValues()
return x.Limit(opts.PageSize, (opts.Page-1)*opts.PageSize)
}
func (opts ListOptions) setSessionPagination(sess *xorm.Session) *xorm.Session {
func (opts *ListOptions) setSessionPagination(sess *xorm.Session) *xorm.Session {
opts.setDefaultValues()
if opts.PageSize <= 0 {
@@ -31,21 +31,21 @@ func (opts ListOptions) setSessionPagination(sess *xorm.Session) *xorm.Session {
return sess.Limit(opts.PageSize, (opts.Page-1)*opts.PageSize)
}
func (opts ListOptions) setEnginePagination(e Engine) Engine {
func (opts *ListOptions) setEnginePagination(e Engine) Engine {
opts.setDefaultValues()
return e.Limit(opts.PageSize, (opts.Page-1)*opts.PageSize)
}
// GetStartEnd returns the start and end of the ListOptions
func (opts ListOptions) GetStartEnd() (start, end int) {
func (opts *ListOptions) GetStartEnd() (start, end int) {
opts.setDefaultValues()
start = (opts.Page - 1) * opts.PageSize
end = start + opts.Page
return
}
func (opts ListOptions) setDefaultValues() {
func (opts *ListOptions) setDefaultValues() {
if opts.PageSize <= 0 {
opts.PageSize = setting.API.DefaultPagingNum
}

View File

@@ -516,6 +516,31 @@ func recreateTable(sess *xorm.Session, bean interface{}) error {
return err
}
case setting.Database.UsePostgreSQL:
var originalSequences []string
type sequenceData struct {
LastValue int `xorm:"'last_value'"`
IsCalled bool `xorm:"'is_called'"`
}
sequenceMap := map[string]sequenceData{}
schema := sess.Engine().Dialect().URI().Schema
sess.Engine().SetSchema("")
if err := sess.Table("information_schema.sequences").Cols("sequence_name").Where("sequence_name LIKE ? || '_%' AND sequence_catalog = ?", tableName, setting.Database.Name).Find(&originalSequences); err != nil {
log.Error("Unable to rename %s to %s. Error: %v", tempTableName, tableName, err)
return err
}
sess.Engine().SetSchema(schema)
for _, sequence := range originalSequences {
sequenceData := sequenceData{}
if _, err := sess.Table(sequence).Cols("last_value", "is_called").Get(&sequenceData); err != nil {
log.Error("Unable to get last_value and is_called from %s. Error: %v", sequence, err)
return err
}
sequenceMap[sequence] = sequenceData
}
// CASCADE causes postgres to drop all the constraints on the old table
if _, err := sess.Exec(fmt.Sprintf("DROP TABLE `%s` CASCADE", tableName)); err != nil {
log.Error("Unable to drop old table %s. Error: %v", tableName, err)
@@ -529,7 +554,6 @@ func recreateTable(sess *xorm.Session, bean interface{}) error {
}
var indices []string
schema := sess.Engine().Dialect().URI().Schema
sess.Engine().SetSchema("")
if err := sess.Table("pg_indexes").Cols("indexname").Where("tablename = ? ", tableName).Find(&indices); err != nil {
log.Error("Unable to rename %s to %s. Error: %v", tempTableName, tableName, err)
@@ -545,6 +569,43 @@ func recreateTable(sess *xorm.Session, bean interface{}) error {
}
}
var sequences []string
sess.Engine().SetSchema("")
if err := sess.Table("information_schema.sequences").Cols("sequence_name").Where("sequence_name LIKE 'tmp_recreate__' || ? || '_%' AND sequence_catalog = ?", tableName, setting.Database.Name).Find(&sequences); err != nil {
log.Error("Unable to rename %s to %s. Error: %v", tempTableName, tableName, err)
return err
}
sess.Engine().SetSchema(schema)
for _, sequence := range sequences {
newSequenceName := strings.Replace(sequence, "tmp_recreate__", "", 1)
if _, err := sess.Exec(fmt.Sprintf("ALTER SEQUENCE `%s` RENAME TO `%s`", sequence, newSequenceName)); err != nil {
log.Error("Unable to rename %s sequence to %s. Error: %v", sequence, newSequenceName, err)
return err
}
val, ok := sequenceMap[newSequenceName]
if newSequenceName == tableName+"_id_seq" {
if ok && val.LastValue != 0 {
if _, err := sess.Exec(fmt.Sprintf("SELECT setval('%s', %d, %t)", newSequenceName, val.LastValue, val.IsCalled)); err != nil {
log.Error("Unable to reset %s to %d. Error: %v", newSequenceName, val, err)
return err
}
} else {
// We're going to try to guess this
if _, err := sess.Exec(fmt.Sprintf("SELECT setval('%s', COALESCE((SELECT MAX(id)+1 FROM `%s`), 1), false)", newSequenceName, tableName)); err != nil {
log.Error("Unable to reset %s. Error: %v", newSequenceName, err)
return err
}
}
} else if ok {
if _, err := sess.Exec(fmt.Sprintf("SELECT setval('%s', %d, %t)", newSequenceName, val.LastValue, val.IsCalled)); err != nil {
log.Error("Unable to reset %s to %d. Error: %v", newSequenceName, val, err)
return err
}
}
}
case setting.Database.UseMSSQL:
// MSSQL will drop all the constraints on the old table
if _, err := sess.Exec(fmt.Sprintf("DROP TABLE `%s`", tableName)); err != nil {

View File

@@ -8,6 +8,7 @@ import (
"sort"
"code.gitea.io/gitea/modules/auth/oauth2"
"code.gitea.io/gitea/modules/log"
)
// OAuth2Provider describes the display values of a single OAuth2 provider
@@ -135,7 +136,12 @@ func initOAuth2LoginSources() error {
oAuth2Config := source.OAuth2()
err := oauth2.RegisterProvider(source.Name, oAuth2Config.Provider, oAuth2Config.ClientID, oAuth2Config.ClientSecret, oAuth2Config.OpenIDConnectAutoDiscoveryURL, oAuth2Config.CustomURLMapping)
if err != nil {
return err
log.Critical("Unable to register source: %s due to Error: %v. This source will be disabled.", source.Name, err)
source.IsActived = false
if err = UpdateSource(source); err != nil {
log.Critical("Unable to update source %s to disable it. Error: %v", err)
return err
}
}
}
return nil

View File

@@ -1290,11 +1290,44 @@ func IncrementRepoForkNum(ctx DBContext, repoID int64) error {
}
// TransferOwnership transfers all corresponding setting from old user to new one.
func TransferOwnership(doer *User, newOwnerName string, repo *Repository) error {
func TransferOwnership(doer *User, newOwnerName string, repo *Repository) (err error) {
repoRenamed := false
wikiRenamed := false
oldOwnerName := doer.Name
defer func() {
if !repoRenamed && !wikiRenamed {
return
}
recoverErr := recover()
if err == nil && recoverErr == nil {
return
}
if repoRenamed {
if err := os.Rename(RepoPath(newOwnerName, repo.Name), RepoPath(oldOwnerName, repo.Name)); err != nil {
log.Critical("Unable to move repository %s/%s directory from %s back to correct place %s: %v", oldOwnerName, repo.Name, RepoPath(newOwnerName, repo.Name), RepoPath(oldOwnerName, repo.Name), err)
}
}
if wikiRenamed {
if err := os.Rename(WikiPath(newOwnerName, repo.Name), WikiPath(oldOwnerName, repo.Name)); err != nil {
log.Critical("Unable to move wiki for repository %s/%s directory from %s back to correct place %s: %v", oldOwnerName, repo.Name, WikiPath(newOwnerName, repo.Name), WikiPath(oldOwnerName, repo.Name), err)
}
}
if recoverErr != nil {
log.Error("Panic within TransferOwnership: %v\n%s", recoverErr, log.Stack(2))
panic(recoverErr)
}
}()
newOwner, err := GetUserByName(newOwnerName)
if err != nil {
return fmt.Errorf("get new owner '%s': %v", newOwnerName, err)
}
newOwnerName = newOwner.Name // ensure capitalisation matches
// Check if new owner has repository with same name.
has, err := IsRepositoryExist(newOwner, repo.Name)
@@ -1311,6 +1344,7 @@ func TransferOwnership(doer *User, newOwnerName string, repo *Repository) error
}
oldOwner := repo.Owner
oldOwnerName = oldOwner.Name
// Note: we have to set value here to make sure recalculate accesses is based on
// new owner.
@@ -1370,9 +1404,9 @@ func TransferOwnership(doer *User, newOwnerName string, repo *Repository) error
}
// Update repository count.
if _, err = sess.Exec("UPDATE `user` SET num_repos=num_repos+1 WHERE id=?", newOwner.ID); err != nil {
if _, err := sess.Exec("UPDATE `user` SET num_repos=num_repos+1 WHERE id=?", newOwner.ID); err != nil {
return fmt.Errorf("increase new owner repository count: %v", err)
} else if _, err = sess.Exec("UPDATE `user` SET num_repos=num_repos-1 WHERE id=?", oldOwner.ID); err != nil {
} else if _, err := sess.Exec("UPDATE `user` SET num_repos=num_repos-1 WHERE id=?", oldOwner.ID); err != nil {
return fmt.Errorf("decrease old owner repository count: %v", err)
}
@@ -1382,7 +1416,7 @@ func TransferOwnership(doer *User, newOwnerName string, repo *Repository) error
// Remove watch for organization.
if oldOwner.IsOrganization() {
if err = watchRepo(sess, oldOwner.ID, repo.ID, false); err != nil {
if err := watchRepo(sess, oldOwner.ID, repo.ID, false); err != nil {
return fmt.Errorf("watchRepo [false]: %v", err)
}
}
@@ -1394,16 +1428,18 @@ func TransferOwnership(doer *User, newOwnerName string, repo *Repository) error
return fmt.Errorf("Failed to create dir %s: %v", dir, err)
}
if err = os.Rename(RepoPath(oldOwner.Name, repo.Name), RepoPath(newOwner.Name, repo.Name)); err != nil {
if err := os.Rename(RepoPath(oldOwner.Name, repo.Name), RepoPath(newOwner.Name, repo.Name)); err != nil {
return fmt.Errorf("rename repository directory: %v", err)
}
repoRenamed = true
// Rename remote wiki repository to new path and delete local copy.
wikiPath := WikiPath(oldOwner.Name, repo.Name)
if com.IsExist(wikiPath) {
if err = os.Rename(wikiPath, WikiPath(newOwner.Name, repo.Name)); err != nil {
if err := os.Rename(wikiPath, WikiPath(newOwner.Name, repo.Name)); err != nil {
return fmt.Errorf("rename repository wiki: %v", err)
}
wikiRenamed = true
}
// If there was previously a redirect at this location, remove it.
@@ -1693,6 +1729,10 @@ func DeleteRepository(doer *User, uid, repoID int64) error {
return fmt.Errorf("deleteBeans: %v", err)
}
if err := deleteLabelsByRepoID(sess, repoID); err != nil {
return err
}
// Delete Issues and related objects
var attachmentPaths []string
if attachmentPaths, err = deleteIssuesByRepoID(sess, repoID); err != nil {

View File

@@ -727,6 +727,7 @@ var (
"assets",
"attachments",
"avatars",
"captcha",
"commits",
"debug",
"error",
@@ -1122,6 +1123,16 @@ func deleteUser(e Engine, u *User) error {
// ***** END: PublicKey *****
// ***** START: GPGPublicKey *****
keys, err := listGPGKeys(e, u.ID, ListOptions{})
if err != nil {
return fmt.Errorf("ListGPGKeys: %v", err)
}
// Delete GPGKeyImport(s).
for _, key := range keys {
if _, err = e.Delete(&GPGKeyImport{KeyID: key.KeyID}); err != nil {
return fmt.Errorf("deleteGPGKeyImports: %v", err)
}
}
if _, err = e.Delete(&GPGKey{OwnerID: u.ID}); err != nil {
return fmt.Errorf("deleteGPGKeys: %v", err)
}
@@ -1612,20 +1623,34 @@ func deleteKeysMarkedForDeletion(keys []string) (bool, error) {
func addLdapSSHPublicKeys(usr *User, s *LoginSource, sshPublicKeys []string) bool {
var sshKeysNeedUpdate bool
for _, sshKey := range sshPublicKeys {
_, _, _, _, err := ssh.ParseAuthorizedKey([]byte(sshKey))
if err == nil {
sshKeyName := fmt.Sprintf("%s-%s", s.Name, sshKey[0:40])
if _, err := AddPublicKey(usr.ID, sshKeyName, sshKey, s.ID); err != nil {
var err error
found := false
keys := []byte(sshKey)
loop:
for len(keys) > 0 && err == nil {
var out ssh.PublicKey
// We ignore options as they are not relevant to Gitea
out, _, _, keys, err = ssh.ParseAuthorizedKey(keys)
if err != nil {
break loop
}
found = true
marshalled := string(ssh.MarshalAuthorizedKey(out))
marshalled = marshalled[:len(marshalled)-1]
sshKeyName := fmt.Sprintf("%s-%s", s.Name, ssh.FingerprintSHA256(out))
if _, err := AddPublicKey(usr.ID, sshKeyName, marshalled, s.ID); err != nil {
if IsErrKeyAlreadyExist(err) {
log.Trace("addLdapSSHPublicKeys[%s]: LDAP Public SSH Key %s already exists for user", s.Name, usr.Name)
log.Trace("addLdapSSHPublicKeys[%s]: LDAP Public SSH Key %s already exists for user", sshKeyName, usr.Name)
} else {
log.Error("addLdapSSHPublicKeys[%s]: Error adding LDAP Public SSH Key for user %s: %v", s.Name, usr.Name, err)
log.Error("addLdapSSHPublicKeys[%s]: Error adding LDAP Public SSH Key for user %s: %v", sshKeyName, usr.Name, err)
}
} else {
log.Trace("addLdapSSHPublicKeys[%s]: Added LDAP Public SSH Key for user %s", s.Name, usr.Name)
log.Trace("addLdapSSHPublicKeys[%s]: Added LDAP Public SSH Key for user %s", sshKeyName, usr.Name)
sshKeysNeedUpdate = true
}
} else {
}
if !found && err != nil {
log.Warn("addLdapSSHPublicKeys[%s]: Skipping invalid LDAP Public SSH Key for user %s: %v", s.Name, usr.Name, sshKey)
}
}

View File

@@ -421,3 +421,71 @@ func TestGetMaileableUsersByIDs(t *testing.T) {
assert.Equal(t, results[1].ID, 4)
}
}
func TestAddLdapSSHPublicKeys(t *testing.T) {
assert.NoError(t, PrepareTestDatabase())
user := AssertExistsAndLoadBean(t, &User{ID: 2}).(*User)
s := &LoginSource{ID: 1}
testCases := []struct {
keyString string
number int
keyContents []string
}{
{
keyString: "ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC4cn+iXnA4KvcQYSV88vGn0Yi91vG47t1P7okprVmhNTkipNRIHWr6WdCO4VDr/cvsRkuVJAsLO2enwjGWWueOO6BodiBgyAOZ/5t5nJNMCNuLGT5UIo/RI1b0WRQwxEZTRjt6mFNw6lH14wRd8ulsr9toSWBPMOGWoYs1PDeDL0JuTjL+tr1SZi/EyxCngpYszKdXllJEHyI79KQgeD0Vt3pTrkbNVTOEcCNqZePSVmUH8X8Vhugz3bnE0/iE9Pb5fkWO9c4AnM1FgI/8Bvp27Fw2ShryIXuR6kKvUqhVMTuOSDHwu6A8jLE5Owt3GAYugDpDYuwTVNGrHLXKpPzrGGPE/jPmaLCMZcsdkec95dYeU3zKODEm8UQZFhmJmDeWVJ36nGrGZHL4J5aTTaeFUJmmXDaJYiJ+K2/ioKgXqnXvltu0A9R8/LGy4nrTJRr4JMLuJFoUXvGm1gXQ70w2LSpk6yl71RNC0hCtsBe8BP8IhYCM0EP5jh7eCMQZNvM= nocomment\n",
number: 1,
keyContents: []string{
"ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC4cn+iXnA4KvcQYSV88vGn0Yi91vG47t1P7okprVmhNTkipNRIHWr6WdCO4VDr/cvsRkuVJAsLO2enwjGWWueOO6BodiBgyAOZ/5t5nJNMCNuLGT5UIo/RI1b0WRQwxEZTRjt6mFNw6lH14wRd8ulsr9toSWBPMOGWoYs1PDeDL0JuTjL+tr1SZi/EyxCngpYszKdXllJEHyI79KQgeD0Vt3pTrkbNVTOEcCNqZePSVmUH8X8Vhugz3bnE0/iE9Pb5fkWO9c4AnM1FgI/8Bvp27Fw2ShryIXuR6kKvUqhVMTuOSDHwu6A8jLE5Owt3GAYugDpDYuwTVNGrHLXKpPzrGGPE/jPmaLCMZcsdkec95dYeU3zKODEm8UQZFhmJmDeWVJ36nGrGZHL4J5aTTaeFUJmmXDaJYiJ+K2/ioKgXqnXvltu0A9R8/LGy4nrTJRr4JMLuJFoUXvGm1gXQ70w2LSpk6yl71RNC0hCtsBe8BP8IhYCM0EP5jh7eCMQZNvM=",
},
},
{
keyString: `ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC4cn+iXnA4KvcQYSV88vGn0Yi91vG47t1P7okprVmhNTkipNRIHWr6WdCO4VDr/cvsRkuVJAsLO2enwjGWWueOO6BodiBgyAOZ/5t5nJNMCNuLGT5UIo/RI1b0WRQwxEZTRjt6mFNw6lH14wRd8ulsr9toSWBPMOGWoYs1PDeDL0JuTjL+tr1SZi/EyxCngpYszKdXllJEHyI79KQgeD0Vt3pTrkbNVTOEcCNqZePSVmUH8X8Vhugz3bnE0/iE9Pb5fkWO9c4AnM1FgI/8Bvp27Fw2ShryIXuR6kKvUqhVMTuOSDHwu6A8jLE5Owt3GAYugDpDYuwTVNGrHLXKpPzrGGPE/jPmaLCMZcsdkec95dYeU3zKODEm8UQZFhmJmDeWVJ36nGrGZHL4J5aTTaeFUJmmXDaJYiJ+K2/ioKgXqnXvltu0A9R8/LGy4nrTJRr4JMLuJFoUXvGm1gXQ70w2LSpk6yl71RNC0hCtsBe8BP8IhYCM0EP5jh7eCMQZNvM= nocomment
ssh-dss AAAAB3NzaC1kc3MAAACBAOChCC7lf6Uo9n7BmZ6M8St19PZf4Tn59NriyboW2x/DZuYAz3ibZ2OkQ3S0SqDIa0HXSEJ1zaExQdmbO+Ux/wsytWZmCczWOVsaszBZSl90q8UnWlSH6P+/YA+RWJm5SFtuV9PtGIhyZgoNuz5kBQ7K139wuQsecdKktISwTakzAAAAFQCzKsO2JhNKlL+wwwLGOcLffoAmkwAAAIBpK7/3xvduajLBD/9vASqBQIHrgK2J+wiQnIb/Wzy0UsVmvfn8A+udRbBo+csM8xrSnlnlJnjkJS3qiM5g+eTwsLIV1IdKPEwmwB+VcP53Cw6lSyWyJcvhFb0N6s08NZysLzvj0N+ZC/FnhKTLzIyMtkHf/IrPCwlM+pV/M/96YgAAAIEAqQcGn9CKgzgPaguIZooTAOQdvBLMI5y0bQjOW6734XOpqQGf/Kra90wpoasLKZjSYKNPjE+FRUOrStLrxcNs4BeVKhy2PYTRnybfYVk1/dmKgH6P1YSRONsGKvTsH6c5IyCRG0ncCgYeF8tXppyd642982daopE7zQ/NPAnJfag= nocomment`,
number: 2,
keyContents: []string{
"ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC4cn+iXnA4KvcQYSV88vGn0Yi91vG47t1P7okprVmhNTkipNRIHWr6WdCO4VDr/cvsRkuVJAsLO2enwjGWWueOO6BodiBgyAOZ/5t5nJNMCNuLGT5UIo/RI1b0WRQwxEZTRjt6mFNw6lH14wRd8ulsr9toSWBPMOGWoYs1PDeDL0JuTjL+tr1SZi/EyxCngpYszKdXllJEHyI79KQgeD0Vt3pTrkbNVTOEcCNqZePSVmUH8X8Vhugz3bnE0/iE9Pb5fkWO9c4AnM1FgI/8Bvp27Fw2ShryIXuR6kKvUqhVMTuOSDHwu6A8jLE5Owt3GAYugDpDYuwTVNGrHLXKpPzrGGPE/jPmaLCMZcsdkec95dYeU3zKODEm8UQZFhmJmDeWVJ36nGrGZHL4J5aTTaeFUJmmXDaJYiJ+K2/ioKgXqnXvltu0A9R8/LGy4nrTJRr4JMLuJFoUXvGm1gXQ70w2LSpk6yl71RNC0hCtsBe8BP8IhYCM0EP5jh7eCMQZNvM=",
"ssh-dss AAAAB3NzaC1kc3MAAACBAOChCC7lf6Uo9n7BmZ6M8St19PZf4Tn59NriyboW2x/DZuYAz3ibZ2OkQ3S0SqDIa0HXSEJ1zaExQdmbO+Ux/wsytWZmCczWOVsaszBZSl90q8UnWlSH6P+/YA+RWJm5SFtuV9PtGIhyZgoNuz5kBQ7K139wuQsecdKktISwTakzAAAAFQCzKsO2JhNKlL+wwwLGOcLffoAmkwAAAIBpK7/3xvduajLBD/9vASqBQIHrgK2J+wiQnIb/Wzy0UsVmvfn8A+udRbBo+csM8xrSnlnlJnjkJS3qiM5g+eTwsLIV1IdKPEwmwB+VcP53Cw6lSyWyJcvhFb0N6s08NZysLzvj0N+ZC/FnhKTLzIyMtkHf/IrPCwlM+pV/M/96YgAAAIEAqQcGn9CKgzgPaguIZooTAOQdvBLMI5y0bQjOW6734XOpqQGf/Kra90wpoasLKZjSYKNPjE+FRUOrStLrxcNs4BeVKhy2PYTRnybfYVk1/dmKgH6P1YSRONsGKvTsH6c5IyCRG0ncCgYeF8tXppyd642982daopE7zQ/NPAnJfag=",
},
},
{
keyString: `ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC4cn+iXnA4KvcQYSV88vGn0Yi91vG47t1P7okprVmhNTkipNRIHWr6WdCO4VDr/cvsRkuVJAsLO2enwjGWWueOO6BodiBgyAOZ/5t5nJNMCNuLGT5UIo/RI1b0WRQwxEZTRjt6mFNw6lH14wRd8ulsr9toSWBPMOGWoYs1PDeDL0JuTjL+tr1SZi/EyxCngpYszKdXllJEHyI79KQgeD0Vt3pTrkbNVTOEcCNqZePSVmUH8X8Vhugz3bnE0/iE9Pb5fkWO9c4AnM1FgI/8Bvp27Fw2ShryIXuR6kKvUqhVMTuOSDHwu6A8jLE5Owt3GAYugDpDYuwTVNGrHLXKpPzrGGPE/jPmaLCMZcsdkec95dYeU3zKODEm8UQZFhmJmDeWVJ36nGrGZHL4J5aTTaeFUJmmXDaJYiJ+K2/ioKgXqnXvltu0A9R8/LGy4nrTJRr4JMLuJFoUXvGm1gXQ70w2LSpk6yl71RNC0hCtsBe8BP8IhYCM0EP5jh7eCMQZNvM= nocomment
# comment asmdna,ndp
ssh-dss AAAAB3NzaC1kc3MAAACBAOChCC7lf6Uo9n7BmZ6M8St19PZf4Tn59NriyboW2x/DZuYAz3ibZ2OkQ3S0SqDIa0HXSEJ1zaExQdmbO+Ux/wsytWZmCczWOVsaszBZSl90q8UnWlSH6P+/YA+RWJm5SFtuV9PtGIhyZgoNuz5kBQ7K139wuQsecdKktISwTakzAAAAFQCzKsO2JhNKlL+wwwLGOcLffoAmkwAAAIBpK7/3xvduajLBD/9vASqBQIHrgK2J+wiQnIb/Wzy0UsVmvfn8A+udRbBo+csM8xrSnlnlJnjkJS3qiM5g+eTwsLIV1IdKPEwmwB+VcP53Cw6lSyWyJcvhFb0N6s08NZysLzvj0N+ZC/FnhKTLzIyMtkHf/IrPCwlM+pV/M/96YgAAAIEAqQcGn9CKgzgPaguIZooTAOQdvBLMI5y0bQjOW6734XOpqQGf/Kra90wpoasLKZjSYKNPjE+FRUOrStLrxcNs4BeVKhy2PYTRnybfYVk1/dmKgH6P1YSRONsGKvTsH6c5IyCRG0ncCgYeF8tXppyd642982daopE7zQ/NPAnJfag= nocomment`,
number: 2,
keyContents: []string{
"ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC4cn+iXnA4KvcQYSV88vGn0Yi91vG47t1P7okprVmhNTkipNRIHWr6WdCO4VDr/cvsRkuVJAsLO2enwjGWWueOO6BodiBgyAOZ/5t5nJNMCNuLGT5UIo/RI1b0WRQwxEZTRjt6mFNw6lH14wRd8ulsr9toSWBPMOGWoYs1PDeDL0JuTjL+tr1SZi/EyxCngpYszKdXllJEHyI79KQgeD0Vt3pTrkbNVTOEcCNqZePSVmUH8X8Vhugz3bnE0/iE9Pb5fkWO9c4AnM1FgI/8Bvp27Fw2ShryIXuR6kKvUqhVMTuOSDHwu6A8jLE5Owt3GAYugDpDYuwTVNGrHLXKpPzrGGPE/jPmaLCMZcsdkec95dYeU3zKODEm8UQZFhmJmDeWVJ36nGrGZHL4J5aTTaeFUJmmXDaJYiJ+K2/ioKgXqnXvltu0A9R8/LGy4nrTJRr4JMLuJFoUXvGm1gXQ70w2LSpk6yl71RNC0hCtsBe8BP8IhYCM0EP5jh7eCMQZNvM=",
"ssh-dss AAAAB3NzaC1kc3MAAACBAOChCC7lf6Uo9n7BmZ6M8St19PZf4Tn59NriyboW2x/DZuYAz3ibZ2OkQ3S0SqDIa0HXSEJ1zaExQdmbO+Ux/wsytWZmCczWOVsaszBZSl90q8UnWlSH6P+/YA+RWJm5SFtuV9PtGIhyZgoNuz5kBQ7K139wuQsecdKktISwTakzAAAAFQCzKsO2JhNKlL+wwwLGOcLffoAmkwAAAIBpK7/3xvduajLBD/9vASqBQIHrgK2J+wiQnIb/Wzy0UsVmvfn8A+udRbBo+csM8xrSnlnlJnjkJS3qiM5g+eTwsLIV1IdKPEwmwB+VcP53Cw6lSyWyJcvhFb0N6s08NZysLzvj0N+ZC/FnhKTLzIyMtkHf/IrPCwlM+pV/M/96YgAAAIEAqQcGn9CKgzgPaguIZooTAOQdvBLMI5y0bQjOW6734XOpqQGf/Kra90wpoasLKZjSYKNPjE+FRUOrStLrxcNs4BeVKhy2PYTRnybfYVk1/dmKgH6P1YSRONsGKvTsH6c5IyCRG0ncCgYeF8tXppyd642982daopE7zQ/NPAnJfag=",
},
},
{
keyString: `ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC4cn+iXnA4KvcQYSV88vGn0Yi91vG47t1P7okprVmhNTkipNRIHWr6WdCO4VDr/cvsRkuVJAsLO2enwjGWWueOO6BodiBgyAOZ/5t5nJNMCNuLGT5UIo/RI1b0WRQwxEZTRjt6mFNw6lH14wRd8ulsr9toSWBPMOGWoYs1PDeDL0JuTjL+tr1SZi/EyxCngpYszKdXllJEHyI79KQgeD0Vt3pTrkbNVTOEcCNqZePSVmUH8X8Vhugz3bnE0/iE9Pb5fkWO9c4AnM1FgI/8Bvp27Fw2ShryIXuR6kKvUqhVMTuOSDHwu6A8jLE5Owt3GAYugDpDYuwTVNGrHLXKpPzrGGPE/jPmaLCMZcsdkec95dYeU3zKODEm8UQZFhmJmDeWVJ36nGrGZHL4J5aTTaeFUJmmXDaJYiJ+K2/ioKgXqnXvltu0A9R8/LGy4nrTJRr4JMLuJFoUXvGm1gXQ70w2LSpk6yl71RNC0hCtsBe8BP8IhYCM0EP5jh7eCMQZNvM= nocomment
382488320jasdj1lasmva/vasodifipi4193-fksma.cm
ssh-dss AAAAB3NzaC1kc3MAAACBAOChCC7lf6Uo9n7BmZ6M8St19PZf4Tn59NriyboW2x/DZuYAz3ibZ2OkQ3S0SqDIa0HXSEJ1zaExQdmbO+Ux/wsytWZmCczWOVsaszBZSl90q8UnWlSH6P+/YA+RWJm5SFtuV9PtGIhyZgoNuz5kBQ7K139wuQsecdKktISwTakzAAAAFQCzKsO2JhNKlL+wwwLGOcLffoAmkwAAAIBpK7/3xvduajLBD/9vASqBQIHrgK2J+wiQnIb/Wzy0UsVmvfn8A+udRbBo+csM8xrSnlnlJnjkJS3qiM5g+eTwsLIV1IdKPEwmwB+VcP53Cw6lSyWyJcvhFb0N6s08NZysLzvj0N+ZC/FnhKTLzIyMtkHf/IrPCwlM+pV/M/96YgAAAIEAqQcGn9CKgzgPaguIZooTAOQdvBLMI5y0bQjOW6734XOpqQGf/Kra90wpoasLKZjSYKNPjE+FRUOrStLrxcNs4BeVKhy2PYTRnybfYVk1/dmKgH6P1YSRONsGKvTsH6c5IyCRG0ncCgYeF8tXppyd642982daopE7zQ/NPAnJfag= nocomment`,
number: 2,
keyContents: []string{
"ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC4cn+iXnA4KvcQYSV88vGn0Yi91vG47t1P7okprVmhNTkipNRIHWr6WdCO4VDr/cvsRkuVJAsLO2enwjGWWueOO6BodiBgyAOZ/5t5nJNMCNuLGT5UIo/RI1b0WRQwxEZTRjt6mFNw6lH14wRd8ulsr9toSWBPMOGWoYs1PDeDL0JuTjL+tr1SZi/EyxCngpYszKdXllJEHyI79KQgeD0Vt3pTrkbNVTOEcCNqZePSVmUH8X8Vhugz3bnE0/iE9Pb5fkWO9c4AnM1FgI/8Bvp27Fw2ShryIXuR6kKvUqhVMTuOSDHwu6A8jLE5Owt3GAYugDpDYuwTVNGrHLXKpPzrGGPE/jPmaLCMZcsdkec95dYeU3zKODEm8UQZFhmJmDeWVJ36nGrGZHL4J5aTTaeFUJmmXDaJYiJ+K2/ioKgXqnXvltu0A9R8/LGy4nrTJRr4JMLuJFoUXvGm1gXQ70w2LSpk6yl71RNC0hCtsBe8BP8IhYCM0EP5jh7eCMQZNvM=",
"ssh-dss AAAAB3NzaC1kc3MAAACBAOChCC7lf6Uo9n7BmZ6M8St19PZf4Tn59NriyboW2x/DZuYAz3ibZ2OkQ3S0SqDIa0HXSEJ1zaExQdmbO+Ux/wsytWZmCczWOVsaszBZSl90q8UnWlSH6P+/YA+RWJm5SFtuV9PtGIhyZgoNuz5kBQ7K139wuQsecdKktISwTakzAAAAFQCzKsO2JhNKlL+wwwLGOcLffoAmkwAAAIBpK7/3xvduajLBD/9vASqBQIHrgK2J+wiQnIb/Wzy0UsVmvfn8A+udRbBo+csM8xrSnlnlJnjkJS3qiM5g+eTwsLIV1IdKPEwmwB+VcP53Cw6lSyWyJcvhFb0N6s08NZysLzvj0N+ZC/FnhKTLzIyMtkHf/IrPCwlM+pV/M/96YgAAAIEAqQcGn9CKgzgPaguIZooTAOQdvBLMI5y0bQjOW6734XOpqQGf/Kra90wpoasLKZjSYKNPjE+FRUOrStLrxcNs4BeVKhy2PYTRnybfYVk1/dmKgH6P1YSRONsGKvTsH6c5IyCRG0ncCgYeF8tXppyd642982daopE7zQ/NPAnJfag=",
},
},
}
for i, kase := range testCases {
s.ID = int64(i) + 20
addLdapSSHPublicKeys(user, s, []string{kase.keyString})
keys, err := ListPublicLdapSSHKeys(user.ID, s.ID)
assert.NoError(t, err)
if err != nil {
continue
}
assert.Equal(t, kase.number, len(keys))
for _, key := range keys {
assert.Contains(t, kase.keyContents, key.Content)
}
for _, key := range keys {
DeletePublicKey(user, key.ID)
}
}
}

View File

@@ -30,6 +30,9 @@ var (
// aliasMap provides a map of the alias to its emoji data.
aliasMap map[string]int
// emptyReplacer is the string replacer for emoji codes.
emptyReplacer *strings.Replacer
// codeReplacer is the string replacer for emoji codes.
codeReplacer *strings.Replacer
@@ -49,6 +52,7 @@ func loadMap() {
// process emoji codes and aliases
codePairs := make([]string, 0)
emptyPairs := make([]string, 0)
aliasPairs := make([]string, 0)
// sort from largest to small so we match combined emoji first
@@ -64,6 +68,7 @@ func loadMap() {
// setup codes
codeMap[e.Emoji] = i
codePairs = append(codePairs, e.Emoji, ":"+e.Aliases[0]+":")
emptyPairs = append(emptyPairs, e.Emoji, e.Emoji)
// setup aliases
for _, a := range e.Aliases {
@@ -77,6 +82,7 @@ func loadMap() {
}
// create replacers
emptyReplacer = strings.NewReplacer(emptyPairs...)
codeReplacer = strings.NewReplacer(codePairs...)
aliasReplacer = strings.NewReplacer(aliasPairs...)
})
@@ -127,38 +133,53 @@ func ReplaceAliases(s string) string {
return aliasReplacer.Replace(s)
}
type rememberSecondWriteWriter struct {
pos int
idx int
end int
writecount int
}
func (n *rememberSecondWriteWriter) Write(p []byte) (int, error) {
n.writecount++
if n.writecount == 2 {
n.idx = n.pos
n.end = n.pos + len(p)
}
n.pos += len(p)
return len(p), nil
}
func (n *rememberSecondWriteWriter) WriteString(s string) (int, error) {
n.writecount++
if n.writecount == 2 {
n.idx = n.pos
n.end = n.pos + len(s)
}
n.pos += len(s)
return len(s), nil
}
// FindEmojiSubmatchIndex returns index pair of longest emoji in a string
func FindEmojiSubmatchIndex(s string) []int {
loadMap()
found := make(map[int]int)
keys := make([]int, 0)
secondWriteWriter := rememberSecondWriteWriter{}
//see if there are any emoji in string before looking for position of specific ones
//no performance difference when there is a match but 10x faster when there are not
if s == ReplaceCodes(s) {
// A faster and clean implementation would copy the trie tree formation in strings.NewReplacer but
// we can be lazy here.
//
// The implementation of strings.Replacer.WriteString is such that the first index of the emoji
// submatch is simply the second thing that is written to WriteString in the writer.
//
// Therefore we can simply take the index of the second write as our first emoji
//
// FIXME: just copy the trie implementation from strings.NewReplacer
_, _ = emptyReplacer.WriteString(&secondWriteWriter, s)
// if we wrote less than twice then we never "replaced"
if secondWriteWriter.writecount < 2 {
return nil
}
// get index of first emoji occurrence while also checking for longest combination
for j := range GemojiData {
i := strings.Index(s, GemojiData[j].Emoji)
if i != -1 {
if _, ok := found[i]; !ok {
if len(keys) == 0 || i < keys[0] {
found[i] = j
keys = []int{i}
}
if i == 0 {
break
}
}
}
}
if len(keys) > 0 {
index := keys[0]
return []int{index, index + len(GemojiData[found[index]].Emoji)}
}
return nil
return []int{secondWriteWriter.idx, secondWriteWriter.end}
}

View File

@@ -8,6 +8,8 @@ package emoji
import (
"reflect"
"testing"
"github.com/stretchr/testify/assert"
)
func TestDumpInfo(t *testing.T) {
@@ -65,3 +67,34 @@ func TestReplacers(t *testing.T) {
}
}
}
func TestFindEmojiSubmatchIndex(t *testing.T) {
type testcase struct {
teststring string
expected []int
}
testcases := []testcase{
{
"\U0001f44d",
[]int{0, len("\U0001f44d")},
},
{
"\U0001f44d +1 \U0001f44d \U0001f37a",
[]int{0, 4},
},
{
" \U0001f44d",
[]int{1, 1 + len("\U0001f44d")},
},
{
string([]byte{'\u0001'}) + "\U0001f44d",
[]int{1, 1 + len("\U0001f44d")},
},
}
for _, kase := range testcases {
actual := FindEmojiSubmatchIndex(kase.teststring)
assert.Equal(t, kase.expected, actual)
}
}

View File

@@ -9,6 +9,7 @@ import (
"bufio"
"bytes"
"container/list"
"errors"
"fmt"
"image"
"image/color"
@@ -17,6 +18,7 @@ import (
_ "image/png" // for processing png images
"io"
"net/http"
"os/exec"
"strconv"
"strings"
@@ -309,23 +311,33 @@ func (c *Commit) CommitsBefore() (*list.List, error) {
// HasPreviousCommit returns true if a given commitHash is contained in commit's parents
func (c *Commit) HasPreviousCommit(commitHash SHA1) (bool, error) {
for i := 0; i < c.ParentCount(); i++ {
commit, err := c.Parent(i)
if err != nil {
return false, err
}
if commit.ID == commitHash {
return true, nil
}
commitInParentCommit, err := commit.HasPreviousCommit(commitHash)
if err != nil {
return false, err
}
if commitInParentCommit {
return true, nil
}
this := c.ID.String()
that := commitHash.String()
if this == that {
return false, nil
}
return false, nil
if err := CheckGitVersionConstraint(">= 1.8.0"); err == nil {
_, err := NewCommand("merge-base", "--is-ancestor", that, this).RunInDir(c.repo.Path)
if err == nil {
return true, nil
}
var exitError *exec.ExitError
if errors.As(err, &exitError) {
if exitError.ProcessState.ExitCode() == 1 && len(exitError.Stderr) == 0 {
return false, nil
}
}
return false, err
}
result, err := NewCommand("rev-list", "--ancestry-path", "-n1", that+".."+this, "--").RunInDir(c.repo.Path)
if err != nil {
return false, err
}
return len(strings.TrimSpace(result)) > 0, nil
}
// CommitsBeforeLimit returns num commits before current revision

View File

@@ -125,30 +125,39 @@ var hunkRegex = regexp.MustCompile(`^@@ -(?P<beginOld>[0-9]+)(,(?P<endOld>[0-9]+
const cmdDiffHead = "diff --git "
func isHeader(lof string) bool {
return strings.HasPrefix(lof, cmdDiffHead) || strings.HasPrefix(lof, "---") || strings.HasPrefix(lof, "+++")
func isHeader(lof string, inHunk bool) bool {
return strings.HasPrefix(lof, cmdDiffHead) || (!inHunk && (strings.HasPrefix(lof, "---") || strings.HasPrefix(lof, "+++")))
}
// CutDiffAroundLine cuts a diff of a file in way that only the given line + numberOfLine above it will be shown
// it also recalculates hunks and adds the appropriate headers to the new diff.
// Warning: Only one-file diffs are allowed.
func CutDiffAroundLine(originalDiff io.Reader, line int64, old bool, numbersOfLine int) string {
func CutDiffAroundLine(originalDiff io.Reader, line int64, old bool, numbersOfLine int) (string, error) {
if line == 0 || numbersOfLine == 0 {
// no line or num of lines => no diff
return ""
return "", nil
}
scanner := bufio.NewScanner(originalDiff)
hunk := make([]string, 0)
// begin is the start of the hunk containing searched line
// end is the end of the hunk ...
// currentLine is the line number on the side of the searched line (differentiated by old)
// otherLine is the line number on the opposite side of the searched line (differentiated by old)
var begin, end, currentLine, otherLine int64
var headerLines int
inHunk := false
for scanner.Scan() {
lof := scanner.Text()
// Add header to enable parsing
if isHeader(lof) {
if isHeader(lof, inHunk) {
if strings.HasPrefix(lof, cmdDiffHead) {
inHunk = false
}
hunk = append(hunk, lof)
headerLines++
}
@@ -157,6 +166,7 @@ func CutDiffAroundLine(originalDiff io.Reader, line int64, old bool, numbersOfLi
}
// Detect "hunk" with contains commented lof
if strings.HasPrefix(lof, "@@") {
inHunk = true
// Already got our hunk. End of hunk detected!
if len(hunk) > headerLines {
break
@@ -213,15 +223,19 @@ func CutDiffAroundLine(originalDiff io.Reader, line int64, old bool, numbersOfLi
}
}
}
err := scanner.Err()
if err != nil {
return "", err
}
// No hunk found
if currentLine == 0 {
return ""
return "", nil
}
// headerLines + hunkLine (1) = totalNonCodeLines
if len(hunk)-headerLines-1 <= numbersOfLine {
// No need to cut the hunk => return existing hunk
return strings.Join(hunk, "\n")
return strings.Join(hunk, "\n"), nil
}
var oldBegin, oldNumOfLines, newBegin, newNumOfLines int64
if old {
@@ -256,5 +270,5 @@ func CutDiffAroundLine(originalDiff io.Reader, line int64, old bool, numbersOfLi
// construct the new hunk header
newHunk[headerLines] = fmt.Sprintf("@@ -%d,%d +%d,%d @@",
oldBegin, oldNumOfLines, newBegin, newNumOfLines)
return strings.Join(newHunk, "\n")
return strings.Join(newHunk, "\n"), nil
}

View File

@@ -23,8 +23,28 @@ const exampleDiff = `diff --git a/README.md b/README.md
+ cut off
+ cut off`
const breakingDiff = `diff --git a/aaa.sql b/aaa.sql
index d8e4c92..19dc8ad 100644
--- a/aaa.sql
+++ b/aaa.sql
@@ -1,9 +1,10 @@
--some comment
--- some comment 5
+--some coment 2
+-- some comment 3
create or replace procedure test(p1 varchar2)
is
begin
---new comment
dbms_output.put_line(p1);
+--some other comment
end;
/
`
func TestCutDiffAroundLine(t *testing.T) {
result := CutDiffAroundLine(strings.NewReader(exampleDiff), 4, false, 3)
result, err := CutDiffAroundLine(strings.NewReader(exampleDiff), 4, false, 3)
assert.NoError(t, err)
resultByLine := strings.Split(result, "\n")
assert.Len(t, resultByLine, 7)
// Check if headers got transferred
@@ -37,18 +57,50 @@ func TestCutDiffAroundLine(t *testing.T) {
assert.Equal(t, "+ Build Status", resultByLine[4])
// Must be same result as before since old line 3 == new line 5
newResult := CutDiffAroundLine(strings.NewReader(exampleDiff), 3, true, 3)
newResult, err := CutDiffAroundLine(strings.NewReader(exampleDiff), 3, true, 3)
assert.NoError(t, err)
assert.Equal(t, result, newResult, "Must be same result as before since old line 3 == new line 5")
newResult = CutDiffAroundLine(strings.NewReader(exampleDiff), 6, false, 300)
newResult, err = CutDiffAroundLine(strings.NewReader(exampleDiff), 6, false, 300)
assert.NoError(t, err)
assert.Equal(t, exampleDiff, newResult)
emptyResult := CutDiffAroundLine(strings.NewReader(exampleDiff), 6, false, 0)
emptyResult, err := CutDiffAroundLine(strings.NewReader(exampleDiff), 6, false, 0)
assert.NoError(t, err)
assert.Empty(t, emptyResult)
// Line is out of scope
emptyResult = CutDiffAroundLine(strings.NewReader(exampleDiff), 434, false, 0)
emptyResult, err = CutDiffAroundLine(strings.NewReader(exampleDiff), 434, false, 0)
assert.NoError(t, err)
assert.Empty(t, emptyResult)
// Handle minus diffs properly
minusDiff, err := CutDiffAroundLine(strings.NewReader(breakingDiff), 2, false, 4)
assert.NoError(t, err)
expected := `diff --git a/aaa.sql b/aaa.sql
--- a/aaa.sql
+++ b/aaa.sql
@@ -1,9 +1,10 @@
--some comment
--- some comment 5
+--some coment 2`
assert.Equal(t, expected, minusDiff)
// Handle minus diffs properly
minusDiff, err = CutDiffAroundLine(strings.NewReader(breakingDiff), 3, false, 4)
assert.NoError(t, err)
expected = `diff --git a/aaa.sql b/aaa.sql
--- a/aaa.sql
+++ b/aaa.sql
@@ -1,9 +1,10 @@
--some comment
--- some comment 5
+--some coment 2
+-- some comment 3`
assert.Equal(t, expected, minusDiff)
}
func BenchmarkCutDiffAroundLine(b *testing.B) {
@@ -69,7 +121,7 @@ func ExampleCutDiffAroundLine() {
Docker Pulls
+ cut off
+ cut off`
result := CutDiffAroundLine(strings.NewReader(diff), 4, false, 3)
result, _ := CutDiffAroundLine(strings.NewReader(diff), 4, false, 3)
println(result)
}

View File

@@ -9,6 +9,8 @@ import (
"bytes"
"container/list"
"fmt"
"io"
"io/ioutil"
"strconv"
"strings"
@@ -129,19 +131,23 @@ func (repo *Repository) getCommit(id SHA1) (*Commit, error) {
// ConvertToSHA1 returns a Hash object from a potential ID string
func (repo *Repository) ConvertToSHA1(commitID string) (SHA1, error) {
if len(commitID) != 40 {
var err error
actualCommitID, err := NewCommand("rev-parse", "--verify", commitID).RunInDir(repo.Path)
if err != nil {
if strings.Contains(err.Error(), "unknown revision or path") ||
strings.Contains(err.Error(), "fatal: Needed a single revision") {
return SHA1{}, ErrNotExist{commitID, ""}
}
return SHA1{}, err
if len(commitID) == 40 {
sha1, err := NewIDFromString(commitID)
if err == nil {
return sha1, nil
}
commitID = actualCommitID
}
return NewIDFromString(commitID)
actualCommitID, err := NewCommand("rev-parse", "--verify", commitID).RunInDir(repo.Path)
if err != nil {
if strings.Contains(err.Error(), "unknown revision or path") ||
strings.Contains(err.Error(), "fatal: Needed a single revision") {
return SHA1{}, ErrNotExist{commitID, ""}
}
return SHA1{}, err
}
return NewIDFromString(actualCommitID)
}
// GetCommit returns commit object of by ID string.
@@ -323,8 +329,41 @@ func (repo *Repository) FileCommitsCount(revision, file string) (int64, error) {
// CommitsByFileAndRange return the commits according revison file and the page
func (repo *Repository) CommitsByFileAndRange(revision, file string, page int) (*list.List, error) {
stdout, err := NewCommand("log", revision, "--follow", "--skip="+strconv.Itoa((page-1)*50),
"--max-count="+strconv.Itoa(CommitsRangeSize), prettyLogFormat, "--", file).RunInDirBytes(repo.Path)
skip := (page - 1) * CommitsRangeSize
stdoutReader, stdoutWriter := io.Pipe()
defer func() {
_ = stdoutReader.Close()
_ = stdoutWriter.Close()
}()
go func() {
stderr := strings.Builder{}
err := NewCommand("log", revision, "--follow",
"--max-count="+strconv.Itoa(CommitsRangeSize*page),
prettyLogFormat, "--", file).
RunInDirPipeline(repo.Path, stdoutWriter, &stderr)
if err != nil {
if stderr.Len() > 0 {
err = fmt.Errorf("%v - %s", err, stderr.String())
}
_ = stdoutWriter.CloseWithError(err)
} else {
_ = stdoutWriter.Close()
}
}()
if skip > 0 {
_, err := io.CopyN(ioutil.Discard, stdoutReader, int64(skip*41))
if err != nil {
if err == io.EOF {
return list.New(), nil
}
_ = stdoutReader.CloseWithError(err)
return nil, err
}
}
stdout, err := ioutil.ReadAll(stdoutReader)
if err != nil {
return nil, err
}

View File

@@ -9,6 +9,7 @@ import (
"encoding/hex"
"errors"
"fmt"
"hash"
"io"
"os"
@@ -66,15 +67,20 @@ func (s *ContentStore) Get(meta *models.LFSMetaObject, fromByte int64) (io.ReadC
// Put takes a Meta object and an io.Reader and writes the content to the store.
func (s *ContentStore) Put(meta *models.LFSMetaObject, r io.Reader) error {
hash := sha256.New()
rd := io.TeeReader(r, hash)
p := meta.RelativePath()
written, err := s.Save(p, rd)
// Wrap the provided reader with an inline hashing and size checker
wrappedRd := newHashingReader(meta.Size, meta.Oid, r)
// now pass the wrapped reader to Save - if there is a size mismatch or hash mismatch then
// the errors returned by the newHashingReader should percolate up to here
written, err := s.Save(p, wrappedRd)
if err != nil {
log.Error("Whilst putting LFS OID[%s]: Failed to copy to tmpPath: %s Error: %v", meta.Oid, p, err)
return err
}
// This shouldn't happen but it is sensible to test
if written != meta.Size {
if err := s.Delete(p); err != nil {
log.Error("Cleaning the LFS OID[%s] failed: %v", meta.Oid, err)
@@ -82,14 +88,6 @@ func (s *ContentStore) Put(meta *models.LFSMetaObject, r io.Reader) error {
return errSizeMismatch
}
shaStr := hex.EncodeToString(hash.Sum(nil))
if shaStr != meta.Oid {
if err := s.Delete(p); err != nil {
log.Error("Cleaning the LFS OID[%s] failed: %v", meta.Oid, err)
}
return errHashMismatch
}
return nil
}
@@ -118,3 +116,45 @@ func (s *ContentStore) Verify(meta *models.LFSMetaObject) (bool, error) {
return true, nil
}
type hashingReader struct {
internal io.Reader
currentSize int64
expectedSize int64
hash hash.Hash
expectedHash string
}
func (r *hashingReader) Read(b []byte) (int, error) {
n, err := r.internal.Read(b)
if n > 0 {
r.currentSize += int64(n)
wn, werr := r.hash.Write(b[:n])
if wn != n || werr != nil {
return n, werr
}
}
if err != nil && err == io.EOF {
if r.currentSize != r.expectedSize {
return n, errSizeMismatch
}
shaStr := hex.EncodeToString(r.hash.Sum(nil))
if shaStr != r.expectedHash {
return n, errHashMismatch
}
}
return n, err
}
func newHashingReader(expectedSize int64, expectedHash string, reader io.Reader) *hashingReader {
return &hashingReader{
internal: reader,
expectedSize: expectedSize,
expectedHash: expectedHash,
hash: sha256.New(),
}
}

View File

@@ -298,19 +298,27 @@ func RenderEmoji(
return ctx.postProcess(rawHTML)
}
var tagCleaner = regexp.MustCompile(`<((?:/?\w+/\w+)|(?:/[\w ]+/)|(/?[hH][tT][mM][lL][ />])|(/?[hH][eE][aA][dD][ />]))`)
var nulCleaner = strings.NewReplacer("\000", "")
func (ctx *postProcessCtx) postProcess(rawHTML []byte) ([]byte, error) {
if ctx.procs == nil {
ctx.procs = defaultProcessors
}
// give a generous extra 50 bytes
res := make([]byte, 0, len(rawHTML)+50)
res = append(res, "<html><body>"...)
res = append(res, rawHTML...)
res = append(res, "</body></html>"...)
res := bytes.NewBuffer(make([]byte, 0, len(rawHTML)+50))
// prepend "<html><body>"
_, _ = res.WriteString("<html><body>")
// Strip out nuls - they're always invalid
_, _ = nulCleaner.WriteString(res, string(tagCleaner.ReplaceAll(rawHTML, []byte("&lt;$1"))))
// close the tags
_, _ = res.WriteString("</body></html>")
// parse the HTML
nodes, err := html.ParseFragment(bytes.NewReader(res), nil)
nodes, err := html.ParseFragment(res, nil)
if err != nil {
return nil, &postProcessError{"invalid HTML", err}
}
@@ -347,17 +355,17 @@ func (ctx *postProcessCtx) postProcess(rawHTML []byte) ([]byte, error) {
// Create buffer in which the data will be placed again. We know that the
// length will be at least that of res; to spare a few alloc+copy, we
// reuse res, resetting its length to 0.
buf := bytes.NewBuffer(res[:0])
res.Reset()
// Render everything to buf.
for _, node := range nodes {
err = html.Render(buf, node)
err = html.Render(res, node)
if err != nil {
return nil, &postProcessError{"error rendering processed HTML", err}
}
}
// Everything done successfully, return parsed data.
return buf.Bytes(), nil
return res.Bytes(), nil
}
func (ctx *postProcessCtx) visitNode(node *html.Node, visitText bool) {

View File

@@ -10,6 +10,7 @@ import (
"regexp"
"strings"
"code.gitea.io/gitea/modules/log"
"code.gitea.io/gitea/modules/markup"
"code.gitea.io/gitea/modules/markup/common"
"code.gitea.io/gitea/modules/setting"
@@ -76,6 +77,12 @@ func (g *ASTTransformer) Transform(node *ast.Document, reader text.Reader, pc pa
header.ID = util.BytesToReadOnlyString(id.([]byte))
}
toc = append(toc, header)
} else {
for _, attr := range v.Attributes() {
if _, ok := attr.Value.([]byte); !ok {
v.SetAttribute(attr.Name, []byte(fmt.Sprintf("%v", attr.Value)))
}
}
}
case *ast.Image:
// Images need two things:
@@ -101,11 +108,41 @@ func (g *ASTTransformer) Transform(node *ast.Document, reader text.Reader, pc pa
parent := n.Parent()
// Create a link around image only if parent is not already a link
if _, ok := parent.(*ast.Link); !ok && parent != nil {
next := n.NextSibling()
// Create a link wrapper
wrap := ast.NewLink()
wrap.Destination = link
wrap.Title = v.Title
// Duplicate the current image node
image := ast.NewImage(ast.NewLink())
image.Destination = link
image.Title = v.Title
for _, attr := range v.Attributes() {
image.SetAttribute(attr.Name, attr.Value)
}
for child := v.FirstChild(); child != nil; {
next := child.NextSibling()
image.AppendChild(image, child)
child = next
}
// Append our duplicate image to the wrapper link
wrap.AppendChild(wrap, image)
// Wire in the next sibling
wrap.SetNextSibling(next)
// Replace the current node with the wrapper link
parent.ReplaceChild(parent, n, wrap)
wrap.AppendChild(wrap, n)
// But most importantly ensure the next sibling is still on the old image too
v.SetNextSibling(next)
} else {
log.Debug("ast.Image: %s has parent: %v", link, parent)
}
case *ast.Link:
// Links need their href to munged to be a real value

View File

@@ -6,7 +6,8 @@
package markdown
import (
"bytes"
"fmt"
"io"
"strings"
"sync"
@@ -18,7 +19,7 @@ import (
chromahtml "github.com/alecthomas/chroma/formatters/html"
"github.com/yuin/goldmark"
"github.com/yuin/goldmark-highlighting"
highlighting "github.com/yuin/goldmark-highlighting"
meta "github.com/yuin/goldmark-meta"
"github.com/yuin/goldmark/extension"
"github.com/yuin/goldmark/parser"
@@ -34,6 +35,44 @@ var urlPrefixKey = parser.NewContextKey()
var isWikiKey = parser.NewContextKey()
var renderMetasKey = parser.NewContextKey()
type closesWithError interface {
io.WriteCloser
CloseWithError(err error) error
}
type limitWriter struct {
w closesWithError
sum int64
limit int64
}
// Write implements the standard Write interface:
func (l *limitWriter) Write(data []byte) (int, error) {
leftToWrite := l.limit - l.sum
if leftToWrite < int64(len(data)) {
n, err := l.w.Write(data[:leftToWrite])
l.sum += int64(n)
if err != nil {
return n, err
}
_ = l.w.Close()
return n, fmt.Errorf("Rendered content too large - truncating render")
}
n, err := l.w.Write(data)
l.sum += int64(n)
return n, err
}
// Close closes the writer
func (l *limitWriter) Close() error {
return l.w.Close()
}
// CloseWithError closes the writer
func (l *limitWriter) CloseWithError(err error) error {
return l.w.CloseWithError(err)
}
// NewGiteaParseContext creates a parser.Context with the gitea context set
func NewGiteaParseContext(urlPrefix string, metas map[string]string, isWiki bool) parser.Context {
pc := parser.NewContext(parser.WithIDs(newPrefixedIDs()))
@@ -43,8 +82,8 @@ func NewGiteaParseContext(urlPrefix string, metas map[string]string, isWiki bool
return pc
}
// render renders Markdown to HTML without handling special links.
func render(body []byte, urlPrefix string, metas map[string]string, wikiMarkdown bool) []byte {
// actualRender renders Markdown to HTML without handling special links.
func actualRender(body []byte, urlPrefix string, metas map[string]string, wikiMarkdown bool) []byte {
once.Do(func() {
converter = goldmark.New(
goldmark.WithExtensions(extension.Table,
@@ -119,12 +158,57 @@ func render(body []byte, urlPrefix string, metas map[string]string, wikiMarkdown
})
pc := NewGiteaParseContext(urlPrefix, metas, wikiMarkdown)
var buf bytes.Buffer
if err := converter.Convert(giteautil.NormalizeEOL(body), &buf, parser.WithContext(pc)); err != nil {
log.Error("Unable to render: %v", err)
rd, wr := io.Pipe()
defer func() {
_ = rd.Close()
_ = wr.Close()
}()
lw := &limitWriter{
w: wr,
limit: setting.UI.MaxDisplayFileSize * 3,
}
return markup.SanitizeReader(&buf).Bytes()
// FIXME: should we include a timeout that closes the pipe to abort the parser and sanitizer if it takes too long?
go func() {
defer func() {
err := recover()
if err == nil {
return
}
log.Warn("Unable to render markdown due to panic in goldmark: %v", err)
if log.IsDebug() {
log.Debug("Panic in markdown: %v\n%s", err, string(log.Stack(2)))
}
_ = lw.CloseWithError(fmt.Errorf("%v", err))
}()
pc := NewGiteaParseContext(urlPrefix, metas, wikiMarkdown)
if err := converter.Convert(giteautil.NormalizeEOL(body), lw, parser.WithContext(pc)); err != nil {
log.Error("Unable to render: %v", err)
_ = lw.CloseWithError(err)
return
}
_ = lw.Close()
}()
return markup.SanitizeReader(rd).Bytes()
}
func render(body []byte, urlPrefix string, metas map[string]string, wikiMarkdown bool) (ret []byte) {
defer func() {
err := recover()
if err == nil {
return
}
log.Warn("Unable to render markdown due to panic in goldmark - will return sanitized raw bytes")
if log.IsDebug() {
log.Debug("Panic in markdown: %v\n%s", err, string(log.Stack(2)))
}
ret = markup.SanitizeBytes(body)
}()
return actualRender(body, urlPrefix, metas, wikiMarkdown)
}
var (

View File

@@ -308,3 +308,34 @@ func TestRender_RenderParagraphs(t *testing.T) {
test(t, "A\n\nB\nC\n", 2)
test(t, "A\n\n\nB\nC\n", 2)
}
func TestMarkdownRenderRaw(t *testing.T) {
testcases := [][]byte{
{ // clusterfuzz_testcase_minimized_fuzz_markdown_render_raw_6267570554535936
0x2a, 0x20, 0x2d, 0x0a, 0x09, 0x20, 0x60, 0x5b, 0x0a, 0x09, 0x20, 0x60,
0x5b,
},
{ // clusterfuzz_testcase_minimized_fuzz_markdown_render_raw_6278827345051648
0x2d, 0x20, 0x2d, 0x0d, 0x09, 0x60, 0x0d, 0x09, 0x60,
},
{ // clusterfuzz_testcase_minimized_fuzz_markdown_render_raw_6016973788020736[] = {
0x7b, 0x63, 0x6c, 0x61, 0x73, 0x73, 0x3d, 0x35, 0x7d, 0x0a, 0x3d,
},
}
for _, testcase := range testcases {
_ = RenderRaw(testcase, "", false)
}
}
func TestRenderSiblingImages_Issue12925(t *testing.T) {
testcase := `![image1](/image1)
![image2](/image2)
`
expected := `<p><a href="/image1" rel="nofollow"><img src="/image1" alt="image1"></a><br>
<a href="/image2" rel="nofollow"><img src="/image2" alt="image2"></a></p>
`
res := string(RenderRaw([]byte(testcase), "", false))
assert.Equal(t, expected, res)
}

View File

@@ -6,7 +6,6 @@
package migrations
import (
"bytes"
"context"
"fmt"
"io"
@@ -811,13 +810,20 @@ func (g *GiteaLocalUploader) CreateReviews(reviews ...*base.Review) error {
}
var patch string
patchBuf := new(bytes.Buffer)
if err := git.GetRepoRawDiffForFile(g.gitRepo, pr.MergeBase, headCommitID, git.RawDiffNormal, comment.TreePath, patchBuf); err != nil {
// We should ignore the error since the commit maybe removed when force push to the pull request
log.Warn("GetRepoRawDiffForFile failed when migrating [%s, %s, %s, %s]: %v", g.gitRepo.Path, pr.MergeBase, headCommitID, comment.TreePath, err)
} else {
patch = git.CutDiffAroundLine(patchBuf, int64((&models.Comment{Line: int64(line + comment.Position - 1)}).UnsignedLine()), line < 0, setting.UI.CodeCommentLines)
}
reader, writer := io.Pipe()
defer func() {
_ = reader.Close()
_ = writer.Close()
}()
go func() {
if err := git.GetRepoRawDiffForFile(g.gitRepo, pr.MergeBase, headCommitID, git.RawDiffNormal, comment.TreePath, writer); err != nil {
// We should ignore the error since the commit maybe removed when force push to the pull request
log.Warn("GetRepoRawDiffForFile failed when migrating [%s, %s, %s, %s]: %v", g.gitRepo.Path, pr.MergeBase, headCommitID, comment.TreePath, err)
}
_ = writer.Close()
}()
patch, _ = git.CutDiffAroundLine(reader, int64((&models.Comment{Line: int64(line + comment.Position - 1)}).UnsignedLine()), line < 0, setting.UI.CodeCommentLines)
var c = models.Comment{
Type: models.CommentTypeCode,

View File

@@ -52,6 +52,13 @@ func isMigrateURLAllowed(remoteURL string) error {
}
}
if u.Host == "" {
if !setting.ImportLocalPaths {
return &models.ErrMigrationNotAllowed{Host: "<LOCAL_FILESYSTEM>"}
}
return nil
}
if !setting.Migrations.AllowLocalNetworks {
addrList, err := net.LookupIP(strings.Split(u.Host, ":")[0])
if err != nil {

View File

@@ -31,4 +31,16 @@ func TestMigrateWhiteBlocklist(t *testing.T) {
err = isMigrateURLAllowed("https://github.com/go-gitea/gitea.git")
assert.Error(t, err)
old := setting.ImportLocalPaths
setting.ImportLocalPaths = false
err = isMigrateURLAllowed("/home/foo/bar/goo")
assert.Error(t, err)
setting.ImportLocalPaths = true
err = isMigrateURLAllowed("/home/foo/bar/goo")
assert.NoError(t, err)
setting.ImportLocalPaths = old
}

View File

@@ -149,6 +149,11 @@ func (q *PersistableChannelUniqueQueue) Has(data Data) (bool, error) {
if err != nil || has {
return has, err
}
q.lock.Lock()
defer q.lock.Unlock()
if q.internal == nil {
return false, nil
}
return q.internal.(UniqueQueue).Has(data)
}

View File

@@ -120,7 +120,6 @@ func UploadRepoFiles(repo *models.Repository, doer *models.User, opts *UploadRep
return err
}
infos[i] = uploadInfo
} else if objectHash, err = t.HashObject(file); err != nil {
return err
}
@@ -128,7 +127,6 @@ func UploadRepoFiles(repo *models.Repository, doer *models.User, opts *UploadRep
// Add the object to the index
if err := t.AddObjectToIndex("100644", objectHash, path.Join(opts.TreePath, uploadInfo.upload.Name)); err != nil {
return err
}
}
@@ -165,28 +163,10 @@ func UploadRepoFiles(repo *models.Repository, doer *models.User, opts *UploadRep
// OK now we can insert the data into the store - there's no way to clean up the store
// once it's in there, it's in there.
contentStore := &lfs.ContentStore{ObjectStorage: storage.LFS}
for _, uploadInfo := range infos {
if uploadInfo.lfsMetaObject == nil {
continue
}
exist, err := contentStore.Exists(uploadInfo.lfsMetaObject)
if err != nil {
for _, info := range infos {
if err := uploadToLFSContentStore(info, contentStore); err != nil {
return cleanUpAfterFailure(&infos, t, err)
}
if !exist {
file, err := os.Open(uploadInfo.upload.LocalPath())
if err != nil {
return cleanUpAfterFailure(&infos, t, err)
}
defer file.Close()
// FIXME: Put regenerates the hash and copies the file over.
// I guess this strictly ensures the soundness of the store but this is inefficient.
if err := contentStore.Put(uploadInfo.lfsMetaObject, file); err != nil {
// OK Now we need to cleanup
// Can't clean up the store, once uploaded there they're there.
return cleanUpAfterFailure(&infos, t, err)
}
}
}
// Then push this tree to NewBranch
@@ -196,3 +176,29 @@ func UploadRepoFiles(repo *models.Repository, doer *models.User, opts *UploadRep
return models.DeleteUploads(uploads...)
}
func uploadToLFSContentStore(info uploadInfo, contentStore *lfs.ContentStore) error {
if info.lfsMetaObject == nil {
return nil
}
exist, err := contentStore.Exists(info.lfsMetaObject)
if err != nil {
return err
}
if !exist {
file, err := os.Open(info.upload.LocalPath())
if err != nil {
return err
}
defer file.Close()
// FIXME: Put regenerates the hash and copies the file over.
// I guess this strictly ensures the soundness of the store but this is inefficient.
if err := contentStore.Put(info.lfsMetaObject, file); err != nil {
// OK Now we need to cleanup
// Can't clean up the store, once uploaded there they're there.
return err
}
}
return nil
}

View File

@@ -771,7 +771,7 @@ func NewContext() {
ImportLocalPaths = sec.Key("IMPORT_LOCAL_PATHS").MustBool(false)
DisableGitHooks = sec.Key("DISABLE_GIT_HOOKS").MustBool(true)
OnlyAllowPushIfGiteaEnvironmentSet = sec.Key("ONLY_ALLOW_PUSH_IF_GITEA_ENVIRONMENT_SET").MustBool(true)
PasswordHashAlgo = sec.Key("PASSWORD_HASH_ALGO").MustString("argon2")
PasswordHashAlgo = sec.Key("PASSWORD_HASH_ALGO").MustString("pbkdf2")
CSRFCookieHTTPOnly = sec.Key("CSRF_COOKIE_HTTP_ONLY").MustBool(true)
PasswordCheckPwn = sec.Key("PASSWORD_CHECK_PWN").MustBool(false)

View File

@@ -7,6 +7,7 @@ package storage
import (
"context"
"io"
"io/ioutil"
"net/url"
"os"
"path/filepath"
@@ -24,13 +25,15 @@ const LocalStorageType Type = "local"
// LocalStorageConfig represents the configuration for a local storage
type LocalStorageConfig struct {
Path string `ini:"PATH"`
Path string `ini:"PATH"`
TemporaryPath string `ini:"TEMPORARY_PATH"`
}
// LocalStorage represents a local files storage
type LocalStorage struct {
ctx context.Context
dir string
ctx context.Context
dir string
tmpdir string
}
// NewLocalStorage returns a local files
@@ -45,9 +48,14 @@ func NewLocalStorage(ctx context.Context, cfg interface{}) (ObjectStorage, error
return nil, err
}
if config.TemporaryPath == "" {
config.TemporaryPath = config.Path + "/tmp"
}
return &LocalStorage{
ctx: ctx,
dir: config.Path,
ctx: ctx,
dir: config.Path,
tmpdir: config.TemporaryPath,
}, nil
}
@@ -63,17 +71,37 @@ func (l *LocalStorage) Save(path string, r io.Reader) (int64, error) {
return 0, err
}
// always override
if err := util.Remove(p); err != nil {
// Create a temporary file to save to
if err := os.MkdirAll(l.tmpdir, os.ModePerm); err != nil {
return 0, err
}
f, err := os.Create(p)
tmp, err := ioutil.TempFile(l.tmpdir, "upload-*")
if err != nil {
return 0, err
}
defer f.Close()
return io.Copy(f, r)
tmpRemoved := false
defer func() {
if !tmpRemoved {
_ = util.Remove(tmp.Name())
}
}()
n, err := io.Copy(tmp, r)
if err != nil {
return 0, err
}
if err := tmp.Close(); err != nil {
return 0, err
}
if err := os.Rename(tmp.Name(), p); err != nil {
return 0, err
}
tmpRemoved = true
return n, nil
}
// Stat returns the info of the file

View File

@@ -689,6 +689,11 @@ func ActionIcon(opType models.ActionType) string {
// ActionContent2Commits converts action content to push commits
func ActionContent2Commits(act Actioner) *repository.PushCommits {
push := repository.NewPushCommits()
if act == nil || act.GetContent() == "" {
return push
}
if err := json.Unmarshal([]byte(act.GetContent()), push); err != nil {
log.Error("json.Unmarshal:\n%s\nERROR: %v", act.GetContent(), err)
}

View File

@@ -7,6 +7,7 @@ package timeutil
import (
"fmt"
"html/template"
"math"
"strings"
"time"
@@ -25,7 +26,11 @@ const (
Year = 12 * Month
)
func computeTimeDiff(diff int64, lang string) (int64, string) {
func round(s float64) int64 {
return int64(math.Round(s))
}
func computeTimeDiffFloor(diff int64, lang string) (int64, string) {
diffStr := ""
switch {
case diff <= 0:
@@ -83,6 +88,94 @@ func computeTimeDiff(diff int64, lang string) (int64, string) {
return diff, diffStr
}
func computeTimeDiff(diff int64, lang string) (int64, string) {
diffStr := ""
switch {
case diff <= 0:
diff = 0
diffStr = i18n.Tr(lang, "tool.now")
case diff < 2:
diff = 0
diffStr = i18n.Tr(lang, "tool.1s")
case diff < 1*Minute:
diffStr = i18n.Tr(lang, "tool.seconds", diff)
diff = 0
case diff < Minute+Minute/2:
diff -= 1 * Minute
diffStr = i18n.Tr(lang, "tool.1m")
case diff < 1*Hour:
minutes := round(float64(diff) / Minute)
if minutes > 1 {
diffStr = i18n.Tr(lang, "tool.minutes", minutes)
} else {
diffStr = i18n.Tr(lang, "tool.1m")
}
diff -= diff / Minute * Minute
case diff < Hour+Hour/2:
diff -= 1 * Hour
diffStr = i18n.Tr(lang, "tool.1h")
case diff < 1*Day:
hours := round(float64(diff) / Hour)
if hours > 1 {
diffStr = i18n.Tr(lang, "tool.hours", hours)
} else {
diffStr = i18n.Tr(lang, "tool.1h")
}
diff -= diff / Hour * Hour
case diff < Day+Day/2:
diff -= 1 * Day
diffStr = i18n.Tr(lang, "tool.1d")
case diff < 1*Week:
days := round(float64(diff) / Day)
if days > 1 {
diffStr = i18n.Tr(lang, "tool.days", days)
} else {
diffStr = i18n.Tr(lang, "tool.1d")
}
diff -= diff / Day * Day
case diff < Week+Week/2:
diff -= 1 * Week
diffStr = i18n.Tr(lang, "tool.1w")
case diff < 1*Month:
weeks := round(float64(diff) / Week)
if weeks > 1 {
diffStr = i18n.Tr(lang, "tool.weeks", weeks)
} else {
diffStr = i18n.Tr(lang, "tool.1w")
}
diff -= diff / Week * Week
case diff < 1*Month+Month/2:
diff -= 1 * Month
diffStr = i18n.Tr(lang, "tool.1mon")
case diff < 1*Year:
months := round(float64(diff) / Month)
if months > 1 {
diffStr = i18n.Tr(lang, "tool.months", months)
} else {
diffStr = i18n.Tr(lang, "tool.1mon")
}
diff -= diff / Month * Month
case diff < Year+Year/2:
diff -= 1 * Year
diffStr = i18n.Tr(lang, "tool.1y")
default:
years := round(float64(diff) / Year)
if years > 1 {
diffStr = i18n.Tr(lang, "tool.years", years)
} else {
diffStr = i18n.Tr(lang, "tool.1y")
}
diff -= (diff / Year) * Year
}
return diff, diffStr
}
// MinutesToFriendly returns a user friendly string with number of minutes
// converted to hours and minutes.
func MinutesToFriendly(minutes int, lang string) string {
@@ -111,7 +204,7 @@ func timeSincePro(then, now time.Time, lang string) string {
break
}
diff, diffStr = computeTimeDiff(diff, lang)
diff, diffStr = computeTimeDiffFloor(diff, lang)
timeStr += ", " + diffStr
}
return strings.TrimPrefix(timeStr, ", ")

View File

@@ -5,6 +5,7 @@
package timeutil
import (
"fmt"
"os"
"testing"
"time"
@@ -47,27 +48,39 @@ func TestTimeSince(t *testing.T) {
// test that each diff in `diffs` yields the expected string
test := func(expected string, diffs ...time.Duration) {
for _, diff := range diffs {
actual := timeSince(BaseDate, BaseDate.Add(diff), "en")
assert.Equal(t, i18n.Tr("en", "tool.ago", expected), actual)
actual = timeSince(BaseDate.Add(diff), BaseDate, "en")
assert.Equal(t, i18n.Tr("en", "tool.from_now", expected), actual)
}
t.Run(expected, func(t *testing.T) {
for _, diff := range diffs {
actual := timeSince(BaseDate, BaseDate.Add(diff), "en")
assert.Equal(t, i18n.Tr("en", "tool.ago", expected), actual)
actual = timeSince(BaseDate.Add(diff), BaseDate, "en")
assert.Equal(t, i18n.Tr("en", "tool.from_now", expected), actual)
}
})
}
test("1 second", time.Second, time.Second+50*time.Millisecond)
test("2 seconds", 2*time.Second, 2*time.Second+50*time.Millisecond)
test("1 minute", time.Minute, time.Minute+30*time.Second)
test("2 minutes", 2*time.Minute, 2*time.Minute+30*time.Second)
test("1 hour", time.Hour, time.Hour+30*time.Minute)
test("2 hours", 2*time.Hour, 2*time.Hour+30*time.Minute)
test("1 day", DayDur, DayDur+12*time.Hour)
test("2 days", 2*DayDur, 2*DayDur+12*time.Hour)
test("1 minute", time.Minute, time.Minute+29*time.Second)
test("2 minutes", 2*time.Minute, time.Minute+30*time.Second)
test("2 minutes", 2*time.Minute, 2*time.Minute+29*time.Second)
test("1 hour", time.Hour, time.Hour+29*time.Minute)
test("2 hours", 2*time.Hour, time.Hour+30*time.Minute)
test("2 hours", 2*time.Hour, 2*time.Hour+29*time.Minute)
test("3 hours", 3*time.Hour, 2*time.Hour+30*time.Minute)
test("1 day", DayDur, DayDur+11*time.Hour)
test("2 days", 2*DayDur, DayDur+12*time.Hour)
test("2 days", 2*DayDur, 2*DayDur+11*time.Hour)
test("3 days", 3*DayDur, 2*DayDur+12*time.Hour)
test("1 week", WeekDur, WeekDur+3*DayDur)
test("2 weeks", 2*WeekDur, WeekDur+4*DayDur)
test("2 weeks", 2*WeekDur, 2*WeekDur+3*DayDur)
test("1 month", MonthDur, MonthDur+15*DayDur)
test("2 months", 2*MonthDur, 2*MonthDur+15*DayDur)
test("1 year", YearDur, YearDur+6*MonthDur)
test("2 years", 2*YearDur, 2*YearDur+6*MonthDur)
test("3 weeks", 3*WeekDur, 2*WeekDur+4*DayDur)
test("1 month", MonthDur, MonthDur+14*DayDur)
test("2 months", 2*MonthDur, MonthDur+15*DayDur)
test("2 months", 2*MonthDur, 2*MonthDur+14*DayDur)
test("1 year", YearDur, YearDur+5*MonthDur)
test("2 years", 2*YearDur, YearDur+6*MonthDur)
test("2 years", 2*YearDur, 2*YearDur+5*MonthDur)
test("3 years", 3*YearDur, 2*YearDur+6*MonthDur)
}
func TestTimeSincePro(t *testing.T) {
@@ -114,11 +127,11 @@ func TestHtmlTimeSince(t *testing.T) {
}
test("1 second", time.Second)
test("3 minutes", 3*time.Minute+5*time.Second)
test("1 day", DayDur+18*time.Hour)
test("1 week", WeekDur+6*DayDur)
test("3 months", 3*MonthDur+3*WeekDur)
test("1 day", DayDur+11*time.Hour)
test("1 week", WeekDur+3*DayDur)
test("3 months", 3*MonthDur+2*WeekDur)
test("2 years", 2*YearDur)
test("3 years", 3*YearDur+11*MonthDur+4*WeekDur)
test("3 years", 2*YearDur+11*MonthDur+4*WeekDur)
}
func TestComputeTimeDiff(t *testing.T) {
@@ -126,26 +139,35 @@ func TestComputeTimeDiff(t *testing.T) {
// computeTimeDiff(base + offset) == (offset, str)
test := func(base int64, str string, offsets ...int64) {
for _, offset := range offsets {
diff, diffStr := computeTimeDiff(base+offset, "en")
assert.Equal(t, offset, diff)
assert.Equal(t, str, diffStr)
t.Run(fmt.Sprintf("%s:%d", str, offset), func(t *testing.T) {
diff, diffStr := computeTimeDiff(base+offset, "en")
assert.Equal(t, offset, diff)
assert.Equal(t, str, diffStr)
})
}
}
test(0, "now", 0)
test(1, "1 second", 0)
test(2, "2 seconds", 0)
test(Minute, "1 minute", 0, 1, 30, Minute-1)
test(2*Minute, "2 minutes", 0, Minute-1)
test(Hour, "1 hour", 0, 1, Hour-1)
test(5*Hour, "5 hours", 0, Hour-1)
test(Day, "1 day", 0, 1, Day-1)
test(5*Day, "5 days", 0, Day-1)
test(Week, "1 week", 0, 1, Week-1)
test(3*Week, "3 weeks", 0, 4*Day+25000)
test(Month, "1 month", 0, 1, Month-1)
test(10*Month, "10 months", 0, Month-1)
test(Year, "1 year", 0, Year-1)
test(3*Year, "3 years", 0, Year-1)
test(Minute, "1 minute", 0, 1, 29)
test(Minute, "2 minutes", 30, Minute-1)
test(2*Minute, "2 minutes", 0, 29)
test(2*Minute, "3 minutes", 30, Minute-1)
test(Hour, "1 hour", 0, 1, 29*Minute)
test(Hour, "2 hours", 30*Minute, Hour-1)
test(5*Hour, "5 hours", 0, 29*Minute)
test(Day, "1 day", 0, 1, 11*Hour)
test(Day, "2 days", 12*Hour, Day-1)
test(5*Day, "5 days", 0, 11*Hour)
test(Week, "1 week", 0, 1, 3*Day)
test(Week, "2 weeks", 4*Day, Week-1)
test(3*Week, "3 weeks", 0, 3*Day)
test(Month, "1 month", 0, 1)
test(Month, "2 months", 16*Day, Month-1)
test(10*Month, "10 months", 0, 13*Day)
test(Year, "1 year", 0, 179*Day)
test(Year, "2 years", 180*Day, Year-1)
test(3*Year, "3 years", 0, 179*Day)
}
func TestMinutesToFriendly(t *testing.T) {

View File

@@ -293,7 +293,6 @@ func CreatePullRequest(ctx *context.APIContext, form api.CreatePullRequestOption
var (
repo = ctx.Repo.Repository
labelIDs []int64
assigneeID int64
milestoneID int64
)
@@ -354,7 +353,7 @@ func CreatePullRequest(ctx *context.APIContext, form api.CreatePullRequestOption
}
if form.Milestone > 0 {
milestone, err := models.GetMilestoneByRepoID(ctx.Repo.Repository.ID, milestoneID)
milestone, err := models.GetMilestoneByRepoID(ctx.Repo.Repository.ID, form.Milestone)
if err != nil {
if models.IsErrMilestoneNotExist(err) {
ctx.NotFound()
@@ -378,7 +377,6 @@ func CreatePullRequest(ctx *context.APIContext, form api.CreatePullRequestOption
PosterID: ctx.User.ID,
Poster: ctx.User,
MilestoneID: milestoneID,
AssigneeID: assigneeID,
IsPull: true,
Content: form.Body,
DeadlineUnix: deadlineUnix,

View File

@@ -539,6 +539,10 @@ func updateBasicProperties(ctx *context.APIContext, opts api.EditRepoOption) err
if opts.Private != nil {
// Visibility of forked repository is forced sync with base repository.
if repo.IsFork {
if err := repo.GetBaseRepo(); err != nil {
ctx.Error(http.StatusInternalServerError, "Unable to load base repository", err)
return err
}
*opts.Private = repo.BaseRepo.IsPrivate
}

View File

@@ -93,7 +93,12 @@ func Transfer(ctx *context.APIContext, opts api.TransferRepoOption) {
}
}
if err = repo_service.TransferOwnership(ctx.User, newOwner, ctx.Repo.Repository, teams); err != nil {
if err = repo_service.StartRepositoryTransfer(ctx.User, newOwner, ctx.Repo.Repository, teams); err != nil {
if models.IsErrCancelled(err) {
ctx.Error(http.StatusForbidden, "transfer", "user has no right to create repo for new owner")
return
}
ctx.InternalServerError(err)
return
}

View File

@@ -133,12 +133,19 @@ func GlobalInit(ctx context.Context) {
log.Trace("AppWorkPath: %s", setting.AppWorkPath)
log.Trace("Custom path: %s", setting.CustomPath)
log.Trace("Log path: %s", setting.LogRootPath)
checkRunMode()
// Setup i18n
InitLocales()
NewServices()
if setting.EnableSQLite3 {
log.Info("SQLite3 Supported")
} else if setting.Database.UseSQLite3 {
log.Fatal("SQLite3 is set in settings but NOT Supported")
}
if setting.InstallLock {
highlight.NewContext()
external.RegisterParsers()
@@ -172,10 +179,6 @@ func GlobalInit(ctx context.Context) {
}
eventsource.GetManager().Init()
}
if setting.EnableSQLite3 {
log.Info("SQLite3 Supported")
}
checkRunMode()
if err := repo_migrations.Init(); err != nil {
log.Fatal("Failed to initialize repository migrations: %v", err)

View File

@@ -725,6 +725,14 @@ func setTemplateIfExists(ctx *context.Context, ctxDataKey string, possibleDirs [
ctx.Data[ctxDataKey] = templateBody
labelIDs := make([]string, 0, len(meta.Labels))
if repoLabels, err := models.GetLabelsByRepoID(ctx.Repo.Repository.ID, "", models.ListOptions{}); err == nil {
ctx.Data["Labels"] = repoLabels
if ctx.Repo.Owner.IsOrganization() {
if orgLabels, err := models.GetLabelsByOrgID(ctx.Repo.Owner.ID, ctx.Query("sort"), models.ListOptions{}); err == nil {
ctx.Data["OrgLabels"] = orgLabels
repoLabels = append(repoLabels, orgLabels...)
}
}
for _, metaLabel := range meta.Labels {
for _, repoLabel := range repoLabels {
if strings.EqualFold(repoLabel.Name, metaLabel) {
@@ -734,7 +742,6 @@ func setTemplateIfExists(ctx *context.Context, ctxDataKey string, possibleDirs [
}
}
}
ctx.Data["Labels"] = repoLabels
}
ctx.Data["HasSelectedLabel"] = len(labelIDs) > 0
ctx.Data["label_ids"] = strings.Join(labelIDs, ",")

View File

@@ -475,9 +475,12 @@ func SettingsPost(ctx *context.Context, form auth.RepoSettingForm) {
ctx.Repo.GitRepo.Close()
ctx.Repo.GitRepo = nil
}
if err = repo_service.TransferOwnership(ctx.User, newOwner, repo, nil); err != nil {
if err = repo_service.StartRepositoryTransfer(ctx.User, newOwner, repo, nil); err != nil {
if models.IsErrRepoAlreadyExist(err) {
ctx.RenderWithErr(ctx.Tr("repo.settings.new_owner_has_same_repo"), tplSettingsOptions, nil)
} else if models.IsErrCancelled(err) {
// this err msg is not translated, since it was introduced in a backport
ctx.RenderWithErr("user has no right to create repo for new owner", tplSettingsOptions, nil)
} else {
ctx.ServerError("TransferOwnership", err)
}

View File

@@ -692,7 +692,10 @@ func RenderUserCards(ctx *context.Context, total int, getter func(opts models.Li
pager := context.NewPagination(total, models.ItemsPerPage, page, 5)
ctx.Data["Page"] = pager
items, err := getter(models.ListOptions{Page: pager.Paginater.Current()})
items, err := getter(models.ListOptions{
Page: pager.Paginater.Current(),
PageSize: models.ItemsPerPage,
})
if err != nil {
ctx.ServerError("getter", err)
return
@@ -723,6 +726,7 @@ func Stars(ctx *context.Context) {
func Forks(ctx *context.Context) {
ctx.Data["Title"] = ctx.Tr("repos.forks")
// TODO: need pagination
forks, err := ctx.Repo.Repository.GetForks(models.ListOptions{})
if err != nil {
ctx.ServerError("GetForks", err)

View File

@@ -746,6 +746,7 @@ func LinkAccount(ctx *context.Context) {
ctx.Data["CaptchaType"] = setting.Service.CaptchaType
ctx.Data["RecaptchaURL"] = setting.Service.RecaptchaURL
ctx.Data["RecaptchaSitekey"] = setting.Service.RecaptchaSitekey
ctx.Data["HcaptchaSitekey"] = setting.Service.HcaptchaSitekey
ctx.Data["DisableRegistration"] = setting.Service.DisableRegistration
ctx.Data["ShowRegistrationButton"] = false
@@ -797,6 +798,7 @@ func LinkAccountPostSignIn(ctx *context.Context, signInForm auth.SignInForm) {
ctx.Data["RecaptchaURL"] = setting.Service.RecaptchaURL
ctx.Data["CaptchaType"] = setting.Service.CaptchaType
ctx.Data["RecaptchaSitekey"] = setting.Service.RecaptchaSitekey
ctx.Data["HcaptchaSitekey"] = setting.Service.HcaptchaSitekey
ctx.Data["DisableRegistration"] = setting.Service.DisableRegistration
ctx.Data["ShowRegistrationButton"] = false
@@ -881,6 +883,7 @@ func LinkAccountPostRegister(ctx *context.Context, cpt *captcha.Captcha, form au
ctx.Data["RecaptchaURL"] = setting.Service.RecaptchaURL
ctx.Data["CaptchaType"] = setting.Service.CaptchaType
ctx.Data["RecaptchaSitekey"] = setting.Service.RecaptchaSitekey
ctx.Data["HcaptchaSitekey"] = setting.Service.HcaptchaSitekey
ctx.Data["DisableRegistration"] = setting.Service.DisableRegistration
ctx.Data["ShowRegistrationButton"] = false

View File

@@ -182,6 +182,8 @@ var (
removedCodePrefix = []byte(`<span class="removed-code">`)
codeTagSuffix = []byte(`</span>`)
)
var unfinishedtagRegex = regexp.MustCompile(`<[^>]*$`)
var trailingSpanRegex = regexp.MustCompile(`<span\s*[[:alpha:]="]*?[>]?$`)
var entityRegex = regexp.MustCompile(`&[#]*?[0-9[:alpha:]]*$`)
@@ -196,10 +198,218 @@ func shouldWriteInline(diff diffmatchpatch.Diff, lineType DiffLineType) bool {
return false
}
func fixupBrokenSpans(diffs []diffmatchpatch.Diff) []diffmatchpatch.Diff {
// Create a new array to store our fixed up blocks
fixedup := make([]diffmatchpatch.Diff, 0, len(diffs))
// semantically label some numbers
const insert, delete, equal = 0, 1, 2
// record the positions of the last type of each block in the fixedup blocks
last := []int{-1, -1, -1}
operation := []diffmatchpatch.Operation{diffmatchpatch.DiffInsert, diffmatchpatch.DiffDelete, diffmatchpatch.DiffEqual}
// create a writer for insert and deletes
toWrite := []strings.Builder{
{},
{},
}
// make some flags for insert and delete
unfinishedTag := []bool{false, false}
unfinishedEnt := []bool{false, false}
// store stores the provided text in the writer for the typ
store := func(text string, typ int) {
(&(toWrite[typ])).WriteString(text)
}
// hasStored returns true if there is stored content
hasStored := func(typ int) bool {
return (&toWrite[typ]).Len() > 0
}
// stored will return that content
stored := func(typ int) string {
return (&toWrite[typ]).String()
}
// empty will empty the stored content
empty := func(typ int) {
(&toWrite[typ]).Reset()
}
// pop will remove the stored content appending to a diff block for that typ
pop := func(typ int, fixedup []diffmatchpatch.Diff) []diffmatchpatch.Diff {
if hasStored(typ) {
if last[typ] > last[equal] {
fixedup[last[typ]].Text += stored(typ)
} else {
fixedup = append(fixedup, diffmatchpatch.Diff{
Type: operation[typ],
Text: stored(typ),
})
}
empty(typ)
}
return fixedup
}
// Now we walk the provided diffs and check the type of each block in turn
for _, diff := range diffs {
typ := delete // flag for handling insert or delete typs
switch diff.Type {
case diffmatchpatch.DiffEqual:
// First check if there is anything stored
if hasStored(insert) || hasStored(delete) {
// There are two reasons for storing content:
// 1. Unfinished Entity <- Could be more efficient here by not doing this if we're looking for a tag
if unfinishedEnt[insert] || unfinishedEnt[delete] {
// we look for a ';' to finish an entity
idx := strings.IndexRune(diff.Text, ';')
if idx >= 0 {
// if we find a ';' store the preceding content to both insert and delete
store(diff.Text[:idx+1], insert)
store(diff.Text[:idx+1], delete)
// and remove it from this block
diff.Text = diff.Text[idx+1:]
// reset the ent flags
unfinishedEnt[insert] = false
unfinishedEnt[delete] = false
} else {
// otherwise store it all on insert and delete
store(diff.Text, insert)
store(diff.Text, delete)
// and empty this block
diff.Text = ""
}
}
// 2. Unfinished Tag
if unfinishedTag[insert] || unfinishedTag[delete] {
// we look for a '>' to finish a tag
idx := strings.IndexRune(diff.Text, '>')
if idx >= 0 {
store(diff.Text[:idx+1], insert)
store(diff.Text[:idx+1], delete)
diff.Text = diff.Text[idx+1:]
unfinishedTag[insert] = false
unfinishedTag[delete] = false
} else {
store(diff.Text, insert)
store(diff.Text, delete)
diff.Text = ""
}
}
// If we've completed the required tag/entities
if !(unfinishedTag[insert] || unfinishedTag[delete] || unfinishedEnt[insert] || unfinishedEnt[delete]) {
// pop off the stack
fixedup = pop(insert, fixedup)
fixedup = pop(delete, fixedup)
}
// If that has left this diff block empty then shortcut
if len(diff.Text) == 0 {
continue
}
}
// check if this block ends in an unfinished tag?
idx := unfinishedtagRegex.FindStringIndex(diff.Text)
if idx != nil {
unfinishedTag[insert] = true
unfinishedTag[delete] = true
} else {
// otherwise does it end in an unfinished entity?
idx = entityRegex.FindStringIndex(diff.Text)
if idx != nil {
unfinishedEnt[insert] = true
unfinishedEnt[delete] = true
}
}
// If there is an unfinished component
if idx != nil {
// Store the fragment
store(diff.Text[idx[0]:], insert)
store(diff.Text[idx[0]:], delete)
// and remove it from this block
diff.Text = diff.Text[:idx[0]]
}
// If that hasn't left the block empty
if len(diff.Text) > 0 {
// store the position of the last equal block and store it in our diffs
last[equal] = len(fixedup)
fixedup = append(fixedup, diff)
}
continue
case diffmatchpatch.DiffInsert:
typ = insert
fallthrough
case diffmatchpatch.DiffDelete:
// First check if there is anything stored for this type
if hasStored(typ) {
// if there is prepend it to this block, empty the storage and reset our flags
diff.Text = stored(typ) + diff.Text
empty(typ)
unfinishedEnt[typ] = false
unfinishedTag[typ] = false
}
// check if this block ends in an unfinished tag
idx := unfinishedtagRegex.FindStringIndex(diff.Text)
if idx != nil {
unfinishedTag[typ] = true
} else {
// otherwise does it end in an unfinished entity
idx = entityRegex.FindStringIndex(diff.Text)
if idx != nil {
unfinishedEnt[typ] = true
}
}
// If there is an unfinished component
if idx != nil {
// Store the fragment
store(diff.Text[idx[0]:], typ)
// and remove it from this block
diff.Text = diff.Text[:idx[0]]
}
// If that hasn't left the block empty
if len(diff.Text) > 0 {
// if the last block of this type was after the last equal block
if last[typ] > last[equal] {
// store this blocks content on that block
fixedup[last[typ]].Text += diff.Text
} else {
// otherwise store the position of the last block of this type and store the block
last[typ] = len(fixedup)
fixedup = append(fixedup, diff)
}
}
continue
}
}
// pop off any remaining stored content
fixedup = pop(insert, fixedup)
fixedup = pop(delete, fixedup)
return fixedup
}
func diffToHTML(fileName string, diffs []diffmatchpatch.Diff, lineType DiffLineType) template.HTML {
buf := bytes.NewBuffer(nil)
match := ""
diffs = fixupBrokenSpans(diffs)
for _, diff := range diffs {
if shouldWriteInline(diff, lineType) {
if len(match) > 0 {
@@ -373,6 +583,7 @@ type DiffFile struct {
IsBin bool
IsLFSFile bool
IsRenamed bool
IsAmbiguous bool
IsSubmodule bool
Sections []*DiffSection
IsIncomplete bool
@@ -566,12 +777,32 @@ parsingLoop:
if strings.HasSuffix(line, " 160000\n") {
curFile.IsSubmodule = true
}
case strings.HasPrefix(line, "rename from "):
curFile.IsRenamed = true
curFile.Type = DiffFileRename
if curFile.IsAmbiguous {
curFile.OldName = line[len("rename from ") : len(line)-1]
}
case strings.HasPrefix(line, "rename to "):
curFile.IsRenamed = true
curFile.Type = DiffFileRename
if curFile.IsAmbiguous {
curFile.Name = line[len("rename to ") : len(line)-1]
curFile.IsAmbiguous = false
}
case strings.HasPrefix(line, "copy from "):
curFile.IsRenamed = true
curFile.Type = DiffFileCopy
if curFile.IsAmbiguous {
curFile.OldName = line[len("copy from ") : len(line)-1]
}
case strings.HasPrefix(line, "copy to "):
curFile.IsRenamed = true
curFile.Type = DiffFileCopy
if curFile.IsAmbiguous {
curFile.Name = line[len("copy to ") : len(line)-1]
curFile.IsAmbiguous = false
}
case strings.HasPrefix(line, "new file"):
curFile.Type = DiffFileAdd
curFile.IsCreated = true
@@ -593,9 +824,35 @@ parsingLoop:
case strings.HasPrefix(line, "Binary"):
curFile.IsBin = true
case strings.HasPrefix(line, "--- "):
// Do nothing with this line
// Handle ambiguous filenames
if curFile.IsAmbiguous {
if len(line) > 6 && line[4] == 'a' {
curFile.OldName = line[6 : len(line)-1]
if line[len(line)-2] == '\t' {
curFile.OldName = curFile.OldName[:len(curFile.OldName)-1]
}
} else {
curFile.OldName = ""
}
}
// Otherwise do nothing with this line
case strings.HasPrefix(line, "+++ "):
// Do nothing with this line
// Handle ambiguous filenames
if curFile.IsAmbiguous {
if len(line) > 6 && line[4] == 'b' {
curFile.Name = line[6 : len(line)-1]
if line[len(line)-2] == '\t' {
curFile.Name = curFile.Name[:len(curFile.Name)-1]
}
if curFile.OldName == "" {
curFile.OldName = curFile.Name
}
} else {
curFile.Name = curFile.OldName
}
curFile.IsAmbiguous = false
}
// Otherwise do nothing with this line, but now switch to parsing hunks
lineBytes, isFragment, err := parseHunks(curFile, maxLines, maxLineCharacters, input)
diff.TotalAddition += curFile.Addition
diff.TotalDeletion += curFile.Deletion
@@ -846,13 +1103,33 @@ func createDiffFile(diff *Diff, line string) *DiffFile {
rd := strings.NewReader(line[len(cmdDiffHead):] + " ")
curFile.Type = DiffFileChange
curFile.OldName = readFileName(rd)
curFile.Name = readFileName(rd)
oldNameAmbiguity := false
newNameAmbiguity := false
curFile.OldName, oldNameAmbiguity = readFileName(rd)
curFile.Name, newNameAmbiguity = readFileName(rd)
if oldNameAmbiguity && newNameAmbiguity {
curFile.IsAmbiguous = true
// OK we should bet that the oldName and the newName are the same if they can be made to be same
// So we need to start again ...
if (len(line)-len(cmdDiffHead)-1)%2 == 0 {
// diff --git a/b b/b b/b b/b b/b b/b
//
midpoint := (len(line) + len(cmdDiffHead) - 1) / 2
new, old := line[len(cmdDiffHead):midpoint], line[midpoint+1:]
if len(new) > 2 && len(old) > 2 && new[2:] == old[2:] {
curFile.OldName = old[2:]
curFile.Name = old[2:]
}
}
}
curFile.IsRenamed = curFile.Name != curFile.OldName
return curFile
}
func readFileName(rd *strings.Reader) string {
func readFileName(rd *strings.Reader) (string, bool) {
ambiguity := false
var name string
char, _ := rd.ReadByte()
_ = rd.UnreadByte()
@@ -862,9 +1139,24 @@ func readFileName(rd *strings.Reader) string {
name = name[1:]
}
} else {
// This technique is potentially ambiguous it may not be possible to uniquely identify the filenames from the diff line alone
ambiguity = true
fmt.Fscanf(rd, "%s ", &name)
char, _ := rd.ReadByte()
_ = rd.UnreadByte()
for !(char == 0 || char == '"' || char == 'b') {
var suffix string
fmt.Fscanf(rd, "%s ", &suffix)
name += " " + suffix
char, _ = rd.ReadByte()
_ = rd.UnreadByte()
}
}
return name[2:]
if len(name) < 2 {
log.Error("Unable to determine name from reader: %v", rd)
return "", true
}
return name[2:], ambiguity
}
// GetDiffRange builds a Diff between two commits of a repository.
@@ -975,6 +1267,7 @@ func CommentAsDiff(c *models.Comment) (*Diff, error) {
diff, err := ParsePatch(setting.Git.MaxGitDiffLines,
setting.Git.MaxGitDiffLineCharacters, setting.Git.MaxGitDiffFiles, strings.NewReader(c.Patch))
if err != nil {
log.Error("Unable to parse patch: %v", err)
return nil, err
}
if len(diff.Files) == 0 {
@@ -989,6 +1282,16 @@ func CommentAsDiff(c *models.Comment) (*Diff, error) {
// CommentMustAsDiff executes AsDiff and logs the error instead of returning
func CommentMustAsDiff(c *models.Comment) *Diff {
if c == nil {
return nil
}
defer func() {
if err := recover(); err != nil {
stack := log.Stack(2)
log.Error("PANIC whilst retrieving diff for comment[%d] Error: %v\nStack: %s", c.ID, err, stack)
panic(fmt.Errorf("PANIC whilst retrieving diff for comment[%d] Error: %v\nStack: %s", c.ID, err, stack))
}
}()
diff, err := CommentAsDiff(c)
if err != nil {
log.Warn("CommentMustAsDiff: %v", err)

View File

@@ -15,6 +15,7 @@ import (
"code.gitea.io/gitea/models"
"code.gitea.io/gitea/modules/git"
"code.gitea.io/gitea/modules/highlight"
"code.gitea.io/gitea/modules/setting"
dmp "github.com/sergi/go-diff/diffmatchpatch"
"github.com/stretchr/testify/assert"
@@ -23,7 +24,7 @@ import (
func assertEqual(t *testing.T, s1 string, s2 template.HTML) {
if s1 != string(s2) {
t.Errorf("%s should be equal %s", s2, s1)
t.Errorf("Did not receive expected results:\nExpected: %s\nActual: %s", s1, s2)
}
}
@@ -61,7 +62,7 @@ func TestDiffToHTML(t *testing.T) {
{Type: dmp.DiffEqual, Text: "</span><span class=\"p\">)</span>"},
}, DiffLineDel))
assertEqual(t, "<span class=\"nx\">r</span><span class=\"p\">.</span><span class=\"nf\">WrapperRenderer</span><span class=\"p\">(</span><span class=\"nx\">w</span><span class=\"p\">,</span> <span class=\"removed-code\"><span class=\"nx\">language</span></span><span class=\"removed-code\"><span class=\"p\">,</span> <span class=\"kc\">true</span><span class=\"p\">,</span> <span class=\"nx\">attrs</span></span><span class=\"p\">,</span> <span class=\"kc\">false</span><span class=\"p\">)</span>", diffToHTML("", []dmp.Diff{
assertEqual(t, "<span class=\"nx\">r</span><span class=\"p\">.</span><span class=\"nf\">WrapperRenderer</span><span class=\"p\">(</span><span class=\"nx\">w</span><span class=\"p\">,</span> <span class=\"removed-code\"><span class=\"nx\">language</span><span class=\"p\">,</span> <span class=\"kc\">true</span><span class=\"p\">,</span> <span class=\"nx\">attrs</span></span><span class=\"p\">,</span> <span class=\"kc\">false</span><span class=\"p\">)</span>", diffToHTML("", []dmp.Diff{
{Type: dmp.DiffEqual, Text: "<span class=\"nx\">r</span><span class=\"p\">.</span><span class=\"nf\">WrapperRenderer</span><span class=\"p\">(</span><span class=\"nx\">w</span><span class=\"p\">,</span> <span class=\"nx\">"},
{Type: dmp.DiffDelete, Text: "language</span><span "},
{Type: dmp.DiffEqual, Text: "c"},
@@ -69,14 +70,14 @@ func TestDiffToHTML(t *testing.T) {
{Type: dmp.DiffEqual, Text: "</span><span class=\"p\">,</span> <span class=\"kc\">false</span><span class=\"p\">)</span>"},
}, DiffLineDel))
assertEqual(t, "<span class=\"added-code\">language</span></span><span class=\"added-code\"><span class=\"p\">,</span> <span class=\"kc\">true</span><span class=\"p\">,</span> <span class=\"nx\">attrs</span></span><span class=\"p\">,</span> <span class=\"kc\">false</span><span class=\"p\">)</span>", diffToHTML("", []dmp.Diff{
assertEqual(t, "<span class=\"added-code\">language</span><span class=\"p\">,</span> <span class=\"kc\">true</span><span class=\"p\">,</span> <span class=\"nx\">attrs</span></span><span class=\"p\">,</span> <span class=\"kc\">false</span><span class=\"p\">)</span>", diffToHTML("", []dmp.Diff{
{Type: dmp.DiffInsert, Text: "language</span><span "},
{Type: dmp.DiffEqual, Text: "c"},
{Type: dmp.DiffInsert, Text: "lass=\"p\">,</span> <span class=\"kc\">true</span><span class=\"p\">,</span> <span class=\"nx\">attrs"},
{Type: dmp.DiffEqual, Text: "</span><span class=\"p\">,</span> <span class=\"kc\">false</span><span class=\"p\">)</span>"},
}, DiffLineAdd))
assertEqual(t, "<span class=\"k\">print</span><span class=\"added-code\"></span><span class=\"added-code\"><span class=\"p\">(</span></span><span class=\"sa\"></span><span class=\"s2\">&#34;</span><span class=\"s2\">// </span><span class=\"s2\">&#34;</span><span class=\"p\">,</span> <span class=\"n\">sys</span><span class=\"o\">.</span><span class=\"n\">argv</span><span class=\"added-code\"><span class=\"p\">)</span></span>", diffToHTML("", []dmp.Diff{
assertEqual(t, "<span class=\"k\">print</span><span class=\"added-code\"><span class=\"p\">(</span></span><span class=\"sa\"></span><span class=\"s2\">&#34;</span><span class=\"s2\">// </span><span class=\"s2\">&#34;</span><span class=\"p\">,</span> <span class=\"n\">sys</span><span class=\"o\">.</span><span class=\"n\">argv</span><span class=\"added-code\"><span class=\"p\">)</span></span>", diffToHTML("", []dmp.Diff{
{Type: dmp.DiffEqual, Text: "<span class=\"k\">print</span>"},
{Type: dmp.DiffInsert, Text: "<span"},
{Type: dmp.DiffEqual, Text: " "},
@@ -85,14 +86,14 @@ func TestDiffToHTML(t *testing.T) {
{Type: dmp.DiffInsert, Text: "<span class=\"p\">)</span>"},
}, DiffLineAdd))
assertEqual(t, "sh <span class=\"added-code\">&#39;useradd -u $(stat -c &#34;%u&#34; .gitignore) jenkins</span>&#39;", diffToHTML("", []dmp.Diff{
assertEqual(t, "sh <span class=\"added-code\">&#39;useradd -u $(stat -c &#34;%u&#34; .gitignore) jenkins&#39;</span>", diffToHTML("", []dmp.Diff{
{Type: dmp.DiffEqual, Text: "sh &#3"},
{Type: dmp.DiffDelete, Text: "4;useradd -u 111 jenkins&#34"},
{Type: dmp.DiffInsert, Text: "9;useradd -u $(stat -c &#34;%u&#34; .gitignore) jenkins&#39"},
{Type: dmp.DiffEqual, Text: ";"},
}, DiffLineAdd))
assertEqual(t, "<span class=\"x\"> &lt;h<span class=\"added-code\">4 class=</span><span class=\"added-code\">&#34;release-list-title df ac&#34;</span>&gt;</span>", diffToHTML("", []dmp.Diff{
assertEqual(t, "<span class=\"x\"> &lt;h<span class=\"added-code\">4 class=&#34;release-list-title df ac&#34;</span>&gt;</span>", diffToHTML("", []dmp.Diff{
{Type: dmp.DiffEqual, Text: "<span class=\"x\"> &lt;h"},
{Type: dmp.DiffInsert, Text: "4 class=&#"},
{Type: dmp.DiffEqual, Text: "3"},
@@ -207,6 +208,66 @@ rename to a b/a a/file b/b file
oldFilename: "a b/file b/a a/file",
filename: "a b/a a/file b/b file",
},
{
name: "ambiguous deleted",
gitdiff: `diff --git a/b b/b b/b b/b
deleted file mode 100644
index 92e798b..0000000
--- a/b b/b` + "\t" + `
+++ /dev/null
@@ -1 +0,0 @@
-b b/b
`,
oldFilename: "b b/b",
filename: "b b/b",
addition: 0,
deletion: 1,
},
{
name: "ambiguous addition",
gitdiff: `diff --git a/b b/b b/b b/b
new file mode 100644
index 0000000..92e798b
--- /dev/null
+++ b/b b/b` + "\t" + `
@@ -0,0 +1 @@
+b b/b
`,
oldFilename: "b b/b",
filename: "b b/b",
addition: 1,
deletion: 0,
},
{
name: "rename",
gitdiff: `diff --git a/b b/b b/b b/b b/b b/b
similarity index 100%
rename from b b/b b/b b/b b/b
rename to b
`,
oldFilename: "b b/b b/b b/b b/b",
filename: "b",
},
{
name: "ambiguous 1",
gitdiff: `diff --git a/b b/b b/b b/b b/b b/b
similarity index 100%
rename from b b/b b/b b/b b/b
rename to b
`,
oldFilename: "b b/b b/b b/b b/b",
filename: "b",
},
{
name: "ambiguous 2",
gitdiff: `diff --git a/b b/b b/b b/b b/b b/b
similarity index 100%
rename from b b/b b/b b/b
rename to b b/b
`,
oldFilename: "b b/b b/b b/b",
filename: "b b/b",
},
{
name: "minuses-and-pluses",
gitdiff: `diff --git a/minuses-and-pluses b/minuses-and-pluses
@@ -234,32 +295,32 @@ index 6961180..9ba1a00 100644
t.Run(testcase.name, func(t *testing.T) {
got, err := ParsePatch(setting.Git.MaxGitDiffLines, setting.Git.MaxGitDiffLineCharacters, setting.Git.MaxGitDiffFiles, strings.NewReader(testcase.gitdiff))
if (err != nil) != testcase.wantErr {
t.Errorf("ParsePatch() error = %v, wantErr %v", err, testcase.wantErr)
t.Errorf("ParsePatch(%q) error = %v, wantErr %v", testcase.name, err, testcase.wantErr)
return
}
gotMarshaled, _ := json.MarshalIndent(got, " ", " ")
if got.NumFiles != 1 {
t.Errorf("ParsePath() did not receive 1 file:\n%s", string(gotMarshaled))
t.Errorf("ParsePath(%q) did not receive 1 file:\n%s", testcase.name, string(gotMarshaled))
return
}
if got.TotalAddition != testcase.addition {
t.Errorf("ParsePath() does not have correct totalAddition %d, wanted %d", got.TotalAddition, testcase.addition)
t.Errorf("ParsePath(%q) does not have correct totalAddition %d, wanted %d", testcase.name, got.TotalAddition, testcase.addition)
}
if got.TotalDeletion != testcase.deletion {
t.Errorf("ParsePath() did not have correct totalDeletion %d, wanted %d", got.TotalDeletion, testcase.deletion)
t.Errorf("ParsePath(%q) did not have correct totalDeletion %d, wanted %d", testcase.name, got.TotalDeletion, testcase.deletion)
}
file := got.Files[0]
if file.Addition != testcase.addition {
t.Errorf("ParsePath() does not have correct file addition %d, wanted %d", file.Addition, testcase.addition)
t.Errorf("ParsePath(%q) does not have correct file addition %d, wanted %d", testcase.name, file.Addition, testcase.addition)
}
if file.Deletion != testcase.deletion {
t.Errorf("ParsePath() did not have correct file deletion %d, wanted %d", file.Deletion, testcase.deletion)
t.Errorf("ParsePath(%q) did not have correct file deletion %d, wanted %d", testcase.name, file.Deletion, testcase.deletion)
}
if file.OldName != testcase.oldFilename {
t.Errorf("ParsePath() did not have correct OldName %s, wanted %s", file.OldName, testcase.oldFilename)
t.Errorf("ParsePath(%q) did not have correct OldName %q, wanted %q", testcase.name, file.OldName, testcase.oldFilename)
}
if file.Name != testcase.filename {
t.Errorf("ParsePath() did not have correct Name %s, wanted %s", file.Name, testcase.filename)
t.Errorf("ParsePath(%q) did not have correct Name %q, wanted %q", testcase.name, file.Name, testcase.filename)
}
})
}
@@ -462,3 +523,14 @@ func TestGetDiffRangeWithWhitespaceBehavior(t *testing.T) {
}
}
}
func TestDiffToHTML_14231(t *testing.T) {
setting.Cfg = ini.Empty()
diffRecord := diffMatchPatch.DiffMain(highlight.Code("main.v", " run()\n"), highlight.Code("main.v", " run(db)\n"), true)
diffRecord = diffMatchPatch.DiffCleanupEfficiency(diffRecord)
expected := ` <span class="n">run</span><span class="added-code"><span class="o">(</span><span class="n">db</span></span><span class="o">)</span>`
output := diffToHTML("main.v", diffRecord, DiffLineAdd)
assertEqual(t, expected, output)
}

View File

@@ -475,7 +475,7 @@ func CloseBranchPulls(doer *models.User, repoID int64, branch string) error {
return nil
}
// CloseRepoBranchesPulls close all pull requests which head branches are in the given repository
// CloseRepoBranchesPulls close all pull requests which head branches are in the given repository, but only whose base repo is not in the given repository
func CloseRepoBranchesPulls(doer *models.User, repo *models.Repository) error {
branches, err := git.GetBranchesByPath(repo.RepoPath())
if err != nil {
@@ -494,6 +494,11 @@ func CloseRepoBranchesPulls(doer *models.User, repo *models.Repository) error {
}
for _, pr := range prs {
// If the base repository for this pr is this repository there is no need to close it
// as it is going to be deleted anyway
if pr.BaseRepoID == repo.ID {
continue
}
if err = issue_service.ChangeStatus(pr.Issue, doer, true); err != nil && !models.IsErrPullWasClosed(err) {
errs = append(errs, err)
}

View File

@@ -6,13 +6,14 @@
package pull
import (
"bytes"
"fmt"
"io"
"regexp"
"strings"
"code.gitea.io/gitea/models"
"code.gitea.io/gitea/modules/git"
"code.gitea.io/gitea/modules/log"
"code.gitea.io/gitea/modules/notification"
"code.gitea.io/gitea/modules/setting"
)
@@ -179,11 +180,24 @@ func createCodeComment(doer *models.User, repo *models.Repository, issue *models
if len(commitID) == 0 {
commitID = headCommitID
}
patchBuf := new(bytes.Buffer)
if err := git.GetRepoRawDiffForFile(gitRepo, pr.MergeBase, headCommitID, git.RawDiffNormal, treePath, patchBuf); err != nil {
return nil, fmt.Errorf("GetRawDiffForLine[%s, %s, %s, %s]: %v", gitRepo.Path, pr.MergeBase, headCommitID, treePath, err)
reader, writer := io.Pipe()
defer func() {
_ = reader.Close()
_ = writer.Close()
}()
go func() {
if err := git.GetRepoRawDiffForFile(gitRepo, pr.MergeBase, headCommitID, git.RawDiffNormal, treePath, writer); err != nil {
_ = writer.CloseWithError(fmt.Errorf("GetRawDiffForLine[%s, %s, %s, %s]: %v", gitRepo.Path, pr.MergeBase, headCommitID, treePath, err))
return
}
_ = writer.Close()
}()
patch, err = git.CutDiffAroundLine(reader, int64((&models.Comment{Line: line}).UnsignedLine()), line < 0, setting.UI.CodeCommentLines)
if err != nil {
log.Error("Error whilst generating patch: %v", err)
return nil, err
}
patch = git.CutDiffAroundLine(patchBuf, int64((&models.Comment{Line: line}).UnsignedLine()), line < 0, setting.UI.CodeCommentLines)
}
return models.CreateComment(&models.CreateCommentOptions{
Type: models.CommentTypeCode,

View File

@@ -48,6 +48,9 @@ func Update(pull *models.PullRequest, doer *models.User, message string) error {
// IsUserAllowedToUpdate check if user is allowed to update PR with given permissions and branch protections
func IsUserAllowedToUpdate(pull *models.PullRequest, user *models.User) (bool, error) {
if user == nil {
return false, nil
}
headRepoPerm, err := models.GetUserRepoPermission(pull.HeadRepo, user)
if err != nil {
return false, err

View File

@@ -72,3 +72,31 @@ func ChangeRepositoryName(doer *models.User, repo *models.Repository, newRepoNam
return nil
}
// StartRepositoryTransfer transfer a repo from one owner to a new one.
// it make repository into pending transfer state, if doer can not create repo for new owner.
func StartRepositoryTransfer(doer, newOwner *models.User, repo *models.Repository, teams []*models.Team) error {
if repo.Status != models.RepositoryReady {
return fmt.Errorf("repository is not ready for transfer")
}
// Admin is always allowed to transfer || user transfer repo back to his account
if doer.IsAdmin || doer.ID == newOwner.ID {
return TransferOwnership(doer, newOwner, repo, teams)
}
// If new owner is an org and user can create repos he can transfer directly too
if newOwner.IsOrganization() {
allowed, err := models.CanCreateOrgRepo(newOwner.ID, doer.ID)
if err != nil {
return err
}
if allowed {
return TransferOwnership(doer, newOwner, repo, teams)
}
}
// Block Transfer, new feature will come in v1.14.0
// https://github.com/go-gitea/gitea/pull/14792
return models.ErrCancelled{}
}

View File

@@ -12,8 +12,10 @@
</head>
<body>
<p><b>@{{.Release.Publisher.Name}}</b> released <a href="{{.Release.HTMLURL}}">{{.Release.TagName}}</a>
in <a href="{{AppUrl}}{{.Release.Repo.OwnerName}}/{{.Release.Repo.Name}}">{{.Release.Repo.FullName}} </p>
<p>
<b>@{{.Release.Publisher.Name}}</b> released <a href="{{.Release.HTMLURL}}">{{.Release.TagName}}</a>
in <a href="{{AppUrl}}{{.Release.Repo.OwnerName}}/{{.Release.Repo.Name}}">{{.Release.Repo.FullName}}</a>
</p>
<h4>Title: {{.Release.Title}}</h4>
<p>
Note: <br>

View File

@@ -63,7 +63,7 @@
<span class="info">{{.i18n.Tr "repo.issues.filter_label_exclude" | Safe}}</span>
<a class="item" href="{{$.Link}}?q={{$.Keyword}}&type={{$.ViewType}}&sort={{$.SortType}}&state={{$.State}}&assignee={{$.AssigneeID}}">{{.i18n.Tr "repo.issues.filter_label_no_select"}}</a>
{{range .Labels}}
<a class="item label-filter-item" href="{{$.Link}}?q={{$.Keyword}}&type={{$.ViewType}}&sort={{$.SortType}}&state={{$.State}}&labels={{.ID}}&assignee={{$.AssigneeID}}" data-label-id="{{.ID}}">{{if .IsExcluded}}{{svg "octicon-circle-slash"}}{{else if contain $.SelLabelIDs .ID}}{{svg "octicon-check"}}{{end}}<span class="label color" style="background-color: {{.Color}}"></span> {{.Name | RenderEmoji}}</a>
<a class="item label-filter-item" href="{{$.Link}}?q={{$.Keyword}}&type={{$.ViewType}}&sort={{$.SortType}}&state={{$.State}}&labels={{.QueryString}}&assignee={{$.AssigneeID}}" data-label-id="{{.ID}}">{{if .IsExcluded}}{{svg "octicon-circle-slash"}}{{else if contain $.SelLabelIDs .ID}}{{svg "octicon-check"}}{{end}}<span class="label color" style="background-color: {{.Color}}"></span> {{.Name | RenderEmoji}}</a>
{{end}}
</div>
</div>

View File

@@ -29,7 +29,7 @@
</div>
<div class="header-right actions df ac">
{{if not $.Repository.IsArchived}}
{{if or (and (eq .PosterID .Issue.PosterID) (eq .Issue.OriginalAuthorID 0)) (eq .Issue.OriginalAuthorID .OriginalAuthorID) }}
{{if or (and (eq .PosterID .Issue.PosterID) (eq .Issue.OriginalAuthorID 0)) (and (eq .Issue.OriginalAuthorID .OriginalAuthorID) (not (eq .OriginalAuthorID 0))) }}
<div class="item tag">
{{$.i18n.Tr "repo.issues.poster"}}
</div>

View File

@@ -96,7 +96,8 @@
<span class="text truncate issue title">{{index .GetIssueInfos 1 | RenderEmoji}}</span>
{{else if or (eq .GetOpType 10) (eq .GetOpType 21) (eq .GetOpType 22) (eq .GetOpType 23)}}
<a href="{{.GetCommentLink}}" class="text truncate issue title">{{.GetIssueTitle | RenderEmoji}}</a>
<p class="text light grey">{{index .GetIssueInfos 1 | RenderEmoji}}</p>
{{$comment := index .GetIssueInfos 1}}
{{if gt (len $comment) 0}}<p class="text light grey">{{$comment | RenderEmoji}}</p>{{end}}
{{else if eq .GetOpType 11}}
<p class="text light grey">{{index .GetIssueInfos 1}}</p>
{{else if or (eq .GetOpType 12) (eq .GetOpType 13) (eq .GetOpType 14) (eq .GetOpType 15)}}

View File

@@ -7,7 +7,6 @@
{{.i18n.Tr "settings.edit_oauth2_application"}}
</h4>
<div class="ui attached segment">
{{template "base/alert" .}}
<p>{{.i18n.Tr "settings.oauth2_application_create_description"}}</p>
</div>
<div class="ui attached segment form ignore-dirty">

View File

@@ -1,7 +1,7 @@
goldmark
==========================================
[![http://godoc.org/github.com/yuin/goldmark](https://godoc.org/github.com/yuin/goldmark?status.svg)](http://godoc.org/github.com/yuin/goldmark)
[![https://pkg.go.dev/github.com/yuin/goldmark](https://pkg.go.dev/badge/github.com/yuin/goldmark.svg)](https://pkg.go.dev/github.com/yuin/goldmark)
[![https://github.com/yuin/goldmark/actions?query=workflow:test](https://github.com/yuin/goldmark/workflows/test/badge.svg?branch=master&event=push)](https://github.com/yuin/goldmark/actions?query=workflow:test)
[![https://coveralls.io/github/yuin/goldmark](https://coveralls.io/repos/github/yuin/goldmark/badge.svg?branch=master)](https://coveralls.io/github/yuin/goldmark)
[![https://goreportcard.com/report/github.com/yuin/goldmark](https://goreportcard.com/badge/github.com/yuin/goldmark)](https://goreportcard.com/report/github.com/yuin/goldmark)
@@ -173,6 +173,7 @@ Parser and Renderer options
- This extension enables Table, Strikethrough, Linkify and TaskList.
- This extension does not filter tags defined in [6.11: Disallowed Raw HTML (extension)](https://github.github.com/gfm/#disallowed-raw-html-extension-).
If you need to filter HTML tags, see [Security](#security).
- If you need to parse github emojis, you can use [goldmark-emoji](https://github.com/yuin/goldmark-emoji) extension.
- `extension.DefinitionList`
- [PHP Markdown Extra: Definition lists](https://michelf.ca/projects/php-markdown/extra/#def-list)
- `extension.Footnote`
@@ -279,13 +280,96 @@ markdown := goldmark.New(
[]byte("https:"),
}),
extension.WithLinkifyURLRegexp(
xurls.Strict(),
xurls.Strict,
),
),
),
)
```
### Footnotes extension
The Footnote extension implements [PHP Markdown Extra: Footnotes](https://michelf.ca/projects/php-markdown/extra/#footnotes).
This extension has some options:
| Functional option | Type | Description |
| ----------------- | ---- | ----------- |
| `extension.WithFootnoteIDPrefix` | `[]byte` | a prefix for the id attributes.|
| `extension.WithFootnoteIDPrefixFunction` | `func(gast.Node) []byte` | a function that determines the id attribute for given Node.|
| `extension.WithFootnoteLinkTitle` | `[]byte` | an optional title attribute for footnote links.|
| `extension.WithFootnoteBacklinkTitle` | `[]byte` | an optional title attribute for footnote backlinks. |
| `extension.WithFootnoteLinkClass` | `[]byte` | a class for footnote links. This defaults to `footnote-ref`. |
| `extension.WithFootnoteBacklinkClass` | `[]byte` | a class for footnote backlinks. This defaults to `footnote-backref`. |
| `extension.WithFootnoteBacklinkHTML` | `[]byte` | a class for footnote backlinks. This defaults to `&#x21a9;&#xfe0e;`. |
Some options can have special substitutions. Occurances of “^^” in the string will be replaced by the corresponding footnote number in the HTML output. Occurances of “%%” will be replaced by a number for the reference (footnotes can have multiple references).
`extension.WithFootnoteIDPrefix` and `extension.WithFootnoteIDPrefixFunction` are useful if you have multiple Markdown documents displayed inside one HTML document to avoid footnote ids to clash each other.
`extension.WithFootnoteIDPrefix` sets fixed id prefix, so you may write codes like the following:
```go
for _, path := range files {
source := readAll(path)
prefix := getPrefix(path)
markdown := goldmark.New(
goldmark.WithExtensions(
NewFootnote(
WithFootnoteIDPrefix([]byte(path)),
),
),
)
var b bytes.Buffer
err := markdown.Convert(source, &b)
if err != nil {
t.Error(err.Error())
}
}
```
`extension.WithFootnoteIDPrefixFunction` determines an id prefix by calling given function, so you may write codes like the following:
```go
markdown := goldmark.New(
goldmark.WithExtensions(
NewFootnote(
WithFootnoteIDPrefixFunction(func(n gast.Node) []byte {
v, ok := n.OwnerDocument().Meta()["footnote-prefix"]
if ok {
return util.StringToReadOnlyBytes(v.(string))
}
return nil
}),
),
),
)
for _, path := range files {
source := readAll(path)
var b bytes.Buffer
doc := markdown.Parser().Parse(text.NewReader(source))
doc.Meta()["footnote-prefix"] = getPrefix(path)
err := markdown.Renderer().Render(&b, source, doc)
}
```
You can use [goldmark-meta](https://github.com/yuin/goldmark-meta) to define a id prefix in the markdown document:
```markdown
---
title: document title
slug: article1
footnote-prefix: article1
---
# My article
```
Security
--------------------
By default, goldmark does not render raw HTML or potentially-dangerous URLs.
@@ -336,6 +420,8 @@ Extensions
extension for the goldmark Markdown parser.
- [goldmark-highlighting](https://github.com/yuin/goldmark-highlighting): A syntax-highlighting extension
for the goldmark markdown parser.
- [goldmark-emoji](https://github.com/yuin/goldmark-emoji): An emoji
extension for the goldmark Markdown parser.
- [goldmark-mathjax](https://github.com/litao91/goldmark-mathjax): Mathjax support for the goldmark markdown parser
goldmark internal(for extension developers)

View File

@@ -45,11 +45,6 @@ type Attribute struct {
Value interface{}
}
var attrNameIDS = []byte("#")
var attrNameID = []byte("id")
var attrNameClassS = []byte(".")
var attrNameClass = []byte("class")
// A Node interface defines basic AST node functionalities.
type Node interface {
// Type returns a type of this node.
@@ -116,6 +111,11 @@ type Node interface {
// tail of the children.
InsertAfter(self, v1, insertee Node)
// OwnerDocument returns this node's owner document.
// If this node is not a child of the Document node, OwnerDocument
// returns nil.
OwnerDocument() *Document
// Dump dumps an AST tree structure to stdout.
// This function completely aimed for debugging.
// level is a indent level. Implementer should indent informations with
@@ -169,7 +169,7 @@ type Node interface {
RemoveAttributes()
}
// A BaseNode struct implements the Node interface.
// A BaseNode struct implements the Node interface partialliy.
type BaseNode struct {
firstChild Node
lastChild Node
@@ -358,6 +358,22 @@ func (n *BaseNode) InsertBefore(self, v1, insertee Node) {
}
}
// OwnerDocument implements Node.OwnerDocument
func (n *BaseNode) OwnerDocument() *Document {
d := n.Parent()
for {
p := d.Parent()
if p == nil {
if v, ok := d.(*Document); ok {
return v
}
break
}
d = p
}
return nil
}
// Text implements Node.Text .
func (n *BaseNode) Text(source []byte) []byte {
var buf bytes.Buffer

View File

@@ -7,7 +7,7 @@ import (
textm "github.com/yuin/goldmark/text"
)
// A BaseBlock struct implements the Node interface.
// A BaseBlock struct implements the Node interface partialliy.
type BaseBlock struct {
BaseNode
blankPreviousLines bool
@@ -50,6 +50,8 @@ func (b *BaseBlock) SetLines(v *textm.Segments) {
// A Document struct is a root node of Markdown text.
type Document struct {
BaseBlock
meta map[string]interface{}
}
// KindDocument is a NodeKind of the Document node.
@@ -70,10 +72,29 @@ func (n *Document) Kind() NodeKind {
return KindDocument
}
// OwnerDocument implements Node.OwnerDocument
func (n *Document) OwnerDocument() *Document {
return n
}
// Meta returns metadata of this document.
func (n *Document) Meta() map[string]interface{} {
if n.meta == nil {
n.meta = map[string]interface{}{}
}
return n.meta
}
// SetMeta sets given metadata to this document.
func (n *Document) SetMeta(meta map[string]interface{}) {
n.meta = meta
}
// NewDocument returns a new Document node.
func NewDocument() *Document {
return &Document{
BaseBlock: BaseBlock{},
meta: nil,
}
}

View File

@@ -8,7 +8,7 @@ import (
"github.com/yuin/goldmark/util"
)
// A BaseInline struct implements the Node interface.
// A BaseInline struct implements the Node interface partialliy.
type BaseInline struct {
BaseNode
}

View File

@@ -2,6 +2,7 @@ package ast
import (
"fmt"
gast "github.com/yuin/goldmark/ast"
)
@@ -9,13 +10,15 @@ import (
// (PHP Markdown Extra) text.
type FootnoteLink struct {
gast.BaseInline
Index int
Index int
RefCount int
}
// Dump implements Node.Dump.
func (n *FootnoteLink) Dump(source []byte, level int) {
m := map[string]string{}
m["Index"] = fmt.Sprintf("%v", n.Index)
m["RefCount"] = fmt.Sprintf("%v", n.RefCount)
gast.DumpHelper(n, source, level, m, nil)
}
@@ -30,36 +33,40 @@ func (n *FootnoteLink) Kind() gast.NodeKind {
// NewFootnoteLink returns a new FootnoteLink node.
func NewFootnoteLink(index int) *FootnoteLink {
return &FootnoteLink{
Index: index,
Index: index,
RefCount: 0,
}
}
// A FootnoteBackLink struct represents a link to a footnote of Markdown
// A FootnoteBacklink struct represents a link to a footnote of Markdown
// (PHP Markdown Extra) text.
type FootnoteBackLink struct {
type FootnoteBacklink struct {
gast.BaseInline
Index int
Index int
RefCount int
}
// Dump implements Node.Dump.
func (n *FootnoteBackLink) Dump(source []byte, level int) {
func (n *FootnoteBacklink) Dump(source []byte, level int) {
m := map[string]string{}
m["Index"] = fmt.Sprintf("%v", n.Index)
m["RefCount"] = fmt.Sprintf("%v", n.RefCount)
gast.DumpHelper(n, source, level, m, nil)
}
// KindFootnoteBackLink is a NodeKind of the FootnoteBackLink node.
var KindFootnoteBackLink = gast.NewNodeKind("FootnoteBackLink")
// KindFootnoteBacklink is a NodeKind of the FootnoteBacklink node.
var KindFootnoteBacklink = gast.NewNodeKind("FootnoteBacklink")
// Kind implements Node.Kind.
func (n *FootnoteBackLink) Kind() gast.NodeKind {
return KindFootnoteBackLink
func (n *FootnoteBacklink) Kind() gast.NodeKind {
return KindFootnoteBacklink
}
// NewFootnoteBackLink returns a new FootnoteBackLink node.
func NewFootnoteBackLink(index int) *FootnoteBackLink {
return &FootnoteBackLink{
Index: index,
// NewFootnoteBacklink returns a new FootnoteBacklink node.
func NewFootnoteBacklink(index int) *FootnoteBacklink {
return &FootnoteBacklink{
Index: index,
RefCount: 0,
}
}

View File

@@ -2,6 +2,8 @@ package extension
import (
"bytes"
"strconv"
"github.com/yuin/goldmark"
gast "github.com/yuin/goldmark/ast"
"github.com/yuin/goldmark/extension/ast"
@@ -10,10 +12,10 @@ import (
"github.com/yuin/goldmark/renderer/html"
"github.com/yuin/goldmark/text"
"github.com/yuin/goldmark/util"
"strconv"
)
var footnoteListKey = parser.NewContextKey()
var footnoteLinkListKey = parser.NewContextKey()
type footnoteBlockParser struct {
}
@@ -164,7 +166,20 @@ func (s *footnoteParser) Parse(parent gast.Node, block text.Reader, pc parser.Co
return nil
}
return ast.NewFootnoteLink(index)
fnlink := ast.NewFootnoteLink(index)
var fnlist []*ast.FootnoteLink
if tmp := pc.Get(footnoteLinkListKey); tmp != nil {
fnlist = tmp.([]*ast.FootnoteLink)
} else {
fnlist = []*ast.FootnoteLink{}
pc.Set(footnoteLinkListKey, fnlist)
}
pc.Set(footnoteLinkListKey, append(fnlist, fnlink))
if line[0] == '!' {
parent.AppendChild(parent, gast.NewTextSegment(text.NewSegment(segment.Start, segment.Start+1)))
}
return fnlink
}
type footnoteASTTransformer struct {
@@ -180,23 +195,46 @@ func NewFootnoteASTTransformer() parser.ASTTransformer {
func (a *footnoteASTTransformer) Transform(node *gast.Document, reader text.Reader, pc parser.Context) {
var list *ast.FootnoteList
if tlist := pc.Get(footnoteListKey); tlist != nil {
list = tlist.(*ast.FootnoteList)
} else {
var fnlist []*ast.FootnoteLink
if tmp := pc.Get(footnoteListKey); tmp != nil {
list = tmp.(*ast.FootnoteList)
}
if tmp := pc.Get(footnoteLinkListKey); tmp != nil {
fnlist = tmp.([]*ast.FootnoteLink)
}
pc.Set(footnoteListKey, nil)
pc.Set(footnoteLinkListKey, nil)
if list == nil {
return
}
pc.Set(footnoteListKey, nil)
counter := map[int]int{}
if fnlist != nil {
for _, fnlink := range fnlist {
if fnlink.Index >= 0 {
counter[fnlink.Index]++
}
}
for _, fnlink := range fnlist {
fnlink.RefCount = counter[fnlink.Index]
}
}
for footnote := list.FirstChild(); footnote != nil; {
var container gast.Node = footnote
next := footnote.NextSibling()
if fc := container.LastChild(); fc != nil && gast.IsParagraph(fc) {
container = fc
}
index := footnote.(*ast.Footnote).Index
fn := footnote.(*ast.Footnote)
index := fn.Index
if index < 0 {
list.RemoveChild(list, footnote)
} else {
container.AppendChild(container, ast.NewFootnoteBackLink(index))
backLink := ast.NewFootnoteBacklink(index)
backLink.RefCount = counter[index]
container.AppendChild(container, backLink)
}
footnote = next
}
@@ -214,19 +252,250 @@ func (a *footnoteASTTransformer) Transform(node *gast.Document, reader text.Read
node.AppendChild(node, list)
}
// FootnoteConfig holds configuration values for the footnote extension.
//
// Link* and Backlink* configurations have some variables:
// Occurrances of “^^” in the string will be replaced by the
// corresponding footnote number in the HTML output.
// Occurrances of “%%” will be replaced by a number for the
// reference (footnotes can have multiple references).
type FootnoteConfig struct {
html.Config
// IDPrefix is a prefix for the id attributes generated by footnotes.
IDPrefix []byte
// IDPrefix is a function that determines the id attribute for given Node.
IDPrefixFunction func(gast.Node) []byte
// LinkTitle is an optional title attribute for footnote links.
LinkTitle []byte
// BacklinkTitle is an optional title attribute for footnote backlinks.
BacklinkTitle []byte
// LinkClass is a class for footnote links.
LinkClass []byte
// BacklinkClass is a class for footnote backlinks.
BacklinkClass []byte
// BacklinkHTML is an HTML content for footnote backlinks.
BacklinkHTML []byte
}
// FootnoteOption interface is a functional option interface for the extension.
type FootnoteOption interface {
renderer.Option
// SetFootnoteOption sets given option to the extension.
SetFootnoteOption(*FootnoteConfig)
}
// NewFootnoteConfig returns a new Config with defaults.
func NewFootnoteConfig() FootnoteConfig {
return FootnoteConfig{
Config: html.NewConfig(),
LinkTitle: []byte(""),
BacklinkTitle: []byte(""),
LinkClass: []byte("footnote-ref"),
BacklinkClass: []byte("footnote-backref"),
BacklinkHTML: []byte("&#x21a9;&#xfe0e;"),
}
}
// SetOption implements renderer.SetOptioner.
func (c *FootnoteConfig) SetOption(name renderer.OptionName, value interface{}) {
switch name {
case optFootnoteIDPrefixFunction:
c.IDPrefixFunction = value.(func(gast.Node) []byte)
case optFootnoteIDPrefix:
c.IDPrefix = value.([]byte)
case optFootnoteLinkTitle:
c.LinkTitle = value.([]byte)
case optFootnoteBacklinkTitle:
c.BacklinkTitle = value.([]byte)
case optFootnoteLinkClass:
c.LinkClass = value.([]byte)
case optFootnoteBacklinkClass:
c.BacklinkClass = value.([]byte)
case optFootnoteBacklinkHTML:
c.BacklinkHTML = value.([]byte)
default:
c.Config.SetOption(name, value)
}
}
type withFootnoteHTMLOptions struct {
value []html.Option
}
func (o *withFootnoteHTMLOptions) SetConfig(c *renderer.Config) {
if o.value != nil {
for _, v := range o.value {
v.(renderer.Option).SetConfig(c)
}
}
}
func (o *withFootnoteHTMLOptions) SetFootnoteOption(c *FootnoteConfig) {
if o.value != nil {
for _, v := range o.value {
v.SetHTMLOption(&c.Config)
}
}
}
// WithFootnoteHTMLOptions is functional option that wraps goldmark HTMLRenderer options.
func WithFootnoteHTMLOptions(opts ...html.Option) FootnoteOption {
return &withFootnoteHTMLOptions{opts}
}
const optFootnoteIDPrefix renderer.OptionName = "FootnoteIDPrefix"
type withFootnoteIDPrefix struct {
value []byte
}
func (o *withFootnoteIDPrefix) SetConfig(c *renderer.Config) {
c.Options[optFootnoteIDPrefix] = o.value
}
func (o *withFootnoteIDPrefix) SetFootnoteOption(c *FootnoteConfig) {
c.IDPrefix = o.value
}
// WithFootnoteIDPrefix is a functional option that is a prefix for the id attributes generated by footnotes.
func WithFootnoteIDPrefix(a []byte) FootnoteOption {
return &withFootnoteIDPrefix{a}
}
const optFootnoteIDPrefixFunction renderer.OptionName = "FootnoteIDPrefixFunction"
type withFootnoteIDPrefixFunction struct {
value func(gast.Node) []byte
}
func (o *withFootnoteIDPrefixFunction) SetConfig(c *renderer.Config) {
c.Options[optFootnoteIDPrefixFunction] = o.value
}
func (o *withFootnoteIDPrefixFunction) SetFootnoteOption(c *FootnoteConfig) {
c.IDPrefixFunction = o.value
}
// WithFootnoteIDPrefixFunction is a functional option that is a prefix for the id attributes generated by footnotes.
func WithFootnoteIDPrefixFunction(a func(gast.Node) []byte) FootnoteOption {
return &withFootnoteIDPrefixFunction{a}
}
const optFootnoteLinkTitle renderer.OptionName = "FootnoteLinkTitle"
type withFootnoteLinkTitle struct {
value []byte
}
func (o *withFootnoteLinkTitle) SetConfig(c *renderer.Config) {
c.Options[optFootnoteLinkTitle] = o.value
}
func (o *withFootnoteLinkTitle) SetFootnoteOption(c *FootnoteConfig) {
c.LinkTitle = o.value
}
// WithFootnoteLinkTitle is a functional option that is an optional title attribute for footnote links.
func WithFootnoteLinkTitle(a []byte) FootnoteOption {
return &withFootnoteLinkTitle{a}
}
const optFootnoteBacklinkTitle renderer.OptionName = "FootnoteBacklinkTitle"
type withFootnoteBacklinkTitle struct {
value []byte
}
func (o *withFootnoteBacklinkTitle) SetConfig(c *renderer.Config) {
c.Options[optFootnoteBacklinkTitle] = o.value
}
func (o *withFootnoteBacklinkTitle) SetFootnoteOption(c *FootnoteConfig) {
c.BacklinkTitle = o.value
}
// WithFootnoteBacklinkTitle is a functional option that is an optional title attribute for footnote backlinks.
func WithFootnoteBacklinkTitle(a []byte) FootnoteOption {
return &withFootnoteBacklinkTitle{a}
}
const optFootnoteLinkClass renderer.OptionName = "FootnoteLinkClass"
type withFootnoteLinkClass struct {
value []byte
}
func (o *withFootnoteLinkClass) SetConfig(c *renderer.Config) {
c.Options[optFootnoteLinkClass] = o.value
}
func (o *withFootnoteLinkClass) SetFootnoteOption(c *FootnoteConfig) {
c.LinkClass = o.value
}
// WithFootnoteLinkClass is a functional option that is a class for footnote links.
func WithFootnoteLinkClass(a []byte) FootnoteOption {
return &withFootnoteLinkClass{a}
}
const optFootnoteBacklinkClass renderer.OptionName = "FootnoteBacklinkClass"
type withFootnoteBacklinkClass struct {
value []byte
}
func (o *withFootnoteBacklinkClass) SetConfig(c *renderer.Config) {
c.Options[optFootnoteBacklinkClass] = o.value
}
func (o *withFootnoteBacklinkClass) SetFootnoteOption(c *FootnoteConfig) {
c.BacklinkClass = o.value
}
// WithFootnoteBacklinkClass is a functional option that is a class for footnote backlinks.
func WithFootnoteBacklinkClass(a []byte) FootnoteOption {
return &withFootnoteBacklinkClass{a}
}
const optFootnoteBacklinkHTML renderer.OptionName = "FootnoteBacklinkHTML"
type withFootnoteBacklinkHTML struct {
value []byte
}
func (o *withFootnoteBacklinkHTML) SetConfig(c *renderer.Config) {
c.Options[optFootnoteBacklinkHTML] = o.value
}
func (o *withFootnoteBacklinkHTML) SetFootnoteOption(c *FootnoteConfig) {
c.BacklinkHTML = o.value
}
// WithFootnoteBacklinkHTML is an HTML content for footnote backlinks.
func WithFootnoteBacklinkHTML(a []byte) FootnoteOption {
return &withFootnoteBacklinkHTML{a}
}
// FootnoteHTMLRenderer is a renderer.NodeRenderer implementation that
// renders FootnoteLink nodes.
type FootnoteHTMLRenderer struct {
html.Config
FootnoteConfig
}
// NewFootnoteHTMLRenderer returns a new FootnoteHTMLRenderer.
func NewFootnoteHTMLRenderer(opts ...html.Option) renderer.NodeRenderer {
func NewFootnoteHTMLRenderer(opts ...FootnoteOption) renderer.NodeRenderer {
r := &FootnoteHTMLRenderer{
Config: html.NewConfig(),
FootnoteConfig: NewFootnoteConfig(),
}
for _, opt := range opts {
opt.SetHTMLOption(&r.Config)
opt.SetFootnoteOption(&r.FootnoteConfig)
}
return r
}
@@ -234,7 +503,7 @@ func NewFootnoteHTMLRenderer(opts ...html.Option) renderer.NodeRenderer {
// RegisterFuncs implements renderer.NodeRenderer.RegisterFuncs.
func (r *FootnoteHTMLRenderer) RegisterFuncs(reg renderer.NodeRendererFuncRegisterer) {
reg.Register(ast.KindFootnoteLink, r.renderFootnoteLink)
reg.Register(ast.KindFootnoteBackLink, r.renderFootnoteBackLink)
reg.Register(ast.KindFootnoteBacklink, r.renderFootnoteBacklink)
reg.Register(ast.KindFootnote, r.renderFootnote)
reg.Register(ast.KindFootnoteList, r.renderFootnoteList)
}
@@ -243,25 +512,45 @@ func (r *FootnoteHTMLRenderer) renderFootnoteLink(w util.BufWriter, source []byt
if entering {
n := node.(*ast.FootnoteLink)
is := strconv.Itoa(n.Index)
_, _ = w.WriteString(`<sup id="fnref:`)
_, _ = w.WriteString(`<sup id="`)
_, _ = w.Write(r.idPrefix(node))
_, _ = w.WriteString(`fnref:`)
_, _ = w.WriteString(is)
_, _ = w.WriteString(`"><a href="#fn:`)
_, _ = w.WriteString(`"><a href="#`)
_, _ = w.Write(r.idPrefix(node))
_, _ = w.WriteString(`fn:`)
_, _ = w.WriteString(is)
_, _ = w.WriteString(`" class="footnote-ref" role="doc-noteref">`)
_, _ = w.WriteString(`" class="`)
_, _ = w.Write(applyFootnoteTemplate(r.FootnoteConfig.LinkClass,
n.Index, n.RefCount))
if len(r.FootnoteConfig.LinkTitle) > 0 {
_, _ = w.WriteString(`" title="`)
_, _ = w.Write(util.EscapeHTML(applyFootnoteTemplate(r.FootnoteConfig.LinkTitle, n.Index, n.RefCount)))
}
_, _ = w.WriteString(`" role="doc-noteref">`)
_, _ = w.WriteString(is)
_, _ = w.WriteString(`</a></sup>`)
}
return gast.WalkContinue, nil
}
func (r *FootnoteHTMLRenderer) renderFootnoteBackLink(w util.BufWriter, source []byte, node gast.Node, entering bool) (gast.WalkStatus, error) {
func (r *FootnoteHTMLRenderer) renderFootnoteBacklink(w util.BufWriter, source []byte, node gast.Node, entering bool) (gast.WalkStatus, error) {
if entering {
n := node.(*ast.FootnoteBackLink)
n := node.(*ast.FootnoteBacklink)
is := strconv.Itoa(n.Index)
_, _ = w.WriteString(` <a href="#fnref:`)
_, _ = w.WriteString(` <a href="#`)
_, _ = w.Write(r.idPrefix(node))
_, _ = w.WriteString(`fnref:`)
_, _ = w.WriteString(is)
_, _ = w.WriteString(`" class="footnote-backref" role="doc-backlink">`)
_, _ = w.WriteString("&#x21a9;&#xfe0e;")
_, _ = w.WriteString(`" class="`)
_, _ = w.Write(applyFootnoteTemplate(r.FootnoteConfig.BacklinkClass, n.Index, n.RefCount))
if len(r.FootnoteConfig.BacklinkTitle) > 0 {
_, _ = w.WriteString(`" title="`)
_, _ = w.Write(util.EscapeHTML(applyFootnoteTemplate(r.FootnoteConfig.BacklinkTitle, n.Index, n.RefCount)))
}
_, _ = w.WriteString(`" role="doc-backlink">`)
_, _ = w.Write(applyFootnoteTemplate(r.FootnoteConfig.BacklinkHTML, n.Index, n.RefCount))
_, _ = w.WriteString(`</a>`)
}
return gast.WalkContinue, nil
@@ -271,7 +560,9 @@ func (r *FootnoteHTMLRenderer) renderFootnote(w util.BufWriter, source []byte, n
n := node.(*ast.Footnote)
is := strconv.Itoa(n.Index)
if entering {
_, _ = w.WriteString(`<li id="fn:`)
_, _ = w.WriteString(`<li id="`)
_, _ = w.Write(r.idPrefix(node))
_, _ = w.WriteString(`fn:`)
_, _ = w.WriteString(is)
_, _ = w.WriteString(`" role="doc-endnote"`)
if node.Attributes() != nil {
@@ -312,11 +603,54 @@ func (r *FootnoteHTMLRenderer) renderFootnoteList(w util.BufWriter, source []byt
return gast.WalkContinue, nil
}
func (r *FootnoteHTMLRenderer) idPrefix(node gast.Node) []byte {
if r.FootnoteConfig.IDPrefix != nil {
return r.FootnoteConfig.IDPrefix
}
if r.FootnoteConfig.IDPrefixFunction != nil {
return r.FootnoteConfig.IDPrefixFunction(node)
}
return []byte("")
}
func applyFootnoteTemplate(b []byte, index, refCount int) []byte {
fast := true
for i, c := range b {
if i != 0 {
if b[i-1] == '^' && c == '^' {
fast = false
break
}
if b[i-1] == '%' && c == '%' {
fast = false
break
}
}
}
if fast {
return b
}
is := []byte(strconv.Itoa(index))
rs := []byte(strconv.Itoa(refCount))
ret := bytes.Replace(b, []byte("^^"), is, -1)
return bytes.Replace(ret, []byte("%%"), rs, -1)
}
type footnote struct {
options []FootnoteOption
}
// Footnote is an extension that allow you to use PHP Markdown Extra Footnotes.
var Footnote = &footnote{}
var Footnote = &footnote{
options: []FootnoteOption{},
}
// NewFootnote returns a new extension with given options.
func NewFootnote(opts ...FootnoteOption) goldmark.Extender {
return &footnote{
options: opts,
}
}
func (e *footnote) Extend(m goldmark.Markdown) {
m.Parser().AddOptions(
@@ -331,6 +665,6 @@ func (e *footnote) Extend(m goldmark.Markdown) {
),
)
m.Renderer().AddOptions(renderer.WithNodeRenderers(
util.Prioritized(NewFootnoteHTMLRenderer(), 500),
util.Prioritized(NewFootnoteHTMLRenderer(e.options...), 500),
))
}

View File

@@ -11,9 +11,9 @@ import (
"github.com/yuin/goldmark/util"
)
var wwwURLRegxp = regexp.MustCompile(`^www\.[-a-zA-Z0-9@:%._\+~#=]{2,256}\.[a-z]+(?:(?:/|[#?])[-a-zA-Z0-9@:%_\+.~#!?&//=\(\);,'">\^{}\[\]` + "`" + `]*)?`)
var wwwURLRegxp = regexp.MustCompile(`^www\.[-a-zA-Z0-9@:%._\+~#=]{1,256}\.[a-z]+(?:[/#?][-a-zA-Z0-9@:%_\+.~#!?&/=\(\);,'">\^{}\[\]` + "`" + `]*)?`)
var urlRegexp = regexp.MustCompile(`^(?:http|https|ftp):\/\/(?:www\.)?[-a-zA-Z0-9@:%._\+~#=]{2,256}\.[a-z]+(?:(?:/|[#?])[-a-zA-Z0-9@:%_+.~#$!?&//=\(\);,'">\^{}\[\]` + "`" + `]*)?`)
var urlRegexp = regexp.MustCompile(`^(?:http|https|ftp)://[-a-zA-Z0-9@:%._\+~#=]{1,256}\.[a-z]+(?::\d+)?(?:[/#?][-a-zA-Z0-9@:%_+.~#$!?&/=\(\);,'">\^{}\[\]` + "`" + `]*)?`)
// An LinkifyConfig struct is a data structure that holds configuration of the
// Linkify extension.
@@ -24,10 +24,12 @@ type LinkifyConfig struct {
EmailRegexp *regexp.Regexp
}
const optLinkifyAllowedProtocols parser.OptionName = "LinkifyAllowedProtocols"
const optLinkifyURLRegexp parser.OptionName = "LinkifyURLRegexp"
const optLinkifyWWWRegexp parser.OptionName = "LinkifyWWWRegexp"
const optLinkifyEmailRegexp parser.OptionName = "LinkifyEmailRegexp"
const (
optLinkifyAllowedProtocols parser.OptionName = "LinkifyAllowedProtocols"
optLinkifyURLRegexp parser.OptionName = "LinkifyURLRegexp"
optLinkifyWWWRegexp parser.OptionName = "LinkifyWWWRegexp"
optLinkifyEmailRegexp parser.OptionName = "LinkifyEmailRegexp"
)
// SetOption implements SetOptioner.
func (c *LinkifyConfig) SetOption(name parser.OptionName, value interface{}) {
@@ -156,10 +158,12 @@ func (s *linkifyParser) Trigger() []byte {
return []byte{' ', '*', '_', '~', '('}
}
var protoHTTP = []byte("http:")
var protoHTTPS = []byte("https:")
var protoFTP = []byte("ftp:")
var domainWWW = []byte("www.")
var (
protoHTTP = []byte("http:")
protoHTTPS = []byte("https:")
protoFTP = []byte("ftp:")
domainWWW = []byte("www.")
)
func (s *linkifyParser) Parse(parent ast.Node, block text.Reader, pc parser.Context) ast.Node {
if pc.IsInLinkLabel() {

View File

@@ -15,6 +15,14 @@ import (
"github.com/yuin/goldmark/util"
)
var escapedPipeCellListKey = parser.NewContextKey()
type escapedPipeCell struct {
Cell *ast.TableCell
Pos []int
Transformed bool
}
// TableCellAlignMethod indicates how are table cells aligned in HTML format.indicates how are table cells aligned in HTML format.
type TableCellAlignMethod int
@@ -148,7 +156,7 @@ func (b *tableParagraphTransformer) Transform(node *gast.Paragraph, reader text.
if alignments == nil {
continue
}
header := b.parseRow(lines.At(i-1), alignments, true, reader)
header := b.parseRow(lines.At(i-1), alignments, true, reader, pc)
if header == nil || len(alignments) != header.ChildCount() {
return
}
@@ -156,7 +164,7 @@ func (b *tableParagraphTransformer) Transform(node *gast.Paragraph, reader text.
table.Alignments = alignments
table.AppendChild(table, ast.NewTableHeader(header))
for j := i + 1; j < lines.Len(); j++ {
table.AppendChild(table, b.parseRow(lines.At(j), alignments, false, reader))
table.AppendChild(table, b.parseRow(lines.At(j), alignments, false, reader, pc))
}
node.Lines().SetSliced(0, i-1)
node.Parent().InsertAfter(node.Parent(), node, table)
@@ -170,7 +178,7 @@ func (b *tableParagraphTransformer) Transform(node *gast.Paragraph, reader text.
}
}
func (b *tableParagraphTransformer) parseRow(segment text.Segment, alignments []ast.Alignment, isHeader bool, reader text.Reader) *ast.TableRow {
func (b *tableParagraphTransformer) parseRow(segment text.Segment, alignments []ast.Alignment, isHeader bool, reader text.Reader, pc parser.Context) *ast.TableRow {
source := reader.Source()
line := segment.Value(source)
pos := 0
@@ -194,18 +202,39 @@ func (b *tableParagraphTransformer) parseRow(segment text.Segment, alignments []
} else {
alignment = alignments[i]
}
closure := util.FindClosure(line[pos:], byte(0), '|', true, false)
if closure < 0 {
closure = len(line[pos:])
}
var escapedCell *escapedPipeCell
node := ast.NewTableCell()
seg := text.NewSegment(segment.Start+pos, segment.Start+pos+closure)
node.Alignment = alignment
hasBacktick := false
closure := pos
for ; closure < limit; closure++ {
if line[closure] == '`' {
hasBacktick = true
}
if line[closure] == '|' {
if closure == 0 || line[closure-1] != '\\' {
break
} else if hasBacktick {
if escapedCell == nil {
escapedCell = &escapedPipeCell{node, []int{}, false}
escapedList := pc.ComputeIfAbsent(escapedPipeCellListKey,
func() interface{} {
return []*escapedPipeCell{}
}).([]*escapedPipeCell)
escapedList = append(escapedList, escapedCell)
pc.Set(escapedPipeCellListKey, escapedList)
}
escapedCell.Pos = append(escapedCell.Pos, segment.Start+closure-1)
}
}
}
seg := text.NewSegment(segment.Start+pos, segment.Start+closure)
seg = seg.TrimLeftSpace(source)
seg = seg.TrimRightSpace(source)
node.Lines().Append(seg)
node.Alignment = alignment
row.AppendChild(row, node)
pos += closure + 1
pos = closure + 1
}
for ; i < len(alignments); i++ {
row.AppendChild(row, ast.NewTableCell())
@@ -243,6 +272,61 @@ func (b *tableParagraphTransformer) parseDelimiter(segment text.Segment, reader
return alignments
}
type tableASTTransformer struct {
}
var defaultTableASTTransformer = &tableASTTransformer{}
// NewTableASTTransformer returns a parser.ASTTransformer for tables.
func NewTableASTTransformer() parser.ASTTransformer {
return defaultTableASTTransformer
}
func (a *tableASTTransformer) Transform(node *gast.Document, reader text.Reader, pc parser.Context) {
lst := pc.Get(escapedPipeCellListKey)
if lst == nil {
return
}
pc.Set(escapedPipeCellListKey, nil)
for _, v := range lst.([]*escapedPipeCell) {
if v.Transformed {
continue
}
_ = gast.Walk(v.Cell, func(n gast.Node, entering bool) (gast.WalkStatus, error) {
if !entering || n.Kind() != gast.KindCodeSpan {
return gast.WalkContinue, nil
}
for c := n.FirstChild(); c != nil; {
next := c.NextSibling()
if c.Kind() != gast.KindText {
c = next
continue
}
parent := c.Parent()
ts := &c.(*gast.Text).Segment
n := c
for _, v := range lst.([]*escapedPipeCell) {
for _, pos := range v.Pos {
if ts.Start <= pos && pos < ts.Stop {
segment := n.(*gast.Text).Segment
n1 := gast.NewRawTextSegment(segment.WithStop(pos))
n2 := gast.NewRawTextSegment(segment.WithStart(pos + 1))
parent.InsertAfter(parent, n, n1)
parent.InsertAfter(parent, n1, n2)
parent.RemoveChild(parent, n)
n = n2
v.Transformed = true
}
}
}
c = next
}
return gast.WalkContinue, nil
})
}
}
// TableHTMLRenderer is a renderer.NodeRenderer implementation that
// renders Table nodes.
type TableHTMLRenderer struct {
@@ -419,7 +503,7 @@ func (r *TableHTMLRenderer) renderTableCell(w util.BufWriter, source []byte, nod
cob.AppendByte(';')
}
style := fmt.Sprintf("text-align:%s", n.Alignment.String())
cob.Append(util.StringToReadOnlyBytes(style))
cob.AppendString(style)
n.SetAttributeString("style", cob.Bytes())
}
}
@@ -454,9 +538,14 @@ func NewTable(opts ...TableOption) goldmark.Extender {
}
func (e *table) Extend(m goldmark.Markdown) {
m.Parser().AddOptions(parser.WithParagraphTransformers(
util.Prioritized(NewTableParagraphTransformer(), 200),
))
m.Parser().AddOptions(
parser.WithParagraphTransformers(
util.Prioritized(NewTableParagraphTransformer(), 200),
),
parser.WithASTTransformers(
util.Prioritized(defaultTableASTTransformer, 0),
),
)
m.Renderer().AddOptions(renderer.WithNodeRenderers(
util.Prioritized(NewTableHTMLRenderer(e.options...), 500),
))

View File

@@ -1,3 +1,3 @@
module github.com/yuin/goldmark
go 1.13
go 1.15

View File

@@ -49,6 +49,12 @@ func (b *codeBlockParser) Continue(node ast.Node, reader text.Reader, pc Context
}
reader.AdvanceAndSetPadding(pos, padding)
_, segment = reader.PeekLine()
// if code block line starts with a tab, keep a tab as it is.
if segment.Padding != 0 {
preserveLeadingTabInCodeBlock(&segment, reader)
}
node.Lines().Append(segment)
reader.Advance(segment.Len() - 1)
return Continue | NoChildren
@@ -77,3 +83,14 @@ func (b *codeBlockParser) CanInterruptParagraph() bool {
func (b *codeBlockParser) CanAcceptIndentedLine() bool {
return true
}
func preserveLeadingTabInCodeBlock(segment *text.Segment, reader text.Reader) {
offsetWithPadding := reader.LineOffset()
sl, ss := reader.Position()
reader.SetPosition(sl, text.NewSegment(ss.Start-1, ss.Stop))
if offsetWithPadding == reader.LineOffset() {
segment.Padding = 0
segment.Start--
}
reader.SetPosition(sl, ss)
}

View File

@@ -71,6 +71,10 @@ func (b *fencedCodeBlockParser) Open(parent ast.Node, reader text.Reader, pc Con
func (b *fencedCodeBlockParser) Continue(node ast.Node, reader text.Reader, pc Context) State {
line, segment := reader.PeekLine()
fdata := pc.Get(fencedCodeBlockInfoKey).(*fenceData)
// if code block line starts with a tab, keep a tab as it is.
if segment.Padding != 0 {
preserveLeadingTabInCodeBlock(&segment, reader)
}
w, pos := util.IndentWidth(line, reader.LineOffset())
if w < 4 {
i := pos

View File

@@ -2,7 +2,6 @@ package parser
import (
"fmt"
"regexp"
"strings"
"github.com/yuin/goldmark/ast"
@@ -113,8 +112,6 @@ func (s *linkParser) Trigger() []byte {
return []byte{'!', '[', ']'}
}
var linkDestinationRegexp = regexp.MustCompile(`\s*([^\s].+)`)
var linkTitleRegexp = regexp.MustCompile(`\s+(\)|["'\(].+)`)
var linkBottom = NewContextKey()
func (s *linkParser) Parse(parent ast.Node, block text.Reader, pc Context) ast.Node {
@@ -293,20 +290,17 @@ func (s *linkParser) parseLink(parent ast.Node, last *linkLabelState, block text
func parseLinkDestination(block text.Reader) ([]byte, bool) {
block.SkipSpaces()
line, _ := block.PeekLine()
buf := []byte{}
if block.Peek() == '<' {
i := 1
for i < len(line) {
c := line[i]
if c == '\\' && i < len(line)-1 && util.IsPunct(line[i+1]) {
buf = append(buf, '\\', line[i+1])
i += 2
continue
} else if c == '>' {
block.Advance(i + 1)
return line[1:i], true
}
buf = append(buf, c)
i++
}
return nil, false
@@ -316,7 +310,6 @@ func parseLinkDestination(block text.Reader) ([]byte, bool) {
for i < len(line) {
c := line[i]
if c == '\\' && i < len(line)-1 && util.IsPunct(line[i+1]) {
buf = append(buf, '\\', line[i+1])
i += 2
continue
} else if c == '(' {
@@ -329,7 +322,6 @@ func parseLinkDestination(block text.Reader) ([]byte, bool) {
} else if util.IsSpace(c) {
break
}
buf = append(buf, c)
i++
}
block.Advance(i)

View File

@@ -138,6 +138,9 @@ type Context interface {
// Get returns a value associated with the given key.
Get(ContextKey) interface{}
// ComputeIfAbsent computes a value if a value associated with the given key is absent and returns the value.
ComputeIfAbsent(ContextKey, func() interface{}) interface{}
// Set sets the given value to the context.
Set(ContextKey, interface{})
@@ -252,6 +255,15 @@ func (p *parseContext) Get(key ContextKey) interface{} {
return p.store[key]
}
func (p *parseContext) ComputeIfAbsent(key ContextKey, f func() interface{}) interface{} {
v := p.store[key]
if v == nil {
v = f()
p.store[key] = v
}
return v
}
func (p *parseContext) Set(key ContextKey, value interface{}) {
p.store[key] = value
}

View File

@@ -2,10 +2,11 @@ package parser
import (
"bytes"
"regexp"
"github.com/yuin/goldmark/ast"
"github.com/yuin/goldmark/text"
"github.com/yuin/goldmark/util"
"regexp"
)
type rawHTMLParser struct {
@@ -67,8 +68,6 @@ func (s *rawHTMLParser) parseSingleLineRegexp(reg *regexp.Regexp, block text.Rea
return node
}
var dummyMatch = [][]byte{}
func (s *rawHTMLParser) parseMultiLineRegexp(reg *regexp.Regexp, block text.Reader, pc Context) ast.Node {
sline, ssegment := block.Position()
if block.Match(reg) {
@@ -102,7 +101,3 @@ func (s *rawHTMLParser) parseMultiLineRegexp(reg *regexp.Regexp, block text.Read
}
return nil
}
func (s *rawHTMLParser) CloseBlock(parent ast.Node, pc Context) {
// nothing to do
}

View File

@@ -37,6 +37,12 @@ func (b *CopyOnWriteBuffer) Write(value []byte) {
b.buffer = append(b.buffer, value...)
}
// WriteString writes given string to the buffer.
// WriteString allocate new buffer and clears it at the first time.
func (b *CopyOnWriteBuffer) WriteString(value string) {
b.Write(StringToReadOnlyBytes(value))
}
// Append appends given bytes to the buffer.
// Append copy buffer at the first time.
func (b *CopyOnWriteBuffer) Append(value []byte) {
@@ -49,6 +55,12 @@ func (b *CopyOnWriteBuffer) Append(value []byte) {
b.buffer = append(b.buffer, value...)
}
// AppendString appends given string to the buffer.
// AppendString copy buffer at the first time.
func (b *CopyOnWriteBuffer) AppendString(value string) {
b.Append(StringToReadOnlyBytes(value))
}
// WriteByte writes the given byte to the buffer.
// WriteByte allocate new buffer and clears it at the first time.
func (b *CopyOnWriteBuffer) WriteByte(c byte) {
@@ -804,7 +816,7 @@ func IsPunct(c byte) bool {
return punctTable[c] == 1
}
// IsPunct returns true if the given rune is a punctuation, otherwise false.
// IsPunctRune returns true if the given rune is a punctuation, otherwise false.
func IsPunctRune(r rune) bool {
return int32(r) <= 256 && IsPunct(byte(r)) || unicode.IsPunct(r)
}
@@ -814,7 +826,7 @@ func IsSpace(c byte) bool {
return spaceTable[c] == 1
}
// IsSpace returns true if the given rune is a space, otherwise false.
// IsSpaceRune returns true if the given rune is a space, otherwise false.
func IsSpaceRune(r rune) bool {
return int32(r) <= 256 && IsSpace(byte(r)) || unicode.IsSpace(r)
}

2
vendor/modules.txt vendored
View File

@@ -745,7 +745,7 @@ github.com/xi2/xz
# github.com/yohcop/openid-go v1.0.0
## explicit
github.com/yohcop/openid-go
# github.com/yuin/goldmark v1.2.1
# github.com/yuin/goldmark v1.3.3
## explicit
github.com/yuin/goldmark
github.com/yuin/goldmark/ast

View File

@@ -1,3 +1,4 @@
import {htmlEscape} from 'escape-goat';
import {svg} from '../svg.js';
const {AppSubUrl} = window.config;
@@ -31,7 +32,7 @@ function issuePopup(owner, repo, index, $element) {
if ((red * 0.299 + green * 0.587 + blue * 0.114) > 125) {
color = '#000000';
}
labels += `<div class="ui label" style="color: ${color}; background-color:#${label.color};">${label.name}</div>`;
labels += `<div class="ui label" style="color: ${color}; background-color:#${label.color};">${htmlEscape(label.name)}</div>`;
}
if (labels.length > 0) {
labels = `<p>${labels}</p>`;
@@ -64,9 +65,9 @@ function issuePopup(owner, repo, index, $element) {
},
html: `
<div>
<p><small>${issue.repository.full_name} on ${createdAt}</small></p>
<p><span class="${color}">${svg(octicon)}</span> <strong>${issue.title}</strong> #${index}</p>
<p>${body}</p>
<p><small>${htmlEscape(issue.repository.full_name)} on ${createdAt}</small></p>
<p><span class="${color}">${svg(octicon)}</span> <strong>${htmlEscape(issue.title)}</strong> #${index}</p>
<p>${htmlEscape(body)}</p>
${labels}
</div>
`

View File

@@ -3592,18 +3592,21 @@ function initIssueList() {
fullTextSearch: true
});
function excludeLabel (item) {
const href = $(item).attr('href');
const id = $(item).data('label-id');
const regStr = `labels=((?:-?[0-9]+%2c)*)(${id})((?:%2c-?[0-9]+)*)&`;
const newStr = 'labels=$1-$2$3&';
window.location = href.replace(new RegExp(regStr), newStr);
}
$('.menu a.label-filter-item').each(function () {
$(this).on('click', function (e) {
if (e.altKey) {
e.preventDefault();
const href = $(this).attr('href');
const id = $(this).data('label-id');
const regStr = `labels=(-?[0-9]+%2c)*(${id})(%2c-?[0-9]+)*&`;
const newStr = 'labels=$1-$2$3&';
window.location = href.replace(new RegExp(regStr), newStr);
excludeLabel(this);
}
});
});
@@ -3611,17 +3614,8 @@ function initIssueList() {
$('.menu .ui.dropdown.label-filter').on('keydown', (e) => {
if (e.altKey && e.keyCode === 13) {
const selectedItems = $('.menu .ui.dropdown.label-filter .menu .item.selected');
if (selectedItems.length > 0) {
const item = $(selectedItems[0]);
const href = item.attr('href');
const id = item.data('label-id');
const regStr = `labels=(-?[0-9]+%2c)*(${id})(%2c-?[0-9]+)*&`;
const newStr = 'labels=$1-$2$3&';
window.location = href.replace(new RegExp(regStr), newStr);
excludeLabel($(selectedItems[0]));
}
}
});

View File

@@ -5,7 +5,7 @@ const headingSelector = '.markdown h1, .markdown h2, .markdown h3, .markdown h4,
function scrollToAnchor() {
if (document.querySelector(':target')) return;
if (!window.location.hash || window.location.hash.length <= 1) return;
const id = window.location.hash.substring(1);
const id = decodeURIComponent(window.location.hash.substring(1));
const el = document.getElementById(`user-content-${id}`);
if (el) {
el.scrollIntoView();