Compare commits

...

20 Commits

Author SHA1 Message Date
6543
f4729e2418 Add Changelog v1.16.7 (#19575)
Co-authored-by: techknowlogick <matti@mdranta.net>
Co-authored-by: Gusted <williamzijl7@hotmail.com>
2022-05-02 05:41:09 +02:00
6543
f7330fd027 Dont overwrite err with nil (part #19572) (#19574)
* Dont overwrite err with nil (part #19572)


Co-authored-by: Gusted <williamzijl7@hotmail.com>
2022-05-02 01:54:20 +02:00
6543
755d8e21ad Migration: only write commit-graph if wiki clone was successfull (#19563) (#19568) 2022-05-01 00:22:42 +02:00
Jimmy Praet
7c0bf06d96 Respect DefaultUserIsRestricted system default when creating new user (#19310 ) (#19560) 2022-04-30 15:00:14 +02:00
Gusted
0d196e29e8 Don't error when branch's commit doesn't exist (#19547) (#19548)
- Backport #19547
  - If one of the branches no longer exists, don't throw an error, it's possible that the branch was destroyed during the process. Simply skip it and disregard it.
  - Resolves #19541
2022-04-29 12:25:19 +02:00
wxiaoguang
b86606fa38 Support hostname:port to pass host matcher's check (#19543) (#19544)
Backport #19543 
hostmatcher: split the hostname from the hostname:port string, use the correct hostname to do the match.
2022-04-29 01:41:58 +08:00
zeripath
74602bb487 Prevent intermittent race in attribute reader close (#19537) (#19539)
Backport #19537

There is a potential rare race possible whereby the c.running channel could
be closed twice. Looking at the code I do not see a need for this c.running
channel and therefore I think we can remove this. (I think the c.running
might have been some attempt to prevent a hang but the use of os.Pipes should
prevent that.)

Signed-off-by: Andrew Thornton <art27@cantab.net>
2022-04-28 17:00:01 +02:00
Gusted
1465e0cbb2 Fix 64-bit atomic operations on 32-bit machines (#19531) (#19532)
- Backport #19531
  - Doing 64-bit atomic operations on 32-bit machines is a bit tricky by golang, as they can only be done under certain set of conditions(https://pkg.go.dev/sync/atomic#pkg-note-BUG).
  - This PR fixes such case whereby the conditions weren't met, it moves the int64 to the first field of the struct, which will 64-bit operations happening on this property on 32-bit machines.
  - Resolves #19518
2022-04-27 10:32:28 -05:00
Lunny Xiao
928b603d19 Fix migrate release from github (#19510) (#19523)
* Fix migrate release from github

* Fix bug
2022-04-27 14:46:00 +02:00
Lunny Xiao
8ff542c1a2 When view _Siderbar or _Footer, just display once (#19501) (#19522)
Co-authored-by: zeripath <art27@cantab.net>
2022-04-27 14:04:53 +02:00
zeripath
39a0db6ecf Prevent dangling archiver goroutine (#19516) (#19526)
Backport #19516

Within doArchive there is a service goroutine that performs the
archiving function.  This goroutine reports its error using a `chan
error` called `done`. Prior to this PR this channel had 0 capacity
meaning that the goroutine would block until the `done` channel was
cleared - however there are a couple of ways in which this channel might
not be read.

The simplest solution is to add a single space of capacity to the
goroutine which will mean that the goroutine will always complete and
even if the `done` channel is not read it will be simply garbage
collected away.

(The PR also contains two other places when setting up the indexers
which do not leak but where the blocking of the sending goroutine is
also unnecessary and so we should just add a small amount of capacity
and let the sending goroutine complete as soon as it can.)

Signed-off-by: Andrew Thornton <art27@cantab.net>

Co-authored-by: 6543 <6543@obermui.de>

Co-authored-by: 6543 <6543@obermui.de>
2022-04-27 16:05:52 +08:00
techknowlogick
9cc93c05cd Unset git author/committer variables when running integration tests (#19512) (#19519)
TestAPIGitTag (and likely others) will fail if the running environment contains
GIT_AUTHOR_NAME and other env variables like it.

This PR simply unsets these when running the integration tests.

Fix #14247

Signed-off-by: Andrew Thornton <art27@cantab.net>

Co-authored-by: zeripath <art27@cantab.net>
2022-04-26 19:23:54 -04:00
Lunny Xiao
b31418edd9 Fix blame page select range error and some typos (#19503)
Partially back port from #19500 and fix two typos.
2022-04-26 20:19:52 +01:00
6543
242f7f1a52 Add notags to fetch (#19487) (#19490)
* Add notags to fetch (#19487)

* gofumpt
2022-04-25 20:26:17 +02:00
6543
8d7f1e430a User specific repoID or xorm builder conditions for issue search (#19475) (#19476) 2022-04-25 15:28:47 +02:00
Pilou
a6b32adc45 [doctor] authorized-keys: fix displayed check name (backport #19464) (#19484)
The registered check name is authorized-keys, not authorized_keys.
2022-04-25 13:45:18 +02:00
Gusted
1f0dca4614 Mark TemplateLoading error as "UnprocessableEntity" (#19445) (#19446)
* Mark TemplateLoading error as "UnprocessableEntity" (#19445)

- Backport #19445
  - Don't return Internal Server error if the user provide incorrect label template, instead return UnprocessableEntity.
  - Resolves #19399

- dep: upgrade: github.com/gogs/chardet
2022-04-22 21:07:57 +02:00
6543
1d665da32f Prevent dangling cat-file calls (goroutine alternative) (#19454) (#19466)
If an `os/exec.Command` is passed non `*os.File` as an input/output, go
will create `os.Pipe`s and wait for their closure in `cmd.Wait()`.  If
the code following this is responsible for closing `io.Pipe`s or other
handlers then on process death from context cancellation the `Wait` can
hang.

There are two possible solutions:

1. use `os.Pipe` as the input/output as `cmd.Wait` does not wait for these.
2. create a goroutine waiting on the context cancellation that will close the inputs.

This PR provides the second option - which is a simpler change that can
be more easily backported.

Closes #19448

Signed-off-by: Andrew Thornton <art27@cantab.net>

Co-authored-by: zeripath <art27@cantab.net>
2022-04-22 16:58:50 +01:00
Gusted
09adc26eb6 Set correct PR status on 3way on conflict checking (#19457) (#19458)
- Backport #19457
  - When 3-way merge is enabled for conflict checking, it has a new interesting behavior that it doesn't return any error when it found a conflict, so we change the condition to not check for the error, but instead check if conflictedfiles is populated, this fixes a issue whereby PR status wasn't correctly on conflicted PR's.
  - Refactor the mergeable property(which was incorrectly set and lead me this bug) to be more maintainable.
  - Add a dedicated test for conflicting checking, so it should prevent future issues with this.
  - Ref: Fix the latest error for https://gitea.com/gitea/go-sdk/pulls/579

Co-authored-by: zeripath <art27@cantab.net>
2022-04-22 09:11:42 +08:00
6543
297346a762 RepoAssignment ensure to close before overwrite (#19449) (#19460)
* check if GitRepo already open and close if

* Only run RepoAssignment once
2022-04-21 18:55:44 +02:00
51 changed files with 470 additions and 242 deletions

View File

@@ -4,6 +4,29 @@ This changelog goes through all the changes that have been made in each release
without substantial changes to our git log; to see the highlights of what has without substantial changes to our git log; to see the highlights of what has
been added to each release, please refer to the [blog](https://blog.gitea.io). been added to each release, please refer to the [blog](https://blog.gitea.io).
## [1.16.7](https://github.com/go-gitea/gitea/releases/tag/v1.16.7) - 2022-05-02
* SECURITY
* Escape git fetch remote (#19487) (#19490)
* BUGFIXES
* Don't overwrite err with nil (#19572) (#19574)
* On Migrations, only write commit-graph if wiki clone was successful (#19563) (#19568)
* Respect DefaultUserIsRestricted system default when creating new user (#19310) (#19560)
* Don't error when branch's commit doesn't exist (#19547) (#19548)
* Support `hostname:port` to pass host matcher's check (#19543) (#19544)
* Prevent intermittent race in attribute reader close (#19537) (#19539)
* Fix 64-bit atomic operations on 32-bit machines (#19531) (#19532)
* Prevent dangling archiver goroutine (#19516) (#19526)
* Fix migrate release from github (#19510) (#19523)
* When view _Siderbar or _Footer, just display once (#19501) (#19522)
* Fix blame page select range error and some typos (#19503)
* Fix name of doctor fix "authorized-keys" in hints (#19464) (#19484)
* User specific repoID or xorm builder conditions for issue search (#19475) (#19476)
* Prevent dangling cat-file calls (goroutine alternative) (#19454) (#19466)
* RepoAssignment ensure to close before overwrite (#19449) (#19460)
* Set correct PR status on 3way on conflict checking (#19457) (#19458)
* Mark TemplateLoading error as "UnprocessableEntity" (#19445) (#19446)
## [1.16.6](https://github.com/go-gitea/gitea/releases/tag/v1.16.6) - 2022-04-20 ## [1.16.6](https://github.com/go-gitea/gitea/releases/tag/v1.16.6) - 2022-04-20
* ENHANCEMENTS * ENHANCEMENTS

View File

@@ -25,6 +25,7 @@ import (
repo_module "code.gitea.io/gitea/modules/repository" repo_module "code.gitea.io/gitea/modules/repository"
"code.gitea.io/gitea/modules/setting" "code.gitea.io/gitea/modules/setting"
"code.gitea.io/gitea/modules/storage" "code.gitea.io/gitea/modules/storage"
"code.gitea.io/gitea/modules/util"
auth_service "code.gitea.io/gitea/services/auth" auth_service "code.gitea.io/gitea/services/auth"
"code.gitea.io/gitea/services/auth/source/oauth2" "code.gitea.io/gitea/services/auth/source/oauth2"
"code.gitea.io/gitea/services/auth/source/smtp" "code.gitea.io/gitea/services/auth/source/smtp"
@@ -113,6 +114,10 @@ var (
Name: "access-token", Name: "access-token",
Usage: "Generate access token for the user", Usage: "Generate access token for the user",
}, },
cli.BoolFlag{
Name: "restricted",
Usage: "Make a restricted user account",
},
}, },
} }
@@ -537,17 +542,26 @@ func runCreateUser(c *cli.Context) error {
changePassword = c.Bool("must-change-password") changePassword = c.Bool("must-change-password")
} }
restricted := util.OptionalBoolNone
if c.IsSet("restricted") {
restricted = util.OptionalBoolOf(c.Bool("restricted"))
}
u := &user_model.User{ u := &user_model.User{
Name: username, Name: username,
Email: c.String("email"), Email: c.String("email"),
Passwd: password, Passwd: password,
IsActive: true,
IsAdmin: c.Bool("admin"), IsAdmin: c.Bool("admin"),
MustChangePassword: changePassword, MustChangePassword: changePassword,
Theme: setting.UI.DefaultTheme,
} }
if err := user_model.CreateUser(u); err != nil { overwriteDefault := &user_model.CreateUserOverwriteOptions{
IsActive: util.OptionalBoolTrue,
IsRestricted: restricted,
}
if err := user_model.CreateUser(u, overwriteDefault); err != nil {
return fmt.Errorf("CreateUser: %v", err) return fmt.Errorf("CreateUser: %v", err)
} }

View File

@@ -43,7 +43,7 @@ Vous devriez avoir une instance fonctionnelle de Gitea. Pour accèder à l'inter
## Named Volumes ## Named Volumes
Ce guide aboutira à une installation avec les données Gita et PostgreSQL stockées dans des volumes nommés. Cela permet une sauvegarde, une restauration et des mises à niveau en toute simplicité. Ce guide aboutira à une installation avec les données Gitea et PostgreSQL stockées dans des volumes nommés. Cela permet une sauvegarde, une restauration et des mises à niveau en toute simplicité.
### The Database ### The Database

2
go.mod
View File

@@ -37,7 +37,7 @@ require (
github.com/go-swagger/go-swagger v0.27.0 github.com/go-swagger/go-swagger v0.27.0
github.com/go-testfixtures/testfixtures/v3 v3.6.1 github.com/go-testfixtures/testfixtures/v3 v3.6.1
github.com/gobwas/glob v0.2.3 github.com/gobwas/glob v0.2.3
github.com/gogs/chardet v0.0.0-20191104214054-4b6791f73a28 github.com/gogs/chardet v0.0.0-20211120154057-b7413eaefb8f
github.com/gogs/cron v0.0.0-20171120032916-9f6c956d3e14 github.com/gogs/cron v0.0.0-20171120032916-9f6c956d3e14
github.com/gogs/go-gogs-client v0.0.0-20210131175652-1d7215cd8d85 github.com/gogs/go-gogs-client v0.0.0-20210131175652-1d7215cd8d85
github.com/golang-jwt/jwt/v4 v4.3.0 github.com/golang-jwt/jwt/v4 v4.3.0

4
go.sum
View File

@@ -649,8 +649,8 @@ github.com/gogo/protobuf v1.3.0/go.mod h1:SlYgWuQ5SjCEi6WLHjHCa1yvBfUnHcTbrrZtXP
github.com/gogo/protobuf v1.3.1/go.mod h1:SlYgWuQ5SjCEi6WLHjHCa1yvBfUnHcTbrrZtXPKa29o= github.com/gogo/protobuf v1.3.1/go.mod h1:SlYgWuQ5SjCEi6WLHjHCa1yvBfUnHcTbrrZtXPKa29o=
github.com/gogo/protobuf v1.3.2 h1:Ov1cvc58UF3b5XjBnZv7+opcTcQFZebYjWzi34vdm4Q= github.com/gogo/protobuf v1.3.2 h1:Ov1cvc58UF3b5XjBnZv7+opcTcQFZebYjWzi34vdm4Q=
github.com/gogo/protobuf v1.3.2/go.mod h1:P1XiOD3dCwIKUDQYPy72D8LYyHL2YPYrpS2s69NZV8Q= github.com/gogo/protobuf v1.3.2/go.mod h1:P1XiOD3dCwIKUDQYPy72D8LYyHL2YPYrpS2s69NZV8Q=
github.com/gogs/chardet v0.0.0-20191104214054-4b6791f73a28 h1:gBeyun7mySAKWg7Fb0GOcv0upX9bdaZScs8QcRo8mEY= github.com/gogs/chardet v0.0.0-20211120154057-b7413eaefb8f h1:3BSP1Tbs2djlpprl7wCLuiqMaUh5SJkkzI2gDs+FgLs=
github.com/gogs/chardet v0.0.0-20191104214054-4b6791f73a28/go.mod h1:Pcatq5tYkCW2Q6yrR2VRHlbHpZ/R4/7qyL1TCF7vl14= github.com/gogs/chardet v0.0.0-20211120154057-b7413eaefb8f/go.mod h1:Pcatq5tYkCW2Q6yrR2VRHlbHpZ/R4/7qyL1TCF7vl14=
github.com/gogs/cron v0.0.0-20171120032916-9f6c956d3e14 h1:yXtpJr/LV6PFu4nTLgfjQdcMdzjbqqXMEnHfq0Or6p8= github.com/gogs/cron v0.0.0-20171120032916-9f6c956d3e14 h1:yXtpJr/LV6PFu4nTLgfjQdcMdzjbqqXMEnHfq0Or6p8=
github.com/gogs/cron v0.0.0-20171120032916-9f6c956d3e14/go.mod h1:jPoNZLWDAqA5N3G5amEoiNbhVrmM+ZQEcnQvNQ2KaZk= github.com/gogs/cron v0.0.0-20171120032916-9f6c956d3e14/go.mod h1:jPoNZLWDAqA5N3G5amEoiNbhVrmM+ZQEcnQvNQ2KaZk=
github.com/gogs/go-gogs-client v0.0.0-20210131175652-1d7215cd8d85 h1:UjoPNDAQ5JPCjlxoJd6K8ALZqSDDhk2ymieAZOVaDg0= github.com/gogs/go-gogs-client v0.0.0-20210131175652-1d7215cd8d85 h1:UjoPNDAQ5JPCjlxoJd6K8ALZqSDDhk2ymieAZOVaDg0=

View File

@@ -112,6 +112,13 @@ func TestMain(m *testing.M) {
} }
} }
os.Unsetenv("GIT_AUTHOR_NAME")
os.Unsetenv("GIT_AUTHOR_EMAIL")
os.Unsetenv("GIT_AUTHOR_DATE")
os.Unsetenv("GIT_COMMITTER_NAME")
os.Unsetenv("GIT_COMMITTER_EMAIL")
os.Unsetenv("GIT_COMMITTER_DATE")
err := unittest.InitFixtures( err := unittest.InitFixtures(
unittest.FixturesOptions{ unittest.FixturesOptions{
Dir: filepath.Join(filepath.Dir(setting.AppPath), "models/fixtures/"), Dir: filepath.Join(filepath.Dir(setting.AppPath), "models/fixtures/"),

View File

@@ -25,6 +25,8 @@ import (
api "code.gitea.io/gitea/modules/structs" api "code.gitea.io/gitea/modules/structs"
"code.gitea.io/gitea/modules/test" "code.gitea.io/gitea/modules/test"
"code.gitea.io/gitea/services/pull" "code.gitea.io/gitea/services/pull"
repo_service "code.gitea.io/gitea/services/repository"
files_service "code.gitea.io/gitea/services/repository/files"
"github.com/stretchr/testify/assert" "github.com/stretchr/testify/assert"
"github.com/unknwon/i18n" "github.com/unknwon/i18n"
@@ -65,7 +67,7 @@ func testPullCleanUp(t *testing.T, session *TestSession, user, repo, pullnum str
func TestPullMerge(t *testing.T) { func TestPullMerge(t *testing.T) {
onGiteaRun(t, func(t *testing.T, giteaURL *url.URL) { onGiteaRun(t, func(t *testing.T, giteaURL *url.URL) {
hookTasks, err := webhook.HookTasks(1, 1) //Retrieve previous hook number hookTasks, err := webhook.HookTasks(1, 1) // Retrieve previous hook number
assert.NoError(t, err) assert.NoError(t, err)
hookTasksLenBefore := len(hookTasks) hookTasksLenBefore := len(hookTasks)
@@ -87,7 +89,7 @@ func TestPullMerge(t *testing.T) {
func TestPullRebase(t *testing.T) { func TestPullRebase(t *testing.T) {
onGiteaRun(t, func(t *testing.T, giteaURL *url.URL) { onGiteaRun(t, func(t *testing.T, giteaURL *url.URL) {
hookTasks, err := webhook.HookTasks(1, 1) //Retrieve previous hook number hookTasks, err := webhook.HookTasks(1, 1) // Retrieve previous hook number
assert.NoError(t, err) assert.NoError(t, err)
hookTasksLenBefore := len(hookTasks) hookTasksLenBefore := len(hookTasks)
@@ -109,7 +111,7 @@ func TestPullRebase(t *testing.T) {
func TestPullRebaseMerge(t *testing.T) { func TestPullRebaseMerge(t *testing.T) {
onGiteaRun(t, func(t *testing.T, giteaURL *url.URL) { onGiteaRun(t, func(t *testing.T, giteaURL *url.URL) {
hookTasks, err := webhook.HookTasks(1, 1) //Retrieve previous hook number hookTasks, err := webhook.HookTasks(1, 1) // Retrieve previous hook number
assert.NoError(t, err) assert.NoError(t, err)
hookTasksLenBefore := len(hookTasks) hookTasksLenBefore := len(hookTasks)
@@ -131,7 +133,7 @@ func TestPullRebaseMerge(t *testing.T) {
func TestPullSquash(t *testing.T) { func TestPullSquash(t *testing.T) {
onGiteaRun(t, func(t *testing.T, giteaURL *url.URL) { onGiteaRun(t, func(t *testing.T, giteaURL *url.URL) {
hookTasks, err := webhook.HookTasks(1, 1) //Retrieve previous hook number hookTasks, err := webhook.HookTasks(1, 1) // Retrieve previous hook number
assert.NoError(t, err) assert.NoError(t, err)
hookTasksLenBefore := len(hookTasks) hookTasksLenBefore := len(hookTasks)
@@ -335,3 +337,74 @@ func TestCantMergeUnrelated(t *testing.T) {
gitRepo.Close() gitRepo.Close()
}) })
} }
func TestConflictChecking(t *testing.T) {
onGiteaRun(t, func(t *testing.T, giteaURL *url.URL) {
user := unittest.AssertExistsAndLoadBean(t, &user_model.User{ID: 2}).(*user_model.User)
// Create new clean repo to test conflict checking.
baseRepo, err := repo_service.CreateRepository(user, user, models.CreateRepoOptions{
Name: "conflict-checking",
Description: "Tempo repo",
AutoInit: true,
Readme: "Default",
DefaultBranch: "main",
})
assert.NoError(t, err)
assert.NotEmpty(t, baseRepo)
// create a commit on new branch.
_, err = files_service.CreateOrUpdateRepoFile(baseRepo, user, &files_service.UpdateRepoFileOptions{
TreePath: "important_file",
Message: "Add a important file",
Content: "Just a non-important file",
IsNewFile: true,
OldBranch: "main",
NewBranch: "important-secrets",
})
assert.NoError(t, err)
// create a commit on main branch.
_, err = files_service.CreateOrUpdateRepoFile(baseRepo, user, &files_service.UpdateRepoFileOptions{
TreePath: "important_file",
Message: "Add a important file",
Content: "Not the same content :P",
IsNewFile: true,
OldBranch: "main",
NewBranch: "main",
})
assert.NoError(t, err)
// create Pull to merge the important-secrets branch into main branch.
pullIssue := &models.Issue{
RepoID: baseRepo.ID,
Title: "PR with conflict!",
PosterID: user.ID,
Poster: user,
IsPull: true,
}
pullRequest := &models.PullRequest{
HeadRepoID: baseRepo.ID,
BaseRepoID: baseRepo.ID,
HeadBranch: "important-secrets",
BaseBranch: "main",
HeadRepo: baseRepo,
BaseRepo: baseRepo,
Type: models.PullRequestGitea,
}
err = pull.NewPullRequest(baseRepo, pullIssue, nil, nil, pullRequest, nil)
assert.NoError(t, err)
issue := unittest.AssertExistsAndLoadBean(t, &models.Issue{Title: "PR with conflict!"}).(*models.Issue)
conflictingPR, err := models.GetPullRequestByIssueID(issue.ID)
assert.NoError(t, err)
// Ensure conflictedFiles is populated.
assert.Equal(t, 1, len(conflictingPR.ConflictedFiles))
// Check if status is correct.
assert.Equal(t, models.PullRequestStatusConflict, conflictingPR.Status)
// Ensure that mergeable returns false
assert.False(t, conflictingPR.Mergeable())
})
}

View File

@@ -1165,7 +1165,8 @@ func GetIssuesByIDs(issueIDs []int64) ([]*Issue, error) {
// IssuesOptions represents options of an issue. // IssuesOptions represents options of an issue.
type IssuesOptions struct { type IssuesOptions struct {
db.ListOptions db.ListOptions
RepoIDs []int64 // include all repos if empty RepoID int64 // overwrites RepoCond if not 0
RepoCond builder.Cond
AssigneeID int64 AssigneeID int64
PosterID int64 PosterID int64
MentionedID int64 MentionedID int64
@@ -1256,15 +1257,15 @@ func (opts *IssuesOptions) setupSessionNoLimit(sess *xorm.Session) {
sess.In("issue.id", opts.IssueIDs) sess.In("issue.id", opts.IssueIDs)
} }
if len(opts.RepoIDs) > 0 { if opts.RepoID != 0 {
applyReposCondition(sess, opts.RepoIDs) opts.RepoCond = builder.Eq{"issue.repo_id": opts.RepoID}
}
if opts.RepoCond != nil {
sess.And(opts.RepoCond)
} }
switch opts.IsClosed { if !opts.IsClosed.IsNone() {
case util.OptionalBoolTrue: sess.And("issue.is_closed=?", opts.IsClosed.IsTrue())
sess.And("issue.is_closed=?", true)
case util.OptionalBoolFalse:
sess.And("issue.is_closed=?", false)
} }
if opts.AssigneeID > 0 { if opts.AssigneeID > 0 {
@@ -1383,10 +1384,6 @@ func issuePullAccessibleRepoCond(repoIDstr string, userID int64, org *Organizati
return cond return cond
} }
func applyReposCondition(sess *xorm.Session, repoIDs []int64) *xorm.Session {
return sess.In("issue.repo_id", repoIDs)
}
func applyAssigneeCondition(sess *xorm.Session, assigneeID int64) *xorm.Session { func applyAssigneeCondition(sess *xorm.Session, assigneeID int64) *xorm.Session {
return sess.Join("INNER", "issue_assignees", "issue.id = issue_assignees.issue_id"). return sess.Join("INNER", "issue_assignees", "issue.id = issue_assignees.issue_id").
And("issue_assignees.assignee_id = ?", assigneeID) And("issue_assignees.assignee_id = ?", assigneeID)

View File

@@ -101,12 +101,9 @@ func (label *Label) CalOpenIssues() {
// CalOpenOrgIssues calculates the open issues of a label for a specific repo // CalOpenOrgIssues calculates the open issues of a label for a specific repo
func (label *Label) CalOpenOrgIssues(repoID, labelID int64) { func (label *Label) CalOpenOrgIssues(repoID, labelID int64) {
repoIDs := []int64{repoID}
labelIDs := []int64{labelID}
counts, _ := CountIssuesByRepo(&IssuesOptions{ counts, _ := CountIssuesByRepo(&IssuesOptions{
RepoIDs: repoIDs, RepoID: repoID,
LabelIDs: labelIDs, LabelIDs: []int64{labelID},
}) })
for _, count := range counts { for _, count := range counts {

View File

@@ -17,6 +17,7 @@ import (
user_model "code.gitea.io/gitea/models/user" user_model "code.gitea.io/gitea/models/user"
"github.com/stretchr/testify/assert" "github.com/stretchr/testify/assert"
"xorm.io/builder"
) )
func TestIssue_ReplaceLabels(t *testing.T) { func TestIssue_ReplaceLabels(t *testing.T) {
@@ -153,7 +154,7 @@ func TestIssues(t *testing.T) {
}, },
{ {
IssuesOptions{ IssuesOptions{
RepoIDs: []int64{1, 3}, RepoCond: builder.In("repo_id", 1, 3),
SortType: "oldest", SortType: "oldest",
ListOptions: db.ListOptions{ ListOptions: db.ListOptions{
Page: 1, Page: 1,
@@ -340,7 +341,7 @@ func TestGetRepoIDsForIssuesOptions(t *testing.T) {
}, },
{ {
IssuesOptions{ IssuesOptions{
RepoIDs: []int64{1, 2}, RepoCond: builder.In("repo_id", 1, 2),
}, },
[]int64{1, 2}, []int64{1, 2},
}, },

View File

@@ -453,7 +453,7 @@ Please try upgrading to a lower version first (suggested v1.6.4), then upgrade t
// Downgrading Gitea's database version not supported // Downgrading Gitea's database version not supported
if int(v-minDBVersion) > len(migrations) { if int(v-minDBVersion) > len(migrations) {
msg := fmt.Sprintf("Your database (migration version: %d) is for a newer Gita, you can not use the newer database for this old Gitea release (%d).", v, minDBVersion+len(migrations)) msg := fmt.Sprintf("Your database (migration version: %d) is for a newer Gitea, you can not use the newer database for this old Gitea release (%d).", v, minDBVersion+len(migrations))
msg += "\nGitea will exit to keep your database safe and unchanged. Please use the correct Gitea release, do not change the migration version manually (incorrect manual operation may lose data)." msg += "\nGitea will exit to keep your database safe and unchanged. Please use the correct Gitea release, do not change the migration version manually (incorrect manual operation may lose data)."
if !setting.IsProd { if !setting.IsProd {
msg += fmt.Sprintf("\nIf you are in development and really know what you're doing, you can force changing the migration version by executing: UPDATE version SET version=%d WHERE id=1;", minDBVersion+len(migrations)) msg += fmt.Sprintf("\nIf you are in development and really know what you're doing, you can force changing the migration version by executing: UPDATE version SET version=%d WHERE id=1;", minDBVersion+len(migrations))

View File

@@ -697,3 +697,14 @@ func (pr *PullRequest) GetHeadBranchHTMLURL() string {
} }
return pr.HeadRepo.HTMLURL() + "/src/branch/" + util.PathEscapeSegments(pr.HeadBranch) return pr.HeadRepo.HTMLURL() + "/src/branch/" + util.PathEscapeSegments(pr.HeadBranch)
} }
// Mergeable returns if the pullrequest is mergeable.
func (pr *PullRequest) Mergeable() bool {
// If a pull request isn't mergable if it's:
// - Being conflict checked.
// - Has a conflict.
// - Received a error while being conflict checked.
// - Is a work-in-progress pull request.
return pr.Status != PullRequestStatusChecking && pr.Status != PullRequestStatusConflict &&
pr.Status != PullRequestStatusError && !pr.IsWorkInProgress()
}

View File

@@ -622,7 +622,14 @@ func IsUsableUsername(name string) error {
// CreateUserOverwriteOptions are an optional options who overwrite system defaults on user creation // CreateUserOverwriteOptions are an optional options who overwrite system defaults on user creation
type CreateUserOverwriteOptions struct { type CreateUserOverwriteOptions struct {
Visibility structs.VisibleType KeepEmailPrivate util.OptionalBool
Visibility *structs.VisibleType
AllowCreateOrganization util.OptionalBool
EmailNotificationsPreference *string
MaxRepoCreation *int
Theme *string
IsRestricted util.OptionalBool
IsActive util.OptionalBool
} }
// CreateUser creates record of a new user. // CreateUser creates record of a new user.
@@ -638,10 +645,36 @@ func CreateUser(u *User, overwriteDefault ...*CreateUserOverwriteOptions) (err e
u.EmailNotificationsPreference = setting.Admin.DefaultEmailNotification u.EmailNotificationsPreference = setting.Admin.DefaultEmailNotification
u.MaxRepoCreation = -1 u.MaxRepoCreation = -1
u.Theme = setting.UI.DefaultTheme u.Theme = setting.UI.DefaultTheme
u.IsRestricted = setting.Service.DefaultUserIsRestricted
u.IsActive = !(setting.Service.RegisterEmailConfirm || setting.Service.RegisterManualConfirm)
// overwrite defaults if set // overwrite defaults if set
if len(overwriteDefault) != 0 && overwriteDefault[0] != nil { if len(overwriteDefault) != 0 && overwriteDefault[0] != nil {
u.Visibility = overwriteDefault[0].Visibility overwrite := overwriteDefault[0]
if !overwrite.KeepEmailPrivate.IsNone() {
u.KeepEmailPrivate = overwrite.KeepEmailPrivate.IsTrue()
}
if overwrite.Visibility != nil {
u.Visibility = *overwrite.Visibility
}
if !overwrite.AllowCreateOrganization.IsNone() {
u.AllowCreateOrganization = overwrite.AllowCreateOrganization.IsTrue()
}
if overwrite.EmailNotificationsPreference != nil {
u.EmailNotificationsPreference = *overwrite.EmailNotificationsPreference
}
if overwrite.MaxRepoCreation != nil {
u.MaxRepoCreation = *overwrite.MaxRepoCreation
}
if overwrite.Theme != nil {
u.Theme = *overwrite.Theme
}
if !overwrite.IsRestricted.IsNone() {
u.IsRestricted = overwrite.IsRestricted.IsTrue()
}
if !overwrite.IsActive.IsNone() {
u.IsActive = overwrite.IsActive.IsTrue()
}
} }
// validate data // validate data

View File

@@ -410,6 +410,12 @@ func RepoIDAssignment() func(ctx *Context) {
// RepoAssignment returns a middleware to handle repository assignment // RepoAssignment returns a middleware to handle repository assignment
func RepoAssignment(ctx *Context) (cancel context.CancelFunc) { func RepoAssignment(ctx *Context) (cancel context.CancelFunc) {
if _, repoAssignmentOnce := ctx.Data["repoAssignmentExecuted"]; repoAssignmentOnce {
log.Trace("RepoAssignment was exec already, skipping second call ...")
return
}
ctx.Data["repoAssignmentExecuted"] = true
var ( var (
owner *user_model.User owner *user_model.User
err error err error
@@ -592,6 +598,9 @@ func RepoAssignment(ctx *Context) (cancel context.CancelFunc) {
ctx.ServerError("RepoAssignment Invalid repo "+repo_model.RepoPath(userName, repoName), err) ctx.ServerError("RepoAssignment Invalid repo "+repo_model.RepoPath(userName, repoName), err)
return return
} }
if ctx.Repo.GitRepo != nil {
ctx.Repo.GitRepo.Close()
}
ctx.Repo.GitRepo = gitRepo ctx.Repo.GitRepo = gitRepo
// We opened it, we should close it // We opened it, we should close it

View File

@@ -67,6 +67,7 @@ func ToAPIPullRequest(pr *models.PullRequest, doer *user_model.User) *api.PullRe
PatchURL: pr.Issue.PatchURL(), PatchURL: pr.Issue.PatchURL(),
HasMerged: pr.HasMerged, HasMerged: pr.HasMerged,
MergeBase: pr.MergeBase, MergeBase: pr.MergeBase,
Mergeable: pr.Mergeable(),
Deadline: apiIssue.Deadline, Deadline: apiIssue.Deadline,
Created: pr.Issue.CreatedUnix.AsTimePtr(), Created: pr.Issue.CreatedUnix.AsTimePtr(),
Updated: pr.Issue.UpdatedUnix.AsTimePtr(), Updated: pr.Issue.UpdatedUnix.AsTimePtr(),
@@ -190,10 +191,6 @@ func ToAPIPullRequest(pr *models.PullRequest, doer *user_model.User) *api.PullRe
} }
} }
if pr.Status != models.PullRequestStatusChecking {
mergeable := !(pr.Status == models.PullRequestStatusConflict || pr.Status == models.PullRequestStatusError) && !pr.IsWorkInProgress()
apiPullRequest.Mergeable = mergeable
}
if pr.HasMerged { if pr.HasMerged {
apiPullRequest.Merged = pr.MergedUnix.AsTimePtr() apiPullRequest.Merged = pr.MergedUnix.AsTimePtr()
apiPullRequest.MergedCommitID = &pr.MergedCommitID apiPullRequest.MergedCommitID = &pr.MergedCommitID

View File

@@ -71,8 +71,8 @@ func checkAuthorizedKeys(logger log.Logger, autofix bool) error {
"authorized_keys file %q is out of date.\nRegenerate it with:\n\t\"%s\"\nor\n\t\"%s\"", "authorized_keys file %q is out of date.\nRegenerate it with:\n\t\"%s\"\nor\n\t\"%s\"",
fPath, fPath,
"gitea admin regenerate keys", "gitea admin regenerate keys",
"gitea doctor --run authorized_keys --fix") "gitea doctor --run authorized-keys --fix")
return fmt.Errorf(`authorized_keys is out of date and should be regenerated with "gitea admin regenerate keys" or "gitea doctor --run authorized_keys --fix"`) return fmt.Errorf(`authorized_keys is out of date and should be regenerated with "gitea admin regenerate keys" or "gitea doctor --run authorized-keys --fix"`)
} }
logger.Warn("authorized_keys is out of date. Attempting rewrite...") logger.Warn("authorized_keys is out of date. Attempting rewrite...")
err = asymkey_model.RewriteAllPublicKeys() err = asymkey_model.RewriteAllPublicKeys()

View File

@@ -54,6 +54,12 @@ func CatFileBatchCheck(ctx context.Context, repoPath string) (WriteCloserError,
<-closed <-closed
} }
// Ensure cancel is called as soon as the provided context is cancelled
go func() {
<-ctx.Done()
cancel()
}()
_, filename, line, _ := runtime.Caller(2) _, filename, line, _ := runtime.Caller(2)
filename = strings.TrimPrefix(filename, callerPrefix) filename = strings.TrimPrefix(filename, callerPrefix)
@@ -93,6 +99,12 @@ func CatFileBatch(ctx context.Context, repoPath string) (WriteCloserError, *bufi
<-closed <-closed
} }
// Ensure cancel is called as soon as the provided context is cancelled
go func() {
<-ctx.Done()
cancel()
}()
_, filename, line, _ := runtime.Caller(2) _, filename, line, _ := runtime.Caller(2)
filename = strings.TrimPrefix(filename, callerPrefix) filename = strings.TrimPrefix(filename, callerPrefix)

View File

@@ -119,12 +119,10 @@ type CheckAttributeReader struct {
env []string env []string
ctx context.Context ctx context.Context
cancel context.CancelFunc cancel context.CancelFunc
running chan struct{}
} }
// Init initializes the cmd // Init initializes the cmd
func (c *CheckAttributeReader) Init(ctx context.Context) error { func (c *CheckAttributeReader) Init(ctx context.Context) error {
c.running = make(chan struct{})
cmdArgs := []string{"check-attr", "--stdin", "-z"} cmdArgs := []string{"check-attr", "--stdin", "-z"}
if len(c.IndexFile) > 0 && CheckGitVersionAtLeast("1.7.8") == nil { if len(c.IndexFile) > 0 && CheckGitVersionAtLeast("1.7.8") == nil {
@@ -183,14 +181,7 @@ func (c *CheckAttributeReader) Run() error {
_ = c.stdOut.Close() _ = c.stdOut.Close()
}() }()
stdErr := new(bytes.Buffer) stdErr := new(bytes.Buffer)
err := c.cmd.RunInDirTimeoutEnvFullPipelineFunc(c.env, -1, c.Repo.Path, c.stdOut, stdErr, c.stdinReader, func(_ context.Context, _ context.CancelFunc) error { err := c.cmd.RunInDirTimeoutEnvFullPipeline(c.env, -1, c.Repo.Path, c.stdOut, stdErr, c.stdinReader)
select {
case <-c.running:
default:
close(c.running)
}
return nil
})
if err != nil && // If there is an error we need to return but: if err != nil && // If there is an error we need to return but:
c.ctx.Err() != err && // 1. Ignore the context error if the context is cancelled or exceeds the deadline (RunWithContext could return c.ctx.Err() which is Canceled or DeadlineExceeded) c.ctx.Err() != err && // 1. Ignore the context error if the context is cancelled or exceeds the deadline (RunWithContext could return c.ctx.Err() which is Canceled or DeadlineExceeded)
err.Error() != "signal: killed" { // 2. We should not pass up errors due to the program being killed err.Error() != "signal: killed" { // 2. We should not pass up errors due to the program being killed
@@ -210,7 +201,7 @@ func (c *CheckAttributeReader) CheckPath(path string) (rs map[string]string, err
select { select {
case <-c.ctx.Done(): case <-c.ctx.Done():
return nil, c.ctx.Err() return nil, c.ctx.Err()
case <-c.running: default:
} }
if _, err = c.stdinWriter.Write([]byte(path + "\x00")); err != nil { if _, err = c.stdinWriter.Write([]byte(path + "\x00")); err != nil {
@@ -237,11 +228,6 @@ func (c *CheckAttributeReader) CheckPath(path string) (rs map[string]string, err
func (c *CheckAttributeReader) Close() error { func (c *CheckAttributeReader) Close() error {
c.cancel() c.cancel()
err := c.stdinWriter.Close() err := c.stdinWriter.Close()
select {
case <-c.running:
default:
close(c.running)
}
return err return err
} }

View File

@@ -127,13 +127,18 @@ func (hl *HostMatchList) checkIP(ip net.IP) bool {
// MatchHostName checks if the host matches an allow/deny(block) list // MatchHostName checks if the host matches an allow/deny(block) list
func (hl *HostMatchList) MatchHostName(host string) bool { func (hl *HostMatchList) MatchHostName(host string) bool {
hostname, _, err := net.SplitHostPort(host)
if err != nil {
hostname = host
}
if hl == nil { if hl == nil {
return false return false
} }
if hl.checkPattern(host) { if hl.checkPattern(hostname) {
return true return true
} }
if ip := net.ParseIP(host); ip != nil { if ip := net.ParseIP(hostname); ip != nil {
return hl.checkIP(ip) return hl.checkIP(ip)
} }
return false return false

View File

@@ -38,6 +38,7 @@ func TestHostOrIPMatchesList(t *testing.T) {
{"", net.ParseIP("10.0.1.1"), true}, {"", net.ParseIP("10.0.1.1"), true},
{"10.0.1.1", nil, true}, {"10.0.1.1", nil, true},
{"10.0.1.1:8080", nil, true},
{"", net.ParseIP("192.168.1.1"), true}, {"", net.ParseIP("192.168.1.1"), true},
{"192.168.1.1", nil, true}, {"192.168.1.1", nil, true},
{"", net.ParseIP("fd00::1"), true}, {"", net.ParseIP("fd00::1"), true},
@@ -48,6 +49,7 @@ func TestHostOrIPMatchesList(t *testing.T) {
{"mydomain.com", net.IPv4zero, false}, {"mydomain.com", net.IPv4zero, false},
{"sub.mydomain.com", net.IPv4zero, true}, {"sub.mydomain.com", net.IPv4zero, true},
{"sub.mydomain.com:8080", net.IPv4zero, true},
{"", net.ParseIP("169.254.1.1"), true}, {"", net.ParseIP("169.254.1.1"), true},
{"169.254.1.1", nil, true}, {"169.254.1.1", nil, true},

View File

@@ -130,7 +130,7 @@ func Init() {
log.Info("PID: %d Repository Indexer closed", os.Getpid()) log.Info("PID: %d Repository Indexer closed", os.Getpid())
}) })
waitChannel := make(chan time.Duration) waitChannel := make(chan time.Duration, 1)
// Create the Queue // Create the Queue
switch setting.Indexer.RepoType { switch setting.Indexer.RepoType {

View File

@@ -98,7 +98,7 @@ var (
// InitIssueIndexer initialize issue indexer, syncReindex is true then reindex until // InitIssueIndexer initialize issue indexer, syncReindex is true then reindex until
// all issue index done. // all issue index done.
func InitIssueIndexer(syncReindex bool) { func InitIssueIndexer(syncReindex bool) {
waitChannel := make(chan time.Duration) waitChannel := make(chan time.Duration, 1)
// Create the Queue // Create the Queue
switch setting.Indexer.IssueType { switch setting.Indexer.IssueType {
@@ -272,7 +272,7 @@ func populateIssueIndexer(ctx context.Context) {
// UpdateRepoIndexer add/update all issues of the repositories // UpdateRepoIndexer add/update all issues of the repositories
func UpdateRepoIndexer(repo *repo_model.Repository) { func UpdateRepoIndexer(repo *repo_model.Repository) {
is, err := models.Issues(&models.IssuesOptions{ is, err := models.Issues(&models.IssuesOptions{
RepoIDs: []int64{repo.ID}, RepoID: repo.ID,
IsClosed: util.OptionalBoolNone, IsClosed: util.OptionalBoolNone,
IsPull: util.OptionalBoolNone, IsPull: util.OptionalBoolNone,
}) })

View File

@@ -19,6 +19,10 @@ import (
// they use to detect if there is a block and will grow and shrink in // they use to detect if there is a block and will grow and shrink in
// response to demand as per configuration. // response to demand as per configuration.
type WorkerPool struct { type WorkerPool struct {
// This field requires to be the first one in the struct.
// This is to allow 64 bit atomic operations on 32-bit machines.
// See: https://pkg.go.dev/sync/atomic#pkg-note-BUG & Gitea issue 19518
numInQueue int64
lock sync.Mutex lock sync.Mutex
baseCtx context.Context baseCtx context.Context
baseCtxCancel context.CancelFunc baseCtxCancel context.CancelFunc
@@ -32,7 +36,6 @@ type WorkerPool struct {
blockTimeout time.Duration blockTimeout time.Duration
boostTimeout time.Duration boostTimeout time.Duration
boostWorkers int boostWorkers int
numInQueue int64
} }
// WorkerPoolConfiguration is the basic configuration for a WorkerPool // WorkerPoolConfiguration is the basic configuration for a WorkerPool

View File

@@ -92,7 +92,7 @@ func MigrateRepositoryGitData(ctx context.Context, u *user_model.User,
return repo, fmt.Errorf("Failed to remove %s: %v", wikiPath, err) return repo, fmt.Errorf("Failed to remove %s: %v", wikiPath, err)
} }
if err = git.CloneWithContext(ctx, wikiRemotePath, wikiPath, git.CloneRepoOptions{ if err := git.CloneWithContext(ctx, wikiRemotePath, wikiPath, git.CloneRepoOptions{
Mirror: true, Mirror: true,
Quiet: true, Quiet: true,
Timeout: migrateTimeout, Timeout: migrateTimeout,
@@ -103,11 +103,12 @@ func MigrateRepositoryGitData(ctx context.Context, u *user_model.User,
if err := util.RemoveAll(wikiPath); err != nil { if err := util.RemoveAll(wikiPath); err != nil {
return repo, fmt.Errorf("Failed to remove %s: %v", wikiPath, err) return repo, fmt.Errorf("Failed to remove %s: %v", wikiPath, err)
} }
} else {
if err := git.WriteCommitGraph(ctx, wikiPath); err != nil {
return repo, err
}
} }
} }
if err := git.WriteCommitGraph(ctx, wikiPath); err != nil {
return repo, err
}
} }
if repo.OwnerID == u.ID { if repo.OwnerID == u.ID {

View File

@@ -19,6 +19,7 @@ type CreateUserOption struct {
Password string `json:"password" binding:"Required;MaxSize(255)"` Password string `json:"password" binding:"Required;MaxSize(255)"`
MustChangePassword *bool `json:"must_change_password"` MustChangePassword *bool `json:"must_change_password"`
SendNotify bool `json:"send_notify"` SendNotify bool `json:"send_notify"`
Restricted *bool `json:"restricted"`
Visibility string `json:"visibility" binding:"In(,public,limited,private)"` Visibility string `json:"visibility" binding:"In(,public,limited,private)"`
} }

View File

@@ -22,6 +22,7 @@ import (
"code.gitea.io/gitea/modules/password" "code.gitea.io/gitea/modules/password"
"code.gitea.io/gitea/modules/setting" "code.gitea.io/gitea/modules/setting"
api "code.gitea.io/gitea/modules/structs" api "code.gitea.io/gitea/modules/structs"
"code.gitea.io/gitea/modules/util"
"code.gitea.io/gitea/modules/web" "code.gitea.io/gitea/modules/web"
"code.gitea.io/gitea/routers/api/v1/user" "code.gitea.io/gitea/routers/api/v1/user"
"code.gitea.io/gitea/routers/api/v1/utils" "code.gitea.io/gitea/routers/api/v1/utils"
@@ -81,7 +82,6 @@ func CreateUser(ctx *context.APIContext) {
Email: form.Email, Email: form.Email,
Passwd: form.Password, Passwd: form.Password,
MustChangePassword: true, MustChangePassword: true,
IsActive: true,
LoginType: auth.Plain, LoginType: auth.Plain,
} }
if form.MustChangePassword != nil { if form.MustChangePassword != nil {
@@ -107,11 +107,17 @@ func CreateUser(ctx *context.APIContext) {
return return
} }
var overwriteDefault *user_model.CreateUserOverwriteOptions overwriteDefault := &user_model.CreateUserOverwriteOptions{
IsActive: util.OptionalBoolTrue,
}
if form.Restricted != nil {
overwriteDefault.IsRestricted = util.OptionalBoolOf(*form.Restricted)
}
if form.Visibility != "" { if form.Visibility != "" {
overwriteDefault = &user_model.CreateUserOverwriteOptions{ visibility := api.VisibilityModes[form.Visibility]
Visibility: api.VisibilityModes[form.Visibility], overwriteDefault.Visibility = &visibility
}
} }
if err := user_model.CreateUser(u, overwriteDefault); err != nil { if err := user_model.CreateUser(u, overwriteDefault); err != nil {

View File

@@ -177,23 +177,18 @@ func CreateBranch(ctx *context.APIContext) {
} }
err := repo_service.CreateNewBranch(ctx.User, ctx.Repo.Repository, opt.OldBranchName, opt.BranchName) err := repo_service.CreateNewBranch(ctx.User, ctx.Repo.Repository, opt.OldBranchName, opt.BranchName)
if err != nil { if err != nil {
if models.IsErrBranchDoesNotExist(err) { if models.IsErrBranchDoesNotExist(err) {
ctx.Error(http.StatusNotFound, "", "The old branch does not exist") ctx.Error(http.StatusNotFound, "", "The old branch does not exist")
} }
if models.IsErrTagAlreadyExists(err) { if models.IsErrTagAlreadyExists(err) {
ctx.Error(http.StatusConflict, "", "The branch with the same tag already exists.") ctx.Error(http.StatusConflict, "", "The branch with the same tag already exists.")
} else if models.IsErrBranchAlreadyExists(err) || git.IsErrPushOutOfDate(err) { } else if models.IsErrBranchAlreadyExists(err) || git.IsErrPushOutOfDate(err) {
ctx.Error(http.StatusConflict, "", "The branch already exists.") ctx.Error(http.StatusConflict, "", "The branch already exists.")
} else if models.IsErrBranchNameConflict(err) { } else if models.IsErrBranchNameConflict(err) {
ctx.Error(http.StatusConflict, "", "The branch with the same name already exists.") ctx.Error(http.StatusConflict, "", "The branch with the same name already exists.")
} else { } else {
ctx.Error(http.StatusInternalServerError, "CreateRepoBranch", err) ctx.Error(http.StatusInternalServerError, "CreateRepoBranch", err)
} }
return return
} }
@@ -263,10 +258,15 @@ func ListBranches(ctx *context.APIContext) {
return return
} }
apiBranches := make([]*api.Branch, len(branches)) apiBranches := make([]*api.Branch, 0, len(branches))
for i := range branches { for i := range branches {
c, err := branches[i].GetCommit() c, err := branches[i].GetCommit()
if err != nil { if err != nil {
// Skip if this branch doesn't exist anymore.
if git.IsErrNotExist(err) {
totalNumOfBranches--
continue
}
ctx.Error(http.StatusInternalServerError, "GetCommit", err) ctx.Error(http.StatusInternalServerError, "GetCommit", err)
return return
} }
@@ -275,11 +275,12 @@ func ListBranches(ctx *context.APIContext) {
ctx.Error(http.StatusInternalServerError, "GetBranchProtection", err) ctx.Error(http.StatusInternalServerError, "GetBranchProtection", err)
return return
} }
apiBranches[i], err = convert.ToBranch(ctx.Repo.Repository, branches[i], c, branchProtection, ctx.User, ctx.Repo.IsAdmin()) apiBranch, err := convert.ToBranch(ctx.Repo.Repository, branches[i], c, branchProtection, ctx.User, ctx.Repo.IsAdmin())
if err != nil { if err != nil {
ctx.Error(http.StatusInternalServerError, "convert.ToBranch", err) ctx.Error(http.StatusInternalServerError, "convert.ToBranch", err)
return return
} }
apiBranches = append(apiBranches, apiBranch)
} }
ctx.SetLinkHeader(totalNumOfBranches, listOptions.PageSize) ctx.SetLinkHeader(totalNumOfBranches, listOptions.PageSize)
@@ -532,7 +533,6 @@ func CreateBranchProtection(ctx *context.APIContext) {
} }
ctx.JSON(http.StatusCreated, convert.ToBranchProtection(bp)) ctx.JSON(http.StatusCreated, convert.ToBranchProtection(bp))
} }
// EditBranchProtection edits a branch protection for a repo // EditBranchProtection edits a branch protection for a repo

View File

@@ -173,6 +173,7 @@ func SearchIssues(ctx *context.APIContext) {
opts.TeamID = team.ID opts.TeamID = team.ID
} }
repoCond := models.SearchRepositoryCondition(opts)
repoIDs, _, err := models.SearchRepositoryIDs(opts) repoIDs, _, err := models.SearchRepositoryIDs(opts)
if err != nil { if err != nil {
ctx.Error(http.StatusInternalServerError, "SearchRepositoryByName", err) ctx.Error(http.StatusInternalServerError, "SearchRepositoryByName", err)
@@ -233,7 +234,7 @@ func SearchIssues(ctx *context.APIContext) {
Page: ctx.FormInt("page"), Page: ctx.FormInt("page"),
PageSize: limit, PageSize: limit,
}, },
RepoIDs: repoIDs, RepoCond: repoCond,
IsClosed: isClosed, IsClosed: isClosed,
IssueIDs: issueIDs, IssueIDs: issueIDs,
IncludedLabelNames: includedLabelNames, IncludedLabelNames: includedLabelNames,
@@ -460,7 +461,7 @@ func ListIssues(ctx *context.APIContext) {
if len(keyword) == 0 || len(issueIDs) > 0 || len(labelIDs) > 0 { if len(keyword) == 0 || len(issueIDs) > 0 || len(labelIDs) > 0 {
issuesOpt := &models.IssuesOptions{ issuesOpt := &models.IssuesOptions{
ListOptions: listOptions, ListOptions: listOptions,
RepoIDs: []int64{ctx.Repo.Repository.ID}, RepoID: ctx.Repo.Repository.ID,
IsClosed: isClosed, IsClosed: isClosed,
IssueIDs: issueIDs, IssueIDs: issueIDs,
LabelIDs: labelIDs, LabelIDs: labelIDs,

View File

@@ -160,7 +160,7 @@ func Search(ctx *context.APIContext) {
opts.Collaborate = util.OptionalBoolFalse opts.Collaborate = util.OptionalBoolFalse
} }
var mode = ctx.FormString("mode") mode := ctx.FormString("mode")
switch mode { switch mode {
case "source": case "source":
opts.Fork = util.OptionalBoolFalse opts.Fork = util.OptionalBoolFalse
@@ -186,9 +186,9 @@ func Search(ctx *context.APIContext) {
opts.IsPrivate = util.OptionalBoolOf(ctx.FormBool("is_private")) opts.IsPrivate = util.OptionalBoolOf(ctx.FormBool("is_private"))
} }
var sortMode = ctx.FormString("sort") sortMode := ctx.FormString("sort")
if len(sortMode) > 0 { if len(sortMode) > 0 {
var sortOrder = ctx.FormString("order") sortOrder := ctx.FormString("order")
if len(sortOrder) == 0 { if len(sortOrder) == 0 {
sortOrder = "asc" sortOrder = "asc"
} }
@@ -264,7 +264,8 @@ func CreateUserRepo(ctx *context.APIContext, owner *user_model.User, opt api.Cre
if repo_model.IsErrRepoAlreadyExist(err) { if repo_model.IsErrRepoAlreadyExist(err) {
ctx.Error(http.StatusConflict, "", "The repository with the same name already exists.") ctx.Error(http.StatusConflict, "", "The repository with the same name already exists.")
} else if db.IsErrNameReserved(err) || } else if db.IsErrNameReserved(err) ||
db.IsErrNamePatternNotAllowed(err) { db.IsErrNamePatternNotAllowed(err) ||
models.IsErrIssueLabelTemplateLoad(err) {
ctx.Error(http.StatusUnprocessableEntity, "", err) ctx.Error(http.StatusUnprocessableEntity, "", err)
} else { } else {
ctx.Error(http.StatusInternalServerError, "CreateRepository", err) ctx.Error(http.StatusInternalServerError, "CreateRepository", err)

View File

@@ -508,13 +508,17 @@ func SubmitInstall(ctx *context.Context) {
// Create admin account // Create admin account
if len(form.AdminName) > 0 { if len(form.AdminName) > 0 {
u := &user_model.User{ u := &user_model.User{
Name: form.AdminName, Name: form.AdminName,
Email: form.AdminEmail, Email: form.AdminEmail,
Passwd: form.AdminPasswd, Passwd: form.AdminPasswd,
IsAdmin: true, IsAdmin: true,
IsActive: true,
} }
if err = user_model.CreateUser(u); err != nil { overwriteDefault := &user_model.CreateUserOverwriteOptions{
IsRestricted: util.OptionalBoolFalse,
IsActive: util.OptionalBoolTrue,
}
if err = user_model.CreateUser(u, overwriteDefault); err != nil {
if !user_model.IsErrUserAlreadyExist(err) { if !user_model.IsErrUserAlreadyExist(err) {
setting.InstallLock = false setting.InstallLock = false
ctx.Data["Err_AdminName"] = true ctx.Data["Err_AdminName"] = true

View File

@@ -125,10 +125,14 @@ func NewUserPost(ctx *context.Context) {
Name: form.UserName, Name: form.UserName,
Email: form.Email, Email: form.Email,
Passwd: form.Password, Passwd: form.Password,
IsActive: true,
LoginType: auth.Plain, LoginType: auth.Plain,
} }
overwriteDefault := &user_model.CreateUserOverwriteOptions{
IsActive: util.OptionalBoolTrue,
Visibility: &form.Visibility,
}
if len(form.LoginType) > 0 { if len(form.LoginType) > 0 {
fields := strings.Split(form.LoginType, "-") fields := strings.Split(form.LoginType, "-")
if len(fields) == 2 { if len(fields) == 2 {
@@ -163,7 +167,7 @@ func NewUserPost(ctx *context.Context) {
u.MustChangePassword = form.MustChangePassword u.MustChangePassword = form.MustChangePassword
} }
if err := user_model.CreateUser(u, &user_model.CreateUserOverwriteOptions{Visibility: form.Visibility}); err != nil { if err := user_model.CreateUser(u, overwriteDefault); err != nil {
switch { switch {
case user_model.IsErrUserAlreadyExist(err): case user_model.IsErrUserAlreadyExist(err):
ctx.Data["Err_UserName"] = true ctx.Data["Err_UserName"] = true

View File

@@ -507,14 +507,12 @@ func SignUpPost(ctx *context.Context) {
} }
u := &user_model.User{ u := &user_model.User{
Name: form.UserName, Name: form.UserName,
Email: form.Email, Email: form.Email,
Passwd: form.Password, Passwd: form.Password,
IsActive: !(setting.Service.RegisterEmailConfirm || setting.Service.RegisterManualConfirm),
IsRestricted: setting.Service.DefaultUserIsRestricted,
} }
if !createAndHandleCreatedUser(ctx, tplSignUp, form, u, nil, false) { if !createAndHandleCreatedUser(ctx, tplSignUp, form, u, nil, nil, false) {
// error already handled // error already handled
return return
} }
@@ -525,8 +523,8 @@ func SignUpPost(ctx *context.Context) {
// createAndHandleCreatedUser calls createUserInContext and // createAndHandleCreatedUser calls createUserInContext and
// then handleUserCreated. // then handleUserCreated.
func createAndHandleCreatedUser(ctx *context.Context, tpl base.TplName, form interface{}, u *user_model.User, gothUser *goth.User, allowLink bool) bool { func createAndHandleCreatedUser(ctx *context.Context, tpl base.TplName, form interface{}, u *user_model.User, overwrites *user_model.CreateUserOverwriteOptions, gothUser *goth.User, allowLink bool) bool {
if !createUserInContext(ctx, tpl, form, u, gothUser, allowLink) { if !createUserInContext(ctx, tpl, form, u, overwrites, gothUser, allowLink) {
return false return false
} }
return handleUserCreated(ctx, u, gothUser) return handleUserCreated(ctx, u, gothUser)
@@ -534,8 +532,8 @@ func createAndHandleCreatedUser(ctx *context.Context, tpl base.TplName, form int
// createUserInContext creates a user and handles errors within a given context. // createUserInContext creates a user and handles errors within a given context.
// Optionally a template can be specified. // Optionally a template can be specified.
func createUserInContext(ctx *context.Context, tpl base.TplName, form interface{}, u *user_model.User, gothUser *goth.User, allowLink bool) (ok bool) { func createUserInContext(ctx *context.Context, tpl base.TplName, form interface{}, u *user_model.User, overwrites *user_model.CreateUserOverwriteOptions, gothUser *goth.User, allowLink bool) (ok bool) {
if err := user_model.CreateUser(u); err != nil { if err := user_model.CreateUser(u, overwrites); err != nil {
if allowLink && (user_model.IsErrUserAlreadyExist(err) || user_model.IsErrEmailAlreadyUsed(err)) { if allowLink && (user_model.IsErrUserAlreadyExist(err) || user_model.IsErrEmailAlreadyUsed(err)) {
if setting.OAuth2Client.AccountLinking == setting.OAuth2AccountLinkingAuto { if setting.OAuth2Client.AccountLinking == setting.OAuth2AccountLinkingAuto {
var user *user_model.User var user *user_model.User

View File

@@ -285,13 +285,12 @@ func LinkAccountPostRegister(ctx *context.Context) {
Name: form.UserName, Name: form.UserName,
Email: form.Email, Email: form.Email,
Passwd: form.Password, Passwd: form.Password,
IsActive: !(setting.Service.RegisterEmailConfirm || setting.Service.RegisterManualConfirm),
LoginType: auth.OAuth2, LoginType: auth.OAuth2,
LoginSource: authSource.ID, LoginSource: authSource.ID,
LoginName: gothUser.UserID, LoginName: gothUser.UserID,
} }
if !createAndHandleCreatedUser(ctx, tplLinkAccount, form, u, &gothUser, false) { if !createAndHandleCreatedUser(ctx, tplLinkAccount, form, u, nil, &gothUser, false) {
// error already handled // error already handled
return return
} }

View File

@@ -25,6 +25,7 @@ import (
"code.gitea.io/gitea/modules/session" "code.gitea.io/gitea/modules/session"
"code.gitea.io/gitea/modules/setting" "code.gitea.io/gitea/modules/setting"
"code.gitea.io/gitea/modules/timeutil" "code.gitea.io/gitea/modules/timeutil"
"code.gitea.io/gitea/modules/util"
"code.gitea.io/gitea/modules/web" "code.gitea.io/gitea/modules/web"
"code.gitea.io/gitea/modules/web/middleware" "code.gitea.io/gitea/modules/web/middleware"
auth_service "code.gitea.io/gitea/services/auth" auth_service "code.gitea.io/gitea/services/auth"
@@ -872,19 +873,21 @@ func SignInOAuthCallback(ctx *context.Context) {
return return
} }
u = &user_model.User{ u = &user_model.User{
Name: getUserName(&gothUser), Name: getUserName(&gothUser),
FullName: gothUser.Name, FullName: gothUser.Name,
Email: gothUser.Email, Email: gothUser.Email,
IsActive: !setting.OAuth2Client.RegisterEmailConfirm, LoginType: auth.OAuth2,
LoginType: auth.OAuth2, LoginSource: authSource.ID,
LoginSource: authSource.ID, LoginName: gothUser.UserID,
LoginName: gothUser.UserID, }
IsRestricted: setting.Service.DefaultUserIsRestricted,
overwriteDefault := &user_model.CreateUserOverwriteOptions{
IsActive: util.OptionalBoolOf(!setting.OAuth2Client.RegisterEmailConfirm),
} }
setUserGroupClaims(authSource, u, &gothUser) setUserGroupClaims(authSource, u, &gothUser)
if !createAndHandleCreatedUser(ctx, base.TplName(""), nil, u, &gothUser, setting.OAuth2Client.AccountLinking != setting.OAuth2AccountLinkingDisabled) { if !createAndHandleCreatedUser(ctx, base.TplName(""), nil, u, overwriteDefault, &gothUser, setting.OAuth2Client.AccountLinking != setting.OAuth2AccountLinkingDisabled) {
// error already handled // error already handled
return return
} }

View File

@@ -425,12 +425,11 @@ func RegisterOpenIDPost(ctx *context.Context) {
} }
u := &user_model.User{ u := &user_model.User{
Name: form.UserName, Name: form.UserName,
Email: form.Email, Email: form.Email,
Passwd: password, Passwd: password,
IsActive: !(setting.Service.RegisterEmailConfirm || setting.Service.RegisterManualConfirm),
} }
if !createUserInContext(ctx, tplSignUpOID, form, u, nil, false) { if !createUserInContext(ctx, tplSignUpOID, form, u, nil, nil, false) {
// error already handled // error already handled
return return
} }

View File

@@ -226,7 +226,7 @@ func issues(ctx *context.Context, milestoneID, projectID int64, isPullOption uti
Page: pager.Paginater.Current(), Page: pager.Paginater.Current(),
PageSize: setting.UI.IssuePagingNum, PageSize: setting.UI.IssuePagingNum,
}, },
RepoIDs: []int64{repo.ID}, RepoID: repo.ID,
AssigneeID: assigneeID, AssigneeID: assigneeID,
PosterID: posterID, PosterID: posterID,
MentionedID: mentionedID, MentionedID: mentionedID,

View File

@@ -191,7 +191,10 @@ func renderViewPage(ctx *context.Context) (*git.Repository, *git.TreeEntry) {
ctx.Data["title"] = pageName ctx.Data["title"] = pageName
ctx.Data["RequireHighlightJS"] = true ctx.Data["RequireHighlightJS"] = true
//lookup filename in wiki - get filecontent, gitTree entry , real filename isSideBar := pageName == "_Sidebar"
isFooter := pageName == "_Footer"
// lookup filename in wiki - get filecontent, gitTree entry , real filename
data, entry, pageFilename, noEntry := wikiContentsByName(ctx, commit, pageName) data, entry, pageFilename, noEntry := wikiContentsByName(ctx, commit, pageName)
if noEntry { if noEntry {
ctx.Redirect(ctx.Repo.RepoLink + "/wiki/?action=_pages") ctx.Redirect(ctx.Repo.RepoLink + "/wiki/?action=_pages")
@@ -203,23 +206,33 @@ func renderViewPage(ctx *context.Context) (*git.Repository, *git.TreeEntry) {
return nil, nil return nil, nil
} }
sidebarContent, _, _, _ := wikiContentsByName(ctx, commit, "_Sidebar") var sidebarContent []byte
if ctx.Written() { if !isSideBar {
if wikiRepo != nil { sidebarContent, _, _, _ = wikiContentsByName(ctx, commit, "_Sidebar")
wikiRepo.Close() if ctx.Written() {
if wikiRepo != nil {
wikiRepo.Close()
}
return nil, nil
} }
return nil, nil } else {
sidebarContent = data
} }
footerContent, _, _, _ := wikiContentsByName(ctx, commit, "_Footer") var footerContent []byte
if ctx.Written() { if !isFooter {
if wikiRepo != nil { footerContent, _, _, _ = wikiContentsByName(ctx, commit, "_Footer")
wikiRepo.Close() if ctx.Written() {
if wikiRepo != nil {
wikiRepo.Close()
}
return nil, nil
} }
return nil, nil } else {
footerContent = data
} }
var rctx = &markup.RenderContext{ rctx := &markup.RenderContext{
URLPrefix: ctx.Repo.RepoLink, URLPrefix: ctx.Repo.RepoLink,
Metas: ctx.Repo.Repository.ComposeDocumentMetas(), Metas: ctx.Repo.Repository.ComposeDocumentMetas(),
IsWiki: true, IsWiki: true,
@@ -236,27 +249,35 @@ func renderViewPage(ctx *context.Context) (*git.Repository, *git.TreeEntry) {
ctx.Data["EscapeStatus"], ctx.Data["content"] = charset.EscapeControlString(buf.String()) ctx.Data["EscapeStatus"], ctx.Data["content"] = charset.EscapeControlString(buf.String())
buf.Reset() if !isSideBar {
if err := markdown.Render(rctx, bytes.NewReader(sidebarContent), &buf); err != nil { buf.Reset()
if wikiRepo != nil { if err := markdown.Render(rctx, bytes.NewReader(sidebarContent), &buf); err != nil {
wikiRepo.Close() if wikiRepo != nil {
wikiRepo.Close()
}
ctx.ServerError("Render", err)
return nil, nil
} }
ctx.ServerError("Render", err) ctx.Data["sidebarPresent"] = sidebarContent != nil
return nil, nil ctx.Data["sidebarEscapeStatus"], ctx.Data["sidebarContent"] = charset.EscapeControlString(buf.String())
} else {
ctx.Data["sidebarPresent"] = false
} }
ctx.Data["sidebarPresent"] = sidebarContent != nil
ctx.Data["sidebarEscapeStatus"], ctx.Data["sidebarContent"] = charset.EscapeControlString(buf.String())
buf.Reset() if !isFooter {
if err := markdown.Render(rctx, bytes.NewReader(footerContent), &buf); err != nil { buf.Reset()
if wikiRepo != nil { if err := markdown.Render(rctx, bytes.NewReader(footerContent), &buf); err != nil {
wikiRepo.Close() if wikiRepo != nil {
wikiRepo.Close()
}
ctx.ServerError("Render", err)
return nil, nil
} }
ctx.ServerError("Render", err) ctx.Data["footerPresent"] = footerContent != nil
return nil, nil ctx.Data["footerEscapeStatus"], ctx.Data["footerContent"] = charset.EscapeControlString(buf.String())
} else {
ctx.Data["footerPresent"] = false
} }
ctx.Data["footerPresent"] = footerContent != nil
ctx.Data["footerEscapeStatus"], ctx.Data["footerContent"] = charset.EscapeControlString(buf.String())
// get commit count - wiki revisions // get commit count - wiki revisions
commitsCount, _ := wikiRepo.FileCommitsCount("master", pageFilename) commitsCount, _ := wikiRepo.FileCommitsCount("master", pageFilename)
@@ -290,7 +311,7 @@ func renderRevisionPage(ctx *context.Context) (*git.Repository, *git.TreeEntry)
ctx.Data["Username"] = ctx.Repo.Owner.Name ctx.Data["Username"] = ctx.Repo.Owner.Name
ctx.Data["Reponame"] = ctx.Repo.Repository.Name ctx.Data["Reponame"] = ctx.Repo.Repository.Name
//lookup filename in wiki - get filecontent, gitTree entry , real filename // lookup filename in wiki - get filecontent, gitTree entry , real filename
data, entry, pageFilename, noEntry := wikiContentsByName(ctx, commit, pageName) data, entry, pageFilename, noEntry := wikiContentsByName(ctx, commit, pageName)
if noEntry { if noEntry {
ctx.Redirect(ctx.Repo.RepoLink + "/wiki/?action=_pages") ctx.Redirect(ctx.Repo.RepoLink + "/wiki/?action=_pages")
@@ -364,7 +385,7 @@ func renderEditPage(ctx *context.Context) {
ctx.Data["title"] = pageName ctx.Data["title"] = pageName
ctx.Data["RequireHighlightJS"] = true ctx.Data["RequireHighlightJS"] = true
//lookup filename in wiki - get filecontent, gitTree entry , real filename // lookup filename in wiki - get filecontent, gitTree entry , real filename
data, entry, _, noEntry := wikiContentsByName(ctx, commit, pageName) data, entry, _, noEntry := wikiContentsByName(ctx, commit, pageName)
if noEntry { if noEntry {
ctx.Redirect(ctx.Repo.RepoLink + "/wiki/?action=_pages") ctx.Redirect(ctx.Repo.RepoLink + "/wiki/?action=_pages")

View File

@@ -462,13 +462,7 @@ func buildIssueOverview(ctx *context.Context, unitType unit.Type) {
// to check if it's in the team(which possible isn't the case). // to check if it's in the team(which possible isn't the case).
opts.User = nil opts.User = nil
} }
userRepoIDs, _, err := models.SearchRepositoryIDs(repoOpts) opts.RepoCond = models.SearchRepositoryCondition(repoOpts)
if err != nil {
ctx.ServerError("models.SearchRepositoryIDs: %v", err)
return
}
opts.RepoIDs = userRepoIDs
} }
// keyword holds the search term entered into the search field. // keyword holds the search term entered into the search field.
@@ -532,7 +526,7 @@ func buildIssueOverview(ctx *context.Context, unitType unit.Type) {
// Gets set when clicking filters on the issues overview page. // Gets set when clicking filters on the issues overview page.
repoIDs := getRepoIDs(ctx.FormString("repos")) repoIDs := getRepoIDs(ctx.FormString("repos"))
if len(repoIDs) > 0 { if len(repoIDs) > 0 {
opts.RepoIDs = repoIDs opts.RepoCond = builder.In("issue.repo_id", repoIDs)
} }
// ------------------------------ // ------------------------------

View File

@@ -12,6 +12,7 @@ import (
user_model "code.gitea.io/gitea/models/user" user_model "code.gitea.io/gitea/models/user"
"code.gitea.io/gitea/modules/log" "code.gitea.io/gitea/modules/log"
"code.gitea.io/gitea/modules/setting" "code.gitea.io/gitea/modules/setting"
"code.gitea.io/gitea/modules/util"
"code.gitea.io/gitea/modules/web/middleware" "code.gitea.io/gitea/modules/web/middleware"
"code.gitea.io/gitea/services/mailer" "code.gitea.io/gitea/services/mailer"
@@ -106,11 +107,15 @@ func (r *ReverseProxy) newUser(req *http.Request) *user_model.User {
} }
user := &user_model.User{ user := &user_model.User{
Name: username, Name: username,
Email: email, Email: email,
IsActive: true,
} }
if err := user_model.CreateUser(user); err != nil {
overwriteDefault := user_model.CreateUserOverwriteOptions{
IsActive: util.OptionalBoolTrue,
}
if err := user_model.CreateUser(user, &overwriteDefault); err != nil {
// FIXME: should I create a system notice? // FIXME: should I create a system notice?
log.Error("CreateUser: %v", err) log.Error("CreateUser: %v", err)
return nil return nil

View File

@@ -12,6 +12,7 @@ import (
"code.gitea.io/gitea/models/auth" "code.gitea.io/gitea/models/auth"
"code.gitea.io/gitea/models/db" "code.gitea.io/gitea/models/db"
user_model "code.gitea.io/gitea/models/user" user_model "code.gitea.io/gitea/models/user"
"code.gitea.io/gitea/modules/util"
"code.gitea.io/gitea/services/mailer" "code.gitea.io/gitea/services/mailer"
user_service "code.gitea.io/gitea/services/user" user_service "code.gitea.io/gitea/services/user"
) )
@@ -80,19 +81,21 @@ func (source *Source) Authenticate(user *user_model.User, userName, password str
} }
user = &user_model.User{ user = &user_model.User{
LowerName: strings.ToLower(sr.Username), LowerName: strings.ToLower(sr.Username),
Name: sr.Username, Name: sr.Username,
FullName: composeFullName(sr.Name, sr.Surname, sr.Username), FullName: composeFullName(sr.Name, sr.Surname, sr.Username),
Email: sr.Mail, Email: sr.Mail,
LoginType: source.authSource.Type, LoginType: source.authSource.Type,
LoginSource: source.authSource.ID, LoginSource: source.authSource.ID,
LoginName: userName, LoginName: userName,
IsActive: true, IsAdmin: sr.IsAdmin,
IsAdmin: sr.IsAdmin, }
IsRestricted: sr.IsRestricted, overwriteDefault := &user_model.CreateUserOverwriteOptions{
IsRestricted: util.OptionalBoolOf(sr.IsRestricted),
IsActive: util.OptionalBoolTrue,
} }
err := user_model.CreateUser(user) err := user_model.CreateUser(user, overwriteDefault)
if err != nil { if err != nil {
return user, err return user, err
} }

View File

@@ -14,6 +14,7 @@ import (
"code.gitea.io/gitea/models/db" "code.gitea.io/gitea/models/db"
user_model "code.gitea.io/gitea/models/user" user_model "code.gitea.io/gitea/models/user"
"code.gitea.io/gitea/modules/log" "code.gitea.io/gitea/modules/log"
"code.gitea.io/gitea/modules/util"
user_service "code.gitea.io/gitea/services/user" user_service "code.gitea.io/gitea/services/user"
) )
@@ -99,19 +100,21 @@ func (source *Source) Sync(ctx context.Context, updateExisting bool) error {
log.Trace("SyncExternalUsers[%s]: Creating user %s", source.authSource.Name, su.Username) log.Trace("SyncExternalUsers[%s]: Creating user %s", source.authSource.Name, su.Username)
usr = &user_model.User{ usr = &user_model.User{
LowerName: su.LowerName, LowerName: su.LowerName,
Name: su.Username, Name: su.Username,
FullName: fullName, FullName: fullName,
LoginType: source.authSource.Type, LoginType: source.authSource.Type,
LoginSource: source.authSource.ID, LoginSource: source.authSource.ID,
LoginName: su.Username, LoginName: su.Username,
Email: su.Mail, Email: su.Mail,
IsAdmin: su.IsAdmin, IsAdmin: su.IsAdmin,
IsRestricted: su.IsRestricted, }
IsActive: true, overwriteDefault := &user_model.CreateUserOverwriteOptions{
IsRestricted: util.OptionalBoolOf(su.IsRestricted),
IsActive: util.OptionalBoolTrue,
} }
err = user_model.CreateUser(usr) err = user_model.CreateUser(usr, overwriteDefault)
if err != nil { if err != nil {
log.Error("SyncExternalUsers[%s]: Error creating user %s: %v", source.authSource.Name, su.Username, err) log.Error("SyncExternalUsers[%s]: Error creating user %s: %v", source.authSource.Name, su.Username, err)

View File

@@ -12,6 +12,7 @@ import (
user_model "code.gitea.io/gitea/models/user" user_model "code.gitea.io/gitea/models/user"
"code.gitea.io/gitea/modules/auth/pam" "code.gitea.io/gitea/modules/auth/pam"
"code.gitea.io/gitea/modules/setting" "code.gitea.io/gitea/modules/setting"
"code.gitea.io/gitea/modules/util"
"code.gitea.io/gitea/services/mailer" "code.gitea.io/gitea/services/mailer"
"github.com/google/uuid" "github.com/google/uuid"
@@ -58,10 +59,12 @@ func (source *Source) Authenticate(user *user_model.User, userName, password str
LoginType: auth.PAM, LoginType: auth.PAM,
LoginSource: source.authSource.ID, LoginSource: source.authSource.ID,
LoginName: userName, // This is what the user typed in LoginName: userName, // This is what the user typed in
IsActive: true, }
overwriteDefault := &user_model.CreateUserOverwriteOptions{
IsActive: util.OptionalBoolTrue,
} }
if err := user_model.CreateUser(user); err != nil { if err := user_model.CreateUser(user, overwriteDefault); err != nil {
return user, err return user, err
} }

View File

@@ -74,10 +74,12 @@ func (source *Source) Authenticate(user *user_model.User, userName, password str
LoginType: auth_model.SMTP, LoginType: auth_model.SMTP,
LoginSource: source.authSource.ID, LoginSource: source.authSource.ID,
LoginName: userName, LoginName: userName,
IsActive: true, }
overwriteDefault := &user_model.CreateUserOverwriteOptions{
IsActive: util.OptionalBoolTrue,
} }
if err := user_model.CreateUser(user); err != nil { if err := user_model.CreateUser(user, overwriteDefault); err != nil {
return user, err return user, err
} }

View File

@@ -16,6 +16,7 @@ import (
"code.gitea.io/gitea/modules/log" "code.gitea.io/gitea/modules/log"
"code.gitea.io/gitea/modules/setting" "code.gitea.io/gitea/modules/setting"
"code.gitea.io/gitea/modules/templates" "code.gitea.io/gitea/modules/templates"
"code.gitea.io/gitea/modules/util"
"code.gitea.io/gitea/modules/web/middleware" "code.gitea.io/gitea/modules/web/middleware"
"code.gitea.io/gitea/services/auth/source/sspi" "code.gitea.io/gitea/services/auth/source/sspi"
"code.gitea.io/gitea/services/mailer" "code.gitea.io/gitea/services/mailer"
@@ -187,17 +188,20 @@ func (s *SSPI) shouldAuthenticate(req *http.Request) (shouldAuth bool) {
func (s *SSPI) newUser(username string, cfg *sspi.Source) (*user_model.User, error) { func (s *SSPI) newUser(username string, cfg *sspi.Source) (*user_model.User, error) {
email := gouuid.New().String() + "@localhost.localdomain" email := gouuid.New().String() + "@localhost.localdomain"
user := &user_model.User{ user := &user_model.User{
Name: username, Name: username,
Email: email, Email: email,
KeepEmailPrivate: true, Passwd: gouuid.New().String(),
Passwd: gouuid.New().String(), Language: cfg.DefaultLanguage,
IsActive: cfg.AutoActivateUsers, UseCustomAvatar: true,
Language: cfg.DefaultLanguage, Avatar: avatars.DefaultAvatarLink(),
UseCustomAvatar: true,
Avatar: avatars.DefaultAvatarLink(),
EmailNotificationsPreference: user_model.EmailNotificationsDisabled,
} }
if err := user_model.CreateUser(user); err != nil { emailNotificationPreference := user_model.EmailNotificationsDisabled
overwriteDefault := &user_model.CreateUserOverwriteOptions{
IsActive: util.OptionalBoolOf(cfg.AutoActivateUsers),
KeepEmailPrivate: util.OptionalBoolTrue,
EmailNotificationsPreference: &emailNotificationPreference,
}
if err := user_model.CreateUser(user, overwriteDefault); err != nil {
return nil, err return nil, err
} }

View File

@@ -7,6 +7,7 @@ package migrations
import ( import (
"context" "context"
"errors"
"fmt" "fmt"
"io" "io"
"os" "os"
@@ -33,9 +34,7 @@ import (
gouuid "github.com/google/uuid" gouuid "github.com/google/uuid"
) )
var ( var _ base.Uploader = &GiteaLocalUploader{}
_ base.Uploader = &GiteaLocalUploader{}
)
// GiteaLocalUploader implements an Uploader to gitea sites // GiteaLocalUploader implements an Uploader to gitea sites
type GiteaLocalUploader struct { type GiteaLocalUploader struct {
@@ -159,7 +158,7 @@ func (g *GiteaLocalUploader) CreateTopics(topics ...string) error {
// CreateMilestones creates milestones // CreateMilestones creates milestones
func (g *GiteaLocalUploader) CreateMilestones(milestones ...*base.Milestone) error { func (g *GiteaLocalUploader) CreateMilestones(milestones ...*base.Milestone) error {
var mss = make([]*models.Milestone, 0, len(milestones)) mss := make([]*models.Milestone, 0, len(milestones))
for _, milestone := range milestones { for _, milestone := range milestones {
var deadline timeutil.TimeStamp var deadline timeutil.TimeStamp
if milestone.Deadline != nil { if milestone.Deadline != nil {
@@ -182,7 +181,7 @@ func (g *GiteaLocalUploader) CreateMilestones(milestones ...*base.Milestone) err
milestone.Updated = &milestone.Created milestone.Updated = &milestone.Created
} }
var ms = models.Milestone{ ms := models.Milestone{
RepoID: g.repo.ID, RepoID: g.repo.ID,
Name: milestone.Title, Name: milestone.Title,
Content: milestone.Description, Content: milestone.Description,
@@ -210,7 +209,7 @@ func (g *GiteaLocalUploader) CreateMilestones(milestones ...*base.Milestone) err
// CreateLabels creates labels // CreateLabels creates labels
func (g *GiteaLocalUploader) CreateLabels(labels ...*base.Label) error { func (g *GiteaLocalUploader) CreateLabels(labels ...*base.Label) error {
var lbs = make([]*models.Label, 0, len(labels)) lbs := make([]*models.Label, 0, len(labels))
for _, label := range labels { for _, label := range labels {
lbs = append(lbs, &models.Label{ lbs = append(lbs, &models.Label{
RepoID: g.repo.ID, RepoID: g.repo.ID,
@@ -232,7 +231,7 @@ func (g *GiteaLocalUploader) CreateLabels(labels ...*base.Label) error {
// CreateReleases creates releases // CreateReleases creates releases
func (g *GiteaLocalUploader) CreateReleases(releases ...*base.Release) error { func (g *GiteaLocalUploader) CreateReleases(releases ...*base.Release) error {
var rels = make([]*models.Release, 0, len(releases)) rels := make([]*models.Release, 0, len(releases))
for _, release := range releases { for _, release := range releases {
if release.Created.IsZero() { if release.Created.IsZero() {
if !release.Published.IsZero() { if !release.Published.IsZero() {
@@ -242,13 +241,12 @@ func (g *GiteaLocalUploader) CreateReleases(releases ...*base.Release) error {
} }
} }
var rel = models.Release{ rel := models.Release{
RepoID: g.repo.ID, RepoID: g.repo.ID,
TagName: release.TagName, TagName: release.TagName,
LowerTagName: strings.ToLower(release.TagName), LowerTagName: strings.ToLower(release.TagName),
Target: release.TargetCommitish, Target: release.TargetCommitish,
Title: release.Name, Title: release.Name,
Sha1: release.TargetCommitish,
Note: release.Body, Note: release.Body,
IsDraft: release.Draft, IsDraft: release.Draft,
IsPrerelease: release.Prerelease, IsPrerelease: release.Prerelease,
@@ -277,15 +275,18 @@ func (g *GiteaLocalUploader) CreateReleases(releases ...*base.Release) error {
rel.OriginalAuthorID = release.PublisherID rel.OriginalAuthorID = release.PublisherID
} }
// calc NumCommits if no draft // calc NumCommits if possible
if !release.Draft { if rel.TagName != "" {
commit, err := g.gitRepo.GetTagCommit(rel.TagName) commit, err := g.gitRepo.GetTagCommit(rel.TagName)
if err != nil { if !errors.Is(err, git.ErrNotExist{}) {
return fmt.Errorf("GetTagCommit[%v]: %v", rel.TagName, err) if err != nil {
} return fmt.Errorf("GetTagCommit[%v]: %v", rel.TagName, err)
rel.NumCommits, err = commit.CommitsCount() }
if err != nil { rel.Sha1 = commit.ID.String()
return fmt.Errorf("CommitsCount: %v", err) rel.NumCommits, err = commit.CommitsCount()
if err != nil {
return fmt.Errorf("CommitsCount: %v", err)
}
} }
} }
@@ -297,7 +298,7 @@ func (g *GiteaLocalUploader) CreateReleases(releases ...*base.Release) error {
asset.Created = release.Created asset.Created = release.Created
} }
} }
var attach = repo_model.Attachment{ attach := repo_model.Attachment{
UUID: gouuid.New().String(), UUID: gouuid.New().String(),
Name: asset.Name, Name: asset.Name,
DownloadCount: int64(*asset.DownloadCount), DownloadCount: int64(*asset.DownloadCount),
@@ -348,7 +349,7 @@ func (g *GiteaLocalUploader) SyncTags() error {
// CreateIssues creates issues // CreateIssues creates issues
func (g *GiteaLocalUploader) CreateIssues(issues ...*base.Issue) error { func (g *GiteaLocalUploader) CreateIssues(issues ...*base.Issue) error {
var iss = make([]*models.Issue, 0, len(issues)) iss := make([]*models.Issue, 0, len(issues))
for _, issue := range issues { for _, issue := range issues {
var labels []*models.Label var labels []*models.Label
for _, label := range issue.Labels { for _, label := range issue.Labels {
@@ -381,7 +382,7 @@ func (g *GiteaLocalUploader) CreateIssues(issues ...*base.Issue) error {
} }
} }
var is = models.Issue{ is := models.Issue{
RepoID: g.repo.ID, RepoID: g.repo.ID,
Repo: g.repo, Repo: g.repo,
Index: issue.Number, Index: issue.Number,
@@ -433,7 +434,7 @@ func (g *GiteaLocalUploader) CreateIssues(issues ...*base.Issue) error {
g.userMap[reaction.UserID] = userid g.userMap[reaction.UserID] = userid
} }
} }
var res = models.Reaction{ res := models.Reaction{
Type: reaction.Content, Type: reaction.Content,
CreatedUnix: timeutil.TimeStampNow(), CreatedUnix: timeutil.TimeStampNow(),
} }
@@ -464,7 +465,7 @@ func (g *GiteaLocalUploader) CreateIssues(issues ...*base.Issue) error {
// CreateComments creates comments of issues // CreateComments creates comments of issues
func (g *GiteaLocalUploader) CreateComments(comments ...*base.Comment) error { func (g *GiteaLocalUploader) CreateComments(comments ...*base.Comment) error {
var cms = make([]*models.Comment, 0, len(comments)) cms := make([]*models.Comment, 0, len(comments))
for _, comment := range comments { for _, comment := range comments {
var issue *models.Issue var issue *models.Issue
issueInter, ok := g.issues.Load(comment.IssueIndex) issueInter, ok := g.issues.Load(comment.IssueIndex)
@@ -528,7 +529,7 @@ func (g *GiteaLocalUploader) CreateComments(comments ...*base.Comment) error {
g.userMap[reaction.UserID] = userid g.userMap[reaction.UserID] = userid
} }
} }
var res = models.Reaction{ res := models.Reaction{
Type: reaction.Content, Type: reaction.Content,
CreatedUnix: timeutil.TimeStampNow(), CreatedUnix: timeutil.TimeStampNow(),
} }
@@ -553,7 +554,7 @@ func (g *GiteaLocalUploader) CreateComments(comments ...*base.Comment) error {
// CreatePullRequests creates pull requests // CreatePullRequests creates pull requests
func (g *GiteaLocalUploader) CreatePullRequests(prs ...*base.PullRequest) error { func (g *GiteaLocalUploader) CreatePullRequests(prs ...*base.PullRequest) error {
var gprs = make([]*models.PullRequest, 0, len(prs)) gprs := make([]*models.PullRequest, 0, len(prs))
for _, pr := range prs { for _, pr := range prs {
gpr, err := g.newPullRequest(pr) gpr, err := g.newPullRequest(pr)
if err != nil { if err != nil {
@@ -652,7 +653,7 @@ func (g *GiteaLocalUploader) newPullRequest(pr *base.PullRequest) (*models.PullR
return nil, err return nil, err
} }
var head = "unknown repository" head := "unknown repository"
if pr.IsForkPullRequest() && pr.State != "closed" { if pr.IsForkPullRequest() && pr.State != "closed" {
if pr.Head.OwnerName != "" { if pr.Head.OwnerName != "" {
remote := pr.Head.OwnerName remote := pr.Head.OwnerName
@@ -669,7 +670,7 @@ func (g *GiteaLocalUploader) newPullRequest(pr *base.PullRequest) (*models.PullR
} }
if ok { if ok {
_, err = git.NewCommand("fetch", remote, pr.Head.Ref).RunInDir(g.repo.RepoPath()) _, err = git.NewCommandContext(g.ctx, "fetch", "--no-tags", "--", remote, pr.Head.Ref).RunInDir(g.repo.RepoPath())
if err != nil { if err != nil {
log.Error("Fetch branch from %s failed: %v", pr.Head.CloneURL, err) log.Error("Fetch branch from %s failed: %v", pr.Head.CloneURL, err)
} else { } else {
@@ -723,7 +724,7 @@ func (g *GiteaLocalUploader) newPullRequest(pr *base.PullRequest) (*models.PullR
pr.Updated = pr.Created pr.Updated = pr.Created
} }
var issue = models.Issue{ issue := models.Issue{
RepoID: g.repo.ID, RepoID: g.repo.ID,
Repo: g.repo, Repo: g.repo,
Title: pr.Title, Title: pr.Title,
@@ -773,7 +774,7 @@ func (g *GiteaLocalUploader) newPullRequest(pr *base.PullRequest) (*models.PullR
g.userMap[reaction.UserID] = userid g.userMap[reaction.UserID] = userid
} }
} }
var res = models.Reaction{ res := models.Reaction{
Type: reaction.Content, Type: reaction.Content,
CreatedUnix: timeutil.TimeStampNow(), CreatedUnix: timeutil.TimeStampNow(),
} }
@@ -787,7 +788,7 @@ func (g *GiteaLocalUploader) newPullRequest(pr *base.PullRequest) (*models.PullR
issue.Reactions = append(issue.Reactions, &res) issue.Reactions = append(issue.Reactions, &res)
} }
var pullRequest = models.PullRequest{ pullRequest := models.PullRequest{
HeadRepoID: g.repo.ID, HeadRepoID: g.repo.ID,
HeadBranch: head, HeadBranch: head,
BaseRepoID: g.repo.ID, BaseRepoID: g.repo.ID,
@@ -830,7 +831,7 @@ func convertReviewState(state string) models.ReviewType {
// CreateReviews create pull request reviews // CreateReviews create pull request reviews
func (g *GiteaLocalUploader) CreateReviews(reviews ...*base.Review) error { func (g *GiteaLocalUploader) CreateReviews(reviews ...*base.Review) error {
var cms = make([]*models.Review, 0, len(reviews)) cms := make([]*models.Review, 0, len(reviews))
for _, review := range reviews { for _, review := range reviews {
var issue *models.Issue var issue *models.Issue
issueInter, ok := g.issues.Load(review.IssueIndex) issueInter, ok := g.issues.Load(review.IssueIndex)
@@ -862,7 +863,7 @@ func (g *GiteaLocalUploader) CreateReviews(reviews ...*base.Review) error {
review.CreatedAt = time.Unix(int64(issue.CreatedUnix), 0) review.CreatedAt = time.Unix(int64(issue.CreatedUnix), 0)
} }
var cm = models.Review{ cm := models.Review{
Type: convertReviewState(review.State), Type: convertReviewState(review.State),
IssueID: issue.ID, IssueID: issue.ID,
Content: review.Content, Content: review.Content,
@@ -926,7 +927,7 @@ func (g *GiteaLocalUploader) CreateReviews(reviews ...*base.Review) error {
comment.UpdatedAt = comment.CreatedAt comment.UpdatedAt = comment.CreatedAt
} }
var c = models.Comment{ c := models.Comment{
Type: models.CommentTypeCode, Type: models.CommentTypeCode,
PosterID: comment.PosterID, PosterID: comment.PosterID,
IssueID: issue.ID, IssueID: issue.ID,

View File

@@ -97,7 +97,7 @@ func TestGiteaUploadRepo(t *testing.T) {
assert.Len(t, releases, 1) assert.Len(t, releases, 1)
issues, err := models.Issues(&models.IssuesOptions{ issues, err := models.Issues(&models.IssuesOptions{
RepoIDs: []int64{repo.ID}, RepoID: repo.ID,
IsPull: util.OptionalBoolFalse, IsPull: util.OptionalBoolFalse,
SortType: "oldest", SortType: "oldest",
}) })

View File

@@ -32,8 +32,8 @@ import (
var prQueue queue.UniqueQueue var prQueue queue.UniqueQueue
var ( var (
ErrIsClosed = errors.New("pull is cosed") ErrIsClosed = errors.New("pull is closed")
ErrUserNotAllowedToMerge = errors.New("user not allowed to merge") ErrUserNotAllowedToMerge = models.ErrNotAllowedToMerge{}
ErrHasMerged = errors.New("has already been merged") ErrHasMerged = errors.New("has already been merged")
ErrIsWorkInProgress = errors.New("work in progress PRs cannot be merged") ErrIsWorkInProgress = errors.New("work in progress PRs cannot be merged")
ErrIsChecking = errors.New("cannot merge while conflict checking is in progress") ErrIsChecking = errors.New("cannot merge while conflict checking is in progress")
@@ -96,10 +96,10 @@ func CheckPullMergable(ctx context.Context, doer *user_model.User, perm *models.
if err := CheckPRReadyToMerge(pr, false); err != nil { if err := CheckPRReadyToMerge(pr, false); err != nil {
if models.IsErrNotAllowedToMerge(err) { if models.IsErrNotAllowedToMerge(err) {
if force { if force {
if isRepoAdmin, err := models.IsUserRepoAdmin(pr.BaseRepo, doer); err != nil { if isRepoAdmin, err2 := models.IsUserRepoAdmin(pr.BaseRepo, doer); err2 != nil {
return err return err2
} else if !isRepoAdmin { } else if !isRepoAdmin {
return ErrUserNotAllowedToMerge return err
} }
} }
} else { } else {

View File

@@ -431,14 +431,16 @@ func checkConflicts(pr *models.PullRequest, gitRepo *git.Repository, tmpBasePath
return nil return nil
}) })
// 8. If there is a conflict the `git apply` command will return a non-zero error code - so there will be a positive error. // 9. Check if the found conflictedfiles is non-zero, "err" could be non-nil, so we should ignore it if we found conflicts.
if err != nil { // Note: `"err" could be non-nil` is due that if enable 3-way merge, it doesn't return any error on found conflicts.
if len(pr.ConflictedFiles) > 0 {
if conflict { if conflict {
pr.Status = models.PullRequestStatusConflict pr.Status = models.PullRequestStatusConflict
log.Trace("Found %d files conflicted: %v", len(pr.ConflictedFiles), pr.ConflictedFiles) log.Trace("Found %d files conflicted: %v", len(pr.ConflictedFiles), pr.ConflictedFiles)
return true, nil return true, nil
} }
} else if err != nil {
return false, fmt.Errorf("git apply --check: %v", err) return false, fmt.Errorf("git apply --check: %v", err)
} }
return false, nil return false, nil

View File

@@ -169,7 +169,7 @@ func doArchive(r *ArchiveRequest) (*repo_model.RepoArchiver, error) {
w.Close() w.Close()
rd.Close() rd.Close()
}() }()
var done = make(chan error) done := make(chan error, 1) // Ensure that there is some capacity which will ensure that the goroutine below can always finish
repo, err := repo_model.GetRepositoryByID(archiver.RepoID) repo, err := repo_model.GetRepositoryByID(archiver.RepoID)
if err != nil { if err != nil {
return nil, fmt.Errorf("archiver.LoadRepo failed: %v", err) return nil, fmt.Errorf("archiver.LoadRepo failed: %v", err)

View File

@@ -14091,6 +14091,10 @@
"type": "string", "type": "string",
"x-go-name": "Password" "x-go-name": "Password"
}, },
"restricted": {
"type": "boolean",
"x-go-name": "Restricted"
},
"send_notify": { "send_notify": {
"type": "boolean", "type": "boolean",
"x-go-name": "SendNotify" "x-go-name": "SendNotify"

View File

@@ -15,10 +15,6 @@ function selectRange($list, $select, $from) {
const $issue = $('a.ref-in-new-issue'); const $issue = $('a.ref-in-new-issue');
const $copyPermalink = $('a.copy-line-permalink'); const $copyPermalink = $('a.copy-line-permalink');
if ($copyPermalink.length === 0) {
return;
}
const updateIssueHref = function (anchor) { const updateIssueHref = function (anchor) {
if ($issue.length === 0) { if ($issue.length === 0) {
return; return;
@@ -29,6 +25,9 @@ function selectRange($list, $select, $from) {
}; };
const updateCopyPermalinkHref = function(anchor) { const updateCopyPermalinkHref = function(anchor) {
if ($copyPermalink.length === 0) {
return;
}
let link = $copyPermalink.attr('data-clipboard-text'); let link = $copyPermalink.attr('data-clipboard-text');
link = `${link.replace(/#L\d+$|#L\d+-L\d+$/, '')}#${anchor}`; link = `${link.replace(/#L\d+$|#L\d+-L\d+$/, '')}#${anchor}`;
$copyPermalink.attr('data-clipboard-text', link); $copyPermalink.attr('data-clipboard-text', link);