Compare commits

...

58 commits

Author SHA1 Message Date
lawwong1
d2fd7b9ba9
merge retry mechanism change from gorealis v1 to gorealis v2 (#21) 2023-01-26 13:36:40 -08:00
lawwong1
8db625730f
Support Australis API to get aurora master nodes and mesos master nodes (#20) 2022-08-24 08:51:12 -07:00
Tan N. Le
e33a2d99d8
release 2.28.0 (#19) 2022-08-02 09:55:36 -07:00
Tan N. Le
4258634ccf
Capacity report (#18)
- pull capacity report via /offers endpoint.
- calculate how many tasks (with resource and constraints) can be fit in the cluster.
examples of using the above 2 features are in aurora-scheduler/australis#33
2022-07-28 19:27:53 -07:00
Tan N. Le
5d0998647a
default policy for slaDrainHosts (#17) 2021-11-01 18:15:51 -07:00
Renán I. Del Valle
907430768c
Misc. fixes for tests (#16)
* Bumping up CI to go1.17 and enabling CI for PRs.

* Adding go.sum now that issues seem to have gone away.

* Bump up aurora to 0.25.0 and mesos to 1.9.0

* Fixing Mac tests. Adding extra time for killing thermos jobs.

* Reduce the thermos overhead for unit tests

Co-authored-by: lenhattan86 <lenhattan86@users.noreply.github.com>
2021-10-25 12:39:13 -07:00
lenhattan86
fe664178ce
Add tier & production in task config (#14) 2021-10-15 12:18:26 -07:00
lenhattan86
a75b691d72
Merge pull request #15 from lenhattan86/fix_unit_test
Fix unit test for GetJobSummary
2021-10-07 13:14:25 -07:00
lenhattan86
306603795b fix unit test error for GetJobSummary 2021-10-06 22:37:54 -07:00
lenhattan86
045a4869a5
Merge branch 'aurora-scheduler:master' into master 2021-10-06 14:01:03 -07:00
lenhattan86
425faf28b8
Adds priority for aurora-scheduler (#13)
Adds priority for task config
2021-09-16 16:29:25 -07:00
lenhattan86
2d81147aaa Merge branch 'aurora-scheduler:master' into master 2021-06-01 21:35:42 -07:00
Renán I. Del Valle
983bf44b9f
Update thrift to 0.14.0 (#9)
Generated thrift stubs using 0.14.0 compiler version.
Script now tells user to use version 0.14.0 of thrift compiler.
2021-03-01 16:52:25 -08:00
Renán I. Del Valle
d0be43b8ac
Dropping support for dep (#10)
Dep files are no longer necessary.
2021-03-01 15:36:28 -08:00
lenhattan86
b1661698c2
GetJobSummary API (#8)
* Adds GetJobSummary API
2021-01-12 16:18:09 -08:00
lenhattan86
364ee93202
Merge pull request #1 from aurora-scheduler/master
pull from upstream
2021-01-12 15:09:36 -08:00
Renan DelValle
755f99fb76
Fixes style issue with jobupdate file. 2020-11-16 21:51:02 -08:00
Renán I. Del Valle
caf1444250
Removes variables from github actions
Github Actions deprecated support for using env files without previously setting them. Adjusting CI scripts accordingly.
2020-11-16 21:45:00 -08:00
lenhattan86
c3dbeba2bd
Adds ability to fetch Mesos Master leader (#7)
* Adds ability to fetch Mesos Master leader from Zookeeper
2020-11-15 16:44:21 -08:00
Renan DelValle
6c639362c8
Bumping up go tests time to 30m. 2020-07-27 21:44:35 -07:00
Renán I. Del Valle
4cf60775f5
Bumping up thrift go library version to v0.13.2 (#6)
Thrift v0.13.2 is a forked version of v0.13.0 with a patch to not panic when trying to write to a closed buffer. Instead we get an error back and we can handle it appropriately.
2020-05-26 20:32:51 -07:00
Renán I. Del Valle
30f804bc53
Update using-the-sample-client.md
Fixing typo on doc.
2020-05-20 18:26:33 -07:00
Renán I. Del Valle
e5d63579e8
Update using-the-sample-client.md
Updating instructions for using the sample client.
2020-05-20 18:26:05 -07:00
Renán I. Del Valle
34a950306d
Update developing.md
Updating documentation for developing gorealis
2020-05-20 18:21:47 -07:00
Renan DelValle
851f9686b6
Bumping up version. 2020-05-07 11:35:22 -07:00
Renan DelValle
72c04220fe
Removing vendoring folder now that packages are cached in goproxy. 2020-05-07 11:33:47 -07:00
Renan DelValle
96384e6fdc
Renaming function in job update to reduce ambiguity about which underlying object it is modifiying. Adding support for specifying ranges in updates as well as SLA Awareness. 2020-05-07 11:32:36 -07:00
Renán I. Del Valle
ed81bcb28d
Increasing test time out to 20 mins
Increasing test timeout to 20mins for CI
2020-05-05 23:02:44 -07:00
Renán I. Del Valle
69ced895e2
Upgrade to Aurora 0.22.0 (#5)
* Upgrading to Thrift 0.13.1. This version is a fork of 0.13.0 with a patch on top of it to fix an issue where trying a realis call after the connection has been closed results in a panic.

* Upgrading compose set up to Mesos 1.6.2 and Aurora 0.22.0.

* Adding support for using different update strategies.

* Adding a monitor that is friendly with auto pause.

* Adding tests for new update strategies.
2020-05-05 20:55:25 -07:00
Renán I. Del Valle
1d8afcd329
Adding github actions CI (#4)
* CI will run on pushes to the main branch.
2020-02-26 11:37:39 -08:00
Renán I. Del Valle
406640c7a9
Add a few items to gitignore. Change few missed dependencies to point to aurora-scheduler repository. (#3) 2020-02-19 12:01:02 -08:00
Renán I. Del Valle
02710e5434
Moving repository to aurora-scheduler organization. (#2)
gorealis v2 will now live in the aurora-scheduler organization
2020-02-19 11:40:40 -08:00
Renán I. Del Valle
3a6a93f946
Changing module address to be under aurora-scheduler (#1)
The v2 version of gorealis will now be housed under the aurora-scheduler organization.
2020-02-18 18:07:17 -08:00
Renán I. Del Valle
fc983fa096
Avoid panics using a forked Thrift version while we wait for Thrift to release 0.14.0 (#119)
* Changing README to point to the incarnation of the aurora scheduler project.

* Pointing to a forked patch version of thrift using mod while we wait for the fix that will land in 0.14.0.
2020-02-18 14:18:45 -08:00
Renan I. Del Valle
7b0c75450b
Removing go sum
Since go has launched a checksum database, it is no longer necessary to store go.sum file.
https://blog.golang.org/module-mirror-launch
2020-02-05 13:41:02 -08:00
Renan DelValle
235f854087 Changing calls on functions that use JobUpdateKey to reflect change made for memory safety. 2019-09-25 17:20:30 -07:00
Renan DelValle
4fc4953ec4 Change JobUpdateKey pointers to be literals, then we deep copy the JobKey pointer to a new JobKey in order to avoid side effects. 2019-09-25 17:20:30 -07:00
Renan DelValle
119d1c429b Moving client configuration options from realis to its own sepearate file to make code more digestible. 2019-09-25 17:20:30 -07:00
Renan DelValle
a8a7cf779f Splitting realis into regular API and admin API files. 2019-09-25 17:20:30 -07:00
Renan DelValle
98f2cab4a2 Renamed Aurora address validator to be less redudnant. Added tests cribbed from version 1. 2019-09-25 17:20:30 -07:00
Renan DelValle
09628391cc Cleaning up error messages and some formatting. 2019-09-25 17:20:30 -07:00
Renan DelValle
f72fdacfb0 Changing the names of the protocol constants to be more descriptive. 2019-09-25 17:20:30 -07:00
Renan DelValle
55cf9bcb70 Adding more fine grained controls to retry mechanism. Retry mechanism may now be configured to not retry if an error is hit or to specifically stop retrying if a timeout error is encountered. 2019-09-25 17:20:30 -07:00
Renan DelValle
fe4a0dc06e Minor error message clarification 2019-09-25 17:20:30 -07:00
Renan DelValle
d67b8ca1d7 Removing uncessary functions which previously handled initializing thrift protocol. Changed how which protocol is chosen based upon configuration. 2019-09-25 17:20:30 -07:00
Renan DelValle
ecd59f7a8d Removing unnecessary cookie jar from thrift protocol initialization. 2019-09-25 17:20:30 -07:00
Renan DelValle
5d75dcc15e Adding MonitorJobUpdateQuery which serves as the basis for other monitors. 2019-09-25 17:20:30 -07:00
Renan DelValle
9a70711537 Making JobUpdate synchronous. MonitorJobUpdateStatus creates a local copy of job key in order to guard against side effects cuased by mutations to the JobKey being performed externally. 2019-09-25 17:20:30 -07:00
Renan DelValle
203f178d68 Changing error messages to be lower case in realis API. 2019-09-25 17:20:30 -07:00
Renan DelValle
9584266b71 Changing MonitorJobUpdate to use MOnitorJobUpdateStatus under the hood. 2019-09-25 17:20:30 -07:00
Renan DelValle
6f20f5b62f Adding JobUpdateStatus monitor as well as renaming all monitor functions to be Monitor + <subject> 2019-09-25 17:20:30 -07:00
Renan DelValle
04471c6918 Adding trace logging. 2019-09-25 17:20:30 -07:00
Renan DelValle
dbad078d95 Adding missing indirection for adding GPU requirements to task. 2019-09-25 17:20:30 -07:00
Renan DelValle
b9db36520c Adding realis test file which currently tests the get certs function. 2019-09-25 17:20:30 -07:00
Renan DelValle
2c795debfd Updating runtestMac to be the same as in the gorealis v1 repo. 2019-09-25 17:20:30 -07:00
Robert Allen
c553f67d4e Adding support for PartitionPolicy. 2019-09-25 17:20:30 -07:00
Renan DelValle
461b23400c
V2.0 thrift repository migration and cleanup (#98)
* Remove unnecessary files from the thrift repository that come along with the go library.

* Updating thrift generated code to be 0.12.0 final generated code.

* Remove git.apache.org dependency in vendor folder.

* Migrating from git.apache.org/thrift.git to github.com/apache/thrift

* Upgrading dep (although it will not work now that imports are using mod format, it allows for users to easily fix this with a replacement of the import path).

* Upgrading mod dependencies for Thrift to point to github.com location of the repository.

* Bug fix for Thermos Payload generation relating to the GPU being set.
2019-02-19 16:40:41 -08:00
Renan DelValle
9b3593e9d9
Fixing GPU resource to only be added if specified since Aurora scheduler by default will reject tasks containing GPU. 2019-01-08 17:47:13 -08:00
2688 changed files with 31574 additions and 524216 deletions

View file

@ -1 +1 @@
0.21.0
0.26.0

32
.github/workflows/main.yml vendored Normal file
View file

@ -0,0 +1,32 @@
name: CI
on:
push:
branches:
- master
pull_request:
branches:
- master
jobs:
build:
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout@v2
- name: Setup Go for use with actions
uses: actions/setup-go@v2
with:
go-version: 1.17
- name: Install goimports
run: go get golang.org/x/tools/cmd/goimports
- name: Set env with list of directories in repo containin go code
run: echo GO_USR_DIRS=$(go list -f {{.Dir}} ./... | grep -E -v "/gen-go/|/vendor/") >> $GITHUB_ENV
- name: Run goimports check
run: test -z "`for d in $GO_USR_DIRS; do goimports -d $d/*.go | tee /dev/stderr; done`"
- name: Create aurora/mesos docker cluster
run: docker-compose up -d
- name: Run tests
run: go test -timeout 35m -race -coverprofile=coverage.txt -covermode=atomic -v github.com/aurora-scheduler/gorealis/v2

15
.gitignore vendored
View file

@ -8,6 +8,17 @@ _obj
_test
.idea
# Thrift library comes with a lot of other files we don't need.
# Ignore everything but the files we do need
vendor/github.com/apache/thrift/*
!vendor/github.com/apache/thrift/lib/
vendor/github.com/apache/thrift/lib/*
!vendor/github.com/apache/thrift/lib/go/
vendor/github.com/apache/thrift/lib/go/*
!vendor/github.com/apache/thrift/lib/go/thrift/
# Architecture specific extensions/prefixes
*.[568vq]
[568vq].out
@ -26,3 +37,7 @@ _testmain.go
# Output of the go coverage tool, specifically when used with LiteIDE
*.out
# Example client build
examples/client
examples/jsonClient

View file

@ -20,7 +20,7 @@ install:
- docker-compose up -d
script:
- go test -race -coverprofile=coverage.txt -covermode=atomic -v github.com/paypal/gorealis
- go test -race -coverprofile=coverage.txt -covermode=atomic -v github.com/aurora-scheduler/gorealis
after_success:
- bash <(curl -s https://codecov.io/bash)

60
Gopkg.lock generated
View file

@ -1,60 +0,0 @@
# This file is autogenerated, do not edit; changes may be undone by the next 'dep ensure'.
[[projects]]
branch = "0.12.0"
digest = "1:89696c38cec777120b8b1bb5e2d363d655cf2e1e7d8c851919aaa0fd576d9b86"
name = "git.apache.org/thrift.git"
packages = ["lib/go/thrift"]
pruneopts = ""
revision = "384647d290e2e4a55a14b1b7ef1b7e66293a2c33"
[[projects]]
digest = "1:0deddd908b6b4b768cfc272c16ee61e7088a60f7fe2f06c547bd3d8e1f8b8e77"
name = "github.com/davecgh/go-spew"
packages = ["spew"]
pruneopts = ""
revision = "8991bc29aa16c548c550c7ff78260e27b9ab7c73"
version = "v1.1.1"
[[projects]]
digest = "1:df48fb76fb2a40edea0c9b3d960bc95e326660d82ff1114e1f88001f7a236b40"
name = "github.com/pkg/errors"
packages = ["."]
pruneopts = ""
revision = "e881fd58d78e04cf6d0de1217f8707c8cc2249bc"
[[projects]]
digest = "1:256484dbbcd271f9ecebc6795b2df8cad4c458dd0f5fd82a8c2fa0c29f233411"
name = "github.com/pmezard/go-difflib"
packages = ["difflib"]
pruneopts = ""
revision = "792786c7400a136282c1664665ae0a8db921c6c2"
version = "v1.0.0"
[[projects]]
digest = "1:78bea5e26e82826dacc5fd64a1013a6711b7075ec8072819b89e6ad76cb8196d"
name = "github.com/samuel/go-zookeeper"
packages = ["zk"]
pruneopts = ""
revision = "471cd4e61d7a78ece1791fa5faa0345dc8c7d5a5"
[[projects]]
digest = "1:c587772fb8ad29ad4db67575dad25ba17a51f072ff18a22b4f0257a4d9c24f75"
name = "github.com/stretchr/testify"
packages = ["assert"]
pruneopts = ""
revision = "f35b8ab0b5a2cef36673838d662e249dd9c94686"
version = "v1.2.2"
[solve-meta]
analyzer-name = "dep"
analyzer-version = 1
input-imports = [
"git.apache.org/thrift.git/lib/go/thrift",
"github.com/pkg/errors",
"github.com/samuel/go-zookeeper/zk",
"github.com/stretchr/testify/assert",
]
solver-name = "gps-cdcl"
solver-version = 1

View file

@ -1,16 +0,0 @@
[[constraint]]
name = "git.apache.org/thrift.git"
branch = "0.12.0"
[[constraint]]
name = "github.com/pkg/errors"
revision = "e881fd58d78e04cf6d0de1217f8707c8cc2249bc"
[[constraint]]
name = "github.com/samuel/go-zookeeper"
revision = "471cd4e61d7a78ece1791fa5faa0345dc8c7d5a5"
[[constraint]]
name = "github.com/stretchr/testify"
version = "1.2.0"

View file

@ -1,6 +1,6 @@
# gorealis [![GoDoc](https://godoc.org/github.com/paypal/gorealis?status.svg)](https://godoc.org/github.com/paypal/gorealis) [![Build Status](https://travis-ci.org/paypal/gorealis.svg?branch=master)](https://travis-ci.org/paypal/gorealis) [![codecov](https://codecov.io/gh/paypal/gorealis/branch/master-v2.0/graph/badge.svg)](https://codecov.io/gh/paypal/gorealis/branch/master-v2.0)
# gorealis [![GoDoc](https://godoc.org/github.com/aurora-scheduler/gorealis?status.svg)](https://godoc.org/github.com/aurora-scheduler/gorealis) [![codecov](https://codecov.io/gh/aurora-scheduler/gorealis/branch/master/graph/badge.svg)](https://codecov.io/gh/aurora-scheduler/gorealis/branch/master)
Go library for interacting with [Apache Aurora](https://github.com/apache/aurora).
Go library for interacting with [Aurora Scheduler](https://github.com/aurora-scheduler/aurora).
### Aurora version compatibility
Please see [.auroraversion](./.auroraversion) to see the latest Aurora version against which this
@ -14,7 +14,7 @@ library has been tested.
## Projects using gorealis
* [australis](https://github.com/rdelval/australis)
* [australis](https://github.com/aurora-scheduler/australis)
## Contributions
Contributions are always welcome. Please raise an issue to discuss a contribution before it is made.

View file

@ -716,9 +716,40 @@ struct JobUpdateKey {
2: string id
}
/** Limits the amount of active changes being made to instances to groupSize. */
struct QueueJobUpdateStrategy {
1: i32 groupSize
}
/** Similar to Queue strategy but will not start a new group until all instances in an active
* group have finished updating.
*/
struct BatchJobUpdateStrategy {
1: i32 groupSize
/* Update will pause automatically after each batch completes */
2: bool autopauseAfterBatch
}
/** Same as Batch strategy but each time an active group completes, the size of the next active
* group may change.
*/
struct VariableBatchJobUpdateStrategy {
1: list<i32> groupSizes
/* Update will pause automatically after each batch completes */
2: bool autopauseAfterBatch
}
union JobUpdateStrategy {
1: QueueJobUpdateStrategy queueStrategy
2: BatchJobUpdateStrategy batchStrategy
3: VariableBatchJobUpdateStrategy varBatchStrategy
}
/** Job update thresholds and limits. */
struct JobUpdateSettings {
/** Max number of instances being updated at any given moment. */
/** Deprecated, please set value inside of desired update strategy instead.
* Max number of instances being updated at any given moment.
*/
1: i32 updateGroupSize
/** Max number of instance failures to tolerate before marking instance as FAILED. */
@ -736,7 +767,7 @@ struct JobUpdateSettings {
/** Instance IDs to act on. All instances will be affected if this is not set. */
7: set<Range> updateOnlyTheseInstances
/**
/** Deprecated, please set updateStrategy to the Batch strategy instead.
* If true, use updateGroupSize as strict batching boundaries, and avoid proceeding to another
* batch until the preceding batch finishes updating.
*/
@ -755,6 +786,9 @@ struct JobUpdateSettings {
* differs between the old and new task configurations, updates will use the newest configuration.
*/
10: optional bool slaAware
/** Update strategy to be used for the update. See JobUpdateStrategy for choices. */
11: optional JobUpdateStrategy updateStrategy
}
/** Event marking a state transition in job update lifecycle. */

View file

@ -28,6 +28,7 @@ type Cluster struct {
ZK string `json:"zk"`
ZKPort int `json:"zk_port"`
SchedZKPath string `json:"scheduler_zk_path"`
MesosZKPath string `json:"mesos_zk_path"`
SchedURI string `json:"scheduler_uri"`
ProxyURL string `json:"proxy_url"`
AuthMechanism string `json:"auth_mechanism"`
@ -61,6 +62,7 @@ func GetDefaultClusterFromZKUrl(zkURL string) *Cluster {
AuthMechanism: "UNAUTHENTICATED",
ZK: zkURL,
SchedZKPath: "/aurora/scheduler",
MesosZKPath: "/mesos",
AgentRunDir: "latest",
AgentRoot: "/var/lib/mesos",
}

View file

@ -18,7 +18,7 @@ import (
"fmt"
"testing"
realis "github.com/paypal/gorealis/v2"
realis "github.com/aurora-scheduler/gorealis/v2"
"github.com/stretchr/testify/assert"
)
@ -32,6 +32,7 @@ func TestLoadClusters(t *testing.T) {
assert.Equal(t, clusters["devcluster"].Name, "devcluster")
assert.Equal(t, clusters["devcluster"].ZK, "192.168.33.7")
assert.Equal(t, clusters["devcluster"].SchedZKPath, "/aurora/scheduler")
assert.Equal(t, clusters["devcluster"].MesosZKPath, "/mesos")
assert.Equal(t, clusters["devcluster"].AuthMechanism, "UNAUTHENTICATED")
assert.Equal(t, clusters["devcluster"].AgentRunDir, "latest")
assert.Equal(t, clusters["devcluster"].AgentRoot, "/var/lib/mesos")

View file

@ -15,7 +15,7 @@
package realis
import (
"github.com/paypal/gorealis/v2/gen-go/apache/aurora"
"github.com/aurora-scheduler/gorealis/v2/gen-go/apache/aurora"
)
type Container interface {

View file

@ -14,7 +14,7 @@ services:
ipv4_address: 192.168.33.2
master:
image: rdelvalle/mesos-master:1.5.1
image: quay.io/aurorascheduler/mesos-master:1.9.0
restart: on-failure
ports:
- "5050:5050"
@ -32,7 +32,7 @@ services:
- zk
agent-one:
image: rdelvalle/mesos-agent:1.5.1
image: quay.io/aurorascheduler/mesos-agent:1.9.0
pid: host
restart: on-failure
ports:
@ -41,10 +41,11 @@ services:
MESOS_MASTER: zk://192.168.33.2:2181/mesos
MESOS_CONTAINERIZERS: docker,mesos
MESOS_PORT: 5051
MESOS_HOSTNAME: localhost
MESOS_HOSTNAME: agent-one
MESOS_RESOURCES: ports(*):[11000-11999]
MESOS_SYSTEMD_ENABLE_SUPPORT: 'false'
MESOS_WORK_DIR: /tmp/mesos
MESOS_ATTRIBUTES: 'host:agent-one;rack:1;zone:west'
networks:
aurora_cluster:
ipv4_address: 192.168.33.4
@ -55,8 +56,58 @@ services:
depends_on:
- zk
agent-two:
image: quay.io/aurorascheduler/mesos-agent:1.9.0
pid: host
restart: on-failure
ports:
- "5052:5051"
environment:
MESOS_MASTER: zk://192.168.33.2:2181/mesos
MESOS_CONTAINERIZERS: docker,mesos
MESOS_PORT: 5051
MESOS_HOSTNAME: agent-two
MESOS_RESOURCES: ports(*):[11000-11999]
MESOS_SYSTEMD_ENABLE_SUPPORT: 'false'
MESOS_WORK_DIR: /tmp/mesos
MESOS_ATTRIBUTES: 'host:agent-two;rack:2;zone:west'
networks:
aurora_cluster:
ipv4_address: 192.168.33.5
volumes:
- /sys/fs/cgroup:/sys/fs/cgroup
- /var/run/docker.sock:/var/run/docker.sock
depends_on:
- zk
agent-three:
image: quay.io/aurorascheduler/mesos-agent:1.9.0
pid: host
restart: on-failure
ports:
- "5053:5051"
environment:
MESOS_MASTER: zk://192.168.33.2:2181/mesos
MESOS_CONTAINERIZERS: docker,mesos
MESOS_PORT: 5051
MESOS_HOSTNAME: agent-three
MESOS_RESOURCES: ports(*):[11000-11999]
MESOS_SYSTEMD_ENABLE_SUPPORT: 'false'
MESOS_WORK_DIR: /tmp/mesos
MESOS_ATTRIBUTES: 'host:agent-three;rack:2;zone:west;dedicated:vagrant/bar'
networks:
aurora_cluster:
ipv4_address: 192.168.33.6
volumes:
- /sys/fs/cgroup:/sys/fs/cgroup
- /var/run/docker.sock:/var/run/docker.sock
depends_on:
- zk
aurora-one:
image: rdelvalle/aurora:0.21.0
image: quay.io/aurorascheduler/scheduler:0.25.0
pid: host
ports:
- "8081:8081"
@ -69,6 +120,8 @@ services:
-http_authentication_mechanism=BASIC
-shiro_realm_modules=INI_AUTHNZ
-shiro_ini_path=/etc/aurora/security.ini
-min_required_instances_for_sla_check=1
-thermos_executor_cpu=0.09
volumes:
- ./.aurora-config:/etc/aurora
networks:

View file

@ -19,25 +19,18 @@ This also allows us to delete and recreate our development cluster very quickly.
To install docker-compose please follow the instructions for your platform
[here](https://docs.docker.com/compose/install/).
### Getting the source code
As of go 1.10.x, GOPATH is still relevant. This may change in the future but
for the sake of making development less error prone, it is suggested that the following
directories be created:
`$ git clone https://github.com/aurora-scheduler/gorealis`
`$ mkdir -p $GOPATH/src/github.com/paypal`
Inside of the newly cloned repo you may download dependencies to the local cache using go mod
And then clone the master branch into the newly created folder:
`$ cd $GOPATH/src/github.com/paypal; git clone git@github.com:paypal/gorealis.git`
Since we check in our vendor folder, gorealis no further set up is needed.
`$ go mod download`
### Bringing up the cluster
To develop gorealis, you will need a fully functioning Mesos cluster along with
Apache Aurora.
To develop gorealis, you will need a fully functioning Mesos cluster along with
the Aurora Scheduler.
In order to bring up our docker-compose set up execute the following command from the root
of the git repository:
@ -62,14 +55,14 @@ environment but not when running under MacOS. To run code involving the ZK leade
For example, running the tests in a container can be done through the following command from
the root of the git repository:
`$ docker run -t -v $(pwd):/go/src/github.com/paypal/gorealis --network gorealis_aurora_cluster golang:1.10.3-alpine go test github.com/paypal/gorealis`
`$ docker run -t -v $(pwd):/go/src/github.com/aurora-scheduler/gorealis --network gorealis_aurora_cluster golang:1.14.3-alpine go test github.com/paypal/gorealis`
Or
`$ ./runTestsMac.sh`
Alternatively, if an interactive shell is necessary, the following command may be used:
`$ docker run -it -v $(pwd):/go/src/github.com/paypal/gorealis --network gorealis_aurora_cluster golang:1.10.3-alpine /bin/sh`
`$ docker run -it -v $(pwd):/go/src/github.com/paypal/gorealis --network gorealis_aurora_cluster golang:1.14.3-alpine /bin/sh`
### Cleaning up the cluster
@ -85,6 +78,3 @@ Once development is done, the environment may be torn down by executing (from th
git directory):
`$ docker-compose down`

View file

@ -247,6 +247,9 @@ job = realis.NewJob().
RAM(64).
Disk(100).
IsService(false).
Production(false).
Tier("preemptible").
Priority(0).
InstanceCount(1).
AddPorts(1).
AddLabel("fileName", "sample-app/docker-compose.yml").
@ -291,6 +294,9 @@ job = realis.NewJob().
RAM(64).
Disk(100).
IsService(true).
Production(false).
Tier("preemptible").
Priority(0).
InstanceCount(1).
AddPorts(1)
```

View file

@ -25,6 +25,9 @@ job = realis.NewJob().
RAM(64).
Disk(100).
IsService(false).
Production(false).
Tier("preemptible").
Priority(0).
InstanceCount(1).
AddPorts(1).
AddLabel("fileName", "sample-app/docker-compose.yml").

View file

@ -22,28 +22,25 @@ Usage of ./client:
```
## Sample commands:
These commands are set to run on a vagrant box. To be able to run the docker compose
executor examples, the vagrant box must be configured properly to use the docker compose executor.
### Thermos
#### Creating a Thermos job
```
$ cd $GOPATH/src/github.com/paypal/gorealis/examples
$ go run client.go -executor=thermos -url=http://192.168.33.7:8081 -cmd=create
$ go run examples/client.go -url=http://localhost:8081 -executor=thermos -cmd=create
```
#### Kill a Thermos job
```
$ go run $GOPATH/src/github.com/paypal/gorealis/examples/client.go -executor=thermos -url=http://192.168.33.7:8081 -cmd=kill
$ go run examples/client.go -url=http://localhost:8081 -executor=thermos -cmd=kill
```
### Docker Compose executor (custom executor)
#### Creating Docker Compose executor job
```
$ go run $GOPATH/src/github.com/paypal/gorealis/examples/client.go -executor=compose -url=http://192.168.33.7:8081 -cmd=create
$ go run examples/client.go -url=http://192.168.33.7:8081 -executor=compose -cmd=create
```
#### Kill a Docker Compose executor job
```
$ go run $GOPATH/src/github.com/paypal/gorealis/examples/client.go -executor=compose -url=http://192.168.33.7:8081 -cmd=kill
$ go run examples/client.go -url=http://192.168.33.7:8081 -executor=compose -cmd=kill
```

View file

@ -28,6 +28,19 @@ func IsTimeout(err error) bool {
return ok && temp.Timedout()
}
type timeoutErr struct {
error
timedout bool
}
func (r *timeoutErr) Timedout() bool {
return r.timedout
}
func newTimedoutError(err error) *timeoutErr {
return &timeoutErr{error: err, timedout: true}
}
// retryErr is a superset of timeout which includes extra context
// with regards to our retry mechanism. This is done in order to make sure
// that our retry mechanism works as expected through our tests and should

View file

@ -21,8 +21,8 @@ import (
"strings"
"time"
realis "github.com/paypal/gorealis/v2"
"github.com/paypal/gorealis/v2/gen-go/apache/aurora"
realis "github.com/aurora-scheduler/gorealis/v2"
"github.com/aurora-scheduler/gorealis/v2/gen-go/apache/aurora"
)
var cmd, executor, url, clustersConfig, clusterName, updateId, username, password, zkUrl, hostList, role string
@ -89,7 +89,6 @@ func main() {
realis.Debug(),
}
// Check if zkUrl is available.
if zkUrl != "" {
fmt.Println("zkUrl: ", zkUrl)
clientOptions = append(clientOptions, realis.ZKUrl(zkUrl))
@ -125,6 +124,9 @@ func main() {
RAM(64).
Disk(100).
IsService(true).
Production(false).
Tier("preemptible").
Priority(0).
InstanceCount(1).
AddPorts(1).
ThermosExecutor(thermosExec)
@ -139,6 +141,9 @@ func main() {
RAM(512).
Disk(100).
IsService(true).
Production(false).
Tier("preemptible").
Priority(0).
InstanceCount(1).
AddPorts(4).
AddLabel("fileName", "sample-app/docker-compose.yml").
@ -152,6 +157,9 @@ func main() {
RAM(64).
Disk(100).
IsService(true).
Production(false).
Tier("preemptible").
Priority(0).
InstanceCount(1).
AddPorts(1)
default:
@ -166,7 +174,7 @@ func main() {
log.Fatalln(err)
}
if ok, mErr := r.InstancesMonitor(job.JobKey(), job.GetInstanceCount(), 5*time.Second, 50*time.Second); !ok || mErr != nil {
if ok, mErr := r.MonitorInstances(job.JobKey(), job.GetInstanceCount(), 5*time.Second, 50*time.Second); !ok || mErr != nil {
err := r.KillJob(job.JobKey())
if err != nil {
log.Fatalln(err)
@ -185,7 +193,7 @@ func main() {
}
fmt.Println(result.String())
if ok, mErr := r.JobUpdateMonitor(*result.GetKey(), 5*time.Second, 180*time.Second); !ok || mErr != nil {
if ok, mErr := r.MonitorJobUpdate(*result.GetKey(), 5*time.Second, 180*time.Second); !ok || mErr != nil {
err := r.AbortJobUpdate(*result.GetKey(), "Monitor timed out")
err = r.KillJob(job.JobKey())
if err != nil {
@ -203,7 +211,7 @@ func main() {
log.Fatal(err)
}
if ok, err := r.InstancesMonitor(job.JobKey(), job.GetInstanceCount(), 10*time.Second, 300*time.Second); !ok || err != nil {
if ok, err := r.MonitorInstances(job.JobKey(), job.GetInstanceCount(), 10*time.Second, 300*time.Second); !ok || err != nil {
err := r.KillJob(job.JobKey())
if err != nil {
log.Fatal(err)
@ -219,7 +227,7 @@ func main() {
log.Fatal(err)
}
if ok, err := r.InstancesMonitor(job.JobKey(), job.GetInstanceCount(), 10*time.Second, 300*time.Second); !ok || err != nil {
if ok, err := r.MonitorInstances(job.JobKey(), job.GetInstanceCount(), 10*time.Second, 300*time.Second); !ok || err != nil {
err := r.KillJob(job.JobKey())
if err != nil {
log.Fatal(err)
@ -258,7 +266,7 @@ func main() {
log.Fatal(err)
}
if ok, err := r.InstancesMonitor(job.JobKey(), 0, 5*time.Second, 50*time.Second); !ok || err != nil {
if ok, err := r.MonitorInstances(job.JobKey(), 0, 5*time.Second, 50*time.Second); !ok || err != nil {
log.Fatal("Unable to kill all instances of job")
}
@ -312,7 +320,7 @@ func main() {
log.Fatal(err)
}
if ok, err := r.InstancesMonitor(job.JobKey(), int32(currInstances+numOfInstances), 5*time.Second, 50*time.Second); !ok || err != nil {
if ok, err := r.MonitorInstances(job.JobKey(), int32(currInstances+numOfInstances), 5*time.Second, 50*time.Second); !ok || err != nil {
fmt.Println("Flexing up failed")
}
@ -333,7 +341,7 @@ func main() {
log.Fatal(err)
}
if ok, err := r.InstancesMonitor(job.JobKey(), int32(currInstances-numOfInstances), 5*time.Second, 100*time.Second); !ok || err != nil {
if ok, err := r.MonitorInstances(job.JobKey(), int32(currInstances-numOfInstances), 5*time.Second, 100*time.Second); !ok || err != nil {
fmt.Println("flexDown failed")
}
@ -360,7 +368,7 @@ func main() {
}
jobUpdateKey := result.GetKey()
_, err = r.JobUpdateMonitor(*jobUpdateKey, 5*time.Second, 6*time.Minute)
_, err = r.MonitorJobUpdate(*jobUpdateKey, 5*time.Second, 6*time.Minute)
if err != nil {
log.Fatal(err)
}
@ -378,7 +386,7 @@ func main() {
case "resumeJobUpdate":
key := job.JobKey()
err := r.ResumeJobUpdate(&aurora.JobUpdateKey{
err := r.ResumeJobUpdate(aurora.JobUpdateKey{
Job: &key,
ID: updateId,
}, "")
@ -389,7 +397,7 @@ func main() {
case "pulseJobUpdate":
key := job.JobKey()
resp, err := r.PulseJobUpdate(&aurora.JobUpdateKey{
resp, err := r.PulseJobUpdate(aurora.JobUpdateKey{
Job: &key,
ID: updateId,
})
@ -518,7 +526,7 @@ func main() {
}
// Monitor change to DRAINING and DRAINED mode
hostResult, err := r.HostMaintenanceMonitor(
hostResult, err := r.MonitorHostMaintenance(
hosts,
[]aurora.MaintenanceMode{aurora.MaintenanceMode_DRAINED, aurora.MaintenanceMode_DRAINING},
5*time.Second,
@ -547,7 +555,7 @@ func main() {
}
// Monitor change to DRAINING and DRAINED mode
hostResult, err := r.HostMaintenanceMonitor(
hostResult, err := r.MonitorHostMaintenance(
hosts,
[]aurora.MaintenanceMode{aurora.MaintenanceMode_DRAINED, aurora.MaintenanceMode_DRAINING},
5*time.Second,
@ -573,7 +581,7 @@ func main() {
}
// Monitor change to DRAINING and DRAINED mode
hostResult, err := r.HostMaintenanceMonitor(
hostResult, err := r.MonitorHostMaintenance(
hosts,
[]aurora.MaintenanceMode{aurora.MaintenanceMode_NONE},
5*time.Second,

View file

@ -2,6 +2,7 @@
"name": "devcluster",
"zk": "192.168.33.7",
"scheduler_zk_path": "/aurora/scheduler",
"mesos_zk_path": "/mesos",
"auth_mechanism": "UNAUTHENTICATED",
"slave_run_directory": "latest",
"slave_root": "/var/lib/mesos"

View file

@ -23,8 +23,8 @@ import (
"os"
"time"
realis "github.com/paypal/gorealis/v2"
"github.com/paypal/gorealis/v2/gen-go/apache/aurora"
realis "github.com/aurora-scheduler/gorealis/v2"
"github.com/aurora-scheduler/gorealis/v2/gen-go/apache/aurora"
"github.com/pkg/errors"
)
@ -177,6 +177,8 @@ func main() {
RAM(job.RAM).
Disk(job.Disk).
IsService(job.Service).
Tier("preemptible").
Priority(0).
InstanceCount(job.Instances).
AddPorts(job.Ports)
@ -208,7 +210,7 @@ func main() {
fmt.Println("Error creating Aurora job: ", jobCreationErr)
os.Exit(1)
} else {
if ok, monitorErr := r.InstancesMonitor(auroraJob.JobKey(), auroraJob.GetInstanceCount(), 5, 50); !ok || monitorErr != nil {
if ok, monitorErr := r.MonitorInstances(auroraJob.JobKey(), auroraJob.GetInstanceCount(), 5, 50); !ok || monitorErr != nil {
if jobErr := r.KillJob(auroraJob.JobKey()); jobErr !=
nil {
fmt.Println(jobErr)

View file

@ -1,7 +1,6 @@
// Autogenerated by Thrift Compiler (1.0.0-dev)
// Autogenerated by Thrift Compiler (0.12.0)
// DO NOT EDIT UNLESS YOU ARE SURE THAT YOU KNOW WHAT YOU ARE DOING
// Code generated by Thrift Compiler (0.14.0). DO NOT EDIT.
package aurora
var GoUnusedProtection__ int
var GoUnusedProtection__ int;

View file

@ -1,58 +1,53 @@
// Autogenerated by Thrift Compiler (0.12.0)
// DO NOT EDIT UNLESS YOU ARE SURE THAT YOU KNOW WHAT YOU ARE DOING
// Code generated by Thrift Compiler (0.14.0). DO NOT EDIT.
package aurora
import (
import(
"bytes"
"context"
"fmt"
"reflect"
"git.apache.org/thrift.git/lib/go/thrift"
"time"
"github.com/apache/thrift/lib/go/thrift"
)
// (needed to ensure safety because of naive import list construction.)
var _ = thrift.ZERO
var _ = fmt.Printf
var _ = context.Background
var _ = reflect.DeepEqual
var _ = time.Now
var _ = bytes.Equal
const AURORA_EXECUTOR_NAME = "AuroraExecutor"
var ACTIVE_STATES []ScheduleStatus
var SLAVE_ASSIGNED_STATES []ScheduleStatus
var LIVE_STATES []ScheduleStatus
var TERMINAL_STATES []ScheduleStatus
const GOOD_IDENTIFIER_PATTERN = "^[\\w\\-\\.]+$"
const GOOD_IDENTIFIER_PATTERN_JVM = "^[\\w\\-\\.]+$"
const GOOD_IDENTIFIER_PATTERN_PYTHON = "^[\\w\\-\\.]+$"
var ACTIVE_JOB_UPDATE_STATES []JobUpdateStatus
var AWAITNG_PULSE_JOB_UPDATE_STATES []JobUpdateStatus
const BYPASS_LEADER_REDIRECT_HEADER_NAME = "Bypass-Leader-Redirect"
const TASK_FILESYSTEM_MOUNT_POINT = "taskfs"
func init() {
ACTIVE_STATES = []ScheduleStatus{
9, 17, 6, 0, 13, 12, 2, 1, 18, 16}
ACTIVE_STATES = []ScheduleStatus{
9, 17, 6, 0, 13, 12, 2, 1, 18, 16, }
SLAVE_ASSIGNED_STATES = []ScheduleStatus{
9, 17, 6, 13, 12, 2, 18, 1}
SLAVE_ASSIGNED_STATES = []ScheduleStatus{
9, 17, 6, 13, 12, 2, 18, 1, }
LIVE_STATES = []ScheduleStatus{
6, 13, 12, 17, 18, 2}
LIVE_STATES = []ScheduleStatus{
6, 13, 12, 17, 18, 2, }
TERMINAL_STATES = []ScheduleStatus{
4, 3, 5, 7}
TERMINAL_STATES = []ScheduleStatus{
4, 3, 5, 7, }
ACTIVE_JOB_UPDATE_STATES = []JobUpdateStatus{
0, 1, 2, 3, 9, 10}
ACTIVE_JOB_UPDATE_STATES = []JobUpdateStatus{
0, 1, 2, 3, 9, 10, }
AWAITNG_PULSE_JOB_UPDATE_STATES = []JobUpdateStatus{
9, 10}
AWAITNG_PULSE_JOB_UPDATE_STATES = []JobUpdateStatus{
9, 10, }
}

File diff suppressed because it is too large Load diff

File diff suppressed because it is too large Load diff

View file

@ -1,10 +1,8 @@
// Autogenerated by Thrift Compiler (0.12.0)
// DO NOT EDIT UNLESS YOU ARE SURE THAT YOU KNOW WHAT YOU ARE DOING
// Code generated by Thrift Compiler (0.14.0). DO NOT EDIT.
package main
import (
"apache/aurora"
"context"
"flag"
"fmt"
@ -14,398 +12,400 @@ import (
"os"
"strconv"
"strings"
"git.apache.org/thrift.git/lib/go/thrift"
"github.com/apache/thrift/lib/go/thrift"
"apache/aurora"
)
var _ = aurora.GoUnusedProtection__
func Usage() {
fmt.Fprintln(os.Stderr, "Usage of ", os.Args[0], " [-h host:port] [-u url] [-f[ramed]] function [arg1 [arg2...]]:")
flag.PrintDefaults()
fmt.Fprintln(os.Stderr, "\nFunctions:")
fmt.Fprintln(os.Stderr, " Response getRoleSummary()")
fmt.Fprintln(os.Stderr, " Response getJobSummary(string role)")
fmt.Fprintln(os.Stderr, " Response getTasksStatus(TaskQuery query)")
fmt.Fprintln(os.Stderr, " Response getTasksWithoutConfigs(TaskQuery query)")
fmt.Fprintln(os.Stderr, " Response getPendingReason(TaskQuery query)")
fmt.Fprintln(os.Stderr, " Response getConfigSummary(JobKey job)")
fmt.Fprintln(os.Stderr, " Response getJobs(string ownerRole)")
fmt.Fprintln(os.Stderr, " Response getQuota(string ownerRole)")
fmt.Fprintln(os.Stderr, " Response populateJobConfig(JobConfiguration description)")
fmt.Fprintln(os.Stderr, " Response getJobUpdateSummaries(JobUpdateQuery jobUpdateQuery)")
fmt.Fprintln(os.Stderr, " Response getJobUpdateDetails(JobUpdateQuery query)")
fmt.Fprintln(os.Stderr, " Response getJobUpdateDiff(JobUpdateRequest request)")
fmt.Fprintln(os.Stderr, " Response getTierConfigs()")
fmt.Fprintln(os.Stderr)
os.Exit(0)
fmt.Fprintln(os.Stderr, "Usage of ", os.Args[0], " [-h host:port] [-u url] [-f[ramed]] function [arg1 [arg2...]]:")
flag.PrintDefaults()
fmt.Fprintln(os.Stderr, "\nFunctions:")
fmt.Fprintln(os.Stderr, " Response getRoleSummary()")
fmt.Fprintln(os.Stderr, " Response getJobSummary(string role)")
fmt.Fprintln(os.Stderr, " Response getTasksStatus(TaskQuery query)")
fmt.Fprintln(os.Stderr, " Response getTasksWithoutConfigs(TaskQuery query)")
fmt.Fprintln(os.Stderr, " Response getPendingReason(TaskQuery query)")
fmt.Fprintln(os.Stderr, " Response getConfigSummary(JobKey job)")
fmt.Fprintln(os.Stderr, " Response getJobs(string ownerRole)")
fmt.Fprintln(os.Stderr, " Response getQuota(string ownerRole)")
fmt.Fprintln(os.Stderr, " Response populateJobConfig(JobConfiguration description)")
fmt.Fprintln(os.Stderr, " Response getJobUpdateSummaries(JobUpdateQuery jobUpdateQuery)")
fmt.Fprintln(os.Stderr, " Response getJobUpdateDetails(JobUpdateQuery query)")
fmt.Fprintln(os.Stderr, " Response getJobUpdateDiff(JobUpdateRequest request)")
fmt.Fprintln(os.Stderr, " Response getTierConfigs()")
fmt.Fprintln(os.Stderr)
os.Exit(0)
}
type httpHeaders map[string]string
func (h httpHeaders) String() string {
var m map[string]string = h
return fmt.Sprintf("%s", m)
var m map[string]string = h
return fmt.Sprintf("%s", m)
}
func (h httpHeaders) Set(value string) error {
parts := strings.Split(value, ": ")
if len(parts) != 2 {
return fmt.Errorf("header should be of format 'Key: Value'")
}
h[parts[0]] = parts[1]
return nil
parts := strings.Split(value, ": ")
if len(parts) != 2 {
return fmt.Errorf("header should be of format 'Key: Value'")
}
h[parts[0]] = parts[1]
return nil
}
func main() {
flag.Usage = Usage
var host string
var port int
var protocol string
var urlString string
var framed bool
var useHttp bool
headers := make(httpHeaders)
var parsedUrl *url.URL
var trans thrift.TTransport
_ = strconv.Atoi
_ = math.Abs
flag.Usage = Usage
flag.StringVar(&host, "h", "localhost", "Specify host and port")
flag.IntVar(&port, "p", 9090, "Specify port")
flag.StringVar(&protocol, "P", "binary", "Specify the protocol (binary, compact, simplejson, json)")
flag.StringVar(&urlString, "u", "", "Specify the url")
flag.BoolVar(&framed, "framed", false, "Use framed transport")
flag.BoolVar(&useHttp, "http", false, "Use http")
flag.Var(headers, "H", "Headers to set on the http(s) request (e.g. -H \"Key: Value\")")
flag.Parse()
if len(urlString) > 0 {
var err error
parsedUrl, err = url.Parse(urlString)
if err != nil {
fmt.Fprintln(os.Stderr, "Error parsing URL: ", err)
flag.Usage()
}
host = parsedUrl.Host
useHttp = len(parsedUrl.Scheme) <= 0 || parsedUrl.Scheme == "http" || parsedUrl.Scheme == "https"
} else if useHttp {
_, err := url.Parse(fmt.Sprint("http://", host, ":", port))
if err != nil {
fmt.Fprintln(os.Stderr, "Error parsing URL: ", err)
flag.Usage()
}
}
cmd := flag.Arg(0)
var err error
if useHttp {
trans, err = thrift.NewTHttpClient(parsedUrl.String())
if len(headers) > 0 {
httptrans := trans.(*thrift.THttpClient)
for key, value := range headers {
httptrans.SetHeader(key, value)
}
}
} else {
portStr := fmt.Sprint(port)
if strings.Contains(host, ":") {
host, portStr, err = net.SplitHostPort(host)
if err != nil {
fmt.Fprintln(os.Stderr, "error with host:", err)
os.Exit(1)
}
}
trans, err = thrift.NewTSocket(net.JoinHostPort(host, portStr))
if err != nil {
fmt.Fprintln(os.Stderr, "error resolving address:", err)
os.Exit(1)
}
if framed {
trans = thrift.NewTFramedTransport(trans)
}
}
if err != nil {
fmt.Fprintln(os.Stderr, "Error creating transport", err)
os.Exit(1)
}
defer trans.Close()
var protocolFactory thrift.TProtocolFactory
switch protocol {
case "compact":
protocolFactory = thrift.NewTCompactProtocolFactory()
break
case "simplejson":
protocolFactory = thrift.NewTSimpleJSONProtocolFactory()
break
case "json":
protocolFactory = thrift.NewTJSONProtocolFactory()
break
case "binary", "":
protocolFactory = thrift.NewTBinaryProtocolFactoryDefault()
break
default:
fmt.Fprintln(os.Stderr, "Invalid protocol specified: ", protocol)
Usage()
os.Exit(1)
}
iprot := protocolFactory.GetProtocol(trans)
oprot := protocolFactory.GetProtocol(trans)
client := aurora.NewReadOnlySchedulerClient(thrift.NewTStandardClient(iprot, oprot))
if err := trans.Open(); err != nil {
fmt.Fprintln(os.Stderr, "Error opening socket to ", host, ":", port, " ", err)
os.Exit(1)
}
switch cmd {
case "getRoleSummary":
if flag.NArg()-1 != 0 {
fmt.Fprintln(os.Stderr, "GetRoleSummary requires 0 args")
flag.Usage()
}
fmt.Print(client.GetRoleSummary(context.Background()))
fmt.Print("\n")
break
case "getJobSummary":
if flag.NArg()-1 != 1 {
fmt.Fprintln(os.Stderr, "GetJobSummary requires 1 args")
flag.Usage()
}
argvalue0 := flag.Arg(1)
value0 := argvalue0
fmt.Print(client.GetJobSummary(context.Background(), value0))
fmt.Print("\n")
break
case "getTasksStatus":
if flag.NArg()-1 != 1 {
fmt.Fprintln(os.Stderr, "GetTasksStatus requires 1 args")
flag.Usage()
}
arg81 := flag.Arg(1)
mbTrans82 := thrift.NewTMemoryBufferLen(len(arg81))
defer mbTrans82.Close()
_, err83 := mbTrans82.WriteString(arg81)
if err83 != nil {
Usage()
return
}
factory84 := thrift.NewTJSONProtocolFactory()
jsProt85 := factory84.GetProtocol(mbTrans82)
argvalue0 := aurora.NewTaskQuery()
err86 := argvalue0.Read(jsProt85)
if err86 != nil {
Usage()
return
}
value0 := argvalue0
fmt.Print(client.GetTasksStatus(context.Background(), value0))
fmt.Print("\n")
break
case "getTasksWithoutConfigs":
if flag.NArg()-1 != 1 {
fmt.Fprintln(os.Stderr, "GetTasksWithoutConfigs requires 1 args")
flag.Usage()
}
arg87 := flag.Arg(1)
mbTrans88 := thrift.NewTMemoryBufferLen(len(arg87))
defer mbTrans88.Close()
_, err89 := mbTrans88.WriteString(arg87)
if err89 != nil {
Usage()
return
}
factory90 := thrift.NewTJSONProtocolFactory()
jsProt91 := factory90.GetProtocol(mbTrans88)
argvalue0 := aurora.NewTaskQuery()
err92 := argvalue0.Read(jsProt91)
if err92 != nil {
Usage()
return
}
value0 := argvalue0
fmt.Print(client.GetTasksWithoutConfigs(context.Background(), value0))
fmt.Print("\n")
break
case "getPendingReason":
if flag.NArg()-1 != 1 {
fmt.Fprintln(os.Stderr, "GetPendingReason requires 1 args")
flag.Usage()
}
arg93 := flag.Arg(1)
mbTrans94 := thrift.NewTMemoryBufferLen(len(arg93))
defer mbTrans94.Close()
_, err95 := mbTrans94.WriteString(arg93)
if err95 != nil {
Usage()
return
}
factory96 := thrift.NewTJSONProtocolFactory()
jsProt97 := factory96.GetProtocol(mbTrans94)
argvalue0 := aurora.NewTaskQuery()
err98 := argvalue0.Read(jsProt97)
if err98 != nil {
Usage()
return
}
value0 := argvalue0
fmt.Print(client.GetPendingReason(context.Background(), value0))
fmt.Print("\n")
break
case "getConfigSummary":
if flag.NArg()-1 != 1 {
fmt.Fprintln(os.Stderr, "GetConfigSummary requires 1 args")
flag.Usage()
}
arg99 := flag.Arg(1)
mbTrans100 := thrift.NewTMemoryBufferLen(len(arg99))
defer mbTrans100.Close()
_, err101 := mbTrans100.WriteString(arg99)
if err101 != nil {
Usage()
return
}
factory102 := thrift.NewTJSONProtocolFactory()
jsProt103 := factory102.GetProtocol(mbTrans100)
argvalue0 := aurora.NewJobKey()
err104 := argvalue0.Read(jsProt103)
if err104 != nil {
Usage()
return
}
value0 := argvalue0
fmt.Print(client.GetConfigSummary(context.Background(), value0))
fmt.Print("\n")
break
case "getJobs":
if flag.NArg()-1 != 1 {
fmt.Fprintln(os.Stderr, "GetJobs requires 1 args")
flag.Usage()
}
argvalue0 := flag.Arg(1)
value0 := argvalue0
fmt.Print(client.GetJobs(context.Background(), value0))
fmt.Print("\n")
break
case "getQuota":
if flag.NArg()-1 != 1 {
fmt.Fprintln(os.Stderr, "GetQuota requires 1 args")
flag.Usage()
}
argvalue0 := flag.Arg(1)
value0 := argvalue0
fmt.Print(client.GetQuota(context.Background(), value0))
fmt.Print("\n")
break
case "populateJobConfig":
if flag.NArg()-1 != 1 {
fmt.Fprintln(os.Stderr, "PopulateJobConfig requires 1 args")
flag.Usage()
}
arg107 := flag.Arg(1)
mbTrans108 := thrift.NewTMemoryBufferLen(len(arg107))
defer mbTrans108.Close()
_, err109 := mbTrans108.WriteString(arg107)
if err109 != nil {
Usage()
return
}
factory110 := thrift.NewTJSONProtocolFactory()
jsProt111 := factory110.GetProtocol(mbTrans108)
argvalue0 := aurora.NewJobConfiguration()
err112 := argvalue0.Read(jsProt111)
if err112 != nil {
Usage()
return
}
value0 := argvalue0
fmt.Print(client.PopulateJobConfig(context.Background(), value0))
fmt.Print("\n")
break
case "getJobUpdateSummaries":
if flag.NArg()-1 != 1 {
fmt.Fprintln(os.Stderr, "GetJobUpdateSummaries requires 1 args")
flag.Usage()
}
arg113 := flag.Arg(1)
mbTrans114 := thrift.NewTMemoryBufferLen(len(arg113))
defer mbTrans114.Close()
_, err115 := mbTrans114.WriteString(arg113)
if err115 != nil {
Usage()
return
}
factory116 := thrift.NewTJSONProtocolFactory()
jsProt117 := factory116.GetProtocol(mbTrans114)
argvalue0 := aurora.NewJobUpdateQuery()
err118 := argvalue0.Read(jsProt117)
if err118 != nil {
Usage()
return
}
value0 := argvalue0
fmt.Print(client.GetJobUpdateSummaries(context.Background(), value0))
fmt.Print("\n")
break
case "getJobUpdateDetails":
if flag.NArg()-1 != 1 {
fmt.Fprintln(os.Stderr, "GetJobUpdateDetails requires 1 args")
flag.Usage()
}
arg119 := flag.Arg(1)
mbTrans120 := thrift.NewTMemoryBufferLen(len(arg119))
defer mbTrans120.Close()
_, err121 := mbTrans120.WriteString(arg119)
if err121 != nil {
Usage()
return
}
factory122 := thrift.NewTJSONProtocolFactory()
jsProt123 := factory122.GetProtocol(mbTrans120)
argvalue0 := aurora.NewJobUpdateQuery()
err124 := argvalue0.Read(jsProt123)
if err124 != nil {
Usage()
return
}
value0 := argvalue0
fmt.Print(client.GetJobUpdateDetails(context.Background(), value0))
fmt.Print("\n")
break
case "getJobUpdateDiff":
if flag.NArg()-1 != 1 {
fmt.Fprintln(os.Stderr, "GetJobUpdateDiff requires 1 args")
flag.Usage()
}
arg125 := flag.Arg(1)
mbTrans126 := thrift.NewTMemoryBufferLen(len(arg125))
defer mbTrans126.Close()
_, err127 := mbTrans126.WriteString(arg125)
if err127 != nil {
Usage()
return
}
factory128 := thrift.NewTJSONProtocolFactory()
jsProt129 := factory128.GetProtocol(mbTrans126)
argvalue0 := aurora.NewJobUpdateRequest()
err130 := argvalue0.Read(jsProt129)
if err130 != nil {
Usage()
return
}
value0 := argvalue0
fmt.Print(client.GetJobUpdateDiff(context.Background(), value0))
fmt.Print("\n")
break
case "getTierConfigs":
if flag.NArg()-1 != 0 {
fmt.Fprintln(os.Stderr, "GetTierConfigs requires 0 args")
flag.Usage()
}
fmt.Print(client.GetTierConfigs(context.Background()))
fmt.Print("\n")
break
case "":
Usage()
break
default:
fmt.Fprintln(os.Stderr, "Invalid function ", cmd)
}
flag.Usage = Usage
var host string
var port int
var protocol string
var urlString string
var framed bool
var useHttp bool
headers := make(httpHeaders)
var parsedUrl *url.URL
var trans thrift.TTransport
_ = strconv.Atoi
_ = math.Abs
flag.Usage = Usage
flag.StringVar(&host, "h", "localhost", "Specify host and port")
flag.IntVar(&port, "p", 9090, "Specify port")
flag.StringVar(&protocol, "P", "binary", "Specify the protocol (binary, compact, simplejson, json)")
flag.StringVar(&urlString, "u", "", "Specify the url")
flag.BoolVar(&framed, "framed", false, "Use framed transport")
flag.BoolVar(&useHttp, "http", false, "Use http")
flag.Var(headers, "H", "Headers to set on the http(s) request (e.g. -H \"Key: Value\")")
flag.Parse()
if len(urlString) > 0 {
var err error
parsedUrl, err = url.Parse(urlString)
if err != nil {
fmt.Fprintln(os.Stderr, "Error parsing URL: ", err)
flag.Usage()
}
host = parsedUrl.Host
useHttp = len(parsedUrl.Scheme) <= 0 || parsedUrl.Scheme == "http" || parsedUrl.Scheme == "https"
} else if useHttp {
_, err := url.Parse(fmt.Sprint("http://", host, ":", port))
if err != nil {
fmt.Fprintln(os.Stderr, "Error parsing URL: ", err)
flag.Usage()
}
}
cmd := flag.Arg(0)
var err error
if useHttp {
trans, err = thrift.NewTHttpClient(parsedUrl.String())
if len(headers) > 0 {
httptrans := trans.(*thrift.THttpClient)
for key, value := range headers {
httptrans.SetHeader(key, value)
}
}
} else {
portStr := fmt.Sprint(port)
if strings.Contains(host, ":") {
host, portStr, err = net.SplitHostPort(host)
if err != nil {
fmt.Fprintln(os.Stderr, "error with host:", err)
os.Exit(1)
}
}
trans, err = thrift.NewTSocket(net.JoinHostPort(host, portStr))
if err != nil {
fmt.Fprintln(os.Stderr, "error resolving address:", err)
os.Exit(1)
}
if framed {
trans = thrift.NewTFramedTransport(trans)
}
}
if err != nil {
fmt.Fprintln(os.Stderr, "Error creating transport", err)
os.Exit(1)
}
defer trans.Close()
var protocolFactory thrift.TProtocolFactory
switch protocol {
case "compact":
protocolFactory = thrift.NewTCompactProtocolFactory()
break
case "simplejson":
protocolFactory = thrift.NewTSimpleJSONProtocolFactory()
break
case "json":
protocolFactory = thrift.NewTJSONProtocolFactory()
break
case "binary", "":
protocolFactory = thrift.NewTBinaryProtocolFactoryDefault()
break
default:
fmt.Fprintln(os.Stderr, "Invalid protocol specified: ", protocol)
Usage()
os.Exit(1)
}
iprot := protocolFactory.GetProtocol(trans)
oprot := protocolFactory.GetProtocol(trans)
client := aurora.NewReadOnlySchedulerClient(thrift.NewTStandardClient(iprot, oprot))
if err := trans.Open(); err != nil {
fmt.Fprintln(os.Stderr, "Error opening socket to ", host, ":", port, " ", err)
os.Exit(1)
}
switch cmd {
case "getRoleSummary":
if flag.NArg() - 1 != 0 {
fmt.Fprintln(os.Stderr, "GetRoleSummary requires 0 args")
flag.Usage()
}
fmt.Print(client.GetRoleSummary(context.Background()))
fmt.Print("\n")
break
case "getJobSummary":
if flag.NArg() - 1 != 1 {
fmt.Fprintln(os.Stderr, "GetJobSummary requires 1 args")
flag.Usage()
}
argvalue0 := flag.Arg(1)
value0 := argvalue0
fmt.Print(client.GetJobSummary(context.Background(), value0))
fmt.Print("\n")
break
case "getTasksStatus":
if flag.NArg() - 1 != 1 {
fmt.Fprintln(os.Stderr, "GetTasksStatus requires 1 args")
flag.Usage()
}
arg132 := flag.Arg(1)
mbTrans133 := thrift.NewTMemoryBufferLen(len(arg132))
defer mbTrans133.Close()
_, err134 := mbTrans133.WriteString(arg132)
if err134 != nil {
Usage()
return
}
factory135 := thrift.NewTJSONProtocolFactory()
jsProt136 := factory135.GetProtocol(mbTrans133)
argvalue0 := aurora.NewTaskQuery()
err137 := argvalue0.Read(context.Background(), jsProt136)
if err137 != nil {
Usage()
return
}
value0 := argvalue0
fmt.Print(client.GetTasksStatus(context.Background(), value0))
fmt.Print("\n")
break
case "getTasksWithoutConfigs":
if flag.NArg() - 1 != 1 {
fmt.Fprintln(os.Stderr, "GetTasksWithoutConfigs requires 1 args")
flag.Usage()
}
arg138 := flag.Arg(1)
mbTrans139 := thrift.NewTMemoryBufferLen(len(arg138))
defer mbTrans139.Close()
_, err140 := mbTrans139.WriteString(arg138)
if err140 != nil {
Usage()
return
}
factory141 := thrift.NewTJSONProtocolFactory()
jsProt142 := factory141.GetProtocol(mbTrans139)
argvalue0 := aurora.NewTaskQuery()
err143 := argvalue0.Read(context.Background(), jsProt142)
if err143 != nil {
Usage()
return
}
value0 := argvalue0
fmt.Print(client.GetTasksWithoutConfigs(context.Background(), value0))
fmt.Print("\n")
break
case "getPendingReason":
if flag.NArg() - 1 != 1 {
fmt.Fprintln(os.Stderr, "GetPendingReason requires 1 args")
flag.Usage()
}
arg144 := flag.Arg(1)
mbTrans145 := thrift.NewTMemoryBufferLen(len(arg144))
defer mbTrans145.Close()
_, err146 := mbTrans145.WriteString(arg144)
if err146 != nil {
Usage()
return
}
factory147 := thrift.NewTJSONProtocolFactory()
jsProt148 := factory147.GetProtocol(mbTrans145)
argvalue0 := aurora.NewTaskQuery()
err149 := argvalue0.Read(context.Background(), jsProt148)
if err149 != nil {
Usage()
return
}
value0 := argvalue0
fmt.Print(client.GetPendingReason(context.Background(), value0))
fmt.Print("\n")
break
case "getConfigSummary":
if flag.NArg() - 1 != 1 {
fmt.Fprintln(os.Stderr, "GetConfigSummary requires 1 args")
flag.Usage()
}
arg150 := flag.Arg(1)
mbTrans151 := thrift.NewTMemoryBufferLen(len(arg150))
defer mbTrans151.Close()
_, err152 := mbTrans151.WriteString(arg150)
if err152 != nil {
Usage()
return
}
factory153 := thrift.NewTJSONProtocolFactory()
jsProt154 := factory153.GetProtocol(mbTrans151)
argvalue0 := aurora.NewJobKey()
err155 := argvalue0.Read(context.Background(), jsProt154)
if err155 != nil {
Usage()
return
}
value0 := argvalue0
fmt.Print(client.GetConfigSummary(context.Background(), value0))
fmt.Print("\n")
break
case "getJobs":
if flag.NArg() - 1 != 1 {
fmt.Fprintln(os.Stderr, "GetJobs requires 1 args")
flag.Usage()
}
argvalue0 := flag.Arg(1)
value0 := argvalue0
fmt.Print(client.GetJobs(context.Background(), value0))
fmt.Print("\n")
break
case "getQuota":
if flag.NArg() - 1 != 1 {
fmt.Fprintln(os.Stderr, "GetQuota requires 1 args")
flag.Usage()
}
argvalue0 := flag.Arg(1)
value0 := argvalue0
fmt.Print(client.GetQuota(context.Background(), value0))
fmt.Print("\n")
break
case "populateJobConfig":
if flag.NArg() - 1 != 1 {
fmt.Fprintln(os.Stderr, "PopulateJobConfig requires 1 args")
flag.Usage()
}
arg158 := flag.Arg(1)
mbTrans159 := thrift.NewTMemoryBufferLen(len(arg158))
defer mbTrans159.Close()
_, err160 := mbTrans159.WriteString(arg158)
if err160 != nil {
Usage()
return
}
factory161 := thrift.NewTJSONProtocolFactory()
jsProt162 := factory161.GetProtocol(mbTrans159)
argvalue0 := aurora.NewJobConfiguration()
err163 := argvalue0.Read(context.Background(), jsProt162)
if err163 != nil {
Usage()
return
}
value0 := argvalue0
fmt.Print(client.PopulateJobConfig(context.Background(), value0))
fmt.Print("\n")
break
case "getJobUpdateSummaries":
if flag.NArg() - 1 != 1 {
fmt.Fprintln(os.Stderr, "GetJobUpdateSummaries requires 1 args")
flag.Usage()
}
arg164 := flag.Arg(1)
mbTrans165 := thrift.NewTMemoryBufferLen(len(arg164))
defer mbTrans165.Close()
_, err166 := mbTrans165.WriteString(arg164)
if err166 != nil {
Usage()
return
}
factory167 := thrift.NewTJSONProtocolFactory()
jsProt168 := factory167.GetProtocol(mbTrans165)
argvalue0 := aurora.NewJobUpdateQuery()
err169 := argvalue0.Read(context.Background(), jsProt168)
if err169 != nil {
Usage()
return
}
value0 := argvalue0
fmt.Print(client.GetJobUpdateSummaries(context.Background(), value0))
fmt.Print("\n")
break
case "getJobUpdateDetails":
if flag.NArg() - 1 != 1 {
fmt.Fprintln(os.Stderr, "GetJobUpdateDetails requires 1 args")
flag.Usage()
}
arg170 := flag.Arg(1)
mbTrans171 := thrift.NewTMemoryBufferLen(len(arg170))
defer mbTrans171.Close()
_, err172 := mbTrans171.WriteString(arg170)
if err172 != nil {
Usage()
return
}
factory173 := thrift.NewTJSONProtocolFactory()
jsProt174 := factory173.GetProtocol(mbTrans171)
argvalue0 := aurora.NewJobUpdateQuery()
err175 := argvalue0.Read(context.Background(), jsProt174)
if err175 != nil {
Usage()
return
}
value0 := argvalue0
fmt.Print(client.GetJobUpdateDetails(context.Background(), value0))
fmt.Print("\n")
break
case "getJobUpdateDiff":
if flag.NArg() - 1 != 1 {
fmt.Fprintln(os.Stderr, "GetJobUpdateDiff requires 1 args")
flag.Usage()
}
arg176 := flag.Arg(1)
mbTrans177 := thrift.NewTMemoryBufferLen(len(arg176))
defer mbTrans177.Close()
_, err178 := mbTrans177.WriteString(arg176)
if err178 != nil {
Usage()
return
}
factory179 := thrift.NewTJSONProtocolFactory()
jsProt180 := factory179.GetProtocol(mbTrans177)
argvalue0 := aurora.NewJobUpdateRequest()
err181 := argvalue0.Read(context.Background(), jsProt180)
if err181 != nil {
Usage()
return
}
value0 := argvalue0
fmt.Print(client.GetJobUpdateDiff(context.Background(), value0))
fmt.Print("\n")
break
case "getTierConfigs":
if flag.NArg() - 1 != 0 {
fmt.Fprintln(os.Stderr, "GetTierConfigs requires 0 args")
flag.Usage()
}
fmt.Print(client.GetTierConfigs(context.Background()))
fmt.Print("\n")
break
case "":
Usage()
break
default:
fmt.Fprintln(os.Stderr, "Invalid function ", cmd)
}
}

View file

@ -1,6 +1,6 @@
#! /bin/bash
THRIFT_VER=0.12.0
THRIFT_VER=0.14.0
if [[ $(thrift -version | grep -e $THRIFT_VER -c) -ne 1 ]]; then
echo "Warning: This wrapper has only been tested with version" $THRIFT_VER;

12
go.mod
View file

@ -1,10 +1,10 @@
module github.com/paypal/gorealis/v2
module github.com/aurora-scheduler/gorealis/v2
require (
git.apache.org/thrift.git v0.12.0
github.com/davecgh/go-spew v1.1.0 // indirect
github.com/pkg/errors v0.0.0-20171216070316-e881fd58d78e
github.com/pmezard/go-difflib v1.0.0 // indirect
github.com/apache/thrift v0.14.0
github.com/pkg/errors v0.9.1
github.com/samuel/go-zookeeper v0.0.0-20171117190445-471cd4e61d7a
github.com/stretchr/testify v1.2.0
github.com/stretchr/testify v1.7.0
)
go 1.16

22
go.sum
View file

@ -1,12 +1,22 @@
git.apache.org/thrift.git v0.12.0 h1:692K1/SsOcQvkvMRMdt60FCq2AvKpuQNM6sIeH3mN4s=
git.apache.org/thrift.git v0.12.0/go.mod h1:fPE2ZNJGynbRyZ4dJvy6G277gSllfV2HJqblrnkyeyg=
github.com/apache/thrift v0.14.0 h1:vqZ2DP42i8th2OsgCcYZkirtbzvpZEFx53LiWDJXIAs=
github.com/apache/thrift v0.14.0/go.mod h1:cp2SuWMxlEZw2r+iP2GNCdIi4C1qmUzdZFSVb+bacwQ=
github.com/davecgh/go-spew v1.1.0 h1:ZDRjVQ15GmhC3fiQ8ni8+OwkZQO4DARzQgrnXU1Liz8=
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/pkg/errors v0.0.0-20171216070316-e881fd58d78e h1:+RHxT/gm0O3UF7nLJbdNzAmULvCFt4XfXHWzh3XI/zs=
github.com/pkg/errors v0.0.0-20171216070316-e881fd58d78e/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
github.com/pkg/errors v0.9.1 h1:FEBLx1zS214owpjy7qsBeixbURkuhQAwrK5UwLGTwt4=
github.com/pkg/errors v0.9.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/samuel/go-zookeeper v0.0.0-20171117190445-471cd4e61d7a h1:EYL2xz/Zdo0hyqdZMXR4lmT2O11jDLTPCEqIe/FR6W4=
github.com/samuel/go-zookeeper v0.0.0-20171117190445-471cd4e61d7a/go.mod h1:gi+0XIa01GRL2eRQVjQkKGqKF3SF9vZR/HnPullcV2E=
github.com/stretchr/testify v1.2.0 h1:LThGCOvhuJic9Gyd1VBCkhyUXmO8vKaBFvBsJ2k03rg=
github.com/stretchr/testify v1.2.0/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=
github.com/stretchr/objx v0.1.0 h1:4G4v2dO3VZwixGIRoQ5Lfboy6nUhCyYzaqnIAPPhYs4=
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
github.com/stretchr/testify v1.5.0 h1:DMOzIV76tmoDNE9pX6RSN0aDtCYeCg5VueieJaAo1uw=
github.com/stretchr/testify v1.5.0/go.mod h1:5W2xD1RspED5o8YsWQXVCued0rvSQ+mT+I5cxcmMvtA=
github.com/stretchr/testify v1.7.0 h1:nwc3DEeHmmLAfoZucVR881uASk0Mfjw8xYJ99tb5CcY=
github.com/stretchr/testify v1.7.0/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405 h1:yhCVgyC4o1eVCa2tZl7eS0r+SDo693bJlVdllGtEeKM=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/yaml.v2 v2.2.2 h1:ZCJp+EgiOT7lHqUV2J862kp8Qj64Jo6az82+3Td9dZw=
gopkg.in/yaml.v2 v2.2.2/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c h1:dUUwHk2QECo/6vqA44rthZ8ie2QXMNeKRTHCNY2nXvo=
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=

23
helpers.go Normal file
View file

@ -0,0 +1,23 @@
package realis
import (
"context"
"github.com/aurora-scheduler/gorealis/v2/gen-go/apache/aurora"
)
func (r *Client) JobExists(key aurora.JobKey) (bool, error) {
resp, err := r.client.GetConfigSummary(context.TODO(), &key)
if err != nil {
return false, err
}
return resp != nil &&
resp.GetResult_() != nil &&
resp.GetResult_().GetConfigSummaryResult_() != nil &&
resp.GetResult_().GetConfigSummaryResult_().GetSummary() != nil &&
resp.GetResult_().GetConfigSummaryResult_().GetSummary().GetGroups() != nil &&
len(resp.GetResult_().GetConfigSummaryResult_().GetSummary().GetGroups()) > 0 &&
resp.GetResponseCode() == aurora.ResponseCode_OK,
nil
}

25
job.go
View file

@ -15,7 +15,7 @@
package realis
import (
"github.com/paypal/gorealis/v2/gen-go/apache/aurora"
"github.com/aurora-scheduler/gorealis/v2/gen-go/apache/aurora"
)
// Structure to collect all information pertaining to an Aurora job.
@ -136,6 +136,11 @@ func (j *AuroraJob) Disk(disk int64) *AuroraJob {
return j
}
func (j *AuroraJob) GPU(gpu int64) *AuroraJob {
j.task.GPU(gpu)
return j
}
func (j *AuroraJob) Tier(tier string) *AuroraJob {
j.task.Tier(tier)
return j
@ -151,6 +156,16 @@ func (j *AuroraJob) IsService(isService bool) *AuroraJob {
return j
}
func (j *AuroraJob) Priority(priority int32) *AuroraJob {
j.task.Priority(priority)
return j
}
func (j *AuroraJob) Production(production bool) *AuroraJob {
j.task.Production(production)
return j
}
func (j *AuroraJob) TaskConfig() *aurora.TaskConfig {
return j.task.TaskConfig()
}
@ -202,3 +217,11 @@ func (j *AuroraJob) ThermosExecutor(thermos ThermosExecutor) *AuroraJob {
func (j *AuroraJob) BuildThermosPayload() error {
return j.task.BuildThermosPayload()
}
func (j *AuroraJob) PartitionPolicy(reschedule bool, delay int64) *AuroraJob {
j.task.PartitionPolicy(aurora.PartitionPolicy{
Reschedule: reschedule,
DelaySecs: &delay,
})
return j
}

View file

@ -17,8 +17,8 @@ package realis
import (
"time"
"git.apache.org/thrift.git/lib/go/thrift"
"github.com/paypal/gorealis/v2/gen-go/apache/aurora"
"github.com/apache/thrift/lib/go/thrift"
"github.com/aurora-scheduler/gorealis/v2/gen-go/apache/aurora"
)
// Structure to collect all information required to create job update
@ -31,32 +31,36 @@ type JobUpdate struct {
func NewJobUpdate() *JobUpdate {
newTask := NewTask()
req := aurora.JobUpdateRequest{}
req.TaskConfig = newTask.TaskConfig()
req.Settings = newUpdateSettings()
return &JobUpdate{task: newTask, request: &req}
return &JobUpdate{
task: newTask,
request: &aurora.JobUpdateRequest{TaskConfig: newTask.TaskConfig(), Settings: newUpdateSettings()},
}
}
// Creates an update with default values using an AuroraTask as the underlying task configuration.
// This function has a high level understanding of Aurora Tasks and thus will support copying a task that is configured
// to use Thermos.
func JobUpdateFromAuroraTask(task *AuroraTask) *JobUpdate {
newTask := task.Clone()
req := aurora.JobUpdateRequest{}
req.TaskConfig = newTask.TaskConfig()
req.Settings = newUpdateSettings()
return &JobUpdate{task: newTask, request: &req}
return &JobUpdate{
task: newTask,
request: &aurora.JobUpdateRequest{TaskConfig: newTask.TaskConfig(), Settings: newUpdateSettings()},
}
}
// JobUpdateFromConfig creates an update with default values using an aurora.TaskConfig
// primitive as the underlying task configuration.
// This function should not be used unless the implications of using a primitive value are understood.
// For example, the primitive has no concept of Thermos.
func JobUpdateFromConfig(task *aurora.TaskConfig) *JobUpdate {
// Perform a deep copy to avoid unexpected behavior
newTask := TaskFromThrift(task)
req := aurora.JobUpdateRequest{}
req.TaskConfig = newTask.TaskConfig()
req.Settings = newUpdateSettings()
return &JobUpdate{task: newTask, request: &req}
return &JobUpdate{
task: newTask,
request: &aurora.JobUpdateRequest{TaskConfig: newTask.TaskConfig(), Settings: newUpdateSettings()},
}
}
// Set instance count the job will have after the update.
@ -73,7 +77,7 @@ func (j *JobUpdate) BatchSize(size int32) *JobUpdate {
// Minimum number of seconds a shard must remain in RUNNING state before considered a success.
func (j *JobUpdate) WatchTime(timeout time.Duration) *JobUpdate {
j.request.Settings.MinWaitInInstanceRunningMs = int32(timeout.Seconds() * 1000)
j.request.Settings.MinWaitInInstanceRunningMs = int32(timeout.Milliseconds())
return j
}
@ -106,9 +110,42 @@ func (j *JobUpdate) PulseIntervalTimeout(timeout time.Duration) *JobUpdate {
j.request.Settings.BlockIfNoPulsesAfterMs = thrift.Int32Ptr(int32(timeout.Seconds() * 1000))
return j
}
func (j *JobUpdate) BatchUpdateStrategy(autoPause bool, batchSize int32) *JobUpdate {
j.request.Settings.UpdateStrategy = &aurora.JobUpdateStrategy{
BatchStrategy: &aurora.BatchJobUpdateStrategy{GroupSize: batchSize, AutopauseAfterBatch: autoPause},
}
return j
}
func (j *JobUpdate) QueueUpdateStrategy(groupSize int32) *JobUpdate {
j.request.Settings.UpdateStrategy = &aurora.JobUpdateStrategy{
QueueStrategy: &aurora.QueueJobUpdateStrategy{GroupSize: groupSize},
}
return j
}
func (j *JobUpdate) VariableBatchStrategy(autoPause bool, batchSizes ...int32) *JobUpdate {
j.request.Settings.UpdateStrategy = &aurora.JobUpdateStrategy{
VarBatchStrategy: &aurora.VariableBatchJobUpdateStrategy{GroupSizes: batchSizes, AutopauseAfterBatch: autoPause},
}
return j
}
// SlaAware makes the scheduler enforce the SLA Aware policy if the job meets the SLA awareness criteria.
// By default, the scheduler will only apply SLA Awareness to jobs in the production tier with 20 or more instances.
func (j *JobUpdate) SlaAware(slaAware bool) *JobUpdate {
j.request.Settings.SlaAware = &slaAware
return j
}
// AddInstanceRange allows updates to only touch a certain specific range of instances
func (j *JobUpdate) AddInstanceRange(first, last int32) *JobUpdate {
j.request.Settings.UpdateOnlyTheseInstances = append(j.request.Settings.UpdateOnlyTheseInstances,
&aurora.Range{First: first, Last: last})
return j
}
func newUpdateSettings() *aurora.JobUpdateSettings {
us := aurora.JobUpdateSettings{}
// Mirrors defaults set by Pystachio
us.UpdateOnlyTheseInstances = []*aurora.Range{}
@ -174,7 +211,7 @@ func (j *JobUpdate) Tier(tier string) *JobUpdate {
return j
}
func (j *JobUpdate) MaxFailure(maxFail int32) *JobUpdate {
func (j *JobUpdate) TaskMaxFailure(maxFail int32) *JobUpdate {
j.task.MaxFailure(maxFail)
return j
}
@ -184,6 +221,16 @@ func (j *JobUpdate) IsService(isService bool) *JobUpdate {
return j
}
func (j *JobUpdate) Priority(priority int32) *JobUpdate {
j.task.Priority(priority)
return j
}
func (j *JobUpdate) Production(production bool) *JobUpdate {
j.task.Production(production)
return j
}
func (j *JobUpdate) TaskConfig() *aurora.TaskConfig {
return j.task.TaskConfig()
}
@ -239,3 +286,11 @@ func (j *JobUpdate) ThermosExecutor(thermos ThermosExecutor) *JobUpdate {
func (j *JobUpdate) BuildThermosPayload() error {
return j.task.BuildThermosPayload()
}
func (j *JobUpdate) PartitionPolicy(reschedule bool, delay int64) *JobUpdate {
j.task.PartitionPolicy(aurora.PartitionPolicy{
Reschedule: reschedule,
DelaySecs: &delay,
})
return j
}

View file

@ -31,29 +31,49 @@ func (NoopLogger) Println(a ...interface{}) {}
type LevelLogger struct {
Logger
debug bool
trace bool
}
func (l *LevelLogger) EnableDebug(enable bool) {
l.debug = enable
}
func (l *LevelLogger) EnableTrace(enable bool) {
l.trace = enable
}
func (l LevelLogger) DebugPrintf(format string, a ...interface{}) {
if l.debug {
l.Print("[DEBUG] ")
l.Printf(format, a...)
l.Printf("[DEBUG] "+format, a...)
}
}
func (l LevelLogger) DebugPrint(a ...interface{}) {
if l.debug {
l.Print("[DEBUG] ")
l.Print(a...)
l.Print(append([]interface{}{"[DEBUG] "}, a...)...)
}
}
func (l LevelLogger) DebugPrintln(a ...interface{}) {
if l.debug {
l.Print("[DEBUG] ")
l.Println(a...)
l.Println(append([]interface{}{"[DEBUG] "}, a...)...)
}
}
func (l LevelLogger) TracePrintf(format string, a ...interface{}) {
if l.trace {
l.Printf("[TRACE] "+format, a...)
}
}
func (l LevelLogger) TracePrint(a ...interface{}) {
if l.trace {
l.Print(append([]interface{}{"[TRACE] "}, a...)...)
}
}
func (l LevelLogger) TracePrintln(a ...interface{}) {
if l.trace {
l.Println(append([]interface{}{"[TRACE] "}, a...)...)
}
}

View file

@ -18,18 +18,13 @@ package realis
import (
"time"
"github.com/paypal/gorealis/v2/gen-go/apache/aurora"
"github.com/aurora-scheduler/gorealis/v2/gen-go/apache/aurora"
"github.com/pkg/errors"
)
const (
UpdateFailed = "update failed"
RolledBack = "update rolled back"
Timedout = "timeout"
)
// Polls the scheduler every certain amount of time to see if the update has succeeded
func (c *Client) JobUpdateMonitor(updateKey aurora.JobUpdateKey, interval, timeout time.Duration) (bool, error) {
// MonitorJobUpdate polls the scheduler every certain amount of time to see if the update has succeeded.
// If the update entered a terminal update state but it is not ROLLED_FORWARD, this function will return an error.
func (c *Client) MonitorJobUpdate(updateKey aurora.JobUpdateKey, interval, timeout time.Duration) (bool, error) {
if interval < 1*time.Second {
interval = interval * time.Second
}
@ -37,65 +32,117 @@ func (c *Client) JobUpdateMonitor(updateKey aurora.JobUpdateKey, interval, timeo
if timeout < 1*time.Second {
timeout = timeout * time.Second
}
updateSummaries, err := c.MonitorJobUpdateQuery(
aurora.JobUpdateQuery{
Key: &updateKey,
Limit: 1,
UpdateStatuses: []aurora.JobUpdateStatus{
aurora.JobUpdateStatus_ROLLED_FORWARD,
aurora.JobUpdateStatus_ROLLED_BACK,
aurora.JobUpdateStatus_ABORTED,
aurora.JobUpdateStatus_ERROR,
aurora.JobUpdateStatus_FAILED,
},
},
interval,
timeout)
if err != nil {
return false, err
}
status := updateSummaries[0].State.Status
c.RealisConfig().logger.Printf("job update status: %v\n", status)
// Rolled forward is the only state in which an update has been successfully updated
// if we encounter an inactive state and it is not at rolled forward, update failed
switch status {
case aurora.JobUpdateStatus_ROLLED_FORWARD:
return true, nil
case aurora.JobUpdateStatus_ROLLED_BACK,
aurora.JobUpdateStatus_ABORTED,
aurora.JobUpdateStatus_ERROR,
aurora.JobUpdateStatus_FAILED:
return false, errors.Errorf("bad terminal state for update: %v", status)
default:
return false, errors.Errorf("unexpected update state: %v", status)
}
}
// MonitorJobUpdateStatus polls the scheduler for information about an update until the update enters one of the
// desired states or until the function times out.
func (c *Client) MonitorJobUpdateStatus(updateKey aurora.JobUpdateKey,
desiredStatuses []aurora.JobUpdateStatus,
interval, timeout time.Duration) (aurora.JobUpdateStatus, error) {
if len(desiredStatuses) == 0 {
return aurora.JobUpdateStatus(-1), errors.New("no desired statuses provided")
}
// Make deep local copy to avoid side effects from job key being manipulated externally.
updateKeyLocal := &aurora.JobUpdateKey{
Job: &aurora.JobKey{
Role: updateKey.Job.GetRole(),
Environment: updateKey.Job.GetEnvironment(),
Name: updateKey.Job.GetName(),
},
ID: updateKey.GetID(),
}
updateQ := aurora.JobUpdateQuery{
Key: &updateKey,
Limit: 1,
Key: updateKeyLocal,
Limit: 1,
UpdateStatuses: desiredStatuses,
}
summary, err := c.MonitorJobUpdateQuery(updateQ, interval, timeout)
if len(summary) > 0 {
return summary[0].State.Status, err
}
return aurora.JobUpdateStatus(-1), err
}
func (c *Client) MonitorJobUpdateQuery(
updateQuery aurora.JobUpdateQuery,
interval time.Duration,
timeout time.Duration) ([]*aurora.JobUpdateSummary, error) {
ticker := time.NewTicker(interval)
defer ticker.Stop()
timer := time.NewTimer(timeout)
defer timer.Stop()
for {
select {
case <-ticker.C:
updateDetail, cliErr := c.JobUpdateDetails(updateQ)
updateSummaryResults, cliErr := c.GetJobUpdateSummaries(&updateQuery)
if cliErr != nil {
return false, cliErr
return nil, cliErr
}
if len(updateDetail) == 0 {
c.RealisConfig().logger.Println("No update found")
return false, errors.New("No update found for " + updateKey.String())
if len(updateSummaryResults.GetUpdateSummaries()) >= 1 {
return updateSummaryResults.GetUpdateSummaries(), nil
}
status := updateDetail[0].Update.Summary.State.Status
// Convert Thrift Set to Golang map for quick lookup
if _, ok := ActiveJobUpdateStates[status]; !ok {
// Rolled forward is the only state in which an update has been successfully updated
// if we encounter an inactive state and it is not at rolled forward, update failed
switch status {
case aurora.JobUpdateStatus_ROLLED_FORWARD:
c.RealisConfig().logger.Println("Update succeeded")
return true, nil
case aurora.JobUpdateStatus_FAILED:
c.RealisConfig().logger.Println("Update failed")
return false, errors.New(UpdateFailed)
case aurora.JobUpdateStatus_ROLLED_BACK:
c.RealisConfig().logger.Println("rolled back")
return false, errors.New(RolledBack)
default:
return false, nil
}
}
case <-timer.C:
return false, errors.New(Timedout)
return nil, newTimedoutError(errors.New("job update monitor timed out"))
}
}
}
// Monitor a AuroraJob until all instances enter one of the LiveStates
func (c *Client) InstancesMonitor(key aurora.JobKey, instances int32, interval, timeout time.Duration) (bool, error) {
return c.ScheduleStatusMonitor(key, instances, aurora.LIVE_STATES, interval, timeout)
func (c *Client) MonitorInstances(key aurora.JobKey, instances int32, interval, timeout time.Duration) (bool, error) {
return c.MonitorScheduleStatus(key, instances, aurora.LIVE_STATES, interval, timeout)
}
// Monitor a AuroraJob until all instances enter a desired status.
// Defaults sets of desired statuses provided by the thrift API include:
// ActiveStates, SlaveAssignedStates, LiveStates, and TerminalStates
func (c *Client) ScheduleStatusMonitor(key aurora.JobKey, instanceCount int32, desiredStatuses []aurora.ScheduleStatus, interval, timeout time.Duration) (bool, error) {
func (c *Client) MonitorScheduleStatus(key aurora.JobKey,
instanceCount int32,
desiredStatuses []aurora.ScheduleStatus,
interval, timeout time.Duration) (bool, error) {
if interval < 1*time.Second {
interval = interval * time.Second
}
@ -124,14 +171,16 @@ func (c *Client) ScheduleStatusMonitor(key aurora.JobKey, instanceCount int32, d
case <-timer.C:
// If the timer runs out, return a timeout error to user
return false, errors.New(Timedout)
return false, newTimedoutError(errors.New("schedule status monitor timedout"))
}
}
}
// Monitor host status until all hosts match the status provided. Returns a map where the value is true if the host
// is in one of the desired mode(s) or false if it is not as of the time when the monitor exited.
func (c *Client) HostMaintenanceMonitor(hosts []string, modes []aurora.MaintenanceMode, interval, timeout time.Duration) (map[string]bool, error) {
func (c *Client) MonitorHostMaintenance(hosts []string,
modes []aurora.MaintenanceMode,
interval, timeout time.Duration) (map[string]bool, error) {
if interval < 1*time.Second {
interval = interval * time.Second
}
@ -191,7 +240,73 @@ func (c *Client) HostMaintenanceMonitor(hosts []string, modes []aurora.Maintenan
hostResult[host] = false
}
return hostResult, errors.New(Timedout)
return hostResult, newTimedoutError(errors.New("host maintenance monitor timedout"))
}
}
}
// MonitorAutoPausedUpdate is a special monitor for auto pause enabled batch updates. This monitor ensures that the update
// being monitored is capable of auto pausing and has auto pausing enabled. After verifying this information,
// the monitor watches for the job to enter the ROLL_FORWARD_PAUSED state and calculates the current batch
// the update is in using information from the update configuration.
func (c *Client) MonitorAutoPausedUpdate(key aurora.JobUpdateKey, interval, timeout time.Duration) (int, error) {
key.Job = &aurora.JobKey{
Role: key.Job.Role,
Environment: key.Job.Environment,
Name: key.Job.Name,
}
query := aurora.JobUpdateQuery{
UpdateStatuses: aurora.ACTIVE_JOB_UPDATE_STATES,
Limit: 1,
Key: &key,
}
updateDetails, err := c.JobUpdateDetails(query)
if err != nil {
return -1, errors.Wrap(err, "unable to get information about update")
}
if len(updateDetails) == 0 {
return -1, errors.Errorf("details for update could not be found")
}
updateStrategy := updateDetails[0].Update.Instructions.Settings.UpdateStrategy
var batchSizes []int32
switch {
case updateStrategy.IsSetVarBatchStrategy():
batchSizes = updateStrategy.VarBatchStrategy.GroupSizes
if !updateStrategy.VarBatchStrategy.AutopauseAfterBatch {
return -1, errors.Errorf("update does not have auto pause enabled")
}
case updateStrategy.IsSetBatchStrategy():
batchSizes = []int32{updateStrategy.BatchStrategy.GroupSize}
if !updateStrategy.BatchStrategy.AutopauseAfterBatch {
return -1, errors.Errorf("update does not have auto pause enabled")
}
default:
return -1, errors.Errorf("update is not using a batch update strategy")
}
query.UpdateStatuses = append(TerminalUpdateStates(), aurora.JobUpdateStatus_ROLL_FORWARD_PAUSED)
summary, err := c.MonitorJobUpdateQuery(query, interval, timeout)
if err != nil {
return -1, err
}
// Summary 0 is assumed to exist because MonitorJobUpdateQuery will return an error if there is no summaries
if !(summary[0].State.Status == aurora.JobUpdateStatus_ROLL_FORWARD_PAUSED ||
summary[0].State.Status == aurora.JobUpdateStatus_ROLLED_FORWARD) {
return -1, errors.Errorf("update is in a terminal state %v", summary[0].State.Status)
}
updatingInstances := make(map[int32]struct{})
for _, e := range updateDetails[0].InstanceEvents {
// We only care about INSTANCE_UPDATING actions because we only care that they've been attempted
if e != nil && e.GetAction() == aurora.JobUpdateAction_INSTANCE_UPDATING {
updatingInstances[e.GetInstanceId()] = struct{}{}
}
}
return calculateCurrentBatch(int32(len(updatingInstances)), batchSizes), nil
}

434
offer.go Normal file
View file

@ -0,0 +1,434 @@
/**
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package realis
import (
"bytes"
"crypto/tls"
"encoding/json"
"fmt"
"net/http"
"strings"
"github.com/aurora-scheduler/gorealis/v2/gen-go/apache/aurora"
)
// Offers on [aurora-scheduler]/offers endpoint
type Offer struct {
ID struct {
Value string `json:"value"`
} `json:"id"`
FrameworkID struct {
Value string `json:"value"`
} `json:"framework_id"`
AgentID struct {
Value string `json:"value"`
} `json:"agent_id"`
Hostname string `json:"hostname"`
URL struct {
Scheme string `json:"scheme"`
Address struct {
Hostname string `json:"hostname"`
IP string `json:"ip"`
Port int `json:"port"`
} `json:"address"`
Path string `json:"path"`
Query []interface{} `json:"query"`
} `json:"url"`
Resources []struct {
Name string `json:"name"`
Type string `json:"type"`
Ranges struct {
Range []struct {
Begin int `json:"begin"`
End int `json:"end"`
} `json:"range"`
} `json:"ranges,omitempty"`
Role string `json:"role"`
Reservations []interface{} `json:"reservations"`
Scalar struct {
Value float64 `json:"value"`
} `json:"scalar,omitempty"`
} `json:"resources"`
Attributes []struct {
Name string `json:"name"`
Type string `json:"type"`
Text struct {
Value string `json:"value"`
} `json:"text"`
} `json:"attributes"`
ExecutorIds []struct {
Value string `json:"value"`
} `json:"executor_ids"`
}
// hosts on [aurora-scheduler]/maintenance endpoint
type MaintenanceList struct {
Drained []string `json:"DRAINED"`
Scheduled []string `json:"SCHEDULED"`
Draining map[string][]string `json:"DRAINING"`
}
type OfferCount map[float64]int64
type OfferGroupReport map[string]OfferCount
type OfferReport map[string]OfferGroupReport
// MaintenanceHosts list all the hosts under maintenance
func (c *Client) MaintenanceHosts() ([]string, error) {
tr := &http.Transport{
TLSClientConfig: &tls.Config{InsecureSkipVerify: c.config.insecureSkipVerify},
}
request := &http.Client{Transport: tr}
resp, err := request.Get(fmt.Sprintf("%s/maintenance", c.GetSchedulerURL()))
if err != nil {
return nil, err
}
defer resp.Body.Close()
buf := new(bytes.Buffer)
if _, err := buf.ReadFrom(resp.Body); err != nil {
return nil, err
}
var list MaintenanceList
if err := json.Unmarshal(buf.Bytes(), &list); err != nil {
return nil, err
}
hosts := append(list.Drained, list.Scheduled...)
for drainingHost := range list.Draining {
hosts = append(hosts, drainingHost)
}
return hosts, nil
}
// Offers pulls data from /offers endpoint
func (c *Client) Offers() ([]Offer, error) {
tr := &http.Transport{
TLSClientConfig: &tls.Config{InsecureSkipVerify: c.config.insecureSkipVerify},
}
request := &http.Client{Transport: tr}
resp, err := request.Get(fmt.Sprintf("%s/offers", c.GetSchedulerURL()))
if err != nil {
return []Offer{}, err
}
defer resp.Body.Close()
buf := new(bytes.Buffer)
if _, err := buf.ReadFrom(resp.Body); err != nil {
return nil, err
}
var offers []Offer
if err := json.Unmarshal(buf.Bytes(), &offers); err != nil {
return []Offer{}, err
}
return offers, nil
}
// AvailOfferReport returns a detailed summary of offers available for use.
// For example, 2 nodes offer 32 cpus and 10 nodes offer 1 cpus.
func (c *Client) AvailOfferReport() (OfferReport, error) {
maintHosts, err := c.MaintenanceHosts()
if err != nil {
return nil, err
}
maintHostSet := map[string]bool{}
for _, h := range maintHosts {
maintHostSet[h] = true
}
// Get a list of offers
offers, err := c.Offers()
if err != nil {
return nil, err
}
report := OfferReport{}
for _, o := range offers {
if maintHostSet[o.Hostname] {
continue
}
group := "non-dedicated"
for _, a := range o.Attributes {
if a.Name == "dedicated" {
group = a.Text.Value
break
}
}
if _, ok := report[group]; !ok {
report[group] = map[string]OfferCount{}
}
for _, r := range o.Resources {
if _, ok := report[group][r.Name]; !ok {
report[group][r.Name] = OfferCount{}
}
val := 0.0
switch r.Type {
case "SCALAR":
val = r.Scalar.Value
case "RANGES":
for _, pr := range r.Ranges.Range {
val += float64(pr.End - pr.Begin + 1)
}
default:
return nil, fmt.Errorf("%s is not supported", r.Type)
}
report[group][r.Name][val]++
}
}
return report, nil
}
// FitTasks computes the number tasks can be fit in a list of offer
func (c *Client) FitTasks(taskConfig *aurora.TaskConfig, offers []Offer) (int64, error) {
// count the number of tasks per limit contraint: limit.name -> limit.value -> count
limitCounts := map[string]map[string]int64{}
for _, c := range taskConfig.Constraints {
if c.Constraint.Limit != nil {
limitCounts[c.Name] = map[string]int64{}
}
}
request := ResourcesToMap(taskConfig.Resources)
// validate resource request
if len(request) == 0 {
return -1, fmt.Errorf("Resource request %v must not be empty", request)
}
isValid := false
for _, resVal := range request {
if resVal > 0 {
isValid = true
break
}
}
if !isValid {
return -1, fmt.Errorf("Resource request %v is not valid", request)
}
// pull the list of hosts under maintenance
maintHosts, err := c.MaintenanceHosts()
if err != nil {
return -1, err
}
maintHostSet := map[string]bool{}
for _, h := range maintHosts {
maintHostSet[h] = true
}
numTasks := int64(0)
for _, o := range offers {
// skip the hosts under maintenance
if maintHostSet[o.Hostname] {
continue
}
numTasksPerOffer := int64(-1)
for resName, resVal := range request {
// skip as we can fit a infinite number of tasks with 0 demand.
if resVal == 0 {
continue
}
avail := 0.0
for _, r := range o.Resources {
if r.Name != resName {
continue
}
switch r.Type {
case "SCALAR":
avail = r.Scalar.Value
case "RANGES":
for _, pr := range r.Ranges.Range {
avail += float64(pr.End - pr.Begin + 1)
}
default:
return -1, fmt.Errorf("%s is not supported", r.Type)
}
}
numTasksPerResource := int64(avail / resVal)
if numTasksPerResource < numTasksPerOffer || numTasksPerOffer < 0 {
numTasksPerOffer = numTasksPerResource
}
}
numTasks += fitConstraints(taskConfig, &o, limitCounts, numTasksPerOffer)
}
return numTasks, nil
}
func fitConstraints(taskConfig *aurora.TaskConfig,
offer *Offer,
limitCounts map[string]map[string]int64,
numTasksPerOffer int64) int64 {
// check dedicated attributes vs. constraints
if !isDedicated(offer, taskConfig.Job.Role, taskConfig.Constraints) {
return 0
}
limitConstraints := []*aurora.Constraint{}
for _, c := range taskConfig.Constraints {
// look for corresponding attribute
attFound := false
for _, a := range offer.Attributes {
if a.Name == c.Name {
attFound = true
}
}
// constraint not found in offer's attributes
if !attFound {
return 0
}
if c.Constraint.Value != nil && !valueConstraint(offer, c) {
// value constraint is not satisfied
return 0
} else if c.Constraint.Limit != nil {
limitConstraints = append(limitConstraints, c)
limit := limitConstraint(offer, c, limitCounts)
if numTasksPerOffer > limit && limit >= 0 {
numTasksPerOffer = limit
}
}
}
// update limitCounts
for _, c := range limitConstraints {
for _, a := range offer.Attributes {
if a.Name == c.Name {
limitCounts[a.Name][a.Text.Value] += numTasksPerOffer
}
}
}
return numTasksPerOffer
}
func isDedicated(offer *Offer, role string, constraints []*aurora.Constraint) bool {
// get all dedicated attributes of an offer
dedicatedAtts := map[string]bool{}
for _, a := range offer.Attributes {
if a.Name == "dedicated" {
dedicatedAtts[a.Text.Value] = true
}
}
if len(dedicatedAtts) == 0 {
return true
}
// check if constraints are matching dedicated attributes
matched := false
for _, c := range constraints {
if c.Name == "dedicated" && c.Constraint.Value != nil {
found := false
for _, v := range c.Constraint.Value.Values {
if dedicatedAtts[v] && strings.HasPrefix(v, fmt.Sprintf("%s/", role)) {
found = true
break
}
}
if found {
matched = true
} else {
return false
}
}
}
return matched
}
// valueConstraint checks Value Contraints of task if the are matched by the offer.
// more details can be found here https://aurora.apache.org/documentation/latest/features/constraints/
func valueConstraint(offer *Offer, constraint *aurora.Constraint) bool {
matched := false
for _, a := range offer.Attributes {
if a.Name == constraint.Name {
for _, v := range constraint.Constraint.Value.Values {
matched = (a.Text.Value == v && !constraint.Constraint.Value.Negated) ||
(a.Text.Value != v && constraint.Constraint.Value.Negated)
if matched {
break
}
}
if matched {
break
}
}
}
return matched
}
// limitConstraint limits the number of pods on a group which has the same attribute.
// more details can be found here https://aurora.apache.org/documentation/latest/features/constraints/
func limitConstraint(offer *Offer, constraint *aurora.Constraint, limitCounts map[string]map[string]int64) int64 {
limit := int64(-1)
for _, a := range offer.Attributes {
// limit constraint found
if a.Name == constraint.Name {
curr := limitCounts[a.Name][a.Text.Value]
currLimit := int64(constraint.Constraint.Limit.Limit)
if curr >= currLimit {
return 0
}
if currLimit-curr < limit || limit < 0 {
limit = currLimit - curr
}
}
}
return limit
}

953
realis.go

File diff suppressed because it is too large Load diff

294
realis_admin.go Normal file
View file

@ -0,0 +1,294 @@
/**
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package realis
import (
"context"
"github.com/aurora-scheduler/gorealis/v2/gen-go/apache/aurora"
"github.com/pkg/errors"
)
// Set a list of nodes to DRAINING. This means nothing will be able to be scheduled on them and any existing
// tasks will be killed and re-scheduled elsewhere in the cluster. Tasks from DRAINING nodes are not guaranteed
// to return to running unless there is enough capacity in the cluster to run them.
func (c *Client) DrainHosts(hosts ...string) ([]*aurora.HostStatus, error) {
if len(hosts) == 0 {
return nil, errors.New("no hosts provided to drain")
}
drainList := aurora.NewHosts()
drainList.HostNames = hosts
c.logger.DebugPrintf("DrainHosts Thrift Payload: %v\n", drainList)
resp, retryErr := c.thriftCallWithRetries(false, func() (*aurora.Response, error) {
return c.adminClient.DrainHosts(context.TODO(), drainList)
},
nil,
)
if retryErr != nil {
return nil, errors.Wrap(retryErr, "unable to recover connection")
}
if resp == nil || resp.GetResult_() == nil || resp.GetResult_().GetDrainHostsResult_() == nil {
return nil, errors.New("unexpected response from scheduler")
}
return resp.GetResult_().GetDrainHostsResult_().GetStatuses(), nil
}
// Start SLA Aware Drain.
// defaultSlaPolicy is the fallback SlaPolicy to use if a task does not have an SlaPolicy.
// After timeoutSecs, tasks will be forcefully drained without checking SLA.
func (c *Client) SLADrainHosts(policy *aurora.SlaPolicy, timeout int64, hosts ...string) ([]*aurora.HostStatus, error) {
if len(hosts) == 0 {
return nil, errors.New("no hosts provided to drain")
}
if policy == nil || policy.CountSetFieldsSlaPolicy() == 0 {
policy = &defaultSlaPolicy
c.logger.Printf("Warning: start draining with default sla policy %v", policy)
}
if timeout < 0 {
c.logger.Printf("Warning: timeout %d secs is invalid, draining with default timeout %d secs",
timeout,
defaultSlaDrainTimeoutSecs)
timeout = defaultSlaDrainTimeoutSecs
}
drainList := aurora.NewHosts()
drainList.HostNames = hosts
c.logger.DebugPrintf("SLADrainHosts Thrift Payload: %v\n", drainList)
resp, retryErr := c.thriftCallWithRetries(false, func() (*aurora.Response, error) {
return c.adminClient.SlaDrainHosts(context.TODO(), drainList, policy, timeout)
},
nil,
)
if retryErr != nil {
return nil, errors.Wrap(retryErr, "unable to recover connection")
}
if resp == nil || resp.GetResult_() == nil || resp.GetResult_().GetDrainHostsResult_() == nil {
return nil, errors.New("unexpected response from scheduler")
}
return resp.GetResult_().GetDrainHostsResult_().GetStatuses(), nil
}
func (c *Client) StartMaintenance(hosts ...string) ([]*aurora.HostStatus, error) {
if len(hosts) == 0 {
return nil, errors.New("no hosts provided to start maintenance on")
}
hostList := aurora.NewHosts()
hostList.HostNames = hosts
c.logger.DebugPrintf("StartMaintenance Thrift Payload: %v\n", hostList)
resp, retryErr := c.thriftCallWithRetries(false, func() (*aurora.Response, error) {
return c.adminClient.StartMaintenance(context.TODO(), hostList)
},
nil,
)
if retryErr != nil {
return nil, errors.Wrap(retryErr, "unable to recover connection")
}
if resp == nil || resp.GetResult_() == nil || resp.GetResult_().GetStartMaintenanceResult_() == nil {
return nil, errors.New("unexpected response from scheduler")
}
return resp.GetResult_().GetStartMaintenanceResult_().GetStatuses(), nil
}
func (c *Client) EndMaintenance(hosts ...string) ([]*aurora.HostStatus, error) {
if len(hosts) == 0 {
return nil, errors.New("no hosts provided to end maintenance on")
}
hostList := aurora.NewHosts()
hostList.HostNames = hosts
c.logger.DebugPrintf("EndMaintenance Thrift Payload: %v\n", hostList)
resp, retryErr := c.thriftCallWithRetries(false, func() (*aurora.Response, error) {
return c.adminClient.EndMaintenance(context.TODO(), hostList)
},
nil,
)
if retryErr != nil {
return nil, errors.Wrap(retryErr, "unable to recover connection")
}
if resp == nil || resp.GetResult_() == nil || resp.GetResult_().GetEndMaintenanceResult_() == nil {
return nil, errors.New("unexpected response from scheduler")
}
return resp.GetResult_().GetEndMaintenanceResult_().GetStatuses(), nil
}
func (c *Client) MaintenanceStatus(hosts ...string) (*aurora.MaintenanceStatusResult_, error) {
if len(hosts) == 0 {
return nil, errors.New("no hosts provided to get maintenance status from")
}
hostList := aurora.NewHosts()
hostList.HostNames = hosts
c.logger.DebugPrintf("MaintenanceStatus Thrift Payload: %v\n", hostList)
// Make thrift call. If we encounter an error sending the call, attempt to reconnect
// and continue trying to resend command until we run out of retries.
resp, retryErr := c.thriftCallWithRetries(false, func() (*aurora.Response, error) {
return c.adminClient.MaintenanceStatus(context.TODO(), hostList)
},
nil,
)
if retryErr != nil {
return nil, errors.Wrap(retryErr, "unable to recover connection")
}
if resp == nil || resp.GetResult_() == nil {
return nil, errors.New("unexpected response from scheduler")
}
return resp.GetResult_().GetMaintenanceStatusResult_(), nil
}
// SetQuota sets a quota aggregate for the given role
// TODO(zircote) Currently investigating an error that is returned from thrift calls that include resources for `NamedPort` and `NumGpu`
func (c *Client) SetQuota(role string, cpu *float64, ramMb *int64, diskMb *int64) error {
ramResource := aurora.NewResource()
ramResource.RamMb = ramMb
cpuResource := aurora.NewResource()
cpuResource.NumCpus = cpu
diskResource := aurora.NewResource()
diskResource.DiskMb = diskMb
quota := aurora.NewResourceAggregate()
quota.Resources = []*aurora.Resource{ramResource, cpuResource, diskResource}
_, retryErr := c.thriftCallWithRetries(false, func() (*aurora.Response, error) {
return c.adminClient.SetQuota(context.TODO(), role, quota)
},
nil,
)
if retryErr != nil {
return errors.Wrap(retryErr, "unable to set role quota")
}
return retryErr
}
// GetQuota returns the resource aggregate for the given role
func (c *Client) GetQuota(role string) (*aurora.GetQuotaResult_, error) {
resp, retryErr := c.thriftCallWithRetries(false, func() (*aurora.Response, error) {
return c.adminClient.GetQuota(context.TODO(), role)
},
nil,
)
if retryErr != nil {
return nil, errors.Wrap(retryErr, "unable to get role quota")
}
if resp == nil || resp.GetResult_() == nil {
return nil, errors.New("unexpected response from scheduler")
}
return resp.GetResult_().GetGetQuotaResult_(), nil
}
// Force Aurora Scheduler to perform a snapshot and write to Mesos log
func (c *Client) Snapshot() error {
_, retryErr := c.thriftCallWithRetries(false, func() (*aurora.Response, error) {
return c.adminClient.Snapshot(context.TODO())
},
nil,
)
if retryErr != nil {
return errors.Wrap(retryErr, "unable to recover connection")
}
return nil
}
// Force Aurora Scheduler to write backup file to a file in the backup directory
func (c *Client) PerformBackup() error {
_, retryErr := c.thriftCallWithRetries(false, func() (*aurora.Response, error) {
return c.adminClient.PerformBackup(context.TODO())
},
nil,
)
if retryErr != nil {
return errors.Wrap(retryErr, "unable to recover connection")
}
return nil
}
// Force an Implicit reconciliation between Mesos and Aurora
func (c *Client) ForceImplicitTaskReconciliation() error {
_, retryErr := c.thriftCallWithRetries(false, func() (*aurora.Response, error) {
return c.adminClient.TriggerImplicitTaskReconciliation(context.TODO())
},
nil,
)
if retryErr != nil {
return errors.Wrap(retryErr, "unable to recover connection")
}
return nil
}
// Force an Explicit reconciliation between Mesos and Aurora
func (c *Client) ForceExplicitTaskReconciliation(batchSize *int32) error {
if batchSize != nil && *batchSize < 1 {
return errors.New("invalid batch size.")
}
settings := aurora.NewExplicitReconciliationSettings()
settings.BatchSize = batchSize
_, retryErr := c.thriftCallWithRetries(false, func() (*aurora.Response, error) {
return c.adminClient.TriggerExplicitTaskReconciliation(context.TODO(), settings)
},
nil,
)
if retryErr != nil {
return errors.Wrap(retryErr, "unable to recover connection")
}
return nil
}

181
realis_config.go Normal file
View file

@ -0,0 +1,181 @@
/**
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package realis
import (
"strings"
"time"
"github.com/apache/thrift/lib/go/thrift"
"github.com/aurora-scheduler/gorealis/v2/gen-go/apache/aurora"
)
type clientConfig struct {
username, password string
url string
timeout time.Duration
transportProtocol TransportProtocol
cluster *Cluster
backoff Backoff
transport thrift.TTransport
protoFactory thrift.TProtocolFactory
logger *LevelLogger
insecureSkipVerify bool
certsPath string
clientKey, clientCert string
options []ClientOption
debug bool
trace bool
zkOptions []ZKOpt
failOnPermanentErrors bool
}
var defaultBackoff = Backoff{
Steps: 3,
Duration: 10 * time.Second,
Factor: 5.0,
Jitter: 0.1,
}
var defaultSlaPolicy = aurora.SlaPolicy{
PercentageSlaPolicy: &aurora.PercentageSlaPolicy{
Percentage: 66,
DurationSecs: 300,
},
}
const defaultSlaDrainTimeoutSecs = 900
type TransportProtocol int
const (
unsetProtocol TransportProtocol = iota
jsonProtocol
binaryProtocol
)
type ClientOption func(*clientConfig)
// clientConfig sets for options in clientConfig.
func BasicAuth(username, password string) ClientOption {
return func(config *clientConfig) {
config.username = username
config.password = password
}
}
func SchedulerUrl(url string) ClientOption {
return func(config *clientConfig) {
config.url = url
}
}
func Timeout(timeout time.Duration) ClientOption {
return func(config *clientConfig) {
config.timeout = timeout
}
}
func ZKCluster(cluster *Cluster) ClientOption {
return func(config *clientConfig) {
config.cluster = cluster
}
}
func ZKUrl(url string) ClientOption {
opts := []ZKOpt{ZKEndpoints(strings.Split(url, ",")...), ZKPath("/aurora/scheduler")}
return func(config *clientConfig) {
if config.zkOptions == nil {
config.zkOptions = opts
} else {
config.zkOptions = append(config.zkOptions, opts...)
}
}
}
func ThriftJSON() ClientOption {
return func(config *clientConfig) {
config.transportProtocol = jsonProtocol
}
}
func ThriftBinary() ClientOption {
return func(config *clientConfig) {
config.transportProtocol = binaryProtocol
}
}
func BackOff(b Backoff) ClientOption {
return func(config *clientConfig) {
config.backoff = b
}
}
func InsecureSkipVerify(InsecureSkipVerify bool) ClientOption {
return func(config *clientConfig) {
config.insecureSkipVerify = InsecureSkipVerify
}
}
func CertsPath(certspath string) ClientOption {
return func(config *clientConfig) {
config.certsPath = certspath
}
}
func ClientCerts(clientKey, clientCert string) ClientOption {
return func(config *clientConfig) {
config.clientKey, config.clientCert = clientKey, clientCert
}
}
// Use this option if you'd like to override default settings for connecting to Zookeeper.
// See zk.go for what is possible to set as an option.
func ZookeeperOptions(opts ...ZKOpt) ClientOption {
return func(config *clientConfig) {
config.zkOptions = opts
}
}
// Using the word set to avoid name collision with Interface.
func SetLogger(l Logger) ClientOption {
return func(config *clientConfig) {
config.logger = &LevelLogger{Logger: l}
}
}
// Enable debug statements.
func Debug() ClientOption {
return func(config *clientConfig) {
config.debug = true
}
}
// Enable trace statements.
func Trace() ClientOption {
return func(config *clientConfig) {
config.trace = true
}
}
// FailOnPermanentErrors - If the client encounters a connection error the standard library
// considers permanent, stop retrying and return an error to the user.
func FailOnPermanentErrors() ClientOption {
return func(config *clientConfig) {
config.failOnPermanentErrors = true
}
}

File diff suppressed because it is too large Load diff

72
realis_test.go Normal file
View file

@ -0,0 +1,72 @@
/**
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package realis
import (
"testing"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
)
func TestGetCACerts(t *testing.T) {
certs, err := GetCerts("./examples/certs")
require.NoError(t, err)
assert.Equal(t, len(certs.Subjects()), 2)
}
func TestAuroraURLValidator(t *testing.T) {
t.Run("badURL", func(t *testing.T) {
url, err := validateAuroraAddress("http://badurl.com/badpath")
assert.Empty(t, url)
assert.Error(t, err)
})
t.Run("URLHttp", func(t *testing.T) {
url, err := validateAuroraAddress("http://goodurl.com:8081/api")
assert.Equal(t, "http://goodurl.com:8081/api", url)
assert.NoError(t, err)
})
t.Run("URLHttps", func(t *testing.T) {
url, err := validateAuroraAddress("https://goodurl.com:8081/api")
assert.Equal(t, "https://goodurl.com:8081/api", url)
assert.NoError(t, err)
})
t.Run("URLNoPath", func(t *testing.T) {
url, err := validateAuroraAddress("http://goodurl.com:8081")
assert.Equal(t, "http://goodurl.com:8081/api", url)
assert.NoError(t, err)
})
t.Run("ipAddrNoPath", func(t *testing.T) {
url, err := validateAuroraAddress("http://192.168.1.33:8081")
assert.Equal(t, "http://192.168.1.33:8081/api", url)
assert.NoError(t, err)
})
t.Run("URLNoProtocol", func(t *testing.T) {
url, err := validateAuroraAddress("goodurl.com:8081/api")
assert.Equal(t, "http://goodurl.com:8081/api", url)
assert.NoError(t, err)
})
t.Run("URLNoProtocolNoPathNoPort", func(t *testing.T) {
url, err := validateAuroraAddress("goodurl.com")
assert.Equal(t, "http://goodurl.com:8081/api", url)
assert.NoError(t, err)
})
}

View file

@ -18,7 +18,7 @@ package response
import (
"bytes"
"github.com/paypal/gorealis/v2/gen-go/apache/aurora"
"github.com/aurora-scheduler/gorealis/v2/gen-go/apache/aurora"
)
// Get key from a response created by a StartJobUpdate call
@ -35,6 +35,10 @@ func ScheduleStatusResult(resp *aurora.Response) *aurora.ScheduleStatusResult_ {
}
func JobUpdateSummaries(resp *aurora.Response) []*aurora.JobUpdateSummary {
if resp == nil || resp.GetResult_() == nil || resp.GetResult_().GetGetJobUpdateSummariesResult_() == nil {
return nil
}
return resp.GetResult_().GetGetJobUpdateSummariesResult_().GetUpdateSummaries()
}

213
retry.go
View file

@ -17,21 +17,20 @@ package realis
import (
"io"
"math/rand"
"net/http"
"net/url"
"strconv"
"strings"
"time"
"git.apache.org/thrift.git/lib/go/thrift"
"github.com/paypal/gorealis/v2/gen-go/apache/aurora"
"github.com/paypal/gorealis/v2/response"
"github.com/apache/thrift/lib/go/thrift"
"github.com/aurora-scheduler/gorealis/v2/gen-go/apache/aurora"
"github.com/aurora-scheduler/gorealis/v2/response"
"github.com/pkg/errors"
)
// Backoff determines how the retry mechanism should react after each failure and how many failures it should
// tolerate.
type Backoff struct {
Duration time.Duration // the base duration
Factor float64 // Duration is multipled by factor each iteration
Factor float64 // Duration is multiplied by a factor each iteration
Jitter float64 // The amount of jitter applied each iteration
Steps int // Exit with error after this many steps
}
@ -53,22 +52,20 @@ func Jitter(duration time.Duration, maxFactor float64) time.Duration {
// if the loop should be aborted.
type ConditionFunc func() (done bool, err error)
// Modified version of the Kubernetes exponential-backoff code.
// ExponentialBackoff repeats a condition check with exponential backoff.
//
// It checks the condition up to Steps times, increasing the wait by multiplying
// the previous duration by Factor.
// ExponentialBackoff is a modified version of the Kubernetes exponential-backoff code.
// It repeats a condition check with exponential backoff and checks the condition up to
// Steps times, increasing the wait by multiplying the previous duration by Factor.
//
// If Jitter is greater than zero, a random amount of each duration is added
// (between duration and duration*(1+jitter)).
//
// If the condition never returns true, ErrWaitTimeout is returned. Errors
// do not cause the function to return.
func ExponentialBackoff(backoff Backoff, logger Logger, condition ConditionFunc) error {
var err error
var ok bool
var curStep int
duration := backoff.Duration
for curStep = 0; curStep < backoff.Steps; curStep++ {
@ -80,7 +77,7 @@ func ExponentialBackoff(backoff Backoff, logger Logger, condition ConditionFunc)
adjusted = Jitter(duration, backoff.Jitter)
}
logger.Printf("A retriable error occurred during function call, backing off for %v before retrying\n", adjusted)
logger.Printf("A retryable error occurred during function call, backing off for %v before retrying\n", adjusted)
time.Sleep(adjusted)
duration = time.Duration(float64(duration) * backoff.Factor)
}
@ -97,10 +94,9 @@ func ExponentialBackoff(backoff Backoff, logger Logger, condition ConditionFunc)
// If the error is temporary, continue retrying.
if !IsTemporary(err) {
return err
} else {
// Print out the temporary error we experienced.
logger.Println(err)
}
// Print out the temporary error we experienced.
logger.Println(err)
}
}
@ -111,18 +107,28 @@ func ExponentialBackoff(backoff Backoff, logger Logger, condition ConditionFunc)
// Provide more information to the user wherever possible
if err != nil {
return newRetryError(errors.Wrap(err, "ran out of retries"), curStep)
} else {
return newRetryError(errors.New("ran out of retries"), curStep)
}
return newRetryError(errors.New("ran out of retries"), curStep)
}
type auroraThriftCall func() (resp *aurora.Response, err error)
// verifyOntimeout defines the type of function that will be used to verify whether a Thirft call to the Scheduler
// made it to the scheduler or not. In general, these types of functions will have to interact with the scheduler
// through the very same Thrift API which previously encountered a time-out from the client.
// This means that the functions themselves should be kept to a minimum number of Thrift calls.
// It should also be noted that this is a best effort mechanism and
// is likely to fail for the same reasons that the original call failed.
type verifyOnTimeout func() (*aurora.Response, bool)
// Duplicates the functionality of ExponentialBackoff but is specifically targeted towards ThriftCalls.
func (c *Client) thriftCallWithRetries(thriftCall auroraThriftCall) (*aurora.Response, error) {
func (c *Client) thriftCallWithRetries(returnOnTimeout bool, thriftCall auroraThriftCall,
verifyOnTimeout verifyOnTimeout) (*aurora.Response, error) {
var resp *aurora.Response
var clientErr error
var curStep int
timeouts := 0
backoff := c.config.backoff
duration := backoff.Duration
@ -136,7 +142,10 @@ func (c *Client) thriftCallWithRetries(thriftCall auroraThriftCall) (*aurora.Res
adjusted = Jitter(duration, backoff.Jitter)
}
c.logger.Printf("A retriable error occurred during thrift call, backing off for %v before retry %v\n", adjusted, curStep)
c.logger.Printf(
"A retryable error occurred during thrift call, backing off for %v before retry %v",
adjusted,
curStep)
time.Sleep(adjusted)
duration = time.Duration(float64(duration) * backoff.Factor)
@ -151,90 +160,132 @@ func (c *Client) thriftCallWithRetries(thriftCall auroraThriftCall) (*aurora.Res
resp, clientErr = thriftCall()
c.logger.DebugPrintf("Aurora Thrift Call ended resp: %v clientErr: %v\n", resp, clientErr)
c.logger.TracePrintf("Aurora Thrift Call ended resp: %v clientErr: %v", resp, clientErr)
}()
// Check if our thrift call is returning an error. This is a retriable event as we don't know
// Check if our thrift call is returning an error. This is a retryable event as we don't know
// if it was caused by network issues.
if clientErr != nil {
// Print out the error to the user
c.logger.Printf("Client Error: %v\n", clientErr)
c.logger.Printf("Client Error: %v", clientErr)
// Determine if error is a temporary URL error by going up the stack
e, ok := clientErr.(thrift.TTransportException)
if ok {
c.logger.DebugPrint("Encountered a transport exception")
temporary, timedout := isConnectionError(clientErr)
if !temporary && c.RealisConfig().failOnPermanentErrors {
return nil, errors.Wrap(clientErr, "permanent connection error")
}
// TODO(rdelvalle): Figure out a better way to obtain the error code as this is a very brittle solution
// 401 Unauthorized means the wrong username and password were provided
if strings.Contains(e.Error(), strconv.Itoa(http.StatusUnauthorized)) {
return nil, errors.Wrap(clientErr, "wrong username or password provided")
}
e, ok := e.Err().(*url.Error)
if ok {
// EOF error occurs when the server closes the read buffer of the client. This is common
// when the server is overloaded and should be retried. All other errors that are permanent
// will not be retried.
if e.Err != io.EOF && !e.Temporary() {
return nil, errors.Wrap(clientErr, "Permanent connection error")
}
}
// There exists a corner case where thrift payload was received by Aurora but
// connection timed out before Aurora was able to reply.
// Users can take special action on a timeout by using IsTimedout and reacting accordingly
// if they have configured the client to return on a timeout.
if timedout && returnOnTimeout {
return resp, newTimedoutError(errors.New("client connection closed before server answer"))
}
// In the future, reestablish connection should be able to check if it is actually possible
// to make a thrift call to Aurora. For now, a reconnect should always lead to a retry.
c.ReestablishConn()
} else {
// If there was no client error, but the response is nil, something went wrong.
// Ideally, we'll never encounter this but we're placing a safeguard here.
if resp == nil {
return nil, errors.New("Response from aurora is nil")
// Ignoring error due to the fact that an error should be retried regardless
reestablishErr := c.ReestablishConn()
if reestablishErr != nil {
c.logger.DebugPrintf("error re-establishing connection ", reestablishErr)
}
// Check Response Code from thrift and make a decision to continue retrying or not.
switch responseCode := resp.GetResponseCode(); responseCode {
// If users did not opt for a return on timeout in order to react to a timedout error,
// attempt to verify that the call made it to the scheduler after the connection was re-established.
if timedout {
timeouts++
c.logger.DebugPrintf(
"Client closed connection %d times before server responded, "+
"consider increasing connection timeout",
timeouts)
// If the thrift call succeeded, stop retrying
case aurora.ResponseCode_OK:
return resp, nil
// If the response code is transient, continue retrying
case aurora.ResponseCode_ERROR_TRANSIENT:
c.logger.Println("Aurora replied with Transient error code, retrying")
continue
// Failure scenarios, these indicate a bad payload or a bad clientConfig. Stop retrying.
case aurora.ResponseCode_INVALID_REQUEST,
aurora.ResponseCode_ERROR,
aurora.ResponseCode_AUTH_FAILED,
aurora.ResponseCode_JOB_UPDATING_ERROR:
c.logger.Printf("Terminal Response Code %v from Aurora, won't retry\n", resp.GetResponseCode().String())
return resp, errors.New(response.CombineMessage(resp))
// The only case that should fall down to here is a WARNING response code.
// It is currently not used as a response in the scheduler so it is unknown how to handle it.
default:
c.logger.DebugPrintf("unhandled response code %v received from Aurora\n", responseCode)
return nil, errors.Errorf("unhandled response code from Aurora %v\n", responseCode.String())
// Allow caller to provide a function which checks if the original call was successful before
// it timed out.
if verifyOnTimeout != nil {
if verifyResp, ok := verifyOnTimeout(); ok {
c.logger.Print("verified that the call went through successfully after a client timeout")
// Response here might be different than the original as it is no longer constructed
// by the scheduler but mimicked.
// This is OK since the scheduler is very unlikely to change responses at this point in its
// development cycle but we must be careful to not return an incorrectly constructed response.
return verifyResp, nil
}
}
}
// Retry the thrift payload
continue
}
// If there was no client error, but the response is nil, something went wrong.
// Ideally, we'll never encounter this but we're placing a safeguard here.
if resp == nil {
return nil, errors.New("response from aurora is nil")
}
// Check Response Code from thrift and make a decision to continue retrying or not.
switch responseCode := resp.GetResponseCode(); responseCode {
// If the thrift call succeeded, stop retrying
case aurora.ResponseCode_OK:
return resp, nil
// If the response code is transient, continue retrying
case aurora.ResponseCode_ERROR_TRANSIENT:
c.logger.Println("Aurora replied with Transient error code, retrying")
continue
// Failure scenarios, these indicate a bad payload or a bad clientConfig. Stop retrying.
case aurora.ResponseCode_INVALID_REQUEST,
aurora.ResponseCode_ERROR,
aurora.ResponseCode_AUTH_FAILED,
aurora.ResponseCode_JOB_UPDATING_ERROR:
c.logger.Printf("Terminal Response Code %v from Aurora, won't retry\n", resp.GetResponseCode().String())
return resp, errors.New(response.CombineMessage(resp))
// The only case that should fall down to here is a WARNING response code.
// It is currently not used as a response in the scheduler so it is unknown how to handle it.
default:
c.logger.DebugPrintf("unhandled response code %v received from Aurora\n", responseCode)
return nil, errors.Errorf("unhandled response code from Aurora %v", responseCode.String())
}
}
c.logger.DebugPrintf("it took %v retries to complete this operation\n", curStep)
if curStep > 1 {
c.config.logger.Printf("retried this thrift call %d time(s)", curStep)
c.config.logger.Printf("this thrift call was retried %d time(s)", curStep)
}
// Provide more information to the user wherever possible.
if clientErr != nil {
return nil, newRetryError(errors.Wrap(clientErr, "ran out of retries, including latest error"), curStep)
} else {
return nil, newRetryError(errors.New("ran out of retries"), curStep)
}
return nil, newRetryError(errors.New("ran out of retries"), curStep)
}
// isConnectionError processes the error received by the client.
// The return values indicate whether this was determined to be a temporary error
// and whether it was determined to be a timeout error
func isConnectionError(err error) (bool, bool) {
// Determine if error is a temporary URL error by going up the stack
transportException, ok := err.(thrift.TTransportException)
if !ok {
return false, false
}
urlError, ok := transportException.Err().(*url.Error)
if !ok {
return false, false
}
// EOF error occurs when the server closes the read buffer of the client. This is common
// when the server is overloaded and we consider it temporary.
// All other which are not temporary as per the member function Temporary(),
// are considered not temporary (permanent).
if urlError.Err != io.EOF && !urlError.Temporary() {
return false, false
}
return true, urlError.Timeout()
}

2
runTestsMac.sh Executable file → Normal file
View file

@ -1,4 +1,4 @@
#!/bin/bash
# Since we run our docker compose setup in bridge mode to be able to run on MacOS, we have to launch a Docker container within the bridge network in order to avoid any routing issues.
docker run -t -v $(pwd):/go/src/github.com/paypal/gorealis --network gorealis_aurora_cluster golang:1.11.3-stretch go test -v github.com/paypal/gorealis
docker run --rm -t -w /gorealis -v $GOPATH/pkg:/go/pkg -v $(pwd):/gorealis --network gorealis_aurora_cluster golang:1.17-buster go test -v github.com/aurora-scheduler/gorealis/v2 $@

70
task.go
View file

@ -18,8 +18,8 @@ import (
"encoding/json"
"strconv"
"git.apache.org/thrift.git/lib/go/thrift"
"github.com/paypal/gorealis/v2/gen-go/apache/aurora"
"github.com/apache/thrift/lib/go/thrift"
"github.com/aurora-scheduler/gorealis/v2/gen-go/apache/aurora"
)
type ResourceType int
@ -47,17 +47,12 @@ func NewTask() *AuroraTask {
numCpus := &aurora.Resource{}
ramMb := &aurora.Resource{}
diskMb := &aurora.Resource{}
numGpus := &aurora.Resource{}
numCpus.NumCpus = new(float64)
ramMb.RamMb = new(int64)
diskMb.DiskMb = new(int64)
resources := make(map[ResourceType]*aurora.Resource)
resources[CPU] = numCpus
resources[RAM] = ramMb
resources[DISK] = diskMb
resources[GPU] = numGpus
resources := map[ResourceType]*aurora.Resource{CPU: numCpus, RAM: ramMb, DISK: diskMb}
return &AuroraTask{task: &aurora.TaskConfig{
Job: &aurora.JobKey{},
@ -83,18 +78,31 @@ func TaskFromThrift(config *aurora.TaskConfig) *AuroraTask {
Role(config.Job.Role).
Name(config.Job.Name).
MaxFailure(config.MaxTaskFailures).
IsService(config.IsService)
IsService(config.IsService).
Priority(config.Priority)
if config.Tier != nil {
newTask.Tier(*config.Tier)
}
if config.Production != nil {
newTask.Production(*config.Production)
}
if config.ExecutorConfig != nil {
newTask.
ExecutorName(config.ExecutorConfig.Name).
ExecutorData(config.ExecutorConfig.Data)
}
if config.PartitionPolicy != nil {
newTask.PartitionPolicy(
aurora.PartitionPolicy{
Reschedule: config.PartitionPolicy.Reschedule,
DelaySecs: thrift.Int64Ptr(*config.PartitionPolicy.DelaySecs),
})
}
// Make a deep copy of the task's container
if config.Container != nil {
if config.Container.Mesos != nil {
@ -130,7 +138,10 @@ func TaskFromThrift(config *aurora.TaskConfig) *AuroraTask {
// Copy only ports. Set CPU, RAM, DISK, and GPU
if resource != nil {
if resource.NamedPort != nil {
newTask.task.Resources = append(newTask.task.Resources, &aurora.Resource{NamedPort: thrift.StringPtr(*resource.NamedPort)})
newTask.task.Resources = append(
newTask.task.Resources,
&aurora.Resource{NamedPort: thrift.StringPtr(*resource.NamedPort)},
)
newTask.portCount++
}
@ -160,7 +171,9 @@ func TaskFromThrift(config *aurora.TaskConfig) *AuroraTask {
taskConstraint := constraint.Constraint
if taskConstraint.Limit != nil {
newConstraint.Constraint = &aurora.TaskConstraint{Limit: &aurora.LimitConstraint{Limit: taskConstraint.Limit.Limit}}
newConstraint.Constraint = &aurora.TaskConstraint{
Limit: &aurora.LimitConstraint{Limit: taskConstraint.Limit.Limit},
}
newTask.task.Constraints = append(newTask.task.Constraints, &newConstraint)
} else if taskConstraint.Value != nil {
@ -187,7 +200,11 @@ func TaskFromThrift(config *aurora.TaskConfig) *AuroraTask {
for _, uri := range config.MesosFetcherUris {
newTask.task.MesosFetcherUris = append(
newTask.task.MesosFetcherUris,
&aurora.MesosFetcherURI{Value: uri.Value, Extract: thrift.BoolPtr(*uri.Extract), Cache: thrift.BoolPtr(*uri.Cache)})
&aurora.MesosFetcherURI{
Value: uri.Value,
Extract: thrift.BoolPtr(*uri.Extract),
Cache: thrift.BoolPtr(*uri.Cache),
})
}
return newTask
@ -247,7 +264,14 @@ func (t *AuroraTask) Disk(disk int64) *AuroraTask {
}
func (t *AuroraTask) GPU(gpu int64) *AuroraTask {
*t.resources[GPU].NumGpus = gpu
// GPU resource must be set explicitly since the scheduler by default
// rejects jobs with GPU resources attached to it.
if _, ok := t.resources[GPU]; !ok {
t.resources[GPU] = &aurora.Resource{}
t.task.Resources = append(t.task.Resources, t.resources[GPU])
}
t.resources[GPU].NumGpus = &gpu
return t
}
@ -268,6 +292,17 @@ func (t *AuroraTask) IsService(isService bool) *AuroraTask {
return t
}
//set priority for preemption or priority-queueing
func (t *AuroraTask) Priority(priority int32) *AuroraTask {
t.task.Priority = priority
return t
}
func (t *AuroraTask) Production(production bool) *AuroraTask {
t.task.Production = &production
return t
}
// Add a list of URIs with the same extract and cache configuration. Scheduler must have
// --enable_mesos_fetcher flag enabled. Currently there is no duplicate detection.
func (t *AuroraTask) AddURIs(extract bool, cache bool, values ...string) *AuroraTask {
@ -407,7 +442,7 @@ func (t *AuroraTask) BuildThermosPayload() error {
t.thermos.disk(*t.resources[DISK].DiskMb)
}
if t.resources[GPU].NumGpus != nil {
if t.resources[GPU] != nil && t.resources[GPU].NumGpus != nil {
t.thermos.gpu(*t.resources[GPU].NumGpus)
}
@ -421,3 +456,10 @@ func (t *AuroraTask) BuildThermosPayload() error {
}
return nil
}
// Set a partition policy for the job configuration to implement.
func (t *AuroraTask) PartitionPolicy(policy aurora.PartitionPolicy) *AuroraTask {
t.task.PartitionPolicy = &policy
return t
}

View file

@ -17,8 +17,8 @@ package realis_test
import (
"testing"
realis "github.com/paypal/gorealis/v2"
"github.com/paypal/gorealis/v2/gen-go/apache/aurora"
realis "github.com/aurora-scheduler/gorealis/v2"
"github.com/aurora-scheduler/gorealis/v2/gen-go/apache/aurora"
"github.com/stretchr/testify/assert"
)
@ -34,6 +34,8 @@ func TestAuroraTask_Clone(t *testing.T) {
RAM(643).
Disk(1000).
IsService(true).
Priority(1).
Production(false).
AddPorts(10).
Tier("preferred").
MaxFailure(23).

View file

@ -4,7 +4,7 @@ import (
"encoding/json"
"testing"
"git.apache.org/thrift.git/lib/go/thrift"
"github.com/apache/thrift/lib/go/thrift"
"github.com/stretchr/testify/assert"
)

61
util.go
View file

@ -4,7 +4,7 @@ import (
"net/url"
"strings"
"github.com/paypal/gorealis/v2/gen-go/apache/aurora"
"github.com/aurora-scheduler/gorealis/v2/gen-go/apache/aurora"
"github.com/pkg/errors"
)
@ -40,14 +40,26 @@ func init() {
}
}
func validateAndPopulateAuroraURL(urlStr string) (string, error) {
// TerminalUpdateStates returns a slice containing all the terminal states an update may be in.
// This is a function in order to avoid having a slice that can be accidentally mutated.
func TerminalUpdateStates() []aurora.JobUpdateStatus {
return []aurora.JobUpdateStatus{
aurora.JobUpdateStatus_ROLLED_FORWARD,
aurora.JobUpdateStatus_ROLLED_BACK,
aurora.JobUpdateStatus_ABORTED,
aurora.JobUpdateStatus_ERROR,
aurora.JobUpdateStatus_FAILED,
}
}
func validateAuroraAddress(address string) (string, error) {
// If no protocol defined, assume http
if !strings.Contains(urlStr, "://") {
urlStr = "http://" + urlStr
if !strings.Contains(address, "://") {
address = "http://" + address
}
u, err := url.Parse(urlStr)
u, err := url.Parse(address)
if err != nil {
return "", errors.Wrap(err, "error parsing url")
@ -73,3 +85,42 @@ func validateAndPopulateAuroraURL(urlStr string) (string, error) {
return u.String(), nil
}
func calculateCurrentBatch(updatingInstances int32, batchSizes []int32) int {
for i, size := range batchSizes {
updatingInstances -= size
if updatingInstances <= 0 {
return i
}
}
// Overflow batches
batchCount := len(batchSizes) - 1
lastBatchIndex := len(batchSizes) - 1
batchCount += int(updatingInstances / batchSizes[lastBatchIndex])
if updatingInstances%batchSizes[lastBatchIndex] != 0 {
batchCount++
}
return batchCount
}
func ResourcesToMap(resources []*aurora.Resource) map[string]float64 {
result := map[string]float64{}
for _, resource := range resources {
if resource.NumCpus != nil {
result["cpus"] += *resource.NumCpus
} else if resource.RamMb != nil {
result["mem"] += float64(*resource.RamMb)
} else if resource.DiskMb != nil {
result["disk"] += float64(*resource.DiskMb)
} else if resource.NamedPort != nil {
result["ports"]++
} else if resource.NumGpus != nil {
result["gpus"] += float64(*resource.NumGpus)
}
}
return result
}

58
util_test.go Normal file
View file

@ -0,0 +1,58 @@
/**
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package realis
import (
"testing"
"github.com/stretchr/testify/assert"
)
func TestCurrentBatchCalculator(t *testing.T) {
t.Run("singleBatchOverflow", func(t *testing.T) {
curBatch := calculateCurrentBatch(10, []int32{2})
assert.Equal(t, 4, curBatch)
})
t.Run("noInstancesUpdating", func(t *testing.T) {
curBatch := calculateCurrentBatch(0, []int32{2})
assert.Equal(t, 0, curBatch)
})
t.Run("evenMatchSingleBatch", func(t *testing.T) {
curBatch := calculateCurrentBatch(2, []int32{2})
assert.Equal(t, 0, curBatch)
})
t.Run("moreInstancesThanBatches", func(t *testing.T) {
curBatch := calculateCurrentBatch(5, []int32{1, 2})
assert.Equal(t, 2, curBatch)
})
t.Run("moreInstancesThanBatchesDecreasing", func(t *testing.T) {
curBatch := calculateCurrentBatch(5, []int32{2, 1})
assert.Equal(t, 3, curBatch)
})
t.Run("unevenFit", func(t *testing.T) {
curBatch := calculateCurrentBatch(2, []int32{1, 2})
assert.Equal(t, 1, curBatch)
})
t.Run("halfWay", func(t *testing.T) {
curBatch := calculateCurrentBatch(1, []int32{1, 2})
assert.Equal(t, 0, curBatch)
})
}

View file

@ -1,56 +0,0 @@
---
Language: Cpp
# BasedOnStyle: LLVM
AccessModifierOffset: -2
ConstructorInitializerIndentWidth: 2
AlignEscapedNewlinesLeft: false
AlignTrailingComments: true
AllowAllParametersOfDeclarationOnNextLine: false
AllowShortBlocksOnASingleLine: false
AllowShortIfStatementsOnASingleLine: false
AllowShortLoopsOnASingleLine: false
AllowShortFunctionsOnASingleLine: Inline
AlwaysBreakTemplateDeclarations: true
AlwaysBreakBeforeMultilineStrings: true
BreakBeforeBinaryOperators: true
BreakBeforeTernaryOperators: true
BreakConstructorInitializersBeforeComma: false
BinPackParameters: false
ColumnLimit: 100
ConstructorInitializerAllOnOneLineOrOnePerLine: true
DerivePointerAlignment: false
IndentCaseLabels: false
IndentWrappedFunctionNames: false
IndentFunctionDeclarationAfterType: false
MaxEmptyLinesToKeep: 1
KeepEmptyLinesAtTheStartOfBlocks: true
NamespaceIndentation: None
ObjCSpaceAfterProperty: false
ObjCSpaceBeforeProtocolList: true
PenaltyBreakBeforeFirstCallParameter: 190
PenaltyBreakComment: 300
PenaltyBreakString: 10000
PenaltyBreakFirstLessLess: 120
PenaltyExcessCharacter: 1000000
PenaltyReturnTypeOnItsOwnLine: 1200
PointerAlignment: Left
SpacesBeforeTrailingComments: 1
Cpp11BracedListStyle: true
Standard: Auto
IndentWidth: 2
TabWidth: 4
UseTab: Never
BreakBeforeBraces: Attach
SpacesInParentheses: false
SpacesInAngles: false
SpaceInEmptyParentheses: false
SpacesInCStyleCastParentheses: false
SpacesInContainerLiterals: true
SpaceBeforeAssignmentOperators: true
ContinuationIndentWidth: 4
CommentPragmas: '^ IWYU pragma:'
ForEachMacros: [ foreach, Q_FOREACH, BOOST_FOREACH ]
SpaceBeforeParens: ControlStatements
DisableFormat: false
...

View file

@ -1 +0,0 @@
.git/

View file

@ -1,112 +0,0 @@
#
## Licensed to the Apache Software Foundation (ASF) under one
## or more contributor license agreements. See the NOTICE file
## distributed with this work for additional information
## regarding copyright ownership. The ASF licenses this file
## to you under the Apache License, Version 2.0 (the
## "License"); you may not use this file except in compliance
## with the License. You may obtain a copy of the License at
##
## http://www.apache.org/licenses/LICENSE-2.0
##
## Unless required by applicable law or agreed to in writing,
## software distributed under the License is distributed on an
## "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
## KIND, either express or implied. See the License for the
## specific language governing permissions and limitations
## under the License.
##
#
# EditorConfig: http://editorconfig.org
# see doc/coding_standards.md
root = true
[*]
end_of_line = lf
charset = utf-8
trim_trailing_whitespace = true
insert_final_newline = true
# ActionScript
# [*.as]
# C
# [*.c]
# C++
[*.cpp]
indent_style = space
indent_size = 2
# C-Sharp
# [*.cs]
# D
# [*.d]
# Erlang
# [*.erl]
# Go-lang
[*.go]
indent_style = tab
indent_size = 8
# C header files
# [*.h]
# Haskell
# [*.hs]
# Haxe
# [*.hx]
# Java
# [*.java]
# Javascript
[*.js]
indent_style = space
indent_size = 2
# JSON
[*.json]
indent_style = space
indent_size = 2
# Lua
# [*.lua]
[*.markdown]
indent_style = space
trim_trailing_whitespace = false
[*.md]
indent_style = space
trim_trailing_whitespace = false
# OCaml
# [*.ml]
# Delphi Pascal
# [*.pas]
# PHP
# [*.php]
# Perl
# [*.pm]
# Python
# [*.py]
# Ruby
# [*.rb]
# Typescript
# [*.ts]
# XML
# [*.xml]

View file

@ -1,9 +0,0 @@
# TODO: Use eslint on js lib and generated code
# Ignore lib/js for now, which uses jshint currently
lib/js/*
# Ignore all generated code for now
**/gen-*
# Don't lint nested node_modules
**/node_modules

View file

@ -1,24 +0,0 @@
{
"env": {
"es6": true,
"node": true
},
"extends": [
"eslint:recommended",
"plugin:prettier/recommended"
],
"parserOptions": {
"ecmaVersion": 2017
},
"rules": {
"no-console": "off",
"no-var": "error",
"prefer-const": "error",
"no-constant-condition": [
"error",
{
"checkLoops": false
}
]
}
}

View file

@ -1 +0,0 @@
* text=auto

View file

@ -1,59 +0,0 @@
# Configuration for probot-stale - https://github.com/probot/stale
# Number of days of inactivity before an Issue or Pull Request becomes stale
daysUntilStale: 60
# Number of days of inactivity before an Issue or Pull Request with the stale label is closed.
# Set to false to disable. If disabled, issues still need to be closed manually, but will remain marked as stale.
daysUntilClose: 7
# Issues or Pull Requests with these labels will never be considered stale. Set to `[]` to disable
exemptLabels:
- Do Not Merge
- blocked
- pinned
- security
- "[Status] Maybe Later"
# Set to true to ignore issues in a project (defaults to false)
exemptProjects: false
# Set to true to ignore issues in a milestone (defaults to false)
exemptMilestones: false
# Label to use when marking as stale
staleLabel: wontfix
# Comment to post when marking as stale. Set to `false` to disable
markComment: >
This issue has been automatically marked as stale because it has not had
recent activity. It will be closed in 7 days if no further activity occurs.
Thank you for your contributions.
# Comment to post when removing the stale label.
unmarkComment: >
This issue is no longer stale.
Thank you for your contributions.
# Comment to post when closing a stale Issue or Pull Request.
closeComment: >
This issue has been automatically closed due to inactivity.
Thank you for your contributions.
# Limit the number of actions per hour, from 1-30. Default is 30
limitPerRun: 30
# Limit to only `issues` or `pulls`
# only: issues
# Optionally, specify configuration settings that are specific to just 'issues' or 'pulls':
# pulls:
# daysUntilStale: 30
# markComment: >
# This pull request has been automatically marked as stale because it has not had
# recent activity. It will be closed if no further activity occurs. Thank you
# for your contributions.
# issues:
# exemptLabels:
# - confirmed

View file

@ -1,395 +0,0 @@
# generic ignores
*.la
*.lo
*.o
*.deps
*.dirstamp
*.libs
*.log
*.trs
*.suo
*.pyc
*.cache
*.user
*.ipch
*.sdf
*.jar
*.exe
*.dll
*_ReSharper*
*.opensdf
*.swp
*.hi
*~
tags
.*project
.classpath
.dub
.settings
.checkstyle
junit*.properties
.idea
*.iml
*.ipr
*.iws
gen-*
Makefile
Makefile.in
aclocal.m4
acinclude.m4
apache-thrift-test-library
autom4te.cache
cmake-*
dub.selections.json
libapache-thrift.a
node_modules
compile
test-driver
erl_crash.dump
project.lock.json
.sonar
.DS_Store
.svn
.vagrant
.vscode
.vs
/aclocal/libtool.m4
/aclocal/lt*.m4
/autoscan.log
/autoscan-*.log
/cmake_*
/compiler/cpp/compiler.VC.db
/compiler/cpp/compiler.VC.VC.opendb
/compiler/cpp/test/plugin/t_cpp_generator.cc
/compiler/cpp/src/thrift/plugin/plugin_constants.cpp
/compiler/cpp/src/thrift/plugin/plugin_constants.h
/compiler/cpp/src/thrift/plugin/plugin_types.cpp
/compiler/cpp/src/thrift/plugin/plugin_types.h
/compiler/cpp/test/*test
/compiler/cpp/test/thrift-gen-*
/compiler/cpp/src/thrift/thrift-bootstrap
/compiler/cpp/src/thrift/plugin/gen.stamp
/compiler/cpp/Debug
/compiler/cpp/Release
/compiler/cpp/src/thrift/libparse.a
/compiler/cpp/src/thrift/thriftl.cc
/compiler/cpp/src/thrift/thrifty.cc
/compiler/cpp/src/thrift/thrifty.hh
/compiler/cpp/src/thrift/windows/version.h
/compiler/cpp/thrift
/compiler/cpp/thriftl.cc
/compiler/cpp/thrifty.cc
/compiler/cpp/lex.yythriftl.cc
/compiler/cpp/thrifty.h
/compiler/cpp/thrifty.hh
/compiler/cpp/src/thrift/version.h
/config.*
/configure
/configure.lineno
/configure.scan
/contrib/.vagrant/
/contrib/fb303/config.cache
/contrib/fb303/config.log
/contrib/fb303/config.status
/contrib/fb303/configure
/contrib/fb303/cpp/libfb303.a
/contrib/fb303/java/build/
/contrib/fb303/py/build/
/contrib/fb303/py/fb303/FacebookService-remote
/contrib/fb303/py/fb303/FacebookService.py
/contrib/fb303/py/fb303/__init__.py
/contrib/fb303/py/fb303/constants.py
/contrib/fb303/py/fb303/ttypes.py
/depcomp
/install-sh
/lib/cl/backport-update.zip
/lib/cl/lib
/lib/cl/run-tests
/lib/cl/quicklisp.lisp
/lib/cl/externals/
/lib/cl/run-tests
/lib/cl/quicklisp/
/lib/cpp/Debug/
/lib/cpp/Debug-mt/
/lib/cpp/Release/
/lib/cpp/Release-mt/
/lib/cpp/src/thrift/qt/moc_TQTcpServer.cpp
/lib/cpp/src/thrift/qt/moc__TQTcpServer.cpp
/lib/cpp/src/thrift/config.h
/lib/cpp/src/thrift/stamp-h2
/lib/cpp/test/Benchmark
/lib/cpp/test/AllProtocolsTest
/lib/cpp/test/AnnotationTest
/lib/cpp/test/DebugProtoTest
/lib/cpp/test/DenseProtoTest
/lib/cpp/test/EnumTest
/lib/cpp/test/JSONProtoTest
/lib/cpp/test/OptionalRequiredTest
/lib/cpp/test/SecurityTest
/lib/cpp/test/SpecializationTest
/lib/cpp/test/RecursiveTest
/lib/cpp/test/ReflectionTest
/lib/cpp/test/RenderedDoubleConstantsTest
/lib/cpp/test/TFDTransportTest
/lib/cpp/test/TFileTransportTest
/lib/cpp/test/TInterruptTest
/lib/cpp/test/TNonblockingServerTest
/lib/cpp/test/TNonblockingSSLServerTest
/lib/cpp/test/TPipedTransportTest
/lib/cpp/test/TServerIntegrationTest
/lib/cpp/test/TSocketInterruptTest
/lib/cpp/test/TransportTest
/lib/cpp/test/UnitTests
/lib/cpp/test/ZlibTest
/lib/cpp/test/OpenSSLManualInitTest
/lib/cpp/test/concurrency_test
/lib/cpp/test/link_test
/lib/cpp/test/processor_test
/lib/cpp/test/tests.xml
/lib/cpp/concurrency_test
/lib/cpp/*.pc
/lib/cpp/x64/Debug/
/lib/cpp/x64/Debug-mt/
/lib/cpp/x64/Release
/lib/cpp/x64/Release-mt
/lib/c_glib/*.gcda
/lib/c_glib/*.gcno
/lib/c_glib/*.loT
/lib/c_glib/src/thrift/config.h
/lib/c_glib/src/thrift/stamp-h3
/lib/c_glib/test/*.gcno
/lib/c_glib/test/testwrapper.sh
/lib/c_glib/test/testwrapper-test*
/lib/c_glib/test/testapplicationexception
/lib/c_glib/test/testbinaryprotocol
/lib/c_glib/test/testcompactprotocol
/lib/c_glib/test/testbufferedtransport
/lib/c_glib/test/testcontainertest
/lib/c_glib/test/testdebugproto
/lib/c_glib/test/testfdtransport
/lib/c_glib/test/testframedtransport
/lib/c_glib/test/testmemorybuffer
/lib/c_glib/test/testoptionalrequired
/lib/c_glib/test/testtransportsslsocket
/lib/c_glib/test/testsimpleserver
/lib/c_glib/test/teststruct
/lib/c_glib/test/testthrifttest
/lib/c_glib/test/testthrifttestclient
/lib/c_glib/test/testtransportsocket
/lib/c_glib/test/testserialization
/lib/c_glib/thriftc.pc
/lib/c_glib/thrift_c_glib.pc
/lib/csharp/**/bin/
/lib/csharp/**/obj/
/lib/csharp/src/packages
/lib/d/test/*.pem
/lib/d/libthriftd*.a
/lib/d/test/async_test
/lib/d/test/client_pool_test
/lib/d/test/serialization_benchmark
/lib/d/test/stress_test_server
/lib/d/test/thrift_test_client
/lib/d/test/thrift_test_server
/lib/d/test/transport_test
/lib/d/unittest/
/lib/dart/coverage
/lib/dart/**/.packages
/lib/dart/**/packages
/lib/dart/**/.pub/
/lib/dart/**/pubspec.lock
/lib/delphi/test/skip/*.request
/lib/delphi/test/skip/*.response
/lib/delphi/**/*.identcache
/lib/delphi/**/*.local
/lib/delphi/**/*.dcu
/lib/delphi/**/*.2007
/lib/delphi/**/*.dproj
/lib/delphi/**/codegen/*.bat
/lib/erl/.eunit
/lib/erl/.generated
/lib/erl/.rebar/
/lib/erl/deps/
/lib/erl/ebin
/lib/erl/src/thrift.app.src
/lib/erl/test/*.beam
/lib/erl/test/*.hrl
/lib/erl/test/Thrift_omit_without.thrift
/lib/haxe/test/bin
/lib/haxe/test/data.tmp
/lib/hs/dist
/lib/java/.gradle
/lib/java/android/.gradle
/lib/java/build
/lib/java/target
/lib/js/dist
/lib/js/doc
/lib/js/test/build
/lib/netcore/**/bin
/lib/netcore/**/obj
/lib/nodejs/coverage
/lib/nodejs/node_modules/
/lib/perl/MANIFEST
/lib/perl/MYMETA.json
/lib/perl/MYMETA.yml
/lib/perl/Makefile-perl.mk
/lib/perl/blib
/lib/perl/pm_to_blib
/lib/py/build
/lib/py/thrift.egg-info/
/lib/rb/Gemfile.lock
/lib/rb/debug_proto_test
/lib/rb/.config
/lib/rb/ext/conftest.dSYM/
/lib/rb/ext/mkmf.log
/lib/rb/ext/thrift_native.bundle
/lib/rb/ext/thrift_native.so
/lib/rb/test/
/lib/rb/thrift-*.gem
/lib/php/src/ext/thrift_protocol/Makefile.*
/lib/php/src/ext/thrift_protocol/build/
/lib/php/src/ext/thrift_protocol/config.*
/lib/php/src/ext/thrift_protocol/configure
/lib/php/src/ext/thrift_protocol/configure.ac
/lib/php/src/ext/thrift_protocol/configure.in
/lib/php/src/ext/thrift_protocol/install-sh
/lib/php/src/ext/thrift_protocol/libtool
/lib/php/src/ext/thrift_protocol/ltmain.sh
/lib/php/src/ext/thrift_protocol/missing
/lib/php/src/ext/thrift_protocol/mkinstalldirs
/lib/php/src/ext/thrift_protocol/modules/
/lib/php/src/ext/thrift_protocol/php_thrift_protocol.lo
/lib/php/src/ext/thrift_protocol/run-tests.php
/lib/php/src/ext/thrift_protocol/thrift_protocol.la
/lib/php/src/ext/thrift_protocol/tmp-php.ini
/lib/php/src/packages/
/lib/php/test/TEST-*.xml
/lib/php/test/packages/
/lib/py/dist/
/lib/erl/logs/
/lib/go/pkg
/lib/go/src
/lib/go/test/gopath/
/lib/go/test/ThriftTest.thrift
/lib/rs/target/
/lib/rs/Cargo.lock
/lib/rs/test/Cargo.lock
/lib/rs/test/target/
/lib/rs/test/bin/
/lib/rs/test/src/base_one.rs
/lib/rs/test/src/base_two.rs
/lib/rs/test/src/midlayer.rs
/lib/rs/test/src/recursive.rs
/lib/rs/test/src/ultimate.rs
/lib/rs/*.iml
/lib/rs/**/*.iml
/libtool
/ltmain.sh
/missing
/node_modules/
/vendor/
/composer.lock
/stamp-h1
/test/features/results.json
/test/results.json
/test/c_glib/test_client
/test/c_glib/test_server
/test/cl/TestServer
/test/cl/TestClient
/test/cpp/StressTest
/test/cpp/StressTestNonBlocking
/test/cpp/TestClient
/test/cpp/TestServer
/test/csharp/obj
/test/csharp/bin
/test/dart/**/.packages
/test/dart/**/packages
/test/dart/**/.pub/
/test/dart/**/pubspec.lock
/test/log/
/test/test.log
/test/erl/.generated
/test/erl/.rebar
/test/erl/ebin
/test/go/bin/
/test/go/ThriftTest.thrift
/test/go/gopath
/test/go/pkg/
/test/go/src/code.google.com/
/test/go/src/common/mock_handler.go
/test/go/src/github.com/golang/
/test/go/src/golang.org/
/test/go/src/gen/
/test/go/src/thrift
/test/haxe/bin
/test/hs/TestClient
/test/hs/TestServer
/test/php/php_ext_dir/
/test/py.twisted/_trial_temp/
/test/rb/Gemfile.lock
/test/netcore/**/bin
/test/netcore/**/obj
/test/netcore/Thrift
/test/php/php_ext_dir/
/test/rs/Cargo.lock
/test/rs/src/thrift_test.rs
/test/rs/bin/
/test/rs/target/
/test/rs/*.iml
/test/rs/**/*.iml
/lib/cl/backport-update.zip
/lib/cl/lib
/tutorial/cl/quicklisp.lisp
/tutorial/cl/externals/
/tutorial/cl/quicklisp/
/tutorial/cl/TutorialClient
/tutorial/cl/TutorialServer
/tutorial/cl/backport-update.zip
/tutorial/cl/lib/
/tutorial/cl/shared-implementation.fasl
/tutorial/cl/tutorial-implementation.fasl
/tutorial/cpp/TutorialClient
/tutorial/cpp/TutorialServer
/tutorial/c_glib/tutorial_client
/tutorial/c_glib/tutorial_server
/tutorial/csharp/CsharpServer/obj
/tutorial/csharp/CsharpServer/bin
/tutorial/csharp/CsharpClient/obj
/tutorial/csharp/CsharpClient/bin
/tutorial/d/async_client
/tutorial/d/client
/tutorial/d/server
/tutorial/dart/**/.packages
/tutorial/dart/**/packages
/tutorial/dart/**/.pub/
/tutorial/dart/**/pubspec.lock
/tutorial/delphi/**/*.dsk
/tutorial/delphi/**/*.local
/tutorial/delphi/**/*.tvsconfig
/tutorial/delphi/**/dcu
/tutorial/delphi/**/*.local
/tutorial/delphi/**/*.identcache
/tutorial/go/gopath
/tutorial/go/go-tutorial
/tutorial/go/calculator-remote
/tutorial/go/src/shared
/tutorial/go/src/tutorial
/tutorial/go/src/git.apache.org
/tutorial/go/src/golang.org
/tutorial/haxe/bin
/tutorial/hs/dist/
/tutorial/java/build/
/tutorial/js/build/
/tutorial/netcore/**/bin
/tutorial/netcore/**/obj
/tutorial/netcore/Thrift
/tutorial/rs/*.iml
/tutorial/rs/src/shared.rs
/tutorial/rs/src/tutorial.rs
/tutorial/rs/bin
/tutorial/rs/target
/tutorial/rs/Cargo.lock
/ylwrap

View file

@ -1,64 +0,0 @@
max_width = 100
hard_tabs = false
tab_spaces = 4
newline_style = "Auto"
use_small_heuristics = "Default"
indent_style = "Block"
wrap_comments = false
format_doc_comments = false
comment_width = 80
normalize_comments = false
normalize_doc_attributes = false
license_template_path = ""
format_strings = false
format_macro_matchers = false
format_macro_bodies = true
empty_item_single_line = true
struct_lit_single_line = true
fn_single_line = false
where_single_line = false
imports_indent = "Block"
imports_layout = "Mixed"
merge_imports = false
reorder_imports = true
reorder_modules = true
reorder_impl_items = false
type_punctuation_density = "Wide"
space_before_colon = false
space_after_colon = true
spaces_around_ranges = false
binop_separator = "Front"
remove_nested_parens = true
combine_control_expr = true
overflow_delimited_expr = false
struct_field_align_threshold = 0
enum_discrim_align_threshold = 0
match_arm_blocks = true
force_multiline_blocks = false
fn_args_density = "Tall"
brace_style = "SameLineWhere"
control_brace_style = "AlwaysSameLine"
trailing_semicolon = true
trailing_comma = "Vertical"
match_block_trailing_comma = false
blank_lines_upper_bound = 1
blank_lines_lower_bound = 0
edition = "2015"
merge_derives = true
use_try_shorthand = false
use_field_init_shorthand = false
force_explicit_abi = true
condense_wildcard_suffixes = false
color = "Auto"
required_version = "1.0.0"
unstable_features = false
disable_all_formatting = false
skip_children = false
hide_parse_errors = false
error_on_line_overflow = false
error_on_unformatted = false
report_todo = "Never"
report_fixme = "Never"
ignore = []
emit_mode = "Files"
make_backup = false

View file

@ -1,181 +0,0 @@
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#
# build Apache Thrift on Travis CI - https://travis-ci.org/
#
# Docker Integration
# see: build/docker/README.md
#
sudo: required
dist: trusty
language: cpp
services:
- docker
install:
- if [[ `uname` == "Linux" ]]; then build/docker/refresh.sh; fi
stages:
- docker # docker images
- thrift # thrift build jobs
env:
global:
- SCRIPT="cmake.sh"
- BUILD_ARG=""
- BUILD_ENV="-e CC=gcc -e CXX=g++ -e THRIFT_CROSSTEST_CONCURRENCY=4"
- DISTRO=ubuntu-bionic
- BUILD_LIBS="CPP C_GLIB HASKELL JAVA PYTHON TESTING TUTORIALS" # only meaningful for CMake builds
- TRAVIS_BUILD_STAGE=test
# DOCKER_REPO (this works for all builds as a source for docker images - you can override for fork builds in your Travis settings)
- DOCKER_REPO="thrift/thrift-build"
# DOCKER_USER (provide in your Travis settings if you want to build and update docker images once, instead of on every job)
# DOCKER_PASS (same)
jobs:
include:
# ========================= stage: docker =========================
- stage: docker
script: true
env:
- JOB="Docker Build ubuntu-xenial 16.04 LTS"
- DISTRO=ubuntu-xenial
- TRAVIS_BUILD_STAGE=docker
- script: true
env:
- JOB="Docker Build ubuntu-bionic 18.04 LTS"
- DISTRO=ubuntu-bionic
- TRAVIS_BUILD_STAGE=docker
# ========================= stage: thrift =======================
# ------------------------- phase: cross ------------------------
# apache/thrift official PR builds can exceed 50 minutes per job so combine all cross tests
- stage: thrift
script: build/docker/run.sh
if: repo = apache/thrift
env:
- JOB="Cross Language Tests"
- SCRIPT="cross-test.sh"
# fork based PR builds cannot exceed 50 minutes per job
- stage: thrift
script: build/docker/run.sh
if: repo != apache/thrift
env:
- JOB="Cross Language Tests (Binary Protocol)"
- SCRIPT="cross-test.sh"
- BUILD_ARG="-'(binary)'"
- stage: thrift
script: build/docker/run.sh
if: repo != apache/thrift
env:
- JOB="Cross Language Tests (Header, JSON Protocols)"
- SCRIPT="cross-test.sh"
- BUILD_ARG="-'(header|json)'"
- stage: thrift
script: build/docker/run.sh
if: repo != apache/thrift
env:
- JOB="Cross Language Tests (Compact and Multiplexed Protocols)"
- SCRIPT="cross-test.sh"
- BUILD_ARG="-'(compact|multiplexed)'"
# ------------------------- phase: sca --------------------------
# QA jobs for code analytics and metrics
- stage: thrift
script: build/docker/run.sh
env:
- JOB="Static Code Analysis"
- SCRIPT="sca.sh"
# C and C++ undefined behavior.
# A binary crashes if undefined behavior occurs and produces a stack trace.
# python is disabled, see: THRIFT-4360
- script: build/docker/run.sh
env:
- JOB="UBSan"
- SCRIPT="ubsan.sh"
- BUILD_ARG="--without-python --without-py3"
# ------------------------- phase: autotools --------------------
# TODO: Remove them once migrated to CMake
- script: build/docker/run.sh
env:
- JOB="Autotools (Ubuntu Bionic)"
- SCRIPT="autotools.sh"
- script: build/docker/run.sh
env:
- JOB="Autotools (Ubuntu Xenial)"
- DISTRO=ubuntu-xenial
- SCRIPT="autotools.sh"
# ------------------------- phase: cmake ------------------------
- script: build/docker/run.sh
env:
- JOB="CMake"
# C++ specific options: compiler plug-in, threading model
- script: build/docker/run.sh
env:
- JOB="C++98 (Boost Thread)"
- SCRIPT="cmake.sh"
- BUILD_LIBS="CPP TESTING TUTORIALS"
- BUILD_ARG="-DCMAKE_CXX_STANDARD=98 -DCMAKE_CXX_STANDARD_REQUIRED=ON -DCMAKE_CXX_EXTENSIONS=OFF --DWITH_BOOSTTHREADS=ON -DWITH_PYTHON=OFF -DWITH_C_GLIB=OFF -DWITH_JAVA=OFF -DWITH_HASKELL=OFF"
- BUILD_ENV="-e CC=clang -e CXX=clang++"
- script: build/docker/run.sh
env:
- JOB="C++ (Std Thread) and Plugin"
- SCRIPT="cmake.sh"
- BUILD_LIBS="CPP TESTING TUTORIALS"
- BUILD_ARG="-DWITH_PLUGIN=ON -DWITH_STDTHREADS=ON -DWITH_PYTHON=OFF -DWITH_C_GLIB=OFF -DWITH_JAVA=OFF -DWITH_HASKELL=OFF"
- BUILD_ENV="-e CC=clang -e CXX=clang++"
# ------------------------- phase: dist -------------------------
- script: build/docker/run.sh
env:
- JOB="make dist"
- SCRIPT="make-dist.sh"
- script: build/docker/run.sh
env:
- JOB="Debian Packages"
- SCRIPT="dpkg.sh"
# ------------------------- phase: coverity ---------------------
# We build the coverity scan build once monthly using a travis cron job
- if: (env(COVERITY_SCAN_NOTIFICATION_EMAIL) IS present) AND (branch IN (master)) AND (type IN (cron))
script: build/docker/run.sh
env:
- JOB="Coverity Scan"
- SCRIPT="covscan.sh"
### ------------------------- phase: osx --------------------------
# disabled due to the time delays it imposes on build jobs
# - os: osx
# osx_image: xcode9
# script: build/docker/scripts/autotools.sh

File diff suppressed because it is too large Load diff

View file

@ -1,124 +0,0 @@
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#
cmake_minimum_required(VERSION 3.1)
# CMake 3.1 supports C++ standards selection with CMAKE_CXX_STANDARD
# If you need CMake 3.1+ for Ubuntu 14.04, try
# https://launchpad.net/~george-edison55/+archive/ubuntu/cmake-3.x
# If you need CMake 3.1+ for debian "jessie", get it from jessie-backports
# Otherwise
# http://cmake.org
project("Apache Thrift")
set(CMAKE_MODULE_PATH "${CMAKE_MODULE_PATH}" "${CMAKE_CURRENT_SOURCE_DIR}/build/cmake")
# TODO: add `git rev-parse --short HEAD`
# Read the version information from the Autoconf file
file (STRINGS "${CMAKE_CURRENT_SOURCE_DIR}/configure.ac" CONFIGURE_AC REGEX "AC_INIT\\(.*\\)" )
# The following variable is used in the version.h.in file
string(REGEX REPLACE "AC_INIT\\(\\[.*\\], \\[([0-9]+\\.[0-9]+\\.[0-9]+(-dev)?)\\]\\)" "\\1" PACKAGE_VERSION ${CONFIGURE_AC})
message(STATUS "Parsed Thrift package version: ${PACKAGE_VERSION}")
# These are internal to CMake
string(REGEX REPLACE "([0-9]+\\.[0-9]+\\.[0-9]+)(-dev)?" "\\1" thrift_VERSION ${PACKAGE_VERSION})
string(REGEX REPLACE "([0-9]+)\\.[0-9]+\\.[0-9]+" "\\1" thrift_VERSION_MAJOR ${thrift_VERSION})
string(REGEX REPLACE "[0-9]+\\.([0-9])+\\.[0-9]+" "\\1" thrift_VERSION_MINOR ${thrift_VERSION})
string(REGEX REPLACE "[0-9]+\\.[0-9]+\\.([0-9]+)" "\\1" thrift_VERSION_PATCH ${thrift_VERSION})
message(STATUS "Parsed Thrift version: ${thrift_VERSION} (${thrift_VERSION_MAJOR}.${thrift_VERSION_MINOR}.${thrift_VERSION_PATCH})")
# Some default settings
include(DefineCMakeDefaults)
# Build time options are defined here
include(DefineOptions)
include(DefineInstallationPaths)
# Based on the options set some platform specifics
include(DefinePlatformSpecifc)
# Generate the config.h file
include(ConfigureChecks)
# Package it
include(CPackConfig)
find_package(Threads)
include(CTest)
if(BUILD_TESTING)
message(STATUS "Building with unittests")
enable_testing()
# Define "make check" as alias for "make test"
add_custom_target(check COMMAND ctest)
else ()
message(STATUS "Building without tests")
endif ()
if(BUILD_COMPILER)
if(NOT EXISTS ${THRIFT_COMPILER})
set(THRIFT_COMPILER $<TARGET_FILE:thrift-compiler>)
endif()
add_subdirectory(${CMAKE_CURRENT_SOURCE_DIR}/compiler/cpp)
elseif(EXISTS ${THRIFT_COMPILER})
add_executable(thrift-compiler IMPORTED)
set_property(TARGET thrift-compiler PROPERTY IMPORTED_LOCATION ${THRIFT_COMPILER})
endif()
if(BUILD_CPP)
add_subdirectory(${CMAKE_CURRENT_SOURCE_DIR}/lib/cpp)
if(BUILD_TUTORIALS)
add_subdirectory(${CMAKE_CURRENT_SOURCE_DIR}/tutorial/cpp)
endif()
if(BUILD_TESTING)
if(WITH_LIBEVENT AND WITH_ZLIB AND WITH_OPENSSL)
add_subdirectory(${CMAKE_CURRENT_SOURCE_DIR}/test/cpp)
else()
message(WARNING "libevent and/or ZLIB and/or OpenSSL not found or disabled; will not build some tests")
endif()
endif()
endif()
if(BUILD_C_GLIB)
add_subdirectory(${CMAKE_CURRENT_SOURCE_DIR}/lib/c_glib)
endif()
if(BUILD_JAVA)
add_subdirectory(${CMAKE_CURRENT_SOURCE_DIR}/lib/java)
endif()
if(BUILD_PYTHON)
add_subdirectory(${CMAKE_CURRENT_SOURCE_DIR}/lib/py)
if(BUILD_TESTING)
add_subdirectory(${CMAKE_CURRENT_SOURCE_DIR}/test/py)
endif()
endif()
if(BUILD_HASKELL)
add_subdirectory(${CMAKE_CURRENT_SOURCE_DIR}/lib/hs)
if(BUILD_TESTING)
add_subdirectory(${CMAKE_CURRENT_SOURCE_DIR}/test/hs)
endif()
endif()
PRINT_CONFIG_SUMMARY()

View file

@ -1,116 +0,0 @@
# How to Contribute #
Thank you for your interest in contributing to the Apache Thrift project! Information on why and how to contribute is available on the Apache Software Foundation (ASF) web site. In particular, we recommend the following to become acquainted with Apache Contributions:
* [Contributors Tech Guide](http://www.apache.org/dev/contributors)
* [Get involved!](http://www.apache.org/foundation/getinvolved.html)
* [Legal aspects on Submission of Contributions (Patches)](http://www.apache.org/licenses/LICENSE-2.0.html#contributions)
## GitHub pull requests ##
This is the preferred method of submitting changes. When you submit a pull request through github,
it activates the continuous integration (CI) build systems at Appveyor and Travis to build your changesxi
on a variety of Linux and Windows configurations and run all the test suites. Follow these requirements
for a successful pull request:
1. All code changes require an [Apache Jira THRIFT Issue](http://issues.apache.org/jira/browse/THRIFT) ticket.
1. All pull requests should contain a single commit per issue, or we will ask you to squash it.
1. The pull request title must begin with the Jira THRIFT ticket identifier, for example:
THRIFT-9999: an example pull request title
1. Commit messages must follow this pattern for code changes (deviations will not be merged):
THRIFT-9999: [summary of fix, one line if possible]
Client: [language(s) affected, comma separated, use lib/ directory names please]
Instructions:
1. Create a fork in your GitHub account of http://github.com/apache/thrift
1. Clone the fork to your development system.
1. Create a branch for your changes (best practice is issue as branch name, e.g. THRIFT-9999).
1. Modify the source to include the improvement/bugfix, and:
* Remember to provide *tests* for all submitted changes!
* Use test-driven development (TDD): add a test that will isolate the bug *before* applying the change that fixes it.
* Verify that you follow [Thrift Coding Standards](/docs/coding_standards) (you can run 'make style', which ensures proper format for some languages).
* [*optional*] Verify that your change works on other platforms by adding a GitHub service hook to [Travis CI](http://docs.travis-ci.com/user/getting-started/#Step-one%3A-Sign-in) and [AppVeyor](http://www.appveyor.com/docs). You can use this technique to run the Thrift CI jobs in your account to check your changes before they are made public. Every GitHub pull request into Thrift will run the full CI build and test suite on your changes.
1. Squash your changes to a single commit. This maintains clean change history.
1. Commit and push changes to your branch (please use issue name and description as commit title, e.g. "THRIFT-9999: make it perfect"), with the affected languages on the next line of the description.
1. Use GitHub to create a pull request going from your branch to apache:master. Ensure that the Jira ticket number is at the beginning of the title of your pull request, same as the commit title.
1. Wait for other contributors or committers to review your new addition, and for a CI build to complete.
1. Wait for a committer to commit your patch. You can nudge the committers if necessary by sending a message to the [Apache Thrift mailing list](https://thrift.apache.org/mailing).
## If you want to build the project locally ##
For Windows systems, see our detailed instructions on the [CMake README](/build/cmake/README.md).
For Windows Native C++ builds, see our detailed instructions on the [WinCPP README](/build/wincpp/README.md).
For unix systems, see our detailed instructions on the [Docker README](/build/docker/README.md).
## If you want to review open issues... ##
1. Review the [GitHub Pull Request Backlog](https://github.com/apache/thrift/pulls). Code reviews are open to all.
2. Review the [Jira issue tracker](http://issues.apache.org/jira/browse/THRIFT). You can search for tickets relating to languages you are interested in or currently using with thrift, for example a Jira search (Issues -> Search For Issues) query of ``project = THRIFT AND component in ("Erlang - Library") and status not in (resolved, closed)`` will locate all the open Erlang Library issues.
## If you discovered a defect... ##
1. Check to see if the issue is already in the [Jira issue tracker](http://issues.apache.org/jira/browse/THRIFT).
1. If not, create a ticket describing the change you're proposing in the Jira issue tracker.
1. Contribute your code changes using the GitHub pull request method:
## Contributing via Patch ##
Some changes do not require a build, for example in documentation. For changes that are not code or build related, you can submit a patch on Jira for review. To create a patch from changes in your local directory:
git diff > ../THRIFT-NNNN.patch
then wait for contributors or committers to review your changes, and then for a committer to apply your patch.
## GitHub recipes for Pull Requests ##
Sometimes commmitters may ask you to take actions in your pull requests. Here are some recipes that will help you accomplish those requests. These examples assume you are working on Jira issue THRIFT-9999. You should also be familiar with the [upstream](https://help.github.com/articles/syncing-a-fork/) repository concept.
### Squash your changes ###
If you have not submitted a pull request yet, or if you have not yet rebased your existing pull request, you can squash all your commits down to a single commit. This makes life easier for the committers. If your pull request on GitHub has more than one commit, you should do this.
1. Use the command ``git log`` to identify how many commits you made since you began.
2. Use the command ``git rebase -i HEAD~N`` where N is the number of commits.
3. Leave "pull" in the first line.
4. Change all other lines from "pull" to "fixup".
5. All your changes are now in a single commit.
If you already have a pull request outstanding, you will need to do a "force push" to overwrite it since you changed your commit history:
git push -u origin THRIFT-9999 --force
A more detailed walkthrough of a squash can be found at [Git Ready](http://gitready.com/advanced/2009/02/10/squashing-commits-with-rebase.html).
### Rebase your pull request ###
If your pull request has a conflict with master, it needs to be rebased:
git checkout THRIFT-9999
git rebase upstream master
(resolve any conflicts, make sure it builds)
git push -u origin THRIFT-9999 --force
### Fix a bad merge ###
If your pull request contains commits that are not yours, then you should use the following technique to fix the bad merge in your branch:
git checkout master
git pull upstream master
git checkout -b THRIFT-9999-take-2
git cherry-pick ...
(pick only your commits from your original pull request in ascending chronological order)
squash your changes to a single commit if there is more than one (see above)
git push -u origin THRIFT-9999-take-2:THRIFT-9999
This procedure will apply only your commits in order to the current master, then you will squash them to a single commit, and then you force push your local THRIFT-9999-take-2 into remote THRIFT-9999 which represents your pull request, replacing all the commits with the new one.

View file

@ -1,355 +0,0 @@
# Apache Thrift Language Support #
Last Modified: 2018-12-17
Guidance For: 0.12.0 or later
Thrift supports many programming languages and has an impressive test suite that exercises most of the languages, protocols, and transports that represents a matrix of thousands of possible combinations. Each language typically has a minimum required version as well as support libraries - some mandatory and some optional. All of this information is provided below to help you assess whether you can use Apache Thrift with your project. Obviously this is a complex matrix to maintain and may not be correct in all cases - if you spot an error please inform the developers using the mailing list.
Apache Thrift has a choice of two build systems. The `autoconf` build system is the most complete build and is used to build all supported languages. The `cmake` build system has been designated by the project to replace `autoconf` however this transition will take quite some time to complete.
The Language/Library Levels indicate the minimum and maximum versions that are used in the [continuous integration environments](build/docker/README.md) (Appveyor, Travis) for Apache Thrift. Other language levels may be supported for each language, however tested less thoroughly; check the README file inside each lib directory for additional details. Note that while a language may contain support for protocols, transports, and servers, the extent to which each is tested as part of the overall build process varies. The definitive integration test for the project is called the "cross" test which executes a test matrix with clients and servers communicating across languages.
<table style="font-size: 60%; padding: 1px;">
<thead>
<tr>
<th rowspan=2>Language</th>
<th rowspan=2 align=center>Since</th>
<th colspan=2 align=center>Build Systems</th>
<th colspan=2 align=center>Lang/Lib Levels (Tested)</th>
<th colspan=6 align=center>Low-Level Transports</th>
<th colspan=3 align=center>Transport Wrappers</th>
<th colspan=4 align=center>Protocols</th>
<th colspan=5 align=center>Servers</th>
<th rowspan=2>Open Issues</th>
</tr>
<tr>
<!-- Build Systems ---------><th>autoconf</th><th>cmake</th>
<!-- Lang/Lib Levels -------><th>Min</th><th>Max</th>
<!-- Low-Level Transports --><th><a href="https://en.wikipedia.org/wiki/Unix_domain_socket">Domain</a></th><th>&nbsp;File&nbsp;</th><th>Memory</th><th>&nbsp;Pipe&nbsp;</th><th>Socket</th><th>&nbsp;TLS&nbsp;</th>
<!-- Transport Wrappers ----><th>Framed</th><th>&nbsp;http&nbsp;</th><th>&nbsp;zlib&nbsp;</th>
<!-- Protocols -------------><th><a href="doc/specs/thrift-binary-protocol.md">Binary</a></th><th><a href="doc/specs/thrift-compact-protocol.md">Compact</a></th><th>&nbsp;JSON&nbsp;</th><th>Multiplex</th>
<!-- Servers ---------------><th>Forking</th><th>Nonblocking</th><th>Simple</th><th>Threaded</th><th>ThreadPool</th>
</tr>
</thead>
<tbody>
<tr align=center>
<td align=left><a href="lib/as3/README.md">ActionScript</a></td>
<!-- Since -----------------><td>0.3.0</td>
<!-- Build Systems ---------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Language Levels -------><td colspan=2>ActionScript 3</td>
<!-- Low-Level Transports --><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Transport Wrappers ----><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Protocols -------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Servers ---------------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<td align=left><a href="https://issues.apache.org/jira/browse/THRIFT/component/12313722">ActionScript</a></td>
</tr>
<tr align=center>
<td align=left><a href="lib/c_glib/README.md">C (glib)</a></td>
<!-- Since -----------------><td>0.6.0</td>
<!-- Build Systems ---------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Language Levels -------><td>2.48.2</td><td>2.54.0</td>
<!-- Low-Level Transports --><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Transport Wrappers ----><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Protocols -------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Servers ---------------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<td align=left><a href="https://issues.apache.org/jira/browse/THRIFT/component/12313854">C (glib)</a></td>
</tr>
<tr align=center>
<td align=left><a href="lib/cpp/README.md">C++</a></td>
<!-- Since -----------------><td>0.2.0</td>
<!-- Build Systems ---------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Language Levels -------><td colspan=2>C++98</td>
<!-- Low-Level Transports --><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Transport Wrappers ----><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Protocols -------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Servers ---------------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<td align=left><a href="https://issues.apache.org/jira/browse/THRIFT/component/12312313">C++</a></td>
</tr>
<tr align=center>
<td align=left><a href="lib/csharp/README.md">C#</a></td>
<!-- Since -----------------><td>0.2.0</td>
<!-- Build Systems ---------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Language Levels -------><td>.NET&nbsp;3.5 / mono&nbsp;3.2.8.0</td><td>.NET&nbsp;4.6.1 / mono&nbsp;4.6.2.7</td>
<!-- Low-Level Transports --><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Transport Wrappers ----><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Protocols -------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Servers ---------------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<td align=left><a href="https://issues.apache.org/jira/browse/THRIFT/component/12312362">C# (.NET)</a></td>
</tr>
<tr align=center>
<td align=left><a href="lib/cocoa/README.md">Cocoa</a></td>
<!-- Since -----------------><td>0.2.0</td>
<!-- Build Systems ---------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Language Levels -------><td colspan=2>unknown</td>
<!-- Low-Level Transports --><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Transport Wrappers ----><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Protocols -------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Servers ---------------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<td align=left><a href="https://issues.apache.org/jira/browse/THRIFT/component/12312398">Cocoa</a></td>
</tr>
<tr align=center>
<td align=left><a href="lib/cl/README.md">Common Lisp</a></td>
<!-- Since -----------------><td>0.12.0</td>
<!-- Build Systems ---------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Language Levels -------><td>SBCL 1.4.5</td><td>SBCL 1.4.9</td>
<!-- Low-Level Transports --><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Transport Wrappers ----><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Protocols -------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Servers ---------------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<td align=left><a href="https://issues.apache.org/jira/browse/THRIFT-82">Common Lisp</a></td>
</tr>
<tr align=center>
<td align=left><a href="lib/d/README.md">Dlang</a></td>
<!-- Since -----------------><td>0.9.0</td>
<!-- Build Systems ---------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Language Levels -------><td>2.075.1</td><td>2.081.0</td>
<!-- Low-Level Transports --><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Transport Wrappers ----><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Protocols -------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Servers ---------------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<td align=left><a href="https://issues.apache.org/jira/browse/THRIFT/component/12317904">D</a></td>
</tr>
<tr align=center>
<td align=left><a href="lib/dart/README.md">Dart</a></td>
<!-- Since -----------------><td>0.10.0</td>
<!-- Build Systems ---------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Language Levels -------><td>1.22.1</td><td>1.24.3</td>
<!-- Low-Level Transports --><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Transport Wrappers ----><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Protocols -------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Servers ---------------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<td align=left><a href="https://issues.apache.org/jira/browse/THRIFT/component/12328006">Dart</a></td>
</tr>
<tr align=center>
<td align=left><a href="lib/delphi/README.md">Delphi</a></td>
<!-- Since -----------------><td>0.8.0</td>
<!-- Build Systems ---------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Language Levels -------><td>2010</td><td>unknown</td>
<!-- Low-Level Transports --><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Transport Wrappers ----><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Protocols -------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Servers ---------------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<td align=left><a href="https://issues.apache.org/jira/browse/THRIFT/component/12316601">Delphi</a></td>
</tr>
<tr align=center>
<td align=left><a href="lib/netcore/README.md">.NET Core</a></td>
<!-- Since -----------------><td>0.11.0</td>
<!-- Build Systems ---------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Language Levels -------><td colspan=2>2.1.4</td>
<!-- Low-Level Transports --><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Transport Wrappers ----><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Protocols -------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Servers ---------------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<td align=left><a href="https://issues.apache.org/jira/browse/THRIFT/component/12331176">.NET Core</a></td>
</tr>
<tr align=center>
<td align=left><a href="lib/erl/README.md">Erlang</a></td>
<!-- Since -----------------><td>0.3.0</td>
<!-- Build Systems ---------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Language Levels -------><td>18.3</td><td>20.0.4</td>
<!-- Low-Level Transports --><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Transport Wrappers ----><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Protocols -------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Servers ---------------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<td align=left><a href="https://issues.apache.org/jira/browse/THRIFT/component/12312390">Erlang</a></td>
</tr>
<tr align=center>
<td align=left><a href="lib/go/README.md">Go</a></td>
<!-- Since -----------------><td>0.7.0</td>
<!-- Build Systems ---------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Language Levels -------><td>1.7.6</td><td>1.10.3</td>
<!-- Low-Level Transports --><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Transport Wrappers ----><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Protocols -------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Servers ---------------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<td align=left><a href="https://issues.apache.org/jira/browse/THRIFT/component/12314307">Go</a></td>
</tr>
<tr align=center>
<td align=left><a href="lib/hs/README.md">Haskell</a></td>
<!-- Since -----------------><td>0.5.0</td>
<!-- Build Systems ---------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Language Levels -------><td>7.10.3</td><td>8.0.2</td>
<!-- Low-Level Transports --><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Transport Wrappers ----><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Protocols -------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Servers ---------------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<td align=left><a href="https://issues.apache.org/jira/browse/THRIFT/component/12312704">Haskell</a></td>
</tr>
<tr align=center>
<td align=left><a href="lib/haxe/README.md">Haxe</a></td>
<!-- Since -----------------><td>0.9.3</td>
<!-- Build Systems ---------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Language Levels -------><td>3.2.1</td><td>3.4.4</td>
<!-- Low-Level Transports --><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Transport Wrappers ----><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Protocols -------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Servers ---------------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<td align=left><a href="https://issues.apache.org/jira/browse/THRIFT/component/12324347">Haxe</a></td>
</tr>
<tr align=center>
<td align=left><a href="lib/java/README.md">Java (SE)</a></td>
<!-- Since -----------------><td>0.2.0</td>
<!-- Build Systems ---------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Language Levels -------><td colspan=2>1.8.0_151</td>
<!-- Low-Level Transports --><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Transport Wrappers ----><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Protocols -------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Servers ---------------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<td align=left><a href="https://issues.apache.org/jira/browse/THRIFT/component/12312314">Java SE</a></td>
</tr>
<tr align=center>
<td align=left><a href="lib/javame/README.md">Java (ME)</a></td>
<!-- Since -----------------><td>0.5.0</td>
<!-- Build Systems ---------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Language Levels -------><td colspan=2>unknown</td>
<!-- Low-Level Transports --><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Transport Wrappers ----><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Protocols -------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Servers ---------------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<td align=left><a href="https://issues.apache.org/jira/browse/THRIFT/component/12313759">Java ME</a></td>
</tr>
<tr align=center>
<td align=left><a href="lib/js/README.md">Javascript</a></td>
<!-- Since -----------------><td>0.3.0</td>
<!-- Build Systems ---------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Language Levels -------><td colspan=2>unknown</td>
<!-- Low-Level Transports --><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Transport Wrappers ----><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Protocols -------------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Servers ---------------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<td align=left><a href="https://issues.apache.org/jira/browse/THRIFT/component/12313418">Javascript</a></td>
</tr>
<tr align=center>
<td align=left><a href="lib/lua/README.md">Lua</a></td>
<!-- Since -----------------><td>0.9.2</td>
<!-- Build Systems ---------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Language Levels -------><td>5.1.5</td><td>5.2.4</td>
<!-- Low-Level Transports --><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Transport Wrappers ----><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Protocols -------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Servers ---------------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<td align=left><a href="https://issues.apache.org/jira/browse/THRIFT/component/12322659">Lua</a></td>
</tr>
<tr align=center>
<td align=left><a href="lib/nodejs/README.md">node.js</a></td>
<!-- Since -----------------><td>0.6.0</td>
<!-- Build Systems ---------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Language Levels -------><td>6.x</td><td>8.11.3</td>
<!-- Low-Level Transports --><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Transport Wrappers ----><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Protocols -------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Servers ---------------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<td align=left><a href="https://issues.apache.org/jira/browse/THRIFT/component/12314320">node.js</a></td>
</tr>
<tr align=center>
<td align=left><a href="lib/nodets/README.md">node.ts</a></td>
<!-- Since -----------------><td>0.12.0</td>
<!-- Build Systems ---------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Language Levels -------><td>3.1.6</td><td></td>
<!-- Low-Level Transports --><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Transport Wrappers ----><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Protocols -------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Servers ---------------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<td align=left><a href="https://issues.apache.org/jira/issues/?jql=project%20%3D%20THRIFT%20AND%20resolution%20%3D%20Unresolved%20and%20Component%20in%20(%22TypeScript%20-%20Library%22)%20ORDER%20BY%20priority%20DESC%2C%20updated%20DESC">node.ts</a></td>
</tr>
<tr align=center>
<td align=left><a href="lib/ocaml/README.md">OCaml</a></td>
<!-- Since -----------------><td>0.2.0</td>
<!-- Build Systems ---------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Language Levels -------><td colspan=2>4.04.0</td>
<!-- Low-Level Transports --><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Transport Wrappers ----><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Protocols -------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Servers ---------------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<td align=left><a href="https://issues.apache.org/jira/browse/THRIFT/component/12313660">OCaml</a></td>
</tr>
<tr align=center>
<td align=left><a href="lib/perl/README.md">Perl</a></td>
<!-- Since -----------------><td>0.2.0</td>
<!-- Build Systems ---------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Language Levels -------><td>5.22.1</td><td>5.26.0</td>
<!-- Low-Level Transports --><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Transport Wrappers ----><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Protocols -------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Servers ---------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<td align=left><a href="https://issues.apache.org/jira/browse/THRIFT/component/12312312">Perl</a></td>
</tr>
<tr align=center>
<td align=left><a href="lib/php/README.md">PHP</a></td>
<!-- Since -----------------><td>0.2.0</td>
<!-- Build Systems ---------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Language Levels -------><td>7.0.22</td><td>7.1.8</td>
<!-- Low-Level Transports --><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Transport Wrappers ----><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Protocols -------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Servers ---------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<td align=left><a href="https://issues.apache.org/jira/browse/THRIFT/component/12312431">PHP</a></td>
</tr>
<tr align=center>
<td align=left><a href="lib/py/README.md">Python</a></td>
<!-- Since -----------------><td>0.2.0</td>
<!-- Build Systems ---------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Language Levels -------><td>2.7.12, 3.5.2</td><td>2.7.14, 3.6.3</td>
<!-- Low-Level Transports --><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Transport Wrappers ----><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Protocols -------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Servers ---------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<td align=left><a href="https://issues.apache.org/jira/browse/THRIFT/component/12312315">Python</a></td>
</tr>
<tr align=center>
<td align=left><a href="lib/rb/README.md">Ruby</a></td>
<!-- Since -----------------><td>0.2.0</td>
<!-- Build Systems ---------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Language Levels -------><td>2.3.1p112</td><td>2.3.3p222</td>
<!-- Low-Level Transports --><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Transport Wrappers ----><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Protocols -------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Servers ---------------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<td align=left><a href="https://issues.apache.org/jira/browse/THRIFT/component/12312316">Ruby</a></td>
</tr>
<tr align=center>
<td align=left><a href="lib/rs/README.md">Rust</a></td>
<!-- Since -----------------><td>0.11.0</td>
<!-- Build Systems ---------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Language Levels -------><td>1.17.0</td><td>1.21.0</td>
<!-- Low-Level Transports --><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Transport Wrappers ----><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Protocols -------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td>
<!-- Servers ---------------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<td align=left><a href="https://issues.apache.org/jira/browse/THRIFT/component/12331420">Rust</a></td>
</tr>
<tr align=center>
<td align=left><a href="lib/st/README.md">Smalltalk</a></td>
<!-- Since -----------------><td>0.2.0</td>
<!-- Build Systems ---------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Language Levels -------><td colspan=2>unknown</td>
<!-- Low-Level Transports --><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Transport Wrappers ----><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Protocols -------------><td><img src="doc/images/cgrn.png" alt="Yes"/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<!-- Servers ---------------><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td><td><img src="doc/images/cred.png" alt=""/></td>
<td align=left><a href="https://issues.apache.org/jira/browse/THRIFT/component/12313861">Smalltalk</a></td>
</tr>
</tbody>
<tfoot>
<tr>
<th rowspan=2>Language</th>
<th rowspan=2 align=center>Since</th>
<!-- Build Systems ---------><th>autoconf</th><th>cmake</th>
<!-- Lang/Lib Levels -------><th>Min</th><th>Max</th>
<!-- Low-Level Transports --><th><a href="https://en.wikipedia.org/wiki/Unix_domain_socket">Domain</a></th></th><th>&nbsp;File&nbsp;</th><th>Memory</th><th>&nbsp;Pipe&nbsp;</th><th>Socket</th><th>&nbsp;TLS&nbsp;</th>
<!-- Transport Wrappers ----><th>Framed</th><th>&nbsp;http&nbsp;</th><th>&nbsp;zlib&nbsp;</th>
<!-- Protocols -------------><th><a href="doc/specs/thrift-binary-protocol.md">Binary</a></th><th><a href="doc/specs/thrift-compact-protocol.md">Compact</a></th><th>&nbsp;JSON&nbsp;</th><th>Multiplex</th>
<!-- Servers ---------------><th>Forking</th><th>Nonblocking</th><th>Simple</th><th>Threaded</th><th>ThreadPool</th>
<th rowspan=2>Open Issues</th>
</tr>
<tr>
<th colspan=2 align=center>Build Systems</th>
<th colspan=2 align=center>Lang/Lib Levels (Tested)</th>
<th colspan=6 align=center>Low-Level Transports</th>
<th colspan=3 align=center>Transport Wrappers</th>
<th colspan=4 align=center>Protocols</th>
<th colspan=5 align=center>Servers</th>
</tr>
</tfoot>
</table>

View file

@ -1,239 +0,0 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
--------------------------------------------------
SOFTWARE DISTRIBUTED WITH THRIFT:
The Apache Thrift software includes a number of subcomponents with
separate copyright notices and license terms. Your use of the source
code for the these subcomponents is subject to the terms and
conditions of the following licenses.
--------------------------------------------------
Portions of the following files are licensed under the MIT License:
lib/erl/src/Makefile.am
Please see doc/otp-base-license.txt for the full terms of this license.
--------------------------------------------------
For the aclocal/ax_boost_base.m4 and contrib/fb303/aclocal/ax_boost_base.m4 components:
# Copyright (c) 2007 Thomas Porschberg <thomas@randspringer.de>
#
# Copying and distribution of this file, with or without
# modification, are permitted in any medium without royalty provided
# the copyright notice and this notice are preserved.
--------------------------------------------------
For the lib/nodejs/lib/thrift/json_parse.js:
/*
json_parse.js
2015-05-02
Public Domain.
NO WARRANTY EXPRESSED OR IMPLIED. USE AT YOUR OWN RISK.
*/
(By Douglas Crockford <douglas@crockford.com>)
--------------------------------------------------

View file

@ -1,135 +0,0 @@
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#
ACLOCAL_AMFLAGS = -I ./aclocal
if WITH_PLUGIN
# To enable bootstrap, build order is lib/cpp -> compiler -> others
SUBDIRS = lib/cpp compiler/cpp lib
if WITH_TESTS
SUBDIRS += lib/cpp/test
endif
else
SUBDIRS = compiler/cpp lib
endif
if WITH_TESTS
SUBDIRS += test
endif
if WITH_TUTORIAL
SUBDIRS += tutorial
endif
dist-hook:
find $(distdir) -type f \( -iname ".DS_Store" -or -iname "._*" -or -iname ".gitignore" \) | xargs rm -rf
find $(distdir) -type d \( -iname ".deps" -or -iname ".libs" \) | xargs rm -rf
find $(distdir) -type d \( -iname ".svn" -or -iname ".git" \) | xargs rm -rf
print-version:
@echo $(PACKAGE_VERSION)
.PHONY: precross cross
precross-%: all
$(MAKE) -C $* precross
precross: all precross-test precross-lib
empty :=
space := $(empty) $(empty)
comma := ,
CROSS_LANGS = @MAYBE_CPP@ @MAYBE_C_GLIB@ @MAYBE_CL@ @MAYBE_D@ @MAYBE_JAVA@ @MAYBE_CSHARP@ @MAYBE_PYTHON@ @MAYBE_PY3@ @MAYBE_RUBY@ @MAYBE_HASKELL@ @MAYBE_PERL@ @MAYBE_PHP@ @MAYBE_GO@ @MAYBE_NODEJS@ @MAYBE_DART@ @MAYBE_ERLANG@ @MAYBE_LUA@ @MAYBE_RS@ @MAYBE_DOTNETCORE@ @MAYBE_NODETS@
CROSS_LANGS_COMMA_SEPARATED = $(subst $(space),$(comma),$(CROSS_LANGS))
if WITH_PY3
CROSS_PY=$(PYTHON3)
else
CROSS_PY=$(PYTHON)
endif
if WITH_PYTHON
crossfeature: precross
$(CROSS_PY) test/test.py --retry-count 5 --features .* --skip-known-failures --server $(CROSS_LANGS_COMMA_SEPARATED)
else
# feature test needs python build
crossfeature:
endif
cross-%: precross crossfeature
$(CROSS_PY) test/test.py --retry-count 5 --skip-known-failures --server $(CROSS_LANGS_COMMA_SEPARATED) --client $(CROSS_LANGS_COMMA_SEPARATED) --regex "$*"
cross: cross-.*
TIMES = 1 2 3
fail: precross
$(CROSS_PY) test/test.py || true
$(CROSS_PY) test/test.py --update-expected-failures=overwrite
$(foreach var,$(TIMES),test/test.py -s || true;test/test.py --update-expected-failures=merge;)
codespell_skip_files = \
*.jar \
*.class \
*.so \
*.a \
*.la \
*.o \
*.p12 \
*OCamlMakefile \
.keystore \
.truststore \
CHANGES \
config.sub \
configure \
depcomp \
libtool.m4 \
output.* \
rebar \
thrift
skipped_files = $(subst $(space),$(comma),$(codespell_skip_files))
style-local:
codespell --write-changes --skip=$(skipped_files) --disable-colors
EXTRA_DIST = \
.clang-format \
.editorconfig \
.travis.yml \
.rustfmt.toml \
.dockerignore \
appveyor.yml \
bower.json \
build \
bootstrap.sh \
cleanup.sh \
CMakeLists.txt \
composer.json \
contrib \
CONTRIBUTING.md \
debian \
doc \
doap.rdf \
package.json \
sonar-project.properties \
LANGUAGES.md \
LICENSE \
CHANGES \
NOTICE \
README.md \
Thrift.podspec

View file

@ -1,5 +0,0 @@
Apache Thrift
Copyright 2006-2017 The Apache Software Foundation.
This product includes software developed at
The Apache Software Foundation (http://www.apache.org/).

View file

@ -1,192 +0,0 @@
Apache Thrift
=============
Introduction
============
Thrift is a lightweight, language-independent software stack with an
associated code generation mechanism for point-to-point RPC. Thrift provides
clean abstractions for data transport, data serialization, and application
level processing. The code generation system takes a simple definition
language as input and generates code across programming languages that
uses the abstracted stack to build interoperable RPC clients and servers.
![Apache Thrift Layered Architecture](doc/images/thrift-layers.png)
Thrift makes it easy for programs written in different programming
languages to share data and call remote procedures. With support
for [25 programming languages](LANGUAGES.md), chances are Thrift
supports the languages that you currently use.
Thrift is specifically designed to support non-atomic version changes
across client and server code.
For more details on Thrift's design and implementation, see the Thrift
whitepaper included in this distribution, or at the README.md file
in your particular subdirectory of interest.
Status
======
| Branch | Travis | Appveyor | Coverity Scan | codecov.io | Website |
| :----- | :----- | :------- | :------------ | :--------- | :------ |
| [`master`](https://github.com/apache/thrift/tree/master) | [![Build Status](https://travis-ci.org/apache/thrift.svg?branch=master)](https://travis-ci.org/apache/thrift) | [![Build status](https://ci.appveyor.com/api/projects/status/github/apache/thrift?branch=master&svg=true)](https://ci.appveyor.com/project/ApacheSoftwareFoundation/thrift/history) | [![Coverity Scan Build Status](https://scan.coverity.com/projects/1345/badge.svg)](https://scan.coverity.com/projects/thrift) | | [![Website](https://img.shields.io/badge/official-website-brightgreen.svg)](https://thrift.apache.org/) |
Releases
========
Thrift does not maintain a specific release calendar at this time.
We strive to release twice yearly. Download the [current release](http://thrift.apache.org/download).
License
=======
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
Project Hierarchy
=================
thrift/
compiler/
Contains the Thrift compiler, implemented in C++.
lib/
Contains the Thrift software library implementation, subdivided by
language of implementation.
cpp/
go/
java/
php/
py/
rb/
...
test/
Contains sample Thrift files and test code across the target programming
languages.
tutorial/
Contains a basic tutorial that will teach you how to develop software
using Thrift.
Development
===========
To build the same way Travis CI builds the project you should use docker.
We have [comprehensive building instructions for docker](build/docker/README.md).
Requirements
============
See http://thrift.apache.org/docs/install for a list of build requirements (may be stale). Alternatively see the docker build environments for a list of prerequisites.
Resources
=========
More information about Thrift can be obtained on the Thrift webpage at:
http://thrift.apache.org
Acknowledgments
===============
Thrift was inspired by pillar, a lightweight RPC tool written by Adam D'Angelo,
and also by Google's protocol buffers.
Installation
============
If you are building from the first time out of the source repository, you will
need to generate the configure scripts. (This is not necessary if you
downloaded a tarball.) From the top directory, do:
./bootstrap.sh
Once the configure scripts are generated, thrift can be configured.
From the top directory, do:
./configure
You may need to specify the location of the boost files explicitly.
If you installed boost in /usr/local, you would run configure as follows:
./configure --with-boost=/usr/local
Note that by default the thrift C++ library is typically built with debugging
symbols included. If you want to customize these options you should use the
CXXFLAGS option in configure, as such:
./configure CXXFLAGS='-g -O2'
./configure CFLAGS='-g -O2'
./configure CPPFLAGS='-DDEBUG_MY_FEATURE'
To enable gcov required options -fprofile-arcs -ftest-coverage enable them:
./configure --enable-coverage
Run ./configure --help to see other configuration options
Please be aware that the Python library will ignore the --prefix option
and just install wherever Python's distutils puts it (usually along
the lines of /usr/lib/pythonX.Y/site-packages/). If you need to control
where the Python modules are installed, set the PY_PREFIX variable.
(DESTDIR is respected for Python and C++.)
Make thrift:
make
From the top directory, become superuser and do:
make install
Note that some language packages must be installed manually using build tools
better suited to those languages (at the time of this writing, this applies
to Java, Ruby, PHP).
Look for the README.md file in the lib/<language>/ folder for more details on the
installation of each language library package.
Testing
=======
There are a large number of client library tests that can all be run
from the top-level directory.
make -k check
This will make all of the libraries (as necessary), and run through
the unit tests defined in each of the client libraries. If a single
language fails, the make check will continue on and provide a synopsis
at the end.
To run the cross-language test suite, please run:
make cross
This will run a set of tests that use different language clients and
servers.

View file

@ -1,16 +0,0 @@
Pod::Spec.new do |s|
s.name = "Thrift-swift3"
s.version = "0.12.0"
s.summary = "Apache Thrift is a lightweight, language-independent software stack with an associated code generation mechanism for RPC."
s.description = <<-DESC
The Apache Thrift software framework, for scalable cross-language services development, combines a software stack with a code generation engine to build services that work efficiently and seamlessly between C++, Java, Python, PHP, Ruby, Erlang, Perl, Haskell, C#, Cocoa, JavaScript, Node.js, Smalltalk, OCaml and Delphi and other languages.
DESC
s.homepage = "http://thrift.apache.org"
s.license = { :type => 'Apache License, Version 2.0', :url => 'https://www.apache.org/licenses/LICENSE-2.0' }
s.author = { "Apache Thrift Developers" => "dev@thrift.apache.org" }
s.ios.deployment_target = '9.0'
s.osx.deployment_target = '10.10'
s.requires_arc = true
s.source = { :git => "https://github.com/apache/thrift.git", :tag => "0.12.0" }
s.source_files = "lib/swift/Sources/*.swift"
end

View file

@ -1,18 +0,0 @@
Pod::Spec.new do |s|
s.name = "Thrift"
s.version = "0.12.0"
s.summary = "Apache Thrift is a lightweight, language-independent software stack with an associated code generation mechanism for RPC."
s.description = <<-DESC
The Apache Thrift software framework, for scalable cross-language services development, combines a software stack with a code generation engine to build services that work efficiently and seamlessly between C++, Java, Python, PHP, Ruby, Erlang, Perl, Haskell, C#, Cocoa, JavaScript, Node.js, Smalltalk, OCaml and Delphi and other languages.
DESC
s.homepage = "http://thrift.apache.org"
s.license = { :type => 'Apache License, Version 2.0', :url => 'https://www.apache.org/licenses/LICENSE-2.0' }
s.author = { "Apache Thrift Developers" => "dev@thrift.apache.org" }
s.requires_arc = true
s.ios.deployment_target = '7.0'
s.osx.deployment_target = '10.8'
s.ios.framework = 'CFNetwork'
s.osx.framework = 'CoreServices'
s.source = { :git => "https://github.com/apache/thrift.git", :tag => "0.12.0" }
s.source_files = 'lib/cocoa/src/**/*.{h,m,swift}'
end

View file

@ -1,54 +0,0 @@
dnl
dnl Check Bison version
dnl AC_PROG_BISON([MIN_VERSION=2.4])
dnl
dnl Will define BISON_USE_PARSER_H_EXTENSION if Automake is < 1.11
dnl for use with .h includes.
dnl
AC_DEFUN([AC_PROG_BISON], [
if test "x$1" = "x" ; then
bison_required_version="2.4"
else
bison_required_version="$1"
fi
AC_CHECK_PROG(have_prog_bison, [bison], [yes],[no])
AC_DEFINE_UNQUOTED([BISON_VERSION], [0.0], [Bison version if bison is not available])
#Do not use *.h extension for parser header files, use newer *.hh
bison_use_parser_h_extension=false
if test "$have_prog_bison" = "yes" ; then
AC_MSG_CHECKING([for bison version >= $bison_required_version])
bison_version=`bison --version | head -n 1 | cut '-d ' -f 4`
AC_DEFINE_UNQUOTED([BISON_VERSION], [$bison_version], [Defines bison version])
if test "$bison_version" \< "$bison_required_version" ; then
BISON=:
AC_MSG_RESULT([no])
AC_MSG_ERROR([Bison version $bison_required_version or higher must be installed on the system!])
else
AC_MSG_RESULT([yes])
BISON=bison
AC_SUBST(BISON)
#Verify automake version 1.11 headers for yy files are .h, > 1.12 uses .hh
automake_version=`automake --version | head -n 1 | cut '-d ' -f 4`
AC_DEFINE_UNQUOTED([AUTOMAKE_VERSION], [$automake_version], [Defines automake version])
if test "$automake_version" \< "1.12" ; then
#Use *.h extension for parser header file
bison_use_parser_h_extension=true
echo "Automake version < 1.12"
AC_DEFINE([BISON_USE_PARSER_H_EXTENSION], [1], [Use *.h extension for parser header file])
fi
fi
else
BISON=:
AC_MSG_RESULT([NO])
fi
AM_CONDITIONAL([BISON_USE_PARSER_H_EXTENSION], [test x$bison_use_parser_h_extension = xtrue])
AC_SUBST(BISON)
])

View file

@ -1,301 +0,0 @@
# ===========================================================================
# https://www.gnu.org/software/autoconf-archive/ax_boost_base.html
# ===========================================================================
#
# SYNOPSIS
#
# AX_BOOST_BASE([MINIMUM-VERSION], [ACTION-IF-FOUND], [ACTION-IF-NOT-FOUND])
#
# DESCRIPTION
#
# Test for the Boost C++ libraries of a particular version (or newer)
#
# If no path to the installed boost library is given the macro searchs
# under /usr, /usr/local, /opt and /opt/local and evaluates the
# $BOOST_ROOT environment variable. Further documentation is available at
# <http://randspringer.de/boost/index.html>.
#
# This macro calls:
#
# AC_SUBST(BOOST_CPPFLAGS) / AC_SUBST(BOOST_LDFLAGS)
#
# And sets:
#
# HAVE_BOOST
#
# LICENSE
#
# Copyright (c) 2008 Thomas Porschberg <thomas@randspringer.de>
# Copyright (c) 2009 Peter Adolphs
#
# Copying and distribution of this file, with or without modification, are
# permitted in any medium without royalty provided the copyright notice
# and this notice are preserved. This file is offered as-is, without any
# warranty.
#serial 45
# example boost program (need to pass version)
m4_define([_AX_BOOST_BASE_PROGRAM],
[AC_LANG_PROGRAM([[
#include <boost/version.hpp>
]],[[
(void) ((void)sizeof(char[1 - 2*!!((BOOST_VERSION) < ($1))]));
]])])
AC_DEFUN([AX_BOOST_BASE],
[
AC_ARG_WITH([boost],
[AS_HELP_STRING([--with-boost@<:@=ARG@:>@],
[use Boost library from a standard location (ARG=yes),
from the specified location (ARG=<path>),
or disable it (ARG=no)
@<:@ARG=yes@:>@ ])],
[
AS_CASE([$withval],
[no],[want_boost="no";_AX_BOOST_BASE_boost_path=""],
[yes],[want_boost="yes";_AX_BOOST_BASE_boost_path=""],
[want_boost="yes";_AX_BOOST_BASE_boost_path="$withval"])
],
[want_boost="yes"])
AC_ARG_WITH([boost-libdir],
[AS_HELP_STRING([--with-boost-libdir=LIB_DIR],
[Force given directory for boost libraries.
Note that this will override library path detection,
so use this parameter only if default library detection fails
and you know exactly where your boost libraries are located.])],
[
AS_IF([test -d "$withval"],
[_AX_BOOST_BASE_boost_lib_path="$withval"],
[AC_MSG_ERROR([--with-boost-libdir expected directory name])])
],
[_AX_BOOST_BASE_boost_lib_path=""])
BOOST_LDFLAGS=""
BOOST_CPPFLAGS=""
AS_IF([test "x$want_boost" = "xyes"],
[_AX_BOOST_BASE_RUNDETECT([$1],[$2],[$3])])
AC_SUBST(BOOST_CPPFLAGS)
AC_SUBST(BOOST_LDFLAGS)
])
# convert a version string in $2 to numeric and affect to polymorphic var $1
AC_DEFUN([_AX_BOOST_BASE_TONUMERICVERSION],[
AS_IF([test "x$2" = "x"],[_AX_BOOST_BASE_TONUMERICVERSION_req="1.20.0"],[_AX_BOOST_BASE_TONUMERICVERSION_req="$2"])
_AX_BOOST_BASE_TONUMERICVERSION_req_shorten=`expr $_AX_BOOST_BASE_TONUMERICVERSION_req : '\([[0-9]]*\.[[0-9]]*\)'`
_AX_BOOST_BASE_TONUMERICVERSION_req_major=`expr $_AX_BOOST_BASE_TONUMERICVERSION_req : '\([[0-9]]*\)'`
AS_IF([test "x$_AX_BOOST_BASE_TONUMERICVERSION_req_major" = "x"],
[AC_MSG_ERROR([You should at least specify libboost major version])])
_AX_BOOST_BASE_TONUMERICVERSION_req_minor=`expr $_AX_BOOST_BASE_TONUMERICVERSION_req : '[[0-9]]*\.\([[0-9]]*\)'`
AS_IF([test "x$_AX_BOOST_BASE_TONUMERICVERSION_req_minor" = "x"],
[_AX_BOOST_BASE_TONUMERICVERSION_req_minor="0"])
_AX_BOOST_BASE_TONUMERICVERSION_req_sub_minor=`expr $_AX_BOOST_BASE_TONUMERICVERSION_req : '[[0-9]]*\.[[0-9]]*\.\([[0-9]]*\)'`
AS_IF([test "X$_AX_BOOST_BASE_TONUMERICVERSION_req_sub_minor" = "X"],
[_AX_BOOST_BASE_TONUMERICVERSION_req_sub_minor="0"])
_AX_BOOST_BASE_TONUMERICVERSION_RET=`expr $_AX_BOOST_BASE_TONUMERICVERSION_req_major \* 100000 \+ $_AX_BOOST_BASE_TONUMERICVERSION_req_minor \* 100 \+ $_AX_BOOST_BASE_TONUMERICVERSION_req_sub_minor`
AS_VAR_SET($1,$_AX_BOOST_BASE_TONUMERICVERSION_RET)
])
dnl Run the detection of boost should be run only if $want_boost
AC_DEFUN([_AX_BOOST_BASE_RUNDETECT],[
_AX_BOOST_BASE_TONUMERICVERSION(WANT_BOOST_VERSION,[$1])
succeeded=no
AC_REQUIRE([AC_CANONICAL_HOST])
dnl On 64-bit systems check for system libraries in both lib64 and lib.
dnl The former is specified by FHS, but e.g. Debian does not adhere to
dnl this (as it rises problems for generic multi-arch support).
dnl The last entry in the list is chosen by default when no libraries
dnl are found, e.g. when only header-only libraries are installed!
AS_CASE([${host_cpu}],
[x86_64],[libsubdirs="lib64 libx32 lib lib64"],
[ppc64|powerpc64|s390x|sparc64|aarch64|ppc64le|powerpc64le|riscv64],[libsubdirs="lib64 lib lib64"],
[libsubdirs="lib"]
)
dnl allow for real multi-arch paths e.g. /usr/lib/x86_64-linux-gnu. Give
dnl them priority over the other paths since, if libs are found there, they
dnl are almost assuredly the ones desired.
AS_CASE([${host_cpu}],
[i?86],[multiarch_libsubdir="lib/i386-${host_os}"],
[multiarch_libsubdir="lib/${host_cpu}-${host_os}"]
)
dnl first we check the system location for boost libraries
dnl this location ist chosen if boost libraries are installed with the --layout=system option
dnl or if you install boost with RPM
AS_IF([test "x$_AX_BOOST_BASE_boost_path" != "x"],[
AC_MSG_CHECKING([for boostlib >= $1 ($WANT_BOOST_VERSION) includes in "$_AX_BOOST_BASE_boost_path/include"])
AS_IF([test -d "$_AX_BOOST_BASE_boost_path/include" && test -r "$_AX_BOOST_BASE_boost_path/include"],[
AC_MSG_RESULT([yes])
BOOST_CPPFLAGS="-I$_AX_BOOST_BASE_boost_path/include"
for _AX_BOOST_BASE_boost_path_tmp in $multiarch_libsubdir $libsubdirs; do
AC_MSG_CHECKING([for boostlib >= $1 ($WANT_BOOST_VERSION) lib path in "$_AX_BOOST_BASE_boost_path/$_AX_BOOST_BASE_boost_path_tmp"])
AS_IF([test -d "$_AX_BOOST_BASE_boost_path/$_AX_BOOST_BASE_boost_path_tmp" && test -r "$_AX_BOOST_BASE_boost_path/$_AX_BOOST_BASE_boost_path_tmp" ],[
AC_MSG_RESULT([yes])
BOOST_LDFLAGS="-L$_AX_BOOST_BASE_boost_path/$_AX_BOOST_BASE_boost_path_tmp";
break;
],
[AC_MSG_RESULT([no])])
done],[
AC_MSG_RESULT([no])])
],[
if test X"$cross_compiling" = Xyes; then
search_libsubdirs=$multiarch_libsubdir
else
search_libsubdirs="$multiarch_libsubdir $libsubdirs"
fi
for _AX_BOOST_BASE_boost_path_tmp in /usr /usr/local /opt /opt/local ; do
if test -d "$_AX_BOOST_BASE_boost_path_tmp/include/boost" && test -r "$_AX_BOOST_BASE_boost_path_tmp/include/boost" ; then
for libsubdir in $search_libsubdirs ; do
if ls "$_AX_BOOST_BASE_boost_path_tmp/$libsubdir/libboost_"* >/dev/null 2>&1 ; then break; fi
done
BOOST_LDFLAGS="-L$_AX_BOOST_BASE_boost_path_tmp/$libsubdir"
BOOST_CPPFLAGS="-I$_AX_BOOST_BASE_boost_path_tmp/include"
break;
fi
done
])
dnl overwrite ld flags if we have required special directory with
dnl --with-boost-libdir parameter
AS_IF([test "x$_AX_BOOST_BASE_boost_lib_path" != "x"],
[BOOST_LDFLAGS="-L$_AX_BOOST_BASE_boost_lib_path"])
AC_MSG_CHECKING([for boostlib >= $1 ($WANT_BOOST_VERSION)])
CPPFLAGS_SAVED="$CPPFLAGS"
CPPFLAGS="$CPPFLAGS $BOOST_CPPFLAGS"
export CPPFLAGS
LDFLAGS_SAVED="$LDFLAGS"
LDFLAGS="$LDFLAGS $BOOST_LDFLAGS"
export LDFLAGS
AC_REQUIRE([AC_PROG_CXX])
AC_LANG_PUSH(C++)
AC_COMPILE_IFELSE([_AX_BOOST_BASE_PROGRAM($WANT_BOOST_VERSION)],[
AC_MSG_RESULT(yes)
succeeded=yes
found_system=yes
],[
])
AC_LANG_POP([C++])
dnl if we found no boost with system layout we search for boost libraries
dnl built and installed without the --layout=system option or for a staged(not installed) version
if test "x$succeeded" != "xyes" ; then
CPPFLAGS="$CPPFLAGS_SAVED"
LDFLAGS="$LDFLAGS_SAVED"
BOOST_CPPFLAGS=
if test -z "$_AX_BOOST_BASE_boost_lib_path" ; then
BOOST_LDFLAGS=
fi
_version=0
if test -n "$_AX_BOOST_BASE_boost_path" ; then
if test -d "$_AX_BOOST_BASE_boost_path" && test -r "$_AX_BOOST_BASE_boost_path"; then
for i in `ls -d $_AX_BOOST_BASE_boost_path/include/boost-* 2>/dev/null`; do
_version_tmp=`echo $i | sed "s#$_AX_BOOST_BASE_boost_path##" | sed 's/\/include\/boost-//' | sed 's/_/./'`
V_CHECK=`expr $_version_tmp \> $_version`
if test "x$V_CHECK" = "x1" ; then
_version=$_version_tmp
fi
VERSION_UNDERSCORE=`echo $_version | sed 's/\./_/'`
BOOST_CPPFLAGS="-I$_AX_BOOST_BASE_boost_path/include/boost-$VERSION_UNDERSCORE"
done
dnl if nothing found search for layout used in Windows distributions
if test -z "$BOOST_CPPFLAGS"; then
if test -d "$_AX_BOOST_BASE_boost_path/boost" && test -r "$_AX_BOOST_BASE_boost_path/boost"; then
BOOST_CPPFLAGS="-I$_AX_BOOST_BASE_boost_path"
fi
fi
dnl if we found something and BOOST_LDFLAGS was unset before
dnl (because "$_AX_BOOST_BASE_boost_lib_path" = ""), set it here.
if test -n "$BOOST_CPPFLAGS" && test -z "$BOOST_LDFLAGS"; then
for libsubdir in $libsubdirs ; do
if ls "$_AX_BOOST_BASE_boost_path/$libsubdir/libboost_"* >/dev/null 2>&1 ; then break; fi
done
BOOST_LDFLAGS="-L$_AX_BOOST_BASE_boost_path/$libsubdir"
fi
fi
else
if test "x$cross_compiling" != "xyes" ; then
for _AX_BOOST_BASE_boost_path in /usr /usr/local /opt /opt/local ; do
if test -d "$_AX_BOOST_BASE_boost_path" && test -r "$_AX_BOOST_BASE_boost_path" ; then
for i in `ls -d $_AX_BOOST_BASE_boost_path/include/boost-* 2>/dev/null`; do
_version_tmp=`echo $i | sed "s#$_AX_BOOST_BASE_boost_path##" | sed 's/\/include\/boost-//' | sed 's/_/./'`
V_CHECK=`expr $_version_tmp \> $_version`
if test "x$V_CHECK" = "x1" ; then
_version=$_version_tmp
best_path=$_AX_BOOST_BASE_boost_path
fi
done
fi
done
VERSION_UNDERSCORE=`echo $_version | sed 's/\./_/'`
BOOST_CPPFLAGS="-I$best_path/include/boost-$VERSION_UNDERSCORE"
if test -z "$_AX_BOOST_BASE_boost_lib_path" ; then
for libsubdir in $libsubdirs ; do
if ls "$best_path/$libsubdir/libboost_"* >/dev/null 2>&1 ; then break; fi
done
BOOST_LDFLAGS="-L$best_path/$libsubdir"
fi
fi
if test -n "$BOOST_ROOT" ; then
for libsubdir in $libsubdirs ; do
if ls "$BOOST_ROOT/stage/$libsubdir/libboost_"* >/dev/null 2>&1 ; then break; fi
done
if test -d "$BOOST_ROOT" && test -r "$BOOST_ROOT" && test -d "$BOOST_ROOT/stage/$libsubdir" && test -r "$BOOST_ROOT/stage/$libsubdir"; then
version_dir=`expr //$BOOST_ROOT : '.*/\(.*\)'`
stage_version=`echo $version_dir | sed 's/boost_//' | sed 's/_/./g'`
stage_version_shorten=`expr $stage_version : '\([[0-9]]*\.[[0-9]]*\)'`
V_CHECK=`expr $stage_version_shorten \>\= $_version`
if test "x$V_CHECK" = "x1" && test -z "$_AX_BOOST_BASE_boost_lib_path" ; then
AC_MSG_NOTICE(We will use a staged boost library from $BOOST_ROOT)
BOOST_CPPFLAGS="-I$BOOST_ROOT"
BOOST_LDFLAGS="-L$BOOST_ROOT/stage/$libsubdir"
fi
fi
fi
fi
CPPFLAGS="$CPPFLAGS $BOOST_CPPFLAGS"
export CPPFLAGS
LDFLAGS="$LDFLAGS $BOOST_LDFLAGS"
export LDFLAGS
AC_LANG_PUSH(C++)
AC_COMPILE_IFELSE([_AX_BOOST_BASE_PROGRAM($WANT_BOOST_VERSION)],[
AC_MSG_RESULT(yes)
succeeded=yes
found_system=yes
],[
])
AC_LANG_POP([C++])
fi
if test "x$succeeded" != "xyes" ; then
if test "x$_version" = "x0" ; then
AC_MSG_NOTICE([[We could not detect the boost libraries (version $1 or higher). If you have a staged boost library (still not installed) please specify \$BOOST_ROOT in your environment and do not give a PATH to --with-boost option. If you are sure you have boost installed, then check your version number looking in <boost/version.hpp>. See http://randspringer.de/boost for more documentation.]])
else
AC_MSG_NOTICE([Your boost libraries seems to old (version $_version).])
fi
# execute ACTION-IF-NOT-FOUND (if present):
ifelse([$3], , :, [$3])
else
AC_DEFINE(HAVE_BOOST,,[define if the Boost library is available])
# execute ACTION-IF-FOUND (if present):
ifelse([$2], , :, [$2])
fi
CPPFLAGS="$CPPFLAGS_SAVED"
LDFLAGS="$LDFLAGS_SAVED"
])

View file

@ -1,124 +0,0 @@
# ===========================================================================
# https://www.gnu.org/software/autoconf-archive/ax_check_openssl.html
# ===========================================================================
#
# SYNOPSIS
#
# AX_CHECK_OPENSSL([action-if-found[, action-if-not-found]])
#
# DESCRIPTION
#
# Look for OpenSSL in a number of default spots, or in a user-selected
# spot (via --with-openssl). Sets
#
# OPENSSL_INCLUDES to the include directives required
# OPENSSL_LIBS to the -l directives required
# OPENSSL_LDFLAGS to the -L or -R flags required
#
# and calls ACTION-IF-FOUND or ACTION-IF-NOT-FOUND appropriately
#
# This macro sets OPENSSL_INCLUDES such that source files should use the
# openssl/ directory in include directives:
#
# #include <openssl/hmac.h>
#
# LICENSE
#
# Copyright (c) 2009,2010 Zmanda Inc. <http://www.zmanda.com/>
# Copyright (c) 2009,2010 Dustin J. Mitchell <dustin@zmanda.com>
#
# Copying and distribution of this file, with or without modification, are
# permitted in any medium without royalty provided the copyright notice
# and this notice are preserved. This file is offered as-is, without any
# warranty.
#serial 10
AU_ALIAS([CHECK_SSL], [AX_CHECK_OPENSSL])
AC_DEFUN([AX_CHECK_OPENSSL], [
found=false
AC_ARG_WITH([openssl],
[AS_HELP_STRING([--with-openssl=DIR],
[root of the OpenSSL directory])],
[
case "$withval" in
"" | y | ye | yes | n | no)
AC_MSG_ERROR([Invalid --with-openssl value])
;;
*) ssldirs="$withval"
;;
esac
], [
# if pkg-config is installed and openssl has installed a .pc file,
# then use that information and don't search ssldirs
AC_CHECK_TOOL([PKG_CONFIG], [pkg-config])
if test x"$PKG_CONFIG" != x""; then
OPENSSL_LDFLAGS=`$PKG_CONFIG openssl --libs-only-L 2>/dev/null`
if test $? = 0; then
OPENSSL_LIBS=`$PKG_CONFIG openssl --libs-only-l 2>/dev/null`
OPENSSL_INCLUDES=`$PKG_CONFIG openssl --cflags-only-I 2>/dev/null`
found=true
fi
fi
# no such luck; use some default ssldirs
if ! $found; then
ssldirs="/usr/local/ssl /usr/lib/ssl /usr/ssl /usr/pkg /usr/local /usr"
fi
]
)
# note that we #include <openssl/foo.h>, so the OpenSSL headers have to be in
# an 'openssl' subdirectory
if ! $found; then
OPENSSL_INCLUDES=
for ssldir in $ssldirs; do
AC_MSG_CHECKING([for openssl/ssl.h in $ssldir])
if test -f "$ssldir/include/openssl/ssl.h"; then
OPENSSL_INCLUDES="-I$ssldir/include"
OPENSSL_LDFLAGS="-L$ssldir/lib"
OPENSSL_LIBS="-lssl -lcrypto"
found=true
AC_MSG_RESULT([yes])
break
else
AC_MSG_RESULT([no])
fi
done
# if the file wasn't found, well, go ahead and try the link anyway -- maybe
# it will just work!
fi
# try the preprocessor and linker with our new flags,
# being careful not to pollute the global LIBS, LDFLAGS, and CPPFLAGS
AC_MSG_CHECKING([whether compiling and linking against OpenSSL works])
echo "Trying link with OPENSSL_LDFLAGS=$OPENSSL_LDFLAGS;" \
"OPENSSL_LIBS=$OPENSSL_LIBS; OPENSSL_INCLUDES=$OPENSSL_INCLUDES" >&AS_MESSAGE_LOG_FD
save_LIBS="$LIBS"
save_LDFLAGS="$LDFLAGS"
save_CPPFLAGS="$CPPFLAGS"
LDFLAGS="$LDFLAGS $OPENSSL_LDFLAGS"
LIBS="$OPENSSL_LIBS $LIBS"
CPPFLAGS="$OPENSSL_INCLUDES $CPPFLAGS"
AC_LINK_IFELSE(
[AC_LANG_PROGRAM([#include <openssl/ssl.h>], [SSL_new(NULL)])],
[
AC_MSG_RESULT([yes])
$1
], [
AC_MSG_RESULT([no])
$2
])
CPPFLAGS="$save_CPPFLAGS"
LDFLAGS="$save_LDFLAGS"
LIBS="$save_LIBS"
AC_SUBST([OPENSSL_INCLUDES])
AC_SUBST([OPENSSL_LIBS])
AC_SUBST([OPENSSL_LDFLAGS])
])

View file

@ -1,177 +0,0 @@
# ===========================================================================
# https://www.gnu.org/software/autoconf-archive/ax_compare_version.html
# ===========================================================================
#
# SYNOPSIS
#
# AX_COMPARE_VERSION(VERSION_A, OP, VERSION_B, [ACTION-IF-TRUE], [ACTION-IF-FALSE])
#
# DESCRIPTION
#
# This macro compares two version strings. Due to the various number of
# minor-version numbers that can exist, and the fact that string
# comparisons are not compatible with numeric comparisons, this is not
# necessarily trivial to do in a autoconf script. This macro makes doing
# these comparisons easy.
#
# The six basic comparisons are available, as well as checking equality
# limited to a certain number of minor-version levels.
#
# The operator OP determines what type of comparison to do, and can be one
# of:
#
# eq - equal (test A == B)
# ne - not equal (test A != B)
# le - less than or equal (test A <= B)
# ge - greater than or equal (test A >= B)
# lt - less than (test A < B)
# gt - greater than (test A > B)
#
# Additionally, the eq and ne operator can have a number after it to limit
# the test to that number of minor versions.
#
# eq0 - equal up to the length of the shorter version
# ne0 - not equal up to the length of the shorter version
# eqN - equal up to N sub-version levels
# neN - not equal up to N sub-version levels
#
# When the condition is true, shell commands ACTION-IF-TRUE are run,
# otherwise shell commands ACTION-IF-FALSE are run. The environment
# variable 'ax_compare_version' is always set to either 'true' or 'false'
# as well.
#
# Examples:
#
# AX_COMPARE_VERSION([3.15.7],[lt],[3.15.8])
# AX_COMPARE_VERSION([3.15],[lt],[3.15.8])
#
# would both be true.
#
# AX_COMPARE_VERSION([3.15.7],[eq],[3.15.8])
# AX_COMPARE_VERSION([3.15],[gt],[3.15.8])
#
# would both be false.
#
# AX_COMPARE_VERSION([3.15.7],[eq2],[3.15.8])
#
# would be true because it is only comparing two minor versions.
#
# AX_COMPARE_VERSION([3.15.7],[eq0],[3.15])
#
# would be true because it is only comparing the lesser number of minor
# versions of the two values.
#
# Note: The characters that separate the version numbers do not matter. An
# empty string is the same as version 0. OP is evaluated by autoconf, not
# configure, so must be a string, not a variable.
#
# The author would like to acknowledge Guido Draheim whose advice about
# the m4_case and m4_ifvaln functions make this macro only include the
# portions necessary to perform the specific comparison specified by the
# OP argument in the final configure script.
#
# LICENSE
#
# Copyright (c) 2008 Tim Toolan <toolan@ele.uri.edu>
#
# Copying and distribution of this file, with or without modification, are
# permitted in any medium without royalty provided the copyright notice
# and this notice are preserved. This file is offered as-is, without any
# warranty.
#serial 12
dnl #########################################################################
AC_DEFUN([AX_COMPARE_VERSION], [
AC_REQUIRE([AC_PROG_AWK])
# Used to indicate true or false condition
ax_compare_version=false
# Convert the two version strings to be compared into a format that
# allows a simple string comparison. The end result is that a version
# string of the form 1.12.5-r617 will be converted to the form
# 0001001200050617. In other words, each number is zero padded to four
# digits, and non digits are removed.
AS_VAR_PUSHDEF([A],[ax_compare_version_A])
A=`echo "$1" | sed -e 's/\([[0-9]]*\)/Z\1Z/g' \
-e 's/Z\([[0-9]]\)Z/Z0\1Z/g' \
-e 's/Z\([[0-9]][[0-9]]\)Z/Z0\1Z/g' \
-e 's/Z\([[0-9]][[0-9]][[0-9]]\)Z/Z0\1Z/g' \
-e 's/[[^0-9]]//g'`
AS_VAR_PUSHDEF([B],[ax_compare_version_B])
B=`echo "$3" | sed -e 's/\([[0-9]]*\)/Z\1Z/g' \
-e 's/Z\([[0-9]]\)Z/Z0\1Z/g' \
-e 's/Z\([[0-9]][[0-9]]\)Z/Z0\1Z/g' \
-e 's/Z\([[0-9]][[0-9]][[0-9]]\)Z/Z0\1Z/g' \
-e 's/[[^0-9]]//g'`
dnl # In the case of le, ge, lt, and gt, the strings are sorted as necessary
dnl # then the first line is used to determine if the condition is true.
dnl # The sed right after the echo is to remove any indented white space.
m4_case(m4_tolower($2),
[lt],[
ax_compare_version=`echo "x$A
x$B" | sed 's/^ *//' | sort -r | sed "s/x${A}/false/;s/x${B}/true/;1q"`
],
[gt],[
ax_compare_version=`echo "x$A
x$B" | sed 's/^ *//' | sort | sed "s/x${A}/false/;s/x${B}/true/;1q"`
],
[le],[
ax_compare_version=`echo "x$A
x$B" | sed 's/^ *//' | sort | sed "s/x${A}/true/;s/x${B}/false/;1q"`
],
[ge],[
ax_compare_version=`echo "x$A
x$B" | sed 's/^ *//' | sort -r | sed "s/x${A}/true/;s/x${B}/false/;1q"`
],[
dnl Split the operator from the subversion count if present.
m4_bmatch(m4_substr($2,2),
[0],[
# A count of zero means use the length of the shorter version.
# Determine the number of characters in A and B.
ax_compare_version_len_A=`echo "$A" | $AWK '{print(length)}'`
ax_compare_version_len_B=`echo "$B" | $AWK '{print(length)}'`
# Set A to no more than B's length and B to no more than A's length.
A=`echo "$A" | sed "s/\(.\{$ax_compare_version_len_B\}\).*/\1/"`
B=`echo "$B" | sed "s/\(.\{$ax_compare_version_len_A\}\).*/\1/"`
],
[[0-9]+],[
# A count greater than zero means use only that many subversions
A=`echo "$A" | sed "s/\(\([[0-9]]\{4\}\)\{m4_substr($2,2)\}\).*/\1/"`
B=`echo "$B" | sed "s/\(\([[0-9]]\{4\}\)\{m4_substr($2,2)\}\).*/\1/"`
],
[.+],[
AC_WARNING(
[illegal OP numeric parameter: $2])
],[])
# Pad zeros at end of numbers to make same length.
ax_compare_version_tmp_A="$A`echo $B | sed 's/./0/g'`"
B="$B`echo $A | sed 's/./0/g'`"
A="$ax_compare_version_tmp_A"
# Check for equality or inequality as necessary.
m4_case(m4_tolower(m4_substr($2,0,2)),
[eq],[
test "x$A" = "x$B" && ax_compare_version=true
],
[ne],[
test "x$A" != "x$B" && ax_compare_version=true
],[
AC_WARNING([illegal OP parameter: $2])
])
])
AS_VAR_POPDEF([A])dnl
AS_VAR_POPDEF([B])dnl
dnl # Execute ACTION-IF-TRUE / ACTION-IF-FALSE.
if test "$ax_compare_version" = "true" ; then
m4_ifvaln([$4],[$4],[:])dnl
m4_ifvaln([$5],[else $5])dnl
fi
]) dnl AX_COMPARE_VERSION

View file

@ -1,948 +0,0 @@
# ===========================================================================
# https://www.gnu.org/software/autoconf-archive/ax_cxx_compile_stdcxx.html
# ===========================================================================
#
# SYNOPSIS
#
# AX_CXX_COMPILE_STDCXX(VERSION, [ext|noext], [mandatory|optional])
#
# DESCRIPTION
#
# Check for baseline language coverage in the compiler for the specified
# version of the C++ standard. If necessary, add switches to CXX and
# CXXCPP to enable support. VERSION may be '11' (for the C++11 standard)
# or '14' (for the C++14 standard).
#
# The second argument, if specified, indicates whether you insist on an
# extended mode (e.g. -std=gnu++11) or a strict conformance mode (e.g.
# -std=c++11). If neither is specified, you get whatever works, with
# preference for an extended mode.
#
# The third argument, if specified 'mandatory' or if left unspecified,
# indicates that baseline support for the specified C++ standard is
# required and that the macro should error out if no mode with that
# support is found. If specified 'optional', then configuration proceeds
# regardless, after defining HAVE_CXX${VERSION} if and only if a
# supporting mode is found.
#
# LICENSE
#
# Copyright (c) 2008 Benjamin Kosnik <bkoz@redhat.com>
# Copyright (c) 2012 Zack Weinberg <zackw@panix.com>
# Copyright (c) 2013 Roy Stogner <roystgnr@ices.utexas.edu>
# Copyright (c) 2014, 2015 Google Inc.; contributed by Alexey Sokolov <sokolov@google.com>
# Copyright (c) 2015 Paul Norman <penorman@mac.com>
# Copyright (c) 2015 Moritz Klammler <moritz@klammler.eu>
# Copyright (c) 2016, 2018 Krzesimir Nowak <qdlacz@gmail.com>
#
# Copying and distribution of this file, with or without modification, are
# permitted in any medium without royalty provided the copyright notice
# and this notice are preserved. This file is offered as-is, without any
# warranty.
#serial 10
dnl This macro is based on the code from the AX_CXX_COMPILE_STDCXX_11 macro
dnl (serial version number 13).
AC_DEFUN([AX_CXX_COMPILE_STDCXX], [dnl
m4_if([$1], [11], [ax_cxx_compile_alternatives="11 0x"],
[$1], [14], [ax_cxx_compile_alternatives="14 1y"],
[$1], [17], [ax_cxx_compile_alternatives="17 1z"],
[m4_fatal([invalid first argument `$1' to AX_CXX_COMPILE_STDCXX])])dnl
m4_if([$2], [], [],
[$2], [ext], [],
[$2], [noext], [],
[m4_fatal([invalid second argument `$2' to AX_CXX_COMPILE_STDCXX])])dnl
m4_if([$3], [], [ax_cxx_compile_cxx$1_required=true],
[$3], [mandatory], [ax_cxx_compile_cxx$1_required=true],
[$3], [optional], [ax_cxx_compile_cxx$1_required=false],
[m4_fatal([invalid third argument `$3' to AX_CXX_COMPILE_STDCXX])])
AC_LANG_PUSH([C++])dnl
ac_success=no
m4_if([$2], [noext], [], [dnl
if test x$ac_success = xno; then
for alternative in ${ax_cxx_compile_alternatives}; do
switch="-std=gnu++${alternative}"
cachevar=AS_TR_SH([ax_cv_cxx_compile_cxx$1_$switch])
AC_CACHE_CHECK(whether $CXX supports C++$1 features with $switch,
$cachevar,
[ac_save_CXX="$CXX"
CXX="$CXX $switch"
AC_COMPILE_IFELSE([AC_LANG_SOURCE([_AX_CXX_COMPILE_STDCXX_testbody_$1])],
[eval $cachevar=yes],
[eval $cachevar=no])
CXX="$ac_save_CXX"])
if eval test x\$$cachevar = xyes; then
CXX="$CXX $switch"
if test -n "$CXXCPP" ; then
CXXCPP="$CXXCPP $switch"
fi
ac_success=yes
break
fi
done
fi])
m4_if([$2], [ext], [], [dnl
if test x$ac_success = xno; then
dnl HP's aCC needs +std=c++11 according to:
dnl http://h21007.www2.hp.com/portal/download/files/unprot/aCxx/PDF_Release_Notes/769149-001.pdf
dnl Cray's crayCC needs "-h std=c++11"
for alternative in ${ax_cxx_compile_alternatives}; do
for switch in -std=c++${alternative} +std=c++${alternative} "-h std=c++${alternative}"; do
cachevar=AS_TR_SH([ax_cv_cxx_compile_cxx$1_$switch])
AC_CACHE_CHECK(whether $CXX supports C++$1 features with $switch,
$cachevar,
[ac_save_CXX="$CXX"
CXX="$CXX $switch"
AC_COMPILE_IFELSE([AC_LANG_SOURCE([_AX_CXX_COMPILE_STDCXX_testbody_$1])],
[eval $cachevar=yes],
[eval $cachevar=no])
CXX="$ac_save_CXX"])
if eval test x\$$cachevar = xyes; then
CXX="$CXX $switch"
if test -n "$CXXCPP" ; then
CXXCPP="$CXXCPP $switch"
fi
ac_success=yes
break
fi
done
if test x$ac_success = xyes; then
break
fi
done
fi])
AC_LANG_POP([C++])
if test x$ax_cxx_compile_cxx$1_required = xtrue; then
if test x$ac_success = xno; then
AC_MSG_ERROR([*** A compiler with support for C++$1 language features is required.])
fi
fi
if test x$ac_success = xno; then
HAVE_CXX$1=0
AC_MSG_NOTICE([No compiler with C++$1 support was found])
else
HAVE_CXX$1=1
AC_DEFINE(HAVE_CXX$1,1,
[define if the compiler supports basic C++$1 syntax])
fi
AC_SUBST(HAVE_CXX$1)
])
dnl Test body for checking C++11 support
m4_define([_AX_CXX_COMPILE_STDCXX_testbody_11],
_AX_CXX_COMPILE_STDCXX_testbody_new_in_11
)
dnl Test body for checking C++14 support
m4_define([_AX_CXX_COMPILE_STDCXX_testbody_14],
_AX_CXX_COMPILE_STDCXX_testbody_new_in_11
_AX_CXX_COMPILE_STDCXX_testbody_new_in_14
)
m4_define([_AX_CXX_COMPILE_STDCXX_testbody_17],
_AX_CXX_COMPILE_STDCXX_testbody_new_in_11
_AX_CXX_COMPILE_STDCXX_testbody_new_in_14
_AX_CXX_COMPILE_STDCXX_testbody_new_in_17
)
dnl Tests for new features in C++11
m4_define([_AX_CXX_COMPILE_STDCXX_testbody_new_in_11], [[
// If the compiler admits that it is not ready for C++11, why torture it?
// Hopefully, this will speed up the test.
#ifndef __cplusplus
#error "This is not a C++ compiler"
#elif __cplusplus < 201103L
#error "This is not a C++11 compiler"
#else
namespace cxx11
{
namespace test_static_assert
{
template <typename T>
struct check
{
static_assert(sizeof(int) <= sizeof(T), "not big enough");
};
}
namespace test_final_override
{
struct Base
{
virtual void f() {}
};
struct Derived : public Base
{
virtual void f() override {}
};
}
namespace test_double_right_angle_brackets
{
template < typename T >
struct check {};
typedef check<void> single_type;
typedef check<check<void>> double_type;
typedef check<check<check<void>>> triple_type;
typedef check<check<check<check<void>>>> quadruple_type;
}
namespace test_decltype
{
int
f()
{
int a = 1;
decltype(a) b = 2;
return a + b;
}
}
namespace test_type_deduction
{
template < typename T1, typename T2 >
struct is_same
{
static const bool value = false;
};
template < typename T >
struct is_same<T, T>
{
static const bool value = true;
};
template < typename T1, typename T2 >
auto
add(T1 a1, T2 a2) -> decltype(a1 + a2)
{
return a1 + a2;
}
int
test(const int c, volatile int v)
{
static_assert(is_same<int, decltype(0)>::value == true, "");
static_assert(is_same<int, decltype(c)>::value == false, "");
static_assert(is_same<int, decltype(v)>::value == false, "");
auto ac = c;
auto av = v;
auto sumi = ac + av + 'x';
auto sumf = ac + av + 1.0;
static_assert(is_same<int, decltype(ac)>::value == true, "");
static_assert(is_same<int, decltype(av)>::value == true, "");
static_assert(is_same<int, decltype(sumi)>::value == true, "");
static_assert(is_same<int, decltype(sumf)>::value == false, "");
static_assert(is_same<int, decltype(add(c, v))>::value == true, "");
return (sumf > 0.0) ? sumi : add(c, v);
}
}
namespace test_noexcept
{
int f() { return 0; }
int g() noexcept { return 0; }
static_assert(noexcept(f()) == false, "");
static_assert(noexcept(g()) == true, "");
}
namespace test_constexpr
{
template < typename CharT >
unsigned long constexpr
strlen_c_r(const CharT *const s, const unsigned long acc) noexcept
{
return *s ? strlen_c_r(s + 1, acc + 1) : acc;
}
template < typename CharT >
unsigned long constexpr
strlen_c(const CharT *const s) noexcept
{
return strlen_c_r(s, 0UL);
}
static_assert(strlen_c("") == 0UL, "");
static_assert(strlen_c("1") == 1UL, "");
static_assert(strlen_c("example") == 7UL, "");
static_assert(strlen_c("another\0example") == 7UL, "");
}
namespace test_rvalue_references
{
template < int N >
struct answer
{
static constexpr int value = N;
};
answer<1> f(int&) { return answer<1>(); }
answer<2> f(const int&) { return answer<2>(); }
answer<3> f(int&&) { return answer<3>(); }
void
test()
{
int i = 0;
const int c = 0;
static_assert(decltype(f(i))::value == 1, "");
static_assert(decltype(f(c))::value == 2, "");
static_assert(decltype(f(0))::value == 3, "");
}
}
namespace test_uniform_initialization
{
struct test
{
static const int zero {};
static const int one {1};
};
static_assert(test::zero == 0, "");
static_assert(test::one == 1, "");
}
namespace test_lambdas
{
void
test1()
{
auto lambda1 = [](){};
auto lambda2 = lambda1;
lambda1();
lambda2();
}
int
test2()
{
auto a = [](int i, int j){ return i + j; }(1, 2);
auto b = []() -> int { return '0'; }();
auto c = [=](){ return a + b; }();
auto d = [&](){ return c; }();
auto e = [a, &b](int x) mutable {
const auto identity = [](int y){ return y; };
for (auto i = 0; i < a; ++i)
a += b--;
return x + identity(a + b);
}(0);
return a + b + c + d + e;
}
int
test3()
{
const auto nullary = [](){ return 0; };
const auto unary = [](int x){ return x; };
using nullary_t = decltype(nullary);
using unary_t = decltype(unary);
const auto higher1st = [](nullary_t f){ return f(); };
const auto higher2nd = [unary](nullary_t f1){
return [unary, f1](unary_t f2){ return f2(unary(f1())); };
};
return higher1st(nullary) + higher2nd(nullary)(unary);
}
}
namespace test_variadic_templates
{
template <int...>
struct sum;
template <int N0, int... N1toN>
struct sum<N0, N1toN...>
{
static constexpr auto value = N0 + sum<N1toN...>::value;
};
template <>
struct sum<>
{
static constexpr auto value = 0;
};
static_assert(sum<>::value == 0, "");
static_assert(sum<1>::value == 1, "");
static_assert(sum<23>::value == 23, "");
static_assert(sum<1, 2>::value == 3, "");
static_assert(sum<5, 5, 11>::value == 21, "");
static_assert(sum<2, 3, 5, 7, 11, 13>::value == 41, "");
}
// http://stackoverflow.com/questions/13728184/template-aliases-and-sfinae
// Clang 3.1 fails with headers of libstd++ 4.8.3 when using std::function
// because of this.
namespace test_template_alias_sfinae
{
struct foo {};
template<typename T>
using member = typename T::member_type;
template<typename T>
void func(...) {}
template<typename T>
void func(member<T>*) {}
void test();
void test() { func<foo>(0); }
}
} // namespace cxx11
#endif // __cplusplus >= 201103L
]])
dnl Tests for new features in C++14
m4_define([_AX_CXX_COMPILE_STDCXX_testbody_new_in_14], [[
// If the compiler admits that it is not ready for C++14, why torture it?
// Hopefully, this will speed up the test.
#ifndef __cplusplus
#error "This is not a C++ compiler"
#elif __cplusplus < 201402L
#error "This is not a C++14 compiler"
#else
namespace cxx14
{
namespace test_polymorphic_lambdas
{
int
test()
{
const auto lambda = [](auto&&... args){
const auto istiny = [](auto x){
return (sizeof(x) == 1UL) ? 1 : 0;
};
const int aretiny[] = { istiny(args)... };
return aretiny[0];
};
return lambda(1, 1L, 1.0f, '1');
}
}
namespace test_binary_literals
{
constexpr auto ivii = 0b0000000000101010;
static_assert(ivii == 42, "wrong value");
}
namespace test_generalized_constexpr
{
template < typename CharT >
constexpr unsigned long
strlen_c(const CharT *const s) noexcept
{
auto length = 0UL;
for (auto p = s; *p; ++p)
++length;
return length;
}
static_assert(strlen_c("") == 0UL, "");
static_assert(strlen_c("x") == 1UL, "");
static_assert(strlen_c("test") == 4UL, "");
static_assert(strlen_c("another\0test") == 7UL, "");
}
namespace test_lambda_init_capture
{
int
test()
{
auto x = 0;
const auto lambda1 = [a = x](int b){ return a + b; };
const auto lambda2 = [a = lambda1(x)](){ return a; };
return lambda2();
}
}
namespace test_digit_separators
{
constexpr auto ten_million = 100'000'000;
static_assert(ten_million == 100000000, "");
}
namespace test_return_type_deduction
{
auto f(int& x) { return x; }
decltype(auto) g(int& x) { return x; }
template < typename T1, typename T2 >
struct is_same
{
static constexpr auto value = false;
};
template < typename T >
struct is_same<T, T>
{
static constexpr auto value = true;
};
int
test()
{
auto x = 0;
static_assert(is_same<int, decltype(f(x))>::value, "");
static_assert(is_same<int&, decltype(g(x))>::value, "");
return x;
}
}
} // namespace cxx14
#endif // __cplusplus >= 201402L
]])
dnl Tests for new features in C++17
m4_define([_AX_CXX_COMPILE_STDCXX_testbody_new_in_17], [[
// If the compiler admits that it is not ready for C++17, why torture it?
// Hopefully, this will speed up the test.
#ifndef __cplusplus
#error "This is not a C++ compiler"
#elif __cplusplus < 201703L
#error "This is not a C++17 compiler"
#else
#include <initializer_list>
#include <utility>
#include <type_traits>
namespace cxx17
{
namespace test_constexpr_lambdas
{
constexpr int foo = [](){return 42;}();
}
namespace test::nested_namespace::definitions
{
}
namespace test_fold_expression
{
template<typename... Args>
int multiply(Args... args)
{
return (args * ... * 1);
}
template<typename... Args>
bool all(Args... args)
{
return (args && ...);
}
}
namespace test_extended_static_assert
{
static_assert (true);
}
namespace test_auto_brace_init_list
{
auto foo = {5};
auto bar {5};
static_assert(std::is_same<std::initializer_list<int>, decltype(foo)>::value);
static_assert(std::is_same<int, decltype(bar)>::value);
}
namespace test_typename_in_template_template_parameter
{
template<template<typename> typename X> struct D;
}
namespace test_fallthrough_nodiscard_maybe_unused_attributes
{
int f1()
{
return 42;
}
[[nodiscard]] int f2()
{
[[maybe_unused]] auto unused = f1();
switch (f1())
{
case 17:
f1();
[[fallthrough]];
case 42:
f1();
}
return f1();
}
}
namespace test_extended_aggregate_initialization
{
struct base1
{
int b1, b2 = 42;
};
struct base2
{
base2() {
b3 = 42;
}
int b3;
};
struct derived : base1, base2
{
int d;
};
derived d1 {{1, 2}, {}, 4}; // full initialization
derived d2 {{}, {}, 4}; // value-initialized bases
}
namespace test_general_range_based_for_loop
{
struct iter
{
int i;
int& operator* ()
{
return i;
}
const int& operator* () const
{
return i;
}
iter& operator++()
{
++i;
return *this;
}
};
struct sentinel
{
int i;
};
bool operator== (const iter& i, const sentinel& s)
{
return i.i == s.i;
}
bool operator!= (const iter& i, const sentinel& s)
{
return !(i == s);
}
struct range
{
iter begin() const
{
return {0};
}
sentinel end() const
{
return {5};
}
};
void f()
{
range r {};
for (auto i : r)
{
[[maybe_unused]] auto v = i;
}
}
}
namespace test_lambda_capture_asterisk_this_by_value
{
struct t
{
int i;
int foo()
{
return [*this]()
{
return i;
}();
}
};
}
namespace test_enum_class_construction
{
enum class byte : unsigned char
{};
byte foo {42};
}
namespace test_constexpr_if
{
template <bool cond>
int f ()
{
if constexpr(cond)
{
return 13;
}
else
{
return 42;
}
}
}
namespace test_selection_statement_with_initializer
{
int f()
{
return 13;
}
int f2()
{
if (auto i = f(); i > 0)
{
return 3;
}
switch (auto i = f(); i + 4)
{
case 17:
return 2;
default:
return 1;
}
}
}
namespace test_template_argument_deduction_for_class_templates
{
template <typename T1, typename T2>
struct pair
{
pair (T1 p1, T2 p2)
: m1 {p1},
m2 {p2}
{}
T1 m1;
T2 m2;
};
void f()
{
[[maybe_unused]] auto p = pair{13, 42u};
}
}
namespace test_non_type_auto_template_parameters
{
template <auto n>
struct B
{};
B<5> b1;
B<'a'> b2;
}
namespace test_structured_bindings
{
int arr[2] = { 1, 2 };
std::pair<int, int> pr = { 1, 2 };
auto f1() -> int(&)[2]
{
return arr;
}
auto f2() -> std::pair<int, int>&
{
return pr;
}
struct S
{
int x1 : 2;
volatile double y1;
};
S f3()
{
return {};
}
auto [ x1, y1 ] = f1();
auto& [ xr1, yr1 ] = f1();
auto [ x2, y2 ] = f2();
auto& [ xr2, yr2 ] = f2();
const auto [ x3, y3 ] = f3();
}
namespace test_exception_spec_type_system
{
struct Good {};
struct Bad {};
void g1() noexcept;
void g2();
template<typename T>
Bad
f(T*, T*);
template<typename T1, typename T2>
Good
f(T1*, T2*);
static_assert (std::is_same_v<Good, decltype(f(g1, g2))>);
}
namespace test_inline_variables
{
template<class T> void f(T)
{}
template<class T> inline T g(T)
{
return T{};
}
template<> inline void f<>(int)
{}
template<> int g<>(int)
{
return 5;
}
}
} // namespace cxx17
#endif // __cplusplus < 201703L
]])

View file

@ -1,39 +0,0 @@
# =============================================================================
# https://www.gnu.org/software/autoconf-archive/ax_cxx_compile_stdcxx_11.html
# =============================================================================
#
# SYNOPSIS
#
# AX_CXX_COMPILE_STDCXX_11([ext|noext], [mandatory|optional])
#
# DESCRIPTION
#
# Check for baseline language coverage in the compiler for the C++11
# standard; if necessary, add switches to CXX and CXXCPP to enable
# support.
#
# This macro is a convenience alias for calling the AX_CXX_COMPILE_STDCXX
# macro with the version set to C++11. The two optional arguments are
# forwarded literally as the second and third argument respectively.
# Please see the documentation for the AX_CXX_COMPILE_STDCXX macro for
# more information. If you want to use this macro, you also need to
# download the ax_cxx_compile_stdcxx.m4 file.
#
# LICENSE
#
# Copyright (c) 2008 Benjamin Kosnik <bkoz@redhat.com>
# Copyright (c) 2012 Zack Weinberg <zackw@panix.com>
# Copyright (c) 2013 Roy Stogner <roystgnr@ices.utexas.edu>
# Copyright (c) 2014, 2015 Google Inc.; contributed by Alexey Sokolov <sokolov@google.com>
# Copyright (c) 2015 Paul Norman <penorman@mac.com>
# Copyright (c) 2015 Moritz Klammler <moritz@klammler.eu>
#
# Copying and distribution of this file, with or without modification, are
# permitted in any medium without royalty provided the copyright notice
# and this notice are preserved. This file is offered as-is, without any
# warranty.
#serial 18
AX_REQUIRE_DEFINED([AX_CXX_COMPILE_STDCXX])
AC_DEFUN([AX_CXX_COMPILE_STDCXX_11], [AX_CXX_COMPILE_STDCXX([11], [$1], [$2])])

View file

@ -1,107 +0,0 @@
dnl @synopsis AX_DMD
dnl
dnl Test for the presence of a DMD-compatible D2 compiler, and (optionally)
dnl specified modules on the import path.
dnl
dnl If "DMD" is defined in the environment, that will be the only
dnl dmd command tested. Otherwise, a hard-coded list will be used.
dnl
dnl After AX_DMD runs, the shell variables "success" and "ax_dmd" are set to
dnl "yes" or "no", and "DMD" is set to the appropriate command. Furthermore,
dnl "dmd_optlink" will be set to "yes" or "no" depending on whether OPTLINK is
dnl used as the linker (DMD/Windows), and "dmd_of_dirsep" will be set to the
dnl directory separator to use when passing -of to DMD (OPTLINK requires a
dnl backslash).
dnl
dnl AX_CHECK_D_MODULE must be run after AX_DMD. It tests for the presence of a
dnl module in the import path of the chosen compiler, and sets the shell
dnl variable "success" to "yes" or "no".
dnl
dnl @category D
dnl @version 2011-05-31
dnl @license AllPermissive
dnl
dnl Copyright (C) 2009 David Reiss
dnl Copyright (C) 2011 David Nadlinger
dnl Copying and distribution of this file, with or without modification,
dnl are permitted in any medium without royalty provided the copyright
dnl notice and this notice are preserved.
AC_DEFUN([AX_DMD],
[
dnl Hard-coded default commands to test.
DMD_PROGS="dmd,gdmd,ldmd"
dnl Allow the user to specify an alternative.
if test -n "$DMD" ; then
DMD_PROGS="$DMD"
fi
AC_MSG_CHECKING(for DMD)
# std.algorithm as a quick way to check for D2/Phobos.
echo "import std.algorithm; void main() {}" > configtest_ax_dmd.d
success=no
oIFS="$IFS"
IFS=","
for DMD in $DMD_PROGS ; do
IFS="$oIFS"
echo "Running \"$DMD configtest_ax_dmd.d\"" >&AS_MESSAGE_LOG_FD
if $DMD configtest_ax_dmd.d >&AS_MESSAGE_LOG_FD 2>&1 ; then
success=yes
break
fi
done
if test "$success" != "yes" ; then
AC_MSG_RESULT(no)
DMD=""
else
AC_MSG_RESULT(yes)
fi
ax_dmd="$success"
# Test whether OPTLINK is used by trying if DMD accepts -L/? without
# erroring out.
if test "$success" == "yes" ; then
AC_MSG_CHECKING(whether DMD uses OPTLINK)
echo "Running \”$DMD -L/? configtest_ax_dmd.d\"" >&AS_MESSAGE_LOG_FD
if $DMD -L/? configtest_ax_dmd.d >&AS_MESSAGE_LOG_FD 2>&1 ; then
AC_MSG_RESULT(yes)
dmd_optlink="yes"
# This actually produces double slashes in the final configure
# output, but at least it works.
dmd_of_dirsep="\\\\"
else
AC_MSG_RESULT(no)
dmd_optlink="no"
dmd_of_dirsep="/"
fi
fi
rm -f configtest_ax_dmd*
])
AC_DEFUN([AX_CHECK_D_MODULE],
[
AC_MSG_CHECKING(for D module [$1])
echo "import $1; void main() {}" > configtest_ax_dmd.d
echo "Running \"$DMD configtest_ax_dmd.d\"" >&AS_MESSAGE_LOG_FD
if $DMD -c configtest_ax_dmd.d >&AS_MESSAGE_LOG_FD 2>&1 ; then
AC_MSG_RESULT(yes)
success=yes
else
AC_MSG_RESULT(no)
success=no
fi
rm -f configtest_ax_dmd*
])

View file

@ -1,129 +0,0 @@
dnl @synopsis AX_JAVAC_AND_JAVA
dnl @synopsis AX_CHECK_JAVA_CLASS(CLASSNAME)
dnl
dnl Test for the presence of a JDK, and (optionally) specific classes.
dnl
dnl If "JAVA" is defined in the environment, that will be the only
dnl java command tested. Otherwise, a hard-coded list will be used.
dnl Similarly for "JAVAC".
dnl
dnl AX_JAVAC_AND_JAVA does not currently support testing for a particular
dnl Java version, testing for only one of "java" and "javac", or
dnl compiling or running user-provided Java code.
dnl
dnl After AX_JAVAC_AND_JAVA runs, the shell variables "success" and
dnl "ax_javac_and_java" are set to "yes" or "no", and "JAVAC" and
dnl "JAVA" are set to the appropriate commands.
dnl
dnl AX_CHECK_JAVA_CLASS must be run after AX_JAVAC_AND_JAVA.
dnl It tests for the presence of a class based on a fully-qualified name.
dnl It sets the shell variable "success" to "yes" or "no".
dnl
dnl @category Java
dnl @version 2009-02-09
dnl @license AllPermissive
dnl
dnl Copyright (C) 2009 David Reiss
dnl Copying and distribution of this file, with or without modification,
dnl are permitted in any medium without royalty provided the copyright
dnl notice and this notice are preserved.
AC_DEFUN([AX_JAVAC_AND_JAVA],
[
dnl Hard-coded default commands to test.
JAVAC_PROGS="javac,jikes,gcj -C"
JAVA_PROGS="java,kaffe"
dnl Allow the user to specify an alternative.
if test -n "$JAVAC" ; then
JAVAC_PROGS="$JAVAC"
fi
if test -n "$JAVA" ; then
JAVA_PROGS="$JAVA"
fi
AC_MSG_CHECKING(for javac and java)
echo "public class configtest_ax_javac_and_java { public static void main(String args@<:@@:>@) { } }" > configtest_ax_javac_and_java.java
success=no
oIFS="$IFS"
IFS=","
for JAVAC in $JAVAC_PROGS ; do
IFS="$oIFS"
echo "Running \"$JAVAC configtest_ax_javac_and_java.java\"" >&AS_MESSAGE_LOG_FD
if $JAVAC configtest_ax_javac_and_java.java >&AS_MESSAGE_LOG_FD 2>&1 ; then
# prevent $JAVA VM issues with UTF-8 path names (THRIFT-3271)
oLC_ALL="$LC_ALL"
LC_ALL=""
IFS=","
for JAVA in $JAVA_PROGS ; do
IFS="$oIFS"
echo "Running \"$JAVA configtest_ax_javac_and_java\"" >&AS_MESSAGE_LOG_FD
if $JAVA configtest_ax_javac_and_java >&AS_MESSAGE_LOG_FD 2>&1 ; then
success=yes
break 2
fi
done
# restore LC_ALL
LC_ALL="$oLC_ALL"
oLC_ALL=""
fi
done
rm -f configtest_ax_javac_and_java.java configtest_ax_javac_and_java.class
if test "$success" != "yes" ; then
AC_MSG_RESULT(no)
JAVAC=""
JAVA=""
else
AC_MSG_RESULT(yes)
fi
ax_javac_and_java="$success"
])
AC_DEFUN([AX_CHECK_JAVA_CLASS],
[
AC_MSG_CHECKING(for Java class [$1])
echo "import $1; public class configtest_ax_javac_and_java { public static void main(String args@<:@@:>@) { } }" > configtest_ax_javac_and_java.java
echo "Running \"$JAVAC configtest_ax_javac_and_java.java\"" >&AS_MESSAGE_LOG_FD
if $JAVAC configtest_ax_javac_and_java.java >&AS_MESSAGE_LOG_FD 2>&1 ; then
AC_MSG_RESULT(yes)
success=yes
else
AC_MSG_RESULT(no)
success=no
fi
rm -f configtest_ax_javac_and_java.java configtest_ax_javac_and_java.class
])
AC_DEFUN([AX_CHECK_ANT_VERSION],
[
AC_MSG_CHECKING(for ant version > $2)
ANT_VALID=`expr "x$(printf "$2\n$($1 -version 2>/dev/null | sed -n 's/.*version \(@<:@0-9\.@:>@*\).*/\1/p')" | sort -t '.' -k 1,1 -k 2,2 -k 3,3 -g | sed -n 1p)" = "x$2"`
if test "x$ANT_VALID" = "x1" ; then
AC_MSG_RESULT(yes)
else
AC_MSG_RESULT(no)
ANT=""
fi
])

View file

@ -1,194 +0,0 @@
dnl @synopsis AX_LIB_EVENT([MINIMUM-VERSION])
dnl
dnl Test for the libevent library of a particular version (or newer).
dnl
dnl If no path to the installed libevent is given, the macro will first try
dnl using no -I or -L flags, then searches under /usr, /usr/local, /opt,
dnl and /opt/libevent.
dnl If these all fail, it will try the $LIBEVENT_ROOT environment variable.
dnl
dnl This macro requires that #include <sys/types.h> works and defines u_char.
dnl
dnl This macro calls:
dnl AC_SUBST(LIBEVENT_CPPFLAGS)
dnl AC_SUBST(LIBEVENT_LDFLAGS)
dnl AC_SUBST(LIBEVENT_LIBS)
dnl
dnl And (if libevent is found):
dnl AC_DEFINE(HAVE_LIBEVENT)
dnl
dnl It also leaves the shell variables "success" and "ax_have_libevent"
dnl set to "yes" or "no".
dnl
dnl NOTE: This macro does not currently work for cross-compiling,
dnl but it can be easily modified to allow it. (grep "cross").
dnl
dnl @category InstalledPackages
dnl @category C
dnl @version 2007-09-12
dnl @license AllPermissive
dnl
dnl Copyright (C) 2009 David Reiss
dnl Copying and distribution of this file, with or without modification,
dnl are permitted in any medium without royalty provided the copyright
dnl notice and this notice are preserved.
dnl Input: ax_libevent_path, WANT_LIBEVENT_VERSION
dnl Output: success=yes/no
AC_DEFUN([AX_LIB_EVENT_DO_CHECK],
[
# Save our flags.
CPPFLAGS_SAVED="$CPPFLAGS"
LDFLAGS_SAVED="$LDFLAGS"
LIBS_SAVED="$LIBS"
LD_LIBRARY_PATH_SAVED="$LD_LIBRARY_PATH"
# Set our flags if we are checking a specific directory.
if test -n "$ax_libevent_path" ; then
LIBEVENT_CPPFLAGS="-I$ax_libevent_path/include"
LIBEVENT_LDFLAGS="-L$ax_libevent_path/lib"
LD_LIBRARY_PATH="$ax_libevent_path/lib:$LD_LIBRARY_PATH"
else
LIBEVENT_CPPFLAGS=""
LIBEVENT_LDFLAGS=""
fi
# Required flag for libevent.
LIBEVENT_LIBS="-levent"
# Prepare the environment for compilation.
CPPFLAGS="$CPPFLAGS $LIBEVENT_CPPFLAGS"
LDFLAGS="$LDFLAGS $LIBEVENT_LDFLAGS"
LIBS="$LIBS $LIBEVENT_LIBS"
export CPPFLAGS
export LDFLAGS
export LIBS
export LD_LIBRARY_PATH
success=no
# Compile, link, and run the program. This checks:
# - event.h is available for including.
# - event_get_version() is available for linking.
# - The event version string is lexicographically greater
# than the required version.
AC_LANG_PUSH([C])
dnl This can be changed to AC_LINK_IFELSE if you are cross-compiling,
dnl but then the version cannot be checked.
AC_LINK_IFELSE([AC_LANG_PROGRAM([[
#include <sys/types.h>
#include <event.h>
]], [[
const char* lib_version = event_get_version();
const char* wnt_version = "$WANT_LIBEVENT_VERSION";
int lib_digits;
int wnt_digits;
for (;;) {
/* If we reached the end of the want version. We have it. */
if (*wnt_version == '\0' || *wnt_version == '-') {
return 0;
}
/* If the want version continues but the lib version does not, */
/* we are missing a letter. We don't have it. */
if (*lib_version == '\0' || *lib_version == '-') {
return 1;
}
/* In the 1.4 version numbering style, if there are more digits */
/* in one version than the other, that one is higher. */
for (lib_digits = 0;
lib_version[lib_digits] >= '0' &&
lib_version[lib_digits] <= '9';
lib_digits++)
;
for (wnt_digits = 0;
wnt_version[wnt_digits] >= '0' &&
wnt_version[wnt_digits] <= '9';
wnt_digits++)
;
if (lib_digits > wnt_digits) {
return 0;
}
if (lib_digits < wnt_digits) {
return 1;
}
/* If we have greater than what we want. We have it. */
if (*lib_version > *wnt_version) {
return 0;
}
/* If we have less, we don't. */
if (*lib_version < *wnt_version) {
return 1;
}
lib_version++;
wnt_version++;
}
return 0;
]])], [
success=yes
])
AC_LANG_POP([C])
# Restore flags.
CPPFLAGS="$CPPFLAGS_SAVED"
LDFLAGS="$LDFLAGS_SAVED"
LIBS="$LIBS_SAVED"
LD_LIBRARY_PATH="$LD_LIBRARY_PATH_SAVED"
])
AC_DEFUN([AX_LIB_EVENT],
[
dnl Allow search path to be overridden on the command line.
AC_ARG_WITH([libevent],
AS_HELP_STRING([--with-libevent@<:@=DIR@:>@], [use libevent [default=yes]. Optionally specify the root prefix dir where libevent is installed]),
[
if test "x$withval" = "xno"; then
want_libevent="no"
elif test "x$withval" = "xyes"; then
want_libevent="yes"
ax_libevent_path=""
else
want_libevent="yes"
ax_libevent_path="$withval"
fi
],
[ want_libevent="yes" ; ax_libevent_path="" ])
if test "$want_libevent" = "yes"; then
WANT_LIBEVENT_VERSION=ifelse([$1], ,1.2,$1)
AC_MSG_CHECKING(for libevent >= $WANT_LIBEVENT_VERSION)
# Run tests.
if test -n "$ax_libevent_path"; then
AX_LIB_EVENT_DO_CHECK
else
for ax_libevent_path in "" $lt_sysroot/usr $lt_sysroot/usr/local $lt_sysroot/opt $lt_sysroot/opt/local $lt_sysroot/opt/libevent "$LIBEVENT_ROOT" ; do
AX_LIB_EVENT_DO_CHECK
if test "$success" = "yes"; then
break;
fi
done
fi
if test "$success" != "yes" ; then
AC_MSG_RESULT(no)
LIBEVENT_CPPFLAGS=""
LIBEVENT_LDFLAGS=""
LIBEVENT_LIBS=""
else
AC_MSG_RESULT(yes)
AC_DEFINE(HAVE_LIBEVENT,,[define if libevent is available])
ax_have_libevent_[]m4_translit([$1], [.], [_])="yes"
fi
ax_have_libevent="$success"
AC_SUBST(LIBEVENT_CPPFLAGS)
AC_SUBST(LIBEVENT_LDFLAGS)
AC_SUBST(LIBEVENT_LIBS)
fi
])

View file

@ -1,173 +0,0 @@
dnl @synopsis AX_LIB_ZLIB([MINIMUM-VERSION])
dnl
dnl Test for the libz library of a particular version (or newer).
dnl
dnl If no path to the installed zlib is given, the macro will first try
dnl using no -I or -L flags, then searches under /usr, /usr/local, /opt,
dnl and /opt/zlib.
dnl If these all fail, it will try the $ZLIB_ROOT environment variable.
dnl
dnl This macro calls:
dnl AC_SUBST(ZLIB_CPPFLAGS)
dnl AC_SUBST(ZLIB_LDFLAGS)
dnl AC_SUBST(ZLIB_LIBS)
dnl
dnl And (if zlib is found):
dnl AC_DEFINE(HAVE_ZLIB)
dnl
dnl It also leaves the shell variables "success" and "ax_have_zlib"
dnl set to "yes" or "no".
dnl
dnl NOTE: This macro does not currently work for cross-compiling,
dnl but it can be easily modified to allow it. (grep "cross").
dnl
dnl @category InstalledPackages
dnl @category C
dnl @version 2007-09-12
dnl @license AllPermissive
dnl
dnl Copyright (C) 2009 David Reiss
dnl Copying and distribution of this file, with or without modification,
dnl are permitted in any medium without royalty provided the copyright
dnl notice and this notice are preserved.
dnl Input: ax_zlib_path, WANT_ZLIB_VERSION
dnl Output: success=yes/no
AC_DEFUN([AX_LIB_ZLIB_DO_CHECK],
[
# Save our flags.
CPPFLAGS_SAVED="$CPPFLAGS"
LDFLAGS_SAVED="$LDFLAGS"
LIBS_SAVED="$LIBS"
LD_LIBRARY_PATH_SAVED="$LD_LIBRARY_PATH"
# Set our flags if we are checking a specific directory.
if test -n "$ax_zlib_path" ; then
ZLIB_CPPFLAGS="-I$ax_zlib_path/include"
ZLIB_LDFLAGS="-L$ax_zlib_path/lib"
LD_LIBRARY_PATH="$ax_zlib_path/lib:$LD_LIBRARY_PATH"
else
ZLIB_CPPFLAGS=""
ZLIB_LDFLAGS=""
fi
# Required flag for zlib.
ZLIB_LIBS="-lz"
# Prepare the environment for compilation.
CPPFLAGS="$CPPFLAGS $ZLIB_CPPFLAGS"
LDFLAGS="$LDFLAGS $ZLIB_LDFLAGS"
LIBS="$LIBS $ZLIB_LIBS"
export CPPFLAGS
export LDFLAGS
export LIBS
export LD_LIBRARY_PATH
success=no
# Compile, link, and run the program. This checks:
# - zlib.h is available for including.
# - zlibVersion() is available for linking.
# - ZLIB_VERNUM is greater than or equal to the desired version.
# - ZLIB_VERSION (defined in zlib.h) matches zlibVersion()
# (defined in the library).
AC_LANG_PUSH([C])
dnl This can be changed to AC_LINK_IFELSE if you are cross-compiling.
AC_LINK_IFELSE([AC_LANG_PROGRAM([[
#include <zlib.h>
#if ZLIB_VERNUM >= 0x$WANT_ZLIB_VERSION
#else
# error zlib is too old
#endif
]], [[
const char* lib_version = zlibVersion();
const char* hdr_version = ZLIB_VERSION;
for (;;) {
if (*lib_version != *hdr_version) {
/* If this happens, your zlib header doesn't match your zlib */
/* library. That is really bad. */
return 1;
}
if (*lib_version == '\0') {
break;
}
lib_version++;
hdr_version++;
}
return 0;
]])], [
success=yes
])
AC_LANG_POP([C])
# Restore flags.
CPPFLAGS="$CPPFLAGS_SAVED"
LDFLAGS="$LDFLAGS_SAVED"
LIBS="$LIBS_SAVED"
LD_LIBRARY_PATH="$LD_LIBRARY_PATH_SAVED"
])
AC_DEFUN([AX_LIB_ZLIB],
[
dnl Allow search path to be overridden on the command line.
AC_ARG_WITH([zlib],
AS_HELP_STRING([--with-zlib@<:@=DIR@:>@], [use zlib (default is yes) - it is possible to specify an alternate root directory for zlib]),
[
if test "x$withval" = "xno"; then
want_zlib="no"
elif test "x$withval" = "xyes"; then
want_zlib="yes"
ax_zlib_path=""
else
want_zlib="yes"
ax_zlib_path="$withval"
fi
],
[want_zlib="yes" ; ax_zlib_path="" ])
if test "$want_zlib" = "yes"; then
# Parse out the version.
zlib_version_req=ifelse([$1], ,1.2.3,$1)
zlib_version_req_major=`expr $zlib_version_req : '\([[0-9]]*\)'`
zlib_version_req_minor=`expr $zlib_version_req : '[[0-9]]*\.\([[0-9]]*\)'`
zlib_version_req_patch=`expr $zlib_version_req : '[[0-9]]*\.[[0-9]]*\.\([[0-9]]*\)'`
if test -z "$zlib_version_req_patch" ; then
zlib_version_req_patch="0"
fi
WANT_ZLIB_VERSION=`expr $zlib_version_req_major \* 1000 \+ $zlib_version_req_minor \* 100 \+ $zlib_version_req_patch \* 10`
AC_MSG_CHECKING(for zlib >= $zlib_version_req)
# Run tests.
if test -n "$ax_zlib_path"; then
AX_LIB_ZLIB_DO_CHECK
else
for ax_zlib_path in "" /usr /usr/local /opt /opt/zlib "$ZLIB_ROOT" ; do
AX_LIB_ZLIB_DO_CHECK
if test "$success" = "yes"; then
break;
fi
done
fi
if test "$success" != "yes" ; then
AC_MSG_RESULT(no)
ZLIB_CPPFLAGS=""
ZLIB_LDFLAGS=""
ZLIB_LIBS=""
else
AC_MSG_RESULT(yes)
AC_DEFINE(HAVE_ZLIB,,[define if zlib is available])
fi
ax_have_zlib="$success"
AC_SUBST(ZLIB_CPPFLAGS)
AC_SUBST(ZLIB_LDFLAGS)
AC_SUBST(ZLIB_LIBS)
fi
])

View file

@ -1,664 +0,0 @@
# ===========================================================================
# https://www.gnu.org/software/autoconf-archive/ax_lua.html
# ===========================================================================
#
# SYNOPSIS
#
# AX_PROG_LUA[([MINIMUM-VERSION], [TOO-BIG-VERSION], [ACTION-IF-FOUND], [ACTION-IF-NOT-FOUND])]
# AX_LUA_HEADERS[([ACTION-IF-FOUND], [ACTION-IF-NOT-FOUND])]
# AX_LUA_LIBS[([ACTION-IF-FOUND], [ACTION-IF-NOT-FOUND])]
# AX_LUA_READLINE[([ACTION-IF-FOUND], [ACTION-IF-NOT-FOUND])]
#
# DESCRIPTION
#
# Detect a Lua interpreter, optionally specifying a minimum and maximum
# version number. Set up important Lua paths, such as the directories in
# which to install scripts and modules (shared libraries).
#
# Also detect Lua headers and libraries. The Lua version contained in the
# header is checked to match the Lua interpreter version exactly. When
# searching for Lua libraries, the version number is used as a suffix.
# This is done with the goal of supporting multiple Lua installs (5.1,
# 5.2, and 5.3 side-by-side).
#
# A note on compatibility with previous versions: This file has been
# mostly rewritten for serial 18. Most developers should be able to use
# these macros without needing to modify configure.ac. Care has been taken
# to preserve each macro's behavior, but there are some differences:
#
# 1) AX_WITH_LUA is deprecated; it now expands to the exact same thing as
# AX_PROG_LUA with no arguments.
#
# 2) AX_LUA_HEADERS now checks that the version number defined in lua.h
# matches the interpreter version. AX_LUA_HEADERS_VERSION is therefore
# unnecessary, so it is deprecated and does not expand to anything.
#
# 3) The configure flag --with-lua-suffix no longer exists; the user
# should instead specify the LUA precious variable on the command line.
# See the AX_PROG_LUA description for details.
#
# Please read the macro descriptions below for more information.
#
# This file was inspired by Andrew Dalke's and James Henstridge's
# python.m4 and Tom Payne's, Matthieu Moy's, and Reuben Thomas's ax_lua.m4
# (serial 17). Basically, this file is a mash-up of those two files. I
# like to think it combines the best of the two!
#
# AX_PROG_LUA: Search for the Lua interpreter, and set up important Lua
# paths. Adds precious variable LUA, which may contain the path of the Lua
# interpreter. If LUA is blank, the user's path is searched for an
# suitable interpreter.
#
# If MINIMUM-VERSION is supplied, then only Lua interpreters with a
# version number greater or equal to MINIMUM-VERSION will be accepted. If
# TOO-BIG-VERSION is also supplied, then only Lua interpreters with a
# version number greater or equal to MINIMUM-VERSION and less than
# TOO-BIG-VERSION will be accepted.
#
# The Lua version number, LUA_VERSION, is found from the interpreter, and
# substituted. LUA_PLATFORM is also found, but not currently supported (no
# standard representation).
#
# Finally, the macro finds four paths:
#
# luadir Directory to install Lua scripts.
# pkgluadir $luadir/$PACKAGE
# luaexecdir Directory to install Lua modules.
# pkgluaexecdir $luaexecdir/$PACKAGE
#
# These paths are found based on $prefix, $exec_prefix, Lua's
# package.path, and package.cpath. The first path of package.path
# beginning with $prefix is selected as luadir. The first path of
# package.cpath beginning with $exec_prefix is used as luaexecdir. This
# should work on all reasonable Lua installations. If a path cannot be
# determined, a default path is used. Of course, the user can override
# these later when invoking make.
#
# luadir Default: $prefix/share/lua/$LUA_VERSION
# luaexecdir Default: $exec_prefix/lib/lua/$LUA_VERSION
#
# These directories can be used by Automake as install destinations. The
# variable name minus 'dir' needs to be used as a prefix to the
# appropriate Automake primary, e.g. lua_SCRIPS or luaexec_LIBRARIES.
#
# If an acceptable Lua interpreter is found, then ACTION-IF-FOUND is
# performed, otherwise ACTION-IF-NOT-FOUND is preformed. If ACTION-IF-NOT-
# FOUND is blank, then it will default to printing an error. To prevent
# the default behavior, give ':' as an action.
#
# AX_LUA_HEADERS: Search for Lua headers. Requires that AX_PROG_LUA be
# expanded before this macro. Adds precious variable LUA_INCLUDE, which
# may contain Lua specific include flags, e.g. -I/usr/include/lua5.1. If
# LUA_INCLUDE is blank, then this macro will attempt to find suitable
# flags.
#
# LUA_INCLUDE can be used by Automake to compile Lua modules or
# executables with embedded interpreters. The *_CPPFLAGS variables should
# be used for this purpose, e.g. myprog_CPPFLAGS = $(LUA_INCLUDE).
#
# This macro searches for the header lua.h (and others). The search is
# performed with a combination of CPPFLAGS, CPATH, etc, and LUA_INCLUDE.
# If the search is unsuccessful, then some common directories are tried.
# If the headers are then found, then LUA_INCLUDE is set accordingly.
#
# The paths automatically searched are:
#
# * /usr/include/luaX.Y
# * /usr/include/lua/X.Y
# * /usr/include/luaXY
# * /usr/local/include/luaX.Y
# * /usr/local/include/lua-X.Y
# * /usr/local/include/lua/X.Y
# * /usr/local/include/luaXY
#
# (Where X.Y is the Lua version number, e.g. 5.1.)
#
# The Lua version number found in the headers is always checked to match
# the Lua interpreter's version number. Lua headers with mismatched
# version numbers are not accepted.
#
# If headers are found, then ACTION-IF-FOUND is performed, otherwise
# ACTION-IF-NOT-FOUND is performed. If ACTION-IF-NOT-FOUND is blank, then
# it will default to printing an error. To prevent the default behavior,
# set the action to ':'.
#
# AX_LUA_LIBS: Search for Lua libraries. Requires that AX_PROG_LUA be
# expanded before this macro. Adds precious variable LUA_LIB, which may
# contain Lua specific linker flags, e.g. -llua5.1. If LUA_LIB is blank,
# then this macro will attempt to find suitable flags.
#
# LUA_LIB can be used by Automake to link Lua modules or executables with
# embedded interpreters. The *_LIBADD and *_LDADD variables should be used
# for this purpose, e.g. mymod_LIBADD = $(LUA_LIB).
#
# This macro searches for the Lua library. More technically, it searches
# for a library containing the function lua_load. The search is performed
# with a combination of LIBS, LIBRARY_PATH, and LUA_LIB.
#
# If the search determines that some linker flags are missing, then those
# flags will be added to LUA_LIB.
#
# If libraries are found, then ACTION-IF-FOUND is performed, otherwise
# ACTION-IF-NOT-FOUND is performed. If ACTION-IF-NOT-FOUND is blank, then
# it will default to printing an error. To prevent the default behavior,
# set the action to ':'.
#
# AX_LUA_READLINE: Search for readline headers and libraries. Requires the
# AX_LIB_READLINE macro, which is provided by ax_lib_readline.m4 from the
# Autoconf Archive.
#
# If a readline compatible library is found, then ACTION-IF-FOUND is
# performed, otherwise ACTION-IF-NOT-FOUND is performed.
#
# LICENSE
#
# Copyright (c) 2015 Reuben Thomas <rrt@sc3d.org>
# Copyright (c) 2014 Tim Perkins <tprk77@gmail.com>
#
# This program is free software: you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the
# Free Software Foundation, either version 3 of the License, or (at your
# option) any later version.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General
# Public License for more details.
#
# You should have received a copy of the GNU General Public License along
# with this program. If not, see <https://www.gnu.org/licenses/>.
#
# As a special exception, the respective Autoconf Macro's copyright owner
# gives unlimited permission to copy, distribute and modify the configure
# scripts that are the output of Autoconf when processing the Macro. You
# need not follow the terms of the GNU General Public License when using
# or distributing such scripts, even though portions of the text of the
# Macro appear in them. The GNU General Public License (GPL) does govern
# all other use of the material that constitutes the Autoconf Macro.
#
# This special exception to the GPL applies to versions of the Autoconf
# Macro released by the Autoconf Archive. When you make and distribute a
# modified version of the Autoconf Macro, you may extend this special
# exception to the GPL to apply to your modified version as well.
#serial 40
dnl =========================================================================
dnl AX_PROG_LUA([MINIMUM-VERSION], [TOO-BIG-VERSION],
dnl [ACTION-IF-FOUND], [ACTION-IF-NOT-FOUND])
dnl =========================================================================
AC_DEFUN([AX_PROG_LUA],
[
dnl Check for required tools.
AC_REQUIRE([AC_PROG_GREP])
AC_REQUIRE([AC_PROG_SED])
dnl Make LUA a precious variable.
AC_ARG_VAR([LUA], [The Lua interpreter, e.g. /usr/bin/lua5.1])
dnl Find a Lua interpreter.
m4_define_default([_AX_LUA_INTERPRETER_LIST],
[lua lua5.3 lua53 lua5.2 lua52 lua5.1 lua51 lua50])
m4_if([$1], [],
[ dnl No version check is needed. Find any Lua interpreter.
AS_IF([test "x$LUA" = 'x'],
[AC_PATH_PROGS([LUA], [_AX_LUA_INTERPRETER_LIST], [:])])
ax_display_LUA='lua'
AS_IF([test "x$LUA" != 'x:'],
[ dnl At least check if this is a Lua interpreter.
AC_MSG_CHECKING([if $LUA is a Lua interpreter])
_AX_LUA_CHK_IS_INTRP([$LUA],
[AC_MSG_RESULT([yes])],
[ AC_MSG_RESULT([no])
AC_MSG_ERROR([not a Lua interpreter])
])
])
],
[ dnl A version check is needed.
AS_IF([test "x$LUA" != 'x'],
[ dnl Check if this is a Lua interpreter.
AC_MSG_CHECKING([if $LUA is a Lua interpreter])
_AX_LUA_CHK_IS_INTRP([$LUA],
[AC_MSG_RESULT([yes])],
[ AC_MSG_RESULT([no])
AC_MSG_ERROR([not a Lua interpreter])
])
dnl Check the version.
m4_if([$2], [],
[_ax_check_text="whether $LUA version >= $1"],
[_ax_check_text="whether $LUA version >= $1, < $2"])
AC_MSG_CHECKING([$_ax_check_text])
_AX_LUA_CHK_VER([$LUA], [$1], [$2],
[AC_MSG_RESULT([yes])],
[ AC_MSG_RESULT([no])
AC_MSG_ERROR([version is out of range for specified LUA])])
ax_display_LUA=$LUA
],
[ dnl Try each interpreter until we find one that satisfies VERSION.
m4_if([$2], [],
[_ax_check_text="for a Lua interpreter with version >= $1"],
[_ax_check_text="for a Lua interpreter with version >= $1, < $2"])
AC_CACHE_CHECK([$_ax_check_text],
[ax_cv_pathless_LUA],
[ for ax_cv_pathless_LUA in _AX_LUA_INTERPRETER_LIST none; do
test "x$ax_cv_pathless_LUA" = 'xnone' && break
_AX_LUA_CHK_IS_INTRP([$ax_cv_pathless_LUA], [], [continue])
_AX_LUA_CHK_VER([$ax_cv_pathless_LUA], [$1], [$2], [break])
done
])
dnl Set $LUA to the absolute path of $ax_cv_pathless_LUA.
AS_IF([test "x$ax_cv_pathless_LUA" = 'xnone'],
[LUA=':'],
[AC_PATH_PROG([LUA], [$ax_cv_pathless_LUA])])
ax_display_LUA=$ax_cv_pathless_LUA
])
])
AS_IF([test "x$LUA" = 'x:'],
[ dnl Run any user-specified action, or abort.
m4_default([$4], [AC_MSG_ERROR([cannot find suitable Lua interpreter])])
],
[ dnl Query Lua for its version number.
AC_CACHE_CHECK([for $ax_display_LUA version],
[ax_cv_lua_version],
[ dnl Get the interpreter version in X.Y format. This should work for
dnl interpreters version 5.0 and beyond.
ax_cv_lua_version=[`$LUA -e '
-- return a version number in X.Y format
local _, _, ver = string.find(_VERSION, "^Lua (%d+%.%d+)")
print(ver)'`]
])
AS_IF([test "x$ax_cv_lua_version" = 'x'],
[AC_MSG_ERROR([invalid Lua version number])])
AC_SUBST([LUA_VERSION], [$ax_cv_lua_version])
AC_SUBST([LUA_SHORT_VERSION], [`echo "$LUA_VERSION" | $SED 's|\.||'`])
dnl The following check is not supported:
dnl At times (like when building shared libraries) you may want to know
dnl which OS platform Lua thinks this is.
AC_CACHE_CHECK([for $ax_display_LUA platform],
[ax_cv_lua_platform],
[ax_cv_lua_platform=[`$LUA -e 'print("unknown")'`]])
AC_SUBST([LUA_PLATFORM], [$ax_cv_lua_platform])
dnl Use the values of $prefix and $exec_prefix for the corresponding
dnl values of LUA_PREFIX and LUA_EXEC_PREFIX. These are made distinct
dnl variables so they can be overridden if need be. However, the general
dnl consensus is that you shouldn't need this ability.
AC_SUBST([LUA_PREFIX], ['${prefix}'])
AC_SUBST([LUA_EXEC_PREFIX], ['${exec_prefix}'])
dnl Lua provides no way to query the script directory, and instead
dnl provides LUA_PATH. However, we should be able to make a safe educated
dnl guess. If the built-in search path contains a directory which is
dnl prefixed by $prefix, then we can store scripts there. The first
dnl matching path will be used.
AC_CACHE_CHECK([for $ax_display_LUA script directory],
[ax_cv_lua_luadir],
[ AS_IF([test "x$prefix" = 'xNONE'],
[ax_lua_prefix=$ac_default_prefix],
[ax_lua_prefix=$prefix])
dnl Initialize to the default path.
ax_cv_lua_luadir="$LUA_PREFIX/share/lua/$LUA_VERSION"
dnl Try to find a path with the prefix.
_AX_LUA_FND_PRFX_PTH([$LUA], [$ax_lua_prefix], [script])
AS_IF([test "x$ax_lua_prefixed_path" != 'x'],
[ dnl Fix the prefix.
_ax_strip_prefix=`echo "$ax_lua_prefix" | $SED 's|.|.|g'`
ax_cv_lua_luadir=`echo "$ax_lua_prefixed_path" | \
$SED "s|^$_ax_strip_prefix|$LUA_PREFIX|"`
])
])
AC_SUBST([luadir], [$ax_cv_lua_luadir])
AC_SUBST([pkgluadir], [\${luadir}/$PACKAGE])
dnl Lua provides no way to query the module directory, and instead
dnl provides LUA_PATH. However, we should be able to make a safe educated
dnl guess. If the built-in search path contains a directory which is
dnl prefixed by $exec_prefix, then we can store modules there. The first
dnl matching path will be used.
AC_CACHE_CHECK([for $ax_display_LUA module directory],
[ax_cv_lua_luaexecdir],
[ AS_IF([test "x$exec_prefix" = 'xNONE'],
[ax_lua_exec_prefix=$ax_lua_prefix],
[ax_lua_exec_prefix=$exec_prefix])
dnl Initialize to the default path.
ax_cv_lua_luaexecdir="$LUA_EXEC_PREFIX/lib/lua/$LUA_VERSION"
dnl Try to find a path with the prefix.
_AX_LUA_FND_PRFX_PTH([$LUA],
[$ax_lua_exec_prefix], [module])
AS_IF([test "x$ax_lua_prefixed_path" != 'x'],
[ dnl Fix the prefix.
_ax_strip_prefix=`echo "$ax_lua_exec_prefix" | $SED 's|.|.|g'`
ax_cv_lua_luaexecdir=`echo "$ax_lua_prefixed_path" | \
$SED "s|^$_ax_strip_prefix|$LUA_EXEC_PREFIX|"`
])
])
AC_SUBST([luaexecdir], [$ax_cv_lua_luaexecdir])
AC_SUBST([pkgluaexecdir], [\${luaexecdir}/$PACKAGE])
dnl Run any user specified action.
$3
])
])
dnl AX_WITH_LUA is now the same thing as AX_PROG_LUA.
AC_DEFUN([AX_WITH_LUA],
[
AC_MSG_WARN([[$0 is deprecated, please use AX_PROG_LUA instead]])
AX_PROG_LUA
])
dnl =========================================================================
dnl _AX_LUA_CHK_IS_INTRP(PROG, [ACTION-IF-TRUE], [ACTION-IF-FALSE])
dnl =========================================================================
AC_DEFUN([_AX_LUA_CHK_IS_INTRP],
[
dnl A minimal Lua factorial to prove this is an interpreter. This should work
dnl for Lua interpreters version 5.0 and beyond.
_ax_lua_factorial=[`$1 2>/dev/null -e '
-- a simple factorial
function fact (n)
if n == 0 then
return 1
else
return n * fact(n-1)
end
end
print("fact(5) is " .. fact(5))'`]
AS_IF([test "$_ax_lua_factorial" = 'fact(5) is 120'],
[$2], [$3])
])
dnl =========================================================================
dnl _AX_LUA_CHK_VER(PROG, MINIMUM-VERSION, [TOO-BIG-VERSION],
dnl [ACTION-IF-TRUE], [ACTION-IF-FALSE])
dnl =========================================================================
AC_DEFUN([_AX_LUA_CHK_VER],
[
dnl Check that the Lua version is within the bounds. Only the major and minor
dnl version numbers are considered. This should work for Lua interpreters
dnl version 5.0 and beyond.
_ax_lua_good_version=[`$1 -e '
-- a script to compare versions
function verstr2num(verstr)
local _, _, majorver, minorver = string.find(verstr, "^(%d+)%.(%d+)")
if majorver and minorver then
return tonumber(majorver) * 100 + tonumber(minorver)
end
end
local minver = verstr2num("$2")
local _, _, trimver = string.find(_VERSION, "^Lua (.*)")
local ver = verstr2num(trimver)
local maxver = verstr2num("$3") or 1e9
if minver <= ver and ver < maxver then
print("yes")
else
print("no")
end'`]
AS_IF([test "x$_ax_lua_good_version" = "xyes"],
[$4], [$5])
])
dnl =========================================================================
dnl _AX_LUA_FND_PRFX_PTH(PROG, PREFIX, SCRIPT-OR-MODULE-DIR)
dnl =========================================================================
AC_DEFUN([_AX_LUA_FND_PRFX_PTH],
[
dnl Get the script or module directory by querying the Lua interpreter,
dnl filtering on the given prefix, and selecting the shallowest path. If no
dnl path is found matching the prefix, the result will be an empty string.
dnl The third argument determines the type of search, it can be 'script' or
dnl 'module'. Supplying 'script' will perform the search with package.path
dnl and LUA_PATH, and supplying 'module' will search with package.cpath and
dnl LUA_CPATH. This is done for compatibility with Lua 5.0.
ax_lua_prefixed_path=[`$1 -e '
-- get the path based on search type
local searchtype = "$3"
local paths = ""
if searchtype == "script" then
paths = (package and package.path) or LUA_PATH
elseif searchtype == "module" then
paths = (package and package.cpath) or LUA_CPATH
end
-- search for the prefix
local prefix = "'$2'"
local minpath = ""
local mindepth = 1e9
string.gsub(paths, "(@<:@^;@:>@+)",
function (path)
path = string.gsub(path, "%?.*$", "")
path = string.gsub(path, "/@<:@^/@:>@*$", "")
if string.find(path, prefix) then
local depth = string.len(string.gsub(path, "@<:@^/@:>@", ""))
if depth < mindepth then
minpath = path
mindepth = depth
end
end
end)
print(minpath)'`]
])
dnl =========================================================================
dnl AX_LUA_HEADERS([ACTION-IF-FOUND], [ACTION-IF-NOT-FOUND])
dnl =========================================================================
AC_DEFUN([AX_LUA_HEADERS],
[
dnl Check for LUA_VERSION.
AC_MSG_CHECKING([if LUA_VERSION is defined])
AS_IF([test "x$LUA_VERSION" != 'x'],
[AC_MSG_RESULT([yes])],
[ AC_MSG_RESULT([no])
AC_MSG_ERROR([cannot check Lua headers without knowing LUA_VERSION])
])
dnl Make LUA_INCLUDE a precious variable.
AC_ARG_VAR([LUA_INCLUDE], [The Lua includes, e.g. -I/usr/include/lua5.1])
dnl Some default directories to search.
LUA_SHORT_VERSION=`echo "$LUA_VERSION" | $SED 's|\.||'`
m4_define_default([_AX_LUA_INCLUDE_LIST],
[ /usr/include/lua$LUA_VERSION \
/usr/include/lua-$LUA_VERSION \
/usr/include/lua/$LUA_VERSION \
/usr/include/lua$LUA_SHORT_VERSION \
/usr/local/include/lua$LUA_VERSION \
/usr/local/include/lua-$LUA_VERSION \
/usr/local/include/lua/$LUA_VERSION \
/usr/local/include/lua$LUA_SHORT_VERSION \
])
dnl Try to find the headers.
_ax_lua_saved_cppflags=$CPPFLAGS
CPPFLAGS="$CPPFLAGS $LUA_INCLUDE"
AC_CHECK_HEADERS([lua.h lualib.h lauxlib.h luaconf.h])
CPPFLAGS=$_ax_lua_saved_cppflags
dnl Try some other directories if LUA_INCLUDE was not set.
AS_IF([test "x$LUA_INCLUDE" = 'x' &&
test "x$ac_cv_header_lua_h" != 'xyes'],
[ dnl Try some common include paths.
for _ax_include_path in _AX_LUA_INCLUDE_LIST; do
test ! -d "$_ax_include_path" && continue
AC_MSG_CHECKING([for Lua headers in])
AC_MSG_RESULT([$_ax_include_path])
AS_UNSET([ac_cv_header_lua_h])
AS_UNSET([ac_cv_header_lualib_h])
AS_UNSET([ac_cv_header_lauxlib_h])
AS_UNSET([ac_cv_header_luaconf_h])
_ax_lua_saved_cppflags=$CPPFLAGS
CPPFLAGS="$CPPFLAGS -I$_ax_include_path"
AC_CHECK_HEADERS([lua.h lualib.h lauxlib.h luaconf.h])
CPPFLAGS=$_ax_lua_saved_cppflags
AS_IF([test "x$ac_cv_header_lua_h" = 'xyes'],
[ LUA_INCLUDE="-I$_ax_include_path"
break
])
done
])
AS_IF([test "x$ac_cv_header_lua_h" = 'xyes'],
[ dnl Make a program to print LUA_VERSION defined in the header.
dnl TODO It would be really nice if we could do this without compiling a
dnl program, then it would work when cross compiling. But I'm not sure how
dnl to do this reliably. For now, assume versions match when cross compiling.
AS_IF([test "x$cross_compiling" != 'xyes'],
[ AC_CACHE_CHECK([for Lua header version],
[ax_cv_lua_header_version],
[ _ax_lua_saved_cppflags=$CPPFLAGS
CPPFLAGS="$CPPFLAGS $LUA_INCLUDE"
AC_RUN_IFELSE(
[ AC_LANG_SOURCE([[
#include <lua.h>
#include <stdlib.h>
#include <stdio.h>
int main(int argc, char ** argv)
{
if(argc > 1) printf("%s", LUA_VERSION);
exit(EXIT_SUCCESS);
}
]])
],
[ ax_cv_lua_header_version=`./conftest$EXEEXT p | \
$SED -n "s|^Lua \(@<:@0-9@:>@\{1,\}\.@<:@0-9@:>@\{1,\}\).\{0,\}|\1|p"`
],
[ax_cv_lua_header_version='unknown'])
CPPFLAGS=$_ax_lua_saved_cppflags
])
dnl Compare this to the previously found LUA_VERSION.
AC_MSG_CHECKING([if Lua header version matches $LUA_VERSION])
AS_IF([test "x$ax_cv_lua_header_version" = "x$LUA_VERSION"],
[ AC_MSG_RESULT([yes])
ax_header_version_match='yes'
],
[ AC_MSG_RESULT([no])
ax_header_version_match='no'
])
],
[ AC_MSG_WARN([cross compiling so assuming header version number matches])
ax_header_version_match='yes'
])
])
dnl Was LUA_INCLUDE specified?
AS_IF([test "x$ax_header_version_match" != 'xyes' &&
test "x$LUA_INCLUDE" != 'x'],
[AC_MSG_ERROR([cannot find headers for specified LUA_INCLUDE])])
dnl Test the final result and run user code.
AS_IF([test "x$ax_header_version_match" = 'xyes'], [$1],
[m4_default([$2], [AC_MSG_ERROR([cannot find Lua includes])])])
])
dnl AX_LUA_HEADERS_VERSION no longer exists, use AX_LUA_HEADERS.
AC_DEFUN([AX_LUA_HEADERS_VERSION],
[
AC_MSG_WARN([[$0 is deprecated, please use AX_LUA_HEADERS instead]])
])
dnl =========================================================================
dnl AX_LUA_LIBS([ACTION-IF-FOUND], [ACTION-IF-NOT-FOUND])
dnl =========================================================================
AC_DEFUN([AX_LUA_LIBS],
[
dnl TODO Should this macro also check various -L flags?
dnl Check for LUA_VERSION.
AC_MSG_CHECKING([if LUA_VERSION is defined])
AS_IF([test "x$LUA_VERSION" != 'x'],
[AC_MSG_RESULT([yes])],
[ AC_MSG_RESULT([no])
AC_MSG_ERROR([cannot check Lua libs without knowing LUA_VERSION])
])
dnl Make LUA_LIB a precious variable.
AC_ARG_VAR([LUA_LIB], [The Lua library, e.g. -llua5.1])
AS_IF([test "x$LUA_LIB" != 'x'],
[ dnl Check that LUA_LIBS works.
_ax_lua_saved_libs=$LIBS
LIBS="$LIBS $LUA_LIB"
AC_SEARCH_LIBS([lua_load], [],
[_ax_found_lua_libs='yes'],
[_ax_found_lua_libs='no'])
LIBS=$_ax_lua_saved_libs
dnl Check the result.
AS_IF([test "x$_ax_found_lua_libs" != 'xyes'],
[AC_MSG_ERROR([cannot find libs for specified LUA_LIB])])
],
[ dnl First search for extra libs.
_ax_lua_extra_libs=''
_ax_lua_saved_libs=$LIBS
LIBS="$LIBS $LUA_LIB"
AC_SEARCH_LIBS([exp], [m])
AC_SEARCH_LIBS([dlopen], [dl])
LIBS=$_ax_lua_saved_libs
AS_IF([test "x$ac_cv_search_exp" != 'xno' &&
test "x$ac_cv_search_exp" != 'xnone required'],
[_ax_lua_extra_libs="$_ax_lua_extra_libs $ac_cv_search_exp"])
AS_IF([test "x$ac_cv_search_dlopen" != 'xno' &&
test "x$ac_cv_search_dlopen" != 'xnone required'],
[_ax_lua_extra_libs="$_ax_lua_extra_libs $ac_cv_search_dlopen"])
dnl Try to find the Lua libs.
_ax_lua_saved_libs=$LIBS
LIBS="$LIBS $LUA_LIB"
AC_SEARCH_LIBS([lua_load],
[ lua$LUA_VERSION \
lua$LUA_SHORT_VERSION \
lua-$LUA_VERSION \
lua-$LUA_SHORT_VERSION \
lua \
],
[_ax_found_lua_libs='yes'],
[_ax_found_lua_libs='no'],
[$_ax_lua_extra_libs])
LIBS=$_ax_lua_saved_libs
AS_IF([test "x$ac_cv_search_lua_load" != 'xno' &&
test "x$ac_cv_search_lua_load" != 'xnone required'],
[LUA_LIB="$ac_cv_search_lua_load $_ax_lua_extra_libs"])
])
dnl Test the result and run user code.
AS_IF([test "x$_ax_found_lua_libs" = 'xyes'], [$1],
[m4_default([$2], [AC_MSG_ERROR([cannot find Lua libs])])])
])
dnl =========================================================================
dnl AX_LUA_READLINE([ACTION-IF-FOUND], [ACTION-IF-NOT-FOUND])
dnl =========================================================================
AC_DEFUN([AX_LUA_READLINE],
[
AX_LIB_READLINE
AS_IF([test "x$ac_cv_header_readline_readline_h" != 'x' &&
test "x$ac_cv_header_readline_history_h" != 'x'],
[ LUA_LIBS_CFLAGS="-DLUA_USE_READLINE $LUA_LIBS_CFLAGS"
$1
],
[$2])
])

View file

@ -1,61 +0,0 @@
# ===============================================================================
# https://www.gnu.org/software/autoconf-archive/ax_prog_dotnetcore_version.html
# ===============================================================================
#
# SYNOPSIS
#
# AX_PROG_DOTNETCORE_VERSION([VERSION],[ACTION-IF-TRUE],[ACTION-IF-FALSE])
#
# DESCRIPTION
#
# Makes sure that .NET Core supports the version indicated. If true the
# shell commands in ACTION-IF-TRUE are executed. If not the shell commands
# in ACTION-IF-FALSE are run. The $dotnetcore_version variable will be
# filled with the detected version.
#
# This macro uses the $DOTNETCORE variable to perform the check. If
# $DOTNETCORE is not set prior to calling this macro, the macro will fail.
#
# Example:
#
# AC_PATH_PROG([DOTNETCORE],[dotnet])
# AC_PROG_DOTNETCORE_VERSION([1.0.2],[ ... ],[ ... ])
#
# Searches for .NET Core, then checks if at least version 1.0.2 is
# present.
#
# LICENSE
#
# Copyright (c) 2016 Jens Geyer <jensg@apache.org>
#
# Copying and distribution of this file, with or without modification, are
# permitted in any medium without royalty provided the copyright notice
# and this notice are preserved. This file is offered as-is, without any
# warranty.
#serial 2
AC_DEFUN([AX_PROG_DOTNETCORE_VERSION],[
AC_REQUIRE([AC_PROG_SED])
AS_IF([test -n "$DOTNETCORE"],[
ax_dotnetcore_version="$1"
AC_MSG_CHECKING([for .NET Core version])
dotnetcore_version=`$DOTNETCORE --version 2>&1 | $SED -e 's/\(@<:@0-9@:>@*\.@<:@0-9@:>@*\.@<:@0-9@:>@*\)\(.*\)/\1/'`
AC_MSG_RESULT($dotnetcore_version)
AC_SUBST([DOTNETCORE_VERSION],[$dotnetcore_version])
AX_COMPARE_VERSION([$ax_dotnetcore_version],[le],[$dotnetcore_version],[
:
$2
],[
:
$3
])
],[
AC_MSG_WARN([could not find .NET Core])
$3
])
])

View file

@ -1,60 +0,0 @@
# ===========================================================================
# https://www.gnu.org/software/autoconf-archive/ax_prog_haxe_version.html
# ===========================================================================
#
# SYNOPSIS
#
# AX_PROG_HAXE_VERSION([VERSION],[ACTION-IF-TRUE],[ACTION-IF-FALSE])
#
# DESCRIPTION
#
# Makes sure that haxe supports the version indicated. If true the shell
# commands in ACTION-IF-TRUE are executed. If not the shell commands in
# ACTION-IF-FALSE are run. The $HAXE_VERSION variable will be filled with
# the detected version.
#
# This macro uses the $HAXE variable to perform the check. If $HAXE is not
# set prior to calling this macro, the macro will fail.
#
# Example:
#
# AC_PATH_PROG([HAXE],[haxe])
# AC_PROG_HAXE_VERSION([3.1.3],[ ... ],[ ... ])
#
# Searches for Haxe, then checks if at least version 3.1.3 is present.
#
# LICENSE
#
# Copyright (c) 2015 Jens Geyer <jensg@apache.org>
#
# Copying and distribution of this file, with or without modification, are
# permitted in any medium without royalty provided the copyright notice
# and this notice are preserved. This file is offered as-is, without any
# warranty.
#serial 2
AC_DEFUN([AX_PROG_HAXE_VERSION],[
AC_REQUIRE([AC_PROG_SED])
AS_IF([test -n "$HAXE"],[
ax_haxe_version="$1"
AC_MSG_CHECKING([for haxe version])
haxe_version=`$HAXE -version 2>&1 | $SED -e 's/^.* \( @<:@0-9@:>@*\.@<:@0-9@:>@*\.@<:@0-9@:>@*\) .*/\1/'`
AC_MSG_RESULT($haxe_version)
AC_SUBST([HAXE_VERSION],[$haxe_version])
AX_COMPARE_VERSION([$ax_haxe_version],[le],[$haxe_version],[
:
$2
],[
:
$3
])
],[
AC_MSG_WARN([could not find Haxe])
$3
])
])

View file

@ -1,77 +0,0 @@
# ===========================================================================
# https://www.gnu.org/software/autoconf-archive/ax_prog_perl_modules.html
# ===========================================================================
#
# SYNOPSIS
#
# AX_PROG_PERL_MODULES([MODULES], [ACTION-IF-TRUE], [ACTION-IF-FALSE])
#
# DESCRIPTION
#
# Checks to see if the given perl modules are available. If true the shell
# commands in ACTION-IF-TRUE are executed. If not the shell commands in
# ACTION-IF-FALSE are run. Note if $PERL is not set (for example by
# calling AC_CHECK_PROG, or AC_PATH_PROG), AC_CHECK_PROG(PERL, perl, perl)
# will be run.
#
# MODULES is a space separated list of module names. To check for a
# minimum version of a module, append the version number to the module
# name, separated by an equals sign.
#
# Example:
#
# AX_PROG_PERL_MODULES( Text::Wrap Net::LDAP=1.0.3, ,
# AC_MSG_WARN(Need some Perl modules)
#
# LICENSE
#
# Copyright (c) 2009 Dean Povey <povey@wedgetail.com>
#
# Copying and distribution of this file, with or without modification, are
# permitted in any medium without royalty provided the copyright notice
# and this notice are preserved. This file is offered as-is, without any
# warranty.
#serial 8
AU_ALIAS([AC_PROG_PERL_MODULES], [AX_PROG_PERL_MODULES])
AC_DEFUN([AX_PROG_PERL_MODULES],[dnl
m4_define([ax_perl_modules])
m4_foreach([ax_perl_module], m4_split(m4_normalize([$1])),
[
m4_append([ax_perl_modules],
[']m4_bpatsubst(ax_perl_module,=,[ ])[' ])
])
# Make sure we have perl
if test -z "$PERL"; then
AC_CHECK_PROG(PERL,perl,perl)
fi
if test "x$PERL" != x; then
ax_perl_modules_failed=0
for ax_perl_module in ax_perl_modules; do
AC_MSG_CHECKING(for perl module $ax_perl_module)
# Would be nice to log result here, but can't rely on autoconf internals
$PERL -e "use $ax_perl_module; exit" > /dev/null 2>&1
if test $? -ne 0; then
AC_MSG_RESULT(no);
ax_perl_modules_failed=1
else
AC_MSG_RESULT(ok);
fi
done
# Run optional shell commands
if test "$ax_perl_modules_failed" = 0; then
:
$2
else
:
$3
fi
else
AC_MSG_WARN(could not find perl)
fi])dnl

View file

@ -1,127 +0,0 @@
dnl @synopsis AX_SIGNED_RIGHT_SHIFT
dnl
dnl Tests the behavior of a right shift on a negative signed int.
dnl
dnl This macro calls:
dnl AC_DEFINE(SIGNED_RIGHT_SHIFT_IS)
dnl AC_DEFINE(ARITHMETIC_RIGHT_SHIFT)
dnl AC_DEFINE(LOGICAL_RIGHT_SHIFT)
dnl AC_DEFINE(UNKNOWN_RIGHT_SHIFT)
dnl
dnl SIGNED_RIGHT_SHIFT_IS will be equal to one of the other macros.
dnl It also leaves the shell variables "ax_signed_right_shift"
dnl set to "arithmetic", "logical", or "unknown".
dnl
dnl NOTE: This macro does not work for cross-compiling.
dnl
dnl @category C
dnl @version 2009-03-25
dnl @license AllPermissive
dnl
dnl Copyright (C) 2009 David Reiss
dnl Copying and distribution of this file, with or without modification,
dnl are permitted in any medium without royalty provided the copyright
dnl notice and this notice are preserved.
AC_DEFUN([AX_SIGNED_RIGHT_SHIFT],
[
AC_MSG_CHECKING(the behavior of a signed right shift)
success_arithmetic=no
AC_RUN_IFELSE([AC_LANG_PROGRAM([[]], [[
return
/* 0xffffffff */
-1 >> 1 != -1 ||
-1 >> 2 != -1 ||
-1 >> 3 != -1 ||
-1 >> 4 != -1 ||
-1 >> 8 != -1 ||
-1 >> 16 != -1 ||
-1 >> 24 != -1 ||
-1 >> 31 != -1 ||
/* 0x80000000 */
(-2147483647 - 1) >> 1 != -1073741824 ||
(-2147483647 - 1) >> 2 != -536870912 ||
(-2147483647 - 1) >> 3 != -268435456 ||
(-2147483647 - 1) >> 4 != -134217728 ||
(-2147483647 - 1) >> 8 != -8388608 ||
(-2147483647 - 1) >> 16 != -32768 ||
(-2147483647 - 1) >> 24 != -128 ||
(-2147483647 - 1) >> 31 != -1 ||
/* 0x90800000 */
-1870659584 >> 1 != -935329792 ||
-1870659584 >> 2 != -467664896 ||
-1870659584 >> 3 != -233832448 ||
-1870659584 >> 4 != -116916224 ||
-1870659584 >> 8 != -7307264 ||
-1870659584 >> 16 != -28544 ||
-1870659584 >> 24 != -112 ||
-1870659584 >> 31 != -1 ||
0;
]])], [
success_arithmetic=yes
])
success_logical=no
AC_RUN_IFELSE([AC_LANG_PROGRAM([[]], [[
return
/* 0xffffffff */
-1 >> 1 != (signed)((unsigned)-1 >> 1) ||
-1 >> 2 != (signed)((unsigned)-1 >> 2) ||
-1 >> 3 != (signed)((unsigned)-1 >> 3) ||
-1 >> 4 != (signed)((unsigned)-1 >> 4) ||
-1 >> 8 != (signed)((unsigned)-1 >> 8) ||
-1 >> 16 != (signed)((unsigned)-1 >> 16) ||
-1 >> 24 != (signed)((unsigned)-1 >> 24) ||
-1 >> 31 != (signed)((unsigned)-1 >> 31) ||
/* 0x80000000 */
(-2147483647 - 1) >> 1 != (signed)((unsigned)(-2147483647 - 1) >> 1) ||
(-2147483647 - 1) >> 2 != (signed)((unsigned)(-2147483647 - 1) >> 2) ||
(-2147483647 - 1) >> 3 != (signed)((unsigned)(-2147483647 - 1) >> 3) ||
(-2147483647 - 1) >> 4 != (signed)((unsigned)(-2147483647 - 1) >> 4) ||
(-2147483647 - 1) >> 8 != (signed)((unsigned)(-2147483647 - 1) >> 8) ||
(-2147483647 - 1) >> 16 != (signed)((unsigned)(-2147483647 - 1) >> 16) ||
(-2147483647 - 1) >> 24 != (signed)((unsigned)(-2147483647 - 1) >> 24) ||
(-2147483647 - 1) >> 31 != (signed)((unsigned)(-2147483647 - 1) >> 31) ||
/* 0x90800000 */
-1870659584 >> 1 != (signed)((unsigned)-1870659584 >> 1) ||
-1870659584 >> 2 != (signed)((unsigned)-1870659584 >> 2) ||
-1870659584 >> 3 != (signed)((unsigned)-1870659584 >> 3) ||
-1870659584 >> 4 != (signed)((unsigned)-1870659584 >> 4) ||
-1870659584 >> 8 != (signed)((unsigned)-1870659584 >> 8) ||
-1870659584 >> 16 != (signed)((unsigned)-1870659584 >> 16) ||
-1870659584 >> 24 != (signed)((unsigned)-1870659584 >> 24) ||
-1870659584 >> 31 != (signed)((unsigned)-1870659584 >> 31) ||
0;
]])], [
success_logical=yes
])
AC_DEFINE([ARITHMETIC_RIGHT_SHIFT], 1, [Possible value for SIGNED_RIGHT_SHIFT_IS])
AC_DEFINE([LOGICAL_RIGHT_SHIFT], 2, [Possible value for SIGNED_RIGHT_SHIFT_IS])
AC_DEFINE([UNKNOWN_RIGHT_SHIFT], 3, [Possible value for SIGNED_RIGHT_SHIFT_IS])
if test "$success_arithmetic" = "yes" && test "$success_logical" = "yes" ; then
AC_MSG_ERROR("Right shift appears to be both arithmetic and logical!")
elif test "$success_arithmetic" = "yes" ; then
ax_signed_right_shift=arithmetic
AC_DEFINE([SIGNED_RIGHT_SHIFT_IS], 1,
[Indicates the effect of the right shift operator
on negative signed integers])
elif test "$success_logical" = "yes" ; then
ax_signed_right_shift=logical
AC_DEFINE([SIGNED_RIGHT_SHIFT_IS], 2,
[Indicates the effect of the right shift operator
on negative signed integers])
else
ax_signed_right_shift=unknown
AC_DEFINE([SIGNED_RIGHT_SHIFT_IS], 3,
[Indicates the effect of the right shift operator
on negative signed integers])
fi
AC_MSG_RESULT($ax_signed_right_shift)
])

View file

@ -1,28 +0,0 @@
dnl @synopsis AX_THRIFT_GEN(SHORT_LANGUAGE, LONG_LANGUAGE, DEFAULT)
dnl @synopsis AX_THRIFT_LIB(SHORT_LANGUAGE, LONG_LANGUAGE, DEFAULT)
dnl
dnl Allow a particular language generator to be disabled.
dnl Allow a particular language library to be disabled.
dnl
dnl These macros have poor error handling and are poorly documented.
dnl They are intended only for internal use by the Thrift compiler.
dnl
dnl @version 2008-02-20
dnl @license AllPermissive
dnl
dnl Copyright (C) 2009 David Reiss
dnl Copying and distribution of this file, with or without modification,
dnl are permitted in any medium without royalty provided the copyright
dnl notice and this notice are preserved.
AC_DEFUN([AX_THRIFT_LIB],
[
AC_ARG_WITH($1,
AC_HELP_STRING([--with-$1], [build the $2 library @<:@default=$3@:>@]),
[with_$1="$withval"],
[with_$1=$3]
)
have_$1=no
dnl What we do here is going to vary from library to library,
dnl so we can't really generalize (yet!).
])

View file

@ -1,110 +0,0 @@
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#
# build Apache Thrift on AppVeyor - https://ci.appveyor.com
version: '0.12.0.{build}'
shallow_clone: true
os:
- Visual Studio 2017
matrix:
allow_failures:
- PROFILE: CYGWIN
fast_finish: true
environment:
matrix:
- PROFILE: MSVC2017
PLATFORM: x64
CONFIGURATION: Release
BOOST_VERSION: 1.65.1
LIBEVENT_VERSION: 2.1.8
PYTHON_VERSION: 3.6
QT_VERSION: 5.10
ZLIB_VERSION: 1.2.11
DISABLED_TESTS: StressTestNonBlocking
- PROFILE: MSVC2013
PLATFORM: x86
CONFIGURATION: Release
BOOST_VERSION: 1.58.0
LIBEVENT_VERSION: 2.0.22
PYTHON_VERSION: 3.5
QT_VERSION: 5.8
ZLIB_VERSION: 1.2.8
DISABLED_TESTS: StressTestNonBlocking
APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2015
- PROFILE: MINGW
PLATFORM: x64
CONFIGURATION: RelWithDebInfo
DISABLED_TESTS: StressTestNonBlocking
- PROFILE: CYGWIN
PLATFORM: x86
CONFIGURATION: RelWithDebInfo
DISABLED_TESTS: (ZlibTest|OpenSSLManualInitTest|TNonblockingServerTest|StressTestNonBlocking)
# - PROFILE: CYGWIN
# PLATFORM: x64
# CONFIGURATION: RelWithDebInfo
# DISABLED_TESTS: (ZlibTest|OpenSSLManualInitTest|TNonblockingServerTest|StressTestNonBlocking)
install:
- cd %APPVEYOR_BUILD_FOLDER%
- call build\appveyor\%PROFILE:~0,4%-appveyor-install.bat
- refreshenv
build_script:
- cd %APPVEYOR_BUILD_FOLDER%
- call build\appveyor\%PROFILE:~0,4%-appveyor-build.bat
test_script:
- cd %APPVEYOR_BUILD_FOLDER%
- call build\appveyor\%PROFILE:~0,4%-appveyor-test.bat
# artifact capture disabled as it might increase service cost for little gain:
#
# artifacts:
# - path: local-thrift-inst
# name: cmake installed content
# type: zip
#
# - path: local-thrift-build\Testing
# name: ctest output
# type: zip
# RDP support: use one or the other...
#
# enables RDP for each build job so you can inspect the environment at the beginning of the job:
# init:
# - ps: iex ((new-object net.webclient).DownloadString('https://raw.githubusercontent.com/appveyor/ci/master/scripts/enable-rdp.ps1'))
#
# enables RDP at the end of the build job so you can login and re-run
# commands to see why something failed...
#on_finish:
# - ps: $blockRdp = $true; iex ((new-object net.webclient).DownloadString('https://raw.githubusercontent.com/appveyor/ci/master/scripts/enable-rdp.ps1'))
#
# also need:
# environment:
# APPVEYOR_RDP_PASSWORD: thr1FT2345$xyzZ

View file

@ -1,61 +0,0 @@
#!/bin/sh
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#
./cleanup.sh
if test -d lib/php/src/ext/thrift_protocol ; then
if phpize -v >/dev/null 2>/dev/null ; then
(cd lib/php/src/ext/thrift_protocol && phpize)
fi
fi
set -e
# libtoolize is called "glibtoolize" on OSX.
if libtoolize --version 1 >/dev/null 2>/dev/null; then
LIBTOOLIZE=libtoolize
elif glibtoolize --version 1 >/dev/null 2>/dev/null; then
LIBTOOLIZE=glibtoolize
else
echo >&2 "Couldn't find libtoolize!"
exit 1
fi
format_version () {
printf "%03d%03d%03d%03d" $(echo $1 | tr '.' ' ');
}
# we require automake 1.13 or later
# check must happen externally due to use of newer macro
AUTOMAKE_VERSION=`automake --version | grep automake | egrep -o '([0-9]{1,}\.)+[0-9]{1,}'`
if [ $(format_version $AUTOMAKE_VERSION) -lt $(format_version 1.13) ]; then
echo >&2 "automake version $AUTOMAKE_VERSION is too old (need 1.13 or later)"
exit 1
fi
set -e
autoscan
$LIBTOOLIZE --copy --automake
aclocal -I ./aclocal
autoheader
sed '/undef VERSION/d' config.hin > config.hin2
mv config.hin2 config.hin
autoconf
automake --copy --add-missing --foreign

View file

@ -1,16 +0,0 @@
{
"name": "thrift",
"version": "0.12.0",
"homepage": "https://git-wip-us.apache.org/repos/asf/thrift.git",
"authors": [
"Apache Thrift <dev@thrift.apache.org>"
],
"description": "Apache Thrift",
"main": "lib/js/src/thrift.js",
"keywords": [
"thrift"
],
"license": "Apache v2",
"ignore": [
]
}

View file

@ -1,36 +0,0 @@
::
:: Licensed under the Apache License, Version 2.0 (the "License");
:: you may not use this file except in compliance with the License.
:: You may obtain a copy of the License at
::
:: http://www.apache.org/licenses/LICENSE-2.0
::
:: Unless required by applicable law or agreed to in writing, software
:: distributed under the License is distributed on an "AS IS" BASIS,
:: WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
:: See the License for the specific language governing permissions and
:: limitations under the License.
::
@ECHO OFF
SETLOCAL EnableDelayedExpansion
CD build\appveyor || EXIT /B
CALL cl_banner_build.bat || EXIT /B
CALL cl_setenv.bat || EXIT /B
SET CMAKEARGS=^
-G'%GENERATOR%' ^
-DCMAKE_BUILD_TYPE=%CONFIGURATION% ^
-DCMAKE_INSTALL_PREFIX=%INSTDIR% ^
-DCMAKE_CXX_EXTENSIONS=ON ^
-DCMAKE_CXX_FLAGS="-D_GNU_SOURCE" ^
-DCMAKE_CXX_STANDARD=11 ^
-DWITH_PYTHON=OFF ^
-DWITH_SHARED_LIB=OFF ^
-DWITH_STATIC_LIB=ON ^
-DWITH_STDTHREADS=ON
@ECHO ON
%BASH% -lc "mkdir -p %BUILDDIR% && cd %BUILDDIR% && cmake.exe %SRCDIR% %CMAKEARGS% && cmake --build . --config %CONFIGURATION% --target install" || EXIT /B
@ECHO OFF

View file

@ -1,34 +0,0 @@
::
:: Licensed under the Apache License, Version 2.0 (the "License");
:: you may not use this file except in compliance with the License.
:: You may obtain a copy of the License at
::
:: http://www.apache.org/licenses/LICENSE-2.0
::
:: Unless required by applicable law or agreed to in writing, software
:: distributed under the License is distributed on an "AS IS" BASIS,
:: WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
:: See the License for the specific language governing permissions and
:: limitations under the License.
::
::
:: Appveyor install script for CYGWIN
:: Installs third party packages we need for a cmake build
::
@ECHO OFF
SETLOCAL EnableDelayedExpansion
CD build\appveyor || EXIT /B
CALL cl_banner_install.bat || EXIT /B
CALL cl_setenv.bat || EXIT /B
CALL cl_showenv.bat || EXIT /B
::
:: Install apt-cyg for package management
::
%BASH% -lc "wget rawgit.com/transcode-open/apt-cyg/master/apt-cyg && install apt-cyg /bin && rm -f apt-cyg" || EXIT /B
%BASH% -lc "apt-cyg update" || EXIT /B
%BASH% -lc "apt-cyg install bison cmake flex gcc-g++ libboost-devel libevent-devel make openssl-devel zlib-devel"

View file

@ -1,21 +0,0 @@
::
:: Licensed under the Apache License, Version 2.0 (the "License");
:: you may not use this file except in compliance with the License.
:: You may obtain a copy of the License at
::
:: http://www.apache.org/licenses/LICENSE-2.0
::
:: Unless required by applicable law or agreed to in writing, software
:: distributed under the License is distributed on an "AS IS" BASIS,
:: WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
:: See the License for the specific language governing permissions and
:: limitations under the License.
::
@ECHO OFF
SETLOCAL EnableDelayedExpansion
CD build\appveyor || EXIT /B
CALL cl_banner_test.bat || EXIT /B
CALL cl_setenv.bat || EXIT /B
%BASH% -lc "cd %BUILDDIR% && ctest.exe -C %CONFIGURATION% --timeout 300 -VV -E '%DISABLED_TESTS%'" || EXIT /B

View file

@ -1,36 +0,0 @@
::
:: Licensed under the Apache License, Version 2.0 (the "License");
:: you may not use this file except in compliance with the License.
:: You may obtain a copy of the License at
::
:: http://www.apache.org/licenses/LICENSE-2.0
::
:: Unless required by applicable law or agreed to in writing, software
:: distributed under the License is distributed on an "AS IS" BASIS,
:: WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
:: See the License for the specific language governing permissions and
:: limitations under the License.
::
@ECHO OFF
SETLOCAL EnableDelayedExpansion
CD build\appveyor || EXIT /B
CALL cl_banner_build.bat || EXIT /B
CALL cl_setenv.bat || EXIT /B
SET CMAKEARGS=^
-G'%GENERATOR%' ^
-DCMAKE_BUILD_TYPE=%CONFIGURATION% ^
-DCMAKE_INSTALL_PREFIX=%INSTDIR% ^
-DCMAKE_MAKE_PROGRAM=/mingw%NORM_PLATFORM%/bin/mingw32-make ^
-DCMAKE_C_COMPILER=/mingw%NORM_PLATFORM%/bin/gcc.exe ^
-DCMAKE_CXX_COMPILER=/mingw%NORM_PLATFORM%/bin/g++.exe ^
-DOPENSSL_ROOT_DIR=/mingw%NORM_PLATFORM% ^
-DWITH_PYTHON=OFF ^
-DWITH_SHARED_LIB=OFF ^
-DWITH_STATIC_LIB=ON
@ECHO ON
%BASH% -lc "mkdir -p %BUILDDIR% && cd %BUILDDIR% && cmake.exe %SRCDIR% %CMAKEARGS% && cmake --build . --config %CONFIGURATION% --target install" || EXIT /B
@ECHO OFF

View file

@ -1,45 +0,0 @@
::
:: Licensed under the Apache License, Version 2.0 (the "License");
:: you may not use this file except in compliance with the License.
:: You may obtain a copy of the License at
::
:: http://www.apache.org/licenses/LICENSE-2.0
::
:: Unless required by applicable law or agreed to in writing, software
:: distributed under the License is distributed on an "AS IS" BASIS,
:: WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
:: See the License for the specific language governing permissions and
:: limitations under the License.
::
::
:: Appveyor install script for MINGW on MSYS2
:: Installs third party packages we need for a cmake build
::
@ECHO OFF
SETLOCAL EnableDelayedExpansion
CD build\appveyor || EXIT /B
CALL cl_banner_install.bat || EXIT /B
CALL cl_setenv.bat || EXIT /B
CALL cl_showenv.bat || EXIT /B
SET PACKAGES=^
--needed -S bison flex make ^
mingw-w64-%MINGWPLAT%-boost ^
mingw-w64-%MINGWPLAT%-cmake ^
mingw-w64-%MINGWPLAT%-libevent ^
mingw-w64-%MINGWPLAT%-openssl ^
mingw-w64-%MINGWPLAT%-toolchain ^
mingw-w64-%MINGWPLAT%-zlib
::mingw-w64-%MINGWPLAT%-qt5 : WAY too large (1GB download!) - tested in cygwin builds anyway
:: Remove old packages that no longer exist to avoid an error
%BASH% -lc "pacman --noconfirm --remove libcatgets catgets || true" || EXIT /B
:: Upgrade things
%BASH% -lc "pacman --noconfirm -Syu %IGNORE%" || EXIT /B
%BASH% -lc "pacman --noconfirm -Su %IGNORE%" || EXIT /B
%BASH% -lc "pacman --noconfirm %PACKAGES%" || EXIT /B

View file

@ -1,22 +0,0 @@
::
:: Licensed under the Apache License, Version 2.0 (the "License");
:: you may not use this file except in compliance with the License.
:: You may obtain a copy of the License at
::
:: http://www.apache.org/licenses/LICENSE-2.0
::
:: Unless required by applicable law or agreed to in writing, software
:: distributed under the License is distributed on an "AS IS" BASIS,
:: WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
:: See the License for the specific language governing permissions and
:: limitations under the License.
::
@ECHO OFF
SETLOCAL EnableDelayedExpansion
CD build\appveyor || EXIT /B
CALL cl_banner_test.bat || EXIT /B
CALL cl_setenv.bat || EXIT /B
%BASH% -lc "cd %BUILDDIR% && ctest.exe -C %CONFIGURATION% --timeout 300 -VV -E '%DISABLED_TESTS%'" || EXIT /B

View file

@ -1,47 +0,0 @@
::
:: Licensed under the Apache License, Version 2.0 (the "License");
:: you may not use this file except in compliance with the License.
:: You may obtain a copy of the License at
::
:: http://www.apache.org/licenses/LICENSE-2.0
::
:: Unless required by applicable law or agreed to in writing, software
:: distributed under the License is distributed on an "AS IS" BASIS,
:: WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
:: See the License for the specific language governing permissions and
:: limitations under the License.
::
@ECHO OFF
SETLOCAL EnableDelayedExpansion
CD build\appveyor || EXIT /B
CALL cl_banner_build.bat || EXIT /B
CALL cl_setenv.bat || EXIT /B
MKDIR "%BUILDDIR%" || EXIT /B
CD "%BUILDDIR%" || EXIT /B
@ECHO ON
cmake "%SRCDIR%" ^
-G"%GENERATOR%" ^
-DBISON_EXECUTABLE=C:\ProgramData\chocolatey\lib\winflexbison3\tools\win_bison.exe ^
-DBOOST_ROOT="%BOOST_ROOT%" ^
-DBOOST_LIBRARYDIR="%BOOST_LIBRARYDIR%" ^
-DCMAKE_BUILD_TYPE="%CONFIGURATION%" ^
-DCMAKE_INSTALL_PREFIX="%INSTDIR%" ^
-DFLEX_EXECUTABLE=C:\ProgramData\chocolatey\lib\winflexbison3\tools\win_flex.exe ^
-DINTTYPES_ROOT="%WIN3P%\msinttypes" ^
-DLIBEVENT_ROOT="%WIN3P%\libevent-%LIBEVENT_VERSION%-stable" ^
-DOPENSSL_ROOT_DIR="%OPENSSL_ROOT%" ^
-DOPENSSL_USE_STATIC_LIBS=OFF ^
-DZLIB_LIBRARY="%WIN3P%\zlib-inst\lib\zlib%ZLIB_LIB_SUFFIX%.lib" ^
-DZLIB_ROOT="%WIN3P%\zlib-inst" ^
-DWITH_PYTHON=%WITH_PYTHON% ^
-DWITH_%THREADMODEL%THREADS=ON ^
-DWITH_SHARED_LIB=OFF ^
-DWITH_STATIC_LIB=ON || EXIT /B
@ECHO OFF
cmake --build . ^
--config "%CONFIGURATION%" ^
--target INSTALL || EXIT /B

View file

@ -1,59 +0,0 @@
::
:: Licensed under the Apache License, Version 2.0 (the "License");
:: you may not use this file except in compliance with the License.
:: You may obtain a copy of the License at
::
:: http://www.apache.org/licenses/LICENSE-2.0
::
:: Unless required by applicable law or agreed to in writing, software
:: distributed under the License is distributed on an "AS IS" BASIS,
:: WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
:: See the License for the specific language governing permissions and
:: limitations under the License.
::
::
:: Appveyor install script for MSVC
:: Installs (or builds) third party packages we need
::
@ECHO OFF
SETLOCAL EnableDelayedExpansion
CD build\appveyor || EXIT /B
CALL cl_banner_install.bat || EXIT /B
CALL cl_setenv.bat || EXIT /B
CALL cl_showenv.bat || EXIT /B
MKDIR "%WIN3P%" || EXIT /B
choco feature enable -n allowGlobalConfirmation || EXIT /B
:: Things to install when NOT running in appveyor:
IF "%APPVEYOR_BUILD_ID%" == "" (
cup -y chocolatey || EXIT /B
cinst -y curl || EXIT /B
cinst -y 7zip || EXIT /B
cinst -y python3 || EXIT /B
cinst -y openssl.light || EXIT /B
)
cinst -y jdk8 || EXIT /B
cinst -y winflexbison3 || EXIT /B
:: zlib - not available through chocolatey
CD "%APPVEYOR_SCRIPTS%" || EXIT /B
call build-zlib.bat || EXIT /B
:: libevent - not available through chocolatey
CD "%APPVEYOR_SCRIPTS%" || EXIT /B
call build-libevent.bat || EXIT /B
:: python packages (correct path to pip set in cl_setenv.bat)
pip.exe ^
install backports.ssl_match_hostname ^
ipaddress ^
six ^
tornado ^
twisted || EXIT /B
cinst -y ghc || EXIT /B

View file

@ -1,32 +0,0 @@
::
:: Licensed under the Apache License, Version 2.0 (the "License");
:: you may not use this file except in compliance with the License.
:: You may obtain a copy of the License at
::
:: http://www.apache.org/licenses/LICENSE-2.0
::
:: Unless required by applicable law or agreed to in writing, software
:: distributed under the License is distributed on an "AS IS" BASIS,
:: WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
:: See the License for the specific language governing permissions and
:: limitations under the License.
::
@ECHO ON
SETLOCAL EnableDelayedExpansion
CD build\appveyor || EXIT /B
CALL cl_banner_test.bat || EXIT /B
CALL cl_setenv.bat || EXIT /B
CD "%BUILDDIR%" || EXIT /B
DIR C:\libraries
DIR C:\libraries\boost_1_59_0
DIR C:\libraries\boost_1_60_0
DIR C:\libraries\boost_1_62_0
DIR C:\libraries\boost_1_63_0
DIR C:\libraries\boost_1_64_0
:: Add directories to the path to find DLLs of third party libraries so tests run properly!
SET PATH=%BOOST_LIBRARYDIR:/=\%;%OPENSSL_ROOT%\bin;%WIN3P%\zlib-inst\bin;%PATH%
ctest -C %CONFIGURATION% --timeout 300 -VV -E "(%DISABLED_TESTS%)" || EXIT /B

View file

@ -1,48 +0,0 @@
::
:: Licensed under the Apache License, Version 2.0 (the "License");
:: you may not use this file except in compliance with the License.
:: You may obtain a copy of the License at
::
:: http://www.apache.org/licenses/LICENSE-2.0
::
:: Unless required by applicable law or agreed to in writing, software
:: distributed under the License is distributed on an "AS IS" BASIS,
:: WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
:: See the License for the specific language governing permissions and
:: limitations under the License.
::
@ECHO OFF
SETLOCAL EnableDelayedExpansion
CD build\appveyor || EXIT /B
CALL cl_banner_build.bat || EXIT /B
CALL cl_setenv.bat || EXIT /B
SET BASH=C:\msys64\usr\bin\bash
SET CMAKE=/c/msys64/mingw64/bin/cmake.exe
@ECHO ON
SET CMAKEARGS=-G\"%GENERATOR%\" ^
-DBoost_DEBUG=ON ^
-DBoost_NAMESPACE=libboost ^
-DBOOST_INCLUDEDIR=%BOOST_INCLUDEDIR% ^
-DBOOST_LIBRARYDIR=%BOOST_LIBRARYDIR% ^
-DCMAKE_BUILD_TYPE=%CONFIGURATION% ^
-DCMAKE_C_COMPILER=gcc.exe ^
-DCMAKE_CXX_COMPILER=g++.exe ^
-DCMAKE_MAKE_PROGRAM=make.exe ^
-DCMAKE_INSTALL_PREFIX=%INSTDIR_MSYS% ^
-DLIBEVENT_ROOT=%LIBEVENT_ROOT% ^
-DOPENSSL_LIBRARIES=%OPENSSL_LIBRARIES% ^
-DOPENSSL_ROOT_DIR=%OPENSSL_ROOT% ^
-DOPENSSL_USE_STATIC_LIBS=ON ^
-DWITH_BOOST_STATIC=ON ^
-DWITH_JAVA=OFF ^
-DWITH_LIBEVENT=ON ^
-DWITH_PYTHON=%WITH_PYTHON% ^
-DWITH_SHARED_LIB=OFF ^
-DWITH_STATIC_LIB=ON
%BASH% -lc "mkdir %BUILDDIR_MSYS% && cd %BUILDDIR_MSYS% && %CMAKE% %SRCDIR_MSYS% %CMAKEARGS% && %CMAKE% --build . --config %CONFIGURATION% --target install" || EXIT /B
@ECHO OFF

View file

@ -1,48 +0,0 @@
::
:: Licensed under the Apache License, Version 2.0 (the "License");
:: you may not use this file except in compliance with the License.
:: You may obtain a copy of the License at
::
:: http://www.apache.org/licenses/LICENSE-2.0
::
:: Unless required by applicable law or agreed to in writing, software
:: distributed under the License is distributed on an "AS IS" BASIS,
:: WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
:: See the License for the specific language governing permissions and
:: limitations under the License.
::
::
:: Appveyor install script for MSYS
:: Installs (or builds) third party packages we need
::
@ECHO OFF
SETLOCAL EnableDelayedExpansion
CD build\appveyor || EXIT /B
CALL cl_banner_install.bat || EXIT /B
CALL cl_setenv.bat || EXIT /B
CALL cl_showenv.bat || EXIT /B
:: We're going to keep boost at a version cmake understands
SET BOOSTPKG=mingw-w64-x86_64-boost-1.64.0-3-any.pkg.tar.xz
SET IGNORE=--ignore mingw-w64-x86_64-boost
SET PACKAGES=^
--needed -S bison flex make ^
mingw-w64-x86_64-cmake ^
mingw-w64-x86_64-libevent ^
mingw-w64-x86_64-openssl ^
mingw-w64-x86_64-toolchain ^
mingw-w64-x86_64-zlib
%BASH% -lc "pacman --noconfirm -Syu %IGNORE%" || EXIT /B
%BASH% -lc "pacman --noconfirm -Su %IGNORE%" || EXIT /B
%BASH% -lc "pacman --noconfirm %PACKAGES%" || EXIT /B
:: Install a slightly older boost (1.64.0) as cmake 3.10
:: does not have built-in dependencies for boost 1.66.0 yet
:: -- this cuts down on build warning output --
%BASH% -lc "wget http://repo.msys2.org/mingw/x86_64/%BOOSTPKG% && pacman --noconfirm --needed -U %BOOSTPKG% && rm %BOOSTPKG%" || EXIT /B

View file

@ -1,26 +0,0 @@
::
:: Licensed under the Apache License, Version 2.0 (the "License");
:: you may not use this file except in compliance with the License.
:: You may obtain a copy of the License at
::
:: http://www.apache.org/licenses/LICENSE-2.0
::
:: Unless required by applicable law or agreed to in writing, software
:: distributed under the License is distributed on an "AS IS" BASIS,
:: WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
:: See the License for the specific language governing permissions and
:: limitations under the License.
::
@ECHO OFF
SETLOCAL EnableDelayedExpansion
CD build\appveyor || EXIT /B
CALL cl_banner_test.bat || EXIT /B
CALL cl_setenv.bat || EXIT /B
CD "%BUILDDIR%" || EXIT /B
:: randomly fails on mingw; see Jira THRIFT-4106
SET DISABLED_TESTS=concurrency_test
%BASH% -lc "cd %BUILDDIR_MSYS% && ctest.exe -C %CONFIGURATION% --timeout 300 -VV -E '(%DISABLED_TESTS%)'" || EXIT /B

View file

@ -1,34 +0,0 @@
<!---
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
# Appveyor Build
Appveyor is capable of building MSVC 2010 through 2015 as well as
having the latest MSYS2/MinGW 64-bit environment. It has many versions
of boost and python installed as well. See what appveyor has
[installed on build workers](https://www.appveyor.com/docs/installed-software/).
We run a matrix build on Appveyor and build the following combinations:
* MinGW x64 (gcc 6.3.0)
* MSVC 2010 x86, an older boost, an older python
* MSVC 2015 x86/x64, the latest boost, the latest python
* MSYS2 x64 (gcc 6.3.0) - this is a work in progress
The Appveyor script takes the first four letters from the PROFILE specified in
the environment stanza and runs these scripts in order:
????-appveyor-install.bat will install third party libraries and set up the environment
????-appveyor-build.bat will build with cmake
????-appveyor-test.bat will run ctest

Some files were not shown because too many files have changed in this diff Show more