[RFC] building postgres with meson
Hi,
For the last year or so I've on and off tinkered with $subject. I think
it's in a state worth sharing now. First, let's look at a little
comparison.
My workstation:
non-cached configure:
current: 11.80s
meson: 6.67s
non-cached build (world-bin):
current: 40.46s
ninja: 7.31s
no-change build:
current: 1.17s
ninja: 0.06s
test world:
current: 105s
meson: 63s
What actually started to motivate me however were the long times windows
builds took to come back with testsresults. On CI, with the same machine
config:
build:
current: 202s (doesn't include genbki etc)
meson+ninja: 140s
meson+msbuild: 206s
test:
current: 1323s (many commands)
meson: 903s (single command)
(note that the test comparison isn't quite fair - there's a few tests
missing, but it's just small contrib ones afaik)
The biggest difference to me however is not the speed, but how readable
the output is.
Running the tests with meson in a terminal, shows the number of tests
that completed out of how many total, how much time has passed, how long
the currently running tests already have been running.
At the end of a testrun a count of tests is shown:
188/189 postgresql:tap+pg_basebackup / pg_basebackup/t/010_pg_basebackup.pl OK 39.51s 110 subtests passed
189/189 postgresql:isolation+snapshot_too_old / snapshot_too_old/isolation OK 62.93s
Ok: 188
Expected Fail: 0
Fail: 1
Unexpected Pass: 0
Skipped: 0
Timeout: 0
Full log written to /tmp/meson/meson-logs/testlog.txt
The log has the output of the tests and ends with:
Summary of Failures:
120/189 postgresql:tap+recovery / recovery/t/007_sync_rep.pl ERROR 7.16s (exit status 255 or signal 127 SIGinvalid)
Quite the difference to make check-world -jnn output.
So, now that the teasing is done, let me explain a bit what lead me down
this path:
Autoconf + make is not being actively developed. Especially autoconf is
*barely* in maintenance mode - despite many shortcomings and bugs. It's
also technology that very few want to use - autoconf m4 is scary, and
it's scarier for people that started more recently than a lot of us
committers for example.
Recursive make as we use it is hard to get right. One reason the clean
make build is so slow compared to meson is that we had to resort to
.NOTPARALLEL to handle dependencies in a bunch of places. And despite
that, I quite regularly see incremental build failures that can be
resolved by retrying the build.
While we have incremental build via --enable-depend, they don't work
that reliable (i.e. misses necessary rebuilds) and yet is often too
aggressive. More modern build system can keep track of the precise
command used to build a target and rebuild it when that command changes.
We also don't just have the autoconf / make buildsystem, there's also
the msvc project generator - something most of us unix-y folks do not
like to touch. I think that, combined with there being no easy way to
run all tests, and it being just different, really hurt our windows
developer appeal (and subsequently the quality of postgres on
windows). I'm not saying this to ding the project generator - that was
well before there were decent "meta" buildsystems out there (and in some
ways it is a small one itself).
The last big issue I have with the current situation is that there's no
good test integration. make check-world output is essentially unreadable
/ not automatically parseable. Which led to the buildfarm having a
separate list of things it needs to test, so that failures can be
pinpointed and paired with appropriate logs. That approach unfortunately
doesn't scale well to multi-core CPUs, slowing down the buildfarm by a
fair bit.
This all led to me to experiment with improvements. I tried a few
somewhat crazy but incremental things like converting our buildsystem to
non-recursive make (I got it to build the backend, but it's too hard to
do manually I think), or to not run tests during the recursive make
check-world, but to append commands to a list of tests, that then is run
by a helper (can kinda be made to work). In the end I concluded that
the amount of time we'd need to invest to maintain our more-and-more
custom buildsystem going forward doesn't make sense.
Which lead me to look around and analyze which other buildsystems there
are that could make some sense for us. The halfway decent list includes,
I think:
1) cmake
2) bazel
3) meson
cmake would be a decent choice, I think. However, I just can't fully
warm up to it. Something about it just doesn't quite sit right with
me. That's not a good enough reason to prevent others from suggesting to
use it, but it's good enough to justify not investing a lot of time in
it myself.
Bazel has some nice architectural properties. But it requires a JVM to
run - I think that basically makes it insuitable for us. And the build
information seems quite arduous to maintain too.
Which left me with meson. It is a meta-buildsystem that can do the
actual work of building via ninja (the most common one, also targeted by
cmake), msbuild (visual studio project files, important for GUI work)
and xcode projects (I assume that's for a macos IDE, but I haven't tried
to use it). Meson roughly does what autoconf+automake did, in a
python-esque DSL, and outputs build-instructions for ninja / msbuild /
xcode. One interesting bit is that meson itself is written in python (
and fairly easy to contribute too - I got a few changes in now).
I don't think meson is perfect architecturally - e.g. its insistence on
not having functions ends up making it a bit harder to not end up
duplicating code. There's some user-interface oddities that are now hard
to fix fully, due to the faily wide usage. But all-in-all it's pretty
nice to use.
Its worth calling out that a lot of large open source projects have been
/ are migrating to meson. qemu/kvm, mesa (core part of graphics stack on
linux and also widely used in other platforms), a good chunk of GNOME,
and quite a few more. Due to that it seems unlikely to be abandoned
soon.
As far as I can tell the only OS that postgres currently supports that
meson doesn't support is HPUX. It'd likely be fairly easy to add
gcc-on-hpux support, a chunk more to add support for the proprietary
ones.
The attached patch (meson support is 0016, the rest is prerequisites
that aren't that interesting at this stage) converts most of postgres to
meson. There's a few missing contrib modules, only about half the
optional library dependencies are implemented, and I've only built on
x64. It builds on freebsd, linux, macos and windows (both ninja and
msbuild) and cross builds from linux to windows. Thomas helped make the
freebsd / macos pieces a reality, thanks!
I took a number of shortcuts (although there used to be a *lot*
more). So this shouldn't be reviewed to the normal standard of the
community - it's a prototype. But I think it's in a complete enough
shape that it allows to do a well-informed evaluation.
What doesn't yet work/ build:
- plenty optional libraries, contrib, NLS, docs build
- PGXS - and I don't yet know what to best do about it. One
backward-compatible way would be to continue use makefiles for pgxs,
but do the necessary replacement of Makefile.global.in via meson (and
not use that for postgres' own build). But that doesn't really
provide a nicer path for building postgres extensions on windows, so
it'd definitely not be a long-term path.
- JIT bitcode generation for anything but src/backend.
- anything but modern-ish x86. That's proably a small amount of work,
but something that needs to be done.
- exporting all symbols for extension modules on windows (the stuff for
postgres is implemented). Instead I marked the relevant symbols als
declspec(dllexport). I think we should do that regardless of the
buildsystem change. Restricting symbol visibility via gcc's
-fvisibility=hidden for extensions results in a substantially reduced
number of exported symbols, and even reduces object size (and I think
improves the code too). I'll send an email about that separately.
There's a lot more stuff to talk about, but I'll stop with a small bit
of instructions below:
Demo / instructions:
# Get code
git remote add andres git@github.com:anarazel/postgres.git
git fetch andres
git checkout --track andres/meson
# setup build directory
meson setup build --buildtype debug
cd build
# build (uses automatically as many cores as available)
ninja
# change configuration, build again
meson configure -Dssl=openssl
ninja
# run all tests
meson test
# run just recovery tests
meson test --suite setup --suite recovery
# list tests
meson test --list
Greetings,
Andres Freund
Attachments:
v3-0001-ci-backend-windows-DONTMERGE-crash-reporting-back.patchtext/x-diff; charset=us-asciiDownload
From a9b4a00a55d29fbfc96b81b0dd568a0cf9f61c20 Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Thu, 9 Sep 2021 17:49:39 -0700
Subject: [PATCH v3 01/17] ci: backend: windows: DONTMERGE: crash reporting
(backend).
---
src/backend/main/main.c | 14 +++++++++++++-
1 file changed, 13 insertions(+), 1 deletion(-)
diff --git a/src/backend/main/main.c b/src/backend/main/main.c
index ad84a45e28c..65a325723fd 100644
--- a/src/backend/main/main.c
+++ b/src/backend/main/main.c
@@ -26,6 +26,10 @@
#include <sys/param.h>
#endif
+#if defined(WIN32)
+#include <crtdbg.h>
+#endif
+
#if defined(_M_AMD64) && _MSC_VER == 1800
#include <math.h>
#include <versionhelpers.h>
@@ -238,7 +242,15 @@ startup_hacks(const char *progname)
}
/* In case of general protection fault, don't show GUI popup box */
- SetErrorMode(SEM_FAILCRITICALERRORS | SEM_NOGPFAULTERRORBOX);
+ SetErrorMode(SEM_FAILCRITICALERRORS /* | SEM_NOGPFAULTERRORBOX */);
+
+ _CrtSetReportMode(_CRT_ASSERT, _CRTDBG_MODE_FILE | _CRTDBG_MODE_DEBUG);
+ _CrtSetReportMode(_CRT_ERROR, _CRTDBG_MODE_FILE | _CRTDBG_MODE_DEBUG);
+ _CrtSetReportFile(_CRT_ASSERT, _CRTDBG_FILE_STDERR);
+ _CrtSetReportFile(_CRT_ERROR, _CRTDBG_FILE_STDERR);
+#ifndef __MINGW64__
+ _set_abort_behavior(_CALL_REPORTFAULT | _WRITE_ABORT_MSG, _CALL_REPORTFAULT | _WRITE_ABORT_MSG);
+#endif
#if defined(_M_AMD64) && _MSC_VER == 1800
--
2.23.0.385.gbc12974a89
v3-0002-ci-Add-CI-for-FreeBSD-Linux-MacOS-and-Windows-uti.patchtext/x-diff; charset=us-asciiDownload
From 7a364ae12c87f2523576f1a309ef688962f3d047 Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Mon, 15 Mar 2021 09:25:15 -0700
Subject: [PATCH v3 02/17] ci: Add CI for FreeBSD, Linux, MacOS and Windows,
utilizing cirrus-ci.
---
.cirrus.yml | 395 ++++++++++++++++++++++++++++++++
.dockerignore | 3 +
ci/docker/linux_debian_bullseye | 13 ++
ci/docker/windows_vs_2019 | 105 +++++++++
ci/freebsd_gcp_repartition.sh | 28 +++
ci/pg_ci_base.conf | 12 +
ci/windows_build_config.pl | 10 +
7 files changed, 566 insertions(+)
create mode 100644 .cirrus.yml
create mode 100644 .dockerignore
create mode 100644 ci/docker/linux_debian_bullseye
create mode 100644 ci/docker/windows_vs_2019
create mode 100755 ci/freebsd_gcp_repartition.sh
create mode 100644 ci/pg_ci_base.conf
create mode 100644 ci/windows_build_config.pl
diff --git a/.cirrus.yml b/.cirrus.yml
new file mode 100644
index 00000000000..f75bdce6dec
--- /dev/null
+++ b/.cirrus.yml
@@ -0,0 +1,395 @@
+env:
+ # accelerate initial clone, but a bit of depth so that concurrent tasks work
+ CIRRUS_CLONE_DEPTH: 100
+ # Useful to be able to analyse what in a script takes long
+ CIRRUS_LOG_TIMESTAMP: true
+ # target to test, for all but windows
+ CHECK: check-world
+ CHECKFLAGS: -Otarget
+ PGCTLTIMEOUT: 120
+ CCACHE_MAXSIZE: "500M"
+ TEMP_CONFIG: ${CIRRUS_WORKING_DIR}/ci/pg_ci_base.conf
+ PG_TEST_EXTRA: kerberos ldap ssl
+
+
+task:
+ name: FreeBSD
+ only_if: $CIRRUS_CHANGE_MESSAGE !=~ '.*\nci-os-only:.*' || $CIRRUS_CHANGE_MESSAGE =~ '.*\nci-os-only:[^\n]*freebsd.*'
+ compute_engine_instance:
+ image_project: pg-vm-images-aio
+ image: family/pg-aio-freebsd-13-0
+ platform: freebsd
+ cpu: 2
+ memory: 2G
+ disk: 50
+ env:
+ CCACHE_DIR: "/tmp/ccache_dir"
+
+ ccache_cache:
+ folder: "/tmp/ccache_dir"
+ sysinfo_script:
+ - export || true
+ sysconfig_script:
+ - sudo sysctl kern.corefile='/tmp/%N.%P.core'
+ repartition_script:
+ - ci/freebsd_gcp_repartition.sh
+ create_user_script:
+ - pw useradd postgres
+ - chown -R postgres:postgres .
+ - mkdir -p /tmp/ccache_dir
+ - chown -R postgres:postgres /tmp/ccache_dir
+
+ configure_script: |
+ su postgres -c './configure \
+ --enable-cassert --enable-debug --enable-tap-tests \
+ --enable-nls \
+ \
+ --with-icu \
+ --with-ldap \
+ --with-libxml \
+ --with-libxslt \
+ \
+ --with-lz4 \
+ --with-pam \
+ --with-perl \
+ --with-python \
+ --with-ssl=openssl \
+ --with-tcl --with-tclconfig=/usr/local/lib/tcl8.6/ \
+ --with-uuid=bsd \
+ \
+ --with-includes=/usr/local/include --with-libs=/usr/local/lib \
+ CC="ccache cc"'
+ build_script:
+ - su postgres -c 'gmake -s -j3 && gmake -s -j3 -C contrib'
+ upload_caches:
+ - ccache
+
+ tests_script:
+ - su postgres -c 'time gmake -s -j2 ${CHECK} ${CHECKFLAGS}'
+
+ on_failure:
+ cores_script: |
+ for corefile in $(find /tmp -name '*.core' 2>/dev/null) ; do
+ binary=$(gdb -quiet -core $corefile -batch -ex 'info auxv' | grep AT_EXECPATH | perl -pe "s/^.*\"(.*)\"\$/\$1/g") ;
+ echo dumping $corefile for $binary ;
+ gdb --batch --quiet -ex "thread apply all bt full" -ex "quit" $binary $corefile;
+ done
+ log_artifacts:
+ path: "**/**.log"
+ type: text/plain
+ regress_diffs_artifacts:
+ path: "**/**.diffs"
+ type: text/plain
+ tap_artifacts:
+ path: "**/regress_log_*"
+ type: text/plain
+
+
+task:
+ name: Linux
+ only_if: $CIRRUS_CHANGE_MESSAGE !=~ '.*\nci-os-only:.*' || $CIRRUS_CHANGE_MESSAGE =~ '.*\nci-os-only:[^\n]*linux.*'
+ compute_engine_instance:
+ image_project: pg-vm-images-aio
+ image: family/pg-aio-bullseye
+ platform: linux
+ cpu: 4
+ memory: 2G
+ nested_virtualization: false
+ env:
+ CCACHE_DIR: "/tmp/ccache_dir"
+ DEBUGINFOD_URLS: "https://debuginfod.debian.net"
+
+ ccache_cache:
+ folder: "/tmp/ccache_dir"
+
+ sysinfo_script:
+ - id
+ - uname -a
+ - cat /proc/cmdline
+ - lsblk
+ - ulimit -a -H
+ - ulimit -a -S
+ - export
+ sysconfig_script:
+ - useradd -m postgres
+ - chown -R postgres:postgres .
+ - mkdir -p /tmp/ccache_dir
+ - chown -R postgres:postgres /tmp/ccache_dir
+ - echo '* - memlock 134217728' > /etc/security/limits.d/postgres.conf
+ - su postgres -c 'ulimit -l -H'
+ - su postgres -c 'ulimit -l -S'
+ - echo '/tmp/%e-%s-%p.core' > /proc/sys/kernel/core_pattern
+
+ configure_script: |
+ su postgres -c './configure \
+ --enable-cassert --enable-debug --enable-tap-tests \
+ --enable-nls \
+ \
+ --with-gssapi \
+ --with-icu \
+ --with-ldap \
+ --with-libxml \
+ --with-libxslt \
+ --with-llvm \
+ --with-lz4 \
+ --with-pam \
+ --with-perl \
+ --with-python \
+ --with-ssl=openssl \
+ --with-systemd \
+ --with-tcl --with-tclconfig=/usr/lib/tcl8.6/ \
+ --with-uuid=e2fs \
+ \
+ CC="ccache gcc" CXX="ccache g++" CLANG="ccache clang" CFLAGS="-O0 -ggdb"'
+ build_script:
+ - su postgres -c 'make -s -j4 && make -s -j4 -C contrib'
+ upload_caches:
+ - ccache
+
+ tests_script: |
+ su postgres -c '\
+ ulimit -c unlimited; \
+ make -s ${CHECK} ${CHECKFLAGS} -j8 \
+ '
+
+ on_failure:
+ cores_script: |
+ for corefile in $(find /tmp -name '*.core' 2>/dev/null) ; do
+ binary=$(gdb -quiet -core $corefile -batch -ex 'info auxv' | grep AT_EXECFN | perl -pe "s/^.*\"(.*)\"\$/\$1/g") ;
+ echo dumping $corefile for $binary ;
+ gdb --batch --quiet -ex "thread apply all bt full" -ex "quit" $binary $corefile ;
+ done
+ log_artifacts:
+ path: "**/**.log"
+ type: text/plain
+ regress_diffs_artifacts:
+ path: "**/**.diffs"
+ type: text/plain
+ tap_artifacts:
+ path: "**/regress_log_*"
+ type: text/plain
+
+
+task:
+ name: macOS
+ only_if: $CIRRUS_CHANGE_MESSAGE !=~ '.*\nci-os-only:.*' || $CIRRUS_CHANGE_MESSAGE =~ '.*\nci-os-only:[^\n]*(macos|darwin|osx).*'
+ osx_instance:
+ image: big-sur-base
+ env:
+ CIRRUS_WORKING_DIR: ${HOME}/pgsql/
+ TEMP_CONFIG: ${CIRRUS_WORKING_DIR}/ci/pg_ci_base.conf
+ CCACHE_DIR: ${HOME}/ccache
+ HOMEBREW_CACHE: ${HOME}/homebrew-cache
+ PERL5LIB: ${HOME}/perl5/lib/perl5
+
+ sysinfo_script:
+ - id
+ - export
+ ccache_cache:
+ folder: ${CCACHE_DIR}
+ homebrew_cache:
+ folder: ${HOMEBREW_CACHE}
+ perl_cache:
+ folder: ~/perl5
+
+ cpan_install_script:
+ - perl -mIPC::Run -e 1 || cpan -T IPC::Run
+ - perl -mIO::Pty -e 1 || cpan -T IO::Pty
+ upload_caches:
+ - perl
+ core_install_script:
+ - sudo chmod 777 /cores
+ homebrew_install_script:
+ - brew install make coreutils ccache icu4c lz4 tcl-tk openldap
+ upload_caches:
+ - homebrew
+
+ configure_script: |
+ LIBS="/usr/local/lib:$LIBS"
+ INCLUDES="/usr/local/include:$INCLUDES"
+
+ INCLUDES="/usr/local/opt/openssl/include:$INCLUDES"
+ LIBS="/usr/local/opt/openssl/lib:$LIBS"
+
+ PKG_CONFIG_PATH="/usr/local/opt/icu4c/lib/pkgconfig:$PKG_CONFIG_PATH"
+ INCLUDES="/usr/local/opt/icu4c/include:$INCLUDES"
+ LIBS="/usr/local/opt/icu4c/lib:$LIBS"
+
+ LIBS="/usr/local/opt/openldap/lib:$LIBS"
+ INCLUDES="/usr/local/opt/openldap/include:$INCLUDES"
+
+ export PKG_CONFIG_PATH
+
+ ./configure \
+ --prefix=$HOME/install \
+ --with-includes="$INCLUDES" \
+ --with-libs="$LIBS" \
+ \
+ --enable-cassert --enable-debug --enable-tap-tests \
+ --enable-nls \
+ \
+ --with-icu \
+ --with-ldap \
+ --with-libxml \
+ --with-libxslt \
+ \
+ --with-lz4 \
+ --with-perl \
+ --with-python \
+ --with-ssl=openssl \
+ --with-tcl --with-tclconfig=/usr/local/opt/tcl-tk/lib/ \
+ --with-uuid=e2fs \
+ \
+ CC="ccache gcc" CFLAGS="-O0 -ggdb"
+ build_script:
+ - gmake -s -j12 && gmake -s -j12 -C contrib
+ upload_caches:
+ - ccache
+
+ tests_script:
+ - ulimit -c unlimited
+ - ulimit -n 1024
+ - gmake -s -j12 ${CHECK} ${CHECKFLAGS}
+
+ on_failure:
+ cores_script: |
+ for corefile in $(find /cores/ -name 'core.*' 2>/dev/null) ; do
+ lldb -c $corefile --batch -o 'thread backtrace all' -o 'quit' ;
+ done
+ log_artifacts:
+ path: "**/**.log"
+ type: text/plain
+ regress_diffs_artifacts:
+ path: "**/**.diffs"
+ type: text/plain
+ tap_artifacts:
+ path: "**/regress_log_*"
+ type: text/plain
+
+
+task:
+ name: Windows
+ only_if: $CIRRUS_CHANGE_MESSAGE !=~ '.*\nci-os-only:.*' || $CIRRUS_CHANGE_MESSAGE =~ '.*\nci-os-only:[^\n]*windows.*'
+ windows_container:
+ dockerfile: ci/docker/windows_vs_2019
+ cpu: 4
+ memory: 4G
+ env:
+ PROVE_FLAGS: -j10
+ # The default working dir is in a directory msbuild complains about
+ CIRRUS_WORKING_DIR: "c:/cirrus"
+ TEMP_CONFIG: ${CIRRUS_WORKING_DIR}/ci/pg_ci_base.conf
+ # Avoid re-installing over and over
+ NO_TEMP_INSTALL: 1
+
+ sysinfo_script:
+ - chcp
+ - systeminfo
+ - powershell -Command get-psdrive -psprovider filesystem
+ - ps: Get-Item -Path 'HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\AeDebug'
+ - set
+
+ configure_script:
+ - copy ci\windows_build_config.pl src\tools\msvc\config.pl
+ - vcvarsall x64
+ - perl src/tools/msvc/mkvcbuild.pl
+ build_script:
+ - vcvarsall x64
+ # Disable file tracker, we're never going to rebuild...
+ - msbuild -m /p:TrackFileAccess=false pgsql.sln
+ tempinstall_script:
+ # Installation on windows currently only completely works from src\tools\msvc
+ - cd src\tools\msvc && perl .\install.pl %CIRRUS_WORKING_DIR%\tmp_install
+
+ check_test_script:
+ - perl src/tools/msvc/vcregress.pl check parallel
+ startcreate_test_script:
+ - tmp_install\bin\pg_ctl.exe initdb -D tmp_check\db -l tmp_check\initdb.log
+ - echo include '%TEMP_CONFIG%' >> tmp_check\db\postgresql.conf
+ - tmp_install\bin\pg_ctl.exe start -D tmp_check\db -l tmp_check\postmaster.log
+ plcheck_test_script:
+ - perl src/tools/msvc/vcregress.pl plcheck
+ isolationcheck_test_script:
+ - perl src/tools/msvc/vcregress.pl isolationcheck
+ modulescheck_test_script:
+ - perl src/tools/msvc/vcregress.pl modulescheck
+ contribcheck_test_script:
+ - perl src/tools/msvc/vcregress.pl contribcheck
+ stop_test_script:
+ - tmp_install\bin\pg_ctl.exe stop -D tmp_check\db -l tmp_check\postmaster.log
+ ssl_test_script:
+ - set with_ssl=openssl
+ - perl src/tools/msvc/vcregress.pl taptest .\src\test\ssl\
+ subscriptioncheck_test_script:
+ - perl src/tools/msvc/vcregress.pl taptest .\src\test\subscription\
+ authentication_test_script:
+ - perl src/tools/msvc/vcregress.pl taptest .\src\test\authentication\
+ recoverycheck_test_script:
+ - perl src/tools/msvc/vcregress.pl recoverycheck
+ bincheck_test_script:
+ - perl src/tools/msvc/vcregress.pl bincheck
+ upgradecheck_test_script:
+ - perl src/tools/msvc/vcregress.pl upgradecheck
+ ecpgcheck_test_script:
+ # tries to build additional stuff
+ - vcvarsall x64
+ # References ecpg_regression.proj in the current dir
+ - cd src\tools\msvc
+ - perl vcregress.pl ecpgcheck
+
+ always:
+ cores_script:
+ - cat crashlog.txt || true
+ dump_artifacts:
+ path: "crashlog.txt"
+ type: text/plain
+
+ on_failure:
+ log_artifacts:
+ path: "**/**.log"
+ type: text/plain
+ regress_diffs_artifacts:
+ path: "**/**.diffs"
+ type: text/plain
+ tap_artifacts:
+ path: "**/regress_log_*"
+ type: text/plain
+
+
+task:
+ name: CompilerWarnings
+ depends_on:
+ - Linux
+ # task that did not run count as a success, so we need to recheck Linux' condition here :/
+ only_if: $CIRRUS_CHANGE_MESSAGE !=~ '.*\nci-os-only:.*' || $CIRRUS_CHANGE_MESSAGE =~ '.*\nci-os-only:[^\n]*linux.*'
+ container:
+ dockerfile: ci/docker/linux_debian_bullseye
+ env:
+ CCACHE_SIZE: "4GB"
+ CCACHE_DIR: "/tmp/ccache_dir"
+ ccache_cache:
+ folder: "/tmp/ccache_dir"
+ setup_script:
+ - echo "COPT=-Werror" > src/Makefile.custom
+ - gcc -v
+ - clang -v
+ # gcc with asserts disabled
+ always:
+ gcc_warning_script:
+ - ./configure --cache gcc.cache CC="ccache gcc"
+ - time make -s -j4 clean && time make -s -j4
+ # gcc with asserts enabled
+ always:
+ gcc_a_warning_script:
+ - ./configure --cache gcc.cache --enable-cassert CC="ccache gcc"
+ - time make -s -j4 clean && time make -s -j4
+ # clang with asserts disabled
+ always:
+ clang_warning_script:
+ - ./configure --cache clang.cache CC="ccache clang"
+ - time make -s -j4 clean && time make -s -j4
+ # clang with asserts enabled
+ always:
+ clang_a_warning_script:
+ - ./configure --cache clang.cache --enable-cassert CC="ccache clang"
+ - time make -s -j4 clean && time make -s -j4
diff --git a/.dockerignore b/.dockerignore
new file mode 100644
index 00000000000..3fceab2e97b
--- /dev/null
+++ b/.dockerignore
@@ -0,0 +1,3 @@
+# Ignore everything, except ci/
+*
+!ci/*
diff --git a/ci/docker/linux_debian_bullseye b/ci/docker/linux_debian_bullseye
new file mode 100644
index 00000000000..f6c1782f16b
--- /dev/null
+++ b/ci/docker/linux_debian_bullseye
@@ -0,0 +1,13 @@
+FROM debian:bullseye
+RUN \
+ apt-get -y update && \
+ apt-get -y upgrade && \
+ DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends \
+ git build-essential gcc g++ libreadline-dev flex bison make perl libipc-run-perl \
+ libio-pty-perl clang llvm-dev libperl-dev libpython3-dev tcl-dev libldap2-dev \
+ libicu-dev docbook-xml docbook-xsl fop libxml2-utils xsltproc krb5-admin-server \
+ krb5-kdc krb5-user slapd ldap-utils libssl-dev pkg-config locales-all liblz4-dev \
+ libsystemd-dev libxml2-dev libxslt1-dev python3-dev libkrb5-dev libpam-dev \
+ libkrb5-*-heimdal uuid-dev gettext \
+ liburing-dev python3-distutils ccache && \
+ apt-get clean
diff --git a/ci/docker/windows_vs_2019 b/ci/docker/windows_vs_2019
new file mode 100644
index 00000000000..e09ca0d5825
--- /dev/null
+++ b/ci/docker/windows_vs_2019
@@ -0,0 +1,105 @@
+# escape=`
+
+# We used to use the visual studio container, but it's too outdated now
+FROM cirrusci/windowsservercore:2019
+
+SHELL ["powershell", "-NoLogo", "-NoProfile", "-Command"]
+
+
+RUN `
+ New-Item -Path 'HKLM:\SOFTWARE\Policies\Microsoft\VisualStudio' ; `
+ New-Item -Path 'HKLM:\SOFTWARE\Policies\Microsoft\VisualStudio\Setup' ; `
+ New-ItemProperty -Path 'HKLM:\SOFTWARE\Policies\Microsoft\VisualStudio\Setup' -Name KeepDownloadedPayloads -Value 0 -PropertyType DWord
+
+
+# Install commandline debugger and log all crashes to c:\cirrus\crashlog.txt
+#
+# Done manually as doing this via chocolatey / the installer directly, ends up
+# with a lot of unnecessary chaff, making the layer unnecessarily large.
+RUN `
+ mkdir c:\t ; `
+ cd c:\t ; `
+ curl.exe -sSL -o 'windsdksetup.exe' https://download.microsoft.com/download/9/7/9/97982c1d-d687-41be-9dd3-6d01e52ceb68/windowssdk/winsdksetup.exe ; `
+ Start-Process -Wait -FilePath ".\windsdksetup.exe" `
+ -ArgumentList '/Features OptionId.WindowsDesktopDebuggers /layout c:\t\sdk /quiet /norestart /log c:\t\sdk.log' `
+ ; `
+ `
+ Start-Process -Wait -FilePath msiexec.exe `
+ -ArgumentList '/a \"C:\t\sdk\Installers\X64 Debuggers And Tools-x64_en-us.msi\" /qb /log install2.log' `
+ ; `
+ C:\Windows` Kits\10\Debuggers\x64\cdb.exe -version ; `
+ `
+ cd c:\ ; `
+ Remove-Item C:\t\* -Force -Recurse ; `
+ Remove-Item C:\t -Force -Recurse ; `
+ Set-ItemProperty -Path 'HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\AeDebug' -Name 'Debugger' -Value '\"C:\Windows Kits\10\Debuggers\x64\cdb.exe\" -p %ld -e %ld -g -kqm -c \".lines -e; .symfix+ ;.logappend c:\cirrus\crashlog.txt ; !peb; ~*kP ; .logclose ; q \"' ; `
+ New-ItemProperty -Path 'HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\AeDebug' -Name 'Auto' -Value 1 -PropertyType DWord ; `
+ Get-ItemProperty -Path 'HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\AeDebug' -Name Debugger; `
+ setx PATH \"C:\Windows Kits\10\Debuggers\x64;$Env:PATH\" /m
+
+
+# Install perl, python, flex and bison.
+#
+# Done manually as choco takes a lot longer. I think it's download issues with
+# powershell's download stuff? That's wy curl.exe is directly used here at least...
+#
+# Using perl 5.26.3.1 for now, as newer versions don't currently work correctly
+RUN `
+ mkdir c:\t ; `
+ cd c:\t ; `
+ `
+ curl.exe -sSL -o perl.zip `
+ https://strawberryperl.com/download/5.26.3.1/strawberry-perl-5.26.3.1-64bit-portable.zip ; `
+ 7z.exe x .\perl.zip -xr!c -oc:\strawberry ; `
+ `
+ curl.exe -sSL -o python.exe https://www.python.org/ftp/python/3.10.0/python-3.10.0-amd64.exe ; `
+ Start-Process -Wait -FilePath ".\python.exe" `
+ -ArgumentList `
+ '/quiet', 'SimpleInstall=1', 'PrependPath=1', 'CompileAll=1', `
+ 'TargetDir=c:\python\', 'InstallAllUsers=1', 'Shortcuts=0', `
+ 'Include_docs=0', 'Include_tcltk=0', 'Include_tests=0' `
+ ; `
+ `
+ curl.exe -sSL -o winflexbison.zip `
+ https://github.com/lexxmark/winflexbison/releases/download/v2.5.24/win_flex_bison-2.5.24.zip ; `
+ 7z.exe x .\winflexbison.zip -oc:\winflexbison ; `
+ Rename-Item -Path c:\winflexbison\win_flex.exe c:\winflexbison\flex.exe ; `
+ Rename-Item -Path c:\winflexbison\win_bison.exe c:\winflexbison\bison.exe ; `
+ `
+ cd c:\ ; `
+ Remove-Item C:\t -Force -Recurse ; `
+ setx PATH \"C:\strawberry\perl\bin;C:\winflexbison;C:\Program Files\Git\usr\bin;$Env:PATH\" /m
+
+
+# Install visual studio
+#
+# Adding VS path to vcvarsall.bat so user of container doesn't need to know the full path
+RUN `
+ mkdir c:\t ; `
+ cd c:\t ; `
+ curl.exe -sSL -o c:\t\vs_buildtools.exe https://aka.ms/vs/16/release/vs_buildtools.exe ; `
+ Start-Process -Wait `
+ -FilePath c:\t\vs_buildtools.exe `
+ -ArgumentList `
+ '--quiet', '--wait', '--norestart', '--nocache', `
+ '--installPath', 'c:\BuildTools', `
+ '--add', 'Microsoft.VisualStudio.Component.VC.Tools.x86.x64', `
+ '--add', 'Microsoft.VisualStudio.Component.Windows10SDK.20348' ; `
+ cd c:\ ; `
+ Remove-Item C:\t -Force -Recurse ; `
+ Remove-Item -Force -Recurse ${Env:TEMP}\*; `
+ Remove-Item -Force -Recurse \"${Env:ProgramData}\Package Cache\" ; `
+ setx PATH \"c:\BuildTools\VC\Auxiliary\Build;$Env:PATH\" /m
+
+
+# Install openssl
+RUN `
+ mkdir c:\t ; `
+ cd c:\t ; `
+ `
+ curl.exe -o openssl-setup.exe -sSL https://slproweb.com/download/Win64OpenSSL-1_1_1L.exe ; `
+ Start-Process -Wait -FilePath ".\openssl-setup.exe" `
+ -ArgumentList '/DIR=c:\openssl\1.1.1l\ /VERYSILENT /SP- /SUPPRESSMSGBOXES' ; `
+ `
+ cd c:\ ; `
+ Remove-Item C:\t -Force -Recurse
diff --git a/ci/freebsd_gcp_repartition.sh b/ci/freebsd_gcp_repartition.sh
new file mode 100755
index 00000000000..2d5e1738998
--- /dev/null
+++ b/ci/freebsd_gcp_repartition.sh
@@ -0,0 +1,28 @@
+#!/bin/sh
+
+set -e
+set -x
+
+# The default filesystem on freebsd gcp images is very slow to run tests on,
+# due to its 32KB block size
+#
+# XXX: It'd probably better to fix this in the image, using something like
+# https://people.freebsd.org/~lidl/blog/re-root.html
+
+# fix backup partition table after resize
+gpart recover da0
+gpart show da0
+# kill swap, so we can delete a partition
+swapoff -a || true
+# (apparently we can only have 4!?)
+gpart delete -i 3 da0
+gpart add -t freebsd-ufs -l data8k -a 4096 da0
+gpart show da0
+newfs -U -b 8192 /dev/da0p3
+
+# Migrate working directory
+du -hs $CIRRUS_WORKING_DIR
+mv $CIRRUS_WORKING_DIR $CIRRUS_WORKING_DIR.orig
+mkdir $CIRRUS_WORKING_DIR
+mount -o noatime /dev/da0p3 $CIRRUS_WORKING_DIR
+cp -r $CIRRUS_WORKING_DIR.orig/* $CIRRUS_WORKING_DIR/
diff --git a/ci/pg_ci_base.conf b/ci/pg_ci_base.conf
new file mode 100644
index 00000000000..637e3cfb343
--- /dev/null
+++ b/ci/pg_ci_base.conf
@@ -0,0 +1,12 @@
+# Tends to produce too many core files, taking a long time
+restart_after_crash = false
+
+# So that tests using the "manually" started postgres on windows can use
+# prepared statements
+max_prepared_transactions=10
+
+# Settings that make logs more useful
+log_line_prefix='%m [%p][%b][%v:%x] '
+log_checkpoints = true
+log_connections = true
+log_disconnections = true
diff --git a/ci/windows_build_config.pl b/ci/windows_build_config.pl
new file mode 100644
index 00000000000..bf0660416fa
--- /dev/null
+++ b/ci/windows_build_config.pl
@@ -0,0 +1,10 @@
+use strict;
+use warnings;
+
+our $config;
+
+$config->{"tap_tests"} = 1;
+$config->{"asserts"} = 1;
+$config->{"openssl"} = "c:/openssl/1.1.1l/";
+
+1;
--
2.23.0.385.gbc12974a89
v3-0003-fixup-ci-Add-CI-for-FreeBSD-Linux-MacOS-and-Windo.patchtext/x-diff; charset=us-asciiDownload
From dadd0b04a94be8bd1b3f3dba1ce378bcdadbd2f9 Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Fri, 8 Oct 2021 23:29:08 -0700
Subject: [PATCH v3 03/17] fixup! ci: Add CI for FreeBSD, Linux, MacOS and
Windows, utilizing cirrus-ci.
---
ci/docker/windows_vs_2019 | 62 +++++++++++++++++++++------------------
1 file changed, 34 insertions(+), 28 deletions(-)
diff --git a/ci/docker/windows_vs_2019 b/ci/docker/windows_vs_2019
index e09ca0d5825..a4fcaceae96 100644
--- a/ci/docker/windows_vs_2019
+++ b/ci/docker/windows_vs_2019
@@ -6,12 +6,6 @@ FROM cirrusci/windowsservercore:2019
SHELL ["powershell", "-NoLogo", "-NoProfile", "-Command"]
-RUN `
- New-Item -Path 'HKLM:\SOFTWARE\Policies\Microsoft\VisualStudio' ; `
- New-Item -Path 'HKLM:\SOFTWARE\Policies\Microsoft\VisualStudio\Setup' ; `
- New-ItemProperty -Path 'HKLM:\SOFTWARE\Policies\Microsoft\VisualStudio\Setup' -Name KeepDownloadedPayloads -Value 0 -PropertyType DWord
-
-
# Install commandline debugger and log all crashes to c:\cirrus\crashlog.txt
#
# Done manually as doing this via chocolatey / the installer directly, ends up
@@ -19,7 +13,11 @@ RUN `
RUN `
mkdir c:\t ; `
cd c:\t ; `
+ `
+ setx PATH \"C:\Windows Kits\10\Debuggers\x64;$Env:PATH\" /m ; `
+ `
curl.exe -sSL -o 'windsdksetup.exe' https://download.microsoft.com/download/9/7/9/97982c1d-d687-41be-9dd3-6d01e52ceb68/windowssdk/winsdksetup.exe ; `
+ echo 'starting sdk installation (for debugger)' ; `
Start-Process -Wait -FilePath ".\windsdksetup.exe" `
-ArgumentList '/Features OptionId.WindowsDesktopDebuggers /layout c:\t\sdk /quiet /norestart /log c:\t\sdk.log' `
; `
@@ -29,13 +27,12 @@ RUN `
; `
C:\Windows` Kits\10\Debuggers\x64\cdb.exe -version ; `
`
- cd c:\ ; `
- Remove-Item C:\t\* -Force -Recurse ; `
- Remove-Item C:\t -Force -Recurse ; `
Set-ItemProperty -Path 'HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\AeDebug' -Name 'Debugger' -Value '\"C:\Windows Kits\10\Debuggers\x64\cdb.exe\" -p %ld -e %ld -g -kqm -c \".lines -e; .symfix+ ;.logappend c:\cirrus\crashlog.txt ; !peb; ~*kP ; .logclose ; q \"' ; `
New-ItemProperty -Path 'HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\AeDebug' -Name 'Auto' -Value 1 -PropertyType DWord ; `
- Get-ItemProperty -Path 'HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\AeDebug' -Name Debugger; `
- setx PATH \"C:\Windows Kits\10\Debuggers\x64;$Env:PATH\" /m
+ Get-ItemProperty -Path 'HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\AeDebug' -Name Debugger ; `
+ `
+ cd c:\ ; `
+ Remove-Item C:\t -Force -Recurse
# Install perl, python, flex and bison.
@@ -48,11 +45,16 @@ RUN `
mkdir c:\t ; `
cd c:\t ; `
`
+ echo 'adding to path, before setup below, so changes are not overwritten' ; `
+ setx PATH \"C:\strawberry\perl\bin;C:\winflexbison;C:\Program Files\Git\usr\bin;$Env:PATH\" /m ; `
+ `
curl.exe -sSL -o perl.zip `
https://strawberryperl.com/download/5.26.3.1/strawberry-perl-5.26.3.1-64bit-portable.zip ; `
+ echo 'installing perl' ; `
7z.exe x .\perl.zip -xr!c -oc:\strawberry ; `
`
curl.exe -sSL -o python.exe https://www.python.org/ftp/python/3.10.0/python-3.10.0-amd64.exe ; `
+ echo 'installing python' ; `
Start-Process -Wait -FilePath ".\python.exe" `
-ArgumentList `
'/quiet', 'SimpleInstall=1', 'PrependPath=1', 'CompileAll=1', `
@@ -62,13 +64,27 @@ RUN `
`
curl.exe -sSL -o winflexbison.zip `
https://github.com/lexxmark/winflexbison/releases/download/v2.5.24/win_flex_bison-2.5.24.zip ; `
+ echo 'installing winflexbison' ; `
7z.exe x .\winflexbison.zip -oc:\winflexbison ; `
Rename-Item -Path c:\winflexbison\win_flex.exe c:\winflexbison\flex.exe ; `
Rename-Item -Path c:\winflexbison\win_bison.exe c:\winflexbison\bison.exe ; `
`
cd c:\ ; `
- Remove-Item C:\t -Force -Recurse ; `
- setx PATH \"C:\strawberry\perl\bin;C:\winflexbison;C:\Program Files\Git\usr\bin;$Env:PATH\" /m
+ Remove-Item C:\t -Force -Recurse
+
+
+# Install openssl
+RUN `
+ mkdir c:\t ; `
+ cd c:\t ; `
+ `
+ curl.exe -o openssl-setup.exe -sSL https://slproweb.com/download/Win64OpenSSL-1_1_1L.exe ; `
+ echo 'staring openssl installation' ; `
+ Start-Process -Wait -FilePath ".\openssl-setup.exe" `
+ -ArgumentList '/DIR=c:\openssl\1.1.1l\ /VERYSILENT /SP- /SUPPRESSMSGBOXES' ; `
+ `
+ cd c:\ ; `
+ Remove-Item C:\t -Force -Recurse
# Install visual studio
@@ -77,7 +93,10 @@ RUN `
RUN `
mkdir c:\t ; `
cd c:\t ; `
+ setx PATH \"c:\BuildTools\VC\Auxiliary\Build;$Env:PATH\" /m ; `
+ `
curl.exe -sSL -o c:\t\vs_buildtools.exe https://aka.ms/vs/16/release/vs_buildtools.exe ; `
+ echo 'starting visual studio installation' ; `
Start-Process -Wait `
-FilePath c:\t\vs_buildtools.exe `
-ArgumentList `
@@ -85,21 +104,8 @@ RUN `
'--installPath', 'c:\BuildTools', `
'--add', 'Microsoft.VisualStudio.Component.VC.Tools.x86.x64', `
'--add', 'Microsoft.VisualStudio.Component.Windows10SDK.20348' ; `
+ `
cd c:\ ; `
Remove-Item C:\t -Force -Recurse ; `
Remove-Item -Force -Recurse ${Env:TEMP}\*; `
- Remove-Item -Force -Recurse \"${Env:ProgramData}\Package Cache\" ; `
- setx PATH \"c:\BuildTools\VC\Auxiliary\Build;$Env:PATH\" /m
-
-
-# Install openssl
-RUN `
- mkdir c:\t ; `
- cd c:\t ; `
- `
- curl.exe -o openssl-setup.exe -sSL https://slproweb.com/download/Win64OpenSSL-1_1_1L.exe ; `
- Start-Process -Wait -FilePath ".\openssl-setup.exe" `
- -ArgumentList '/DIR=c:\openssl\1.1.1l\ /VERYSILENT /SP- /SUPPRESSMSGBOXES' ; `
- `
- cd c:\ ; `
- Remove-Item C:\t -Force -Recurse
+ Remove-Item -Force -Recurse \"${Env:ProgramData}\Package Cache\"
--
2.23.0.385.gbc12974a89
v3-0004-meson-prereq-output-and-depencency-tracking-work.patchtext/x-diff; charset=us-asciiDownload
From 3c0caccb8d9686538385cbb617a73a2d9b2476bd Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Mon, 8 Mar 2021 13:47:39 -0800
Subject: [PATCH v3 04/17] meson: prereq: output and depencency tracking work.
---
src/backend/utils/misc/Makefile | 5 ++++-
src/backend/utils/misc/guc.c | 2 +-
src/bin/initdb/initdb.c | 5 +++--
src/bin/psql/Makefile | 4 ++--
src/bin/psql/create_help.pl | 16 ++++++++++++----
src/tools/msvc/MSBuildProject.pm | 9 +++++++--
src/tools/msvc/Mkvcbuild.pm | 3 +++
src/tools/msvc/Solution.pm | 2 +-
src/tools/msvc/pgflex.pl | 4 ++--
9 files changed, 35 insertions(+), 15 deletions(-)
diff --git a/src/backend/utils/misc/Makefile b/src/backend/utils/misc/Makefile
index 1d5327cf644..14861fd96b2 100644
--- a/src/backend/utils/misc/Makefile
+++ b/src/backend/utils/misc/Makefile
@@ -37,8 +37,11 @@ endif
include $(top_srcdir)/src/backend/common.mk
+guc-file.c.h: guc-file.l
+ flex -o $@ $<
+
# guc-file is compiled as part of guc
-guc.o: guc-file.c
+guc.o: guc-file.c.h
# Note: guc-file.c is not deleted by 'make clean',
# since we want to ship it in distribution tarballs.
diff --git a/src/backend/utils/misc/guc.c b/src/backend/utils/misc/guc.c
index d2ce4a84506..3786ae11a08 100644
--- a/src/backend/utils/misc/guc.c
+++ b/src/backend/utils/misc/guc.c
@@ -12559,4 +12559,4 @@ check_default_with_oids(bool *newval, void **extra, GucSource source)
return true;
}
-#include "guc-file.c"
+#include "guc-file.c.h"
diff --git a/src/bin/initdb/initdb.c b/src/bin/initdb/initdb.c
index 1ed4808d53f..9067a06e58a 100644
--- a/src/bin/initdb/initdb.c
+++ b/src/bin/initdb/initdb.c
@@ -1368,8 +1368,9 @@ bootstrap_template1(void)
if (strcmp(headerline, *bki_lines) != 0)
{
- pg_log_error("input file \"%s\" does not belong to PostgreSQL %s",
- bki_file, PG_VERSION);
+ pg_log_error("input file \"%s\" does not belong to PostgreSQL %s (expect %s, is %s)",
+ bki_file, PG_VERSION,
+ headerline, *bki_lines);
fprintf(stderr,
_("Check your installation or specify the correct path "
"using the option -L.\n"));
diff --git a/src/bin/psql/Makefile b/src/bin/psql/Makefile
index d00881163c0..3851da1c8ef 100644
--- a/src/bin/psql/Makefile
+++ b/src/bin/psql/Makefile
@@ -56,7 +56,7 @@ sql_help.c: sql_help.h
touch $@
sql_help.h: create_help.pl $(wildcard $(REFDOCDIR)/*.sgml)
- $(PERL) $< $(REFDOCDIR) $*
+ $(PERL) $< $(REFDOCDIR) . $*
psqlscanslash.c: FLEXFLAGS = -Cfe -p -p
psqlscanslash.c: FLEX_NO_BACKUP=yes
@@ -81,7 +81,7 @@ clean distclean:
# files removed here are supposed to be in the distribution tarball,
# so do not clean them in the clean/distclean rules
maintainer-clean: distclean
- rm -f sql_help.h sql_help.c psqlscanslash.c
+ rm -f sql_help.h sql_help.c sql_help.dep psqlscanslash.c
check:
$(prove_check)
diff --git a/src/bin/psql/create_help.pl b/src/bin/psql/create_help.pl
index 83324239740..40eb6ac2d3f 100644
--- a/src/bin/psql/create_help.pl
+++ b/src/bin/psql/create_help.pl
@@ -23,9 +23,12 @@ use strict;
use warnings;
my $docdir = $ARGV[0] or die "$0: missing required argument: docdir\n";
-my $hfile = $ARGV[1] . '.h'
+my $outdir = $ARGV[1] or die "$0: missing required argument: outdir\n";
+
+my $hfile = $ARGV[2] . '.h'
or die "$0: missing required argument: output file\n";
-my $cfile = $ARGV[1] . '.c';
+my $cfile = $ARGV[2] . '.c';
+my $depfile = $ARGV[2] . '.dep';
my $hfilebasename;
if ($hfile =~ m!.*/([^/]+)$!)
@@ -43,10 +46,12 @@ $define =~ s/\W/_/g;
opendir(DIR, $docdir)
or die "$0: could not open documentation source dir '$docdir': $!\n";
-open(my $hfile_handle, '>', $hfile)
+open(my $hfile_handle, '>', $outdir . '/' . $hfile)
or die "$0: could not open output file '$hfile': $!\n";
-open(my $cfile_handle, '>', $cfile)
+open(my $cfile_handle, '>', $outdir . '/' . $cfile)
or die "$0: could not open output file '$cfile': $!\n";
+open(my $depfile_handle, '>', $outdir . '/' . $depfile)
+ or die "$0: could not open output file '$depfile': $!\n";
print $hfile_handle "/*
* *** Do not change this file by hand. It is automatically
@@ -98,6 +103,8 @@ foreach my $file (sort readdir DIR)
my ($cmdid, @cmdnames, $cmddesc, $cmdsynopsis);
$file =~ /\.sgml$/ or next;
+ print $depfile_handle "$cfile $hfile: $docdir/$file\n";
+
open(my $fh, '<', "$docdir/$file") or next;
my $filecontent = join('', <$fh>);
close $fh;
@@ -216,4 +223,5 @@ print $hfile_handle "
close $cfile_handle;
close $hfile_handle;
+close $depfile_handle;
closedir DIR;
diff --git a/src/tools/msvc/MSBuildProject.pm b/src/tools/msvc/MSBuildProject.pm
index fdd22e89eb2..036e44fcb83 100644
--- a/src/tools/msvc/MSBuildProject.pm
+++ b/src/tools/msvc/MSBuildProject.pm
@@ -211,14 +211,19 @@ EOF
}
else #if ($grammarFile =~ /\.l$/)
{
+ if ($outputFile eq 'src/backend/utils/misc/guc-file.c')
+ {
+ $outputFile = 'src/backend/utils/misc/guc-file.c.h';
+ }
+
print $f <<EOF;
<CustomBuild Include="$grammarFile">
<Message Condition="'\$(Configuration)|\$(Platform)'=='Debug|$self->{platform}'">Running flex on $grammarFile</Message>
- <Command Condition="'\$(Configuration)|\$(Platform)'=='Debug|$self->{platform}'">perl "src\\tools\\msvc\\pgflex.pl" "$grammarFile"</Command>
+ <Command Condition="'\$(Configuration)|\$(Platform)'=='Debug|$self->{platform}'">perl "src\\tools\\msvc\\pgflex.pl" "$grammarFile" "$outputFile"</Command>
<AdditionalInputs Condition="'\$(Configuration)|\$(Platform)'=='Debug|$self->{platform}'">%(AdditionalInputs)</AdditionalInputs>
<Outputs Condition="'\$(Configuration)|\$(Platform)'=='Debug|$self->{platform}'">$outputFile;%(Outputs)</Outputs>
<Message Condition="'\$(Configuration)|\$(Platform)'=='Release|$self->{platform}'">Running flex on $grammarFile</Message>
- <Command Condition="'\$(Configuration)|\$(Platform)'=='Release|$self->{platform}'">perl "src\\tools\\msvc\\pgflex.pl" "$grammarFile"</Command>
+ <Command Condition="'\$(Configuration)|\$(Platform)'=='Release|$self->{platform}'">perl "src\\tools\\msvc\\pgflex.pl" "$grammarFile" "$outputFile"</Command>
<AdditionalInputs Condition="'\$(Configuration)|\$(Platform)'=='Release|$self->{platform}'">%(AdditionalInputs)</AdditionalInputs>
<Outputs Condition="'\$(Configuration)|\$(Platform)'=='Release|$self->{platform}'">$outputFile;%(Outputs)</Outputs>
</CustomBuild>
diff --git a/src/tools/msvc/Mkvcbuild.pm b/src/tools/msvc/Mkvcbuild.pm
index 4362bd44fd1..b8e62c6d3f7 100644
--- a/src/tools/msvc/Mkvcbuild.pm
+++ b/src/tools/msvc/Mkvcbuild.pm
@@ -330,6 +330,7 @@ sub mkvcbuild
$pgregress_ecpg->AddFile('src/test/regress/pg_regress.c');
$pgregress_ecpg->AddIncludeDir('src/port');
$pgregress_ecpg->AddIncludeDir('src/test/regress');
+ $pgregress_ecpg->AddDefine('DLSUFFIX=".dll"');
$pgregress_ecpg->AddDefine('HOST_TUPLE="i686-pc-win32vc"');
$pgregress_ecpg->AddLibrary('ws2_32.lib');
$pgregress_ecpg->AddDirResourceFile('src/interfaces/ecpg/test');
@@ -345,6 +346,7 @@ sub mkvcbuild
$isolation_tester->AddIncludeDir('src/port');
$isolation_tester->AddIncludeDir('src/test/regress');
$isolation_tester->AddIncludeDir('src/interfaces/libpq');
+ $isolation_tester->AddDefine('DLSUFFIX=".dll"');
$isolation_tester->AddDefine('HOST_TUPLE="i686-pc-win32vc"');
$isolation_tester->AddLibrary('ws2_32.lib');
$isolation_tester->AddDirResourceFile('src/test/isolation');
@@ -356,6 +358,7 @@ sub mkvcbuild
$pgregress_isolation->AddFile('src/test/regress/pg_regress.c');
$pgregress_isolation->AddIncludeDir('src/port');
$pgregress_isolation->AddIncludeDir('src/test/regress');
+ $pgregress_isolation->AddDefine('DLSUFFIX=".dll"');
$pgregress_isolation->AddDefine('HOST_TUPLE="i686-pc-win32vc"');
$pgregress_isolation->AddLibrary('ws2_32.lib');
$pgregress_isolation->AddDirResourceFile('src/test/isolation');
diff --git a/src/tools/msvc/Solution.pm b/src/tools/msvc/Solution.pm
index 165a93987ac..721f690ae46 100644
--- a/src/tools/msvc/Solution.pm
+++ b/src/tools/msvc/Solution.pm
@@ -688,7 +688,7 @@ sub GenerateFiles
{
print "Generating sql_help.h...\n";
chdir('src/bin/psql');
- system("perl create_help.pl ../../../doc/src/sgml/ref sql_help");
+ system("perl create_help.pl ../../../doc/src/sgml/ref . sql_help");
chdir('../../..');
}
diff --git a/src/tools/msvc/pgflex.pl b/src/tools/msvc/pgflex.pl
index 0728b85d4de..19f26ff213f 100644
--- a/src/tools/msvc/pgflex.pl
+++ b/src/tools/msvc/pgflex.pl
@@ -29,6 +29,8 @@ unless ($verparts[0] == 2
}
my $input = shift;
+my $output = shift;
+
if ($input !~ /\.l$/)
{
print "Input must be a .l file\n";
@@ -40,8 +42,6 @@ elsif (!-e $input)
exit 1;
}
-(my $output = $input) =~ s/\.l$/.c/;
-
# get flex flags from make file
my $makefile = dirname($input) . "/Makefile";
my ($mf, $make);
--
2.23.0.385.gbc12974a89
v3-0005-meson-prereq-move-snowball_create.sql-creation-in.patchtext/x-diff; charset=us-asciiDownload
From 3735dde29b192404e3e52c2989ec990a39d3e513 Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Mon, 8 Mar 2021 14:59:22 -0800
Subject: [PATCH v3 05/17] meson: prereq: move snowball_create.sql creation
into perl file.
FIXME: deduplicate with Install.pm
---
src/backend/snowball/Makefile | 27 +-----
src/backend/snowball/snowball_create.pl | 110 ++++++++++++++++++++++++
2 files changed, 113 insertions(+), 24 deletions(-)
create mode 100644 src/backend/snowball/snowball_create.pl
diff --git a/src/backend/snowball/Makefile b/src/backend/snowball/Makefile
index 50b9199910c..259104f8eb3 100644
--- a/src/backend/snowball/Makefile
+++ b/src/backend/snowball/Makefile
@@ -119,29 +119,8 @@ all: all-shared-lib $(SQLSCRIPT)
include $(top_srcdir)/src/Makefile.shlib
-$(SQLSCRIPT): Makefile snowball_func.sql.in snowball.sql.in
- echo '-- Language-specific snowball dictionaries' > $@
- cat $(srcdir)/snowball_func.sql.in >> $@
- @set -e; \
- set $(LANGUAGES) ; \
- while [ "$$#" -gt 0 ] ; \
- do \
- lang=$$1; shift; \
- nonascdictname=$$lang; \
- ascdictname=$$1; shift; \
- if [ -s $(srcdir)/stopwords/$${lang}.stop ] ; then \
- stop=", StopWords=$${lang}" ; \
- else \
- stop=""; \
- fi; \
- cat $(srcdir)/snowball.sql.in | \
- sed -e "s#_LANGNAME_#$$lang#g" | \
- sed -e "s#_DICTNAME_#$${lang}_stem#g" | \
- sed -e "s#_CFGNAME_#$$lang#g" | \
- sed -e "s#_ASCDICTNAME_#$${ascdictname}_stem#g" | \
- sed -e "s#_NONASCDICTNAME_#$${nonascdictname}_stem#g" | \
- sed -e "s#_STOPWORDS_#$$stop#g" ; \
- done >> $@
+$(SQLSCRIPT): snowball_create.pl Makefile snowball_func.sql.in snowball.sql.in
+ $(PERL) $< --input ${srcdir} --output .
install: all installdirs install-lib
$(INSTALL_DATA) $(SQLSCRIPT) '$(DESTDIR)$(datadir)'
@@ -171,4 +150,4 @@ uninstall: uninstall-lib
done
clean distclean maintainer-clean: clean-lib
- rm -f $(OBJS) $(SQLSCRIPT)
+ rm -f $(OBJS) $(SQLSCRIPT) snowball_create.dep
diff --git a/src/backend/snowball/snowball_create.pl b/src/backend/snowball/snowball_create.pl
new file mode 100644
index 00000000000..d9d79f3668f
--- /dev/null
+++ b/src/backend/snowball/snowball_create.pl
@@ -0,0 +1,110 @@
+#!/usr/bin/perl
+
+use strict;
+use warnings;
+use Getopt::Long;
+
+my $output_path = '';
+my $makefile_path = '';
+my $input_path = '';
+
+GetOptions(
+ 'output:s' => \$output_path,
+ 'input:s' => \$input_path) || usage();
+
+# Make sure input_path ends in a slash if needed.
+if ($input_path ne '' && substr($input_path, -1) ne '/')
+{
+ $output_path .= '/';
+}
+
+# Make sure output_path ends in a slash if needed.
+if ($output_path ne '' && substr($output_path, -1) ne '/')
+{
+ $output_path .= '/';
+}
+
+GenerateTsearchFiles();
+
+sub usage
+{
+ die <<EOM;
+Usage: snowball_create.pl --input/-i <path> --input <path>
+ --output Output directory (default '.')
+ --input Input directory
+
+snowball_create.pl creates snowball.sql from snowball.sql.in
+EOM
+}
+
+sub GenerateTsearchFiles
+{
+ my $target = shift;
+ my $output_file = "$output_path/snowball_create.sql";
+
+ print "Generating tsearch script...";
+ my $F;
+ my $D;
+ my $tmpl = read_file("$input_path/snowball.sql.in");
+ my $mf = read_file("$input_path/Makefile");
+
+ open($D, '>', "$output_path/snowball_create.dep")
+ || die "Could not write snowball_create.dep";
+
+ print $D "$output_file: $input_path/Makefile\n";
+ print $D "$output_file: $input_path/snowball.sql.in\n";
+ print $D "$output_file: $input_path/snowball_func.sql.in\n";
+
+ $mf =~ s{\\\r?\n}{}g;
+ $mf =~ /^LANGUAGES\s*=\s*(.*)$/m
+ || die "Could not find LANGUAGES line in snowball Makefile\n";
+ my @pieces = split /\s+/, $1;
+ open($F, '>', $output_file)
+ || die "Could not write snowball_create.sql";
+
+ print $F "-- Language-specific snowball dictionaries\n";
+
+ print $F read_file("$input_path/snowball_func.sql.in");
+
+ while ($#pieces > 0)
+ {
+ my $lang = shift @pieces || last;
+ my $asclang = shift @pieces || last;
+ my $txt = $tmpl;
+ my $stop = '';
+ my $stopword_path = "$input_path/stopwords/$lang.stop";
+
+ if (-s "$stopword_path")
+ {
+ $stop = ", StopWords=$lang";
+
+ print $D "$output_file: $stopword_path\n";
+ }
+
+ $txt =~ s#_LANGNAME_#${lang}#gs;
+ $txt =~ s#_DICTNAME_#${lang}_stem#gs;
+ $txt =~ s#_CFGNAME_#${lang}#gs;
+ $txt =~ s#_ASCDICTNAME_#${asclang}_stem#gs;
+ $txt =~ s#_NONASCDICTNAME_#${lang}_stem#gs;
+ $txt =~ s#_STOPWORDS_#$stop#gs;
+ print $F $txt;
+ print ".";
+ }
+ close($F);
+ close($D);
+ print "\n";
+ return;
+}
+
+
+sub read_file
+{
+ my $filename = shift;
+ my $F;
+ local $/ = undef;
+ open($F, '<', $filename) || die "Could not open file $filename\n";
+ my $txt = <$F>;
+ close($F);
+
+ return $txt;
+}
--
2.23.0.385.gbc12974a89
v3-0006-meson-prereq-add-output-path-arg-in-generate-lwlo.patchtext/x-diff; charset=us-asciiDownload
From ff46bc3dd368d11043d6089b2da2f82c1c28f631 Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Wed, 10 Mar 2021 01:43:07 -0800
Subject: [PATCH v3 06/17] meson: prereq: add output path arg in
generate-lwlocknames.pl
---
src/backend/storage/lmgr/generate-lwlocknames.pl | 14 ++++++++++----
1 file changed, 10 insertions(+), 4 deletions(-)
diff --git a/src/backend/storage/lmgr/generate-lwlocknames.pl b/src/backend/storage/lmgr/generate-lwlocknames.pl
index 8a44946594d..315156b29f1 100644
--- a/src/backend/storage/lmgr/generate-lwlocknames.pl
+++ b/src/backend/storage/lmgr/generate-lwlocknames.pl
@@ -5,15 +5,21 @@
use strict;
use warnings;
+use Getopt::Long;
+
+my $output_path = '.';
my $lastlockidx = -1;
my $continue = "\n";
+GetOptions(
+ 'output:s' => \$output_path);
+
open my $lwlocknames, '<', $ARGV[0] or die;
# Include PID in suffix in case parallel make runs this multiple times.
-my $htmp = "lwlocknames.h.tmp$$";
-my $ctmp = "lwlocknames.c.tmp$$";
+my $htmp = "$output_path/lwlocknames.h.tmp$$";
+my $ctmp = "$output_path/lwlocknames.c.tmp$$";
open my $h, '>', $htmp or die "Could not open $htmp: $!";
open my $c, '>', $ctmp or die "Could not open $ctmp: $!";
@@ -65,7 +71,7 @@ printf $h "#define NUM_INDIVIDUAL_LWLOCKS %s\n", $lastlockidx + 1;
close $h;
close $c;
-rename($htmp, 'lwlocknames.h') || die "rename: $htmp: $!";
-rename($ctmp, 'lwlocknames.c') || die "rename: $ctmp: $!";
+rename($htmp, "$output_path/lwlocknames.h") || die "rename: $htmp to $output_path/lwlocknames.h: $!";
+rename($ctmp, "$output_path/lwlocknames.c") || die "rename: $ctmp: $!";
close $lwlocknames;
--
2.23.0.385.gbc12974a89
v3-0007-meson-prereq-add-src-tools-gen_versioning_script..patchtext/x-diff; charset=us-asciiDownload
From 442cee8d3c2b32d59696e2114f97de062f798293 Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Wed, 10 Mar 2021 15:11:13 -0800
Subject: [PATCH v3 07/17] meson: prereq: add
src/tools/gen_versioning_script.pl.
Currently the logic is all in src/Makefile.shlib. This adds a sketch
of a generation script that can be used from meson.
---
src/tools/gen_versioning_script.pl | 58 ++++++++++++++++++++++++++++++
1 file changed, 58 insertions(+)
create mode 100644 src/tools/gen_versioning_script.pl
diff --git a/src/tools/gen_versioning_script.pl b/src/tools/gen_versioning_script.pl
new file mode 100644
index 00000000000..862b5e14aad
--- /dev/null
+++ b/src/tools/gen_versioning_script.pl
@@ -0,0 +1,58 @@
+use strict;
+use warnings;
+
+my $format = $ARGV[0] or die "$0: missing required argument: format\n";
+my $input = $ARGV[1] or die "$0: missing required argument: input\n";
+my $output = $ARGV[2] or die "$0: missing required argument: output\n";
+
+#FIXME: handle format argument, so we can reuse the one script for several platforms
+if (not ($format eq 'gnu' or $format eq 'darwin'))
+{
+ die "$0: $format is not yet handled (only gnu is)\n";
+}
+
+open(my $input_handle, '<', $input)
+ or die "$0: could not open input file '$input': $!\n";
+
+open(my $output_handle, '>', $output)
+ or die "$0: could not open output file '$output': $!\n";
+
+
+if ($format eq 'gnu')
+{
+ print $output_handle "{
+ global:
+";
+}
+
+while (<$input_handle>)
+{
+ if (/^#/)
+ {
+ # don't do anything with a comment
+ }
+ elsif (/^([^\s]+)\s+([^\s]+)/)
+ {
+ if ($format eq 'gnu')
+ {
+ print $output_handle " $1;\n";
+ }
+ elsif ($format eq 'darwin')
+ {
+ print $output_handle " _$1\n";
+ }
+ }
+ else
+ {
+ die "$0: unexpected line $_\n";
+ }
+}
+
+if ($format eq 'gnu')
+{
+ print $output_handle " local: *;
+};
+";
+}
+
+exit(0);
--
2.23.0.385.gbc12974a89
v3-0008-meson-prereq-generate-errcodes.pl-accept-output-f.patchtext/x-diff; charset=us-asciiDownload
From aa5915c8bab7ef119d34d350dbeaf17ec59c16e3 Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Mon, 27 Sep 2021 00:14:09 -0700
Subject: [PATCH v3 08/17] meson: prereq: generate-errcodes.pl: accept output
file
---
src/backend/utils/Makefile | 2 +-
src/backend/utils/generate-errcodes.pl | 13 ++++++++-----
src/tools/msvc/Solution.pm | 2 +-
3 files changed, 10 insertions(+), 7 deletions(-)
diff --git a/src/backend/utils/Makefile b/src/backend/utils/Makefile
index ef8df254826..469caf0d704 100644
--- a/src/backend/utils/Makefile
+++ b/src/backend/utils/Makefile
@@ -52,7 +52,7 @@ fmgr-stamp: Gen_fmgrtab.pl $(catalogdir)/Catalog.pm $(top_srcdir)/src/include/ca
touch $@
errcodes.h: $(top_srcdir)/src/backend/utils/errcodes.txt generate-errcodes.pl
- $(PERL) $(srcdir)/generate-errcodes.pl $< > $@
+ $(PERL) $(srcdir)/generate-errcodes.pl $< $@
ifneq ($(enable_dtrace), yes)
probes.h: Gen_dummy_probes.sed
diff --git a/src/backend/utils/generate-errcodes.pl b/src/backend/utils/generate-errcodes.pl
index c5cdd388138..57ec2a5ca21 100644
--- a/src/backend/utils/generate-errcodes.pl
+++ b/src/backend/utils/generate-errcodes.pl
@@ -6,11 +6,13 @@
use strict;
use warnings;
-print
+open my $errcodes, '<', $ARGV[0] or die;
+open my $out, '>', $ARGV[1] or die;
+
+print $out
"/* autogenerated from src/backend/utils/errcodes.txt, do not edit */\n";
-print "/* there is deliberately not an #ifndef ERRCODES_H here */\n";
+print $out "/* there is deliberately not an #ifndef ERRCODES_H here */\n";
-open my $errcodes, '<', $ARGV[0] or die;
while (<$errcodes>)
{
@@ -25,7 +27,7 @@ while (<$errcodes>)
{
my $header = $1;
$header =~ s/^\s+//;
- print "\n/* $header */\n";
+ print $out "\n/* $header */\n";
next;
}
@@ -40,7 +42,8 @@ while (<$errcodes>)
# And quote them
$sqlstate =~ s/([^,])/'$1'/g;
- print "#define $errcode_macro MAKE_SQLSTATE($sqlstate)\n";
+ print $out "#define $errcode_macro MAKE_SQLSTATE($sqlstate)\n";
}
close $errcodes;
+close $out;
diff --git a/src/tools/msvc/Solution.pm b/src/tools/msvc/Solution.pm
index 721f690ae46..aba59a270b4 100644
--- a/src/tools/msvc/Solution.pm
+++ b/src/tools/msvc/Solution.pm
@@ -658,7 +658,7 @@ sub GenerateFiles
{
print "Generating errcodes.h...\n";
system(
- 'perl src/backend/utils/generate-errcodes.pl src/backend/utils/errcodes.txt > src/backend/utils/errcodes.h'
+ 'perl src/backend/utils/generate-errcodes.pl src/backend/utils/errcodes.txt src/backend/utils/errcodes.h'
);
copyFile('src/backend/utils/errcodes.h',
'src/include/utils/errcodes.h');
--
2.23.0.385.gbc12974a89
v3-0009-meson-prereq-remove-unhelpful-chattiness-in-snowb.patchtext/x-diff; charset=us-asciiDownload
From f78a0f77bcb97fc1dc2fa6d9ee3143f8b95561d4 Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Mon, 27 Sep 2021 15:41:24 -0700
Subject: [PATCH v3 09/17] meson: prereq: remove unhelpful chattiness in
snowball_create.pl.
---
src/backend/snowball/snowball_create.pl | 3 ---
1 file changed, 3 deletions(-)
diff --git a/src/backend/snowball/snowball_create.pl b/src/backend/snowball/snowball_create.pl
index d9d79f3668f..285cf4f5d90 100644
--- a/src/backend/snowball/snowball_create.pl
+++ b/src/backend/snowball/snowball_create.pl
@@ -42,7 +42,6 @@ sub GenerateTsearchFiles
my $target = shift;
my $output_file = "$output_path/snowball_create.sql";
- print "Generating tsearch script...";
my $F;
my $D;
my $tmpl = read_file("$input_path/snowball.sql.in");
@@ -88,11 +87,9 @@ sub GenerateTsearchFiles
$txt =~ s#_NONASCDICTNAME_#${lang}_stem#gs;
$txt =~ s#_STOPWORDS_#$stop#gs;
print $F $txt;
- print ".";
}
close($F);
close($D);
- print "\n";
return;
}
--
2.23.0.385.gbc12974a89
v3-0010-meson-prereq-Can-we-get-away-with-not-export-all-.patchtext/x-diff; charset=us-asciiDownload
From 8d3ca1659baf70926fb034f4043d3d90669f242b Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Wed, 29 Sep 2021 00:29:10 -0700
Subject: [PATCH v3 10/17] meson: prereq: Can we get away with not
export-all'ing libraries?
---
configure | 49 ++++++++++++++++++++++
configure.ac | 10 +++++
contrib/hstore/hstore.h | 16 +++----
contrib/ltree/ltree.h | 40 +++++++++---------
src/Makefile.global.in | 1 +
src/Makefile.shlib | 12 ++++++
src/include/c.h | 15 +++++--
src/include/fmgr.h | 6 ++-
src/include/jit/jit.h | 2 +-
src/include/pg_config.h.in | 3 ++
src/include/replication/output_plugin.h | 2 +
src/pl/plpython/plpy_elog.h | 8 ++--
src/pl/plpython/plpy_typeio.h | 18 ++++----
src/pl/plpython/plpy_util.h | 8 ++--
src/test/modules/test_shm_mq/test_shm_mq.h | 2 +-
src/test/modules/worker_spi/worker_spi.c | 2 +-
src/tools/msvc/Solution.pm | 1 +
17 files changed, 142 insertions(+), 53 deletions(-)
diff --git a/configure b/configure
index 4ffefe46552..a62008e5ac5 100755
--- a/configure
+++ b/configure
@@ -735,6 +735,7 @@ CPP
CFLAGS_SL
BITCODE_CXXFLAGS
BITCODE_CFLAGS
+CFLAGS_SL_MOD
CFLAGS_VECTORIZE
CFLAGS_UNROLL_LOOPS
PERMIT_DECLARATION_AFTER_STATEMENT
@@ -6421,6 +6422,54 @@ fi
if test -n "$NOT_THE_CFLAGS"; then
CFLAGS="$CFLAGS -Wno-stringop-truncation"
fi
+
+ # If the compiler knows how to hide symbols, use that. But only for shared libraries,
+ # for postgres itself that'd be too verbose for now.
+ { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether ${CC} supports -fvisibility=hidden, for CFLAGS_SL_MOD" >&5
+$as_echo_n "checking whether ${CC} supports -fvisibility=hidden, for CFLAGS_SL_MOD... " >&6; }
+if ${pgac_cv_prog_CC_cflags__fvisibility_hidden+:} false; then :
+ $as_echo_n "(cached) " >&6
+else
+ pgac_save_CFLAGS=$CFLAGS
+pgac_save_CC=$CC
+CC=${CC}
+CFLAGS="${CFLAGS_SL_MOD} -fvisibility=hidden"
+ac_save_c_werror_flag=$ac_c_werror_flag
+ac_c_werror_flag=yes
+cat confdefs.h - <<_ACEOF >conftest.$ac_ext
+/* end confdefs.h. */
+
+int
+main ()
+{
+
+ ;
+ return 0;
+}
+_ACEOF
+if ac_fn_c_try_compile "$LINENO"; then :
+ pgac_cv_prog_CC_cflags__fvisibility_hidden=yes
+else
+ pgac_cv_prog_CC_cflags__fvisibility_hidden=no
+fi
+rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext
+ac_c_werror_flag=$ac_save_c_werror_flag
+CFLAGS="$pgac_save_CFLAGS"
+CC="$pgac_save_CC"
+fi
+{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $pgac_cv_prog_CC_cflags__fvisibility_hidden" >&5
+$as_echo "$pgac_cv_prog_CC_cflags__fvisibility_hidden" >&6; }
+if test x"$pgac_cv_prog_CC_cflags__fvisibility_hidden" = x"yes"; then
+ CFLAGS_SL_MOD="${CFLAGS_SL_MOD} -fvisibility=hidden"
+fi
+
+
+ if test "$pgac_cv_prog_CC_cflags__fvisibility_hidden" = yes; then
+
+$as_echo "#define HAVE_VISIBILITY_ATTRIBUTE 1" >>confdefs.h
+
+ fi
+
elif test "$ICC" = yes; then
# Intel's compiler has a bug/misoptimization in checking for
# division by NAN (NaN == 0), -mp1 fixes it, so add it to the CFLAGS.
diff --git a/configure.ac b/configure.ac
index 44ee3ebe2f1..973f83db52c 100644
--- a/configure.ac
+++ b/configure.ac
@@ -541,6 +541,15 @@ if test "$GCC" = yes -a "$ICC" = no; then
if test -n "$NOT_THE_CFLAGS"; then
CFLAGS="$CFLAGS -Wno-stringop-truncation"
fi
+
+ # If the compiler knows how to hide symbols, use that. But only for shared libraries,
+ # for postgres itself that'd be too verbose for now.
+ PGAC_PROG_CC_VAR_OPT(CFLAGS_SL_MOD, [-fvisibility=hidden])
+ if test "$pgac_cv_prog_CC_cflags__fvisibility_hidden" = yes; then
+ AC_DEFINE(HAVE_VISIBILITY_ATTRIBUTE, 1,
+ [Define to 1 if your compiler knows the visibility("hidden") attribute.])
+ fi
+
elif test "$ICC" = yes; then
# Intel's compiler has a bug/misoptimization in checking for
# division by NAN (NaN == 0), -mp1 fixes it, so add it to the CFLAGS.
@@ -564,6 +573,7 @@ fi
AC_SUBST(CFLAGS_UNROLL_LOOPS)
AC_SUBST(CFLAGS_VECTORIZE)
+AC_SUBST(CFLAGS_SL_MOD)
# Determine flags used to emit bitcode for JIT inlining. Need to test
# for behaviour changing compiler flags, to keep compatibility with
diff --git a/contrib/hstore/hstore.h b/contrib/hstore/hstore.h
index bf4a565ed9b..625134c9f69 100644
--- a/contrib/hstore/hstore.h
+++ b/contrib/hstore/hstore.h
@@ -147,7 +147,7 @@ typedef struct
} while (0)
/* DatumGetHStoreP includes support for reading old-format hstore values */
-extern HStore *hstoreUpgrade(Datum orig);
+extern PGDLLEXPORT HStore *hstoreUpgrade(Datum orig);
#define DatumGetHStoreP(d) hstoreUpgrade(d)
@@ -168,14 +168,14 @@ typedef struct
bool needfree; /* need to pfree the value? */
} Pairs;
-extern int hstoreUniquePairs(Pairs *a, int32 l, int32 *buflen);
-extern HStore *hstorePairs(Pairs *pairs, int32 pcount, int32 buflen);
+extern PGDLLEXPORT int hstoreUniquePairs(Pairs *a, int32 l, int32 *buflen);
+extern PGDLLEXPORT HStore *hstorePairs(Pairs *pairs, int32 pcount, int32 buflen);
-extern size_t hstoreCheckKeyLen(size_t len);
-extern size_t hstoreCheckValLen(size_t len);
+extern PGDLLEXPORT size_t hstoreCheckKeyLen(size_t len);
+extern PGDLLEXPORT size_t hstoreCheckValLen(size_t len);
-extern int hstoreFindKey(HStore *hs, int *lowbound, char *key, int keylen);
-extern Pairs *hstoreArrayToPairs(ArrayType *a, int *npairs);
+extern PGDLLEXPORT int hstoreFindKey(HStore *hs, int *lowbound, char *key, int keylen);
+extern PGDLLEXPORT Pairs *hstoreArrayToPairs(ArrayType *a, int *npairs);
#define HStoreContainsStrategyNumber 7
#define HStoreExistsStrategyNumber 9
@@ -194,7 +194,7 @@ extern Pairs *hstoreArrayToPairs(ArrayType *a, int *npairs);
#if HSTORE_POLLUTE_NAMESPACE
#define HSTORE_POLLUTE(newname_,oldname_) \
PG_FUNCTION_INFO_V1(oldname_); \
- Datum newname_(PG_FUNCTION_ARGS); \
+ extern PGDLLEXPORT Datum newname_(PG_FUNCTION_ARGS); \
Datum oldname_(PG_FUNCTION_ARGS) { return newname_(fcinfo); } \
extern int no_such_variable
#else
diff --git a/contrib/ltree/ltree.h b/contrib/ltree/ltree.h
index 5b4be5e680a..d8bcdedbdbe 100644
--- a/contrib/ltree/ltree.h
+++ b/contrib/ltree/ltree.h
@@ -176,30 +176,30 @@ typedef struct
/* use in array iterator */
-Datum ltree_isparent(PG_FUNCTION_ARGS);
-Datum ltree_risparent(PG_FUNCTION_ARGS);
-Datum ltq_regex(PG_FUNCTION_ARGS);
-Datum ltq_rregex(PG_FUNCTION_ARGS);
-Datum lt_q_regex(PG_FUNCTION_ARGS);
-Datum lt_q_rregex(PG_FUNCTION_ARGS);
-Datum ltxtq_exec(PG_FUNCTION_ARGS);
-Datum ltxtq_rexec(PG_FUNCTION_ARGS);
-Datum _ltq_regex(PG_FUNCTION_ARGS);
-Datum _ltq_rregex(PG_FUNCTION_ARGS);
-Datum _lt_q_regex(PG_FUNCTION_ARGS);
-Datum _lt_q_rregex(PG_FUNCTION_ARGS);
-Datum _ltxtq_exec(PG_FUNCTION_ARGS);
-Datum _ltxtq_rexec(PG_FUNCTION_ARGS);
-Datum _ltree_isparent(PG_FUNCTION_ARGS);
-Datum _ltree_risparent(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum ltree_isparent(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum ltree_risparent(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum ltq_regex(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum ltq_rregex(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum lt_q_regex(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum lt_q_rregex(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum ltxtq_exec(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum ltxtq_rexec(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum _ltq_regex(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum _ltq_rregex(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum _lt_q_regex(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum _lt_q_rregex(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum _ltxtq_exec(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum _ltxtq_rexec(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum _ltree_isparent(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum _ltree_risparent(PG_FUNCTION_ARGS);
/* Concatenation functions */
-Datum ltree_addltree(PG_FUNCTION_ARGS);
-Datum ltree_addtext(PG_FUNCTION_ARGS);
-Datum ltree_textadd(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum ltree_addltree(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum ltree_addtext(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum ltree_textadd(PG_FUNCTION_ARGS);
/* Util function */
-Datum ltree_in(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum ltree_in(PG_FUNCTION_ARGS);
bool ltree_execute(ITEM *curitem, void *checkval,
bool calcnot, bool (*chkcond) (void *checkval, ITEM *val));
diff --git a/src/Makefile.global.in b/src/Makefile.global.in
index a1da1ea4eeb..a4ab27ae145 100644
--- a/src/Makefile.global.in
+++ b/src/Makefile.global.in
@@ -259,6 +259,7 @@ SUN_STUDIO_CC = @SUN_STUDIO_CC@
CXX = @CXX@
CFLAGS = @CFLAGS@
CFLAGS_SL = @CFLAGS_SL@
+CFLAGS_SL_MOD = @CFLAGS_SL_MOD@
CFLAGS_UNROLL_LOOPS = @CFLAGS_UNROLL_LOOPS@
CFLAGS_VECTORIZE = @CFLAGS_VECTORIZE@
CFLAGS_SSE42 = @CFLAGS_SSE42@
diff --git a/src/Makefile.shlib b/src/Makefile.shlib
index 551023c6fb0..d36782aa942 100644
--- a/src/Makefile.shlib
+++ b/src/Makefile.shlib
@@ -253,6 +253,18 @@ ifeq ($(PORTNAME), win32)
endif
+# If the shared library doesn't have an export file, mark all symbols not
+# explicitly exported using PGDLLEXPORT as hidden. We can't pass these flags
+# when building a library with explicit exports, as the symbols would be
+# hidden before the linker script / exported symbol list takes effect.
+#
+# XXX: This probably isn't the best location, but not clear instead?
+ifeq ($(SHLIB_EXPORTS),)
+ LDFLAGS += $(CFLAGS_SL_MOD)
+ override CFLAGS += $(CFLAGS_SL_MOD)
+ override CXXFLAGS += $(CFLAGS_SL_MOD)
+endif
+
##
## BUILD
diff --git a/src/include/c.h b/src/include/c.h
index c8ede082739..9b539a2657b 100644
--- a/src/include/c.h
+++ b/src/include/c.h
@@ -1312,11 +1312,18 @@ extern long long strtoll(const char *str, char **endptr, int base);
extern unsigned long long strtoull(const char *str, char **endptr, int base);
#endif
-/* no special DLL markers on most ports */
-#ifndef PGDLLIMPORT
-#define PGDLLIMPORT
+/*
+ * If the platform knows __attribute__((visibility("*"))), i.e. gcc like
+ * compilers, we use that.
+ */
+#if !defined(PGDLLIMPORT) && defined(HAVE_VISIBILITY_ATTRIBUTE)
+#define PGDLLIMPORT __attribute__((visibility("default")))
+#define PGDLLEXPORT __attribute__((visibility("default")))
#endif
-#ifndef PGDLLEXPORT
+
+/* No special DLL markers on the remaining ports. */
+#if !defined(PGDLLIMPORT)
+#define PGDLLIMPORT
#define PGDLLEXPORT
#endif
diff --git a/src/include/fmgr.h b/src/include/fmgr.h
index ab7b85c86e1..679443cca19 100644
--- a/src/include/fmgr.h
+++ b/src/include/fmgr.h
@@ -413,7 +413,7 @@ typedef const Pg_finfo_record *(*PGFInfoFunction) (void);
* info function, since authors shouldn't need to be explicitly aware of it.
*/
#define PG_FUNCTION_INFO_V1(funcname) \
-extern Datum funcname(PG_FUNCTION_ARGS); \
+extern PGDLLEXPORT Datum funcname(PG_FUNCTION_ARGS); \
extern PGDLLEXPORT const Pg_finfo_record * CppConcat(pg_finfo_,funcname)(void); \
const Pg_finfo_record * \
CppConcat(pg_finfo_,funcname) (void) \
@@ -424,6 +424,10 @@ CppConcat(pg_finfo_,funcname) (void) \
extern int no_such_variable
+extern PGDLLEXPORT void _PG_init(void);
+extern PGDLLEXPORT void _PG_fini(void);
+
+
/*-------------------------------------------------------------------------
* Support for verifying backend compatibility of loaded modules
*
diff --git a/src/include/jit/jit.h b/src/include/jit/jit.h
index b634df30b98..74617ad1b64 100644
--- a/src/include/jit/jit.h
+++ b/src/include/jit/jit.h
@@ -63,7 +63,7 @@ typedef struct JitContext
typedef struct JitProviderCallbacks JitProviderCallbacks;
-extern void _PG_jit_provider_init(JitProviderCallbacks *cb);
+extern PGDLLEXPORT void _PG_jit_provider_init(JitProviderCallbacks *cb);
typedef void (*JitProviderInit) (JitProviderCallbacks *cb);
typedef void (*JitProviderResetAfterErrorCB) (void);
typedef void (*JitProviderReleaseContextCB) (JitContext *context);
diff --git a/src/include/pg_config.h.in b/src/include/pg_config.h.in
index 15ffdd895aa..e3ab1c7752f 100644
--- a/src/include/pg_config.h.in
+++ b/src/include/pg_config.h.in
@@ -710,6 +710,9 @@
/* Define to 1 if you have the <uuid/uuid.h> header file. */
#undef HAVE_UUID_UUID_H
+/* Define to 1 if your compiler knows the visibility("hidden") attribute. */
+#undef HAVE_VISIBILITY_ATTRIBUTE
+
/* Define to 1 if you have the `wcstombs_l' function. */
#undef HAVE_WCSTOMBS_L
diff --git a/src/include/replication/output_plugin.h b/src/include/replication/output_plugin.h
index 810495ed0e4..a087f14dadd 100644
--- a/src/include/replication/output_plugin.h
+++ b/src/include/replication/output_plugin.h
@@ -35,6 +35,8 @@ typedef struct OutputPluginOptions
*/
typedef void (*LogicalOutputPluginInit) (struct OutputPluginCallbacks *cb);
+extern PGDLLEXPORT void _PG_output_plugin_init(struct OutputPluginCallbacks *cb);
+
/*
* Callback that gets called in a user-defined plugin. ctx->private_data can
* be set to some private data.
diff --git a/src/pl/plpython/plpy_elog.h b/src/pl/plpython/plpy_elog.h
index e02ef4ffe9f..aeade82ce10 100644
--- a/src/pl/plpython/plpy_elog.h
+++ b/src/pl/plpython/plpy_elog.h
@@ -34,13 +34,13 @@ extern PyObject *PLy_exc_spi_error;
} while(0)
#endif /* HAVE__BUILTIN_CONSTANT_P */
-extern void PLy_elog_impl(int elevel, const char *fmt,...) pg_attribute_printf(2, 3);
+extern PGDLLEXPORT void PLy_elog_impl(int elevel, const char *fmt,...) pg_attribute_printf(2, 3);
-extern void PLy_exception_set(PyObject *exc, const char *fmt,...) pg_attribute_printf(2, 3);
+extern PGDLLEXPORT void PLy_exception_set(PyObject *exc, const char *fmt,...) pg_attribute_printf(2, 3);
-extern void PLy_exception_set_plural(PyObject *exc, const char *fmt_singular, const char *fmt_plural,
+extern PGDLLEXPORT void PLy_exception_set_plural(PyObject *exc, const char *fmt_singular, const char *fmt_plural,
unsigned long n,...) pg_attribute_printf(2, 5) pg_attribute_printf(3, 5);
-extern void PLy_exception_set_with_details(PyObject *excclass, ErrorData *edata);
+extern PGDLLEXPORT void PLy_exception_set_with_details(PyObject *excclass, ErrorData *edata);
#endif /* PLPY_ELOG_H */
diff --git a/src/pl/plpython/plpy_typeio.h b/src/pl/plpython/plpy_typeio.h
index d11e6ae1b89..87e3b2c464e 100644
--- a/src/pl/plpython/plpy_typeio.h
+++ b/src/pl/plpython/plpy_typeio.h
@@ -147,29 +147,29 @@ struct PLyObToDatum
};
-extern PyObject *PLy_input_convert(PLyDatumToOb *arg, Datum val);
-extern Datum PLy_output_convert(PLyObToDatum *arg, PyObject *val,
+extern PGDLLEXPORT PyObject *PLy_input_convert(PLyDatumToOb *arg, Datum val);
+extern PGDLLEXPORT Datum PLy_output_convert(PLyObToDatum *arg, PyObject *val,
bool *isnull);
-extern PyObject *PLy_input_from_tuple(PLyDatumToOb *arg, HeapTuple tuple,
+extern PGDLLEXPORT PyObject *PLy_input_from_tuple(PLyDatumToOb *arg, HeapTuple tuple,
TupleDesc desc, bool include_generated);
-extern void PLy_input_setup_func(PLyDatumToOb *arg, MemoryContext arg_mcxt,
+extern PGDLLEXPORT void PLy_input_setup_func(PLyDatumToOb *arg, MemoryContext arg_mcxt,
Oid typeOid, int32 typmod,
struct PLyProcedure *proc);
-extern void PLy_output_setup_func(PLyObToDatum *arg, MemoryContext arg_mcxt,
+extern PGDLLEXPORT void PLy_output_setup_func(PLyObToDatum *arg, MemoryContext arg_mcxt,
Oid typeOid, int32 typmod,
struct PLyProcedure *proc);
-extern void PLy_input_setup_tuple(PLyDatumToOb *arg, TupleDesc desc,
+extern PGDLLEXPORT void PLy_input_setup_tuple(PLyDatumToOb *arg, TupleDesc desc,
struct PLyProcedure *proc);
-extern void PLy_output_setup_tuple(PLyObToDatum *arg, TupleDesc desc,
+extern PGDLLEXPORT void PLy_output_setup_tuple(PLyObToDatum *arg, TupleDesc desc,
struct PLyProcedure *proc);
-extern void PLy_output_setup_record(PLyObToDatum *arg, TupleDesc desc,
+extern PGDLLEXPORT void PLy_output_setup_record(PLyObToDatum *arg, TupleDesc desc,
struct PLyProcedure *proc);
/* conversion from Python objects to C strings --- exported for transforms */
-extern char *PLyObject_AsString(PyObject *plrv);
+extern PGDLLEXPORT char *PLyObject_AsString(PyObject *plrv);
#endif /* PLPY_TYPEIO_H */
diff --git a/src/pl/plpython/plpy_util.h b/src/pl/plpython/plpy_util.h
index c9ba7edc0ec..6927601e0be 100644
--- a/src/pl/plpython/plpy_util.h
+++ b/src/pl/plpython/plpy_util.h
@@ -8,12 +8,12 @@
#include "plpython.h"
-extern PyObject *PLyUnicode_Bytes(PyObject *unicode);
-extern char *PLyUnicode_AsString(PyObject *unicode);
+extern PGDLLEXPORT PyObject *PLyUnicode_Bytes(PyObject *unicode);
+extern PGDLLEXPORT char *PLyUnicode_AsString(PyObject *unicode);
#if PY_MAJOR_VERSION >= 3
-extern PyObject *PLyUnicode_FromString(const char *s);
-extern PyObject *PLyUnicode_FromStringAndSize(const char *s, Py_ssize_t size);
+extern PGDLLEXPORT PyObject *PLyUnicode_FromString(const char *s);
+extern PGDLLEXPORT PyObject *PLyUnicode_FromStringAndSize(const char *s, Py_ssize_t size);
#endif
#endif /* PLPY_UTIL_H */
diff --git a/src/test/modules/test_shm_mq/test_shm_mq.h b/src/test/modules/test_shm_mq/test_shm_mq.h
index a6661218347..a7a36714a48 100644
--- a/src/test/modules/test_shm_mq/test_shm_mq.h
+++ b/src/test/modules/test_shm_mq/test_shm_mq.h
@@ -40,6 +40,6 @@ extern void test_shm_mq_setup(int64 queue_size, int32 nworkers,
shm_mq_handle **input);
/* Main entrypoint for a worker. */
-extern void test_shm_mq_main(Datum) pg_attribute_noreturn();
+extern PGDLLEXPORT void test_shm_mq_main(Datum) pg_attribute_noreturn();
#endif
diff --git a/src/test/modules/worker_spi/worker_spi.c b/src/test/modules/worker_spi/worker_spi.c
index 0b6246676b6..e267bc3cffa 100644
--- a/src/test/modules/worker_spi/worker_spi.c
+++ b/src/test/modules/worker_spi/worker_spi.c
@@ -47,7 +47,7 @@ PG_MODULE_MAGIC;
PG_FUNCTION_INFO_V1(worker_spi_launch);
void _PG_init(void);
-void worker_spi_main(Datum) pg_attribute_noreturn();
+PGDLLEXPORT void worker_spi_main(Datum) pg_attribute_noreturn();
/* GUC variables */
static int worker_spi_naptime = 10;
diff --git a/src/tools/msvc/Solution.pm b/src/tools/msvc/Solution.pm
index aba59a270b4..811d6fcc5b2 100644
--- a/src/tools/msvc/Solution.pm
+++ b/src/tools/msvc/Solution.pm
@@ -432,6 +432,7 @@ sub GenerateFiles
HAVE_WINLDAP_H => undef,
HAVE_WCSTOMBS_L => 1,
HAVE_WCTYPE_H => 1,
+ HAVE_VISIBILITY_ATTRIBUTE => undef,
HAVE_WRITEV => undef,
HAVE_X509_GET_SIGNATURE_NID => 1,
HAVE_X86_64_POPCNTQ => undef,
--
2.23.0.385.gbc12974a89
v3-0011-meson-prereq-Handle-DLSUFFIX-in-msvc-builds-simil.patchtext/x-diff; charset=us-asciiDownload
From 9f1027a30568d7142ed3a57129b8c84494a8eb1f Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Thu, 30 Sep 2021 10:20:24 -0700
Subject: [PATCH v3 11/17] meson: prereq: Handle DLSUFFIX in msvc builds
similar to other build envs.
---
src/include/port/win32_port.h | 3 ---
src/tools/msvc/Mkvcbuild.pm | 3 +++
2 files changed, 3 insertions(+), 3 deletions(-)
diff --git a/src/include/port/win32_port.h b/src/include/port/win32_port.h
index c1c4831595a..72b2d2b5a01 100644
--- a/src/include/port/win32_port.h
+++ b/src/include/port/win32_port.h
@@ -529,9 +529,6 @@ typedef unsigned short mode_t;
#define W_OK 2
#define R_OK 4
-/* Pulled from Makefile.port in MinGW */
-#define DLSUFFIX ".dll"
-
#endif /* _MSC_VER */
#if (defined(_MSC_VER) && (_MSC_VER < 1900)) || \
diff --git a/src/tools/msvc/Mkvcbuild.pm b/src/tools/msvc/Mkvcbuild.pm
index b8e62c6d3f7..47b5c43357a 100644
--- a/src/tools/msvc/Mkvcbuild.pm
+++ b/src/tools/msvc/Mkvcbuild.pm
@@ -195,6 +195,7 @@ sub mkvcbuild
'syncrep_gram.y');
$postgres->AddFiles('src/backend/utils/adt', 'jsonpath_scan.l',
'jsonpath_gram.y');
+ $postgres->AddDefine('DLSUFFIX=".dll"');
$postgres->AddDefine('BUILDING_DLL');
$postgres->AddLibrary('secur32.lib');
$postgres->AddLibrary('ws2_32.lib');
@@ -298,6 +299,7 @@ sub mkvcbuild
my $libecpg = $solution->AddProject('libecpg', 'dll', 'interfaces',
'src/interfaces/ecpg/ecpglib');
$libecpg->AddDefine('FRONTEND');
+ $libecpg->AddDefine('DLSUFFIX=".dll"');
$libecpg->AddIncludeDir('src/interfaces/ecpg/include');
$libecpg->AddIncludeDir('src/interfaces/libpq');
$libecpg->AddIncludeDir('src/port');
@@ -845,6 +847,7 @@ sub mkvcbuild
$pgregress->AddFile('src/test/regress/pg_regress.c');
$pgregress->AddFile('src/test/regress/pg_regress_main.c');
$pgregress->AddIncludeDir('src/port');
+ $pgregress->AddDefine('DLSUFFIX=".dll"');
$pgregress->AddDefine('HOST_TUPLE="i686-pc-win32vc"');
$pgregress->AddLibrary('ws2_32.lib');
$pgregress->AddDirResourceFile('src/test/regress');
--
2.23.0.385.gbc12974a89
v3-0012-prereq-Move-sed-expression-from-regress-python3-m.patchtext/x-diff; charset=us-asciiDownload
From 69e6ddf1d840d0d39005d559774626982c974bd1 Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Sun, 3 Oct 2021 10:56:21 -0700
Subject: [PATCH v3 12/17] prereq: Move sed expression from
regress-python3-mangle.mk into its own file.
---
src/pl/plpython/regress-python3-mangle.mk | 17 ++---------------
src/pl/plpython/regress-python3-mangle.sed | 13 +++++++++++++
2 files changed, 15 insertions(+), 15 deletions(-)
create mode 100644 src/pl/plpython/regress-python3-mangle.sed
diff --git a/src/pl/plpython/regress-python3-mangle.mk b/src/pl/plpython/regress-python3-mangle.mk
index a785818a172..a5c155e73de 100644
--- a/src/pl/plpython/regress-python3-mangle.mk
+++ b/src/pl/plpython/regress-python3-mangle.mk
@@ -14,21 +14,8 @@ REGRESS := $(foreach test,$(REGRESS),$(if $(filter $(test),$(REGRESS_PLPYTHON3_M
pgregress-python3-mangle:
$(MKDIR_P) sql/python3 expected/python3 results/python3
for file in $(patsubst %,$(srcdir)/sql/%.sql,$(REGRESS_PLPYTHON3_MANGLE)) $(patsubst %,$(srcdir)/expected/%*.out,$(REGRESS_PLPYTHON3_MANGLE)); do \
- sed \
- -e "s/<type 'exceptions\.\([[:alpha:]]*\)'>/<class '\1'>/g" \
- -e "s/<type 'long'>/<class 'int'>/g" \
- -e "s/\([0-9][0-9]*\)L/\1/g" \
- -e 's/\([ [{]\)u"/\1"/g' \
- -e "s/\([ [{]\)u'/\1'/g" \
- -e "s/def next/def __next__/g" \
- -e "s/LANGUAGE plpythonu/LANGUAGE plpython3u/g" \
- -e "s/LANGUAGE plpython2u/LANGUAGE plpython3u/g" \
- -e "s/EXTENSION plpythonu/EXTENSION plpython3u/g" \
- -e "s/EXTENSION plpython2u/EXTENSION plpython3u/g" \
- -e "s/EXTENSION \([^ ]*\)_plpythonu/EXTENSION \1_plpython3u/g" \
- -e "s/EXTENSION \([^ ]*\)_plpython2u/EXTENSION \1_plpython3u/g" \
- -e 's/installing required extension "plpython2u"/installing required extension "plpython3u"/g' \
- $$file >`echo $$file | sed 's,^.*/\([^/][^/]*/\)\([^/][^/]*\)$$,\1python3/\2,'` || exit; \
+ sed -f $(top_srcdir)/src/pl/plpython/regress-python3-mangle.sed $$file > \
+ `echo $$file | sed 's,^.*/\([^/][^/]*/\)\([^/][^/]*\)$$,\1python3/\2,'` || exit; \
done
check installcheck: pgregress-python3-mangle
diff --git a/src/pl/plpython/regress-python3-mangle.sed b/src/pl/plpython/regress-python3-mangle.sed
new file mode 100644
index 00000000000..d2fde24e0bf
--- /dev/null
+++ b/src/pl/plpython/regress-python3-mangle.sed
@@ -0,0 +1,13 @@
+s/<type 'exceptions\.\([[:alpha:]]*\)'>/<class '\1'>/g
+s/<type 'long'>/<class 'int'>/g
+s/\([0-9][0-9]*\)L/\1/g
+s/\([ [{]\)u"/\1"/g
+s/\([ [{]\)u'/\1'/g
+s/def next/def __next__/g
+s/LANGUAGE plpythonu/LANGUAGE plpython3u/g
+s/LANGUAGE plpython2u/LANGUAGE plpython3u/g
+s/EXTENSION plpythonu/EXTENSION plpython3u/g
+s/EXTENSION plpython2u/EXTENSION plpython3u/g
+s/EXTENSION \([^ ]*\)_plpythonu/EXTENSION \1_plpython3u/g
+s/EXTENSION \([^ ]*\)_plpython2u/EXTENSION \1_plpython3u/g
+s/installing required extension "plpython2u"/installing required extension "plpython3u"/g
--
2.23.0.385.gbc12974a89
v3-0013-Adapt-src-test-ldap-t-001_auth.pl-to-work-with-op.patchtext/x-diff; charset=us-asciiDownload
From 79eaef1e6a8039119d9e7a13671c8f94fecee2c3 Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Sat, 9 Oct 2021 16:42:22 -0700
Subject: [PATCH v3 13/17] Adapt src/test/ldap/t/001_auth.pl to work with
openldap 2.5.
ldapsearch's deprecated -h/-p arguments were removed, need to use -H now.
Discussion: https://postgr.es/m/20211009233850.wvr6apcrw2ai6cnj@alap3.anarazel.de
Backpatch: 11-, where the tests were added.
---
src/test/ldap/t/001_auth.pl | 4 ++--
1 file changed, 2 insertions(+), 2 deletions(-)
diff --git a/src/test/ldap/t/001_auth.pl b/src/test/ldap/t/001_auth.pl
index f670bc5e0d5..a025a641b02 100644
--- a/src/test/ldap/t/001_auth.pl
+++ b/src/test/ldap/t/001_auth.pl
@@ -130,8 +130,8 @@ while (1)
last
if (
system_log(
- "ldapsearch", "-h", $ldap_server, "-p",
- $ldap_port, "-s", "base", "-b",
+ "ldapsearch", "-H", "$ldap_url", "-s",
+ "base", "-b",
$ldap_basedn, "-D", $ldap_rootdn, "-y",
$ldap_pwfile, "-n", "'objectclass=*'") == 0);
die "cannot connect to slapd" if ++$retries >= 300;
--
2.23.0.385.gbc12974a89
v3-0014-wip-don-t-run-ldap-tests-on-windows.patchtext/x-diff; charset=us-asciiDownload
From fddca5d028897e5a5f1d5affe3c426e9721a2655 Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Sun, 10 Oct 2021 13:49:12 -0700
Subject: [PATCH v3 14/17] wip: don't run ldap tests on windows.
---
src/test/ldap/t/001_auth.pl | 7 +++++++
1 file changed, 7 insertions(+)
diff --git a/src/test/ldap/t/001_auth.pl b/src/test/ldap/t/001_auth.pl
index a025a641b02..c921a3d4872 100644
--- a/src/test/ldap/t/001_auth.pl
+++ b/src/test/ldap/t/001_auth.pl
@@ -6,6 +6,13 @@ use warnings;
use TestLib;
use PostgresNode;
use Test::More;
+use Config;
+
+if ($Config{osname} eq 'MSWin32')
+{
+ plan skip_all => 'ldap tests ';
+ exit;
+}
if ($ENV{with_ldap} eq 'yes')
{
--
2.23.0.385.gbc12974a89
v3-0015-wip-split-TESTDIR-into-two.patchtext/x-diff; charset=us-asciiDownload
From 7eed3612b84eba13730c13eaf59cb0d7bdae9d81 Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Fri, 1 Oct 2021 16:37:52 -0700
Subject: [PATCH v3 15/17] wip: split TESTDIR into two.
---
src/Makefile.global.in | 3 +++
src/bin/psql/t/010_tab_completion.pl | 34 ++++++++++++++--------------
src/test/perl/TestLib.pm | 2 +-
src/tools/msvc/vcregress.pl | 1 +
4 files changed, 22 insertions(+), 18 deletions(-)
diff --git a/src/Makefile.global.in b/src/Makefile.global.in
index a4ab27ae145..7505e3fa79d 100644
--- a/src/Makefile.global.in
+++ b/src/Makefile.global.in
@@ -450,6 +450,7 @@ define prove_installcheck
rm -rf '$(CURDIR)'/tmp_check
$(MKDIR_P) '$(CURDIR)'/tmp_check
cd $(srcdir) && \
+ TESTOUTDIR='$(CURDIR)/tmp_check' \
TESTDIR='$(CURDIR)' PATH="$(bindir):$$PATH" PGPORT='6$(DEF_PGPORT)' \
top_builddir='$(CURDIR)/$(top_builddir)' \
PG_REGRESS='$(CURDIR)/$(top_builddir)/src/test/regress/pg_regress' \
@@ -460,6 +461,7 @@ define prove_installcheck
rm -rf '$(CURDIR)'/tmp_check
$(MKDIR_P) '$(CURDIR)'/tmp_check
cd $(srcdir) && \
+ TESTOUTDIR='$(CURDIR)/tmp_check' \
TESTDIR='$(CURDIR)' PATH="$(bindir):$$PATH" PGPORT='6$(DEF_PGPORT)' \
top_builddir='$(top_builddir)' \
PG_REGRESS='$(top_builddir)/src/test/regress/pg_regress' \
@@ -471,6 +473,7 @@ define prove_check
rm -rf '$(CURDIR)'/tmp_check
$(MKDIR_P) '$(CURDIR)'/tmp_check
cd $(srcdir) && \
+ TESTOUTDIR='$(CURDIR)/tmp_check' \
TESTDIR='$(CURDIR)' $(with_temp_install) PGPORT='6$(DEF_PGPORT)' \
PG_REGRESS='$(CURDIR)/$(top_builddir)/src/test/regress/pg_regress' \
$(PROVE) $(PG_PROVE_FLAGS) $(PROVE_FLAGS) $(if $(PROVE_TESTS),$(PROVE_TESTS),t/*.pl)
diff --git a/src/bin/psql/t/010_tab_completion.pl b/src/bin/psql/t/010_tab_completion.pl
index 8695d225451..9123ac4c1f0 100644
--- a/src/bin/psql/t/010_tab_completion.pl
+++ b/src/bin/psql/t/010_tab_completion.pl
@@ -67,23 +67,23 @@ delete $ENV{LS_COLORS};
# to run in the build directory so that we can use relative paths to
# access the tmp_check subdirectory; otherwise the output from filename
# completion tests is too variable.
-if ($ENV{TESTDIR})
+if ($ENV{TESTOUTDIR})
{
- chdir $ENV{TESTDIR} or die "could not chdir to \"$ENV{TESTDIR}\": $!";
+ chdir "$ENV{TESTOUTDIR}" or die "could not chdir to \"$ENV{TESTOUTDIR}\": $!";
}
# Create some junk files for filename completion testing.
my $FH;
-open $FH, ">", "tmp_check/somefile"
- or die("could not create file \"tmp_check/somefile\": $!");
+open $FH, ">", "somefile"
+ or die("could not create file \"somefile\": $!");
print $FH "some stuff\n";
close $FH;
-open $FH, ">", "tmp_check/afile123"
- or die("could not create file \"tmp_check/afile123\": $!");
+open $FH, ">", "afile123"
+ or die("could not create file \"afile123\": $!");
print $FH "more stuff\n";
close $FH;
-open $FH, ">", "tmp_check/afile456"
- or die("could not create file \"tmp_check/afile456\": $!");
+open $FH, ">", "afile456"
+ or die("could not create file \"afile456\": $!");
print $FH "other stuff\n";
close $FH;
@@ -180,16 +180,16 @@ clear_query();
# check filename completion
check_completion(
- "\\lo_import tmp_check/some\t",
- qr|tmp_check/somefile |,
+ "\\lo_import some\t",
+ qr|somefile |,
"filename completion with one possibility");
clear_query();
# note: readline might print a bell before the completion
check_completion(
- "\\lo_import tmp_check/af\t",
- qr|tmp_check/af\a?ile|,
+ "\\lo_import af\t",
+ qr|af\a?ile|,
"filename completion with multiple possibilities");
clear_query();
@@ -198,15 +198,15 @@ clear_query();
# note: broken versions of libedit want to backslash the closing quote;
# not much we can do about that
check_completion(
- "COPY foo FROM tmp_check/some\t",
- qr|'tmp_check/somefile\\?' |,
+ "COPY foo FROM some\t",
+ qr|'somefile\\?' |,
"quoted filename completion with one possibility");
clear_line();
check_completion(
- "COPY foo FROM tmp_check/af\t",
- qr|'tmp_check/afile|,
+ "COPY foo FROM af\t",
+ qr|'afile|,
"quoted filename completion with multiple possibilities");
# some versions of readline/libedit require two tabs here, some only need one
@@ -214,7 +214,7 @@ check_completion(
# the quotes might appear, too
check_completion(
"\t\t",
- qr|afile123'? +'?(tmp_check/)?afile456|,
+ qr|afile123'? +'?afile456|,
"offer multiple file choices");
clear_line();
diff --git a/src/test/perl/TestLib.pm b/src/test/perl/TestLib.pm
index 06aae1760eb..b3ae976d85c 100644
--- a/src/test/perl/TestLib.pm
+++ b/src/test/perl/TestLib.pm
@@ -187,7 +187,7 @@ INIT
# Determine output directories, and create them. The base path is the
# TESTDIR environment variable, which is normally set by the invoking
# Makefile.
- $tmp_check = $ENV{TESTDIR} ? "$ENV{TESTDIR}/tmp_check" : "tmp_check";
+ $tmp_check = $ENV{TESTOUTDIR} ? "$ENV{TESTOUTDIR}" : "tmp_check";
$log_path = "$tmp_check/log";
mkdir $tmp_check;
diff --git a/src/tools/msvc/vcregress.pl b/src/tools/msvc/vcregress.pl
index 35e8f67f013..d2f5ef1118a 100644
--- a/src/tools/msvc/vcregress.pl
+++ b/src/tools/msvc/vcregress.pl
@@ -248,6 +248,7 @@ sub tap_check
$ENV{REGRESS_SHLIB} = "$topdir/src/test/regress/regress.dll";
$ENV{TESTDIR} = "$dir";
+ $ENV{TESTOUTDIR} = "$dir/tmp_check";
rmtree('tmp_check');
system(@args);
--
2.23.0.385.gbc12974a89
v3-0016-meson-Add-draft-of-a-meson-based-buildsystem.patchtext/x-diff; charset=utf-8Download
From b41aabee7aa54ec2f8e0c1fd770c0ee52b09ca12 Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Fri, 10 Sep 2021 09:51:51 -0700
Subject: [PATCH v3 16/17] meson: Add draft of a meson based buildsystem.
Author: Andres Freund
Author: Thomas Munro
---
contrib/adminpack/meson.build | 20 +
contrib/amcheck/meson.build | 35 +
contrib/auth_delay/meson.build | 4 +
contrib/auto_explain/meson.build | 13 +
contrib/bloom/meson.build | 38 +
contrib/bool_plperl/meson.build | 37 +
contrib/btree_gin/meson.build | 51 +
contrib/btree_gist/meson.build | 79 +
contrib/citext/meson.build | 29 +
contrib/cube/meson.build | 42 +
contrib/dblink/meson.build | 29 +
contrib/dict_int/meson.build | 19 +
contrib/dict_xsyn/meson.build | 26 +
contrib/earthdistance/meson.build | 20 +
contrib/file_fdw/meson.build | 19 +
contrib/fuzzystrmatch/meson.build | 23 +
contrib/hstore/meson.build | 36 +
contrib/hstore_plperl/meson.build | 38 +
contrib/hstore_plpython/expected/meson.build | 15 +
contrib/hstore_plpython/meson.build | 44 +
contrib/hstore_plpython/sql/meson.build | 17 +
contrib/jsonb_plperl/meson.build | 37 +
contrib/jsonb_plpython/expected/meson.build | 15 +
contrib/jsonb_plpython/meson.build | 44 +
contrib/jsonb_plpython/sql/meson.build | 17 +
contrib/meson.build | 63 +
contrib/oid2name/meson.build | 14 +
contrib/pageinspect/meson.build | 45 +
contrib/pg_prewarm/meson.build | 16 +
contrib/pg_stat_statements/meson.build | 31 +
contrib/pg_trgm/meson.build | 33 +
contrib/pg_visibility/meson.build | 25 +
contrib/postgres_fdw/meson.build | 31 +
contrib/spi/meson.build | 43 +
contrib/test_decoding/meson.build | 69 +
contrib/tsm_system_rows/meson.build | 22 +
contrib/tsm_system_time/meson.build | 22 +
contrib/unaccent/meson.build | 30 +
contrib/vacuumlo/meson.build | 14 +
contrib/xml2/meson.build | 30 +
conversion_helpers.txt | 6 +
meson.build | 1901 +++++++++++++++++
meson_options.txt | 81 +
src/backend/access/brin/meson.build | 12 +
src/backend/access/common/meson.build | 18 +
src/backend/access/gin/meson.build | 17 +
src/backend/access/gist/meson.build | 13 +
src/backend/access/hash/meson.build | 12 +
src/backend/access/heap/meson.build | 11 +
src/backend/access/index/meson.build | 6 +
src/backend/access/meson.build | 13 +
src/backend/access/nbtree/meson.build | 13 +
src/backend/access/rmgrdesc/meson.build | 26 +
src/backend/access/spgist/meson.build | 13 +
src/backend/access/table/meson.build | 6 +
src/backend/access/tablesample/meson.build | 5 +
src/backend/access/transam/meson.build | 28 +
src/backend/bootstrap/meson.build | 12 +
src/backend/catalog/meson.build | 41 +
src/backend/commands/meson.build | 50 +
src/backend/executor/meson.build | 67 +
src/backend/foreign/meson.build | 3 +
src/backend/jit/llvm/meson.build | 41 +
src/backend/jit/meson.build | 3 +
src/backend/lib/meson.build | 12 +
src/backend/libpq/meson.build | 28 +
src/backend/main/meson.build | 2 +
src/backend/meson.build | 191 ++
src/backend/nodes/meson.build | 17 +
src/backend/optimizer/geqo/meson.build | 17 +
src/backend/optimizer/meson.build | 5 +
src/backend/optimizer/path/meson.build | 11 +
src/backend/optimizer/plan/meson.build | 10 +
src/backend/optimizer/prep/meson.build | 7 +
src/backend/optimizer/util/meson.build | 16 +
src/backend/parser/meson.build | 43 +
src/backend/partitioning/meson.build | 5 +
src/backend/port/meson.build | 28 +
src/backend/port/win32/meson.build | 6 +
src/backend/postmaster/meson.build | 15 +
src/backend/regex/meson.build | 15 +
.../replication/libpqwalreceiver/meson.build | 13 +
src/backend/replication/logical/meson.build | 14 +
src/backend/replication/meson.build | 42 +
src/backend/replication/pgoutput/meson.build | 11 +
src/backend/rewrite/meson.build | 9 +
src/backend/snowball/meson.build | 83 +
src/backend/statistics/meson.build | 6 +
src/backend/storage/buffer/meson.build | 7 +
src/backend/storage/file/meson.build | 8 +
src/backend/storage/freespace/meson.build | 5 +
src/backend/storage/ipc/meson.build | 20 +
src/backend/storage/large_object/meson.build | 3 +
src/backend/storage/lmgr/meson.build | 18 +
src/backend/storage/meson.build | 9 +
src/backend/storage/page/meson.build | 5 +
src/backend/storage/smgr/meson.build | 4 +
src/backend/storage/sync/meson.build | 4 +
src/backend/tcop/meson.build | 8 +
src/backend/tsearch/meson.build | 21 +
src/backend/utils/activity/meson.build | 5 +
src/backend/utils/adt/meson.build | 118 +
src/backend/utils/cache/meson.build | 16 +
src/backend/utils/error/meson.build | 4 +
src/backend/utils/fmgr/meson.build | 8 +
src/backend/utils/hash/meson.build | 4 +
src/backend/utils/init/meson.build | 4 +
.../utils/mb/conversion_procs/meson.build | 38 +
src/backend/utils/mb/meson.build | 9 +
src/backend/utils/meson.build | 13 +
src/backend/utils/misc/meson.build | 28 +
src/backend/utils/mmgr/meson.build | 10 +
src/backend/utils/mmgr/proxy.c | 217 ++
src/backend/utils/resowner/meson.build | 3 +
src/backend/utils/sort/meson.build | 7 +
src/backend/utils/time/meson.build | 4 +
src/bin/initdb/meson.build | 24 +
src/bin/meson.build | 20 +
src/bin/pg_amcheck/meson.build | 22 +
src/bin/pg_archivecleanup/meson.build | 14 +
src/bin/pg_basebackup/meson.build | 44 +
src/bin/pg_checksums/meson.build | 16 +
src/bin/pg_config/meson.build | 14 +
src/bin/pg_controldata/meson.build | 14 +
src/bin/pg_ctl/meson.build | 17 +
src/bin/pg_dump/meson.build | 69 +
src/bin/pg_resetwal/meson.build | 15 +
src/bin/pg_rewind/meson.build | 34 +
src/bin/pg_test_fsync/meson.build | 14 +
src/bin/pg_test_timing/meson.build | 14 +
src/bin/pg_upgrade/meson.build | 26 +
src/bin/pg_verifybackup/meson.build | 25 +
src/bin/pg_waldump/meson.build | 23 +
src/bin/pgbench/meson.build | 38 +
src/bin/pgevent/meson.build | 1 +
src/bin/psql/meson.build | 46 +
src/bin/scripts/meson.build | 46 +
src/common/meson.build | 140 ++
src/fe_utils/meson.build | 27 +
src/include/catalog/meson.build | 113 +
src/include/meson.build | 50 +
src/include/parser/meson.build | 10 +
src/include/pch/c_pch.h | 1 +
src/include/pch/postgres_pch.h | 1 +
src/include/pg_config_ext.h.meson | 7 +
src/include/storage/meson.build | 15 +
src/include/utils/meson.build | 22 +
src/interfaces/libpq/meson.build | 99 +
src/meson.build | 10 +
src/pl/meson.build | 4 +
src/pl/plperl/meson.build | 81 +
src/pl/plpgsql/meson.build | 1 +
src/pl/plpgsql/src/meson.build | 67 +
src/pl/plpython/expected/meson.build | 14 +
src/pl/plpython/meson.build | 100 +
src/pl/plpython/sql/meson.build | 15 +
src/port/meson.build | 146 ++
src/test/authentication/meson.build | 9 +
src/test/isolation/meson.build | 49 +
src/test/kerberos/meson.build | 12 +
src/test/ldap/meson.build | 9 +
src/test/meson.build | 19 +
src/test/modules/brin/meson.build | 19 +
src/test/modules/commit_ts/meson.build | 20 +
src/test/modules/delay_execution/meson.build | 15 +
src/test/modules/dummy_index_am/meson.build | 20 +
src/test/modules/dummy_seclabel/meson.build | 20 +
src/test/modules/libpq_pipeline/meson.build | 21 +
src/test/modules/meson.build | 25 +
src/test/modules/plsample/meson.build | 20 +
src/test/modules/snapshot_too_old/meson.build | 11 +
src/test/modules/spgist_name_ops/meson.build | 20 +
.../ssl_passphrase_callback/meson.build | 45 +
src/test/modules/test_bloomfilter/meson.build | 20 +
src/test/modules/test_ddl_deparse/meson.build | 40 +
src/test/modules/test_extensions/meson.build | 38 +
.../modules/test_ginpostinglist/meson.build | 20 +
src/test/modules/test_integerset/meson.build | 20 +
src/test/modules/test_misc/meson.build | 8 +
src/test/modules/test_parser/meson.build | 20 +
src/test/modules/test_pg_dump/meson.build | 24 +
src/test/modules/test_predtest/meson.build | 20 +
src/test/modules/test_rbtree/meson.build | 20 +
src/test/modules/test_regex/meson.build | 21 +
src/test/modules/test_rls_hooks/meson.build | 19 +
src/test/modules/test_shm_mq/meson.build | 24 +
src/test/modules/unsafe_tests/meson.build | 9 +
src/test/modules/worker_spi/meson.build | 23 +
src/test/recovery/meson.build | 33 +
src/test/regress/meson.build | 57 +
src/test/ssl/meson.build | 10 +
src/test/subscription/meson.build | 33 +
src/timezone/meson.build | 50 +
src/timezone/tznames/meson.build | 20 +
src/tools/find_meson | 20 +
src/tools/irlink | 28 +
src/tools/msvc/export2def.pl | 22 +
src/tools/msvc/gendef2.pl | 177 ++
src/tools/testwrap | 22 +
199 files changed, 7430 insertions(+)
create mode 100644 contrib/adminpack/meson.build
create mode 100644 contrib/amcheck/meson.build
create mode 100644 contrib/auth_delay/meson.build
create mode 100644 contrib/auto_explain/meson.build
create mode 100644 contrib/bloom/meson.build
create mode 100644 contrib/bool_plperl/meson.build
create mode 100644 contrib/btree_gin/meson.build
create mode 100644 contrib/btree_gist/meson.build
create mode 100644 contrib/citext/meson.build
create mode 100644 contrib/cube/meson.build
create mode 100644 contrib/dblink/meson.build
create mode 100644 contrib/dict_int/meson.build
create mode 100644 contrib/dict_xsyn/meson.build
create mode 100644 contrib/earthdistance/meson.build
create mode 100644 contrib/file_fdw/meson.build
create mode 100644 contrib/fuzzystrmatch/meson.build
create mode 100644 contrib/hstore/meson.build
create mode 100644 contrib/hstore_plperl/meson.build
create mode 100644 contrib/hstore_plpython/expected/meson.build
create mode 100644 contrib/hstore_plpython/meson.build
create mode 100644 contrib/hstore_plpython/sql/meson.build
create mode 100644 contrib/jsonb_plperl/meson.build
create mode 100644 contrib/jsonb_plpython/expected/meson.build
create mode 100644 contrib/jsonb_plpython/meson.build
create mode 100644 contrib/jsonb_plpython/sql/meson.build
create mode 100644 contrib/meson.build
create mode 100644 contrib/oid2name/meson.build
create mode 100644 contrib/pageinspect/meson.build
create mode 100644 contrib/pg_prewarm/meson.build
create mode 100644 contrib/pg_stat_statements/meson.build
create mode 100644 contrib/pg_trgm/meson.build
create mode 100644 contrib/pg_visibility/meson.build
create mode 100644 contrib/postgres_fdw/meson.build
create mode 100644 contrib/spi/meson.build
create mode 100644 contrib/test_decoding/meson.build
create mode 100644 contrib/tsm_system_rows/meson.build
create mode 100644 contrib/tsm_system_time/meson.build
create mode 100644 contrib/unaccent/meson.build
create mode 100644 contrib/vacuumlo/meson.build
create mode 100644 contrib/xml2/meson.build
create mode 100644 conversion_helpers.txt
create mode 100644 meson.build
create mode 100644 meson_options.txt
create mode 100644 src/backend/access/brin/meson.build
create mode 100644 src/backend/access/common/meson.build
create mode 100644 src/backend/access/gin/meson.build
create mode 100644 src/backend/access/gist/meson.build
create mode 100644 src/backend/access/hash/meson.build
create mode 100644 src/backend/access/heap/meson.build
create mode 100644 src/backend/access/index/meson.build
create mode 100644 src/backend/access/meson.build
create mode 100644 src/backend/access/nbtree/meson.build
create mode 100644 src/backend/access/rmgrdesc/meson.build
create mode 100644 src/backend/access/spgist/meson.build
create mode 100644 src/backend/access/table/meson.build
create mode 100644 src/backend/access/tablesample/meson.build
create mode 100644 src/backend/access/transam/meson.build
create mode 100644 src/backend/bootstrap/meson.build
create mode 100644 src/backend/catalog/meson.build
create mode 100644 src/backend/commands/meson.build
create mode 100644 src/backend/executor/meson.build
create mode 100644 src/backend/foreign/meson.build
create mode 100644 src/backend/jit/llvm/meson.build
create mode 100644 src/backend/jit/meson.build
create mode 100644 src/backend/lib/meson.build
create mode 100644 src/backend/libpq/meson.build
create mode 100644 src/backend/main/meson.build
create mode 100644 src/backend/meson.build
create mode 100644 src/backend/nodes/meson.build
create mode 100644 src/backend/optimizer/geqo/meson.build
create mode 100644 src/backend/optimizer/meson.build
create mode 100644 src/backend/optimizer/path/meson.build
create mode 100644 src/backend/optimizer/plan/meson.build
create mode 100644 src/backend/optimizer/prep/meson.build
create mode 100644 src/backend/optimizer/util/meson.build
create mode 100644 src/backend/parser/meson.build
create mode 100644 src/backend/partitioning/meson.build
create mode 100644 src/backend/port/meson.build
create mode 100644 src/backend/port/win32/meson.build
create mode 100644 src/backend/postmaster/meson.build
create mode 100644 src/backend/regex/meson.build
create mode 100644 src/backend/replication/libpqwalreceiver/meson.build
create mode 100644 src/backend/replication/logical/meson.build
create mode 100644 src/backend/replication/meson.build
create mode 100644 src/backend/replication/pgoutput/meson.build
create mode 100644 src/backend/rewrite/meson.build
create mode 100644 src/backend/snowball/meson.build
create mode 100644 src/backend/statistics/meson.build
create mode 100644 src/backend/storage/buffer/meson.build
create mode 100644 src/backend/storage/file/meson.build
create mode 100644 src/backend/storage/freespace/meson.build
create mode 100644 src/backend/storage/ipc/meson.build
create mode 100644 src/backend/storage/large_object/meson.build
create mode 100644 src/backend/storage/lmgr/meson.build
create mode 100644 src/backend/storage/meson.build
create mode 100644 src/backend/storage/page/meson.build
create mode 100644 src/backend/storage/smgr/meson.build
create mode 100644 src/backend/storage/sync/meson.build
create mode 100644 src/backend/tcop/meson.build
create mode 100644 src/backend/tsearch/meson.build
create mode 100644 src/backend/utils/activity/meson.build
create mode 100644 src/backend/utils/adt/meson.build
create mode 100644 src/backend/utils/cache/meson.build
create mode 100644 src/backend/utils/error/meson.build
create mode 100644 src/backend/utils/fmgr/meson.build
create mode 100644 src/backend/utils/hash/meson.build
create mode 100644 src/backend/utils/init/meson.build
create mode 100644 src/backend/utils/mb/conversion_procs/meson.build
create mode 100644 src/backend/utils/mb/meson.build
create mode 100644 src/backend/utils/meson.build
create mode 100644 src/backend/utils/misc/meson.build
create mode 100644 src/backend/utils/mmgr/meson.build
create mode 100644 src/backend/utils/mmgr/proxy.c
create mode 100644 src/backend/utils/resowner/meson.build
create mode 100644 src/backend/utils/sort/meson.build
create mode 100644 src/backend/utils/time/meson.build
create mode 100644 src/bin/initdb/meson.build
create mode 100644 src/bin/meson.build
create mode 100644 src/bin/pg_amcheck/meson.build
create mode 100644 src/bin/pg_archivecleanup/meson.build
create mode 100644 src/bin/pg_basebackup/meson.build
create mode 100644 src/bin/pg_checksums/meson.build
create mode 100644 src/bin/pg_config/meson.build
create mode 100644 src/bin/pg_controldata/meson.build
create mode 100644 src/bin/pg_ctl/meson.build
create mode 100644 src/bin/pg_dump/meson.build
create mode 100644 src/bin/pg_resetwal/meson.build
create mode 100644 src/bin/pg_rewind/meson.build
create mode 100644 src/bin/pg_test_fsync/meson.build
create mode 100644 src/bin/pg_test_timing/meson.build
create mode 100644 src/bin/pg_upgrade/meson.build
create mode 100644 src/bin/pg_verifybackup/meson.build
create mode 100644 src/bin/pg_waldump/meson.build
create mode 100644 src/bin/pgbench/meson.build
create mode 100644 src/bin/pgevent/meson.build
create mode 100644 src/bin/psql/meson.build
create mode 100644 src/bin/scripts/meson.build
create mode 100644 src/common/meson.build
create mode 100644 src/fe_utils/meson.build
create mode 100644 src/include/catalog/meson.build
create mode 100644 src/include/meson.build
create mode 100644 src/include/parser/meson.build
create mode 100644 src/include/pch/c_pch.h
create mode 100644 src/include/pch/postgres_pch.h
create mode 100644 src/include/pg_config_ext.h.meson
create mode 100644 src/include/storage/meson.build
create mode 100644 src/include/utils/meson.build
create mode 100644 src/interfaces/libpq/meson.build
create mode 100644 src/meson.build
create mode 100644 src/pl/meson.build
create mode 100644 src/pl/plperl/meson.build
create mode 100644 src/pl/plpgsql/meson.build
create mode 100644 src/pl/plpgsql/src/meson.build
create mode 100644 src/pl/plpython/expected/meson.build
create mode 100644 src/pl/plpython/meson.build
create mode 100644 src/pl/plpython/sql/meson.build
create mode 100644 src/port/meson.build
create mode 100644 src/test/authentication/meson.build
create mode 100644 src/test/isolation/meson.build
create mode 100644 src/test/kerberos/meson.build
create mode 100644 src/test/ldap/meson.build
create mode 100644 src/test/meson.build
create mode 100644 src/test/modules/brin/meson.build
create mode 100644 src/test/modules/commit_ts/meson.build
create mode 100644 src/test/modules/delay_execution/meson.build
create mode 100644 src/test/modules/dummy_index_am/meson.build
create mode 100644 src/test/modules/dummy_seclabel/meson.build
create mode 100644 src/test/modules/libpq_pipeline/meson.build
create mode 100644 src/test/modules/meson.build
create mode 100644 src/test/modules/plsample/meson.build
create mode 100644 src/test/modules/snapshot_too_old/meson.build
create mode 100644 src/test/modules/spgist_name_ops/meson.build
create mode 100644 src/test/modules/ssl_passphrase_callback/meson.build
create mode 100644 src/test/modules/test_bloomfilter/meson.build
create mode 100644 src/test/modules/test_ddl_deparse/meson.build
create mode 100644 src/test/modules/test_extensions/meson.build
create mode 100644 src/test/modules/test_ginpostinglist/meson.build
create mode 100644 src/test/modules/test_integerset/meson.build
create mode 100644 src/test/modules/test_misc/meson.build
create mode 100644 src/test/modules/test_parser/meson.build
create mode 100644 src/test/modules/test_pg_dump/meson.build
create mode 100644 src/test/modules/test_predtest/meson.build
create mode 100644 src/test/modules/test_rbtree/meson.build
create mode 100644 src/test/modules/test_regex/meson.build
create mode 100644 src/test/modules/test_rls_hooks/meson.build
create mode 100644 src/test/modules/test_shm_mq/meson.build
create mode 100644 src/test/modules/unsafe_tests/meson.build
create mode 100644 src/test/modules/worker_spi/meson.build
create mode 100644 src/test/recovery/meson.build
create mode 100644 src/test/regress/meson.build
create mode 100644 src/test/ssl/meson.build
create mode 100644 src/test/subscription/meson.build
create mode 100644 src/timezone/meson.build
create mode 100644 src/timezone/tznames/meson.build
create mode 100755 src/tools/find_meson
create mode 100644 src/tools/irlink
create mode 100644 src/tools/msvc/export2def.pl
create mode 100644 src/tools/msvc/gendef2.pl
create mode 100755 src/tools/testwrap
diff --git a/contrib/adminpack/meson.build b/contrib/adminpack/meson.build
new file mode 100644
index 00000000000..457a6089445
--- /dev/null
+++ b/contrib/adminpack/meson.build
@@ -0,0 +1,20 @@
+autoinc = shared_module('adminpack',
+ ['adminpack.c'],
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'adminpack.control',
+ 'adminpack--1.0.sql',
+ 'adminpack--1.0--1.1.sql',
+ 'adminpack--1.1--2.0.sql',
+ 'adminpack--2.0--2.1.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'adminpack',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': ['adminpack'],
+}
diff --git a/contrib/amcheck/meson.build b/contrib/amcheck/meson.build
new file mode 100644
index 00000000000..7d6a7bc8385
--- /dev/null
+++ b/contrib/amcheck/meson.build
@@ -0,0 +1,35 @@
+amcheck = shared_module('amcheck', [
+ 'verify_heapam.c',
+ 'verify_nbtree.c',
+ ],
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'amcheck.control',
+ 'amcheck--1.0.sql',
+ 'amcheck--1.0--1.1.sql',
+ 'amcheck--1.1--1.2.sql',
+ 'amcheck--1.2--1.3.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'amcheck',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'check',
+ 'check_btree',
+ 'check_heap'
+ ],
+}
+
+tap_tests += {
+ 'name': 'amcheck',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_verify_heapam.pl',
+ ],
+}
diff --git a/contrib/auth_delay/meson.build b/contrib/auth_delay/meson.build
new file mode 100644
index 00000000000..941bb6f39a4
--- /dev/null
+++ b/contrib/auth_delay/meson.build
@@ -0,0 +1,4 @@
+autoinc = shared_module('auth_delay',
+ ['auth_delay.c'],
+ kwargs: contrib_mod_args,
+)
diff --git a/contrib/auto_explain/meson.build b/contrib/auto_explain/meson.build
new file mode 100644
index 00000000000..321896efa2c
--- /dev/null
+++ b/contrib/auto_explain/meson.build
@@ -0,0 +1,13 @@
+auto_explain = shared_module('auto_explain',
+ files('auto_explain.c'),
+ kwargs: contrib_mod_args,
+)
+
+tap_tests += {
+ 'name': 'auto_explain',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_auto_explain.pl',
+ ]
+}
diff --git a/contrib/bloom/meson.build b/contrib/bloom/meson.build
new file mode 100644
index 00000000000..5c5d33c7f7a
--- /dev/null
+++ b/contrib/bloom/meson.build
@@ -0,0 +1,38 @@
+bloom_sources = files(
+ 'blcost.c',
+ 'blinsert.c',
+ 'blscan.c',
+ 'blutils.c',
+ 'blvacuum.c',
+ 'blvalidate.c',
+)
+
+bloom = shared_module('bloom',
+ bloom_sources,
+ c_pch: '../../src/include/pch/c_pch.h',
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'bloom.control',
+ 'bloom--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'bloom',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'bloom'
+ ],
+}
+
+tap_tests += {
+ 'name': 'bloom',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_wal.pl',
+ ],
+}
diff --git a/contrib/bool_plperl/meson.build b/contrib/bool_plperl/meson.build
new file mode 100644
index 00000000000..e15dc5285eb
--- /dev/null
+++ b/contrib/bool_plperl/meson.build
@@ -0,0 +1,37 @@
+if not perl_dep.found()
+ subdir_done()
+endif
+
+bool_plperl_sources = files(
+ 'bool_plperl.c',
+)
+
+bool_plperl = shared_module('bool_plperl',
+ bool_plperl_sources,
+ include_directories: [plperl_inc, include_directories('.')],
+ kwargs: pg_mod_args + {
+ 'dependencies': [perl_dep, contrib_mod_args['dependencies']],
+ },
+)
+
+install_data(
+ 'bool_plperl.control',
+ 'bool_plperl--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+install_data(
+ 'bool_plperlu.control',
+ 'bool_plperlu--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'bool_plperl',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'bool_plperl',
+ 'bool_plperlu',
+ ],
+}
diff --git a/contrib/btree_gin/meson.build b/contrib/btree_gin/meson.build
new file mode 100644
index 00000000000..d25ece7500e
--- /dev/null
+++ b/contrib/btree_gin/meson.build
@@ -0,0 +1,51 @@
+btree_gin = shared_module('btree_gin',
+ files('btree_gin.c'),
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'btree_gin.control',
+ 'btree_gin--1.0.sql',
+ 'btree_gin--1.0--1.1.sql',
+ 'btree_gin--1.1--1.2.sql',
+ 'btree_gin--1.2--1.3.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'btree_gin',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'install_btree_gin',
+ 'int2',
+ 'int4',
+ 'int8',
+ 'float4',
+ 'float8',
+ 'money',
+ 'oid',
+ 'timestamp',
+ 'timestamptz',
+ 'time',
+ 'timetz',
+ 'date',
+ 'interval',
+ 'macaddr',
+ 'macaddr8',
+ 'inet',
+ 'cidr',
+ 'text',
+ 'varchar',
+ 'char',
+ 'bytea',
+ 'bit',
+ 'varbit',
+ 'numeric',
+ 'enum',
+ 'uuid',
+ 'name',
+ 'bool',
+ 'bpchar',
+ ],
+}
diff --git a/contrib/btree_gist/meson.build b/contrib/btree_gist/meson.build
new file mode 100644
index 00000000000..8ee0faea401
--- /dev/null
+++ b/contrib/btree_gist/meson.build
@@ -0,0 +1,79 @@
+btree_gist_sources = files(
+ 'btree_bit.c',
+ 'btree_bytea.c',
+ 'btree_cash.c',
+ 'btree_date.c',
+ 'btree_enum.c',
+ 'btree_float4.c',
+ 'btree_float8.c',
+ 'btree_gist.c',
+ 'btree_inet.c',
+ 'btree_int2.c',
+ 'btree_int4.c',
+ 'btree_int8.c',
+ 'btree_interval.c',
+ 'btree_macaddr.c',
+ 'btree_macaddr8.c',
+ 'btree_numeric.c',
+ 'btree_oid.c',
+ 'btree_text.c',
+ 'btree_time.c',
+ 'btree_ts.c',
+ 'btree_utils_num.c',
+ 'btree_utils_var.c',
+ 'btree_uuid.c',
+)
+
+btree_gist = shared_module('btree_gist',
+ btree_gist_sources,
+ c_pch: '../../src/include/pch/c_pch.h',
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'btree_gist.control',
+ 'btree_gist--1.0--1.1.sql',
+ 'btree_gist--1.1--1.2.sql',
+ 'btree_gist--1.2.sql',
+ 'btree_gist--1.2--1.3.sql',
+ 'btree_gist--1.3--1.4.sql',
+ 'btree_gist--1.4--1.5.sql',
+ 'btree_gist--1.5--1.6.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'btree_gist',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'init',
+ 'int2',
+ 'int4',
+ 'int8',
+ 'float4',
+ 'float8',
+ 'cash',
+ 'oid',
+ 'timestamp',
+ 'timestamptz',
+ 'time',
+ 'timetz',
+ 'date',
+ 'interval',
+ 'macaddr',
+ 'macaddr8',
+ 'inet',
+ 'cidr',
+ 'text',
+ 'varchar',
+ 'char',
+ 'bytea',
+ 'bit',
+ 'varbit',
+ 'numeric',
+ 'uuid',
+ 'not_equal',
+ 'enum',
+ ],
+}
diff --git a/contrib/citext/meson.build b/contrib/citext/meson.build
new file mode 100644
index 00000000000..f2e9ff2117d
--- /dev/null
+++ b/contrib/citext/meson.build
@@ -0,0 +1,29 @@
+citext_sources = files(
+ 'citext.c',
+)
+
+citext = shared_module('citext',
+ citext_sources,
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'citext.control',
+ 'citext--1.0--1.1.sql',
+ 'citext--1.1--1.2.sql',
+ 'citext--1.2--1.3.sql',
+ 'citext--1.3--1.4.sql',
+ 'citext--1.4.sql',
+ 'citext--1.4--1.5.sql',
+ 'citext--1.5--1.6.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'citext',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'citext'
+ ],
+}
diff --git a/contrib/cube/meson.build b/contrib/cube/meson.build
new file mode 100644
index 00000000000..49276aed644
--- /dev/null
+++ b/contrib/cube/meson.build
@@ -0,0 +1,42 @@
+cube_sources = files(
+ 'cube.c',
+)
+
+# cubescan is compiled as part of cubeparse
+cubescan = custom_target('cubescan',
+ input: ['cubescan.l'],
+ output: ['cubescan.c'],
+ command: [flex, '-CFe', '-p', '-p', '-o', '@OUTPUT@', '@INPUT@'])
+
+cube_sources += custom_target('cubeparse',
+ input: 'cubeparse.y',
+ output: 'cubeparse.c',
+ depends: cubescan,
+ command: [bison, bisonflags, '-o', '@OUTPUT@', '@INPUT0@'])
+
+cube = shared_module('cube',
+ cube_sources,
+ include_directories: include_directories('.'),
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'cube.control',
+ 'cube--1.0--1.1.sql',
+ 'cube--1.1--1.2.sql',
+ 'cube--1.2.sql',
+ 'cube--1.2--1.3.sql',
+ 'cube--1.3--1.4.sql',
+ 'cube--1.4--1.5.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'cube',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'cube',
+ 'cube_sci',
+ ],
+}
diff --git a/contrib/dblink/meson.build b/contrib/dblink/meson.build
new file mode 100644
index 00000000000..7ac253700c9
--- /dev/null
+++ b/contrib/dblink/meson.build
@@ -0,0 +1,29 @@
+dblink_sources = files(
+ 'dblink.c',
+)
+
+dblink = shared_module('dblink',
+ dblink_sources,
+ kwargs: contrib_mod_args + {
+ 'dependencies': pg_mod_args['dependencies'] + [libpq],
+ },
+)
+
+install_data(
+ 'dblink.control',
+ 'dblink--1.0--1.1.sql',
+ 'dblink--1.1--1.2.sql',
+ 'dblink--1.2.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'dblink',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'paths',
+ 'dblink'
+ ],
+ 'regress_args': ['--dlpath', meson.build_root() / 'src/test/regress'],
+}
diff --git a/contrib/dict_int/meson.build b/contrib/dict_int/meson.build
new file mode 100644
index 00000000000..7c23b275c5a
--- /dev/null
+++ b/contrib/dict_int/meson.build
@@ -0,0 +1,19 @@
+dict_int = shared_module('dict_int',
+ files('dict_int.c'),
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'dict_int.control',
+ 'dict_int--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'dict_int',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'dict_int'
+ ],
+}
diff --git a/contrib/dict_xsyn/meson.build b/contrib/dict_xsyn/meson.build
new file mode 100644
index 00000000000..7cbabba02f1
--- /dev/null
+++ b/contrib/dict_xsyn/meson.build
@@ -0,0 +1,26 @@
+dict_xsyn = shared_module('dict_xsyn',
+ files('dict_xsyn.c'),
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'dict_xsyn.control',
+ 'dict_xsyn--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+install_data(
+ 'xsyn_sample.rules',
+ kwargs: contrib_data_args + {
+ 'install_dir': get_option('datadir') / 'tsearch_data'
+ }
+)
+
+regress_tests += {
+ 'name': 'dict_xsyn',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'dict_xsyn'
+ ],
+}
diff --git a/contrib/earthdistance/meson.build b/contrib/earthdistance/meson.build
new file mode 100644
index 00000000000..d56abf4f260
--- /dev/null
+++ b/contrib/earthdistance/meson.build
@@ -0,0 +1,20 @@
+earthdistance = shared_module('earthdistance',
+ files('earthdistance.c'),
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'earthdistance.control',
+ 'earthdistance--1.0--1.1.sql',
+ 'earthdistance--1.1.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'earthdistance',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'earthdistance'
+ ],
+}
diff --git a/contrib/file_fdw/meson.build b/contrib/file_fdw/meson.build
new file mode 100644
index 00000000000..0cd3348dfd0
--- /dev/null
+++ b/contrib/file_fdw/meson.build
@@ -0,0 +1,19 @@
+file_fdw = shared_module('file_fdw',
+ files('file_fdw.c'),
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'file_fdw.control',
+ 'file_fdw--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'file_fdw',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'file_fdw'
+ ],
+}
diff --git a/contrib/fuzzystrmatch/meson.build b/contrib/fuzzystrmatch/meson.build
new file mode 100644
index 00000000000..d1e75479668
--- /dev/null
+++ b/contrib/fuzzystrmatch/meson.build
@@ -0,0 +1,23 @@
+fuzzystrmatch = shared_module('fuzzystrmatch',
+ files(
+ 'fuzzystrmatch.c',
+ 'dmetaphone.c'
+ ),
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'fuzzystrmatch.control',
+ 'fuzzystrmatch--1.0--1.1.sql',
+ 'fuzzystrmatch--1.1.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'fuzzystrmatch',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'fuzzystrmatch'
+ ],
+}
diff --git a/contrib/hstore/meson.build b/contrib/hstore/meson.build
new file mode 100644
index 00000000000..661e61f9692
--- /dev/null
+++ b/contrib/hstore/meson.build
@@ -0,0 +1,36 @@
+# .. so that includes of hstore/hstore.h work
+hstore_inc = include_directories('.', '../')
+
+hstore = shared_module('hstore',
+ files(
+ 'hstore_compat.c',
+ 'hstore_gin.c',
+ 'hstore_gist.c',
+ 'hstore_io.c',
+ 'hstore_op.c',
+ 'hstore_subs.c',
+ ),
+ c_pch: '../../src/include/pch/c_pch.h',
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'hstore.control',
+ 'hstore--1.1--1.2.sql',
+ 'hstore--1.3--1.4.sql',
+ 'hstore--1.4.sql',
+ 'hstore--1.4--1.5.sql',
+ 'hstore--1.5--1.6.sql',
+ 'hstore--1.6--1.7.sql',
+ 'hstore--1.7--1.8.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'hstore',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'hstore'
+ ],
+}
diff --git a/contrib/hstore_plperl/meson.build b/contrib/hstore_plperl/meson.build
new file mode 100644
index 00000000000..48231cb1c9e
--- /dev/null
+++ b/contrib/hstore_plperl/meson.build
@@ -0,0 +1,38 @@
+if not perl_dep.found()
+ subdir_done()
+endif
+
+hstore_plperl_sources = files(
+ 'hstore_plperl.c',
+)
+
+hstore_plperl = shared_module('hstore_plperl',
+ hstore_plperl_sources,
+ include_directories: [plperl_inc, hstore_inc],
+ kwargs: pg_mod_args + {
+ 'dependencies': [perl_dep, contrib_mod_args['dependencies']],
+ },
+)
+
+install_data(
+ 'hstore_plperl.control',
+ 'hstore_plperl--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+install_data(
+ 'hstore_plperlu.control',
+ 'hstore_plperlu--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'hstore_plperl',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'hstore_plperl',
+ 'hstore_plperlu',
+ 'create_transform',
+ ],
+}
diff --git a/contrib/hstore_plpython/expected/meson.build b/contrib/hstore_plpython/expected/meson.build
new file mode 100644
index 00000000000..5a33c3752ef
--- /dev/null
+++ b/contrib/hstore_plpython/expected/meson.build
@@ -0,0 +1,15 @@
+foreach r2 : hstore_plpython_regress
+ # string.replace is only in meson 0.58
+ r3 = 'plpython3' + r2.split('plpython')[1]
+
+ s2 = '@0@.out'.format(r2)
+ s3 = '@0@.out'.format(r3)
+ hstore_plpython3_deps += custom_target(s3,
+ input: '@0@.out'.format(r2),
+ output: '@0@.out'.format(r3),
+ capture: true,
+ command: plpython_regress_cmd,
+ build_by_default: false,
+ install: false,
+ )
+endforeach
diff --git a/contrib/hstore_plpython/meson.build b/contrib/hstore_plpython/meson.build
new file mode 100644
index 00000000000..3c47a71e430
--- /dev/null
+++ b/contrib/hstore_plpython/meson.build
@@ -0,0 +1,44 @@
+if not python3.found()
+ subdir_done()
+endif
+
+hstore_plpython_sources = files(
+ 'hstore_plpython.c',
+)
+
+hstore_plpython = shared_module('hstore_plpython3',
+ hstore_plpython_sources,
+ include_directories: [plpython_inc, hstore_inc, ],
+ kwargs: pg_mod_args + {
+ 'c_args': ['-DPLPYTHON_LIBNAME="plpython3"'] + contrib_mod_args['c_args'],
+ 'dependencies': [python3, contrib_mod_args['dependencies']],
+ },
+)
+
+install_data(
+ 'hstore_plpythonu.control',
+ 'hstore_plpython2u.control',
+ 'hstore_plpython3u.control',
+ 'hstore_plpythonu--1.0.sql',
+ 'hstore_plpython2u--1.0.sql',
+ 'hstore_plpython3u--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+hstore_plpython_regress = ['hstore_plpython']
+
+hstore_plpython3_regress = []
+hstore_plpython3_deps = []
+
+# FIXME: this is an abysmal hack
+subdir('sql')
+subdir('expected')
+
+regress_tests += {
+ 'name': 'hstore_plpython',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': hstore_plpython3_regress,
+ 'deps': hstore_plpython3_deps,
+ 'regress_args': ['--inputdir', meson.current_build_dir(), '--load-extension=hstore'],
+}
diff --git a/contrib/hstore_plpython/sql/meson.build b/contrib/hstore_plpython/sql/meson.build
new file mode 100644
index 00000000000..612980fd37c
--- /dev/null
+++ b/contrib/hstore_plpython/sql/meson.build
@@ -0,0 +1,17 @@
+# Convert plpython2 regression tests to plpython3 ones
+foreach r2 : hstore_plpython_regress
+ # string.replace is only in meson 0.58
+ r3 = 'plpython3' + r2.split('plpython')[1]
+ hstore_plpython3_regress += r3
+
+ s2 = '@0@.sql'.format(r2)
+ s3 = '@0@.sql'.format(r3)
+ hstore_plpython3_deps += custom_target(s3,
+ input: '@0@.sql'.format(r2),
+ output: '@0@.sql'.format(r3),
+ capture: true,
+ command: plpython_regress_cmd,
+ build_by_default: false,
+ install: false,
+ )
+endforeach
diff --git a/contrib/jsonb_plperl/meson.build b/contrib/jsonb_plperl/meson.build
new file mode 100644
index 00000000000..c34090e5f5c
--- /dev/null
+++ b/contrib/jsonb_plperl/meson.build
@@ -0,0 +1,37 @@
+if not perl_dep.found()
+ subdir_done()
+endif
+
+jsonb_plperl_sources = files(
+ 'jsonb_plperl.c',
+)
+
+jsonb_plperl = shared_module('jsonb_plperl',
+ jsonb_plperl_sources,
+ include_directories: [plperl_inc],
+ kwargs: pg_mod_args + {
+ 'dependencies': [perl_dep, contrib_mod_args['dependencies']],
+ },
+)
+
+install_data(
+ 'jsonb_plperl.control',
+ 'jsonb_plperl--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+install_data(
+ 'jsonb_plperlu.control',
+ 'jsonb_plperlu--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'jsonb_plperl',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'jsonb_plperl',
+ 'jsonb_plperlu',
+ ],
+}
diff --git a/contrib/jsonb_plpython/expected/meson.build b/contrib/jsonb_plpython/expected/meson.build
new file mode 100644
index 00000000000..3840bdd92e5
--- /dev/null
+++ b/contrib/jsonb_plpython/expected/meson.build
@@ -0,0 +1,15 @@
+foreach r2 : jsonb_plpython_regress
+ # string.replace is only in meson 0.58
+ r3 = 'plpython3' + r2.split('plpython')[1]
+
+ s2 = '@0@.out'.format(r2)
+ s3 = '@0@.out'.format(r3)
+ jsonb_plpython3_deps += custom_target(s3,
+ input: '@0@.out'.format(r2),
+ output: '@0@.out'.format(r3),
+ capture: true,
+ command: plpython_regress_cmd,
+ build_by_default: false,
+ install: false,
+ )
+endforeach
diff --git a/contrib/jsonb_plpython/meson.build b/contrib/jsonb_plpython/meson.build
new file mode 100644
index 00000000000..21abebf41b7
--- /dev/null
+++ b/contrib/jsonb_plpython/meson.build
@@ -0,0 +1,44 @@
+if not python3.found()
+ subdir_done()
+endif
+
+jsonb_plpython_sources = files(
+ 'jsonb_plpython.c',
+)
+
+jsonb_plpython = shared_module('jsonb_plpython3',
+ jsonb_plpython_sources,
+ include_directories: [plpython_inc],
+ kwargs: pg_mod_args + {
+ 'c_args': ['-DPLPYTHON_LIBNAME="plpython3"'] + contrib_mod_args['c_args'],
+ 'dependencies': [python3, contrib_mod_args['dependencies']],
+ },
+)
+
+install_data(
+ 'jsonb_plpythonu.control',
+ 'jsonb_plpython2u.control',
+ 'jsonb_plpython3u.control',
+ 'jsonb_plpythonu--1.0.sql',
+ 'jsonb_plpython2u--1.0.sql',
+ 'jsonb_plpython3u--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+jsonb_plpython_regress = ['jsonb_plpython']
+
+jsonb_plpython3_regress = []
+jsonb_plpython3_deps = []
+
+# FIXME: this is an abysmal hack
+subdir('sql')
+subdir('expected')
+
+regress_tests += {
+ 'name': 'jsonb_plpython',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': jsonb_plpython3_regress,
+ 'deps': jsonb_plpython3_deps,
+ 'regress_args': ['--inputdir', meson.current_build_dir()],
+}
diff --git a/contrib/jsonb_plpython/sql/meson.build b/contrib/jsonb_plpython/sql/meson.build
new file mode 100644
index 00000000000..4b29029192a
--- /dev/null
+++ b/contrib/jsonb_plpython/sql/meson.build
@@ -0,0 +1,17 @@
+# Convert plpython2 regression tests to plpython3 ones
+foreach r2 : jsonb_plpython_regress
+ # string.replace is only in meson 0.58
+ r3 = 'plpython3' + r2.split('plpython')[1]
+ jsonb_plpython3_regress += r3
+
+ s2 = '@0@.sql'.format(r2)
+ s3 = '@0@.sql'.format(r3)
+ jsonb_plpython3_deps += custom_target(s3,
+ input: '@0@.sql'.format(r2),
+ output: '@0@.sql'.format(r3),
+ capture: true,
+ command: plpython_regress_cmd,
+ build_by_default: false,
+ install: false,
+ )
+endforeach
diff --git a/contrib/meson.build b/contrib/meson.build
new file mode 100644
index 00000000000..0d5f7315ebb
--- /dev/null
+++ b/contrib/meson.build
@@ -0,0 +1,63 @@
+contrib_mod_args = pg_mod_args
+
+contrib_data_dir = get_option('datadir') / 'extension'
+contrib_data_args = {
+ 'install_dir': contrib_data_dir
+}
+
+subdir('adminpack')
+subdir('amcheck')
+subdir('auth_delay')
+subdir('auto_explain')
+subdir('bloom')
+subdir('bool_plperl')
+subdir('btree_gin')
+subdir('btree_gist')
+subdir('citext')
+subdir('cube')
+subdir('dblink')
+subdir('dict_int')
+subdir('dict_xsyn')
+subdir('earthdistance')
+subdir('file_fdw')
+subdir('fuzzystrmatch')
+subdir('hstore')
+subdir('hstore_plperl')
+subdir('hstore_plpython')
+# TODO: intagg
+# TODO: intarray
+# TODO: isn
+subdir('jsonb_plperl')
+subdir('jsonb_plpython')
+# TODO: lo
+# TODO: ltree
+# TODO: ltree_plpython
+subdir('oid2name')
+# TODO: old_snapshot
+subdir('pageinspect')
+# TODO: passwordcheck
+# TODO: pg_buffercache
+# TODO: pgcrypto
+# TODO: pg_freespacemap
+subdir('pg_prewarm')
+# TODO: pgrowlocks
+subdir('pg_stat_statements')
+# TODO: pgstattuple
+# TODO: pg_surgery
+subdir('pg_trgm')
+subdir('pg_visibility')
+subdir('postgres_fdw')
+# TODO: seg
+# TODO: sepgsql
+subdir('spi')
+# TODO: sslinfo
+# TODO: start-scripts
+# TODO: tablefunc
+# TODO: tcn
+subdir('test_decoding')
+subdir('tsm_system_rows')
+subdir('tsm_system_time')
+subdir('unaccent')
+# TODO: uuid-ossp
+subdir('vacuumlo')
+subdir('xml2')
diff --git a/contrib/oid2name/meson.build b/contrib/oid2name/meson.build
new file mode 100644
index 00000000000..bee34d2137c
--- /dev/null
+++ b/contrib/oid2name/meson.build
@@ -0,0 +1,14 @@
+executable('oid2name',
+ ['oid2name.c'],
+ dependencies: [frontend_code, libpq],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name' : 'oid2name',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests' :[
+ 't/001_basic.pl',
+ ]
+}
diff --git a/contrib/pageinspect/meson.build b/contrib/pageinspect/meson.build
new file mode 100644
index 00000000000..4bd5b1784e0
--- /dev/null
+++ b/contrib/pageinspect/meson.build
@@ -0,0 +1,45 @@
+pageinspect = shared_module('pageinspect',
+ files(
+ 'brinfuncs.c',
+ 'btreefuncs.c',
+ 'fsmfuncs.c',
+ 'ginfuncs.c',
+ 'gistfuncs.c',
+ 'hashfuncs.c',
+ 'heapfuncs.c',
+ 'rawpage.c',
+ ),
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'pageinspect--1.0--1.1.sql',
+ 'pageinspect--1.1--1.2.sql',
+ 'pageinspect--1.2--1.3.sql',
+ 'pageinspect--1.3--1.4.sql',
+ 'pageinspect--1.4--1.5.sql',
+ 'pageinspect--1.5--1.6.sql',
+ 'pageinspect--1.5.sql',
+ 'pageinspect--1.6--1.7.sql',
+ 'pageinspect--1.7--1.8.sql',
+ 'pageinspect--1.8--1.9.sql',
+ 'pageinspect--1.9--1.10.sql',
+ 'pageinspect.control',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'pageinspect',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'page',
+ 'btree',
+ 'brin',
+ 'gin',
+ 'gist',
+ 'hash',
+ 'checksum',
+ 'oldextversions',
+ ],
+}
diff --git a/contrib/pg_prewarm/meson.build b/contrib/pg_prewarm/meson.build
new file mode 100644
index 00000000000..c93ccc2db6d
--- /dev/null
+++ b/contrib/pg_prewarm/meson.build
@@ -0,0 +1,16 @@
+pg_prewarm = shared_module('pg_prewarm',
+ files(
+ 'autoprewarm.c',
+ 'pg_prewarm.c',
+ ),
+ c_pch: '../../src/include/pch/postgres_pch.h',
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'pg_prewarm--1.0--1.1.sql',
+ 'pg_prewarm--1.1--1.2.sql',
+ 'pg_prewarm--1.1.sql',
+ 'pg_prewarm.control',
+ kwargs: contrib_data_args,
+)
diff --git a/contrib/pg_stat_statements/meson.build b/contrib/pg_stat_statements/meson.build
new file mode 100644
index 00000000000..6ed70ac0f18
--- /dev/null
+++ b/contrib/pg_stat_statements/meson.build
@@ -0,0 +1,31 @@
+pg_stat_statements = shared_module('pg_stat_statements',
+ files('pg_stat_statements.c'),
+ kwargs: contrib_mod_args + {
+ 'dependencies': contrib_mod_args['dependencies'],
+ },
+)
+
+install_data(
+ 'pg_stat_statements.control',
+ 'pg_stat_statements--1.4.sql',
+ 'pg_stat_statements--1.8--1.9.sql',
+ 'pg_stat_statements--1.7--1.8.sql',
+ 'pg_stat_statements--1.6--1.7.sql',
+ 'pg_stat_statements--1.5--1.6.sql',
+ 'pg_stat_statements--1.4--1.5.sql',
+ 'pg_stat_statements--1.3--1.4.sql',
+ 'pg_stat_statements--1.2--1.3.sql',
+ 'pg_stat_statements--1.1--1.2.sql',
+ 'pg_stat_statements--1.0--1.1.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'pg_stat_statements',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'pg_stat_statements'
+ ],
+ 'regress_args': ['--temp-config', files('pg_stat_statements.conf')],
+}
diff --git a/contrib/pg_trgm/meson.build b/contrib/pg_trgm/meson.build
new file mode 100644
index 00000000000..0a56926ad6b
--- /dev/null
+++ b/contrib/pg_trgm/meson.build
@@ -0,0 +1,33 @@
+pg_trgm = shared_module('pg_trgm',
+ files(
+ 'trgm_gin.c',
+ 'trgm_gist.c',
+ 'trgm_op.c',
+ 'trgm_regexp.c',
+ ),
+ c_pch: '../../src/include/pch/postgres_pch.h',
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'pg_trgm--1.0--1.1.sql',
+ 'pg_trgm--1.1--1.2.sql',
+ 'pg_trgm--1.2--1.3.sql',
+ 'pg_trgm--1.3--1.4.sql',
+ 'pg_trgm--1.3.sql',
+ 'pg_trgm--1.4--1.5.sql',
+ 'pg_trgm--1.5--1.6.sql',
+ 'pg_trgm.control',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'pg_trgm',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'pg_trgm',
+ 'pg_word_trgm',
+ 'pg_strict_word_trgm',
+ ],
+}
diff --git a/contrib/pg_visibility/meson.build b/contrib/pg_visibility/meson.build
new file mode 100644
index 00000000000..68a7e1cf28c
--- /dev/null
+++ b/contrib/pg_visibility/meson.build
@@ -0,0 +1,25 @@
+pg_visibility = shared_module('pg_visibility',
+ files(
+ 'pg_visibility.c',
+ ),
+ c_pch: '../../src/include/pch/postgres_pch.h',
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'pg_visibility--1.0--1.1.sql',
+ 'pg_visibility--1.1.sql',
+ 'pg_visibility--1.1--1.2.sql',
+ 'pg_visibility.control',
+ kwargs: contrib_data_args,
+)
+
+
+regress_tests += {
+ 'name': 'pg_visibility',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'pg_visibility'
+ ],
+}
diff --git a/contrib/postgres_fdw/meson.build b/contrib/postgres_fdw/meson.build
new file mode 100644
index 00000000000..507d01448b1
--- /dev/null
+++ b/contrib/postgres_fdw/meson.build
@@ -0,0 +1,31 @@
+postgres_fdw_sources = files(
+ 'connection.c',
+ 'deparse.c',
+ 'option.c',
+ 'postgres_fdw.c',
+ 'shippable.c',
+)
+
+postgres_fdw = shared_module('postgres_fdw',
+ postgres_fdw_sources,
+ kwargs: contrib_mod_args + {
+ 'dependencies': pg_mod_args['dependencies'] + [libpq],
+ },
+)
+
+install_data(
+ 'postgres_fdw.control',
+ 'postgres_fdw--1.0.sql',
+ 'postgres_fdw--1.0--1.1.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'postgres_fdw',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'postgres_fdw'
+ ],
+ 'regress_args': ['--dlpath', meson.build_root() / 'src/test/regress'],
+}
diff --git a/contrib/spi/meson.build b/contrib/spi/meson.build
new file mode 100644
index 00000000000..51bc96ea657
--- /dev/null
+++ b/contrib/spi/meson.build
@@ -0,0 +1,43 @@
+autoinc = shared_module('autoinc',
+ ['autoinc.c'],
+ kwargs: contrib_mod_args,
+)
+
+install_data('autoinc.control', 'autoinc--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+
+insert_username = shared_module('insert_username',
+ ['insert_username.c'],
+ kwargs: contrib_mod_args,
+)
+
+install_data('insert_username.control', 'insert_username--1.0.sql',
+ install_dir: get_option('datadir') / 'extension'
+)
+
+
+moddatetime = shared_module('moddatetime',
+ ['moddatetime.c'],
+ kwargs: contrib_mod_args,
+)
+
+install_data('moddatetime.control', 'moddatetime--1.0.sql',
+ install_dir: get_option('datadir') / 'extension'
+)
+
+# this is needed for the regression tests;
+# comment out if you want a quieter refint package for other uses
+refint_cflags = ['-DREFINT_VERBOSE']
+
+refint = shared_module('refint',
+ ['refint.c'],
+ kwargs: contrib_mod_args + {
+ 'c_args': refint_cflags + contrib_mod_args['c_args'],
+ },
+)
+
+install_data('refint.control', 'refint--1.0.sql',
+ kwargs: contrib_data_args,
+)
diff --git a/contrib/test_decoding/meson.build b/contrib/test_decoding/meson.build
new file mode 100644
index 00000000000..d26b43cbe79
--- /dev/null
+++ b/contrib/test_decoding/meson.build
@@ -0,0 +1,69 @@
+test_decoding_sources = files(
+ 'test_decoding.c',
+)
+
+test_decoding = shared_module('test_decoding',
+ test_decoding_sources,
+ kwargs: contrib_mod_args,
+)
+
+
+regress_tests += {
+ 'name': 'test_decoding',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'ddl',
+ 'xact',
+ 'rewrite',
+ 'toast',
+ 'permissions',
+ 'decoding_in_xact',
+ 'decoding_into_rel',
+ 'binary',
+ 'prepared',
+ 'replorigin',
+ 'time',
+ 'messages',
+ 'spill',
+ 'slot',
+ 'truncate',
+ 'stream',
+ 'stats',
+ 'twophase',
+ 'twophase_stream',
+ ],
+ 'regress_args': [
+ '--temp-config', files('logical.conf')
+ ]
+}
+
+isolation_tests += {
+ 'name': 'test_decoding',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'specs': [
+ 'mxact',
+ 'delayed_startup',
+ 'ondisk_startup',
+ 'concurrent_ddl_dml',
+ 'oldest_xmin',
+ 'snapshot_transfer',
+ 'subxact_without_top',
+ 'concurrent_stream',
+ 'twophase_snapshot',
+ ],
+ 'regress_args': [
+ '--temp-config', files('logical.conf')
+ ]
+}
+
+
+tap_tests += {
+ 'name': 'test_decoding',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_repl_stats.pl',
+ ],
+}
diff --git a/contrib/tsm_system_rows/meson.build b/contrib/tsm_system_rows/meson.build
new file mode 100644
index 00000000000..2c8f4487f8d
--- /dev/null
+++ b/contrib/tsm_system_rows/meson.build
@@ -0,0 +1,22 @@
+tsm_system_rows = shared_module('tsm_system_rows',
+ files(
+ 'tsm_system_rows.c',
+ ),
+ c_pch: '../../src/include/pch/postgres_pch.h',
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'tsm_system_rows--1.0.sql',
+ 'tsm_system_rows.control',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'tsm_system_rows',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'tsm_system_rows',
+ ],
+}
diff --git a/contrib/tsm_system_time/meson.build b/contrib/tsm_system_time/meson.build
new file mode 100644
index 00000000000..df9c4aa4b51
--- /dev/null
+++ b/contrib/tsm_system_time/meson.build
@@ -0,0 +1,22 @@
+tsm_system_time = shared_module('tsm_system_time',
+ files(
+ 'tsm_system_time.c',
+ ),
+ c_pch: '../../src/include/pch/postgres_pch.h',
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'tsm_system_time--1.0.sql',
+ 'tsm_system_time.control',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'tsm_system_time',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'tsm_system_time',
+ ],
+}
diff --git a/contrib/unaccent/meson.build b/contrib/unaccent/meson.build
new file mode 100644
index 00000000000..e77bf790d8c
--- /dev/null
+++ b/contrib/unaccent/meson.build
@@ -0,0 +1,30 @@
+unaccent = shared_module('unaccent',
+ files(
+ 'unaccent.c',
+ ),
+ c_pch: '../../src/include/pch/postgres_pch.h',
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'unaccent--1.0--1.1.sql',
+ 'unaccent--1.1.sql',
+ 'unaccent.control',
+ kwargs: contrib_data_args,
+)
+
+install_data(
+ 'unaccent.rules',
+ install_dir: get_option('datadir') / 'tsearch_data'
+)
+
+# XXX: Implement downlo
+regress_tests += {
+ 'name': 'unaccent',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'unaccent',
+ ],
+ 'regress_args': ['--encoding=UTF8'],
+}
diff --git a/contrib/vacuumlo/meson.build b/contrib/vacuumlo/meson.build
new file mode 100644
index 00000000000..99e76daacf9
--- /dev/null
+++ b/contrib/vacuumlo/meson.build
@@ -0,0 +1,14 @@
+executable('vacuumlo',
+ ['vacuumlo.c'],
+ dependencies: [frontend_code, libpq],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name' : 'vacuumlo',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests' :[
+ 't/001_basic.pl',
+ ]
+}
diff --git a/contrib/xml2/meson.build b/contrib/xml2/meson.build
new file mode 100644
index 00000000000..6f8a26e4f0a
--- /dev/null
+++ b/contrib/xml2/meson.build
@@ -0,0 +1,30 @@
+if not libxml.found()
+ subdir_done()
+endif
+
+xml2 = shared_module('pgxml',
+ files(
+ 'xpath.c',
+ 'xslt_proc.c',
+ ),
+ c_pch: '../../src/include/pch/postgres_pch.h',
+ kwargs: contrib_mod_args + {
+ 'dependencies': [libxml, libxslt, contrib_mod_args['dependencies']],
+ },
+)
+
+install_data(
+ 'xml2--1.0--1.1.sql',
+ 'xml2--1.1.sql',
+ 'xml2.control',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'xml2',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'xml2',
+ ],
+}
diff --git a/conversion_helpers.txt b/conversion_helpers.txt
new file mode 100644
index 00000000000..e5879b4fe77
--- /dev/null
+++ b/conversion_helpers.txt
@@ -0,0 +1,6 @@
+convert list of files to quoted-one-per-line:
+
+ ?\b\(\(?:\w\|\d\|_\|-\)+\)\.o ?\(?:\\
+\)? â '\1.c',
+
+
diff --git a/meson.build b/meson.build
new file mode 100644
index 00000000000..c9d123bdac3
--- /dev/null
+++ b/meson.build
@@ -0,0 +1,1901 @@
+project('postgresql',
+ ['c'],
+ version: '15devel',
+ license: 'PostgreSQL',
+ meson_version: '>=0.54',
+ default_options: [
+ 'warning_level=2',
+ 'b_pie=true',
+ 'b_pch=false',
+ 'buildtype=release',
+ ]
+)
+
+
+
+###############################################################
+# Basic prep
+###############################################################
+
+fs = import('fs')
+
+thread_dep = dependency('threads')
+
+
+
+###############################################################
+# Version and other metadata
+###############################################################
+
+pg_version = meson.project_version()
+
+if pg_version.endswith('devel')
+ pg_version_arr = [pg_version.split('devel')[0], '0']
+elif pg_version.contains('beta')
+ pg_version_arr = pg_version.split('beta')
+elif pg_version.contains('rc')
+ pg_version_arr = pg_version.split('rc')
+else
+ pg_version_arr = pg_version.split('.')
+endif
+
+pg_version_major = pg_version_arr[0].to_int()
+pg_version_minor = pg_version_arr[1].to_int()
+
+cc = meson.get_compiler('c')
+
+cdata = configuration_data()
+
+
+cdata.set_quoted('PACKAGE_NAME', 'PostgreSQL')
+cdata.set_quoted('PACKAGE_BUGREPORT', 'pgsql-bugs@lists.postgresql.org')
+cdata.set_quoted('PACKAGE_URL', 'https://www.postgresql.org/')
+
+cdata.set_quoted('PG_VERSION', pg_version)
+cdata.set_quoted('PG_VERSION_STR', 'PostgreSQL @0@ on @1@, compiled by @2@-@3@'.format(
+ pg_version, target_machine.cpu_family(), cc.get_id(), cc.version()))
+cdata.set_quoted('PG_MAJORVERSION', pg_version_major.to_string())
+cdata.set('PG_MAJORVERSION_NUM', pg_version_major)
+cdata.set('PG_VERSION_NUM', (pg_version_major*10000)+pg_version_minor)
+cdata.set_quoted('CONFIGURE_ARGS', '')
+
+
+
+###############################################################
+# Search paths
+#
+# NB: Arguments added globally (via the below, or CFLAGS etc) are not taken
+# into account for configuration-time checks (so they are more
+# isolated). Flags that have to be taken into account for configure checks
+# have to be explicitly specified in configure tests.
+###############################################################
+
+g_inc = []
+g_c_args = []
+g_l_args = []
+
+if host_machine.system() == 'darwin'
+ # XXX, should this be required?
+ xcrun = find_program('xcrun', native: true, required: true)
+
+ sysroot = run_command(xcrun, '--show-sdk-path', check: true).stdout().strip()
+ message('sysroot is >@0@<'.format(sysroot))
+
+ g_c_args += ['-isysroot', sysroot]
+ g_l_args += ['-isysroot', sysroot]
+endif
+
+if host_machine.system() == 'linux' or host_machine.system() == 'cygwin'
+ g_c_args += '-D_GNU_SOURCE'
+endif
+
+g_c_inc = []
+
+g_c_inc += include_directories(get_option('extra_include_dirs'))
+g_c_lib = get_option('extra_lib_dirs')
+
+add_project_arguments(g_c_args, language: ['c', 'cpp'])
+add_project_link_arguments(g_l_args, language: ['c', 'cpp'])
+
+
+
+###############################################################
+# Program paths
+###############################################################
+
+# External programs
+perl = find_program(get_option('PERL'), required: true)
+flex = find_program(get_option('FLEX'), native: true)
+bison = find_program(get_option('BISON'), native: true, version: '>= 1.875')
+sed = find_program(get_option('SED'), 'sed', native: true)
+prove = find_program(get_option('PROVE'))
+tar = find_program(get_option('TAR'), native: true)
+gzip = find_program(get_option('GZIP'), native: true)
+touch = find_program('touch', native: true)
+
+# Internal programs
+find_meson = find_program('src/tools/find_meson', native: true)
+testwrap = find_program('src/tools/testwrap', native: true)
+
+bisonflags = []
+if bison.found()
+ bison_version_c = run_command(bison, '--version', check: true)
+ # bison version string helpfully is something like
+ # >>bison (GNU bison) 3.8.1<<
+ bison_version = bison_version_c.stdout().split(' ')[3].split('\n')[0]
+ if bison_version.version_compare('>=3.0')
+ bisonflags += ['-Wno-deprecated']
+ endif
+endif
+
+
+
+###############################################################
+# Path to meson (for tests etc)
+###############################################################
+
+# FIXME: this should really be part of meson, see
+# https://github.com/mesonbuild/meson/issues/8511
+meson_binpath_r = run_command(find_meson)
+
+if meson_binpath_r.returncode() != 0 or meson_binpath_r.stdout() == ''
+ error('huh, could not run find_meson.\nerrcode: @0@\nstdout: @1@\nstderr: @2@'.format(
+ meson_binpath_r.returncode(),
+ meson_binpath_r.stdout(),
+ meson_binpath_r.stderr()))
+endif
+
+meson_binpath_s = meson_binpath_r.stdout().split('\n')
+meson_binpath_len = meson_binpath_s.length()
+
+if meson_binpath_len < 1
+ error('unexpected introspect line @0@'.format(meson_binpath_r.stdout()))
+endif
+
+i = 0
+meson_binpath = ''
+meson_args = []
+foreach e : meson_binpath_s
+ if i == 0
+ meson_binpath = e
+ else
+ meson_args += e
+ endif
+ i += 1
+endforeach
+
+meson_bin = find_program(meson_binpath, native: true)
+
+
+
+###############################################################
+# Option Handling
+###############################################################
+
+cdata.set('USE_ASSERT_CHECKING', get_option('cassert'))
+
+cdata.set('BLCKSZ', 8192, description: '''
+ Size of a disk block --- this also limits the size of a tuple. You
+ can set it bigger if you need bigger tuples (although TOAST should
+ reduce the need to have large tuples, since fields can be spread
+ across multiple tuples).
+
+ BLCKSZ must be a power of 2. The maximum possible value of BLCKSZ
+ is currently 2^15 (32768). This is determined by the 15-bit widths
+ of the lp_off and lp_len fields in ItemIdData (see
+ include/storage/itemid.h).
+
+ Changing BLCKSZ requires an initdb.
+''')
+
+cdata.set('XLOG_BLCKSZ', 8192)
+cdata.set('RELSEG_SIZE', 131072)
+cdata.set('DEF_PGPORT', 5432)
+cdata.set_quoted('DEF_PGPORT_STR', '5432')
+cdata.set_quoted('PG_KRB_SRVNAM', 'postgres')
+
+
+
+###############################################################
+# Library: GSSAPI
+###############################################################
+
+gssapiopt = get_option('gssapi')
+if not gssapiopt.disabled()
+ gssapi = dependency('krb5-gssapi', required: gssapiopt)
+
+ if gssapi.found() and \
+ cc.check_header('gssapi/gssapi.h', args: g_c_args, dependencies: gssapi, required: gssapiopt)
+
+ if not cc.has_function('gss_init_sec_context', args: g_c_args, dependencies: gssapi)
+ error('''could not find function 'gss_init_sec_context' required for GSSAPI''')
+ endif
+ cdata.set('ENABLE_GSS', 1)
+ endif
+
+else
+ gssapi = dependency('', required : false)
+endif
+
+
+
+###############################################################
+# Library: ldap
+###############################################################
+
+ldapopt = get_option('ldap')
+if not ldapopt.disabled()
+
+ if host_machine.system() == 'windows'
+ ldap = cc.find_library('wldap32')
+ ldap_r = ldap
+ else
+ ldap = dependency('ldap', required: false)
+
+ # Before 2.5 openldap didn't have a pkg-config file..
+ if ldap.found()
+ ldap_r = ldap
+ else
+ ldap = cc.find_library('ldap', required: ldapopt)
+ ldap_r = cc.find_library('ldap_r', required: ldapopt)
+
+ # Use ldap_r for FE if available, else assume ldap is thread-safe.
+ # On some platforms ldap_r fails to link without PTHREAD_LIBS.
+ if ldap.found() and not ldap_r.found()
+ ldap_r = ldap
+ endif
+ endif
+
+ if ldap.found() and cc.has_function('ldap_initialize', args: g_c_args, dependencies: [ldap, thread_dep])
+ cdata.set('HAVE_LDAP_INITIALIZE', 1)
+ endif
+ endif
+
+ if ldap.found()
+ cdata.set('USE_LDAP', 1)
+ endif
+
+else
+ ldap_r = ldap = dependency('', required : false)
+endif
+
+
+
+###############################################################
+# Library: LLVM
+###############################################################
+
+llvmopt = get_option('llvm')
+if not llvmopt.disabled()
+ llvm = dependency('llvm', version : '>=3.9', method: 'config-tool', required: llvmopt)
+
+ if llvm.found()
+
+ cdata.set('USE_LLVM', 1)
+
+ add_languages('cpp', required : true, native: false)
+ cpp = meson.get_compiler('cpp')
+
+ llvm_binpath = llvm.get_variable(configtool: 'bindir')
+
+ ccache = find_program('ccache', required: false)
+ clang = find_program(llvm_binpath / 'clang', required: true)
+ llvm_lto = find_program(llvm_binpath / 'llvm-lto', required: true)
+
+ # FIXME: the includes hardcoded here suck
+ llvm_irgen_args = [
+ '-c', '-o', '@OUTPUT@', '@INPUT@',
+ '-flto=thin', '-emit-llvm',
+ '-MD', '-MQ', '@OUTPUT@', '-MF', '@DEPFILE@',
+ '-I', '@SOURCE_ROOT@/src/include',
+ '-I', '@BUILD_ROOT@/src/include',
+ '-I', '@BUILD_ROOT@/src/backend/utils/misc',
+ '-I', '@CURRENT_SOURCE_DIR@',
+ '-O2',
+ '-Wno-ignored-attributes',
+ '-Wno-empty-body',
+ ]
+
+ if ccache.found()
+ llvm_irgen_command = ccache
+ llvm_irgen_args = [clang.path()] + llvm_irgen_args
+ else
+ llvm_irgen_command = clang
+ endif
+
+ llvm_irgen_kw = {
+ 'command': [llvm_irgen_command] + llvm_irgen_args,
+ 'depfile': '@BASENAME@.c.bc.d',
+ }
+
+ irlink = find_program('src/tools/irlink', native: true)
+
+ llvm_irlink_kw = {
+ 'command':[
+ irlink,
+ '@SOURCE_ROOT@',
+ '@BUILD_ROOT@',
+ llvm_lto,
+ '-o', '@OUTPUT0@',
+ '@PRIVATE_DIR@',
+ '@INPUT@',
+ ],
+ 'install': true,
+ 'install_dir': get_option('libdir'),
+ }
+
+ endif
+else
+ llvm = dependency('', required: false)
+endif
+
+
+
+###############################################################
+# Library: icu
+###############################################################
+
+if not get_option('icu').disabled()
+ icu = dependency('icu-uc', required: get_option('icu').enabled())
+ icu_i18n = dependency('icu-i18n', required: get_option('icu').enabled())
+
+ if icu.found()
+ cdata.set('USE_ICU', 1)
+ endif
+
+else
+ icu = dependency('', required : false)
+ icu_i18n = dependency('', required : false)
+endif
+
+
+
+###############################################################
+# Library: libxml
+###############################################################
+
+libxmlopt = get_option('libxml')
+if not libxmlopt.disabled()
+ libxml = dependency('libxml-2.0', required: libxmlopt, version: '>= 2.6.23')
+
+ if libxml.found()
+ cdata.set('USE_LIBXML', 1)
+ endif
+else
+ libxml = dependency('', required : false)
+endif
+
+
+
+###############################################################
+# Library: libxslt
+###############################################################
+
+libxsltopt = get_option('libxslt')
+if not libxsltopt.disabled()
+ libxslt = dependency('libxslt', required: libxsltopt)
+
+ if libxslt.found()
+ cdata.set('USE_LIBXSLT', 1)
+ endif
+else
+ libxslt = dependency('', required : false)
+endif
+
+
+
+###############################################################
+# Library: lz4
+###############################################################
+
+lz4opt = get_option('lz4')
+if not lz4opt.disabled()
+ lz4 = dependency('liblz4', required: lz4opt)
+
+ if lz4.found()
+ cdata.set('USE_LZ4', 1)
+ cdata.set('HAVE_LIBLZ4', 1)
+ endif
+
+else
+ lz4 = dependency('', required : false)
+endif
+
+
+
+###############################################################
+# Library: Perl (for plperl)
+###############################################################
+
+perlopt = get_option('perl')
+perl_dep = dependency('', required: false)
+
+if perlopt.disabled()
+ perl_may_work = false
+else
+ perl_may_work = true
+
+ perl_conf_cmd = [perl, '-MConfig', '-e', 'print $Config{$ARGV[0]}']
+
+ # FIXME: include copy-edited comments from perl.m4
+
+ perlversion = run_command(perl_conf_cmd, 'api_versionstring', check: true).stdout()
+ archlibexp = run_command(perl_conf_cmd, 'archlibexp', check: true).stdout()
+ privlibexp = run_command(perl_conf_cmd, 'privlibexp', check: true).stdout()
+ useshrplib = run_command(perl_conf_cmd, 'useshrplib', check: true).stdout()
+ libperl = run_command(perl_conf_cmd, 'libperl', check: true).stdout()
+
+ perl_inc = include_directories('@0@/CORE'.format(archlibexp))
+
+ perl_ccflags = []
+
+ if useshrplib != 'true'
+ if perlopt.enabled()
+ warning('need a shared perl')
+ perl_may_work = false
+ else
+ error('need a shared perl')
+ endif
+ endif
+
+ message('perl libperl: @0@'.format(libperl))
+
+ if host_machine.system() == 'darwin'
+ perl_ccflags += ['-iwithsysroot', '@0@/CORE'.format(archlibexp)]
+ endif
+
+ # FIXME macOS may need sysroot muckery
+
+ # XXX: On my system the cc.find_library() is actually enough to get a working
+ # plperl. Would be nice to get rid of the nasty stuff below.
+ if perl_may_work and \
+ cc.has_header('perl.h', args: g_c_args + perl_ccflags,
+ include_directories: perl_inc, required: perlopt)
+ foreach p : ['perl', 'libperl', libperl, libperl.strip('lib'), fs.stem(libperl), fs.stem(libperl).strip('lib')]
+ perl_dep_int = cc.find_library(p,
+ dirs: ['@0@/CORE'.format(archlibexp)],
+ required: false)
+ if perl_dep_int.found()
+ break
+ endif
+ endforeach
+
+ if not perl_dep_int.found()
+ perl_may_work = false
+ endif
+ else
+ perl_may_work = false
+ endif
+
+ if perl_may_work
+ perl_ccflags_r = run_command(perl_conf_cmd, 'ccflags', check: true).stdout()
+ message('CCFLAGS recommended by Perl: @0@'.format(perl_ccflags_r))
+
+ foreach flag : perl_ccflags_r.split(' ')
+ if flag.startswith('-D') and \
+ (not flag.startswith('-D_') or flag == '_USE_32BIT_TIME_T')
+ perl_ccflags += flag
+ endif
+ endforeach
+
+ if host_machine.system() == 'windows'
+ perl_ccflags += ['-DPLPERL_HAVE_UID_GID']
+ endif
+
+ message('CCFLAGS for embedding perl: @0@'.format(' '.join(perl_ccflags)))
+
+ # perl.m4 sayeth:
+ #
+ # We are after Embed's ldopts, but without the subset mentioned in
+ # Config's ccdlflags;
+ #
+ # FIXME: andres sayeth: But why?
+
+ ldopts = run_command(perl, '-MExtUtils::Embed', '-e', 'ldopts', check: true).stdout().strip()
+ ccdlflags = run_command(perl_conf_cmd, 'ccdlflags', check: true).stdout().strip()
+
+ ccdlflags_dict = {}
+
+ foreach ccdlflag : ccdlflags.split(' ')
+ ccdlflags_dict += {ccdlflag: 1}
+ endforeach
+
+ perl_ldopts = []
+ foreach ldopt : ldopts.split(' ')
+ if ldopt == ''
+ continue
+ elif ccdlflags_dict.has_key(ldopt)
+ continue
+ # strawberry perl unhelpfully has that in ldopts
+ elif ldopt == '-s'
+ continue
+ endif
+
+ perl_ldopts += ldopt.strip('"')
+ endforeach
+
+ # FIXME: check if windows handling is necessary
+
+ message('LDFLAGS for embedding perl: "@0@" (ccdlflags: "@1@", ldopts: "@2@")'.format(
+ ' '.join(perl_ldopts), ccdlflags, ldopts))
+
+ if perl_dep_int.found()
+ perl_dep = declare_dependency(
+ include_directories: perl_inc,
+ compile_args: perl_ccflags,
+ link_args: perl_ldopts,
+ version: perlversion,
+ )
+ endif
+ endif # perl_may_work
+
+ if perlopt.enabled() and not perl_may_work
+ error('could not find working perl')
+ endif
+endif
+
+
+
+###############################################################
+# Library: Python (for plpython)
+###############################################################
+
+pyopt = get_option('python')
+if not pyopt.disabled()
+ pm = import('python')
+ python3_inst = pm.find_installation(required: pyopt.enabled())
+ python3 = python3_inst.dependency(embed: true, required: pyopt.enabled())
+else
+ python3 = dependency('', required: false)
+endif
+
+
+
+###############################################################
+# Library: Readline
+#
+# FIXME: editline support
+###############################################################
+
+if not get_option('readline').disabled()
+ readline = dependency('readline', required: false)
+ if not readline.found()
+ readline = cc.find_library('readline',
+ required: get_option('readline').enabled())
+ endif
+
+ if readline.found()
+ cdata.set('HAVE_LIBREADLINE', 1)
+
+ if cc.has_header('readline/history.h', args: g_c_args, dependencies: [readline], required: false)
+ history_h = 'readline/history.h'
+ cdata.set('HAVE_READLINE_HISTORY_H', 1)
+ cdata.set('HAVE_READLINE_H', false)
+ elif cc.has_header('history.h', args: g_c_args, dependencies: [readline], required: false)
+ history_h = 'history.h'
+ cdata.set('HAVE_READLINE_HISTORY_H', false)
+ cdata.set('HAVE_HISTORY_H', 1)
+ else
+ error('''readline header not found
+If you have readline already installed, see see meson-log/meson-log.txt for details on the
+failure. It is possible the compiler isn't looking in the proper directory.
+Use -Dreadline=false to disable readline support.''')
+ endif
+
+ if cc.has_header('readline/readline.h', args: g_c_args, dependencies: [readline], required: false)
+ readline_h = 'readline/readline.h'
+ cdata.set('HAVE_READLINE_READLINE_H', 1)
+ elif cc.has_header('readline.h', args: g_c_args, dependencies: [readline], required: false)
+ readline_h = 'readline.h'
+ cdata.set('HAVE_READLINE_H', 1)
+ else
+ error('''readline header not found
+If you have readline already installed, see see meson-log/meson-log.txt for details on the
+failure. It is possible the compiler isn't looking in the proper directory.
+Use -Dreadline=false to disable readline support.''')
+ endif
+
+ check_funcs = [
+ 'rl_completion_matches',
+ 'rl_filename_completion_function',
+ 'rl_reset_screen_size',
+ 'append_history',
+ 'history_truncate_file',
+ ]
+
+ foreach func : check_funcs
+ cdata.set('HAVE_'+func.to_upper(),
+ cc.has_function(func, args: g_c_args, dependencies: [readline]) ? 1 : false)
+ endforeach
+
+ check_vars = [
+ 'rl_completion_append_character',
+ 'rl_completion_suppress_quote',
+ 'rl_filename_quote_characters',
+ 'rl_filename_quoting_function',
+ ]
+
+ foreach var : check_vars
+ cdata.set('HAVE_'+var.to_upper(),
+ cc.has_header_symbol(readline_h, var, args: g_c_args, dependencies: [readline]) ? 1 : false)
+ endforeach
+ endif
+else
+ readline = dependency('', required : false)
+endif
+
+
+
+###############################################################
+# Library: SSL
+###############################################################
+
+if get_option('ssl') == 'openssl'
+
+ # Try to find openssl via pkg-config et al, if that doesn't work, look for
+ # the library names that we know about.
+
+ # via pkg-config et al
+ ssl = dependency('openssl', required: false)
+
+ # via library + headers
+ if not ssl.found()
+ ssl_lib = cc.find_library('ssl',
+ dirs: g_c_lib,
+ header_include_directories: g_c_inc,
+ has_headers: ['openssl/ssl.h', 'openssl/err.h'])
+ crypto_lib = cc.find_library('crypto',
+ dirs: g_c_lib,
+ header_include_directories: g_c_inc)
+ ssl_int = [ssl_lib, crypto_lib]
+
+ ssl = declare_dependency(dependencies: ssl_int,
+ include_directories: g_c_inc)
+ else
+ cc.has_header('openssl/ssl.h', args: g_c_args, dependencies: ssl, required: true)
+ cc.has_header('openssl/err.h', args: g_c_args, dependencies: ssl, required: true)
+
+ ssl_int = [ssl]
+ endif
+
+ cdata.set_quoted('WITH_SSL', get_option('ssl'))
+
+ check_funcs = [
+ ['CRYPTO_new_ex_data', {'required': true}],
+ ['SSL_new', {'required': true}],
+
+ # Function introduced in OpenSSL 1.0.2.
+ ['X509_get_signature_nid'],
+
+ # Functions introduced in OpenSSL 1.1.0. We used to check for
+ # OPENSSL_VERSION_NUMBER, but that didn't work with 1.1.0, because LibreSSL
+ # defines OPENSSL_VERSION_NUMBER to claim version 2.0.0, even though it
+ # doesn't have these OpenSSL 1.1.0 functions. So check for individual
+ # functions.
+ ['OPENSSL_init_ssl'],
+ ['BIO_get_data'],
+ ['BIO_meth_new'],
+ ['ASN1_STRING_get0_data'],
+ ['HMAC_CTX_new'],
+ ['HMAC_CTX_free'],
+
+ # OpenSSL versions before 1.1.0 required setting callback functions, for
+ # thread-safety. In 1.1.0, it's no longer required, and CRYPTO_lock()
+ # function was removed.
+ ['CRYPTO_lock'],
+ ]
+
+ foreach c : check_funcs
+ func = c.get(0)
+ val = cc.has_function(func, args: g_c_args, dependencies: ssl_int)
+ if not val and c.get(1, {}).get('required', false)
+ error('openssl function @0@ is required'.format(func))
+ endif
+ cdata.set('HAVE_'+func.to_upper(), val ? 1 : false)
+ endforeach
+
+ cdata.set('USE_OPENSSL', 1,
+ description: 'Define to 1 to build with OpenSSL support. (-Dssl=openssl)')
+
+ cdata.set('OPENSSL_API_COMPAT', 0x10001000,
+ description: 'Define to the OpenSSL API version in use. This avoids deprecation warnings from newer OpenSSL versions.')
+else
+ ssl = dependency('', required : false)
+endif
+
+
+
+###############################################################
+# Library: zlib
+###############################################################
+
+zlibopt = get_option('zlib')
+zlib = dependency('', required : false)
+if not zlibopt.disabled()
+ zlib_t = dependency('zlib', required: zlibopt)
+
+ if zlib_t.type_name() == 'internal'
+ # if fallback was used, we don't need to test if headers are present (they
+ # aren't built yet, so we can't test)
+ zlib = zlib_t
+ elif not zlib_t.found()
+ warning('did not find zlib')
+ elif not cc.has_header('zlib.h', args: g_c_args, dependencies: [zlib_t], required: zlibopt.enabled())
+ warning('zlib header not found')
+ elif not cc.has_type('z_streamp', args: g_c_args, dependencies: [zlib_t], prefix: '#include <zlib.h>')
+ if zlibopt.enabled()
+ error('zlib version is too old')
+ else
+ warning('zlib version is too old')
+ endif
+ else
+ zlib = zlib_t
+ endif
+
+ if zlib.found()
+ cdata.set('HAVE_LIBZ', 1)
+ endif
+endif
+
+
+
+###############################################################
+# Compiler tests
+###############################################################
+
+sizeof_long = cc.sizeof('long', args: g_c_args)
+if sizeof_long == 8
+ cdata.set('HAVE_LONG_INT_64', 1)
+ cdata.set('PG_INT64_TYPE', 'long int')
+ cdata.set_quoted('INT64_MODIFIER', 'l')
+elif sizeof_long == 4 and cc.sizeof('long long', args: g_c_args) == 8
+ cdata.set('HAVE_LONG_LONG_INT_64', 1)
+ cdata.set('PG_INT64_TYPE', 'long long int')
+ cdata.set_quoted('INT64_MODIFIER', 'll')
+elif
+ error('do not know how to get a 64bit int')
+endif
+
+
+cdata.set('MAXIMUM_ALIGNOF', 8)
+cdata.set('ALIGNOF_SHORT', cc.alignment('short', args: g_c_args))
+cdata.set('ALIGNOF_INT', cc.alignment('int', args: g_c_args))
+cdata.set('ALIGNOF_LONG', cc.alignment('long', args: g_c_args))
+cdata.set('ALIGNOF_DOUBLE', cc.alignment('double', args: g_c_args))
+cdata.set('SIZEOF_VOID_P', cc.sizeof('void *', args: g_c_args))
+
+# Check if the C compiler knows computed gotos (gcc extension, also
+# available in at least clang). If so, define HAVE_COMPUTED_GOTO.
+#
+# Checking whether computed gotos are supported syntax-wise ought to
+# be enough, as the syntax is otherwise illegal.
+if cc.compiles('''
+ static inline int foo(void)
+ {
+ void *labeladdrs[] = {&&my_label};
+ goto *labeladdrs[0];
+ my_label:
+ return 1;
+ }''',
+ args: g_c_args)
+ cdata.set('HAVE_COMPUTED_GOTO', 1)
+endif
+
+
+# XXX: for now just assume that compiler knows __func__ - it's C99 after all.
+cdata.set('HAVE_FUNCNAME__FUNC', 1)
+
+# Check if the C compiler understands _Static_assert(),
+# and define HAVE__STATIC_ASSERT if so.
+#
+# We actually check the syntax ({ _Static_assert(...) }), because we need
+# gcc-style compound expressions to be able to wrap the thing into macros.
+if cc.compiles('''
+ int main(int arg, char **argv)
+ {
+ ({ _Static_assert(1, "foo"); })
+ }
+ ''',
+ args: g_c_args)
+ cdata.set('HAVE__STATIC_ASSERT', 1)
+endif
+
+# We use <stdbool.h> if we have it and it declares type bool as having
+# size 1. Otherwise, c.h will fall back to declaring bool as unsigned char.
+if cc.has_type('_Bool', args: g_c_args) \
+ and cc.has_type('bool', prefix: '#include <stdbool.h>', args: g_c_args) \
+ and cc.sizeof('bool', prefix: '#include <stdbool.h>', args: g_c_args) == 1
+ cdata.set('HAVE__BOOL', 1)
+ cdata.set('PG_USE_STDBOOL', 1)
+endif
+
+
+printf_attributes = ['gnu_printf', '__syslog__', 'printf']
+testsrc = 'extern void pgac_write(int ignore, const char *fmt,...) __attribute__((format(@0@, 2,3)));'
+foreach a : printf_attributes
+ if cc.compiles(testsrc.format(a), args: g_c_args + ['-Werror'], name: 'format ' + a)
+ cdata.set('PG_PRINTF_ATTRIBUTE', a)
+ break
+ endif
+endforeach
+
+if cc.has_function_attribute('visibility:default') and \
+ cc.has_function_attribute('visibility:hidden')
+ cdata.set('HAVE_VISIBILITY_ATTRIBUTE', 1)
+endif
+
+
+if cc.has_function('__builtin_unreachable', args: g_c_args)
+ cdata.set('HAVE__BUILTIN_UNREACHABLE', 1)
+endif
+
+if cc.has_function('__builtin_constant_p', args: g_c_args)
+ cdata.set('HAVE__BUILTIN_CONSTANT_P', 1)
+
+ if host_machine.cpu_family() == 'ppc' or host_machine.cpu_family() == 'ppc64'
+ # Check if compiler accepts "i"(x) when __builtin_constant_p(x).
+ if cc.compiles('''
+ static inline int
+ addi(int ra, int si)
+ {
+ int res = 0;
+ if (__builtin_constant_p(si))
+ __asm__ __volatile__(
+ " addi %0,%1,%2\n" : "=r"(res) : "b"(ra), "i"(si));
+ return res;
+ }
+ int test_adds(int x) { return addi(3, x) + addi(x, 5); }
+ ''',
+ args: g_c_args)
+ cdata.set('HAVE_I_CONSTRAINT__BUILTIN_CONSTANT_P', 1)
+ endif
+ endif
+endif
+
+
+
+# XXX: The configure.ac check for __cpuid() is broken, we don't copy that
+# here. To prevent problems due to two detection methods working, stop
+# checking after one.
+if cc.links('''
+ #include <cpuid.h>
+ int main(int arg, char **argv)
+ {
+ unsigned int exx[4] = {0, 0, 0, 0};
+ __get_cpuid(1, &exx[0], &exx[1], &exx[2], &exx[3]);
+ }
+ ''', name: '__get_cpuid',
+ args: g_c_args)
+ cdata.set('HAVE__GET_CPUID', 1)
+elif cc.links('''
+ #include <intrin.h>
+ int main(int arg, char **argv)
+ {
+ unsigned int exx[4] = {0, 0, 0, 0};
+ __cpuid(exx, 1);
+ }
+ ''', name: '__cpuid',
+ args: g_c_args)
+ cdata.set('HAVE__CPUID', 1)
+endif
+
+
+
+###############################################################
+# Compiler flags
+###############################################################
+
+common_functional_flags = [
+ # Disable strict-aliasing rules; needed for gcc 3.3+
+ '-fno-strict-aliasing',
+ # Disable optimizations that assume no overflow; needed for gcc 4.3+
+ '-fwrapv',
+ '-fexcess-precision=standard'
+]
+
+add_project_arguments(cc.get_supported_arguments(common_functional_flags), language: 'c')
+
+vectorize_cflags = cc.get_supported_arguments(['-ftree-vectorize'])
+unroll_loops_cflags = cc.get_supported_arguments(['-funroll-loops'])
+
+
+common_warning_flags = [
+ '-Wmissing-prototypes',
+ '-Wpointer-arith',
+ '-Werror=vla',
+ '-Wendif-labels',
+ '-Wmissing-format-attribute',
+ '-Wimplicit-fallthrough=3',
+ '-Wcast-function-type',
+ '-Wformat-security',
+]
+
+add_project_arguments(cc.get_supported_arguments(common_warning_flags), language: 'c')
+
+if llvm.found()
+ add_project_arguments(cpp.get_supported_arguments(common_warning_flags), language: 'cpp')
+endif
+
+# A few places with imported code get a pass on -Wdeclaration-after-statement, remember
+# the result for them
+if cc.has_argument('-Wdeclaration-after-statement')
+ add_project_arguments('-Wdeclaration-after-statement', language: 'c')
+ using_declaration_after_statement_warning = true
+else
+ using_declaration_after_statement_warning = false
+endif
+
+
+# We want to suppress a few unhelpful warnings - but gcc won't
+# complain about unrecognized -Wno-foo switches, so we have to test
+# for the positive form and if that works, add the negative form
+
+negative_warning_flags = [
+ 'unused-command-line-argument',
+ 'format-truncation',
+ 'stringop-truncation',
+
+ # FIXME: from andres's local config
+ 'clobbered',
+ 'missing-field-initializers',
+ 'sign-compare',
+ 'unused-parameter',
+]
+
+foreach w : negative_warning_flags
+ if cc.has_argument('-W'+w)
+ add_project_arguments('-Wno-'+w, language: 'c')
+ endif
+
+ if llvm.found() and cpp.has_argument('-W'+w)
+ add_project_arguments('-Wno-'+w, language: 'cpp')
+ endif
+endforeach
+
+
+# From Project.pm
+if cc.get_id() == 'msvc'
+ add_project_arguments('/wd4018', '/wd4244', '/wd4273', '/wd4102', '/wd4090', '/wd4267',
+ language: 'c')
+ add_project_arguments('/DWIN32', '/DWINDOWS', '/D__WINDOWS__', '/D__WIN32__',
+ '/DWIN32_STACK_RLIMIT=4194304', '/D_CRT_SECURE_NO_DEPRECATE', '/D_CRT_NONSTDC_NO_DEPRECATE',
+ language: 'c')
+endif
+
+
+###############################################################
+# Atomics
+###############################################################
+
+cdata.set('HAVE_SPINLOCKS', 1)
+
+if get_option('atomics')
+ cdata.set('HAVE_ATOMICS', 1)
+
+ atomic_checks = [
+ {'name': 'HAVE_GCC__SYNC_CHAR_TAS',
+ 'desc': '__sync_lock_test_and_set(char)',
+ 'test': '''
+char lock = 0;
+__sync_lock_test_and_set(&lock, 1);
+__sync_lock_release(&lock);'''},
+
+ {'name': 'HAVE_GCC__SYNC_INT32_TAS',
+ 'desc': '__sync_lock_test_and_set(int32)',
+ 'test': '''
+int lock = 0;
+__sync_lock_test_and_set(&lock, 1);
+__sync_lock_release(&lock);'''},
+
+ {'name': 'HAVE_GCC__SYNC_INT32_CAS',
+ 'desc': '__sync_val_compare_and_swap(int32)',
+ 'test': '''
+int val = 0;
+__sync_val_compare_and_swap(&val, 0, 37);'''},
+
+# FIXME: int64 reference
+ {'name': 'HAVE_GCC__SYNC_INT64_CAS',
+ 'desc': '__sync_val_compare_and_swap(int64)',
+ 'test': '''
+long val = 0;
+__sync_val_compare_and_swap(&val, 0, 37);'''},
+
+ {'name': 'HAVE_GCC__ATOMIC_INT32_CAS',
+ 'desc': ' __atomic_compare_exchange_n(int32)',
+ 'test': '''
+int val = 0;
+int expect = 0;
+__atomic_compare_exchange_n(&val, &expect, 37, 0, __ATOMIC_SEQ_CST, __ATOMIC_RELAXED);'''},
+
+# FIXME: int64 reference
+ {'name': 'HAVE_GCC__ATOMIC_INT64_CAS',
+ 'desc': ' __atomic_compare_exchange_n(int64)',
+ 'test': '''
+long val = 0;
+int expect = 0;
+__atomic_compare_exchange_n(&val, &expect, 37, 0, __ATOMIC_SEQ_CST, __ATOMIC_RELAXED);'''},
+ ]
+
+ foreach check : atomic_checks
+ test = '''
+int main(void)
+{
+@0@
+}'''.format(check['test'])
+
+ cdata.set(check['name'],
+ cc.links(test, name: check['desc'], args: g_c_args))
+ endforeach
+
+endif
+
+
+
+###############################################################
+# Library / OS tests
+###############################################################
+
+header_checks = [
+ ['atomic.h'],
+ ['stdbool.h'],
+ ['copyfile.h'],
+ ['execinfo.h'],
+ ['getopt.h'],
+ ['ifaddrs.h'],
+ ['langinfo.h'],
+ ['mbarrier.h'],
+ ['poll.h'],
+ ['sys/epoll.h'],
+ ['sys/event.h'],
+ ['sys/ipc.h'],
+ ['sys/prctl.h'],
+ ['sys/procctl.h'],
+ ['sys/pstat.h'],
+ ['sys/resource.h'],
+ ['sys/select.h'],
+ ['sys/sem.h'],
+ ['sys/shm.h'],
+ ['sys/sockio.h'],
+ ['sys/tas.h'],
+ ['sys/uio.h'],
+ ['sys/un.h'],
+ ['termios.h'],
+ ['ucred.h'],
+ # FIXME: openbsd workaround
+ ['sys/ucred.h'],
+ ['wctype.h'],
+ ['netinet/tcp.h'],
+ ['net/if.h'],
+ ['crtdefs.h'],
+]
+
+foreach c : header_checks
+ varname = 'HAVE_'+c.get(0).underscorify().to_upper()
+
+ # Emulate autoconf behaviour of not-found->undef, found->1
+ found = cc.has_header(c.get(0), include_directories: g_inc, args: g_c_args)
+ cdata.set(varname, found ? 1 : false,
+ description: 'Define to 1 if you have the <@0@> header file.'.format(c))
+endforeach
+
+
+
+decl_checks = [
+ ['F_FULLFSYNC', 'fcntl.h'],
+ ['RTLD_GLOBAL', 'dlfcn.h'],
+ ['RTLD_NOW', 'dlfcn.h'],
+ ['fdatasync', 'unistd.h'],
+ ['posix_fadvise', 'fcntl.h'],
+ ['sigwait', 'signal.h'],
+ ['strlcat', 'string.h'],
+ ['strlcpy', 'string.h'],
+ ['strnlen', 'string.h'],
+ ['strsignal', 'string.h'],
+ ['strtoll', 'stdlib.h'], ['strtoull', 'stdlib.h'], # strto[u]ll may exist but not be declared
+]
+
+# Need to check for function declarations for these functions, because
+# checking for library symbols wouldn't handle deployment target
+# restrictions on macOS
+decl_checks += [
+ ['preadv', 'sys/uio.h'],
+ ['pwritev', 'sys/uio.h'],
+]
+
+foreach c : decl_checks
+ varname = 'HAVE_DECL_'+c.get(0).underscorify().to_upper()
+
+ found = cc.has_header_symbol(c.get(1), c.get(0), args: g_c_args, kwargs: c.get(2, {}))
+ cdata.set10(varname, found, description:
+'''Define to 1 if you have the declaration of `@0@\', and to 0 if you
+ don't.'''.format(c))
+endforeach
+
+
+
+# XXX: this is borked, HAVE_SYS_UCRED_H not defined
+if cc.has_type('struct cmsgcred',
+ include_directories: g_inc,
+ args: g_c_args + ['@0@'.format(cdata.get('HAVE_SYS_UCRED_H')) == 'false' ? '-DHAVE_SYS_UCRED_H' : ''],
+ prefix: '''
+#include <sys/socket.h>
+#include <sys/param.h>
+#ifdef HAVE_SYS_UCRED_H
+#include <sys/ucred.h>
+#endif''')
+ cdata.set('HAVE_STRUCT_CMSGCRED', 1)
+else
+ cdata.set('HAVE_STRUCT_CMSGCRED', false)
+endif
+
+if cc.has_function('getopt', args: g_c_args) and \
+ cc.has_function('getopt_long', args: g_c_args) and \
+ cc.has_type('struct option', args: g_c_args, prefix: '#include <getopt.h>')
+ cdata.set('HAVE_GETOPT', 1)
+ cdata.set('HAVE_GETOPT_LONG', 1)
+ cdata.set('HAVE_STRUCT_OPTION', 1)
+else
+ warning('not yet implemented')
+endif
+
+
+foreach c : ['opterr', 'optreset']
+ varname = 'HAVE_INT_'+c.underscorify().to_upper()
+
+ if cc.links('''
+#include <unistd.h>
+int main(void)
+{
+ extern int @0@;
+ @0@ = 1;
+}
+'''.format(c), name: c, args: g_c_args)
+ cdata.set(varname, 1)
+ else
+ cdata.set(varname, false)
+ endif
+endforeach
+
+
+if cc.has_type('struct sockaddr_storage', args: g_c_args, prefix: '''
+#include <sys/types.h>
+#include <sys/socket.h>''')
+ cdata.set('HAVE_STRUCT_SOCKADDR_STORAGE', 1)
+endif
+
+if cc.has_member('struct sockaddr_storage', 'ss_family', args: g_c_args,
+ prefix: '''#include <sys/types.h>
+#include <sys/socket.h>''')
+ cdata.set('HAVE_STRUCT_SOCKADDR_STORAGE_SS_FAMILY', 1)
+endif
+
+if cc.has_member('struct sockaddr_storage', '__ss_family', args: g_c_args,
+ prefix: '''
+#include <sys/types.h>
+#include <sys/socket.h>''')
+ cdata.set('HAVE_STRUCT_SOCKADDR_STORAGE___SS_FAMILY', 1)
+endif
+
+if cc.has_type('struct sockaddr_un', args: g_c_args, prefix: '''
+#include <sys/types.h>
+#include <sys/un.h>''')
+ cdata.set('HAVE_STRUCT_SOCKADDR_UN', 1)
+endif
+
+if cc.has_type('struct addrinfo', args: g_c_args, prefix: '''
+#include <sys/types.h>
+#include <sys/socket.h>
+#include <netdb.h>
+''')
+ cdata.set('HAVE_STRUCT_ADDRINFO', 1)
+endif
+
+if host_machine.system() == 'windows'
+ cdata.set('HAVE_STRUCT_SOCKADDR_STORAGE', 1)
+ cdata.set('HAVE_STRUCT_SOCKADDR_STORAGE_SS_FAMILY', 1)
+endif
+
+if cc.has_type('struct sockaddr_in6', args: g_c_args, prefix: '''
+#include <netinet/in.h>''')
+ cdata.set('HAVE_IPV6', 1)
+endif
+
+
+if cc.has_member('struct tm', 'tm_zone', args: g_c_args, prefix: '''
+#include <sys/types.h>
+#include <time.h>
+''')
+ cdata.set('HAVE_STRUCT_TM_TM_ZONE', 1)
+endif
+
+if cc.compiles('''
+#include <time.h>
+extern int foo(void);
+int foo(void)
+{
+ return timezone / 60;
+}
+''', name: 'Check if the global variable `timezone\' exists', args: g_c_args,)
+ cdata.set('HAVE_INT_TIMEZONE', 1)
+else
+ cdata.set('HAVE_INT_TIMEZONE', false)
+endif
+
+# FIXME: sys/ipc.h, sys/sem.h includes were conditional
+if cc.has_type('union semun', args: g_c_args, prefix: '''
+#include <sys/types.h>
+#include <sys/ipc.h>
+#include <sys/sem.h>
+''')
+ cdata.set('HAVE_UNION_SEMUN', 1)
+endif
+
+if cc.compiles('''
+#include <string.h>
+int main(void)
+{
+ char buf[100];
+ switch (strerror_r(1, buf, sizeof(buf)))
+ { case 0: break; default: break; }
+}''', args: g_c_args)
+ cdata.set('STRERROR_R_INT', 1)
+else
+ cdata.set('STRERROR_R_INT', false)
+endif
+
+# FIXME
+cdata.set('pg_restrict', '__restrict')
+
+# FIXME
+if host_machine.system() == 'windows'
+ cdata.set('ACCEPT_TYPE_ARG1', 'unsigned int')
+ cdata.set('ACCEPT_TYPE_ARG2', 'struct sockaddr *')
+ cdata.set('ACCEPT_TYPE_ARG3', 'int')
+ cdata.set('ACCEPT_TYPE_RETURN', 'unsigned int PASCAL')
+else
+ cdata.set('ACCEPT_TYPE_ARG1', 'int')
+ cdata.set('ACCEPT_TYPE_ARG2', 'struct sockaddr')
+ cdata.set('ACCEPT_TYPE_ARG3', 'socklen_t')
+ cdata.set('ACCEPT_TYPE_RETURN', 'int')
+endif
+
+cdata.set('HAVE_STRUCT_ADDRINFO', 1)
+
+
+cdata.set('MEMSET_LOOP_LIMIT', 1024)
+
+
+if cc.links('''
+#include <machine/vmparam.h>
+#include <sys/exec.h>
+
+int main(void)
+{
+ PS_STRINGS->ps_nargvstr = 1;
+ PS_STRINGS->ps_argvstr = "foo";
+}
+''',
+ name: 'PS_STRINGS', args: g_c_args)
+ cdata.set('HAVE_PS_STRINGS', 1)
+else
+ cdata.set('HAVE_PS_STRINGS', false)
+endif
+
+
+# FIXME: this clearly needs to be cleaned up.
+if cc.get_id() != 'msvc'
+ # FIXME: Need to actually test
+ add_project_arguments('-msse4.2', language: 'c')
+ cdata.set('USE_SSE42_CRC32C', false)
+ cdata.set('USE_SSE42_CRC32C_WITH_RUNTIME_CHECK', 1)
+else
+ cdata.set('USE_SSE42_CRC32C', false)
+ cdata.set('USE_SSE42_CRC32C_WITH_RUNTIME_CHECK', 1)
+endif
+
+
+m_dep = cc.find_library('m', required : false)
+
+# Most libraries are included only if they demonstrably provide a function we
+# need, but libm is an exception: always include it, because there are too
+# many compilers that play cute optimization games that will break probes for
+# standard functions such as pow().
+os_deps = [m_dep]
+
+rt_dep = cc.find_library('rt', required : false)
+
+dl_dep = cc.find_library('dl', required : false)
+
+util_dep = cc.find_library('util', required : false)
+posix4_dep = cc.find_library('posix4', required : false)
+
+getopt_dep = cc.find_library('getopt', required : false)
+gnugetopt_dep = cc.find_library('gnugetopt', required : false)
+
+execinfo_dep = cc.find_library('execinfo', required : false)
+
+func_checks = [
+ ['_configthreadlocale'],
+ ['backtrace_symbols', {'dependencies': [execinfo_dep]}],
+ ['clock_gettime', {'dependencies': [rt_dep, posix4_dep]}],
+ ['copyfile'],
+ ['dlopen', {'dependencies': [dl_dep]}],
+ ['explicit_bzero'],
+ ['fdatasync', {'dependencies': [rt_dep, posix4_dep]}],
+ ['fls'],
+ ['getaddrinfo'],
+ ['gethostbyname_r', {'dependencies': [thread_dep]}],
+ ['getifaddrs'],
+ ['getopt', {'dependencies': [getopt_dep, gnugetopt_dep]}],
+ ['getopt_long',{'dependencies': [getopt_dep, gnugetopt_dep]}],
+ ['getpeereid'],
+ ['getpeerucred'],
+ ['getpwuid_r', {'dependencies': [thread_dep]}],
+ ['getrlimit'],
+ ['getrusage'],
+ ['gettimeofday'], # XXX: This seems to be in the autoconf case
+ ['inet_aton'],
+ ['kqueue'],
+ ['link'],
+ ['mbstowcs_l'],
+ ['memset_s'],
+ ['mkdtemp'],
+ ['poll'],
+ ['posix_fadvise'],
+ ['posix_fallocate'],
+ ['ppoll'],
+ ['pread'],
+ ['pstat'],
+ ['pthread_is_threaded_np'],
+ ['pwrite'],
+ ['random'],
+ ['readlink'],
+ ['readv'],
+ ['setenv'], # FIXME: windows handling
+ ['setproctitle', {'dependencies': [util_dep]}],
+ ['setproctitle_fast'],
+ ['setsid'],
+ ['shm_open', {'dependencies': [rt_dep]}],
+ ['shm_unlink', {'dependencies': [rt_dep]}],
+ ['srandom'],
+ ['strchrnul'],
+ ['strerror_r', {'dependencies': [thread_dep]}],
+ ['strlcat'],
+ ['strlcpy'],
+ ['strnlen'],
+ ['strsignal'],
+ ['strtof'], # strsignal is checked separately
+ ['strtoll'], ['__strtoll'], ['strtoq'],
+ ['strtoull'], ['__strtoull'], ['strtouq'],
+ ['symlink'],
+ ['sync_file_range'],
+ ['syncfs'],
+ ['unsetenv'],
+ ['uselocale'],
+ ['wcstombs_l'],
+ ['writev'],
+]
+
+foreach c : func_checks
+ func = c.get(0)
+ kwargs = c.get(1, {})
+ deps = kwargs.get('dependencies', [])
+
+ varname = 'HAVE_'+func.underscorify().to_upper()
+
+ found = cc.has_function(func, args: g_c_args,
+ kwargs: kwargs + {'dependencies': []})
+
+ if not found
+ foreach dep : deps
+ if not dep.found()
+ continue
+ endif
+ found = cc.has_function(func, args: g_c_args,
+ kwargs: kwargs + {'dependencies': [dep]})
+ if found
+ os_deps += dep
+ break
+ endif
+ endforeach
+ endif
+
+ # Emulate autoconf behaviour of not-found->undef, found->1
+ cdata.set(varname, found ? 1 : false,
+ description: 'Define to 1 if you have the `@0@\' function.'.format(c))
+endforeach
+
+
+
+
+
+if host_machine.system() == 'linux' or host_machine.system() == 'freebsd'
+ dlsuffix = '.so'
+elif host_machine.system() == 'darwin'
+ dlsuffix = '.dylib'
+elif host_machine.system() == 'windows'
+ dlsuffix = '.dll'
+else
+ error('not yet')
+endif
+
+cdata.set_quoted('DLSUFFIX', dlsuffix)
+
+if host_machine.system() == 'windows'
+ cdata.set('USE_WIN32_SEMAPHORES', 1)
+ cdata.set('USE_WIN32_SHARED_MEMORY', 1)
+elif host_machine.system() == 'darwin'
+ cdata.set('USE_SYSV_SEMAPHORES', 1)
+ cdata.set('USE_SYSV_SHARED_MEMORY', 1)
+else
+ cdata.set('USE_UNNAMED_POSIX_SEMAPHORES', 1)
+ cdata.set('USE_SYSV_SHARED_MEMORY', 1)
+endif
+
+
+if host_machine.system() == 'windows'
+ cdata.set('HAVE_IPV6', 1)
+ cdata.set('HAVE_SYMLINK', 1)
+ cdata.set('WIN32_STACK_RLIMIT', 4194304)
+ cdata.set('HAVE__CONFIGTHREADLOCALE', 1)
+endif
+
+if cc.get_id() == 'msvc'
+ add_project_link_arguments(
+ '/fixed:no',
+ '/dynamicbase',
+ '/nxcompat',
+ language : ['c', 'cpp'],
+ )
+endif
+
+if host_machine.system() == 'windows'
+ os_deps += cc.find_library('ws2_32', required: true)
+endif
+
+
+###############################################################
+# Threading
+###############################################################
+
+# Probably not worth implementing other cases anymore
+cdata.set('ENABLE_THREAD_SAFETY', 1)
+
+if thread_dep.found()
+ if cc.has_function('pthread_is_threaded_np', args: g_c_args, dependencies: [thread_dep])
+ cdata.set('HAVE_PTHREAD_IS_THREADED_NP', 1)
+ endif
+ if cc.has_function('pthread_barrier_wait', args: g_c_args, dependencies: [thread_dep])
+ cdata.set('HAVE_PTHREAD_BARRIER_WAIT', 1)
+ endif
+endif
+
+
+###############################################################
+# Build
+###############################################################
+
+# Collect a number of lists of things while recursing through the source
+# tree. Later steps then can use those.
+
+test_deps = []
+backend_targets = []
+
+
+# List of tap tests we later generate test() invocations for. The main
+# reason for doing it that way instead of having test() invocations
+# everywhere is that they end up being too large. A second benefit is
+# that it'd make it easier to generate data for another runner.
+tap_tests = []
+isolation_tests = []
+regress_tests = []
+
+
+# Default options for targets
+
+default_target_args = {
+ 'implicit_include_directories': false,
+ 'install': true,
+}
+
+default_lib_args = default_target_args + {
+ 'name_prefix': 'lib',
+}
+
+internal_lib_args = default_lib_args + {
+ 'build_by_default': false,
+ 'install': false,
+}
+
+default_mod_args = default_lib_args + {
+ 'name_prefix': '',
+}
+
+default_bin_args = default_target_args + {
+}
+
+
+# First visit src/include - all targets creating headers are defined
+# within. That makes it easy to add the necessary dependencies for the
+# subsequent build steps.
+
+generated_headers = []
+generated_backend_headers = []
+
+postgres_inc = [include_directories('src/include')]
+
+if host_machine.system() == 'windows'
+ postgres_inc += include_directories('src/include/port/win32')
+
+ if cc.get_id() == 'msvc'
+ postgres_inc += include_directories('src/include/port/win32_msvc')
+ endif
+endif
+
+subdir('src/include')
+
+
+# Then through src/port and src/common, as most other things depend on them
+
+frontend_port_code = declare_dependency(
+ compile_args: ['-DFRONTEND'],
+ include_directories: [postgres_inc],
+ sources: [errcodes],
+ dependencies: os_deps,
+)
+
+backend_port_code = declare_dependency(
+ compile_args: ['-DBUILDING_DLL'],
+ include_directories: [postgres_inc],
+ sources: [errcodes],
+ dependencies: os_deps,
+)
+
+subdir('src/port')
+
+frontend_common_code = declare_dependency(
+ compile_args: ['-DFRONTEND'],
+ include_directories: [postgres_inc],
+ sources: generated_headers,
+ dependencies: os_deps,
+)
+
+backend_common_code = declare_dependency(
+ compile_args: ['-DBUILDING_DLL'],
+ include_directories: [postgres_inc],
+ sources: generated_headers,
+)
+
+subdir('src/common')
+
+frontend_shlib_code = declare_dependency(
+ compile_args: ['-DFRONTEND'],
+ include_directories: [postgres_inc],
+ link_with: [pgport_shlib, common_shlib],
+ sources: generated_headers,
+ dependencies: os_deps,
+)
+
+subdir('src/interfaces/libpq')
+subdir('src/fe_utils')
+
+frontend_code = declare_dependency(
+ compile_args: ['-DFRONTEND'],
+ include_directories: [postgres_inc],
+ link_with: [pgport_static, common_static, fe_utils],
+ sources: generated_headers,
+ dependencies: os_deps,
+)
+
+backend_code = declare_dependency(
+ compile_args: ['-DBUILDING_DLL'],
+ include_directories: [postgres_inc],
+ link_with: [],
+ sources: generated_headers + generated_backend_headers,
+ dependencies: [os_deps, ssl, lz4, icu, icu_i18n, ldap, gssapi, libxml],
+)
+
+# Note there's intentionally no dependency on pgport/common here - we want the
+# symbols from the main binary for extension modules, rather than the
+# extension linking separately to pgport/common.
+backend_mod_code = declare_dependency(
+ compile_args: [],
+ include_directories: [postgres_inc],
+ link_with: [],
+ sources: generated_headers + generated_backend_headers,
+ dependencies: [os_deps, ssl, lz4, icu, icu_i18n, ldap, gssapi, libxml],
+)
+
+# Then through the main sources. That way contrib can have dependencies on
+# main sources. Note that this explicitly doesn't enter src/test, right now a
+# few regression tests depend on contrib files.
+
+subdir('src')
+
+subdir('contrib')
+
+subdir('src/test')
+
+
+###############################################################
+# Test prep
+###############################################################
+
+# The determination of where a DESTDIR install points to is ugly, it's somewhat hard
+# to combine two absolute paths portably...
+
+prefix = get_option('prefix')
+
+test_prefix = prefix
+
+if fs.is_absolute(get_option('prefix'))
+ if host_machine.system() == 'windows'
+ if prefix.split(':\\').length() == 1
+ # just a drive
+ test_prefix = ''
+ else
+ test_prefix = prefix.split(':\\')[1]
+ endif
+ else
+ test_prefix = prefix.substring(1)
+ endif
+endif
+
+# DESTDIR for the installation used to run tests in
+test_install_destdir = meson.build_root() / 'tmp_install/'
+# DESTDIR + prefix appropriately munged
+test_install_location = test_install_destdir / test_prefix
+
+
+test('tmp_install',
+ meson_bin, args: meson_args + ['install', '--quiet', '--only-changed', '--no-rebuild'],
+ env: {'DESTDIR':test_install_destdir},
+ priority: 100,
+ is_parallel: false,
+ suite: ['setup'])
+
+test_result_dir = meson.build_root() / 'testrun'
+
+
+# XXX: pg_regress doesn't assign unique ports on windows. To avoid the
+# inevitable conflicts from running tests in parallel, hackishly assign
+# different ports for different tests.
+
+testport=40000
+
+test_env = environment()
+
+if host_machine.system() == 'darwin'
+ library_path_var = 'DYLD_LIBRARY_PATH'
+elif host_machine.system() == 'aix'
+ library_path_var = 'LIBPATH'
+else
+ library_path_var = 'LD_LIBRARY_PATH'
+endif
+
+test_env.prepend('PATH', test_install_location / get_option('bindir'))
+test_env.prepend(library_path_var, test_install_location / get_option('libdir'))
+test_env.set('PG_REGRESS', meson.build_root() / 'src/test/regress/pg_regress')
+test_env.set('REGRESS_SHLIB', regress_module.full_path())
+
+
+###############################################################
+# Test Generation
+###############################################################
+
+# Define all 'pg_regress' style tests
+foreach t : regress_tests
+ test_command = [
+ pg_regress,
+ '--temp-instance', test_result_dir / t['name'] / 'pg_regress' / 'tmp_check',
+ '--inputdir', t['sd'],
+ '--outputdir', test_result_dir / t['name'] / 'pg_regress',
+ '--bindir', '',
+ '--dlpath', t['bd'],
+ '--max-concurrent-tests=20',
+ '--port=@0@'.format(testport),
+ ]
+
+ if t.has_key('regress_args')
+ test_command += t['regress_args']
+ endif
+
+ if t.has_key('schedule')
+ test_command += ['--schedule', t['schedule'],]
+ else
+ test_command += t['sql']
+ endif
+
+ test_kwargs = {
+ 'suite': ['pg_regress', t['name']],
+ 'priority': 10,
+ 'timeout': 300,
+ 'depends': test_deps + t.get('deps', []),
+ 'env': test_env,
+ 'workdir': t['sd'],
+ 'args': [
+ meson.build_root(),
+ t['bd'],
+ t['name'],
+ 'pg_regress',
+ test_command,
+ ]
+ }
+
+ # Allow test definition to override arguments
+ if t.has_key('test_kwargs')
+ test_kwargs += t['test_kwargs']
+ endif
+
+ test(t['name'] / 'pg_regress',
+ testwrap,
+ kwargs: test_kwargs,
+ )
+
+ testport = testport + 1
+endforeach
+
+
+# Define all 'isolationtester' style tests
+foreach t : isolation_tests
+ test_command = [
+ pg_isolation_regress,
+ '--temp-instance', test_result_dir / t['name'] / 'isolation' / 'tmp_check',
+ '--inputdir', t['sd'],
+ '--outputdir', test_result_dir / t['name'] / 'isolation',
+ '--bindir', '',
+ '--dlpath', t['bd'],
+ '--max-concurrent-tests=20',
+ '--port=@0@'.format(testport),
+ ]
+
+ if t.has_key('regress_args')
+ test_command += t['regress_args']
+ endif
+
+ if t.has_key('schedule')
+ test_command += ['--schedule', t['schedule'],]
+ else
+ test_command += t['specs']
+ endif
+
+ test_kwargs = {
+ 'suite': ['isolation', t['name']],
+ 'priority': 20,
+ 'timeout': 300,
+ 'depends': test_deps + t.get('deps', []),
+ 'workdir': t['sd'],
+ 'env': test_env,
+ 'args': [
+ meson.build_root(),
+ t['bd'],
+ t['name'],
+ 'isolation',
+ test_command,
+ ]
+ }
+
+ # Allow test definition to override arguments
+ if t.has_key('test_kwargs')
+ test_kwargs += t['test_kwargs']
+ endif
+
+ test(t['name'] / 'isolation',
+ testwrap,
+ kwargs: test_kwargs,
+ )
+
+ testport = testport + 1
+endforeach
+
+
+# Define all 'tap' style tests
+# FIXME: dependencies for each test
+foreach t : tap_tests
+ env = test_env
+
+ foreach name, value : t.get('env', {})
+ if name == 'PATH'
+ # FIXME: manually setting PATH again, because repeated prepend didn't work
+ # before meson 0.58.
+ env.prepend('PATH', value, test_install_location / get_option('bindir'))
+ else
+ env.set(name, value)
+ endif
+ endforeach
+
+ foreach onetap : t['tests']
+ test(t['name'] / onetap,
+ testwrap,
+ workdir: t['sd'],
+ args: [
+ meson.build_root(),
+ t['bd'],
+ t['name'],
+ onetap,
+ 'perl',
+ '-I', meson.source_root() / 'src/test/perl',
+ '-I', t['sd'],
+ t['sd'] / onetap
+ ],
+ protocol: 'tap',
+ suite: ['tap', t['name']],
+ env: env,
+ depends: test_deps + t.get('deps', []),
+ timeout: 300,
+ )
+ endforeach
+endforeach
+
+
+
+###############################################################
+# Pseudo targets
+###############################################################
+
+alias_target('backend', backend_targets)
+
+
+
+###############################################################
+# The End, The End, My Friend
+###############################################################
+
+if meson.version().version_compare('>=0.57')
+
+ summary({
+ 'Data Block Size' : cdata.get('BLCKSZ'),
+ 'WAL Block Size' : cdata.get('XLOG_BLCKSZ')
+ }, section: 'Data Layout'
+ )
+
+ summary(
+ {
+ 'host system' : '@0@ @1@'.format(host_machine.system(), host_machine.cpu_family()),
+ 'build system' : '@0@ @1@'.format(build_machine.system(), build_machine.cpu_family()),
+ },
+ section: 'System'
+ )
+
+ summary(
+ {
+ 'linker': '@0@'.format(cc.get_linker_id()),
+ 'C compiler': '@0@ @1@'.format(cc.get_id(), cc.version()),
+ },
+ section: 'Compiler'
+ )
+
+ if llvm.found()
+ summary(
+ {
+ 'C++ compiler': '@0@ @1@'.format(cpp.get_id(), cpp.version())
+ },
+ section: 'Compiler')
+ endif
+
+ summary(
+ {
+ 'bison' : '@0@ @1@'.format(bison.full_path(), bison_version),
+ },
+ section: 'Programs'
+ )
+
+ summary(
+ {
+ 'GSS': gssapi,
+ 'LDAP': ldap,
+ 'LLVM': llvm,
+ 'icu': icu,
+ 'libxml': libxml,
+ 'libxslt': libxslt,
+ 'lz4': lz4,
+ 'perl': perl_dep,
+ 'python3': python3,
+ 'readline': readline,
+ 'ssl': ssl,
+ 'zlib': zlib,
+ },
+ section: 'External Libraries'
+ )
+
+endif
diff --git a/meson_options.txt b/meson_options.txt
new file mode 100644
index 00000000000..d4c7f717ff8
--- /dev/null
+++ b/meson_options.txt
@@ -0,0 +1,81 @@
+# Data layout influencing options
+option('BLCKSZ', type : 'combo', choices : ['1', '2', '4', '8', '16', '32'], value : '8',
+ description: 'set table block size in kB')
+
+
+# You get it
+option('cassert', type : 'boolean', value: false,
+ description: 'enable assertion checks (for debugging)')
+
+option('atomics', type : 'boolean', value: true,
+ description: 'whether to use atomic operations')
+
+
+# Compilation options
+
+option('extra_include_dirs', type : 'array',
+ description: 'non-default directories to be searched for headers')
+option('extra_lib_dirs', type : 'array',
+ description: 'non-default directories to be searched for libs')
+
+
+# External dependencies
+
+option('gssapi', type : 'feature', value: 'auto',
+ description: 'GSSAPI support')
+
+option('ldap', type : 'feature', value: 'auto',
+ description: 'LDAP support')
+
+option('llvm', type : 'feature', value: 'disabled',
+ description: 'whether to use llvm')
+
+option('icu', type : 'feature', value: 'auto',
+ description: 'ICU support')
+
+option('libxml', type : 'feature', value: 'auto',
+ description: 'XML support')
+
+option('libxslt', type : 'feature', value: 'auto',
+ description: 'XSLT support in contrib/xml2')
+
+option('lz4', type : 'feature', value: 'auto',
+ description: 'LZ4 support')
+
+option('perl', type : 'feature', value: 'auto',
+ description: 'build Perl modules (PL/Perl)')
+
+option('python', type : 'feature', value: 'auto',
+ description: 'build Python modules (PL/Python)')
+
+option('readline', type : 'feature', value : 'auto',
+ description: 'use GNU Readline or BSD Libedit for editing')
+
+option('ssl', type : 'combo', choices : ['none', 'openssl'], value : 'none',
+ description: 'use LIB for SSL/TLS support (openssl)')
+
+option('zlib', type : 'feature', value: 'auto',
+ description: 'whether to use zlib')
+
+
+# Programs
+option('BISON', type : 'string', value: 'bison',
+ description: 'path to bison binary')
+
+option('FLEX', type : 'string', value: 'flex',
+ description: 'path to flex binary')
+
+option('GZIP', type : 'string', value: 'gzip',
+ description: 'path to gzip binary')
+
+option('PERL', type : 'string', value: 'perl',
+ description: 'path to perl binary')
+
+option('PROVE', type : 'string', value: 'prove',
+ description: 'path to prove binary')
+
+option('SED', type : 'string', value: 'gsed',
+ description: 'path to sed binary')
+
+option('TAR', type : 'string', value: 'tar',
+ description: 'path to tar binary')
diff --git a/src/backend/access/brin/meson.build b/src/backend/access/brin/meson.build
new file mode 100644
index 00000000000..a54c7532927
--- /dev/null
+++ b/src/backend/access/brin/meson.build
@@ -0,0 +1,12 @@
+backend_sources += files(
+ 'brin.c',
+ 'brin_bloom.c',
+ 'brin_inclusion.c',
+ 'brin_minmax.c',
+ 'brin_minmax_multi.c',
+ 'brin_pageops.c',
+ 'brin_revmap.c',
+ 'brin_tuple.c',
+ 'brin_validate.c',
+ 'brin_xlog.c',
+)
diff --git a/src/backend/access/common/meson.build b/src/backend/access/common/meson.build
new file mode 100644
index 00000000000..857beaa32d3
--- /dev/null
+++ b/src/backend/access/common/meson.build
@@ -0,0 +1,18 @@
+backend_sources += files(
+ 'attmap.c',
+ 'bufmask.c',
+ 'detoast.c',
+ 'heaptuple.c',
+ 'indextuple.c',
+ 'printsimple.c',
+ 'printtup.c',
+ 'relation.c',
+ 'reloptions.c',
+ 'scankey.c',
+ 'session.c',
+ 'syncscan.c',
+ 'toast_compression.c',
+ 'toast_internals.c',
+ 'tupconvert.c',
+ 'tupdesc.c',
+)
diff --git a/src/backend/access/gin/meson.build b/src/backend/access/gin/meson.build
new file mode 100644
index 00000000000..56d6f343d54
--- /dev/null
+++ b/src/backend/access/gin/meson.build
@@ -0,0 +1,17 @@
+backend_sources += files(
+ 'ginarrayproc.c',
+ 'ginbtree.c',
+ 'ginbulk.c',
+ 'gindatapage.c',
+ 'ginentrypage.c',
+ 'ginfast.c',
+ 'ginget.c',
+ 'gininsert.c',
+ 'ginlogic.c',
+ 'ginpostinglist.c',
+ 'ginscan.c',
+ 'ginutil.c',
+ 'ginvacuum.c',
+ 'ginvalidate.c',
+ 'ginxlog.c',
+)
diff --git a/src/backend/access/gist/meson.build b/src/backend/access/gist/meson.build
new file mode 100644
index 00000000000..1a996b5e25d
--- /dev/null
+++ b/src/backend/access/gist/meson.build
@@ -0,0 +1,13 @@
+backend_sources += files(
+ 'gist.c',
+ 'gistbuild.c',
+ 'gistbuildbuffers.c',
+ 'gistget.c',
+ 'gistproc.c',
+ 'gistscan.c',
+ 'gistsplit.c',
+ 'gistutil.c',
+ 'gistvacuum.c',
+ 'gistvalidate.c',
+ 'gistxlog.c',
+)
diff --git a/src/backend/access/hash/meson.build b/src/backend/access/hash/meson.build
new file mode 100644
index 00000000000..22f2c691c34
--- /dev/null
+++ b/src/backend/access/hash/meson.build
@@ -0,0 +1,12 @@
+backend_sources += files(
+ 'hash.c',
+ 'hash_xlog.c',
+ 'hashfunc.c',
+ 'hashinsert.c',
+ 'hashovfl.c',
+ 'hashpage.c',
+ 'hashsearch.c',
+ 'hashsort.c',
+ 'hashutil.c',
+ 'hashvalidate.c',
+)
diff --git a/src/backend/access/heap/meson.build b/src/backend/access/heap/meson.build
new file mode 100644
index 00000000000..f1dca73743c
--- /dev/null
+++ b/src/backend/access/heap/meson.build
@@ -0,0 +1,11 @@
+backend_sources += files(
+ 'heapam.c',
+ 'heapam_handler.c',
+ 'heapam_visibility.c',
+ 'heaptoast.c',
+ 'hio.c',
+ 'pruneheap.c',
+ 'rewriteheap.c',
+ 'vacuumlazy.c',
+ 'visibilitymap.c',
+)
diff --git a/src/backend/access/index/meson.build b/src/backend/access/index/meson.build
new file mode 100644
index 00000000000..18af5533e65
--- /dev/null
+++ b/src/backend/access/index/meson.build
@@ -0,0 +1,6 @@
+backend_sources += files(
+ 'amapi.c',
+ 'amvalidate.c',
+ 'genam.c',
+ 'indexam.c',
+)
diff --git a/src/backend/access/meson.build b/src/backend/access/meson.build
new file mode 100644
index 00000000000..9874291fc0a
--- /dev/null
+++ b/src/backend/access/meson.build
@@ -0,0 +1,13 @@
+subdir('brin')
+subdir('common')
+subdir('gin')
+subdir('gist')
+subdir('hash')
+subdir('heap')
+subdir('index')
+subdir('nbtree')
+subdir('rmgrdesc')
+subdir('spgist')
+subdir('table')
+subdir('tablesample')
+subdir('transam')
diff --git a/src/backend/access/nbtree/meson.build b/src/backend/access/nbtree/meson.build
new file mode 100644
index 00000000000..07dc29e8190
--- /dev/null
+++ b/src/backend/access/nbtree/meson.build
@@ -0,0 +1,13 @@
+backend_sources += files(
+ 'nbtcompare.c',
+ 'nbtdedup.c',
+ 'nbtinsert.c',
+ 'nbtpage.c',
+ 'nbtree.c',
+ 'nbtsearch.c',
+ 'nbtsort.c',
+ 'nbtsplitloc.c',
+ 'nbtutils.c',
+ 'nbtvalidate.c',
+ 'nbtxlog.c',
+)
diff --git a/src/backend/access/rmgrdesc/meson.build b/src/backend/access/rmgrdesc/meson.build
new file mode 100644
index 00000000000..f3a6e0a571b
--- /dev/null
+++ b/src/backend/access/rmgrdesc/meson.build
@@ -0,0 +1,26 @@
+# used by frontend programs like pg_waldump
+rmgr_desc_sources = files(
+ 'brindesc.c',
+ 'clogdesc.c',
+ 'committsdesc.c',
+ 'dbasedesc.c',
+ 'genericdesc.c',
+ 'gindesc.c',
+ 'gistdesc.c',
+ 'hashdesc.c',
+ 'heapdesc.c',
+ 'logicalmsgdesc.c',
+ 'mxactdesc.c',
+ 'nbtdesc.c',
+ 'relmapdesc.c',
+ 'replorigindesc.c',
+ 'seqdesc.c',
+ 'smgrdesc.c',
+ 'spgdesc.c',
+ 'standbydesc.c',
+ 'tblspcdesc.c',
+ 'xactdesc.c',
+ 'xlogdesc.c',
+)
+
+backend_sources += rmgr_desc_sources
diff --git a/src/backend/access/spgist/meson.build b/src/backend/access/spgist/meson.build
new file mode 100644
index 00000000000..f18d0d2e53f
--- /dev/null
+++ b/src/backend/access/spgist/meson.build
@@ -0,0 +1,13 @@
+backend_sources += files(
+ 'spgdoinsert.c',
+ 'spginsert.c',
+ 'spgkdtreeproc.c',
+ 'spgproc.c',
+ 'spgquadtreeproc.c',
+ 'spgscan.c',
+ 'spgtextproc.c',
+ 'spgutils.c',
+ 'spgvacuum.c',
+ 'spgvalidate.c',
+ 'spgxlog.c',
+)
diff --git a/src/backend/access/table/meson.build b/src/backend/access/table/meson.build
new file mode 100644
index 00000000000..66c706d640e
--- /dev/null
+++ b/src/backend/access/table/meson.build
@@ -0,0 +1,6 @@
+backend_sources += files(
+ 'table.c',
+ 'tableam.c',
+ 'tableamapi.c',
+ 'toast_helper.c',
+)
diff --git a/src/backend/access/tablesample/meson.build b/src/backend/access/tablesample/meson.build
new file mode 100644
index 00000000000..63ee8203226
--- /dev/null
+++ b/src/backend/access/tablesample/meson.build
@@ -0,0 +1,5 @@
+backend_sources += files(
+ 'bernoulli.c',
+ 'system.c',
+ 'tablesample.c',
+)
diff --git a/src/backend/access/transam/meson.build b/src/backend/access/transam/meson.build
new file mode 100644
index 00000000000..fe3703e0f21
--- /dev/null
+++ b/src/backend/access/transam/meson.build
@@ -0,0 +1,28 @@
+backend_sources += files(
+ 'clog.c',
+ 'commit_ts.c',
+ 'generic_xlog.c',
+ 'multixact.c',
+ 'parallel.c',
+ 'rmgr.c',
+ 'slru.c',
+ 'subtrans.c',
+ 'timeline.c',
+ 'transam.c',
+ 'twophase.c',
+ 'twophase_rmgr.c',
+ 'varsup.c',
+ 'xact.c',
+ 'xlog.c',
+ 'xlogarchive.c',
+ 'xlogfuncs.c',
+ 'xloginsert.c',
+ 'xlogutils.c',
+)
+
+# used by frontend programs to built a frontend xlogreader
+xlogreader_sources = files(
+ 'xlogreader.c',
+)
+
+backend_sources += xlogreader_sources
diff --git a/src/backend/bootstrap/meson.build b/src/backend/bootstrap/meson.build
new file mode 100644
index 00000000000..55c0be68cc4
--- /dev/null
+++ b/src/backend/bootstrap/meson.build
@@ -0,0 +1,12 @@
+backend_sources += files(
+ 'bootstrap.c')
+
+bootscanner = custom_target('bootscanner',
+ input: ['bootscanner.l'],
+ output: ['bootscanner.c'],
+ command: [flex, '-o', '@OUTPUT@', '@INPUT@'])
+
+generated_backend_sources += custom_target('bootparse',
+ input: ['bootparse.y', bootscanner[0]],
+ output: ['bootparse.c'],
+ command: [bison, bisonflags, '-o', '@OUTPUT@', '@INPUT0@'])
diff --git a/src/backend/catalog/meson.build b/src/backend/catalog/meson.build
new file mode 100644
index 00000000000..2cc23582e35
--- /dev/null
+++ b/src/backend/catalog/meson.build
@@ -0,0 +1,41 @@
+backend_sources += files(
+ 'aclchk.c',
+ 'catalog.c',
+ 'dependency.c',
+ 'heap.c',
+ 'index.c',
+ 'indexing.c',
+ 'namespace.c',
+ 'objectaccess.c',
+ 'objectaddress.c',
+ 'partition.c',
+ 'pg_aggregate.c',
+ 'pg_cast.c',
+ 'pg_class.c',
+ 'pg_collation.c',
+ 'pg_constraint.c',
+ 'pg_conversion.c',
+ 'pg_db_role_setting.c',
+ 'pg_depend.c',
+ 'pg_enum.c',
+ 'pg_inherits.c',
+ 'pg_largeobject.c',
+ 'pg_namespace.c',
+ 'pg_operator.c',
+ 'pg_proc.c',
+ 'pg_publication.c',
+ 'pg_range.c',
+ 'pg_shdepend.c',
+ 'pg_subscription.c',
+ 'pg_type.c',
+ 'storage.c',
+ 'toasting.c',
+)
+
+
+install_data(
+ 'information_schema.sql',
+ 'sql_features.txt',
+ 'system_functions.sql',
+ 'system_views.sql',
+ install_dir: 'share/')
diff --git a/src/backend/commands/meson.build b/src/backend/commands/meson.build
new file mode 100644
index 00000000000..8e73b29a263
--- /dev/null
+++ b/src/backend/commands/meson.build
@@ -0,0 +1,50 @@
+backend_sources += files(
+ 'aggregatecmds.c',
+ 'alter.c',
+ 'amcmds.c',
+ 'analyze.c',
+ 'async.c',
+ 'cluster.c',
+ 'collationcmds.c',
+ 'comment.c',
+ 'constraint.c',
+ 'conversioncmds.c',
+ 'copy.c',
+ 'copyfrom.c',
+ 'copyfromparse.c',
+ 'copyto.c',
+ 'createas.c',
+ 'dbcommands.c',
+ 'define.c',
+ 'discard.c',
+ 'dropcmds.c',
+ 'event_trigger.c',
+ 'explain.c',
+ 'extension.c',
+ 'foreigncmds.c',
+ 'functioncmds.c',
+ 'indexcmds.c',
+ 'lockcmds.c',
+ 'matview.c',
+ 'opclasscmds.c',
+ 'operatorcmds.c',
+ 'policy.c',
+ 'portalcmds.c',
+ 'prepare.c',
+ 'proclang.c',
+ 'publicationcmds.c',
+ 'schemacmds.c',
+ 'seclabel.c',
+ 'sequence.c',
+ 'statscmds.c',
+ 'subscriptioncmds.c',
+ 'tablecmds.c',
+ 'tablespace.c',
+ 'trigger.c',
+ 'tsearchcmds.c',
+ 'typecmds.c',
+ 'user.c',
+ 'vacuum.c',
+ 'variable.c',
+ 'view.c',
+)
diff --git a/src/backend/executor/meson.build b/src/backend/executor/meson.build
new file mode 100644
index 00000000000..518674cfa28
--- /dev/null
+++ b/src/backend/executor/meson.build
@@ -0,0 +1,67 @@
+backend_sources += files(
+ 'execAmi.c',
+ 'execAsync.c',
+ 'execCurrent.c',
+ 'execExpr.c',
+ 'execExprInterp.c',
+ 'execGrouping.c',
+ 'execIndexing.c',
+ 'execJunk.c',
+ 'execMain.c',
+ 'execParallel.c',
+ 'execPartition.c',
+ 'execProcnode.c',
+ 'execReplication.c',
+ 'execSRF.c',
+ 'execScan.c',
+ 'execTuples.c',
+ 'execUtils.c',
+ 'functions.c',
+ 'instrument.c',
+ 'nodeAgg.c',
+ 'nodeAppend.c',
+ 'nodeBitmapAnd.c',
+ 'nodeBitmapHeapscan.c',
+ 'nodeBitmapIndexscan.c',
+ 'nodeBitmapOr.c',
+ 'nodeCtescan.c',
+ 'nodeCustom.c',
+ 'nodeForeignscan.c',
+ 'nodeFunctionscan.c',
+ 'nodeGather.c',
+ 'nodeGatherMerge.c',
+ 'nodeGroup.c',
+ 'nodeHash.c',
+ 'nodeHashjoin.c',
+ 'nodeIncrementalSort.c',
+ 'nodeIndexonlyscan.c',
+ 'nodeIndexscan.c',
+ 'nodeLimit.c',
+ 'nodeLockRows.c',
+ 'nodeMaterial.c',
+ 'nodeMemoize.c',
+ 'nodeMergeAppend.c',
+ 'nodeMergejoin.c',
+ 'nodeModifyTable.c',
+ 'nodeNamedtuplestorescan.c',
+ 'nodeNestloop.c',
+ 'nodeProjectSet.c',
+ 'nodeRecursiveunion.c',
+ 'nodeResult.c',
+ 'nodeSamplescan.c',
+ 'nodeSeqscan.c',
+ 'nodeSetOp.c',
+ 'nodeSort.c',
+ 'nodeSubplan.c',
+ 'nodeSubqueryscan.c',
+ 'nodeTableFuncscan.c',
+ 'nodeTidrangescan.c',
+ 'nodeTidscan.c',
+ 'nodeUnique.c',
+ 'nodeValuesscan.c',
+ 'nodeWindowAgg.c',
+ 'nodeWorktablescan.c',
+ 'spi.c',
+ 'tqueue.c',
+ 'tstoreReceiver.c',
+)
diff --git a/src/backend/foreign/meson.build b/src/backend/foreign/meson.build
new file mode 100644
index 00000000000..57463db92c1
--- /dev/null
+++ b/src/backend/foreign/meson.build
@@ -0,0 +1,3 @@
+backend_sources += files(
+ 'foreign.c'
+)
diff --git a/src/backend/jit/llvm/meson.build b/src/backend/jit/llvm/meson.build
new file mode 100644
index 00000000000..83a90770bca
--- /dev/null
+++ b/src/backend/jit/llvm/meson.build
@@ -0,0 +1,41 @@
+if llvm.found()
+
+ llvmjit_sources = []
+
+ # Infrastructure
+ llvmjit_sources += files(
+ 'llvmjit.c',
+ 'llvmjit_error.cpp',
+ 'llvmjit_inline.cpp',
+ 'llvmjit_wrap.cpp',
+ )
+
+ # Code generation
+ llvmjit_sources += files(
+ 'llvmjit_deform.c',
+ 'llvmjit_expr.c',
+ )
+
+ llvmjit = shared_module('llvmjit',
+ llvmjit_sources,
+ kwargs: pg_mod_args + {
+ 'dependencies': pg_mod_args['dependencies'] + [llvm],
+ }
+ )
+
+ backend_targets += llvmjit
+
+ # Note this is intentionally is not installed to bitcodedir, as it's not
+ # for inlining
+ llvmjit_types = custom_target('llvmjit_types.bc',
+ kwargs: llvm_irgen_kw + {
+ 'input': 'llvmjit_types.c',
+ 'output': 'llvmjit_types.bc',
+ 'depends': [postgres],
+ 'install': true,
+ 'install_dir': get_option('libdir')
+ }
+ )
+ backend_targets += llvmjit_types
+
+endif
diff --git a/src/backend/jit/meson.build b/src/backend/jit/meson.build
new file mode 100644
index 00000000000..63cd33a4bed
--- /dev/null
+++ b/src/backend/jit/meson.build
@@ -0,0 +1,3 @@
+backend_sources += files(
+ 'jit.c'
+)
diff --git a/src/backend/lib/meson.build b/src/backend/lib/meson.build
new file mode 100644
index 00000000000..53292563d34
--- /dev/null
+++ b/src/backend/lib/meson.build
@@ -0,0 +1,12 @@
+backend_sources += files(
+ 'binaryheap.c',
+ 'bipartite_match.c',
+ 'bloomfilter.c',
+ 'dshash.c',
+ 'hyperloglog.c',
+ 'ilist.c',
+ 'integerset.c',
+ 'knapsack.c',
+ 'pairingheap.c',
+ 'rbtree.c'
+)
diff --git a/src/backend/libpq/meson.build b/src/backend/libpq/meson.build
new file mode 100644
index 00000000000..49867647155
--- /dev/null
+++ b/src/backend/libpq/meson.build
@@ -0,0 +1,28 @@
+backend_sources += files(
+ 'auth-sasl.c',
+ 'auth-scram.c',
+ 'auth.c',
+ 'be-fsstubs.c',
+ 'be-secure-common.c',
+ 'be-secure.c',
+ 'crypt.c',
+ 'hba.c',
+ 'ifaddr.c',
+ 'pqcomm.c',
+ 'pqformat.c',
+ 'pqmq.c',
+ 'pqsignal.c',
+)
+
+if ssl.found()
+ backend_sources += files('be-secure-openssl.c')
+endif
+
+if gssapi.found()
+ backend_sources += files(
+ 'be-secure-gssapi.c',
+ 'be-gssapi-common.c'
+ )
+endif
+
+install_data('pg_hba.conf.sample', 'pg_ident.conf.sample', install_dir: 'share/')
diff --git a/src/backend/main/meson.build b/src/backend/main/meson.build
new file mode 100644
index 00000000000..241e125f089
--- /dev/null
+++ b/src/backend/main/meson.build
@@ -0,0 +1,2 @@
+main_file = files('main.c')
+backend_sources += main_file
diff --git a/src/backend/meson.build b/src/backend/meson.build
new file mode 100644
index 00000000000..b84f947bc7e
--- /dev/null
+++ b/src/backend/meson.build
@@ -0,0 +1,191 @@
+backend_build_deps = [backend_code]
+backend_deps = [dl_dep, thread_dep]
+backend_sources = []
+backend_link_with = [pgport_srv, common_srv]
+backend_c_args = []
+
+generated_backend_sources = []
+
+subdir('access')
+subdir('bootstrap')
+subdir('catalog')
+subdir('commands')
+subdir('executor')
+subdir('foreign')
+subdir('jit')
+subdir('lib')
+subdir('libpq')
+subdir('main')
+subdir('nodes')
+subdir('optimizer')
+subdir('parser')
+subdir('partitioning')
+subdir('port')
+subdir('postmaster')
+subdir('regex')
+subdir('replication')
+subdir('rewrite')
+subdir('statistics')
+subdir('storage')
+subdir('tcop')
+subdir('tsearch')
+subdir('utils')
+
+
+if host_machine.system() == 'windows'
+ backend_deps += cc.find_library('secur32', required: true)
+endif
+
+
+postgres_link_args = []
+postgres_link_depends = []
+
+# On windows when compiling with msvc we need to make postgres export all its
+# symbols so that extension libraries can use them. For that we need to scan
+# the constituting objects and generate a file specifying all the functions as
+# exported (variables need an "import" declaration in the header, hence
+# PGDLLEXPORT, but functions work without that, due to import libraries
+# basically being trampolines).
+#
+#
+# On meson there's currently no easy way to do this that I found. So we build
+# a static library with all the input objects, run our script to generate
+# exports, and build the final executable using that static library
+#
+#
+# XXX: This needs to be improved.
+#
+
+# NB: There's an outer and inner layer of == windows checks, to allow to
+# exercise most of this on !windows, by widening the outer "layer".
+
+if cc.get_id() == 'msvc' # or true
+
+ postgres_lib = static_library('postgres_lib',
+ backend_sources + timezone_sources + generated_backend_sources,
+ link_whole: backend_link_with,
+ c_pch: '../include/pch/postgres_pch.h',
+ c_args: backend_c_args,
+ implicit_include_directories: false,
+ dependencies: backend_build_deps,
+ build_by_default: false,
+ install: false,
+ )
+
+ postgres_def = custom_target('postgres.def',
+ command: [perl, files('../tools/msvc/gendef2.pl'), 'x64', '@OUTPUT@', '@PRIVATE_DIR@', '@INPUT@'],
+ input: [postgres_lib, common_srv, pgport_srv],
+ output: 'postgres.def',
+ depends: [postgres_lib, common_srv, pgport_srv],
+ install: false,
+ )
+
+ if cc.get_id() == 'msvc'
+ postgres_link_args += '/DEF:@0@'.format(postgres_def.full_path())
+ postgres_link_args += '/STACK:@0@'.format(cdata.get('WIN32_STACK_RLIMIT'))
+ postgres_link_depends += postgres_def
+ endif
+
+ # Unfortunately the msvc linker whines when building an executable with just
+ # libraries, hence the reuse of the 'main' object directly.
+
+ postgres = executable('postgres',
+ objects: [postgres_lib.extract_objects(main_file)],
+ link_with: [postgres_lib],
+ link_args: postgres_link_args,
+ link_depends: postgres_link_depends,
+ dependencies: backend_deps,
+ export_dynamic: true,
+ implib: true,
+ kwargs: default_bin_args,
+ )
+
+else
+
+ postgres = executable('postgres',
+ backend_sources + generated_backend_sources + timezone_sources,
+ c_pch: '../include/pch/postgres_pch.h',
+ c_args: backend_c_args,
+ link_with: backend_link_with,
+ export_dynamic: true,
+ dependencies: [backend_build_deps, backend_deps],
+ kwargs: default_bin_args,
+ )
+
+endif
+
+backend_targets += postgres
+
+pg_mod_args = default_mod_args + {
+ 'dependencies': [backend_mod_code],
+ 'c_args': [],
+ 'cpp_args': [],
+ }
+
+if cdata.has('HAVE_VISIBILITY_ATTRIBUTE')
+ pg_mod_args = pg_mod_args + {
+ 'c_args': pg_mod_args['c_args'] + ['-fvisibility=hidden'],
+ 'cpp_args': pg_mod_args['c_args'] + ['-fvisibility=hidden', '-fvisibility-inlines-hidden'],
+ }
+endif
+
+# Windows / MacOs link shared modules against postgres. To avoid unnecessary
+# build-time dependencies on other operating systems, only add it when
+# necessary.
+if host_machine.system() == 'windows' or host_machine.system() == 'darwin'
+ pg_mod_args = pg_mod_args + {'link_with': [postgres]}
+endif
+if host_machine.system() == 'darwin'
+ pg_mod_args = pg_mod_args + {'link_args': ['-bundle_loader', '@0@'.format(postgres.full_path())]}
+endif
+
+
+# Shared modules that, on some OSs, link against the server binary. Only enter
+# these after we defined the server build.
+
+subdir('jit/llvm')
+subdir('replication/libpqwalreceiver')
+subdir('replication/pgoutput')
+subdir('snowball')
+subdir('utils/mb/conversion_procs')
+
+
+if llvm.found()
+
+ # custom_target() insists on targetting files into the current
+ # directory. But we have files with the same name in different
+ # subdirectories. generators() don't have that problem, but their results
+ # are not installable. The irlink command copies the files for us.
+ #
+ # FIXME: this needs to be in a central place
+ #
+ # generator and custom_'t accept CustomTargetIndex as 'depends', nor do they
+ # like targets with more than one output. However, a custom target accepts
+ # them as input without a problem. So we have the below transitive target :(
+
+ transitive_depend_target = custom_target('stamp',
+ input: generated_headers + generated_backend_headers + generated_backend_sources,
+ output: 'stamp',
+ command: [touch, '@OUTPUT@'],
+ install: false)
+
+ llvm_gen = generator(llvm_irgen_command,
+ arguments: llvm_irgen_args + g_c_args,
+ depends: transitive_depend_target,
+ depfile: '@BASENAME@.c.bc.d',
+ output: ['@PLAINNAME@.bc']
+ )
+
+ bc_backend_sources = llvm_gen.process(backend_sources,
+ preserve_path_from: meson.current_source_dir())
+
+ postgres_llvm = custom_target('postgres.index.bc',
+ kwargs: llvm_irlink_kw + {
+ 'input': bc_backend_sources,
+ 'output': ['bitcode'],
+ },
+ )
+
+ backend_targets += postgres_llvm
+
+endif
diff --git a/src/backend/nodes/meson.build b/src/backend/nodes/meson.build
new file mode 100644
index 00000000000..9fca83fba44
--- /dev/null
+++ b/src/backend/nodes/meson.build
@@ -0,0 +1,17 @@
+backend_sources += files(
+ 'bitmapset.c',
+ 'copyfuncs.c',
+ 'equalfuncs.c',
+ 'extensible.c',
+ 'list.c',
+ 'makefuncs.c',
+ 'nodeFuncs.c',
+ 'nodes.c',
+ 'outfuncs.c',
+ 'params.c',
+ 'print.c',
+ 'read.c',
+ 'readfuncs.c',
+ 'tidbitmap.c',
+ 'value.c',
+)
diff --git a/src/backend/optimizer/geqo/meson.build b/src/backend/optimizer/geqo/meson.build
new file mode 100644
index 00000000000..c04f1dc2dfd
--- /dev/null
+++ b/src/backend/optimizer/geqo/meson.build
@@ -0,0 +1,17 @@
+backend_sources += files(
+ 'geqo_copy.c',
+ 'geqo_cx.c',
+ 'geqo_erx.c',
+ 'geqo_eval.c',
+ 'geqo_main.c',
+ 'geqo_misc.c',
+ 'geqo_mutation.c',
+ 'geqo_ox1.c',
+ 'geqo_ox2.c',
+ 'geqo_pmx.c',
+ 'geqo_pool.c',
+ 'geqo_px.c',
+ 'geqo_random.c',
+ 'geqo_recombination.c',
+ 'geqo_selection.c',
+)
diff --git a/src/backend/optimizer/meson.build b/src/backend/optimizer/meson.build
new file mode 100644
index 00000000000..1ab1d9934ae
--- /dev/null
+++ b/src/backend/optimizer/meson.build
@@ -0,0 +1,5 @@
+subdir('geqo')
+subdir('path')
+subdir('plan')
+subdir('prep')
+subdir('util')
diff --git a/src/backend/optimizer/path/meson.build b/src/backend/optimizer/path/meson.build
new file mode 100644
index 00000000000..310042e7aee
--- /dev/null
+++ b/src/backend/optimizer/path/meson.build
@@ -0,0 +1,11 @@
+backend_sources += files(
+ 'allpaths.c',
+ 'clausesel.c',
+ 'costsize.c',
+ 'equivclass.c',
+ 'indxpath.c',
+ 'joinpath.c',
+ 'joinrels.c',
+ 'pathkeys.c',
+ 'tidpath.c',
+)
diff --git a/src/backend/optimizer/plan/meson.build b/src/backend/optimizer/plan/meson.build
new file mode 100644
index 00000000000..22ec65a3845
--- /dev/null
+++ b/src/backend/optimizer/plan/meson.build
@@ -0,0 +1,10 @@
+backend_sources += files(
+ 'analyzejoins.c',
+ 'createplan.c',
+ 'initsplan.c',
+ 'planagg.c',
+ 'planmain.c',
+ 'planner.c',
+ 'setrefs.c',
+ 'subselect.c',
+)
diff --git a/src/backend/optimizer/prep/meson.build b/src/backend/optimizer/prep/meson.build
new file mode 100644
index 00000000000..4549a5b0e79
--- /dev/null
+++ b/src/backend/optimizer/prep/meson.build
@@ -0,0 +1,7 @@
+backend_sources += files(
+ 'prepagg.c',
+ 'prepjointree.c',
+ 'prepqual.c',
+ 'preptlist.c',
+ 'prepunion.c',
+)
diff --git a/src/backend/optimizer/util/meson.build b/src/backend/optimizer/util/meson.build
new file mode 100644
index 00000000000..e7ceaf566b5
--- /dev/null
+++ b/src/backend/optimizer/util/meson.build
@@ -0,0 +1,16 @@
+backend_sources += files(
+ 'appendinfo.c',
+ 'clauses.c',
+ 'inherit.c',
+ 'joininfo.c',
+ 'orclauses.c',
+ 'paramassign.c',
+ 'pathnode.c',
+ 'placeholder.c',
+ 'plancat.c',
+ 'predtest.c',
+ 'relnode.c',
+ 'restrictinfo.c',
+ 'tlist.c',
+ 'var.c',
+)
diff --git a/src/backend/parser/meson.build b/src/backend/parser/meson.build
new file mode 100644
index 00000000000..491eacf20bb
--- /dev/null
+++ b/src/backend/parser/meson.build
@@ -0,0 +1,43 @@
+backend_sources += files(
+ 'analyze.c',
+ 'parse_agg.c',
+ 'parse_clause.c',
+ 'parse_coerce.c',
+ 'parse_collate.c',
+ 'parse_cte.c',
+ 'parse_enr.c',
+ 'parse_expr.c',
+ 'parse_func.c',
+ 'parse_node.c',
+ 'parse_oper.c',
+ 'parse_param.c',
+ 'parse_relation.c',
+ 'parse_target.c',
+ 'parse_type.c',
+ 'parse_utilcmd.c',
+ 'scansup.c',
+)
+
+# Build a small utility static lib for the parser. This makes it easier to not
+# depend on gram.h already having been generated for most of the other code
+# (which depends on generated headers having been generated). The generation
+# of the parser is slow...
+
+parser_sources = [files('parser.c')]
+
+backend_scanner = custom_target('scan',
+ input: ['scan.l'],
+ output: ['scan.c'],
+ command: [flex, '-CF', '-p', '-p', '-o', '@OUTPUT@', '@INPUT0@'])
+parser_sources += backend_scanner[0]
+
+parser_sources += backend_parser_header[0]
+parser_sources += backend_parser_header[1]
+
+parser = static_library('parser',
+ parser_sources + generated_headers,
+ c_pch: '../../include/pch/c_pch.h',
+ dependencies: [backend_code],
+ kwargs: default_lib_args + {'install': false},
+)
+backend_link_with += parser
diff --git a/src/backend/partitioning/meson.build b/src/backend/partitioning/meson.build
new file mode 100644
index 00000000000..e5e3806a0cc
--- /dev/null
+++ b/src/backend/partitioning/meson.build
@@ -0,0 +1,5 @@
+backend_sources += files(
+ 'partbounds.c',
+ 'partdesc.c',
+ 'partprune.c',
+)
diff --git a/src/backend/port/meson.build b/src/backend/port/meson.build
new file mode 100644
index 00000000000..f1bf7f6d929
--- /dev/null
+++ b/src/backend/port/meson.build
@@ -0,0 +1,28 @@
+backend_sources += files(
+ 'atomics.c',
+)
+
+
+if cdata.has('USE_UNNAMED_POSIX_SEMAPHORES') or cdata.has('USE_NAMED_POSIX_SEMAPHORES')
+ backend_sources += files('posix_sema.c')
+endif
+
+if cdata.has('USE_SYSV_SEMAPHORES')
+ backend_sources += files('sysv_sema.c')
+endif
+
+if cdata.has('USE_WIN32_SEMAPHORES')
+ backend_sources += files('win32_sema.c')
+endif
+
+if cdata.has('USE_SYSV_SHARED_MEMORY')
+ backend_sources += files('sysv_shmem.c')
+endif
+
+if cdata.has('USE_WIN32_SHARED_MEMORY')
+ backend_sources += files('win32_shmem.c')
+endif
+
+if host_machine.system() == 'windows'
+ subdir('win32')
+endif
diff --git a/src/backend/port/win32/meson.build b/src/backend/port/win32/meson.build
new file mode 100644
index 00000000000..68fe4cc3cd0
--- /dev/null
+++ b/src/backend/port/win32/meson.build
@@ -0,0 +1,6 @@
+backend_sources += files(
+ 'crashdump.c',
+ 'signal.c',
+ 'socket.c',
+ 'timer.c',
+)
diff --git a/src/backend/postmaster/meson.build b/src/backend/postmaster/meson.build
new file mode 100644
index 00000000000..803405683e2
--- /dev/null
+++ b/src/backend/postmaster/meson.build
@@ -0,0 +1,15 @@
+backend_sources += files(
+ 'autovacuum.c',
+ 'auxprocess.c',
+ 'bgworker.c',
+ 'bgwriter.c',
+ 'checkpointer.c',
+ 'fork_process.c',
+ 'interrupt.c',
+ 'pgarch.c',
+ 'pgstat.c',
+ 'postmaster.c',
+ 'startup.c',
+ 'syslogger.c',
+ 'walwriter.c',
+)
diff --git a/src/backend/regex/meson.build b/src/backend/regex/meson.build
new file mode 100644
index 00000000000..d08e21cd6d6
--- /dev/null
+++ b/src/backend/regex/meson.build
@@ -0,0 +1,15 @@
+backend_sources += files(
+ 'regcomp.c',
+ 'regerror.c',
+ 'regexec.c',
+ 'regexport.c',
+ 'regfree.c',
+ 'regprefix.c'
+)
+
+#FIXME
+# mark inclusion dependencies between .c files explicitly
+#regcomp.o: regcomp.c regc_lex.c regc_color.c regc_nfa.c regc_cvec.c \
+# regc_locale.c regc_pg_locale.c
+#
+#regexec.o: regexec.c rege_dfa.c
diff --git a/src/backend/replication/libpqwalreceiver/meson.build b/src/backend/replication/libpqwalreceiver/meson.build
new file mode 100644
index 00000000000..3fc786c80a0
--- /dev/null
+++ b/src/backend/replication/libpqwalreceiver/meson.build
@@ -0,0 +1,13 @@
+libpqwalreceiver_sources = files(
+ 'libpqwalreceiver.c',
+)
+
+libpqwalreceiver = shared_module('pqwalreceiver',
+ libpqwalreceiver_sources,
+ kwargs: pg_mod_args + {
+ 'name_prefix': 'lib',
+ 'dependencies': pg_mod_args['dependencies'] + [libpq],
+ }
+)
+
+backend_targets += libpqwalreceiver
diff --git a/src/backend/replication/logical/meson.build b/src/backend/replication/logical/meson.build
new file mode 100644
index 00000000000..773583a12ba
--- /dev/null
+++ b/src/backend/replication/logical/meson.build
@@ -0,0 +1,14 @@
+backend_sources += files(
+ 'decode.c',
+ 'launcher.c',
+ 'logical.c',
+ 'logicalfuncs.c',
+ 'message.c',
+ 'origin.c',
+ 'proto.c',
+ 'relation.c',
+ 'reorderbuffer.c',
+ 'snapbuild.c',
+ 'tablesync.c',
+ 'worker.c',
+)
diff --git a/src/backend/replication/meson.build b/src/backend/replication/meson.build
new file mode 100644
index 00000000000..2573f166d79
--- /dev/null
+++ b/src/backend/replication/meson.build
@@ -0,0 +1,42 @@
+backend_sources += files(
+ 'backup_manifest.c',
+ 'basebackup.c',
+ 'slot.c',
+ 'slotfuncs.c',
+ 'syncrep.c',
+ 'walreceiver.c',
+ 'walreceiverfuncs.c',
+ 'walsender.c',
+)
+
+# [sync]repl_scanner is compiled as part of [sync]repl_gram. The
+# ordering is enforced by making the generation of grammar depend on
+# the scanner generation. That's unnecessarily strict, but overall
+# harmless.
+
+repl_scanner = custom_target('repl_scanner',
+ input : files('repl_scanner.l'),
+ output : ['repl_scanner.c'],
+ command : [flex, '-CF', '-p', '-p', '-o', '@OUTPUT0@', '@INPUT@']
+)
+
+generated_backend_sources += custom_target('repl_gram',
+ input: 'repl_gram.y',
+ output: 'repl_gram.c',
+ depends: repl_scanner,
+ command: [bison, bisonflags, '-o', '@OUTPUT@', '@INPUT0@'])
+
+
+syncrep_scanner = custom_target('syncrep_scanner',
+ input: 'syncrep_scanner.l',
+ output: 'syncrep_scanner.c',
+ command: [flex, '-CF', '-p', '-p', '-o', '@OUTPUT0@', '@INPUT@'])
+
+generated_backend_sources += custom_target('syncrep_gram',
+ input: 'syncrep_gram.y',
+ output: 'syncrep_gram.c',
+ depends: syncrep_scanner,
+ command: [bison, bisonflags, '-o', '@OUTPUT@', '@INPUT0@'])
+
+
+subdir('logical')
diff --git a/src/backend/replication/pgoutput/meson.build b/src/backend/replication/pgoutput/meson.build
new file mode 100644
index 00000000000..8ff0a0c6133
--- /dev/null
+++ b/src/backend/replication/pgoutput/meson.build
@@ -0,0 +1,11 @@
+pgoutput_sources = files(
+ 'pgoutput.c',
+)
+
+pgoutput = shared_module('pgoutput',
+ pgoutput_sources,
+ kwargs: pg_mod_args + {
+ }
+)
+
+backend_targets += pgoutput
diff --git a/src/backend/rewrite/meson.build b/src/backend/rewrite/meson.build
new file mode 100644
index 00000000000..032e2e409b5
--- /dev/null
+++ b/src/backend/rewrite/meson.build
@@ -0,0 +1,9 @@
+backend_sources += files(
+ 'rewriteDefine.c',
+ 'rewriteHandler.c',
+ 'rewriteManip.c',
+ 'rewriteRemove.c',
+ 'rewriteSearchCycle.c',
+ 'rewriteSupport.c',
+ 'rowsecurity.c'
+)
diff --git a/src/backend/snowball/meson.build b/src/backend/snowball/meson.build
new file mode 100644
index 00000000000..b1e52e9a0c3
--- /dev/null
+++ b/src/backend/snowball/meson.build
@@ -0,0 +1,83 @@
+dict_snowball_sources = files(
+ 'dict_snowball.c',
+ 'libstemmer/api.c',
+ 'libstemmer/utilities.c',
+)
+
+dict_snowball_sources += files(
+ 'libstemmer/stem_ISO_8859_1_basque.c',
+ 'libstemmer/stem_ISO_8859_1_catalan.c',
+ 'libstemmer/stem_ISO_8859_1_danish.c',
+ 'libstemmer/stem_ISO_8859_1_dutch.c',
+ 'libstemmer/stem_ISO_8859_1_english.c',
+ 'libstemmer/stem_ISO_8859_1_finnish.c',
+ 'libstemmer/stem_ISO_8859_1_french.c',
+ 'libstemmer/stem_ISO_8859_1_german.c',
+ 'libstemmer/stem_ISO_8859_1_indonesian.c',
+ 'libstemmer/stem_ISO_8859_1_irish.c',
+ 'libstemmer/stem_ISO_8859_1_italian.c',
+ 'libstemmer/stem_ISO_8859_1_norwegian.c',
+ 'libstemmer/stem_ISO_8859_1_porter.c',
+ 'libstemmer/stem_ISO_8859_1_portuguese.c',
+ 'libstemmer/stem_ISO_8859_1_spanish.c',
+ 'libstemmer/stem_ISO_8859_1_swedish.c',
+ 'libstemmer/stem_ISO_8859_2_hungarian.c',
+ 'libstemmer/stem_ISO_8859_2_romanian.c',
+ 'libstemmer/stem_KOI8_R_russian.c',
+ 'libstemmer/stem_UTF_8_arabic.c',
+ 'libstemmer/stem_UTF_8_armenian.c',
+ 'libstemmer/stem_UTF_8_basque.c',
+ 'libstemmer/stem_UTF_8_catalan.c',
+ 'libstemmer/stem_UTF_8_danish.c',
+ 'libstemmer/stem_UTF_8_dutch.c',
+ 'libstemmer/stem_UTF_8_english.c',
+ 'libstemmer/stem_UTF_8_finnish.c',
+ 'libstemmer/stem_UTF_8_french.c',
+ 'libstemmer/stem_UTF_8_german.c',
+ 'libstemmer/stem_UTF_8_greek.c',
+ 'libstemmer/stem_UTF_8_hindi.c',
+ 'libstemmer/stem_UTF_8_hungarian.c',
+ 'libstemmer/stem_UTF_8_indonesian.c',
+ 'libstemmer/stem_UTF_8_irish.c',
+ 'libstemmer/stem_UTF_8_italian.c',
+ 'libstemmer/stem_UTF_8_lithuanian.c',
+ 'libstemmer/stem_UTF_8_nepali.c',
+ 'libstemmer/stem_UTF_8_norwegian.c',
+ 'libstemmer/stem_UTF_8_porter.c',
+ 'libstemmer/stem_UTF_8_portuguese.c',
+ 'libstemmer/stem_UTF_8_romanian.c',
+ 'libstemmer/stem_UTF_8_russian.c',
+ 'libstemmer/stem_UTF_8_serbian.c',
+ 'libstemmer/stem_UTF_8_spanish.c',
+ 'libstemmer/stem_UTF_8_swedish.c',
+ 'libstemmer/stem_UTF_8_tamil.c',
+ 'libstemmer/stem_UTF_8_turkish.c',
+ 'libstemmer/stem_UTF_8_yiddish.c',
+)
+
+# see comment in src/include/snowball/header.h
+stemmer_inc = include_directories('../../include/snowball')
+
+dict_snowball = shared_module('dict_snowball',
+ dict_snowball_sources,
+ c_pch: '../../include/pch/postgres_pch.h',
+ kwargs: pg_mod_args + {
+ 'include_directories': [stemmer_inc],
+ }
+)
+
+snowball_create = custom_target('snowball_create',
+ input: ['snowball_create.pl'],
+ output: ['snowball_create.sql'],
+ depfile: 'snowball_create.dep',
+ command: [perl, '@INPUT0@', '--input', '@CURRENT_SOURCE_DIR@', '--output', '@OUTDIR@'],
+ install: true,
+ install_dir: get_option('datadir'))
+
+# FIXME: check whether the logic to select languages currently in Makefile is needed
+install_subdir('stopwords',
+ install_dir: get_option('datadir') / 'tsearch_data',
+ strip_directory: true)
+
+backend_targets += dict_snowball
+backend_targets += snowball_create
diff --git a/src/backend/statistics/meson.build b/src/backend/statistics/meson.build
new file mode 100644
index 00000000000..8530c55f73c
--- /dev/null
+++ b/src/backend/statistics/meson.build
@@ -0,0 +1,6 @@
+backend_sources += files(
+ 'dependencies.c',
+ 'extended_stats.c',
+ 'mcv.c',
+ 'mvdistinct.c',
+)
diff --git a/src/backend/storage/buffer/meson.build b/src/backend/storage/buffer/meson.build
new file mode 100644
index 00000000000..56a59b52484
--- /dev/null
+++ b/src/backend/storage/buffer/meson.build
@@ -0,0 +1,7 @@
+backend_sources += files(
+ 'buf_init.c',
+ 'buf_table.c',
+ 'bufmgr.c',
+ 'freelist.c',
+ 'localbuf.c',
+)
diff --git a/src/backend/storage/file/meson.build b/src/backend/storage/file/meson.build
new file mode 100644
index 00000000000..e1d5047d4aa
--- /dev/null
+++ b/src/backend/storage/file/meson.build
@@ -0,0 +1,8 @@
+backend_sources += files(
+ 'buffile.c',
+ 'copydir.c',
+ 'fd.c',
+ 'fileset.c',
+ 'reinit.c',
+ 'sharedfileset.c',
+)
diff --git a/src/backend/storage/freespace/meson.build b/src/backend/storage/freespace/meson.build
new file mode 100644
index 00000000000..e4200ea6527
--- /dev/null
+++ b/src/backend/storage/freespace/meson.build
@@ -0,0 +1,5 @@
+backend_sources += files(
+ 'freespace.c',
+ 'fsmpage.c',
+ 'indexfsm.c',
+)
diff --git a/src/backend/storage/ipc/meson.build b/src/backend/storage/ipc/meson.build
new file mode 100644
index 00000000000..516bc1d0193
--- /dev/null
+++ b/src/backend/storage/ipc/meson.build
@@ -0,0 +1,20 @@
+backend_sources += files(
+ 'barrier.c',
+ 'dsm.c',
+ 'dsm_impl.c',
+ 'ipc.c',
+ 'ipci.c',
+ 'latch.c',
+ 'pmsignal.c',
+ 'procarray.c',
+ 'procsignal.c',
+ 'shm_mq.c',
+ 'shm_toc.c',
+ 'shmem.c',
+ 'shmqueue.c',
+ 'signalfuncs.c',
+ 'sinval.c',
+ 'sinvaladt.c',
+ 'standby.c',
+
+)
diff --git a/src/backend/storage/large_object/meson.build b/src/backend/storage/large_object/meson.build
new file mode 100644
index 00000000000..8a181ab9b34
--- /dev/null
+++ b/src/backend/storage/large_object/meson.build
@@ -0,0 +1,3 @@
+backend_sources += files(
+ 'inv_api.c',
+)
diff --git a/src/backend/storage/lmgr/meson.build b/src/backend/storage/lmgr/meson.build
new file mode 100644
index 00000000000..938e7f89894
--- /dev/null
+++ b/src/backend/storage/lmgr/meson.build
@@ -0,0 +1,18 @@
+backend_sources += files(
+ 'condition_variable.c',
+ 'deadlock.c',
+ 'lmgr.c',
+ 'lock.c',
+ 'lwlock.c',
+ 'predicate.c',
+ 'proc.c',
+ 's_lock.c',
+ 'spin.c',
+)
+
+lwlocknames_backend = custom_target('lwlocknames',
+ input : files('lwlocknames.txt'),
+ output : ['lwlocknames.c', 'lwlocknames.h'],
+ command : [perl, files('generate-lwlocknames.pl'), '-o', '@OUTDIR@', '@INPUT@']
+)
+generated_backend_sources += lwlocknames_backend[0]
diff --git a/src/backend/storage/meson.build b/src/backend/storage/meson.build
new file mode 100644
index 00000000000..daad628d74c
--- /dev/null
+++ b/src/backend/storage/meson.build
@@ -0,0 +1,9 @@
+subdir('buffer')
+subdir('file')
+subdir('freespace')
+subdir('ipc')
+subdir('large_object')
+subdir('lmgr')
+subdir('page')
+subdir('smgr')
+subdir('sync')
diff --git a/src/backend/storage/page/meson.build b/src/backend/storage/page/meson.build
new file mode 100644
index 00000000000..2ecd16c952c
--- /dev/null
+++ b/src/backend/storage/page/meson.build
@@ -0,0 +1,5 @@
+backend_sources += files(
+ 'bufpage.c',
+ 'checksum.c',
+ 'itemptr.c',
+)
diff --git a/src/backend/storage/smgr/meson.build b/src/backend/storage/smgr/meson.build
new file mode 100644
index 00000000000..fdeb1223b32
--- /dev/null
+++ b/src/backend/storage/smgr/meson.build
@@ -0,0 +1,4 @@
+backend_sources += files(
+ 'md.c',
+ 'smgr.c',
+)
diff --git a/src/backend/storage/sync/meson.build b/src/backend/storage/sync/meson.build
new file mode 100644
index 00000000000..05148b91a8e
--- /dev/null
+++ b/src/backend/storage/sync/meson.build
@@ -0,0 +1,4 @@
+backend_sources += files(
+ 'sync.c',
+
+)
diff --git a/src/backend/tcop/meson.build b/src/backend/tcop/meson.build
new file mode 100644
index 00000000000..fb54aae8122
--- /dev/null
+++ b/src/backend/tcop/meson.build
@@ -0,0 +1,8 @@
+backend_sources += files(
+ 'cmdtag.c',
+ 'dest.c',
+ 'fastpath.c',
+ 'postgres.c',
+ 'pquery.c',
+ 'utility.c',
+)
diff --git a/src/backend/tsearch/meson.build b/src/backend/tsearch/meson.build
new file mode 100644
index 00000000000..460036b6d4c
--- /dev/null
+++ b/src/backend/tsearch/meson.build
@@ -0,0 +1,21 @@
+backend_sources += files(
+ 'dict.c',
+ 'dict_ispell.c',
+ 'dict_simple.c',
+ 'dict_synonym.c',
+ 'dict_thesaurus.c',
+ 'regis.c',
+ 'spell.c',
+ 'to_tsany.c',
+ 'ts_locale.c',
+ 'ts_parse.c',
+ 'ts_selfuncs.c',
+ 'ts_typanalyze.c',
+ 'ts_utils.c',
+ 'wparser.c',
+ 'wparser_def.c',
+)
+
+install_subdir('dicts',
+ install_dir: get_option('datadir') / 'tsearch_data',
+ strip_directory: true)
diff --git a/src/backend/utils/activity/meson.build b/src/backend/utils/activity/meson.build
new file mode 100644
index 00000000000..cef26eb564b
--- /dev/null
+++ b/src/backend/utils/activity/meson.build
@@ -0,0 +1,5 @@
+backend_sources += files(
+ 'backend_progress.c',
+ 'backend_status.c',
+ 'wait_event.c',
+)
diff --git a/src/backend/utils/adt/meson.build b/src/backend/utils/adt/meson.build
new file mode 100644
index 00000000000..086fde8ff09
--- /dev/null
+++ b/src/backend/utils/adt/meson.build
@@ -0,0 +1,118 @@
+backend_sources += files(
+ 'acl.c',
+ 'amutils.c',
+ 'array_expanded.c',
+ 'array_selfuncs.c',
+ 'array_typanalyze.c',
+ 'array_userfuncs.c',
+ 'arrayfuncs.c',
+ 'arraysubs.c',
+ 'arrayutils.c',
+ 'ascii.c',
+ 'bool.c',
+ 'cash.c',
+ 'char.c',
+ 'cryptohashfuncs.c',
+ 'date.c',
+ 'datetime.c',
+ 'datum.c',
+ 'dbsize.c',
+ 'domains.c',
+ 'encode.c',
+ 'enum.c',
+ 'expandeddatum.c',
+ 'expandedrecord.c',
+ 'float.c',
+ 'format_type.c',
+ 'formatting.c',
+ 'genfile.c',
+ 'geo_ops.c',
+ 'geo_selfuncs.c',
+ 'geo_spgist.c',
+ 'inet_cidr_ntop.c',
+ 'inet_net_pton.c',
+ 'int.c',
+ 'int8.c',
+ 'json.c',
+ 'jsonb.c',
+ 'jsonb_gin.c',
+ 'jsonb_op.c',
+ 'jsonb_util.c',
+ 'jsonfuncs.c',
+ 'jsonbsubs.c',
+ 'jsonpath.c',
+ 'jsonpath_exec.c',
+ 'like.c',
+ 'like_support.c',
+ 'lockfuncs.c',
+ 'mac.c',
+ 'mac8.c',
+ 'mcxtfuncs.c',
+ 'misc.c',
+ 'multirangetypes.c',
+ 'multirangetypes_selfuncs.c',
+ 'name.c',
+ 'network.c',
+ 'network_gist.c',
+ 'network_selfuncs.c',
+ 'network_spgist.c',
+ 'numeric.c',
+ 'numutils.c',
+ 'oid.c',
+ 'oracle_compat.c',
+ 'orderedsetaggs.c',
+ 'partitionfuncs.c',
+ 'pg_locale.c',
+ 'pg_lsn.c',
+ 'pg_upgrade_support.c',
+ 'pgstatfuncs.c',
+ 'pseudotypes.c',
+ 'quote.c',
+ 'rangetypes.c',
+ 'rangetypes_gist.c',
+ 'rangetypes_selfuncs.c',
+ 'rangetypes_spgist.c',
+ 'rangetypes_typanalyze.c',
+ 'regexp.c',
+ 'regproc.c',
+ 'ri_triggers.c',
+ 'rowtypes.c',
+ 'ruleutils.c',
+ 'selfuncs.c',
+ 'tid.c',
+ 'timestamp.c',
+ 'trigfuncs.c',
+ 'tsginidx.c',
+ 'tsgistidx.c',
+ 'tsquery.c',
+ 'tsquery_cleanup.c',
+ 'tsquery_gist.c',
+ 'tsquery_op.c',
+ 'tsquery_rewrite.c',
+ 'tsquery_util.c',
+ 'tsrank.c',
+ 'tsvector.c',
+ 'tsvector_op.c',
+ 'tsvector_parser.c',
+ 'uuid.c',
+ 'varbit.c',
+ 'varchar.c',
+ 'varlena.c',
+ 'version.c',
+ 'windowfuncs.c',
+ 'xid.c',
+ 'xid8funcs.c',
+ 'xml.c',
+)
+
+
+jsonpath_scan = custom_target('jsonpath_scan',
+ input: ['jsonpath_scan.l'],
+ output: ['jsonpath_scan.c'],
+ command: [flex, '-CF', '-p', '-p', '-o', '@OUTPUT@', '@INPUT@'])
+
+# jsonpath_scan is compiled as part of jsonpath_gram
+generated_backend_sources += custom_target('jsonpath_parse',
+ input: ['jsonpath_gram.y', jsonpath_scan[0]],
+ output: ['jsonpath_gram.c'],
+ command: [bison, bisonflags, '-o', '@OUTPUT@', '@INPUT0@'])
diff --git a/src/backend/utils/cache/meson.build b/src/backend/utils/cache/meson.build
new file mode 100644
index 00000000000..92972db52ad
--- /dev/null
+++ b/src/backend/utils/cache/meson.build
@@ -0,0 +1,16 @@
+backend_sources += files(
+ 'attoptcache.c',
+ 'catcache.c',
+ 'evtcache.c',
+ 'inval.c',
+ 'lsyscache.c',
+ 'partcache.c',
+ 'plancache.c',
+ 'relcache.c',
+ 'relfilenodemap.c',
+ 'relmapper.c',
+ 'spccache.c',
+ 'syscache.c',
+ 'ts_cache.c',
+ 'typcache.c',
+)
diff --git a/src/backend/utils/error/meson.build b/src/backend/utils/error/meson.build
new file mode 100644
index 00000000000..ff0ae388263
--- /dev/null
+++ b/src/backend/utils/error/meson.build
@@ -0,0 +1,4 @@
+backend_sources += files(
+ 'assert.c',
+ 'elog.c',
+ )
diff --git a/src/backend/utils/fmgr/meson.build b/src/backend/utils/fmgr/meson.build
new file mode 100644
index 00000000000..e545b424fd2
--- /dev/null
+++ b/src/backend/utils/fmgr/meson.build
@@ -0,0 +1,8 @@
+backend_sources += files(
+ 'dfmgr.c',
+ 'fmgr.c',
+ 'funcapi.c',
+)
+
+# fmgrtab.c
+generated_backend_sources += fmgrtab_target[2]
diff --git a/src/backend/utils/hash/meson.build b/src/backend/utils/hash/meson.build
new file mode 100644
index 00000000000..242e2f0ecdf
--- /dev/null
+++ b/src/backend/utils/hash/meson.build
@@ -0,0 +1,4 @@
+backend_sources += files(
+ 'dynahash.c',
+ 'pg_crc.c'
+)
diff --git a/src/backend/utils/init/meson.build b/src/backend/utils/init/meson.build
new file mode 100644
index 00000000000..ec9d72c3df1
--- /dev/null
+++ b/src/backend/utils/init/meson.build
@@ -0,0 +1,4 @@
+backend_sources += files(
+ 'globals.c',
+ 'miscinit.c',
+ 'postinit.c')
diff --git a/src/backend/utils/mb/conversion_procs/meson.build b/src/backend/utils/mb/conversion_procs/meson.build
new file mode 100644
index 00000000000..b84a78b6318
--- /dev/null
+++ b/src/backend/utils/mb/conversion_procs/meson.build
@@ -0,0 +1,38 @@
+encodings = {
+ 'cyrillic_and_mic': ['cyrillic_and_mic/cyrillic_and_mic.c'],
+ 'euc2004_sjis2004': ['euc2004_sjis2004/euc2004_sjis2004.c'],
+ 'euc_cn_and_mic': ['euc_cn_and_mic/euc_cn_and_mic.c'],
+ 'euc_jp_and_sjis': ['euc_jp_and_sjis/euc_jp_and_sjis.c'],
+ 'euc_kr_and_mic': ['euc_kr_and_mic/euc_kr_and_mic.c'],
+ 'euc_tw_and_big5': [
+ 'euc_tw_and_big5/euc_tw_and_big5.c',
+ 'euc_tw_and_big5/big5.c',
+ ],
+ 'latin2_and_win1250': ['latin2_and_win1250/latin2_and_win1250.c'],
+ 'latin_and_mic': ['latin_and_mic/latin_and_mic.c'],
+ 'utf8_and_big5': ['utf8_and_big5/utf8_and_big5.c'],
+ 'utf8_and_cyrillic': ['utf8_and_cyrillic/utf8_and_cyrillic.c'],
+ 'utf8_and_euc2004': ['utf8_and_euc2004/utf8_and_euc2004.c'],
+ 'utf8_and_euc_cn': ['utf8_and_euc_cn/utf8_and_euc_cn.c'],
+ 'utf8_and_euc_jp': ['utf8_and_euc_jp/utf8_and_euc_jp.c'],
+ 'utf8_and_euc_kr': ['utf8_and_euc_kr/utf8_and_euc_kr.c'],
+ 'utf8_and_euc_tw': ['utf8_and_euc_tw/utf8_and_euc_tw.c'],
+ 'utf8_and_gb18030': ['utf8_and_gb18030/utf8_and_gb18030.c'],
+ 'utf8_and_gbk': ['utf8_and_gbk/utf8_and_gbk.c'],
+ 'utf8_and_iso8859': ['utf8_and_iso8859/utf8_and_iso8859.c'],
+ 'utf8_and_iso8859_1': ['utf8_and_iso8859_1/utf8_and_iso8859_1.c'],
+ 'utf8_and_johab': ['utf8_and_johab/utf8_and_johab.c'],
+ 'utf8_and_sjis': ['utf8_and_sjis/utf8_and_sjis.c'],
+ 'utf8_and_sjis2004': ['utf8_and_sjis2004/utf8_and_sjis2004.c'],
+ 'utf8_and_uhc': ['utf8_and_uhc/utf8_and_uhc.c'],
+ 'utf8_and_win': ['utf8_and_win/utf8_and_win.c'],
+}
+
+foreach encoding, sources : encodings
+ backend_targets += shared_module(encoding,
+ sources,
+ kwargs: pg_mod_args + {
+ }
+ )
+
+endforeach
diff --git a/src/backend/utils/mb/meson.build b/src/backend/utils/mb/meson.build
new file mode 100644
index 00000000000..39e45638db0
--- /dev/null
+++ b/src/backend/utils/mb/meson.build
@@ -0,0 +1,9 @@
+backend_sources += files(
+ 'conv.c',
+ 'mbutils.c',
+ 'stringinfo_mb.c',
+ 'wstrcmp.c',
+ 'wstrncmp.c',
+)
+
+# Note we only enter conversion_procs once the backend build is defined
diff --git a/src/backend/utils/meson.build b/src/backend/utils/meson.build
new file mode 100644
index 00000000000..afb1c0346ba
--- /dev/null
+++ b/src/backend/utils/meson.build
@@ -0,0 +1,13 @@
+subdir('activity')
+subdir('adt')
+subdir('cache')
+subdir('error')
+subdir('fmgr')
+subdir('hash')
+subdir('init')
+subdir('mb')
+subdir('misc')
+subdir('mmgr')
+subdir('resowner')
+subdir('sort')
+subdir('time')
diff --git a/src/backend/utils/misc/meson.build b/src/backend/utils/misc/meson.build
new file mode 100644
index 00000000000..5274c8aa1ae
--- /dev/null
+++ b/src/backend/utils/misc/meson.build
@@ -0,0 +1,28 @@
+backend_sources += files(
+ 'help_config.c',
+ 'pg_config.c',
+ 'pg_controldata.c',
+ 'pg_rusage.c',
+ 'ps_status.c',
+ 'queryenvironment.c',
+ 'queryjumble.c',
+ 'rls.c',
+ 'sampling.c',
+ 'superuser.c',
+ 'timeout.c',
+ 'tzparser.c',
+ 'guc.c',
+)
+
+# guc-file.c.h is compiled as part of guc.c
+guc_scan = custom_target('guc_scan',
+ input: ['guc-file.l'],
+ output: ['guc-file.c.h'],
+ command: [flex, '-CF', '-p', '-p', '-o', '@OUTPUT@', '@INPUT@'])
+
+generated_backend_sources += guc_scan
+
+backend_build_deps += declare_dependency(sources: [guc_scan],
+ include_directories: include_directories('.'))
+
+install_data('postgresql.conf.sample', install_dir: 'share/')
diff --git a/src/backend/utils/mmgr/meson.build b/src/backend/utils/mmgr/meson.build
new file mode 100644
index 00000000000..641bb181ba1
--- /dev/null
+++ b/src/backend/utils/mmgr/meson.build
@@ -0,0 +1,10 @@
+backend_sources += files(
+ 'aset.c',
+ 'dsa.c',
+ 'freepage.c',
+ 'generation.c',
+ 'mcxt.c',
+ 'memdebug.c',
+ 'portalmem.c',
+ 'slab.c',
+)
diff --git a/src/backend/utils/mmgr/proxy.c b/src/backend/utils/mmgr/proxy.c
new file mode 100644
index 00000000000..6e68fa9a557
--- /dev/null
+++ b/src/backend/utils/mmgr/proxy.c
@@ -0,0 +1,217 @@
+#include "postgres.h"
+
+#include "utils/memdebug.h"
+#include "utils/memutils.h"
+#include "lib/ilist.h"
+
+typedef struct ProxyContext
+{
+ MemoryContextData header; /* Standard memory-context fields */
+ MemoryContext alloc_parent;
+ dlist_head allocations;
+} ProxyContext;
+
+typedef struct ProxyChunk
+{
+ dlist_node node;
+ void *context;
+} ProxyChunk;
+
+static void *ProxyContextAlloc(MemoryContext context, Size size);
+static void ProxyContextFree(MemoryContext context, void *pointer);
+static void *ProxyContextRealloc(MemoryContext context, void *pointer, Size size);
+static void ProxyContextReset(MemoryContext context);
+static void ProxyContextDelete(MemoryContext context);
+static Size ProxyContextGetChunkSpace(MemoryContext context, void *pointer);
+static bool ProxyContextIsEmpty(MemoryContext context);
+static void ProxyContextStats(MemoryContext context,
+ MemoryStatsPrintFunc printfunc, void *passthru,
+ MemoryContextCounters *totals);
+
+#ifdef MEMORY_CONTEXT_CHECKING
+static void ProxyContextCheck(MemoryContext context);
+#endif
+
+/*
+ * This is the virtual function table for ProxyContext contexts.
+ */
+static const MemoryContextMethods ProxyContextMethods = {
+ ProxyContextAlloc,
+ ProxyContextFree,
+ ProxyContextRealloc,
+ ProxyContextReset,
+ ProxyContextDelete,
+ ProxyContextGetChunkSpace,
+ ProxyContextIsEmpty,
+ ProxyContextStats
+#ifdef MEMORY_CONTEXT_CHECKING
+ ,ProxyContextCheck
+#endif
+};
+
+MemoryContext
+ProxyContextCreate(MemoryContext parent,
+ const char *name)
+{
+ ProxyContext *proxy;
+ MemoryContext alloc_parent;
+
+ proxy = (ProxyContext *) malloc(sizeof(ProxyContext));
+ if (proxy == NULL)
+ {
+ MemoryContextStats(TopMemoryContext);
+ ereport(ERROR,
+ (errcode(ERRCODE_OUT_OF_MEMORY),
+ errmsg("out of memory"),
+ errdetail("Failed while creating memory context \"%s\".",
+ name)));
+ }
+
+ /* find node we can actually allocate in */
+ alloc_parent = parent;
+ while (alloc_parent != NULL && IsA(alloc_parent, ProxyContext))
+ alloc_parent = alloc_parent->parent;
+ if (alloc_parent == NULL)
+ elog(ERROR, "can't proxy forever");
+ proxy->alloc_parent = parent;
+
+ dlist_init(&proxy->allocations);
+
+ /* Finally, do the type-independent part of context creation */
+ MemoryContextCreate((MemoryContext) proxy,
+ T_ProxyContext,
+ &ProxyContextMethods,
+ parent,
+ name);
+
+ return (MemoryContext) proxy;
+}
+
+static void *
+ProxyContextAlloc(MemoryContext context, Size size)
+{
+ ProxyContext *proxy = castNode(ProxyContext, context);
+ ProxyChunk *chunk;
+ Size alloc_size;
+
+ alloc_size = size + MAXALIGN(sizeof(ProxyChunk));
+
+ chunk = MemoryContextAlloc(proxy->alloc_parent, alloc_size);
+
+ dlist_push_tail(&proxy->allocations, &chunk->node);
+ chunk->context = context;
+
+ return (char *) chunk + MAXALIGN(sizeof(ProxyChunk));
+}
+
+static void
+ProxyContextFree(MemoryContext context, void *pointer)
+{
+ ProxyChunk *chunk = (ProxyChunk *) ((char *) pointer - MAXALIGN(sizeof(ProxyChunk)));
+
+ dlist_delete(&chunk->node);
+
+ pfree(chunk);
+}
+
+static void *
+ProxyContextRealloc(MemoryContext context, void *pointer, Size size)
+{
+ ProxyContext *proxy = castNode(ProxyContext, context);
+ ProxyChunk *chunk = (ProxyChunk *) ((char *) pointer - MAXALIGN(sizeof(ProxyChunk)));
+
+ dlist_delete(&chunk->node);
+
+ chunk =repalloc ((char *) chunk, size + MAXALIGN(sizeof(ProxyChunk)));
+
+ dlist_push_tail(&proxy->allocations, &chunk->node);
+ chunk->context = context;
+
+ return (char *) chunk + MAXALIGN(sizeof(ProxyChunk));
+}
+
+static void
+ProxyContextReset(MemoryContext context)
+{
+ ProxyContext *proxy = castNode(ProxyContext, context);
+ dlist_mutable_iter iter;
+
+ dlist_foreach_modify(iter, &proxy->allocations)
+ {
+ ProxyChunk *ptr = dlist_container(ProxyChunk, node, iter.cur);
+
+ dlist_delete(iter.cur);
+ pfree(ptr);
+ }
+}
+
+static void
+ProxyContextDelete(MemoryContext context)
+{
+ ProxyContextReset(context);
+ free(context);
+}
+
+static Size
+ProxyContextGetChunkSpace(MemoryContext context, void *pointer)
+{
+ ProxyChunk *chunk = (ProxyChunk *) ((char *) pointer - MAXALIGN(sizeof(ProxyChunk)));
+
+ return GetMemoryChunkSpace(chunk);
+}
+
+static bool
+ProxyContextIsEmpty(MemoryContext context)
+{
+ ProxyContext *proxy = castNode(ProxyContext, context);
+
+ return dlist_is_empty(&proxy->allocations);
+}
+
+static void
+ProxyContextStats(MemoryContext context,
+ MemoryStatsPrintFunc printfunc, void *passthru,
+ MemoryContextCounters *totals)
+{
+ ProxyContext *proxy = castNode(ProxyContext, context);
+ Size nblocks = 0;
+ Size freechunks = 0;
+ Size totalspace = 0;
+ Size freespace = 0;
+ Size chunks = 0;
+ dlist_iter iter;
+
+ dlist_foreach(iter, &proxy->allocations)
+ {
+ ProxyChunk *chunk = dlist_container(ProxyChunk, node, iter.cur);
+
+ totalspace += GetMemoryChunkSpace((void *) chunk);
+ chunks++;
+ }
+
+ if (printfunc)
+ {
+ char stats_string[200];
+
+ snprintf(stats_string, sizeof(stats_string),
+ "%zu chunks proxied to parent, totaling %zu bytes",
+ chunks, totalspace);
+ printfunc(context, passthru, stats_string);
+ }
+
+ if (totals)
+ {
+ totals->nblocks += nblocks;
+ totals->freechunks += freechunks;
+ totals->totalspace += totalspace;
+ totals->freespace += freespace;
+ }
+}
+
+#ifdef MEMORY_CONTEXT_CHECKING
+static void
+ProxyContextCheck(MemoryContext context)
+{
+ /* FIXME */
+}
+#endif
diff --git a/src/backend/utils/resowner/meson.build b/src/backend/utils/resowner/meson.build
new file mode 100644
index 00000000000..d30891ca027
--- /dev/null
+++ b/src/backend/utils/resowner/meson.build
@@ -0,0 +1,3 @@
+backend_sources += files(
+ 'resowner.c'
+)
diff --git a/src/backend/utils/sort/meson.build b/src/backend/utils/sort/meson.build
new file mode 100644
index 00000000000..b626bdc9d96
--- /dev/null
+++ b/src/backend/utils/sort/meson.build
@@ -0,0 +1,7 @@
+backend_sources += files(
+ 'logtape.c',
+ 'sharedtuplestore.c',
+ 'sortsupport.c',
+ 'tuplesort.c',
+ 'tuplestore.c',
+)
diff --git a/src/backend/utils/time/meson.build b/src/backend/utils/time/meson.build
new file mode 100644
index 00000000000..6fff8792bb0
--- /dev/null
+++ b/src/backend/utils/time/meson.build
@@ -0,0 +1,4 @@
+backend_sources += files(
+ 'combocid.c',
+ 'snapmgr.c',
+)
diff --git a/src/bin/initdb/meson.build b/src/bin/initdb/meson.build
new file mode 100644
index 00000000000..52f679e3116
--- /dev/null
+++ b/src/bin/initdb/meson.build
@@ -0,0 +1,24 @@
+initdb_sources = files(
+ 'findtimezone.c',
+ 'initdb.c'
+)
+
+initdb_sources += timezone_localtime_source
+
+#fixme: reimplement libpq_pgport logic
+
+executable('initdb',
+ initdb_sources,
+ include_directories: [timezone_inc],
+ dependencies: [frontend_code, libpq],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name': 'initdb',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_initdb.pl'
+ ]
+}
diff --git a/src/bin/meson.build b/src/bin/meson.build
new file mode 100644
index 00000000000..3718fd0759a
--- /dev/null
+++ b/src/bin/meson.build
@@ -0,0 +1,20 @@
+subdir('initdb')
+subdir('pg_amcheck')
+subdir('pg_archivecleanup')
+subdir('pg_basebackup')
+subdir('pg_checksums')
+subdir('pg_config')
+subdir('pg_controldata')
+subdir('pg_ctl')
+subdir('pg_dump')
+subdir('pg_resetwal')
+subdir('pg_rewind')
+subdir('pg_test_fsync')
+subdir('pg_test_timing')
+subdir('pg_upgrade')
+subdir('pg_verifybackup')
+subdir('pg_waldump')
+subdir('pgbench')
+#TODO subdir('pgevent')
+subdir('psql')
+subdir('scripts')
diff --git a/src/bin/pg_amcheck/meson.build b/src/bin/pg_amcheck/meson.build
new file mode 100644
index 00000000000..69eaef8f141
--- /dev/null
+++ b/src/bin/pg_amcheck/meson.build
@@ -0,0 +1,22 @@
+pg_amcheck_sources = files(
+ 'pg_amcheck.c'
+)
+
+pg_amcheck = executable('pg_amcheck',
+ pg_amcheck_sources,
+ dependencies: [frontend_code, libpq],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name': 'pg_amcheck',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_basic.pl',
+ 't/002_nonesuch.pl',
+ 't/003_check.pl',
+ 't/004_verify_heapam.pl',
+ 't/005_opclass_damage.pl',
+ ]
+}
diff --git a/src/bin/pg_archivecleanup/meson.build b/src/bin/pg_archivecleanup/meson.build
new file mode 100644
index 00000000000..27742fafab7
--- /dev/null
+++ b/src/bin/pg_archivecleanup/meson.build
@@ -0,0 +1,14 @@
+pg_archivecleanup = executable('pg_archivecleanup',
+ ['pg_archivecleanup.c'],
+ dependencies: [frontend_code],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name': 'pg_archivecleanup',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/010_pg_archivecleanup.pl'
+ ]
+}
diff --git a/src/bin/pg_basebackup/meson.build b/src/bin/pg_basebackup/meson.build
new file mode 100644
index 00000000000..a629b8b02f5
--- /dev/null
+++ b/src/bin/pg_basebackup/meson.build
@@ -0,0 +1,44 @@
+common_sources = files(
+ 'receivelog.c',
+ 'streamutil.c',
+ 'walmethods.c',
+)
+
+pg_basebackup_common = static_library('pg_basebackup_common',
+ common_sources,
+ dependencies: [frontend_code, libpq, zlib],
+ kwargs: internal_lib_args,
+)
+
+executable('pg_basebackup',
+ 'pg_basebackup.c',
+ link_with: [pg_basebackup_common],
+ dependencies: [frontend_code, libpq, zlib],
+ kwargs: default_bin_args,
+)
+
+executable('pg_receivewal',
+ 'pg_receivewal.c',
+ link_with: [pg_basebackup_common],
+ dependencies: [frontend_code, libpq],
+ kwargs: default_bin_args,
+)
+
+executable('pg_recvlogical',
+ 'pg_recvlogical.c',
+ link_with: [pg_basebackup_common],
+ dependencies: [frontend_code, libpq],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name' : 'pg_basebackup',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'env': {'GZIP_PROGRAM': gzip.path(), 'TAR': tar.path()},
+ 'tests': [
+ 't/010_pg_basebackup.pl',
+ 't/020_pg_receivewal.pl',
+ 't/030_pg_recvlogical.pl',
+ ]
+}
diff --git a/src/bin/pg_checksums/meson.build b/src/bin/pg_checksums/meson.build
new file mode 100644
index 00000000000..bbf9582b904
--- /dev/null
+++ b/src/bin/pg_checksums/meson.build
@@ -0,0 +1,16 @@
+executable('pg_checksums',
+ ['pg_checksums.c'],
+ include_directories: [timezone_inc],
+ dependencies: [frontend_code],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name': 'pg_checksums',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_basic.pl',
+ 't/002_actions.pl'
+ ]
+}
diff --git a/src/bin/pg_config/meson.build b/src/bin/pg_config/meson.build
new file mode 100644
index 00000000000..df0eb13f636
--- /dev/null
+++ b/src/bin/pg_config/meson.build
@@ -0,0 +1,14 @@
+executable('pg_config',
+ ['pg_config.c'],
+ dependencies: [frontend_code],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name': 'pg_config',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_pg_config.pl',
+ ]
+}
diff --git a/src/bin/pg_controldata/meson.build b/src/bin/pg_controldata/meson.build
new file mode 100644
index 00000000000..fa6057afa54
--- /dev/null
+++ b/src/bin/pg_controldata/meson.build
@@ -0,0 +1,14 @@
+executable('pg_controldata',
+ ['pg_controldata.c'],
+ dependencies: [frontend_code],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name': 'pg_controldata',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_pg_controldata.pl'
+ ]
+}
diff --git a/src/bin/pg_ctl/meson.build b/src/bin/pg_ctl/meson.build
new file mode 100644
index 00000000000..ac0d4f18192
--- /dev/null
+++ b/src/bin/pg_ctl/meson.build
@@ -0,0 +1,17 @@
+executable('pg_ctl',
+ ['pg_ctl.c'],
+ dependencies: [frontend_code, libpq],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name': 'pg_ctl',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_start_stop.pl',
+ 't/002_status.pl',
+ 't/003_promote.pl',
+ 't/004_logrotate.pl'
+ ]
+}
diff --git a/src/bin/pg_dump/meson.build b/src/bin/pg_dump/meson.build
new file mode 100644
index 00000000000..ce5ef11eaeb
--- /dev/null
+++ b/src/bin/pg_dump/meson.build
@@ -0,0 +1,69 @@
+pg_dump_common_sources = files(
+ 'compress_io.c',
+ 'dumputils.c',
+ 'parallel.c',
+ 'pg_backup_archiver.c',
+ 'pg_backup_custom.c',
+ 'pg_backup_db.c',
+ 'pg_backup_directory.c',
+ 'pg_backup_null.c',
+ 'pg_backup_tar.c',
+ 'pg_backup_utils.c',
+)
+
+pg_dump_common = static_library('pg_dump_common',
+ pg_dump_common_sources,
+ c_pch: '../../include/pch/c_pch.h',
+ dependencies: [frontend_code, libpq, zlib],
+ kwargs: internal_lib_args,
+)
+
+pg_dump_sources = files(
+ 'pg_dump.c',
+ 'common.c',
+ 'pg_dump_sort.c',
+)
+
+executable('pg_dump',
+ pg_dump_sources,
+ link_with: [pg_dump_common],
+ dependencies: [frontend_code, libpq, zlib],
+ kwargs: default_bin_args,
+)
+
+
+pg_dumpall_sources = files(
+ 'pg_dumpall.c',
+)
+
+executable('pg_dumpall',
+ pg_dumpall_sources,
+ link_with: [pg_dump_common],
+ dependencies: [frontend_code, libpq, zlib],
+ kwargs: default_bin_args,
+)
+
+
+pg_restore_sources = files(
+ 'pg_restore.c',
+)
+
+executable('pg_restore',
+ pg_restore_sources,
+ link_with: [pg_dump_common],
+ dependencies: [frontend_code, libpq, zlib],
+ kwargs: default_bin_args,
+)
+
+
+tap_tests += {
+ 'name': 'pg_dump',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_basic.pl',
+ 't/002_pg_dump.pl',
+ 't/003_pg_dump_with_server.pl',
+ 't/010_dump_connstr.pl',
+ ]
+}
diff --git a/src/bin/pg_resetwal/meson.build b/src/bin/pg_resetwal/meson.build
new file mode 100644
index 00000000000..7450c0f6432
--- /dev/null
+++ b/src/bin/pg_resetwal/meson.build
@@ -0,0 +1,15 @@
+executable('pg_resetwal',
+ files('pg_resetwal.c'),
+ dependencies: [frontend_code],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name': 'pg_resetwal',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_basic.pl',
+ 't/002_corrupted.pl'
+ ]
+}
diff --git a/src/bin/pg_rewind/meson.build b/src/bin/pg_rewind/meson.build
new file mode 100644
index 00000000000..c7c59e9e523
--- /dev/null
+++ b/src/bin/pg_rewind/meson.build
@@ -0,0 +1,34 @@
+pg_rewind_sources = files(
+ 'datapagemap.c',
+ 'file_ops.c',
+ 'filemap.c',
+ 'libpq_source.c',
+ 'local_source.c',
+ 'parsexlog.c',
+ 'pg_rewind.c',
+ 'timeline.c',
+ '../../backend/access/transam/xlogreader.c',
+)
+
+pg_rewind = executable('pg_rewind',
+ pg_rewind_sources,
+ dependencies: [frontend_code, libpq, lz4],
+ kwargs: default_bin_args,
+)
+
+
+tap_tests += {
+ 'name': 'pg_rewind',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_basic.pl',
+ 't/002_databases.pl',
+ 't/003_extrafiles.pl',
+ 't/004_pg_xlog_symlink.pl',
+ 't/005_same_timeline.pl',
+ 't/006_options.pl',
+ 't/007_standby_source.pl',
+ 't/008_min_recovery_point.pl',
+ ]
+}
diff --git a/src/bin/pg_test_fsync/meson.build b/src/bin/pg_test_fsync/meson.build
new file mode 100644
index 00000000000..527be88d125
--- /dev/null
+++ b/src/bin/pg_test_fsync/meson.build
@@ -0,0 +1,14 @@
+executable('pg_test_fsync',
+ ['pg_test_fsync.c'],
+ dependencies: [frontend_code],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name' : 'pg_test_fsync',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests' :[
+ 't/001_basic.pl',
+ ]
+}
diff --git a/src/bin/pg_test_timing/meson.build b/src/bin/pg_test_timing/meson.build
new file mode 100644
index 00000000000..c74577df493
--- /dev/null
+++ b/src/bin/pg_test_timing/meson.build
@@ -0,0 +1,14 @@
+pg_test_timing = executable('pg_test_timing',
+ ['pg_test_timing.c'],
+ dependencies: [frontend_code],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name': 'pg_test_timing',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_basic.pl'
+ ]
+}
diff --git a/src/bin/pg_upgrade/meson.build b/src/bin/pg_upgrade/meson.build
new file mode 100644
index 00000000000..88d0e03446b
--- /dev/null
+++ b/src/bin/pg_upgrade/meson.build
@@ -0,0 +1,26 @@
+pg_upgrade_sources = files(
+ 'check.c',
+ 'controldata.c',
+ 'dump.c',
+ 'exec.c',
+ 'file.c',
+ 'function.c',
+ 'info.c',
+ 'option.c',
+ 'parallel.c',
+ 'pg_upgrade.c',
+ 'relfilenode.c',
+ 'server.c',
+ 'tablespace.c',
+ 'util.c',
+ 'version.c',
+)
+
+pg_upgrade = executable('pg_upgrade',
+ pg_upgrade_sources,
+ c_pch: '../../include/pch/c_pch.h',
+ dependencies: [frontend_code, libpq],
+ kwargs: default_bin_args,
+)
+
+# FIXME: add test
diff --git a/src/bin/pg_verifybackup/meson.build b/src/bin/pg_verifybackup/meson.build
new file mode 100644
index 00000000000..c7039ddcc49
--- /dev/null
+++ b/src/bin/pg_verifybackup/meson.build
@@ -0,0 +1,25 @@
+pg_verifybackup_sources = files(
+ 'parse_manifest.c',
+ 'pg_verifybackup.c'
+)
+
+pg_verifybackup = executable('pg_verifybackup',
+ pg_verifybackup_sources,
+ dependencies: [frontend_code, libpq],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name': 'pg_verifybackup',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_basic.pl',
+ 't/002_algorithm.pl',
+ 't/003_corruption.pl',
+ 't/004_options.pl',
+ 't/005_bad_manifest.pl',
+ 't/006_encoding.pl',
+ 't/007_wal.pl',
+ ]
+}
diff --git a/src/bin/pg_waldump/meson.build b/src/bin/pg_waldump/meson.build
new file mode 100644
index 00000000000..f89139f89f5
--- /dev/null
+++ b/src/bin/pg_waldump/meson.build
@@ -0,0 +1,23 @@
+pg_waldump_sources = files(
+ 'compat.c',
+ 'pg_waldump.c',
+ 'rmgrdesc.c',
+)
+
+pg_waldump_sources += rmgr_desc_sources
+pg_waldump_sources += xlogreader_sources
+
+pg_waldump = executable('pg_waldump',
+ pg_waldump_sources,
+ dependencies: [frontend_code, lz4],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name': 'pg_waldump',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_basic.pl',
+ ]
+}
diff --git a/src/bin/pgbench/meson.build b/src/bin/pgbench/meson.build
new file mode 100644
index 00000000000..5c4a778ff32
--- /dev/null
+++ b/src/bin/pgbench/meson.build
@@ -0,0 +1,38 @@
+pgbench_sources = files(
+ 'pgbench.c',
+)
+
+# exprscan is compiled as part ofexprparse. The ordering is enforced by making
+# the generation of grammar depend on the scanner generation. That's
+# unnecessarily strict, but overall harmless.
+
+exprscan = custom_target('exprscan',
+ input : files('exprscan.l'),
+ output : ['exprscan.c'],
+ command : [flex, '-CF', '-p', '-p', '-o', '@OUTPUT0@', '@INPUT@']
+)
+
+exprparse = custom_target('exprparse',
+ input: 'exprparse.y',
+ output: 'exprparse.c',
+ depends: exprscan,
+ command: [bison, bisonflags, '-o', '@OUTPUT@', '@INPUT0@'])
+pgbench_sources += exprparse
+
+executable('pgbench',
+ pgbench_sources,
+ dependencies: [frontend_code, libpq, thread_dep],
+ include_directories: include_directories('.'),
+ kwargs: default_bin_args,
+)
+
+
+tap_tests += {
+ 'name' : 'pgbench',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests' :[
+ 't/001_pgbench_with_server.pl',
+ 't/002_pgbench_no_server.pl'
+ ]
+}
diff --git a/src/bin/pgevent/meson.build b/src/bin/pgevent/meson.build
new file mode 100644
index 00000000000..9b4642a69d0
--- /dev/null
+++ b/src/bin/pgevent/meson.build
@@ -0,0 +1 @@
+# FIXME: implement when I can test
diff --git a/src/bin/psql/meson.build b/src/bin/psql/meson.build
new file mode 100644
index 00000000000..98921f801d7
--- /dev/null
+++ b/src/bin/psql/meson.build
@@ -0,0 +1,46 @@
+psql_sources = files(
+ 'command.c',
+ 'common.c',
+ 'copy.c',
+ 'crosstabview.c',
+ 'describe.c',
+ 'help.c',
+ 'input.c',
+ 'large_obj.c',
+ 'mainloop.c',
+ 'prompt.c',
+ 'startup.c',
+ 'stringutils.c',
+ 'tab-complete.c',
+ 'variables.c',
+)
+
+psql_sources += custom_target('psqlscanslash',
+ input: ['psqlscanslash.l'],
+ output: ['psqlscanslash.c'],
+ command: [flex, '-CFe', '-p', '-p', '-o', '@OUTPUT@', '@INPUT@'])
+
+psql_sources += custom_target('psql_help',
+ input: ['create_help.pl'],
+ output: ['sql_help.c', 'sql_help.h'],
+ depfile: 'sql_help.dep',
+ command: [perl, '@INPUT0@', '@SOURCE_ROOT@/doc/src/sgml/ref', '@OUTDIR@', 'sql_help'])
+
+executable('psql',
+ psql_sources,
+ c_pch: '../../include/pch/c_pch.h',
+ include_directories: include_directories('.'),
+ dependencies : [frontend_code, libpq, readline],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name': 'psql',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'env': {'with_readline': readline.found() ? 'yes' : 'no'},
+ 'tests': [
+ 't/010_tab_completion.pl',
+ 't/020_cancel.pl',
+ ],
+}
diff --git a/src/bin/scripts/meson.build b/src/bin/scripts/meson.build
new file mode 100644
index 00000000000..547a53500a4
--- /dev/null
+++ b/src/bin/scripts/meson.build
@@ -0,0 +1,46 @@
+scripts_common = static_library('scripts_common',
+ files('common.c'),
+ dependencies: [frontend_code, libpq],
+ kwargs: internal_lib_args,
+)
+
+binaries = [
+ 'createdb',
+ 'dropdb',
+ 'createuser',
+ 'dropuser',
+ 'clusterdb',
+ 'vacuumdb',
+ 'reindexdb',
+ 'pg_isready',
+]
+
+foreach binary : binaries
+ executable(binary,
+ files(binary + '.c'),
+ link_with: [scripts_common],
+ dependencies: [frontend_code, libpq],
+ kwargs: default_bin_args,
+ )
+endforeach
+
+tap_tests += {
+ 'name': 'scripts',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/010_clusterdb.pl',
+ 't/011_clusterdb_all.pl',
+ 't/020_createdb.pl',
+ 't/040_createuser.pl',
+ 't/050_dropdb.pl',
+ 't/070_dropuser.pl',
+ 't/080_pg_isready.pl',
+ 't/090_reindexdb.pl',
+ 't/091_reindexdb_all.pl',
+ 't/100_vacuumdb.pl',
+ 't/101_vacuumdb_all.pl',
+ 't/102_vacuumdb_stages.pl',
+ 't/200_connstr.pl',
+ ]
+}
diff --git a/src/common/meson.build b/src/common/meson.build
new file mode 100644
index 00000000000..6be70b0c5b3
--- /dev/null
+++ b/src/common/meson.build
@@ -0,0 +1,140 @@
+common_sources = files(
+ 'archive.c',
+ 'base64.c',
+ 'checksum_helper.c',
+ 'config_info.c',
+ 'controldata_utils.c',
+ 'encnames.c',
+ 'exec.c',
+ 'file_perm.c',
+ 'file_utils.c',
+ 'hashfn.c',
+ 'ip.c',
+ 'jsonapi.c',
+ 'keywords.c',
+ 'kwlookup.c',
+ 'link-canary.c',
+ 'md5_common.c',
+ 'pg_get_line.c',
+ 'pg_lzcompress.c',
+ 'pgfnames.c',
+ 'psprintf.c',
+ 'relpath.c',
+ 'rmtree.c',
+ 'saslprep.c',
+ 'scram-common.c',
+ 'string.c',
+ 'stringinfo.c',
+ 'unicode_norm.c',
+ 'username.c',
+ 'wait_error.c',
+ 'wchar.c',
+)
+
+# FIXME: implement openssl
+if ssl.found()
+ common_sources += files(
+ 'cryptohash_openssl.c',
+ 'hmac_openssl.c',
+ 'protocol_openssl.c',
+ )
+else
+ common_sources += files(
+ 'cryptohash.c',
+ 'hmac.c',
+ 'md5.c',
+ 'sha1.c',
+ 'sha2.c',
+ )
+endif
+
+common_sources += custom_target('kwlist',
+ input: files('../include/parser/kwlist.h'),
+ output: 'kwlist_d.h',
+ command: [perl, '-I', '@SOURCE_ROOT@/src/tools', files('../tools/gen_keywordlist.pl'),
+ '--extern', '--output', '@OUTDIR@', '@INPUT@'])
+
+
+# The code imported from Ryu gets a pass on declaration-after-statement,
+# in order to keep it more closely aligned with its upstream.
+ryu_sources = files(
+ 'd2s.c',
+ 'f2s.c',
+)
+ryu_cflags = []
+
+if using_declaration_after_statement_warning
+ ryu_cflags += ['-Wno-declaration-after-statement']
+endif
+
+ryu_srv = static_library('ryu_srv',
+ ryu_sources,
+ dependencies: [backend_port_code],
+ c_args: ryu_cflags,
+ kwargs: internal_lib_args
+)
+
+ryu_shlib = static_library('ryu_shlib',
+ ryu_sources,
+ dependencies: [frontend_port_code],
+ c_args: ryu_cflags,
+ pic: true,
+ kwargs: internal_lib_args
+)
+
+ryu_static = static_library('ryu_static',
+ ryu_sources,
+ dependencies: [frontend_port_code],
+ c_args: ryu_cflags,
+ kwargs: internal_lib_args
+)
+
+
+# A few files are currently only built for frontend, not server
+# (Mkvcbuild.pm has a copy of this list, too). logging.c is excluded
+# from OBJS_FRONTEND_SHLIB (shared library) as a matter of policy,
+# because it is not appropriate for general purpose libraries such
+# as libpq to report errors directly.
+
+common_sources_frontend_shlib = common_sources
+common_sources_frontend_shlib += files(
+ 'fe_memutils.c',
+ 'restricted_token.c',
+ 'sprompt.c',
+)
+
+common_sources_frontend_static = common_sources_frontend_shlib
+common_sources_frontend_static += files(
+ 'logging.c',
+)
+
+# XXX: in most environments we could probably link_whole pgcommon_shlib
+# against pgcommon_static, instead of compiling twice.
+
+common_srv = static_library('pgcommon_srv',
+ common_sources,
+ c_pch: '../include/pch/c_pch.h',
+ link_with: [ryu_srv],
+ dependencies: [backend_common_code, ssl],
+ include_directories: include_directories('.'),
+ kwargs: internal_lib_args,
+)
+
+common_shlib = static_library('pgcommon_shlib',
+ common_sources_frontend_shlib,
+ c_pch: '../include/pch/c_pch.h',
+ pic: true,
+ link_with: [ryu_shlib],
+ dependencies: [frontend_common_code, ssl],
+ include_directories: include_directories('.'),
+ kwargs: default_lib_args,
+)
+
+common_static = static_library('pgcommon_static',
+ common_sources_frontend_static,
+ c_pch: '../include/pch/c_pch.h',
+ link_with: [ryu_static],
+ dependencies: [frontend_common_code, ssl],
+ include_directories: include_directories('.'),
+ kwargs: default_lib_args,
+)
diff --git a/src/fe_utils/meson.build b/src/fe_utils/meson.build
new file mode 100644
index 00000000000..b305727d967
--- /dev/null
+++ b/src/fe_utils/meson.build
@@ -0,0 +1,27 @@
+fe_utils_sources = files(
+ 'archive.c',
+ 'cancel.c',
+ 'conditional.c',
+ 'connect_utils.c',
+ 'mbprint.c',
+ 'option_utils.c',
+ 'parallel_slot.c',
+ 'print.c',
+ 'query_utils.c',
+ 'recovery_gen.c',
+ 'simple_list.c',
+ 'string_utils.c',
+)
+
+fe_utils_sources += custom_target('psqlscan',
+ input: ['psqlscan.l'],
+ output: ['psqlscan.c'],
+ command: [flex, '-Cfe', '-p', '-p', '-o', '@OUTPUT@', '@INPUT@'])
+
+fe_utils = static_library('fe_utils',
+ fe_utils_sources + generated_headers,
+ c_pch: '../include/pch/c_pch.h',
+ include_directories : [postgres_inc, libpq_inc],
+ c_args: ['-DFRONTEND'],
+ kwargs: default_lib_args,
+)
diff --git a/src/include/catalog/meson.build b/src/include/catalog/meson.build
new file mode 100644
index 00000000000..a86ac67eada
--- /dev/null
+++ b/src/include/catalog/meson.build
@@ -0,0 +1,113 @@
+catalog_headers = [
+ 'pg_proc.h',
+ 'pg_type.h',
+ 'pg_attribute.h',
+ 'pg_class.h',
+ 'pg_attrdef.h',
+ 'pg_constraint.h',
+ 'pg_inherits.h',
+ 'pg_index.h',
+ 'pg_operator.h',
+ 'pg_opfamily.h',
+ 'pg_opclass.h',
+ 'pg_am.h',
+ 'pg_amop.h',
+ 'pg_amproc.h',
+ 'pg_language.h',
+ 'pg_largeobject_metadata.h',
+ 'pg_largeobject.h',
+ 'pg_aggregate.h',
+ 'pg_statistic.h',
+ 'pg_statistic_ext.h',
+ 'pg_statistic_ext_data.h',
+ 'pg_rewrite.h',
+ 'pg_trigger.h',
+ 'pg_event_trigger.h',
+ 'pg_description.h',
+ 'pg_cast.h',
+ 'pg_enum.h',
+ 'pg_namespace.h',
+ 'pg_conversion.h',
+ 'pg_depend.h',
+ 'pg_database.h',
+ 'pg_db_role_setting.h',
+ 'pg_tablespace.h',
+ 'pg_authid.h',
+ 'pg_auth_members.h',
+ 'pg_shdepend.h',
+ 'pg_shdescription.h',
+ 'pg_ts_config.h',
+ 'pg_ts_config_map.h',
+ 'pg_ts_dict.h',
+ 'pg_ts_parser.h',
+ 'pg_ts_template.h',
+ 'pg_extension.h',
+ 'pg_foreign_data_wrapper.h',
+ 'pg_foreign_server.h',
+ 'pg_user_mapping.h',
+ 'pg_foreign_table.h',
+ 'pg_policy.h',
+ 'pg_replication_origin.h',
+ 'pg_default_acl.h',
+ 'pg_init_privs.h',
+ 'pg_seclabel.h',
+ 'pg_shseclabel.h',
+ 'pg_collation.h',
+ 'pg_partitioned_table.h',
+ 'pg_range.h',
+ 'pg_transform.h',
+ 'pg_sequence.h',
+ 'pg_publication.h',
+ 'pg_publication_rel.h',
+ 'pg_subscription.h',
+ 'pg_subscription_rel.h',
+]
+
+bki_data = files(
+ 'pg_aggregate.dat',
+ 'pg_am.dat',
+ 'pg_amop.dat',
+ 'pg_amproc.dat',
+ 'pg_authid.dat',
+ 'pg_cast.dat',
+ 'pg_class.dat',
+ 'pg_collation.dat',
+ 'pg_conversion.dat',
+ 'pg_database.dat',
+ 'pg_language.dat',
+ 'pg_namespace.dat',
+ 'pg_opclass.dat',
+ 'pg_operator.dat',
+ 'pg_opfamily.dat',
+ 'pg_proc.dat',
+ 'pg_range.dat',
+ 'pg_tablespace.dat',
+ 'pg_ts_config.dat',
+ 'pg_ts_config_map.dat',
+ 'pg_ts_dict.dat',
+ 'pg_ts_parser.dat',
+ 'pg_ts_template.dat',
+ 'pg_type.dat',
+ )
+
+
+input = []
+output_files = ['postgres.bki', 'schemapg.h', 'system_fk_info.h', 'system_constraints.sql']
+output_install = [get_option('datadir'), get_option('includedir'), get_option('includedir'), get_option('datadir')]
+
+foreach h : catalog_headers
+ fname = h.split('.h')[0]+'_d.h'
+ input += files(h)
+ output_files += fname
+ output_install += get_option('includedir')
+endforeach
+
+generated_headers += custom_target('generated_catalog_headers',
+ input: input,
+ depend_files: bki_data,
+ build_by_default: true,
+ install: true,
+ output: output_files,
+ install_dir: output_install,
+ command: [perl, files('../../backend/catalog/genbki.pl'), '--include-path=@SOURCE_ROOT@/src/include', '--set-version='+pg_version_major.to_string(), '--output=@OUTDIR@', '@INPUT@']
+ )
diff --git a/src/include/meson.build b/src/include/meson.build
new file mode 100644
index 00000000000..c3af4a2574f
--- /dev/null
+++ b/src/include/meson.build
@@ -0,0 +1,50 @@
+configure_file(input : 'pg_config_ext.h.meson',
+ output : 'pg_config_ext.h',
+ configuration : cdata)
+
+system = host_machine.system()
+if system == 'windows'
+ system = 'win32'
+endif
+
+configure_file(
+ output : 'pg_config_os.h',
+ input: files('port/@0@.h'.format(system)),
+ install: true,
+ install_dir : get_option('includedir'),
+ copy : true)
+
+configure_file(
+ output : 'pg_config.h',
+ install : true,
+ install_dir : get_option('includedir'),
+ configuration : cdata)
+
+
+config_paths_data = configuration_data()
+config_paths_data.set_quoted('PGBINDIR', get_option('prefix') / get_option('bindir'))
+config_paths_data.set_quoted('PGSHAREDIR', get_option('prefix') / get_option('datadir'))
+config_paths_data.set_quoted('SYSCONFDIR', get_option('prefix') / get_option('sysconfdir'))
+config_paths_data.set_quoted('INCLUDEDIR', get_option('prefix') / get_option('includedir'))
+# FIXME: shouldn't be the same
+config_paths_data.set_quoted('PKGINCLUDEDIR', get_option('prefix') / get_option('includedir'))
+config_paths_data.set_quoted('INCLUDEDIRSERVER', get_option('prefix') / get_option('includedir'))
+config_paths_data.set_quoted('LIBDIR', get_option('prefix') / get_option('libdir'))
+# FIXME: figure out logic for pkglibdir
+config_paths_data.set_quoted('PKGLIBDIR', get_option('prefix') / get_option('libdir'))
+config_paths_data.set_quoted('LOCALEDIR', get_option('prefix') / get_option('localedir'))
+config_paths_data.set_quoted('DOCDIR', get_option('prefix') / get_option('datadir') / 'doc/postgresql')
+config_paths_data.set_quoted('HTMLDIR', get_option('prefix') / get_option('datadir') / 'doc/postgresql')
+config_paths_data.set_quoted('MANDIR', get_option('prefix') / get_option('datadir') / 'doc/postgresql')
+
+configure_file(
+ output: 'pg_config_paths.h',
+ configuration: config_paths_data,
+ install: false
+)
+
+
+subdir('utils')
+subdir('storage')
+subdir('catalog')
+subdir('parser')
diff --git a/src/include/parser/meson.build b/src/include/parser/meson.build
new file mode 100644
index 00000000000..caf4c092909
--- /dev/null
+++ b/src/include/parser/meson.build
@@ -0,0 +1,10 @@
+backend_parser_header = custom_target('gram',
+ input: [files('../../backend/parser/gram.y')],
+ output: ['gram.c', 'gram.h'],
+ command: [bison, bisonflags, '-d', '-o', '@OUTPUT0@', '@INPUT0@'],
+ install: true,
+ # Only install gram.h, not gram.c
+ install_dir: [false, get_option('includedir')]
+)
+
+#generated_backend_headers += backend_parser[1]
diff --git a/src/include/pch/c_pch.h b/src/include/pch/c_pch.h
new file mode 100644
index 00000000000..f40c757ca62
--- /dev/null
+++ b/src/include/pch/c_pch.h
@@ -0,0 +1 @@
+#include "c.h"
diff --git a/src/include/pch/postgres_pch.h b/src/include/pch/postgres_pch.h
new file mode 100644
index 00000000000..71b2f35f76b
--- /dev/null
+++ b/src/include/pch/postgres_pch.h
@@ -0,0 +1 @@
+#include "postgres.h"
diff --git a/src/include/pg_config_ext.h.meson b/src/include/pg_config_ext.h.meson
new file mode 100644
index 00000000000..57cdfca0cfd
--- /dev/null
+++ b/src/include/pg_config_ext.h.meson
@@ -0,0 +1,7 @@
+/*
+ * src/include/pg_config_ext.h.in. This is generated manually, not by
+ * autoheader, since we want to limit which symbols get defined here.
+ */
+
+/* Define to the name of a signed 64-bit integer type. */
+#mesondefine PG_INT64_TYPE
diff --git a/src/include/storage/meson.build b/src/include/storage/meson.build
new file mode 100644
index 00000000000..ce749169442
--- /dev/null
+++ b/src/include/storage/meson.build
@@ -0,0 +1,15 @@
+# FIXME: this creates an unnecessary lwlocknames.c - but it's not
+# obvious how to avoid that: meson insist on output files being in the
+# current dir.
+
+# FIXME: this leads to lwlocknames.c being installed. Bad.
+lwlocknames = custom_target('lwlocknames',
+ input : files('../../backend/storage/lmgr/lwlocknames.txt'),
+ output : ['lwlocknames.h', 'lwlocknames.c'],
+ command : [perl, files('../../backend/storage/lmgr/generate-lwlocknames.pl'), '-o', '@OUTDIR@', '@INPUT@'],
+ build_by_default: true,
+ install: true,
+ install_dir: get_option('includedir')
+)
+
+generated_backend_headers += lwlocknames[0]
diff --git a/src/include/utils/meson.build b/src/include/utils/meson.build
new file mode 100644
index 00000000000..b9c959b474d
--- /dev/null
+++ b/src/include/utils/meson.build
@@ -0,0 +1,22 @@
+errcodes = custom_target('errcodes',
+ input : files('../../backend/utils/errcodes.txt'),
+ output : ['errcodes.h'],
+ command : [perl, files('../../backend/utils/generate-errcodes.pl'), '@INPUT@', '@OUTPUT@']
+)
+generated_headers += errcodes
+
+generated_backend_headers += custom_target('probes.d',
+ input: files('../../backend/utils/probes.d'),
+ output : 'probes.h',
+ capture: true,
+ command : [sed, '-f', files('../../backend/utils/Gen_dummy_probes.sed'), '@INPUT@']
+)
+
+fmgrtab_target = custom_target('fmgrtab',
+ input: '../catalog/pg_proc.dat',
+ output : ['fmgroids.h', 'fmgrprotos.h', 'fmgrtab.c'],
+ command: [perl, '-I', '@SOURCE_ROOT@/src/backend/catalog/', files('../../backend/utils/Gen_fmgrtab.pl'), '--include-path=@SOURCE_ROOT@/src/include', '--output=@OUTDIR@', '@INPUT@']
+)
+
+generated_backend_headers += fmgrtab_target[0]
+generated_backend_headers += fmgrtab_target[1]
diff --git a/src/interfaces/libpq/meson.build b/src/interfaces/libpq/meson.build
new file mode 100644
index 00000000000..4b716f0e89d
--- /dev/null
+++ b/src/interfaces/libpq/meson.build
@@ -0,0 +1,99 @@
+libpq_sources = files(
+ 'fe-auth-scram.c',
+ 'fe-connect.c',
+ 'fe-exec.c',
+ 'fe-lobj.c',
+ 'fe-misc.c',
+ 'fe-print.c',
+ 'fe-protocol3.c',
+ 'fe-secure.c',
+ 'fe-trace.c',
+ 'legacy-pqsignal.c',
+ 'libpq-events.c',
+ 'pqexpbuffer.c',
+ 'fe-auth.c',
+)
+
+if host_machine.system() == 'windows'
+ libpq_sources += files('win32.c', 'pthread-win32.c')
+endif
+
+if ssl.found()
+ libpq_sources += files('fe-secure-common.c')
+ libpq_sources += files('fe-secure-openssl.c')
+endif
+
+if gssapi.found()
+ libpq_sources += files(
+ 'fe-secure-gssapi.c',
+ 'fe-gssapi-common.c'
+ )
+endif
+
+export_file = custom_target('libpq_exports.list',
+ input: [files('exports.txt')],
+ output: ['@BASENAME@.list'],
+ command: [perl, files('../../tools/gen_versioning_script.pl'),
+ host_machine.system() == 'darwin' ? 'darwin' : 'gnu',
+ '@INPUT0@', '@OUTPUT0@'],
+ build_by_default: false,
+ install: false,
+)
+
+libpq_def = custom_target('libpq.def',
+ command: [perl, files('../../tools/msvc/export2def.pl'), '@OUTPUT@', '@INPUT0@', 'libpq'],
+ input: files('exports.txt'),
+ output: 'libpq.def',
+ build_by_default: false,
+ install: false,
+)
+
+# port needs to be in include path due to pthread-win32.h
+libpq_inc = include_directories('.', '../../port')
+libpq_deps = [frontend_shlib_code, thread_dep, ssl, ldap, gssapi]
+libpq_link_depends = []
+
+libpq_kwargs = default_lib_args + {
+ 'version': '5.'+pg_version_major.to_string(),
+}
+
+
+if host_machine.system() == 'darwin'
+ libpq_kwargs = libpq_kwargs + {
+ 'link_args': ['-exported_symbols_list', export_file.full_path()],
+ 'link_depends': export_file,
+ 'soversion': '5',
+ }
+elif host_machine.system() == 'windows'
+ libpq_deps += cc.find_library('secur32', required: true)
+
+ libpq_kwargs = libpq_kwargs + {
+ 'vs_module_defs': libpq_def,
+ 'soversion': '',
+ }
+else
+ libpq_kwargs = libpq_kwargs + {
+ 'link_args': '-Wl,--version-script=' + export_file.full_path(),
+ 'link_depends': export_file,
+ 'soversion': '5',
+ }
+endif
+
+libpq_so = shared_library('pq',
+ libpq_sources,
+ include_directories : [libpq_inc, postgres_inc],
+ c_args: ['-DFRONTEND'],
+ c_pch: '../../include/pch/c_pch.h',
+ dependencies: libpq_deps,
+ kwargs: libpq_kwargs,
+)
+
+libpq = declare_dependency(
+ link_with: [libpq_so],
+ include_directories: [include_directories('.')]
+)
+
+install_headers('libpq-fe.h', 'libpq-events.h')
+# FIXME: adjust path
+install_headers('libpq-int.h', 'pqexpbuffer.h')
+install_data('pg_service.conf.sample', install_dir: get_option('datadir'))
diff --git a/src/meson.build b/src/meson.build
new file mode 100644
index 00000000000..414be1db419
--- /dev/null
+++ b/src/meson.build
@@ -0,0 +1,10 @@
+# libraries that other subsystems might depend uppon first, in their
+# respective dependency order
+
+subdir('timezone')
+
+subdir('backend')
+
+subdir('bin')
+
+subdir('pl')
diff --git a/src/pl/meson.build b/src/pl/meson.build
new file mode 100644
index 00000000000..b720e922093
--- /dev/null
+++ b/src/pl/meson.build
@@ -0,0 +1,4 @@
+subdir('plpgsql')
+
+subdir('plperl')
+subdir('plpython')
diff --git a/src/pl/plperl/meson.build b/src/pl/plperl/meson.build
new file mode 100644
index 00000000000..a5a994e845f
--- /dev/null
+++ b/src/pl/plperl/meson.build
@@ -0,0 +1,81 @@
+if not perl_dep.found()
+ subdir_done()
+endif
+
+plperl_sources = files(
+ 'plperl.c',
+)
+
+subppdir = run_command(perl, '-e', 'use List::Util qw(first); print first { -r "$_/ExtUtils/xsubpp" } @INC',
+ check: true).stdout()
+xsubpp = '@0@/ExtUtils/xsubpp'.format(subppdir)
+typemap = '@0@/ExtUtils/typemap'.format(subppdir)
+
+plperl_sources += custom_target('perlchunks.h',
+ input: files('plc_perlboot.pl', 'plc_trusted.pl'),
+ output: 'perlchunks.h',
+ capture: true,
+ command: [perl, files('text2macro.pl'), '--strip=^(\#.*|\s*)$', '@INPUT@']
+)
+
+plperl_sources += custom_target('plperl_opmask.h',
+ input: files('plperl_opmask.pl'),
+ output: 'plperl_opmask.h',
+ command: [perl, '@INPUT@', '@OUTPUT@']
+)
+
+foreach n : ['SPI', 'Util']
+ xs = files(n+'.xs')
+ xs_c_name = n+'.c'
+
+ # FIXME: -output option is only available in perl 5.9.3 - but that's
+ # probably a fine minimum requirement?
+ xs_c = custom_target(xs_c_name,
+ input: xs,
+ output: xs_c_name,
+ command: [perl, xsubpp, '-typemap', typemap, '-output', '@OUTPUT@', '@INPUT@']
+ )
+ plperl_sources += xs_c
+endforeach
+
+plperl_inc = include_directories('.')
+shared_module('plperl',
+ plperl_sources,
+ c_pch: '../../include/pch/postgres_pch.h',
+ include_directories: [plperl_inc, postgres_inc],
+ kwargs: pg_mod_args + {
+ 'dependencies': [perl_dep, pg_mod_args['dependencies']],
+ },
+)
+
+install_data(
+ 'plperl.control',
+ 'plperl--1.0.sql',
+ install_dir: get_option('datadir') / 'extension'
+)
+
+install_data(
+ 'plperlu.control',
+ 'plperlu--1.0.sql',
+ install_dir: get_option('datadir') / 'extension'
+)
+
+regress_tests += {
+ 'name': 'plperl',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'plperl_setup',
+ 'plperl',
+ 'plperl_lc',
+ 'plperl_trigger',
+ 'plperl_shared',
+ 'plperl_elog',
+ 'plperl_util',
+ 'plperl_init',
+ 'plperlu',
+ 'plperl_array',
+ 'plperl_call',
+ 'plperl_transaction',
+ ],
+}
diff --git a/src/pl/plpgsql/meson.build b/src/pl/plpgsql/meson.build
new file mode 100644
index 00000000000..9537275d67c
--- /dev/null
+++ b/src/pl/plpgsql/meson.build
@@ -0,0 +1 @@
+subdir('src')
diff --git a/src/pl/plpgsql/src/meson.build b/src/pl/plpgsql/src/meson.build
new file mode 100644
index 00000000000..b040e5e8507
--- /dev/null
+++ b/src/pl/plpgsql/src/meson.build
@@ -0,0 +1,67 @@
+plpgsql_sources = files(
+ 'pl_comp.c',
+ 'pl_exec.c',
+ 'pl_funcs.c',
+ 'pl_handler.c',
+ 'pl_scanner.c',
+)
+
+plpgsql_sources += custom_target('gram',
+ input: ['pl_gram.y'],
+ output: ['pl_gram.c', 'pl_gram.h'],
+ command: [bison, bisonflags, '-d', '-o', '@OUTPUT0@', '@INPUT0@'])
+
+gen_plerrcodes = files('generate-plerrcodes.pl')
+plpgsql_sources += custom_target('plerrcodes',
+ input: ['../../../../src/backend/utils/errcodes.txt'],
+ output: ['plerrcodes.h'],
+ command: [perl, gen_plerrcodes, '@INPUT0@'],
+ capture: true)
+
+gen_keywordlist = files('../../../../src/tools/gen_keywordlist.pl')
+plpgsql_sources += custom_target('pl_reserved_kwlist',
+ input: ['pl_reserved_kwlist.h'],
+ output: ['pl_reserved_kwlist_d.h'],
+ command: [perl, '-I', '@SOURCE_ROOT@/src/tools', gen_keywordlist, '--output', '@OUTDIR@', '--varname', 'ReservedPLKeywords', '@INPUT@']
+)
+
+plpgsql_sources += custom_target('pl_unreserved_kwlist',
+ input: ['pl_unreserved_kwlist.h'],
+ output: ['pl_unreserved_kwlist_d.h'],
+ command: [perl, '-I', '@SOURCE_ROOT@/src/tools', gen_keywordlist, '--output', '@OUTDIR@', '--varname', 'UnreservedPLKeywords', '@INPUT@']
+)
+
+shared_module('plpgsql',
+ plpgsql_sources,
+ c_pch: '../../../include/pch/postgres_pch.h',
+ include_directories: include_directories('.'),
+ kwargs: pg_mod_args,
+)
+
+install_data('plpgsql.control', 'plpgsql--1.0.sql',
+ install_dir: get_option('datadir') / 'extension'
+)
+
+install_headers('plpgsql.h',
+ install_dir: get_option('includedir') / 'server')
+
+
+regress_tests += {
+ 'name': 'plpgsql',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'plpgsql_array',
+ 'plpgsql_call',
+ 'plpgsql_control',
+ 'plpgsql_copy',
+ 'plpgsql_domain',
+ 'plpgsql_record',
+ 'plpgsql_cache',
+ 'plpgsql_simple',
+ 'plpgsql_transaction',
+ 'plpgsql_trap',
+ 'plpgsql_trigger',
+ 'plpgsql_varprops',
+ ],
+}
diff --git a/src/pl/plpython/expected/meson.build b/src/pl/plpython/expected/meson.build
new file mode 100644
index 00000000000..4172ced2208
--- /dev/null
+++ b/src/pl/plpython/expected/meson.build
@@ -0,0 +1,14 @@
+# FIXME: adding the variant files like this is an abysmal hack in an abysmal hack
+foreach r2 : plpython_regress + ['plpython_error_5', 'plpython_types_3']
+ # string.replace is only in meson 0.58
+ r3 = 'plpython3' + r2.split('plpython')[1]
+
+ s2 = '@0@.out'.format(r2)
+ s3 = '@0@.out'.format(r3)
+ plpython3_test_deps += custom_target(s3,
+ input: '@0@.out'.format(r2),
+ output: '@0@.out'.format(r3),
+ capture: true,
+ command: plpython_regress_cmd,
+ )
+endforeach
diff --git a/src/pl/plpython/meson.build b/src/pl/plpython/meson.build
new file mode 100644
index 00000000000..14fe8bd7c7f
--- /dev/null
+++ b/src/pl/plpython/meson.build
@@ -0,0 +1,100 @@
+if not python3.found()
+ subdir_done()
+endif
+
+plpython_sources = files(
+ 'plpy_cursorobject.c',
+ 'plpy_elog.c',
+ 'plpy_exec.c',
+ 'plpy_main.c',
+ 'plpy_planobject.c',
+ 'plpy_plpymodule.c',
+ 'plpy_procedure.c',
+ 'plpy_resultobject.c',
+ 'plpy_spi.c',
+ 'plpy_subxactobject.c',
+ 'plpy_typeio.c',
+ 'plpy_util.c',
+)
+
+plpython_sources += custom_target('spiexceptions.h',
+ input: files('../../backend/utils/errcodes.txt'),
+ output: 'spiexceptions.h',
+ command: [perl, files('generate-spiexceptions.pl'), '@INPUT@'],
+ capture: true
+ )
+
+
+# FIXME: need to duplicate import library ugliness?
+plpython_inc = include_directories('.')
+
+shared_module('plpython3',
+ plpython_sources,
+ c_pch: '../../include/pch/postgres_pch.h',
+ include_directories: [plpython_inc, postgres_inc],
+ kwargs: pg_mod_args + {
+ 'dependencies': [python3, pg_mod_args['dependencies']],
+ },
+)
+
+# FIXME: Only install the relevant versions
+install_data(
+ 'plpythonu.control',
+ 'plpython2u.control',
+ 'plpython3u.control',
+ 'plpythonu--1.0.sql',
+ 'plpython2u--1.0.sql',
+ 'plpython3u--1.0.sql',
+ install_dir: get_option('datadir') / 'extension'
+)
+
+# FIXME: Think about python2
+
+
+plpython_regress_cmd = [
+ sed,
+ '-f', files('regress-python3-mangle.sed'),
+ '@INPUT0@'
+]
+
+plpython_regress = [
+ 'plpython_schema',
+ 'plpython_populate',
+ 'plpython_test',
+ 'plpython_do',
+ 'plpython_global',
+ 'plpython_import',
+ 'plpython_spi',
+ 'plpython_newline',
+ 'plpython_void',
+ 'plpython_call',
+ 'plpython_params',
+ 'plpython_setof',
+ 'plpython_record',
+ 'plpython_trigger',
+ 'plpython_types',
+ 'plpython_error',
+ 'plpython_ereport',
+ 'plpython_unicode',
+ 'plpython_quote',
+ 'plpython_composite',
+ 'plpython_subtransaction',
+ 'plpython_transaction',
+ 'plpython_drop',
+]
+
+plpython3_regress = []
+plpython3_test_deps = []
+
+# FIXME: this is an abysmal hack
+subdir('sql')
+subdir('expected')
+
+regress_tests += {
+ 'name': 'plpython',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': plpython3_regress,
+ 'deps': plpython3_test_deps,
+ 'regress_args': ['--inputdir', meson.current_build_dir()],
+}
diff --git a/src/pl/plpython/sql/meson.build b/src/pl/plpython/sql/meson.build
new file mode 100644
index 00000000000..9923425e5ff
--- /dev/null
+++ b/src/pl/plpython/sql/meson.build
@@ -0,0 +1,15 @@
+# Convert plpython2 regression tests to plpython3 ones
+foreach r2 : plpython_regress
+ # string.replace is only in meson 0.58
+ r3 = 'plpython3' + r2.split('plpython')[1]
+ plpython3_regress += r3
+
+ s2 = '@0@.sql'.format(r2)
+ s3 = '@0@.sql'.format(r3)
+ plpython3_test_deps += custom_target(s3,
+ input: '@0@.sql'.format(r2),
+ output: '@0@.sql'.format(r3),
+ capture: true,
+ command: plpython_regress_cmd,
+ )
+endforeach
diff --git a/src/port/meson.build b/src/port/meson.build
new file mode 100644
index 00000000000..c6e4c788052
--- /dev/null
+++ b/src/port/meson.build
@@ -0,0 +1,146 @@
+pgport_sources = [
+ 'bsearch_arg.c',
+ 'chklocale.c',
+ 'erand48.c',
+ 'inet_net_ntop.c',
+ 'noblock.c',
+ 'path.c',
+ 'pg_bitutils.c',
+ 'pg_strong_random.c',
+ 'pgcheckdir.c',
+ 'pgmkdirp.c',
+ 'pgsleep.c',
+ 'pgstrcasecmp.c',
+ 'pgstrsignal.c',
+ 'pqsignal.c',
+ 'qsort.c',
+ 'qsort_arg.c',
+ 'quotes.c',
+ 'snprintf.c',
+ 'strerror.c',
+ 'tar.c',
+ 'thread.c',
+]
+
+if host_machine.system() == 'windows'
+ pgport_sources += files(
+ 'dirmod.c',
+ 'kill.c',
+ 'open.c',
+ 'system.c',
+ 'win32env.c',
+ 'win32error.c',
+ 'win32security.c',
+ 'win32setlocale.c',
+ 'win32stat.c',
+ )
+endif
+
+if cc.get_id() == 'msvc'
+ pgport_sources += files(
+ 'dirent.c',
+ )
+endif
+
+# Replacement functionality to be built if corresponding configure symbol
+# is false
+replace_funcs_neg = [
+ ['dlopen'],
+ ['explicit_bzero'],
+ ['fls'],
+ ['getaddrinfo'],
+ ['getopt'],
+ ['getopt_long'],
+ ['getpeereid'],
+ ['getpeereid'],
+ ['getrusage'],
+ ['gettimeofday'],
+ ['inet_aton'],
+ ['link'],
+ ['mkdtemp'],
+ ['pread'],
+ ['preadv', 'HAVE_DECL_PREADV'],
+ ['pwrite'],
+ ['pwritev', 'HAVE_DECL_PWRITEV'],
+ ['random'],
+ ['srandom'],
+ ['strlcat'],
+ ['strlcpy'],
+ ['strnlen'],
+]
+
+# Replacement functionality to be built if corresponding configure symbol
+# is true
+replace_funcs_pos = [
+ ['pg_crc32c_sse42', 'USE_SSE42_CRC32C'],
+ ['pg_crc32c_sse42', 'USE_SSE42_CRC32C_WITH_RUNTIME_CHECK'],
+ ['pg_crc32c_sse42_choose', 'USE_SSE42_CRC32C_WITH_RUNTIME_CHECK'],
+ ['pg_crc32c_sb8', 'USE_SSE42_CRC32C_WITH_RUNTIME_CHECK'],
+ ['pg_crc32c_armv8', 'USE_ARMV8_CRC32C'],
+ ['pg_crc32c_sb8', 'USE_ARMV8_CRC32C_WITH_RUNTIME_CHECK'],
+ ['pg_crc32c_armv8_choose', 'USE_ARMV8_CRC32C_WITH_RUNTIME_CHECK'],
+]
+
+foreach f : replace_funcs_neg
+ func = f.get(0)
+ varname = f.get(1, 'HAVE_@0@'.format(func.to_upper()))
+ filename = '@0@.c'.format(func)
+
+ val = '@0@'.format(cdata.get(varname, 'false'))
+ if val == 'false' or val == '0'
+ pgport_sources += files(filename)
+ endif
+endforeach
+
+foreach f : replace_funcs_pos
+ func = f.get(0)
+ varname = f.get(1, 'HAVE_@0@'.format(func.to_upper()))
+ filename = '@0@.c'.format(func)
+
+ val = '@0@'.format(cdata.get(varname, 'false'))
+ if val == 'true' or val == '1'
+ pgport_sources += files(filename)
+ endif
+endforeach
+
+
+if (host_machine.system() == 'windows' or host_machine.system() == 'cygwin') and \
+ (cc.get_id() != 'msvc' or cc.version().version_compare('<14.0'))
+
+ # Cygwin and (apparently, based on test results) Mingw both
+ # have a broken strtof(), so substitute the same replacement
+ # code we use with VS2013. That's not a perfect fix, since
+ # (unlike with VS2013) it doesn't avoid double-rounding, but
+ # we have no better options. To get that, though, we have to
+ # force the file to be compiled despite HAVE_STRTOF.
+ pgport_sources += files('strtof.c')
+ message('On @0@ with compiler @1@ @2@ we will use our strtof wrapper.'.format(
+ host_machine.system(), cc.get_id(), cc.version()))
+endif
+
+if not cdata.has('HAVE_PTHREAD_BARRIER_WAIT') and host_machine.system() != 'windows'
+ pgport_sources += files('pthread_barrier_wait.c')
+endif
+
+
+pgport_srv = static_library('pgport_srv',
+ pgport_sources,
+ c_pch: '../include/pch/c_pch.h',
+ dependencies: [ssl, backend_port_code],
+ kwargs: internal_lib_args,
+)
+
+pgport_static = static_library('pgport',
+ pgport_sources,
+ c_pch: '../include/pch/c_pch.h',
+ dependencies: [ssl, frontend_port_code],
+ kwargs: default_lib_args,
+)
+
+pgport_shlib = static_library('pgport_shlib',
+ pgport_sources,
+ c_pch: '../include/pch/c_pch.h',
+ pic: true,
+ dependencies: [ssl, frontend_port_code],
+ kwargs: default_lib_args,
+)
diff --git a/src/test/authentication/meson.build b/src/test/authentication/meson.build
new file mode 100644
index 00000000000..be41fb314a5
--- /dev/null
+++ b/src/test/authentication/meson.build
@@ -0,0 +1,9 @@
+tap_tests += {
+ 'name': 'authentication',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_password.pl',
+ 't/002_saslprep.pl',
+ ],
+}
diff --git a/src/test/isolation/meson.build b/src/test/isolation/meson.build
new file mode 100644
index 00000000000..637b4807550
--- /dev/null
+++ b/src/test/isolation/meson.build
@@ -0,0 +1,49 @@
+# pg_regress_c helpfully provided by regress/meson.build
+
+isolation_sources = pg_regress_c + files(
+ 'isolation_main.c',
+)
+
+# see src/backend/replication/meson.build for depend logic
+spec_scanner = custom_target('specscanner',
+ input : files('specscanner.l'),
+ output : ['specscanner.c'],
+ command : [flex, '-CF', '-p', '-p', '-o', '@OUTPUT0@', '@INPUT@']
+)
+
+isolationtester_sources = files('isolationtester.c')
+isolationtester_sources += custom_target('specparse',
+ input: 'specparse.y',
+ output: 'specparse.c',
+ depends: spec_scanner,
+ command: [bison, bisonflags, '-o', '@OUTPUT@', '@INPUT0@'])
+
+pg_isolation_regress = executable('pg_isolation_regress',
+ isolation_sources,
+ c_args: pg_regress_cflags,
+ include_directories: [pg_regress_inc],
+ dependencies: [frontend_code],
+ kwargs: default_bin_args + {
+ 'install': false
+ },
+)
+
+isolationtester = executable('isolationtester',
+ isolationtester_sources,
+ include_directories: include_directories('.'),
+ dependencies: [frontend_code, libpq],
+ kwargs: default_bin_args + {
+ 'install': false
+ },
+)
+
+isolation_tests += {
+ 'name': 'main',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'schedule': files('isolation_schedule'),
+ 'test_kwargs': {
+ 'priority': 40,
+ 'timeout': 1000,
+ },
+}
diff --git a/src/test/kerberos/meson.build b/src/test/kerberos/meson.build
new file mode 100644
index 00000000000..9f9957a3b4c
--- /dev/null
+++ b/src/test/kerberos/meson.build
@@ -0,0 +1,12 @@
+tap_tests += {
+ 'name': 'kerberos',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_auth.pl',
+ ],
+ 'env' : {
+ 'with_gssapi': gssapi.found() ? 'yes' : 'no',
+ 'with_krb_srvnam': 'postgres',
+ },
+}
diff --git a/src/test/ldap/meson.build b/src/test/ldap/meson.build
new file mode 100644
index 00000000000..58eb9adc6f5
--- /dev/null
+++ b/src/test/ldap/meson.build
@@ -0,0 +1,9 @@
+tap_tests += {
+ 'name': 'ldap',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_auth.pl',
+ ],
+ 'env' : {'with_ldap': ldap.found() ? 'yes' : 'no'},
+}
diff --git a/src/test/meson.build b/src/test/meson.build
new file mode 100644
index 00000000000..f0b0d3d3b5e
--- /dev/null
+++ b/src/test/meson.build
@@ -0,0 +1,19 @@
+subdir('regress')
+subdir('isolation')
+
+subdir('authentication')
+subdir('recovery')
+subdir('subscription')
+subdir('modules')
+
+if ssl.found()
+ subdir('ssl')
+endif
+
+if ldap.found()
+ subdir('ldap')
+endif
+
+if gssapi.found()
+ subdir('kerberos')
+endif
diff --git a/src/test/modules/brin/meson.build b/src/test/modules/brin/meson.build
new file mode 100644
index 00000000000..99ccaac5b38
--- /dev/null
+++ b/src/test/modules/brin/meson.build
@@ -0,0 +1,19 @@
+isolation_tests += {
+ 'name': 'brin',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'specs': [
+ 'summarization-and-inprogress-insertion',
+ ]
+}
+
+
+tap_tests += {
+ 'name': 'brin',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/01_workitems.pl',
+ ],
+}
+
diff --git a/src/test/modules/commit_ts/meson.build b/src/test/modules/commit_ts/meson.build
new file mode 100644
index 00000000000..2794d837c35
--- /dev/null
+++ b/src/test/modules/commit_ts/meson.build
@@ -0,0 +1,20 @@
+regress_tests += {
+ 'name': 'commit_ts',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'commit_timestamp',
+ ]
+}
+
+tap_tests += {
+ 'name': 'commit_ts',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_base.pl',
+ 't/002_standby.pl',
+ 't/003_standby_2.pl',
+ 't/004_restart.pl',
+ ],
+}
diff --git a/src/test/modules/delay_execution/meson.build b/src/test/modules/delay_execution/meson.build
new file mode 100644
index 00000000000..58fe5a1a21d
--- /dev/null
+++ b/src/test/modules/delay_execution/meson.build
@@ -0,0 +1,15 @@
+# FIXME: prevent install during main install, but not during test :/
+delay_execution = shared_module('delay_execution',
+ ['delay_execution.c'],
+ kwargs: pg_mod_args,
+)
+
+isolation_tests += {
+ 'name': 'delay_execution',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'specs': [
+ 'partition-addition',
+ 'partition-removal-1',
+ ]
+}
diff --git a/src/test/modules/dummy_index_am/meson.build b/src/test/modules/dummy_index_am/meson.build
new file mode 100644
index 00000000000..a9c49bd9554
--- /dev/null
+++ b/src/test/modules/dummy_index_am/meson.build
@@ -0,0 +1,20 @@
+# FIXME: prevent install during main install, but not during test :/
+dummy_index_am = shared_module('dummy_index_am',
+ ['dummy_index_am.c'],
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'dummy_index_am.control',
+ 'dummy_index_am--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'dummy_index_am',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'reloptions',
+ ]
+}
diff --git a/src/test/modules/dummy_seclabel/meson.build b/src/test/modules/dummy_seclabel/meson.build
new file mode 100644
index 00000000000..ed31d8f9530
--- /dev/null
+++ b/src/test/modules/dummy_seclabel/meson.build
@@ -0,0 +1,20 @@
+# FIXME: prevent install during main install, but not during test :/
+dummy_seclabel = shared_module('dummy_seclabel',
+ ['dummy_seclabel.c'],
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'dummy_seclabel.control',
+ 'dummy_seclabel--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'dummy_seclabel',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'dummy_seclabel',
+ ]
+}
diff --git a/src/test/modules/libpq_pipeline/meson.build b/src/test/modules/libpq_pipeline/meson.build
new file mode 100644
index 00000000000..2f850215a6f
--- /dev/null
+++ b/src/test/modules/libpq_pipeline/meson.build
@@ -0,0 +1,21 @@
+libpq_pipeline = executable('libpq_pipeline',
+ files(
+ 'libpq_pipeline.c',
+ ),
+ dependencies: [frontend_code, libpq],
+ kwargs: default_bin_args + {
+ 'install': false,
+ },
+)
+
+tap_tests += {
+ 'name': 'libpq_pipeline',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'env': {
+ 'PATH': meson.current_build_dir(),
+ },
+ 'tests': [
+ 't/001_libpq_pipeline.pl',
+ ]
+}
diff --git a/src/test/modules/meson.build b/src/test/modules/meson.build
new file mode 100644
index 00000000000..c98225c6e7b
--- /dev/null
+++ b/src/test/modules/meson.build
@@ -0,0 +1,25 @@
+subdir('brin')
+subdir('commit_ts')
+subdir('delay_execution')
+subdir('dummy_index_am')
+subdir('dummy_seclabel')
+subdir('libpq_pipeline')
+subdir('plsample')
+subdir('snapshot_too_old')
+subdir('spgist_name_ops')
+subdir('ssl_passphrase_callback')
+subdir('test_bloomfilter')
+subdir('test_ddl_deparse')
+subdir('test_extensions')
+subdir('test_ginpostinglist')
+subdir('test_integerset')
+subdir('test_misc')
+subdir('test_parser')
+subdir('test_pg_dump')
+subdir('test_predtest')
+subdir('test_rbtree')
+subdir('test_regex')
+subdir('test_rls_hooks')
+subdir('test_shm_mq')
+subdir('unsafe_tests')
+subdir('worker_spi')
diff --git a/src/test/modules/plsample/meson.build b/src/test/modules/plsample/meson.build
new file mode 100644
index 00000000000..3f70688fb89
--- /dev/null
+++ b/src/test/modules/plsample/meson.build
@@ -0,0 +1,20 @@
+# FIXME: prevent install during main install, but not during test :/
+plsample = shared_module('plsample',
+ ['plsample.c'],
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'plsample.control',
+ 'plsample--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'plsample',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'plsample',
+ ]
+}
diff --git a/src/test/modules/snapshot_too_old/meson.build b/src/test/modules/snapshot_too_old/meson.build
new file mode 100644
index 00000000000..cdf4afd18b8
--- /dev/null
+++ b/src/test/modules/snapshot_too_old/meson.build
@@ -0,0 +1,11 @@
+isolation_tests += {
+ 'name': 'snapshot_too_old',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'specs': [
+ 'sto_using_cursor',
+ 'sto_using_select',
+ 'sto_using_hash_index',
+ ],
+ 'regress_args': ['--temp-config', files('sto.conf')],
+}
diff --git a/src/test/modules/spgist_name_ops/meson.build b/src/test/modules/spgist_name_ops/meson.build
new file mode 100644
index 00000000000..19aa00892f1
--- /dev/null
+++ b/src/test/modules/spgist_name_ops/meson.build
@@ -0,0 +1,20 @@
+# FIXME: prevent install during main install, but not during test :/
+spgist_name_ops = shared_module('spgist_name_ops',
+ ['spgist_name_ops.c'],
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'spgist_name_ops.control',
+ 'spgist_name_ops--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'spgist_name_ops',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'spgist_name_ops',
+ ]
+}
diff --git a/src/test/modules/ssl_passphrase_callback/meson.build b/src/test/modules/ssl_passphrase_callback/meson.build
new file mode 100644
index 00000000000..b9fa5ee1cdc
--- /dev/null
+++ b/src/test/modules/ssl_passphrase_callback/meson.build
@@ -0,0 +1,45 @@
+if not ssl.found()
+ subdir_done()
+endif
+
+# FIXME: prevent install during main install, but not during test :/
+ssl_passphrase_callback = shared_module('ssl_passphrase_func',
+ ['ssl_passphrase_func.c'],
+ kwargs: pg_mod_args + {
+ 'dependencies': [ssl, pg_mod_args['dependencies']],
+ }
+)
+
+# Targets to generate or remove the ssl certificate and key. Need to be copied
+# to the source afterwards. Normally not needed.
+
+openssl = find_program('openssl', native: true, required: false)
+
+if openssl.found()
+ cert = custom_target('server.crt',
+ output: ['server.crt', 'server.ckey'],
+ command: [openssl, 'req', '-new', '-x509', '-days', '10000', '-nodes', '-out', '@OUTPUT0@',
+ '-keyout', '@OUTPUT1@', '-subj', '/CN=localhost'],
+ build_by_default: false,
+ install: false,
+ )
+
+ # needs to agree with what's in the test script
+ pass = 'FooBaR1'
+
+ enccert = custom_target('server.key',
+ input: [cert[1]],
+ output: ['server.key'],
+ command: [openssl, 'rsa', '-aes256', '-in', '@INPUT0@', '-out', '@OUTPUT0@', '-passout', 'pass:@0@'.format(pass)]
+ )
+endif
+
+tap_tests += {
+ 'name': 'ssl_passphrase_callback',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_testfunc.pl',
+ ],
+ 'env': {'with_ssl': 'openssl'},
+}
diff --git a/src/test/modules/test_bloomfilter/meson.build b/src/test/modules/test_bloomfilter/meson.build
new file mode 100644
index 00000000000..2e995310876
--- /dev/null
+++ b/src/test/modules/test_bloomfilter/meson.build
@@ -0,0 +1,20 @@
+# FIXME: prevent install during main install, but not during test :/
+test_bloomfilter = shared_module('test_bloomfilter',
+ ['test_bloomfilter.c'],
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'test_bloomfilter.control',
+ 'test_bloomfilter--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'test_bloomfilter',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'test_bloomfilter',
+ ]
+}
diff --git a/src/test/modules/test_ddl_deparse/meson.build b/src/test/modules/test_ddl_deparse/meson.build
new file mode 100644
index 00000000000..3618229594d
--- /dev/null
+++ b/src/test/modules/test_ddl_deparse/meson.build
@@ -0,0 +1,40 @@
+# FIXME: prevent install during main install, but not during test :/
+test_ddl_deparse = shared_module('test_ddl_deparse',
+ ['test_ddl_deparse.c'],
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'test_ddl_deparse.control',
+ 'test_ddl_deparse--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'test_ddl_deparse',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'test_ddl_deparse',
+ 'create_extension',
+ 'create_schema',
+ 'create_type',
+ 'create_conversion',
+ 'create_domain',
+ 'create_sequence_1',
+ 'create_table',
+ 'create_transform',
+ 'alter_table',
+ 'create_view',
+ 'create_trigger',
+ 'create_rule',
+ 'comment_on',
+ 'alter_function',
+ 'alter_sequence',
+ 'alter_ts_config',
+ 'alter_type_enum',
+ 'opfamily',
+ 'defprivs',
+ 'matviews',
+ ]
+}
diff --git a/src/test/modules/test_extensions/meson.build b/src/test/modules/test_extensions/meson.build
new file mode 100644
index 00000000000..2ca504f8588
--- /dev/null
+++ b/src/test/modules/test_extensions/meson.build
@@ -0,0 +1,38 @@
+# FIXME: prevent install during main install, but not during test :/
+install_data(
+ 'test_ext1--1.0.sql',
+ 'test_ext1.control',
+ 'test_ext2--1.0.sql',
+ 'test_ext2.control',
+ 'test_ext3--1.0.sql',
+ 'test_ext3.control',
+ 'test_ext4--1.0.sql',
+ 'test_ext4.control',
+ 'test_ext5--1.0.sql',
+ 'test_ext5.control',
+ 'test_ext6--1.0.sql',
+ 'test_ext6.control',
+ 'test_ext7--1.0--2.0.sql',
+ 'test_ext7--1.0.sql',
+ 'test_ext7.control',
+ 'test_ext8--1.0.sql',
+ 'test_ext8.control',
+ 'test_ext_cyclic1--1.0.sql',
+ 'test_ext_cyclic1.control',
+ 'test_ext_cyclic2--1.0.sql',
+ 'test_ext_cyclic2.control',
+ 'test_ext_evttrig--1.0--2.0.sql',
+ 'test_ext_evttrig--1.0.sql',
+ 'test_ext_evttrig.control',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'test_extensions',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'test_extensions',
+ 'test_extdepend',
+ ]
+}
diff --git a/src/test/modules/test_ginpostinglist/meson.build b/src/test/modules/test_ginpostinglist/meson.build
new file mode 100644
index 00000000000..e177e90019f
--- /dev/null
+++ b/src/test/modules/test_ginpostinglist/meson.build
@@ -0,0 +1,20 @@
+# FIXME: prevent install during main install, but not during test :/
+test_ginpostinglist = shared_module('test_ginpostinglist',
+ ['test_ginpostinglist.c'],
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'test_ginpostinglist.control',
+ 'test_ginpostinglist--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'test_ginpostinglist',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'test_ginpostinglist',
+ ]
+}
diff --git a/src/test/modules/test_integerset/meson.build b/src/test/modules/test_integerset/meson.build
new file mode 100644
index 00000000000..ccb8db725e5
--- /dev/null
+++ b/src/test/modules/test_integerset/meson.build
@@ -0,0 +1,20 @@
+# FIXME: prevent install during main install, but not during test :/
+test_integerset = shared_module('test_integerset',
+ ['test_integerset.c'],
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'test_integerset.control',
+ 'test_integerset--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'test_integerset',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'test_integerset',
+ ]
+}
diff --git a/src/test/modules/test_misc/meson.build b/src/test/modules/test_misc/meson.build
new file mode 100644
index 00000000000..4ee8c562ac0
--- /dev/null
+++ b/src/test/modules/test_misc/meson.build
@@ -0,0 +1,8 @@
+tap_tests += {
+ 'name': 'misc',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_constraint_validation.pl',
+ ],
+}
diff --git a/src/test/modules/test_parser/meson.build b/src/test/modules/test_parser/meson.build
new file mode 100644
index 00000000000..c43ae95cf2c
--- /dev/null
+++ b/src/test/modules/test_parser/meson.build
@@ -0,0 +1,20 @@
+# FIXME: prevent install during main install, but not during test :/
+test_parser = shared_module('test_parser',
+ ['test_parser.c'],
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'test_parser.control',
+ 'test_parser--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'test_parser',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'test_parser',
+ ]
+}
diff --git a/src/test/modules/test_pg_dump/meson.build b/src/test/modules/test_pg_dump/meson.build
new file mode 100644
index 00000000000..110b3876832
--- /dev/null
+++ b/src/test/modules/test_pg_dump/meson.build
@@ -0,0 +1,24 @@
+# FIXME: prevent install during main install, but not during test :/
+install_data(
+ 'test_pg_dump.control',
+ 'test_pg_dump--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'test_pg_dump',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'test_pg_dump',
+ ]
+}
+
+tap_tests += {
+ 'name': 'test_pg_dump',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_base.pl',
+ ]
+}
diff --git a/src/test/modules/test_predtest/meson.build b/src/test/modules/test_predtest/meson.build
new file mode 100644
index 00000000000..9f9a9475c8b
--- /dev/null
+++ b/src/test/modules/test_predtest/meson.build
@@ -0,0 +1,20 @@
+# FIXME: prevent install during main install, but not during test :/
+test_predtest = shared_module('test_predtest',
+ ['test_predtest.c'],
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'test_predtest.control',
+ 'test_predtest--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'test_predtest',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'test_predtest',
+ ]
+}
diff --git a/src/test/modules/test_rbtree/meson.build b/src/test/modules/test_rbtree/meson.build
new file mode 100644
index 00000000000..6bbeca39ec9
--- /dev/null
+++ b/src/test/modules/test_rbtree/meson.build
@@ -0,0 +1,20 @@
+# FIXME: prevent install during main install, but not during test :/
+test_rbtree = shared_module('test_rbtree',
+ ['test_rbtree.c'],
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'test_rbtree.control',
+ 'test_rbtree--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'test_rbtree',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'test_rbtree',
+ ]
+}
diff --git a/src/test/modules/test_regex/meson.build b/src/test/modules/test_regex/meson.build
new file mode 100644
index 00000000000..c5fd92ee1c6
--- /dev/null
+++ b/src/test/modules/test_regex/meson.build
@@ -0,0 +1,21 @@
+# FIXME: prevent install during main install, but not during test :/
+test_regex = shared_module('test_regex',
+ ['test_regex.c'],
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'test_regex.control',
+ 'test_regex--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'test_regex',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'test_regex',
+ 'test_regex_utf8',
+ ]
+}
diff --git a/src/test/modules/test_rls_hooks/meson.build b/src/test/modules/test_rls_hooks/meson.build
new file mode 100644
index 00000000000..fb8b697e160
--- /dev/null
+++ b/src/test/modules/test_rls_hooks/meson.build
@@ -0,0 +1,19 @@
+# FIXME: prevent install during main install, but not during test :/
+test_rls_hooks = shared_module('test_rls_hooks',
+ ['test_rls_hooks.c'],
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'test_rls_hooks.control',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'test_rls_hooks',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'test_rls_hooks',
+ ]
+}
diff --git a/src/test/modules/test_shm_mq/meson.build b/src/test/modules/test_shm_mq/meson.build
new file mode 100644
index 00000000000..159943f861e
--- /dev/null
+++ b/src/test/modules/test_shm_mq/meson.build
@@ -0,0 +1,24 @@
+# FIXME: prevent install during main install, but not during test :/
+test_shm_mq = shared_module('test_shm_mq',
+ files(
+ 'setup.c',
+ 'test.c',
+ 'worker.c',
+ ),
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'test_shm_mq.control',
+ 'test_shm_mq--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'test_shm_mq',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'test_shm_mq',
+ ]
+}
diff --git a/src/test/modules/unsafe_tests/meson.build b/src/test/modules/unsafe_tests/meson.build
new file mode 100644
index 00000000000..9ed4d587721
--- /dev/null
+++ b/src/test/modules/unsafe_tests/meson.build
@@ -0,0 +1,9 @@
+regress_tests += {
+ 'name': 'unsafe_tests',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'rolenames',
+ 'alter_system_table',
+ ],
+}
diff --git a/src/test/modules/worker_spi/meson.build b/src/test/modules/worker_spi/meson.build
new file mode 100644
index 00000000000..a80bd493ea7
--- /dev/null
+++ b/src/test/modules/worker_spi/meson.build
@@ -0,0 +1,23 @@
+# FIXME: prevent install during main install, but not during test :/
+test_worker_spi = shared_module('worker_spi',
+ files(
+ 'worker_spi.c',
+ ),
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'worker_spi.control',
+ 'worker_spi--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'worker_spi',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'worker_spi',
+ ],
+ 'regress_args': ['--temp-config', files('dynamic.conf'), '--dbname=contrib_regression'],
+}
diff --git a/src/test/recovery/meson.build b/src/test/recovery/meson.build
new file mode 100644
index 00000000000..5678e1d27ae
--- /dev/null
+++ b/src/test/recovery/meson.build
@@ -0,0 +1,33 @@
+tap_tests += {
+ 'name': 'recovery',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests' : [
+ 't/001_stream_rep.pl',
+ 't/002_archiving.pl',
+ 't/003_recovery_targets.pl',
+ 't/004_timeline_switch.pl',
+ 't/005_replay_delay.pl',
+ 't/006_logical_decoding.pl',
+ 't/007_sync_rep.pl',
+ 't/008_fsm_truncation.pl',
+ 't/009_twophase.pl',
+ 't/010_logical_decoding_timelines.pl',
+ 't/011_crash_recovery.pl',
+ 't/012_subtransactions.pl',
+ 't/013_crash_restart.pl',
+ 't/014_unlogged_reinit.pl',
+ 't/015_promotion_pages.pl',
+ 't/016_min_consistency.pl',
+ 't/017_shm.pl',
+ 't/018_wal_optimize.pl',
+ 't/019_replslot_limit.pl',
+ 't/020_archive_status.pl',
+ 't/021_row_visibility.pl',
+ 't/022_crash_temp_files.pl',
+ 't/023_pitr_prepared_xact.pl',
+ 't/024_archive_recovery.pl',
+ 't/025_stuck_on_old_timeline.pl',
+ 't/026_overwrite_contrecord.pl',
+ ]
+}
diff --git a/src/test/regress/meson.build b/src/test/regress/meson.build
new file mode 100644
index 00000000000..1a2f7675e87
--- /dev/null
+++ b/src/test/regress/meson.build
@@ -0,0 +1,57 @@
+# also used by isolationtester
+pg_regress_c = files('pg_regress.c')
+pg_regress_inc = include_directories('.')
+
+regress_sources = pg_regress_c + files(
+ 'pg_regress_main.c'
+)
+
+pg_regress_cflags = ['-DHOST_TUPLE="frak"', '-DSHELLPROG="/bin/sh"']
+
+pg_regress = executable('pg_regress',
+ regress_sources,
+ c_args: pg_regress_cflags,
+ dependencies: [frontend_code, libpq],
+ kwargs: default_bin_args + {
+ 'install': false
+ },
+)
+
+regress_module = shared_module('regress',
+ ['regress.c'],
+ kwargs: pg_mod_args + {
+ 'install': false,
+ },
+)
+
+# Get some extra C modules from contrib/spi but mark them as not to be
+# installed.
+# FIXME: avoid the duplication.
+
+shared_module('autoinc',
+ ['../../../contrib/spi/autoinc.c'],
+ kwargs: pg_mod_args + {
+ 'install': false,
+ },
+)
+
+shared_module('refint',
+ ['../../../contrib/spi/refint.c'],
+ kwargs: pg_mod_args + {
+ 'c_args': refint_cflags + contrib_mod_args['c_args'],
+ 'install': false,
+ },
+)
+
+
+regress_tests += {
+ 'name': 'main',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'schedule': files('parallel_schedule'),
+ 'regress_args': ['--make-testtablespace-dir'],
+ 'test_kwargs': {
+ 'priority': 50,
+ 'timeout': 1000,
+ },
+}
diff --git a/src/test/ssl/meson.build b/src/test/ssl/meson.build
new file mode 100644
index 00000000000..42e34c9f632
--- /dev/null
+++ b/src/test/ssl/meson.build
@@ -0,0 +1,10 @@
+tap_tests += {
+ 'name': 'ssl',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'env' : {'with_ssl': get_option('ssl')},
+ 'tests': [
+ 't/001_ssltests.pl',
+ 't/002_scram.pl'
+ ],
+}
diff --git a/src/test/subscription/meson.build b/src/test/subscription/meson.build
new file mode 100644
index 00000000000..31580edd3d3
--- /dev/null
+++ b/src/test/subscription/meson.build
@@ -0,0 +1,33 @@
+tap_tests += {
+ 'name': 'subscription',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'env' : {'with_icu': icu.found() ? 'yes' : 'no'},
+ 'tests': [
+ 't/001_rep_changes.pl',
+ 't/002_types.pl',
+ 't/003_constraints.pl',
+ 't/004_sync.pl',
+ 't/005_encoding.pl',
+ 't/006_rewrite.pl',
+ 't/007_ddl.pl',
+ 't/008_diff_schema.pl',
+ 't/009_matviews.pl',
+ 't/010_truncate.pl',
+ 't/011_generated.pl',
+ 't/012_collation.pl',
+ 't/013_partition.pl',
+ 't/014_binary.pl',
+ 't/015_stream.pl',
+ 't/016_stream_subxact.pl',
+ 't/017_stream_ddl.pl',
+ 't/018_stream_subxact_abort.pl',
+ 't/019_stream_subxact_ddl_abort.pl',
+ 't/020_messages.pl',
+ 't/021_twophase.pl',
+ 't/022_twophase_cascade.pl',
+ 't/023_twophase_stream.pl',
+ 't/024_add_drop_pub.pl',
+ 't/100_bugs.pl',
+ ],
+}
diff --git a/src/timezone/meson.build b/src/timezone/meson.build
new file mode 100644
index 00000000000..c3703a5ec7d
--- /dev/null
+++ b/src/timezone/meson.build
@@ -0,0 +1,50 @@
+# files to build into backend
+timezone_sources = files(
+ 'localtime.c',
+ 'pgtz.c',
+ 'strftime.c',
+)
+
+
+timezone_inc = include_directories('.')
+
+timezone_localtime_source = files('localtime.c')
+
+# files needed to build zic utility program
+zic_sources = files(
+ 'zic.c'
+)
+
+# we now distribute the timezone data as a single file
+tzdata = files(
+ 'data/tzdata.zi'
+)
+
+
+# FIXME: For cross builds, it would need a native built libpgport/pgcommon to
+# build our zic. But for that we'd need to run a good chunk of the configure
+# tests both natively and cross. Unclear if it's worth it.
+if meson.is_cross_build()
+ zic = find_program('zic', native: true, required: false)
+else
+ zic = executable('zic', zic_sources,
+ dependencies: [frontend_code],
+ kwargs: default_bin_args + {'install': false}
+ )
+endif
+
+# FIXME: this used to be sorted - but also isn't actually used
+abbrevs_txt = custom_target('abbrevs.txt',
+ input: tzdata,
+ output: ['abbrevs.txt'],
+ command: [zic, '-P', '-b', 'fat', 'junkdir', '@INPUT@'],
+ capture: true)
+
+tzdata = custom_target('tzdata',
+ input: tzdata,
+ output: ['timezone'],
+ command: [zic, '-d', '@OUTPUT@', '@INPUT@'],
+ install: true,
+ install_dir: get_option('datadir'))
+
+subdir('tznames')
diff --git a/src/timezone/tznames/meson.build b/src/timezone/tznames/meson.build
new file mode 100644
index 00000000000..effd2880ce7
--- /dev/null
+++ b/src/timezone/tznames/meson.build
@@ -0,0 +1,20 @@
+tznames = files(
+ 'Africa.txt',
+ 'America.txt',
+ 'Antarctica.txt',
+ 'Asia.txt',
+ 'Atlantic.txt',
+ 'Australia.txt',
+ 'Etc.txt',
+ 'Europe.txt',
+ 'Indian.txt',
+ 'Pacific.txt',
+)
+
+tznames_sets = files(
+ 'Default',
+ 'Australia',
+ 'India')
+
+install_data(tznames, install_dir: get_option('datadir') / 'timezonesets')
+install_data(tznames_sets, install_dir: get_option('datadir') / 'timezonesets')
diff --git a/src/tools/find_meson b/src/tools/find_meson
new file mode 100755
index 00000000000..2d75537374e
--- /dev/null
+++ b/src/tools/find_meson
@@ -0,0 +1,20 @@
+#!/usr/bin/env python3
+
+import os
+import shlex
+import sys
+
+mesonintrospect = os.environ['MESONINTROSPECT']
+components = shlex.split(mesonintrospect)
+
+if len(components) < 2:
+ print(f'expected more, got: {components}')
+ sys.exit(1)
+
+if components[-1] != 'introspect':
+ print('expected introspection at the end')
+ sys.exit(1)
+
+print('\n'.join(components[:-1]), end='')
+
+sys.exit(0)
diff --git a/src/tools/irlink b/src/tools/irlink
new file mode 100644
index 00000000000..efc2c700277
--- /dev/null
+++ b/src/tools/irlink
@@ -0,0 +1,28 @@
+#!/bin/bash
+
+set -e
+
+srcdir="$1"
+builddir="$2"
+llvm_lto="$3"
+outputdir=$(realpath "$5")
+index="$outputdir/postgres.index.bc"
+priv="$6"
+shift 6
+numinput=$#
+
+if [ ! -d "$outputdir" ];then
+ mkdir -p "$outputdir/postgres"
+fi
+
+cd $priv
+
+# fixme, remove old contents"
+cp -r . "$outputdir/postgres"
+
+cd "$outputdir"
+
+filenames=$(for f in "$@";do echo "postgres/${f#$priv/}";done)
+"$llvm_lto" -thinlto -thinlto-action=thinlink -o "$index" $filenames
+
+exit 0
diff --git a/src/tools/msvc/export2def.pl b/src/tools/msvc/export2def.pl
new file mode 100644
index 00000000000..fb88e8b8ab9
--- /dev/null
+++ b/src/tools/msvc/export2def.pl
@@ -0,0 +1,22 @@
+# Copyright (c) 2021, PostgreSQL Global Development Group
+
+use strict;
+use warnings;
+use 5.8.0;
+use List::Util qw(max);
+
+my ($deffile, $txtfile, $libname) = @ARGV;
+
+print STDERR "Generating $deffile...\n";
+open(my $if, '<', $txtfile) || die("Could not open $txtfile\n");
+open(my $of, '>', $deffile) || die("Could not open $deffile for writing\n");
+print $of "LIBRARY $libname\nEXPORTS\n";
+while (<$if>)
+{
+ next if (/^#/);
+ next if (/^\s*$/);
+ my ($f, $o) = split;
+ print $of " $f @ $o\n";
+}
+close($of);
+close($if);
diff --git a/src/tools/msvc/gendef2.pl b/src/tools/msvc/gendef2.pl
new file mode 100644
index 00000000000..3b905d6f5da
--- /dev/null
+++ b/src/tools/msvc/gendef2.pl
@@ -0,0 +1,177 @@
+
+# Copyright (c) 2021, PostgreSQL Global Development Group
+
+use strict;
+use warnings;
+use 5.8.0;
+use List::Util qw(max);
+
+my @def;
+
+#
+# Script that generates a .DEF file for all objects in a directory
+#
+# src/tools/msvc/gendef.pl
+#
+
+# Given a symbol file path, loops over its contents
+# and returns a list of symbols of interest as a dictionary
+# of 'symbolname' -> symtype, where symtype is:
+#
+# 0 a CODE symbol, left undecorated in the .DEF
+# 1 A DATA symbol, i.e. global var export
+#
+sub extract_syms
+{
+ my ($symfile, $def) = @_;
+ open(my $f, '<', $symfile) || die "Could not open $symfile: $!\n";
+ while (<$f>)
+ {
+
+ # Expected symbol lines look like:
+ #
+ # 0 1 2 3 4 5 6
+ # IDX SYMBOL SECT SYMTYPE SYMSTATIC SYMNAME
+ # ------------------------------------------------------------------------
+ # 02E 00000130 SECTA notype External | _standbyState
+ # 02F 00000009 SECT9 notype Static | _LocalRecoveryInProgress
+ # 064 00000020 SECTC notype () Static | _XLogCheckBuffer
+ # 065 00000000 UNDEF notype () External | _BufferGetTag
+ #
+ # See http://msdn.microsoft.com/en-us/library/b842y285.aspx
+ #
+ # We're not interested in the symbol index or offset.
+ #
+ # SECT[ION] is only examined to see whether the symbol is defined in a
+ # COFF section of the local object file; if UNDEF, it's a symbol to be
+ # resolved at link time from another object so we can't export it.
+ #
+ # SYMTYPE is always notype for C symbols as there's no typeinfo and no
+ # way to get the symbol type from name (de)mangling. However, we care
+ # if "notype" is suffixed by "()" or not. The presence of () means the
+ # symbol is a function, the absence means it isn't.
+ #
+ # SYMSTATIC indicates whether it's a compilation-unit local "static"
+ # symbol ("Static"), or whether it's available for use from other
+ # compilation units ("External"). We export all symbols that aren't
+ # static as part of the whole program DLL interface to produce UNIX-like
+ # default linkage.
+ #
+ # SYMNAME is, obviously, the symbol name. The leading underscore
+ # indicates that the _cdecl calling convention is used. See
+ # http://www.unixwiz.net/techtips/win32-callconv.html
+ # http://www.codeproject.com/Articles/1388/Calling-Conventions-Demystified
+ #
+ s/notype \(\)/func/g;
+ s/notype/data/g;
+
+ my @pieces = split;
+
+ # Skip file and section headers and other non-symbol entries
+ next unless defined($pieces[0]) and $pieces[0] =~ /^[A-F0-9]{3,}$/;
+
+ # Skip blank symbol names
+ next unless $pieces[6];
+
+ # Skip externs used from another compilation unit
+ next if ($pieces[2] eq "UNDEF");
+
+ # Skip static symbols
+ next unless ($pieces[4] eq "External");
+
+ # Skip some more MSVC-generated crud
+ next if $pieces[6] =~ /^@/;
+ next if $pieces[6] =~ /^\(/;
+
+ # __real and __xmm are out-of-line floating point literals and
+ # (for __xmm) their SIMD equivalents. They shouldn't be part
+ # of the DLL interface.
+ next if $pieces[6] =~ /^__real/;
+ next if $pieces[6] =~ /^__xmm/;
+
+ # __imp entries are imports from other DLLs, eg __imp__malloc .
+ # (We should never have one of these that hasn't already been skipped
+ # by the UNDEF test above, though).
+ next if $pieces[6] =~ /^__imp/;
+
+ # More under-documented internal crud
+ next if $pieces[6] =~ /NULL_THUNK_DATA$/;
+ next if $pieces[6] =~ /^__IMPORT_DESCRIPTOR/;
+ next if $pieces[6] =~ /^__NULL_IMPORT/;
+
+ # Skip string literals
+ next if $pieces[6] =~ /^\?\?_C/;
+
+ # We assume that if a symbol is defined as data, then as a function,
+ # the linker will reject the binary anyway. So it's OK to just pick
+ # whatever came last.
+ $def->{ $pieces[6] } = $pieces[3];
+ }
+ close($f);
+ return;
+}
+
+sub writedef
+{
+ my ($deffile, $platform, $def) = @_;
+ open(my $fh, '>', $deffile) || die "Could not write to $deffile\n";
+ print $fh "EXPORTS\n";
+ foreach my $f (sort keys %{$def})
+ {
+ my $isdata = $def->{$f} eq 'data';
+
+ # Strip the leading underscore for win32, but not x64
+ $f =~ s/^_//
+ unless ($platform eq "x64");
+
+ # Emit just the name if it's a function symbol, or emit the name
+ # decorated with the DATA option for variables.
+ if ($isdata)
+ {
+ print $fh " $f DATA\n";
+ }
+ else
+ {
+ print $fh " $f\n";
+ }
+ }
+ close($fh);
+ return;
+}
+
+
+sub usage
+{
+ die( "Usage: gendef.pl platform outputfile tempdir sourcelib\n"
+ . " modulepath: path to dir with obj files, no trailing slash"
+ . " platform: Win32 | x64");
+}
+
+usage()
+ unless scalar(@ARGV) >= 4;
+
+my $platform = $ARGV[0];
+shift;
+my $deffile = $ARGV[0];
+shift;
+my $tempdir = $ARGV[0];
+shift;
+
+print STDERR "Generating $deffile in tmp dir $tempdir from ".join(' ', @ARGV)."\n";
+
+my %def = ();
+
+my $symfile = "$tempdir/all.sym";
+my $tmpfile = "$tempdir/tmp.sym";
+mkdir($tempdir);
+print STDERR "dumpbin /symbols /out:$tmpfile ".join(' ', @ARGV)."\n";
+system("dumpbin /symbols /out:$tmpfile ".join(' ', @ARGV))
+ && die "Could not call dumpbin";
+rename($tmpfile, $symfile);
+print "generated symfile to $symfile (via $tmpfile)\n";
+extract_syms($symfile, \%def);
+print "\n";
+
+writedef($deffile, $platform, \%def);
+
+print "Generated " . scalar(keys(%def)) . " symbols\n";
diff --git a/src/tools/testwrap b/src/tools/testwrap
new file mode 100755
index 00000000000..aeb2019b099
--- /dev/null
+++ b/src/tools/testwrap
@@ -0,0 +1,22 @@
+#!/bin/sh
+#
+# FIXME: I should probably be a perl or python script
+#
+
+# FIXME: argument parsing
+
+basedir=$1
+builddir=$2
+testgroup=$3
+testname=$(basename -s .pl $4)
+shift 4
+
+testdir="$basedir/testrun/$testgroup/$testname"
+echo "# executing test in $testdir group $testgroup test $testname, builddir $builddir"
+rm -rf "$testdir/"
+mkdir -p "$testdir"
+
+export TESTOUTDIR="$testdir"
+export TESTDIR="$builddir"
+
+exec "$@"
--
2.23.0.385.gbc12974a89
v3-0017-ci-Build-both-with-meson-and-as-before.patchtext/x-diff; charset=us-asciiDownload
From 6567a246b7c98bf769b15f4544fb039d162dbe38 Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Fri, 8 Oct 2021 17:29:10 -0700
Subject: [PATCH v3 17/17] ci: Build both with meson and as before.
---
.cirrus.yml | 464 ++++++++++++++++++++++++++++++++++------------------
1 file changed, 308 insertions(+), 156 deletions(-)
diff --git a/.cirrus.yml b/.cirrus.yml
index f75bdce6dec..eace4602ea3 100644
--- a/.cirrus.yml
+++ b/.cirrus.yml
@@ -13,14 +13,13 @@ env:
task:
- name: FreeBSD
only_if: $CIRRUS_CHANGE_MESSAGE !=~ '.*\nci-os-only:.*' || $CIRRUS_CHANGE_MESSAGE =~ '.*\nci-os-only:[^\n]*freebsd.*'
compute_engine_instance:
image_project: pg-vm-images-aio
image: family/pg-aio-freebsd-13-0
platform: freebsd
- cpu: 2
- memory: 2G
+ cpu: 4
+ memory: 4G
disk: 50
env:
CCACHE_DIR: "/tmp/ccache_dir"
@@ -39,33 +38,52 @@ task:
- mkdir -p /tmp/ccache_dir
- chown -R postgres:postgres /tmp/ccache_dir
- configure_script: |
- su postgres -c './configure \
- --enable-cassert --enable-debug --enable-tap-tests \
- --enable-nls \
- \
- --with-icu \
- --with-ldap \
- --with-libxml \
- --with-libxslt \
- \
- --with-lz4 \
- --with-pam \
- --with-perl \
- --with-python \
- --with-ssl=openssl \
- --with-tcl --with-tclconfig=/usr/local/lib/tcl8.6/ \
- --with-uuid=bsd \
- \
- --with-includes=/usr/local/include --with-libs=/usr/local/lib \
- CC="ccache cc"'
- build_script:
- - su postgres -c 'gmake -s -j3 && gmake -s -j3 -C contrib'
- upload_caches:
- - ccache
+ matrix:
+ - name: FreeBSD autoconf
+
+ configure_script: |
+ su postgres -c './configure \
+ --enable-cassert --enable-debug --enable-tap-tests \
+ --enable-nls \
+ \
+ --with-icu \
+ --with-ldap \
+ --with-libxml \
+ --with-libxslt \
+ \
+ --with-lz4 \
+ --with-pam \
+ --with-perl \
+ --with-python \
+ --with-ssl=openssl \
+ --with-tcl --with-tclconfig=/usr/local/lib/tcl8.6/ \
+ --with-uuid=bsd \
+ \
+ --with-includes=/usr/local/include --with-libs=/usr/local/lib \
+ CC="ccache cc"'
+ build_script:
+ - su postgres -c 'gmake -s -j4 && gmake -s -j4 -C contrib'
+ upload_caches:
+ - ccache
+
+ tests_script:
+ - su postgres -c 'time gmake -s -j4 ${CHECK} ${CHECKFLAGS}'
+
+ - name: FreeBSD meson
+
+ configure_script:
+ - su postgres -c 'meson setup --buildtype debug -Dcassert=true -Dssl=openssl build'
+ build_script:
+ - su postgres -c 'ninja -C build'
+ upload_caches:
+ - ccache
+ run_tests_script:
+ - su postgres -c 'meson test --no-rebuild -C build'
- tests_script:
- - su postgres -c 'time gmake -s -j2 ${CHECK} ${CHECKFLAGS}'
+ always:
+ meson_log_artifacts:
+ path: "build/meson-logs/*.txt"
+ type: text/plain
on_failure:
cores_script: |
@@ -83,14 +101,16 @@ task:
tap_artifacts:
path: "**/regress_log_*"
type: text/plain
+ meson_test_artifacts:
+ path: "build/meson-logs/testlog.junit.xml"
+ type: text/xml
+ format: junit
task:
- name: Linux
only_if: $CIRRUS_CHANGE_MESSAGE !=~ '.*\nci-os-only:.*' || $CIRRUS_CHANGE_MESSAGE =~ '.*\nci-os-only:[^\n]*linux.*'
compute_engine_instance:
image_project: pg-vm-images-aio
- image: family/pg-aio-bullseye
platform: linux
cpu: 4
memory: 2G
@@ -120,37 +140,78 @@ task:
- su postgres -c 'ulimit -l -S'
- echo '/tmp/%e-%s-%p.core' > /proc/sys/kernel/core_pattern
- configure_script: |
- su postgres -c './configure \
- --enable-cassert --enable-debug --enable-tap-tests \
- --enable-nls \
- \
- --with-gssapi \
- --with-icu \
- --with-ldap \
- --with-libxml \
- --with-libxslt \
- --with-llvm \
- --with-lz4 \
- --with-pam \
- --with-perl \
- --with-python \
- --with-ssl=openssl \
- --with-systemd \
- --with-tcl --with-tclconfig=/usr/lib/tcl8.6/ \
- --with-uuid=e2fs \
- \
- CC="ccache gcc" CXX="ccache g++" CLANG="ccache clang" CFLAGS="-O0 -ggdb"'
- build_script:
- - su postgres -c 'make -s -j4 && make -s -j4 -C contrib'
- upload_caches:
- - ccache
+ matrix:
+ - name: Linux Autoconf
+
+ compute_engine_instance:
+ image: family/pg-aio-bullseye
+
+ configure_script: |
+ su postgres -c './configure \
+ --enable-cassert --enable-debug --enable-tap-tests \
+ --enable-nls \
+ \
+ --with-gssapi \
+ --with-icu \
+ --with-ldap \
+ --with-libxml \
+ --with-libxslt \
+ --with-llvm \
+ --with-lz4 \
+ --with-pam \
+ --with-perl \
+ --with-python \
+ --with-ssl=openssl \
+ --with-systemd \
+ --with-tcl --with-tclconfig=/usr/lib/tcl8.6/ \
+ --with-uuid=e2fs \
+ \
+ CC="ccache gcc" CXX="ccache g++" CLANG="ccache clang" CFLAGS="-O0 -ggdb"'
+ build_script:
+ - su postgres -c 'make -s -j4 && make -s -j4 -C contrib'
+ upload_caches:
+ - ccache
+
+ tests_script: |
+ su postgres -c '\
+ ulimit -c unlimited; \
+ make -s ${CHECK} ${CHECKFLAGS} -j8 \
+ '
+
+ - name: Linux Meson
+
+ compute_engine_instance:
+ image: family/pg-aio-bullseye
+
+ configure_script:
+ - su postgres -c 'meson setup --buildtype debug -Dcassert=true -Dssl=openssl build'
+ build_script:
+ - su postgres -c 'ninja -C build'
+ upload_caches:
+ - ccache
+
+ tests_script:
+ - su postgres -c 'meson test --no-rebuild -C build'
+
+ - name: Linux Meson Sid
+
+ compute_engine_instance:
+ image: family/pg-aio-sid
+
+ configure_script:
+ - su postgres -c 'meson setup --buildtype debug -Dcassert=true -Dssl=openssl build'
+ build_script:
+ - su postgres -c 'ninja -C build'
+ upload_caches:
+ - ccache
+
+ tests_script:
+ - su postgres -c 'meson test --no-rebuild -C build'
- tests_script: |
- su postgres -c '\
- ulimit -c unlimited; \
- make -s ${CHECK} ${CHECKFLAGS} -j8 \
- '
+ always:
+ meson_log_artifacts:
+ path: "build/meson-logs/*.txt"
+ type: text/plain
on_failure:
cores_script: |
@@ -168,10 +229,13 @@ task:
tap_artifacts:
path: "**/regress_log_*"
type: text/plain
+ meson_test_artifacts:
+ path: "build/meson-logs/testlog.junit.xml"
+ type: text/xml
+ format: junit
task:
- name: macOS
only_if: $CIRRUS_CHANGE_MESSAGE !=~ '.*\nci-os-only:.*' || $CIRRUS_CHANGE_MESSAGE =~ '.*\nci-os-only:[^\n]*(macos|darwin|osx).*'
osx_instance:
image: big-sur-base
@@ -201,55 +265,86 @@ task:
- sudo chmod 777 /cores
homebrew_install_script:
- brew install make coreutils ccache icu4c lz4 tcl-tk openldap
+ - brew install meson ninja python@3.9
upload_caches:
- homebrew
- configure_script: |
- LIBS="/usr/local/lib:$LIBS"
- INCLUDES="/usr/local/include:$INCLUDES"
-
- INCLUDES="/usr/local/opt/openssl/include:$INCLUDES"
- LIBS="/usr/local/opt/openssl/lib:$LIBS"
-
- PKG_CONFIG_PATH="/usr/local/opt/icu4c/lib/pkgconfig:$PKG_CONFIG_PATH"
- INCLUDES="/usr/local/opt/icu4c/include:$INCLUDES"
- LIBS="/usr/local/opt/icu4c/lib:$LIBS"
-
- LIBS="/usr/local/opt/openldap/lib:$LIBS"
- INCLUDES="/usr/local/opt/openldap/include:$INCLUDES"
-
- export PKG_CONFIG_PATH
-
- ./configure \
- --prefix=$HOME/install \
- --with-includes="$INCLUDES" \
- --with-libs="$LIBS" \
- \
- --enable-cassert --enable-debug --enable-tap-tests \
- --enable-nls \
- \
- --with-icu \
- --with-ldap \
- --with-libxml \
- --with-libxslt \
- \
- --with-lz4 \
- --with-perl \
- --with-python \
- --with-ssl=openssl \
- --with-tcl --with-tclconfig=/usr/local/opt/tcl-tk/lib/ \
- --with-uuid=e2fs \
- \
- CC="ccache gcc" CFLAGS="-O0 -ggdb"
- build_script:
- - gmake -s -j12 && gmake -s -j12 -C contrib
- upload_caches:
- - ccache
+ matrix:
+ - name: macOS autoconf
+
+ configure_script: |
+ LIBS="/usr/local/lib:$LIBS"
+ INCLUDES="/usr/local/include:$INCLUDES"
+
+ PKG_CONFIG_PATH="/usr/local/opt/openssl/lib/pkgconfig:$PKG_CONFIG_PATH"
+ INCLUDES="/usr/local/opt/openssl/include:$INCLUDES"
+ LIBS="/usr/local/opt/openssl/lib:$LIBS"
+
+ PKG_CONFIG_PATH="/usr/local/opt/icu4c/lib/pkgconfig:$PKG_CONFIG_PATH"
+ INCLUDES="/usr/local/opt/icu4c/include:$INCLUDES"
+ LIBS="/usr/local/opt/icu4c/lib:$LIBS"
+
+ PKG_CONFIG_PATH="/usr/local/opt/ldap/lib/pkgconfig:$PKG_CONFIG_PATH"
+ LIBS="/usr/local/opt/openldap/lib:$LIBS"
+ INCLUDES="/usr/local/opt/openldap/include:$INCLUDES"
+
+ export PKG_CONFIG_PATH
+
+ ./configure \
+ --prefix=$HOME/install \
+ --with-includes="$INCLUDES" \
+ --with-libs="$LIBS" \
+ \
+ --enable-cassert --enable-debug --enable-tap-tests \
+ --enable-nls \
+ \
+ --with-icu \
+ --with-ldap \
+ --with-libxml \
+ --with-libxslt \
+ \
+ --with-lz4 \
+ --with-perl \
+ --with-python \
+ --with-ssl=openssl \
+ --with-tcl --with-tclconfig=/usr/local/opt/tcl-tk/lib/ \
+ --with-uuid=e2fs \
+ \
+ CC="ccache gcc" CFLAGS="-O0 -ggdb"
+ build_script:
+ - gmake -s -j12 && gmake -s -j12 -C contrib
+ upload_caches:
+ - ccache
+
+ tests_script:
+ - ulimit -c unlimited
+ - ulimit -n 1024
+ - gmake -s -j12 ${CHECK} ${CHECKFLAGS}
+
+ - name: macOS meson
+
+ configure_script: |
+ PKG_CONFIG_PATH="/usr/local/opt/openssl/lib/pkgconfig:$PKG_CONFIG_PATH"
+ PKG_CONFIG_PATH="/usr/local/opt/icu4c/lib/pkgconfig:$PKG_CONFIG_PATH"
+ PKG_CONFIG_PATH="/usr/local/opt/openldap/lib/pkgconfig:$PKG_CONFIG_PATH"
+
+ export PKG_CONFIG_PATH
+
+ meson setup --buildtype debug -Dcassert=true -Dssl=openssl build
+ build_script:
+ - ninja -C build
+ upload_caches:
+ - ccache
+
+ tests_script:
+ - ulimit -c unlimited
+ - ulimit -n 1024
+ - meson test --no-rebuild -C build
- tests_script:
- - ulimit -c unlimited
- - ulimit -n 1024
- - gmake -s -j12 ${CHECK} ${CHECKFLAGS}
+ always:
+ meson_log_artifacts:
+ path: "build/meson-logs/*.txt"
+ type: text/plain
on_failure:
cores_script: |
@@ -265,10 +360,13 @@ task:
tap_artifacts:
path: "**/regress_log_*"
type: text/plain
+ meson_test_artifacts:
+ path: "build/meson-logs/testlog.junit.xml"
+ type: text/xml
+ format: junit
task:
- name: Windows
only_if: $CIRRUS_CHANGE_MESSAGE !=~ '.*\nci-os-only:.*' || $CIRRUS_CHANGE_MESSAGE =~ '.*\nci-os-only:[^\n]*windows.*'
windows_container:
dockerfile: ci/docker/windows_vs_2019
@@ -281,6 +379,8 @@ task:
TEMP_CONFIG: ${CIRRUS_WORKING_DIR}/ci/pg_ci_base.conf
# Avoid re-installing over and over
NO_TEMP_INSTALL: 1
+ # Try to hide git's tar
+ PATH: c:\windows\system32;${PATH}
sysinfo_script:
- chcp
@@ -289,55 +389,103 @@ task:
- ps: Get-Item -Path 'HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\AeDebug'
- set
- configure_script:
- - copy ci\windows_build_config.pl src\tools\msvc\config.pl
- - vcvarsall x64
- - perl src/tools/msvc/mkvcbuild.pl
- build_script:
- - vcvarsall x64
- # Disable file tracker, we're never going to rebuild...
- - msbuild -m /p:TrackFileAccess=false pgsql.sln
- tempinstall_script:
- # Installation on windows currently only completely works from src\tools\msvc
- - cd src\tools\msvc && perl .\install.pl %CIRRUS_WORKING_DIR%\tmp_install
-
- check_test_script:
- - perl src/tools/msvc/vcregress.pl check parallel
- startcreate_test_script:
- - tmp_install\bin\pg_ctl.exe initdb -D tmp_check\db -l tmp_check\initdb.log
- - echo include '%TEMP_CONFIG%' >> tmp_check\db\postgresql.conf
- - tmp_install\bin\pg_ctl.exe start -D tmp_check\db -l tmp_check\postmaster.log
- plcheck_test_script:
- - perl src/tools/msvc/vcregress.pl plcheck
- isolationcheck_test_script:
- - perl src/tools/msvc/vcregress.pl isolationcheck
- modulescheck_test_script:
- - perl src/tools/msvc/vcregress.pl modulescheck
- contribcheck_test_script:
- - perl src/tools/msvc/vcregress.pl contribcheck
- stop_test_script:
- - tmp_install\bin\pg_ctl.exe stop -D tmp_check\db -l tmp_check\postmaster.log
- ssl_test_script:
- - set with_ssl=openssl
- - perl src/tools/msvc/vcregress.pl taptest .\src\test\ssl\
- subscriptioncheck_test_script:
- - perl src/tools/msvc/vcregress.pl taptest .\src\test\subscription\
- authentication_test_script:
- - perl src/tools/msvc/vcregress.pl taptest .\src\test\authentication\
- recoverycheck_test_script:
- - perl src/tools/msvc/vcregress.pl recoverycheck
- bincheck_test_script:
- - perl src/tools/msvc/vcregress.pl bincheck
- upgradecheck_test_script:
- - perl src/tools/msvc/vcregress.pl upgradecheck
- ecpgcheck_test_script:
- # tries to build additional stuff
- - vcvarsall x64
- # References ecpg_regression.proj in the current dir
- - cd src\tools\msvc
- - perl vcregress.pl ecpgcheck
+ matrix:
+ - name: Windows homegrowns
+
+ configure_script:
+ - copy ci\windows_build_config.pl src\tools\msvc\config.pl
+ - vcvarsall x64
+ - perl src/tools/msvc/mkvcbuild.pl
+ build_script:
+ - vcvarsall x64
+ # Disable file tracker, we're never going to rebuild...
+ - msbuild -m /p:TrackFileAccess=false pgsql.sln
+ tempinstall_script:
+ # Installation on windows currently only completely works from src\tools\msvc
+ - cd src\tools\msvc && perl .\install.pl %CIRRUS_WORKING_DIR%\tmp_install
+
+ check_test_script:
+ - perl src/tools/msvc/vcregress.pl check parallel
+ startcreate_test_script:
+ - tmp_install\bin\pg_ctl.exe initdb -D tmp_check\db -l tmp_check\initdb.log
+ - echo include '%TEMP_CONFIG%' >> tmp_check\db\postgresql.conf
+ - tmp_install\bin\pg_ctl.exe start -D tmp_check\db -l tmp_check\postmaster.log
+ plcheck_test_script:
+ - perl src/tools/msvc/vcregress.pl plcheck
+ isolationcheck_test_script:
+ - perl src/tools/msvc/vcregress.pl isolationcheck
+ modulescheck_test_script:
+ - perl src/tools/msvc/vcregress.pl modulescheck
+ contribcheck_test_script:
+ - perl src/tools/msvc/vcregress.pl contribcheck
+ stop_test_script:
+ - tmp_install\bin\pg_ctl.exe stop -D tmp_check\db -l tmp_check\postmaster.log
+ ssl_test_script:
+ - set with_ssl=openssl
+ - perl src/tools/msvc/vcregress.pl taptest .\src\test\ssl\
+ subscriptioncheck_test_script:
+ - perl src/tools/msvc/vcregress.pl taptest .\src\test\subscription\
+ authentication_test_script:
+ - perl src/tools/msvc/vcregress.pl taptest .\src\test\authentication\
+ recoverycheck_test_script:
+ - perl src/tools/msvc/vcregress.pl recoverycheck
+ bincheck_test_script:
+ - perl src/tools/msvc/vcregress.pl bincheck
+ upgradecheck_test_script:
+ - perl src/tools/msvc/vcregress.pl upgradecheck
+ ecpgcheck_test_script:
+ # tries to build additional stuff
+ - vcvarsall x64
+ # References ecpg_regression.proj in the current dir
+ - cd src\tools\msvc
+ - perl vcregress.pl ecpgcheck
+
+ - name: Windows Meson+vs+Ninja
+
+ meson_script:
+ - pip install meson
+ - pip install ninja
+ configure_script:
+ - vcvarsall x64
+ - mkdir subprojects
+ - meson wrap install lz4
+ - meson wrap install zlib
+ - meson setup --buildtype debug --backend ninja -Dcassert=true -Db_pch=true -Dssl=openssl -Dlz4=enabled -Dzlib=enabled -Dextra_lib_dirs=c:\openssl\1.1.1l\lib -Dextra_include_dirs=c:\openssl\1.1.1l\include build
+ build_script:
+ - vcvarsall x64
+ - ninja -C build
+
+ check_script:
+ - vcvarsall x64
+ - meson test --no-rebuild -C build
+
+ - name: Windows Meson+vs+msbuild
+
+ # Need a development version of meson for now
+ meson_dev_script:
+ - git clone https://github.com/mesonbuild/meson.git
+
+ configure_script:
+ - vcvarsall x64
+ - mkdir subprojects
+ - .\meson\meson.py wrap install lz4
+ - .\meson\meson.py wrap install zlib
+ - .\meson\meson.py setup --buildtype debug --backend vs -Dcassert=true -Db_pch=true -Dssl=openssl -Dlz4=enabled -Dzlib=enabled -Dextra_lib_dirs=c:\openssl\1.1.1l\lib -Dextra_include_dirs=c:\openssl\1.1.1l\include build
+
+ build_script:
+ - vcvarsall x64
+ - msbuild -m /p:UseMultiToolTask=true build\postgresql.sln
+
+ check_script:
+ - vcvarsall x64
+ - .\meson\meson.py test --no-rebuild -C build
always:
+ meson_log_artifacts:
+ path: "build/meson-logs/*.txt"
+ type: text/plain
+ cat_dumps_script:
+
cores_script:
- cat crashlog.txt || true
dump_artifacts:
@@ -354,12 +502,16 @@ task:
tap_artifacts:
path: "**/regress_log_*"
type: text/plain
+ meson_test_artifacts:
+ path: "build/meson-logs/testlog.junit.xml"
+ type: text/xml
+ format: junit
task:
name: CompilerWarnings
depends_on:
- - Linux
+ - Linux Autoconf
# task that did not run count as a success, so we need to recheck Linux' condition here :/
only_if: $CIRRUS_CHANGE_MESSAGE !=~ '.*\nci-os-only:.*' || $CIRRUS_CHANGE_MESSAGE =~ '.*\nci-os-only:[^\n]*linux.*'
container:
--
2.23.0.385.gbc12974a89
Hi,
On 2021-10-12 01:37:21 -0700, Andres Freund wrote:
non-cached build (world-bin):
current: 40.46s
ninja: 7.31s
Interestingly this is pretty close to the minimum achievable on my
machine from the buildsystem perspective.
A build with -fuse-ld=lld, which the above didn't use, takes 6.979s. The
critical path is
bison gram.y -> gram.c 4.13s
gcc gram.c -> gram.o 2.05s
gcc postgres .... 0.317
A very helpful visualization is to transform ninja's build logs into a
tracefile with https://github.com/nico/ninjatracing
I attached an example - the trace.json.gz can be uploaded as-is to
https://ui.perfetto.dev/
It's quite a bit of of fun to look at imo.
There's a few other things quickly apparent:
- genbki prevents build progress due to dependencies on the generated
headers.
- the absolutely stupid way I implemented the python2->python3
regression test output conversion uses up a fair bit of resources
- tablecmds.c, pg_dump.c, xlog.c and a few other files are starting to
big enough to be problematic compile-time wise
Greetings,
Andres Freund
Attachments:
On 12.10.21 10:37, Andres Freund wrote:
For the last year or so I've on and off tinkered with $subject. I think
it's in a state worth sharing now. First, let's look at a little
comparison.
I played with $subject a few years ago and liked it. I think, like you
said, meson is the best way forward. I support this project.
One problem I noticed back then was that some choices that we currently
determine ourselves in configure or the makefiles are hardcoded in
meson. For example, at the time, gcc on macOS was not supported. Meson
thought, if you are on macOS, you are surely using the Apple compiler,
and it supports these options. Fixing that required patches deep in the
bowels of the meson source code (and, in practice, waiting for a new
release etc.). I strongly suspect this isn't the only such problem.
For example, the shared library build behavior has been carefully tuned
in opinionated ways. With the autotools chain, one can override
anything with enough violence; so we have always felt free to do that.
I haven't followed it in a while, so I don't know what the situation is
now; but it is a concern, because we have always felt free to try new
and unusual build tools (Sun compiler, Intel compiler,
clang-when-it-was-new) early without waiting for anyone else.
On Tue, Oct 12, 2021 at 9:31 AM Peter Eisentraut
<peter.eisentraut@enterprisedb.com> wrote:
One problem I noticed back then was that some choices that we currently
determine ourselves in configure or the makefiles are hardcoded in
meson. For example, at the time, gcc on macOS was not supported. Meson
thought, if you are on macOS, you are surely using the Apple compiler,
and it supports these options. Fixing that required patches deep in the
bowels of the meson source code (and, in practice, waiting for a new
release etc.). I strongly suspect this isn't the only such problem.
For example, the shared library build behavior has been carefully tuned
in opinionated ways. With the autotools chain, one can override
anything with enough violence; so we have always felt free to do that.
I haven't followed it in a while, so I don't know what the situation is
now; but it is a concern, because we have always felt free to try new
and unusual build tools (Sun compiler, Intel compiler,
clang-when-it-was-new) early without waiting for anyone else.
I think we're going to need some solution to this problem. We have too
many people here with strong opinions about questions like this for me
to feel good about the idea that we're going to collectively be OK
with leaving these sorts of decisions up to some other project.
From my point of view, the time it takes to run configure is annoying,
but the build time is pretty fine. On my system, configure takes about
33 seconds, and a full rebuild with 'make -j8' takes 14.5 seconds (I
am using ccache). Moreover, most of the time when I run make, I'm only
doing a partial rebuild, so it's near-instantaneous.
--
Robert Haas
EDB: http://www.enterprisedb.com
On 10/12/21 4:37 AM, Andres Freund wrote:
git remote add andres git@github.com:anarazel/postgres.git
ITYM:
git remote add andres git://github.com/anarazel/postgres.git
cheers
andrew
�
--
Andrew Dunstan
EDB: https://www.enterprisedb.com
.
út 12. 10. 2021 v 10:37 odesílatel Andres Freund <andres@anarazel.de> napsal:
Hi,
For the last year or so I've on and off tinkered with $subject. I think
it's in a state worth sharing now. First, let's look at a little
comparison.My workstation:
non-cached configure:
current: 11.80s
meson: 6.67snon-cached build (world-bin):
current: 40.46s
ninja: 7.31sno-change build:
current: 1.17s
ninja: 0.06stest world:
current: 105s
meson: 63sWhat actually started to motivate me however were the long times windows
builds took to come back with testsresults. On CI, with the same machine
config:build:
current: 202s (doesn't include genbki etc)
meson+ninja: 140s
meson+msbuild: 206stest:
current: 1323s (many commands)
meson: 903s (single command)(note that the test comparison isn't quite fair - there's a few tests
missing, but it's just small contrib ones afaik)The biggest difference to me however is not the speed, but how readable
the output is.Running the tests with meson in a terminal, shows the number of tests
that completed out of how many total, how much time has passed, how long
the currently running tests already have been running.At the end of a testrun a count of tests is shown:
188/189 postgresql:tap+pg_basebackup / pg_basebackup/t/010_pg_basebackup.pl OK 39.51s 110 subtests passed
189/189 postgresql:isolation+snapshot_too_old / snapshot_too_old/isolation OK 62.93sOk: 188
Expected Fail: 0
Fail: 1
Unexpected Pass: 0
Skipped: 0
Timeout: 0Full log written to /tmp/meson/meson-logs/testlog.txt
The log has the output of the tests and ends with:
Summary of Failures:
120/189 postgresql:tap+recovery / recovery/t/007_sync_rep.pl ERROR 7.16s (exit status 255 or signal 127 SIGinvalid)Quite the difference to make check-world -jnn output.
So, now that the teasing is done, let me explain a bit what lead me down
this path:Autoconf + make is not being actively developed. Especially autoconf is
*barely* in maintenance mode - despite many shortcomings and bugs. It's
also technology that very few want to use - autoconf m4 is scary, and
it's scarier for people that started more recently than a lot of us
committers for example.Recursive make as we use it is hard to get right. One reason the clean
make build is so slow compared to meson is that we had to resort to
.NOTPARALLEL to handle dependencies in a bunch of places. And despite
that, I quite regularly see incremental build failures that can be
resolved by retrying the build.While we have incremental build via --enable-depend, they don't work
that reliable (i.e. misses necessary rebuilds) and yet is often too
aggressive. More modern build system can keep track of the precise
command used to build a target and rebuild it when that command changes.We also don't just have the autoconf / make buildsystem, there's also
the msvc project generator - something most of us unix-y folks do not
like to touch. I think that, combined with there being no easy way to
run all tests, and it being just different, really hurt our windows
developer appeal (and subsequently the quality of postgres on
windows). I'm not saying this to ding the project generator - that was
well before there were decent "meta" buildsystems out there (and in some
ways it is a small one itself).The last big issue I have with the current situation is that there's no
good test integration. make check-world output is essentially unreadable
/ not automatically parseable. Which led to the buildfarm having a
separate list of things it needs to test, so that failures can be
pinpointed and paired with appropriate logs. That approach unfortunately
doesn't scale well to multi-core CPUs, slowing down the buildfarm by a
fair bit.This all led to me to experiment with improvements. I tried a few
somewhat crazy but incremental things like converting our buildsystem to
non-recursive make (I got it to build the backend, but it's too hard to
do manually I think), or to not run tests during the recursive make
check-world, but to append commands to a list of tests, that then is run
by a helper (can kinda be made to work). In the end I concluded that
the amount of time we'd need to invest to maintain our more-and-more
custom buildsystem going forward doesn't make sense.Which lead me to look around and analyze which other buildsystems there
are that could make some sense for us. The halfway decent list includes,
I think:
1) cmake
2) bazel
3) mesoncmake would be a decent choice, I think. However, I just can't fully
warm up to it. Something about it just doesn't quite sit right with
me. That's not a good enough reason to prevent others from suggesting to
use it, but it's good enough to justify not investing a lot of time in
it myself.Bazel has some nice architectural properties. But it requires a JVM to
run - I think that basically makes it insuitable for us. And the build
information seems quite arduous to maintain too.Which left me with meson. It is a meta-buildsystem that can do the
actual work of building via ninja (the most common one, also targeted by
cmake), msbuild (visual studio project files, important for GUI work)
and xcode projects (I assume that's for a macos IDE, but I haven't tried
to use it). Meson roughly does what autoconf+automake did, in a
python-esque DSL, and outputs build-instructions for ninja / msbuild /
xcode. One interesting bit is that meson itself is written in python (
and fairly easy to contribute too - I got a few changes in now).I don't think meson is perfect architecturally - e.g. its insistence on
not having functions ends up making it a bit harder to not end up
duplicating code. There's some user-interface oddities that are now hard
to fix fully, due to the faily wide usage. But all-in-all it's pretty
nice to use.Its worth calling out that a lot of large open source projects have been
/ are migrating to meson. qemu/kvm, mesa (core part of graphics stack on
linux and also widely used in other platforms), a good chunk of GNOME,
and quite a few more. Due to that it seems unlikely to be abandoned
soon.As far as I can tell the only OS that postgres currently supports that
meson doesn't support is HPUX. It'd likely be fairly easy to add
gcc-on-hpux support, a chunk more to add support for the proprietary
ones.The attached patch (meson support is 0016, the rest is prerequisites
that aren't that interesting at this stage) converts most of postgres to
meson. There's a few missing contrib modules, only about half the
optional library dependencies are implemented, and I've only built on
x64. It builds on freebsd, linux, macos and windows (both ninja and
msbuild) and cross builds from linux to windows. Thomas helped make the
freebsd / macos pieces a reality, thanks!I took a number of shortcuts (although there used to be a *lot*
more). So this shouldn't be reviewed to the normal standard of the
community - it's a prototype. But I think it's in a complete enough
shape that it allows to do a well-informed evaluation.What doesn't yet work/ build:
- plenty optional libraries, contrib, NLS, docs build
- PGXS - and I don't yet know what to best do about it. One
backward-compatible way would be to continue use makefiles for pgxs,
but do the necessary replacement of Makefile.global.in via meson (and
not use that for postgres' own build). But that doesn't really
provide a nicer path for building postgres extensions on windows, so
it'd definitely not be a long-term path.- JIT bitcode generation for anything but src/backend.
- anything but modern-ish x86. That's proably a small amount of work,
but something that needs to be done.- exporting all symbols for extension modules on windows (the stuff for
postgres is implemented). Instead I marked the relevant symbols als
declspec(dllexport). I think we should do that regardless of the
buildsystem change. Restricting symbol visibility via gcc's
-fvisibility=hidden for extensions results in a substantially reduced
number of exported symbols, and even reduces object size (and I think
improves the code too). I'll send an email about that separately.There's a lot more stuff to talk about, but I'll stop with a small bit
of instructions below:Demo / instructions:
# Get code
git remote add andres git@github.com:anarazel/postgres.git
git fetch andres
git checkout --track andres/meson# setup build directory
meson setup build --buildtype debug
cd build# build (uses automatically as many cores as available)
ninja
I'm getting errors at this step. You can find my output at
https://pastebin.com/Ar5VqfFG. Setup went well without errors. Is that
expected for now?
Show quoted text
# change configuration, build again
meson configure -Dssl=openssl
ninja# run all tests
meson test# run just recovery tests
meson test --suite setup --suite recovery# list tests
meson test --listGreetings,
Andres Freund
On 10/12/21 4:37 AM, Andres Freund wrote:
# setup build directory
meson setup build --buildtype debug
I took this for an outing on msys2 and it just seems to hang. If it's not hanging it's unbelievably slow.
cheers
andrew
--
Andrew Dunstan
EDB: https://www.enterprisedb.com
Robert Haas <robertmhaas@gmail.com> writes:
I think we're going to need some solution to this problem. We have too
many people here with strong opinions about questions like this for me
to feel good about the idea that we're going to collectively be OK
with leaving these sorts of decisions up to some other project.
Agreed. I'm willing to put up with the costs of moving to some
other build system, but not if it dictates choices we don't want to
make about the end products.
From my point of view, the time it takes to run configure is annoying,
but the build time is pretty fine. On my system, configure takes about
33 seconds, and a full rebuild with 'make -j8' takes 14.5 seconds (I
am using ccache). Moreover, most of the time when I run make, I'm only
doing a partial rebuild, so it's near-instantaneous.
Read about Autoconf's --cache-file option. That and ccache are
absolutely essential tools IMO.
regards, tom lane
On 10/12/21 11:28 AM, Andrew Dunstan wrote:
On 10/12/21 4:37 AM, Andres Freund wrote:
# setup build directory
meson setup build --buildtype debugI took this for an outing on msys2 and it just seems to hang. If it's not hanging it's unbelievably slow.
It hung because it expected the compiler to be 'ccache cc'. Hanging in
such a case is kinda unforgivable. I remedied that by setting 'CC=gcc'
but it then errored out looking for perl libs. I think msys2 is going to
be a bit difficult here :-(
cheers
andrew
--
Andrew Dunstan
EDB: https://www.enterprisedb.com
Hi,
On 2021-10-12 15:30:57 +0200, Peter Eisentraut wrote:
I played with $subject a few years ago and liked it. I think, like you
said, meson is the best way forward. I support this project.
Cool.
One problem I noticed back then was that some choices that we currently
determine ourselves in configure or the makefiles are hardcoded in meson.
Yea, there's some of that. I think some degree of reduction in flexibility is
needed to realistically target multiple "backend" build-system like visual
studio project files etc. but I wish there were a bit less of that
nonetheless.
For example, at the time, gcc on macOS was not supported. Meson thought, if
you are on macOS, you are surely using the Apple compiler, and it supports
these options.
I'm pretty sure this one now can just be overridden with CC=gcc. It can on
linux and windows, but I don't have ready interactive access with a mac
(leaving cirrus asside, which now has a "start a terminal" option...).
For example, the shared library build behavior has been carefully tuned in
opinionated ways. With the autotools chain, one can override anything with
enough violence; so we have always felt free to do that. I haven't followed
it in a while, so I don't know what the situation is now; but it is a
concern, because we have always felt free to try new and unusual build tools
(Sun compiler, Intel compiler, clang-when-it-was-new) early without waiting
for anyone else.
It's possible to just take over building e.g. shared libraries ourselves with
custom targets. Although it'd be a bit annoying to do. The bigger problem is
that that e.g. wouldn't play that nicely with generating visual studio
projects, which require to generate link steps in a certain way. It'd build,
but the GUI might loose some of its options. Etc.
Greetings,
Andres Freund
Hi,
On 2021-10-12 11:50:03 -0400, Andrew Dunstan wrote:
It hung because it expected the compiler to be 'ccache cc'. Hanging in
such a case is kinda unforgivable. I remedied that by setting 'CC=gcc'
but it then errored out looking for perl libs. I think msys2 is going to
be a bit difficult here :-(
Hm. Yea, the perl thing is my fault - you should be able to get past it with
-Dperl=disabled, and I'll take a look at fixing the perl detection. (*)
I can't reproduce the hanging though. I needed to install bison, flex and
ninja and disable perl as described above, but then it built just fine.
It does seems to crash somewhere in the main regression tests though, I think
I don't do the "set stack depth" dance correctly for msys.
If you repro the hanging, what's the last bit in meson-logs/meson-log.txt?
(*) I've for now made most dependencies autodetected, unless you pass
--auto-features disabled to collectively disable all the auto-detected
features. Initially I had mirrored the autoconf behaviour, but I got sick of
forgetting to turn off readline or zlib on windows. And then it was useful to
test on multiple operating systems...
For working on windows meson's wraps are quite useful. I've not added that to
the git branch, but if you manually do
mkdir subprojects
meson wrap install lz4
meson wrap install zlib
building with -Dzlib=enabled -Dlz4=enabled will fall back to building lz4,
zlib as-needed.
I was wondering about adding a binary wrap for e.g. bison, flex on windows, so
that the process of getting a build going isn't as arduous.
Greetings,
Andres Freund
Hi,
On 2021-10-12 17:21:50 +0200, Josef Šimánek wrote:
# build (uses automatically as many cores as available)
ninjaI'm getting errors at this step. You can find my output at
https://pastebin.com/Ar5VqfFG. Setup went well without errors. Is that
expected for now?
Thanks, that's helpful. And no, that's not expected (*), it should be fixed.
What OS / distribution / version is this?
Can you build postgres "normally" with --with-gss? Seems like we're ending up
with a version of gssapi that we're not compatible with.
You should be able to get past this by disabling gss using meson configure
-Dgssapi=disabled.
Greetings,
Andres Freund
* except kinda, in the sense that I'd expect it to be buggy, given that I've
run it only on a few machines and it's very, uh, bleeding edge
Hi,
On 2021-10-12 09:59:26 -0700, Andres Freund wrote:
On 2021-10-12 11:50:03 -0400, Andrew Dunstan wrote:
It hung because it expected the compiler to be 'ccache cc'. Hanging in
such a case is kinda unforgivable. I remedied that by setting 'CC=gcc'
but it then errored out looking for perl libs. I think msys2 is going to
be a bit difficult here :-(Hm. Yea, the perl thing is my fault - you should be able to get past it with
-Dperl=disabled, and I'll take a look at fixing the perl detection. (*)
This is a weird one. I don't know much about msys, so it's probably related to
that. Perl spits out /usr/lib/perl5/core_perl/ as its archlibexp. According to
shell commands that exists, but not according to msys's own python
$ /mingw64/bin/python -c "import os; p = '/usr/lib/perl5/core_perl/CORE'; print(f'does {p} exist:', os.path.exists(p))"
does /usr/lib/perl5/core_perl/CORE exist: False
$ ls -ld /usr/lib/perl5/core_perl/CORE
drwxr-xr-x 1 anfreund anfreund 0 Oct 10 10:19 /usr/lib/perl5/core_perl/CORE
So it's not too surprising that that doesn't work out. It's easy enough to
work around, but still pretty weird.
I pushed a workaround for the config-time error, but it doesn't yet recognize
msys perl correctly. But at least it's not alone in that - configure doesn't
seem to either, so I'm probably doing something wrong :)
I can't reproduce the hanging though. I needed to install bison, flex and
ninja and disable perl as described above, but then it built just fine.It does seems to crash somewhere in the main regression tests though, I think
I don't do the "set stack depth" dance correctly for msys.
That was it - just hadn't ported setting -Wl,--stack=... for !msvc
windows. Pushed the fix for that out.
I guess I should figure out how to commandline install msys and add it to CI.
Greetings,
Andres Freund
On 10/12/21 12:59 PM, Andres Freund wrote:
If you repro the hanging, what's the last bit in meson-logs/meson-log.txt?
Here's the entire thing
# cat
C:/tools/msys64/home/Administrator/postgresql/build/meson-logs/meson-log.txt
Build started at 2021-10-12T18:08:34.387568
Main binary: C:/tools/msys64/mingw64/bin/python.exe
Build Options: -Dbuildtype=debug
Python system: Windows
The Meson build system
Version: 0.59.1
Source dir: C:/tools/msys64/home/Administrator/postgresql
Build dir: C:/tools/msys64/home/Administrator/postgresql/build
Build type: native build
Project name: postgresql
Project version: 15devel
Sanity testing C compiler: ccache cc
Is cross compiler: False.
Sanity check compiler command line: ccache cc sanitycheckc.c -o
sanitycheckc.exe -D_FILE_OFFSET_BITS=64
Sanity check compile stdout:
-----
Sanity check compile stderr:
-----
meson.build:1:0: ERROR: Compiler ccache cc can not compile programs.
cheers
andrew
--
Andrew Dunstan
EDB: https://www.enterprisedb.com
Hi,
On 2021-10-12 14:11:39 -0400, Andrew Dunstan wrote:
On 10/12/21 12:59 PM, Andres Freund wrote:
If you repro the hanging, what's the last bit in meson-logs/meson-log.txt?
Here's the entire thing
Sanity check compiler command line: ccache cc sanitycheckc.c -o
sanitycheckc.exe -D_FILE_OFFSET_BITS=64
Sanity check compile stdout:-----
Sanity check compile stderr:-----
meson.build:1:0: ERROR: Compiler ccache cc can not compile programs.
Huh, it's not a question of gcc vs cc, it's that meson automatically uses
ccache. And it looks like msys's ccache is broken at the moment (installed
yesterday):
$ ccache --version
ccache version 4.4.1
...
$ echo > test.c
$ ccache cc -c test.c
Segmentation fault (core dumped)
..
not sure how that leads to hanging, but it's not too surprising that things
don't work out after that...
Greetings,
Andres Freund
On 10/12/21 2:09 PM, Andres Freund wrote:
Hi,
On 2021-10-12 09:59:26 -0700, Andres Freund wrote:
On 2021-10-12 11:50:03 -0400, Andrew Dunstan wrote:
It hung because it expected the compiler to be 'ccache cc'. Hanging in
such a case is kinda unforgivable. I remedied that by setting 'CC=gcc'
but it then errored out looking for perl libs. I think msys2 is going to
be a bit difficult here :-(Hm. Yea, the perl thing is my fault - you should be able to get past it with
-Dperl=disabled, and I'll take a look at fixing the perl detection. (*)This is a weird one. I don't know much about msys, so it's probably related to
that. Perl spits out /usr/lib/perl5/core_perl/ as its archlibexp. According to
shell commands that exists, but not according to msys's own python$ /mingw64/bin/python -c "import os; p = '/usr/lib/perl5/core_perl/CORE'; print(f'does {p} exist:', os.path.exists(p))"
does /usr/lib/perl5/core_perl/CORE exist: False$ ls -ld /usr/lib/perl5/core_perl/CORE
drwxr-xr-x 1 anfreund anfreund 0 Oct 10 10:19 /usr/lib/perl5/core_perl/CORE
Looks to me like a python issue:
# perl -e 'my $p = "/usr/lib/perl5/core_perl/CORE"; print qq(does $p
exist: ), -e $p, qq{\n};'
does /usr/lib/perl5/core_perl/CORE exist: 1
# python -c "import os; p = '/usr/lib/perl5/core_perl/CORE';
print(f'does {p} exist:', os.path.exists(p))"
does /usr/lib/perl5/core_perl/CORE exist: False
# cygpath -m /usr/lib/perl5/core_perl/CORE
C:/tools/msys64/usr/lib/perl5/core_perl/CORE
# python -c "import os; p =
'C:/tools/msys64/usr/lib/perl5/core_perl/CORE'; print(f'does {p}
exist:', os.path.exists(p))"
does C:/tools/msys64/usr/lib/perl5/core_perl/CORE exist: True
Clearly python is not understanding msys virtualized paths.
I guess I should figure out how to commandline install msys and add it to CI.
here's what I do:
# msys2 outputs esc-[3J which clears the screen's scroll buffer. Nasty.
# so we redirect the output
# find the log in c:\Windows\System32 if needed
choco install -y --no-progress --limit-output msys2 > msys2inst.log
c:\tools\msys64\usr\bin\bash -l
'/c/vfiles/windows-uploads/msys2-packages.sh'
Here's what's in msys-packages.sh:
pacman -S --needed --noconfirm \
base-devel \
msys/git \
msys/ccache \
msys/vim \
msys/perl-Crypt-SSLeay \
mingw-w64-clang-x86_64-toolchain \
mingw-w64-x86_64-toolchain
# could do: pacman -S --needed --noconfirm development
# this is more economical. These should cover most of the things you
might
# want to configure with
pacman -S --needed --noconfirm \
msys/gettext-devel \
msys/icu-devel \
msys/libiconv-devel \
msys/libreadline-devel \
msys/libxml2-devel \
msys/libxslt-devel \
msys/openssl-devel \
msys/zlib-devel
cheers
andrew
--
Andrew Dunstan
EDB: https://www.enterprisedb.com
On 10/12/21 2:23 PM, Andres Freund wrote:
Hi,
On 2021-10-12 14:11:39 -0400, Andrew Dunstan wrote:
On 10/12/21 12:59 PM, Andres Freund wrote:
If you repro the hanging, what's the last bit in meson-logs/meson-log.txt?
Here's the entire thing
Sanity check compiler command line: ccache cc sanitycheckc.c -o
sanitycheckc.exe -D_FILE_OFFSET_BITS=64
Sanity check compile stdout:-----
Sanity check compile stderr:-----
meson.build:1:0: ERROR: Compiler ccache cc can not compile programs.
Huh, it's not a question of gcc vs cc, it's that meson automatically uses
ccache. And it looks like msys's ccache is broken at the moment (installed
yesterday):$ ccache --version
ccache version 4.4.1
...$ echo > test.c
$ ccache cc -c test.c
Segmentation fault (core dumped)
..not sure how that leads to hanging, but it's not too surprising that things
don't work out after that...
Yes, I've had to disable ccache on fairywren.
cheers
andrew
--
Andrew Dunstan
EDB: https://www.enterprisedb.com
Hi,
On 2021-10-12 09:15:41 -0700, Andres Freund wrote:
For example, at the time, gcc on macOS was not supported. Meson thought, if
you are on macOS, you are surely using the Apple compiler, and it supports
these options.I'm pretty sure this one now can just be overridden with CC=gcc. It can on
linux and windows, but I don't have ready interactive access with a mac
(leaving cirrus asside, which now has a "start a terminal" option...).
It was a tad more complicated. But only because it took me a while to figure
out how to make gcc on macos actually work, independent of meson. Initially
gcc was always failing with errors about not finding the linker, and
installing binutils was a dead end.
Turns out just using a gcc at a specific path doesn't work, it ends up using
wrong internal binaries or something like that.
Once I got to that, the meson part was easy:
$ export PATH="/usr/local/opt/gcc/bin:$PATH"
$ CC=gcc-11 meson setup build-gcc
...
C compiler for the host machine: gcc-11 (gcc 11.2.0 "gcc-11 (Homebrew GCC 11.2.0) 11.2.0")
...
$ cd build-gcc
$ ninja test
...
181/181 postgresql:tap+subscription / subscription/t/100_bugs.pl OK 17.83s 5 subtests passed
Ok: 180
Expected Fail: 0
Fail: 0
Unexpected Pass: 0
Skipped: 1
Timeout: 0
One thing that is nice with meson's testrunner is that it can parse the output
of tap tests and recognizes the number of completed / failed subtests. I
wonder whether we could make pg_regress' output tap compliant without the
output quality suffering too much.
Greetings,
Andres Freund
On 10/12/21 2:37 PM, Andrew Dunstan wrote:
On 10/12/21 2:09 PM, Andres Freund wrote:
Hi,
On 2021-10-12 09:59:26 -0700, Andres Freund wrote:
On 2021-10-12 11:50:03 -0400, Andrew Dunstan wrote:
It hung because it expected the compiler to be 'ccache cc'. Hanging in
such a case is kinda unforgivable. I remedied that by setting 'CC=gcc'
but it then errored out looking for perl libs. I think msys2 is going to
be a bit difficult here :-(Hm. Yea, the perl thing is my fault - you should be able to get past it with
-Dperl=disabled, and I'll take a look at fixing the perl detection. (*)This is a weird one. I don't know much about msys, so it's probably related to
that. Perl spits out /usr/lib/perl5/core_perl/ as its archlibexp. According to
shell commands that exists, but not according to msys's own python$ /mingw64/bin/python -c "import os; p = '/usr/lib/perl5/core_perl/CORE'; print(f'does {p} exist:', os.path.exists(p))"
does /usr/lib/perl5/core_perl/CORE exist: False$ ls -ld /usr/lib/perl5/core_perl/CORE
drwxr-xr-x 1 anfreund anfreund 0 Oct 10 10:19 /usr/lib/perl5/core_perl/CORELooks to me like a python issue:
# perl -e 'my $p = "/usr/lib/perl5/core_perl/CORE"; print qq(does $p
exist: ), -e $p, qq{\n};'
does /usr/lib/perl5/core_perl/CORE exist: 1# python -c "import os; p = '/usr/lib/perl5/core_perl/CORE';
print(f'does {p} exist:', os.path.exists(p))"
does /usr/lib/perl5/core_perl/CORE exist: False# cygpath -m /usr/lib/perl5/core_perl/CORE
C:/tools/msys64/usr/lib/perl5/core_perl/CORE# python -c "import os; p =
'C:/tools/msys64/usr/lib/perl5/core_perl/CORE'; print(f'does {p}
exist:', os.path.exists(p))"
does C:/tools/msys64/usr/lib/perl5/core_perl/CORE exist: TrueClearly python is not understanding msys virtualized paths.
It's a matter of which python you use. The one that understands msys
paths is msys/python. The mingw64 packages are normally pure native
windows and so don't understand msys paths. I know it's confusing :-(
# /usr/bin/python -c "import os; p = '/usr/lib/perl5/core_perl/CORE';
print(f'does {p} exist:', os.path.exists(p))"
does /usr/lib/perl5/core_perl/CORE exist: True
cheers
andrew
--
Andrew Dunstan
EDB: https://www.enterprisedb.com
Hi,
On 2021-10-12 14:37:04 -0400, Andrew Dunstan wrote:
On 10/12/21 2:09 PM, Andres Freund wrote:
Hm. Yea, the perl thing is my fault - you should be able to get past it with
-Dperl=disabled, and I'll take a look at fixing the perl detection. (*)This is a weird one. I don't know much about msys, so it's probably related to
that. Perl spits out /usr/lib/perl5/core_perl/ as its archlibexp. According to
shell commands that exists, but not according to msys's own python$ /mingw64/bin/python -c "import os; p = '/usr/lib/perl5/core_perl/CORE'; print(f'does {p} exist:', os.path.exists(p))"
does /usr/lib/perl5/core_perl/CORE exist: False$ ls -ld /usr/lib/perl5/core_perl/CORE
drwxr-xr-x 1 anfreund anfreund 0 Oct 10 10:19 /usr/lib/perl5/core_perl/CORE
Looks to me like a python issue:
Clearly python is not understanding msys virtualized paths.
Ah, it's a question of the *wrong* python being used :/. I somehow ended up
with both a mingw and an msys python, with the mingw python taking preference
over the msys one. The latter one does understand such paths.
I guess I should figure out how to commandline install msys and add it to CI.
here's what I do:
Thanks!
Does that recipe get you to a build where ./configure --with-perl succeeds?
I see this here:
checking for Perl archlibexp... /usr/lib/perl5/core_perl
checking for Perl privlibexp... /usr/share/perl5/core_perl
checking for Perl useshrplib... true
checking for CFLAGS recommended by Perl... -DPERL_USE_SAFE_PUTENV -U__STRICT_ANSI__ -D_GNU_SOURCE -march=x86-64 -mtune=generic -O2 -pipe -fwrapv -fno-strict-aliasing -fstack-protector-strong
checking for CFLAGS to compile embedded Perl... -DPERL_USE_SAFE_PUTENV
checking for flags to link embedded Perl... no
configure: error: could not determine flags for linking embedded Perl.
This probably means that ExtUtils::Embed or ExtUtils::MakeMaker is not
installed.
If I just include perl.h from a test file with gcc using the above flags it
fails to compile:
$ echo '#include <perl.h>' > test.c
$ gcc -DPERL_USE_SAFE_PUTENV -U__STRICT_ANSI__ -D_GNU_SOURCE -march=x86-64 -mtune=generic -O2 -pipe -fwrapv -fno-strict-aliasing -fstack-protector-strong test.c -c -I /c/dev/msys64/usr/lib/perl5/core_perl/CORE
In file included from test.c:1:
C:/dev/msys64/usr/lib/perl5/core_perl/CORE/perl.h:1003:13: fatal error: sys/wait.h: No such file or directory
1003 | # include <sys/wait.h>
and ldopts bleats
$ perl -MExtUtils::Embed -e ldopts
Warning (mostly harmless): No library found for -lpthread
Warning (mostly harmless): No library found for -ldl
-Wl,--enable-auto-import -Wl,--export-all-symbols -Wl,--enable-auto-image-base -fstack-protector-strong -L/usr/lib/perl5/core_perl/CORE -lperl -lcrypt
Greetings,
Andres Freund
On Tue, Oct 12, 2021 at 4:37 AM Andres Freund <andres@anarazel.de> wrote:
[Meson prototype]
The build code looks pretty approachable for someone with no prior
exposure, and feels pretty nice when running it (I couldn't get a build
working but I'll leave that aside for now).
As far as I can tell the only OS that postgres currently supports that
meson doesn't support is HPUX. It'd likely be fairly easy to add
gcc-on-hpux support, a chunk more to add support for the proprietary
ones.
That would also have to work for all the dependencies, which were displayed
to me as:
ninja, gdbm, ca-certificates, openssl@1.1, readline, sqlite and python@3.9
Also, could utility makefile targets be made to work? I'm thinking in
particular of update-unicode and reformat-dat-files, for example.
--
John Naylor
EDB: http://www.enterprisedb.com
On 10/12/21 3:29 PM, Andres Freund wrote:
Does that recipe get you to a build where ./configure --with-perl succeeds?
I see this here:
checking for Perl archlibexp... /usr/lib/perl5/core_perl
checking for Perl privlibexp... /usr/share/perl5/core_perl
checking for Perl useshrplib... true
checking for CFLAGS recommended by Perl... -DPERL_USE_SAFE_PUTENV -U__STRICT_ANSI__ -D_GNU_SOURCE -march=x86-64 -mtune=generic -O2 -pipe -fwrapv -fno-strict-aliasing -fstack-protector-strong
checking for CFLAGS to compile embedded Perl... -DPERL_USE_SAFE_PUTENV
checking for flags to link embedded Perl... no
configure: error: could not determine flags for linking embedded Perl.
This probably means that ExtUtils::Embed or ExtUtils::MakeMaker is not
installed.If I just include perl.h from a test file with gcc using the above flags it
fails to compile:
You need to build against a native perl, like Strawberry or ActiveState.
(I have had mixed success with Strawberry) You do that by putting a path
to it at the start of the PATH. The wrinkle in this is that you need
prove to point to one that understands virtual paths. So you do
something like this:
PATH="/c/perl/bin:$PATH" PROVE=/bin/core_perl/prove configure ...
cheers
andrew
--
Andrew Dunstan
EDB: https://www.enterprisedb.com
Hi,
On 2021-10-12 16:02:14 -0400, Andrew Dunstan wrote:
You need to build against a native perl, like Strawberry or ActiveState.
(I have had mixed success with Strawberry)
Do you understand why that is needed?
You do that by putting a path to it at the start of the PATH. The wrinkle in
this is that you need prove to point to one that understands virtual
paths. So you do something like this:PATH="/c/perl/bin:$PATH" PROVE=/bin/core_perl/prove configure ...
Oh my.
I'll try that later... I wonder if we could make this easier from our side?
This is a lot of magic to know.
Greetings,
Andres Freund
Hi,
On 2021-10-12 15:55:22 -0400, John Naylor wrote:
On Tue, Oct 12, 2021 at 4:37 AM Andres Freund <andres@anarazel.de> wrote:
The build code looks pretty approachable for someone with no prior
exposure, and feels pretty nice when running it
That's part of what attracted me...
(I couldn't get a build working but I'll leave that aside for now).
If you want to do that separately, I'll try to fix it.
As far as I can tell the only OS that postgres currently supports that
meson doesn't support is HPUX. It'd likely be fairly easy to add
gcc-on-hpux support, a chunk more to add support for the proprietary
ones.That would also have to work for all the dependencies, which were displayed
to me as:ninja, gdbm, ca-certificates, openssl@1.1, readline, sqlite and python@3.9
meson does depend on ninja (to execute the build) and of course python. But
the rest should be optional dependencies. ninja builds without any
dependencies as long as you don't change its parser sources. python builds on
aix, hpux etc.
Not sure what way gdbm openssl@1.1 and sqlite are pulled in? I assume readline
is for python...
Also, could utility makefile targets be made to work? I'm thinking in
particular of update-unicode and reformat-dat-files, for example.
Yes, that shouldn't be a problem. You can run arbitrary code in targets
(there's plenty need for that already in what I have so far).
Greetings,
Andres Freund
út 12. 10. 2021 v 19:17 odesílatel Andres Freund <andres@anarazel.de> napsal:
Hi,
On 2021-10-12 17:21:50 +0200, Josef Šimánek wrote:
# build (uses automatically as many cores as available)
ninjaI'm getting errors at this step. You can find my output at
https://pastebin.com/Ar5VqfFG. Setup went well without errors. Is that
expected for now?Thanks, that's helpful. And no, that's not expected (*), it should be fixed.
What OS / distribution / version is this?
Fedora 34 (64 bit)
Can you build postgres "normally" with --with-gss? Seems like we're ending up
with a version of gssapi that we're not compatible with.
Yes, I can.
You should be able to get past this by disabling gss using meson configure
-Dgssapi=disabled.
I tried to clean and start from scratch, but I'm getting different
error probably related to wrongly configured JIT (LLVM wasn't found
during meson setup). I'll debug on my side to provide more info.
Whole build error could be found at https://pastebin.com/hCFqcPvZ.
Setup log could be found at https://pastebin.com/wjbE1w56.
Show quoted text
Greetings,
Andres Freund
* except kinda, in the sense that I'd expect it to be buggy, given that I've
run it only on a few machines and it's very, uh, bleeding edge
Hi,
On 2021-10-13 01:19:27 +0200, Josef Šimánek wrote:
I tried to clean and start from scratch, but I'm getting different
error probably related to wrongly configured JIT (LLVM wasn't found
during meson setup). I'll debug on my side to provide more info.
../src/backend/jit/jit.c:91:73: error: ‘DLSUFFIX’ undeclared (first use in this function)
91 | snprintf(path, MAXPGPATH, "%s/%s%s", pkglib_path, jit_provider, DLSUFFIX);
| ^~~~~~~~
This *very* likely is related to building in a source tree that also contains
a "non-meson" build "in place". The problem is that the meson build picks up
the pg_config.h generated by ./configure in the "normal" build, rather than
the one meson generated itself.
You'd need to execute make distclean or such, or use a separate git checkout.
I forgot about this issue because I only ever build postgres from outside the
source-tree (by invoking configure from a separate directory), so there's
never build products in it. I think at least I need to make the build emit a
warning / error if there's a pg_config.h in the source tree...
This is the part of the jit code that's built regardless of llvm availability
- you'd get the same error in a few other places unrelated to jit.
Greetings,
Andres Freund
Hi,
On 2021-10-12 13:42:56 -0700, Andres Freund wrote:
On 2021-10-12 16:02:14 -0400, Andrew Dunstan wrote:
You do that by putting a path to it at the start of the PATH. The wrinkle in
this is that you need prove to point to one that understands virtual
paths. So you do something like this:PATH="/c/perl/bin:$PATH" PROVE=/bin/core_perl/prove configure ...
Oh my.
I'll try that later... I wonder if we could make this easier from our side?
This is a lot of magic to know.
I managed to get this working. At first it failed because I don't have
pexports - it's not available inside msys as far as I could tell. And seems to
be unmaintained. But replacing pexports with gendef fixed that.
There's this comment in src/pl/plperl/GNUmakefile
# Perl on win32 ships with import libraries only for Microsoft Visual C++,
# which are not compatible with mingw gcc. Therefore we need to build a
# new import library to link with.
but I seem to be able to link fine without going through that song-and-dance?
Greetings,
Andres Freund
On 12 Oct 2021, at 21:01, Andres Freund <andres@anarazel.de> wrote:
One thing that is nice with meson's testrunner is that it can parse the output
of tap tests and recognizes the number of completed / failed subtests. I
wonder whether we could make pg_regress' output tap compliant without the
output quality suffering too much.
I added a --tap option for TAP output to pg_regress together with Jinbao Chen
for giggles and killing some time a while back. It's not entirely done and
sort of PoC, but most of it works. Might not be of interest here, but in case
it is I've refreshed it slightly and rebased it. There might be better ways to
do it, but the aim was to make the diff against the guts of pg_regress small
and instead extract output functions for the different formats.
It omits the test timings, but that could be added either as a diagnostic line
following each status or as a YAML block in TAP 13 (the attached is standard
TAP, not version 13 but the change would be trivial).
If it's helpful and there's any interest for this I'm happy to finish it up now.
One thing that came out of this, is that we don't really handle the ignored
tests in the way the code thinks it does for normal output, the attached treats
ignored tests as SKIP tests.
--
Daniel Gustafsson https://vmware.com/
Attachments:
pg_regress_tap.diffapplication/octet-stream; name=pg_regress_tap.diff; x-unix-mode=0644Download
diff --git a/src/test/regress/pg_regress.c b/src/test/regress/pg_regress.c
index 05296f7ee1..dc60edf82e 100644
--- a/src/test/regress/pg_regress.c
+++ b/src/test/regress/pg_regress.c
@@ -95,6 +95,7 @@ static char *dlpath = PKGLIBDIR;
static char *user = NULL;
static _stringlist *extraroles = NULL;
static char *config_auth_datadir = NULL;
+static bool tap = false;
/* internal variables */
static const char *progname;
@@ -120,9 +121,71 @@ static int fail_ignore_count = 0;
static bool directory_exists(const char *dir);
static void make_directory(const char *dir);
-static void header(const char *fmt,...) pg_attribute_printf(1, 2);
+struct output_func
+{
+ void (*header)(const char *line);
+ void (*footer)(const char *difffilename, const char *logfilename);
+ void (*comment)(const char *comment);
+
+ void (*test_status_preamble)(const char *testname);
+
+ void (*test_status_ok)(const char *testname);
+ void (*test_status_failed)(const char *testname);
+ void (*test_status_ignored)(const char *testname);
+
+ void (*test_runtime)(const char *testname, double runtime);
+};
+
+
+void (*test_runtime)(const char *testname, double runtime);
+/* Text output format */
+static void header_text(const char *line);
+static void footer_text(const char *difffilename, const char *logfilename);
+static void comment_text(const char *comment);
+static void test_status_preamble_text(const char *testname);
+static void test_status_ok_text(const char *testname);
+static void test_status_failed_text(const char *testname);
+static void test_runtime_text(const char *testname, double runtime);
+
+struct output_func output_func_text =
+{
+ header_text,
+ footer_text,
+ comment_text,
+ test_status_preamble_text,
+ test_status_ok_text,
+ test_status_failed_text,
+ NULL,
+ test_runtime_text
+};
+
+/* TAP output format */
+static void header_tap(const char *line);
+static void footer_tap(const char *difffilename, const char *logfilename);
+static void comment_tap(const char *comment);
+static void test_status_ok_tap(const char *testname);
+static void test_status_failed_tap(const char *testname);
+static void test_status_ignored_tap(const char *testname);
+
+struct output_func output_func_tap =
+{
+ header_tap,
+ footer_tap,
+ comment_tap,
+ NULL,
+ test_status_ok_tap,
+ test_status_failed_tap,
+ test_status_ignored_tap,
+ NULL
+};
+
+struct output_func *output = &output_func_text;
+
+static void test_status_ok(const char *testname);
+
static void status(const char *fmt,...) pg_attribute_printf(1, 2);
static void psql_command(const char *database, const char *query,...) pg_attribute_printf(2, 3);
+static void status_end(void);
/*
* allow core files if possible.
@@ -206,18 +269,227 @@ split_to_stringlist(const char *s, const char *delim, _stringlist **listhead)
/*
* Print a progress banner on stdout.
*/
+static void
+header_text(const char *line)
+{
+ fprintf(stdout, "============== %-38s ==============\n", line);
+ fflush(stdout);
+}
+
+static void
+header_tap(const char *line)
+{
+ fprintf(stdout, "# %s\n", line);
+ fflush(stdout);
+}
+
static void
header(const char *fmt,...)
{
char tmp[64];
va_list ap;
+ if (!output->header)
+ return;
+
va_start(ap, fmt);
vsnprintf(tmp, sizeof(tmp), fmt, ap);
va_end(ap);
- fprintf(stdout, "============== %-38s ==============\n", tmp);
- fflush(stdout);
+ output->header(tmp);
+}
+
+static void
+footer_tap(const char *difffilename, const char *logfilename)
+{
+ status("1..%i\n", (fail_count + fail_ignore_count + success_count));
+ status_end();
+}
+
+static void
+footer(const char *difffilename, const char *logfilename)
+{
+ if (output->footer)
+ output->footer(difffilename, logfilename);
+}
+
+static void
+comment_text(const char *comment)
+{
+ status("%s", comment);
+}
+
+static void
+comment_tap(const char *comment)
+{
+ status("# %s", comment);
+}
+
+static void
+comment(const char *fmt,...)
+{
+ char tmp[256];
+ va_list ap;
+
+ if (!output->comment)
+ return;
+
+ va_start(ap, fmt);
+ vsnprintf(tmp, sizeof(tmp), fmt, ap);
+ va_end(ap);
+
+ output->comment(tmp);
+}
+
+static void
+test_status_preamble_text(const char *testname)
+{
+ status(_("test %-28s ... "), testname);
+}
+
+static void
+test_status_preamble(const char *testname)
+{
+ if (output->test_status_preamble)
+ output->test_status_preamble(testname);
+}
+
+static void
+test_status_ok_tap(const char *testname)
+{
+ /* There is no NLS translation here as "ok" is a protocol message */
+ status("ok %i - %s",
+ (fail_count + fail_ignore_count + success_count),
+ testname);
+}
+
+static void
+test_status_ok_text(const char *testname)
+{
+ (void) testname; /* unused */
+ status(_("ok ")); /* align with FAILED */
+}
+
+static void
+test_status_ok(const char *testname)
+{
+ success_count++;
+ if (output->test_status_ok)
+ output->test_status_ok(testname);
+}
+
+static void
+test_status_failed_tap(const char *testname)
+{
+ status("not ok %i - %s",
+ (fail_count + fail_ignore_count + success_count),
+ testname);
+}
+
+static void
+test_status_failed_text(const char *testname)
+{
+ status(_("FAILED"));
+}
+
+static void
+test_status_failed(const char *testname)
+{
+ fail_count++;
+ if (output->test_status_failed)
+ output->test_status_failed(testname);
+}
+
+static void
+test_status_ignored(const char *testname)
+{
+ fail_ignore_count++;
+ if (output->test_status_ignored)
+ output->test_status_ignored(testname);
+}
+
+static void
+test_status_ignored_tap(const char *testname)
+{
+ status("ok %i - %s # SKIP (ignored)",
+ (fail_count + fail_ignore_count + success_count),
+ testname);
+}
+
+static void
+test_runtime_text(const char *testname, double runtime)
+{
+ (void)testname;
+ status(_(" %8.0f ms"), runtime);
+}
+
+static void
+runtime(const char *testname, double runtime)
+{
+ if (output->test_runtime)
+ output->test_runtime(testname, runtime);
+}
+
+static void
+footer_text(const char *difffilename, const char *logfilename)
+{
+ char buf[256];
+
+ /*
+ * Emit nice-looking summary message
+ */
+ if (fail_count == 0 && fail_ignore_count == 0)
+ snprintf(buf, sizeof(buf),
+ _(" All %d tests passed. "),
+ success_count);
+ else if (fail_count == 0) /* fail_count=0, fail_ignore_count>0 */
+ snprintf(buf, sizeof(buf),
+ _(" %d of %d tests passed, %d failed test(s) ignored. "),
+ success_count,
+ success_count + fail_ignore_count,
+ fail_ignore_count);
+ else if (fail_ignore_count == 0) /* fail_count>0 && fail_ignore_count=0 */
+ snprintf(buf, sizeof(buf),
+ _(" %d of %d tests failed. "),
+ fail_count,
+ success_count + fail_count);
+ else
+ /* fail_count>0 && fail_ignore_count>0 */
+ snprintf(buf, sizeof(buf),
+ _(" %d of %d tests failed, %d of these failures ignored. "),
+ fail_count + fail_ignore_count,
+ success_count + fail_count + fail_ignore_count,
+ fail_ignore_count);
+
+ putchar('\n');
+ for (int i = strlen(buf); i > 0; i--)
+ putchar('=');
+ printf("\n%s\n", buf);
+ for (int i = strlen(buf); i > 0; i--)
+ putchar('=');
+ putchar('\n');
+ putchar('\n');
+
+ if (difffilename && logfilename)
+ {
+ printf(_("The differences that caused some tests to fail can be viewed in the\n"
+ "file \"%s\". A copy of the test summary that you see\n"
+ "above is saved in the file \"%s\".\n\n"),
+ difffilename, logfilename);
+ }
+}
+
+static void
+status_start(bool single, const char *testname)
+{
+ /* TAP only outputs after the test has finished */
+ if (tap)
+ return;
+
+ if (single)
+ status(_("test %-24s ... "), testname);
+ else
+ status(_(" %-24s ... "), testname);
}
/*
@@ -917,13 +1189,13 @@ initialize_environment(void)
#endif
if (pghost && pgport)
- printf(_("(using postmaster on %s, port %s)\n"), pghost, pgport);
+ comment(_("(using postmaster on %s, port %s)\n"), pghost, pgport);
if (pghost && !pgport)
- printf(_("(using postmaster on %s, default port)\n"), pghost);
+ comment(_("(using postmaster on %s, default port)\n"), pghost);
if (!pghost && pgport)
- printf(_("(using postmaster on Unix socket, port %s)\n"), pgport);
+ comment(_("(using postmaster on Unix socket, port %s)\n"), pgport);
if (!pghost && !pgport)
- printf(_("(using postmaster on Unix socket, default port)\n"));
+ comment(_("(using postmaster on Unix socket, default port)\n"));
}
convert_sourcefiles();
@@ -1167,9 +1439,10 @@ psql_command(const char *database, const char *query,...)
/* And now we can build and execute the shell command */
snprintf(psql_cmd, sizeof(psql_cmd),
- "\"%s%spsql\" -X -c \"%s\" \"%s\"",
+ "\"%s%spsql\" %s -X -c \"%s\" \"%s\"",
bindir ? bindir : "",
bindir ? "/" : "",
+ tap ? "-q" : "",
query_escaped,
database);
@@ -1704,6 +1977,9 @@ run_schedule(const char *schedule, test_start_function startfunc,
c++;
add_stringlist_item(&ignorelist, c);
+ test_status_ignored(c);
+ status_end();
+
/*
* Note: ignore: lines do not run the test, they just say that
* failure of this test when run later on is to be ignored. A bit
@@ -1762,7 +2038,7 @@ run_schedule(const char *schedule, test_start_function startfunc,
if (num_tests == 1)
{
- status(_("test %-28s ... "), tests[0]);
+ test_status_preamble(tests[0]);
pids[0] = (startfunc) (tests[0], &resultfiles[0], &expectfiles[0], &tags[0]);
INSTR_TIME_SET_CURRENT(starttimes[0]);
wait_for_tests(pids, statuses, stoptimes, NULL, 1);
@@ -1778,8 +2054,8 @@ run_schedule(const char *schedule, test_start_function startfunc,
{
int oldest = 0;
- status(_("parallel group (%d tests, in groups of %d): "),
- num_tests, max_connections);
+ comment(_("parallel group (%d tests, in groups of %d): "),
+ num_tests, max_connections);
for (i = 0; i < num_tests; i++)
{
if (i - oldest >= max_connections)
@@ -1799,7 +2075,7 @@ run_schedule(const char *schedule, test_start_function startfunc,
}
else
{
- status(_("parallel group (%d tests): "), num_tests);
+ comment(_("parallel group (%d tests): "), num_tests);
for (i = 0; i < num_tests; i++)
{
pids[i] = (startfunc) (tests[i], &resultfiles[i], &expectfiles[i], &tags[i]);
@@ -1818,7 +2094,7 @@ run_schedule(const char *schedule, test_start_function startfunc,
bool differ = false;
if (num_tests > 1)
- status(_(" %-28s ... "), tests[i]);
+ test_status_preamble(tests[i]);
/*
* Advance over all three lists simultaneously.
@@ -1858,27 +2134,18 @@ run_schedule(const char *schedule, test_start_function startfunc,
}
}
if (ignore)
- {
- status(_("failed (ignored)"));
- fail_ignore_count++;
- }
+ test_status_ignored(tests[i]);
else
- {
- status(_("FAILED"));
- fail_count++;
- }
+ test_status_failed(tests[i]);
}
else
- {
- status(_("ok ")); /* align with FAILED */
- success_count++;
- }
+ test_status_ok(tests[i]);
if (statuses[i] != 0)
log_child_failure(statuses[i]);
INSTR_TIME_SUBTRACT(stoptimes[i], starttimes[i]);
- status(_(" %8.0f ms"), INSTR_TIME_GET_MILLISEC(stoptimes[i]));
+ runtime(tests[i], INSTR_TIME_GET_MILLISEC(stoptimes[i]));
status_end();
}
@@ -1917,7 +2184,7 @@ run_single_test(const char *test, test_start_function startfunc,
*tl;
bool differ = false;
- status(_("test %-28s ... "), test);
+ test_status_preamble(test);
pid = (startfunc) (test, &resultfiles, &expectfiles, &tags);
INSTR_TIME_SET_CURRENT(starttime);
wait_for_tests(&pid, &exit_status, &stoptime, NULL, 1);
@@ -1947,15 +2214,9 @@ run_single_test(const char *test, test_start_function startfunc,
}
if (differ)
- {
- status(_("FAILED"));
- fail_count++;
- }
+ test_status_failed(test);
else
- {
- status(_("ok ")); /* align with FAILED */
- success_count++;
- }
+ test_status_ok(test);
if (exit_status != 0)
log_child_failure(exit_status);
@@ -2152,6 +2413,7 @@ regression_main(int argc, char *argv[],
{"config-auth", required_argument, NULL, 24},
{"max-concurrent-tests", required_argument, NULL, 25},
{"make-testtablespace-dir", no_argument, NULL, 26},
+ {"tap", no_argument, NULL, 27},
{NULL, 0, NULL, 0}
};
@@ -2285,6 +2547,9 @@ regression_main(int argc, char *argv[],
case 26:
make_testtablespace_dir = true;
break;
+ case 27:
+ tap = true;
+ break;
default:
/* getopt_long already emitted a complaint */
fprintf(stderr, _("\nTry \"%s -h\" for more information.\n"),
@@ -2311,6 +2576,9 @@ regression_main(int argc, char *argv[],
exit(0);
}
+ if (tap)
+ output = &output_func_tap;
+
if (temp_instance && !port_specified_by_user)
/*
@@ -2636,54 +2904,20 @@ regression_main(int argc, char *argv[],
fclose(logfile);
- /*
- * Emit nice-looking summary message
- */
- if (fail_count == 0 && fail_ignore_count == 0)
- snprintf(buf, sizeof(buf),
- _(" All %d tests passed. "),
- success_count);
- else if (fail_count == 0) /* fail_count=0, fail_ignore_count>0 */
- snprintf(buf, sizeof(buf),
- _(" %d of %d tests passed, %d failed test(s) ignored. "),
- success_count,
- success_count + fail_ignore_count,
- fail_ignore_count);
- else if (fail_ignore_count == 0) /* fail_count>0 && fail_ignore_count=0 */
- snprintf(buf, sizeof(buf),
- _(" %d of %d tests failed. "),
- fail_count,
- success_count + fail_count);
- else
- /* fail_count>0 && fail_ignore_count>0 */
- snprintf(buf, sizeof(buf),
- _(" %d of %d tests failed, %d of these failures ignored. "),
- fail_count + fail_ignore_count,
- success_count + fail_count + fail_ignore_count,
- fail_ignore_count);
-
- putchar('\n');
- for (i = strlen(buf); i > 0; i--)
- putchar('=');
- printf("\n%s\n", buf);
- for (i = strlen(buf); i > 0; i--)
- putchar('=');
- putchar('\n');
- putchar('\n');
-
- if (file_size(difffilename) > 0)
- {
- printf(_("The differences that caused some tests to fail can be viewed in the\n"
- "file \"%s\". A copy of the test summary that you see\n"
- "above is saved in the file \"%s\".\n\n"),
- difffilename, logfilename);
- }
- else
+ if (file_size(difffilename) <= 0)
{
unlink(difffilename);
unlink(logfilename);
+
+ free(difffilename);
+ difffilename = NULL;
+ free(logfilename);
+ logfilename = NULL;
}
+ footer(difffilename, logfilename);
+ status_end();
+
if (fail_count != 0)
exit(1);
On 10/12/21 9:03 PM, Andres Freund wrote:
Hi,
On 2021-10-12 13:42:56 -0700, Andres Freund wrote:
On 2021-10-12 16:02:14 -0400, Andrew Dunstan wrote:
You do that by putting a path to it at the start of the PATH. The wrinkle in
this is that you need prove to point to one that understands virtual
paths. So you do something like this:PATH="/c/perl/bin:$PATH" PROVE=/bin/core_perl/prove configure ...
Oh my.
I'll try that later... I wonder if we could make this easier from our side?
This is a lot of magic to know.I managed to get this working. At first it failed because I don't have
pexports - it's not available inside msys as far as I could tell. And seems to
be unmaintained. But replacing pexports with gendef fixed that.There's this comment in src/pl/plperl/GNUmakefile
# Perl on win32 ships with import libraries only for Microsoft Visual C++,
# which are not compatible with mingw gcc. Therefore we need to build a
# new import library to link with.but I seem to be able to link fine without going through that song-and-dance?
It looks like you're not building a native postgres, but rather one
targeted at msys. To build one that's native (i.e. runs without any
presence of msys) you need to do these things before building:
MSYSTEM=MINGW64
MSYSTEM_CHOST=x86_64-w64-mingw32
PATH="/mingw64/bin:$PATH"
pexports will be in the resulting path, and the build will use the
native compiler.
You can use fairywren's config as a guide.
cheers
andrew
--
Andrew Dunstan
EDB: https://www.enterprisedb.com
On Tue, Oct 12, 2021 at 4:59 PM Andres Freund <andres@anarazel.de> wrote:
On 2021-10-12 15:55:22 -0400, John Naylor wrote:
(I couldn't get a build working but I'll leave that aside for now).
If you want to do that separately, I'll try to fix it.
Okay, I pulled the latest commits and tried again:
[51/950] Compiling C object
src/interfaces/libpq/libpq.5.dylib.p/fe-connect.c.o
FAILED: src/interfaces/libpq/libpq.5.dylib.p/fe-connect.c.o
ccache cc -Isrc/interfaces/libpq/libpq.5.dylib.p -Isrc/interfaces/libpq
-I../src/interfaces/libpq -Isrc/port -I../src/port -Isrc/include
-I../src/include
-I/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/System/Library/Frameworks/LDAP.framework/Headers
-I/usr/local/opt/readline/include -I/usr/local/opt/gettext/include
-I/usr/local/opt/zlib/include -I/usr/local/opt/openssl/include
-fcolor-diagnostics -Wall -Winvalid-pch -Wextra -O0 -g -isysroot
/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk -fno-strict-aliasing
-fwrapv -Wmissing-prototypes -Wpointer-arith -Werror=vla -Wendif-labels
-Wmissing-format-attribute -Wformat-security -Wdeclaration-after-statement
-Wno-unused-command-line-argument -Wno-missing-field-initializers
-Wno-sign-compare -Wno-unused-parameter -msse4.2
-F/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/System/Library/Frameworks/LDAP.framework
-DFRONTEND -MD -MQ src/interfaces/libpq/libpq.5.dylib.p/fe-connect.c.o -MF
src/interfaces/libpq/libpq.5.dylib.p/fe-connect.c.o.d -o
src/interfaces/libpq/libpq.5.dylib.p/fe-connect.c.o -c
../src/interfaces/libpq/fe-connect.c
In file included from ../src/interfaces/libpq/fe-connect.c:72:
In file included from
/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/System/Library/Frameworks/LDAP.framework/Headers/ldap.h:1:
[the last line is repeated a bunch of times, then...]
/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/System/Library/Frameworks/LDAP.framework/Headers/ldap.h:1:10:
error: #include nested too deeply
#include <ldap.h>
^
Then the expected "undeclared identifier" errors that would arise from a
missing header. I tried compiling --with-ldap with the Make build, and only
got warnings about deprecated declarations -- that build completed.
I tried disabling ldap with the Meson build but I'll spare the details of
what went wrong there in case I did something wrong, so we can take things
one step at a time.
That would also have to work for all the dependencies, which were
displayed
to me as:
ninja, gdbm, ca-certificates, openssl@1.1, readline, sqlite and
python@3.9
meson does depend on ninja (to execute the build) and of course python.
But
the rest should be optional dependencies. ninja builds without any
dependencies as long as you don't change its parser sources. python
builds on
aix, hpux etc.
Not sure what way gdbm openssl@1.1 and sqlite are pulled in? I assume
readline
is for python...
Hmm, weird.
--
John Naylor
EDB: http://www.enterprisedb.com
Hi,
On 2021-10-13 11:51:03 -0400, John Naylor wrote:
On Tue, Oct 12, 2021 at 4:59 PM Andres Freund <andres@anarazel.de> wrote:
On 2021-10-12 15:55:22 -0400, John Naylor wrote:
(I couldn't get a build working but I'll leave that aside for now).
If you want to do that separately, I'll try to fix it.
Okay, I pulled the latest commits and tried again:
/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/System/Library/Frameworks/LDAP.framework/Headers/ldap.h:1:[the last line is repeated a bunch of times, then...]
Oh. I actually saw that on CI at some point... That one is definitely
odd. Currently CI for OSX builds like
- brew install make coreutils ccache icu4c lz4 tcl-tk openldap
- brew install meson ninja python@3.9
..
PKG_CONFIG_PATH="/usr/local/opt/openssl/lib/pkgconfig:$PKG_CONFIG_PATH"
PKG_CONFIG_PATH="/usr/local/opt/icu4c/lib/pkgconfig:$PKG_CONFIG_PATH"
PKG_CONFIG_PATH="/usr/local/opt/openldap/lib/pkgconfig:$PKG_CONFIG_PATH"
export PKG_CONFIG_PATH
meson setup --buildtype debug -Dcassert=true -Dssl=openssl build
but I set that up knowing little about macos.
For the autoconf build CI currently does something similar via
LIBS="/usr/local/lib:$LIBS"
INCLUDES="/usr/local/include:$INCLUDES"
...
LIBS="/usr/local/opt/openldap/lib:$LIBS"
INCLUDES="/usr/local/opt/openldap/include:$INCLUDES"
...
--with-includes="$INCLUDES" \
--with-libs="$LIBS" \
are you doing something like that? Or does it work for you without? I vaguely
recall hitting a similar problem as you report when not passing
/usr/local/... to configure.
i tried disabling ldap with the meson build but i'll spare the details of
what went wrong there in case i did something wrong, so we can take things
one step at a time.
you can change it for an existing builddir with
meson configure -dldap=disabled or when setting up a new builddir by passing
-dldap=disabled at that time.
ninja, gdbm, ca-certificates, openssl@1.1, readline, sqlite and
python@3.9
meson does depend on ninja (to execute the build) and of course python.
but
the rest should be optional dependencies. ninja builds without any
dependencies as long as you don't change its parser sources. pythonbuilds on
aix, hpux etc.
not sure what way gdbm openssl@1.1 and sqlite are pulled in? i assume
readline
is for python...
Hmm, weird.
They're homebrew python deps: https://github.com/Homebrew/homebrew-core/blob/HEAD/Formula/python@3.9.rb#L28
which are optional things enabled explicitly:
https://github.com/Homebrew/homebrew-core/blob/HEAD/Formula/python@3.9.rb#L123
Greetings,
Andres Freund
On Wed, Oct 13, 2021 at 12:37 PM Andres Freund <andres@anarazel.de> wrote:
For the autoconf build CI currently does something similar via
LIBS="/usr/local/lib:$LIBS"
INCLUDES="/usr/local/include:$INCLUDES"
...
LIBS="/usr/local/opt/openldap/lib:$LIBS"
INCLUDES="/usr/local/opt/openldap/include:$INCLUDES"
...
--with-includes="$INCLUDES" \
--with-libs="$LIBS" \are you doing something like that? Or does it work for you without? I
vaguely
recall hitting a similar problem as you report when not passing
/usr/local/... to configure.
I didn't do anything like that for the autoconf build. I have in the past
done things retail, like
--with-icu ICU_CFLAGS='-I/usr/local/opt/icu4c/include/'
ICU_LIBS='-L/usr/local/opt/icu4c/lib/ -licui18n -licuuc -licudata'
i tried disabling ldap with the meson build but i'll spare the details
of
what went wrong there in case i did something wrong, so we can take
things
one step at a time.
you can change it for an existing builddir with
meson configure -dldap=disabled or when setting up a new builddir by
passing
-dldap=disabled at that time.
Somehow our emails got lower-cased down here, but I tried it with capital D:
meson configure -Dldap=disabled
inside the build dir and got this:
../meson.build:278:2: ERROR: Tried to assign the invalid value "None" of
type NoneType to variable.
Line 278 is
ldap_r = ldap = dependency('', required : false)
--
John Naylor
EDB: http://www.enterprisedb.com
Hi,
On 2021-10-13 08:55:38 -0400, Andrew Dunstan wrote:
On 10/12/21 9:03 PM, Andres Freund wrote:
I managed to get this working. At first it failed because I don't have
pexports - it's not available inside msys as far as I could tell. And seems to
be unmaintained. But replacing pexports with gendef fixed that.There's this comment in src/pl/plperl/GNUmakefile
# Perl on win32 ships with import libraries only for Microsoft Visual C++,
# which are not compatible with mingw gcc. Therefore we need to build a
# new import library to link with.but I seem to be able to link fine without going through that song-and-dance?
It looks like you're not building a native postgres, but rather one
targeted at msys. To build one that's native (i.e. runs without any
presence of msys) you need to do these things before building:MSYSTEM=MINGW64
MSYSTEM_CHOST=x86_64-w64-mingw32
PATH="/mingw64/bin:$PATH"
I had a config equivalent to this (slight difference in PATH, but the same gcc
being picked), and I just verified that it still works if I set up PATH like
that. I get a working plperl out of it. Without msys on PATH or such.
where perl526.dll
C:\perl\strawberry-5.26.3.1-64bit\perl\bin\perl526.dll
dumpbin /imports 'C:/Users/anfreund/src/pg-meson/build-mingw/tmp_install/lib/plperl.dll'|grep dll
Dump of file C:\Users\anfreund\src\pg-meson\build-mingw\tmp_install\lib\plperl.dll
KERNEL32.dll
msvcrt.dll
perl526.dll
dumpbin /imports .\build-mingw\tmp_install\bin\postgres.exe|grep dll
ADVAPI32.dll
KERNEL32.dll
msvcrt.dll
Secur32.dll
WLDAP32.dll
WS2_32.dll
do $$elog(NOTICE, "blob");$$ language plperl;
NOTICE: blob
DO
To me this looks like it's a plperl built without the import file recreation,
without being msys dependent?
pexports will be in the resulting path, and the build will use the
native compiler.
I don't see pexports anywhere in the msys installation. I can see it available
on sourceforge, and I see a few others asking where to get it from in the
context of msys, and being pointed to manually downloading it.
Seems like we should consider using gendef instead of pexports, given it's
available in msys?
$ pacman -Fy
$ pacman -F gendef.exe
...
mingw64/mingw-w64-x86_64-tools-git 9.0.0.6316.acdc7adc9-1 (mingw-w64-x86_64-toolchain) [installed]
mingw64/bin/gendef.exe
..
$ pacman -F pexports.exe
$ pacman -Fx pexports
<bunch of packages containing smtpexports.h>
Greetings,
Andres Freund
Hi,
On 2021-10-13 13:19:36 -0400, John Naylor wrote:
On Wed, Oct 13, 2021 at 12:37 PM Andres Freund <andres@anarazel.de> wrote:
For the autoconf build CI currently does something similar via
LIBS="/usr/local/lib:$LIBS"
INCLUDES="/usr/local/include:$INCLUDES"
...
LIBS="/usr/local/opt/openldap/lib:$LIBS"
INCLUDES="/usr/local/opt/openldap/include:$INCLUDES"
...
--with-includes="$INCLUDES" \
--with-libs="$LIBS" \are you doing something like that? Or does it work for you without? I
vaguely
recall hitting a similar problem as you report when not passing
/usr/local/... to configure.I didn't do anything like that for the autoconf build. I have in the past
done things retail, like
I'll try to see how this works / what causes the breakage.
Somehow our emails got lower-cased down here, but I tried it with capital D:
:)
meson configure -Dldap=disabled
inside the build dir and got this:
../meson.build:278:2: ERROR: Tried to assign the invalid value "None" of
type NoneType to variable.Line 278 is
ldap_r = ldap = dependency('', required : false)
Oops, I broke that when trying to clean things up. I guess I write too much C
;). It needs to be two lines.
I pushed the fix for that.
Greetings,
Andres Freund
On Wed, Oct 13, 2021 at 1:42 PM Andres Freund <andres@anarazel.de> wrote:
I pushed the fix for that.
Ok great, it builds now! :-) Now something's off with dynamic loading.
There are libraries in ./tmp_install/usr/local/lib/ but apparently initdb
doesn't know to look for them there:
$ cat /Users/john/pgdev/meson/build/testrun/main/pg_regress/log/initdb.log
dyld: Library not loaded: /usr/local/lib/libpq.5.dylib
Referenced from:
/Users/john/pgdev/meson/build/tmp_install/usr/local/bin/initdb
Reason: image not found
--
John Naylor
EDB: http://www.enterprisedb.com
On 10/13/21 1:26 PM, Andres Freund wrote:
pexports will be in the resulting path, and the build will use the
native compiler.I don't see pexports anywhere in the msys installation. I can see it available
on sourceforge, and I see a few others asking where to get it from in the
context of msys, and being pointed to manually downloading it.
Weird. fairywren has it, which means that it must have been removed from
the packages at some stage, fairly recently as fairywren isn't that old.
I just confirmed the absence on a 100% fresh install.
It is in Strawberry's c/bin directory.
Seems like we should consider using gendef instead of pexports, given it's
available in msys?
Yeah. It's missing on my ancient msys animal (frogmouth), but it doesn't
build --with-perl.
jacana seems to have it.
If you prep a patch I'll test it.
cheers
andrew
--
Andrew Dunstan
EDB: https://www.enterprisedb.com
Hi,
On 2021-10-13 16:06:32 -0400, Andrew Dunstan wrote:
If you prep a patch I'll test it.
Well, right now I'm wondering if the better fix is to just remove the whole
win32 block. I don't know how far back, but afaict it's not needed. Seems to
have been needed for narwhal at some point, according to 02b61dd08f99. But
narwhal is long dead.
Greetings,
Andres Freund
st 13. 10. 2021 v 1:54 odesílatel Andres Freund <andres@anarazel.de> napsal:
Hi,
On 2021-10-13 01:19:27 +0200, Josef Šimánek wrote:
I tried to clean and start from scratch, but I'm getting different
error probably related to wrongly configured JIT (LLVM wasn't found
during meson setup). I'll debug on my side to provide more info.../src/backend/jit/jit.c:91:73: error: ‘DLSUFFIX’ undeclared (first use in this function)
91 | snprintf(path, MAXPGPATH, "%s/%s%s", pkglib_path, jit_provider, DLSUFFIX);
| ^~~~~~~~This *very* likely is related to building in a source tree that also contains
a "non-meson" build "in place". The problem is that the meson build picks up
the pg_config.h generated by ./configure in the "normal" build, rather than
the one meson generated itself.You'd need to execute make distclean or such, or use a separate git checkout.
I forgot about this issue because I only ever build postgres from outside the
source-tree (by invoking configure from a separate directory), so there's
never build products in it. I think at least I need to make the build emit a
warning / error if there's a pg_config.h in the source tree...
Hello, thanks for the hint. I can finally build using meson and run
regress tests.
The only problem I do have currently is auto-detection of perl. I'm
getting error related to missing "Opcode.pm". PERL is autodetected and
enabled (https://pastebin.com/xfRRrDcU).
I do get the same error when I enforce perl for current master build
(./configure --with-perl). Using ./configure with perl autodetection
skips plperl extension on my system.
Disabling perl manually for meson build (meson setup build
--reconfigure --buildtype debug -Dperl=disabled) works for me.
Show quoted text
This is the part of the jit code that's built regardless of llvm availability
- you'd get the same error in a few other places unrelated to jit.Greetings,
Andres Freund
On 10/13/21 5:46 PM, Andres Freund wrote:
Hi,
On 2021-10-13 16:06:32 -0400, Andrew Dunstan wrote:
If you prep a patch I'll test it.
Well, right now I'm wondering if the better fix is to just remove the whole
win32 block. I don't know how far back, but afaict it's not needed. Seems to
have been needed for narwhal at some point, according to 02b61dd08f99. But
narwhal is long dead.
Ok, I'll test it out.
cheers
andrew
--
Andrew Dunstan
EDB: https://www.enterprisedb.com
On Thu, Oct 14, 2021 at 4:51 AM John Naylor
<john.naylor@enterprisedb.com> wrote:
/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/System/Library/Frameworks/LDAP.framework/Headers/ldap.h:1:10: error: #include nested too deeply
#include <ldap.h>
^
I vaguely recall that PostgreSQL should build OK against Apple's copy
of OpenLDAP. That recursive include loop is coming from a "framework"
header that contains just a couple of lines like #include <ldap.h> to
try to include the real header, which should also be in the include
path, somewhere like
/Library/Developer/CommandLineTools/SDKs/MacOSX11.3.sdk/usr/include/ldap.h.
I think we'd need to figure out where that
-I/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/System/Library/Frameworks/LDAP.framework/Headers
directive is coming from and get rid of it, so we can include the real
header directly.
Josef Šimánek <josef.simanek@gmail.com> writes:
The only problem I do have currently is auto-detection of perl. I'm
getting error related to missing "Opcode.pm". PERL is autodetected and
enabled (https://pastebin.com/xfRRrDcU).
Your Perl (not PERL) installation seems to be incomplete. Opcode.pm is a
core module, and should be in /usr/lib64/perl5, judging by the paths in
the error message.
Which OS is this? Some Linux distributions have separate packages for
the interpreter itself and the included modules, and the packages can be
named confusingly. E.g. on older Redhat/Fedora versions you have to
install the 'perl-core' package to get all the modules, 'perl' is just
the interpreter and the bare minimum set of strictily necessary modules.
They've fixed this in recent versions (Fedora 34 and Redhat 8, IIRC), so
that 'perl' gives you the hole bundle, and 'perl-interpeter' is the
minimal one.
- ilmari
čt 14. 10. 2021 v 15:14 odesílatel Dagfinn Ilmari Mannsåker
<ilmari@ilmari.org> napsal:
Josef Šimánek <josef.simanek@gmail.com> writes:
The only problem I do have currently is auto-detection of perl. I'm
getting error related to missing "Opcode.pm". PERL is autodetected and
enabled (https://pastebin.com/xfRRrDcU).Your Perl (not PERL) installation seems to be incomplete. Opcode.pm is a
core module, and should be in /usr/lib64/perl5, judging by the paths in
the error message.Which OS is this? Some Linux distributions have separate packages for
the interpreter itself and the included modules, and the packages can be
named confusingly. E.g. on older Redhat/Fedora versions you have to
install the 'perl-core' package to get all the modules, 'perl' is just
the interpreter and the bare minimum set of strictily necessary modules.They've fixed this in recent versions (Fedora 34 and Redhat 8, IIRC), so
that 'perl' gives you the hole bundle, and 'perl-interpeter' is the
minimal one.
I'm using Fedora 34 and I still see perl-Opcode.x86_64 as a separate
package. Anyway it behaves differently with autoconf tools and the
meson build system. Is perl disabled by default in the current build
system?
Show quoted text
- ilmari
On 2021-Oct-14, Josef Šimánek wrote:
I'm using Fedora 34 and I still see perl-Opcode.x86_64 as a separate
package. Anyway it behaves differently with autoconf tools and the
meson build system. Is perl disabled by default in the current build
system?
Yes, you have to use --with-perl in order to get it.
--
Álvaro Herrera Valdivia, Chile — https://www.EnterpriseDB.com/
"Puedes vivir sólo una vez, pero si lo haces bien, una vez es suficiente"
Josef Šimánek <josef.simanek@gmail.com> writes:
čt 14. 10. 2021 v 15:14 odesílatel Dagfinn Ilmari Mannsåker
<ilmari@ilmari.org> napsal:Josef Šimánek <josef.simanek@gmail.com> writes:
The only problem I do have currently is auto-detection of perl. I'm
getting error related to missing "Opcode.pm". PERL is autodetected and
enabled (https://pastebin.com/xfRRrDcU).Your Perl (not PERL) installation seems to be incomplete. Opcode.pm is a
core module, and should be in /usr/lib64/perl5, judging by the paths in
the error message.Which OS is this? Some Linux distributions have separate packages for
the interpreter itself and the included modules, and the packages can be
named confusingly. E.g. on older Redhat/Fedora versions you have to
install the 'perl-core' package to get all the modules, 'perl' is just
the interpreter and the bare minimum set of strictily necessary modules.They've fixed this in recent versions (Fedora 34 and Redhat 8, IIRC), so
that 'perl' gives you the hole bundle, and 'perl-interpeter' is the
minimal one.I'm using Fedora 34 and I still see perl-Opcode.x86_64 as a separate
package.`
Yes, it's a separate package, but the 'perl' package depends on all the
core module packages, so installing that should fix things. You appear
to only have 'perl-interpreter' installed.
Anyway it behaves differently with autoconf tools and the meson build
system. Is perl disabled by default in the current build system?
configure doesn't auto-detect any optional features, they have to be
explicitly enabled using --with-foo switches.
- ilmari
Hi,
On 2021-10-14 10:29:42 -0300, Alvaro Herrera wrote:
On 2021-Oct-14, Josef Šimánek wrote:
I'm using Fedora 34 and I still see perl-Opcode.x86_64 as a separate
package. Anyway it behaves differently with autoconf tools and the
meson build system. Is perl disabled by default in the current build
system?
Hm, so it seems we should make the test separately verify that perl -M{Opcode,
ExtUtils::Embed, ExtUtils::ParseXS} doesn't fail, so that we can fail perl
detection with a useful message?
Yes, you have to use --with-perl in order to get it.
With the meson prototype I set most optional features to "auto", except for
LLVM, as that increases compile times noticeably.
For configure we didn't/don't want to do much auto-detection, because that
makes life harder for distributors. But meson has one switch controlling all
features set to 'auto' and not explicitly enabled/disabled:
--auto-features {enabled,disabled,auto} Override value of all 'auto' features (default: auto).
so the argument doesn't apply to the same degree there. We could default
auto-features to something else too.
There were two other reasons:
1) I got tired of needing to disable zlib, readline to be able to build on
windows.
2) Exercising all the dependency detection / checking seems important at this
stage
Greetings,
Andres Freund
Hi,
On 2021-10-13 23:58:12 +0200, Josef Šimánek wrote:
st 13. 10. 2021 v 1:54 odesílatel Andres Freund <andres@anarazel.de> napsal:
This *very* likely is related to building in a source tree that also contains
a "non-meson" build "in place". The problem is that the meson build picks up
the pg_config.h generated by ./configure in the "normal" build, rather than
the one meson generated itself.You'd need to execute make distclean or such, or use a separate git checkout.
I forgot about this issue because I only ever build postgres from outside the
source-tree (by invoking configure from a separate directory), so there's
never build products in it. I think at least I need to make the build emit a
warning / error if there's a pg_config.h in the source tree...Hello, thanks for the hint. I can finally build using meson and run
regress tests.
I yesterday pushed code that should detect this case (with an error). Should
now detect the situation both when you first run configure in tree, and then
meson, and the other way round (by the dirty hack of ./configure touch'ing
meson.build at the end for in-tree builds).
The only problem I do have currently is auto-detection of perl. I'm
getting error related to missing "Opcode.pm". PERL is autodetected and
enabled (https://pastebin.com/xfRRrDcU).I do get the same error when I enforce perl for current master build
(./configure --with-perl). Using ./configure with perl autodetection
skips plperl extension on my system.Disabling perl manually for meson build (meson setup build
--reconfigure --buildtype debug -Dperl=disabled) works for me.
Yay, thanks for testing!
Greetings,
Andres Freund
I wrote:
Ok great, it builds now! :-) Now something's off with dynamic loading.
There are libraries in ./tmp_install/usr/local/lib/ but apparently initdb
doesn't know to look for them there:
$ cat /Users/john/pgdev/meson/build/testrun/main/pg_regress/log/initdb.log
dyld: Library not loaded: /usr/local/lib/libpq.5.dylib
Referenced from:
/Users/john/pgdev/meson/build/tmp_install/usr/local/bin/initdb
Reason: image not found
After poking a bit more, this only happens when trying to run the tests. If
I specify a prefix, I can install, init, and start the server just fine, so
that much works.
--
John Naylor
EDB: http://www.enterprisedb.com
čt 14. 10. 2021 v 15:32 odesílatel Dagfinn Ilmari Mannsåker
<ilmari@ilmari.org> napsal:
Josef Šimánek <josef.simanek@gmail.com> writes:
čt 14. 10. 2021 v 15:14 odesílatel Dagfinn Ilmari Mannsåker
<ilmari@ilmari.org> napsal:Josef Šimánek <josef.simanek@gmail.com> writes:
The only problem I do have currently is auto-detection of perl. I'm
getting error related to missing "Opcode.pm". PERL is autodetected and
enabled (https://pastebin.com/xfRRrDcU).Your Perl (not PERL) installation seems to be incomplete. Opcode.pm is a
core module, and should be in /usr/lib64/perl5, judging by the paths in
the error message.Which OS is this? Some Linux distributions have separate packages for
the interpreter itself and the included modules, and the packages can be
named confusingly. E.g. on older Redhat/Fedora versions you have to
install the 'perl-core' package to get all the modules, 'perl' is just
the interpreter and the bare minimum set of strictily necessary modules.They've fixed this in recent versions (Fedora 34 and Redhat 8, IIRC), so
that 'perl' gives you the hole bundle, and 'perl-interpeter' is the
minimal one.I'm using Fedora 34 and I still see perl-Opcode.x86_64 as a separate
package.`Yes, it's a separate package, but the 'perl' package depends on all the
core module packages, so installing that should fix things. You appear
to only have 'perl-interpreter' installed.
You're right. Installing "perl" or "perl-Opcode" manually fixes this
problem. Currently I only have "perl-interpreter" installed.
Show quoted text
Anyway it behaves differently with autoconf tools and the meson build
system. Is perl disabled by default in the current build system?configure doesn't auto-detect any optional features, they have to be
explicitly enabled using --with-foo switches.- ilmari
čt 14. 10. 2021 v 19:24 odesílatel Andres Freund <andres@anarazel.de> napsal:
Hi,
On 2021-10-14 10:29:42 -0300, Alvaro Herrera wrote:
On 2021-Oct-14, Josef Šimánek wrote:
I'm using Fedora 34 and I still see perl-Opcode.x86_64 as a separate
package. Anyway it behaves differently with autoconf tools and the
meson build system. Is perl disabled by default in the current build
system?Hm, so it seems we should make the test separately verify that perl -M{Opcode,
ExtUtils::Embed, ExtUtils::ParseXS} doesn't fail, so that we can fail perl
detection with a useful message?
I can confirm "perl -MOpcode" fails. ExtUtils::Embed and
ExtUtils::ParseXS are present. Looking at the local system history of
perl-interpreter package, it seems to be installed by default on
Fedora 34. Friendly error message would be welcomed.
Yes, you have to use --with-perl in order to get it.
With the meson prototype I set most optional features to "auto", except for
LLVM, as that increases compile times noticeably.For configure we didn't/don't want to do much auto-detection, because that
makes life harder for distributors. But meson has one switch controlling all
features set to 'auto' and not explicitly enabled/disabled:
--auto-features {enabled,disabled,auto} Override value of all 'auto' features (default: auto).
so the argument doesn't apply to the same degree there. We could default
auto-features to something else too.There were two other reasons:
1) I got tired of needing to disable zlib, readline to be able to build on
windows.
2) Exercising all the dependency detection / checking seems important at this
stage
Clear, thanks for the info.
Show quoted text
Greetings,
Andres Freund
Hi,
On October 14, 2021 12:14:16 PM PDT, John Naylor <john.naylor@enterprisedb.com> wrote:
I wrote:
Ok great, it builds now! :-) Now something's off with dynamic loading.
There are libraries in ./tmp_install/usr/local/lib/ but apparently initdb
doesn't know to look for them there:$ cat /Users/john/pgdev/meson/build/testrun/main/pg_regress/log/initdb.log
dyld: Library not loaded: /usr/local/lib/libpq.5.dylib
Referenced from:/Users/john/pgdev/meson/build/tmp_install/usr/local/bin/initdb
Reason: image not found
After poking a bit more, this only happens when trying to run the tests. If
I specify a prefix, I can install, init, and start the server just fine, so
that much works.
Is this a Mac with SIP enabled? The Mac CI presumably has that disabled, which is why I didn't see this issue there. Probably need to implement whatever Tom figured out to do about that for the current way of running tests.
Andres
--
Sent from my Android device with K-9 Mail. Please excuse my brevity.
On Thu, Oct 14, 2021 at 4:34 PM Andres Freund <andres@anarazel.de> wrote:
Is this a Mac with SIP enabled? The Mac CI presumably has that disabled,
which is why I didn't see this issue there. Probably need to implement
whatever Tom figured out to do about that for the current way of running
tests.
System Information says it's disabled. Running "csrutil status" complains
of an unsupported configuration, which doesn't sound good, so I should
probably go fix that independent of anything else. :-/
--
John Naylor
EDB: http://www.enterprisedb.com
I wrote:
Is this a Mac with SIP enabled? The Mac CI presumably has that
disabled, which is why I didn't see this issue there. Probably need to
implement whatever Tom figured out to do about that for the current way of
running tests.
System Information says it's disabled. Running "csrutil status" complains
of an unsupported configuration, which doesn't sound good, so I should
probably go fix that independent of anything else. :-/
Looking online, I wonder if the "unsupported" message might be overly
cautious. In any case, I do remember turning something off to allow a
debugger to run. Here are all the settings, in case it matters:
Apple Internal: disabled
Kext Signing: enabled
Filesystem Protections: enabled
Debugging Restrictions: disabled
DTrace Restrictions: enabled
NVRAM Protections: enabled
BaseSystem Verification: enabled
--
John Naylor
EDB: http://www.enterprisedb.com
Hi,
On 14.10.2021 23:54, John Naylor wrote:
On Thu, Oct 14, 2021 at 4:34 PM Andres Freund <andres@anarazel.de
<mailto:andres@anarazel.de>> wrote:Is this a Mac with SIP enabled? The Mac CI presumably has that
disabled, which is why I didn't see this issue there. Probably need to
implement whatever Tom figured out to do about that for the current way
of running tests.System Information says it's disabled. Running "csrutil status"
complains of an unsupported configuration, which doesn't sound good, so
I should probably go fix that independent of anything else. :-/
Maybe you could check that DYLD_LIBRARY_PATH is working for you?
% DYLD_FALLBACK_LIBRARY_PATH=
DYLD_LIBRARY_PATH=./tmp_install/usr/local/lib
./tmp_install/usr/local/bin/psql --version
psql (PostgreSQL) 15devel
Without DYLD_LIBRARY_PATH I get the error, as expected:
% DYLD_FALLBACK_LIBRARY_PATH= ./tmp_install/usr/local/bin/psql --version
dyld: Library not loaded: /usr/local/lib/libpq.5.dylib
Referenced from:
/Users/shinderuk/src/postgres-meson/build/./tmp_install/usr/local/bin/psql
Reason: image not found
I add "DYLD_FALLBACK_LIBRARY_PATH=" because otherwise dyld falls back to
/usr/lib/libpq.5.dylib provided by Apple (I am testing on Catalina).
% DYLD_PRINT_LIBRARIES=1 ./tmp_install/usr/local/bin/psql --version 2>&1
| grep libpq
dyld: loaded: <4EDF735E-2104-32AD-BE7B-B400ABFCF57C> /usr/lib/libpq.5.dylib
Regards,
--
Sergey Shinderuk https://postgrespro.com/
John Naylor <john.naylor@enterprisedb.com> writes:
System Information says it's disabled. Running "csrutil status" complains
of an unsupported configuration, which doesn't sound good, so I should
probably go fix that independent of anything else. :-/
Looking online, I wonder if the "unsupported" message might be overly
cautious. In any case, I do remember turning something off to allow a
debugger to run. Here are all the settings, in case it matters:
Apple Internal: disabled
Kext Signing: enabled
Filesystem Protections: enabled
Debugging Restrictions: disabled
DTrace Restrictions: enabled
NVRAM Protections: enabled
BaseSystem Verification: enabled
I remember having seen that report too, after some previous software
upgrade that had started from a "SIP disabled" status. I'm mostly
guessing here, but my guess is that
(a) csrutil only considers the all-enabled and all-disabled states
of these individual flags to be "supported" cases.
(b) some one or more of these flags came along in a macOS update,
and if you did the update starting from a "disabled" state, you
nonetheless ended up with the new flags enabled, leading to the
mixed state that csrutil complains about.
I've lost count of the number of times I've seen macOS updates
be sloppy about preserving non-default settings, so I don't find
theory (b) to be even slightly surprising.
Whether the mixed state is actually problematic in any way,
I dunno. I don't recall having had any problems before noticing
that that was what I had.
regards, tom lane
Andres Freund <andres@anarazel.de> writes:
Is this a Mac with SIP enabled? The Mac CI presumably has that disabled, which is why I didn't see this issue there. Probably need to implement whatever Tom figured out to do about that for the current way of running tests.
AFAIR the only cases we've made work are
(1) disable SIP
(2) avoid the need for (1) by always doing "make install" before
"make check".
Peter E. did some hacking towards another solution awhile ago,
but IIRC it involved changing the built binaries, and I think
we concluded that the benefits didn't justify that.
regards, tom lane
Andres Freund <andres@anarazel.de> writes:
Hm, so it seems we should make the test separately verify that perl -M{Opcode,
ExtUtils::Embed, ExtUtils::ParseXS} doesn't fail, so that we can fail perl
detection with a useful message?
Our existing policy is that we should check this at configure time,
not later. Since plperl won't work at all without Opcode, it seems
appropriate to add a check there if you say --with-perl. I wasn't
aware that Red Hat had unbundled that from the minimal perl
installation :-(.
OTOH, if they've not unbundled ExtUtils::Embed or ExtUtils::ParseXS,
I doubt it's worth the configure cycles to check for those separately.
regards, tom lane
On 10/13/21 7:11 PM, Andrew Dunstan wrote:
On 10/13/21 5:46 PM, Andres Freund wrote:
Hi,
On 2021-10-13 16:06:32 -0400, Andrew Dunstan wrote:
If you prep a patch I'll test it.
Well, right now I'm wondering if the better fix is to just remove the whole
win32 block. I don't know how far back, but afaict it's not needed. Seems to
have been needed for narwhal at some point, according to 02b61dd08f99. But
narwhal is long dead.Ok, I'll test it out.
confirmed that jacana doesn't need this code to build or test plperl
(all I did was change the test from win32 to win32x). There would still
be work to do to fix the contrib bool_plperl, jsonb_plperl and
hstore_plperl modules.
cheers
andrew
--
Andrew Dunstan
EDB: https://www.enterprisedb.com
On Fri, Oct 15, 2021 at 11:00 AM Tom Lane <tgl@sss.pgh.pa.us> wrote:
Peter E. did some hacking towards another solution awhile ago,
but IIRC it involved changing the built binaries, and I think
we concluded that the benefits didn't justify that.
Yeah, by now there are lots of useful blogs from various projects
figuring out that you can use the install_name_tool to adjust the
paths it uses to be absolute or relative to certain magic words, like
@executable_path/../lib/blah.dylib, which is tempting, but...
realistically, for serious hacking on a Mac, SIP is so annoying that
it isn't the only reason you'll want to turn it off: it stops
dtrace/dtruss/... from working, and somehow prevents debuggers from
working when you've ssh'd in from a remote machine with a proper
keyboard, and probably more things that I'm forgetting.
I wish I could find the Xnu source that shows exactly how and when the
environment is suppressed in this way to understand better, but it
doesn't jump out of Apple's github; maybe it's hiding in closed source
machinery...
Thomas Munro <thomas.munro@gmail.com> writes:
I wish I could find the Xnu source that shows exactly how and when the
environment is suppressed in this way to understand better, but it
doesn't jump out of Apple's github; maybe it's hiding in closed source
machinery...
I recall that we figured out awhile ago that the environment gets trimmed
when make (or whatever) executes some command via the shell; seemingly,
Apple has decided that /bin/sh is a security-critical program that mustn't
be run with a non-default DYLD_LIBRARY_PATH. Dunno if that helps you
find where the damage is done exactly.
(The silliness of this policy, when you pair it with the fact that they
don't reset PATH at the same time, seems blindingly obvious to me. But
apparently not to Apple.)
regards, tom lane
Hi,
On 2021-10-14 18:00:49 -0400, Tom Lane wrote:
Andres Freund <andres@anarazel.de> writes:
Is this a Mac with SIP enabled? The Mac CI presumably has that disabled, which is why I didn't see this issue there. Probably need to implement whatever Tom figured out to do about that for the current way of running tests.
AFAIR the only cases we've made work are
(1) disable SIP
(2) avoid the need for (1) by always doing "make install" before
"make check".
Ah, I thought it was more than that. In that case, John, does meson's test
succeed after you did the "proper" install? Assuming it's in a path that's
allowed to provide shared libraries?
Greetings,
Andres Freund
I wrote:
I recall that we figured out awhile ago that the environment gets trimmed
when make (or whatever) executes some command via the shell; seemingly,
Apple has decided that /bin/sh is a security-critical program that mustn't
be run with a non-default DYLD_LIBRARY_PATH. Dunno if that helps you
find where the damage is done exactly.
BTW, here's the evidence for this theory:
[tgl@pro ~]$ cat checkenv.c
#include <stdio.h>
#include <stdlib.h>
int
main(int argc, char **argv)
{
char *pth = getenv("DYLD_LIBRARY_PATH");
if (pth)
printf("DYLD_LIBRARY_PATH = %s\n", pth);
else
printf("DYLD_LIBRARY_PATH is unset\n");
return 0;
}
[tgl@pro ~]$ gcc checkenv.c
[tgl@pro ~]$ ./a.out
DYLD_LIBRARY_PATH is unset
[tgl@pro ~]$ export DYLD_LIBRARY_PATH=/Users/tgl/pginstall/lib
[tgl@pro ~]$ ./a.out
DYLD_LIBRARY_PATH = /Users/tgl/pginstall/lib
[tgl@pro ~]$ sh -c ./a.out
DYLD_LIBRARY_PATH is unset
[tgl@pro ~]$ ./a.out
DYLD_LIBRARY_PATH = /Users/tgl/pginstall/lib
[tgl@pro ~]$ bash -c ./a.out
DYLD_LIBRARY_PATH is unset
You have to check the environment using an "unprivileged" program.
If you try to examine the environment using, say, "env", you will get
very misleading results. AFAICT, /usr/bin/env is *also* considered
security-critical, because I cannot get it to ever report that
DYLD_LIBRARY_PATH is set.
Hmm ... /usr/bin/perl seems to act the same way. It can see
ENV{'PATH'} but not ENV{'DYLD_LIBRARY_PATH'}.
This may indicate that they've applied this policy on a blanket
basis to everything in /bin and /usr/bin (and other system
directories, maybe), rather than singling out the shell.
regards, tom lane
Hi,
On 2021-10-15 11:23:00 +1300, Thomas Munro wrote:
On Fri, Oct 15, 2021 at 11:00 AM Tom Lane <tgl@sss.pgh.pa.us> wrote:
Peter E. did some hacking towards another solution awhile ago,
but IIRC it involved changing the built binaries, and I think
we concluded that the benefits didn't justify that.Yeah, by now there are lots of useful blogs from various projects
figuring out that you can use the install_name_tool to adjust the
paths it uses to be absolute or relative to certain magic words, like
@executable_path/../lib/blah.dylib, which is tempting, but...
realistically, for serious hacking on a Mac, SIP is so annoying that
it isn't the only reason you'll want to turn it off: it stops
dtrace/dtruss/... from working, and somehow prevents debuggers from
working when you've ssh'd in from a remote machine with a proper
keyboard, and probably more things that I'm forgetting.
Meson has support for using install_name_tool to remove "build time" rpaths
and set "install time" rpaths during the installation process - which uses
install_name_tool on mac.
If, and perhaps that's too big an if, relative rpaths actually work despite
SIP, it might be worth setting a relative install_rpath, because afaict that
should then work both for a "real" installation and our temporary test one.
If absolute rpaths are required, it'd make the process a bit more expensive,
because we'd probably need to change a configure time option during the temporary
install. No actual rebuilds would be required, but still.
Greetings,
Andres Freund
Andres Freund <andres@anarazel.de> writes:
If, and perhaps that's too big an if, relative rpaths actually work despite
SIP, it might be worth setting a relative install_rpath, because afaict that
should then work both for a "real" installation and our temporary test one.
From what we know so far, it seems like SIP wouldn't interfere with
that (if it works at all). I think what SIP desires to prevent is
messing with a program's execution by setting DYLD_LIBRARY_PATH.
As long as the program executable itself is saying where to find
the library, I don't see why they should interfere with that.
(Again, it seems blindingly stupid to forbid this while not blocking
PATH or any of the other environment variables that have always affected
execution. But what do I know.)
If absolute rpaths are required, it'd make the process a bit more expensive,
It'd also put the kibosh on relocatable install trees, though I dunno how
much people really care about that.
regards, tom lane
On Thu, Oct 14, 2021 at 6:55 PM Andres Freund <andres@anarazel.de> wrote:
Ah, I thought it was more than that. In that case, John, does meson's test
succeed after you did the "proper" install? Assuming it's in a path that's
allowed to provide shared libraries?
Oh, it can act like installcheck? [checks] Yep, "meson test" ran fine (*).
It still ran the temp install first, but in any case it worked. The full
"configure step" was
meson setup build --buildtype debug -Dldap=disabled -Dcassert=true
-Dprefix=$(pwd)/inst
* (all passed but skipped subscription/t/012_collation.pl)
--
John Naylor
EDB: http://www.enterprisedb.com
Hi,
On 2021-10-14 18:08:58 -0400, Tom Lane wrote:
Andres Freund <andres@anarazel.de> writes:
Hm, so it seems we should make the test separately verify that perl -M{Opcode,
ExtUtils::Embed, ExtUtils::ParseXS} doesn't fail, so that we can fail perl
detection with a useful message?Our existing policy is that we should check this at configure time,
not later.
Yea, I was thinking of configure (and meson's equivalent) as well.
Since plperl won't work at all without Opcode, it seems
appropriate to add a check there if you say --with-perl. I wasn't
aware that Red Hat had unbundled that from the minimal perl
installation :-(.OTOH, if they've not unbundled ExtUtils::Embed or ExtUtils::ParseXS,
I doubt it's worth the configure cycles to check for those separately.
On debian the perl binary, with a sparse set of modules is in
perl-base. ExtUtils::Embed and ExtUtils::ParseXS are in
perl-modules-x.yy. Whereas Opcode is in libperlx.yy. But libperlx.yy depends
on perl-modules-x.yy so I guess an Opcode.pm check would suffice.
Seems we can just check all of them at once with with something like
perl -MOpcode -MExtUtils::Embed -MExtUtils::ParseXSNotAvailable -e ''
Can't locate ExtUtils/ParseXSNotAvailable.pm in @INC (you may need to install the ExtUtils::ParseXS3 module) (@INC contains: /home/andres/bin/perl5/lib/perl5/x86_64-linux-gnu-thread-multi /home/andres/bin/perl5/lib/perl5 /etc/perl /usr/local/lib/x86_64-linux-gnu/perl/5.32.1 /usr/local/share/perl/5.32.1 /usr/lib/x86_64-linux-gnu/perl5/5.32 /usr/share/perl5 /usr/lib/x86_64-linux-gnu/perl-base /usr/lib/x86_64-linux-gnu/perl/5.32 /usr/share/perl/5.32 /usr/local/lib/site_perl).
Greetings,
Andres Freund
Andres Freund <andres@anarazel.de> writes:
On 2021-10-14 18:08:58 -0400, Tom Lane wrote:
Andres Freund <andres@anarazel.de> writes:
Hm, so it seems we should make the test separately verify that perl -M{Opcode,
ExtUtils::Embed, ExtUtils::ParseXS} doesn't fail, so that we can fail perl
detection with a useful message?
Our existing policy is that we should check this at configure time,
not later.
Yea, I was thinking of configure (and meson's equivalent) as well.
Ah, sorry, I misunderstood what you meant by "test".
regards, tom lane
Hi,
On 2021-10-14 19:27:17 -0400, John Naylor wrote:
On Thu, Oct 14, 2021 at 6:55 PM Andres Freund <andres@anarazel.de> wrote:
Ah, I thought it was more than that. In that case, John, does meson's test
succeed after you did the "proper" install? Assuming it's in a path that's
allowed to provide shared libraries?Oh, it can act like installcheck? [checks] Yep, "meson test" ran fine (*).
It still ran the temp install first, but in any case it worked.
As far as I understand Tom, our normal make check only works on OSX if
previously you ran make install. Which will have installed libpq into the
"proper" install location. Because all our binaries will, by default, have an
rpath to the library directory embedded, that then allows binaries in the
temporary install to work. But using the wrong libpq - which most of the time
turns out to be harmless, because libpq doesn't change that rapidly.
* (all passed but skipped subscription/t/012_collation.pl)
That test requires ICU, so that's fine. I guess we could prevent the test from
being executed in the first place, but I don't think we've done that for cases
where it's one specific test in a t/ directory, where others in the same
directory do not have such dependencies.
Greetings,
Andres Freund
On Fri, Oct 15, 2021 at 12:04 PM Tom Lane <tgl@sss.pgh.pa.us> wrote:
[tgl@pro ~]$ cat checkenv.c
#include <stdio.h>
#include <stdlib.h>int
main(int argc, char **argv)
{
char *pth = getenv("DYLD_LIBRARY_PATH");if (pth)
printf("DYLD_LIBRARY_PATH = %s\n", pth);
else
printf("DYLD_LIBRARY_PATH is unset\n");return 0;
}
[tgl@pro ~]$ gcc checkenv.c
[tgl@pro ~]$ ./a.out
DYLD_LIBRARY_PATH is unset
[tgl@pro ~]$ export DYLD_LIBRARY_PATH=/Users/tgl/pginstall/lib
[tgl@pro ~]$ ./a.out
DYLD_LIBRARY_PATH = /Users/tgl/pginstall/lib
[tgl@pro ~]$ sh -c ./a.out
DYLD_LIBRARY_PATH is unset
[tgl@pro ~]$ ./a.out
DYLD_LIBRARY_PATH = /Users/tgl/pginstall/lib
[tgl@pro ~]$ bash -c ./a.out
DYLD_LIBRARY_PATH is unsetYou have to check the environment using an "unprivileged" program.
If you try to examine the environment using, say, "env", you will get
very misleading results. AFAICT, /usr/bin/env is *also* considered
security-critical, because I cannot get it to ever report that
DYLD_LIBRARY_PATH is set.Hmm ... /usr/bin/perl seems to act the same way. It can see
ENV{'PATH'} but not ENV{'DYLD_LIBRARY_PATH'}.This may indicate that they've applied this policy on a blanket
basis to everything in /bin and /usr/bin (and other system
directories, maybe), rather than singling out the shell.
Looks like it. If I've found the right code here, it looks like where
any common-or-garden Unix runtime linker would ignore LD_LIBRARY_PATH
for a setuid binary, they've trained theirs to whack DYLD_*, and also
for code-signed and __RESTRICT-marked executables.
https://github.com/opensource-apple/dyld/blob/master/src/dyld.cpp#L1681
I suppose you could point SHELL at an unsigned copy of sh (codesign
--remove-signature, or something from brew/ports/x) so that GNU make
should respect, but I don't know how many other exec("/bin/sh") calls
might be hiding around the place (I guess perl calls system()?) and
might require some kind of LD_PRELOAD hackery... not much fun.
Thomas Munro <thomas.munro@gmail.com> writes:
On Fri, Oct 15, 2021 at 12:04 PM Tom Lane <tgl@sss.pgh.pa.us> wrote:
This may indicate that they've applied this policy on a blanket
basis to everything in /bin and /usr/bin (and other system
directories, maybe), rather than singling out the shell.
Looks like it. If I've found the right code here, it looks like where
any common-or-garden Unix runtime linker would ignore LD_LIBRARY_PATH
for a setuid binary, they've trained theirs to whack DYLD_*, and also
for code-signed and __RESTRICT-marked executables.
https://github.com/opensource-apple/dyld/blob/master/src/dyld.cpp#L1681
Ugh. That explains it, all right.
I suppose you could point SHELL at an unsigned copy of sh (codesign
--remove-signature, or something from brew/ports/x) so that GNU make
should respect, but I don't know how many other exec("/bin/sh") calls
might be hiding around the place (I guess perl calls system()?) and
might require some kind of LD_PRELOAD hackery... not much fun.
Yeah. I thought about invoking everything via a small wrapper
that restores the correct setting of DYLD_LIBRARY_PATH. We could
perhaps make that work for the invocations of test postmasters
and psqls from "make" and TAP scripts, but hacking up our code's
sundry uses of system(3) like that seems like it'd be very messy,
if feasible at all.
BTW, the POSIX spec explicitly discourages letting SHELL affect the
behavior of system(3), so I bet that wouldn't help.
regards, tom lane
Hi,
On 2021-10-14 22:46:07 -0400, Tom Lane wrote:
Thomas Munro <thomas.munro@gmail.com> writes:
I suppose you could point SHELL at an unsigned copy of sh (codesign
--remove-signature, or something from brew/ports/x) so that GNU make
should respect, but I don't know how many other exec("/bin/sh") calls
might be hiding around the place (I guess perl calls system()?) and
might require some kind of LD_PRELOAD hackery... not much fun.Yeah. I thought about invoking everything via a small wrapper
that restores the correct setting of DYLD_LIBRARY_PATH. We could
perhaps make that work for the invocations of test postmasters
and psqls from "make" and TAP scripts, but hacking up our code's
sundry uses of system(3) like that seems like it'd be very messy,
if feasible at all.
It does sound like using relative rpaths might be the thing we want - and like
they've been available since 10.5 or something.
Is there a reason we're using absolute rpaths on a bunch of platforms, rather
than relative ones, which'd allow relocation?
Greetings,
Andres Freund
Hi,
On 2021-10-14 19:23:58 -0400, Tom Lane wrote:
Andres Freund <andres@anarazel.de> writes:
If, and perhaps that's too big an if, relative rpaths actually work despite
SIP, it might be worth setting a relative install_rpath, because afaict that
should then work both for a "real" installation and our temporary test one.From what we know so far, it seems like SIP wouldn't interfere with
that (if it works at all). I think what SIP desires to prevent is
messing with a program's execution by setting DYLD_LIBRARY_PATH.
As long as the program executable itself is saying where to find
the library, I don't see why they should interfere with that.
Well, there's *some* danger with relative rpaths, because they might
accidentally be pointing somewhere non-existing and user-creatable. Not a huge
risk, but as you say:
(Again, it seems blindingly stupid to forbid this while not blocking
PATH or any of the other environment variables that have always affected
execution. But what do I know.)
these aren't necessarily carefuly weighed considerations :/
But it seems to work well from what I gather.
If absolute rpaths are required, it'd make the process a bit more expensive,
It'd also put the kibosh on relocatable install trees, though I dunno how
much people really care about that.
We currently use absolute rpaths, or something equivalent.
The reason that running tests on macos works is that we set the "install_name"
of shared libraries to the intended installed location, using an absolute
path:
LINK.shared = $(COMPILER) -dynamiclib -install_name '$(libdir)/lib$(NAME).$(SO_MAJOR_VERSION)$(DLSUFFIX)' $(version_link) $(exported_symbols_list) -multiply_defined suppress
which on macos means that all libraries linking to that dylib reference it
under that absolute path.
On most other platforms we set an absolute rpath to the installation
directory, which has an equivalent effect:
rpathdir = $(libdir)
It seems to work quite well to change our own references to libpq in binaries
/ shared libs to be relative, but to leave the install_name of the libraries
intact. In combination with adding an rpath of @loader_path/../lib/ to
binaries and @loader_path/ to shlibs, the install will re relocatable.
It doesn't work as well to actually have a non-absolute install_name for
libraries (e.g. @rpath/libpq.dylib), because then external code linking to
libpq needs to add an rpath to the installation to make it work.
The advantage of this approach over Peter's is that it's not temp-install
specific - due to the relative paths, it makes installations relocatable
without relying [DY]LD_LIBRARY_PATH.
On other unixoid systems this whole mess is simpler, because we can just add
$ORIGIN to shared libraries and $ORIGIN/../lib/ to binaries. We don't need to
leave some absolute path in the libraries themself intact.
Greetings,
Andres Freund
Hi,
On 2021-10-15 11:50:30 -0700, Andres Freund wrote:
It seems to work quite well to change our own references to libpq in binaries
/ shared libs to be relative, but to leave the install_name of the libraries
intact. In combination with adding an rpath of @loader_path/../lib/ to
binaries and @loader_path/ to shlibs, the install will re relocatable.It doesn't work as well to actually have a non-absolute install_name for
libraries (e.g. @rpath/libpq.dylib), because then external code linking to
libpq needs to add an rpath to the installation to make it work.The advantage of this approach over Peter's is that it's not temp-install
specific - due to the relative paths, it makes installations relocatable
without relying [DY]LD_LIBRARY_PATH.On other unixoid systems this whole mess is simpler, because we can just add
$ORIGIN to shared libraries and $ORIGIN/../lib/ to binaries. We don't need to
leave some absolute path in the libraries themself intact.
I implemented this for the meson build, and it seems to work nicely. The macos
part was harder than I hoped due to the install_name stuff, which meson
doesn't solve.
https://github.com/anarazel/postgres/commit/a35379c28989469cc4b701a8d7a22422e6302e09
After that the build directory is relocatale.
I don't immediately see a way to do this reasonably for the autoconf
build. We'd need a list of our own shared libraries from somewhere, and then
replace the references after building the objects?
Greetings,
Andres Freund
Hi,
On 2021-10-15 15:36:16 -0700, Andres Freund wrote:
On 2021-10-15 11:50:30 -0700, Andres Freund wrote:
It seems to work quite well to change our own references to libpq in binaries
/ shared libs to be relative, but to leave the install_name of the libraries
intact. In combination with adding an rpath of @loader_path/../lib/ to
binaries and @loader_path/ to shlibs, the install will re relocatable.It doesn't work as well to actually have a non-absolute install_name for
libraries (e.g. @rpath/libpq.dylib), because then external code linking to
libpq needs to add an rpath to the installation to make it work.The advantage of this approach over Peter's is that it's not temp-install
specific - due to the relative paths, it makes installations relocatable
without relying [DY]LD_LIBRARY_PATH.On other unixoid systems this whole mess is simpler, because we can just add
$ORIGIN to shared libraries and $ORIGIN/../lib/ to binaries. We don't need to
leave some absolute path in the libraries themself intact.I implemented this for the meson build, and it seems to work nicely. The macos
part was harder than I hoped due to the install_name stuff, which meson
doesn't solve.https://github.com/anarazel/postgres/commit/a35379c28989469cc4b701a8d7a22422e6302e09
After that the build directory is relocatale.
Well, now that I think about it, it's still only relocatable in the sense that
postgres itself will continue to work. Outside code linking to e.g. libpq will
get the wrong path after relocation the source tree, due to the absolute
install_name.
But that doesn't seem solvable, unless we make the installed install_name to
be '@rpath/libpq...dylib' and require code linking to libpq to pass
-Wl,-rpath,/path/to/libpq when linking to libpq.
Greetings,
Andres Freund
Hi Tom,
On 2021-10-12 01:37:21 -0700, Andres Freund wrote:
As far as I can tell the only OS that postgres currently supports that
meson doesn't support is HPUX. It'd likely be fairly easy to add
gcc-on-hpux support, a chunk more to add support for the proprietary
ones.
Tom, wrt HPUX on pa-risc, what are your thoughts there? IIRC we gave up
supporting HP's compiler on pa-risc a while ago.
As I said it'd probably not be too hard to add meson support for hpux on hppa,
it's probably just a few branches. But that'd require access somewhere. The
gcc compile farm does not have a hppa member anymore...
I did notice that gcc will declare hppa-hpux obsolete in gcc 12 and will
remove at some point:
"The hppa[12]*-*-hpux10* and hppa[12]*-*-hpux11* configurations targeting 32-bit PA-RISC with HP-UX have been obsoleted and will be removed in a future release."
https://gcc.gnu.org/gcc-12/changes.html
Greetings,
Andres Freund
Andres Freund <andres@anarazel.de> writes:
On 2021-10-12 01:37:21 -0700, Andres Freund wrote:
As far as I can tell the only OS that postgres currently supports that
meson doesn't support is HPUX. It'd likely be fairly easy to add
gcc-on-hpux support, a chunk more to add support for the proprietary
ones.
Tom, wrt HPUX on pa-risc, what are your thoughts there? IIRC we gave up
supporting HP's compiler on pa-risc a while ago.
Right. I am still testing with gcc on HP-PA. I'd kind of like to
keep it running just as an edge case for our spinlock support, but
I'm not sure that I want to do any huge amount of work on meson
to keep that going.
I do have a functioning OpenBSD installation on that machine, so
one alternative if the porting costs look too high is to replace
gaur with an OpenBSD animal. However, last I checked, OpenBSD
was about half the speed of HPUX on that hardware, so I'm not
real eager to go that way. gaur's already about the slowest
animal in the farm :-(
As I said it'd probably not be too hard to add meson support for hpux on hppa,
it's probably just a few branches. But that'd require access somewhere. The
gcc compile farm does not have a hppa member anymore...
If you've got an idea where to look, I could add that to my
to-do queue.
In any case, I don't think we need to consider HPUX as a blocker
for the meson approach. The value-add from keeping gaur going
probably isn't terribly much. I'm more concerned about the
effort involved in getting meson going on some other old animals,
such as prairiedog.
regards, tom lane
Hi,
I know this is still in the evaluation stage, but I did notice some
discrepencies in the Flex flags. With the attached patch, the read-only
data segment seems to match up pretty well now.
--
John Naylor
EDB: http://www.enterprisedb.com
Attachments:
sync-flex-flags-with-autoconf-build.patchapplication/octet-stream; name=sync-flex-flags-with-autoconf-build.patchDownload
diff --git a/contrib/cube/meson.build b/contrib/cube/meson.build
index 49276aed64..3cf7ebdd8e 100644
--- a/contrib/cube/meson.build
+++ b/contrib/cube/meson.build
@@ -6,7 +6,7 @@ cube_sources = files(
cubescan = custom_target('cubescan',
input: ['cubescan.l'],
output: ['cubescan.c'],
- command: [flex, '-CFe', '-p', '-p', '-o', '@OUTPUT@', '@INPUT@'])
+ command: [flex, '-o', '@OUTPUT@', '@INPUT@'])
cube_sources += custom_target('cubeparse',
input: 'cubeparse.y',
diff --git a/src/backend/parser/meson.build b/src/backend/parser/meson.build
index 491eacf20b..5ce4d09f31 100644
--- a/src/backend/parser/meson.build
+++ b/src/backend/parser/meson.build
@@ -28,7 +28,7 @@ parser_sources = [files('parser.c')]
backend_scanner = custom_target('scan',
input: ['scan.l'],
output: ['scan.c'],
- command: [flex, '-CF', '-p', '-p', '-o', '@OUTPUT@', '@INPUT0@'])
+ command: [flex, '-b', '-CF', '-p', '-p', '-o', '@OUTPUT@', '@INPUT0@'])
parser_sources += backend_scanner[0]
parser_sources += backend_parser_header[0]
diff --git a/src/backend/replication/meson.build b/src/backend/replication/meson.build
index 2573f166d7..ee12c6d49d 100644
--- a/src/backend/replication/meson.build
+++ b/src/backend/replication/meson.build
@@ -17,7 +17,7 @@ backend_sources += files(
repl_scanner = custom_target('repl_scanner',
input : files('repl_scanner.l'),
output : ['repl_scanner.c'],
- command : [flex, '-CF', '-p', '-p', '-o', '@OUTPUT0@', '@INPUT@']
+ command: [flex, '-o', '@OUTPUT@', '@INPUT@']
)
generated_backend_sources += custom_target('repl_gram',
@@ -30,7 +30,7 @@ generated_backend_sources += custom_target('repl_gram',
syncrep_scanner = custom_target('syncrep_scanner',
input: 'syncrep_scanner.l',
output: 'syncrep_scanner.c',
- command: [flex, '-CF', '-p', '-p', '-o', '@OUTPUT0@', '@INPUT@'])
+ command: [flex, '-o', '@OUTPUT@', '@INPUT@'])
generated_backend_sources += custom_target('syncrep_gram',
input: 'syncrep_gram.y',
diff --git a/src/backend/utils/adt/meson.build b/src/backend/utils/adt/meson.build
index 086fde8ff0..e1cea1eb4e 100644
--- a/src/backend/utils/adt/meson.build
+++ b/src/backend/utils/adt/meson.build
@@ -109,7 +109,7 @@ backend_sources += files(
jsonpath_scan = custom_target('jsonpath_scan',
input: ['jsonpath_scan.l'],
output: ['jsonpath_scan.c'],
- command: [flex, '-CF', '-p', '-p', '-o', '@OUTPUT@', '@INPUT@'])
+ command: [flex, '-b', '-CF', '-p', '-p', '-o', '@OUTPUT@', '@INPUT@'])
# jsonpath_scan is compiled as part of jsonpath_gram
generated_backend_sources += custom_target('jsonpath_parse',
diff --git a/src/backend/utils/misc/meson.build b/src/backend/utils/misc/meson.build
index 5274c8aa1a..2c0090ad33 100644
--- a/src/backend/utils/misc/meson.build
+++ b/src/backend/utils/misc/meson.build
@@ -18,7 +18,7 @@ backend_sources += files(
guc_scan = custom_target('guc_scan',
input: ['guc-file.l'],
output: ['guc-file.c.h'],
- command: [flex, '-CF', '-p', '-p', '-o', '@OUTPUT@', '@INPUT@'])
+ command: [flex, '-o', '@OUTPUT@', '@INPUT@'])
generated_backend_sources += guc_scan
diff --git a/src/bin/pgbench/meson.build b/src/bin/pgbench/meson.build
index 5c4a778ff3..bc135abebf 100644
--- a/src/bin/pgbench/meson.build
+++ b/src/bin/pgbench/meson.build
@@ -9,7 +9,7 @@ pgbench_sources = files(
exprscan = custom_target('exprscan',
input : files('exprscan.l'),
output : ['exprscan.c'],
- command : [flex, '-CF', '-p', '-p', '-o', '@OUTPUT0@', '@INPUT@']
+ command : [flex, '-o', '@OUTPUT0@', '@INPUT@']
)
exprparse = custom_target('exprparse',
diff --git a/src/bin/psql/meson.build b/src/bin/psql/meson.build
index 98921f801d..e56beb28e1 100644
--- a/src/bin/psql/meson.build
+++ b/src/bin/psql/meson.build
@@ -18,7 +18,7 @@ psql_sources = files(
psql_sources += custom_target('psqlscanslash',
input: ['psqlscanslash.l'],
output: ['psqlscanslash.c'],
- command: [flex, '-CFe', '-p', '-p', '-o', '@OUTPUT@', '@INPUT@'])
+ command: [flex, '-b', '-Cfe', '-p', '-p', '-o', '@OUTPUT@', '@INPUT@'])
psql_sources += custom_target('psql_help',
input: ['create_help.pl'],
diff --git a/src/fe_utils/meson.build b/src/fe_utils/meson.build
index b305727d96..e3f0b34cf1 100644
--- a/src/fe_utils/meson.build
+++ b/src/fe_utils/meson.build
@@ -16,7 +16,7 @@ fe_utils_sources = files(
fe_utils_sources += custom_target('psqlscan',
input: ['psqlscan.l'],
output: ['psqlscan.c'],
- command: [flex, '-Cfe', '-p', '-p', '-o', '@OUTPUT@', '@INPUT@'])
+ command: [flex, '-b', '-Cfe', '-p', '-p', '-o', '@OUTPUT@', '@INPUT@'])
fe_utils = static_library('fe_utils',
fe_utils_sources + generated_headers,
diff --git a/src/test/isolation/meson.build b/src/test/isolation/meson.build
index 637b480755..ea8baa2063 100644
--- a/src/test/isolation/meson.build
+++ b/src/test/isolation/meson.build
@@ -8,7 +8,7 @@ isolation_sources = pg_regress_c + files(
spec_scanner = custom_target('specscanner',
input : files('specscanner.l'),
output : ['specscanner.c'],
- command : [flex, '-CF', '-p', '-p', '-o', '@OUTPUT0@', '@INPUT@']
+ command : [flex, '-o', '@OUTPUT0@', '@INPUT@']
)
isolationtester_sources = files('isolationtester.c')
Hi,
On 2021-10-19 17:57:31 -0400, John Naylor wrote:
I know this is still in the evaluation stage, but I did notice some
discrepencies in the Flex flags. With the attached patch, the read-only
data segment seems to match up pretty well now.
Good catch. I think I just copied them around...
I wish we had a bit more consistency in the flags, so we could centralize
them. Seems there's no reason to not use -p -p and -b everywhere?
I also need to make meson use our flex wrapper for the relevant versions... I
can see the warning that'd be fixed by it on macos CI. Will do that and push
it out to my github repo together with your changes.
Thanks!
Andres
Hi,
On 2021-10-19 15:22:15 -0400, Tom Lane wrote:
Andres Freund <andres@anarazel.de> writes:
On 2021-10-12 01:37:21 -0700, Andres Freund wrote:
As far as I can tell the only OS that postgres currently supports that
meson doesn't support is HPUX. It'd likely be fairly easy to add
gcc-on-hpux support, a chunk more to add support for the proprietary
ones.Tom, wrt HPUX on pa-risc, what are your thoughts there? IIRC we gave up
supporting HP's compiler on pa-risc a while ago.Right. I am still testing with gcc on HP-PA. I'd kind of like to
keep it running just as an edge case for our spinlock support, but
I'm not sure that I want to do any huge amount of work on meson
to keep that going.
Makes sense. While that does test an odd special case for our spinlock
implementation, it's also the only supported platform with that edge case, and
it seems extremely unlikely that there ever will be a new platform with such
odd/limited atomic operations.
I do have a functioning OpenBSD installation on that machine, so
one alternative if the porting costs look too high is to replace
gaur with an OpenBSD animal. However, last I checked, OpenBSD
was about half the speed of HPUX on that hardware, so I'm not
real eager to go that way. gaur's already about the slowest
animal in the farm :-(
Yea, that doesn't sound enticing. Seems like we either should keep it running
on hp-ux or just drop parisc support?
As I said it'd probably not be too hard to add meson support for hpux on hppa,
it's probably just a few branches. But that'd require access somewhere. The
gcc compile farm does not have a hppa member anymore...If you've got an idea where to look, I could add that to my to-do queue.
It might even just work. Looks like meson does have pa-risc detection. While
it doesn't have any specifically for hpux, it just falls back to python's
sys.platform in that case. python3 -c 'import sys;print(sys.platform)'
meson generates output for ninja to execute (basically a faster make that's
partially faster by being much less flexible. Intended to be output by more
user-friendly buildsystems ). Ninja can be built by a minimal python script,
or with cmake. The former doesn't seem to have hpux support, the latter does I
think.
https://github.com/ninja-build/ninja
So it could be interesting to see if ninja builds.
I've not taught the PG meson the necessary stuff for a 32 bit build. So
there's no point is trying whether meson works that much. I'll try to do that,
and let you know.
I'm more concerned about the effort involved in getting meson going on some
other old animals, such as prairiedog.
Yea, that's an *old* OS version. One version too old to have support for
@rpath, added in 10.5 :(. Is there a reason to run 10.4 specifically?
According to wikipedia 10.5 is the last version to support ppc.
Looks like python still supports building back to 10.4.
Greetings,
Andres Freund
Andres Freund <andres@anarazel.de> writes:
I wish we had a bit more consistency in the flags, so we could centralize
them. Seems there's no reason to not use -p -p and -b everywhere?
I don't think we care enough about performance of most of the scanners
to make them all backup-free, so -1 to that idea.
We could possibly replace the command line switches with %option
entries in the files themselves. But I think the reason we haven't
done so for -b is that the Makefile still needs to know about it
so as to know what to do with the lex.backup output file.
regards, tom lane
Andres Freund <andres@anarazel.de> writes:
On 2021-10-19 15:22:15 -0400, Tom Lane wrote:
I'm more concerned about the effort involved in getting meson going on some
other old animals, such as prairiedog.
Yea, that's an *old* OS version. One version too old to have support for
@rpath, added in 10.5 :(. Is there a reason to run 10.4 specifically?
According to wikipedia 10.5 is the last version to support ppc.
My notes say
Currently running OSX 10.4.11 (last release of Tiger); although 10.5 Leopard
supports PPCs, it refuses to install if CPU speed < 867MHz, well beyond the
Cube's ability. Wikipedia does suggest it's possible to run Leopard, but...
https://en.wikipedia.org/wiki/Mac_OS_X_Leopard#Usage_on_unsupported_hardware
I'm not sure that I have install media for 10.5 anymore, either --- ISTR
some machine's CD drive failing and not letting me get the CD back out.
If I did have it, I don't think there'd be a way to update past 10.5.0
(surely Apple no longer has those updaters on-line?), so on the whole
I think that path is a nonstarter.
I do have 10.5 running on an old G4 PowerMac, but that machine is (a)
noisy (b) power-hungry and (c) getting flaky, so I'm uneager to spin up
a buildfarm animal on it.
As with the HPPA, a potential compromise is to spin up some newer
BSD-ish system on it. I agree that OSX 10.4 is uninteresting as a
software platform, but I'd like to keep 32-bit PPC represented in
the farm.
regards, tom lane
Hi,
On 2021-10-19 17:31:22 -0700, Andres Freund wrote:
I also need to make meson use our flex wrapper for the relevant versions... I
can see the warning that'd be fixed by it on macos CI. Will do that and push
it out to my github repo together with your changes.
That turned out to be more work than I anticipated, so I pushed your changes
out separately.
There's this bit in plflex.pl that talks about adjusting yywrap() for msvc. I
didn't implement that and didn't see any compilation problems. Looks like that
originally hails from 2011, in 08a0c2dabc3b9d59d72d7a79ed867b8e37d275a7
Hm. Seems not worth carrying forward unless it actually causes trouble?
Greetings,
Andres Freund
Hi,
On 2021-10-19 21:26:53 -0400, Tom Lane wrote:
My notes say
Currently running OSX 10.4.11 (last release of Tiger); although 10.5 Leopard
supports PPCs, it refuses to install if CPU speed < 867MHz, well beyond the
Cube's ability. Wikipedia does suggest it's possible to run Leopard, but...
https://en.wikipedia.org/wiki/Mac_OS_X_Leopard#Usage_on_unsupported_hardwareI'm not sure that I have install media for 10.5 anymore, either --- ISTR
some machine's CD drive failing and not letting me get the CD back out.
If I did have it, I don't think there'd be a way to update past 10.5.0
(surely Apple no longer has those updaters on-line?), so on the whole
I think that path is a nonstarter.
That does indeed sound like a nonstarter.
I do have 10.5 running on an old G4 PowerMac, but that machine is (a)
noisy (b) power-hungry and (c) getting flaky, so I'm uneager to spin up
a buildfarm animal on it.
Understandable.
As with the HPPA, a potential compromise is to spin up some newer
BSD-ish system on it. I agree that OSX 10.4 is uninteresting as a
software platform, but I'd like to keep 32-bit PPC represented in
the farm.
I assume the reason 32-bit PPC is interesting is that it's commonly run big
endian?
I wonder when it'll be faster to run 32bit ppc via qemu than natively :)
Greetings,
Andres Freund
Andres Freund <andres@anarazel.de> writes:
On 2021-10-19 21:26:53 -0400, Tom Lane wrote:
As with the HPPA, a potential compromise is to spin up some newer
BSD-ish system on it. I agree that OSX 10.4 is uninteresting as a
software platform, but I'd like to keep 32-bit PPC represented in
the farm.
I assume the reason 32-bit PPC is interesting is that it's commonly run big
endian?
Aside from bit width and endianness, I believe it's a somewhat smaller
instruction set than the newer CPUs.
I wonder when it'll be faster to run 32bit ppc via qemu than natively :)
I think qemu would have a ways to go for that. More to the point,
I've found that its emulation is not as precise as one might wish...
regards, tom lane
Hi,
On 2021-10-19 18:49:43 -0700, Andres Freund wrote:
I wonder when it'll be faster to run 32bit ppc via qemu than natively :)
Freebsd didn't seem to want to boot, but surprisingly a debian buster image
started at least the installer without problems... Will probably take a while
to see if it actually works.
I assume to make it acceptable from a build-speed perspective one would have
to use distcc with the compiler running outside.
Greetings,
Andres Freund
Hi,
On 2021-10-19 19:41:56 -0700, Andres Freund wrote:
On 2021-10-19 18:49:43 -0700, Andres Freund wrote:
I wonder when it'll be faster to run 32bit ppc via qemu than natively :)
Freebsd didn't seem to want to boot, but surprisingly a debian buster image
started at least the installer without problems... Will probably take a while
to see if it actually works.
The build was quite slow (cold ccache cache, only 1 cpu):
real 106m33.418s
user 86m36.363s
sys 17m33.830s
But the actual test time wasn't *too* bad, compared to the 32bit ppc animals
real 12m14.944s
user 0m51.622s
sys 0m44.743s
Greetings,
Andres Freund
Hi,
On 2021-10-12 15:55:22 -0400, John Naylor wrote:
Also, could utility makefile targets be made to work? I'm thinking in
particular of update-unicode and reformat-dat-files, for example.
Implementing reformat-dat-files was trivial:
https://github.com/anarazel/postgres/commit/29c1ce1ad4731290714978da5ce81e99ef051bec
However, update-unicode is a bit harder. Partially not directly because of
meson, but because update-unicode as-is afaict doesn't support VPATH builds,
and meson enforces those.
make update-unicode
...
make -C src/common/unicode update-unicode
'/usr/bin/perl' generate-unicode_norm_table.pl
Can't open perl script "generate-unicode_norm_table.pl": No such file or directory
It's not too hard to fix. See attached for the minimal stuff that I
immediately found to be needed. There's likely more,
e.g. src/backend/utils/mb/Unicode - but I didn't immediately see where that's
invoked from.
The slightly bigger issue making update-unicode work with meson is that meson
doesn't provide support for invoking build targets in specific directories
(because it doesn't map nicely to e.g. msbuild). But scripts like
src/common/unicode/generate-unicode_norm_table.pl rely on CWD. It's not hard
to work around that, but IMO it's better for such scripts to not rely on CWD.
Greetings,
Andres Freund
Attachments:
update-unicode.difftext/x-diff; charset=us-asciiDownload
diff --git i/src/common/unicode/Makefile w/src/common/unicode/Makefile
index a3683dd86b9..e69054d4671 100644
--- i/src/common/unicode/Makefile
+++ w/src/common/unicode/Makefile
@@ -12,14 +12,14 @@ subdir = src/common/unicode
top_builddir = ../../..
include $(top_builddir)/src/Makefile.global
-override CPPFLAGS := -DFRONTEND $(CPPFLAGS)
+override CPPFLAGS := -DFRONTEND -I$(abs_top_builddir)/src/common/unicode $(CPPFLAGS)
LIBS += $(PTHREAD_LIBS)
# By default, do nothing.
all:
update-unicode: unicode_norm_table.h unicode_combining_table.h unicode_east_asian_fw_table.h unicode_normprops_table.h unicode_norm_hashfunc.h
- mv $^ ../../../src/include/common/
+ mv $^ $(top_srcdir)/src/include/common/
$(MAKE) normalization-check
# These files are part of the Unicode Character Database. Download
@@ -33,7 +33,7 @@ UnicodeData.txt EastAsianWidth.txt DerivedNormalizationProps.txt CompositionExcl
unicode_norm_hashfunc.h: unicode_norm_table.h
unicode_norm_table.h: generate-unicode_norm_table.pl UnicodeData.txt CompositionExclusions.txt
- $(PERL) generate-unicode_norm_table.pl
+ $(PERL) $^
unicode_combining_table.h: generate-unicode_combining_table.pl UnicodeData.txt
$(PERL) $^ >$@
@@ -58,7 +58,7 @@ submake-common:
$(MAKE) -C .. all
norm_test_table.h: generate-norm_test_table.pl NormalizationTest.txt
- perl generate-norm_test_table.pl NormalizationTest.txt $@
+ perl $^ $@
.PHONY: normalization-check
diff --git i/contrib/unaccent/Makefile w/contrib/unaccent/Makefile
index b8307d1601e..d6c466e07ad 100644
--- i/contrib/unaccent/Makefile
+++ w/contrib/unaccent/Makefile
@@ -27,12 +27,12 @@ include $(top_builddir)/src/Makefile.global
include $(top_srcdir)/contrib/contrib-global.mk
endif
-update-unicode: unaccent.rules
+update-unicode: $(srcdir)/unaccent.rules
# Allow running this even without --with-python
PYTHON ?= python
-unaccent.rules: generate_unaccent_rules.py ../../src/common/unicode/UnicodeData.txt Latin-ASCII.xml
+$(srcdir)/unaccent.rules: generate_unaccent_rules.py ../../src/common/unicode/UnicodeData.txt Latin-ASCII.xml
$(PYTHON) $< --unicode-data-file $(word 2,$^) --latin-ascii-file $(word 3,$^) >$@
# Only download it once; dependencies must match src/common/unicode/
On Thu, Oct 21, 2021 at 5:48 PM Andres Freund <andres@anarazel.de> wrote:
However, update-unicode is a bit harder. Partially not directly because
of
meson, but because update-unicode as-is afaict doesn't support VPATH
builds,
and meson enforces those.
make update-unicode
...
make -C src/common/unicode update-unicode
'/usr/bin/perl' generate-unicode_norm_table.pl
Can't open perl script "generate-unicode_norm_table.pl": No such file or
directory
It's not too hard to fix. See attached for the minimal stuff that I
immediately found to be needed.
Thanks for doing that, it works well enough for demonstration. With your
patch, and using an autoconf VPATH build, the unicode tables work fine, but
it complains of a permission error in generate_unaccent_rules.py. That
seems to be because the script is invoked directly rather than as an
argument to the python interpreter.
The slightly bigger issue making update-unicode work with meson is that
meson
doesn't provide support for invoking build targets in specific directories
(because it doesn't map nicely to e.g. msbuild). But scripts like
src/common/unicode/generate-unicode_norm_table.pl rely on CWD. It's not
hard
to work around that, but IMO it's better for such scripts to not rely on
CWD.
Yeah. I encountered a further issue: With autoconf on HEAD, with a source
tree build executed in contrib/unaccent:
$ touch generate_unaccent_rules.py
$ make update-unicode
generate_unaccent_rules.py --unicode-data-file
../../src/common/unicode/UnicodeData.txt --latin-ascii-file Latin-ASCII.xml
unaccent.rules
/bin/sh: generate_unaccent_rules.py: command not found
make: *** [unaccent.rules] Error 127
make: *** Deleting file `unaccent.rules'
...so in this case it seems not to know to use CWD here.
Anyway, this can be put off until the very end, since it's not run often.
You've demonstrated how these targets would work, and that's good enough
for now.
--
John Naylor
EDB: http://www.enterprisedb.com
Hi,
Attached is an updated version of the meson patchset.
Changes:
- support for remaining binaries in src/bin, contrib modules
- nearly all tests, including src/test/modules etc, are integrated.
- quite a few more, but not yet all, optional dependencies (most are
exercised in the included CI)
- runs tests on SIP enabled macos without needing a prior installation /
installation is relocatable
- support for building docs.
I couldn't get dbtoepub work in a vpath style build, so I changed that
to also use pandoc. No idea if anybody uses the epub rules?
- 32bit x86 [1]I had not defined SIZEOF_SIZE_T. Surprisingly that still results in a successful 64bit build, but not a successful 32bit build., 64bit aarch64 builds
- cross-building windows from linux works
- error when building with meson against a source tree with an in-tree
autoconf build (leads to problems with pg_config.h etc)
- update-unicode, reformat-dat-files, expand-dat-files
Bigger missing pieces:
- pgxs (that's a *hard* one)
- NLS
- test / add support for platforms besides freebsd, linux, macos, windows
- remaining hardcoded configure tests (e.g. ACCEPT_TYPE_ARG*)
- win32 resource files only handled for two binaries, needs to be made
more compact
- ecpg
- fixing up flex output
- truckloads of polishing
- some tests (e.g. pg_upgrade, because of the upcoming tap conversion,
other tests that are shell scripts). Some tests are now run
unconditionally that previously were opt-in.
- what exactly gets installed where
- a "dist" target
- fix "ldap" build on macos
Greetings,
Andres Freund
[1]: I had not defined SIZEOF_SIZE_T. Surprisingly that still results in a successful 64bit build, but not a successful 32bit build.
a successful 64bit build, but not a successful 32bit build.
Attachments:
v5-0001-ci-backend-windows-DONTMERGE-crash-reporting-back.patchtext/x-diff; charset=us-asciiDownload
From 48d06672d142ffed46faaf0ab4d2fece67534dce Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Thu, 9 Sep 2021 17:49:39 -0700
Subject: [PATCH v5 01/16] ci: backend: windows: DONTMERGE: crash reporting
(backend).
---
src/backend/main/main.c | 14 +++++++++++++-
1 file changed, 13 insertions(+), 1 deletion(-)
diff --git a/src/backend/main/main.c b/src/backend/main/main.c
index ad84a45e28c..65a325723fd 100644
--- a/src/backend/main/main.c
+++ b/src/backend/main/main.c
@@ -26,6 +26,10 @@
#include <sys/param.h>
#endif
+#if defined(WIN32)
+#include <crtdbg.h>
+#endif
+
#if defined(_M_AMD64) && _MSC_VER == 1800
#include <math.h>
#include <versionhelpers.h>
@@ -238,7 +242,15 @@ startup_hacks(const char *progname)
}
/* In case of general protection fault, don't show GUI popup box */
- SetErrorMode(SEM_FAILCRITICALERRORS | SEM_NOGPFAULTERRORBOX);
+ SetErrorMode(SEM_FAILCRITICALERRORS /* | SEM_NOGPFAULTERRORBOX */);
+
+ _CrtSetReportMode(_CRT_ASSERT, _CRTDBG_MODE_FILE | _CRTDBG_MODE_DEBUG);
+ _CrtSetReportMode(_CRT_ERROR, _CRTDBG_MODE_FILE | _CRTDBG_MODE_DEBUG);
+ _CrtSetReportFile(_CRT_ASSERT, _CRTDBG_FILE_STDERR);
+ _CrtSetReportFile(_CRT_ERROR, _CRTDBG_FILE_STDERR);
+#ifndef __MINGW64__
+ _set_abort_behavior(_CALL_REPORTFAULT | _WRITE_ABORT_MSG, _CALL_REPORTFAULT | _WRITE_ABORT_MSG);
+#endif
#if defined(_M_AMD64) && _MSC_VER == 1800
--
2.23.0.385.gbc12974a89
v5-0002-ci-Add-CI-for-FreeBSD-Linux-MacOS-and-Windows-uti.patchtext/x-diff; charset=us-asciiDownload
From 2e4bba81f06ab55fad639e9657b71c5db98b6252 Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Mon, 15 Mar 2021 09:25:15 -0700
Subject: [PATCH v5 02/16] ci: Add CI for FreeBSD, Linux, MacOS and Windows,
utilizing cirrus-ci.
---
.cirrus.yml | 395 ++++++++++++++++++++++++++++++++
.dockerignore | 3 +
ci/docker/linux_debian_bullseye | 13 ++
ci/docker/windows_vs_2019 | 111 +++++++++
ci/freebsd_gcp_repartition.sh | 28 +++
ci/pg_ci_base.conf | 12 +
ci/windows_build_config.pl | 13 ++
7 files changed, 575 insertions(+)
create mode 100644 .cirrus.yml
create mode 100644 .dockerignore
create mode 100644 ci/docker/linux_debian_bullseye
create mode 100644 ci/docker/windows_vs_2019
create mode 100755 ci/freebsd_gcp_repartition.sh
create mode 100644 ci/pg_ci_base.conf
create mode 100644 ci/windows_build_config.pl
diff --git a/.cirrus.yml b/.cirrus.yml
new file mode 100644
index 00000000000..f75bdce6dec
--- /dev/null
+++ b/.cirrus.yml
@@ -0,0 +1,395 @@
+env:
+ # accelerate initial clone, but a bit of depth so that concurrent tasks work
+ CIRRUS_CLONE_DEPTH: 100
+ # Useful to be able to analyse what in a script takes long
+ CIRRUS_LOG_TIMESTAMP: true
+ # target to test, for all but windows
+ CHECK: check-world
+ CHECKFLAGS: -Otarget
+ PGCTLTIMEOUT: 120
+ CCACHE_MAXSIZE: "500M"
+ TEMP_CONFIG: ${CIRRUS_WORKING_DIR}/ci/pg_ci_base.conf
+ PG_TEST_EXTRA: kerberos ldap ssl
+
+
+task:
+ name: FreeBSD
+ only_if: $CIRRUS_CHANGE_MESSAGE !=~ '.*\nci-os-only:.*' || $CIRRUS_CHANGE_MESSAGE =~ '.*\nci-os-only:[^\n]*freebsd.*'
+ compute_engine_instance:
+ image_project: pg-vm-images-aio
+ image: family/pg-aio-freebsd-13-0
+ platform: freebsd
+ cpu: 2
+ memory: 2G
+ disk: 50
+ env:
+ CCACHE_DIR: "/tmp/ccache_dir"
+
+ ccache_cache:
+ folder: "/tmp/ccache_dir"
+ sysinfo_script:
+ - export || true
+ sysconfig_script:
+ - sudo sysctl kern.corefile='/tmp/%N.%P.core'
+ repartition_script:
+ - ci/freebsd_gcp_repartition.sh
+ create_user_script:
+ - pw useradd postgres
+ - chown -R postgres:postgres .
+ - mkdir -p /tmp/ccache_dir
+ - chown -R postgres:postgres /tmp/ccache_dir
+
+ configure_script: |
+ su postgres -c './configure \
+ --enable-cassert --enable-debug --enable-tap-tests \
+ --enable-nls \
+ \
+ --with-icu \
+ --with-ldap \
+ --with-libxml \
+ --with-libxslt \
+ \
+ --with-lz4 \
+ --with-pam \
+ --with-perl \
+ --with-python \
+ --with-ssl=openssl \
+ --with-tcl --with-tclconfig=/usr/local/lib/tcl8.6/ \
+ --with-uuid=bsd \
+ \
+ --with-includes=/usr/local/include --with-libs=/usr/local/lib \
+ CC="ccache cc"'
+ build_script:
+ - su postgres -c 'gmake -s -j3 && gmake -s -j3 -C contrib'
+ upload_caches:
+ - ccache
+
+ tests_script:
+ - su postgres -c 'time gmake -s -j2 ${CHECK} ${CHECKFLAGS}'
+
+ on_failure:
+ cores_script: |
+ for corefile in $(find /tmp -name '*.core' 2>/dev/null) ; do
+ binary=$(gdb -quiet -core $corefile -batch -ex 'info auxv' | grep AT_EXECPATH | perl -pe "s/^.*\"(.*)\"\$/\$1/g") ;
+ echo dumping $corefile for $binary ;
+ gdb --batch --quiet -ex "thread apply all bt full" -ex "quit" $binary $corefile;
+ done
+ log_artifacts:
+ path: "**/**.log"
+ type: text/plain
+ regress_diffs_artifacts:
+ path: "**/**.diffs"
+ type: text/plain
+ tap_artifacts:
+ path: "**/regress_log_*"
+ type: text/plain
+
+
+task:
+ name: Linux
+ only_if: $CIRRUS_CHANGE_MESSAGE !=~ '.*\nci-os-only:.*' || $CIRRUS_CHANGE_MESSAGE =~ '.*\nci-os-only:[^\n]*linux.*'
+ compute_engine_instance:
+ image_project: pg-vm-images-aio
+ image: family/pg-aio-bullseye
+ platform: linux
+ cpu: 4
+ memory: 2G
+ nested_virtualization: false
+ env:
+ CCACHE_DIR: "/tmp/ccache_dir"
+ DEBUGINFOD_URLS: "https://debuginfod.debian.net"
+
+ ccache_cache:
+ folder: "/tmp/ccache_dir"
+
+ sysinfo_script:
+ - id
+ - uname -a
+ - cat /proc/cmdline
+ - lsblk
+ - ulimit -a -H
+ - ulimit -a -S
+ - export
+ sysconfig_script:
+ - useradd -m postgres
+ - chown -R postgres:postgres .
+ - mkdir -p /tmp/ccache_dir
+ - chown -R postgres:postgres /tmp/ccache_dir
+ - echo '* - memlock 134217728' > /etc/security/limits.d/postgres.conf
+ - su postgres -c 'ulimit -l -H'
+ - su postgres -c 'ulimit -l -S'
+ - echo '/tmp/%e-%s-%p.core' > /proc/sys/kernel/core_pattern
+
+ configure_script: |
+ su postgres -c './configure \
+ --enable-cassert --enable-debug --enable-tap-tests \
+ --enable-nls \
+ \
+ --with-gssapi \
+ --with-icu \
+ --with-ldap \
+ --with-libxml \
+ --with-libxslt \
+ --with-llvm \
+ --with-lz4 \
+ --with-pam \
+ --with-perl \
+ --with-python \
+ --with-ssl=openssl \
+ --with-systemd \
+ --with-tcl --with-tclconfig=/usr/lib/tcl8.6/ \
+ --with-uuid=e2fs \
+ \
+ CC="ccache gcc" CXX="ccache g++" CLANG="ccache clang" CFLAGS="-O0 -ggdb"'
+ build_script:
+ - su postgres -c 'make -s -j4 && make -s -j4 -C contrib'
+ upload_caches:
+ - ccache
+
+ tests_script: |
+ su postgres -c '\
+ ulimit -c unlimited; \
+ make -s ${CHECK} ${CHECKFLAGS} -j8 \
+ '
+
+ on_failure:
+ cores_script: |
+ for corefile in $(find /tmp -name '*.core' 2>/dev/null) ; do
+ binary=$(gdb -quiet -core $corefile -batch -ex 'info auxv' | grep AT_EXECFN | perl -pe "s/^.*\"(.*)\"\$/\$1/g") ;
+ echo dumping $corefile for $binary ;
+ gdb --batch --quiet -ex "thread apply all bt full" -ex "quit" $binary $corefile ;
+ done
+ log_artifacts:
+ path: "**/**.log"
+ type: text/plain
+ regress_diffs_artifacts:
+ path: "**/**.diffs"
+ type: text/plain
+ tap_artifacts:
+ path: "**/regress_log_*"
+ type: text/plain
+
+
+task:
+ name: macOS
+ only_if: $CIRRUS_CHANGE_MESSAGE !=~ '.*\nci-os-only:.*' || $CIRRUS_CHANGE_MESSAGE =~ '.*\nci-os-only:[^\n]*(macos|darwin|osx).*'
+ osx_instance:
+ image: big-sur-base
+ env:
+ CIRRUS_WORKING_DIR: ${HOME}/pgsql/
+ TEMP_CONFIG: ${CIRRUS_WORKING_DIR}/ci/pg_ci_base.conf
+ CCACHE_DIR: ${HOME}/ccache
+ HOMEBREW_CACHE: ${HOME}/homebrew-cache
+ PERL5LIB: ${HOME}/perl5/lib/perl5
+
+ sysinfo_script:
+ - id
+ - export
+ ccache_cache:
+ folder: ${CCACHE_DIR}
+ homebrew_cache:
+ folder: ${HOMEBREW_CACHE}
+ perl_cache:
+ folder: ~/perl5
+
+ cpan_install_script:
+ - perl -mIPC::Run -e 1 || cpan -T IPC::Run
+ - perl -mIO::Pty -e 1 || cpan -T IO::Pty
+ upload_caches:
+ - perl
+ core_install_script:
+ - sudo chmod 777 /cores
+ homebrew_install_script:
+ - brew install make coreutils ccache icu4c lz4 tcl-tk openldap
+ upload_caches:
+ - homebrew
+
+ configure_script: |
+ LIBS="/usr/local/lib:$LIBS"
+ INCLUDES="/usr/local/include:$INCLUDES"
+
+ INCLUDES="/usr/local/opt/openssl/include:$INCLUDES"
+ LIBS="/usr/local/opt/openssl/lib:$LIBS"
+
+ PKG_CONFIG_PATH="/usr/local/opt/icu4c/lib/pkgconfig:$PKG_CONFIG_PATH"
+ INCLUDES="/usr/local/opt/icu4c/include:$INCLUDES"
+ LIBS="/usr/local/opt/icu4c/lib:$LIBS"
+
+ LIBS="/usr/local/opt/openldap/lib:$LIBS"
+ INCLUDES="/usr/local/opt/openldap/include:$INCLUDES"
+
+ export PKG_CONFIG_PATH
+
+ ./configure \
+ --prefix=$HOME/install \
+ --with-includes="$INCLUDES" \
+ --with-libs="$LIBS" \
+ \
+ --enable-cassert --enable-debug --enable-tap-tests \
+ --enable-nls \
+ \
+ --with-icu \
+ --with-ldap \
+ --with-libxml \
+ --with-libxslt \
+ \
+ --with-lz4 \
+ --with-perl \
+ --with-python \
+ --with-ssl=openssl \
+ --with-tcl --with-tclconfig=/usr/local/opt/tcl-tk/lib/ \
+ --with-uuid=e2fs \
+ \
+ CC="ccache gcc" CFLAGS="-O0 -ggdb"
+ build_script:
+ - gmake -s -j12 && gmake -s -j12 -C contrib
+ upload_caches:
+ - ccache
+
+ tests_script:
+ - ulimit -c unlimited
+ - ulimit -n 1024
+ - gmake -s -j12 ${CHECK} ${CHECKFLAGS}
+
+ on_failure:
+ cores_script: |
+ for corefile in $(find /cores/ -name 'core.*' 2>/dev/null) ; do
+ lldb -c $corefile --batch -o 'thread backtrace all' -o 'quit' ;
+ done
+ log_artifacts:
+ path: "**/**.log"
+ type: text/plain
+ regress_diffs_artifacts:
+ path: "**/**.diffs"
+ type: text/plain
+ tap_artifacts:
+ path: "**/regress_log_*"
+ type: text/plain
+
+
+task:
+ name: Windows
+ only_if: $CIRRUS_CHANGE_MESSAGE !=~ '.*\nci-os-only:.*' || $CIRRUS_CHANGE_MESSAGE =~ '.*\nci-os-only:[^\n]*windows.*'
+ windows_container:
+ dockerfile: ci/docker/windows_vs_2019
+ cpu: 4
+ memory: 4G
+ env:
+ PROVE_FLAGS: -j10
+ # The default working dir is in a directory msbuild complains about
+ CIRRUS_WORKING_DIR: "c:/cirrus"
+ TEMP_CONFIG: ${CIRRUS_WORKING_DIR}/ci/pg_ci_base.conf
+ # Avoid re-installing over and over
+ NO_TEMP_INSTALL: 1
+
+ sysinfo_script:
+ - chcp
+ - systeminfo
+ - powershell -Command get-psdrive -psprovider filesystem
+ - ps: Get-Item -Path 'HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\AeDebug'
+ - set
+
+ configure_script:
+ - copy ci\windows_build_config.pl src\tools\msvc\config.pl
+ - vcvarsall x64
+ - perl src/tools/msvc/mkvcbuild.pl
+ build_script:
+ - vcvarsall x64
+ # Disable file tracker, we're never going to rebuild...
+ - msbuild -m /p:TrackFileAccess=false pgsql.sln
+ tempinstall_script:
+ # Installation on windows currently only completely works from src\tools\msvc
+ - cd src\tools\msvc && perl .\install.pl %CIRRUS_WORKING_DIR%\tmp_install
+
+ check_test_script:
+ - perl src/tools/msvc/vcregress.pl check parallel
+ startcreate_test_script:
+ - tmp_install\bin\pg_ctl.exe initdb -D tmp_check\db -l tmp_check\initdb.log
+ - echo include '%TEMP_CONFIG%' >> tmp_check\db\postgresql.conf
+ - tmp_install\bin\pg_ctl.exe start -D tmp_check\db -l tmp_check\postmaster.log
+ plcheck_test_script:
+ - perl src/tools/msvc/vcregress.pl plcheck
+ isolationcheck_test_script:
+ - perl src/tools/msvc/vcregress.pl isolationcheck
+ modulescheck_test_script:
+ - perl src/tools/msvc/vcregress.pl modulescheck
+ contribcheck_test_script:
+ - perl src/tools/msvc/vcregress.pl contribcheck
+ stop_test_script:
+ - tmp_install\bin\pg_ctl.exe stop -D tmp_check\db -l tmp_check\postmaster.log
+ ssl_test_script:
+ - set with_ssl=openssl
+ - perl src/tools/msvc/vcregress.pl taptest .\src\test\ssl\
+ subscriptioncheck_test_script:
+ - perl src/tools/msvc/vcregress.pl taptest .\src\test\subscription\
+ authentication_test_script:
+ - perl src/tools/msvc/vcregress.pl taptest .\src\test\authentication\
+ recoverycheck_test_script:
+ - perl src/tools/msvc/vcregress.pl recoverycheck
+ bincheck_test_script:
+ - perl src/tools/msvc/vcregress.pl bincheck
+ upgradecheck_test_script:
+ - perl src/tools/msvc/vcregress.pl upgradecheck
+ ecpgcheck_test_script:
+ # tries to build additional stuff
+ - vcvarsall x64
+ # References ecpg_regression.proj in the current dir
+ - cd src\tools\msvc
+ - perl vcregress.pl ecpgcheck
+
+ always:
+ cores_script:
+ - cat crashlog.txt || true
+ dump_artifacts:
+ path: "crashlog.txt"
+ type: text/plain
+
+ on_failure:
+ log_artifacts:
+ path: "**/**.log"
+ type: text/plain
+ regress_diffs_artifacts:
+ path: "**/**.diffs"
+ type: text/plain
+ tap_artifacts:
+ path: "**/regress_log_*"
+ type: text/plain
+
+
+task:
+ name: CompilerWarnings
+ depends_on:
+ - Linux
+ # task that did not run count as a success, so we need to recheck Linux' condition here :/
+ only_if: $CIRRUS_CHANGE_MESSAGE !=~ '.*\nci-os-only:.*' || $CIRRUS_CHANGE_MESSAGE =~ '.*\nci-os-only:[^\n]*linux.*'
+ container:
+ dockerfile: ci/docker/linux_debian_bullseye
+ env:
+ CCACHE_SIZE: "4GB"
+ CCACHE_DIR: "/tmp/ccache_dir"
+ ccache_cache:
+ folder: "/tmp/ccache_dir"
+ setup_script:
+ - echo "COPT=-Werror" > src/Makefile.custom
+ - gcc -v
+ - clang -v
+ # gcc with asserts disabled
+ always:
+ gcc_warning_script:
+ - ./configure --cache gcc.cache CC="ccache gcc"
+ - time make -s -j4 clean && time make -s -j4
+ # gcc with asserts enabled
+ always:
+ gcc_a_warning_script:
+ - ./configure --cache gcc.cache --enable-cassert CC="ccache gcc"
+ - time make -s -j4 clean && time make -s -j4
+ # clang with asserts disabled
+ always:
+ clang_warning_script:
+ - ./configure --cache clang.cache CC="ccache clang"
+ - time make -s -j4 clean && time make -s -j4
+ # clang with asserts enabled
+ always:
+ clang_a_warning_script:
+ - ./configure --cache clang.cache --enable-cassert CC="ccache clang"
+ - time make -s -j4 clean && time make -s -j4
diff --git a/.dockerignore b/.dockerignore
new file mode 100644
index 00000000000..3fceab2e97b
--- /dev/null
+++ b/.dockerignore
@@ -0,0 +1,3 @@
+# Ignore everything, except ci/
+*
+!ci/*
diff --git a/ci/docker/linux_debian_bullseye b/ci/docker/linux_debian_bullseye
new file mode 100644
index 00000000000..f6c1782f16b
--- /dev/null
+++ b/ci/docker/linux_debian_bullseye
@@ -0,0 +1,13 @@
+FROM debian:bullseye
+RUN \
+ apt-get -y update && \
+ apt-get -y upgrade && \
+ DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends \
+ git build-essential gcc g++ libreadline-dev flex bison make perl libipc-run-perl \
+ libio-pty-perl clang llvm-dev libperl-dev libpython3-dev tcl-dev libldap2-dev \
+ libicu-dev docbook-xml docbook-xsl fop libxml2-utils xsltproc krb5-admin-server \
+ krb5-kdc krb5-user slapd ldap-utils libssl-dev pkg-config locales-all liblz4-dev \
+ libsystemd-dev libxml2-dev libxslt1-dev python3-dev libkrb5-dev libpam-dev \
+ libkrb5-*-heimdal uuid-dev gettext \
+ liburing-dev python3-distutils ccache && \
+ apt-get clean
diff --git a/ci/docker/windows_vs_2019 b/ci/docker/windows_vs_2019
new file mode 100644
index 00000000000..a4fcaceae96
--- /dev/null
+++ b/ci/docker/windows_vs_2019
@@ -0,0 +1,111 @@
+# escape=`
+
+# We used to use the visual studio container, but it's too outdated now
+FROM cirrusci/windowsservercore:2019
+
+SHELL ["powershell", "-NoLogo", "-NoProfile", "-Command"]
+
+
+# Install commandline debugger and log all crashes to c:\cirrus\crashlog.txt
+#
+# Done manually as doing this via chocolatey / the installer directly, ends up
+# with a lot of unnecessary chaff, making the layer unnecessarily large.
+RUN `
+ mkdir c:\t ; `
+ cd c:\t ; `
+ `
+ setx PATH \"C:\Windows Kits\10\Debuggers\x64;$Env:PATH\" /m ; `
+ `
+ curl.exe -sSL -o 'windsdksetup.exe' https://download.microsoft.com/download/9/7/9/97982c1d-d687-41be-9dd3-6d01e52ceb68/windowssdk/winsdksetup.exe ; `
+ echo 'starting sdk installation (for debugger)' ; `
+ Start-Process -Wait -FilePath ".\windsdksetup.exe" `
+ -ArgumentList '/Features OptionId.WindowsDesktopDebuggers /layout c:\t\sdk /quiet /norestart /log c:\t\sdk.log' `
+ ; `
+ `
+ Start-Process -Wait -FilePath msiexec.exe `
+ -ArgumentList '/a \"C:\t\sdk\Installers\X64 Debuggers And Tools-x64_en-us.msi\" /qb /log install2.log' `
+ ; `
+ C:\Windows` Kits\10\Debuggers\x64\cdb.exe -version ; `
+ `
+ Set-ItemProperty -Path 'HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\AeDebug' -Name 'Debugger' -Value '\"C:\Windows Kits\10\Debuggers\x64\cdb.exe\" -p %ld -e %ld -g -kqm -c \".lines -e; .symfix+ ;.logappend c:\cirrus\crashlog.txt ; !peb; ~*kP ; .logclose ; q \"' ; `
+ New-ItemProperty -Path 'HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\AeDebug' -Name 'Auto' -Value 1 -PropertyType DWord ; `
+ Get-ItemProperty -Path 'HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\AeDebug' -Name Debugger ; `
+ `
+ cd c:\ ; `
+ Remove-Item C:\t -Force -Recurse
+
+
+# Install perl, python, flex and bison.
+#
+# Done manually as choco takes a lot longer. I think it's download issues with
+# powershell's download stuff? That's wy curl.exe is directly used here at least...
+#
+# Using perl 5.26.3.1 for now, as newer versions don't currently work correctly
+RUN `
+ mkdir c:\t ; `
+ cd c:\t ; `
+ `
+ echo 'adding to path, before setup below, so changes are not overwritten' ; `
+ setx PATH \"C:\strawberry\perl\bin;C:\winflexbison;C:\Program Files\Git\usr\bin;$Env:PATH\" /m ; `
+ `
+ curl.exe -sSL -o perl.zip `
+ https://strawberryperl.com/download/5.26.3.1/strawberry-perl-5.26.3.1-64bit-portable.zip ; `
+ echo 'installing perl' ; `
+ 7z.exe x .\perl.zip -xr!c -oc:\strawberry ; `
+ `
+ curl.exe -sSL -o python.exe https://www.python.org/ftp/python/3.10.0/python-3.10.0-amd64.exe ; `
+ echo 'installing python' ; `
+ Start-Process -Wait -FilePath ".\python.exe" `
+ -ArgumentList `
+ '/quiet', 'SimpleInstall=1', 'PrependPath=1', 'CompileAll=1', `
+ 'TargetDir=c:\python\', 'InstallAllUsers=1', 'Shortcuts=0', `
+ 'Include_docs=0', 'Include_tcltk=0', 'Include_tests=0' `
+ ; `
+ `
+ curl.exe -sSL -o winflexbison.zip `
+ https://github.com/lexxmark/winflexbison/releases/download/v2.5.24/win_flex_bison-2.5.24.zip ; `
+ echo 'installing winflexbison' ; `
+ 7z.exe x .\winflexbison.zip -oc:\winflexbison ; `
+ Rename-Item -Path c:\winflexbison\win_flex.exe c:\winflexbison\flex.exe ; `
+ Rename-Item -Path c:\winflexbison\win_bison.exe c:\winflexbison\bison.exe ; `
+ `
+ cd c:\ ; `
+ Remove-Item C:\t -Force -Recurse
+
+
+# Install openssl
+RUN `
+ mkdir c:\t ; `
+ cd c:\t ; `
+ `
+ curl.exe -o openssl-setup.exe -sSL https://slproweb.com/download/Win64OpenSSL-1_1_1L.exe ; `
+ echo 'staring openssl installation' ; `
+ Start-Process -Wait -FilePath ".\openssl-setup.exe" `
+ -ArgumentList '/DIR=c:\openssl\1.1.1l\ /VERYSILENT /SP- /SUPPRESSMSGBOXES' ; `
+ `
+ cd c:\ ; `
+ Remove-Item C:\t -Force -Recurse
+
+
+# Install visual studio
+#
+# Adding VS path to vcvarsall.bat so user of container doesn't need to know the full path
+RUN `
+ mkdir c:\t ; `
+ cd c:\t ; `
+ setx PATH \"c:\BuildTools\VC\Auxiliary\Build;$Env:PATH\" /m ; `
+ `
+ curl.exe -sSL -o c:\t\vs_buildtools.exe https://aka.ms/vs/16/release/vs_buildtools.exe ; `
+ echo 'starting visual studio installation' ; `
+ Start-Process -Wait `
+ -FilePath c:\t\vs_buildtools.exe `
+ -ArgumentList `
+ '--quiet', '--wait', '--norestart', '--nocache', `
+ '--installPath', 'c:\BuildTools', `
+ '--add', 'Microsoft.VisualStudio.Component.VC.Tools.x86.x64', `
+ '--add', 'Microsoft.VisualStudio.Component.Windows10SDK.20348' ; `
+ `
+ cd c:\ ; `
+ Remove-Item C:\t -Force -Recurse ; `
+ Remove-Item -Force -Recurse ${Env:TEMP}\*; `
+ Remove-Item -Force -Recurse \"${Env:ProgramData}\Package Cache\"
diff --git a/ci/freebsd_gcp_repartition.sh b/ci/freebsd_gcp_repartition.sh
new file mode 100755
index 00000000000..2d5e1738998
--- /dev/null
+++ b/ci/freebsd_gcp_repartition.sh
@@ -0,0 +1,28 @@
+#!/bin/sh
+
+set -e
+set -x
+
+# The default filesystem on freebsd gcp images is very slow to run tests on,
+# due to its 32KB block size
+#
+# XXX: It'd probably better to fix this in the image, using something like
+# https://people.freebsd.org/~lidl/blog/re-root.html
+
+# fix backup partition table after resize
+gpart recover da0
+gpart show da0
+# kill swap, so we can delete a partition
+swapoff -a || true
+# (apparently we can only have 4!?)
+gpart delete -i 3 da0
+gpart add -t freebsd-ufs -l data8k -a 4096 da0
+gpart show da0
+newfs -U -b 8192 /dev/da0p3
+
+# Migrate working directory
+du -hs $CIRRUS_WORKING_DIR
+mv $CIRRUS_WORKING_DIR $CIRRUS_WORKING_DIR.orig
+mkdir $CIRRUS_WORKING_DIR
+mount -o noatime /dev/da0p3 $CIRRUS_WORKING_DIR
+cp -r $CIRRUS_WORKING_DIR.orig/* $CIRRUS_WORKING_DIR/
diff --git a/ci/pg_ci_base.conf b/ci/pg_ci_base.conf
new file mode 100644
index 00000000000..637e3cfb343
--- /dev/null
+++ b/ci/pg_ci_base.conf
@@ -0,0 +1,12 @@
+# Tends to produce too many core files, taking a long time
+restart_after_crash = false
+
+# So that tests using the "manually" started postgres on windows can use
+# prepared statements
+max_prepared_transactions=10
+
+# Settings that make logs more useful
+log_line_prefix='%m [%p][%b][%v:%x] '
+log_checkpoints = true
+log_connections = true
+log_disconnections = true
diff --git a/ci/windows_build_config.pl b/ci/windows_build_config.pl
new file mode 100644
index 00000000000..ba82b13d69a
--- /dev/null
+++ b/ci/windows_build_config.pl
@@ -0,0 +1,13 @@
+use strict;
+use warnings;
+
+our $config;
+
+$config->{"tap_tests"} = 1;
+$config->{"asserts"} = 1;
+
+$config->{"openssl"} = "c:/openssl/1.1.1l/";
+$config->{"perl"} = "c:/strawberry/perl/";
+$config->{"python"} = "c:/python/";
+
+1;
--
2.23.0.385.gbc12974a89
v5-0003-plpython-Drop-support-python2.patchtext/x-diff; charset=us-asciiDownload
From 96cb64239dcee91060a038bbad47047bd8b1b1c3 Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Sun, 3 Oct 2021 10:56:21 -0700
Subject: [PATCH v5 03/16] plpython: Drop support python2.
Author:
Reviewed-By:
Discussion: https://postgr.es/m/20211031184548.g4sxfe47n2kyi55r@alap3.anarazel.de
Backpatch:
---
.cirrus.yml | 3 +-
config/python.m4 | 4 +-
configure | 4 +-
contrib/hstore_plpython/Makefile | 10 +-
.../expected/hstore_plpython.out | 22 +-
.../hstore_plpython2u--1.0.sql | 19 -
.../hstore_plpython/hstore_plpython2u.control | 6 -
.../hstore_plpython/hstore_plpythonu--1.0.sql | 19 -
.../hstore_plpython/hstore_plpythonu.control | 6 -
.../hstore_plpython/sql/hstore_plpython.sql | 18 +-
contrib/jsonb_plpython/Makefile | 11 +-
.../expected/jsonb_plpython.out | 32 +-
.../jsonb_plpython/jsonb_plpython2u--1.0.sql | 19 -
.../jsonb_plpython/jsonb_plpython2u.control | 6 -
.../jsonb_plpython/jsonb_plpythonu--1.0.sql | 19 -
.../jsonb_plpython/jsonb_plpythonu.control | 6 -
contrib/jsonb_plpython/sql/jsonb_plpython.sql | 30 +-
contrib/ltree_plpython/Makefile | 10 +-
.../expected/ltree_plpython.out | 10 +-
.../ltree_plpython/ltree_plpython2u--1.0.sql | 12 -
.../ltree_plpython/ltree_plpython2u.control | 6 -
.../ltree_plpython/ltree_plpythonu--1.0.sql | 12 -
.../ltree_plpython/ltree_plpythonu.control | 6 -
contrib/ltree_plpython/sql/ltree_plpython.sql | 8 +-
src/pl/plpython/Makefile | 14 -
src/pl/plpython/expected/plpython_call.out | 12 +-
.../plpython/expected/plpython_composite.out | 32 +-
src/pl/plpython/expected/plpython_do.out | 8 +-
src/pl/plpython/expected/plpython_drop.out | 3 +-
src/pl/plpython/expected/plpython_ereport.out | 22 +-
src/pl/plpython/expected/plpython_error.out | 52 +-
src/pl/plpython/expected/plpython_error_5.out | 447 --------
src/pl/plpython/expected/plpython_global.out | 6 +-
src/pl/plpython/expected/plpython_import.out | 8 +-
src/pl/plpython/expected/plpython_newline.out | 6 +-
src/pl/plpython/expected/plpython_params.out | 8 +-
src/pl/plpython/expected/plpython_quote.out | 2 +-
src/pl/plpython/expected/plpython_record.out | 18 +-
src/pl/plpython/expected/plpython_setof.out | 18 +-
src/pl/plpython/expected/plpython_spi.out | 48 +-
.../expected/plpython_subtransaction.out | 38 +-
src/pl/plpython/expected/plpython_test.out | 12 +-
.../expected/plpython_transaction.out | 24 +-
src/pl/plpython/expected/plpython_trigger.out | 46 +-
src/pl/plpython/expected/plpython_types.out | 230 ++--
src/pl/plpython/expected/plpython_types_3.out | 1009 -----------------
src/pl/plpython/expected/plpython_unicode.out | 16 +-
src/pl/plpython/expected/plpython_void.out | 6 +-
src/pl/plpython/plpy_cursorobject.c | 2 +-
src/pl/plpython/plpy_main.c | 55 +-
src/pl/plpython/plpy_plpymodule.c | 16 -
src/pl/plpython/plpy_plpymodule.h | 2 -
src/pl/plpython/plpy_resultobject.c | 8 -
src/pl/plpython/plpy_typeio.c | 8 -
src/pl/plpython/plpy_util.c | 3 -
src/pl/plpython/plpy_util.h | 2 -
src/pl/plpython/plpython.h | 13 +-
src/pl/plpython/plpython2u--1.0.sql | 17 -
src/pl/plpython/plpython2u.control | 7 -
src/pl/plpython/plpythonu--1.0.sql | 17 -
src/pl/plpython/plpythonu.control | 7 -
src/pl/plpython/regress-python3-mangle.mk | 38 -
src/pl/plpython/sql/plpython_call.sql | 12 +-
src/pl/plpython/sql/plpython_composite.sql | 32 +-
src/pl/plpython/sql/plpython_do.sql | 6 +-
src/pl/plpython/sql/plpython_drop.sql | 4 +-
src/pl/plpython/sql/plpython_ereport.sql | 22 +-
src/pl/plpython/sql/plpython_error.sql | 48 +-
src/pl/plpython/sql/plpython_global.sql | 6 +-
src/pl/plpython/sql/plpython_import.sql | 8 +-
src/pl/plpython/sql/plpython_newline.sql | 6 +-
src/pl/plpython/sql/plpython_params.sql | 8 +-
src/pl/plpython/sql/plpython_quote.sql | 2 +-
src/pl/plpython/sql/plpython_record.sql | 18 +-
src/pl/plpython/sql/plpython_setof.sql | 18 +-
src/pl/plpython/sql/plpython_spi.sql | 48 +-
.../plpython/sql/plpython_subtransaction.sql | 38 +-
src/pl/plpython/sql/plpython_test.sql | 12 +-
src/pl/plpython/sql/plpython_transaction.sql | 22 +-
src/pl/plpython/sql/plpython_trigger.sql | 46 +-
src/pl/plpython/sql/plpython_types.sql | 106 +-
src/pl/plpython/sql/plpython_unicode.sql | 16 +-
src/pl/plpython/sql/plpython_void.sql | 6 +-
83 files changed, 629 insertions(+), 2433 deletions(-)
delete mode 100644 contrib/hstore_plpython/hstore_plpython2u--1.0.sql
delete mode 100644 contrib/hstore_plpython/hstore_plpython2u.control
delete mode 100644 contrib/hstore_plpython/hstore_plpythonu--1.0.sql
delete mode 100644 contrib/hstore_plpython/hstore_plpythonu.control
delete mode 100644 contrib/jsonb_plpython/jsonb_plpython2u--1.0.sql
delete mode 100644 contrib/jsonb_plpython/jsonb_plpython2u.control
delete mode 100644 contrib/jsonb_plpython/jsonb_plpythonu--1.0.sql
delete mode 100644 contrib/jsonb_plpython/jsonb_plpythonu.control
delete mode 100644 contrib/ltree_plpython/ltree_plpython2u--1.0.sql
delete mode 100644 contrib/ltree_plpython/ltree_plpython2u.control
delete mode 100644 contrib/ltree_plpython/ltree_plpythonu--1.0.sql
delete mode 100644 contrib/ltree_plpython/ltree_plpythonu.control
delete mode 100644 src/pl/plpython/expected/plpython_error_5.out
delete mode 100644 src/pl/plpython/expected/plpython_types_3.out
delete mode 100644 src/pl/plpython/plpython2u--1.0.sql
delete mode 100644 src/pl/plpython/plpython2u.control
delete mode 100644 src/pl/plpython/plpythonu--1.0.sql
delete mode 100644 src/pl/plpython/plpythonu.control
delete mode 100644 src/pl/plpython/regress-python3-mangle.mk
diff --git a/.cirrus.yml b/.cirrus.yml
index f75bdce6dec..2bb6f4a14d7 100644
--- a/.cirrus.yml
+++ b/.cirrus.yml
@@ -240,7 +240,8 @@ task:
--with-tcl --with-tclconfig=/usr/local/opt/tcl-tk/lib/ \
--with-uuid=e2fs \
\
- CC="ccache gcc" CFLAGS="-O0 -ggdb"
+ CC="ccache gcc" CFLAGS="-O0 -ggdb" \
+ PYTHON=python3
build_script:
- gmake -s -j12 && gmake -s -j12 -C contrib
upload_caches:
diff --git a/config/python.m4 b/config/python.m4
index d41aeb2876a..f51d23c3d43 100644
--- a/config/python.m4
+++ b/config/python.m4
@@ -37,8 +37,8 @@ python_majorversion=`echo "$python_fullversion" | sed '[s/^\([0-9]*\).*/\1/]'`
python_minorversion=`echo "$python_fullversion" | sed '[s/^[0-9]*\.\([0-9]*\).*/\1/]'`
python_version=`echo "$python_fullversion" | sed '[s/^\([0-9]*\.[0-9]*\).*/\1/]'`
# Reject unsupported Python versions as soon as practical.
-if test "$python_majorversion" -lt 3 -a "$python_minorversion" -lt 6; then
- AC_MSG_ERROR([Python version $python_version is too old (version 2.6 or later is required)])
+if test "$python_majorversion" -lt 3; then
+ AC_MSG_ERROR([Python version $python_version is too old (version 3 or later is required)])
fi
AC_MSG_CHECKING([for Python distutils module])
diff --git a/configure b/configure
index 4ffefe46552..1b5fd12a432 100755
--- a/configure
+++ b/configure
@@ -10123,8 +10123,8 @@ python_majorversion=`echo "$python_fullversion" | sed 's/^\([0-9]*\).*/\1/'`
python_minorversion=`echo "$python_fullversion" | sed 's/^[0-9]*\.\([0-9]*\).*/\1/'`
python_version=`echo "$python_fullversion" | sed 's/^\([0-9]*\.[0-9]*\).*/\1/'`
# Reject unsupported Python versions as soon as practical.
-if test "$python_majorversion" -lt 3 -a "$python_minorversion" -lt 6; then
- as_fn_error $? "Python version $python_version is too old (version 2.6 or later is required)" "$LINENO" 5
+if test "$python_majorversion" -lt 3; then
+ as_fn_error $? "Python version $python_version is too old (version 3 or later is required)" "$LINENO" 5
fi
{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for Python distutils module" >&5
diff --git a/contrib/hstore_plpython/Makefile b/contrib/hstore_plpython/Makefile
index 6af097ae68b..9d88cda1d0f 100644
--- a/contrib/hstore_plpython/Makefile
+++ b/contrib/hstore_plpython/Makefile
@@ -6,11 +6,10 @@ OBJS = \
hstore_plpython.o
PGFILEDESC = "hstore_plpython - hstore transform for plpython"
-EXTENSION = hstore_plpythonu hstore_plpython2u hstore_plpython3u
-DATA = hstore_plpythonu--1.0.sql hstore_plpython2u--1.0.sql hstore_plpython3u--1.0.sql
+EXTENSION = hstore_plpython3u
+DATA = hstore_plpython3u--1.0.sql
REGRESS = hstore_plpython
-REGRESS_PLPYTHON3_MANGLE := $(REGRESS)
PG_CPPFLAGS = $(python_includespec) -DPLPYTHON_LIBNAME='"plpython$(python_majorversion)"'
@@ -37,9 +36,4 @@ SHLIB_LINK += $(python_libspec) $(python_additional_libs)
endif
REGRESS_OPTS += --load-extension=hstore
-ifeq ($(python_majorversion),2)
-REGRESS_OPTS += --load-extension=plpythonu --load-extension=hstore_plpythonu
-endif
EXTRA_INSTALL += contrib/hstore
-
-include $(top_srcdir)/src/pl/plpython/regress-python3-mangle.mk
diff --git a/contrib/hstore_plpython/expected/hstore_plpython.out b/contrib/hstore_plpython/expected/hstore_plpython.out
index ecf1dd61bc1..bf238701fec 100644
--- a/contrib/hstore_plpython/expected/hstore_plpython.out
+++ b/contrib/hstore_plpython/expected/hstore_plpython.out
@@ -1,8 +1,8 @@
-CREATE EXTENSION hstore_plpython2u CASCADE;
-NOTICE: installing required extension "plpython2u"
+CREATE EXTENSION hstore_plpython3u CASCADE;
+NOTICE: installing required extension "plpython3u"
-- test hstore -> python
CREATE FUNCTION test1(val hstore) RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE hstore
AS $$
assert isinstance(val, dict)
@@ -18,7 +18,7 @@ INFO: [('aa', 'bb'), ('cc', None)]
-- the same with the versioned language name
CREATE FUNCTION test1n(val hstore) RETURNS int
-LANGUAGE plpython2u
+LANGUAGE plpython3u
TRANSFORM FOR TYPE hstore
AS $$
assert isinstance(val, dict)
@@ -34,7 +34,7 @@ INFO: [('aa', 'bb'), ('cc', None)]
-- test hstore[] -> python
CREATE FUNCTION test1arr(val hstore[]) RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE hstore
AS $$
assert(val == [{'aa': 'bb', 'cc': None}, {'dd': 'ee'}])
@@ -48,7 +48,7 @@ SELECT test1arr(array['aa=>bb, cc=>NULL'::hstore, 'dd=>ee']);
-- test python -> hstore
CREATE FUNCTION test2(a int, b text) RETURNS hstore
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE hstore
AS $$
val = {'a': a, 'b': b, 'c': None}
@@ -65,14 +65,14 @@ SELECT test2(1, 'boo');
CREATE OR REPLACE FUNCTION public.test2(a integer, b text)
RETURNS hstore
TRANSFORM FOR TYPE hstore
- LANGUAGE plpythonu
+ LANGUAGE plpython3u
AS $function$
val = {'a': a, 'b': b, 'c': None}
return val
$function$
-- test python -> hstore[]
CREATE FUNCTION test2arr() RETURNS hstore[]
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE hstore
AS $$
val = [{'a': 1, 'b': 'boo', 'c': None}, {'d': 2}]
@@ -87,7 +87,7 @@ SELECT test2arr();
-- test python -> domain over hstore
CREATE DOMAIN hstore_foo AS hstore CHECK(VALUE ? 'foo');
CREATE FUNCTION test2dom(fn text) RETURNS hstore_foo
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE hstore
AS $$
return {'a': 1, fn: 'boo', 'c': None}
@@ -104,7 +104,7 @@ CONTEXT: while creating return value
PL/Python function "test2dom"
-- test as part of prepare/execute
CREATE FUNCTION test3() RETURNS void
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE hstore
AS $$
rv = plpy.execute("SELECT 'aa=>bb, cc=>NULL'::hstore AS col1")
@@ -131,7 +131,7 @@ SELECT * FROM test1;
(1 row)
CREATE FUNCTION test4() RETURNS trigger
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE hstore
AS $$
assert(TD["new"] == {'a': 1, 'b': {'aa': 'bb', 'cc': None}})
diff --git a/contrib/hstore_plpython/hstore_plpython2u--1.0.sql b/contrib/hstore_plpython/hstore_plpython2u--1.0.sql
deleted file mode 100644
index 800765f3f0c..00000000000
--- a/contrib/hstore_plpython/hstore_plpython2u--1.0.sql
+++ /dev/null
@@ -1,19 +0,0 @@
-/* contrib/hstore_plpython/hstore_plpython2u--1.0.sql */
-
--- complain if script is sourced in psql, rather than via CREATE EXTENSION
-\echo Use "CREATE EXTENSION hstore_plpython2u" to load this file. \quit
-
-CREATE FUNCTION hstore_to_plpython2(val internal) RETURNS internal
-LANGUAGE C STRICT IMMUTABLE
-AS 'MODULE_PATHNAME', 'hstore_to_plpython';
-
-CREATE FUNCTION plpython2_to_hstore(val internal) RETURNS hstore
-LANGUAGE C STRICT IMMUTABLE
-AS 'MODULE_PATHNAME', 'plpython_to_hstore';
-
-CREATE TRANSFORM FOR hstore LANGUAGE plpython2u (
- FROM SQL WITH FUNCTION hstore_to_plpython2(internal),
- TO SQL WITH FUNCTION plpython2_to_hstore(internal)
-);
-
-COMMENT ON TRANSFORM FOR hstore LANGUAGE plpython2u IS 'transform between hstore and Python dict';
diff --git a/contrib/hstore_plpython/hstore_plpython2u.control b/contrib/hstore_plpython/hstore_plpython2u.control
deleted file mode 100644
index ed905671123..00000000000
--- a/contrib/hstore_plpython/hstore_plpython2u.control
+++ /dev/null
@@ -1,6 +0,0 @@
-# hstore_plpython2u extension
-comment = 'transform between hstore and plpython2u'
-default_version = '1.0'
-module_pathname = '$libdir/hstore_plpython2'
-relocatable = true
-requires = 'hstore,plpython2u'
diff --git a/contrib/hstore_plpython/hstore_plpythonu--1.0.sql b/contrib/hstore_plpython/hstore_plpythonu--1.0.sql
deleted file mode 100644
index 52832912abc..00000000000
--- a/contrib/hstore_plpython/hstore_plpythonu--1.0.sql
+++ /dev/null
@@ -1,19 +0,0 @@
-/* contrib/hstore_plpython/hstore_plpythonu--1.0.sql */
-
--- complain if script is sourced in psql, rather than via CREATE EXTENSION
-\echo Use "CREATE EXTENSION hstore_plpythonu" to load this file. \quit
-
-CREATE FUNCTION hstore_to_plpython(val internal) RETURNS internal
-LANGUAGE C STRICT IMMUTABLE
-AS 'MODULE_PATHNAME';
-
-CREATE FUNCTION plpython_to_hstore(val internal) RETURNS hstore
-LANGUAGE C STRICT IMMUTABLE
-AS 'MODULE_PATHNAME';
-
-CREATE TRANSFORM FOR hstore LANGUAGE plpythonu (
- FROM SQL WITH FUNCTION hstore_to_plpython(internal),
- TO SQL WITH FUNCTION plpython_to_hstore(internal)
-);
-
-COMMENT ON TRANSFORM FOR hstore LANGUAGE plpythonu IS 'transform between hstore and Python dict';
diff --git a/contrib/hstore_plpython/hstore_plpythonu.control b/contrib/hstore_plpython/hstore_plpythonu.control
deleted file mode 100644
index 8e9b35e43bf..00000000000
--- a/contrib/hstore_plpython/hstore_plpythonu.control
+++ /dev/null
@@ -1,6 +0,0 @@
-# hstore_plpythonu extension
-comment = 'transform between hstore and plpythonu'
-default_version = '1.0'
-module_pathname = '$libdir/hstore_plpython2'
-relocatable = true
-requires = 'hstore,plpythonu'
diff --git a/contrib/hstore_plpython/sql/hstore_plpython.sql b/contrib/hstore_plpython/sql/hstore_plpython.sql
index b6d98b7dd53..a9cfbbe13e2 100644
--- a/contrib/hstore_plpython/sql/hstore_plpython.sql
+++ b/contrib/hstore_plpython/sql/hstore_plpython.sql
@@ -1,9 +1,9 @@
-CREATE EXTENSION hstore_plpython2u CASCADE;
+CREATE EXTENSION hstore_plpython3u CASCADE;
-- test hstore -> python
CREATE FUNCTION test1(val hstore) RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE hstore
AS $$
assert isinstance(val, dict)
@@ -16,7 +16,7 @@ SELECT test1('aa=>bb, cc=>NULL'::hstore);
-- the same with the versioned language name
CREATE FUNCTION test1n(val hstore) RETURNS int
-LANGUAGE plpython2u
+LANGUAGE plpython3u
TRANSFORM FOR TYPE hstore
AS $$
assert isinstance(val, dict)
@@ -29,7 +29,7 @@ SELECT test1n('aa=>bb, cc=>NULL'::hstore);
-- test hstore[] -> python
CREATE FUNCTION test1arr(val hstore[]) RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE hstore
AS $$
assert(val == [{'aa': 'bb', 'cc': None}, {'dd': 'ee'}])
@@ -41,7 +41,7 @@ SELECT test1arr(array['aa=>bb, cc=>NULL'::hstore, 'dd=>ee']);
-- test python -> hstore
CREATE FUNCTION test2(a int, b text) RETURNS hstore
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE hstore
AS $$
val = {'a': a, 'b': b, 'c': None}
@@ -56,7 +56,7 @@ SELECT test2(1, 'boo');
-- test python -> hstore[]
CREATE FUNCTION test2arr() RETURNS hstore[]
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE hstore
AS $$
val = [{'a': 1, 'b': 'boo', 'c': None}, {'d': 2}]
@@ -70,7 +70,7 @@ SELECT test2arr();
CREATE DOMAIN hstore_foo AS hstore CHECK(VALUE ? 'foo');
CREATE FUNCTION test2dom(fn text) RETURNS hstore_foo
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE hstore
AS $$
return {'a': 1, fn: 'boo', 'c': None}
@@ -82,7 +82,7 @@ SELECT test2dom('bar'); -- fail
-- test as part of prepare/execute
CREATE FUNCTION test3() RETURNS void
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE hstore
AS $$
rv = plpy.execute("SELECT 'aa=>bb, cc=>NULL'::hstore AS col1")
@@ -103,7 +103,7 @@ INSERT INTO test1 VALUES (1, 'aa=>bb, cc=>NULL');
SELECT * FROM test1;
CREATE FUNCTION test4() RETURNS trigger
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE hstore
AS $$
assert(TD["new"] == {'a': 1, 'b': {'aa': 'bb', 'cc': None}})
diff --git a/contrib/jsonb_plpython/Makefile b/contrib/jsonb_plpython/Makefile
index ca767418943..6333ea0bbaf 100644
--- a/contrib/jsonb_plpython/Makefile
+++ b/contrib/jsonb_plpython/Makefile
@@ -8,11 +8,10 @@ PGFILEDESC = "jsonb_plpython - transform between jsonb and plpythonu"
PG_CPPFLAGS = -I$(top_srcdir)/src/pl/plpython $(python_includespec) -DPLPYTHON_LIBNAME='"plpython$(python_majorversion)"'
-EXTENSION = jsonb_plpythonu jsonb_plpython2u jsonb_plpython3u
-DATA = jsonb_plpythonu--1.0.sql jsonb_plpython2u--1.0.sql jsonb_plpython3u--1.0.sql
+EXTENSION = jsonb_plpython3u
+DATA = jsonb_plpython3u--1.0.sql
REGRESS = jsonb_plpython
-REGRESS_PLPYTHON3_MANGLE := $(REGRESS)
ifdef USE_PGXS
PG_CONFIG = pg_config
@@ -33,9 +32,3 @@ else
rpathdir = $(python_libdir)
SHLIB_LINK += $(python_libspec) $(python_additional_libs)
endif
-
-ifeq ($(python_majorversion),2)
-REGRESS_OPTS += --load-extension=plpythonu --load-extension=jsonb_plpythonu
-endif
-
-include $(top_srcdir)/src/pl/plpython/regress-python3-mangle.mk
diff --git a/contrib/jsonb_plpython/expected/jsonb_plpython.out b/contrib/jsonb_plpython/expected/jsonb_plpython.out
index b491fe9cc68..cac963de69c 100644
--- a/contrib/jsonb_plpython/expected/jsonb_plpython.out
+++ b/contrib/jsonb_plpython/expected/jsonb_plpython.out
@@ -1,8 +1,8 @@
-CREATE EXTENSION jsonb_plpython2u CASCADE;
-NOTICE: installing required extension "plpython2u"
+CREATE EXTENSION jsonb_plpython3u CASCADE;
+NOTICE: installing required extension "plpython3u"
-- test jsonb -> python dict
CREATE FUNCTION test1(val jsonb) RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
AS $$
assert isinstance(val, dict)
@@ -18,7 +18,7 @@ SELECT test1('{"a": 1, "c": "NULL"}'::jsonb);
-- test jsonb -> python dict
-- complex dict with dicts as value
CREATE FUNCTION test1complex(val jsonb) RETURNS int
-LANGUAGE plpython2u
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
AS $$
assert isinstance(val, dict)
@@ -34,7 +34,7 @@ SELECT test1complex('{"d": {"d": 1}}'::jsonb);
-- test jsonb[] -> python dict
-- dict with array as value
CREATE FUNCTION test1arr(val jsonb) RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
AS $$
assert isinstance(val, dict)
@@ -50,7 +50,7 @@ SELECT test1arr('{"d":[12, 1]}'::jsonb);
-- test jsonb[] -> python list
-- simple list
CREATE FUNCTION test2arr(val jsonb) RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
AS $$
assert isinstance(val, list)
@@ -66,7 +66,7 @@ SELECT test2arr('[12, 1]'::jsonb);
-- test jsonb[] -> python list
-- array of dicts
CREATE FUNCTION test3arr(val jsonb) RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
AS $$
assert isinstance(val, list)
@@ -81,7 +81,7 @@ SELECT test3arr('[{"a": 1, "b": 2}, {"c": 3,"d": 4}]'::jsonb);
-- test jsonb int -> python int
CREATE FUNCTION test1int(val jsonb) RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
AS $$
assert(val == 1)
@@ -95,7 +95,7 @@ SELECT test1int('1'::jsonb);
-- test jsonb string -> python string
CREATE FUNCTION test1string(val jsonb) RETURNS text
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
AS $$
assert(val == "a")
@@ -109,7 +109,7 @@ SELECT test1string('"a"'::jsonb);
-- test jsonb null -> python None
CREATE FUNCTION test1null(val jsonb) RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
AS $$
assert(val == None)
@@ -123,7 +123,7 @@ SELECT test1null('null'::jsonb);
-- test python -> jsonb
CREATE FUNCTION roundtrip(val jsonb) RETURNS jsonb
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
as $$
return val
@@ -238,7 +238,7 @@ SELECT roundtrip('["string", "string2"]'::jsonb);
-- complex numbers -> jsonb
CREATE FUNCTION testComplexNumbers() RETURNS jsonb
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
AS $$
x = 1 + 2j
@@ -250,7 +250,7 @@ CONTEXT: while creating return value
PL/Python function "testcomplexnumbers"
-- range -> jsonb
CREATE FUNCTION testRange() RETURNS jsonb
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
AS $$
x = range(3)
@@ -264,7 +264,7 @@ SELECT testRange();
-- 0xff -> jsonb
CREATE FUNCTION testDecimal() RETURNS jsonb
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
AS $$
x = 0xff
@@ -278,7 +278,7 @@ SELECT testDecimal();
-- tuple -> jsonb
CREATE FUNCTION testTuple() RETURNS jsonb
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
AS $$
x = (1, 'String', None)
@@ -292,7 +292,7 @@ SELECT testTuple();
-- interesting dict -> jsonb
CREATE FUNCTION test_dict1() RETURNS jsonb
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
AS $$
x = {"a": 1, None: 2, 33: 3}
diff --git a/contrib/jsonb_plpython/jsonb_plpython2u--1.0.sql b/contrib/jsonb_plpython/jsonb_plpython2u--1.0.sql
deleted file mode 100644
index 2526d14ee19..00000000000
--- a/contrib/jsonb_plpython/jsonb_plpython2u--1.0.sql
+++ /dev/null
@@ -1,19 +0,0 @@
-/* contrib/jsonb_plpython/jsonb_plpython2u--1.0.sql */
-
--- complain if script is sourced in psql, rather than via CREATE EXTENSION
-\echo Use "CREATE EXTENSION jsonb_plpython2u" to load this file. \quit
-
-CREATE FUNCTION jsonb_to_plpython2(val internal) RETURNS internal
-LANGUAGE C STRICT IMMUTABLE
-AS 'MODULE_PATHNAME', 'jsonb_to_plpython';
-
-CREATE FUNCTION plpython2_to_jsonb(val internal) RETURNS jsonb
-LANGUAGE C STRICT IMMUTABLE
-AS 'MODULE_PATHNAME', 'plpython_to_jsonb';
-
-CREATE TRANSFORM FOR jsonb LANGUAGE plpython2u (
- FROM SQL WITH FUNCTION jsonb_to_plpython2(internal),
- TO SQL WITH FUNCTION plpython2_to_jsonb(internal)
-);
-
-COMMENT ON TRANSFORM FOR jsonb LANGUAGE plpython2u IS 'transform between jsonb and Python';
diff --git a/contrib/jsonb_plpython/jsonb_plpython2u.control b/contrib/jsonb_plpython/jsonb_plpython2u.control
deleted file mode 100644
index d26368316b6..00000000000
--- a/contrib/jsonb_plpython/jsonb_plpython2u.control
+++ /dev/null
@@ -1,6 +0,0 @@
-# jsonb_plpython2u extension
-comment = 'transform between jsonb and plpython2u'
-default_version = '1.0'
-module_pathname = '$libdir/jsonb_plpython2'
-relocatable = true
-requires = 'plpython2u'
diff --git a/contrib/jsonb_plpython/jsonb_plpythonu--1.0.sql b/contrib/jsonb_plpython/jsonb_plpythonu--1.0.sql
deleted file mode 100644
index 3fa89885a63..00000000000
--- a/contrib/jsonb_plpython/jsonb_plpythonu--1.0.sql
+++ /dev/null
@@ -1,19 +0,0 @@
-/* contrib/jsonb_plpython/jsonb_plpythonu--1.0.sql */
-
--- complain if script is sourced in psql, rather than via CREATE EXTENSION
-\echo Use "CREATE EXTENSION jsonb_plpythonu" to load this file. \quit
-
-CREATE FUNCTION jsonb_to_plpython(val internal) RETURNS internal
-LANGUAGE C STRICT IMMUTABLE
-AS 'MODULE_PATHNAME';
-
-CREATE FUNCTION plpython_to_jsonb(val internal) RETURNS jsonb
-LANGUAGE C STRICT IMMUTABLE
-AS 'MODULE_PATHNAME';
-
-CREATE TRANSFORM FOR jsonb LANGUAGE plpythonu (
- FROM SQL WITH FUNCTION jsonb_to_plpython(internal),
- TO SQL WITH FUNCTION plpython_to_jsonb(internal)
-);
-
-COMMENT ON TRANSFORM FOR jsonb LANGUAGE plpythonu IS 'transform between jsonb and Python';
diff --git a/contrib/jsonb_plpython/jsonb_plpythonu.control b/contrib/jsonb_plpython/jsonb_plpythonu.control
deleted file mode 100644
index 6f8fa4f184b..00000000000
--- a/contrib/jsonb_plpython/jsonb_plpythonu.control
+++ /dev/null
@@ -1,6 +0,0 @@
-# jsonb_plpythonu extension
-comment = 'transform between jsonb and plpythonu'
-default_version = '1.0'
-module_pathname = '$libdir/jsonb_plpython2'
-relocatable = true
-requires = 'plpythonu'
diff --git a/contrib/jsonb_plpython/sql/jsonb_plpython.sql b/contrib/jsonb_plpython/sql/jsonb_plpython.sql
index 2ee1bca0a98..29dc33279a0 100644
--- a/contrib/jsonb_plpython/sql/jsonb_plpython.sql
+++ b/contrib/jsonb_plpython/sql/jsonb_plpython.sql
@@ -1,8 +1,8 @@
-CREATE EXTENSION jsonb_plpython2u CASCADE;
+CREATE EXTENSION jsonb_plpython3u CASCADE;
-- test jsonb -> python dict
CREATE FUNCTION test1(val jsonb) RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
AS $$
assert isinstance(val, dict)
@@ -15,7 +15,7 @@ SELECT test1('{"a": 1, "c": "NULL"}'::jsonb);
-- test jsonb -> python dict
-- complex dict with dicts as value
CREATE FUNCTION test1complex(val jsonb) RETURNS int
-LANGUAGE plpython2u
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
AS $$
assert isinstance(val, dict)
@@ -29,7 +29,7 @@ SELECT test1complex('{"d": {"d": 1}}'::jsonb);
-- test jsonb[] -> python dict
-- dict with array as value
CREATE FUNCTION test1arr(val jsonb) RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
AS $$
assert isinstance(val, dict)
@@ -42,7 +42,7 @@ SELECT test1arr('{"d":[12, 1]}'::jsonb);
-- test jsonb[] -> python list
-- simple list
CREATE FUNCTION test2arr(val jsonb) RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
AS $$
assert isinstance(val, list)
@@ -55,7 +55,7 @@ SELECT test2arr('[12, 1]'::jsonb);
-- test jsonb[] -> python list
-- array of dicts
CREATE FUNCTION test3arr(val jsonb) RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
AS $$
assert isinstance(val, list)
@@ -67,7 +67,7 @@ SELECT test3arr('[{"a": 1, "b": 2}, {"c": 3,"d": 4}]'::jsonb);
-- test jsonb int -> python int
CREATE FUNCTION test1int(val jsonb) RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
AS $$
assert(val == 1)
@@ -78,7 +78,7 @@ SELECT test1int('1'::jsonb);
-- test jsonb string -> python string
CREATE FUNCTION test1string(val jsonb) RETURNS text
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
AS $$
assert(val == "a")
@@ -89,7 +89,7 @@ SELECT test1string('"a"'::jsonb);
-- test jsonb null -> python None
CREATE FUNCTION test1null(val jsonb) RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
AS $$
assert(val == None)
@@ -100,7 +100,7 @@ SELECT test1null('null'::jsonb);
-- test python -> jsonb
CREATE FUNCTION roundtrip(val jsonb) RETURNS jsonb
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
as $$
return val
@@ -129,7 +129,7 @@ SELECT roundtrip('["string", "string2"]'::jsonb);
-- complex numbers -> jsonb
CREATE FUNCTION testComplexNumbers() RETURNS jsonb
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
AS $$
x = 1 + 2j
@@ -140,7 +140,7 @@ SELECT testComplexNumbers();
-- range -> jsonb
CREATE FUNCTION testRange() RETURNS jsonb
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
AS $$
x = range(3)
@@ -151,7 +151,7 @@ SELECT testRange();
-- 0xff -> jsonb
CREATE FUNCTION testDecimal() RETURNS jsonb
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
AS $$
x = 0xff
@@ -162,7 +162,7 @@ SELECT testDecimal();
-- tuple -> jsonb
CREATE FUNCTION testTuple() RETURNS jsonb
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
AS $$
x = (1, 'String', None)
@@ -173,7 +173,7 @@ SELECT testTuple();
-- interesting dict -> jsonb
CREATE FUNCTION test_dict1() RETURNS jsonb
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE jsonb
AS $$
x = {"a": 1, None: 2, 33: 3}
diff --git a/contrib/ltree_plpython/Makefile b/contrib/ltree_plpython/Makefile
index 12a01467721..406d2789c9c 100644
--- a/contrib/ltree_plpython/Makefile
+++ b/contrib/ltree_plpython/Makefile
@@ -6,11 +6,10 @@ OBJS = \
ltree_plpython.o
PGFILEDESC = "ltree_plpython - ltree transform for plpython"
-EXTENSION = ltree_plpythonu ltree_plpython2u ltree_plpython3u
-DATA = ltree_plpythonu--1.0.sql ltree_plpython2u--1.0.sql ltree_plpython3u--1.0.sql
+EXTENSION = ltree_plpython3u
+DATA = ltree_plpython3u--1.0.sql
REGRESS = ltree_plpython
-REGRESS_PLPYTHON3_MANGLE := $(REGRESS)
PG_CPPFLAGS = $(python_includespec) -DPLPYTHON_LIBNAME='"plpython$(python_majorversion)"'
@@ -37,9 +36,4 @@ SHLIB_LINK += $(python_libspec) $(python_additional_libs)
endif
REGRESS_OPTS += --load-extension=ltree
-ifeq ($(python_majorversion),2)
-REGRESS_OPTS += --load-extension=plpythonu --load-extension=ltree_plpythonu
-endif
EXTRA_INSTALL += contrib/ltree
-
-include $(top_srcdir)/src/pl/plpython/regress-python3-mangle.mk
diff --git a/contrib/ltree_plpython/expected/ltree_plpython.out b/contrib/ltree_plpython/expected/ltree_plpython.out
index f28897fee48..bd32541fdb3 100644
--- a/contrib/ltree_plpython/expected/ltree_plpython.out
+++ b/contrib/ltree_plpython/expected/ltree_plpython.out
@@ -1,7 +1,7 @@
-CREATE EXTENSION ltree_plpython2u CASCADE;
-NOTICE: installing required extension "plpython2u"
+CREATE EXTENSION ltree_plpython3u CASCADE;
+NOTICE: installing required extension "plpython3u"
CREATE FUNCTION test1(val ltree) RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE ltree
AS $$
plpy.info(repr(val))
@@ -15,7 +15,7 @@ INFO: ['aa', 'bb', 'cc']
(1 row)
CREATE FUNCTION test1n(val ltree) RETURNS int
-LANGUAGE plpython2u
+LANGUAGE plpython3u
TRANSFORM FOR TYPE ltree
AS $$
plpy.info(repr(val))
@@ -29,7 +29,7 @@ INFO: ['aa', 'bb', 'cc']
(1 row)
CREATE FUNCTION test2() RETURNS ltree
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE ltree
AS $$
return ['foo', 'bar', 'baz']
diff --git a/contrib/ltree_plpython/ltree_plpython2u--1.0.sql b/contrib/ltree_plpython/ltree_plpython2u--1.0.sql
deleted file mode 100644
index 5c4a7037013..00000000000
--- a/contrib/ltree_plpython/ltree_plpython2u--1.0.sql
+++ /dev/null
@@ -1,12 +0,0 @@
-/* contrib/ltree_plpython/ltree_plpython2u--1.0.sql */
-
--- complain if script is sourced in psql, rather than via CREATE EXTENSION
-\echo Use "CREATE EXTENSION ltree_plpython2u" to load this file. \quit
-
-CREATE FUNCTION ltree_to_plpython2(val internal) RETURNS internal
-LANGUAGE C STRICT IMMUTABLE
-AS 'MODULE_PATHNAME', 'ltree_to_plpython';
-
-CREATE TRANSFORM FOR ltree LANGUAGE plpython2u (
- FROM SQL WITH FUNCTION ltree_to_plpython2(internal)
-);
diff --git a/contrib/ltree_plpython/ltree_plpython2u.control b/contrib/ltree_plpython/ltree_plpython2u.control
deleted file mode 100644
index bedfd0acbad..00000000000
--- a/contrib/ltree_plpython/ltree_plpython2u.control
+++ /dev/null
@@ -1,6 +0,0 @@
-# ltree_plpython2u extension
-comment = 'transform between ltree and plpython2u'
-default_version = '1.0'
-module_pathname = '$libdir/ltree_plpython2'
-relocatable = true
-requires = 'ltree,plpython2u'
diff --git a/contrib/ltree_plpython/ltree_plpythonu--1.0.sql b/contrib/ltree_plpython/ltree_plpythonu--1.0.sql
deleted file mode 100644
index ee93edf28b9..00000000000
--- a/contrib/ltree_plpython/ltree_plpythonu--1.0.sql
+++ /dev/null
@@ -1,12 +0,0 @@
-/* contrib/ltree_plpython/ltree_plpythonu--1.0.sql */
-
--- complain if script is sourced in psql, rather than via CREATE EXTENSION
-\echo Use "CREATE EXTENSION ltree_plpythonu" to load this file. \quit
-
-CREATE FUNCTION ltree_to_plpython(val internal) RETURNS internal
-LANGUAGE C STRICT IMMUTABLE
-AS 'MODULE_PATHNAME';
-
-CREATE TRANSFORM FOR ltree LANGUAGE plpythonu (
- FROM SQL WITH FUNCTION ltree_to_plpython(internal)
-);
diff --git a/contrib/ltree_plpython/ltree_plpythonu.control b/contrib/ltree_plpython/ltree_plpythonu.control
deleted file mode 100644
index b03c89a2e6e..00000000000
--- a/contrib/ltree_plpython/ltree_plpythonu.control
+++ /dev/null
@@ -1,6 +0,0 @@
-# ltree_plpythonu extension
-comment = 'transform between ltree and plpythonu'
-default_version = '1.0'
-module_pathname = '$libdir/ltree_plpython2'
-relocatable = true
-requires = 'ltree,plpythonu'
diff --git a/contrib/ltree_plpython/sql/ltree_plpython.sql b/contrib/ltree_plpython/sql/ltree_plpython.sql
index 210f5428a5a..0b8d28399a6 100644
--- a/contrib/ltree_plpython/sql/ltree_plpython.sql
+++ b/contrib/ltree_plpython/sql/ltree_plpython.sql
@@ -1,8 +1,8 @@
-CREATE EXTENSION ltree_plpython2u CASCADE;
+CREATE EXTENSION ltree_plpython3u CASCADE;
CREATE FUNCTION test1(val ltree) RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE ltree
AS $$
plpy.info(repr(val))
@@ -13,7 +13,7 @@ SELECT test1('aa.bb.cc'::ltree);
CREATE FUNCTION test1n(val ltree) RETURNS int
-LANGUAGE plpython2u
+LANGUAGE plpython3u
TRANSFORM FOR TYPE ltree
AS $$
plpy.info(repr(val))
@@ -24,7 +24,7 @@ SELECT test1n('aa.bb.cc'::ltree);
CREATE FUNCTION test2() RETURNS ltree
-LANGUAGE plpythonu
+LANGUAGE plpython3u
TRANSFORM FOR TYPE ltree
AS $$
return ['foo', 'bar', 'baz']
diff --git a/src/pl/plpython/Makefile b/src/pl/plpython/Makefile
index 9e95285af89..a8feacdef06 100644
--- a/src/pl/plpython/Makefile
+++ b/src/pl/plpython/Makefile
@@ -35,9 +35,6 @@ OBJS = \
plpy_util.o
DATA = $(NAME)u.control $(NAME)u--1.0.sql
-ifeq ($(python_majorversion),2)
-DATA += plpythonu.control plpythonu--1.0.sql
-endif
# header files to install - it's not clear which of these might be needed
# so install them all.
@@ -77,11 +74,6 @@ endif # win32
SHLIB_LINK = $(python_libspec) $(python_additional_libs) $(filter -lintl,$(LIBS))
REGRESS_OPTS = --dbname=$(PL_TESTDB)
-# Only load plpythonu with Python 2. The test files themselves load
-# the versioned language plpython(2|3)u.
-ifeq ($(python_majorversion),2)
-REGRESS_OPTS += --load-extension=plpythonu
-endif
REGRESS = \
plpython_schema \
@@ -108,8 +100,6 @@ REGRESS = \
plpython_transaction \
plpython_drop
-REGRESS_PLPYTHON3_MANGLE := $(REGRESS)
-
include $(top_srcdir)/src/Makefile.shlib
all: all-lib
@@ -127,7 +117,6 @@ uninstall: uninstall-lib uninstall-data
install-data: installdirs
$(INSTALL_DATA) $(addprefix $(srcdir)/, $(DATA)) '$(DESTDIR)$(datadir)/extension/'
$(INSTALL_DATA) $(addprefix $(srcdir)/, $(INCS)) '$(DESTDIR)$(includedir_server)'
- $(INSTALL_DATA) $(srcdir)/regress-python3-mangle.mk '$(DESTDIR)$(pgxsdir)/src/pl/plpython'
uninstall-data:
rm -f $(addprefix '$(DESTDIR)$(datadir)/extension'/, $(notdir $(DATA)))
@@ -136,9 +125,6 @@ uninstall-data:
.PHONY: install-data uninstall-data
-include $(srcdir)/regress-python3-mangle.mk
-
-
check: submake-pg-regress
$(pg_regress_check) $(REGRESS_OPTS) $(REGRESS)
diff --git a/src/pl/plpython/expected/plpython_call.out b/src/pl/plpython/expected/plpython_call.out
index 55e1027246a..4c0690067a0 100644
--- a/src/pl/plpython/expected/plpython_call.out
+++ b/src/pl/plpython/expected/plpython_call.out
@@ -2,14 +2,14 @@
-- Tests for procedures / CALL syntax
--
CREATE PROCEDURE test_proc1()
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
pass
$$;
CALL test_proc1();
-- error: can't return non-None
CREATE PROCEDURE test_proc2()
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
return 5
$$;
@@ -18,7 +18,7 @@ ERROR: PL/Python procedure did not return None
CONTEXT: PL/Python procedure "test_proc2"
CREATE TABLE test1 (a int);
CREATE PROCEDURE test_proc3(x int)
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
plpy.execute("INSERT INTO test1 VALUES (%s)" % x)
$$;
@@ -31,7 +31,7 @@ SELECT * FROM test1;
-- output arguments
CREATE PROCEDURE test_proc5(INOUT a text)
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
return [a + '+' + a]
$$;
@@ -42,7 +42,7 @@ CALL test_proc5('abc');
(1 row)
CREATE PROCEDURE test_proc6(a int, INOUT b int, INOUT c int)
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
return (b * a, c * a)
$$;
@@ -54,7 +54,7 @@ CALL test_proc6(2, 3, 4);
-- OUT parameters
CREATE PROCEDURE test_proc9(IN a int, OUT b int)
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
plpy.notice("a: %s" % (a))
return (a * 2,)
diff --git a/src/pl/plpython/expected/plpython_composite.out b/src/pl/plpython/expected/plpython_composite.out
index af801923343..bb101e07d53 100644
--- a/src/pl/plpython/expected/plpython_composite.out
+++ b/src/pl/plpython/expected/plpython_composite.out
@@ -1,6 +1,6 @@
CREATE FUNCTION multiout_simple(OUT i integer, OUT j integer) AS $$
return (1, 2)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT multiout_simple();
multiout_simple
-----------------
@@ -27,7 +27,7 @@ SELECT (multiout_simple()).j + 3;
CREATE FUNCTION multiout_simple_setof(n integer = 1, OUT integer, OUT integer) RETURNS SETOF record AS $$
return [(1, 2)] * n
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT multiout_simple_setof();
multiout_simple_setof
-----------------------
@@ -67,7 +67,7 @@ elif typ == 'obj':
return type_record
elif typ == 'str':
return "('%s',%r)" % (first, second)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM multiout_record_as('dict', 'foo', 1, 'f');
first | second
-------+--------
@@ -237,7 +237,7 @@ for i in range(n):
power = 2 ** i
length = plpy.execute("select length('%d')" % power)[0]['length']
yield power, length
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM multiout_setof(3);
power_of_2 | length
------------+--------
@@ -260,7 +260,7 @@ CREATE FUNCTION multiout_return_table() RETURNS TABLE (x integer, y text) AS $$
return [{'x': 4, 'y' :'four'},
{'x': 7, 'y' :'seven'},
{'x': 0, 'y' :'zero'}]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM multiout_return_table();
x | y
---+-------
@@ -273,7 +273,7 @@ CREATE FUNCTION multiout_array(OUT integer[], OUT text) RETURNS SETOF record AS
yield [[1], 'a']
yield [[1,2], 'b']
yield [[1,2,3], None]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM multiout_array();
column1 | column2
---------+---------
@@ -284,11 +284,11 @@ SELECT * FROM multiout_array();
CREATE FUNCTION singleout_composite(OUT type_record) AS $$
return {'first': 1, 'second': 2}
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION multiout_composite(OUT type_record) RETURNS SETOF type_record AS $$
return [{'first': 1, 'second': 2},
{'first': 3, 'second': 4 }]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM singleout_composite();
first | second
-------+--------
@@ -305,7 +305,7 @@ SELECT * FROM multiout_composite();
-- composite OUT parameters in functions returning RECORD not supported yet
CREATE FUNCTION multiout_composite(INOUT n integer, OUT type_record) AS $$
return (n, (n * 2, n * 3))
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION multiout_table_type_setof(typ text, returnnull boolean, INOUT n integer, OUT table_record) RETURNS SETOF record AS $$
if returnnull:
d = None
@@ -323,7 +323,7 @@ elif typ == 'str':
d = "(%r,%r)" % (n * 2, n * 3)
for i in range(n):
yield (i, d)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM multiout_composite(2);
n | column2
---+---------
@@ -438,7 +438,7 @@ CREATE TABLE changing (
CREATE FUNCTION changing_test(OUT n integer, OUT changing) RETURNS SETOF record AS $$
return [(1, {'i': 1, 'j': 2}),
(1, (3, 4))]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM changing_test();
n | column2
---+---------
@@ -474,7 +474,7 @@ yield {'tab': [('first', 1), ('second', 2)],
yield {'tab': [('first', 1), ('second', 2)],
'typ': [{'first': 'third', 'second': 3},
{'first': 'fourth', 'second': 4}]}
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM composite_types_table();
tab | typ
----------------------------+----------------------------
@@ -486,7 +486,7 @@ SELECT * FROM composite_types_table();
-- check what happens if the output record descriptor changes
CREATE FUNCTION return_record(t text) RETURNS record AS $$
return {'t': t, 'val': 10}
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM return_record('abc') AS r(t text, val integer);
t | val
-----+-----
@@ -525,7 +525,7 @@ SELECT * FROM return_record('999') AS r(val text, t integer);
CREATE FUNCTION return_record_2(t text) RETURNS record AS $$
return {'v1':1,'v2':2,t:3}
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM return_record_2('v3') AS (v3 int, v2 int, v1 int);
v3 | v2 | v1
----+----+----
@@ -572,7 +572,7 @@ SELECT * FROM return_record_2('v3') AS (v1 int, v2 int, v3 int);
-- multi-dimensional array of composite types.
CREATE FUNCTION composite_type_as_list() RETURNS type_record[] AS $$
return [[('first', 1), ('second', 1)], [('first', 2), ('second', 2)], [('first', 3), ('second', 3)]];
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM composite_type_as_list();
composite_type_as_list
------------------------------------------------------------------------------------
@@ -585,7 +585,7 @@ SELECT * FROM composite_type_as_list();
-- on the issue.
CREATE FUNCTION composite_type_as_list_broken() RETURNS type_record[] AS $$
return [['first', 1]];
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM composite_type_as_list_broken();
ERROR: malformed record literal: "first"
DETAIL: Missing left parenthesis.
diff --git a/src/pl/plpython/expected/plpython_do.out b/src/pl/plpython/expected/plpython_do.out
index e300530e031..d131a4c0ed6 100644
--- a/src/pl/plpython/expected/plpython_do.out
+++ b/src/pl/plpython/expected/plpython_do.out
@@ -1,8 +1,6 @@
-DO $$ plpy.notice("This is plpythonu.") $$ LANGUAGE plpythonu;
-NOTICE: This is plpythonu.
-DO $$ plpy.notice("This is plpython2u.") $$ LANGUAGE plpython2u;
-NOTICE: This is plpython2u.
-DO $$ raise Exception("error test") $$ LANGUAGE plpythonu;
+DO $$ plpy.notice("This is plpython3u.") $$ LANGUAGE plpython3u;
+NOTICE: This is plpython3u.
+DO $$ raise Exception("error test") $$ LANGUAGE plpython3u;
ERROR: Exception: error test
CONTEXT: Traceback (most recent call last):
PL/Python anonymous code block, line 1, in <module>
diff --git a/src/pl/plpython/expected/plpython_drop.out b/src/pl/plpython/expected/plpython_drop.out
index a0e3b5c4ef6..97bb54a55e7 100644
--- a/src/pl/plpython/expected/plpython_drop.out
+++ b/src/pl/plpython/expected/plpython_drop.out
@@ -2,5 +2,4 @@
-- For paranoia's sake, don't leave an untrusted language sitting around
--
SET client_min_messages = WARNING;
-DROP EXTENSION plpythonu CASCADE;
-DROP EXTENSION IF EXISTS plpython2u CASCADE;
+DROP EXTENSION plpython3u CASCADE;
diff --git a/src/pl/plpython/expected/plpython_ereport.out b/src/pl/plpython/expected/plpython_ereport.out
index b73bfff5115..b38bb91e894 100644
--- a/src/pl/plpython/expected/plpython_ereport.out
+++ b/src/pl/plpython/expected/plpython_ereport.out
@@ -17,7 +17,7 @@ plpy.info('This is message text.',
plpy.notice('notice', detail='some detail')
plpy.warning('warning', detail='some detail')
plpy.error('stop on error', detail='some detail', hint='some hint')
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT elog_test();
INFO: info
DETAIL: some detail
@@ -38,42 +38,42 @@ CONTEXT: Traceback (most recent call last):
PL/Python function "elog_test", line 18, in <module>
plpy.error('stop on error', detail='some detail', hint='some hint')
PL/Python function "elog_test"
-DO $$ plpy.info('other types', detail=(10, 20)) $$ LANGUAGE plpythonu;
+DO $$ plpy.info('other types', detail=(10, 20)) $$ LANGUAGE plpython3u;
INFO: other types
DETAIL: (10, 20)
DO $$
import time;
from datetime import date
plpy.info('other types', detail=date(2016, 2, 26))
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
INFO: other types
DETAIL: 2016-02-26
DO $$
basket = ['apple', 'orange', 'apple', 'pear', 'orange', 'banana']
plpy.info('other types', detail=basket)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
INFO: other types
DETAIL: ['apple', 'orange', 'apple', 'pear', 'orange', 'banana']
-- should fail
-DO $$ plpy.info('wrong sqlstate', sqlstate='54444A') $$ LANGUAGE plpythonu;
+DO $$ plpy.info('wrong sqlstate', sqlstate='54444A') $$ LANGUAGE plpython3u;
ERROR: ValueError: invalid SQLSTATE code
CONTEXT: Traceback (most recent call last):
PL/Python anonymous code block, line 1, in <module>
plpy.info('wrong sqlstate', sqlstate='54444A')
PL/Python anonymous code block
-DO $$ plpy.info('unsupported argument', blabla='fooboo') $$ LANGUAGE plpythonu;
+DO $$ plpy.info('unsupported argument', blabla='fooboo') $$ LANGUAGE plpython3u;
ERROR: TypeError: 'blabla' is an invalid keyword argument for this function
CONTEXT: Traceback (most recent call last):
PL/Python anonymous code block, line 1, in <module>
plpy.info('unsupported argument', blabla='fooboo')
PL/Python anonymous code block
-DO $$ plpy.info('first message', message='second message') $$ LANGUAGE plpythonu;
+DO $$ plpy.info('first message', message='second message') $$ LANGUAGE plpython3u;
ERROR: TypeError: argument 'message' given by name and position
CONTEXT: Traceback (most recent call last):
PL/Python anonymous code block, line 1, in <module>
plpy.info('first message', message='second message')
PL/Python anonymous code block
-DO $$ plpy.info('first message', 'second message', message='third message') $$ LANGUAGE plpythonu;
+DO $$ plpy.info('first message', 'second message', message='third message') $$ LANGUAGE plpython3u;
ERROR: TypeError: argument 'message' given by name and position
CONTEXT: Traceback (most recent call last):
PL/Python anonymous code block, line 1, in <module>
@@ -96,7 +96,7 @@ kwargs = {
}
# ignore None values
plpy.error(**dict((k, v) for k, v in iter(kwargs.items()) if v))
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT raise_exception('hello', 'world');
ERROR: plpy.Error: hello
DETAIL: world
@@ -189,7 +189,7 @@ try:
except Exception as e:
plpy.info(e.spidata)
raise e
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
INFO: (119577128, None, 'some hint', None, 0, None, 'users_tab', None, 'user_type', None)
ERROR: plpy.SPIError: plpy.Error: my message
HINT: some hint
@@ -199,7 +199,7 @@ try:
except Exception as e:
plpy.info('sqlstate: %s, hint: %s, table_name: %s, datatype_name: %s' % (e.sqlstate, e.hint, e.table_name, e.datatype_name))
raise e
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
INFO: sqlstate: XX987, hint: some hint, table_name: users_tab, datatype_name: user_type
ERROR: plpy.Error: my message
HINT: some hint
diff --git a/src/pl/plpython/expected/plpython_error.out b/src/pl/plpython/expected/plpython_error.out
index b2f8fe83eb6..7fe864a1a57 100644
--- a/src/pl/plpython/expected/plpython_error.out
+++ b/src/pl/plpython/expected/plpython_error.out
@@ -6,7 +6,7 @@
CREATE FUNCTION python_syntax_error() RETURNS text
AS
'.syntaxerror'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
ERROR: could not compile PL/Python function "python_syntax_error"
DETAIL: SyntaxError: invalid syntax (<string>, line 2)
/* With check_function_bodies = false the function should get defined
@@ -16,7 +16,7 @@ SET check_function_bodies = false;
CREATE FUNCTION python_syntax_error() RETURNS text
AS
'.syntaxerror'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
SELECT python_syntax_error();
ERROR: could not compile PL/Python function "python_syntax_error"
DETAIL: SyntaxError: invalid syntax (<string>, line 2)
@@ -30,7 +30,7 @@ RESET check_function_bodies;
CREATE FUNCTION sql_syntax_error() RETURNS text
AS
'plpy.execute("syntax error")'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
SELECT sql_syntax_error();
ERROR: spiexceptions.SyntaxError: syntax error at or near "syntax"
LINE 1: syntax error
@@ -45,7 +45,7 @@ PL/Python function "sql_syntax_error"
CREATE FUNCTION exception_index_invalid(text) RETURNS text
AS
'return args[1]'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
SELECT exception_index_invalid('test');
ERROR: IndexError: list index out of range
CONTEXT: Traceback (most recent call last):
@@ -58,7 +58,7 @@ CREATE FUNCTION exception_index_invalid_nested() RETURNS text
AS
'rv = plpy.execute("SELECT test5(''foo'')")
return rv[0]'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
SELECT exception_index_invalid_nested();
ERROR: spiexceptions.UndefinedFunction: function test5(unknown) does not exist
LINE 1: SELECT test5('foo')
@@ -81,7 +81,7 @@ if len(rv):
return rv[0]["fname"]
return None
'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
SELECT invalid_type_uncaught('rick');
ERROR: spiexceptions.UndefinedObject: type "test" does not exist
CONTEXT: Traceback (most recent call last):
@@ -105,7 +105,7 @@ if len(rv):
return rv[0]["fname"]
return None
'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
SELECT invalid_type_caught('rick');
NOTICE: type "test" does not exist
invalid_type_caught
@@ -129,7 +129,7 @@ if len(rv):
return rv[0]["fname"]
return None
'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
SELECT invalid_type_reraised('rick');
ERROR: plpy.Error: type "test" does not exist
CONTEXT: Traceback (most recent call last):
@@ -147,7 +147,7 @@ if len(rv):
return rv[0]["fname"]
return None
'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
SELECT valid_type('rick');
valid_type
------------
@@ -170,7 +170,7 @@ def fun3():
fun3()
return "not reached"
'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
SELECT nested_error();
ERROR: plpy.Error: boom
CONTEXT: Traceback (most recent call last):
@@ -199,7 +199,7 @@ def fun3():
fun3()
return "not reached"
'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
SELECT nested_error_raise();
ERROR: plpy.Error: boom
CONTEXT: Traceback (most recent call last):
@@ -228,7 +228,7 @@ def fun3():
fun3()
return "you''ve been warned"
'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
SELECT nested_warning();
WARNING: boom
nested_warning
@@ -241,9 +241,9 @@ WARNING: boom
CREATE FUNCTION toplevel_attribute_error() RETURNS void AS
$$
plpy.nonexistent
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT toplevel_attribute_error();
-ERROR: AttributeError: 'module' object has no attribute 'nonexistent'
+ERROR: AttributeError: module 'plpy' has no attribute 'nonexistent'
CONTEXT: Traceback (most recent call last):
PL/Python function "toplevel_attribute_error", line 2, in <module>
plpy.nonexistent
@@ -261,7 +261,7 @@ def third():
plpy.execute("select sql_error()")
first()
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE OR REPLACE FUNCTION sql_error() RETURNS void AS $$
begin
select 1/0;
@@ -274,7 +274,7 @@ end
$$ LANGUAGE plpgsql;
CREATE OR REPLACE FUNCTION sql_from_python_error() RETURNS void AS $$
plpy.execute("select sql_error()")
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT python_traceback();
ERROR: spiexceptions.DivisionByZero: division by zero
CONTEXT: Traceback (most recent call last):
@@ -325,7 +325,7 @@ except spiexceptions.NotNullViolation as e:
plpy.notice("Violated the NOT NULL constraint, sqlstate %s" % e.sqlstate)
except spiexceptions.UniqueViolation as e:
plpy.notice("Violated the UNIQUE constraint, sqlstate %s" % e.sqlstate)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT specific_exception(2);
specific_exception
--------------------
@@ -351,7 +351,7 @@ NOTICE: Violated the UNIQUE constraint, sqlstate 23505
CREATE FUNCTION python_unique_violation() RETURNS void AS $$
plpy.execute("insert into specific values (1)")
plpy.execute("insert into specific values (1)")
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION catch_python_unique_violation() RETURNS text AS $$
begin
begin
@@ -374,7 +374,7 @@ CREATE FUNCTION manual_subxact() RETURNS void AS $$
plpy.execute("savepoint save")
plpy.execute("create table foo(x integer)")
plpy.execute("rollback to save")
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT manual_subxact();
ERROR: plpy.SPIError: SPI_execute failed: SPI_ERROR_TRANSACTION
CONTEXT: Traceback (most recent call last):
@@ -389,7 +389,7 @@ rollback = plpy.prepare("rollback to save")
plpy.execute(save)
plpy.execute("create table foo(x integer)")
plpy.execute(rollback)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT manual_subxact_prepared();
ERROR: plpy.SPIError: SPI_execute_plan failed: SPI_ERROR_TRANSACTION
CONTEXT: Traceback (most recent call last):
@@ -400,7 +400,7 @@ PL/Python function "manual_subxact_prepared"
*/
CREATE FUNCTION plpy_raise_spiexception() RETURNS void AS $$
raise plpy.spiexceptions.DivisionByZero()
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
DO $$
BEGIN
SELECT plpy_raise_spiexception();
@@ -414,7 +414,7 @@ CREATE FUNCTION plpy_raise_spiexception_override() RETURNS void AS $$
exc = plpy.spiexceptions.DivisionByZero()
exc.sqlstate = 'SILLY'
raise exc
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
DO $$
BEGIN
SELECT plpy_raise_spiexception_override();
@@ -425,18 +425,18 @@ $$ LANGUAGE plpgsql;
/* test the context stack trace for nested execution levels
*/
CREATE FUNCTION notice_innerfunc() RETURNS int AS $$
-plpy.execute("DO LANGUAGE plpythonu $x$ plpy.notice('inside DO') $x$")
+plpy.execute("DO LANGUAGE plpython3u $x$ plpy.notice('inside DO') $x$")
return 1
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION notice_outerfunc() RETURNS int AS $$
plpy.execute("SELECT notice_innerfunc()")
return 1
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
\set SHOW_CONTEXT always
SELECT notice_outerfunc();
NOTICE: inside DO
CONTEXT: PL/Python anonymous code block
-SQL statement "DO LANGUAGE plpythonu $x$ plpy.notice('inside DO') $x$"
+SQL statement "DO LANGUAGE plpython3u $x$ plpy.notice('inside DO') $x$"
PL/Python function "notice_innerfunc"
SQL statement "SELECT notice_innerfunc()"
PL/Python function "notice_outerfunc"
diff --git a/src/pl/plpython/expected/plpython_error_5.out b/src/pl/plpython/expected/plpython_error_5.out
deleted file mode 100644
index bc66ab55340..00000000000
--- a/src/pl/plpython/expected/plpython_error_5.out
+++ /dev/null
@@ -1,447 +0,0 @@
--- test error handling, i forgot to restore Warn_restart in
--- the trigger handler once. the errors and subsequent core dump were
--- interesting.
-/* Flat out Python syntax error
- */
-CREATE FUNCTION python_syntax_error() RETURNS text
- AS
-'.syntaxerror'
- LANGUAGE plpython3u;
-ERROR: could not compile PL/Python function "python_syntax_error"
-DETAIL: SyntaxError: invalid syntax (<string>, line 2)
-/* With check_function_bodies = false the function should get defined
- * and the error reported when called
- */
-SET check_function_bodies = false;
-CREATE FUNCTION python_syntax_error() RETURNS text
- AS
-'.syntaxerror'
- LANGUAGE plpython3u;
-SELECT python_syntax_error();
-ERROR: could not compile PL/Python function "python_syntax_error"
-DETAIL: SyntaxError: invalid syntax (<string>, line 2)
-/* Run the function twice to check if the hashtable entry gets cleaned up */
-SELECT python_syntax_error();
-ERROR: could not compile PL/Python function "python_syntax_error"
-DETAIL: SyntaxError: invalid syntax (<string>, line 2)
-RESET check_function_bodies;
-/* Flat out syntax error
- */
-CREATE FUNCTION sql_syntax_error() RETURNS text
- AS
-'plpy.execute("syntax error")'
- LANGUAGE plpython3u;
-SELECT sql_syntax_error();
-ERROR: spiexceptions.SyntaxError: syntax error at or near "syntax"
-LINE 1: syntax error
- ^
-QUERY: syntax error
-CONTEXT: Traceback (most recent call last):
- PL/Python function "sql_syntax_error", line 1, in <module>
- plpy.execute("syntax error")
-PL/Python function "sql_syntax_error"
-/* check the handling of uncaught python exceptions
- */
-CREATE FUNCTION exception_index_invalid(text) RETURNS text
- AS
-'return args[1]'
- LANGUAGE plpython3u;
-SELECT exception_index_invalid('test');
-ERROR: IndexError: list index out of range
-CONTEXT: Traceback (most recent call last):
- PL/Python function "exception_index_invalid", line 1, in <module>
- return args[1]
-PL/Python function "exception_index_invalid"
-/* check handling of nested exceptions
- */
-CREATE FUNCTION exception_index_invalid_nested() RETURNS text
- AS
-'rv = plpy.execute("SELECT test5(''foo'')")
-return rv[0]'
- LANGUAGE plpython3u;
-SELECT exception_index_invalid_nested();
-ERROR: spiexceptions.UndefinedFunction: function test5(unknown) does not exist
-LINE 1: SELECT test5('foo')
- ^
-HINT: No function matches the given name and argument types. You might need to add explicit type casts.
-QUERY: SELECT test5('foo')
-CONTEXT: Traceback (most recent call last):
- PL/Python function "exception_index_invalid_nested", line 1, in <module>
- rv = plpy.execute("SELECT test5('foo')")
-PL/Python function "exception_index_invalid_nested"
-/* a typo
- */
-CREATE FUNCTION invalid_type_uncaught(a text) RETURNS text
- AS
-'if "plan" not in SD:
- q = "SELECT fname FROM users WHERE lname = $1"
- SD["plan"] = plpy.prepare(q, [ "test" ])
-rv = plpy.execute(SD["plan"], [ a ])
-if len(rv):
- return rv[0]["fname"]
-return None
-'
- LANGUAGE plpython3u;
-SELECT invalid_type_uncaught('rick');
-ERROR: spiexceptions.UndefinedObject: type "test" does not exist
-CONTEXT: Traceback (most recent call last):
- PL/Python function "invalid_type_uncaught", line 3, in <module>
- SD["plan"] = plpy.prepare(q, [ "test" ])
-PL/Python function "invalid_type_uncaught"
-/* for what it's worth catch the exception generated by
- * the typo, and return None
- */
-CREATE FUNCTION invalid_type_caught(a text) RETURNS text
- AS
-'if "plan" not in SD:
- q = "SELECT fname FROM users WHERE lname = $1"
- try:
- SD["plan"] = plpy.prepare(q, [ "test" ])
- except plpy.SPIError as ex:
- plpy.notice(str(ex))
- return None
-rv = plpy.execute(SD["plan"], [ a ])
-if len(rv):
- return rv[0]["fname"]
-return None
-'
- LANGUAGE plpython3u;
-SELECT invalid_type_caught('rick');
-NOTICE: type "test" does not exist
- invalid_type_caught
----------------------
-
-(1 row)
-
-/* for what it's worth catch the exception generated by
- * the typo, and reraise it as a plain error
- */
-CREATE FUNCTION invalid_type_reraised(a text) RETURNS text
- AS
-'if "plan" not in SD:
- q = "SELECT fname FROM users WHERE lname = $1"
- try:
- SD["plan"] = plpy.prepare(q, [ "test" ])
- except plpy.SPIError as ex:
- plpy.error(str(ex))
-rv = plpy.execute(SD["plan"], [ a ])
-if len(rv):
- return rv[0]["fname"]
-return None
-'
- LANGUAGE plpython3u;
-SELECT invalid_type_reraised('rick');
-ERROR: plpy.Error: type "test" does not exist
-CONTEXT: Traceback (most recent call last):
- PL/Python function "invalid_type_reraised", line 6, in <module>
- plpy.error(str(ex))
-PL/Python function "invalid_type_reraised"
-/* no typo no messing about
- */
-CREATE FUNCTION valid_type(a text) RETURNS text
- AS
-'if "plan" not in SD:
- SD["plan"] = plpy.prepare("SELECT fname FROM users WHERE lname = $1", [ "text" ])
-rv = plpy.execute(SD["plan"], [ a ])
-if len(rv):
- return rv[0]["fname"]
-return None
-'
- LANGUAGE plpython3u;
-SELECT valid_type('rick');
- valid_type
-------------
-
-(1 row)
-
-/* error in nested functions to get a traceback
-*/
-CREATE FUNCTION nested_error() RETURNS text
- AS
-'def fun1():
- plpy.error("boom")
-
-def fun2():
- fun1()
-
-def fun3():
- fun2()
-
-fun3()
-return "not reached"
-'
- LANGUAGE plpython3u;
-SELECT nested_error();
-ERROR: plpy.Error: boom
-CONTEXT: Traceback (most recent call last):
- PL/Python function "nested_error", line 10, in <module>
- fun3()
- PL/Python function "nested_error", line 8, in fun3
- fun2()
- PL/Python function "nested_error", line 5, in fun2
- fun1()
- PL/Python function "nested_error", line 2, in fun1
- plpy.error("boom")
-PL/Python function "nested_error"
-/* raising plpy.Error is just like calling plpy.error
-*/
-CREATE FUNCTION nested_error_raise() RETURNS text
- AS
-'def fun1():
- raise plpy.Error("boom")
-
-def fun2():
- fun1()
-
-def fun3():
- fun2()
-
-fun3()
-return "not reached"
-'
- LANGUAGE plpython3u;
-SELECT nested_error_raise();
-ERROR: plpy.Error: boom
-CONTEXT: Traceback (most recent call last):
- PL/Python function "nested_error_raise", line 10, in <module>
- fun3()
- PL/Python function "nested_error_raise", line 8, in fun3
- fun2()
- PL/Python function "nested_error_raise", line 5, in fun2
- fun1()
- PL/Python function "nested_error_raise", line 2, in fun1
- raise plpy.Error("boom")
-PL/Python function "nested_error_raise"
-/* using plpy.warning should not produce a traceback
-*/
-CREATE FUNCTION nested_warning() RETURNS text
- AS
-'def fun1():
- plpy.warning("boom")
-
-def fun2():
- fun1()
-
-def fun3():
- fun2()
-
-fun3()
-return "you''ve been warned"
-'
- LANGUAGE plpython3u;
-SELECT nested_warning();
-WARNING: boom
- nested_warning
---------------------
- you've been warned
-(1 row)
-
-/* AttributeError at toplevel used to give segfaults with the traceback
-*/
-CREATE FUNCTION toplevel_attribute_error() RETURNS void AS
-$$
-plpy.nonexistent
-$$ LANGUAGE plpython3u;
-SELECT toplevel_attribute_error();
-ERROR: AttributeError: module 'plpy' has no attribute 'nonexistent'
-CONTEXT: Traceback (most recent call last):
- PL/Python function "toplevel_attribute_error", line 2, in <module>
- plpy.nonexistent
-PL/Python function "toplevel_attribute_error"
-/* Calling PL/Python functions from SQL and vice versa should not lose context.
- */
-CREATE OR REPLACE FUNCTION python_traceback() RETURNS void AS $$
-def first():
- second()
-
-def second():
- third()
-
-def third():
- plpy.execute("select sql_error()")
-
-first()
-$$ LANGUAGE plpython3u;
-CREATE OR REPLACE FUNCTION sql_error() RETURNS void AS $$
-begin
- select 1/0;
-end
-$$ LANGUAGE plpgsql;
-CREATE OR REPLACE FUNCTION python_from_sql_error() RETURNS void AS $$
-begin
- select python_traceback();
-end
-$$ LANGUAGE plpgsql;
-CREATE OR REPLACE FUNCTION sql_from_python_error() RETURNS void AS $$
-plpy.execute("select sql_error()")
-$$ LANGUAGE plpython3u;
-SELECT python_traceback();
-ERROR: spiexceptions.DivisionByZero: division by zero
-CONTEXT: Traceback (most recent call last):
- PL/Python function "python_traceback", line 11, in <module>
- first()
- PL/Python function "python_traceback", line 3, in first
- second()
- PL/Python function "python_traceback", line 6, in second
- third()
- PL/Python function "python_traceback", line 9, in third
- plpy.execute("select sql_error()")
-PL/Python function "python_traceback"
-SELECT sql_error();
-ERROR: division by zero
-CONTEXT: SQL statement "select 1/0"
-PL/pgSQL function sql_error() line 3 at SQL statement
-SELECT python_from_sql_error();
-ERROR: spiexceptions.DivisionByZero: division by zero
-CONTEXT: Traceback (most recent call last):
- PL/Python function "python_traceback", line 11, in <module>
- first()
- PL/Python function "python_traceback", line 3, in first
- second()
- PL/Python function "python_traceback", line 6, in second
- third()
- PL/Python function "python_traceback", line 9, in third
- plpy.execute("select sql_error()")
-PL/Python function "python_traceback"
-SQL statement "select python_traceback()"
-PL/pgSQL function python_from_sql_error() line 3 at SQL statement
-SELECT sql_from_python_error();
-ERROR: spiexceptions.DivisionByZero: division by zero
-CONTEXT: Traceback (most recent call last):
- PL/Python function "sql_from_python_error", line 2, in <module>
- plpy.execute("select sql_error()")
-PL/Python function "sql_from_python_error"
-/* check catching specific types of exceptions
- */
-CREATE TABLE specific (
- i integer PRIMARY KEY
-);
-CREATE FUNCTION specific_exception(i integer) RETURNS void AS
-$$
-from plpy import spiexceptions
-try:
- plpy.execute("insert into specific values (%s)" % (i or "NULL"));
-except spiexceptions.NotNullViolation as e:
- plpy.notice("Violated the NOT NULL constraint, sqlstate %s" % e.sqlstate)
-except spiexceptions.UniqueViolation as e:
- plpy.notice("Violated the UNIQUE constraint, sqlstate %s" % e.sqlstate)
-$$ LANGUAGE plpython3u;
-SELECT specific_exception(2);
- specific_exception
---------------------
-
-(1 row)
-
-SELECT specific_exception(NULL);
-NOTICE: Violated the NOT NULL constraint, sqlstate 23502
- specific_exception
---------------------
-
-(1 row)
-
-SELECT specific_exception(2);
-NOTICE: Violated the UNIQUE constraint, sqlstate 23505
- specific_exception
---------------------
-
-(1 row)
-
-/* SPI errors in PL/Python functions should preserve the SQLSTATE value
- */
-CREATE FUNCTION python_unique_violation() RETURNS void AS $$
-plpy.execute("insert into specific values (1)")
-plpy.execute("insert into specific values (1)")
-$$ LANGUAGE plpython3u;
-CREATE FUNCTION catch_python_unique_violation() RETURNS text AS $$
-begin
- begin
- perform python_unique_violation();
- exception when unique_violation then
- return 'ok';
- end;
- return 'not reached';
-end;
-$$ language plpgsql;
-SELECT catch_python_unique_violation();
- catch_python_unique_violation
--------------------------------
- ok
-(1 row)
-
-/* manually starting subtransactions - a bad idea
- */
-CREATE FUNCTION manual_subxact() RETURNS void AS $$
-plpy.execute("savepoint save")
-plpy.execute("create table foo(x integer)")
-plpy.execute("rollback to save")
-$$ LANGUAGE plpython3u;
-SELECT manual_subxact();
-ERROR: plpy.SPIError: SPI_execute failed: SPI_ERROR_TRANSACTION
-CONTEXT: Traceback (most recent call last):
- PL/Python function "manual_subxact", line 2, in <module>
- plpy.execute("savepoint save")
-PL/Python function "manual_subxact"
-/* same for prepared plans
- */
-CREATE FUNCTION manual_subxact_prepared() RETURNS void AS $$
-save = plpy.prepare("savepoint save")
-rollback = plpy.prepare("rollback to save")
-plpy.execute(save)
-plpy.execute("create table foo(x integer)")
-plpy.execute(rollback)
-$$ LANGUAGE plpython3u;
-SELECT manual_subxact_prepared();
-ERROR: plpy.SPIError: SPI_execute_plan failed: SPI_ERROR_TRANSACTION
-CONTEXT: Traceback (most recent call last):
- PL/Python function "manual_subxact_prepared", line 4, in <module>
- plpy.execute(save)
-PL/Python function "manual_subxact_prepared"
-/* raising plpy.spiexception.* from python code should preserve sqlstate
- */
-CREATE FUNCTION plpy_raise_spiexception() RETURNS void AS $$
-raise plpy.spiexceptions.DivisionByZero()
-$$ LANGUAGE plpython3u;
-DO $$
-BEGIN
- SELECT plpy_raise_spiexception();
-EXCEPTION WHEN division_by_zero THEN
- -- NOOP
-END
-$$ LANGUAGE plpgsql;
-/* setting a custom sqlstate should be handled
- */
-CREATE FUNCTION plpy_raise_spiexception_override() RETURNS void AS $$
-exc = plpy.spiexceptions.DivisionByZero()
-exc.sqlstate = 'SILLY'
-raise exc
-$$ LANGUAGE plpython3u;
-DO $$
-BEGIN
- SELECT plpy_raise_spiexception_override();
-EXCEPTION WHEN SQLSTATE 'SILLY' THEN
- -- NOOP
-END
-$$ LANGUAGE plpgsql;
-/* test the context stack trace for nested execution levels
- */
-CREATE FUNCTION notice_innerfunc() RETURNS int AS $$
-plpy.execute("DO LANGUAGE plpythonu $x$ plpy.notice('inside DO') $x$")
-return 1
-$$ LANGUAGE plpythonu;
-CREATE FUNCTION notice_outerfunc() RETURNS int AS $$
-plpy.execute("SELECT notice_innerfunc()")
-return 1
-$$ LANGUAGE plpythonu;
-\set SHOW_CONTEXT always
-SELECT notice_outerfunc();
-NOTICE: inside DO
-CONTEXT: PL/Python anonymous code block
-SQL statement "DO LANGUAGE plpythonu $x$ plpy.notice('inside DO') $x$"
-PL/Python function "notice_innerfunc"
-SQL statement "SELECT notice_innerfunc()"
-PL/Python function "notice_outerfunc"
- notice_outerfunc
-------------------
- 1
-(1 row)
-
diff --git a/src/pl/plpython/expected/plpython_global.out b/src/pl/plpython/expected/plpython_global.out
index 192e3e48a72..a4cfb1483f9 100644
--- a/src/pl/plpython/expected/plpython_global.out
+++ b/src/pl/plpython/expected/plpython_global.out
@@ -8,7 +8,7 @@ CREATE FUNCTION global_test_one() returns text
if "global_test" not in GD:
GD["global_test"] = "set by global_test_one"
return "SD: " + SD["global_test"] + ", GD: " + GD["global_test"]'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
CREATE FUNCTION global_test_two() returns text
AS
'if "global_test" not in SD:
@@ -16,7 +16,7 @@ CREATE FUNCTION global_test_two() returns text
if "global_test" not in GD:
GD["global_test"] = "set by global_test_two"
return "SD: " + SD["global_test"] + ", GD: " + GD["global_test"]'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
CREATE FUNCTION static_test() returns int4
AS
'if "call" in SD:
@@ -25,7 +25,7 @@ else:
SD["call"] = 1
return SD["call"]
'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
SELECT static_test();
static_test
-------------
diff --git a/src/pl/plpython/expected/plpython_import.out b/src/pl/plpython/expected/plpython_import.out
index b59e1821a79..854e989eaf9 100644
--- a/src/pl/plpython/expected/plpython_import.out
+++ b/src/pl/plpython/expected/plpython_import.out
@@ -6,7 +6,7 @@ CREATE FUNCTION import_fail() returns text
except ImportError:
return "failed as expected"
return "succeeded, that wasn''t supposed to happen"'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
CREATE FUNCTION import_succeed() returns text
AS
'try:
@@ -25,7 +25,7 @@ except Exception as ex:
plpy.notice("import failed -- %s" % str(ex))
return "failed, that wasn''t supposed to happen"
return "succeeded, as expected"'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
CREATE FUNCTION import_test_one(p text) RETURNS text
AS
'try:
@@ -35,7 +35,7 @@ except ImportError:
import sha
digest = sha.new(p)
return digest.hexdigest()'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
CREATE FUNCTION import_test_two(u users) RETURNS text
AS
'plain = u["fname"] + u["lname"]
@@ -46,7 +46,7 @@ except ImportError:
import sha
digest = sha.new(plain);
return "sha hash of " + plain + " is " + digest.hexdigest()'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
-- import python modules
--
SELECT import_fail();
diff --git a/src/pl/plpython/expected/plpython_newline.out b/src/pl/plpython/expected/plpython_newline.out
index 27dc2f8ab0c..2bc149257e7 100644
--- a/src/pl/plpython/expected/plpython_newline.out
+++ b/src/pl/plpython/expected/plpython_newline.out
@@ -3,13 +3,13 @@
--
CREATE OR REPLACE FUNCTION newline_lf() RETURNS integer AS
E'x = 100\ny = 23\nreturn x + y\n'
-LANGUAGE plpythonu;
+LANGUAGE plpython3u;
CREATE OR REPLACE FUNCTION newline_cr() RETURNS integer AS
E'x = 100\ry = 23\rreturn x + y\r'
-LANGUAGE plpythonu;
+LANGUAGE plpython3u;
CREATE OR REPLACE FUNCTION newline_crlf() RETURNS integer AS
E'x = 100\r\ny = 23\r\nreturn x + y\r\n'
-LANGUAGE plpythonu;
+LANGUAGE plpython3u;
SELECT newline_lf();
newline_lf
------------
diff --git a/src/pl/plpython/expected/plpython_params.out b/src/pl/plpython/expected/plpython_params.out
index 46ea7dfb90b..d1a36f36239 100644
--- a/src/pl/plpython/expected/plpython_params.out
+++ b/src/pl/plpython/expected/plpython_params.out
@@ -3,12 +3,12 @@
--
CREATE FUNCTION test_param_names0(integer, integer) RETURNS int AS $$
return args[0] + args[1]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION test_param_names1(a0 integer, a1 text) RETURNS boolean AS $$
assert a0 == args[0]
assert a1 == args[1]
return True
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION test_param_names2(u users) RETURNS text AS $$
assert u == args[0]
if isinstance(u, dict):
@@ -19,7 +19,7 @@ if isinstance(u, dict):
else:
s = str(u)
return s
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
-- use deliberately wrong parameter names
CREATE FUNCTION test_param_names3(a0 integer) RETURNS boolean AS $$
try:
@@ -28,7 +28,7 @@ try:
except NameError as e:
assert e.args[0].find("a1") > -1
return True
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT test_param_names0(2,7);
test_param_names0
-------------------
diff --git a/src/pl/plpython/expected/plpython_quote.out b/src/pl/plpython/expected/plpython_quote.out
index eed72923aec..1fbe93d5351 100644
--- a/src/pl/plpython/expected/plpython_quote.out
+++ b/src/pl/plpython/expected/plpython_quote.out
@@ -8,7 +8,7 @@ CREATE FUNCTION quote(t text, how text) RETURNS text AS $$
return plpy.quote_ident(t)
else:
raise plpy.Error("unrecognized quote type %s" % how)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT quote(t, 'literal') FROM (VALUES
('abc'),
('a''bc'),
diff --git a/src/pl/plpython/expected/plpython_record.out b/src/pl/plpython/expected/plpython_record.out
index 458330713a8..31de198582b 100644
--- a/src/pl/plpython/expected/plpython_record.out
+++ b/src/pl/plpython/expected/plpython_record.out
@@ -23,7 +23,7 @@ elif typ == 'obj':
type_record.first = first
type_record.second = second
return type_record
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION test_type_record_as(typ text, first text, second integer, retnull boolean) RETURNS type_record AS $$
if retnull:
return None
@@ -40,17 +40,17 @@ elif typ == 'obj':
return type_record
elif typ == 'str':
return "('%s',%r)" % (first, second)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION test_in_out_params(first in text, second out text) AS $$
return first + '_in_to_out';
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION test_in_out_params_multi(first in text,
second out text, third out text) AS $$
return (first + '_record_in_to_out_1', first + '_record_in_to_out_2');
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION test_inout_params(first inout text) AS $$
return first + '_inout';
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
-- Test tuple returning functions
SELECT * FROM test_table_record_as('dict', null, null, false);
first | second
@@ -340,7 +340,7 @@ SELECT * FROM test_type_record_as('obj', 'one', 1, false);
-- errors cases
CREATE FUNCTION test_type_record_error1() RETURNS type_record AS $$
return { 'first': 'first' }
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_record_error1();
ERROR: key "second" not found in mapping
HINT: To return null in a column, add the value None to the mapping with the key named after the column.
@@ -348,7 +348,7 @@ CONTEXT: while creating return value
PL/Python function "test_type_record_error1"
CREATE FUNCTION test_type_record_error2() RETURNS type_record AS $$
return [ 'first' ]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_record_error2();
ERROR: length of returned sequence did not match number of columns in row
CONTEXT: while creating return value
@@ -357,7 +357,7 @@ CREATE FUNCTION test_type_record_error3() RETURNS type_record AS $$
class type_record: pass
type_record.first = 'first'
return type_record
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_record_error3();
ERROR: attribute "second" does not exist in Python object
HINT: To return null in a column, let the returned object have an attribute named after column with value None.
@@ -365,7 +365,7 @@ CONTEXT: while creating return value
PL/Python function "test_type_record_error3"
CREATE FUNCTION test_type_record_error4() RETURNS type_record AS $$
return 'foo'
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_record_error4();
ERROR: malformed record literal: "foo"
DETAIL: Missing left parenthesis.
diff --git a/src/pl/plpython/expected/plpython_setof.out b/src/pl/plpython/expected/plpython_setof.out
index 170dbc394de..39409400290 100644
--- a/src/pl/plpython/expected/plpython_setof.out
+++ b/src/pl/plpython/expected/plpython_setof.out
@@ -3,20 +3,20 @@
--
CREATE FUNCTION test_setof_error() RETURNS SETOF text AS $$
return 37
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT test_setof_error();
ERROR: returned object cannot be iterated
DETAIL: PL/Python set-returning functions must return an iterable object.
CONTEXT: PL/Python function "test_setof_error"
CREATE FUNCTION test_setof_as_list(count integer, content text) RETURNS SETOF text AS $$
return [ content ]*count
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION test_setof_as_tuple(count integer, content text) RETURNS SETOF text AS $$
t = ()
for i in range(count):
t += ( content, )
return t
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION test_setof_as_iterator(count integer, content text) RETURNS SETOF text AS $$
class producer:
def __init__ (self, icount, icontent):
@@ -24,13 +24,13 @@ class producer:
self.icount = icount
def __iter__ (self):
return self
- def next (self):
+ def __next__ (self):
if self.icount == 0:
raise StopIteration
self.icount -= 1
return self.icontent
return producer(count, content)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION test_setof_spi_in_iterator() RETURNS SETOF text AS
$$
for s in ('Hello', 'Brave', 'New', 'World'):
@@ -38,7 +38,7 @@ $$
yield s
plpy.execute('select 2')
$$
-LANGUAGE plpythonu;
+LANGUAGE plpython3u;
-- Test set returning functions
SELECT test_setof_as_list(0, 'list');
test_setof_as_list
@@ -130,7 +130,7 @@ global x
while x <= lim:
yield x
x = x + 1
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT ugly(1, 5);
ugly
------
@@ -155,7 +155,7 @@ CREATE OR REPLACE FUNCTION get_user_records()
RETURNS SETOF users
AS $$
return plpy.execute("SELECT * FROM users ORDER BY username")
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT get_user_records();
get_user_records
----------------------
@@ -179,7 +179,7 @@ CREATE OR REPLACE FUNCTION get_user_records2()
RETURNS TABLE(fname text, lname text, username text, userid int)
AS $$
return plpy.execute("SELECT * FROM users ORDER BY username")
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT get_user_records2();
get_user_records2
----------------------
diff --git a/src/pl/plpython/expected/plpython_spi.out b/src/pl/plpython/expected/plpython_spi.out
index a09df68c7d1..391fdb0e645 100644
--- a/src/pl/plpython/expected/plpython_spi.out
+++ b/src/pl/plpython/expected/plpython_spi.out
@@ -6,17 +6,17 @@ CREATE FUNCTION nested_call_one(a text) RETURNS text
'q = "SELECT nested_call_two(''%s'')" % a
r = plpy.execute(q)
return r[0]'
- LANGUAGE plpythonu ;
+ LANGUAGE plpython3u ;
CREATE FUNCTION nested_call_two(a text) RETURNS text
AS
'q = "SELECT nested_call_three(''%s'')" % a
r = plpy.execute(q)
return r[0]'
- LANGUAGE plpythonu ;
+ LANGUAGE plpython3u ;
CREATE FUNCTION nested_call_three(a text) RETURNS text
AS
'return a'
- LANGUAGE plpythonu ;
+ LANGUAGE plpython3u ;
-- some spi stuff
CREATE FUNCTION spi_prepared_plan_test_one(a text) RETURNS text
AS
@@ -30,7 +30,7 @@ except Exception as ex:
plpy.error(str(ex))
return None
'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
CREATE FUNCTION spi_prepared_plan_test_two(a text) RETURNS text
AS
'if "myplan" not in SD:
@@ -43,7 +43,7 @@ except Exception as ex:
plpy.error(str(ex))
return None
'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
CREATE FUNCTION spi_prepared_plan_test_nested(a text) RETURNS text
AS
'if "myplan" not in SD:
@@ -57,7 +57,7 @@ except Exception as ex:
plpy.error(str(ex))
return None
'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
CREATE FUNCTION join_sequences(s sequences) RETURNS text
AS
'if not s["multipart"]:
@@ -69,7 +69,7 @@ for r in rv:
seq = seq + r["sequence"]
return seq
'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
CREATE FUNCTION spi_recursive_sum(a int) RETURNS int
AS
'r = 0
@@ -77,7 +77,7 @@ if a > 1:
r = plpy.execute("SELECT spi_recursive_sum(%d) as a" % (a-1))[0]["a"]
return a + r
'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
--
-- spi and nested calls
--
@@ -155,7 +155,7 @@ if result.status() > 0:
return result.nrows()
else:
return None
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT result_metadata_test($$SELECT 1 AS foo, '11'::text AS bar UNION SELECT 2, '22'$$);
INFO: True
INFO: ['foo', 'bar']
@@ -177,7 +177,7 @@ CREATE FUNCTION result_nrows_test(cmd text) RETURNS int
AS $$
result = plpy.execute(cmd)
return result.nrows()
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT result_nrows_test($$SELECT 1$$);
result_nrows_test
-------------------
@@ -206,7 +206,7 @@ CREATE FUNCTION result_len_test(cmd text) RETURNS int
AS $$
result = plpy.execute(cmd)
return len(result)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT result_len_test($$SELECT 1$$);
result_len_test
-----------------
@@ -254,7 +254,7 @@ except TypeError:
else:
assert False, "TypeError not raised"
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT result_subscript_test();
INFO: 2
INFO: 4
@@ -272,7 +272,7 @@ result = plpy.execute("select 1 where false")
plpy.info(result[:])
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT result_empty_test();
INFO: []
result_empty_test
@@ -285,7 +285,7 @@ AS $$
plan = plpy.prepare(cmd)
result = plpy.execute(plan)
return str(result)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT result_str_test($$SELECT 1 AS foo UNION SELECT 2$$);
result_str_test
------------------------------------------------------------
@@ -306,12 +306,12 @@ for row in res:
if row['lname'] == 'doe':
does += 1
return does
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION double_cursor_close() RETURNS int AS $$
res = plpy.cursor("select fname, lname from users")
res.close()
res.close()
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION cursor_fetch() RETURNS int AS $$
res = plpy.cursor("select fname, lname from users")
assert len(res.fetch(3)) == 3
@@ -329,7 +329,7 @@ except StopIteration:
pass
else:
assert False, "StopIteration not raised"
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION cursor_mix_next_and_fetch() RETURNS int AS $$
res = plpy.cursor("select fname, lname from users order by fname")
assert len(res.fetch(2)) == 2
@@ -342,7 +342,7 @@ except AttributeError:
assert item['fname'] == 'rick'
assert len(res.fetch(2)) == 1
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION fetch_after_close() RETURNS int AS $$
res = plpy.cursor("select fname, lname from users")
res.close()
@@ -352,7 +352,7 @@ except ValueError:
pass
else:
assert False, "ValueError not raised"
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION next_after_close() RETURNS int AS $$
res = plpy.cursor("select fname, lname from users")
res.close()
@@ -365,7 +365,7 @@ except ValueError:
pass
else:
assert False, "ValueError not raised"
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION cursor_fetch_next_empty() RETURNS int AS $$
res = plpy.cursor("select fname, lname from users where false")
assert len(res.fetch(1)) == 0
@@ -378,7 +378,7 @@ except StopIteration:
pass
else:
assert False, "StopIteration not raised"
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION cursor_plan() RETURNS SETOF text AS $$
plan = plpy.prepare(
"select fname, lname from users where fname like $1 || '%' order by fname",
@@ -387,12 +387,12 @@ for row in plpy.cursor(plan, ["w"]):
yield row['fname']
for row in plan.cursor(["j"]):
yield row['fname']
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION cursor_plan_wrong_args() RETURNS SETOF text AS $$
plan = plpy.prepare("select fname, lname from users where fname like $1 || '%'",
["text"])
c = plpy.cursor(plan, ["a", "b"])
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TYPE test_composite_type AS (
a1 int,
a2 varchar
@@ -401,7 +401,7 @@ CREATE OR REPLACE FUNCTION plan_composite_args() RETURNS test_composite_type AS
plan = plpy.prepare("select $1 as c1", ["test_composite_type"])
res = plpy.execute(plan, [{"a1": 3, "a2": "label"}])
return res[0]["c1"]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT simple_cursor_test();
simple_cursor_test
--------------------
diff --git a/src/pl/plpython/expected/plpython_subtransaction.out b/src/pl/plpython/expected/plpython_subtransaction.out
index 2a56541917d..43d9277a33b 100644
--- a/src/pl/plpython/expected/plpython_subtransaction.out
+++ b/src/pl/plpython/expected/plpython_subtransaction.out
@@ -14,7 +14,7 @@ with plpy.subtransaction():
plpy.execute("INSERT INTO subtransaction_tbl VALUES ('oops')")
elif what_error == "Python":
raise Exception("Python exception")
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT subtransaction_ctx_test();
subtransaction_ctx_test
-------------------------
@@ -71,7 +71,7 @@ with plpy.subtransaction():
raise
plpy.notice("Swallowed %s(%r)" % (e.__class__.__name__, e.args[0]))
return "ok"
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT subtransaction_nested_test();
ERROR: spiexceptions.SyntaxError: syntax error at or near "error"
LINE 1: error
@@ -111,7 +111,7 @@ with plpy.subtransaction():
plpy.execute("INSERT INTO subtransaction_tbl VALUES (2)")
plpy.execute("SELECT subtransaction_nested_test('t')")
return "ok"
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT subtransaction_deeply_nested_test();
NOTICE: Swallowed SyntaxError('syntax error at or near "error"')
subtransaction_deeply_nested_test
@@ -133,42 +133,42 @@ TRUNCATE subtransaction_tbl;
CREATE FUNCTION subtransaction_exit_without_enter() RETURNS void
AS $$
plpy.subtransaction().__exit__(None, None, None)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION subtransaction_enter_without_exit() RETURNS void
AS $$
plpy.subtransaction().__enter__()
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION subtransaction_exit_twice() RETURNS void
AS $$
plpy.subtransaction().__enter__()
plpy.subtransaction().__exit__(None, None, None)
plpy.subtransaction().__exit__(None, None, None)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION subtransaction_enter_twice() RETURNS void
AS $$
plpy.subtransaction().__enter__()
plpy.subtransaction().__enter__()
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION subtransaction_exit_same_subtransaction_twice() RETURNS void
AS $$
s = plpy.subtransaction()
s.__enter__()
s.__exit__(None, None, None)
s.__exit__(None, None, None)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION subtransaction_enter_same_subtransaction_twice() RETURNS void
AS $$
s = plpy.subtransaction()
s.__enter__()
s.__enter__()
s.__exit__(None, None, None)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
-- No warnings here, as the subtransaction gets indeed closed
CREATE FUNCTION subtransaction_enter_subtransaction_in_with() RETURNS void
AS $$
with plpy.subtransaction() as s:
s.__enter__()
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION subtransaction_exit_subtransaction_in_with() RETURNS void
AS $$
try:
@@ -176,7 +176,7 @@ try:
s.__exit__(None, None, None)
except ValueError as e:
raise ValueError(e)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT subtransaction_exit_without_enter();
ERROR: ValueError: this subtransaction has not been entered
CONTEXT: Traceback (most recent call last):
@@ -255,7 +255,7 @@ try:
plpy.execute(p, ["wrong"])
except plpy.SPIError:
plpy.warning("Caught a SPI error")
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT subtransaction_mix_explicit_and_implicit();
WARNING: Caught a SPI error from an explicit subtransaction
WARNING: Caught a SPI error
@@ -278,7 +278,7 @@ AS $$
s = plpy.subtransaction()
s.enter()
s.exit(None, None, None)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT subtransaction_alternative_names();
subtransaction_alternative_names
----------------------------------
@@ -294,7 +294,7 @@ with plpy.subtransaction():
plpy.execute("INSERT INTO subtransaction_tbl VALUES ('a')")
except plpy.SPIError:
plpy.notice("caught")
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT try_catch_inside_subtransaction();
NOTICE: caught
try_catch_inside_subtransaction
@@ -318,7 +318,7 @@ with plpy.subtransaction():
plpy.execute("INSERT INTO subtransaction_tbl VALUES (1)")
except plpy.SPIError:
plpy.notice("caught")
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT pk_violation_inside_subtransaction();
NOTICE: caught
pk_violation_inside_subtransaction
@@ -340,7 +340,7 @@ with plpy.subtransaction():
cur.fetch(10)
fetched = cur.fetch(10);
return int(fetched[5]["i"])
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION cursor_aborted_subxact() RETURNS int AS $$
try:
with plpy.subtransaction():
@@ -351,7 +351,7 @@ except plpy.SPIError:
fetched = cur.fetch(10)
return int(fetched[5]["i"])
return 0 # not reached
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION cursor_plan_aborted_subxact() RETURNS int AS $$
try:
with plpy.subtransaction():
@@ -364,7 +364,7 @@ except plpy.SPIError:
fetched = cur.fetch(5)
return fetched[2]["i"]
return 0 # not reached
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION cursor_close_aborted_subxact() RETURNS boolean AS $$
try:
with plpy.subtransaction():
@@ -374,7 +374,7 @@ except plpy.SPIError:
cur.close()
return True
return False # not reached
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT cursor_in_subxact();
cursor_in_subxact
-------------------
diff --git a/src/pl/plpython/expected/plpython_test.out b/src/pl/plpython/expected/plpython_test.out
index 39b994f4468..13c14119c08 100644
--- a/src/pl/plpython/expected/plpython_test.out
+++ b/src/pl/plpython/expected/plpython_test.out
@@ -1,7 +1,7 @@
-- first some tests of basic functionality
-CREATE EXTENSION plpython2u;
+CREATE EXTENSION plpython3u;
-- really stupid function just to get the module loaded
-CREATE FUNCTION stupid() RETURNS text AS 'return "zarkon"' LANGUAGE plpythonu;
+CREATE FUNCTION stupid() RETURNS text AS 'return "zarkon"' LANGUAGE plpython3u;
select stupid();
stupid
--------
@@ -9,7 +9,7 @@ select stupid();
(1 row)
-- check 2/3 versioning
-CREATE FUNCTION stupidn() RETURNS text AS 'return "zarkon"' LANGUAGE plpython2u;
+CREATE FUNCTION stupidn() RETURNS text AS 'return "zarkon"' LANGUAGE plpython3u;
select stupidn();
stupidn
---------
@@ -26,7 +26,7 @@ for key in keys:
out.append("%s: %s" % (key, u[key]))
words = a1 + " " + a2 + " => {" + ", ".join(out) + "}"
return words'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
select "Argument test #1"(users, fname, lname) from users where lname = 'doe' order by 1;
Argument test #1
-----------------------------------------------------------------------
@@ -41,7 +41,7 @@ $$
contents = list(filter(lambda x: not x.startswith("__"), dir(plpy)))
contents.sort()
return contents
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
select module_contents();
module_contents
-----------------
@@ -78,7 +78,7 @@ plpy.info('info', 37, [1, 2, 3])
plpy.notice('notice')
plpy.warning('warning')
plpy.error('error')
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT elog_test_basic();
INFO: info
INFO: 37
diff --git a/src/pl/plpython/expected/plpython_transaction.out b/src/pl/plpython/expected/plpython_transaction.out
index 14152993c75..393ea21eaad 100644
--- a/src/pl/plpython/expected/plpython_transaction.out
+++ b/src/pl/plpython/expected/plpython_transaction.out
@@ -1,6 +1,6 @@
CREATE TABLE test1 (a int, b text);
CREATE PROCEDURE transaction_test1()
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
for i in range(0, 10):
plpy.execute("INSERT INTO test1 (a) VALUES (%d)" % i)
@@ -22,7 +22,7 @@ SELECT * FROM test1;
TRUNCATE test1;
DO
-LANGUAGE plpythonu
+LANGUAGE plpython3u
$$
for i in range(0, 10):
plpy.execute("INSERT INTO test1 (a) VALUES (%d)" % i)
@@ -44,7 +44,7 @@ SELECT * FROM test1;
TRUNCATE test1;
-- not allowed in a function
CREATE FUNCTION transaction_test2() RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
for i in range(0, 10):
plpy.execute("INSERT INTO test1 (a) VALUES (%d)" % i)
@@ -64,7 +64,7 @@ SELECT * FROM test1;
-- also not allowed if procedure is called from a function
CREATE FUNCTION transaction_test3() RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
plpy.execute("CALL transaction_test1()")
return 1
@@ -82,19 +82,19 @@ SELECT * FROM test1;
-- DO block inside function
CREATE FUNCTION transaction_test4() RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
-plpy.execute("DO LANGUAGE plpythonu $x$ plpy.commit() $x$")
+plpy.execute("DO LANGUAGE plpython3u $x$ plpy.commit() $x$")
return 1
$$;
SELECT transaction_test4();
ERROR: spiexceptions.InvalidTransactionTermination: invalid transaction termination
CONTEXT: Traceback (most recent call last):
PL/Python function "transaction_test4", line 2, in <module>
- plpy.execute("DO LANGUAGE plpythonu $x$ plpy.commit() $x$")
+ plpy.execute("DO LANGUAGE plpython3u $x$ plpy.commit() $x$")
PL/Python function "transaction_test4"
-- commit inside subtransaction (prohibited)
-DO LANGUAGE plpythonu $$
+DO LANGUAGE plpython3u $$
s = plpy.subtransaction()
s.enter()
plpy.commit()
@@ -106,7 +106,7 @@ CONTEXT: PL/Python anonymous code block
CREATE TABLE test2 (x int);
INSERT INTO test2 VALUES (0), (1), (2), (3), (4);
TRUNCATE test1;
-DO LANGUAGE plpythonu $$
+DO LANGUAGE plpython3u $$
for row in plpy.cursor("SELECT * FROM test2 ORDER BY x"):
plpy.execute("INSERT INTO test1 (a) VALUES (%s)" % row['x'])
plpy.commit()
@@ -129,7 +129,7 @@ SELECT * FROM pg_cursors;
-- error in cursor loop with commit
TRUNCATE test1;
-DO LANGUAGE plpythonu $$
+DO LANGUAGE plpython3u $$
for row in plpy.cursor("SELECT * FROM test2 ORDER BY x"):
plpy.execute("INSERT INTO test1 (a) VALUES (12/(%s-2))" % row['x'])
plpy.commit()
@@ -153,7 +153,7 @@ SELECT * FROM pg_cursors;
-- rollback inside cursor loop
TRUNCATE test1;
-DO LANGUAGE plpythonu $$
+DO LANGUAGE plpython3u $$
for row in plpy.cursor("SELECT * FROM test2 ORDER BY x"):
plpy.execute("INSERT INTO test1 (a) VALUES (%s)" % row['x'])
plpy.rollback()
@@ -170,7 +170,7 @@ SELECT * FROM pg_cursors;
-- first commit then rollback inside cursor loop
TRUNCATE test1;
-DO LANGUAGE plpythonu $$
+DO LANGUAGE plpython3u $$
for row in plpy.cursor("SELECT * FROM test2 ORDER BY x"):
plpy.execute("INSERT INTO test1 (a) VALUES (%s)" % row['x'])
if row['x'] % 2 == 0:
diff --git a/src/pl/plpython/expected/plpython_trigger.out b/src/pl/plpython/expected/plpython_trigger.out
index 742988a5b59..dd1ca32fa49 100644
--- a/src/pl/plpython/expected/plpython_trigger.out
+++ b/src/pl/plpython/expected/plpython_trigger.out
@@ -15,20 +15,20 @@ if TD["new"]["fname"] == "william":
TD["new"]["fname"] = TD["args"][0]
rv = "MODIFY"
return rv'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
CREATE FUNCTION users_update() returns trigger
AS
'if TD["event"] == "UPDATE":
if TD["old"]["fname"] != TD["new"]["fname"] and TD["old"]["fname"] == TD["args"][0]:
return "SKIP"
return None'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
CREATE FUNCTION users_delete() RETURNS trigger
AS
'if TD["old"]["fname"] == TD["args"][0]:
return "SKIP"
return None'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
CREATE TRIGGER users_insert_trig BEFORE INSERT ON users FOR EACH ROW
EXECUTE PROCEDURE users_insert ('willem');
CREATE TRIGGER users_update_trig BEFORE UPDATE ON users FOR EACH ROW
@@ -71,7 +71,7 @@ CREATE TABLE trigger_test_generated (
i int,
j int GENERATED ALWAYS AS (i * 2) STORED
);
-CREATE FUNCTION trigger_data() RETURNS trigger LANGUAGE plpythonu AS $$
+CREATE FUNCTION trigger_data() RETURNS trigger LANGUAGE plpython3u AS $$
if 'relid' in TD:
TD['relid'] = "bogus:12345"
@@ -328,7 +328,7 @@ INSERT INTO trigger_test VALUES (0, 'zero');
CREATE FUNCTION stupid1() RETURNS trigger
AS $$
return 37
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TRIGGER stupid_trigger1
BEFORE INSERT ON trigger_test
FOR EACH ROW EXECUTE PROCEDURE stupid1();
@@ -341,7 +341,7 @@ DROP TRIGGER stupid_trigger1 ON trigger_test;
CREATE FUNCTION stupid2() RETURNS trigger
AS $$
return "MODIFY"
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TRIGGER stupid_trigger2
BEFORE DELETE ON trigger_test
FOR EACH ROW EXECUTE PROCEDURE stupid2();
@@ -353,7 +353,7 @@ INSERT INTO trigger_test VALUES (0, 'zero');
CREATE FUNCTION stupid3() RETURNS trigger
AS $$
return "foo"
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TRIGGER stupid_trigger3
BEFORE UPDATE ON trigger_test
FOR EACH ROW EXECUTE PROCEDURE stupid3();
@@ -365,8 +365,8 @@ DROP TRIGGER stupid_trigger3 ON trigger_test;
-- Unicode variant
CREATE FUNCTION stupid3u() RETURNS trigger
AS $$
- return u"foo"
-$$ LANGUAGE plpythonu;
+ return "foo"
+$$ LANGUAGE plpython3u;
CREATE TRIGGER stupid_trigger3
BEFORE UPDATE ON trigger_test
FOR EACH ROW EXECUTE PROCEDURE stupid3u();
@@ -380,7 +380,7 @@ CREATE FUNCTION stupid4() RETURNS trigger
AS $$
del TD["new"]
return "MODIFY";
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TRIGGER stupid_trigger4
BEFORE UPDATE ON trigger_test
FOR EACH ROW EXECUTE PROCEDURE stupid4();
@@ -394,7 +394,7 @@ CREATE FUNCTION stupid5() RETURNS trigger
AS $$
TD["new"] = ['foo', 'bar']
return "MODIFY";
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TRIGGER stupid_trigger5
BEFORE UPDATE ON trigger_test
FOR EACH ROW EXECUTE PROCEDURE stupid5();
@@ -408,7 +408,7 @@ CREATE FUNCTION stupid6() RETURNS trigger
AS $$
TD["new"] = {1: 'foo', 2: 'bar'}
return "MODIFY";
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TRIGGER stupid_trigger6
BEFORE UPDATE ON trigger_test
FOR EACH ROW EXECUTE PROCEDURE stupid6();
@@ -422,7 +422,7 @@ CREATE FUNCTION stupid7() RETURNS trigger
AS $$
TD["new"] = {'v': 'foo', 'a': 'bar'}
return "MODIFY";
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TRIGGER stupid_trigger7
BEFORE UPDATE ON trigger_test
FOR EACH ROW EXECUTE PROCEDURE stupid7();
@@ -434,9 +434,9 @@ DROP TRIGGER stupid_trigger7 ON trigger_test;
-- Unicode variant
CREATE FUNCTION stupid7u() RETURNS trigger
AS $$
- TD["new"] = {u'v': 'foo', u'a': 'bar'}
+ TD["new"] = {'v': 'foo', 'a': 'bar'}
return "MODIFY"
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TRIGGER stupid_trigger7
BEFORE UPDATE ON trigger_test
FOR EACH ROW EXECUTE PROCEDURE stupid7u();
@@ -461,7 +461,7 @@ CREATE FUNCTION test_null() RETURNS trigger
AS $$
TD["new"]['v'] = None
return "MODIFY"
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TRIGGER test_null_trigger
BEFORE UPDATE ON trigger_test
FOR EACH ROW EXECUTE PROCEDURE test_null();
@@ -481,7 +481,7 @@ SET DateStyle = 'ISO';
CREATE FUNCTION set_modif_time() RETURNS trigger AS $$
TD['new']['modif_time'] = '2010-10-13 21:57:28.930486'
return 'MODIFY'
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TABLE pb (a TEXT, modif_time TIMESTAMP(0) WITHOUT TIME ZONE);
CREATE TRIGGER set_modif_time BEFORE UPDATE ON pb
FOR EACH ROW EXECUTE PROCEDURE set_modif_time();
@@ -507,7 +507,7 @@ CREATE FUNCTION composite_trigger_f() RETURNS trigger AS $$
TD['new']['f1'] = (3, False)
TD['new']['f2'] = {'k': 7, 'l': 'yes', 'ignored': 10}
return 'MODIFY'
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TRIGGER composite_trigger BEFORE INSERT ON composite_trigger_test
FOR EACH ROW EXECUTE PROCEDURE composite_trigger_f();
INSERT INTO composite_trigger_test VALUES (NULL, NULL);
@@ -521,7 +521,7 @@ SELECT * FROM composite_trigger_test;
CREATE TABLE composite_trigger_noop_test (f1 comp1, f2 comp2);
CREATE FUNCTION composite_trigger_noop_f() RETURNS trigger AS $$
return 'MODIFY'
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TRIGGER composite_trigger_noop BEFORE INSERT ON composite_trigger_noop_test
FOR EACH ROW EXECUTE PROCEDURE composite_trigger_noop_f();
INSERT INTO composite_trigger_noop_test VALUES (NULL, NULL);
@@ -540,7 +540,7 @@ CREATE TYPE comp3 AS (c1 comp1, c2 comp2, m integer);
CREATE TABLE composite_trigger_nested_test(c comp3);
CREATE FUNCTION composite_trigger_nested_f() RETURNS trigger AS $$
return 'MODIFY'
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TRIGGER composite_trigger_nested BEFORE INSERT ON composite_trigger_nested_test
FOR EACH ROW EXECUTE PROCEDURE composite_trigger_nested_f();
INSERT INTO composite_trigger_nested_test VALUES (NULL);
@@ -555,7 +555,7 @@ SELECT * FROM composite_trigger_nested_test;
(3 rows)
-- check that using a function as a trigger over two tables works correctly
-CREATE FUNCTION trig1234() RETURNS trigger LANGUAGE plpythonu AS $$
+CREATE FUNCTION trig1234() RETURNS trigger LANGUAGE plpython3u AS $$
TD["new"]["data"] = '1234'
return 'MODIFY'
$$;
@@ -581,7 +581,7 @@ SELECT * FROM b;
-- check that SQL run in trigger code can see transition tables
CREATE TABLE transition_table_test (id int, name text);
INSERT INTO transition_table_test VALUES (1, 'a');
-CREATE FUNCTION transition_table_test_f() RETURNS trigger LANGUAGE plpythonu AS
+CREATE FUNCTION transition_table_test_f() RETURNS trigger LANGUAGE plpython3u AS
$$
rv = plpy.execute("SELECT * FROM old_table")
assert(rv.nrows() == 1)
@@ -601,7 +601,7 @@ DROP TABLE transition_table_test;
DROP FUNCTION transition_table_test_f();
-- dealing with generated columns
CREATE FUNCTION generated_test_func1() RETURNS trigger
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
TD['new']['j'] = 5 # not allowed
return 'MODIFY'
diff --git a/src/pl/plpython/expected/plpython_types.out b/src/pl/plpython/expected/plpython_types.out
index 0a2659fe292..a470911c2ec 100644
--- a/src/pl/plpython/expected/plpython_types.out
+++ b/src/pl/plpython/expected/plpython_types.out
@@ -7,23 +7,23 @@
CREATE FUNCTION test_type_conversion_bool(x bool) RETURNS bool AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_bool(true);
-INFO: (True, <type 'bool'>)
+INFO: (True, <class 'bool'>)
test_type_conversion_bool
---------------------------
t
(1 row)
SELECT * FROM test_type_conversion_bool(false);
-INFO: (False, <type 'bool'>)
+INFO: (False, <class 'bool'>)
test_type_conversion_bool
---------------------------
f
(1 row)
SELECT * FROM test_type_conversion_bool(null);
-INFO: (None, <type 'NoneType'>)
+INFO: (None, <class 'NoneType'>)
test_type_conversion_bool
---------------------------
@@ -48,7 +48,7 @@ elif n == 5:
ret = [0]
plpy.info(ret, not not ret)
return ret
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_bool_other(0);
INFO: (0, False)
test_type_conversion_bool_other
@@ -94,16 +94,16 @@ INFO: ([0], True)
CREATE FUNCTION test_type_conversion_char(x char) RETURNS char AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_char('a');
-INFO: ('a', <type 'str'>)
+INFO: ('a', <class 'str'>)
test_type_conversion_char
---------------------------
a
(1 row)
SELECT * FROM test_type_conversion_char(null);
-INFO: (None, <type 'NoneType'>)
+INFO: (None, <class 'NoneType'>)
test_type_conversion_char
---------------------------
@@ -112,23 +112,23 @@ INFO: (None, <type 'NoneType'>)
CREATE FUNCTION test_type_conversion_int2(x int2) RETURNS int2 AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_int2(100::int2);
-INFO: (100, <type 'int'>)
+INFO: (100, <class 'int'>)
test_type_conversion_int2
---------------------------
100
(1 row)
SELECT * FROM test_type_conversion_int2(-100::int2);
-INFO: (-100, <type 'int'>)
+INFO: (-100, <class 'int'>)
test_type_conversion_int2
---------------------------
-100
(1 row)
SELECT * FROM test_type_conversion_int2(null);
-INFO: (None, <type 'NoneType'>)
+INFO: (None, <class 'NoneType'>)
test_type_conversion_int2
---------------------------
@@ -137,23 +137,23 @@ INFO: (None, <type 'NoneType'>)
CREATE FUNCTION test_type_conversion_int4(x int4) RETURNS int4 AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_int4(100);
-INFO: (100, <type 'int'>)
+INFO: (100, <class 'int'>)
test_type_conversion_int4
---------------------------
100
(1 row)
SELECT * FROM test_type_conversion_int4(-100);
-INFO: (-100, <type 'int'>)
+INFO: (-100, <class 'int'>)
test_type_conversion_int4
---------------------------
-100
(1 row)
SELECT * FROM test_type_conversion_int4(null);
-INFO: (None, <type 'NoneType'>)
+INFO: (None, <class 'NoneType'>)
test_type_conversion_int4
---------------------------
@@ -162,30 +162,30 @@ INFO: (None, <type 'NoneType'>)
CREATE FUNCTION test_type_conversion_int8(x int8) RETURNS int8 AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_int8(100);
-INFO: (100L, <type 'long'>)
+INFO: (100, <class 'int'>)
test_type_conversion_int8
---------------------------
100
(1 row)
SELECT * FROM test_type_conversion_int8(-100);
-INFO: (-100L, <type 'long'>)
+INFO: (-100, <class 'int'>)
test_type_conversion_int8
---------------------------
-100
(1 row)
SELECT * FROM test_type_conversion_int8(5000000000);
-INFO: (5000000000L, <type 'long'>)
+INFO: (5000000000, <class 'int'>)
test_type_conversion_int8
---------------------------
5000000000
(1 row)
SELECT * FROM test_type_conversion_int8(null);
-INFO: (None, <type 'NoneType'>)
+INFO: (None, <class 'NoneType'>)
test_type_conversion_int8
---------------------------
@@ -196,7 +196,7 @@ CREATE FUNCTION test_type_conversion_numeric(x numeric) RETURNS numeric AS $$
# between decimal and cdecimal
plpy.info(str(x), x.__class__.__name__)
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_numeric(100);
INFO: ('100', 'Decimal')
test_type_conversion_numeric
@@ -256,30 +256,30 @@ INFO: ('None', 'NoneType')
CREATE FUNCTION test_type_conversion_float4(x float4) RETURNS float4 AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_float4(100);
-INFO: (100.0, <type 'float'>)
+INFO: (100.0, <class 'float'>)
test_type_conversion_float4
-----------------------------
100
(1 row)
SELECT * FROM test_type_conversion_float4(-100);
-INFO: (-100.0, <type 'float'>)
+INFO: (-100.0, <class 'float'>)
test_type_conversion_float4
-----------------------------
-100
(1 row)
SELECT * FROM test_type_conversion_float4(5000.5);
-INFO: (5000.5, <type 'float'>)
+INFO: (5000.5, <class 'float'>)
test_type_conversion_float4
-----------------------------
5000.5
(1 row)
SELECT * FROM test_type_conversion_float4(null);
-INFO: (None, <type 'NoneType'>)
+INFO: (None, <class 'NoneType'>)
test_type_conversion_float4
-----------------------------
@@ -288,37 +288,37 @@ INFO: (None, <type 'NoneType'>)
CREATE FUNCTION test_type_conversion_float8(x float8) RETURNS float8 AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_float8(100);
-INFO: (100.0, <type 'float'>)
+INFO: (100.0, <class 'float'>)
test_type_conversion_float8
-----------------------------
100
(1 row)
SELECT * FROM test_type_conversion_float8(-100);
-INFO: (-100.0, <type 'float'>)
+INFO: (-100.0, <class 'float'>)
test_type_conversion_float8
-----------------------------
-100
(1 row)
SELECT * FROM test_type_conversion_float8(5000000000.5);
-INFO: (5000000000.5, <type 'float'>)
+INFO: (5000000000.5, <class 'float'>)
test_type_conversion_float8
-----------------------------
5000000000.5
(1 row)
SELECT * FROM test_type_conversion_float8(null);
-INFO: (None, <type 'NoneType'>)
+INFO: (None, <class 'NoneType'>)
test_type_conversion_float8
-----------------------------
(1 row)
SELECT * FROM test_type_conversion_float8(100100100.654321);
-INFO: (100100100.654321, <type 'float'>)
+INFO: (100100100.654321, <class 'float'>)
test_type_conversion_float8
-----------------------------
100100100.654321
@@ -327,23 +327,23 @@ INFO: (100100100.654321, <type 'float'>)
CREATE FUNCTION test_type_conversion_oid(x oid) RETURNS oid AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_oid(100);
-INFO: (100L, <type 'long'>)
+INFO: (100, <class 'int'>)
test_type_conversion_oid
--------------------------
100
(1 row)
SELECT * FROM test_type_conversion_oid(2147483649);
-INFO: (2147483649L, <type 'long'>)
+INFO: (2147483649, <class 'int'>)
test_type_conversion_oid
--------------------------
2147483649
(1 row)
SELECT * FROM test_type_conversion_oid(null);
-INFO: (None, <type 'NoneType'>)
+INFO: (None, <class 'NoneType'>)
test_type_conversion_oid
--------------------------
@@ -352,16 +352,16 @@ INFO: (None, <type 'NoneType'>)
CREATE FUNCTION test_type_conversion_text(x text) RETURNS text AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_text('hello world');
-INFO: ('hello world', <type 'str'>)
+INFO: ('hello world', <class 'str'>)
test_type_conversion_text
---------------------------
hello world
(1 row)
SELECT * FROM test_type_conversion_text(null);
-INFO: (None, <type 'NoneType'>)
+INFO: (None, <class 'NoneType'>)
test_type_conversion_text
---------------------------
@@ -370,23 +370,23 @@ INFO: (None, <type 'NoneType'>)
CREATE FUNCTION test_type_conversion_bytea(x bytea) RETURNS bytea AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_bytea('hello world');
-INFO: ('hello world', <type 'str'>)
+INFO: (b'hello world', <class 'bytes'>)
test_type_conversion_bytea
----------------------------
\x68656c6c6f20776f726c64
(1 row)
SELECT * FROM test_type_conversion_bytea(E'null\\000byte');
-INFO: ('null\x00byte', <type 'str'>)
+INFO: (b'null\x00byte', <class 'bytes'>)
test_type_conversion_bytea
----------------------------
\x6e756c6c0062797465
(1 row)
SELECT * FROM test_type_conversion_bytea(null);
-INFO: (None, <type 'NoneType'>)
+INFO: (None, <class 'NoneType'>)
test_type_conversion_bytea
----------------------------
@@ -395,14 +395,14 @@ INFO: (None, <type 'NoneType'>)
CREATE FUNCTION test_type_marshal() RETURNS bytea AS $$
import marshal
return marshal.dumps('hello world')
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION test_type_unmarshal(x bytea) RETURNS text AS $$
import marshal
try:
return marshal.loads(x)
except ValueError as e:
return 'FAILED: ' + str(e)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT test_type_unmarshal(x) FROM test_type_marshal() x;
test_type_unmarshal
---------------------
@@ -415,7 +415,7 @@ SELECT test_type_unmarshal(x) FROM test_type_marshal() x;
CREATE DOMAIN booltrue AS bool CHECK (VALUE IS TRUE OR VALUE IS NULL);
CREATE FUNCTION test_type_conversion_booltrue(x booltrue, y bool) RETURNS booltrue AS $$
return y
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_booltrue(true, true);
test_type_conversion_booltrue
-------------------------------
@@ -432,21 +432,21 @@ CREATE DOMAIN uint2 AS int2 CHECK (VALUE >= 0);
CREATE FUNCTION test_type_conversion_uint2(x uint2, y int) RETURNS uint2 AS $$
plpy.info(x, type(x))
return y
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_uint2(100::uint2, 50);
-INFO: (100, <type 'int'>)
+INFO: (100, <class 'int'>)
test_type_conversion_uint2
----------------------------
50
(1 row)
SELECT * FROM test_type_conversion_uint2(100::uint2, -50);
-INFO: (100, <type 'int'>)
+INFO: (100, <class 'int'>)
ERROR: value for domain uint2 violates check constraint "uint2_check"
CONTEXT: while creating return value
PL/Python function "test_type_conversion_uint2"
SELECT * FROM test_type_conversion_uint2(null, 1);
-INFO: (None, <type 'NoneType'>)
+INFO: (None, <class 'NoneType'>)
test_type_conversion_uint2
----------------------------
1
@@ -455,7 +455,7 @@ INFO: (None, <type 'NoneType'>)
CREATE DOMAIN nnint AS int CHECK (VALUE IS NOT NULL);
CREATE FUNCTION test_type_conversion_nnint(x nnint, y int) RETURNS nnint AS $$
return y
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_nnint(10, 20);
test_type_conversion_nnint
----------------------------
@@ -472,9 +472,9 @@ CREATE DOMAIN bytea10 AS bytea CHECK (octet_length(VALUE) = 10 AND VALUE IS NOT
CREATE FUNCTION test_type_conversion_bytea10(x bytea10, y bytea) RETURNS bytea10 AS $$
plpy.info(x, type(x))
return y
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_bytea10('hello wold', 'hello wold');
-INFO: ('hello wold', <type 'str'>)
+INFO: (b'hello wold', <class 'bytes'>)
test_type_conversion_bytea10
------------------------------
\x68656c6c6f20776f6c64
@@ -483,14 +483,14 @@ INFO: ('hello wold', <type 'str'>)
SELECT * FROM test_type_conversion_bytea10('hello world', 'hello wold');
ERROR: value for domain bytea10 violates check constraint "bytea10_check"
SELECT * FROM test_type_conversion_bytea10('hello word', 'hello world');
-INFO: ('hello word', <type 'str'>)
+INFO: (b'hello word', <class 'bytes'>)
ERROR: value for domain bytea10 violates check constraint "bytea10_check"
CONTEXT: while creating return value
PL/Python function "test_type_conversion_bytea10"
SELECT * FROM test_type_conversion_bytea10(null, 'hello word');
ERROR: value for domain bytea10 violates check constraint "bytea10_check"
SELECT * FROM test_type_conversion_bytea10('hello word', null);
-INFO: ('hello word', <type 'str'>)
+INFO: (b'hello word', <class 'bytes'>)
ERROR: value for domain bytea10 violates check constraint "bytea10_check"
CONTEXT: while creating return value
PL/Python function "test_type_conversion_bytea10"
@@ -500,58 +500,58 @@ PL/Python function "test_type_conversion_bytea10"
CREATE FUNCTION test_type_conversion_array_int4(x int4[]) RETURNS int4[] AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_int4(ARRAY[0, 100]);
-INFO: ([0, 100], <type 'list'>)
+INFO: ([0, 100], <class 'list'>)
test_type_conversion_array_int4
---------------------------------
{0,100}
(1 row)
SELECT * FROM test_type_conversion_array_int4(ARRAY[0,-100,55]);
-INFO: ([0, -100, 55], <type 'list'>)
+INFO: ([0, -100, 55], <class 'list'>)
test_type_conversion_array_int4
---------------------------------
{0,-100,55}
(1 row)
SELECT * FROM test_type_conversion_array_int4(ARRAY[NULL,1]);
-INFO: ([None, 1], <type 'list'>)
+INFO: ([None, 1], <class 'list'>)
test_type_conversion_array_int4
---------------------------------
{NULL,1}
(1 row)
SELECT * FROM test_type_conversion_array_int4(ARRAY[]::integer[]);
-INFO: ([], <type 'list'>)
+INFO: ([], <class 'list'>)
test_type_conversion_array_int4
---------------------------------
{}
(1 row)
SELECT * FROM test_type_conversion_array_int4(NULL);
-INFO: (None, <type 'NoneType'>)
+INFO: (None, <class 'NoneType'>)
test_type_conversion_array_int4
---------------------------------
(1 row)
SELECT * FROM test_type_conversion_array_int4(ARRAY[[1,2,3],[4,5,6]]);
-INFO: ([[1, 2, 3], [4, 5, 6]], <type 'list'>)
+INFO: ([[1, 2, 3], [4, 5, 6]], <class 'list'>)
test_type_conversion_array_int4
---------------------------------
{{1,2,3},{4,5,6}}
(1 row)
SELECT * FROM test_type_conversion_array_int4(ARRAY[[[1,2,NULL],[NULL,5,6]],[[NULL,8,9],[10,11,12]]]);
-INFO: ([[[1, 2, None], [None, 5, 6]], [[None, 8, 9], [10, 11, 12]]], <type 'list'>)
+INFO: ([[[1, 2, None], [None, 5, 6]], [[None, 8, 9], [10, 11, 12]]], <class 'list'>)
test_type_conversion_array_int4
---------------------------------------------------
{{{1,2,NULL},{NULL,5,6}},{{NULL,8,9},{10,11,12}}}
(1 row)
SELECT * FROM test_type_conversion_array_int4('[2:4]={1,2,3}');
-INFO: ([1, 2, 3], <type 'list'>)
+INFO: ([1, 2, 3], <class 'list'>)
test_type_conversion_array_int4
---------------------------------
{1,2,3}
@@ -560,9 +560,9 @@ INFO: ([1, 2, 3], <type 'list'>)
CREATE FUNCTION test_type_conversion_array_int8(x int8[]) RETURNS int8[] AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_int8(ARRAY[[[1,2,NULL],[NULL,5,6]],[[NULL,8,9],[10,11,12]]]::int8[]);
-INFO: ([[[1L, 2L, None], [None, 5L, 6L]], [[None, 8L, 9L], [10L, 11L, 12L]]], <type 'list'>)
+INFO: ([[[1, 2, None], [None, 5, 6]], [[None, 8, 9], [10, 11, 12]]], <class 'list'>)
test_type_conversion_array_int8
---------------------------------------------------
{{{1,2,NULL},{NULL,5,6}},{{NULL,8,9},{10,11,12}}}
@@ -571,10 +571,10 @@ INFO: ([[[1L, 2L, None], [None, 5L, 6L]], [[None, 8L, 9L], [10L, 11L, 12L]]], <
CREATE FUNCTION test_type_conversion_array_date(x date[]) RETURNS date[] AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_date(ARRAY[[['2016-09-21','2016-09-22',NULL],[NULL,'2016-10-21','2016-10-22']],
[[NULL,'2016-11-21','2016-10-21'],['2015-09-21','2015-09-22','2014-09-21']]]::date[]);
-INFO: ([[['09-21-2016', '09-22-2016', None], [None, '10-21-2016', '10-22-2016']], [[None, '11-21-2016', '10-21-2016'], ['09-21-2015', '09-22-2015', '09-21-2014']]], <type 'list'>)
+INFO: ([[['09-21-2016', '09-22-2016', None], [None, '10-21-2016', '10-22-2016']], [[None, '11-21-2016', '10-21-2016'], ['09-21-2015', '09-22-2015', '09-21-2014']]], <class 'list'>)
test_type_conversion_array_date
---------------------------------------------------------------------------------------------------------------------------------
{{{09-21-2016,09-22-2016,NULL},{NULL,10-21-2016,10-22-2016}},{{NULL,11-21-2016,10-21-2016},{09-21-2015,09-22-2015,09-21-2014}}}
@@ -583,12 +583,12 @@ INFO: ([[['09-21-2016', '09-22-2016', None], [None, '10-21-2016', '10-22-2016']
CREATE FUNCTION test_type_conversion_array_timestamp(x timestamp[]) RETURNS timestamp[] AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_timestamp(ARRAY[[['2016-09-21 15:34:24.078792-04','2016-10-22 11:34:24.078795-04',NULL],
[NULL,'2016-10-21 11:34:25.078792-04','2016-10-21 11:34:24.098792-04']],
[[NULL,'2016-01-21 11:34:24.078792-04','2016-11-21 11:34:24.108792-04'],
['2015-09-21 11:34:24.079792-04','2014-09-21 11:34:24.078792-04','2013-09-21 11:34:24.078792-04']]]::timestamp[]);
-INFO: ([[['Wed Sep 21 15:34:24.078792 2016', 'Sat Oct 22 11:34:24.078795 2016', None], [None, 'Fri Oct 21 11:34:25.078792 2016', 'Fri Oct 21 11:34:24.098792 2016']], [[None, 'Thu Jan 21 11:34:24.078792 2016', 'Mon Nov 21 11:34:24.108792 2016'], ['Mon Sep 21 11:34:24.079792 2015', 'Sun Sep 21 11:34:24.078792 2014', 'Sat Sep 21 11:34:24.078792 2013']]], <type 'list'>)
+INFO: ([[['Wed Sep 21 15:34:24.078792 2016', 'Sat Oct 22 11:34:24.078795 2016', None], [None, 'Fri Oct 21 11:34:25.078792 2016', 'Fri Oct 21 11:34:24.098792 2016']], [[None, 'Thu Jan 21 11:34:24.078792 2016', 'Mon Nov 21 11:34:24.108792 2016'], ['Mon Sep 21 11:34:24.079792 2015', 'Sun Sep 21 11:34:24.078792 2014', 'Sat Sep 21 11:34:24.078792 2013']]], <class 'list'>)
test_type_conversion_array_timestamp
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
{{{"Wed Sep 21 15:34:24.078792 2016","Sat Oct 22 11:34:24.078795 2016",NULL},{NULL,"Fri Oct 21 11:34:25.078792 2016","Fri Oct 21 11:34:24.098792 2016"}},{{NULL,"Thu Jan 21 11:34:24.078792 2016","Mon Nov 21 11:34:24.108792 2016"},{"Mon Sep 21 11:34:24.079792 2015","Sun Sep 21 11:34:24.078792 2014","Sat Sep 21 11:34:24.078792 2013"}}}
@@ -598,9 +598,9 @@ CREATE OR REPLACE FUNCTION pyreturnmultidemint4(h int4, i int4, j int4, k int4 )
m = [[[[x for x in range(h)] for y in range(i)] for z in range(j)] for w in range(k)]
plpy.info(m, type(m))
return m
-$BODY$ LANGUAGE plpythonu;
+$BODY$ LANGUAGE plpython3u;
select pyreturnmultidemint4(8,5,3,2);
-INFO: ([[[[0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7]], [[0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7]], [[0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7]]], [[[0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7]], [[0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7]], [[0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7]]]], <type 'list'>)
+INFO: ([[[[0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7]], [[0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7]], [[0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7]]], [[[0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7]], [[0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7]], [[0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7]]]], <class 'list'>)
pyreturnmultidemint4
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
{{{{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7}},{{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7}},{{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7}}},{{{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7}},{{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7}},{{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7}}}}
@@ -610,9 +610,9 @@ CREATE OR REPLACE FUNCTION pyreturnmultidemint8(h int4, i int4, j int4, k int4 )
m = [[[[x for x in range(h)] for y in range(i)] for z in range(j)] for w in range(k)]
plpy.info(m, type(m))
return m
-$BODY$ LANGUAGE plpythonu;
+$BODY$ LANGUAGE plpython3u;
select pyreturnmultidemint8(5,5,3,2);
-INFO: ([[[[0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4]], [[0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4]], [[0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4]]], [[[0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4]], [[0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4]], [[0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4]]]], <type 'list'>)
+INFO: ([[[[0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4]], [[0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4]], [[0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4]]], [[[0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4]], [[0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4]], [[0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4]]]], <class 'list'>)
pyreturnmultidemint8
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
{{{{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4}},{{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4}},{{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4}}},{{{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4}},{{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4}},{{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4}}}}
@@ -622,9 +622,9 @@ CREATE OR REPLACE FUNCTION pyreturnmultidemfloat4(h int4, i int4, j int4, k int4
m = [[[[x for x in range(h)] for y in range(i)] for z in range(j)] for w in range(k)]
plpy.info(m, type(m))
return m
-$BODY$ LANGUAGE plpythonu;
+$BODY$ LANGUAGE plpython3u;
select pyreturnmultidemfloat4(6,5,3,2);
-INFO: ([[[[0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5]], [[0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5]], [[0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5]]], [[[0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5]], [[0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5]], [[0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5]]]], <type 'list'>)
+INFO: ([[[[0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5]], [[0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5]], [[0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5]]], [[[0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5]], [[0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5]], [[0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5]]]], <class 'list'>)
pyreturnmultidemfloat4
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
{{{{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5}},{{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5}},{{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5}}},{{{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5}},{{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5}},{{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5}}}}
@@ -634,9 +634,9 @@ CREATE OR REPLACE FUNCTION pyreturnmultidemfloat8(h int4, i int4, j int4, k int4
m = [[[[x for x in range(h)] for y in range(i)] for z in range(j)] for w in range(k)]
plpy.info(m, type(m))
return m
-$BODY$ LANGUAGE plpythonu;
+$BODY$ LANGUAGE plpython3u;
select pyreturnmultidemfloat8(7,5,3,2);
-INFO: ([[[[0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6]], [[0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6]], [[0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6]]], [[[0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6]], [[0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6]], [[0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6]]]], <type 'list'>)
+INFO: ([[[[0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6]], [[0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6]], [[0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6]]], [[[0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6]], [[0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6]], [[0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6]]]], <class 'list'>)
pyreturnmultidemfloat8
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
{{{{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6}},{{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6}},{{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6}}},{{{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6}},{{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6}},{{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6}}}}
@@ -645,16 +645,16 @@ INFO: ([[[[0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6],
CREATE FUNCTION test_type_conversion_array_text(x text[]) RETURNS text[] AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_text(ARRAY['foo', 'bar']);
-INFO: (['foo', 'bar'], <type 'list'>)
+INFO: (['foo', 'bar'], <class 'list'>)
test_type_conversion_array_text
---------------------------------
{foo,bar}
(1 row)
SELECT * FROM test_type_conversion_array_text(ARRAY[['foo', 'bar'],['foo2', 'bar2']]);
-INFO: ([['foo', 'bar'], ['foo2', 'bar2']], <type 'list'>)
+INFO: ([['foo', 'bar'], ['foo2', 'bar2']], <class 'list'>)
test_type_conversion_array_text
---------------------------------
{{foo,bar},{foo2,bar2}}
@@ -663,9 +663,9 @@ INFO: ([['foo', 'bar'], ['foo2', 'bar2']], <type 'list'>)
CREATE FUNCTION test_type_conversion_array_bytea(x bytea[]) RETURNS bytea[] AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_bytea(ARRAY[E'\\xdeadbeef'::bytea, NULL]);
-INFO: (['\xde\xad\xbe\xef', None], <type 'list'>)
+INFO: ([b'\xde\xad\xbe\xef', None], <class 'list'>)
test_type_conversion_array_bytea
----------------------------------
{"\\xdeadbeef",NULL}
@@ -673,7 +673,7 @@ INFO: (['\xde\xad\xbe\xef', None], <type 'list'>)
CREATE FUNCTION test_type_conversion_array_mixed1() RETURNS text[] AS $$
return [123, 'abc']
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_mixed1();
test_type_conversion_array_mixed1
-----------------------------------
@@ -682,14 +682,14 @@ SELECT * FROM test_type_conversion_array_mixed1();
CREATE FUNCTION test_type_conversion_array_mixed2() RETURNS int[] AS $$
return [123, 'abc']
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_mixed2();
ERROR: invalid input syntax for type integer: "abc"
CONTEXT: while creating return value
PL/Python function "test_type_conversion_array_mixed2"
CREATE FUNCTION test_type_conversion_mdarray_malformed() RETURNS int[] AS $$
return [[1,2,3],[4,5]]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_mdarray_malformed();
ERROR: wrong length of inner sequence: has length 2, but 3 was expected
DETAIL: To construct a multidimensional array, the inner sequences must all have the same length.
@@ -697,14 +697,14 @@ CONTEXT: while creating return value
PL/Python function "test_type_conversion_mdarray_malformed"
CREATE FUNCTION test_type_conversion_mdarray_toodeep() RETURNS int[] AS $$
return [[[[[[[1]]]]]]]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_mdarray_toodeep();
ERROR: number of array dimensions exceeds the maximum allowed (6)
CONTEXT: while creating return value
PL/Python function "test_type_conversion_mdarray_toodeep"
CREATE FUNCTION test_type_conversion_array_record() RETURNS type_record[] AS $$
return [{'first': 'one', 'second': 42}, {'first': 'two', 'second': 11}]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_record();
test_type_conversion_array_record
-----------------------------------
@@ -713,7 +713,7 @@ SELECT * FROM test_type_conversion_array_record();
CREATE FUNCTION test_type_conversion_array_string() RETURNS text[] AS $$
return 'abc'
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_string();
test_type_conversion_array_string
-----------------------------------
@@ -722,7 +722,7 @@ SELECT * FROM test_type_conversion_array_string();
CREATE FUNCTION test_type_conversion_array_tuple() RETURNS text[] AS $$
return ('abc', 'def')
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_tuple();
test_type_conversion_array_tuple
----------------------------------
@@ -731,7 +731,7 @@ SELECT * FROM test_type_conversion_array_tuple();
CREATE FUNCTION test_type_conversion_array_error() RETURNS int[] AS $$
return 5
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_error();
ERROR: return value of function with array return type is not a Python sequence
CONTEXT: while creating return value
@@ -743,16 +743,16 @@ CREATE DOMAIN ordered_pair_domain AS integer[] CHECK (array_length(VALUE,1)=2 AN
CREATE FUNCTION test_type_conversion_array_domain(x ordered_pair_domain) RETURNS ordered_pair_domain AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_domain(ARRAY[0, 100]::ordered_pair_domain);
-INFO: ([0, 100], <type 'list'>)
+INFO: ([0, 100], <class 'list'>)
test_type_conversion_array_domain
-----------------------------------
{0,100}
(1 row)
SELECT * FROM test_type_conversion_array_domain(NULL::ordered_pair_domain);
-INFO: (None, <type 'NoneType'>)
+INFO: (None, <class 'NoneType'>)
test_type_conversion_array_domain
-----------------------------------
@@ -760,7 +760,7 @@ INFO: (None, <type 'NoneType'>)
CREATE FUNCTION test_type_conversion_array_domain_check_violation() RETURNS ordered_pair_domain AS $$
return [2,1]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_domain_check_violation();
ERROR: value for domain ordered_pair_domain violates check constraint "ordered_pair_domain_check"
CONTEXT: while creating return value
@@ -771,9 +771,9 @@ PL/Python function "test_type_conversion_array_domain_check_violation"
CREATE FUNCTION test_read_uint2_array(x uint2[]) RETURNS uint2 AS $$
plpy.info(x, type(x))
return x[0]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
select test_read_uint2_array(array[1::uint2]);
-INFO: ([1], <type 'list'>)
+INFO: ([1], <class 'list'>)
test_read_uint2_array
-----------------------
1
@@ -781,7 +781,7 @@ INFO: ([1], <type 'list'>)
CREATE FUNCTION test_build_uint2_array(x int2) RETURNS uint2[] AS $$
return [x, x]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
select test_build_uint2_array(1::int2);
test_build_uint2_array
------------------------
@@ -800,7 +800,7 @@ PL/Python function "test_build_uint2_array"
CREATE FUNCTION test_type_conversion_domain_array(x integer[])
RETURNS ordered_pair_domain[] AS $$
return [x, x]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
select test_type_conversion_domain_array(array[2,4]);
ERROR: return value of function with array return type is not a Python sequence
CONTEXT: while creating return value
@@ -813,9 +813,9 @@ CREATE FUNCTION test_type_conversion_domain_array2(x ordered_pair_domain)
RETURNS integer AS $$
plpy.info(x, type(x))
return x[1]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
select test_type_conversion_domain_array2(array[2,4]);
-INFO: ([2, 4], <type 'list'>)
+INFO: ([2, 4], <class 'list'>)
test_type_conversion_domain_array2
------------------------------------
4
@@ -827,9 +827,9 @@ CREATE FUNCTION test_type_conversion_array_domain_array(x ordered_pair_domain[])
RETURNS ordered_pair_domain AS $$
plpy.info(x, type(x))
return x[0]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
select test_type_conversion_array_domain_array(array[array[2,4]::ordered_pair_domain]);
-INFO: ([[2, 4]], <type 'list'>)
+INFO: ([[2, 4]], <class 'list'>)
test_type_conversion_array_domain_array
-----------------------------------------
{2,4}
@@ -846,7 +846,7 @@ CREATE TABLE employee (
INSERT INTO employee VALUES ('John', 100, 10), ('Mary', 200, 10);
CREATE OR REPLACE FUNCTION test_composite_table_input(e employee) RETURNS integer AS $$
return e['basesalary'] + e['bonus']
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT name, test_composite_table_input(employee.*) FROM employee;
name | test_composite_table_input
------+----------------------------
@@ -876,7 +876,7 @@ CREATE TYPE named_pair AS (
);
CREATE OR REPLACE FUNCTION test_composite_type_input(p named_pair) RETURNS integer AS $$
return sum(p.values())
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT test_composite_type_input(row(1, 2));
test_composite_type_input
---------------------------
@@ -896,7 +896,7 @@ SELECT test_composite_type_input(row(1, 2));
CREATE TYPE nnint_container AS (f1 int, f2 nnint);
CREATE FUNCTION nnint_test(x int, y int) RETURNS nnint_container AS $$
return {'f1': x, 'f2': y}
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT nnint_test(null, 3);
nnint_test
------------
@@ -913,7 +913,7 @@ PL/Python function "nnint_test"
CREATE DOMAIN ordered_named_pair AS named_pair_2 CHECK((VALUE).i <= (VALUE).j);
CREATE FUNCTION read_ordered_named_pair(p ordered_named_pair) RETURNS integer AS $$
return p['i'] + p['j']
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT read_ordered_named_pair(row(1, 2));
read_ordered_named_pair
-------------------------
@@ -924,7 +924,7 @@ SELECT read_ordered_named_pair(row(2, 1)); -- fail
ERROR: value for domain ordered_named_pair violates check constraint "ordered_named_pair_check"
CREATE FUNCTION build_ordered_named_pair(i int, j int) RETURNS ordered_named_pair AS $$
return {'i': i, 'j': j}
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT build_ordered_named_pair(1,2);
build_ordered_named_pair
--------------------------
@@ -937,7 +937,7 @@ CONTEXT: while creating return value
PL/Python function "build_ordered_named_pair"
CREATE FUNCTION build_ordered_named_pairs(i int, j int) RETURNS ordered_named_pair[] AS $$
return [{'i': i, 'j': j}, {'i': i, 'j': j+1}]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT build_ordered_named_pairs(1,2);
build_ordered_named_pairs
---------------------------
@@ -952,7 +952,7 @@ PL/Python function "build_ordered_named_pairs"
-- Prepared statements
--
CREATE OR REPLACE FUNCTION test_prep_bool_input() RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
plan = plpy.prepare("SELECT CASE WHEN $1 THEN 1 ELSE 0 END AS val", ['boolean'])
rv = plpy.execute(plan, ['fa'], 5) # 'fa' is true in Python
@@ -965,7 +965,7 @@ SELECT test_prep_bool_input(); -- 1
(1 row)
CREATE OR REPLACE FUNCTION test_prep_bool_output() RETURNS bool
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
plan = plpy.prepare("SELECT $1 = 1 AS val", ['int'])
rv = plpy.execute(plan, [0], 5)
@@ -980,7 +980,7 @@ INFO: {'val': False}
(1 row)
CREATE OR REPLACE FUNCTION test_prep_bytea_input(bb bytea) RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
plan = plpy.prepare("SELECT octet_length($1) AS val", ['bytea'])
rv = plpy.execute(plan, [bb], 5)
@@ -993,7 +993,7 @@ SELECT test_prep_bytea_input(E'a\\000b'); -- 3 (embedded null formerly truncated
(1 row)
CREATE OR REPLACE FUNCTION test_prep_bytea_output() RETURNS bytea
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
plan = plpy.prepare("SELECT decode('aa00bb', 'hex') AS val")
rv = plpy.execute(plan, [], 5)
@@ -1001,7 +1001,7 @@ plpy.info(rv[0])
return rv[0]['val']
$$;
SELECT test_prep_bytea_output();
-INFO: {'val': '\xaa\x00\xbb'}
+INFO: {'val': b'\xaa\x00\xbb'}
test_prep_bytea_output
------------------------
\xaa00bb
diff --git a/src/pl/plpython/expected/plpython_types_3.out b/src/pl/plpython/expected/plpython_types_3.out
deleted file mode 100644
index a6ec10d5e18..00000000000
--- a/src/pl/plpython/expected/plpython_types_3.out
+++ /dev/null
@@ -1,1009 +0,0 @@
---
--- Test data type behavior
---
---
--- Base/common types
---
-CREATE FUNCTION test_type_conversion_bool(x bool) RETURNS bool AS $$
-plpy.info(x, type(x))
-return x
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_bool(true);
-INFO: (True, <class 'bool'>)
- test_type_conversion_bool
----------------------------
- t
-(1 row)
-
-SELECT * FROM test_type_conversion_bool(false);
-INFO: (False, <class 'bool'>)
- test_type_conversion_bool
----------------------------
- f
-(1 row)
-
-SELECT * FROM test_type_conversion_bool(null);
-INFO: (None, <class 'NoneType'>)
- test_type_conversion_bool
----------------------------
-
-(1 row)
-
--- test various other ways to express Booleans in Python
-CREATE FUNCTION test_type_conversion_bool_other(n int) RETURNS bool AS $$
-# numbers
-if n == 0:
- ret = 0
-elif n == 1:
- ret = 5
-# strings
-elif n == 2:
- ret = ''
-elif n == 3:
- ret = 'fa' # true in Python, false in PostgreSQL
-# containers
-elif n == 4:
- ret = []
-elif n == 5:
- ret = [0]
-plpy.info(ret, not not ret)
-return ret
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_bool_other(0);
-INFO: (0, False)
- test_type_conversion_bool_other
----------------------------------
- f
-(1 row)
-
-SELECT * FROM test_type_conversion_bool_other(1);
-INFO: (5, True)
- test_type_conversion_bool_other
----------------------------------
- t
-(1 row)
-
-SELECT * FROM test_type_conversion_bool_other(2);
-INFO: ('', False)
- test_type_conversion_bool_other
----------------------------------
- f
-(1 row)
-
-SELECT * FROM test_type_conversion_bool_other(3);
-INFO: ('fa', True)
- test_type_conversion_bool_other
----------------------------------
- t
-(1 row)
-
-SELECT * FROM test_type_conversion_bool_other(4);
-INFO: ([], False)
- test_type_conversion_bool_other
----------------------------------
- f
-(1 row)
-
-SELECT * FROM test_type_conversion_bool_other(5);
-INFO: ([0], True)
- test_type_conversion_bool_other
----------------------------------
- t
-(1 row)
-
-CREATE FUNCTION test_type_conversion_char(x char) RETURNS char AS $$
-plpy.info(x, type(x))
-return x
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_char('a');
-INFO: ('a', <class 'str'>)
- test_type_conversion_char
----------------------------
- a
-(1 row)
-
-SELECT * FROM test_type_conversion_char(null);
-INFO: (None, <class 'NoneType'>)
- test_type_conversion_char
----------------------------
-
-(1 row)
-
-CREATE FUNCTION test_type_conversion_int2(x int2) RETURNS int2 AS $$
-plpy.info(x, type(x))
-return x
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_int2(100::int2);
-INFO: (100, <class 'int'>)
- test_type_conversion_int2
----------------------------
- 100
-(1 row)
-
-SELECT * FROM test_type_conversion_int2(-100::int2);
-INFO: (-100, <class 'int'>)
- test_type_conversion_int2
----------------------------
- -100
-(1 row)
-
-SELECT * FROM test_type_conversion_int2(null);
-INFO: (None, <class 'NoneType'>)
- test_type_conversion_int2
----------------------------
-
-(1 row)
-
-CREATE FUNCTION test_type_conversion_int4(x int4) RETURNS int4 AS $$
-plpy.info(x, type(x))
-return x
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_int4(100);
-INFO: (100, <class 'int'>)
- test_type_conversion_int4
----------------------------
- 100
-(1 row)
-
-SELECT * FROM test_type_conversion_int4(-100);
-INFO: (-100, <class 'int'>)
- test_type_conversion_int4
----------------------------
- -100
-(1 row)
-
-SELECT * FROM test_type_conversion_int4(null);
-INFO: (None, <class 'NoneType'>)
- test_type_conversion_int4
----------------------------
-
-(1 row)
-
-CREATE FUNCTION test_type_conversion_int8(x int8) RETURNS int8 AS $$
-plpy.info(x, type(x))
-return x
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_int8(100);
-INFO: (100, <class 'int'>)
- test_type_conversion_int8
----------------------------
- 100
-(1 row)
-
-SELECT * FROM test_type_conversion_int8(-100);
-INFO: (-100, <class 'int'>)
- test_type_conversion_int8
----------------------------
- -100
-(1 row)
-
-SELECT * FROM test_type_conversion_int8(5000000000);
-INFO: (5000000000, <class 'int'>)
- test_type_conversion_int8
----------------------------
- 5000000000
-(1 row)
-
-SELECT * FROM test_type_conversion_int8(null);
-INFO: (None, <class 'NoneType'>)
- test_type_conversion_int8
----------------------------
-
-(1 row)
-
-CREATE FUNCTION test_type_conversion_numeric(x numeric) RETURNS numeric AS $$
-# print just the class name, not the type, to avoid differences
-# between decimal and cdecimal
-plpy.info(str(x), x.__class__.__name__)
-return x
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_numeric(100);
-INFO: ('100', 'Decimal')
- test_type_conversion_numeric
-------------------------------
- 100
-(1 row)
-
-SELECT * FROM test_type_conversion_numeric(-100);
-INFO: ('-100', 'Decimal')
- test_type_conversion_numeric
-------------------------------
- -100
-(1 row)
-
-SELECT * FROM test_type_conversion_numeric(100.0);
-INFO: ('100.0', 'Decimal')
- test_type_conversion_numeric
-------------------------------
- 100.0
-(1 row)
-
-SELECT * FROM test_type_conversion_numeric(100.00);
-INFO: ('100.00', 'Decimal')
- test_type_conversion_numeric
-------------------------------
- 100.00
-(1 row)
-
-SELECT * FROM test_type_conversion_numeric(5000000000.5);
-INFO: ('5000000000.5', 'Decimal')
- test_type_conversion_numeric
-------------------------------
- 5000000000.5
-(1 row)
-
-SELECT * FROM test_type_conversion_numeric(1234567890.0987654321);
-INFO: ('1234567890.0987654321', 'Decimal')
- test_type_conversion_numeric
-------------------------------
- 1234567890.0987654321
-(1 row)
-
-SELECT * FROM test_type_conversion_numeric(-1234567890.0987654321);
-INFO: ('-1234567890.0987654321', 'Decimal')
- test_type_conversion_numeric
-------------------------------
- -1234567890.0987654321
-(1 row)
-
-SELECT * FROM test_type_conversion_numeric(null);
-INFO: ('None', 'NoneType')
- test_type_conversion_numeric
-------------------------------
-
-(1 row)
-
-CREATE FUNCTION test_type_conversion_float4(x float4) RETURNS float4 AS $$
-plpy.info(x, type(x))
-return x
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_float4(100);
-INFO: (100.0, <class 'float'>)
- test_type_conversion_float4
------------------------------
- 100
-(1 row)
-
-SELECT * FROM test_type_conversion_float4(-100);
-INFO: (-100.0, <class 'float'>)
- test_type_conversion_float4
------------------------------
- -100
-(1 row)
-
-SELECT * FROM test_type_conversion_float4(5000.5);
-INFO: (5000.5, <class 'float'>)
- test_type_conversion_float4
------------------------------
- 5000.5
-(1 row)
-
-SELECT * FROM test_type_conversion_float4(null);
-INFO: (None, <class 'NoneType'>)
- test_type_conversion_float4
------------------------------
-
-(1 row)
-
-CREATE FUNCTION test_type_conversion_float8(x float8) RETURNS float8 AS $$
-plpy.info(x, type(x))
-return x
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_float8(100);
-INFO: (100.0, <class 'float'>)
- test_type_conversion_float8
------------------------------
- 100
-(1 row)
-
-SELECT * FROM test_type_conversion_float8(-100);
-INFO: (-100.0, <class 'float'>)
- test_type_conversion_float8
------------------------------
- -100
-(1 row)
-
-SELECT * FROM test_type_conversion_float8(5000000000.5);
-INFO: (5000000000.5, <class 'float'>)
- test_type_conversion_float8
------------------------------
- 5000000000.5
-(1 row)
-
-SELECT * FROM test_type_conversion_float8(null);
-INFO: (None, <class 'NoneType'>)
- test_type_conversion_float8
------------------------------
-
-(1 row)
-
-SELECT * FROM test_type_conversion_float8(100100100.654321);
-INFO: (100100100.654321, <class 'float'>)
- test_type_conversion_float8
------------------------------
- 100100100.654321
-(1 row)
-
-CREATE FUNCTION test_type_conversion_oid(x oid) RETURNS oid AS $$
-plpy.info(x, type(x))
-return x
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_oid(100);
-INFO: (100, <class 'int'>)
- test_type_conversion_oid
---------------------------
- 100
-(1 row)
-
-SELECT * FROM test_type_conversion_oid(2147483649);
-INFO: (2147483649, <class 'int'>)
- test_type_conversion_oid
---------------------------
- 2147483649
-(1 row)
-
-SELECT * FROM test_type_conversion_oid(null);
-INFO: (None, <class 'NoneType'>)
- test_type_conversion_oid
---------------------------
-
-(1 row)
-
-CREATE FUNCTION test_type_conversion_text(x text) RETURNS text AS $$
-plpy.info(x, type(x))
-return x
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_text('hello world');
-INFO: ('hello world', <class 'str'>)
- test_type_conversion_text
----------------------------
- hello world
-(1 row)
-
-SELECT * FROM test_type_conversion_text(null);
-INFO: (None, <class 'NoneType'>)
- test_type_conversion_text
----------------------------
-
-(1 row)
-
-CREATE FUNCTION test_type_conversion_bytea(x bytea) RETURNS bytea AS $$
-plpy.info(x, type(x))
-return x
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_bytea('hello world');
-INFO: (b'hello world', <class 'bytes'>)
- test_type_conversion_bytea
-----------------------------
- \x68656c6c6f20776f726c64
-(1 row)
-
-SELECT * FROM test_type_conversion_bytea(E'null\\000byte');
-INFO: (b'null\x00byte', <class 'bytes'>)
- test_type_conversion_bytea
-----------------------------
- \x6e756c6c0062797465
-(1 row)
-
-SELECT * FROM test_type_conversion_bytea(null);
-INFO: (None, <class 'NoneType'>)
- test_type_conversion_bytea
-----------------------------
-
-(1 row)
-
-CREATE FUNCTION test_type_marshal() RETURNS bytea AS $$
-import marshal
-return marshal.dumps('hello world')
-$$ LANGUAGE plpython3u;
-CREATE FUNCTION test_type_unmarshal(x bytea) RETURNS text AS $$
-import marshal
-try:
- return marshal.loads(x)
-except ValueError as e:
- return 'FAILED: ' + str(e)
-$$ LANGUAGE plpython3u;
-SELECT test_type_unmarshal(x) FROM test_type_marshal() x;
- test_type_unmarshal
----------------------
- hello world
-(1 row)
-
---
--- Domains
---
-CREATE DOMAIN booltrue AS bool CHECK (VALUE IS TRUE OR VALUE IS NULL);
-CREATE FUNCTION test_type_conversion_booltrue(x booltrue, y bool) RETURNS booltrue AS $$
-return y
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_booltrue(true, true);
- test_type_conversion_booltrue
--------------------------------
- t
-(1 row)
-
-SELECT * FROM test_type_conversion_booltrue(false, true);
-ERROR: value for domain booltrue violates check constraint "booltrue_check"
-SELECT * FROM test_type_conversion_booltrue(true, false);
-ERROR: value for domain booltrue violates check constraint "booltrue_check"
-CONTEXT: while creating return value
-PL/Python function "test_type_conversion_booltrue"
-CREATE DOMAIN uint2 AS int2 CHECK (VALUE >= 0);
-CREATE FUNCTION test_type_conversion_uint2(x uint2, y int) RETURNS uint2 AS $$
-plpy.info(x, type(x))
-return y
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_uint2(100::uint2, 50);
-INFO: (100, <class 'int'>)
- test_type_conversion_uint2
-----------------------------
- 50
-(1 row)
-
-SELECT * FROM test_type_conversion_uint2(100::uint2, -50);
-INFO: (100, <class 'int'>)
-ERROR: value for domain uint2 violates check constraint "uint2_check"
-CONTEXT: while creating return value
-PL/Python function "test_type_conversion_uint2"
-SELECT * FROM test_type_conversion_uint2(null, 1);
-INFO: (None, <class 'NoneType'>)
- test_type_conversion_uint2
-----------------------------
- 1
-(1 row)
-
-CREATE DOMAIN nnint AS int CHECK (VALUE IS NOT NULL);
-CREATE FUNCTION test_type_conversion_nnint(x nnint, y int) RETURNS nnint AS $$
-return y
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_nnint(10, 20);
- test_type_conversion_nnint
-----------------------------
- 20
-(1 row)
-
-SELECT * FROM test_type_conversion_nnint(null, 20);
-ERROR: value for domain nnint violates check constraint "nnint_check"
-SELECT * FROM test_type_conversion_nnint(10, null);
-ERROR: value for domain nnint violates check constraint "nnint_check"
-CONTEXT: while creating return value
-PL/Python function "test_type_conversion_nnint"
-CREATE DOMAIN bytea10 AS bytea CHECK (octet_length(VALUE) = 10 AND VALUE IS NOT NULL);
-CREATE FUNCTION test_type_conversion_bytea10(x bytea10, y bytea) RETURNS bytea10 AS $$
-plpy.info(x, type(x))
-return y
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_bytea10('hello wold', 'hello wold');
-INFO: (b'hello wold', <class 'bytes'>)
- test_type_conversion_bytea10
-------------------------------
- \x68656c6c6f20776f6c64
-(1 row)
-
-SELECT * FROM test_type_conversion_bytea10('hello world', 'hello wold');
-ERROR: value for domain bytea10 violates check constraint "bytea10_check"
-SELECT * FROM test_type_conversion_bytea10('hello word', 'hello world');
-INFO: (b'hello word', <class 'bytes'>)
-ERROR: value for domain bytea10 violates check constraint "bytea10_check"
-CONTEXT: while creating return value
-PL/Python function "test_type_conversion_bytea10"
-SELECT * FROM test_type_conversion_bytea10(null, 'hello word');
-ERROR: value for domain bytea10 violates check constraint "bytea10_check"
-SELECT * FROM test_type_conversion_bytea10('hello word', null);
-INFO: (b'hello word', <class 'bytes'>)
-ERROR: value for domain bytea10 violates check constraint "bytea10_check"
-CONTEXT: while creating return value
-PL/Python function "test_type_conversion_bytea10"
---
--- Arrays
---
-CREATE FUNCTION test_type_conversion_array_int4(x int4[]) RETURNS int4[] AS $$
-plpy.info(x, type(x))
-return x
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_array_int4(ARRAY[0, 100]);
-INFO: ([0, 100], <class 'list'>)
- test_type_conversion_array_int4
----------------------------------
- {0,100}
-(1 row)
-
-SELECT * FROM test_type_conversion_array_int4(ARRAY[0,-100,55]);
-INFO: ([0, -100, 55], <class 'list'>)
- test_type_conversion_array_int4
----------------------------------
- {0,-100,55}
-(1 row)
-
-SELECT * FROM test_type_conversion_array_int4(ARRAY[NULL,1]);
-INFO: ([None, 1], <class 'list'>)
- test_type_conversion_array_int4
----------------------------------
- {NULL,1}
-(1 row)
-
-SELECT * FROM test_type_conversion_array_int4(ARRAY[]::integer[]);
-INFO: ([], <class 'list'>)
- test_type_conversion_array_int4
----------------------------------
- {}
-(1 row)
-
-SELECT * FROM test_type_conversion_array_int4(NULL);
-INFO: (None, <class 'NoneType'>)
- test_type_conversion_array_int4
----------------------------------
-
-(1 row)
-
-SELECT * FROM test_type_conversion_array_int4(ARRAY[[1,2,3],[4,5,6]]);
-INFO: ([[1, 2, 3], [4, 5, 6]], <class 'list'>)
- test_type_conversion_array_int4
----------------------------------
- {{1,2,3},{4,5,6}}
-(1 row)
-
-SELECT * FROM test_type_conversion_array_int4(ARRAY[[[1,2,NULL],[NULL,5,6]],[[NULL,8,9],[10,11,12]]]);
-INFO: ([[[1, 2, None], [None, 5, 6]], [[None, 8, 9], [10, 11, 12]]], <class 'list'>)
- test_type_conversion_array_int4
----------------------------------------------------
- {{{1,2,NULL},{NULL,5,6}},{{NULL,8,9},{10,11,12}}}
-(1 row)
-
-SELECT * FROM test_type_conversion_array_int4('[2:4]={1,2,3}');
-INFO: ([1, 2, 3], <class 'list'>)
- test_type_conversion_array_int4
----------------------------------
- {1,2,3}
-(1 row)
-
-CREATE FUNCTION test_type_conversion_array_int8(x int8[]) RETURNS int8[] AS $$
-plpy.info(x, type(x))
-return x
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_array_int8(ARRAY[[[1,2,NULL],[NULL,5,6]],[[NULL,8,9],[10,11,12]]]::int8[]);
-INFO: ([[[1, 2, None], [None, 5, 6]], [[None, 8, 9], [10, 11, 12]]], <class 'list'>)
- test_type_conversion_array_int8
----------------------------------------------------
- {{{1,2,NULL},{NULL,5,6}},{{NULL,8,9},{10,11,12}}}
-(1 row)
-
-CREATE FUNCTION test_type_conversion_array_date(x date[]) RETURNS date[] AS $$
-plpy.info(x, type(x))
-return x
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_array_date(ARRAY[[['2016-09-21','2016-09-22',NULL],[NULL,'2016-10-21','2016-10-22']],
- [[NULL,'2016-11-21','2016-10-21'],['2015-09-21','2015-09-22','2014-09-21']]]::date[]);
-INFO: ([[['09-21-2016', '09-22-2016', None], [None, '10-21-2016', '10-22-2016']], [[None, '11-21-2016', '10-21-2016'], ['09-21-2015', '09-22-2015', '09-21-2014']]], <class 'list'>)
- test_type_conversion_array_date
----------------------------------------------------------------------------------------------------------------------------------
- {{{09-21-2016,09-22-2016,NULL},{NULL,10-21-2016,10-22-2016}},{{NULL,11-21-2016,10-21-2016},{09-21-2015,09-22-2015,09-21-2014}}}
-(1 row)
-
-CREATE FUNCTION test_type_conversion_array_timestamp(x timestamp[]) RETURNS timestamp[] AS $$
-plpy.info(x, type(x))
-return x
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_array_timestamp(ARRAY[[['2016-09-21 15:34:24.078792-04','2016-10-22 11:34:24.078795-04',NULL],
- [NULL,'2016-10-21 11:34:25.078792-04','2016-10-21 11:34:24.098792-04']],
- [[NULL,'2016-01-21 11:34:24.078792-04','2016-11-21 11:34:24.108792-04'],
- ['2015-09-21 11:34:24.079792-04','2014-09-21 11:34:24.078792-04','2013-09-21 11:34:24.078792-04']]]::timestamp[]);
-INFO: ([[['Wed Sep 21 15:34:24.078792 2016', 'Sat Oct 22 11:34:24.078795 2016', None], [None, 'Fri Oct 21 11:34:25.078792 2016', 'Fri Oct 21 11:34:24.098792 2016']], [[None, 'Thu Jan 21 11:34:24.078792 2016', 'Mon Nov 21 11:34:24.108792 2016'], ['Mon Sep 21 11:34:24.079792 2015', 'Sun Sep 21 11:34:24.078792 2014', 'Sat Sep 21 11:34:24.078792 2013']]], <class 'list'>)
- test_type_conversion_array_timestamp
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
- {{{"Wed Sep 21 15:34:24.078792 2016","Sat Oct 22 11:34:24.078795 2016",NULL},{NULL,"Fri Oct 21 11:34:25.078792 2016","Fri Oct 21 11:34:24.098792 2016"}},{{NULL,"Thu Jan 21 11:34:24.078792 2016","Mon Nov 21 11:34:24.108792 2016"},{"Mon Sep 21 11:34:24.079792 2015","Sun Sep 21 11:34:24.078792 2014","Sat Sep 21 11:34:24.078792 2013"}}}
-(1 row)
-
-CREATE OR REPLACE FUNCTION pyreturnmultidemint4(h int4, i int4, j int4, k int4 ) RETURNS int4[] AS $BODY$
-m = [[[[x for x in range(h)] for y in range(i)] for z in range(j)] for w in range(k)]
-plpy.info(m, type(m))
-return m
-$BODY$ LANGUAGE plpython3u;
-select pyreturnmultidemint4(8,5,3,2);
-INFO: ([[[[0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7]], [[0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7]], [[0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7]]], [[[0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7]], [[0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7]], [[0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7], [0, 1, 2, 3, 4, 5, 6, 7]]]], <class 'list'>)
- pyreturnmultidemint4
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
- {{{{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7}},{{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7}},{{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7}}},{{{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7}},{{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7}},{{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7},{0,1,2,3,4,5,6,7}}}}
-(1 row)
-
-CREATE OR REPLACE FUNCTION pyreturnmultidemint8(h int4, i int4, j int4, k int4 ) RETURNS int8[] AS $BODY$
-m = [[[[x for x in range(h)] for y in range(i)] for z in range(j)] for w in range(k)]
-plpy.info(m, type(m))
-return m
-$BODY$ LANGUAGE plpython3u;
-select pyreturnmultidemint8(5,5,3,2);
-INFO: ([[[[0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4]], [[0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4]], [[0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4]]], [[[0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4]], [[0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4]], [[0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4]]]], <class 'list'>)
- pyreturnmultidemint8
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
- {{{{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4}},{{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4}},{{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4}}},{{{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4}},{{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4}},{{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4},{0,1,2,3,4}}}}
-(1 row)
-
-CREATE OR REPLACE FUNCTION pyreturnmultidemfloat4(h int4, i int4, j int4, k int4 ) RETURNS float4[] AS $BODY$
-m = [[[[x for x in range(h)] for y in range(i)] for z in range(j)] for w in range(k)]
-plpy.info(m, type(m))
-return m
-$BODY$ LANGUAGE plpython3u;
-select pyreturnmultidemfloat4(6,5,3,2);
-INFO: ([[[[0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5]], [[0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5]], [[0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5]]], [[[0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5]], [[0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5]], [[0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5]]]], <class 'list'>)
- pyreturnmultidemfloat4
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
- {{{{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5}},{{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5}},{{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5}}},{{{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5}},{{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5}},{{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5},{0,1,2,3,4,5}}}}
-(1 row)
-
-CREATE OR REPLACE FUNCTION pyreturnmultidemfloat8(h int4, i int4, j int4, k int4 ) RETURNS float8[] AS $BODY$
-m = [[[[x for x in range(h)] for y in range(i)] for z in range(j)] for w in range(k)]
-plpy.info(m, type(m))
-return m
-$BODY$ LANGUAGE plpython3u;
-select pyreturnmultidemfloat8(7,5,3,2);
-INFO: ([[[[0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6]], [[0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6]], [[0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6]]], [[[0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6]], [[0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6]], [[0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6], [0, 1, 2, 3, 4, 5, 6]]]], <class 'list'>)
- pyreturnmultidemfloat8
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
- {{{{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6}},{{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6}},{{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6}}},{{{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6}},{{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6}},{{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6},{0,1,2,3,4,5,6}}}}
-(1 row)
-
-CREATE FUNCTION test_type_conversion_array_text(x text[]) RETURNS text[] AS $$
-plpy.info(x, type(x))
-return x
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_array_text(ARRAY['foo', 'bar']);
-INFO: (['foo', 'bar'], <class 'list'>)
- test_type_conversion_array_text
----------------------------------
- {foo,bar}
-(1 row)
-
-SELECT * FROM test_type_conversion_array_text(ARRAY[['foo', 'bar'],['foo2', 'bar2']]);
-INFO: ([['foo', 'bar'], ['foo2', 'bar2']], <class 'list'>)
- test_type_conversion_array_text
----------------------------------
- {{foo,bar},{foo2,bar2}}
-(1 row)
-
-CREATE FUNCTION test_type_conversion_array_bytea(x bytea[]) RETURNS bytea[] AS $$
-plpy.info(x, type(x))
-return x
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_array_bytea(ARRAY[E'\\xdeadbeef'::bytea, NULL]);
-INFO: ([b'\xde\xad\xbe\xef', None], <class 'list'>)
- test_type_conversion_array_bytea
-----------------------------------
- {"\\xdeadbeef",NULL}
-(1 row)
-
-CREATE FUNCTION test_type_conversion_array_mixed1() RETURNS text[] AS $$
-return [123, 'abc']
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_array_mixed1();
- test_type_conversion_array_mixed1
------------------------------------
- {123,abc}
-(1 row)
-
-CREATE FUNCTION test_type_conversion_array_mixed2() RETURNS int[] AS $$
-return [123, 'abc']
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_array_mixed2();
-ERROR: invalid input syntax for type integer: "abc"
-CONTEXT: while creating return value
-PL/Python function "test_type_conversion_array_mixed2"
-CREATE FUNCTION test_type_conversion_mdarray_malformed() RETURNS int[] AS $$
-return [[1,2,3],[4,5]]
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_mdarray_malformed();
-ERROR: wrong length of inner sequence: has length 2, but 3 was expected
-DETAIL: To construct a multidimensional array, the inner sequences must all have the same length.
-CONTEXT: while creating return value
-PL/Python function "test_type_conversion_mdarray_malformed"
-CREATE FUNCTION test_type_conversion_mdarray_toodeep() RETURNS int[] AS $$
-return [[[[[[[1]]]]]]]
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_mdarray_toodeep();
-ERROR: number of array dimensions exceeds the maximum allowed (6)
-CONTEXT: while creating return value
-PL/Python function "test_type_conversion_mdarray_toodeep"
-CREATE FUNCTION test_type_conversion_array_record() RETURNS type_record[] AS $$
-return [{'first': 'one', 'second': 42}, {'first': 'two', 'second': 11}]
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_array_record();
- test_type_conversion_array_record
------------------------------------
- {"(one,42)","(two,11)"}
-(1 row)
-
-CREATE FUNCTION test_type_conversion_array_string() RETURNS text[] AS $$
-return 'abc'
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_array_string();
- test_type_conversion_array_string
------------------------------------
- {a,b,c}
-(1 row)
-
-CREATE FUNCTION test_type_conversion_array_tuple() RETURNS text[] AS $$
-return ('abc', 'def')
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_array_tuple();
- test_type_conversion_array_tuple
-----------------------------------
- {abc,def}
-(1 row)
-
-CREATE FUNCTION test_type_conversion_array_error() RETURNS int[] AS $$
-return 5
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_array_error();
-ERROR: return value of function with array return type is not a Python sequence
-CONTEXT: while creating return value
-PL/Python function "test_type_conversion_array_error"
---
--- Domains over arrays
---
-CREATE DOMAIN ordered_pair_domain AS integer[] CHECK (array_length(VALUE,1)=2 AND VALUE[1] < VALUE[2]);
-CREATE FUNCTION test_type_conversion_array_domain(x ordered_pair_domain) RETURNS ordered_pair_domain AS $$
-plpy.info(x, type(x))
-return x
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_array_domain(ARRAY[0, 100]::ordered_pair_domain);
-INFO: ([0, 100], <class 'list'>)
- test_type_conversion_array_domain
------------------------------------
- {0,100}
-(1 row)
-
-SELECT * FROM test_type_conversion_array_domain(NULL::ordered_pair_domain);
-INFO: (None, <class 'NoneType'>)
- test_type_conversion_array_domain
------------------------------------
-
-(1 row)
-
-CREATE FUNCTION test_type_conversion_array_domain_check_violation() RETURNS ordered_pair_domain AS $$
-return [2,1]
-$$ LANGUAGE plpython3u;
-SELECT * FROM test_type_conversion_array_domain_check_violation();
-ERROR: value for domain ordered_pair_domain violates check constraint "ordered_pair_domain_check"
-CONTEXT: while creating return value
-PL/Python function "test_type_conversion_array_domain_check_violation"
---
--- Arrays of domains
---
-CREATE FUNCTION test_read_uint2_array(x uint2[]) RETURNS uint2 AS $$
-plpy.info(x, type(x))
-return x[0]
-$$ LANGUAGE plpythonu;
-select test_read_uint2_array(array[1::uint2]);
-INFO: ([1], <class 'list'>)
- test_read_uint2_array
------------------------
- 1
-(1 row)
-
-CREATE FUNCTION test_build_uint2_array(x int2) RETURNS uint2[] AS $$
-return [x, x]
-$$ LANGUAGE plpythonu;
-select test_build_uint2_array(1::int2);
- test_build_uint2_array
-------------------------
- {1,1}
-(1 row)
-
-select test_build_uint2_array(-1::int2); -- fail
-ERROR: value for domain uint2 violates check constraint "uint2_check"
-CONTEXT: while creating return value
-PL/Python function "test_build_uint2_array"
---
--- ideally this would work, but for now it doesn't, because the return value
--- is [[2,4], [2,4]] which our conversion code thinks should become a 2-D
--- integer array, not an array of arrays.
---
-CREATE FUNCTION test_type_conversion_domain_array(x integer[])
- RETURNS ordered_pair_domain[] AS $$
-return [x, x]
-$$ LANGUAGE plpythonu;
-select test_type_conversion_domain_array(array[2,4]);
-ERROR: return value of function with array return type is not a Python sequence
-CONTEXT: while creating return value
-PL/Python function "test_type_conversion_domain_array"
-select test_type_conversion_domain_array(array[4,2]); -- fail
-ERROR: return value of function with array return type is not a Python sequence
-CONTEXT: while creating return value
-PL/Python function "test_type_conversion_domain_array"
-CREATE FUNCTION test_type_conversion_domain_array2(x ordered_pair_domain)
- RETURNS integer AS $$
-plpy.info(x, type(x))
-return x[1]
-$$ LANGUAGE plpythonu;
-select test_type_conversion_domain_array2(array[2,4]);
-INFO: ([2, 4], <class 'list'>)
- test_type_conversion_domain_array2
-------------------------------------
- 4
-(1 row)
-
-select test_type_conversion_domain_array2(array[4,2]); -- fail
-ERROR: value for domain ordered_pair_domain violates check constraint "ordered_pair_domain_check"
-CREATE FUNCTION test_type_conversion_array_domain_array(x ordered_pair_domain[])
- RETURNS ordered_pair_domain AS $$
-plpy.info(x, type(x))
-return x[0]
-$$ LANGUAGE plpythonu;
-select test_type_conversion_array_domain_array(array[array[2,4]::ordered_pair_domain]);
-INFO: ([[2, 4]], <class 'list'>)
- test_type_conversion_array_domain_array
------------------------------------------
- {2,4}
-(1 row)
-
----
---- Composite types
----
-CREATE TABLE employee (
- name text,
- basesalary integer,
- bonus integer
-);
-INSERT INTO employee VALUES ('John', 100, 10), ('Mary', 200, 10);
-CREATE OR REPLACE FUNCTION test_composite_table_input(e employee) RETURNS integer AS $$
-return e['basesalary'] + e['bonus']
-$$ LANGUAGE plpython3u;
-SELECT name, test_composite_table_input(employee.*) FROM employee;
- name | test_composite_table_input
-------+----------------------------
- John | 110
- Mary | 210
-(2 rows)
-
-ALTER TABLE employee DROP bonus;
-SELECT name, test_composite_table_input(employee.*) FROM employee;
-ERROR: KeyError: 'bonus'
-CONTEXT: Traceback (most recent call last):
- PL/Python function "test_composite_table_input", line 2, in <module>
- return e['basesalary'] + e['bonus']
-PL/Python function "test_composite_table_input"
-ALTER TABLE employee ADD bonus integer;
-UPDATE employee SET bonus = 10;
-SELECT name, test_composite_table_input(employee.*) FROM employee;
- name | test_composite_table_input
-------+----------------------------
- John | 110
- Mary | 210
-(2 rows)
-
-CREATE TYPE named_pair AS (
- i integer,
- j integer
-);
-CREATE OR REPLACE FUNCTION test_composite_type_input(p named_pair) RETURNS integer AS $$
-return sum(p.values())
-$$ LANGUAGE plpython3u;
-SELECT test_composite_type_input(row(1, 2));
- test_composite_type_input
----------------------------
- 3
-(1 row)
-
-ALTER TYPE named_pair RENAME TO named_pair_2;
-SELECT test_composite_type_input(row(1, 2));
- test_composite_type_input
----------------------------
- 3
-(1 row)
-
---
--- Domains within composite
---
-CREATE TYPE nnint_container AS (f1 int, f2 nnint);
-CREATE FUNCTION nnint_test(x int, y int) RETURNS nnint_container AS $$
-return {'f1': x, 'f2': y}
-$$ LANGUAGE plpythonu;
-SELECT nnint_test(null, 3);
- nnint_test
-------------
- (,3)
-(1 row)
-
-SELECT nnint_test(3, null); -- fail
-ERROR: value for domain nnint violates check constraint "nnint_check"
-CONTEXT: while creating return value
-PL/Python function "nnint_test"
---
--- Domains of composite
---
-CREATE DOMAIN ordered_named_pair AS named_pair_2 CHECK((VALUE).i <= (VALUE).j);
-CREATE FUNCTION read_ordered_named_pair(p ordered_named_pair) RETURNS integer AS $$
-return p['i'] + p['j']
-$$ LANGUAGE plpythonu;
-SELECT read_ordered_named_pair(row(1, 2));
- read_ordered_named_pair
--------------------------
- 3
-(1 row)
-
-SELECT read_ordered_named_pair(row(2, 1)); -- fail
-ERROR: value for domain ordered_named_pair violates check constraint "ordered_named_pair_check"
-CREATE FUNCTION build_ordered_named_pair(i int, j int) RETURNS ordered_named_pair AS $$
-return {'i': i, 'j': j}
-$$ LANGUAGE plpythonu;
-SELECT build_ordered_named_pair(1,2);
- build_ordered_named_pair
---------------------------
- (1,2)
-(1 row)
-
-SELECT build_ordered_named_pair(2,1); -- fail
-ERROR: value for domain ordered_named_pair violates check constraint "ordered_named_pair_check"
-CONTEXT: while creating return value
-PL/Python function "build_ordered_named_pair"
-CREATE FUNCTION build_ordered_named_pairs(i int, j int) RETURNS ordered_named_pair[] AS $$
-return [{'i': i, 'j': j}, {'i': i, 'j': j+1}]
-$$ LANGUAGE plpythonu;
-SELECT build_ordered_named_pairs(1,2);
- build_ordered_named_pairs
----------------------------
- {"(1,2)","(1,3)"}
-(1 row)
-
-SELECT build_ordered_named_pairs(2,1); -- fail
-ERROR: value for domain ordered_named_pair violates check constraint "ordered_named_pair_check"
-CONTEXT: while creating return value
-PL/Python function "build_ordered_named_pairs"
---
--- Prepared statements
---
-CREATE OR REPLACE FUNCTION test_prep_bool_input() RETURNS int
-LANGUAGE plpython3u
-AS $$
-plan = plpy.prepare("SELECT CASE WHEN $1 THEN 1 ELSE 0 END AS val", ['boolean'])
-rv = plpy.execute(plan, ['fa'], 5) # 'fa' is true in Python
-return rv[0]['val']
-$$;
-SELECT test_prep_bool_input(); -- 1
- test_prep_bool_input
-----------------------
- 1
-(1 row)
-
-CREATE OR REPLACE FUNCTION test_prep_bool_output() RETURNS bool
-LANGUAGE plpython3u
-AS $$
-plan = plpy.prepare("SELECT $1 = 1 AS val", ['int'])
-rv = plpy.execute(plan, [0], 5)
-plpy.info(rv[0])
-return rv[0]['val']
-$$;
-SELECT test_prep_bool_output(); -- false
-INFO: {'val': False}
- test_prep_bool_output
------------------------
- f
-(1 row)
-
-CREATE OR REPLACE FUNCTION test_prep_bytea_input(bb bytea) RETURNS int
-LANGUAGE plpython3u
-AS $$
-plan = plpy.prepare("SELECT octet_length($1) AS val", ['bytea'])
-rv = plpy.execute(plan, [bb], 5)
-return rv[0]['val']
-$$;
-SELECT test_prep_bytea_input(E'a\\000b'); -- 3 (embedded null formerly truncated value)
- test_prep_bytea_input
------------------------
- 3
-(1 row)
-
-CREATE OR REPLACE FUNCTION test_prep_bytea_output() RETURNS bytea
-LANGUAGE plpython3u
-AS $$
-plan = plpy.prepare("SELECT decode('aa00bb', 'hex') AS val")
-rv = plpy.execute(plan, [], 5)
-plpy.info(rv[0])
-return rv[0]['val']
-$$;
-SELECT test_prep_bytea_output();
-INFO: {'val': b'\xaa\x00\xbb'}
- test_prep_bytea_output
-------------------------
- \xaa00bb
-(1 row)
-
diff --git a/src/pl/plpython/expected/plpython_unicode.out b/src/pl/plpython/expected/plpython_unicode.out
index c7546dd4587..fd54b0b88e8 100644
--- a/src/pl/plpython/expected/plpython_unicode.out
+++ b/src/pl/plpython/expected/plpython_unicode.out
@@ -11,24 +11,24 @@ CREATE TABLE unicode_test (
testvalue text NOT NULL
);
CREATE FUNCTION unicode_return() RETURNS text AS E'
-return u"\\xA0"
-' LANGUAGE plpythonu;
+return "\\xA0"
+' LANGUAGE plpython3u;
CREATE FUNCTION unicode_trigger() RETURNS trigger AS E'
-TD["new"]["testvalue"] = u"\\xA0"
+TD["new"]["testvalue"] = "\\xA0"
return "MODIFY"
-' LANGUAGE plpythonu;
+' LANGUAGE plpython3u;
CREATE TRIGGER unicode_test_bi BEFORE INSERT ON unicode_test
FOR EACH ROW EXECUTE PROCEDURE unicode_trigger();
CREATE FUNCTION unicode_plan1() RETURNS text AS E'
plan = plpy.prepare("SELECT $1 AS testvalue", ["text"])
-rv = plpy.execute(plan, [u"\\xA0"], 1)
+rv = plpy.execute(plan, ["\\xA0"], 1)
return rv[0]["testvalue"]
-' LANGUAGE plpythonu;
+' LANGUAGE plpython3u;
CREATE FUNCTION unicode_plan2() RETURNS text AS E'
-plan = plpy.prepare("SELECT $1 || $2 AS testvalue", ["text", u"text"])
+plan = plpy.prepare("SELECT $1 || $2 AS testvalue", ["text", "text"])
rv = plpy.execute(plan, ["foo", "bar"], 1)
return rv[0]["testvalue"]
-' LANGUAGE plpythonu;
+' LANGUAGE plpython3u;
SELECT unicode_return();
unicode_return
----------------
diff --git a/src/pl/plpython/expected/plpython_void.out b/src/pl/plpython/expected/plpython_void.out
index 1080d12d6b2..07d0760783e 100644
--- a/src/pl/plpython/expected/plpython_void.out
+++ b/src/pl/plpython/expected/plpython_void.out
@@ -3,14 +3,14 @@
--
CREATE FUNCTION test_void_func1() RETURNS void AS $$
x = 10
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
-- illegal: can't return non-None value in void-returning func
CREATE FUNCTION test_void_func2() RETURNS void AS $$
return 10
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION test_return_none() RETURNS int AS $$
None
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
-- Tests for functions returning void
SELECT test_void_func1(), test_void_func1() IS NULL AS "is null";
test_void_func1 | is null
diff --git a/src/pl/plpython/plpy_cursorobject.c b/src/pl/plpython/plpy_cursorobject.c
index 08d8b607e38..f8591358bc8 100644
--- a/src/pl/plpython/plpy_cursorobject.c
+++ b/src/pl/plpython/plpy_cursorobject.c
@@ -40,7 +40,7 @@ static PyTypeObject PLy_CursorType = {
.tp_name = "PLyCursor",
.tp_basicsize = sizeof(PLyCursorObject),
.tp_dealloc = PLy_cursor_dealloc,
- .tp_flags = Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE | Py_TPFLAGS_HAVE_ITER,
+ .tp_flags = Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE,
.tp_doc = PLy_cursor_doc,
.tp_iter = PyObject_SelfIter,
.tp_iternext = PLy_cursor_iternext,
diff --git a/src/pl/plpython/plpy_main.c b/src/pl/plpython/plpy_main.c
index 3eedaa80da7..21303c1586b 100644
--- a/src/pl/plpython/plpy_main.c
+++ b/src/pl/plpython/plpy_main.c
@@ -28,27 +28,13 @@
* exported functions
*/
-#if PY_MAJOR_VERSION >= 3
-/* Use separate names to reduce confusion */
-#define plpython_validator plpython3_validator
-#define plpython_call_handler plpython3_call_handler
-#define plpython_inline_handler plpython3_inline_handler
-#endif
-
extern void _PG_init(void);
PG_MODULE_MAGIC;
-PG_FUNCTION_INFO_V1(plpython_validator);
-PG_FUNCTION_INFO_V1(plpython_call_handler);
-PG_FUNCTION_INFO_V1(plpython_inline_handler);
-
-#if PY_MAJOR_VERSION < 3
-/* Define aliases plpython2_call_handler etc */
-PG_FUNCTION_INFO_V1(plpython2_validator);
-PG_FUNCTION_INFO_V1(plpython2_call_handler);
-PG_FUNCTION_INFO_V1(plpython2_inline_handler);
-#endif
+PG_FUNCTION_INFO_V1(plpython3_validator);
+PG_FUNCTION_INFO_V1(plpython3_call_handler);
+PG_FUNCTION_INFO_V1(plpython3_inline_handler);
static bool PLy_procedure_is_trigger(Form_pg_proc procStruct);
@@ -125,13 +111,9 @@ PLy_initialize(void)
if (inited)
return;
-#if PY_MAJOR_VERSION >= 3
PyImport_AppendInittab("plpy", PyInit_plpy);
-#endif
Py_Initialize();
-#if PY_MAJOR_VERSION >= 3
PyImport_ImportModule("plpy");
-#endif
PLy_init_interp();
PLy_init_plpy();
if (PyErr_Occurred())
@@ -171,7 +153,7 @@ PLy_init_interp(void)
}
Datum
-plpython_validator(PG_FUNCTION_ARGS)
+plpython3_validator(PG_FUNCTION_ARGS)
{
Oid funcoid = PG_GETARG_OID(0);
HeapTuple tuple;
@@ -203,17 +185,8 @@ plpython_validator(PG_FUNCTION_ARGS)
PG_RETURN_VOID();
}
-#if PY_MAJOR_VERSION < 3
Datum
-plpython2_validator(PG_FUNCTION_ARGS)
-{
- /* call plpython validator with our fcinfo so it gets our oid */
- return plpython_validator(fcinfo);
-}
-#endif /* PY_MAJOR_VERSION < 3 */
-
-Datum
-plpython_call_handler(PG_FUNCTION_ARGS)
+plpython3_call_handler(PG_FUNCTION_ARGS)
{
bool nonatomic;
Datum retval;
@@ -284,16 +257,8 @@ plpython_call_handler(PG_FUNCTION_ARGS)
return retval;
}
-#if PY_MAJOR_VERSION < 3
-Datum
-plpython2_call_handler(PG_FUNCTION_ARGS)
-{
- return plpython_call_handler(fcinfo);
-}
-#endif /* PY_MAJOR_VERSION < 3 */
-
Datum
-plpython_inline_handler(PG_FUNCTION_ARGS)
+plpython3_inline_handler(PG_FUNCTION_ARGS)
{
LOCAL_FCINFO(fake_fcinfo, 0);
InlineCodeBlock *codeblock = (InlineCodeBlock *) DatumGetPointer(PG_GETARG_DATUM(0));
@@ -368,14 +333,6 @@ plpython_inline_handler(PG_FUNCTION_ARGS)
PG_RETURN_VOID();
}
-#if PY_MAJOR_VERSION < 3
-Datum
-plpython2_inline_handler(PG_FUNCTION_ARGS)
-{
- return plpython_inline_handler(fcinfo);
-}
-#endif /* PY_MAJOR_VERSION < 3 */
-
static bool
PLy_procedure_is_trigger(Form_pg_proc procStruct)
{
diff --git a/src/pl/plpython/plpy_plpymodule.c b/src/pl/plpython/plpy_plpymodule.c
index 0365acc95b0..0eefd34afcd 100644
--- a/src/pl/plpython/plpy_plpymodule.c
+++ b/src/pl/plpython/plpy_plpymodule.c
@@ -109,7 +109,6 @@ static PyMethodDef PLy_exc_methods[] = {
{NULL, NULL, 0, NULL}
};
-#if PY_MAJOR_VERSION >= 3
static PyModuleDef PLy_module = {
PyModuleDef_HEAD_INIT,
.m_name = "plpy",
@@ -141,7 +140,6 @@ PyInit_plpy(void)
return m;
}
-#endif /* PY_MAJOR_VERSION >= 3 */
void
PLy_init_plpy(void)
@@ -150,10 +148,6 @@ PLy_init_plpy(void)
*main_dict,
*plpy_mod;
-#if PY_MAJOR_VERSION < 3
- PyObject *plpy;
-#endif
-
/*
* initialize plpy module
*/
@@ -162,13 +156,7 @@ PLy_init_plpy(void)
PLy_subtransaction_init_type();
PLy_cursor_init_type();
-#if PY_MAJOR_VERSION >= 3
PyModule_Create(&PLy_module);
- /* for Python 3 we initialized the exceptions in PyInit_plpy */
-#else
- plpy = Py_InitModule("plpy", PLy_methods);
- PLy_add_exceptions(plpy);
-#endif
/* PyDict_SetItemString(plpy, "PlanType", (PyObject *) &PLy_PlanType); */
@@ -191,11 +179,7 @@ PLy_add_exceptions(PyObject *plpy)
PyObject *excmod;
HASHCTL hash_ctl;
-#if PY_MAJOR_VERSION < 3
- excmod = Py_InitModule("spiexceptions", PLy_exc_methods);
-#else
excmod = PyModule_Create(&PLy_exc_module);
-#endif
if (excmod == NULL)
PLy_elog(ERROR, "could not create the spiexceptions module");
diff --git a/src/pl/plpython/plpy_plpymodule.h b/src/pl/plpython/plpy_plpymodule.h
index 54d78101ceb..ad6436aca78 100644
--- a/src/pl/plpython/plpy_plpymodule.h
+++ b/src/pl/plpython/plpy_plpymodule.h
@@ -11,9 +11,7 @@
extern HTAB *PLy_spi_exceptions;
-#if PY_MAJOR_VERSION >= 3
PyMODINIT_FUNC PyInit_plpy(void);
-#endif
extern void PLy_init_plpy(void);
#endif /* PLPY_PLPYMODULE_H */
diff --git a/src/pl/plpython/plpy_resultobject.c b/src/pl/plpython/plpy_resultobject.c
index 54f39419c84..f289617ba80 100644
--- a/src/pl/plpython/plpy_resultobject.c
+++ b/src/pl/plpython/plpy_resultobject.c
@@ -226,19 +226,11 @@ PLy_result_str(PyObject *arg)
{
PLyResultObject *ob = (PLyResultObject *) arg;
-#if PY_MAJOR_VERSION >= 3
return PyUnicode_FromFormat("<%s status=%S nrows=%S rows=%S>",
Py_TYPE(ob)->tp_name,
ob->status,
ob->nrows,
ob->rows);
-#else
- return PyString_FromFormat("<%s status=%ld nrows=%ld rows=%s>",
- ob->ob_type->tp_name,
- PyInt_AsLong(ob->status),
- PyInt_AsLong(ob->nrows),
- PyString_AsString(PyObject_Str(ob->rows)));
-#endif
}
static PyObject *
diff --git a/src/pl/plpython/plpy_typeio.c b/src/pl/plpython/plpy_typeio.c
index 5e807b139f1..adf37a9b882 100644
--- a/src/pl/plpython/plpy_typeio.c
+++ b/src/pl/plpython/plpy_typeio.c
@@ -1032,25 +1032,17 @@ PLyObject_AsString(PyObject *plrv)
else if (PyFloat_Check(plrv))
{
/* use repr() for floats, str() is lossy */
-#if PY_MAJOR_VERSION >= 3
PyObject *s = PyObject_Repr(plrv);
plrv_bo = PLyUnicode_Bytes(s);
Py_XDECREF(s);
-#else
- plrv_bo = PyObject_Repr(plrv);
-#endif
}
else
{
-#if PY_MAJOR_VERSION >= 3
PyObject *s = PyObject_Str(plrv);
plrv_bo = PLyUnicode_Bytes(s);
Py_XDECREF(s);
-#else
- plrv_bo = PyObject_Str(plrv);
-#endif
}
if (!plrv_bo)
PLy_elog(ERROR, "could not create string representation of Python object");
diff --git a/src/pl/plpython/plpy_util.c b/src/pl/plpython/plpy_util.c
index 4a7d7264d79..693d0396c48 100644
--- a/src/pl/plpython/plpy_util.c
+++ b/src/pl/plpython/plpy_util.c
@@ -95,7 +95,6 @@ PLyUnicode_AsString(PyObject *unicode)
return rv;
}
-#if PY_MAJOR_VERSION >= 3
/*
* Convert a C string in the PostgreSQL server encoding to a Python
* unicode object. Reference ownership is passed to the caller.
@@ -126,5 +125,3 @@ PLyUnicode_FromString(const char *s)
{
return PLyUnicode_FromStringAndSize(s, strlen(s));
}
-
-#endif /* PY_MAJOR_VERSION >= 3 */
diff --git a/src/pl/plpython/plpy_util.h b/src/pl/plpython/plpy_util.h
index c9ba7edc0ec..7c6577925ea 100644
--- a/src/pl/plpython/plpy_util.h
+++ b/src/pl/plpython/plpy_util.h
@@ -11,9 +11,7 @@
extern PyObject *PLyUnicode_Bytes(PyObject *unicode);
extern char *PLyUnicode_AsString(PyObject *unicode);
-#if PY_MAJOR_VERSION >= 3
extern PyObject *PLyUnicode_FromString(const char *s);
extern PyObject *PLyUnicode_FromStringAndSize(const char *s, Py_ssize_t size);
-#endif
#endif /* PLPY_UTIL_H */
diff --git a/src/pl/plpython/plpython.h b/src/pl/plpython/plpython.h
index 994457b37d6..68426b12f76 100644
--- a/src/pl/plpython/plpython.h
+++ b/src/pl/plpython/plpython.h
@@ -69,26 +69,21 @@
* compatibility layer for Python 3 that when asked to convert a C
* string to a Python string it converts the C string from the
* PostgreSQL server encoding to a Python Unicode object.
+ *
+ * FIXME
*/
-#if PY_MAJOR_VERSION >= 3
#define PyString_Check(x) 0
#define PyString_AsString(x) PLyUnicode_AsString(x)
#define PyString_FromString(x) PLyUnicode_FromString(x)
#define PyString_FromStringAndSize(x, size) PLyUnicode_FromStringAndSize(x, size)
-#endif
/*
* Python 3 only has long.
+ *
+ * FIXME
*/
-#if PY_MAJOR_VERSION >= 3
#define PyInt_FromLong(x) PyLong_FromLong(x)
#define PyInt_AsLong(x) PyLong_AsLong(x)
-#endif
-
-/* Python 3 removed the Py_TPFLAGS_HAVE_ITER flag */
-#if PY_MAJOR_VERSION >= 3
-#define Py_TPFLAGS_HAVE_ITER 0
-#endif
/* define our text domain for translations */
#undef TEXTDOMAIN
diff --git a/src/pl/plpython/plpython2u--1.0.sql b/src/pl/plpython/plpython2u--1.0.sql
deleted file mode 100644
index 69f74775678..00000000000
--- a/src/pl/plpython/plpython2u--1.0.sql
+++ /dev/null
@@ -1,17 +0,0 @@
-/* src/pl/plpython/plpython2u--1.0.sql */
-
-CREATE FUNCTION plpython2_call_handler() RETURNS language_handler
- LANGUAGE c AS 'MODULE_PATHNAME';
-
-CREATE FUNCTION plpython2_inline_handler(internal) RETURNS void
- STRICT LANGUAGE c AS 'MODULE_PATHNAME';
-
-CREATE FUNCTION plpython2_validator(oid) RETURNS void
- STRICT LANGUAGE c AS 'MODULE_PATHNAME';
-
-CREATE LANGUAGE plpython2u
- HANDLER plpython2_call_handler
- INLINE plpython2_inline_handler
- VALIDATOR plpython2_validator;
-
-COMMENT ON LANGUAGE plpython2u IS 'PL/Python2U untrusted procedural language';
diff --git a/src/pl/plpython/plpython2u.control b/src/pl/plpython/plpython2u.control
deleted file mode 100644
index 39c2b791efe..00000000000
--- a/src/pl/plpython/plpython2u.control
+++ /dev/null
@@ -1,7 +0,0 @@
-# plpython2u extension
-comment = 'PL/Python2U untrusted procedural language'
-default_version = '1.0'
-module_pathname = '$libdir/plpython2'
-relocatable = false
-schema = pg_catalog
-superuser = true
diff --git a/src/pl/plpython/plpythonu--1.0.sql b/src/pl/plpython/plpythonu--1.0.sql
deleted file mode 100644
index 4c6f7c3f140..00000000000
--- a/src/pl/plpython/plpythonu--1.0.sql
+++ /dev/null
@@ -1,17 +0,0 @@
-/* src/pl/plpython/plpythonu--1.0.sql */
-
-CREATE FUNCTION plpython_call_handler() RETURNS language_handler
- LANGUAGE c AS 'MODULE_PATHNAME';
-
-CREATE FUNCTION plpython_inline_handler(internal) RETURNS void
- STRICT LANGUAGE c AS 'MODULE_PATHNAME';
-
-CREATE FUNCTION plpython_validator(oid) RETURNS void
- STRICT LANGUAGE c AS 'MODULE_PATHNAME';
-
-CREATE LANGUAGE plpythonu
- HANDLER plpython_call_handler
- INLINE plpython_inline_handler
- VALIDATOR plpython_validator;
-
-COMMENT ON LANGUAGE plpythonu IS 'PL/PythonU untrusted procedural language';
diff --git a/src/pl/plpython/plpythonu.control b/src/pl/plpython/plpythonu.control
deleted file mode 100644
index ae91b1c255c..00000000000
--- a/src/pl/plpython/plpythonu.control
+++ /dev/null
@@ -1,7 +0,0 @@
-# plpythonu extension
-comment = 'PL/PythonU untrusted procedural language'
-default_version = '1.0'
-module_pathname = '$libdir/plpython2'
-relocatable = false
-schema = pg_catalog
-superuser = true
diff --git a/src/pl/plpython/regress-python3-mangle.mk b/src/pl/plpython/regress-python3-mangle.mk
deleted file mode 100644
index a785818a172..00000000000
--- a/src/pl/plpython/regress-python3-mangle.mk
+++ /dev/null
@@ -1,38 +0,0 @@
-ifeq ($(python_majorversion),3)
-# Adjust regression tests for Python 3 compatibility
-#
-# Mention those regression test files that need to be mangled in the
-# variable REGRESS_PLPYTHON3_MANGLE. They will be copied to a
-# subdirectory python3/ and have their Python syntax and other bits
-# adjusted to work with Python 3.
-
-# Note that the order of the tests needs to be preserved in this
-# expression.
-REGRESS := $(foreach test,$(REGRESS),$(if $(filter $(test),$(REGRESS_PLPYTHON3_MANGLE)),python3/$(test),$(test)))
-
-.PHONY: pgregress-python3-mangle
-pgregress-python3-mangle:
- $(MKDIR_P) sql/python3 expected/python3 results/python3
- for file in $(patsubst %,$(srcdir)/sql/%.sql,$(REGRESS_PLPYTHON3_MANGLE)) $(patsubst %,$(srcdir)/expected/%*.out,$(REGRESS_PLPYTHON3_MANGLE)); do \
- sed \
- -e "s/<type 'exceptions\.\([[:alpha:]]*\)'>/<class '\1'>/g" \
- -e "s/<type 'long'>/<class 'int'>/g" \
- -e "s/\([0-9][0-9]*\)L/\1/g" \
- -e 's/\([ [{]\)u"/\1"/g' \
- -e "s/\([ [{]\)u'/\1'/g" \
- -e "s/def next/def __next__/g" \
- -e "s/LANGUAGE plpythonu/LANGUAGE plpython3u/g" \
- -e "s/LANGUAGE plpython2u/LANGUAGE plpython3u/g" \
- -e "s/EXTENSION plpythonu/EXTENSION plpython3u/g" \
- -e "s/EXTENSION plpython2u/EXTENSION plpython3u/g" \
- -e "s/EXTENSION \([^ ]*\)_plpythonu/EXTENSION \1_plpython3u/g" \
- -e "s/EXTENSION \([^ ]*\)_plpython2u/EXTENSION \1_plpython3u/g" \
- -e 's/installing required extension "plpython2u"/installing required extension "plpython3u"/g' \
- $$file >`echo $$file | sed 's,^.*/\([^/][^/]*/\)\([^/][^/]*\)$$,\1python3/\2,'` || exit; \
- done
-
-check installcheck: pgregress-python3-mangle
-
-pg_regress_clean_files += sql/python3/ expected/python3/ results/python3/
-
-endif # Python 3
diff --git a/src/pl/plpython/sql/plpython_call.sql b/src/pl/plpython/sql/plpython_call.sql
index b0b3705ae3c..daa4bc377d7 100644
--- a/src/pl/plpython/sql/plpython_call.sql
+++ b/src/pl/plpython/sql/plpython_call.sql
@@ -3,7 +3,7 @@
--
CREATE PROCEDURE test_proc1()
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
pass
$$;
@@ -13,7 +13,7 @@ CALL test_proc1();
-- error: can't return non-None
CREATE PROCEDURE test_proc2()
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
return 5
$$;
@@ -24,7 +24,7 @@ CALL test_proc2();
CREATE TABLE test1 (a int);
CREATE PROCEDURE test_proc3(x int)
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
plpy.execute("INSERT INTO test1 VALUES (%s)" % x)
$$;
@@ -37,7 +37,7 @@ SELECT * FROM test1;
-- output arguments
CREATE PROCEDURE test_proc5(INOUT a text)
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
return [a + '+' + a]
$$;
@@ -46,7 +46,7 @@ CALL test_proc5('abc');
CREATE PROCEDURE test_proc6(a int, INOUT b int, INOUT c int)
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
return (b * a, c * a)
$$;
@@ -57,7 +57,7 @@ CALL test_proc6(2, 3, 4);
-- OUT parameters
CREATE PROCEDURE test_proc9(IN a int, OUT b int)
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
plpy.notice("a: %s" % (a))
return (a * 2,)
diff --git a/src/pl/plpython/sql/plpython_composite.sql b/src/pl/plpython/sql/plpython_composite.sql
index 0fd2f5d5e3b..21757701cc1 100644
--- a/src/pl/plpython/sql/plpython_composite.sql
+++ b/src/pl/plpython/sql/plpython_composite.sql
@@ -1,6 +1,6 @@
CREATE FUNCTION multiout_simple(OUT i integer, OUT j integer) AS $$
return (1, 2)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT multiout_simple();
SELECT * FROM multiout_simple();
@@ -9,7 +9,7 @@ SELECT (multiout_simple()).j + 3;
CREATE FUNCTION multiout_simple_setof(n integer = 1, OUT integer, OUT integer) RETURNS SETOF record AS $$
return [(1, 2)] * n
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT multiout_simple_setof();
SELECT * FROM multiout_simple_setof();
@@ -34,7 +34,7 @@ elif typ == 'obj':
return type_record
elif typ == 'str':
return "('%s',%r)" % (first, second)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM multiout_record_as('dict', 'foo', 1, 'f');
SELECT multiout_record_as('dict', 'foo', 1, 'f');
@@ -77,7 +77,7 @@ for i in range(n):
power = 2 ** i
length = plpy.execute("select length('%d')" % power)[0]['length']
yield power, length
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM multiout_setof(3);
SELECT multiout_setof(5);
@@ -86,7 +86,7 @@ CREATE FUNCTION multiout_return_table() RETURNS TABLE (x integer, y text) AS $$
return [{'x': 4, 'y' :'four'},
{'x': 7, 'y' :'seven'},
{'x': 0, 'y' :'zero'}]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM multiout_return_table();
@@ -94,18 +94,18 @@ CREATE FUNCTION multiout_array(OUT integer[], OUT text) RETURNS SETOF record AS
yield [[1], 'a']
yield [[1,2], 'b']
yield [[1,2,3], None]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM multiout_array();
CREATE FUNCTION singleout_composite(OUT type_record) AS $$
return {'first': 1, 'second': 2}
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION multiout_composite(OUT type_record) RETURNS SETOF type_record AS $$
return [{'first': 1, 'second': 2},
{'first': 3, 'second': 4 }]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM singleout_composite();
SELECT * FROM multiout_composite();
@@ -113,7 +113,7 @@ SELECT * FROM multiout_composite();
-- composite OUT parameters in functions returning RECORD not supported yet
CREATE FUNCTION multiout_composite(INOUT n integer, OUT type_record) AS $$
return (n, (n * 2, n * 3))
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION multiout_table_type_setof(typ text, returnnull boolean, INOUT n integer, OUT table_record) RETURNS SETOF record AS $$
if returnnull:
@@ -132,7 +132,7 @@ elif typ == 'str':
d = "(%r,%r)" % (n * 2, n * 3)
for i in range(n):
yield (i, d)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM multiout_composite(2);
SELECT * FROM multiout_table_type_setof('dict', 'f', 3);
@@ -157,7 +157,7 @@ CREATE TABLE changing (
CREATE FUNCTION changing_test(OUT n integer, OUT changing) RETURNS SETOF record AS $$
return [(1, {'i': 1, 'j': 2}),
(1, (3, 4))]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM changing_test();
ALTER TABLE changing DROP COLUMN j;
@@ -178,14 +178,14 @@ yield {'tab': [('first', 1), ('second', 2)],
yield {'tab': [('first', 1), ('second', 2)],
'typ': [{'first': 'third', 'second': 3},
{'first': 'fourth', 'second': 4}]}
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM composite_types_table();
-- check what happens if the output record descriptor changes
CREATE FUNCTION return_record(t text) RETURNS record AS $$
return {'t': t, 'val': 10}
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM return_record('abc') AS r(t text, val integer);
SELECT * FROM return_record('abc') AS r(t text, val bigint);
@@ -196,7 +196,7 @@ SELECT * FROM return_record('999') AS r(val text, t integer);
CREATE FUNCTION return_record_2(t text) RETURNS record AS $$
return {'v1':1,'v2':2,t:3}
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM return_record_2('v3') AS (v3 int, v2 int, v1 int);
SELECT * FROM return_record_2('v3') AS (v2 int, v3 int, v1 int);
@@ -211,7 +211,7 @@ SELECT * FROM return_record_2('v3') AS (v1 int, v2 int, v3 int);
-- multi-dimensional array of composite types.
CREATE FUNCTION composite_type_as_list() RETURNS type_record[] AS $$
return [[('first', 1), ('second', 1)], [('first', 2), ('second', 2)], [('first', 3), ('second', 3)]];
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM composite_type_as_list();
-- Starting with PostgreSQL 10, a composite type in an array cannot be
@@ -220,5 +220,5 @@ SELECT * FROM composite_type_as_list();
-- on the issue.
CREATE FUNCTION composite_type_as_list_broken() RETURNS type_record[] AS $$
return [['first', 1]];
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM composite_type_as_list_broken();
diff --git a/src/pl/plpython/sql/plpython_do.sql b/src/pl/plpython/sql/plpython_do.sql
index 0e281a08ee8..d49413268e9 100644
--- a/src/pl/plpython/sql/plpython_do.sql
+++ b/src/pl/plpython/sql/plpython_do.sql
@@ -1,5 +1,3 @@
-DO $$ plpy.notice("This is plpythonu.") $$ LANGUAGE plpythonu;
+DO $$ plpy.notice("This is plpython3u.") $$ LANGUAGE plpython3u;
-DO $$ plpy.notice("This is plpython2u.") $$ LANGUAGE plpython2u;
-
-DO $$ raise Exception("error test") $$ LANGUAGE plpythonu;
+DO $$ raise Exception("error test") $$ LANGUAGE plpython3u;
diff --git a/src/pl/plpython/sql/plpython_drop.sql b/src/pl/plpython/sql/plpython_drop.sql
index 72d5d657ec3..e4f373b2bc7 100644
--- a/src/pl/plpython/sql/plpython_drop.sql
+++ b/src/pl/plpython/sql/plpython_drop.sql
@@ -3,6 +3,4 @@
--
SET client_min_messages = WARNING;
-DROP EXTENSION plpythonu CASCADE;
-
-DROP EXTENSION IF EXISTS plpython2u CASCADE;
+DROP EXTENSION plpython3u CASCADE;
diff --git a/src/pl/plpython/sql/plpython_ereport.sql b/src/pl/plpython/sql/plpython_ereport.sql
index 58df2057ef5..3bcf8f5cde9 100644
--- a/src/pl/plpython/sql/plpython_ereport.sql
+++ b/src/pl/plpython/sql/plpython_ereport.sql
@@ -17,28 +17,28 @@ plpy.info('This is message text.',
plpy.notice('notice', detail='some detail')
plpy.warning('warning', detail='some detail')
plpy.error('stop on error', detail='some detail', hint='some hint')
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT elog_test();
-DO $$ plpy.info('other types', detail=(10, 20)) $$ LANGUAGE plpythonu;
+DO $$ plpy.info('other types', detail=(10, 20)) $$ LANGUAGE plpython3u;
DO $$
import time;
from datetime import date
plpy.info('other types', detail=date(2016, 2, 26))
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
DO $$
basket = ['apple', 'orange', 'apple', 'pear', 'orange', 'banana']
plpy.info('other types', detail=basket)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
-- should fail
-DO $$ plpy.info('wrong sqlstate', sqlstate='54444A') $$ LANGUAGE plpythonu;
-DO $$ plpy.info('unsupported argument', blabla='fooboo') $$ LANGUAGE plpythonu;
-DO $$ plpy.info('first message', message='second message') $$ LANGUAGE plpythonu;
-DO $$ plpy.info('first message', 'second message', message='third message') $$ LANGUAGE plpythonu;
+DO $$ plpy.info('wrong sqlstate', sqlstate='54444A') $$ LANGUAGE plpython3u;
+DO $$ plpy.info('unsupported argument', blabla='fooboo') $$ LANGUAGE plpython3u;
+DO $$ plpy.info('first message', message='second message') $$ LANGUAGE plpython3u;
+DO $$ plpy.info('first message', 'second message', message='third message') $$ LANGUAGE plpython3u;
-- raise exception in python, handle exception in plgsql
CREATE OR REPLACE FUNCTION raise_exception(_message text, _detail text DEFAULT NULL, _hint text DEFAULT NULL,
@@ -57,7 +57,7 @@ kwargs = {
}
# ignore None values
plpy.error(**dict((k, v) for k, v in iter(kwargs.items()) if v))
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT raise_exception('hello', 'world');
SELECT raise_exception('message text', 'detail text', _sqlstate => 'YY333');
@@ -128,7 +128,7 @@ try:
except Exception as e:
plpy.info(e.spidata)
raise e
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
DO $$
try:
@@ -136,4 +136,4 @@ try:
except Exception as e:
plpy.info('sqlstate: %s, hint: %s, table_name: %s, datatype_name: %s' % (e.sqlstate, e.hint, e.table_name, e.datatype_name))
raise e
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
diff --git a/src/pl/plpython/sql/plpython_error.sql b/src/pl/plpython/sql/plpython_error.sql
index 88d6936fd0d..11f14ec5a7c 100644
--- a/src/pl/plpython/sql/plpython_error.sql
+++ b/src/pl/plpython/sql/plpython_error.sql
@@ -7,7 +7,7 @@
CREATE FUNCTION python_syntax_error() RETURNS text
AS
'.syntaxerror'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
/* With check_function_bodies = false the function should get defined
* and the error reported when called
@@ -17,7 +17,7 @@ SET check_function_bodies = false;
CREATE FUNCTION python_syntax_error() RETURNS text
AS
'.syntaxerror'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
SELECT python_syntax_error();
/* Run the function twice to check if the hashtable entry gets cleaned up */
@@ -30,7 +30,7 @@ RESET check_function_bodies;
CREATE FUNCTION sql_syntax_error() RETURNS text
AS
'plpy.execute("syntax error")'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
SELECT sql_syntax_error();
@@ -40,7 +40,7 @@ SELECT sql_syntax_error();
CREATE FUNCTION exception_index_invalid(text) RETURNS text
AS
'return args[1]'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
SELECT exception_index_invalid('test');
@@ -51,7 +51,7 @@ CREATE FUNCTION exception_index_invalid_nested() RETURNS text
AS
'rv = plpy.execute("SELECT test5(''foo'')")
return rv[0]'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
SELECT exception_index_invalid_nested();
@@ -68,7 +68,7 @@ if len(rv):
return rv[0]["fname"]
return None
'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
SELECT invalid_type_uncaught('rick');
@@ -90,7 +90,7 @@ if len(rv):
return rv[0]["fname"]
return None
'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
SELECT invalid_type_caught('rick');
@@ -111,7 +111,7 @@ if len(rv):
return rv[0]["fname"]
return None
'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
SELECT invalid_type_reraised('rick');
@@ -127,7 +127,7 @@ if len(rv):
return rv[0]["fname"]
return None
'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
SELECT valid_type('rick');
@@ -147,7 +147,7 @@ def fun3():
fun3()
return "not reached"
'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
SELECT nested_error();
@@ -167,7 +167,7 @@ def fun3():
fun3()
return "not reached"
'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
SELECT nested_error_raise();
@@ -187,7 +187,7 @@ def fun3():
fun3()
return "you''ve been warned"
'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
SELECT nested_warning();
@@ -196,7 +196,7 @@ SELECT nested_warning();
CREATE FUNCTION toplevel_attribute_error() RETURNS void AS
$$
plpy.nonexistent
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT toplevel_attribute_error();
@@ -213,7 +213,7 @@ def third():
plpy.execute("select sql_error()")
first()
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE OR REPLACE FUNCTION sql_error() RETURNS void AS $$
begin
@@ -229,7 +229,7 @@ $$ LANGUAGE plpgsql;
CREATE OR REPLACE FUNCTION sql_from_python_error() RETURNS void AS $$
plpy.execute("select sql_error()")
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT python_traceback();
SELECT sql_error();
@@ -251,7 +251,7 @@ except spiexceptions.NotNullViolation as e:
plpy.notice("Violated the NOT NULL constraint, sqlstate %s" % e.sqlstate)
except spiexceptions.UniqueViolation as e:
plpy.notice("Violated the UNIQUE constraint, sqlstate %s" % e.sqlstate)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT specific_exception(2);
SELECT specific_exception(NULL);
@@ -262,7 +262,7 @@ SELECT specific_exception(2);
CREATE FUNCTION python_unique_violation() RETURNS void AS $$
plpy.execute("insert into specific values (1)")
plpy.execute("insert into specific values (1)")
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION catch_python_unique_violation() RETURNS text AS $$
begin
@@ -283,7 +283,7 @@ CREATE FUNCTION manual_subxact() RETURNS void AS $$
plpy.execute("savepoint save")
plpy.execute("create table foo(x integer)")
plpy.execute("rollback to save")
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT manual_subxact();
@@ -295,7 +295,7 @@ rollback = plpy.prepare("rollback to save")
plpy.execute(save)
plpy.execute("create table foo(x integer)")
plpy.execute(rollback)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT manual_subxact_prepared();
@@ -303,7 +303,7 @@ SELECT manual_subxact_prepared();
*/
CREATE FUNCTION plpy_raise_spiexception() RETURNS void AS $$
raise plpy.spiexceptions.DivisionByZero()
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
DO $$
BEGIN
@@ -319,7 +319,7 @@ CREATE FUNCTION plpy_raise_spiexception_override() RETURNS void AS $$
exc = plpy.spiexceptions.DivisionByZero()
exc.sqlstate = 'SILLY'
raise exc
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
DO $$
BEGIN
@@ -332,14 +332,14 @@ $$ LANGUAGE plpgsql;
/* test the context stack trace for nested execution levels
*/
CREATE FUNCTION notice_innerfunc() RETURNS int AS $$
-plpy.execute("DO LANGUAGE plpythonu $x$ plpy.notice('inside DO') $x$")
+plpy.execute("DO LANGUAGE plpython3u $x$ plpy.notice('inside DO') $x$")
return 1
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION notice_outerfunc() RETURNS int AS $$
plpy.execute("SELECT notice_innerfunc()")
return 1
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
\set SHOW_CONTEXT always
diff --git a/src/pl/plpython/sql/plpython_global.sql b/src/pl/plpython/sql/plpython_global.sql
index 32502b41eee..96d20492861 100644
--- a/src/pl/plpython/sql/plpython_global.sql
+++ b/src/pl/plpython/sql/plpython_global.sql
@@ -9,7 +9,7 @@ CREATE FUNCTION global_test_one() returns text
if "global_test" not in GD:
GD["global_test"] = "set by global_test_one"
return "SD: " + SD["global_test"] + ", GD: " + GD["global_test"]'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
CREATE FUNCTION global_test_two() returns text
AS
@@ -18,7 +18,7 @@ CREATE FUNCTION global_test_two() returns text
if "global_test" not in GD:
GD["global_test"] = "set by global_test_two"
return "SD: " + SD["global_test"] + ", GD: " + GD["global_test"]'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
CREATE FUNCTION static_test() returns int4
@@ -29,7 +29,7 @@ else:
SD["call"] = 1
return SD["call"]
'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
SELECT static_test();
diff --git a/src/pl/plpython/sql/plpython_import.sql b/src/pl/plpython/sql/plpython_import.sql
index ec887677e1e..3031eef2e69 100644
--- a/src/pl/plpython/sql/plpython_import.sql
+++ b/src/pl/plpython/sql/plpython_import.sql
@@ -7,7 +7,7 @@ CREATE FUNCTION import_fail() returns text
except ImportError:
return "failed as expected"
return "succeeded, that wasn''t supposed to happen"'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
CREATE FUNCTION import_succeed() returns text
@@ -28,7 +28,7 @@ except Exception as ex:
plpy.notice("import failed -- %s" % str(ex))
return "failed, that wasn''t supposed to happen"
return "succeeded, as expected"'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
CREATE FUNCTION import_test_one(p text) RETURNS text
AS
@@ -39,7 +39,7 @@ except ImportError:
import sha
digest = sha.new(p)
return digest.hexdigest()'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
CREATE FUNCTION import_test_two(u users) RETURNS text
AS
@@ -51,7 +51,7 @@ except ImportError:
import sha
digest = sha.new(plain);
return "sha hash of " + plain + " is " + digest.hexdigest()'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
-- import python modules
diff --git a/src/pl/plpython/sql/plpython_newline.sql b/src/pl/plpython/sql/plpython_newline.sql
index f9cee9491bb..cb22ba923f9 100644
--- a/src/pl/plpython/sql/plpython_newline.sql
+++ b/src/pl/plpython/sql/plpython_newline.sql
@@ -4,15 +4,15 @@
CREATE OR REPLACE FUNCTION newline_lf() RETURNS integer AS
E'x = 100\ny = 23\nreturn x + y\n'
-LANGUAGE plpythonu;
+LANGUAGE plpython3u;
CREATE OR REPLACE FUNCTION newline_cr() RETURNS integer AS
E'x = 100\ry = 23\rreturn x + y\r'
-LANGUAGE plpythonu;
+LANGUAGE plpython3u;
CREATE OR REPLACE FUNCTION newline_crlf() RETURNS integer AS
E'x = 100\r\ny = 23\r\nreturn x + y\r\n'
-LANGUAGE plpythonu;
+LANGUAGE plpython3u;
SELECT newline_lf();
diff --git a/src/pl/plpython/sql/plpython_params.sql b/src/pl/plpython/sql/plpython_params.sql
index ee75c4dc410..8bab4888592 100644
--- a/src/pl/plpython/sql/plpython_params.sql
+++ b/src/pl/plpython/sql/plpython_params.sql
@@ -4,13 +4,13 @@
CREATE FUNCTION test_param_names0(integer, integer) RETURNS int AS $$
return args[0] + args[1]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION test_param_names1(a0 integer, a1 text) RETURNS boolean AS $$
assert a0 == args[0]
assert a1 == args[1]
return True
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION test_param_names2(u users) RETURNS text AS $$
assert u == args[0]
@@ -22,7 +22,7 @@ if isinstance(u, dict):
else:
s = str(u)
return s
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
-- use deliberately wrong parameter names
CREATE FUNCTION test_param_names3(a0 integer) RETURNS boolean AS $$
@@ -32,7 +32,7 @@ try:
except NameError as e:
assert e.args[0].find("a1") > -1
return True
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT test_param_names0(2,7);
diff --git a/src/pl/plpython/sql/plpython_quote.sql b/src/pl/plpython/sql/plpython_quote.sql
index 346b5485daf..a1133e7e266 100644
--- a/src/pl/plpython/sql/plpython_quote.sql
+++ b/src/pl/plpython/sql/plpython_quote.sql
@@ -9,7 +9,7 @@ CREATE FUNCTION quote(t text, how text) RETURNS text AS $$
return plpy.quote_ident(t)
else:
raise plpy.Error("unrecognized quote type %s" % how)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT quote(t, 'literal') FROM (VALUES
('abc'),
diff --git a/src/pl/plpython/sql/plpython_record.sql b/src/pl/plpython/sql/plpython_record.sql
index 9bab4c9e82d..52bad8bccea 100644
--- a/src/pl/plpython/sql/plpython_record.sql
+++ b/src/pl/plpython/sql/plpython_record.sql
@@ -27,7 +27,7 @@ elif typ == 'obj':
type_record.first = first
type_record.second = second
return type_record
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION test_type_record_as(typ text, first text, second integer, retnull boolean) RETURNS type_record AS $$
if retnull:
@@ -45,20 +45,20 @@ elif typ == 'obj':
return type_record
elif typ == 'str':
return "('%s',%r)" % (first, second)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION test_in_out_params(first in text, second out text) AS $$
return first + '_in_to_out';
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION test_in_out_params_multi(first in text,
second out text, third out text) AS $$
return (first + '_record_in_to_out_1', first + '_record_in_to_out_2');
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION test_inout_params(first inout text) AS $$
return first + '_inout';
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
-- Test tuple returning functions
@@ -136,14 +136,14 @@ SELECT * FROM test_type_record_as('obj', 'one', 1, false);
CREATE FUNCTION test_type_record_error1() RETURNS type_record AS $$
return { 'first': 'first' }
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_record_error1();
CREATE FUNCTION test_type_record_error2() RETURNS type_record AS $$
return [ 'first' ]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_record_error2();
@@ -152,12 +152,12 @@ CREATE FUNCTION test_type_record_error3() RETURNS type_record AS $$
class type_record: pass
type_record.first = 'first'
return type_record
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_record_error3();
CREATE FUNCTION test_type_record_error4() RETURNS type_record AS $$
return 'foo'
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_record_error4();
diff --git a/src/pl/plpython/sql/plpython_setof.sql b/src/pl/plpython/sql/plpython_setof.sql
index 16c2eef0ad6..4cfb10192c0 100644
--- a/src/pl/plpython/sql/plpython_setof.sql
+++ b/src/pl/plpython/sql/plpython_setof.sql
@@ -4,21 +4,21 @@
CREATE FUNCTION test_setof_error() RETURNS SETOF text AS $$
return 37
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT test_setof_error();
CREATE FUNCTION test_setof_as_list(count integer, content text) RETURNS SETOF text AS $$
return [ content ]*count
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION test_setof_as_tuple(count integer, content text) RETURNS SETOF text AS $$
t = ()
for i in range(count):
t += ( content, )
return t
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION test_setof_as_iterator(count integer, content text) RETURNS SETOF text AS $$
class producer:
@@ -27,13 +27,13 @@ class producer:
self.icount = icount
def __iter__ (self):
return self
- def next (self):
+ def __next__ (self):
if self.icount == 0:
raise StopIteration
self.icount -= 1
return self.icontent
return producer(count, content)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION test_setof_spi_in_iterator() RETURNS SETOF text AS
$$
@@ -42,7 +42,7 @@ $$
yield s
plpy.execute('select 2')
$$
-LANGUAGE plpythonu;
+LANGUAGE plpython3u;
-- Test set returning functions
@@ -69,7 +69,7 @@ global x
while x <= lim:
yield x
x = x + 1
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT ugly(1, 5);
@@ -81,7 +81,7 @@ CREATE OR REPLACE FUNCTION get_user_records()
RETURNS SETOF users
AS $$
return plpy.execute("SELECT * FROM users ORDER BY username")
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT get_user_records();
SELECT * FROM get_user_records();
@@ -91,7 +91,7 @@ CREATE OR REPLACE FUNCTION get_user_records2()
RETURNS TABLE(fname text, lname text, username text, userid int)
AS $$
return plpy.execute("SELECT * FROM users ORDER BY username")
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT get_user_records2();
SELECT * FROM get_user_records2();
diff --git a/src/pl/plpython/sql/plpython_spi.sql b/src/pl/plpython/sql/plpython_spi.sql
index dd77833ed56..112add93fc9 100644
--- a/src/pl/plpython/sql/plpython_spi.sql
+++ b/src/pl/plpython/sql/plpython_spi.sql
@@ -7,19 +7,19 @@ CREATE FUNCTION nested_call_one(a text) RETURNS text
'q = "SELECT nested_call_two(''%s'')" % a
r = plpy.execute(q)
return r[0]'
- LANGUAGE plpythonu ;
+ LANGUAGE plpython3u ;
CREATE FUNCTION nested_call_two(a text) RETURNS text
AS
'q = "SELECT nested_call_three(''%s'')" % a
r = plpy.execute(q)
return r[0]'
- LANGUAGE plpythonu ;
+ LANGUAGE plpython3u ;
CREATE FUNCTION nested_call_three(a text) RETURNS text
AS
'return a'
- LANGUAGE plpythonu ;
+ LANGUAGE plpython3u ;
-- some spi stuff
@@ -35,7 +35,7 @@ except Exception as ex:
plpy.error(str(ex))
return None
'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
CREATE FUNCTION spi_prepared_plan_test_two(a text) RETURNS text
AS
@@ -49,7 +49,7 @@ except Exception as ex:
plpy.error(str(ex))
return None
'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
CREATE FUNCTION spi_prepared_plan_test_nested(a text) RETURNS text
AS
@@ -64,7 +64,7 @@ except Exception as ex:
plpy.error(str(ex))
return None
'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
CREATE FUNCTION join_sequences(s sequences) RETURNS text
AS
@@ -77,7 +77,7 @@ for r in rv:
seq = seq + r["sequence"]
return seq
'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
CREATE FUNCTION spi_recursive_sum(a int) RETURNS int
AS
@@ -86,7 +86,7 @@ if a > 1:
r = plpy.execute("SELECT spi_recursive_sum(%d) as a" % (a-1))[0]["a"]
return a + r
'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
--
-- spi and nested calls
@@ -120,7 +120,7 @@ if result.status() > 0:
return result.nrows()
else:
return None
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT result_metadata_test($$SELECT 1 AS foo, '11'::text AS bar UNION SELECT 2, '22'$$);
SELECT result_metadata_test($$CREATE TEMPORARY TABLE foo1 (a int, b text)$$);
@@ -129,7 +129,7 @@ CREATE FUNCTION result_nrows_test(cmd text) RETURNS int
AS $$
result = plpy.execute(cmd)
return result.nrows()
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT result_nrows_test($$SELECT 1$$);
SELECT result_nrows_test($$CREATE TEMPORARY TABLE foo2 (a int, b text)$$);
@@ -140,7 +140,7 @@ CREATE FUNCTION result_len_test(cmd text) RETURNS int
AS $$
result = plpy.execute(cmd)
return len(result)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT result_len_test($$SELECT 1$$);
SELECT result_len_test($$CREATE TEMPORARY TABLE foo3 (a int, b text)$$);
@@ -170,7 +170,7 @@ except TypeError:
else:
assert False, "TypeError not raised"
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT result_subscript_test();
@@ -180,7 +180,7 @@ result = plpy.execute("select 1 where false")
plpy.info(result[:])
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT result_empty_test();
@@ -189,7 +189,7 @@ AS $$
plan = plpy.prepare(cmd)
result = plpy.execute(plan)
return str(result)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT result_str_test($$SELECT 1 AS foo UNION SELECT 2$$);
SELECT result_str_test($$CREATE TEMPORARY TABLE foo1 (a int, b text)$$);
@@ -203,13 +203,13 @@ for row in res:
if row['lname'] == 'doe':
does += 1
return does
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION double_cursor_close() RETURNS int AS $$
res = plpy.cursor("select fname, lname from users")
res.close()
res.close()
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION cursor_fetch() RETURNS int AS $$
res = plpy.cursor("select fname, lname from users")
@@ -228,7 +228,7 @@ except StopIteration:
pass
else:
assert False, "StopIteration not raised"
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION cursor_mix_next_and_fetch() RETURNS int AS $$
res = plpy.cursor("select fname, lname from users order by fname")
@@ -242,7 +242,7 @@ except AttributeError:
assert item['fname'] == 'rick'
assert len(res.fetch(2)) == 1
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION fetch_after_close() RETURNS int AS $$
res = plpy.cursor("select fname, lname from users")
@@ -253,7 +253,7 @@ except ValueError:
pass
else:
assert False, "ValueError not raised"
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION next_after_close() RETURNS int AS $$
res = plpy.cursor("select fname, lname from users")
@@ -267,7 +267,7 @@ except ValueError:
pass
else:
assert False, "ValueError not raised"
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION cursor_fetch_next_empty() RETURNS int AS $$
res = plpy.cursor("select fname, lname from users where false")
@@ -281,7 +281,7 @@ except StopIteration:
pass
else:
assert False, "StopIteration not raised"
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION cursor_plan() RETURNS SETOF text AS $$
plan = plpy.prepare(
@@ -291,13 +291,13 @@ for row in plpy.cursor(plan, ["w"]):
yield row['fname']
for row in plan.cursor(["j"]):
yield row['fname']
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION cursor_plan_wrong_args() RETURNS SETOF text AS $$
plan = plpy.prepare("select fname, lname from users where fname like $1 || '%'",
["text"])
c = plpy.cursor(plan, ["a", "b"])
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TYPE test_composite_type AS (
a1 int,
@@ -308,7 +308,7 @@ CREATE OR REPLACE FUNCTION plan_composite_args() RETURNS test_composite_type AS
plan = plpy.prepare("select $1 as c1", ["test_composite_type"])
res = plpy.execute(plan, [{"a1": 3, "a2": "label"}])
return res[0]["c1"]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT simple_cursor_test();
SELECT double_cursor_close();
diff --git a/src/pl/plpython/sql/plpython_subtransaction.sql b/src/pl/plpython/sql/plpython_subtransaction.sql
index cc4b1ae102b..c65c380f40c 100644
--- a/src/pl/plpython/sql/plpython_subtransaction.sql
+++ b/src/pl/plpython/sql/plpython_subtransaction.sql
@@ -17,7 +17,7 @@ with plpy.subtransaction():
plpy.execute("INSERT INTO subtransaction_tbl VALUES ('oops')")
elif what_error == "Python":
raise Exception("Python exception")
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT subtransaction_ctx_test();
SELECT * FROM subtransaction_tbl;
@@ -45,7 +45,7 @@ with plpy.subtransaction():
raise
plpy.notice("Swallowed %s(%r)" % (e.__class__.__name__, e.args[0]))
return "ok"
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT subtransaction_nested_test();
SELECT * FROM subtransaction_tbl;
@@ -65,7 +65,7 @@ with plpy.subtransaction():
plpy.execute("INSERT INTO subtransaction_tbl VALUES (2)")
plpy.execute("SELECT subtransaction_nested_test('t')")
return "ok"
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT subtransaction_deeply_nested_test();
SELECT * FROM subtransaction_tbl;
@@ -76,25 +76,25 @@ TRUNCATE subtransaction_tbl;
CREATE FUNCTION subtransaction_exit_without_enter() RETURNS void
AS $$
plpy.subtransaction().__exit__(None, None, None)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION subtransaction_enter_without_exit() RETURNS void
AS $$
plpy.subtransaction().__enter__()
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION subtransaction_exit_twice() RETURNS void
AS $$
plpy.subtransaction().__enter__()
plpy.subtransaction().__exit__(None, None, None)
plpy.subtransaction().__exit__(None, None, None)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION subtransaction_enter_twice() RETURNS void
AS $$
plpy.subtransaction().__enter__()
plpy.subtransaction().__enter__()
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION subtransaction_exit_same_subtransaction_twice() RETURNS void
AS $$
@@ -102,7 +102,7 @@ s = plpy.subtransaction()
s.__enter__()
s.__exit__(None, None, None)
s.__exit__(None, None, None)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION subtransaction_enter_same_subtransaction_twice() RETURNS void
AS $$
@@ -110,14 +110,14 @@ s = plpy.subtransaction()
s.__enter__()
s.__enter__()
s.__exit__(None, None, None)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
-- No warnings here, as the subtransaction gets indeed closed
CREATE FUNCTION subtransaction_enter_subtransaction_in_with() RETURNS void
AS $$
with plpy.subtransaction() as s:
s.__enter__()
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION subtransaction_exit_subtransaction_in_with() RETURNS void
AS $$
@@ -126,7 +126,7 @@ try:
s.__exit__(None, None, None)
except ValueError as e:
raise ValueError(e)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT subtransaction_exit_without_enter();
SELECT subtransaction_enter_without_exit();
@@ -159,7 +159,7 @@ try:
plpy.execute(p, ["wrong"])
except plpy.SPIError:
plpy.warning("Caught a SPI error")
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT subtransaction_mix_explicit_and_implicit();
SELECT * FROM subtransaction_tbl;
@@ -172,7 +172,7 @@ AS $$
s = plpy.subtransaction()
s.enter()
s.exit(None, None, None)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT subtransaction_alternative_names();
@@ -186,7 +186,7 @@ with plpy.subtransaction():
plpy.execute("INSERT INTO subtransaction_tbl VALUES ('a')")
except plpy.SPIError:
plpy.notice("caught")
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT try_catch_inside_subtransaction();
SELECT * FROM subtransaction_tbl;
@@ -202,7 +202,7 @@ with plpy.subtransaction():
plpy.execute("INSERT INTO subtransaction_tbl VALUES (1)")
except plpy.SPIError:
plpy.notice("caught")
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT pk_violation_inside_subtransaction();
SELECT * FROM subtransaction_tbl;
@@ -217,7 +217,7 @@ with plpy.subtransaction():
cur.fetch(10)
fetched = cur.fetch(10);
return int(fetched[5]["i"])
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION cursor_aborted_subxact() RETURNS int AS $$
try:
@@ -229,7 +229,7 @@ except plpy.SPIError:
fetched = cur.fetch(10)
return int(fetched[5]["i"])
return 0 # not reached
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION cursor_plan_aborted_subxact() RETURNS int AS $$
try:
@@ -243,7 +243,7 @@ except plpy.SPIError:
fetched = cur.fetch(5)
return fetched[2]["i"]
return 0 # not reached
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION cursor_close_aborted_subxact() RETURNS boolean AS $$
try:
@@ -254,7 +254,7 @@ except plpy.SPIError:
cur.close()
return True
return False # not reached
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT cursor_in_subxact();
SELECT cursor_aborted_subxact();
diff --git a/src/pl/plpython/sql/plpython_test.sql b/src/pl/plpython/sql/plpython_test.sql
index 5f1be9c94a8..aa22a274155 100644
--- a/src/pl/plpython/sql/plpython_test.sql
+++ b/src/pl/plpython/sql/plpython_test.sql
@@ -1,13 +1,13 @@
-- first some tests of basic functionality
-CREATE EXTENSION plpython2u;
+CREATE EXTENSION plpython3u;
-- really stupid function just to get the module loaded
-CREATE FUNCTION stupid() RETURNS text AS 'return "zarkon"' LANGUAGE plpythonu;
+CREATE FUNCTION stupid() RETURNS text AS 'return "zarkon"' LANGUAGE plpython3u;
select stupid();
-- check 2/3 versioning
-CREATE FUNCTION stupidn() RETURNS text AS 'return "zarkon"' LANGUAGE plpython2u;
+CREATE FUNCTION stupidn() RETURNS text AS 'return "zarkon"' LANGUAGE plpython3u;
select stupidn();
@@ -21,7 +21,7 @@ for key in keys:
out.append("%s: %s" % (key, u[key]))
words = a1 + " " + a2 + " => {" + ", ".join(out) + "}"
return words'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
select "Argument test #1"(users, fname, lname) from users where lname = 'doe' order by 1;
@@ -32,7 +32,7 @@ $$
contents = list(filter(lambda x: not x.startswith("__"), dir(plpy)))
contents.sort()
return contents
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
select module_contents();
@@ -47,6 +47,6 @@ plpy.info('info', 37, [1, 2, 3])
plpy.notice('notice')
plpy.warning('warning')
plpy.error('error')
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT elog_test_basic();
diff --git a/src/pl/plpython/sql/plpython_transaction.sql b/src/pl/plpython/sql/plpython_transaction.sql
index 33b37e5b7fc..f9062254572 100644
--- a/src/pl/plpython/sql/plpython_transaction.sql
+++ b/src/pl/plpython/sql/plpython_transaction.sql
@@ -2,7 +2,7 @@ CREATE TABLE test1 (a int, b text);
CREATE PROCEDURE transaction_test1()
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
for i in range(0, 10):
plpy.execute("INSERT INTO test1 (a) VALUES (%d)" % i)
@@ -20,7 +20,7 @@ SELECT * FROM test1;
TRUNCATE test1;
DO
-LANGUAGE plpythonu
+LANGUAGE plpython3u
$$
for i in range(0, 10):
plpy.execute("INSERT INTO test1 (a) VALUES (%d)" % i)
@@ -37,7 +37,7 @@ TRUNCATE test1;
-- not allowed in a function
CREATE FUNCTION transaction_test2() RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
for i in range(0, 10):
plpy.execute("INSERT INTO test1 (a) VALUES (%d)" % i)
@@ -55,7 +55,7 @@ SELECT * FROM test1;
-- also not allowed if procedure is called from a function
CREATE FUNCTION transaction_test3() RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
plpy.execute("CALL transaction_test1()")
return 1
@@ -68,9 +68,9 @@ SELECT * FROM test1;
-- DO block inside function
CREATE FUNCTION transaction_test4() RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
-plpy.execute("DO LANGUAGE plpythonu $x$ plpy.commit() $x$")
+plpy.execute("DO LANGUAGE plpython3u $x$ plpy.commit() $x$")
return 1
$$;
@@ -78,7 +78,7 @@ SELECT transaction_test4();
-- commit inside subtransaction (prohibited)
-DO LANGUAGE plpythonu $$
+DO LANGUAGE plpython3u $$
s = plpy.subtransaction()
s.enter()
plpy.commit()
@@ -91,7 +91,7 @@ INSERT INTO test2 VALUES (0), (1), (2), (3), (4);
TRUNCATE test1;
-DO LANGUAGE plpythonu $$
+DO LANGUAGE plpython3u $$
for row in plpy.cursor("SELECT * FROM test2 ORDER BY x"):
plpy.execute("INSERT INTO test1 (a) VALUES (%s)" % row['x'])
plpy.commit()
@@ -106,7 +106,7 @@ SELECT * FROM pg_cursors;
-- error in cursor loop with commit
TRUNCATE test1;
-DO LANGUAGE plpythonu $$
+DO LANGUAGE plpython3u $$
for row in plpy.cursor("SELECT * FROM test2 ORDER BY x"):
plpy.execute("INSERT INTO test1 (a) VALUES (12/(%s-2))" % row['x'])
plpy.commit()
@@ -120,7 +120,7 @@ SELECT * FROM pg_cursors;
-- rollback inside cursor loop
TRUNCATE test1;
-DO LANGUAGE plpythonu $$
+DO LANGUAGE plpython3u $$
for row in plpy.cursor("SELECT * FROM test2 ORDER BY x"):
plpy.execute("INSERT INTO test1 (a) VALUES (%s)" % row['x'])
plpy.rollback()
@@ -134,7 +134,7 @@ SELECT * FROM pg_cursors;
-- first commit then rollback inside cursor loop
TRUNCATE test1;
-DO LANGUAGE plpythonu $$
+DO LANGUAGE plpython3u $$
for row in plpy.cursor("SELECT * FROM test2 ORDER BY x"):
plpy.execute("INSERT INTO test1 (a) VALUES (%s)" % row['x'])
if row['x'] % 2 == 0:
diff --git a/src/pl/plpython/sql/plpython_trigger.sql b/src/pl/plpython/sql/plpython_trigger.sql
index 19852dc5851..e5504b9ab1d 100644
--- a/src/pl/plpython/sql/plpython_trigger.sql
+++ b/src/pl/plpython/sql/plpython_trigger.sql
@@ -16,7 +16,7 @@ if TD["new"]["fname"] == "william":
TD["new"]["fname"] = TD["args"][0]
rv = "MODIFY"
return rv'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
CREATE FUNCTION users_update() returns trigger
@@ -25,7 +25,7 @@ CREATE FUNCTION users_update() returns trigger
if TD["old"]["fname"] != TD["new"]["fname"] and TD["old"]["fname"] == TD["args"][0]:
return "SKIP"
return None'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
CREATE FUNCTION users_delete() RETURNS trigger
@@ -33,7 +33,7 @@ CREATE FUNCTION users_delete() RETURNS trigger
'if TD["old"]["fname"] == TD["args"][0]:
return "SKIP"
return None'
- LANGUAGE plpythonu;
+ LANGUAGE plpython3u;
CREATE TRIGGER users_insert_trig BEFORE INSERT ON users FOR EACH ROW
@@ -72,7 +72,7 @@ CREATE TABLE trigger_test_generated (
j int GENERATED ALWAYS AS (i * 2) STORED
);
-CREATE FUNCTION trigger_data() RETURNS trigger LANGUAGE plpythonu AS $$
+CREATE FUNCTION trigger_data() RETURNS trigger LANGUAGE plpython3u AS $$
if 'relid' in TD:
TD['relid'] = "bogus:12345"
@@ -157,7 +157,7 @@ INSERT INTO trigger_test VALUES (0, 'zero');
CREATE FUNCTION stupid1() RETURNS trigger
AS $$
return 37
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TRIGGER stupid_trigger1
BEFORE INSERT ON trigger_test
@@ -173,7 +173,7 @@ DROP TRIGGER stupid_trigger1 ON trigger_test;
CREATE FUNCTION stupid2() RETURNS trigger
AS $$
return "MODIFY"
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TRIGGER stupid_trigger2
BEFORE DELETE ON trigger_test
@@ -191,7 +191,7 @@ INSERT INTO trigger_test VALUES (0, 'zero');
CREATE FUNCTION stupid3() RETURNS trigger
AS $$
return "foo"
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TRIGGER stupid_trigger3
BEFORE UPDATE ON trigger_test
@@ -206,8 +206,8 @@ DROP TRIGGER stupid_trigger3 ON trigger_test;
CREATE FUNCTION stupid3u() RETURNS trigger
AS $$
- return u"foo"
-$$ LANGUAGE plpythonu;
+ return "foo"
+$$ LANGUAGE plpython3u;
CREATE TRIGGER stupid_trigger3
BEFORE UPDATE ON trigger_test
@@ -224,7 +224,7 @@ CREATE FUNCTION stupid4() RETURNS trigger
AS $$
del TD["new"]
return "MODIFY";
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TRIGGER stupid_trigger4
BEFORE UPDATE ON trigger_test
@@ -241,7 +241,7 @@ CREATE FUNCTION stupid5() RETURNS trigger
AS $$
TD["new"] = ['foo', 'bar']
return "MODIFY";
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TRIGGER stupid_trigger5
BEFORE UPDATE ON trigger_test
@@ -258,7 +258,7 @@ CREATE FUNCTION stupid6() RETURNS trigger
AS $$
TD["new"] = {1: 'foo', 2: 'bar'}
return "MODIFY";
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TRIGGER stupid_trigger6
BEFORE UPDATE ON trigger_test
@@ -275,7 +275,7 @@ CREATE FUNCTION stupid7() RETURNS trigger
AS $$
TD["new"] = {'v': 'foo', 'a': 'bar'}
return "MODIFY";
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TRIGGER stupid_trigger7
BEFORE UPDATE ON trigger_test
@@ -290,9 +290,9 @@ DROP TRIGGER stupid_trigger7 ON trigger_test;
CREATE FUNCTION stupid7u() RETURNS trigger
AS $$
- TD["new"] = {u'v': 'foo', u'a': 'bar'}
+ TD["new"] = {'v': 'foo', 'a': 'bar'}
return "MODIFY"
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TRIGGER stupid_trigger7
BEFORE UPDATE ON trigger_test
@@ -318,7 +318,7 @@ CREATE FUNCTION test_null() RETURNS trigger
AS $$
TD["new"]['v'] = None
return "MODIFY"
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TRIGGER test_null_trigger
BEFORE UPDATE ON trigger_test
@@ -341,7 +341,7 @@ SET DateStyle = 'ISO';
CREATE FUNCTION set_modif_time() RETURNS trigger AS $$
TD['new']['modif_time'] = '2010-10-13 21:57:28.930486'
return 'MODIFY'
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TABLE pb (a TEXT, modif_time TIMESTAMP(0) WITHOUT TIME ZONE);
@@ -365,7 +365,7 @@ CREATE FUNCTION composite_trigger_f() RETURNS trigger AS $$
TD['new']['f1'] = (3, False)
TD['new']['f2'] = {'k': 7, 'l': 'yes', 'ignored': 10}
return 'MODIFY'
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TRIGGER composite_trigger BEFORE INSERT ON composite_trigger_test
FOR EACH ROW EXECUTE PROCEDURE composite_trigger_f();
@@ -380,7 +380,7 @@ CREATE TABLE composite_trigger_noop_test (f1 comp1, f2 comp2);
CREATE FUNCTION composite_trigger_noop_f() RETURNS trigger AS $$
return 'MODIFY'
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TRIGGER composite_trigger_noop BEFORE INSERT ON composite_trigger_noop_test
FOR EACH ROW EXECUTE PROCEDURE composite_trigger_noop_f();
@@ -399,7 +399,7 @@ CREATE TABLE composite_trigger_nested_test(c comp3);
CREATE FUNCTION composite_trigger_nested_f() RETURNS trigger AS $$
return 'MODIFY'
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE TRIGGER composite_trigger_nested BEFORE INSERT ON composite_trigger_nested_test
FOR EACH ROW EXECUTE PROCEDURE composite_trigger_nested_f();
@@ -410,7 +410,7 @@ INSERT INTO composite_trigger_nested_test VALUES (ROW(ROW(NULL, 't'), ROW(1, 'f'
SELECT * FROM composite_trigger_nested_test;
-- check that using a function as a trigger over two tables works correctly
-CREATE FUNCTION trig1234() RETURNS trigger LANGUAGE plpythonu AS $$
+CREATE FUNCTION trig1234() RETURNS trigger LANGUAGE plpython3u AS $$
TD["new"]["data"] = '1234'
return 'MODIFY'
$$;
@@ -432,7 +432,7 @@ SELECT * FROM b;
CREATE TABLE transition_table_test (id int, name text);
INSERT INTO transition_table_test VALUES (1, 'a');
-CREATE FUNCTION transition_table_test_f() RETURNS trigger LANGUAGE plpythonu AS
+CREATE FUNCTION transition_table_test_f() RETURNS trigger LANGUAGE plpython3u AS
$$
rv = plpy.execute("SELECT * FROM old_table")
assert(rv.nrows() == 1)
@@ -455,7 +455,7 @@ DROP FUNCTION transition_table_test_f();
-- dealing with generated columns
CREATE FUNCTION generated_test_func1() RETURNS trigger
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
TD['new']['j'] = 5 # not allowed
return 'MODIFY'
diff --git a/src/pl/plpython/sql/plpython_types.sql b/src/pl/plpython/sql/plpython_types.sql
index 0d207d9c015..40f4f79d99f 100644
--- a/src/pl/plpython/sql/plpython_types.sql
+++ b/src/pl/plpython/sql/plpython_types.sql
@@ -9,7 +9,7 @@
CREATE FUNCTION test_type_conversion_bool(x bool) RETURNS bool AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_bool(true);
SELECT * FROM test_type_conversion_bool(false);
@@ -35,7 +35,7 @@ elif n == 5:
ret = [0]
plpy.info(ret, not not ret)
return ret
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_bool_other(0);
SELECT * FROM test_type_conversion_bool_other(1);
@@ -48,7 +48,7 @@ SELECT * FROM test_type_conversion_bool_other(5);
CREATE FUNCTION test_type_conversion_char(x char) RETURNS char AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_char('a');
SELECT * FROM test_type_conversion_char(null);
@@ -57,7 +57,7 @@ SELECT * FROM test_type_conversion_char(null);
CREATE FUNCTION test_type_conversion_int2(x int2) RETURNS int2 AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_int2(100::int2);
SELECT * FROM test_type_conversion_int2(-100::int2);
@@ -67,7 +67,7 @@ SELECT * FROM test_type_conversion_int2(null);
CREATE FUNCTION test_type_conversion_int4(x int4) RETURNS int4 AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_int4(100);
SELECT * FROM test_type_conversion_int4(-100);
@@ -77,7 +77,7 @@ SELECT * FROM test_type_conversion_int4(null);
CREATE FUNCTION test_type_conversion_int8(x int8) RETURNS int8 AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_int8(100);
SELECT * FROM test_type_conversion_int8(-100);
@@ -90,7 +90,7 @@ CREATE FUNCTION test_type_conversion_numeric(x numeric) RETURNS numeric AS $$
# between decimal and cdecimal
plpy.info(str(x), x.__class__.__name__)
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_numeric(100);
SELECT * FROM test_type_conversion_numeric(-100);
@@ -105,7 +105,7 @@ SELECT * FROM test_type_conversion_numeric(null);
CREATE FUNCTION test_type_conversion_float4(x float4) RETURNS float4 AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_float4(100);
SELECT * FROM test_type_conversion_float4(-100);
@@ -116,7 +116,7 @@ SELECT * FROM test_type_conversion_float4(null);
CREATE FUNCTION test_type_conversion_float8(x float8) RETURNS float8 AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_float8(100);
SELECT * FROM test_type_conversion_float8(-100);
@@ -128,7 +128,7 @@ SELECT * FROM test_type_conversion_float8(100100100.654321);
CREATE FUNCTION test_type_conversion_oid(x oid) RETURNS oid AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_oid(100);
SELECT * FROM test_type_conversion_oid(2147483649);
@@ -138,7 +138,7 @@ SELECT * FROM test_type_conversion_oid(null);
CREATE FUNCTION test_type_conversion_text(x text) RETURNS text AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_text('hello world');
SELECT * FROM test_type_conversion_text(null);
@@ -147,7 +147,7 @@ SELECT * FROM test_type_conversion_text(null);
CREATE FUNCTION test_type_conversion_bytea(x bytea) RETURNS bytea AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_bytea('hello world');
SELECT * FROM test_type_conversion_bytea(E'null\\000byte');
@@ -157,7 +157,7 @@ SELECT * FROM test_type_conversion_bytea(null);
CREATE FUNCTION test_type_marshal() RETURNS bytea AS $$
import marshal
return marshal.dumps('hello world')
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION test_type_unmarshal(x bytea) RETURNS text AS $$
import marshal
@@ -165,7 +165,7 @@ try:
return marshal.loads(x)
except ValueError as e:
return 'FAILED: ' + str(e)
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT test_type_unmarshal(x) FROM test_type_marshal() x;
@@ -178,7 +178,7 @@ CREATE DOMAIN booltrue AS bool CHECK (VALUE IS TRUE OR VALUE IS NULL);
CREATE FUNCTION test_type_conversion_booltrue(x booltrue, y bool) RETURNS booltrue AS $$
return y
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_booltrue(true, true);
SELECT * FROM test_type_conversion_booltrue(false, true);
@@ -190,7 +190,7 @@ CREATE DOMAIN uint2 AS int2 CHECK (VALUE >= 0);
CREATE FUNCTION test_type_conversion_uint2(x uint2, y int) RETURNS uint2 AS $$
plpy.info(x, type(x))
return y
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_uint2(100::uint2, 50);
SELECT * FROM test_type_conversion_uint2(100::uint2, -50);
@@ -201,7 +201,7 @@ CREATE DOMAIN nnint AS int CHECK (VALUE IS NOT NULL);
CREATE FUNCTION test_type_conversion_nnint(x nnint, y int) RETURNS nnint AS $$
return y
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_nnint(10, 20);
SELECT * FROM test_type_conversion_nnint(null, 20);
@@ -213,7 +213,7 @@ CREATE DOMAIN bytea10 AS bytea CHECK (octet_length(VALUE) = 10 AND VALUE IS NOT
CREATE FUNCTION test_type_conversion_bytea10(x bytea10, y bytea) RETURNS bytea10 AS $$
plpy.info(x, type(x))
return y
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_bytea10('hello wold', 'hello wold');
SELECT * FROM test_type_conversion_bytea10('hello world', 'hello wold');
@@ -229,7 +229,7 @@ SELECT * FROM test_type_conversion_bytea10('hello word', null);
CREATE FUNCTION test_type_conversion_array_int4(x int4[]) RETURNS int4[] AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_int4(ARRAY[0, 100]);
SELECT * FROM test_type_conversion_array_int4(ARRAY[0,-100,55]);
@@ -243,14 +243,14 @@ SELECT * FROM test_type_conversion_array_int4('[2:4]={1,2,3}');
CREATE FUNCTION test_type_conversion_array_int8(x int8[]) RETURNS int8[] AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_int8(ARRAY[[[1,2,NULL],[NULL,5,6]],[[NULL,8,9],[10,11,12]]]::int8[]);
CREATE FUNCTION test_type_conversion_array_date(x date[]) RETURNS date[] AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_date(ARRAY[[['2016-09-21','2016-09-22',NULL],[NULL,'2016-10-21','2016-10-22']],
[[NULL,'2016-11-21','2016-10-21'],['2015-09-21','2015-09-22','2014-09-21']]]::date[]);
@@ -258,7 +258,7 @@ SELECT * FROM test_type_conversion_array_date(ARRAY[[['2016-09-21','2016-09-22',
CREATE FUNCTION test_type_conversion_array_timestamp(x timestamp[]) RETURNS timestamp[] AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_timestamp(ARRAY[[['2016-09-21 15:34:24.078792-04','2016-10-22 11:34:24.078795-04',NULL],
[NULL,'2016-10-21 11:34:25.078792-04','2016-10-21 11:34:24.098792-04']],
@@ -270,7 +270,7 @@ CREATE OR REPLACE FUNCTION pyreturnmultidemint4(h int4, i int4, j int4, k int4 )
m = [[[[x for x in range(h)] for y in range(i)] for z in range(j)] for w in range(k)]
plpy.info(m, type(m))
return m
-$BODY$ LANGUAGE plpythonu;
+$BODY$ LANGUAGE plpython3u;
select pyreturnmultidemint4(8,5,3,2);
@@ -278,7 +278,7 @@ CREATE OR REPLACE FUNCTION pyreturnmultidemint8(h int4, i int4, j int4, k int4 )
m = [[[[x for x in range(h)] for y in range(i)] for z in range(j)] for w in range(k)]
plpy.info(m, type(m))
return m
-$BODY$ LANGUAGE plpythonu;
+$BODY$ LANGUAGE plpython3u;
select pyreturnmultidemint8(5,5,3,2);
@@ -286,7 +286,7 @@ CREATE OR REPLACE FUNCTION pyreturnmultidemfloat4(h int4, i int4, j int4, k int4
m = [[[[x for x in range(h)] for y in range(i)] for z in range(j)] for w in range(k)]
plpy.info(m, type(m))
return m
-$BODY$ LANGUAGE plpythonu;
+$BODY$ LANGUAGE plpython3u;
select pyreturnmultidemfloat4(6,5,3,2);
@@ -294,14 +294,14 @@ CREATE OR REPLACE FUNCTION pyreturnmultidemfloat8(h int4, i int4, j int4, k int4
m = [[[[x for x in range(h)] for y in range(i)] for z in range(j)] for w in range(k)]
plpy.info(m, type(m))
return m
-$BODY$ LANGUAGE plpythonu;
+$BODY$ LANGUAGE plpython3u;
select pyreturnmultidemfloat8(7,5,3,2);
CREATE FUNCTION test_type_conversion_array_text(x text[]) RETURNS text[] AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_text(ARRAY['foo', 'bar']);
SELECT * FROM test_type_conversion_array_text(ARRAY[['foo', 'bar'],['foo2', 'bar2']]);
@@ -310,59 +310,59 @@ SELECT * FROM test_type_conversion_array_text(ARRAY[['foo', 'bar'],['foo2', 'bar
CREATE FUNCTION test_type_conversion_array_bytea(x bytea[]) RETURNS bytea[] AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_bytea(ARRAY[E'\\xdeadbeef'::bytea, NULL]);
CREATE FUNCTION test_type_conversion_array_mixed1() RETURNS text[] AS $$
return [123, 'abc']
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_mixed1();
CREATE FUNCTION test_type_conversion_array_mixed2() RETURNS int[] AS $$
return [123, 'abc']
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_mixed2();
CREATE FUNCTION test_type_conversion_mdarray_malformed() RETURNS int[] AS $$
return [[1,2,3],[4,5]]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_mdarray_malformed();
CREATE FUNCTION test_type_conversion_mdarray_toodeep() RETURNS int[] AS $$
return [[[[[[[1]]]]]]]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_mdarray_toodeep();
CREATE FUNCTION test_type_conversion_array_record() RETURNS type_record[] AS $$
return [{'first': 'one', 'second': 42}, {'first': 'two', 'second': 11}]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_record();
CREATE FUNCTION test_type_conversion_array_string() RETURNS text[] AS $$
return 'abc'
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_string();
CREATE FUNCTION test_type_conversion_array_tuple() RETURNS text[] AS $$
return ('abc', 'def')
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_tuple();
CREATE FUNCTION test_type_conversion_array_error() RETURNS int[] AS $$
return 5
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_error();
@@ -376,14 +376,14 @@ CREATE DOMAIN ordered_pair_domain AS integer[] CHECK (array_length(VALUE,1)=2 AN
CREATE FUNCTION test_type_conversion_array_domain(x ordered_pair_domain) RETURNS ordered_pair_domain AS $$
plpy.info(x, type(x))
return x
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_domain(ARRAY[0, 100]::ordered_pair_domain);
SELECT * FROM test_type_conversion_array_domain(NULL::ordered_pair_domain);
CREATE FUNCTION test_type_conversion_array_domain_check_violation() RETURNS ordered_pair_domain AS $$
return [2,1]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT * FROM test_type_conversion_array_domain_check_violation();
@@ -394,13 +394,13 @@ SELECT * FROM test_type_conversion_array_domain_check_violation();
CREATE FUNCTION test_read_uint2_array(x uint2[]) RETURNS uint2 AS $$
plpy.info(x, type(x))
return x[0]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
select test_read_uint2_array(array[1::uint2]);
CREATE FUNCTION test_build_uint2_array(x int2) RETURNS uint2[] AS $$
return [x, x]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
select test_build_uint2_array(1::int2);
select test_build_uint2_array(-1::int2); -- fail
@@ -413,7 +413,7 @@ select test_build_uint2_array(-1::int2); -- fail
CREATE FUNCTION test_type_conversion_domain_array(x integer[])
RETURNS ordered_pair_domain[] AS $$
return [x, x]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
select test_type_conversion_domain_array(array[2,4]);
select test_type_conversion_domain_array(array[4,2]); -- fail
@@ -422,7 +422,7 @@ CREATE FUNCTION test_type_conversion_domain_array2(x ordered_pair_domain)
RETURNS integer AS $$
plpy.info(x, type(x))
return x[1]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
select test_type_conversion_domain_array2(array[2,4]);
select test_type_conversion_domain_array2(array[4,2]); -- fail
@@ -431,7 +431,7 @@ CREATE FUNCTION test_type_conversion_array_domain_array(x ordered_pair_domain[])
RETURNS ordered_pair_domain AS $$
plpy.info(x, type(x))
return x[0]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
select test_type_conversion_array_domain_array(array[array[2,4]::ordered_pair_domain]);
@@ -450,7 +450,7 @@ INSERT INTO employee VALUES ('John', 100, 10), ('Mary', 200, 10);
CREATE OR REPLACE FUNCTION test_composite_table_input(e employee) RETURNS integer AS $$
return e['basesalary'] + e['bonus']
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT name, test_composite_table_input(employee.*) FROM employee;
@@ -470,7 +470,7 @@ CREATE TYPE named_pair AS (
CREATE OR REPLACE FUNCTION test_composite_type_input(p named_pair) RETURNS integer AS $$
return sum(p.values())
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT test_composite_type_input(row(1, 2));
@@ -487,7 +487,7 @@ CREATE TYPE nnint_container AS (f1 int, f2 nnint);
CREATE FUNCTION nnint_test(x int, y int) RETURNS nnint_container AS $$
return {'f1': x, 'f2': y}
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT nnint_test(null, 3);
SELECT nnint_test(3, null); -- fail
@@ -501,21 +501,21 @@ CREATE DOMAIN ordered_named_pair AS named_pair_2 CHECK((VALUE).i <= (VALUE).j);
CREATE FUNCTION read_ordered_named_pair(p ordered_named_pair) RETURNS integer AS $$
return p['i'] + p['j']
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT read_ordered_named_pair(row(1, 2));
SELECT read_ordered_named_pair(row(2, 1)); -- fail
CREATE FUNCTION build_ordered_named_pair(i int, j int) RETURNS ordered_named_pair AS $$
return {'i': i, 'j': j}
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT build_ordered_named_pair(1,2);
SELECT build_ordered_named_pair(2,1); -- fail
CREATE FUNCTION build_ordered_named_pairs(i int, j int) RETURNS ordered_named_pair[] AS $$
return [{'i': i, 'j': j}, {'i': i, 'j': j+1}]
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
SELECT build_ordered_named_pairs(1,2);
SELECT build_ordered_named_pairs(2,1); -- fail
@@ -526,7 +526,7 @@ SELECT build_ordered_named_pairs(2,1); -- fail
--
CREATE OR REPLACE FUNCTION test_prep_bool_input() RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
plan = plpy.prepare("SELECT CASE WHEN $1 THEN 1 ELSE 0 END AS val", ['boolean'])
rv = plpy.execute(plan, ['fa'], 5) # 'fa' is true in Python
@@ -537,7 +537,7 @@ SELECT test_prep_bool_input(); -- 1
CREATE OR REPLACE FUNCTION test_prep_bool_output() RETURNS bool
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
plan = plpy.prepare("SELECT $1 = 1 AS val", ['int'])
rv = plpy.execute(plan, [0], 5)
@@ -549,7 +549,7 @@ SELECT test_prep_bool_output(); -- false
CREATE OR REPLACE FUNCTION test_prep_bytea_input(bb bytea) RETURNS int
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
plan = plpy.prepare("SELECT octet_length($1) AS val", ['bytea'])
rv = plpy.execute(plan, [bb], 5)
@@ -560,7 +560,7 @@ SELECT test_prep_bytea_input(E'a\\000b'); -- 3 (embedded null formerly truncated
CREATE OR REPLACE FUNCTION test_prep_bytea_output() RETURNS bytea
-LANGUAGE plpythonu
+LANGUAGE plpython3u
AS $$
plan = plpy.prepare("SELECT decode('aa00bb', 'hex') AS val")
rv = plpy.execute(plan, [], 5)
diff --git a/src/pl/plpython/sql/plpython_unicode.sql b/src/pl/plpython/sql/plpython_unicode.sql
index a11e5eeaa21..14f7b4e0053 100644
--- a/src/pl/plpython/sql/plpython_unicode.sql
+++ b/src/pl/plpython/sql/plpython_unicode.sql
@@ -14,28 +14,28 @@ CREATE TABLE unicode_test (
);
CREATE FUNCTION unicode_return() RETURNS text AS E'
-return u"\\xA0"
-' LANGUAGE plpythonu;
+return "\\xA0"
+' LANGUAGE plpython3u;
CREATE FUNCTION unicode_trigger() RETURNS trigger AS E'
-TD["new"]["testvalue"] = u"\\xA0"
+TD["new"]["testvalue"] = "\\xA0"
return "MODIFY"
-' LANGUAGE plpythonu;
+' LANGUAGE plpython3u;
CREATE TRIGGER unicode_test_bi BEFORE INSERT ON unicode_test
FOR EACH ROW EXECUTE PROCEDURE unicode_trigger();
CREATE FUNCTION unicode_plan1() RETURNS text AS E'
plan = plpy.prepare("SELECT $1 AS testvalue", ["text"])
-rv = plpy.execute(plan, [u"\\xA0"], 1)
+rv = plpy.execute(plan, ["\\xA0"], 1)
return rv[0]["testvalue"]
-' LANGUAGE plpythonu;
+' LANGUAGE plpython3u;
CREATE FUNCTION unicode_plan2() RETURNS text AS E'
-plan = plpy.prepare("SELECT $1 || $2 AS testvalue", ["text", u"text"])
+plan = plpy.prepare("SELECT $1 || $2 AS testvalue", ["text", "text"])
rv = plpy.execute(plan, ["foo", "bar"], 1)
return rv[0]["testvalue"]
-' LANGUAGE plpythonu;
+' LANGUAGE plpython3u;
SELECT unicode_return();
diff --git a/src/pl/plpython/sql/plpython_void.sql b/src/pl/plpython/sql/plpython_void.sql
index 77d7f59e4c7..5a1a6711fb0 100644
--- a/src/pl/plpython/sql/plpython_void.sql
+++ b/src/pl/plpython/sql/plpython_void.sql
@@ -4,16 +4,16 @@
CREATE FUNCTION test_void_func1() RETURNS void AS $$
x = 10
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
-- illegal: can't return non-None value in void-returning func
CREATE FUNCTION test_void_func2() RETURNS void AS $$
return 10
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
CREATE FUNCTION test_return_none() RETURNS int AS $$
None
-$$ LANGUAGE plpythonu;
+$$ LANGUAGE plpython3u;
-- Tests for functions returning void
--
2.23.0.385.gbc12974a89
v5-0004-meson-prereq-output-and-depencency-tracking-work.patchtext/x-diff; charset=us-asciiDownload
From 604a2932f353f0bd12f7ed45ed7fe7124e8aad7f Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Mon, 8 Mar 2021 13:47:39 -0800
Subject: [PATCH v5 04/16] meson: prereq: output and depencency tracking work.
---
src/backend/utils/misc/Makefile | 5 ++++-
src/backend/utils/misc/guc.c | 2 +-
src/bin/initdb/initdb.c | 5 +++--
src/bin/psql/Makefile | 4 ++--
src/bin/psql/create_help.pl | 16 ++++++++++++----
src/tools/msvc/MSBuildProject.pm | 9 +++++++--
src/tools/msvc/Mkvcbuild.pm | 3 +++
src/tools/msvc/Solution.pm | 2 +-
src/tools/msvc/pgflex.pl | 4 ++--
9 files changed, 35 insertions(+), 15 deletions(-)
diff --git a/src/backend/utils/misc/Makefile b/src/backend/utils/misc/Makefile
index 1d5327cf644..14861fd96b2 100644
--- a/src/backend/utils/misc/Makefile
+++ b/src/backend/utils/misc/Makefile
@@ -37,8 +37,11 @@ endif
include $(top_srcdir)/src/backend/common.mk
+guc-file.c.h: guc-file.l
+ flex -o $@ $<
+
# guc-file is compiled as part of guc
-guc.o: guc-file.c
+guc.o: guc-file.c.h
# Note: guc-file.c is not deleted by 'make clean',
# since we want to ship it in distribution tarballs.
diff --git a/src/backend/utils/misc/guc.c b/src/backend/utils/misc/guc.c
index e91d5a3cfda..a0ac8cf0341 100644
--- a/src/backend/utils/misc/guc.c
+++ b/src/backend/utils/misc/guc.c
@@ -12572,4 +12572,4 @@ check_default_with_oids(bool *newval, void **extra, GucSource source)
return true;
}
-#include "guc-file.c"
+#include "guc-file.c.h"
diff --git a/src/bin/initdb/initdb.c b/src/bin/initdb/initdb.c
index 1ed4808d53f..9067a06e58a 100644
--- a/src/bin/initdb/initdb.c
+++ b/src/bin/initdb/initdb.c
@@ -1368,8 +1368,9 @@ bootstrap_template1(void)
if (strcmp(headerline, *bki_lines) != 0)
{
- pg_log_error("input file \"%s\" does not belong to PostgreSQL %s",
- bki_file, PG_VERSION);
+ pg_log_error("input file \"%s\" does not belong to PostgreSQL %s (expect %s, is %s)",
+ bki_file, PG_VERSION,
+ headerline, *bki_lines);
fprintf(stderr,
_("Check your installation or specify the correct path "
"using the option -L.\n"));
diff --git a/src/bin/psql/Makefile b/src/bin/psql/Makefile
index d00881163c0..3851da1c8ef 100644
--- a/src/bin/psql/Makefile
+++ b/src/bin/psql/Makefile
@@ -56,7 +56,7 @@ sql_help.c: sql_help.h
touch $@
sql_help.h: create_help.pl $(wildcard $(REFDOCDIR)/*.sgml)
- $(PERL) $< $(REFDOCDIR) $*
+ $(PERL) $< $(REFDOCDIR) . $*
psqlscanslash.c: FLEXFLAGS = -Cfe -p -p
psqlscanslash.c: FLEX_NO_BACKUP=yes
@@ -81,7 +81,7 @@ clean distclean:
# files removed here are supposed to be in the distribution tarball,
# so do not clean them in the clean/distclean rules
maintainer-clean: distclean
- rm -f sql_help.h sql_help.c psqlscanslash.c
+ rm -f sql_help.h sql_help.c sql_help.dep psqlscanslash.c
check:
$(prove_check)
diff --git a/src/bin/psql/create_help.pl b/src/bin/psql/create_help.pl
index 83324239740..40eb6ac2d3f 100644
--- a/src/bin/psql/create_help.pl
+++ b/src/bin/psql/create_help.pl
@@ -23,9 +23,12 @@ use strict;
use warnings;
my $docdir = $ARGV[0] or die "$0: missing required argument: docdir\n";
-my $hfile = $ARGV[1] . '.h'
+my $outdir = $ARGV[1] or die "$0: missing required argument: outdir\n";
+
+my $hfile = $ARGV[2] . '.h'
or die "$0: missing required argument: output file\n";
-my $cfile = $ARGV[1] . '.c';
+my $cfile = $ARGV[2] . '.c';
+my $depfile = $ARGV[2] . '.dep';
my $hfilebasename;
if ($hfile =~ m!.*/([^/]+)$!)
@@ -43,10 +46,12 @@ $define =~ s/\W/_/g;
opendir(DIR, $docdir)
or die "$0: could not open documentation source dir '$docdir': $!\n";
-open(my $hfile_handle, '>', $hfile)
+open(my $hfile_handle, '>', $outdir . '/' . $hfile)
or die "$0: could not open output file '$hfile': $!\n";
-open(my $cfile_handle, '>', $cfile)
+open(my $cfile_handle, '>', $outdir . '/' . $cfile)
or die "$0: could not open output file '$cfile': $!\n";
+open(my $depfile_handle, '>', $outdir . '/' . $depfile)
+ or die "$0: could not open output file '$depfile': $!\n";
print $hfile_handle "/*
* *** Do not change this file by hand. It is automatically
@@ -98,6 +103,8 @@ foreach my $file (sort readdir DIR)
my ($cmdid, @cmdnames, $cmddesc, $cmdsynopsis);
$file =~ /\.sgml$/ or next;
+ print $depfile_handle "$cfile $hfile: $docdir/$file\n";
+
open(my $fh, '<', "$docdir/$file") or next;
my $filecontent = join('', <$fh>);
close $fh;
@@ -216,4 +223,5 @@ print $hfile_handle "
close $cfile_handle;
close $hfile_handle;
+close $depfile_handle;
closedir DIR;
diff --git a/src/tools/msvc/MSBuildProject.pm b/src/tools/msvc/MSBuildProject.pm
index fdd22e89eb2..036e44fcb83 100644
--- a/src/tools/msvc/MSBuildProject.pm
+++ b/src/tools/msvc/MSBuildProject.pm
@@ -211,14 +211,19 @@ EOF
}
else #if ($grammarFile =~ /\.l$/)
{
+ if ($outputFile eq 'src/backend/utils/misc/guc-file.c')
+ {
+ $outputFile = 'src/backend/utils/misc/guc-file.c.h';
+ }
+
print $f <<EOF;
<CustomBuild Include="$grammarFile">
<Message Condition="'\$(Configuration)|\$(Platform)'=='Debug|$self->{platform}'">Running flex on $grammarFile</Message>
- <Command Condition="'\$(Configuration)|\$(Platform)'=='Debug|$self->{platform}'">perl "src\\tools\\msvc\\pgflex.pl" "$grammarFile"</Command>
+ <Command Condition="'\$(Configuration)|\$(Platform)'=='Debug|$self->{platform}'">perl "src\\tools\\msvc\\pgflex.pl" "$grammarFile" "$outputFile"</Command>
<AdditionalInputs Condition="'\$(Configuration)|\$(Platform)'=='Debug|$self->{platform}'">%(AdditionalInputs)</AdditionalInputs>
<Outputs Condition="'\$(Configuration)|\$(Platform)'=='Debug|$self->{platform}'">$outputFile;%(Outputs)</Outputs>
<Message Condition="'\$(Configuration)|\$(Platform)'=='Release|$self->{platform}'">Running flex on $grammarFile</Message>
- <Command Condition="'\$(Configuration)|\$(Platform)'=='Release|$self->{platform}'">perl "src\\tools\\msvc\\pgflex.pl" "$grammarFile"</Command>
+ <Command Condition="'\$(Configuration)|\$(Platform)'=='Release|$self->{platform}'">perl "src\\tools\\msvc\\pgflex.pl" "$grammarFile" "$outputFile"</Command>
<AdditionalInputs Condition="'\$(Configuration)|\$(Platform)'=='Release|$self->{platform}'">%(AdditionalInputs)</AdditionalInputs>
<Outputs Condition="'\$(Configuration)|\$(Platform)'=='Release|$self->{platform}'">$outputFile;%(Outputs)</Outputs>
</CustomBuild>
diff --git a/src/tools/msvc/Mkvcbuild.pm b/src/tools/msvc/Mkvcbuild.pm
index 4362bd44fd1..b8e62c6d3f7 100644
--- a/src/tools/msvc/Mkvcbuild.pm
+++ b/src/tools/msvc/Mkvcbuild.pm
@@ -330,6 +330,7 @@ sub mkvcbuild
$pgregress_ecpg->AddFile('src/test/regress/pg_regress.c');
$pgregress_ecpg->AddIncludeDir('src/port');
$pgregress_ecpg->AddIncludeDir('src/test/regress');
+ $pgregress_ecpg->AddDefine('DLSUFFIX=".dll"');
$pgregress_ecpg->AddDefine('HOST_TUPLE="i686-pc-win32vc"');
$pgregress_ecpg->AddLibrary('ws2_32.lib');
$pgregress_ecpg->AddDirResourceFile('src/interfaces/ecpg/test');
@@ -345,6 +346,7 @@ sub mkvcbuild
$isolation_tester->AddIncludeDir('src/port');
$isolation_tester->AddIncludeDir('src/test/regress');
$isolation_tester->AddIncludeDir('src/interfaces/libpq');
+ $isolation_tester->AddDefine('DLSUFFIX=".dll"');
$isolation_tester->AddDefine('HOST_TUPLE="i686-pc-win32vc"');
$isolation_tester->AddLibrary('ws2_32.lib');
$isolation_tester->AddDirResourceFile('src/test/isolation');
@@ -356,6 +358,7 @@ sub mkvcbuild
$pgregress_isolation->AddFile('src/test/regress/pg_regress.c');
$pgregress_isolation->AddIncludeDir('src/port');
$pgregress_isolation->AddIncludeDir('src/test/regress');
+ $pgregress_isolation->AddDefine('DLSUFFIX=".dll"');
$pgregress_isolation->AddDefine('HOST_TUPLE="i686-pc-win32vc"');
$pgregress_isolation->AddLibrary('ws2_32.lib');
$pgregress_isolation->AddDirResourceFile('src/test/isolation');
diff --git a/src/tools/msvc/Solution.pm b/src/tools/msvc/Solution.pm
index 43fd1be0888..dc78d3b65c3 100644
--- a/src/tools/msvc/Solution.pm
+++ b/src/tools/msvc/Solution.pm
@@ -689,7 +689,7 @@ sub GenerateFiles
{
print "Generating sql_help.h...\n";
chdir('src/bin/psql');
- system("perl create_help.pl ../../../doc/src/sgml/ref sql_help");
+ system("perl create_help.pl ../../../doc/src/sgml/ref . sql_help");
chdir('../../..');
}
diff --git a/src/tools/msvc/pgflex.pl b/src/tools/msvc/pgflex.pl
index 0728b85d4de..19f26ff213f 100644
--- a/src/tools/msvc/pgflex.pl
+++ b/src/tools/msvc/pgflex.pl
@@ -29,6 +29,8 @@ unless ($verparts[0] == 2
}
my $input = shift;
+my $output = shift;
+
if ($input !~ /\.l$/)
{
print "Input must be a .l file\n";
@@ -40,8 +42,6 @@ elsif (!-e $input)
exit 1;
}
-(my $output = $input) =~ s/\.l$/.c/;
-
# get flex flags from make file
my $makefile = dirname($input) . "/Makefile";
my ($mf, $make);
--
2.23.0.385.gbc12974a89
v5-0005-meson-prereq-move-snowball_create.sql-creation-in.patchtext/x-diff; charset=us-asciiDownload
From 931ff07e713d7869a16a23dcf770f5fd30c18df8 Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Mon, 8 Mar 2021 14:59:22 -0800
Subject: [PATCH v5 05/16] meson: prereq: move snowball_create.sql creation
into perl file.
FIXME: deduplicate with Install.pm
---
src/backend/snowball/Makefile | 27 +-----
src/backend/snowball/snowball_create.pl | 110 ++++++++++++++++++++++++
2 files changed, 113 insertions(+), 24 deletions(-)
create mode 100644 src/backend/snowball/snowball_create.pl
diff --git a/src/backend/snowball/Makefile b/src/backend/snowball/Makefile
index 50b9199910c..259104f8eb3 100644
--- a/src/backend/snowball/Makefile
+++ b/src/backend/snowball/Makefile
@@ -119,29 +119,8 @@ all: all-shared-lib $(SQLSCRIPT)
include $(top_srcdir)/src/Makefile.shlib
-$(SQLSCRIPT): Makefile snowball_func.sql.in snowball.sql.in
- echo '-- Language-specific snowball dictionaries' > $@
- cat $(srcdir)/snowball_func.sql.in >> $@
- @set -e; \
- set $(LANGUAGES) ; \
- while [ "$$#" -gt 0 ] ; \
- do \
- lang=$$1; shift; \
- nonascdictname=$$lang; \
- ascdictname=$$1; shift; \
- if [ -s $(srcdir)/stopwords/$${lang}.stop ] ; then \
- stop=", StopWords=$${lang}" ; \
- else \
- stop=""; \
- fi; \
- cat $(srcdir)/snowball.sql.in | \
- sed -e "s#_LANGNAME_#$$lang#g" | \
- sed -e "s#_DICTNAME_#$${lang}_stem#g" | \
- sed -e "s#_CFGNAME_#$$lang#g" | \
- sed -e "s#_ASCDICTNAME_#$${ascdictname}_stem#g" | \
- sed -e "s#_NONASCDICTNAME_#$${nonascdictname}_stem#g" | \
- sed -e "s#_STOPWORDS_#$$stop#g" ; \
- done >> $@
+$(SQLSCRIPT): snowball_create.pl Makefile snowball_func.sql.in snowball.sql.in
+ $(PERL) $< --input ${srcdir} --output .
install: all installdirs install-lib
$(INSTALL_DATA) $(SQLSCRIPT) '$(DESTDIR)$(datadir)'
@@ -171,4 +150,4 @@ uninstall: uninstall-lib
done
clean distclean maintainer-clean: clean-lib
- rm -f $(OBJS) $(SQLSCRIPT)
+ rm -f $(OBJS) $(SQLSCRIPT) snowball_create.dep
diff --git a/src/backend/snowball/snowball_create.pl b/src/backend/snowball/snowball_create.pl
new file mode 100644
index 00000000000..d9d79f3668f
--- /dev/null
+++ b/src/backend/snowball/snowball_create.pl
@@ -0,0 +1,110 @@
+#!/usr/bin/perl
+
+use strict;
+use warnings;
+use Getopt::Long;
+
+my $output_path = '';
+my $makefile_path = '';
+my $input_path = '';
+
+GetOptions(
+ 'output:s' => \$output_path,
+ 'input:s' => \$input_path) || usage();
+
+# Make sure input_path ends in a slash if needed.
+if ($input_path ne '' && substr($input_path, -1) ne '/')
+{
+ $output_path .= '/';
+}
+
+# Make sure output_path ends in a slash if needed.
+if ($output_path ne '' && substr($output_path, -1) ne '/')
+{
+ $output_path .= '/';
+}
+
+GenerateTsearchFiles();
+
+sub usage
+{
+ die <<EOM;
+Usage: snowball_create.pl --input/-i <path> --input <path>
+ --output Output directory (default '.')
+ --input Input directory
+
+snowball_create.pl creates snowball.sql from snowball.sql.in
+EOM
+}
+
+sub GenerateTsearchFiles
+{
+ my $target = shift;
+ my $output_file = "$output_path/snowball_create.sql";
+
+ print "Generating tsearch script...";
+ my $F;
+ my $D;
+ my $tmpl = read_file("$input_path/snowball.sql.in");
+ my $mf = read_file("$input_path/Makefile");
+
+ open($D, '>', "$output_path/snowball_create.dep")
+ || die "Could not write snowball_create.dep";
+
+ print $D "$output_file: $input_path/Makefile\n";
+ print $D "$output_file: $input_path/snowball.sql.in\n";
+ print $D "$output_file: $input_path/snowball_func.sql.in\n";
+
+ $mf =~ s{\\\r?\n}{}g;
+ $mf =~ /^LANGUAGES\s*=\s*(.*)$/m
+ || die "Could not find LANGUAGES line in snowball Makefile\n";
+ my @pieces = split /\s+/, $1;
+ open($F, '>', $output_file)
+ || die "Could not write snowball_create.sql";
+
+ print $F "-- Language-specific snowball dictionaries\n";
+
+ print $F read_file("$input_path/snowball_func.sql.in");
+
+ while ($#pieces > 0)
+ {
+ my $lang = shift @pieces || last;
+ my $asclang = shift @pieces || last;
+ my $txt = $tmpl;
+ my $stop = '';
+ my $stopword_path = "$input_path/stopwords/$lang.stop";
+
+ if (-s "$stopword_path")
+ {
+ $stop = ", StopWords=$lang";
+
+ print $D "$output_file: $stopword_path\n";
+ }
+
+ $txt =~ s#_LANGNAME_#${lang}#gs;
+ $txt =~ s#_DICTNAME_#${lang}_stem#gs;
+ $txt =~ s#_CFGNAME_#${lang}#gs;
+ $txt =~ s#_ASCDICTNAME_#${asclang}_stem#gs;
+ $txt =~ s#_NONASCDICTNAME_#${lang}_stem#gs;
+ $txt =~ s#_STOPWORDS_#$stop#gs;
+ print $F $txt;
+ print ".";
+ }
+ close($F);
+ close($D);
+ print "\n";
+ return;
+}
+
+
+sub read_file
+{
+ my $filename = shift;
+ my $F;
+ local $/ = undef;
+ open($F, '<', $filename) || die "Could not open file $filename\n";
+ my $txt = <$F>;
+ close($F);
+
+ return $txt;
+}
--
2.23.0.385.gbc12974a89
v5-0006-meson-prereq-add-output-path-arg-in-generate-lwlo.patchtext/x-diff; charset=us-asciiDownload
From 5e712fa7763cb28a4ab8f763cab6d3746c24f857 Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Wed, 10 Mar 2021 01:43:07 -0800
Subject: [PATCH v5 06/16] meson: prereq: add output path arg in
generate-lwlocknames.pl
---
src/backend/storage/lmgr/generate-lwlocknames.pl | 14 ++++++++++----
1 file changed, 10 insertions(+), 4 deletions(-)
diff --git a/src/backend/storage/lmgr/generate-lwlocknames.pl b/src/backend/storage/lmgr/generate-lwlocknames.pl
index 8a44946594d..315156b29f1 100644
--- a/src/backend/storage/lmgr/generate-lwlocknames.pl
+++ b/src/backend/storage/lmgr/generate-lwlocknames.pl
@@ -5,15 +5,21 @@
use strict;
use warnings;
+use Getopt::Long;
+
+my $output_path = '.';
my $lastlockidx = -1;
my $continue = "\n";
+GetOptions(
+ 'output:s' => \$output_path);
+
open my $lwlocknames, '<', $ARGV[0] or die;
# Include PID in suffix in case parallel make runs this multiple times.
-my $htmp = "lwlocknames.h.tmp$$";
-my $ctmp = "lwlocknames.c.tmp$$";
+my $htmp = "$output_path/lwlocknames.h.tmp$$";
+my $ctmp = "$output_path/lwlocknames.c.tmp$$";
open my $h, '>', $htmp or die "Could not open $htmp: $!";
open my $c, '>', $ctmp or die "Could not open $ctmp: $!";
@@ -65,7 +71,7 @@ printf $h "#define NUM_INDIVIDUAL_LWLOCKS %s\n", $lastlockidx + 1;
close $h;
close $c;
-rename($htmp, 'lwlocknames.h') || die "rename: $htmp: $!";
-rename($ctmp, 'lwlocknames.c') || die "rename: $ctmp: $!";
+rename($htmp, "$output_path/lwlocknames.h") || die "rename: $htmp to $output_path/lwlocknames.h: $!";
+rename($ctmp, "$output_path/lwlocknames.c") || die "rename: $ctmp: $!";
close $lwlocknames;
--
2.23.0.385.gbc12974a89
v5-0007-meson-prereq-add-src-tools-gen_versioning_script..patchtext/x-diff; charset=us-asciiDownload
From 0fd47642701e4941c6a2bb3eca29b9509b999399 Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Wed, 10 Mar 2021 15:11:13 -0800
Subject: [PATCH v5 07/16] meson: prereq: add
src/tools/gen_versioning_script.pl.
Currently the logic is all in src/Makefile.shlib. This adds a sketch
of a generation script that can be used from meson.
---
src/tools/gen_versioning_script.pl | 58 ++++++++++++++++++++++++++++++
1 file changed, 58 insertions(+)
create mode 100644 src/tools/gen_versioning_script.pl
diff --git a/src/tools/gen_versioning_script.pl b/src/tools/gen_versioning_script.pl
new file mode 100644
index 00000000000..862b5e14aad
--- /dev/null
+++ b/src/tools/gen_versioning_script.pl
@@ -0,0 +1,58 @@
+use strict;
+use warnings;
+
+my $format = $ARGV[0] or die "$0: missing required argument: format\n";
+my $input = $ARGV[1] or die "$0: missing required argument: input\n";
+my $output = $ARGV[2] or die "$0: missing required argument: output\n";
+
+#FIXME: handle format argument, so we can reuse the one script for several platforms
+if (not ($format eq 'gnu' or $format eq 'darwin'))
+{
+ die "$0: $format is not yet handled (only gnu is)\n";
+}
+
+open(my $input_handle, '<', $input)
+ or die "$0: could not open input file '$input': $!\n";
+
+open(my $output_handle, '>', $output)
+ or die "$0: could not open output file '$output': $!\n";
+
+
+if ($format eq 'gnu')
+{
+ print $output_handle "{
+ global:
+";
+}
+
+while (<$input_handle>)
+{
+ if (/^#/)
+ {
+ # don't do anything with a comment
+ }
+ elsif (/^([^\s]+)\s+([^\s]+)/)
+ {
+ if ($format eq 'gnu')
+ {
+ print $output_handle " $1;\n";
+ }
+ elsif ($format eq 'darwin')
+ {
+ print $output_handle " _$1\n";
+ }
+ }
+ else
+ {
+ die "$0: unexpected line $_\n";
+ }
+}
+
+if ($format eq 'gnu')
+{
+ print $output_handle " local: *;
+};
+";
+}
+
+exit(0);
--
2.23.0.385.gbc12974a89
v5-0008-meson-prereq-generate-errcodes.pl-accept-output-f.patchtext/x-diff; charset=us-asciiDownload
From 8e6ee0e31e25238ff8ea50938a20f1775001c88d Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Mon, 27 Sep 2021 00:14:09 -0700
Subject: [PATCH v5 08/16] meson: prereq: generate-errcodes.pl: accept output
file
---
src/backend/utils/Makefile | 2 +-
src/backend/utils/generate-errcodes.pl | 13 ++++++++-----
src/tools/msvc/Solution.pm | 2 +-
3 files changed, 10 insertions(+), 7 deletions(-)
diff --git a/src/backend/utils/Makefile b/src/backend/utils/Makefile
index ef8df254826..469caf0d704 100644
--- a/src/backend/utils/Makefile
+++ b/src/backend/utils/Makefile
@@ -52,7 +52,7 @@ fmgr-stamp: Gen_fmgrtab.pl $(catalogdir)/Catalog.pm $(top_srcdir)/src/include/ca
touch $@
errcodes.h: $(top_srcdir)/src/backend/utils/errcodes.txt generate-errcodes.pl
- $(PERL) $(srcdir)/generate-errcodes.pl $< > $@
+ $(PERL) $(srcdir)/generate-errcodes.pl $< $@
ifneq ($(enable_dtrace), yes)
probes.h: Gen_dummy_probes.sed
diff --git a/src/backend/utils/generate-errcodes.pl b/src/backend/utils/generate-errcodes.pl
index c5cdd388138..57ec2a5ca21 100644
--- a/src/backend/utils/generate-errcodes.pl
+++ b/src/backend/utils/generate-errcodes.pl
@@ -6,11 +6,13 @@
use strict;
use warnings;
-print
+open my $errcodes, '<', $ARGV[0] or die;
+open my $out, '>', $ARGV[1] or die;
+
+print $out
"/* autogenerated from src/backend/utils/errcodes.txt, do not edit */\n";
-print "/* there is deliberately not an #ifndef ERRCODES_H here */\n";
+print $out "/* there is deliberately not an #ifndef ERRCODES_H here */\n";
-open my $errcodes, '<', $ARGV[0] or die;
while (<$errcodes>)
{
@@ -25,7 +27,7 @@ while (<$errcodes>)
{
my $header = $1;
$header =~ s/^\s+//;
- print "\n/* $header */\n";
+ print $out "\n/* $header */\n";
next;
}
@@ -40,7 +42,8 @@ while (<$errcodes>)
# And quote them
$sqlstate =~ s/([^,])/'$1'/g;
- print "#define $errcode_macro MAKE_SQLSTATE($sqlstate)\n";
+ print $out "#define $errcode_macro MAKE_SQLSTATE($sqlstate)\n";
}
close $errcodes;
+close $out;
diff --git a/src/tools/msvc/Solution.pm b/src/tools/msvc/Solution.pm
index dc78d3b65c3..40cd6020421 100644
--- a/src/tools/msvc/Solution.pm
+++ b/src/tools/msvc/Solution.pm
@@ -659,7 +659,7 @@ sub GenerateFiles
{
print "Generating errcodes.h...\n";
system(
- 'perl src/backend/utils/generate-errcodes.pl src/backend/utils/errcodes.txt > src/backend/utils/errcodes.h'
+ 'perl src/backend/utils/generate-errcodes.pl src/backend/utils/errcodes.txt src/backend/utils/errcodes.h'
);
copyFile('src/backend/utils/errcodes.h',
'src/include/utils/errcodes.h');
--
2.23.0.385.gbc12974a89
v5-0009-meson-prereq-remove-unhelpful-chattiness-in-snowb.patchtext/x-diff; charset=us-asciiDownload
From b47c7af10cea83c78e8f3c75bf083c42d6adfd55 Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Mon, 27 Sep 2021 15:41:24 -0700
Subject: [PATCH v5 09/16] meson: prereq: remove unhelpful chattiness in
snowball_create.pl.
---
src/backend/snowball/snowball_create.pl | 3 ---
1 file changed, 3 deletions(-)
diff --git a/src/backend/snowball/snowball_create.pl b/src/backend/snowball/snowball_create.pl
index d9d79f3668f..285cf4f5d90 100644
--- a/src/backend/snowball/snowball_create.pl
+++ b/src/backend/snowball/snowball_create.pl
@@ -42,7 +42,6 @@ sub GenerateTsearchFiles
my $target = shift;
my $output_file = "$output_path/snowball_create.sql";
- print "Generating tsearch script...";
my $F;
my $D;
my $tmpl = read_file("$input_path/snowball.sql.in");
@@ -88,11 +87,9 @@ sub GenerateTsearchFiles
$txt =~ s#_NONASCDICTNAME_#${lang}_stem#gs;
$txt =~ s#_STOPWORDS_#$stop#gs;
print $F $txt;
- print ".";
}
close($F);
close($D);
- print "\n";
return;
}
--
2.23.0.385.gbc12974a89
v5-0010-meson-prereq-Can-we-get-away-with-not-export-all-.patchtext/x-diff; charset=us-asciiDownload
From bfcc214e472040810bb59110280e734b2c5f1f2b Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Wed, 29 Sep 2021 00:29:10 -0700
Subject: [PATCH v5 10/16] meson: prereq: Can we get away with not
export-all'ing libraries?
---
configure | 49 ++++++++++++++++++++++
configure.ac | 10 +++++
contrib/hstore/hstore.h | 16 +++----
contrib/ltree/ltree.h | 40 +++++++++---------
src/Makefile.global.in | 1 +
src/Makefile.shlib | 12 ++++++
src/include/c.h | 15 +++++--
src/include/fmgr.h | 6 ++-
src/include/jit/jit.h | 2 +-
src/include/pg_config.h.in | 3 ++
src/include/replication/output_plugin.h | 2 +
src/pl/plpython/plpy_elog.h | 8 ++--
src/pl/plpython/plpy_typeio.h | 18 ++++----
src/pl/plpython/plpy_util.h | 8 ++--
src/test/modules/test_shm_mq/test_shm_mq.h | 2 +-
src/test/modules/worker_spi/worker_spi.c | 2 +-
src/tools/msvc/Solution.pm | 1 +
17 files changed, 142 insertions(+), 53 deletions(-)
diff --git a/configure b/configure
index 1b5fd12a432..fd15801b34c 100755
--- a/configure
+++ b/configure
@@ -735,6 +735,7 @@ CPP
CFLAGS_SL
BITCODE_CXXFLAGS
BITCODE_CFLAGS
+CFLAGS_SL_MOD
CFLAGS_VECTORIZE
CFLAGS_UNROLL_LOOPS
PERMIT_DECLARATION_AFTER_STATEMENT
@@ -6421,6 +6422,54 @@ fi
if test -n "$NOT_THE_CFLAGS"; then
CFLAGS="$CFLAGS -Wno-stringop-truncation"
fi
+
+ # If the compiler knows how to hide symbols, use that. But only for shared libraries,
+ # for postgres itself that'd be too verbose for now.
+ { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether ${CC} supports -fvisibility=hidden, for CFLAGS_SL_MOD" >&5
+$as_echo_n "checking whether ${CC} supports -fvisibility=hidden, for CFLAGS_SL_MOD... " >&6; }
+if ${pgac_cv_prog_CC_cflags__fvisibility_hidden+:} false; then :
+ $as_echo_n "(cached) " >&6
+else
+ pgac_save_CFLAGS=$CFLAGS
+pgac_save_CC=$CC
+CC=${CC}
+CFLAGS="${CFLAGS_SL_MOD} -fvisibility=hidden"
+ac_save_c_werror_flag=$ac_c_werror_flag
+ac_c_werror_flag=yes
+cat confdefs.h - <<_ACEOF >conftest.$ac_ext
+/* end confdefs.h. */
+
+int
+main ()
+{
+
+ ;
+ return 0;
+}
+_ACEOF
+if ac_fn_c_try_compile "$LINENO"; then :
+ pgac_cv_prog_CC_cflags__fvisibility_hidden=yes
+else
+ pgac_cv_prog_CC_cflags__fvisibility_hidden=no
+fi
+rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext
+ac_c_werror_flag=$ac_save_c_werror_flag
+CFLAGS="$pgac_save_CFLAGS"
+CC="$pgac_save_CC"
+fi
+{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $pgac_cv_prog_CC_cflags__fvisibility_hidden" >&5
+$as_echo "$pgac_cv_prog_CC_cflags__fvisibility_hidden" >&6; }
+if test x"$pgac_cv_prog_CC_cflags__fvisibility_hidden" = x"yes"; then
+ CFLAGS_SL_MOD="${CFLAGS_SL_MOD} -fvisibility=hidden"
+fi
+
+
+ if test "$pgac_cv_prog_CC_cflags__fvisibility_hidden" = yes; then
+
+$as_echo "#define HAVE_VISIBILITY_ATTRIBUTE 1" >>confdefs.h
+
+ fi
+
elif test "$ICC" = yes; then
# Intel's compiler has a bug/misoptimization in checking for
# division by NAN (NaN == 0), -mp1 fixes it, so add it to the CFLAGS.
diff --git a/configure.ac b/configure.ac
index 44ee3ebe2f1..973f83db52c 100644
--- a/configure.ac
+++ b/configure.ac
@@ -541,6 +541,15 @@ if test "$GCC" = yes -a "$ICC" = no; then
if test -n "$NOT_THE_CFLAGS"; then
CFLAGS="$CFLAGS -Wno-stringop-truncation"
fi
+
+ # If the compiler knows how to hide symbols, use that. But only for shared libraries,
+ # for postgres itself that'd be too verbose for now.
+ PGAC_PROG_CC_VAR_OPT(CFLAGS_SL_MOD, [-fvisibility=hidden])
+ if test "$pgac_cv_prog_CC_cflags__fvisibility_hidden" = yes; then
+ AC_DEFINE(HAVE_VISIBILITY_ATTRIBUTE, 1,
+ [Define to 1 if your compiler knows the visibility("hidden") attribute.])
+ fi
+
elif test "$ICC" = yes; then
# Intel's compiler has a bug/misoptimization in checking for
# division by NAN (NaN == 0), -mp1 fixes it, so add it to the CFLAGS.
@@ -564,6 +573,7 @@ fi
AC_SUBST(CFLAGS_UNROLL_LOOPS)
AC_SUBST(CFLAGS_VECTORIZE)
+AC_SUBST(CFLAGS_SL_MOD)
# Determine flags used to emit bitcode for JIT inlining. Need to test
# for behaviour changing compiler flags, to keep compatibility with
diff --git a/contrib/hstore/hstore.h b/contrib/hstore/hstore.h
index bf4a565ed9b..625134c9f69 100644
--- a/contrib/hstore/hstore.h
+++ b/contrib/hstore/hstore.h
@@ -147,7 +147,7 @@ typedef struct
} while (0)
/* DatumGetHStoreP includes support for reading old-format hstore values */
-extern HStore *hstoreUpgrade(Datum orig);
+extern PGDLLEXPORT HStore *hstoreUpgrade(Datum orig);
#define DatumGetHStoreP(d) hstoreUpgrade(d)
@@ -168,14 +168,14 @@ typedef struct
bool needfree; /* need to pfree the value? */
} Pairs;
-extern int hstoreUniquePairs(Pairs *a, int32 l, int32 *buflen);
-extern HStore *hstorePairs(Pairs *pairs, int32 pcount, int32 buflen);
+extern PGDLLEXPORT int hstoreUniquePairs(Pairs *a, int32 l, int32 *buflen);
+extern PGDLLEXPORT HStore *hstorePairs(Pairs *pairs, int32 pcount, int32 buflen);
-extern size_t hstoreCheckKeyLen(size_t len);
-extern size_t hstoreCheckValLen(size_t len);
+extern PGDLLEXPORT size_t hstoreCheckKeyLen(size_t len);
+extern PGDLLEXPORT size_t hstoreCheckValLen(size_t len);
-extern int hstoreFindKey(HStore *hs, int *lowbound, char *key, int keylen);
-extern Pairs *hstoreArrayToPairs(ArrayType *a, int *npairs);
+extern PGDLLEXPORT int hstoreFindKey(HStore *hs, int *lowbound, char *key, int keylen);
+extern PGDLLEXPORT Pairs *hstoreArrayToPairs(ArrayType *a, int *npairs);
#define HStoreContainsStrategyNumber 7
#define HStoreExistsStrategyNumber 9
@@ -194,7 +194,7 @@ extern Pairs *hstoreArrayToPairs(ArrayType *a, int *npairs);
#if HSTORE_POLLUTE_NAMESPACE
#define HSTORE_POLLUTE(newname_,oldname_) \
PG_FUNCTION_INFO_V1(oldname_); \
- Datum newname_(PG_FUNCTION_ARGS); \
+ extern PGDLLEXPORT Datum newname_(PG_FUNCTION_ARGS); \
Datum oldname_(PG_FUNCTION_ARGS) { return newname_(fcinfo); } \
extern int no_such_variable
#else
diff --git a/contrib/ltree/ltree.h b/contrib/ltree/ltree.h
index 5b4be5e680a..d8bcdedbdbe 100644
--- a/contrib/ltree/ltree.h
+++ b/contrib/ltree/ltree.h
@@ -176,30 +176,30 @@ typedef struct
/* use in array iterator */
-Datum ltree_isparent(PG_FUNCTION_ARGS);
-Datum ltree_risparent(PG_FUNCTION_ARGS);
-Datum ltq_regex(PG_FUNCTION_ARGS);
-Datum ltq_rregex(PG_FUNCTION_ARGS);
-Datum lt_q_regex(PG_FUNCTION_ARGS);
-Datum lt_q_rregex(PG_FUNCTION_ARGS);
-Datum ltxtq_exec(PG_FUNCTION_ARGS);
-Datum ltxtq_rexec(PG_FUNCTION_ARGS);
-Datum _ltq_regex(PG_FUNCTION_ARGS);
-Datum _ltq_rregex(PG_FUNCTION_ARGS);
-Datum _lt_q_regex(PG_FUNCTION_ARGS);
-Datum _lt_q_rregex(PG_FUNCTION_ARGS);
-Datum _ltxtq_exec(PG_FUNCTION_ARGS);
-Datum _ltxtq_rexec(PG_FUNCTION_ARGS);
-Datum _ltree_isparent(PG_FUNCTION_ARGS);
-Datum _ltree_risparent(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum ltree_isparent(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum ltree_risparent(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum ltq_regex(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum ltq_rregex(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum lt_q_regex(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum lt_q_rregex(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum ltxtq_exec(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum ltxtq_rexec(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum _ltq_regex(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum _ltq_rregex(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum _lt_q_regex(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum _lt_q_rregex(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum _ltxtq_exec(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum _ltxtq_rexec(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum _ltree_isparent(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum _ltree_risparent(PG_FUNCTION_ARGS);
/* Concatenation functions */
-Datum ltree_addltree(PG_FUNCTION_ARGS);
-Datum ltree_addtext(PG_FUNCTION_ARGS);
-Datum ltree_textadd(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum ltree_addltree(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum ltree_addtext(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum ltree_textadd(PG_FUNCTION_ARGS);
/* Util function */
-Datum ltree_in(PG_FUNCTION_ARGS);
+PGDLLEXPORT Datum ltree_in(PG_FUNCTION_ARGS);
bool ltree_execute(ITEM *curitem, void *checkval,
bool calcnot, bool (*chkcond) (void *checkval, ITEM *val));
diff --git a/src/Makefile.global.in b/src/Makefile.global.in
index 533c12fef95..c7c701dcb93 100644
--- a/src/Makefile.global.in
+++ b/src/Makefile.global.in
@@ -259,6 +259,7 @@ SUN_STUDIO_CC = @SUN_STUDIO_CC@
CXX = @CXX@
CFLAGS = @CFLAGS@
CFLAGS_SL = @CFLAGS_SL@
+CFLAGS_SL_MOD = @CFLAGS_SL_MOD@
CFLAGS_UNROLL_LOOPS = @CFLAGS_UNROLL_LOOPS@
CFLAGS_VECTORIZE = @CFLAGS_VECTORIZE@
CFLAGS_SSE42 = @CFLAGS_SSE42@
diff --git a/src/Makefile.shlib b/src/Makefile.shlib
index 551023c6fb0..d36782aa942 100644
--- a/src/Makefile.shlib
+++ b/src/Makefile.shlib
@@ -253,6 +253,18 @@ ifeq ($(PORTNAME), win32)
endif
+# If the shared library doesn't have an export file, mark all symbols not
+# explicitly exported using PGDLLEXPORT as hidden. We can't pass these flags
+# when building a library with explicit exports, as the symbols would be
+# hidden before the linker script / exported symbol list takes effect.
+#
+# XXX: This probably isn't the best location, but not clear instead?
+ifeq ($(SHLIB_EXPORTS),)
+ LDFLAGS += $(CFLAGS_SL_MOD)
+ override CFLAGS += $(CFLAGS_SL_MOD)
+ override CXXFLAGS += $(CFLAGS_SL_MOD)
+endif
+
##
## BUILD
diff --git a/src/include/c.h b/src/include/c.h
index c8ede082739..9b539a2657b 100644
--- a/src/include/c.h
+++ b/src/include/c.h
@@ -1312,11 +1312,18 @@ extern long long strtoll(const char *str, char **endptr, int base);
extern unsigned long long strtoull(const char *str, char **endptr, int base);
#endif
-/* no special DLL markers on most ports */
-#ifndef PGDLLIMPORT
-#define PGDLLIMPORT
+/*
+ * If the platform knows __attribute__((visibility("*"))), i.e. gcc like
+ * compilers, we use that.
+ */
+#if !defined(PGDLLIMPORT) && defined(HAVE_VISIBILITY_ATTRIBUTE)
+#define PGDLLIMPORT __attribute__((visibility("default")))
+#define PGDLLEXPORT __attribute__((visibility("default")))
#endif
-#ifndef PGDLLEXPORT
+
+/* No special DLL markers on the remaining ports. */
+#if !defined(PGDLLIMPORT)
+#define PGDLLIMPORT
#define PGDLLEXPORT
#endif
diff --git a/src/include/fmgr.h b/src/include/fmgr.h
index ab7b85c86e1..679443cca19 100644
--- a/src/include/fmgr.h
+++ b/src/include/fmgr.h
@@ -413,7 +413,7 @@ typedef const Pg_finfo_record *(*PGFInfoFunction) (void);
* info function, since authors shouldn't need to be explicitly aware of it.
*/
#define PG_FUNCTION_INFO_V1(funcname) \
-extern Datum funcname(PG_FUNCTION_ARGS); \
+extern PGDLLEXPORT Datum funcname(PG_FUNCTION_ARGS); \
extern PGDLLEXPORT const Pg_finfo_record * CppConcat(pg_finfo_,funcname)(void); \
const Pg_finfo_record * \
CppConcat(pg_finfo_,funcname) (void) \
@@ -424,6 +424,10 @@ CppConcat(pg_finfo_,funcname) (void) \
extern int no_such_variable
+extern PGDLLEXPORT void _PG_init(void);
+extern PGDLLEXPORT void _PG_fini(void);
+
+
/*-------------------------------------------------------------------------
* Support for verifying backend compatibility of loaded modules
*
diff --git a/src/include/jit/jit.h b/src/include/jit/jit.h
index b634df30b98..74617ad1b64 100644
--- a/src/include/jit/jit.h
+++ b/src/include/jit/jit.h
@@ -63,7 +63,7 @@ typedef struct JitContext
typedef struct JitProviderCallbacks JitProviderCallbacks;
-extern void _PG_jit_provider_init(JitProviderCallbacks *cb);
+extern PGDLLEXPORT void _PG_jit_provider_init(JitProviderCallbacks *cb);
typedef void (*JitProviderInit) (JitProviderCallbacks *cb);
typedef void (*JitProviderResetAfterErrorCB) (void);
typedef void (*JitProviderReleaseContextCB) (JitContext *context);
diff --git a/src/include/pg_config.h.in b/src/include/pg_config.h.in
index 15ffdd895aa..e3ab1c7752f 100644
--- a/src/include/pg_config.h.in
+++ b/src/include/pg_config.h.in
@@ -710,6 +710,9 @@
/* Define to 1 if you have the <uuid/uuid.h> header file. */
#undef HAVE_UUID_UUID_H
+/* Define to 1 if your compiler knows the visibility("hidden") attribute. */
+#undef HAVE_VISIBILITY_ATTRIBUTE
+
/* Define to 1 if you have the `wcstombs_l' function. */
#undef HAVE_WCSTOMBS_L
diff --git a/src/include/replication/output_plugin.h b/src/include/replication/output_plugin.h
index 810495ed0e4..a087f14dadd 100644
--- a/src/include/replication/output_plugin.h
+++ b/src/include/replication/output_plugin.h
@@ -35,6 +35,8 @@ typedef struct OutputPluginOptions
*/
typedef void (*LogicalOutputPluginInit) (struct OutputPluginCallbacks *cb);
+extern PGDLLEXPORT void _PG_output_plugin_init(struct OutputPluginCallbacks *cb);
+
/*
* Callback that gets called in a user-defined plugin. ctx->private_data can
* be set to some private data.
diff --git a/src/pl/plpython/plpy_elog.h b/src/pl/plpython/plpy_elog.h
index e02ef4ffe9f..aeade82ce10 100644
--- a/src/pl/plpython/plpy_elog.h
+++ b/src/pl/plpython/plpy_elog.h
@@ -34,13 +34,13 @@ extern PyObject *PLy_exc_spi_error;
} while(0)
#endif /* HAVE__BUILTIN_CONSTANT_P */
-extern void PLy_elog_impl(int elevel, const char *fmt,...) pg_attribute_printf(2, 3);
+extern PGDLLEXPORT void PLy_elog_impl(int elevel, const char *fmt,...) pg_attribute_printf(2, 3);
-extern void PLy_exception_set(PyObject *exc, const char *fmt,...) pg_attribute_printf(2, 3);
+extern PGDLLEXPORT void PLy_exception_set(PyObject *exc, const char *fmt,...) pg_attribute_printf(2, 3);
-extern void PLy_exception_set_plural(PyObject *exc, const char *fmt_singular, const char *fmt_plural,
+extern PGDLLEXPORT void PLy_exception_set_plural(PyObject *exc, const char *fmt_singular, const char *fmt_plural,
unsigned long n,...) pg_attribute_printf(2, 5) pg_attribute_printf(3, 5);
-extern void PLy_exception_set_with_details(PyObject *excclass, ErrorData *edata);
+extern PGDLLEXPORT void PLy_exception_set_with_details(PyObject *excclass, ErrorData *edata);
#endif /* PLPY_ELOG_H */
diff --git a/src/pl/plpython/plpy_typeio.h b/src/pl/plpython/plpy_typeio.h
index d11e6ae1b89..87e3b2c464e 100644
--- a/src/pl/plpython/plpy_typeio.h
+++ b/src/pl/plpython/plpy_typeio.h
@@ -147,29 +147,29 @@ struct PLyObToDatum
};
-extern PyObject *PLy_input_convert(PLyDatumToOb *arg, Datum val);
-extern Datum PLy_output_convert(PLyObToDatum *arg, PyObject *val,
+extern PGDLLEXPORT PyObject *PLy_input_convert(PLyDatumToOb *arg, Datum val);
+extern PGDLLEXPORT Datum PLy_output_convert(PLyObToDatum *arg, PyObject *val,
bool *isnull);
-extern PyObject *PLy_input_from_tuple(PLyDatumToOb *arg, HeapTuple tuple,
+extern PGDLLEXPORT PyObject *PLy_input_from_tuple(PLyDatumToOb *arg, HeapTuple tuple,
TupleDesc desc, bool include_generated);
-extern void PLy_input_setup_func(PLyDatumToOb *arg, MemoryContext arg_mcxt,
+extern PGDLLEXPORT void PLy_input_setup_func(PLyDatumToOb *arg, MemoryContext arg_mcxt,
Oid typeOid, int32 typmod,
struct PLyProcedure *proc);
-extern void PLy_output_setup_func(PLyObToDatum *arg, MemoryContext arg_mcxt,
+extern PGDLLEXPORT void PLy_output_setup_func(PLyObToDatum *arg, MemoryContext arg_mcxt,
Oid typeOid, int32 typmod,
struct PLyProcedure *proc);
-extern void PLy_input_setup_tuple(PLyDatumToOb *arg, TupleDesc desc,
+extern PGDLLEXPORT void PLy_input_setup_tuple(PLyDatumToOb *arg, TupleDesc desc,
struct PLyProcedure *proc);
-extern void PLy_output_setup_tuple(PLyObToDatum *arg, TupleDesc desc,
+extern PGDLLEXPORT void PLy_output_setup_tuple(PLyObToDatum *arg, TupleDesc desc,
struct PLyProcedure *proc);
-extern void PLy_output_setup_record(PLyObToDatum *arg, TupleDesc desc,
+extern PGDLLEXPORT void PLy_output_setup_record(PLyObToDatum *arg, TupleDesc desc,
struct PLyProcedure *proc);
/* conversion from Python objects to C strings --- exported for transforms */
-extern char *PLyObject_AsString(PyObject *plrv);
+extern PGDLLEXPORT char *PLyObject_AsString(PyObject *plrv);
#endif /* PLPY_TYPEIO_H */
diff --git a/src/pl/plpython/plpy_util.h b/src/pl/plpython/plpy_util.h
index 7c6577925ea..6f491b0f95b 100644
--- a/src/pl/plpython/plpy_util.h
+++ b/src/pl/plpython/plpy_util.h
@@ -8,10 +8,10 @@
#include "plpython.h"
-extern PyObject *PLyUnicode_Bytes(PyObject *unicode);
-extern char *PLyUnicode_AsString(PyObject *unicode);
+extern PGDLLEXPORT PyObject *PLyUnicode_Bytes(PyObject *unicode);
+extern PGDLLEXPORT char *PLyUnicode_AsString(PyObject *unicode);
-extern PyObject *PLyUnicode_FromString(const char *s);
-extern PyObject *PLyUnicode_FromStringAndSize(const char *s, Py_ssize_t size);
+extern PGDLLEXPORT PyObject *PLyUnicode_FromString(const char *s);
+extern PGDLLEXPORT PyObject *PLyUnicode_FromStringAndSize(const char *s, Py_ssize_t size);
#endif /* PLPY_UTIL_H */
diff --git a/src/test/modules/test_shm_mq/test_shm_mq.h b/src/test/modules/test_shm_mq/test_shm_mq.h
index a6661218347..a7a36714a48 100644
--- a/src/test/modules/test_shm_mq/test_shm_mq.h
+++ b/src/test/modules/test_shm_mq/test_shm_mq.h
@@ -40,6 +40,6 @@ extern void test_shm_mq_setup(int64 queue_size, int32 nworkers,
shm_mq_handle **input);
/* Main entrypoint for a worker. */
-extern void test_shm_mq_main(Datum) pg_attribute_noreturn();
+extern PGDLLEXPORT void test_shm_mq_main(Datum) pg_attribute_noreturn();
#endif
diff --git a/src/test/modules/worker_spi/worker_spi.c b/src/test/modules/worker_spi/worker_spi.c
index 0b6246676b6..e267bc3cffa 100644
--- a/src/test/modules/worker_spi/worker_spi.c
+++ b/src/test/modules/worker_spi/worker_spi.c
@@ -47,7 +47,7 @@ PG_MODULE_MAGIC;
PG_FUNCTION_INFO_V1(worker_spi_launch);
void _PG_init(void);
-void worker_spi_main(Datum) pg_attribute_noreturn();
+PGDLLEXPORT void worker_spi_main(Datum) pg_attribute_noreturn();
/* GUC variables */
static int worker_spi_naptime = 10;
diff --git a/src/tools/msvc/Solution.pm b/src/tools/msvc/Solution.pm
index 40cd6020421..85b877b80b2 100644
--- a/src/tools/msvc/Solution.pm
+++ b/src/tools/msvc/Solution.pm
@@ -432,6 +432,7 @@ sub GenerateFiles
HAVE_WINLDAP_H => undef,
HAVE_WCSTOMBS_L => 1,
HAVE_WCTYPE_H => 1,
+ HAVE_VISIBILITY_ATTRIBUTE => undef,
HAVE_WRITEV => undef,
HAVE_X509_GET_SIGNATURE_NID => 1,
HAVE_X86_64_POPCNTQ => undef,
--
2.23.0.385.gbc12974a89
v5-0011-meson-prereq-Handle-DLSUFFIX-in-msvc-builds-simil.patchtext/x-diff; charset=us-asciiDownload
From c1571bfd8f3eb811ad0f8916d189f94d3f98ec80 Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Thu, 30 Sep 2021 10:20:24 -0700
Subject: [PATCH v5 11/16] meson: prereq: Handle DLSUFFIX in msvc builds
similar to other build envs.
---
src/include/port/win32_port.h | 3 ---
src/tools/msvc/Mkvcbuild.pm | 3 +++
2 files changed, 3 insertions(+), 3 deletions(-)
diff --git a/src/include/port/win32_port.h b/src/include/port/win32_port.h
index c1c4831595a..72b2d2b5a01 100644
--- a/src/include/port/win32_port.h
+++ b/src/include/port/win32_port.h
@@ -529,9 +529,6 @@ typedef unsigned short mode_t;
#define W_OK 2
#define R_OK 4
-/* Pulled from Makefile.port in MinGW */
-#define DLSUFFIX ".dll"
-
#endif /* _MSC_VER */
#if (defined(_MSC_VER) && (_MSC_VER < 1900)) || \
diff --git a/src/tools/msvc/Mkvcbuild.pm b/src/tools/msvc/Mkvcbuild.pm
index b8e62c6d3f7..47b5c43357a 100644
--- a/src/tools/msvc/Mkvcbuild.pm
+++ b/src/tools/msvc/Mkvcbuild.pm
@@ -195,6 +195,7 @@ sub mkvcbuild
'syncrep_gram.y');
$postgres->AddFiles('src/backend/utils/adt', 'jsonpath_scan.l',
'jsonpath_gram.y');
+ $postgres->AddDefine('DLSUFFIX=".dll"');
$postgres->AddDefine('BUILDING_DLL');
$postgres->AddLibrary('secur32.lib');
$postgres->AddLibrary('ws2_32.lib');
@@ -298,6 +299,7 @@ sub mkvcbuild
my $libecpg = $solution->AddProject('libecpg', 'dll', 'interfaces',
'src/interfaces/ecpg/ecpglib');
$libecpg->AddDefine('FRONTEND');
+ $libecpg->AddDefine('DLSUFFIX=".dll"');
$libecpg->AddIncludeDir('src/interfaces/ecpg/include');
$libecpg->AddIncludeDir('src/interfaces/libpq');
$libecpg->AddIncludeDir('src/port');
@@ -845,6 +847,7 @@ sub mkvcbuild
$pgregress->AddFile('src/test/regress/pg_regress.c');
$pgregress->AddFile('src/test/regress/pg_regress_main.c');
$pgregress->AddIncludeDir('src/port');
+ $pgregress->AddDefine('DLSUFFIX=".dll"');
$pgregress->AddDefine('HOST_TUPLE="i686-pc-win32vc"');
$pgregress->AddLibrary('ws2_32.lib');
$pgregress->AddDirResourceFile('src/test/regress');
--
2.23.0.385.gbc12974a89
v5-0012-prereq-make-unicode-targets-work-in-vpath-builds.patchtext/x-diff; charset=us-asciiDownload
From 386bab39b951a71336c57611f8ff10977995ec9f Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Wed, 27 Oct 2021 09:59:33 -0700
Subject: [PATCH v5 12/16] prereq: make unicode targets work in vpath builds.
---
contrib/unaccent/Makefile | 4 ++--
src/common/unicode/Makefile | 8 ++++----
src/common/unicode/generate-unicode_norm_table.pl | 11 ++++++-----
3 files changed, 12 insertions(+), 11 deletions(-)
diff --git a/contrib/unaccent/Makefile b/contrib/unaccent/Makefile
index b8307d1601e..d6c466e07ad 100644
--- a/contrib/unaccent/Makefile
+++ b/contrib/unaccent/Makefile
@@ -27,12 +27,12 @@ include $(top_builddir)/src/Makefile.global
include $(top_srcdir)/contrib/contrib-global.mk
endif
-update-unicode: unaccent.rules
+update-unicode: $(srcdir)/unaccent.rules
# Allow running this even without --with-python
PYTHON ?= python
-unaccent.rules: generate_unaccent_rules.py ../../src/common/unicode/UnicodeData.txt Latin-ASCII.xml
+$(srcdir)/unaccent.rules: generate_unaccent_rules.py ../../src/common/unicode/UnicodeData.txt Latin-ASCII.xml
$(PYTHON) $< --unicode-data-file $(word 2,$^) --latin-ascii-file $(word 3,$^) >$@
# Only download it once; dependencies must match src/common/unicode/
diff --git a/src/common/unicode/Makefile b/src/common/unicode/Makefile
index a3683dd86b9..40a5f7bc0fe 100644
--- a/src/common/unicode/Makefile
+++ b/src/common/unicode/Makefile
@@ -12,14 +12,14 @@ subdir = src/common/unicode
top_builddir = ../../..
include $(top_builddir)/src/Makefile.global
-override CPPFLAGS := -DFRONTEND $(CPPFLAGS)
+override CPPFLAGS := -DFRONTEND -I$(abs_top_builddir)/src/common/unicode $(CPPFLAGS)
LIBS += $(PTHREAD_LIBS)
# By default, do nothing.
all:
update-unicode: unicode_norm_table.h unicode_combining_table.h unicode_east_asian_fw_table.h unicode_normprops_table.h unicode_norm_hashfunc.h
- mv $^ ../../../src/include/common/
+ mv $^ $(top_srcdir)/src/include/common/
$(MAKE) normalization-check
# These files are part of the Unicode Character Database. Download
@@ -33,7 +33,7 @@ UnicodeData.txt EastAsianWidth.txt DerivedNormalizationProps.txt CompositionExcl
unicode_norm_hashfunc.h: unicode_norm_table.h
unicode_norm_table.h: generate-unicode_norm_table.pl UnicodeData.txt CompositionExclusions.txt
- $(PERL) generate-unicode_norm_table.pl
+ $(PERL) $< $(CURDIR)
unicode_combining_table.h: generate-unicode_combining_table.pl UnicodeData.txt
$(PERL) $^ >$@
@@ -58,7 +58,7 @@ submake-common:
$(MAKE) -C .. all
norm_test_table.h: generate-norm_test_table.pl NormalizationTest.txt
- perl generate-norm_test_table.pl NormalizationTest.txt $@
+ perl $^ $@
.PHONY: normalization-check
diff --git a/src/common/unicode/generate-unicode_norm_table.pl b/src/common/unicode/generate-unicode_norm_table.pl
index 114ab30d3f1..4d2c603ff27 100644
--- a/src/common/unicode/generate-unicode_norm_table.pl
+++ b/src/common/unicode/generate-unicode_norm_table.pl
@@ -15,15 +15,16 @@ use FindBin;
use lib "$FindBin::RealBin/../../tools/";
use PerfectHash;
-my $output_table_file = "unicode_norm_table.h";
-my $output_func_file = "unicode_norm_hashfunc.h";
+my $directory = $ARGV[0];
+my $output_table_file = "$directory/unicode_norm_table.h";
+my $output_func_file = "$directory/unicode_norm_hashfunc.h";
my $FH;
# Read list of codes that should be excluded from re-composition.
my @composition_exclusion_codes = ();
-open($FH, '<', "CompositionExclusions.txt")
- or die "Could not open CompositionExclusions.txt: $!.";
+open($FH, '<', "$directory/CompositionExclusions.txt")
+ or die "Could not open $directory/CompositionExclusions.txt: $!.";
while (my $line = <$FH>)
{
if ($line =~ /^([[:xdigit:]]+)/)
@@ -38,7 +39,7 @@ close $FH;
# and character decomposition mapping
my @characters = ();
my %character_hash = ();
-open($FH, '<', "UnicodeData.txt")
+open($FH, '<', "$directory/UnicodeData.txt")
or die "Could not open UnicodeData.txt: $!.";
while (my $line = <$FH>)
{
--
2.23.0.385.gbc12974a89
v5-0013-wip-don-t-run-ldap-tests-on-windows.patchtext/x-diff; charset=us-asciiDownload
From 418ac644ce3c51c73c50166f126392610c5c2d0e Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Sun, 10 Oct 2021 13:49:12 -0700
Subject: [PATCH v5 13/16] wip: don't run ldap tests on windows.
---
src/test/ldap/t/001_auth.pl | 7 +++++++
1 file changed, 7 insertions(+)
diff --git a/src/test/ldap/t/001_auth.pl b/src/test/ldap/t/001_auth.pl
index 5a9a0098327..8850fc1cb1b 100644
--- a/src/test/ldap/t/001_auth.pl
+++ b/src/test/ldap/t/001_auth.pl
@@ -6,6 +6,13 @@ use warnings;
use PostgreSQL::Test::Utils;
use PostgreSQL::Test::Cluster;
use Test::More;
+use Config;
+
+if ($Config{osname} eq 'MSWin32')
+{
+ plan skip_all => 'ldap tests ';
+ exit;
+}
if ($ENV{with_ldap} eq 'yes')
{
--
2.23.0.385.gbc12974a89
v5-0014-wip-split-TESTDIR-into-two.patchtext/x-diff; charset=us-asciiDownload
From fb2933a9bba05d79ce920f9f161a732c615c4897 Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Wed, 27 Oct 2021 10:10:37 -0700
Subject: [PATCH v5 14/16] wip: split TESTDIR into two.
---
src/Makefile.global.in | 9 ++++---
src/bin/psql/t/010_tab_completion.pl | 34 +++++++++++++-------------
src/test/perl/PostgreSQL/Test/Utils.pm | 2 +-
src/tools/msvc/vcregress.pl | 2 ++
4 files changed, 26 insertions(+), 21 deletions(-)
diff --git a/src/Makefile.global.in b/src/Makefile.global.in
index c7c701dcb93..014029c9a9c 100644
--- a/src/Makefile.global.in
+++ b/src/Makefile.global.in
@@ -450,7 +450,8 @@ define prove_installcheck
rm -rf '$(CURDIR)'/tmp_check
$(MKDIR_P) '$(CURDIR)'/tmp_check
cd $(srcdir) && \
- TESTDIR='$(CURDIR)' PATH="$(bindir):$(CURDIR):$$PATH" \
+ TESTOUTDIR='$(CURDIR)/tmp_check' TESTDIR='$(CURDIR)' \
+ PATH="$(bindir):$(CURDIR):$$PATH" \
PGPORT='6$(DEF_PGPORT)' top_builddir='$(CURDIR)/$(top_builddir)' \
PG_REGRESS='$(CURDIR)/$(top_builddir)/src/test/regress/pg_regress' \
$(PROVE) $(PG_PROVE_FLAGS) $(PROVE_FLAGS) $(if $(PROVE_TESTS),$(PROVE_TESTS),t/*.pl)
@@ -460,8 +461,9 @@ define prove_installcheck
rm -rf '$(CURDIR)'/tmp_check
$(MKDIR_P) '$(CURDIR)'/tmp_check
cd $(srcdir) && \
- TESTDIR='$(CURDIR)' PATH="$(bindir):$(CURDIR):$$PATH" \
- PGPORT='6$(DEF_PGPORT)' top_builddir='$(top_builddir)' \
+ TESTOUTDIR='$(CURDIR)/tmp_check' TESTDIR='$(CURDIR)' \
+ PATH="$(bindir):$(CURDIR):$$PATH" PGPORT='6$(DEF_PGPORT)' \
+ top_builddir='$(top_builddir)' \
PG_REGRESS='$(top_builddir)/src/test/regress/pg_regress' \
$(PROVE) $(PG_PROVE_FLAGS) $(PROVE_FLAGS) $(if $(PROVE_TESTS),$(PROVE_TESTS),t/*.pl)
endef
@@ -471,6 +473,7 @@ define prove_check
rm -rf '$(CURDIR)'/tmp_check
$(MKDIR_P) '$(CURDIR)'/tmp_check
cd $(srcdir) && \
+ TESTOUTDIR='$(CURDIR)/tmp_check' \
TESTDIR='$(CURDIR)' $(with_temp_install) PGPORT='6$(DEF_PGPORT)' \
PG_REGRESS='$(CURDIR)/$(top_builddir)/src/test/regress/pg_regress' \
$(PROVE) $(PG_PROVE_FLAGS) $(PROVE_FLAGS) $(if $(PROVE_TESTS),$(PROVE_TESTS),t/*.pl)
diff --git a/src/bin/psql/t/010_tab_completion.pl b/src/bin/psql/t/010_tab_completion.pl
index 55b318517ea..33f6a6c12fd 100644
--- a/src/bin/psql/t/010_tab_completion.pl
+++ b/src/bin/psql/t/010_tab_completion.pl
@@ -67,23 +67,23 @@ delete $ENV{LS_COLORS};
# to run in the build directory so that we can use relative paths to
# access the tmp_check subdirectory; otherwise the output from filename
# completion tests is too variable.
-if ($ENV{TESTDIR})
+if ($ENV{TESTOUTDIR})
{
- chdir $ENV{TESTDIR} or die "could not chdir to \"$ENV{TESTDIR}\": $!";
+ chdir "$ENV{TESTOUTDIR}" or die "could not chdir to \"$ENV{TESTOUTDIR}\": $!";
}
# Create some junk files for filename completion testing.
my $FH;
-open $FH, ">", "tmp_check/somefile"
- or die("could not create file \"tmp_check/somefile\": $!");
+open $FH, ">", "somefile"
+ or die("could not create file \"somefile\": $!");
print $FH "some stuff\n";
close $FH;
-open $FH, ">", "tmp_check/afile123"
- or die("could not create file \"tmp_check/afile123\": $!");
+open $FH, ">", "afile123"
+ or die("could not create file \"afile123\": $!");
print $FH "more stuff\n";
close $FH;
-open $FH, ">", "tmp_check/afile456"
- or die("could not create file \"tmp_check/afile456\": $!");
+open $FH, ">", "afile456"
+ or die("could not create file \"afile456\": $!");
print $FH "other stuff\n";
close $FH;
@@ -184,16 +184,16 @@ clear_query();
# check filename completion
check_completion(
- "\\lo_import tmp_check/some\t",
- qr|tmp_check/somefile |,
+ "\\lo_import some\t",
+ qr|somefile |,
"filename completion with one possibility");
clear_query();
# note: readline might print a bell before the completion
check_completion(
- "\\lo_import tmp_check/af\t",
- qr|tmp_check/af\a?ile|,
+ "\\lo_import af\t",
+ qr|af\a?ile|,
"filename completion with multiple possibilities");
clear_query();
@@ -202,15 +202,15 @@ clear_query();
# note: broken versions of libedit want to backslash the closing quote;
# not much we can do about that
check_completion(
- "COPY foo FROM tmp_check/some\t",
- qr|'tmp_check/somefile\\?' |,
+ "COPY foo FROM some\t",
+ qr|'somefile\\?' |,
"quoted filename completion with one possibility");
clear_line();
check_completion(
- "COPY foo FROM tmp_check/af\t",
- qr|'tmp_check/afile|,
+ "COPY foo FROM af\t",
+ qr|'afile|,
"quoted filename completion with multiple possibilities");
# some versions of readline/libedit require two tabs here, some only need one
@@ -218,7 +218,7 @@ check_completion(
# the quotes might appear, too
check_completion(
"\t\t",
- qr|afile123'? +'?(tmp_check/)?afile456|,
+ qr|afile123'? +'?afile456|,
"offer multiple file choices");
clear_line();
diff --git a/src/test/perl/PostgreSQL/Test/Utils.pm b/src/test/perl/PostgreSQL/Test/Utils.pm
index f29d43f1f32..7878bc4ef48 100644
--- a/src/test/perl/PostgreSQL/Test/Utils.pm
+++ b/src/test/perl/PostgreSQL/Test/Utils.pm
@@ -187,7 +187,7 @@ INIT
# Determine output directories, and create them. The base path is the
# TESTDIR environment variable, which is normally set by the invoking
# Makefile.
- $tmp_check = $ENV{TESTDIR} ? "$ENV{TESTDIR}/tmp_check" : "tmp_check";
+ $tmp_check = $ENV{TESTOUTDIR} ? "$ENV{TESTOUTDIR}" : "tmp_check";
$log_path = "$tmp_check/log";
mkdir $tmp_check;
diff --git a/src/tools/msvc/vcregress.pl b/src/tools/msvc/vcregress.pl
index fc826da3ff2..cf099033ebd 100644
--- a/src/tools/msvc/vcregress.pl
+++ b/src/tools/msvc/vcregress.pl
@@ -252,6 +252,8 @@ sub tap_check
# add the module build dir as the second element in the PATH
$ENV{PATH} =~ s!;!;$topdir/$Config/$module;!;
+ $ENV{TESTOUTDIR} = "$dir/tmp_check";
+
rmtree('tmp_check');
system(@args);
my $status = $? >> 8;
--
2.23.0.385.gbc12974a89
v5-0015-meson-Add-draft-of-a-meson-based-buildsystem.patchtext/x-diff; charset=utf-8Download
From 8152a34caee2ce97112a860c01b1c6a0b4cef73f Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Fri, 10 Sep 2021 09:51:51 -0700
Subject: [PATCH v5 15/16] meson: Add draft of a meson based buildsystem.
Author: Andres Freund
Author: Thomas Munro
Author: John Naylor <john.naylor@enterprisedb.com>
---
configure | 6 +
configure.ac | 6 +
contrib/adminpack/meson.build | 20 +
contrib/amcheck/meson.build | 37 +
contrib/auth_delay/meson.build | 4 +
contrib/auto_explain/meson.build | 13 +
contrib/bloom/meson.build | 38 +
contrib/bool_plperl/meson.build | 37 +
contrib/btree_gin/meson.build | 51 +
contrib/btree_gist/meson.build | 79 +
contrib/citext/meson.build | 29 +
contrib/cube/meson.build | 42 +
contrib/dblink/meson.build | 29 +
contrib/dict_int/meson.build | 19 +
contrib/dict_xsyn/meson.build | 26 +
contrib/earthdistance/meson.build | 20 +
contrib/file_fdw/meson.build | 19 +
contrib/fuzzystrmatch/meson.build | 23 +
contrib/hstore/meson.build | 36 +
contrib/hstore_plperl/meson.build | 38 +
contrib/hstore_plpython/meson.build | 34 +
contrib/intagg/meson.build | 6 +
contrib/intarray/meson.build | 34 +
contrib/isn/meson.build | 25 +
contrib/jsonb_plperl/meson.build | 37 +
contrib/jsonb_plpython/meson.build | 33 +
contrib/lo/meson.build | 24 +
contrib/ltree/meson.build | 36 +
contrib/ltree_plpython/meson.build | 34 +
contrib/meson.build | 63 +
contrib/oid2name/meson.build | 14 +
contrib/old_snapshot/meson.build | 14 +
contrib/pageinspect/meson.build | 45 +
contrib/passwordcheck/meson.build | 27 +
contrib/pg_buffercache/meson.build | 16 +
contrib/pg_freespacemap/meson.build | 15 +
contrib/pg_prewarm/meson.build | 16 +
contrib/pg_stat_statements/meson.build | 31 +
contrib/pg_surgery/meson.build | 23 +
contrib/pg_trgm/meson.build | 33 +
contrib/pg_visibility/meson.build | 25 +
contrib/pgcrypto/meson.build | 117 +
contrib/pgrowlocks/meson.build | 15 +
contrib/pgstattuple/meson.build | 30 +
contrib/postgres_fdw/meson.build | 31 +
contrib/seg/meson.build | 40 +
contrib/sepgsql/meson.build | 34 +
contrib/spi/meson.build | 43 +
contrib/sslinfo/meson.build | 21 +
contrib/tablefunc/meson.build | 23 +
contrib/tcn/meson.build | 13 +
contrib/test_decoding/meson.build | 69 +
contrib/tsm_system_rows/meson.build | 22 +
contrib/tsm_system_time/meson.build | 22 +
contrib/unaccent/meson.build | 30 +
contrib/uuid-ossp/meson.build | 31 +
contrib/vacuumlo/meson.build | 14 +
contrib/xml2/meson.build | 30 +
conversion_helpers.txt | 6 +
doc/src/sgml/meson.build | 241 ++
doc/src/sgml/resolv.xsl | 7 +
doc/src/sgml/version.sgml.in | 2 +
meson.build | 2130 +++++++++++++++++
meson_options.txt | 90 +
src/backend/access/brin/meson.build | 12 +
src/backend/access/common/meson.build | 18 +
src/backend/access/gin/meson.build | 17 +
src/backend/access/gist/meson.build | 13 +
src/backend/access/hash/meson.build | 12 +
src/backend/access/heap/meson.build | 11 +
src/backend/access/index/meson.build | 6 +
src/backend/access/meson.build | 13 +
src/backend/access/nbtree/meson.build | 13 +
src/backend/access/rmgrdesc/meson.build | 26 +
src/backend/access/spgist/meson.build | 13 +
src/backend/access/table/meson.build | 6 +
src/backend/access/tablesample/meson.build | 5 +
src/backend/access/transam/meson.build | 28 +
src/backend/bootstrap/meson.build | 12 +
src/backend/catalog/meson.build | 41 +
src/backend/commands/meson.build | 50 +
src/backend/executor/meson.build | 67 +
src/backend/foreign/meson.build | 3 +
src/backend/jit/llvm/meson.build | 41 +
src/backend/jit/meson.build | 3 +
src/backend/lib/meson.build | 12 +
src/backend/libpq/meson.build | 28 +
src/backend/main/meson.build | 2 +
src/backend/meson.build | 197 ++
src/backend/nodes/meson.build | 17 +
src/backend/optimizer/geqo/meson.build | 17 +
src/backend/optimizer/meson.build | 5 +
src/backend/optimizer/path/meson.build | 11 +
src/backend/optimizer/plan/meson.build | 10 +
src/backend/optimizer/prep/meson.build | 7 +
src/backend/optimizer/util/meson.build | 16 +
src/backend/parser/meson.build | 43 +
src/backend/partitioning/meson.build | 5 +
src/backend/port/meson.build | 28 +
src/backend/port/win32/meson.build | 6 +
src/backend/postmaster/meson.build | 15 +
src/backend/regex/meson.build | 15 +
.../replication/libpqwalreceiver/meson.build | 13 +
src/backend/replication/logical/meson.build | 14 +
src/backend/replication/meson.build | 42 +
src/backend/replication/pgoutput/meson.build | 11 +
src/backend/rewrite/meson.build | 9 +
src/backend/snowball/meson.build | 83 +
src/backend/statistics/meson.build | 6 +
src/backend/storage/buffer/meson.build | 7 +
src/backend/storage/file/meson.build | 8 +
src/backend/storage/freespace/meson.build | 5 +
src/backend/storage/ipc/meson.build | 20 +
src/backend/storage/large_object/meson.build | 3 +
src/backend/storage/lmgr/meson.build | 18 +
src/backend/storage/meson.build | 9 +
src/backend/storage/page/meson.build | 5 +
src/backend/storage/smgr/meson.build | 4 +
src/backend/storage/sync/meson.build | 4 +
src/backend/tcop/meson.build | 8 +
src/backend/tsearch/meson.build | 21 +
src/backend/utils/activity/meson.build | 5 +
src/backend/utils/adt/meson.build | 118 +
src/backend/utils/cache/meson.build | 16 +
src/backend/utils/error/meson.build | 4 +
src/backend/utils/fmgr/meson.build | 8 +
src/backend/utils/hash/meson.build | 4 +
src/backend/utils/init/meson.build | 4 +
.../utils/mb/conversion_procs/meson.build | 38 +
src/backend/utils/mb/meson.build | 9 +
src/backend/utils/meson.build | 13 +
src/backend/utils/misc/meson.build | 28 +
src/backend/utils/mmgr/meson.build | 10 +
src/backend/utils/resowner/meson.build | 3 +
src/backend/utils/sort/meson.build | 7 +
src/backend/utils/time/meson.build | 4 +
src/bin/initdb/meson.build | 24 +
src/bin/meson.build | 20 +
src/bin/pg_amcheck/meson.build | 22 +
src/bin/pg_archivecleanup/meson.build | 14 +
src/bin/pg_basebackup/meson.build | 44 +
src/bin/pg_checksums/meson.build | 16 +
src/bin/pg_config/meson.build | 14 +
src/bin/pg_controldata/meson.build | 14 +
src/bin/pg_ctl/meson.build | 17 +
src/bin/pg_dump/meson.build | 69 +
src/bin/pg_resetwal/meson.build | 15 +
src/bin/pg_rewind/meson.build | 34 +
src/bin/pg_test_fsync/meson.build | 14 +
src/bin/pg_test_timing/meson.build | 14 +
src/bin/pg_upgrade/meson.build | 26 +
src/bin/pg_verifybackup/meson.build | 25 +
src/bin/pg_waldump/meson.build | 23 +
src/bin/pgbench/meson.build | 38 +
src/bin/pgevent/meson.build | 32 +
src/bin/psql/meson.build | 65 +
src/bin/scripts/meson.build | 46 +
src/common/meson.build | 155 ++
src/common/unicode/meson.build | 99 +
src/fe_utils/meson.build | 27 +
src/include/catalog/meson.build | 129 +
src/include/meson.build | 50 +
src/include/parser/meson.build | 10 +
src/include/pch/c_pch.h | 1 +
src/include/pch/postgres_pch.h | 1 +
src/include/pg_config_ext.h.meson | 7 +
src/include/storage/meson.build | 16 +
src/include/utils/meson.build | 22 +
src/interfaces/libpq/meson.build | 99 +
src/meson.build | 10 +
src/pl/meson.build | 4 +
src/pl/plperl/meson.build | 81 +
src/pl/plpgsql/meson.build | 1 +
src/pl/plpgsql/src/meson.build | 67 +
src/pl/plpython/meson.build | 78 +
src/port/meson.build | 191 ++
src/port/win32ver.rc.in | 41 +
src/test/authentication/meson.build | 9 +
src/test/isolation/meson.build | 49 +
src/test/kerberos/meson.build | 12 +
src/test/ldap/meson.build | 9 +
src/test/meson.build | 19 +
src/test/modules/brin/meson.build | 19 +
src/test/modules/commit_ts/meson.build | 20 +
src/test/modules/delay_execution/meson.build | 15 +
src/test/modules/dummy_index_am/meson.build | 20 +
src/test/modules/dummy_seclabel/meson.build | 20 +
src/test/modules/libpq_pipeline/meson.build | 21 +
src/test/modules/meson.build | 25 +
src/test/modules/plsample/meson.build | 20 +
src/test/modules/snapshot_too_old/meson.build | 11 +
src/test/modules/spgist_name_ops/meson.build | 20 +
.../ssl_passphrase_callback/meson.build | 45 +
src/test/modules/test_bloomfilter/meson.build | 20 +
src/test/modules/test_ddl_deparse/meson.build | 40 +
src/test/modules/test_extensions/meson.build | 38 +
.../modules/test_ginpostinglist/meson.build | 20 +
src/test/modules/test_integerset/meson.build | 20 +
src/test/modules/test_misc/meson.build | 8 +
src/test/modules/test_parser/meson.build | 20 +
src/test/modules/test_pg_dump/meson.build | 24 +
src/test/modules/test_predtest/meson.build | 20 +
src/test/modules/test_rbtree/meson.build | 20 +
src/test/modules/test_regex/meson.build | 21 +
src/test/modules/test_rls_hooks/meson.build | 19 +
src/test/modules/test_shm_mq/meson.build | 24 +
src/test/modules/unsafe_tests/meson.build | 9 +
src/test/modules/worker_spi/meson.build | 23 +
src/test/recovery/meson.build | 33 +
src/test/regress/meson.build | 57 +
src/test/ssl/meson.build | 10 +
src/test/subscription/meson.build | 34 +
src/timezone/meson.build | 50 +
src/timezone/tznames/meson.build | 20 +
src/tools/find_meson | 20 +
src/tools/irlink | 28 +
src/tools/msvc/export2def.pl | 22 +
src/tools/msvc/gendef2.pl | 177 ++
.../relativize_shared_library_references | 84 +
src/tools/relpath.py | 6 +
src/tools/testwrap | 22 +
221 files changed, 8521 insertions(+)
create mode 100644 contrib/adminpack/meson.build
create mode 100644 contrib/amcheck/meson.build
create mode 100644 contrib/auth_delay/meson.build
create mode 100644 contrib/auto_explain/meson.build
create mode 100644 contrib/bloom/meson.build
create mode 100644 contrib/bool_plperl/meson.build
create mode 100644 contrib/btree_gin/meson.build
create mode 100644 contrib/btree_gist/meson.build
create mode 100644 contrib/citext/meson.build
create mode 100644 contrib/cube/meson.build
create mode 100644 contrib/dblink/meson.build
create mode 100644 contrib/dict_int/meson.build
create mode 100644 contrib/dict_xsyn/meson.build
create mode 100644 contrib/earthdistance/meson.build
create mode 100644 contrib/file_fdw/meson.build
create mode 100644 contrib/fuzzystrmatch/meson.build
create mode 100644 contrib/hstore/meson.build
create mode 100644 contrib/hstore_plperl/meson.build
create mode 100644 contrib/hstore_plpython/meson.build
create mode 100644 contrib/intagg/meson.build
create mode 100644 contrib/intarray/meson.build
create mode 100644 contrib/isn/meson.build
create mode 100644 contrib/jsonb_plperl/meson.build
create mode 100644 contrib/jsonb_plpython/meson.build
create mode 100644 contrib/lo/meson.build
create mode 100644 contrib/ltree/meson.build
create mode 100644 contrib/ltree_plpython/meson.build
create mode 100644 contrib/meson.build
create mode 100644 contrib/oid2name/meson.build
create mode 100644 contrib/old_snapshot/meson.build
create mode 100644 contrib/pageinspect/meson.build
create mode 100644 contrib/passwordcheck/meson.build
create mode 100644 contrib/pg_buffercache/meson.build
create mode 100644 contrib/pg_freespacemap/meson.build
create mode 100644 contrib/pg_prewarm/meson.build
create mode 100644 contrib/pg_stat_statements/meson.build
create mode 100644 contrib/pg_surgery/meson.build
create mode 100644 contrib/pg_trgm/meson.build
create mode 100644 contrib/pg_visibility/meson.build
create mode 100644 contrib/pgcrypto/meson.build
create mode 100644 contrib/pgrowlocks/meson.build
create mode 100644 contrib/pgstattuple/meson.build
create mode 100644 contrib/postgres_fdw/meson.build
create mode 100644 contrib/seg/meson.build
create mode 100644 contrib/sepgsql/meson.build
create mode 100644 contrib/spi/meson.build
create mode 100644 contrib/sslinfo/meson.build
create mode 100644 contrib/tablefunc/meson.build
create mode 100644 contrib/tcn/meson.build
create mode 100644 contrib/test_decoding/meson.build
create mode 100644 contrib/tsm_system_rows/meson.build
create mode 100644 contrib/tsm_system_time/meson.build
create mode 100644 contrib/unaccent/meson.build
create mode 100644 contrib/uuid-ossp/meson.build
create mode 100644 contrib/vacuumlo/meson.build
create mode 100644 contrib/xml2/meson.build
create mode 100644 conversion_helpers.txt
create mode 100644 doc/src/sgml/meson.build
create mode 100644 doc/src/sgml/resolv.xsl
create mode 100644 doc/src/sgml/version.sgml.in
create mode 100644 meson.build
create mode 100644 meson_options.txt
create mode 100644 src/backend/access/brin/meson.build
create mode 100644 src/backend/access/common/meson.build
create mode 100644 src/backend/access/gin/meson.build
create mode 100644 src/backend/access/gist/meson.build
create mode 100644 src/backend/access/hash/meson.build
create mode 100644 src/backend/access/heap/meson.build
create mode 100644 src/backend/access/index/meson.build
create mode 100644 src/backend/access/meson.build
create mode 100644 src/backend/access/nbtree/meson.build
create mode 100644 src/backend/access/rmgrdesc/meson.build
create mode 100644 src/backend/access/spgist/meson.build
create mode 100644 src/backend/access/table/meson.build
create mode 100644 src/backend/access/tablesample/meson.build
create mode 100644 src/backend/access/transam/meson.build
create mode 100644 src/backend/bootstrap/meson.build
create mode 100644 src/backend/catalog/meson.build
create mode 100644 src/backend/commands/meson.build
create mode 100644 src/backend/executor/meson.build
create mode 100644 src/backend/foreign/meson.build
create mode 100644 src/backend/jit/llvm/meson.build
create mode 100644 src/backend/jit/meson.build
create mode 100644 src/backend/lib/meson.build
create mode 100644 src/backend/libpq/meson.build
create mode 100644 src/backend/main/meson.build
create mode 100644 src/backend/meson.build
create mode 100644 src/backend/nodes/meson.build
create mode 100644 src/backend/optimizer/geqo/meson.build
create mode 100644 src/backend/optimizer/meson.build
create mode 100644 src/backend/optimizer/path/meson.build
create mode 100644 src/backend/optimizer/plan/meson.build
create mode 100644 src/backend/optimizer/prep/meson.build
create mode 100644 src/backend/optimizer/util/meson.build
create mode 100644 src/backend/parser/meson.build
create mode 100644 src/backend/partitioning/meson.build
create mode 100644 src/backend/port/meson.build
create mode 100644 src/backend/port/win32/meson.build
create mode 100644 src/backend/postmaster/meson.build
create mode 100644 src/backend/regex/meson.build
create mode 100644 src/backend/replication/libpqwalreceiver/meson.build
create mode 100644 src/backend/replication/logical/meson.build
create mode 100644 src/backend/replication/meson.build
create mode 100644 src/backend/replication/pgoutput/meson.build
create mode 100644 src/backend/rewrite/meson.build
create mode 100644 src/backend/snowball/meson.build
create mode 100644 src/backend/statistics/meson.build
create mode 100644 src/backend/storage/buffer/meson.build
create mode 100644 src/backend/storage/file/meson.build
create mode 100644 src/backend/storage/freespace/meson.build
create mode 100644 src/backend/storage/ipc/meson.build
create mode 100644 src/backend/storage/large_object/meson.build
create mode 100644 src/backend/storage/lmgr/meson.build
create mode 100644 src/backend/storage/meson.build
create mode 100644 src/backend/storage/page/meson.build
create mode 100644 src/backend/storage/smgr/meson.build
create mode 100644 src/backend/storage/sync/meson.build
create mode 100644 src/backend/tcop/meson.build
create mode 100644 src/backend/tsearch/meson.build
create mode 100644 src/backend/utils/activity/meson.build
create mode 100644 src/backend/utils/adt/meson.build
create mode 100644 src/backend/utils/cache/meson.build
create mode 100644 src/backend/utils/error/meson.build
create mode 100644 src/backend/utils/fmgr/meson.build
create mode 100644 src/backend/utils/hash/meson.build
create mode 100644 src/backend/utils/init/meson.build
create mode 100644 src/backend/utils/mb/conversion_procs/meson.build
create mode 100644 src/backend/utils/mb/meson.build
create mode 100644 src/backend/utils/meson.build
create mode 100644 src/backend/utils/misc/meson.build
create mode 100644 src/backend/utils/mmgr/meson.build
create mode 100644 src/backend/utils/resowner/meson.build
create mode 100644 src/backend/utils/sort/meson.build
create mode 100644 src/backend/utils/time/meson.build
create mode 100644 src/bin/initdb/meson.build
create mode 100644 src/bin/meson.build
create mode 100644 src/bin/pg_amcheck/meson.build
create mode 100644 src/bin/pg_archivecleanup/meson.build
create mode 100644 src/bin/pg_basebackup/meson.build
create mode 100644 src/bin/pg_checksums/meson.build
create mode 100644 src/bin/pg_config/meson.build
create mode 100644 src/bin/pg_controldata/meson.build
create mode 100644 src/bin/pg_ctl/meson.build
create mode 100644 src/bin/pg_dump/meson.build
create mode 100644 src/bin/pg_resetwal/meson.build
create mode 100644 src/bin/pg_rewind/meson.build
create mode 100644 src/bin/pg_test_fsync/meson.build
create mode 100644 src/bin/pg_test_timing/meson.build
create mode 100644 src/bin/pg_upgrade/meson.build
create mode 100644 src/bin/pg_verifybackup/meson.build
create mode 100644 src/bin/pg_waldump/meson.build
create mode 100644 src/bin/pgbench/meson.build
create mode 100644 src/bin/pgevent/meson.build
create mode 100644 src/bin/psql/meson.build
create mode 100644 src/bin/scripts/meson.build
create mode 100644 src/common/meson.build
create mode 100644 src/common/unicode/meson.build
create mode 100644 src/fe_utils/meson.build
create mode 100644 src/include/catalog/meson.build
create mode 100644 src/include/meson.build
create mode 100644 src/include/parser/meson.build
create mode 100644 src/include/pch/c_pch.h
create mode 100644 src/include/pch/postgres_pch.h
create mode 100644 src/include/pg_config_ext.h.meson
create mode 100644 src/include/storage/meson.build
create mode 100644 src/include/utils/meson.build
create mode 100644 src/interfaces/libpq/meson.build
create mode 100644 src/meson.build
create mode 100644 src/pl/meson.build
create mode 100644 src/pl/plperl/meson.build
create mode 100644 src/pl/plpgsql/meson.build
create mode 100644 src/pl/plpgsql/src/meson.build
create mode 100644 src/pl/plpython/meson.build
create mode 100644 src/port/meson.build
create mode 100644 src/port/win32ver.rc.in
create mode 100644 src/test/authentication/meson.build
create mode 100644 src/test/isolation/meson.build
create mode 100644 src/test/kerberos/meson.build
create mode 100644 src/test/ldap/meson.build
create mode 100644 src/test/meson.build
create mode 100644 src/test/modules/brin/meson.build
create mode 100644 src/test/modules/commit_ts/meson.build
create mode 100644 src/test/modules/delay_execution/meson.build
create mode 100644 src/test/modules/dummy_index_am/meson.build
create mode 100644 src/test/modules/dummy_seclabel/meson.build
create mode 100644 src/test/modules/libpq_pipeline/meson.build
create mode 100644 src/test/modules/meson.build
create mode 100644 src/test/modules/plsample/meson.build
create mode 100644 src/test/modules/snapshot_too_old/meson.build
create mode 100644 src/test/modules/spgist_name_ops/meson.build
create mode 100644 src/test/modules/ssl_passphrase_callback/meson.build
create mode 100644 src/test/modules/test_bloomfilter/meson.build
create mode 100644 src/test/modules/test_ddl_deparse/meson.build
create mode 100644 src/test/modules/test_extensions/meson.build
create mode 100644 src/test/modules/test_ginpostinglist/meson.build
create mode 100644 src/test/modules/test_integerset/meson.build
create mode 100644 src/test/modules/test_misc/meson.build
create mode 100644 src/test/modules/test_parser/meson.build
create mode 100644 src/test/modules/test_pg_dump/meson.build
create mode 100644 src/test/modules/test_predtest/meson.build
create mode 100644 src/test/modules/test_rbtree/meson.build
create mode 100644 src/test/modules/test_regex/meson.build
create mode 100644 src/test/modules/test_rls_hooks/meson.build
create mode 100644 src/test/modules/test_shm_mq/meson.build
create mode 100644 src/test/modules/unsafe_tests/meson.build
create mode 100644 src/test/modules/worker_spi/meson.build
create mode 100644 src/test/recovery/meson.build
create mode 100644 src/test/regress/meson.build
create mode 100644 src/test/ssl/meson.build
create mode 100644 src/test/subscription/meson.build
create mode 100644 src/timezone/meson.build
create mode 100644 src/timezone/tznames/meson.build
create mode 100755 src/tools/find_meson
create mode 100644 src/tools/irlink
create mode 100644 src/tools/msvc/export2def.pl
create mode 100644 src/tools/msvc/gendef2.pl
create mode 100755 src/tools/relativize_shared_library_references
create mode 100755 src/tools/relpath.py
create mode 100755 src/tools/testwrap
diff --git a/configure b/configure
index fd15801b34c..c91295fcab5 100755
--- a/configure
+++ b/configure
@@ -21045,3 +21045,9 @@ if test -n "$ac_unrecognized_opts" && test "$enable_option_checking" != no; then
$as_echo "$as_me: WARNING: unrecognized options: $ac_unrecognized_opts" >&2;}
fi
+
+# Ensure that any meson build directories would reconfigure and see that
+# there's a conflicting in-tree build and can error out.
+if test "$vpath_build"="no"; then
+ touch meson.build
+fi
diff --git a/configure.ac b/configure.ac
index 973f83db52c..95c1f7ac6c9 100644
--- a/configure.ac
+++ b/configure.ac
@@ -2494,3 +2494,9 @@ AC_CONFIG_HEADERS([src/interfaces/ecpg/include/ecpg_config.h],
[echo >src/interfaces/ecpg/include/stamp-h])
AC_OUTPUT
+
+# Ensure that any meson build directories would reconfigure and see that
+# there's a conflicting in-tree build and can error out.
+if test "$vpath_build"="no"; then
+ touch meson.build
+fi
diff --git a/contrib/adminpack/meson.build b/contrib/adminpack/meson.build
new file mode 100644
index 00000000000..457a6089445
--- /dev/null
+++ b/contrib/adminpack/meson.build
@@ -0,0 +1,20 @@
+autoinc = shared_module('adminpack',
+ ['adminpack.c'],
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'adminpack.control',
+ 'adminpack--1.0.sql',
+ 'adminpack--1.0--1.1.sql',
+ 'adminpack--1.1--2.0.sql',
+ 'adminpack--2.0--2.1.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'adminpack',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': ['adminpack'],
+}
diff --git a/contrib/amcheck/meson.build b/contrib/amcheck/meson.build
new file mode 100644
index 00000000000..e656d35f41f
--- /dev/null
+++ b/contrib/amcheck/meson.build
@@ -0,0 +1,37 @@
+amcheck = shared_module('amcheck', [
+ 'verify_heapam.c',
+ 'verify_nbtree.c',
+ ],
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'amcheck.control',
+ 'amcheck--1.0.sql',
+ 'amcheck--1.0--1.1.sql',
+ 'amcheck--1.1--1.2.sql',
+ 'amcheck--1.2--1.3.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'amcheck',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'check',
+ 'check_btree',
+ 'check_heap'
+ ],
+}
+
+tap_tests += {
+ 'name': 'amcheck',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_verify_heapam.pl',
+ 't/002_cic.pl',
+ 't/003_cic_2pc.pl',
+ ],
+}
diff --git a/contrib/auth_delay/meson.build b/contrib/auth_delay/meson.build
new file mode 100644
index 00000000000..941bb6f39a4
--- /dev/null
+++ b/contrib/auth_delay/meson.build
@@ -0,0 +1,4 @@
+autoinc = shared_module('auth_delay',
+ ['auth_delay.c'],
+ kwargs: contrib_mod_args,
+)
diff --git a/contrib/auto_explain/meson.build b/contrib/auto_explain/meson.build
new file mode 100644
index 00000000000..321896efa2c
--- /dev/null
+++ b/contrib/auto_explain/meson.build
@@ -0,0 +1,13 @@
+auto_explain = shared_module('auto_explain',
+ files('auto_explain.c'),
+ kwargs: contrib_mod_args,
+)
+
+tap_tests += {
+ 'name': 'auto_explain',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_auto_explain.pl',
+ ]
+}
diff --git a/contrib/bloom/meson.build b/contrib/bloom/meson.build
new file mode 100644
index 00000000000..5c5d33c7f7a
--- /dev/null
+++ b/contrib/bloom/meson.build
@@ -0,0 +1,38 @@
+bloom_sources = files(
+ 'blcost.c',
+ 'blinsert.c',
+ 'blscan.c',
+ 'blutils.c',
+ 'blvacuum.c',
+ 'blvalidate.c',
+)
+
+bloom = shared_module('bloom',
+ bloom_sources,
+ c_pch: '../../src/include/pch/c_pch.h',
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'bloom.control',
+ 'bloom--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'bloom',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'bloom'
+ ],
+}
+
+tap_tests += {
+ 'name': 'bloom',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_wal.pl',
+ ],
+}
diff --git a/contrib/bool_plperl/meson.build b/contrib/bool_plperl/meson.build
new file mode 100644
index 00000000000..e15dc5285eb
--- /dev/null
+++ b/contrib/bool_plperl/meson.build
@@ -0,0 +1,37 @@
+if not perl_dep.found()
+ subdir_done()
+endif
+
+bool_plperl_sources = files(
+ 'bool_plperl.c',
+)
+
+bool_plperl = shared_module('bool_plperl',
+ bool_plperl_sources,
+ include_directories: [plperl_inc, include_directories('.')],
+ kwargs: pg_mod_args + {
+ 'dependencies': [perl_dep, contrib_mod_args['dependencies']],
+ },
+)
+
+install_data(
+ 'bool_plperl.control',
+ 'bool_plperl--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+install_data(
+ 'bool_plperlu.control',
+ 'bool_plperlu--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'bool_plperl',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'bool_plperl',
+ 'bool_plperlu',
+ ],
+}
diff --git a/contrib/btree_gin/meson.build b/contrib/btree_gin/meson.build
new file mode 100644
index 00000000000..d25ece7500e
--- /dev/null
+++ b/contrib/btree_gin/meson.build
@@ -0,0 +1,51 @@
+btree_gin = shared_module('btree_gin',
+ files('btree_gin.c'),
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'btree_gin.control',
+ 'btree_gin--1.0.sql',
+ 'btree_gin--1.0--1.1.sql',
+ 'btree_gin--1.1--1.2.sql',
+ 'btree_gin--1.2--1.3.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'btree_gin',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'install_btree_gin',
+ 'int2',
+ 'int4',
+ 'int8',
+ 'float4',
+ 'float8',
+ 'money',
+ 'oid',
+ 'timestamp',
+ 'timestamptz',
+ 'time',
+ 'timetz',
+ 'date',
+ 'interval',
+ 'macaddr',
+ 'macaddr8',
+ 'inet',
+ 'cidr',
+ 'text',
+ 'varchar',
+ 'char',
+ 'bytea',
+ 'bit',
+ 'varbit',
+ 'numeric',
+ 'enum',
+ 'uuid',
+ 'name',
+ 'bool',
+ 'bpchar',
+ ],
+}
diff --git a/contrib/btree_gist/meson.build b/contrib/btree_gist/meson.build
new file mode 100644
index 00000000000..8ee0faea401
--- /dev/null
+++ b/contrib/btree_gist/meson.build
@@ -0,0 +1,79 @@
+btree_gist_sources = files(
+ 'btree_bit.c',
+ 'btree_bytea.c',
+ 'btree_cash.c',
+ 'btree_date.c',
+ 'btree_enum.c',
+ 'btree_float4.c',
+ 'btree_float8.c',
+ 'btree_gist.c',
+ 'btree_inet.c',
+ 'btree_int2.c',
+ 'btree_int4.c',
+ 'btree_int8.c',
+ 'btree_interval.c',
+ 'btree_macaddr.c',
+ 'btree_macaddr8.c',
+ 'btree_numeric.c',
+ 'btree_oid.c',
+ 'btree_text.c',
+ 'btree_time.c',
+ 'btree_ts.c',
+ 'btree_utils_num.c',
+ 'btree_utils_var.c',
+ 'btree_uuid.c',
+)
+
+btree_gist = shared_module('btree_gist',
+ btree_gist_sources,
+ c_pch: '../../src/include/pch/c_pch.h',
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'btree_gist.control',
+ 'btree_gist--1.0--1.1.sql',
+ 'btree_gist--1.1--1.2.sql',
+ 'btree_gist--1.2.sql',
+ 'btree_gist--1.2--1.3.sql',
+ 'btree_gist--1.3--1.4.sql',
+ 'btree_gist--1.4--1.5.sql',
+ 'btree_gist--1.5--1.6.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'btree_gist',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'init',
+ 'int2',
+ 'int4',
+ 'int8',
+ 'float4',
+ 'float8',
+ 'cash',
+ 'oid',
+ 'timestamp',
+ 'timestamptz',
+ 'time',
+ 'timetz',
+ 'date',
+ 'interval',
+ 'macaddr',
+ 'macaddr8',
+ 'inet',
+ 'cidr',
+ 'text',
+ 'varchar',
+ 'char',
+ 'bytea',
+ 'bit',
+ 'varbit',
+ 'numeric',
+ 'uuid',
+ 'not_equal',
+ 'enum',
+ ],
+}
diff --git a/contrib/citext/meson.build b/contrib/citext/meson.build
new file mode 100644
index 00000000000..f2e9ff2117d
--- /dev/null
+++ b/contrib/citext/meson.build
@@ -0,0 +1,29 @@
+citext_sources = files(
+ 'citext.c',
+)
+
+citext = shared_module('citext',
+ citext_sources,
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'citext.control',
+ 'citext--1.0--1.1.sql',
+ 'citext--1.1--1.2.sql',
+ 'citext--1.2--1.3.sql',
+ 'citext--1.3--1.4.sql',
+ 'citext--1.4.sql',
+ 'citext--1.4--1.5.sql',
+ 'citext--1.5--1.6.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'citext',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'citext'
+ ],
+}
diff --git a/contrib/cube/meson.build b/contrib/cube/meson.build
new file mode 100644
index 00000000000..3cf7ebdd8ea
--- /dev/null
+++ b/contrib/cube/meson.build
@@ -0,0 +1,42 @@
+cube_sources = files(
+ 'cube.c',
+)
+
+# cubescan is compiled as part of cubeparse
+cubescan = custom_target('cubescan',
+ input: ['cubescan.l'],
+ output: ['cubescan.c'],
+ command: [flex, '-o', '@OUTPUT@', '@INPUT@'])
+
+cube_sources += custom_target('cubeparse',
+ input: 'cubeparse.y',
+ output: 'cubeparse.c',
+ depends: cubescan,
+ command: [bison, bisonflags, '-o', '@OUTPUT@', '@INPUT0@'])
+
+cube = shared_module('cube',
+ cube_sources,
+ include_directories: include_directories('.'),
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'cube.control',
+ 'cube--1.0--1.1.sql',
+ 'cube--1.1--1.2.sql',
+ 'cube--1.2.sql',
+ 'cube--1.2--1.3.sql',
+ 'cube--1.3--1.4.sql',
+ 'cube--1.4--1.5.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'cube',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'cube',
+ 'cube_sci',
+ ],
+}
diff --git a/contrib/dblink/meson.build b/contrib/dblink/meson.build
new file mode 100644
index 00000000000..7ac253700c9
--- /dev/null
+++ b/contrib/dblink/meson.build
@@ -0,0 +1,29 @@
+dblink_sources = files(
+ 'dblink.c',
+)
+
+dblink = shared_module('dblink',
+ dblink_sources,
+ kwargs: contrib_mod_args + {
+ 'dependencies': pg_mod_args['dependencies'] + [libpq],
+ },
+)
+
+install_data(
+ 'dblink.control',
+ 'dblink--1.0--1.1.sql',
+ 'dblink--1.1--1.2.sql',
+ 'dblink--1.2.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'dblink',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'paths',
+ 'dblink'
+ ],
+ 'regress_args': ['--dlpath', meson.build_root() / 'src/test/regress'],
+}
diff --git a/contrib/dict_int/meson.build b/contrib/dict_int/meson.build
new file mode 100644
index 00000000000..7c23b275c5a
--- /dev/null
+++ b/contrib/dict_int/meson.build
@@ -0,0 +1,19 @@
+dict_int = shared_module('dict_int',
+ files('dict_int.c'),
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'dict_int.control',
+ 'dict_int--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'dict_int',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'dict_int'
+ ],
+}
diff --git a/contrib/dict_xsyn/meson.build b/contrib/dict_xsyn/meson.build
new file mode 100644
index 00000000000..7cbabba02f1
--- /dev/null
+++ b/contrib/dict_xsyn/meson.build
@@ -0,0 +1,26 @@
+dict_xsyn = shared_module('dict_xsyn',
+ files('dict_xsyn.c'),
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'dict_xsyn.control',
+ 'dict_xsyn--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+install_data(
+ 'xsyn_sample.rules',
+ kwargs: contrib_data_args + {
+ 'install_dir': get_option('datadir') / 'tsearch_data'
+ }
+)
+
+regress_tests += {
+ 'name': 'dict_xsyn',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'dict_xsyn'
+ ],
+}
diff --git a/contrib/earthdistance/meson.build b/contrib/earthdistance/meson.build
new file mode 100644
index 00000000000..d56abf4f260
--- /dev/null
+++ b/contrib/earthdistance/meson.build
@@ -0,0 +1,20 @@
+earthdistance = shared_module('earthdistance',
+ files('earthdistance.c'),
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'earthdistance.control',
+ 'earthdistance--1.0--1.1.sql',
+ 'earthdistance--1.1.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'earthdistance',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'earthdistance'
+ ],
+}
diff --git a/contrib/file_fdw/meson.build b/contrib/file_fdw/meson.build
new file mode 100644
index 00000000000..0cd3348dfd0
--- /dev/null
+++ b/contrib/file_fdw/meson.build
@@ -0,0 +1,19 @@
+file_fdw = shared_module('file_fdw',
+ files('file_fdw.c'),
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'file_fdw.control',
+ 'file_fdw--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'file_fdw',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'file_fdw'
+ ],
+}
diff --git a/contrib/fuzzystrmatch/meson.build b/contrib/fuzzystrmatch/meson.build
new file mode 100644
index 00000000000..d1e75479668
--- /dev/null
+++ b/contrib/fuzzystrmatch/meson.build
@@ -0,0 +1,23 @@
+fuzzystrmatch = shared_module('fuzzystrmatch',
+ files(
+ 'fuzzystrmatch.c',
+ 'dmetaphone.c'
+ ),
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'fuzzystrmatch.control',
+ 'fuzzystrmatch--1.0--1.1.sql',
+ 'fuzzystrmatch--1.1.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'fuzzystrmatch',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'fuzzystrmatch'
+ ],
+}
diff --git a/contrib/hstore/meson.build b/contrib/hstore/meson.build
new file mode 100644
index 00000000000..661e61f9692
--- /dev/null
+++ b/contrib/hstore/meson.build
@@ -0,0 +1,36 @@
+# .. so that includes of hstore/hstore.h work
+hstore_inc = include_directories('.', '../')
+
+hstore = shared_module('hstore',
+ files(
+ 'hstore_compat.c',
+ 'hstore_gin.c',
+ 'hstore_gist.c',
+ 'hstore_io.c',
+ 'hstore_op.c',
+ 'hstore_subs.c',
+ ),
+ c_pch: '../../src/include/pch/c_pch.h',
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'hstore.control',
+ 'hstore--1.1--1.2.sql',
+ 'hstore--1.3--1.4.sql',
+ 'hstore--1.4.sql',
+ 'hstore--1.4--1.5.sql',
+ 'hstore--1.5--1.6.sql',
+ 'hstore--1.6--1.7.sql',
+ 'hstore--1.7--1.8.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'hstore',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'hstore'
+ ],
+}
diff --git a/contrib/hstore_plperl/meson.build b/contrib/hstore_plperl/meson.build
new file mode 100644
index 00000000000..48231cb1c9e
--- /dev/null
+++ b/contrib/hstore_plperl/meson.build
@@ -0,0 +1,38 @@
+if not perl_dep.found()
+ subdir_done()
+endif
+
+hstore_plperl_sources = files(
+ 'hstore_plperl.c',
+)
+
+hstore_plperl = shared_module('hstore_plperl',
+ hstore_plperl_sources,
+ include_directories: [plperl_inc, hstore_inc],
+ kwargs: pg_mod_args + {
+ 'dependencies': [perl_dep, contrib_mod_args['dependencies']],
+ },
+)
+
+install_data(
+ 'hstore_plperl.control',
+ 'hstore_plperl--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+install_data(
+ 'hstore_plperlu.control',
+ 'hstore_plperlu--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'hstore_plperl',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'hstore_plperl',
+ 'hstore_plperlu',
+ 'create_transform',
+ ],
+}
diff --git a/contrib/hstore_plpython/meson.build b/contrib/hstore_plpython/meson.build
new file mode 100644
index 00000000000..98114e46f5e
--- /dev/null
+++ b/contrib/hstore_plpython/meson.build
@@ -0,0 +1,34 @@
+if not python3.found()
+ subdir_done()
+endif
+
+hstore_plpython_sources = files(
+ 'hstore_plpython.c',
+)
+
+hstore_plpython = shared_module('hstore_plpython3',
+ hstore_plpython_sources,
+ include_directories: [plpython_inc, hstore_inc, ],
+ kwargs: pg_mod_args + {
+ 'c_args': ['-DPLPYTHON_LIBNAME="plpython3"'] + contrib_mod_args['c_args'],
+ 'dependencies': [python3, contrib_mod_args['dependencies']],
+ },
+)
+
+install_data(
+ 'hstore_plpython3u--1.0.sql',
+ 'hstore_plpython3u.control',
+ kwargs: contrib_data_args,
+)
+
+hstore_plpython_regress = [
+ 'hstore_plpython'
+]
+
+regress_tests += {
+ 'name': 'hstore_plpython',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': hstore_plpython_regress,
+ 'regress_args': ['--load-extension=hstore'],
+}
diff --git a/contrib/intagg/meson.build b/contrib/intagg/meson.build
new file mode 100644
index 00000000000..a0f39366f47
--- /dev/null
+++ b/contrib/intagg/meson.build
@@ -0,0 +1,6 @@
+install_data(
+ 'intagg.control',
+ 'intagg--1.0--1.1.sql',
+ 'intagg--1.1.sql',
+ kwargs: contrib_data_args,
+)
diff --git a/contrib/intarray/meson.build b/contrib/intarray/meson.build
new file mode 100644
index 00000000000..d4fe99ca275
--- /dev/null
+++ b/contrib/intarray/meson.build
@@ -0,0 +1,34 @@
+intarray_sources = files(
+ '_int_bool.c',
+ '_int_gin.c',
+ '_int_gist.c',
+ '_int_op.c',
+ '_int_selfuncs.c',
+ '_int_tool.c',
+ '_intbig_gist.c',
+)
+
+intarray = shared_module('_int',
+ intarray_sources,
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'intarray.control',
+ 'intarray--1.0--1.1.sql',
+ 'intarray--1.1--1.2.sql',
+ 'intarray--1.2.sql',
+ 'intarray--1.2--1.3.sql',
+ 'intarray--1.3--1.4.sql',
+ 'intarray--1.4--1.5.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'intarray',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ '_int'
+ ],
+}
diff --git a/contrib/isn/meson.build b/contrib/isn/meson.build
new file mode 100644
index 00000000000..ca919800b9f
--- /dev/null
+++ b/contrib/isn/meson.build
@@ -0,0 +1,25 @@
+isn_sources = files(
+ 'isn.c',
+)
+
+isn = shared_module('isn',
+ isn_sources,
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'isn.control',
+ 'isn--1.0--1.1.sql',
+ 'isn--1.1--1.2.sql',
+ 'isn--1.1.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'isn',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'isn'
+ ],
+}
diff --git a/contrib/jsonb_plperl/meson.build b/contrib/jsonb_plperl/meson.build
new file mode 100644
index 00000000000..c34090e5f5c
--- /dev/null
+++ b/contrib/jsonb_plperl/meson.build
@@ -0,0 +1,37 @@
+if not perl_dep.found()
+ subdir_done()
+endif
+
+jsonb_plperl_sources = files(
+ 'jsonb_plperl.c',
+)
+
+jsonb_plperl = shared_module('jsonb_plperl',
+ jsonb_plperl_sources,
+ include_directories: [plperl_inc],
+ kwargs: pg_mod_args + {
+ 'dependencies': [perl_dep, contrib_mod_args['dependencies']],
+ },
+)
+
+install_data(
+ 'jsonb_plperl.control',
+ 'jsonb_plperl--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+install_data(
+ 'jsonb_plperlu.control',
+ 'jsonb_plperlu--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'jsonb_plperl',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'jsonb_plperl',
+ 'jsonb_plperlu',
+ ],
+}
diff --git a/contrib/jsonb_plpython/meson.build b/contrib/jsonb_plpython/meson.build
new file mode 100644
index 00000000000..aa372d52e26
--- /dev/null
+++ b/contrib/jsonb_plpython/meson.build
@@ -0,0 +1,33 @@
+if not python3.found()
+ subdir_done()
+endif
+
+jsonb_plpython_sources = files(
+ 'jsonb_plpython.c',
+)
+
+jsonb_plpython = shared_module('jsonb_plpython3',
+ jsonb_plpython_sources,
+ include_directories: [plpython_inc],
+ kwargs: pg_mod_args + {
+ 'c_args': ['-DPLPYTHON_LIBNAME="plpython3"'] + contrib_mod_args['c_args'],
+ 'dependencies': [python3, contrib_mod_args['dependencies']],
+ },
+)
+
+install_data(
+ 'jsonb_plpython3u.control',
+ 'jsonb_plpython3u--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+jsonb_plpython_regress = [
+ 'jsonb_plpython'
+]
+
+regress_tests += {
+ 'name': 'jsonb_plpython',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': jsonb_plpython_regress,
+}
diff --git a/contrib/lo/meson.build b/contrib/lo/meson.build
new file mode 100644
index 00000000000..ca9bbc42015
--- /dev/null
+++ b/contrib/lo/meson.build
@@ -0,0 +1,24 @@
+lo_sources = files(
+ 'lo.c',
+)
+
+lo = shared_module('lo',
+ lo_sources,
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'lo.control',
+ 'lo--1.0--1.1.sql',
+ 'lo--1.1.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'lo',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'lo'
+ ],
+}
diff --git a/contrib/ltree/meson.build b/contrib/ltree/meson.build
new file mode 100644
index 00000000000..12909945350
--- /dev/null
+++ b/contrib/ltree/meson.build
@@ -0,0 +1,36 @@
+ltree_sources = files(
+ '_ltree_gist.c',
+ '_ltree_op.c',
+ 'crc32.c',
+ 'lquery_op.c',
+ 'ltree_gist.c',
+ 'ltree_io.c',
+ 'ltree_op.c',
+ 'ltxtquery_io.c',
+ 'ltxtquery_op.c',
+)
+
+# .. so that includes of ltree/ltree.h work
+ltree_inc = include_directories('.', '../')
+
+ltree = shared_module('ltree',
+ ltree_sources,
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'ltree.control',
+ 'ltree--1.0--1.1.sql',
+ 'ltree--1.1--1.2.sql',
+ 'ltree--1.1.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'ltree',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'ltree'
+ ],
+}
diff --git a/contrib/ltree_plpython/meson.build b/contrib/ltree_plpython/meson.build
new file mode 100644
index 00000000000..df30c84c6a9
--- /dev/null
+++ b/contrib/ltree_plpython/meson.build
@@ -0,0 +1,34 @@
+if not python3.found()
+ subdir_done()
+endif
+
+ltree_plpython_sources = files(
+ 'ltree_plpython.c',
+)
+
+ltree_plpython = shared_module('ltree_plpython3',
+ ltree_plpython_sources,
+ include_directories: [plpython_inc, ltree_inc],
+ kwargs: pg_mod_args + {
+ 'c_args': ['-DPLPYTHON_LIBNAME="plpython3"'] + contrib_mod_args['c_args'],
+ 'dependencies': [python3, contrib_mod_args['dependencies']],
+ },
+)
+
+install_data(
+ 'ltree_plpython3u--1.0.sql',
+ 'ltree_plpython3u.control',
+ kwargs: contrib_data_args,
+)
+
+ltree_plpython_regress = [
+ 'ltree_plpython'
+]
+
+regress_tests += {
+ 'name': 'ltree_plpython',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': ltree_plpython_regress,
+ 'regress_args': ['--load-extension=ltree'],
+}
diff --git a/contrib/meson.build b/contrib/meson.build
new file mode 100644
index 00000000000..720181023c8
--- /dev/null
+++ b/contrib/meson.build
@@ -0,0 +1,63 @@
+contrib_mod_args = pg_mod_args
+
+contrib_data_dir = get_option('datadir') / 'extension'
+contrib_data_args = {
+ 'install_dir': contrib_data_dir
+}
+
+subdir('adminpack')
+subdir('amcheck')
+subdir('auth_delay')
+subdir('auto_explain')
+subdir('bloom')
+subdir('bool_plperl')
+subdir('btree_gin')
+subdir('btree_gist')
+subdir('citext')
+subdir('cube')
+subdir('dblink')
+subdir('dict_int')
+subdir('dict_xsyn')
+subdir('earthdistance')
+subdir('file_fdw')
+subdir('fuzzystrmatch')
+subdir('hstore')
+subdir('hstore_plperl')
+subdir('hstore_plpython')
+subdir('intagg')
+subdir('intarray')
+subdir('isn')
+subdir('jsonb_plperl')
+subdir('jsonb_plpython')
+subdir('lo')
+subdir('ltree')
+subdir('ltree_plpython')
+subdir('oid2name')
+subdir('old_snapshot')
+subdir('pageinspect')
+subdir('passwordcheck')
+subdir('pg_buffercache')
+subdir('pgcrypto')
+subdir('pg_freespacemap')
+subdir('pg_prewarm')
+subdir('pgrowlocks')
+subdir('pg_stat_statements')
+subdir('pgstattuple')
+subdir('pg_surgery')
+subdir('pg_trgm')
+subdir('pg_visibility')
+subdir('postgres_fdw')
+subdir('seg')
+subdir('sepgsql')
+subdir('spi')
+subdir('sslinfo')
+# start-scripts doesn't contain build products
+subdir('tablefunc')
+subdir('tcn')
+subdir('test_decoding')
+subdir('tsm_system_rows')
+subdir('tsm_system_time')
+subdir('unaccent')
+subdir('uuid-ossp')
+subdir('vacuumlo')
+subdir('xml2')
diff --git a/contrib/oid2name/meson.build b/contrib/oid2name/meson.build
new file mode 100644
index 00000000000..bee34d2137c
--- /dev/null
+++ b/contrib/oid2name/meson.build
@@ -0,0 +1,14 @@
+executable('oid2name',
+ ['oid2name.c'],
+ dependencies: [frontend_code, libpq],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name' : 'oid2name',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests' :[
+ 't/001_basic.pl',
+ ]
+}
diff --git a/contrib/old_snapshot/meson.build b/contrib/old_snapshot/meson.build
new file mode 100644
index 00000000000..5785c29e9f8
--- /dev/null
+++ b/contrib/old_snapshot/meson.build
@@ -0,0 +1,14 @@
+old_snapshot_sources = files(
+ 'time_mapping.c',
+)
+
+old_snapshot = shared_module('old_snapshot',
+ old_snapshot_sources,
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'old_snapshot.control',
+ 'old_snapshot--1.0.sql',
+ kwargs: contrib_data_args,
+)
diff --git a/contrib/pageinspect/meson.build b/contrib/pageinspect/meson.build
new file mode 100644
index 00000000000..4bd5b1784e0
--- /dev/null
+++ b/contrib/pageinspect/meson.build
@@ -0,0 +1,45 @@
+pageinspect = shared_module('pageinspect',
+ files(
+ 'brinfuncs.c',
+ 'btreefuncs.c',
+ 'fsmfuncs.c',
+ 'ginfuncs.c',
+ 'gistfuncs.c',
+ 'hashfuncs.c',
+ 'heapfuncs.c',
+ 'rawpage.c',
+ ),
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'pageinspect--1.0--1.1.sql',
+ 'pageinspect--1.1--1.2.sql',
+ 'pageinspect--1.2--1.3.sql',
+ 'pageinspect--1.3--1.4.sql',
+ 'pageinspect--1.4--1.5.sql',
+ 'pageinspect--1.5--1.6.sql',
+ 'pageinspect--1.5.sql',
+ 'pageinspect--1.6--1.7.sql',
+ 'pageinspect--1.7--1.8.sql',
+ 'pageinspect--1.8--1.9.sql',
+ 'pageinspect--1.9--1.10.sql',
+ 'pageinspect.control',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'pageinspect',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'page',
+ 'btree',
+ 'brin',
+ 'gin',
+ 'gist',
+ 'hash',
+ 'checksum',
+ 'oldextversions',
+ ],
+}
diff --git a/contrib/passwordcheck/meson.build b/contrib/passwordcheck/meson.build
new file mode 100644
index 00000000000..ae5db94d0e1
--- /dev/null
+++ b/contrib/passwordcheck/meson.build
@@ -0,0 +1,27 @@
+passwordcheck_sources = files(
+ 'passwordcheck.c',
+)
+
+passwordcheck_c_args = []
+passwordcheck_deps = []
+
+# uncomment the following two lines to enable cracklib support
+# passwordcheck_c_args += ['-DUSE_CRACKLIB', '-DCRACKLIB_DICTPATH="/usr/lib/cracklib_dict"']
+# passwordcheck_deps += [cc.find_library('crack')]
+
+passwordcheck = shared_module('passwordcheck',
+ passwordcheck_sources,
+ kwargs: contrib_mod_args + {
+ 'c_args': contrib_mod_args.get('c_args') + passwordcheck_c_args,
+ 'dependencies': contrib_mod_args.get('dependencies') + passwordcheck_deps,
+ }
+)
+
+regress_tests += {
+ 'name': 'passwordcheck',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'passwordcheck'
+ ],
+}
diff --git a/contrib/pg_buffercache/meson.build b/contrib/pg_buffercache/meson.build
new file mode 100644
index 00000000000..f4f540218b6
--- /dev/null
+++ b/contrib/pg_buffercache/meson.build
@@ -0,0 +1,16 @@
+pg_buffercache = shared_module('pg_buffercache',
+ files(
+ 'pg_buffercache_pages.c',
+ ),
+ c_pch: '../../src/include/pch/postgres_pch.h',
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'pg_buffercache--1.0--1.1.sql',
+ 'pg_buffercache--1.1--1.2.sql',
+ 'pg_buffercache--1.2--1.3.sql',
+ 'pg_buffercache--1.2.sql',
+ 'pg_buffercache.control',
+ kwargs: contrib_data_args,
+)
diff --git a/contrib/pg_freespacemap/meson.build b/contrib/pg_freespacemap/meson.build
new file mode 100644
index 00000000000..feb1e225f48
--- /dev/null
+++ b/contrib/pg_freespacemap/meson.build
@@ -0,0 +1,15 @@
+pg_freespacemap = shared_module('pg_freespacemap',
+ files(
+ 'pg_freespacemap.c',
+ ),
+ c_pch: '../../src/include/pch/postgres_pch.h',
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'pg_freespacemap--1.0--1.1.sql',
+ 'pg_freespacemap--1.1--1.2.sql',
+ 'pg_freespacemap--1.1.sql',
+ 'pg_freespacemap.control',
+ kwargs: contrib_data_args,
+)
diff --git a/contrib/pg_prewarm/meson.build b/contrib/pg_prewarm/meson.build
new file mode 100644
index 00000000000..c93ccc2db6d
--- /dev/null
+++ b/contrib/pg_prewarm/meson.build
@@ -0,0 +1,16 @@
+pg_prewarm = shared_module('pg_prewarm',
+ files(
+ 'autoprewarm.c',
+ 'pg_prewarm.c',
+ ),
+ c_pch: '../../src/include/pch/postgres_pch.h',
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'pg_prewarm--1.0--1.1.sql',
+ 'pg_prewarm--1.1--1.2.sql',
+ 'pg_prewarm--1.1.sql',
+ 'pg_prewarm.control',
+ kwargs: contrib_data_args,
+)
diff --git a/contrib/pg_stat_statements/meson.build b/contrib/pg_stat_statements/meson.build
new file mode 100644
index 00000000000..6ed70ac0f18
--- /dev/null
+++ b/contrib/pg_stat_statements/meson.build
@@ -0,0 +1,31 @@
+pg_stat_statements = shared_module('pg_stat_statements',
+ files('pg_stat_statements.c'),
+ kwargs: contrib_mod_args + {
+ 'dependencies': contrib_mod_args['dependencies'],
+ },
+)
+
+install_data(
+ 'pg_stat_statements.control',
+ 'pg_stat_statements--1.4.sql',
+ 'pg_stat_statements--1.8--1.9.sql',
+ 'pg_stat_statements--1.7--1.8.sql',
+ 'pg_stat_statements--1.6--1.7.sql',
+ 'pg_stat_statements--1.5--1.6.sql',
+ 'pg_stat_statements--1.4--1.5.sql',
+ 'pg_stat_statements--1.3--1.4.sql',
+ 'pg_stat_statements--1.2--1.3.sql',
+ 'pg_stat_statements--1.1--1.2.sql',
+ 'pg_stat_statements--1.0--1.1.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'pg_stat_statements',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'pg_stat_statements'
+ ],
+ 'regress_args': ['--temp-config', files('pg_stat_statements.conf')],
+}
diff --git a/contrib/pg_surgery/meson.build b/contrib/pg_surgery/meson.build
new file mode 100644
index 00000000000..58de871b041
--- /dev/null
+++ b/contrib/pg_surgery/meson.build
@@ -0,0 +1,23 @@
+pg_surgery = shared_module('pg_surgery',
+ files(
+ 'heap_surgery.c',
+ ),
+ c_pch: '../../src/include/pch/postgres_pch.h',
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'pg_surgery--1.0.sql',
+ 'pg_surgery.control',
+ kwargs: contrib_data_args,
+)
+
+
+regress_tests += {
+ 'name': 'pg_surgery',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'heap_surgery'
+ ],
+}
diff --git a/contrib/pg_trgm/meson.build b/contrib/pg_trgm/meson.build
new file mode 100644
index 00000000000..0a56926ad6b
--- /dev/null
+++ b/contrib/pg_trgm/meson.build
@@ -0,0 +1,33 @@
+pg_trgm = shared_module('pg_trgm',
+ files(
+ 'trgm_gin.c',
+ 'trgm_gist.c',
+ 'trgm_op.c',
+ 'trgm_regexp.c',
+ ),
+ c_pch: '../../src/include/pch/postgres_pch.h',
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'pg_trgm--1.0--1.1.sql',
+ 'pg_trgm--1.1--1.2.sql',
+ 'pg_trgm--1.2--1.3.sql',
+ 'pg_trgm--1.3--1.4.sql',
+ 'pg_trgm--1.3.sql',
+ 'pg_trgm--1.4--1.5.sql',
+ 'pg_trgm--1.5--1.6.sql',
+ 'pg_trgm.control',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'pg_trgm',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'pg_trgm',
+ 'pg_word_trgm',
+ 'pg_strict_word_trgm',
+ ],
+}
diff --git a/contrib/pg_visibility/meson.build b/contrib/pg_visibility/meson.build
new file mode 100644
index 00000000000..5fd0e8cd986
--- /dev/null
+++ b/contrib/pg_visibility/meson.build
@@ -0,0 +1,25 @@
+pg_visibility = shared_module('pg_visibility',
+ files(
+ 'pg_visibility.c',
+ ),
+ c_pch: '../../src/include/pch/postgres_pch.h',
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'pg_visibility--1.0--1.1.sql',
+ 'pg_visibility--1.1--1.2.sql',
+ 'pg_visibility--1.1.sql',
+ 'pg_visibility.control',
+ kwargs: contrib_data_args,
+)
+
+
+regress_tests += {
+ 'name': 'pg_visibility',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'pg_visibility'
+ ],
+}
diff --git a/contrib/pgcrypto/meson.build b/contrib/pgcrypto/meson.build
new file mode 100644
index 00000000000..f20076b84a0
--- /dev/null
+++ b/contrib/pgcrypto/meson.build
@@ -0,0 +1,117 @@
+pgcrypto_sources = files(
+ 'crypt-blowfish.c',
+ 'crypt-des.c',
+ 'crypt-gensalt.c',
+ 'crypt-md5.c',
+ 'mbuf.c',
+ 'pgcrypto.c',
+ 'pgp-armor.c',
+ 'pgp-cfb.c',
+ 'pgp-compress.c',
+ 'pgp-decrypt.c',
+ 'pgp-encrypt.c',
+ 'pgp-info.c',
+ 'pgp-mpi.c',
+ 'pgp-pgsql.c',
+ 'pgp-pubdec.c',
+ 'pgp-pubenc.c',
+ 'pgp-pubkey.c',
+ 'pgp-s2k.c',
+ 'pgp.c',
+ 'px-crypt.c',
+ 'px-hmac.c',
+ 'px.c',
+)
+
+pgcrypto_regress = [
+ 'init',
+ 'md5',
+ 'sha1',
+ 'hmac-md5',
+ 'hmac-sha1',
+ 'blowfish',
+ 'rijndael',
+ 'crypt-des',
+ 'crypt-md5',
+ 'crypt-blowfish',
+ 'crypt-xdes',
+ 'pgp-armor',
+ 'pgp-decrypt',
+ 'pgp-encrypt',
+ 'pgp-pubkey-decrypt',
+ 'pgp-pubkey-encrypt',
+ 'pgp-info',
+]
+
+
+pgcrypto_internal_sources = files(
+ 'internal.c',
+ 'internal-sha2.c',
+ 'blf.c',
+ 'rijndael.c',
+ 'pgp-mpi-internal.c',
+ 'imath.c',
+)
+
+pgcrypto_internal_regress = [
+ 'sha2',
+]
+
+
+pgcrypto_openssl_sources = files(
+ 'openssl.c',
+ 'pgp-mpi-openssl.c',
+)
+pgcrypto_openssl_regress = [
+ 'sha2',
+ 'des',
+ '3des',
+ 'cast5',
+]
+
+# TODO: implement rijndael.c generation
+
+pgcrypto_deps = []
+if ssl.found()
+ pgcrypto_deps += ssl
+ pgcrypto_sources += pgcrypto_openssl_sources
+ pgcrypto_regress += pgcrypto_openssl_regress
+else
+ pgcrypto_sources += pgcrypto_internal_sources
+ pgcrypto_regress += pgcrypto_internal_regress
+endif
+
+if zlib.found()
+ pgcrypto_deps += zlib
+ pgcrypto_regress += 'pgp-compression'
+else
+ pgcrypto_regress += 'pgp-zlib-DISABLED'
+endif
+
+pgcrypto = shared_module('pgcrypto',
+ pgcrypto_sources,
+ c_pch: '../../src/include/pch/postgres_pch.h',
+ kwargs: contrib_mod_args + {
+ 'dependencies': [pgcrypto_deps, contrib_mod_args['dependencies']]
+ },
+)
+
+install_data(
+ 'pgcrypto--1.0--1.1.sql',
+ 'pgcrypto--1.1--1.2.sql',
+ 'pgcrypto--1.2--1.3.sql',
+ 'pgcrypto--1.3.sql',
+ 'pgcrypto.control',
+ kwargs: contrib_data_args,
+)
+
+
+regress_tests += {
+ 'name': 'pgcrypto',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ pgcrypto_regress
+ ],
+}
+
diff --git a/contrib/pgrowlocks/meson.build b/contrib/pgrowlocks/meson.build
new file mode 100644
index 00000000000..26c68248fce
--- /dev/null
+++ b/contrib/pgrowlocks/meson.build
@@ -0,0 +1,15 @@
+pgrowlocks = shared_module('pgrowlocks',
+ files(
+ 'pgrowlocks.c',
+ ),
+ c_pch: '../../src/include/pch/postgres_pch.h',
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'pgrowlocks--1.0--1.1.sql',
+ 'pgrowlocks--1.1--1.2.sql',
+ 'pgrowlocks--1.2.sql',
+ 'pgrowlocks.control',
+ kwargs: contrib_data_args,
+)
diff --git a/contrib/pgstattuple/meson.build b/contrib/pgstattuple/meson.build
new file mode 100644
index 00000000000..4ed41b3743c
--- /dev/null
+++ b/contrib/pgstattuple/meson.build
@@ -0,0 +1,30 @@
+pgstattuple = shared_module('pgstattuple',
+ files(
+ 'pgstatapprox.c',
+ 'pgstatindex.c',
+ 'pgstattuple.c',
+ ),
+ c_pch: '../../src/include/pch/postgres_pch.h',
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'pgstattuple--1.0--1.1.sql',
+ 'pgstattuple--1.1--1.2.sql',
+ 'pgstattuple--1.2--1.3.sql',
+ 'pgstattuple--1.3--1.4.sql',
+ 'pgstattuple--1.4--1.5.sql',
+ 'pgstattuple--1.4.sql',
+ 'pgstattuple.control',
+ kwargs: contrib_data_args,
+)
+
+
+regress_tests += {
+ 'name': 'pgstattuple',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'pgstattuple'
+ ],
+}
diff --git a/contrib/postgres_fdw/meson.build b/contrib/postgres_fdw/meson.build
new file mode 100644
index 00000000000..507d01448b1
--- /dev/null
+++ b/contrib/postgres_fdw/meson.build
@@ -0,0 +1,31 @@
+postgres_fdw_sources = files(
+ 'connection.c',
+ 'deparse.c',
+ 'option.c',
+ 'postgres_fdw.c',
+ 'shippable.c',
+)
+
+postgres_fdw = shared_module('postgres_fdw',
+ postgres_fdw_sources,
+ kwargs: contrib_mod_args + {
+ 'dependencies': pg_mod_args['dependencies'] + [libpq],
+ },
+)
+
+install_data(
+ 'postgres_fdw.control',
+ 'postgres_fdw--1.0.sql',
+ 'postgres_fdw--1.0--1.1.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'postgres_fdw',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'postgres_fdw'
+ ],
+ 'regress_args': ['--dlpath', meson.build_root() / 'src/test/regress'],
+}
diff --git a/contrib/seg/meson.build b/contrib/seg/meson.build
new file mode 100644
index 00000000000..a66a4b4c218
--- /dev/null
+++ b/contrib/seg/meson.build
@@ -0,0 +1,40 @@
+seg_sources = files(
+ 'seg.c',
+)
+
+# segscan is compiled as part of segparse
+segscan = custom_target('segscan',
+ input: ['segscan.l'],
+ output: ['segscan.c'],
+ command: [flex, '-o', '@OUTPUT@', '@INPUT@'])
+
+seg_sources += custom_target('segparse',
+ input: 'segparse.y',
+ output: 'segparse.c',
+ depends: segscan,
+ command: [bison, bisonflags, '-o', '@OUTPUT@', '@INPUT0@'])
+
+seg = shared_module('seg',
+ seg_sources,
+ include_directories: include_directories('.'),
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'seg.control',
+ 'seg--1.0--1.1.sql',
+ 'seg--1.1--1.2.sql',
+ 'seg--1.1.sql',
+ 'seg--1.2--1.3.sql',
+ 'seg--1.3--1.4.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'seg',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'seg',
+ ],
+}
diff --git a/contrib/sepgsql/meson.build b/contrib/sepgsql/meson.build
new file mode 100644
index 00000000000..852c6740569
--- /dev/null
+++ b/contrib/sepgsql/meson.build
@@ -0,0 +1,34 @@
+if not selinux.found()
+ subdir_done()
+endif
+
+sepgsql_sources = files(
+ 'database.c',
+ 'dml.c',
+ 'hooks.c',
+ 'label.c',
+ 'proc.c',
+ 'relation.c',
+ 'schema.c',
+ 'selinux.c',
+ 'uavc.c',
+)
+
+sepgsql = shared_module('sepgsql',
+ sepgsql_sources,
+ c_pch: '../../src/include/pch/postgres_pch.h',
+ kwargs: contrib_mod_args + {
+ 'dependencies': [selinux, pg_mod_args['dependencies']],
+ }
+)
+
+custom_target('sepgsql.sql',
+ input: 'sepgsql.sql.in',
+ output: 'sepgsql.sql',
+ command: [sed, '-e', 's,MODULE_PATHNAME,$libdir/sepgsql,g', '@INPUT@'],
+ capture: true,
+ install: true,
+ install_dir: contrib_data_args['install_dir'],
+)
+
+# TODO: implement sepgsql tests
diff --git a/contrib/spi/meson.build b/contrib/spi/meson.build
new file mode 100644
index 00000000000..51bc96ea657
--- /dev/null
+++ b/contrib/spi/meson.build
@@ -0,0 +1,43 @@
+autoinc = shared_module('autoinc',
+ ['autoinc.c'],
+ kwargs: contrib_mod_args,
+)
+
+install_data('autoinc.control', 'autoinc--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+
+insert_username = shared_module('insert_username',
+ ['insert_username.c'],
+ kwargs: contrib_mod_args,
+)
+
+install_data('insert_username.control', 'insert_username--1.0.sql',
+ install_dir: get_option('datadir') / 'extension'
+)
+
+
+moddatetime = shared_module('moddatetime',
+ ['moddatetime.c'],
+ kwargs: contrib_mod_args,
+)
+
+install_data('moddatetime.control', 'moddatetime--1.0.sql',
+ install_dir: get_option('datadir') / 'extension'
+)
+
+# this is needed for the regression tests;
+# comment out if you want a quieter refint package for other uses
+refint_cflags = ['-DREFINT_VERBOSE']
+
+refint = shared_module('refint',
+ ['refint.c'],
+ kwargs: contrib_mod_args + {
+ 'c_args': refint_cflags + contrib_mod_args['c_args'],
+ },
+)
+
+install_data('refint.control', 'refint--1.0.sql',
+ kwargs: contrib_data_args,
+)
diff --git a/contrib/sslinfo/meson.build b/contrib/sslinfo/meson.build
new file mode 100644
index 00000000000..27ee59d62d5
--- /dev/null
+++ b/contrib/sslinfo/meson.build
@@ -0,0 +1,21 @@
+if not ssl.found()
+ subdir_done()
+endif
+
+sslinfo = shared_module('sslinfo',
+ files(
+ 'sslinfo.c',
+ ),
+ c_pch: '../../src/include/pch/postgres_pch.h',
+ kwargs: contrib_mod_args + {
+ 'dependencies': [ssl, pg_mod_args['dependencies']],
+ }
+)
+
+install_data(
+ 'sslinfo--1.0--1.1.sql',
+ 'sslinfo--1.1--1.2.sql',
+ 'sslinfo--1.2.sql',
+ 'sslinfo.control',
+ kwargs: contrib_data_args,
+)
diff --git a/contrib/tablefunc/meson.build b/contrib/tablefunc/meson.build
new file mode 100644
index 00000000000..955b1ae6795
--- /dev/null
+++ b/contrib/tablefunc/meson.build
@@ -0,0 +1,23 @@
+tablefunc = shared_module('tablefunc',
+ files(
+ 'tablefunc.c',
+ ),
+ c_pch: '../../src/include/pch/postgres_pch.h',
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'tablefunc--1.0.sql',
+ 'tablefunc.control',
+ kwargs: contrib_data_args,
+)
+
+
+regress_tests += {
+ 'name': 'tablefunc',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'tablefunc'
+ ],
+}
diff --git a/contrib/tcn/meson.build b/contrib/tcn/meson.build
new file mode 100644
index 00000000000..b3a663f64bc
--- /dev/null
+++ b/contrib/tcn/meson.build
@@ -0,0 +1,13 @@
+tcn = shared_module('tcn',
+ files(
+ 'tcn.c',
+ ),
+ c_pch: '../../src/include/pch/postgres_pch.h',
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'tcn--1.0.sql',
+ 'tcn.control',
+ kwargs: contrib_data_args,
+)
diff --git a/contrib/test_decoding/meson.build b/contrib/test_decoding/meson.build
new file mode 100644
index 00000000000..d26b43cbe79
--- /dev/null
+++ b/contrib/test_decoding/meson.build
@@ -0,0 +1,69 @@
+test_decoding_sources = files(
+ 'test_decoding.c',
+)
+
+test_decoding = shared_module('test_decoding',
+ test_decoding_sources,
+ kwargs: contrib_mod_args,
+)
+
+
+regress_tests += {
+ 'name': 'test_decoding',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'ddl',
+ 'xact',
+ 'rewrite',
+ 'toast',
+ 'permissions',
+ 'decoding_in_xact',
+ 'decoding_into_rel',
+ 'binary',
+ 'prepared',
+ 'replorigin',
+ 'time',
+ 'messages',
+ 'spill',
+ 'slot',
+ 'truncate',
+ 'stream',
+ 'stats',
+ 'twophase',
+ 'twophase_stream',
+ ],
+ 'regress_args': [
+ '--temp-config', files('logical.conf')
+ ]
+}
+
+isolation_tests += {
+ 'name': 'test_decoding',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'specs': [
+ 'mxact',
+ 'delayed_startup',
+ 'ondisk_startup',
+ 'concurrent_ddl_dml',
+ 'oldest_xmin',
+ 'snapshot_transfer',
+ 'subxact_without_top',
+ 'concurrent_stream',
+ 'twophase_snapshot',
+ ],
+ 'regress_args': [
+ '--temp-config', files('logical.conf')
+ ]
+}
+
+
+tap_tests += {
+ 'name': 'test_decoding',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_repl_stats.pl',
+ ],
+}
diff --git a/contrib/tsm_system_rows/meson.build b/contrib/tsm_system_rows/meson.build
new file mode 100644
index 00000000000..2c8f4487f8d
--- /dev/null
+++ b/contrib/tsm_system_rows/meson.build
@@ -0,0 +1,22 @@
+tsm_system_rows = shared_module('tsm_system_rows',
+ files(
+ 'tsm_system_rows.c',
+ ),
+ c_pch: '../../src/include/pch/postgres_pch.h',
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'tsm_system_rows--1.0.sql',
+ 'tsm_system_rows.control',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'tsm_system_rows',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'tsm_system_rows',
+ ],
+}
diff --git a/contrib/tsm_system_time/meson.build b/contrib/tsm_system_time/meson.build
new file mode 100644
index 00000000000..df9c4aa4b51
--- /dev/null
+++ b/contrib/tsm_system_time/meson.build
@@ -0,0 +1,22 @@
+tsm_system_time = shared_module('tsm_system_time',
+ files(
+ 'tsm_system_time.c',
+ ),
+ c_pch: '../../src/include/pch/postgres_pch.h',
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'tsm_system_time--1.0.sql',
+ 'tsm_system_time.control',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'tsm_system_time',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'tsm_system_time',
+ ],
+}
diff --git a/contrib/unaccent/meson.build b/contrib/unaccent/meson.build
new file mode 100644
index 00000000000..e77bf790d8c
--- /dev/null
+++ b/contrib/unaccent/meson.build
@@ -0,0 +1,30 @@
+unaccent = shared_module('unaccent',
+ files(
+ 'unaccent.c',
+ ),
+ c_pch: '../../src/include/pch/postgres_pch.h',
+ kwargs: contrib_mod_args,
+)
+
+install_data(
+ 'unaccent--1.0--1.1.sql',
+ 'unaccent--1.1.sql',
+ 'unaccent.control',
+ kwargs: contrib_data_args,
+)
+
+install_data(
+ 'unaccent.rules',
+ install_dir: get_option('datadir') / 'tsearch_data'
+)
+
+# XXX: Implement downlo
+regress_tests += {
+ 'name': 'unaccent',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'unaccent',
+ ],
+ 'regress_args': ['--encoding=UTF8'],
+}
diff --git a/contrib/uuid-ossp/meson.build b/contrib/uuid-ossp/meson.build
new file mode 100644
index 00000000000..dad1ec228bd
--- /dev/null
+++ b/contrib/uuid-ossp/meson.build
@@ -0,0 +1,31 @@
+if not uuid.found()
+ subdir_done()
+endif
+
+uuid_ossp = shared_module('uuid-ossp',
+ files(
+ 'uuid-ossp.c',
+ ),
+ c_pch: '../../src/include/pch/postgres_pch.h',
+ kwargs: contrib_mod_args + {
+ 'dependencies': [uuid, pg_mod_args['dependencies']],
+ },
+
+)
+
+install_data(
+ 'uuid-ossp--1.0--1.1.sql',
+ 'uuid-ossp--1.1.sql',
+ 'uuid-ossp.control',
+ kwargs: contrib_data_args,
+)
+
+
+regress_tests += {
+ 'name': 'uuid-ossp',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'uuid_ossp'
+ ],
+}
diff --git a/contrib/vacuumlo/meson.build b/contrib/vacuumlo/meson.build
new file mode 100644
index 00000000000..99e76daacf9
--- /dev/null
+++ b/contrib/vacuumlo/meson.build
@@ -0,0 +1,14 @@
+executable('vacuumlo',
+ ['vacuumlo.c'],
+ dependencies: [frontend_code, libpq],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name' : 'vacuumlo',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests' :[
+ 't/001_basic.pl',
+ ]
+}
diff --git a/contrib/xml2/meson.build b/contrib/xml2/meson.build
new file mode 100644
index 00000000000..6f8a26e4f0a
--- /dev/null
+++ b/contrib/xml2/meson.build
@@ -0,0 +1,30 @@
+if not libxml.found()
+ subdir_done()
+endif
+
+xml2 = shared_module('pgxml',
+ files(
+ 'xpath.c',
+ 'xslt_proc.c',
+ ),
+ c_pch: '../../src/include/pch/postgres_pch.h',
+ kwargs: contrib_mod_args + {
+ 'dependencies': [libxml, libxslt, contrib_mod_args['dependencies']],
+ },
+)
+
+install_data(
+ 'xml2--1.0--1.1.sql',
+ 'xml2--1.1.sql',
+ 'xml2.control',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'xml2',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'xml2',
+ ],
+}
diff --git a/conversion_helpers.txt b/conversion_helpers.txt
new file mode 100644
index 00000000000..e5879b4fe77
--- /dev/null
+++ b/conversion_helpers.txt
@@ -0,0 +1,6 @@
+convert list of files to quoted-one-per-line:
+
+ ?\b\(\(?:\w\|\d\|_\|-\)+\)\.o ?\(?:\\
+\)? â '\1.c',
+
+
diff --git a/doc/src/sgml/meson.build b/doc/src/sgml/meson.build
new file mode 100644
index 00000000000..4576e20a189
--- /dev/null
+++ b/doc/src/sgml/meson.build
@@ -0,0 +1,241 @@
+alldocs = []
+doc_generated = []
+
+xmllint = find_program('xmllint', native: true, required: false)
+pandoc = find_program('pandoc', native: true, required: false)
+xsltproc = find_program('xsltproc', native: true, required: false)
+fop = find_program('fop', native: true, required: false)
+
+
+configure_file(
+ input: 'version.sgml.in',
+ output: 'version.sgml',
+ configuration: cdata,
+)
+
+doc_generated += custom_target('features-supported.sgml',
+ input: files(
+ '../../../src/backend/catalog/sql_feature_packages.txt',
+ '../../../src/backend/catalog/sql_features.txt'),
+ output: 'features-supported.sgml',
+ command: [perl, files('mk_feature_tables.pl'), 'YES', '@INPUT@'],
+ build_by_default: false,
+ install: false,
+ capture: true)
+
+doc_generated += custom_target('features-unsupported.sgml',
+ input: files(
+ '../../../src/backend/catalog/sql_feature_packages.txt',
+ '../../../src/backend/catalog/sql_features.txt'),
+ output: 'features-unsupported.sgml',
+ command: [perl, files('mk_feature_tables.pl'), 'NO', '@INPUT@'],
+ build_by_default: false,
+ install: false,
+ capture: true)
+
+doc_generated += custom_target('errcodes-table.sgml',
+ input: files(
+ '../../../src/backend/utils/errcodes.txt'),
+ output: 'errcodes-table.sgml',
+ command: [perl, files('generate-errcodes-table.pl'), '@INPUT@'],
+ build_by_default: false,
+ install: false,
+ capture: true)
+
+# FIXME: this actually has further inputs, adding depfile support to
+# generate-keywords-table.pl is probably the best way to address that
+# robustly.
+doc_generated += custom_target('keywords-table.sgml',
+ input: files(
+ '../../../src/include/parser/kwlist.h'),
+ output: 'keywords-table.sgml',
+ command: [perl, files('generate-keywords-table.pl'), '@CURRENT_SOURCE_DIR@'],
+ build_by_default: false,
+ install: false,
+ capture: true)
+
+# For everything else we need at least xmllint
+if not xmllint.found()
+ subdir_done()
+endif
+
+# Compute validity just once
+postgres_sgml_valid = custom_target('postgres.sgml.valid',
+ input: 'postgres.sgml',
+ output: 'postgres.sgml.valid',
+ command: [xmllint, '--noout', '--valid', '--path', '@OUTDIR@'],
+ build_by_default: true,
+ capture: true,
+)
+alldocs += postgres_sgml_valid
+
+
+#
+# Full documentation as html, text
+#
+if xsltproc.found()
+ xsltproc_flags = [
+ '--stringparam', 'pg.version', pg_version,
+ '--param', 'website.stylesheet', '1'
+ ]
+
+
+ # FIXME: Should use a wrapper around xsltproc --load-trace to compute a
+ # depfile
+ html = custom_target('html',
+ input: ['stylesheet.xsl', 'postgres.sgml'],
+ output: 'html',
+ depends: doc_generated + [postgres_sgml_valid],
+ command: [xsltproc, '--path', '@OUTDIR@', '-o', '@OUTDIR@/', xsltproc_flags, '@INPUT@'],
+ build_by_default: false,
+ )
+ alldocs += html
+
+
+ html_help = custom_target('html_help',
+ input: ['stylesheet-hh.xsl', 'postgres.sgml'],
+ output: 'htmlhelp',
+ depends: doc_generated + [postgres_sgml_valid],
+ command: [xsltproc, '--path', '@OUTDIR@', '-o', '@OUTDIR@/', xsltproc_flags, '@INPUT@'],
+ build_by_default: false,
+ )
+ alldocs += html_help
+
+
+ # single-page HTML
+ postgres_html = custom_target('postgres.html',
+ input: ['stylesheet-html-nochunk.xsl', 'postgres.sgml'],
+ output: 'postgres.html',
+ depends: doc_generated + [postgres_sgml_valid],
+ command: [xsltproc, '--path', '@OUTDIR@', '-o', '@OUTPUT@', xsltproc_flags, '@INPUT@'],
+ build_by_default: false,
+ )
+ alldocs += postgres_html
+
+ # single-page text
+ if pandoc.found()
+ postgres_txt = custom_target('postgres.txt',
+ input: [postgres_html],
+ output: 'postgres.txt',
+ depends: doc_generated + [postgres_sgml_valid],
+ command: [pandoc, '-t', 'plain', '-o', '@OUTPUT@', '@INPUT@'],
+ build_by_default: false,
+ )
+ alldocs += postgres_txt
+ endif
+endif
+
+
+#
+# INSTALL in html, text
+#
+if xsltproc.found()
+ install_xml = custom_target('INSTALL.xml',
+ input: ['standalone-profile.xsl', 'standalone-install.xml'],
+ output: 'INSTALL.xml',
+ depends: doc_generated + [postgres_sgml_valid],
+ command: [xsltproc, '--path', '@OUTDIR@', '-o', '@OUTPUT@', xsltproc_flags, '--xinclude', '@INPUT@'],
+ build_by_default: false,
+ )
+ install_html = custom_target('INSTALL.html',
+ input: ['stylesheet-text.xsl', install_xml],
+ output: 'INSTALL.html',
+ depends: doc_generated + [postgres_sgml_valid],
+ command: [xsltproc, '--path', '@OUTDIR@', '-o', '@OUTPUT@', xsltproc_flags, '@INPUT@'],
+ build_by_default: false,
+ )
+ alldocs += install_html
+
+ if pandoc.found()
+ # XXX: Makefile does an iconv translit here, but unclear why?
+ install = custom_target('INSTALL',
+ input: [install_html],
+ output: 'INSTALL',
+ depends: doc_generated + [postgres_sgml_valid],
+ command: [pandoc, '-t', 'plain', '-o', '@OUTPUT@', '@INPUT@'],
+ build_by_default: false,
+ )
+ alldocs += postgres_txt
+ endif
+
+endif
+
+
+#
+# Man pages
+#
+if xsltproc.found()
+ # FIXME: implement / consider sqlmansectnum logic
+ man = custom_target('man',
+ input: ['stylesheet-man.xsl', 'postgres.sgml'],
+ output: ['man1', 'man3', 'man7'],
+ depends: doc_generated + [postgres_sgml_valid],
+ command: [xsltproc, '--path', '@OUTDIR@', '-o', '@OUTDIR@', xsltproc_flags, '@INPUT@'],
+ build_by_default: false,
+ )
+endif
+
+
+#
+# Full documentation as PDF
+#
+if fop.found() and xsltproc.found()
+ xsltproc_fo_flags = xsltproc_flags + ['--stringparam', 'img.src.path', meson.current_source_dir() + '/']
+
+ foreach format, detail: {'A4': 'A4', 'US': 'USletter'}
+ postgres_x_fo_f = 'postgres-@0@.fo'.format(format)
+ postgres_x_pdf_f = 'postgres-@0@.pdf'.format(format)
+
+ postgres_x_fo = custom_target(postgres_x_fo_f,
+ input: ['stylesheet-fo.xsl', 'postgres.sgml'],
+ output: [postgres_x_fo_f],
+ depends: doc_generated + [postgres_sgml_valid],
+ command: [xsltproc, '--path', '@OUTDIR@/', xsltproc_fo_flags,
+ '--stringparam', 'paper.type', detail,
+ '-o', '@OUTPUT@', '@INPUT@'],
+ build_by_default: false,
+ )
+
+ postgres_x_pdf = custom_target(postgres_x_pdf_f,
+ input: [postgres_x_fo],
+ output: [postgres_x_pdf_f],
+ command: [fop, '-fo', '@INPUT@', '-pdf', '@OUTPUT@'],
+ build_by_default: false,
+ )
+ alldocs += postgres_x_pdf
+ endforeach
+endif
+
+
+#
+# epub
+#
+
+# This was previously implemented using dbtoepub - but that doesn't seem to
+# support running in build != source directory (i.e. VPATH builds already
+# weren't supported).
+if pandoc.found() and xsltproc.found()
+ # XXX: Wasn't able to make pandoc successfully resolve entities
+ # XXX: Perhaps we should just make all targets use this, to avoid repeatedly
+ # building whole thing? It's comparatively fast though.
+ postgres_full_xml = custom_target('postgres-full.xml',
+ input: ['resolv.xsl', 'postgres.sgml'],
+ output: ['postgres-full.xml'],
+ depends: doc_generated + [postgres_sgml_valid],
+ command: [xsltproc, '--path', '@OUTDIR@/', xsltproc_flags,
+ '-o', '@OUTPUT@', '@INPUT@'],
+ build_by_default: false,
+ )
+
+ postgres_epub = custom_target('postgres.epub',
+ input: [postgres_full_xml],
+ output: 'postgres.epub',
+ command: [pandoc, '-f', 'docbook', '-t', 'epub', '-o', '@OUTPUT@', '--resource-path=@CURRENT_SOURCE_DIR@',
+ '@INPUT@'],
+ build_by_default: false,
+ )
+ alldocs += postgres_epub
+endif
+
+
+alias_target('alldocs', alldocs)
diff --git a/doc/src/sgml/resolv.xsl b/doc/src/sgml/resolv.xsl
new file mode 100644
index 00000000000..c69ba714dab
--- /dev/null
+++ b/doc/src/sgml/resolv.xsl
@@ -0,0 +1,7 @@
+<xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
+ <xsl:template match="@*|node()">
+ <xsl:copy>
+ <xsl:apply-templates select="@*|node()"/>
+ </xsl:copy>
+ </xsl:template>
+</xsl:stylesheet>
diff --git a/doc/src/sgml/version.sgml.in b/doc/src/sgml/version.sgml.in
new file mode 100644
index 00000000000..fa5ff343f40
--- /dev/null
+++ b/doc/src/sgml/version.sgml.in
@@ -0,0 +1,2 @@
+<!ENTITY version @PG_VERSION@>
+<!ENTITY majorversion @PG_MAJORVERSION@>
diff --git a/meson.build b/meson.build
new file mode 100644
index 00000000000..4cf621f375b
--- /dev/null
+++ b/meson.build
@@ -0,0 +1,2130 @@
+project('postgresql',
+ ['c'],
+ version: '15devel',
+ license: 'PostgreSQL',
+ meson_version: '>=0.54',
+ default_options: [
+ 'warning_level=2',
+ 'b_pie=true',
+ 'b_pch=false',
+ 'buildtype=release',
+ ]
+)
+
+
+
+###############################################################
+# Basic prep
+###############################################################
+
+fs = import('fs')
+windows = import('windows')
+
+thread_dep = dependency('threads')
+
+cpu_family = host_machine.cpu_family()
+
+
+# It's very easy to get into confusing states when the source directory
+# contains an in-place build. E.g. the wrong pg_config.h will be used. So just
+# refuse to build in that case.
+if fs.exists(meson.current_source_dir() / 'src' / 'include' / 'pg_config.h')
+ error('''
+****
+Non-clean source code directory detected.
+
+To build with meson the source tree may not have an in-place, ./configure
+style, build configured. Use a separate check out for meson based builds, or
+run make distclean in the source tree.
+
+You can have both meson and ./configure style builds for the same source tree
+by building out-of-source / VPATH with configure as well.
+****
+''')
+endif
+
+
+
+###############################################################
+# Version and other metadata
+###############################################################
+
+pg_version = meson.project_version()
+
+if pg_version.endswith('devel')
+ pg_version_arr = [pg_version.split('devel')[0], '0']
+elif pg_version.contains('beta')
+ pg_version_arr = pg_version.split('beta')
+elif pg_version.contains('rc')
+ pg_version_arr = pg_version.split('rc')
+else
+ pg_version_arr = pg_version.split('.')
+endif
+
+pg_version_major = pg_version_arr[0].to_int()
+pg_version_minor = pg_version_arr[1].to_int()
+
+cc = meson.get_compiler('c')
+
+cdata = configuration_data()
+
+
+cdata.set_quoted('PACKAGE_NAME', 'PostgreSQL')
+cdata.set_quoted('PACKAGE_BUGREPORT', 'pgsql-bugs@lists.postgresql.org')
+cdata.set_quoted('PACKAGE_URL', 'https://www.postgresql.org/')
+
+cdata.set_quoted('PG_VERSION', pg_version)
+cdata.set_quoted('PG_VERSION_STR', 'PostgreSQL @0@ on @1@, compiled by @2@-@3@'.format(
+ pg_version, target_machine.cpu_family(), cc.get_id(), cc.version()))
+cdata.set_quoted('PG_MAJORVERSION', pg_version_major.to_string())
+cdata.set('PG_MAJORVERSION_NUM', pg_version_major)
+cdata.set_quoted('PG_MINORVERSION', pg_version_minor.to_string())
+cdata.set('PG_MINORVERSION_NUM', pg_version_minor)
+cdata.set('PG_VERSION_NUM', (pg_version_major*10000)+pg_version_minor)
+cdata.set_quoted('CONFIGURE_ARGS', '')
+
+
+
+###############################################################
+# Search paths
+#
+# NB: Arguments added globally (via the below, or CFLAGS etc) are not taken
+# into account for configuration-time checks (so they are more
+# isolated). Flags that have to be taken into account for configure checks
+# have to be explicitly specified in configure tests.
+###############################################################
+
+g_inc = []
+g_c_args = []
+g_l_args = []
+
+if host_machine.system() == 'darwin'
+ # XXX, should this be required?
+ xcrun = find_program('xcrun', native: true, required: true)
+
+ sysroot = run_command(xcrun, '--show-sdk-path', check: true).stdout().strip()
+ message('sysroot is >@0@<'.format(sysroot))
+
+ g_c_args += ['-isysroot', sysroot]
+ g_l_args += ['-isysroot', sysroot]
+endif
+
+if host_machine.system() == 'linux' or host_machine.system() == 'cygwin'
+ g_c_args += '-D_GNU_SOURCE'
+endif
+
+g_c_inc = []
+
+g_c_inc += include_directories(get_option('extra_include_dirs'))
+g_c_lib = get_option('extra_lib_dirs')
+
+add_project_arguments(g_c_args, language: ['c', 'cpp'])
+add_project_link_arguments(g_l_args, language: ['c', 'cpp'])
+
+
+
+###############################################################
+# Program paths
+###############################################################
+
+# External programs
+perl = find_program(get_option('PERL'), required: true)
+flex = find_program(get_option('FLEX'), native: true)
+bison = find_program(get_option('BISON'), native: true, version: '>= 1.875')
+sed = find_program(get_option('SED'), 'sed', native: true)
+prove = find_program(get_option('PROVE'))
+tar = find_program(get_option('TAR'), native: true)
+gzip = find_program(get_option('GZIP'), native: true)
+touch = find_program('touch', native: true)
+
+# Internal programs
+find_meson = find_program('src/tools/find_meson', native: true)
+testwrap = find_program('src/tools/testwrap', native: true)
+relpath = find_program('src/tools/relpath.py', native: true)
+
+bisonflags = []
+if bison.found()
+ bison_version_c = run_command(bison, '--version', check: true)
+ # bison version string helpfully is something like
+ # >>bison (GNU bison) 3.8.1<<
+ bison_version = bison_version_c.stdout().split(' ')[3].split('\n')[0]
+ if bison_version.version_compare('>=3.0')
+ bisonflags += ['-Wno-deprecated']
+ endif
+endif
+
+
+wget = find_program('wget', required: false, native: true)
+wget_flags = ['-O', '@OUTPUT0@', '--no-use-server-timestamps']
+
+
+###############################################################
+# Path to meson (for tests etc)
+###############################################################
+
+# FIXME: this should really be part of meson, see
+# https://github.com/mesonbuild/meson/issues/8511
+meson_binpath_r = run_command(find_meson, check: true)
+
+if meson_binpath_r.returncode() != 0 or meson_binpath_r.stdout() == ''
+ error('huh, could not run find_meson.\nerrcode: @0@\nstdout: @1@\nstderr: @2@'.format(
+ meson_binpath_r.returncode(),
+ meson_binpath_r.stdout(),
+ meson_binpath_r.stderr()))
+endif
+
+meson_binpath_s = meson_binpath_r.stdout().split('\n')
+meson_binpath_len = meson_binpath_s.length()
+
+if meson_binpath_len < 1
+ error('unexpected introspect line @0@'.format(meson_binpath_r.stdout()))
+endif
+
+i = 0
+meson_binpath = ''
+meson_args = []
+foreach e : meson_binpath_s
+ if i == 0
+ meson_binpath = e
+ else
+ meson_args += e
+ endif
+ i += 1
+endforeach
+
+meson_bin = find_program(meson_binpath, native: true)
+
+
+
+###############################################################
+# Option Handling
+###############################################################
+
+cdata.set('USE_ASSERT_CHECKING', get_option('cassert'))
+
+cdata.set('BLCKSZ', 8192, description: '''
+ Size of a disk block --- this also limits the size of a tuple. You
+ can set it bigger if you need bigger tuples (although TOAST should
+ reduce the need to have large tuples, since fields can be spread
+ across multiple tuples).
+
+ BLCKSZ must be a power of 2. The maximum possible value of BLCKSZ
+ is currently 2^15 (32768). This is determined by the 15-bit widths
+ of the lp_off and lp_len fields in ItemIdData (see
+ include/storage/itemid.h).
+
+ Changing BLCKSZ requires an initdb.
+''')
+
+cdata.set('XLOG_BLCKSZ', 8192)
+cdata.set('RELSEG_SIZE', 131072)
+cdata.set('DEF_PGPORT', 5432)
+cdata.set_quoted('DEF_PGPORT_STR', '5432')
+cdata.set_quoted('PG_KRB_SRVNAM', 'postgres')
+
+
+
+###############################################################
+# Library: GSSAPI
+###############################################################
+
+gssapiopt = get_option('gssapi')
+if not gssapiopt.disabled()
+ gssapi = dependency('krb5-gssapi', required: gssapiopt)
+
+ if gssapi.found() and \
+ cc.check_header('gssapi/gssapi.h', args: g_c_args, dependencies: gssapi, required: gssapiopt)
+
+ if not cc.has_function('gss_init_sec_context', args: g_c_args, dependencies: gssapi)
+ error('''could not find function 'gss_init_sec_context' required for GSSAPI''')
+ endif
+ cdata.set('ENABLE_GSS', 1)
+ endif
+
+else
+ gssapi = dependency('', required : false)
+endif
+
+
+
+###############################################################
+# Library: ldap
+###############################################################
+
+ldapopt = get_option('ldap')
+if not ldapopt.disabled()
+
+ if host_machine.system() == 'windows'
+ ldap = cc.find_library('wldap32')
+ ldap_r = ldap
+ else
+ ldap = dependency('ldap', required: false)
+
+ # Before 2.5 openldap didn't have a pkg-config file..
+ if ldap.found()
+ ldap_r = ldap
+ else
+ ldap = cc.find_library('ldap', required: ldapopt)
+ ldap_r = cc.find_library('ldap_r', required: ldapopt)
+
+ # Use ldap_r for FE if available, else assume ldap is thread-safe.
+ # On some platforms ldap_r fails to link without PTHREAD_LIBS.
+ if ldap.found() and not ldap_r.found()
+ ldap_r = ldap
+ endif
+ endif
+
+ if ldap.found() and cc.has_function('ldap_initialize', args: g_c_args, dependencies: [ldap, thread_dep])
+ cdata.set('HAVE_LDAP_INITIALIZE', 1)
+ endif
+ endif
+
+ if ldap.found()
+ cdata.set('USE_LDAP', 1)
+ endif
+
+else
+ ldap = dependency('', required : false)
+ ldap_r = ldap
+endif
+
+
+
+###############################################################
+# Library: LLVM
+###############################################################
+
+llvmopt = get_option('llvm')
+if not llvmopt.disabled()
+ add_languages('cpp', required : true, native: false)
+ llvm = dependency('llvm', version : '>=3.9', method: 'config-tool', required: llvmopt)
+
+ if llvm.found()
+
+ cdata.set('USE_LLVM', 1)
+
+ cpp = meson.get_compiler('cpp')
+
+ llvm_binpath = llvm.get_variable(configtool: 'bindir')
+
+ ccache = find_program('ccache', required: false)
+ clang = find_program(llvm_binpath / 'clang', required: true)
+ llvm_lto = find_program(llvm_binpath / 'llvm-lto', required: true)
+
+ # FIXME: the includes hardcoded here suck
+ llvm_irgen_args = [
+ '-c', '-o', '@OUTPUT@', '@INPUT@',
+ '-flto=thin', '-emit-llvm',
+ '-MD', '-MQ', '@OUTPUT@', '-MF', '@DEPFILE@',
+ '-I', '@SOURCE_ROOT@/src/include',
+ '-I', '@BUILD_ROOT@/src/include',
+ '-I', '@BUILD_ROOT@/src/backend/utils/misc',
+ '-I', '@CURRENT_SOURCE_DIR@',
+ '-O2',
+ '-Wno-ignored-attributes',
+ '-Wno-empty-body',
+ ]
+
+ if ccache.found()
+ llvm_irgen_command = ccache
+ llvm_irgen_args = [clang.path()] + llvm_irgen_args
+ else
+ llvm_irgen_command = clang
+ endif
+
+ llvm_irgen_kw = {
+ 'command': [llvm_irgen_command] + llvm_irgen_args,
+ 'depfile': '@BASENAME@.c.bc.d',
+ }
+
+ irlink = find_program('src/tools/irlink', native: true)
+
+ llvm_irlink_kw = {
+ 'command':[
+ irlink,
+ '@SOURCE_ROOT@',
+ '@BUILD_ROOT@',
+ llvm_lto,
+ '-o', '@OUTPUT0@',
+ '@PRIVATE_DIR@',
+ '@INPUT@',
+ ],
+ 'install': true,
+ 'install_dir': get_option('libdir'),
+ }
+
+ endif
+else
+ llvm = dependency('', required: false)
+endif
+
+
+
+###############################################################
+# Library: icu
+###############################################################
+
+if not get_option('icu').disabled()
+ icu = dependency('icu-uc', required: get_option('icu').enabled())
+ icu_i18n = dependency('icu-i18n', required: get_option('icu').enabled())
+
+ if icu.found()
+ cdata.set('USE_ICU', 1)
+ endif
+
+else
+ icu = dependency('', required : false)
+ icu_i18n = dependency('', required : false)
+endif
+
+
+
+###############################################################
+# Library: libxml
+###############################################################
+
+libxmlopt = get_option('libxml')
+if not libxmlopt.disabled()
+ libxml = dependency('libxml-2.0', required: libxmlopt, version: '>= 2.6.23')
+
+ if libxml.found()
+ cdata.set('USE_LIBXML', 1)
+ endif
+else
+ libxml = dependency('', required : false)
+endif
+
+
+
+###############################################################
+# Library: libxslt
+###############################################################
+
+libxsltopt = get_option('libxslt')
+if not libxsltopt.disabled()
+ libxslt = dependency('libxslt', required: libxsltopt)
+
+ if libxslt.found()
+ cdata.set('USE_LIBXSLT', 1)
+ endif
+else
+ libxslt = dependency('', required : false)
+endif
+
+
+
+###############################################################
+# Library: lz4
+###############################################################
+
+lz4opt = get_option('lz4')
+if not lz4opt.disabled()
+ lz4 = dependency('liblz4', required: lz4opt)
+
+ if lz4.found()
+ cdata.set('USE_LZ4', 1)
+ cdata.set('HAVE_LIBLZ4', 1)
+ endif
+
+else
+ lz4 = dependency('', required : false)
+endif
+
+
+
+###############################################################
+# Library: Perl (for plperl)
+###############################################################
+
+perlopt = get_option('perl')
+perl_dep = dependency('', required: false)
+
+if perlopt.disabled()
+ perl_may_work = false
+else
+ perl_may_work = true
+
+ # First verify that perl has the necessary dependencies installed
+ perl_mods = run_command(
+ [perl,
+ '-MConfig', '-MOpcode', '-MExtUtils::Embed', '-MExtUtils::ParseXS',
+ '-e', ''],
+ check: false)
+ if perl_mods.returncode() != 0
+ perl_may_work = false
+ perl_msg = 'perl installation does not have the required modules'
+ endif
+
+ # Then inquire perl about its configuration
+ if perl_may_work
+ # FIXME: include copy-edited comments from perl.m4
+ perl_conf_cmd = [perl, '-MConfig', '-e', 'print $Config{$ARGV[0]}']
+ perlversion = run_command(perl_conf_cmd, 'api_versionstring', check: true).stdout()
+ archlibexp = run_command(perl_conf_cmd, 'archlibexp', check: true).stdout()
+ privlibexp = run_command(perl_conf_cmd, 'privlibexp', check: true).stdout()
+ useshrplib = run_command(perl_conf_cmd, 'useshrplib', check: true).stdout()
+ libperl = run_command(perl_conf_cmd, 'libperl', check: true).stdout()
+
+ perl_inc_dir = '@0@/CORE'.format(archlibexp)
+
+ perl_ccflags = []
+
+ if useshrplib != 'true'
+ perl_may_work = false
+ perl_msg = 'need a shared perl'
+ endif
+ endif
+
+ # XXX: should we only add directories that exist? Seems a bit annoying with
+ # macos' sysroot stuff...
+ #
+ # NB: For unknown reasons msys' python doesn't see these paths, despite gcc
+ # et al seeing them. So we can't use include_directories(), as that checks
+ # file existence.
+ if perl_may_work
+ perl_ccflags += ['-I@0@'.format(perl_inc_dir)]
+ if host_machine.system() == 'darwin'
+ perl_ccflags += ['-iwithsysroot', perl_inc_dir]
+ endif
+ endif
+
+ # check required headers are present
+ if perl_may_work and not \
+ cc.has_header('perl.h', args: g_c_args + perl_ccflags, required: false)
+ perl_may_work = false
+ perl_msg = 'missing perl.h'
+ endif
+
+ # Find perl library. This is made more complicated by the fact that the name
+ # Config.pm returns isn't directly usable (sometimes lib needs to be chopped
+ # off)
+ if perl_may_work
+ foreach p : ['perl', 'libperl', libperl, libperl.strip('lib'), fs.stem(libperl), fs.stem(libperl).strip('lib')]
+ perl_dep_int = cc.find_library(p,
+ dirs: ['@0@/CORE'.format(archlibexp)],
+ required: false)
+ if perl_dep_int.found()
+ break
+ endif
+ endforeach
+
+ if not perl_dep_int.found()
+ perl_may_work = false
+ perl_msg = 'missing libperl'
+ endif
+ endif
+
+ if perl_may_work
+ perl_ccflags_r = run_command(perl_conf_cmd, 'ccflags', check: true).stdout()
+ message('CCFLAGS recommended by Perl: @0@'.format(perl_ccflags_r))
+
+ foreach flag : perl_ccflags_r.split(' ')
+ if flag.startswith('-D') and \
+ (not flag.startswith('-D_') or flag == '_USE_32BIT_TIME_T')
+ perl_ccflags += flag
+ endif
+ endforeach
+
+ if host_machine.system() == 'windows'
+ perl_ccflags += ['-DPLPERL_HAVE_UID_GID']
+ endif
+
+ message('CCFLAGS for embedding perl: @0@'.format(' '.join(perl_ccflags)))
+
+ # perl.m4 sayeth:
+ #
+ # We are after Embed's ldopts, but without the subset mentioned in
+ # Config's ccdlflags;
+ #
+ # FIXME: andres sayeth: But why?
+
+ ldopts = run_command(perl, '-MExtUtils::Embed', '-e', 'ldopts', check: true).stdout().strip()
+ ccdlflags = run_command(perl_conf_cmd, 'ccdlflags', check: true).stdout().strip()
+
+ ccdlflags_dict = {}
+
+ foreach ccdlflag : ccdlflags.split(' ')
+ ccdlflags_dict += {ccdlflag: 1}
+ endforeach
+
+ perl_ldopts = []
+ foreach ldopt : ldopts.split(' ')
+ if ldopt == ''
+ continue
+ elif ccdlflags_dict.has_key(ldopt)
+ continue
+ # strawberry perl unhelpfully has that in ldopts
+ elif ldopt == '-s'
+ continue
+ endif
+
+ perl_ldopts += ldopt.strip('"')
+ endforeach
+
+ # FIXME: check if windows handling is necessary
+
+ message('LDFLAGS for embedding perl: "@0@" (ccdlflags: "@1@", ldopts: "@2@")'.format(
+ ' '.join(perl_ldopts), ccdlflags, ldopts))
+
+ if perl_dep_int.found()
+ perl_dep = declare_dependency(
+ compile_args: perl_ccflags,
+ link_args: perl_ldopts,
+ version: perlversion,
+ )
+ endif
+ endif # perl_may_work
+
+ if not perl_may_work
+ if perlopt.enabled()
+ error('dependency perl failed: @0@'.format(perl_msg))
+ else
+ message('disabling optional dependency perl: @0@'.format(perl_msg))
+ endif
+ endif
+endif
+
+
+
+###############################################################
+# Library: Python (for plpython)
+###############################################################
+
+pyopt = get_option('python')
+if not pyopt.disabled()
+ pm = import('python')
+ python3_inst = pm.find_installation(required: pyopt.enabled())
+ python3 = python3_inst.dependency(embed: true, required: pyopt.enabled())
+else
+ python3 = dependency('', required: false)
+endif
+
+
+
+###############################################################
+# Library: Readline
+#
+# FIXME: editline support
+###############################################################
+
+if not get_option('readline').disabled()
+ readline = dependency('readline', required: false)
+ if not readline.found()
+ readline = cc.find_library('readline',
+ required: get_option('readline').enabled())
+ endif
+
+ if readline.found()
+ cdata.set('HAVE_LIBREADLINE', 1)
+
+ if cc.has_header('readline/history.h', args: g_c_args, dependencies: [readline], required: false)
+ history_h = 'readline/history.h'
+ cdata.set('HAVE_READLINE_HISTORY_H', 1)
+ cdata.set('HAVE_READLINE_H', false)
+ elif cc.has_header('history.h', args: g_c_args, dependencies: [readline], required: false)
+ history_h = 'history.h'
+ cdata.set('HAVE_READLINE_HISTORY_H', false)
+ cdata.set('HAVE_HISTORY_H', 1)
+ else
+ error('''readline header not found
+If you have readline already installed, see see meson-log/meson-log.txt for details on the
+failure. It is possible the compiler isn't looking in the proper directory.
+Use -Dreadline=false to disable readline support.''')
+ endif
+
+ if cc.has_header('readline/readline.h', args: g_c_args, dependencies: [readline], required: false)
+ readline_h = 'readline/readline.h'
+ cdata.set('HAVE_READLINE_READLINE_H', 1)
+ elif cc.has_header('readline.h', args: g_c_args, dependencies: [readline], required: false)
+ readline_h = 'readline.h'
+ cdata.set('HAVE_READLINE_H', 1)
+ else
+ error('''readline header not found
+If you have readline already installed, see see meson-log/meson-log.txt for details on the
+failure. It is possible the compiler isn't looking in the proper directory.
+Use -Dreadline=false to disable readline support.''')
+ endif
+
+ check_funcs = [
+ 'rl_completion_matches',
+ 'rl_filename_completion_function',
+ 'rl_reset_screen_size',
+ 'append_history',
+ 'history_truncate_file',
+ ]
+
+ foreach func : check_funcs
+ cdata.set('HAVE_'+func.to_upper(),
+ cc.has_function(func, args: g_c_args, dependencies: [readline]) ? 1 : false)
+ endforeach
+
+ check_vars = [
+ 'rl_completion_append_character',
+ 'rl_completion_suppress_quote',
+ 'rl_filename_quote_characters',
+ 'rl_filename_quoting_function',
+ ]
+
+ foreach var : check_vars
+ cdata.set('HAVE_'+var.to_upper(),
+ cc.has_header_symbol(readline_h, var, args: g_c_args, dependencies: [readline]) ? 1 : false)
+ endforeach
+ endif
+else
+ readline = dependency('', required : false)
+endif
+
+
+
+###############################################################
+# Library: selinux
+###############################################################
+
+selinux = dependency('', required : false)
+selinuxopt = get_option('selinux')
+if not selinuxopt.disabled()
+ selinux = dependency('libselinux', required: selinuxopt, version: '>= 2.1.10')
+endif
+cdata.set('HAVE_LIBSELINUX',
+ selinux.found() ? 1 : false)
+
+
+
+###############################################################
+# Library: systemd
+###############################################################
+
+systemd = dependency('', required : false)
+systemdopt = get_option('systemd')
+if meson.version().version_compare('>=0.59')
+ systemdopt = systemdopt.disable_auto_if(host_machine.system() != 'linux')
+endif
+if not systemdopt.disabled()
+ systemd = dependency('libsystemd', required: systemdopt)
+endif
+cdata.set('USE_SYSTEMD',
+ systemd.found() ? 1 : false)
+
+
+
+###############################################################
+# Library: SSL
+###############################################################
+
+if get_option('ssl') == 'openssl'
+
+ # Try to find openssl via pkg-config et al, if that doesn't work, look for
+ # the library names that we know about.
+
+ # via pkg-config et al
+ ssl = dependency('openssl', required: false)
+
+ # via library + headers
+ if not ssl.found()
+ ssl_lib = cc.find_library('ssl',
+ dirs: g_c_lib,
+ header_include_directories: g_c_inc,
+ has_headers: ['openssl/ssl.h', 'openssl/err.h'])
+ crypto_lib = cc.find_library('crypto',
+ dirs: g_c_lib,
+ header_include_directories: g_c_inc)
+ ssl_int = [ssl_lib, crypto_lib]
+
+ ssl = declare_dependency(dependencies: ssl_int,
+ include_directories: g_c_inc)
+ else
+ cc.has_header('openssl/ssl.h', args: g_c_args, dependencies: ssl, required: true)
+ cc.has_header('openssl/err.h', args: g_c_args, dependencies: ssl, required: true)
+
+ ssl_int = [ssl]
+ endif
+
+ cdata.set_quoted('WITH_SSL', get_option('ssl'))
+
+ check_funcs = [
+ ['CRYPTO_new_ex_data', {'required': true}],
+ ['SSL_new', {'required': true}],
+
+ # Function introduced in OpenSSL 1.0.2.
+ ['X509_get_signature_nid'],
+
+ # Functions introduced in OpenSSL 1.1.0. We used to check for
+ # OPENSSL_VERSION_NUMBER, but that didn't work with 1.1.0, because LibreSSL
+ # defines OPENSSL_VERSION_NUMBER to claim version 2.0.0, even though it
+ # doesn't have these OpenSSL 1.1.0 functions. So check for individual
+ # functions.
+ ['OPENSSL_init_ssl'],
+ ['BIO_get_data'],
+ ['BIO_meth_new'],
+ ['ASN1_STRING_get0_data'],
+ ['HMAC_CTX_new'],
+ ['HMAC_CTX_free'],
+
+ # OpenSSL versions before 1.1.0 required setting callback functions, for
+ # thread-safety. In 1.1.0, it's no longer required, and CRYPTO_lock()
+ # function was removed.
+ ['CRYPTO_lock'],
+ ]
+
+ foreach c : check_funcs
+ func = c.get(0)
+ val = cc.has_function(func, args: g_c_args, dependencies: ssl_int)
+ if not val and c.get(1, {}).get('required', false)
+ error('openssl function @0@ is required'.format(func))
+ endif
+ cdata.set('HAVE_'+func.to_upper(), val ? 1 : false)
+ endforeach
+
+ cdata.set('USE_OPENSSL', 1,
+ description: 'Define to 1 to build with OpenSSL support. (-Dssl=openssl)')
+
+ cdata.set('OPENSSL_API_COMPAT', 0x10001000,
+ description: 'Define to the OpenSSL API version in use. This avoids deprecation warnings from newer OpenSSL versions.')
+else
+ ssl = dependency('', required : false)
+endif
+
+
+
+###############################################################
+# Library: uuid
+###############################################################
+
+uuidopt = get_option('uuid')
+if uuidopt != 'none'
+ uuidname = uuidopt.to_upper()
+ if uuidopt == 'e2fs'
+ uuid = dependency('uuid', required: true)
+ uuidfunc = 'uuid_generate'
+ uuidheader = 'uuid/uuid.h'
+ elif uuidopt == 'bsd'
+ # libc should have uuid function
+ uuid = declare_dependency()
+ uuidfunc = 'uuid_to_string'
+ uuidheader = 'uuid.h'
+ elif uuidopt == 'ossp'
+ uuid = dependency('ossp-uuid', required: true)
+ uuidfunc = 'uuid_export'
+ uuidheader = 'ossp/uuid.h'
+ else
+ error('huh')
+ endif
+
+ if not cc.has_header_symbol(uuidheader, uuidfunc, dependencies: uuid)
+ error('uuid library @0@ missing required function @1@'.format(uuidopt, uuidfunc))
+ endif
+ cdata.set('HAVE_@0@'.format(uuidheader.underscorify().to_upper()), 1)
+
+ cdata.set('HAVE_UUID_@0@'.format(uuidname), 1,
+ description: 'Define to 1 if you have @0@ UUID support.'.format(uuidname))
+else
+ uuid = dependency('', required : false)
+endif
+
+
+
+###############################################################
+# Library: zlib
+###############################################################
+
+zlibopt = get_option('zlib')
+zlib = dependency('', required : false)
+if not zlibopt.disabled()
+ zlib_t = dependency('zlib', required: zlibopt)
+
+ if zlib_t.type_name() == 'internal'
+ # if fallback was used, we don't need to test if headers are present (they
+ # aren't built yet, so we can't test)
+ zlib = zlib_t
+ elif not zlib_t.found()
+ warning('did not find zlib')
+ elif not cc.has_header('zlib.h', args: g_c_args, dependencies: [zlib_t], required: zlibopt.enabled())
+ warning('zlib header not found')
+ elif not cc.has_type('z_streamp', args: g_c_args, dependencies: [zlib_t], prefix: '#include <zlib.h>')
+ if zlibopt.enabled()
+ error('zlib version is too old')
+ else
+ warning('zlib version is too old')
+ endif
+ else
+ zlib = zlib_t
+ endif
+
+ if zlib.found()
+ cdata.set('HAVE_LIBZ', 1)
+ endif
+endif
+
+
+
+###############################################################
+# Compiler tests
+###############################################################
+
+sizeof_long = cc.sizeof('long', args: g_c_args)
+cdata.set('SIZEOF_LONG', sizeof_long)
+if sizeof_long == 8
+ cdata.set('HAVE_LONG_INT_64', 1)
+ cdata.set('PG_INT64_TYPE', 'long int')
+ cdata.set_quoted('INT64_MODIFIER', 'l')
+elif sizeof_long == 4 and cc.sizeof('long long', args: g_c_args) == 8
+ cdata.set('HAVE_LONG_LONG_INT_64', 1)
+ cdata.set('PG_INT64_TYPE', 'long long int')
+ cdata.set_quoted('INT64_MODIFIER', 'll')
+elif
+ error('do not know how to get a 64bit int')
+endif
+
+
+cdata.set('MAXIMUM_ALIGNOF', 8)
+cdata.set('ALIGNOF_SHORT', cc.alignment('short', args: g_c_args))
+cdata.set('ALIGNOF_INT', cc.alignment('int', args: g_c_args))
+cdata.set('ALIGNOF_LONG', cc.alignment('long', args: g_c_args))
+cdata.set('ALIGNOF_DOUBLE', cc.alignment('double', args: g_c_args))
+cdata.set('SIZEOF_VOID_P', cc.sizeof('void *', args: g_c_args))
+cdata.set('SIZEOF_SIZE_T', cc.sizeof('size_t', args: g_c_args))
+
+# Check if the C compiler knows computed gotos (gcc extension, also
+# available in at least clang). If so, define HAVE_COMPUTED_GOTO.
+#
+# Checking whether computed gotos are supported syntax-wise ought to
+# be enough, as the syntax is otherwise illegal.
+if cc.compiles('''
+ static inline int foo(void)
+ {
+ void *labeladdrs[] = {&&my_label};
+ goto *labeladdrs[0];
+ my_label:
+ return 1;
+ }''',
+ args: g_c_args)
+ cdata.set('HAVE_COMPUTED_GOTO', 1)
+endif
+
+
+# XXX: for now just assume that compiler knows __func__ - it's C99 after all.
+cdata.set('HAVE_FUNCNAME__FUNC', 1)
+
+# Check if the C compiler understands _Static_assert(),
+# and define HAVE__STATIC_ASSERT if so.
+#
+# We actually check the syntax ({ _Static_assert(...) }), because we need
+# gcc-style compound expressions to be able to wrap the thing into macros.
+if cc.compiles('''
+ int main(int arg, char **argv)
+ {
+ ({ _Static_assert(1, "foo"); })
+ }
+ ''',
+ args: g_c_args)
+ cdata.set('HAVE__STATIC_ASSERT', 1)
+endif
+
+# We use <stdbool.h> if we have it and it declares type bool as having
+# size 1. Otherwise, c.h will fall back to declaring bool as unsigned char.
+if cc.has_type('_Bool', args: g_c_args) \
+ and cc.has_type('bool', prefix: '#include <stdbool.h>', args: g_c_args) \
+ and cc.sizeof('bool', prefix: '#include <stdbool.h>', args: g_c_args) == 1
+ cdata.set('HAVE__BOOL', 1)
+ cdata.set('PG_USE_STDBOOL', 1)
+endif
+
+
+printf_attributes = ['gnu_printf', '__syslog__', 'printf']
+testsrc = 'extern void pgac_write(int ignore, const char *fmt,...) __attribute__((format(@0@, 2,3)));'
+foreach a : printf_attributes
+ if cc.compiles(testsrc.format(a), args: g_c_args + ['-Werror'], name: 'format ' + a)
+ cdata.set('PG_PRINTF_ATTRIBUTE', a)
+ break
+ endif
+endforeach
+
+if cc.has_function_attribute('visibility:default') and \
+ cc.has_function_attribute('visibility:hidden')
+ cdata.set('HAVE_VISIBILITY_ATTRIBUTE', 1)
+endif
+
+
+if cc.has_function('__builtin_unreachable', args: g_c_args)
+ cdata.set('HAVE__BUILTIN_UNREACHABLE', 1)
+endif
+
+if cc.has_function('__builtin_constant_p', args: g_c_args)
+ cdata.set('HAVE__BUILTIN_CONSTANT_P', 1)
+
+ if host_machine.cpu_family() == 'ppc' or host_machine.cpu_family() == 'ppc64'
+ # Check if compiler accepts "i"(x) when __builtin_constant_p(x).
+ if cc.compiles('''
+ static inline int
+ addi(int ra, int si)
+ {
+ int res = 0;
+ if (__builtin_constant_p(si))
+ __asm__ __volatile__(
+ " addi %0,%1,%2\n" : "=r"(res) : "b"(ra), "i"(si));
+ return res;
+ }
+ int test_adds(int x) { return addi(3, x) + addi(x, 5); }
+ ''',
+ args: g_c_args)
+ cdata.set('HAVE_I_CONSTRAINT__BUILTIN_CONSTANT_P', 1)
+ endif
+ endif
+endif
+
+
+
+# XXX: The configure.ac check for __cpuid() is broken, we don't copy that
+# here. To prevent problems due to two detection methods working, stop
+# checking after one.
+if cc.links('''
+ #include <cpuid.h>
+ int main(int arg, char **argv)
+ {
+ unsigned int exx[4] = {0, 0, 0, 0};
+ __get_cpuid(1, &exx[0], &exx[1], &exx[2], &exx[3]);
+ }
+ ''', name: '__get_cpuid',
+ args: g_c_args)
+ cdata.set('HAVE__GET_CPUID', 1)
+elif cc.links('''
+ #include <intrin.h>
+ int main(int arg, char **argv)
+ {
+ unsigned int exx[4] = {0, 0, 0, 0};
+ __cpuid(exx, 1);
+ }
+ ''', name: '__cpuid',
+ args: g_c_args)
+ cdata.set('HAVE__CPUID', 1)
+endif
+
+
+
+###############################################################
+# Compiler flags
+###############################################################
+
+common_functional_flags = [
+ # Disable strict-aliasing rules; needed for gcc 3.3+
+ '-fno-strict-aliasing',
+ # Disable optimizations that assume no overflow; needed for gcc 4.3+
+ '-fwrapv',
+ '-fexcess-precision=standard'
+]
+
+add_project_arguments(cc.get_supported_arguments(common_functional_flags), language: 'c')
+
+vectorize_cflags = cc.get_supported_arguments(['-ftree-vectorize'])
+unroll_loops_cflags = cc.get_supported_arguments(['-funroll-loops'])
+
+
+common_warning_flags = [
+ '-Wmissing-prototypes',
+ '-Wpointer-arith',
+ '-Werror=vla',
+ '-Wendif-labels',
+ '-Wmissing-format-attribute',
+ '-Wimplicit-fallthrough=3',
+ '-Wcast-function-type',
+ '-Wformat-security',
+]
+
+add_project_arguments(cc.get_supported_arguments(common_warning_flags), language: 'c')
+
+if llvm.found()
+ add_project_arguments(cpp.get_supported_arguments(common_warning_flags), language: 'cpp')
+endif
+
+# A few places with imported code get a pass on -Wdeclaration-after-statement, remember
+# the result for them
+if cc.has_argument('-Wdeclaration-after-statement')
+ add_project_arguments('-Wdeclaration-after-statement', language: 'c')
+ using_declaration_after_statement_warning = true
+else
+ using_declaration_after_statement_warning = false
+endif
+
+
+# We want to suppress a few unhelpful warnings - but gcc won't
+# complain about unrecognized -Wno-foo switches, so we have to test
+# for the positive form and if that works, add the negative form
+
+negative_warning_flags = [
+ 'unused-command-line-argument',
+ 'format-truncation',
+ 'stringop-truncation',
+
+ # FIXME: from andres's local config
+ 'clobbered',
+ 'missing-field-initializers',
+ 'sign-compare',
+ 'unused-parameter',
+]
+
+foreach w : negative_warning_flags
+ if cc.has_argument('-W'+w)
+ add_project_arguments('-Wno-'+w, language: 'c')
+ endif
+
+ if llvm.found() and cpp.has_argument('-W'+w)
+ add_project_arguments('-Wno-'+w, language: 'cpp')
+ endif
+endforeach
+
+
+# From Project.pm
+if cc.get_id() == 'msvc'
+ add_project_arguments('/wd4018', '/wd4244', '/wd4273', '/wd4102', '/wd4090', '/wd4267',
+ language: 'c')
+ add_project_arguments('/DWIN32', '/DWINDOWS', '/D__WINDOWS__', '/D__WIN32__',
+ '/DWIN32_STACK_RLIMIT=4194304', '/D_CRT_SECURE_NO_DEPRECATE', '/D_CRT_NONSTDC_NO_DEPRECATE',
+ language: 'c')
+endif
+
+
+
+###############################################################
+# Atomics
+###############################################################
+
+# FIXME
+cdata.set('HAVE_SPINLOCKS', 1)
+
+if get_option('atomics')
+ # FIXME
+ cdata.set('HAVE_ATOMICS', 1)
+
+ atomic_checks = [
+ {'name': 'HAVE_GCC__SYNC_CHAR_TAS',
+ 'desc': '__sync_lock_test_and_set(char)',
+ 'test': '''
+char lock = 0;
+__sync_lock_test_and_set(&lock, 1);
+__sync_lock_release(&lock);'''},
+
+ {'name': 'HAVE_GCC__SYNC_INT32_TAS',
+ 'desc': '__sync_lock_test_and_set(int32)',
+ 'test': '''
+int lock = 0;
+__sync_lock_test_and_set(&lock, 1);
+__sync_lock_release(&lock);'''},
+
+ {'name': 'HAVE_GCC__SYNC_INT32_CAS',
+ 'desc': '__sync_val_compare_and_swap(int32)',
+ 'test': '''
+int val = 0;
+__sync_val_compare_and_swap(&val, 0, 37);'''},
+
+# FIXME: int64 reference
+ {'name': 'HAVE_GCC__SYNC_INT64_CAS',
+ 'desc': '__sync_val_compare_and_swap(int64)',
+ 'test': '''
+long val = 0;
+__sync_val_compare_and_swap(&val, 0, 37);'''},
+
+ {'name': 'HAVE_GCC__ATOMIC_INT32_CAS',
+ 'desc': ' __atomic_compare_exchange_n(int32)',
+ 'test': '''
+int val = 0;
+int expect = 0;
+__atomic_compare_exchange_n(&val, &expect, 37, 0, __ATOMIC_SEQ_CST, __ATOMIC_RELAXED);'''},
+
+# FIXME: int64 reference
+ {'name': 'HAVE_GCC__ATOMIC_INT64_CAS',
+ 'desc': ' __atomic_compare_exchange_n(int64)',
+ 'test': '''
+long val = 0;
+int expect = 0;
+__atomic_compare_exchange_n(&val, &expect, 37, 0, __ATOMIC_SEQ_CST, __ATOMIC_RELAXED);'''},
+ ]
+
+ foreach check : atomic_checks
+ test = '''
+int main(void)
+{
+@0@
+}'''.format(check['test'])
+
+ cdata.set(check['name'],
+ cc.links(test, name: check['desc'], args: g_c_args))
+ endforeach
+
+endif
+
+
+
+###############################################################
+# CRC
+###############################################################
+
+have_optimized_crc = false
+cflags_crc = []
+if cpu_family == 'x86' or cpu_family == 'x86_64'
+
+ if cc.get_id() == 'msvc'
+ cdata.set('USE_SSE42_CRC32C', false)
+ cdata.set('USE_SSE42_CRC32C_WITH_RUNTIME_CHECK', 1)
+ have_optimized_crc = true
+ else
+
+ prog = '''
+#include <nmmintrin.h>
+
+int main(void)
+{
+ unsigned int crc = 0;
+ crc = _mm_crc32_u8(crc, 0);
+ crc = _mm_crc32_u32(crc, 0);
+ /* return computed value, to prevent the above being optimized away */
+ return crc == 0;
+}
+'''
+
+ if cc.links(prog, name: '_mm_crc32_u8 and _mm_crc32_u32 without -msse4.2', args: g_c_args)
+ cdata.set('USE_SSE42_CRC32C', 1)
+ have_optimized_crc = true
+ elif cc.links(prog, name: '_mm_crc32_u8 and _mm_crc32_u32 with -msse4.2', args: g_c_args + ['-msse4.2'])
+ cflags_crc += '-msse4.2'
+ cdata.set('USE_SSE42_CRC32C', false)
+ cdata.set('USE_SSE42_CRC32C_WITH_RUNTIME_CHECK', 1)
+ have_optimized_crc = true
+ endif
+
+ endif
+
+elif cpu_family == 'arm' or cpu_family == 'aarch64'
+
+ prog = '''
+#include <arm_acle.h>
+
+int main(void)
+{
+ unsigned int crc = 0;
+ crc = __crc32cb(crc, 0);
+ crc = __crc32ch(crc, 0);
+ crc = __crc32cw(crc, 0);
+ crc = __crc32cd(crc, 0);
+
+ /* return computed value, to prevent the above being optimized away */
+ return crc == 0;
+}
+'''
+
+ if cc.links(prog, name: '__crc32cb, __crc32ch, __crc32cw, and __crc32cd without -march=armv8-a+crc',
+ args: g_c_args)
+ cdata.set('USE_ARMV8_CRC32C', true)
+ have_optimized_crc = true
+ elif cc.links(prog, name: '__crc32cb, __crc32ch, __crc32cw, and __crc32cd with -march=armv8-a+crc',
+ args: g_c_args + ['-march=armv8-a+crc'])
+ cflags_crc += '-march=armv8-a+crc'
+ cdata.set('USE_ARMV8_CRC32C', false)
+ cdata.set('USE_ARMV8_CRC32C_WITH_RUNTIME_CHECK', 1)
+ have_optimized_crc = true
+ endif
+endif
+
+if not have_optimized_crc
+ cdata.set('USE_SLICING_BY_8_CRC32C', 1)
+endif
+
+
+
+###############################################################
+# Library / OS tests
+###############################################################
+
+header_checks = [
+ ['atomic.h'],
+ ['stdbool.h'],
+ ['copyfile.h'],
+ ['execinfo.h'],
+ ['getopt.h'],
+ ['ifaddrs.h'],
+ ['langinfo.h'],
+ ['mbarrier.h'],
+ ['poll.h'],
+ ['sys/epoll.h'],
+ ['sys/event.h'],
+ ['sys/ipc.h'],
+ ['sys/prctl.h'],
+ ['sys/procctl.h'],
+ ['sys/pstat.h'],
+ ['sys/resource.h'],
+ ['sys/select.h'],
+ ['sys/sem.h'],
+ ['sys/shm.h'],
+ ['sys/sockio.h'],
+ ['sys/tas.h'],
+ ['sys/uio.h'],
+ ['sys/un.h'],
+ ['termios.h'],
+ ['ucred.h'],
+ # FIXME: openbsd workaround
+ ['sys/ucred.h'],
+ ['wctype.h'],
+ ['netinet/tcp.h'],
+ ['net/if.h'],
+ ['crtdefs.h'],
+]
+
+foreach c : header_checks
+ varname = 'HAVE_'+c.get(0).underscorify().to_upper()
+
+ # Emulate autoconf behaviour of not-found->undef, found->1
+ found = cc.has_header(c.get(0), include_directories: g_inc, args: g_c_args)
+ cdata.set(varname, found ? 1 : false,
+ description: 'Define to 1 if you have the <@0@> header file.'.format(c))
+endforeach
+
+
+
+decl_checks = [
+ ['F_FULLFSYNC', 'fcntl.h'],
+ ['RTLD_GLOBAL', 'dlfcn.h'],
+ ['RTLD_NOW', 'dlfcn.h'],
+ ['fdatasync', 'unistd.h'],
+ ['posix_fadvise', 'fcntl.h'],
+ ['sigwait', 'signal.h'],
+ ['strlcat', 'string.h'],
+ ['strlcpy', 'string.h'],
+ ['strnlen', 'string.h'],
+ ['strsignal', 'string.h'],
+ ['strtoll', 'stdlib.h'], ['strtoull', 'stdlib.h'], # strto[u]ll may exist but not be declared
+]
+
+# Need to check for function declarations for these functions, because
+# checking for library symbols wouldn't handle deployment target
+# restrictions on macOS
+decl_checks += [
+ ['preadv', 'sys/uio.h'],
+ ['pwritev', 'sys/uio.h'],
+]
+
+foreach c : decl_checks
+ varname = 'HAVE_DECL_'+c.get(0).underscorify().to_upper()
+
+ found = cc.has_header_symbol(c.get(1), c.get(0), args: g_c_args, kwargs: c.get(2, {}))
+ cdata.set10(varname, found, description:
+'''Define to 1 if you have the declaration of `@0@\', and to 0 if you
+ don't.'''.format(c))
+endforeach
+
+
+
+# XXX: this is borked, HAVE_SYS_UCRED_H not defined
+if cc.has_type('struct cmsgcred',
+ include_directories: g_inc,
+ args: g_c_args + ['@0@'.format(cdata.get('HAVE_SYS_UCRED_H')) == 'false' ? '-DHAVE_SYS_UCRED_H' : ''],
+ prefix: '''
+#include <sys/socket.h>
+#include <sys/param.h>
+#ifdef HAVE_SYS_UCRED_H
+#include <sys/ucred.h>
+#endif''')
+ cdata.set('HAVE_STRUCT_CMSGCRED', 1)
+else
+ cdata.set('HAVE_STRUCT_CMSGCRED', false)
+endif
+
+if cc.has_function('getopt', args: g_c_args) and \
+ cc.has_function('getopt_long', args: g_c_args) and \
+ cc.has_type('struct option', args: g_c_args, prefix: '#include <getopt.h>')
+ cdata.set('HAVE_GETOPT', 1)
+ cdata.set('HAVE_GETOPT_LONG', 1)
+ cdata.set('HAVE_STRUCT_OPTION', 1)
+else
+ warning('not yet implemented')
+endif
+
+
+foreach c : ['opterr', 'optreset']
+ varname = 'HAVE_INT_'+c.underscorify().to_upper()
+
+ if cc.links('''
+#include <unistd.h>
+int main(void)
+{
+ extern int @0@;
+ @0@ = 1;
+}
+'''.format(c), name: c, args: g_c_args)
+ cdata.set(varname, 1)
+ else
+ cdata.set(varname, false)
+ endif
+endforeach
+
+
+if cc.has_type('struct sockaddr_storage', args: g_c_args, prefix: '''
+#include <sys/types.h>
+#include <sys/socket.h>''')
+ cdata.set('HAVE_STRUCT_SOCKADDR_STORAGE', 1)
+endif
+
+if cc.has_member('struct sockaddr_storage', 'ss_family', args: g_c_args,
+ prefix: '''#include <sys/types.h>
+#include <sys/socket.h>''')
+ cdata.set('HAVE_STRUCT_SOCKADDR_STORAGE_SS_FAMILY', 1)
+endif
+
+if cc.has_member('struct sockaddr_storage', '__ss_family', args: g_c_args,
+ prefix: '''
+#include <sys/types.h>
+#include <sys/socket.h>''')
+ cdata.set('HAVE_STRUCT_SOCKADDR_STORAGE___SS_FAMILY', 1)
+endif
+
+if cc.has_type('struct sockaddr_un', args: g_c_args, prefix: '''
+#include <sys/types.h>
+#include <sys/un.h>''')
+ cdata.set('HAVE_STRUCT_SOCKADDR_UN', 1)
+endif
+
+if cc.has_type('struct addrinfo', args: g_c_args, prefix: '''
+#include <sys/types.h>
+#include <sys/socket.h>
+#include <netdb.h>
+''')
+ cdata.set('HAVE_STRUCT_ADDRINFO', 1)
+endif
+
+if host_machine.system() == 'windows'
+ cdata.set('HAVE_STRUCT_SOCKADDR_STORAGE', 1)
+ cdata.set('HAVE_STRUCT_SOCKADDR_STORAGE_SS_FAMILY', 1)
+endif
+
+if cc.has_type('struct sockaddr_in6', args: g_c_args, prefix: '''
+#include <netinet/in.h>''')
+ cdata.set('HAVE_IPV6', 1)
+endif
+
+
+if cc.has_member('struct tm', 'tm_zone', args: g_c_args, prefix: '''
+#include <sys/types.h>
+#include <time.h>
+''')
+ cdata.set('HAVE_STRUCT_TM_TM_ZONE', 1)
+endif
+
+if cc.compiles('''
+#include <time.h>
+extern int foo(void);
+int foo(void)
+{
+ return timezone / 60;
+}
+''', name: 'Check if the global variable `timezone\' exists', args: g_c_args,)
+ cdata.set('HAVE_INT_TIMEZONE', 1)
+else
+ cdata.set('HAVE_INT_TIMEZONE', false)
+endif
+
+# FIXME: sys/ipc.h, sys/sem.h includes were conditional
+if cc.has_type('union semun', args: g_c_args, prefix: '''
+#include <sys/types.h>
+#include <sys/ipc.h>
+#include <sys/sem.h>
+''')
+ cdata.set('HAVE_UNION_SEMUN', 1)
+endif
+
+if cc.compiles('''
+#include <string.h>
+int main(void)
+{
+ char buf[100];
+ switch (strerror_r(1, buf, sizeof(buf)))
+ { case 0: break; default: break; }
+}''', args: g_c_args)
+ cdata.set('STRERROR_R_INT', 1)
+else
+ cdata.set('STRERROR_R_INT', false)
+endif
+
+# FIXME
+cdata.set('pg_restrict', '__restrict')
+
+# FIXME
+if host_machine.system() == 'windows'
+ cdata.set('ACCEPT_TYPE_ARG1', 'unsigned int')
+ cdata.set('ACCEPT_TYPE_ARG2', 'struct sockaddr *')
+ cdata.set('ACCEPT_TYPE_ARG3', 'int')
+ cdata.set('ACCEPT_TYPE_RETURN', 'unsigned int PASCAL')
+else
+ cdata.set('ACCEPT_TYPE_ARG1', 'int')
+ cdata.set('ACCEPT_TYPE_ARG2', 'struct sockaddr')
+ cdata.set('ACCEPT_TYPE_ARG3', 'socklen_t')
+ cdata.set('ACCEPT_TYPE_RETURN', 'int')
+endif
+
+cdata.set('HAVE_STRUCT_ADDRINFO', 1)
+
+
+cdata.set('MEMSET_LOOP_LIMIT', 1024)
+
+
+if cc.links('''
+#include <machine/vmparam.h>
+#include <sys/exec.h>
+
+int main(void)
+{
+ PS_STRINGS->ps_nargvstr = 1;
+ PS_STRINGS->ps_argvstr = "foo";
+}
+''',
+ name: 'PS_STRINGS', args: g_c_args)
+ cdata.set('HAVE_PS_STRINGS', 1)
+else
+ cdata.set('HAVE_PS_STRINGS', false)
+endif
+
+
+m_dep = cc.find_library('m', required : false)
+
+# Most libraries are included only if they demonstrably provide a function we
+# need, but libm is an exception: always include it, because there are too
+# many compilers that play cute optimization games that will break probes for
+# standard functions such as pow().
+os_deps = [m_dep]
+
+rt_dep = cc.find_library('rt', required : false)
+
+dl_dep = cc.find_library('dl', required : false)
+
+util_dep = cc.find_library('util', required : false)
+posix4_dep = cc.find_library('posix4', required : false)
+
+getopt_dep = cc.find_library('getopt', required : false)
+gnugetopt_dep = cc.find_library('gnugetopt', required : false)
+
+execinfo_dep = cc.find_library('execinfo', required : false)
+
+func_checks = [
+ ['_configthreadlocale'],
+ ['backtrace_symbols', {'dependencies': [execinfo_dep]}],
+ ['clock_gettime', {'dependencies': [rt_dep, posix4_dep]}],
+ ['copyfile'],
+ ['dlopen', {'dependencies': [dl_dep]}],
+ ['explicit_bzero'],
+ ['fdatasync', {'dependencies': [rt_dep, posix4_dep]}],
+ ['fls'],
+ ['getaddrinfo'],
+ ['gethostbyname_r', {'dependencies': [thread_dep]}],
+ ['getifaddrs'],
+ ['getopt', {'dependencies': [getopt_dep, gnugetopt_dep]}],
+ ['getopt_long',{'dependencies': [getopt_dep, gnugetopt_dep]}],
+ ['getpeereid'],
+ ['getpeerucred'],
+ ['getpwuid_r', {'dependencies': [thread_dep]}],
+ ['getrlimit'],
+ ['getrusage'],
+ ['gettimeofday'], # XXX: This seems to be in the autoconf case
+ ['inet_aton'],
+ ['kqueue'],
+ ['link'],
+ ['mbstowcs_l'],
+ ['memset_s'],
+ ['mkdtemp'],
+ ['poll'],
+ ['posix_fadvise'],
+ ['posix_fallocate'],
+ ['ppoll'],
+ ['pread'],
+ ['pstat'],
+ ['pthread_is_threaded_np'],
+ ['pwrite'],
+ ['random'],
+ ['readlink'],
+ ['readv'],
+ ['setenv'], # FIXME: windows handling
+ ['setproctitle', {'dependencies': [util_dep]}],
+ ['setproctitle_fast'],
+ ['setsid'],
+ ['shm_open', {'dependencies': [rt_dep]}],
+ ['shm_unlink', {'dependencies': [rt_dep]}],
+ ['srandom'],
+ ['strchrnul'],
+ ['strerror_r', {'dependencies': [thread_dep]}],
+ ['strlcat'],
+ ['strlcpy'],
+ ['strnlen'],
+ ['strsignal'],
+ ['strtof'], # strsignal is checked separately
+ ['strtoll'], ['__strtoll'], ['strtoq'],
+ ['strtoull'], ['__strtoull'], ['strtouq'],
+ ['symlink'],
+ ['sync_file_range'],
+ ['syncfs'],
+ ['unsetenv'],
+ ['uselocale'],
+ ['wcstombs_l'],
+ ['writev'],
+]
+
+foreach c : func_checks
+ func = c.get(0)
+ kwargs = c.get(1, {})
+ deps = kwargs.get('dependencies', [])
+
+ varname = 'HAVE_'+func.underscorify().to_upper()
+
+ found = cc.has_function(func, args: g_c_args,
+ kwargs: kwargs + {'dependencies': []})
+
+ if not found
+ foreach dep : deps
+ if not dep.found()
+ continue
+ endif
+ found = cc.has_function(func, args: g_c_args,
+ kwargs: kwargs + {'dependencies': [dep]})
+ if found
+ os_deps += dep
+ break
+ endif
+ endforeach
+ endif
+
+ # Emulate autoconf behaviour of not-found->undef, found->1
+ cdata.set(varname, found ? 1 : false,
+ description: 'Define to 1 if you have the `@0@\' function.'.format(c))
+endforeach
+
+
+
+
+
+if host_machine.system() == 'linux' or host_machine.system() == 'freebsd'
+ dlsuffix = '.so'
+elif host_machine.system() == 'darwin'
+ dlsuffix = '.dylib'
+elif host_machine.system() == 'windows'
+ dlsuffix = '.dll'
+else
+ error('not yet')
+endif
+
+cdata.set_quoted('DLSUFFIX', dlsuffix)
+
+if host_machine.system() == 'windows'
+ cdata.set('USE_WIN32_SEMAPHORES', 1)
+ cdata.set('USE_WIN32_SHARED_MEMORY', 1)
+elif host_machine.system() == 'darwin'
+ cdata.set('USE_SYSV_SEMAPHORES', 1)
+ cdata.set('USE_SYSV_SHARED_MEMORY', 1)
+else
+ cdata.set('USE_UNNAMED_POSIX_SEMAPHORES', 1)
+ cdata.set('USE_SYSV_SHARED_MEMORY', 1)
+endif
+
+
+if host_machine.system() == 'windows'
+ cdata.set('HAVE_IPV6', 1)
+ cdata.set('HAVE_SYMLINK', 1)
+ cdata.set('WIN32_STACK_RLIMIT', 4194304)
+ cdata.set('HAVE__CONFIGTHREADLOCALE', 1)
+endif
+
+if cc.get_id() == 'msvc'
+ add_project_link_arguments(
+ '/fixed:no',
+ '/dynamicbase',
+ '/nxcompat',
+ language : ['c', 'cpp'],
+ )
+endif
+
+if host_machine.system() == 'windows'
+ os_deps += cc.find_library('ws2_32', required: true)
+endif
+
+
+
+###############################################################
+# Threading
+###############################################################
+
+# Probably not worth implementing other cases anymore
+cdata.set('ENABLE_THREAD_SAFETY', 1)
+
+if thread_dep.found()
+ if cc.has_function('pthread_is_threaded_np', args: g_c_args, dependencies: [thread_dep])
+ cdata.set('HAVE_PTHREAD_IS_THREADED_NP', 1)
+ endif
+ if cc.has_function('pthread_barrier_wait', args: g_c_args, dependencies: [thread_dep])
+ cdata.set('HAVE_PTHREAD_BARRIER_WAIT', 1)
+ endif
+endif
+
+
+
+###############################################################
+# Build
+###############################################################
+
+# Collect a number of lists of things while recursing through the source
+# tree. Later steps then can use those.
+
+test_deps = []
+backend_targets = []
+
+
+# List of tap tests we later generate test() invocations for. The main
+# reason for doing it that way instead of having test() invocations
+# everywhere is that they end up being too large. A second benefit is
+# that it'd make it easier to generate data for another runner.
+tap_tests = []
+isolation_tests = []
+regress_tests = []
+
+
+# Default options for targets
+
+default_target_args = {
+ 'implicit_include_directories': false,
+ 'install': true,
+}
+
+default_lib_args = default_target_args + {
+ 'name_prefix': 'lib',
+}
+
+internal_lib_args = default_lib_args + {
+ 'build_by_default': false,
+ 'install': false,
+}
+
+default_mod_args = default_lib_args + {
+ 'name_prefix': '',
+}
+
+default_bin_args = default_target_args + {
+}
+
+if host_machine.system() == 'windows'
+ # nothing to do
+else
+ if host_machine.system() == 'darwin'
+ rpath_var = '@loader_path'
+ else
+ rpath_var = '$ORIGIN'
+ endif
+
+ # PG binaries might need to link to libpq, use relative path to reference
+ bin_to_lib = run_command(relpath,
+ get_option('bindir'), get_option('libdir'), check: true).stdout().strip()
+ default_bin_args += {'install_rpath': rpath_var / bin_to_lib}
+
+ # PG extensions might need to link to libpq, use relative path to reference
+ # (often just .)
+ mod_to_lib = run_command(relpath,
+ get_option('libdir'), get_option('libdir'), check: true).stdout().strip()
+ default_mod_args += {'install_rpath': rpath_var}
+endif
+
+
+###
+### windows resources related stuff
+###
+
+rc_cdata = configuration_data()
+rc_cdata.set_quoted('ICO', meson.source_root() / 'src' / 'port' / 'win32.ico')
+
+rc_lib_cdata = rc_cdata
+rc_lib_cdata.set('VFT_TYPE', 'VFT_DLL')
+
+rc_bin_cdata = rc_cdata
+rc_bin_cdata.set('VFT_TYPE', 'VFT_APP')
+
+win32ver_rc_in = files('src/port/win32ver.rc.in')
+
+
+# First visit src/include - all targets creating headers are defined
+# within. That makes it easy to add the necessary dependencies for the
+# subsequent build steps.
+
+generated_headers = []
+generated_backend_headers = []
+
+postgres_inc = [include_directories('src/include')]
+
+if host_machine.system() == 'windows'
+ postgres_inc += include_directories('src/include/port/win32')
+
+ if cc.get_id() == 'msvc'
+ postgres_inc += include_directories('src/include/port/win32_msvc')
+ endif
+endif
+
+subdir('src/include')
+
+
+# Then through src/port and src/common, as most other things depend on them
+
+frontend_port_code = declare_dependency(
+ compile_args: ['-DFRONTEND'],
+ include_directories: [postgres_inc],
+ sources: [errcodes],
+ dependencies: os_deps,
+)
+
+backend_port_code = declare_dependency(
+ compile_args: ['-DBUILDING_DLL'],
+ include_directories: [postgres_inc],
+ sources: [errcodes],
+ dependencies: os_deps,
+)
+
+subdir('src/port')
+
+frontend_common_code = declare_dependency(
+ compile_args: ['-DFRONTEND'],
+ include_directories: [postgres_inc],
+ sources: generated_headers,
+ dependencies: os_deps,
+)
+
+backend_common_code = declare_dependency(
+ compile_args: ['-DBUILDING_DLL'],
+ include_directories: [postgres_inc],
+ sources: generated_headers,
+)
+
+subdir('src/common')
+
+frontend_shlib_code = declare_dependency(
+ compile_args: ['-DFRONTEND'],
+ include_directories: [postgres_inc],
+ link_with: [pgport_shlib, common_shlib],
+ sources: generated_headers,
+ dependencies: os_deps,
+)
+
+subdir('src/interfaces/libpq')
+subdir('src/fe_utils')
+
+frontend_code = declare_dependency(
+ compile_args: ['-DFRONTEND'],
+ include_directories: [postgres_inc],
+ link_with: [pgport_static, common_static, fe_utils],
+ sources: generated_headers,
+ dependencies: os_deps,
+)
+
+backend_code = declare_dependency(
+ compile_args: ['-DBUILDING_DLL'],
+ include_directories: [postgres_inc],
+ link_with: [],
+ sources: generated_headers + generated_backend_headers,
+ dependencies: [os_deps, ssl, lz4, icu, icu_i18n, ldap, gssapi, libxml, systemd],
+)
+
+# Note there's intentionally no dependency on pgport/common here - we want the
+# symbols from the main binary for extension modules, rather than the
+# extension linking separately to pgport/common.
+backend_mod_code = declare_dependency(
+ compile_args: [],
+ include_directories: [postgres_inc],
+ link_with: [],
+ sources: generated_headers + generated_backend_headers,
+ dependencies: [os_deps, ssl, lz4, icu, icu_i18n, ldap, gssapi, libxml, systemd],
+)
+
+# Then through the main sources. That way contrib can have dependencies on
+# main sources. Note that this explicitly doesn't enter src/test, right now a
+# few regression tests depend on contrib files.
+
+subdir('src')
+
+subdir('contrib')
+
+subdir('src/test')
+
+subdir('doc/src/sgml')
+
+
+if host_machine.system() == 'darwin'
+ meson.add_install_script('src/tools/relativize_shared_library_references')
+endif
+
+
+
+###############################################################
+# Test prep
+###############################################################
+
+# The determination of where a DESTDIR install points to is ugly, it's somewhat hard
+# to combine two absolute paths portably...
+
+prefix = get_option('prefix')
+
+test_prefix = prefix
+
+if fs.is_absolute(get_option('prefix'))
+ if host_machine.system() == 'windows'
+ if prefix.split(':\\').length() == 1
+ # just a drive
+ test_prefix = ''
+ else
+ test_prefix = prefix.split(':\\')[1]
+ endif
+ else
+ test_prefix = prefix.substring(1)
+ endif
+endif
+
+# DESTDIR for the installation used to run tests in
+test_install_destdir = meson.build_root() / 'tmp_install/'
+# DESTDIR + prefix appropriately munged
+test_install_location = test_install_destdir / test_prefix
+
+
+test('tmp_install',
+ meson_bin, args: meson_args + ['install', '--quiet', '--only-changed', '--no-rebuild'],
+ env: {'DESTDIR':test_install_destdir},
+ priority: 100,
+ is_parallel: false,
+ suite: ['setup'])
+
+test_result_dir = meson.build_root() / 'testrun'
+
+
+# XXX: pg_regress doesn't assign unique ports on windows. To avoid the
+# inevitable conflicts from running tests in parallel, hackishly assign
+# different ports for different tests.
+
+testport=40000
+
+test_env = environment()
+
+test_env.prepend('PATH', test_install_location / get_option('bindir'))
+test_env.set('PG_REGRESS', meson.build_root() / 'src/test/regress/pg_regress')
+test_env.set('REGRESS_SHLIB', regress_module.full_path())
+
+
+
+###############################################################
+# Test Generation
+###############################################################
+
+# Define all 'pg_regress' style tests
+foreach t : regress_tests
+ test_command = [
+ pg_regress,
+ '--temp-instance', test_result_dir / t['name'] / 'pg_regress' / 'tmp_check',
+ '--inputdir', t['sd'],
+ '--outputdir', test_result_dir / t['name'] / 'pg_regress',
+ '--bindir', '',
+ '--dlpath', t['bd'],
+ '--max-concurrent-tests=20',
+ '--port=@0@'.format(testport),
+ ]
+
+ if t.has_key('regress_args')
+ test_command += t['regress_args']
+ endif
+
+ if t.has_key('schedule')
+ test_command += ['--schedule', t['schedule'],]
+ else
+ test_command += t['sql']
+ endif
+
+ test_kwargs = {
+ 'suite': ['pg_regress', t['name']],
+ 'priority': 10,
+ 'timeout': 300,
+ 'depends': test_deps + t.get('deps', []),
+ 'env': test_env,
+ 'workdir': t['sd'],
+ 'args': [
+ meson.build_root(),
+ t['bd'],
+ t['name'],
+ 'pg_regress',
+ test_command,
+ ]
+ }
+
+ # Allow test definition to override arguments
+ if t.has_key('test_kwargs')
+ test_kwargs += t['test_kwargs']
+ endif
+
+ test(t['name'] / 'pg_regress',
+ testwrap,
+ kwargs: test_kwargs,
+ )
+
+ testport = testport + 1
+endforeach
+
+
+# Define all 'isolationtester' style tests
+foreach t : isolation_tests
+ test_command = [
+ pg_isolation_regress,
+ '--temp-instance', test_result_dir / t['name'] / 'isolation' / 'tmp_check',
+ '--inputdir', t['sd'],
+ '--outputdir', test_result_dir / t['name'] / 'isolation',
+ '--bindir', '',
+ '--dlpath', t['bd'],
+ '--max-concurrent-tests=20',
+ '--port=@0@'.format(testport),
+ ]
+
+ if t.has_key('regress_args')
+ test_command += t['regress_args']
+ endif
+
+ if t.has_key('schedule')
+ test_command += ['--schedule', t['schedule'],]
+ else
+ test_command += t['specs']
+ endif
+
+ test_kwargs = {
+ 'suite': ['isolation', t['name']],
+ 'priority': 20,
+ 'timeout': 300,
+ 'depends': test_deps + t.get('deps', []),
+ 'workdir': t['sd'],
+ 'env': test_env,
+ 'args': [
+ meson.build_root(),
+ t['bd'],
+ t['name'],
+ 'isolation',
+ test_command,
+ ]
+ }
+
+ # Allow test definition to override arguments
+ if t.has_key('test_kwargs')
+ test_kwargs += t['test_kwargs']
+ endif
+
+ test(t['name'] / 'isolation',
+ testwrap,
+ kwargs: test_kwargs,
+ )
+
+ testport = testport + 1
+endforeach
+
+
+# Define all 'tap' style tests
+# FIXME: dependencies for each test
+foreach t : tap_tests
+ env = test_env
+
+ foreach name, value : t.get('env', {})
+ if name == 'PATH'
+ # FIXME: manually setting PATH again, because repeated prepend didn't work
+ # before meson 0.58.
+ env.prepend('PATH', value, test_install_location / get_option('bindir'))
+ else
+ env.set(name, value)
+ endif
+ endforeach
+
+ foreach onetap : t['tests']
+ test(t['name'] / onetap,
+ testwrap,
+ workdir: t['sd'],
+ args: [
+ meson.build_root(),
+ t['bd'],
+ t['name'],
+ onetap,
+ 'perl',
+ '-I', meson.source_root() / 'src/test/perl',
+ '-I', t['sd'],
+ t['sd'] / onetap
+ ],
+ protocol: 'tap',
+ suite: ['tap', t['name']],
+ env: env,
+ depends: test_deps + t.get('deps', []),
+ timeout: 300,
+ )
+ endforeach
+endforeach
+
+
+
+###############################################################
+# Pseudo targets
+###############################################################
+
+alias_target('backend', backend_targets)
+
+
+
+###############################################################
+# The End, The End, My Friend
+###############################################################
+
+if meson.version().version_compare('>=0.57')
+
+ summary({
+ 'Data Block Size' : cdata.get('BLCKSZ'),
+ 'WAL Block Size' : cdata.get('XLOG_BLCKSZ')
+ }, section: 'Data Layout'
+ )
+
+ summary(
+ {
+ 'host system' : '@0@ @1@'.format(host_machine.system(), host_machine.cpu_family()),
+ 'build system' : '@0@ @1@'.format(build_machine.system(), build_machine.cpu_family()),
+ },
+ section: 'System'
+ )
+
+ summary(
+ {
+ 'linker': '@0@'.format(cc.get_linker_id()),
+ 'C compiler': '@0@ @1@'.format(cc.get_id(), cc.version()),
+ },
+ section: 'Compiler'
+ )
+
+ if llvm.found()
+ summary(
+ {
+ 'C++ compiler': '@0@ @1@'.format(cpp.get_id(), cpp.version())
+ },
+ section: 'Compiler')
+ endif
+
+ summary(
+ {
+ 'bison' : '@0@ @1@'.format(bison.full_path(), bison_version),
+ },
+ section: 'Programs'
+ )
+
+ summary(
+ {
+ 'GSS': gssapi,
+ 'LDAP': ldap,
+ 'LLVM': llvm,
+ 'icu': icu,
+ 'libxml': libxml,
+ 'libxslt': libxslt,
+ 'lz4': lz4,
+ 'perl': perl_dep,
+ 'python3': python3,
+ 'readline': readline,
+ 'selinux': selinux,
+ 'ssl': ssl,
+ 'systemd': systemd,
+ 'uuid': uuid,
+ 'zlib': zlib,
+ },
+ section: 'External Libraries'
+ )
+
+endif
diff --git a/meson_options.txt b/meson_options.txt
new file mode 100644
index 00000000000..d80d8fa5820
--- /dev/null
+++ b/meson_options.txt
@@ -0,0 +1,90 @@
+# Data layout influencing options
+option('BLCKSZ', type : 'combo', choices : ['1', '2', '4', '8', '16', '32'], value : '8',
+ description: 'set table block size in kB')
+
+
+# You get it
+option('cassert', type : 'boolean', value: false,
+ description: 'enable assertion checks (for debugging)')
+
+option('atomics', type : 'boolean', value: true,
+ description: 'whether to use atomic operations')
+
+
+# Compilation options
+
+option('extra_include_dirs', type : 'array',
+ description: 'non-default directories to be searched for headers')
+option('extra_lib_dirs', type : 'array',
+ description: 'non-default directories to be searched for libs')
+
+
+# External dependencies
+
+option('gssapi', type : 'feature', value: 'auto',
+ description: 'GSSAPI support')
+
+option('ldap', type : 'feature', value: 'auto',
+ description: 'LDAP support')
+
+option('llvm', type : 'feature', value: 'disabled',
+ description: 'whether to use llvm')
+
+option('icu', type : 'feature', value: 'auto',
+ description: 'ICU support')
+
+option('libxml', type : 'feature', value: 'auto',
+ description: 'XML support')
+
+option('libxslt', type : 'feature', value: 'auto',
+ description: 'XSLT support in contrib/xml2')
+
+option('lz4', type : 'feature', value: 'auto',
+ description: 'LZ4 support')
+
+option('perl', type : 'feature', value: 'auto',
+ description: 'build Perl modules (PL/Perl)')
+
+option('python', type : 'feature', value: 'auto',
+ description: 'build Python modules (PL/Python)')
+
+option('readline', type : 'feature', value : 'auto',
+ description: 'use GNU Readline or BSD Libedit for editing')
+
+option('selinux', type : 'feature', value : 'disabled',
+ description: 'build with SELinux support')
+
+option('ssl', type : 'combo', choices : ['none', 'openssl'], value : 'none',
+ description: 'use LIB for SSL/TLS support (openssl)')
+
+option('systemd', type : 'feature', value: 'auto',
+ description: 'build with systemd support')
+
+option('uuid', type : 'combo', choices : ['none', 'bsd', 'e2fs', 'ossp'], value : 'none',
+ description: 'build contrib/uuid-ossp using LIB')
+
+option('zlib', type : 'feature', value: 'auto',
+ description: 'whether to use zlib')
+
+
+# Programs
+option('BISON', type : 'string', value: 'bison',
+ description: 'path to bison binary')
+
+option('FLEX', type : 'string', value: 'flex',
+ description: 'path to flex binary')
+
+option('GZIP', type : 'string', value: 'gzip',
+ description: 'path to gzip binary')
+
+option('PERL', type : 'string', value: 'perl',
+ description: 'path to perl binary')
+
+option('PROVE', type : 'string', value: 'prove',
+ description: 'path to prove binary')
+
+option('SED', type : 'string', value: 'gsed',
+ description: 'path to sed binary')
+
+option('TAR', type : 'string', value: 'tar',
+ description: 'path to tar binary')
diff --git a/src/backend/access/brin/meson.build b/src/backend/access/brin/meson.build
new file mode 100644
index 00000000000..a54c7532927
--- /dev/null
+++ b/src/backend/access/brin/meson.build
@@ -0,0 +1,12 @@
+backend_sources += files(
+ 'brin.c',
+ 'brin_bloom.c',
+ 'brin_inclusion.c',
+ 'brin_minmax.c',
+ 'brin_minmax_multi.c',
+ 'brin_pageops.c',
+ 'brin_revmap.c',
+ 'brin_tuple.c',
+ 'brin_validate.c',
+ 'brin_xlog.c',
+)
diff --git a/src/backend/access/common/meson.build b/src/backend/access/common/meson.build
new file mode 100644
index 00000000000..857beaa32d3
--- /dev/null
+++ b/src/backend/access/common/meson.build
@@ -0,0 +1,18 @@
+backend_sources += files(
+ 'attmap.c',
+ 'bufmask.c',
+ 'detoast.c',
+ 'heaptuple.c',
+ 'indextuple.c',
+ 'printsimple.c',
+ 'printtup.c',
+ 'relation.c',
+ 'reloptions.c',
+ 'scankey.c',
+ 'session.c',
+ 'syncscan.c',
+ 'toast_compression.c',
+ 'toast_internals.c',
+ 'tupconvert.c',
+ 'tupdesc.c',
+)
diff --git a/src/backend/access/gin/meson.build b/src/backend/access/gin/meson.build
new file mode 100644
index 00000000000..56d6f343d54
--- /dev/null
+++ b/src/backend/access/gin/meson.build
@@ -0,0 +1,17 @@
+backend_sources += files(
+ 'ginarrayproc.c',
+ 'ginbtree.c',
+ 'ginbulk.c',
+ 'gindatapage.c',
+ 'ginentrypage.c',
+ 'ginfast.c',
+ 'ginget.c',
+ 'gininsert.c',
+ 'ginlogic.c',
+ 'ginpostinglist.c',
+ 'ginscan.c',
+ 'ginutil.c',
+ 'ginvacuum.c',
+ 'ginvalidate.c',
+ 'ginxlog.c',
+)
diff --git a/src/backend/access/gist/meson.build b/src/backend/access/gist/meson.build
new file mode 100644
index 00000000000..1a996b5e25d
--- /dev/null
+++ b/src/backend/access/gist/meson.build
@@ -0,0 +1,13 @@
+backend_sources += files(
+ 'gist.c',
+ 'gistbuild.c',
+ 'gistbuildbuffers.c',
+ 'gistget.c',
+ 'gistproc.c',
+ 'gistscan.c',
+ 'gistsplit.c',
+ 'gistutil.c',
+ 'gistvacuum.c',
+ 'gistvalidate.c',
+ 'gistxlog.c',
+)
diff --git a/src/backend/access/hash/meson.build b/src/backend/access/hash/meson.build
new file mode 100644
index 00000000000..22f2c691c34
--- /dev/null
+++ b/src/backend/access/hash/meson.build
@@ -0,0 +1,12 @@
+backend_sources += files(
+ 'hash.c',
+ 'hash_xlog.c',
+ 'hashfunc.c',
+ 'hashinsert.c',
+ 'hashovfl.c',
+ 'hashpage.c',
+ 'hashsearch.c',
+ 'hashsort.c',
+ 'hashutil.c',
+ 'hashvalidate.c',
+)
diff --git a/src/backend/access/heap/meson.build b/src/backend/access/heap/meson.build
new file mode 100644
index 00000000000..f1dca73743c
--- /dev/null
+++ b/src/backend/access/heap/meson.build
@@ -0,0 +1,11 @@
+backend_sources += files(
+ 'heapam.c',
+ 'heapam_handler.c',
+ 'heapam_visibility.c',
+ 'heaptoast.c',
+ 'hio.c',
+ 'pruneheap.c',
+ 'rewriteheap.c',
+ 'vacuumlazy.c',
+ 'visibilitymap.c',
+)
diff --git a/src/backend/access/index/meson.build b/src/backend/access/index/meson.build
new file mode 100644
index 00000000000..18af5533e65
--- /dev/null
+++ b/src/backend/access/index/meson.build
@@ -0,0 +1,6 @@
+backend_sources += files(
+ 'amapi.c',
+ 'amvalidate.c',
+ 'genam.c',
+ 'indexam.c',
+)
diff --git a/src/backend/access/meson.build b/src/backend/access/meson.build
new file mode 100644
index 00000000000..9874291fc0a
--- /dev/null
+++ b/src/backend/access/meson.build
@@ -0,0 +1,13 @@
+subdir('brin')
+subdir('common')
+subdir('gin')
+subdir('gist')
+subdir('hash')
+subdir('heap')
+subdir('index')
+subdir('nbtree')
+subdir('rmgrdesc')
+subdir('spgist')
+subdir('table')
+subdir('tablesample')
+subdir('transam')
diff --git a/src/backend/access/nbtree/meson.build b/src/backend/access/nbtree/meson.build
new file mode 100644
index 00000000000..07dc29e8190
--- /dev/null
+++ b/src/backend/access/nbtree/meson.build
@@ -0,0 +1,13 @@
+backend_sources += files(
+ 'nbtcompare.c',
+ 'nbtdedup.c',
+ 'nbtinsert.c',
+ 'nbtpage.c',
+ 'nbtree.c',
+ 'nbtsearch.c',
+ 'nbtsort.c',
+ 'nbtsplitloc.c',
+ 'nbtutils.c',
+ 'nbtvalidate.c',
+ 'nbtxlog.c',
+)
diff --git a/src/backend/access/rmgrdesc/meson.build b/src/backend/access/rmgrdesc/meson.build
new file mode 100644
index 00000000000..f3a6e0a571b
--- /dev/null
+++ b/src/backend/access/rmgrdesc/meson.build
@@ -0,0 +1,26 @@
+# used by frontend programs like pg_waldump
+rmgr_desc_sources = files(
+ 'brindesc.c',
+ 'clogdesc.c',
+ 'committsdesc.c',
+ 'dbasedesc.c',
+ 'genericdesc.c',
+ 'gindesc.c',
+ 'gistdesc.c',
+ 'hashdesc.c',
+ 'heapdesc.c',
+ 'logicalmsgdesc.c',
+ 'mxactdesc.c',
+ 'nbtdesc.c',
+ 'relmapdesc.c',
+ 'replorigindesc.c',
+ 'seqdesc.c',
+ 'smgrdesc.c',
+ 'spgdesc.c',
+ 'standbydesc.c',
+ 'tblspcdesc.c',
+ 'xactdesc.c',
+ 'xlogdesc.c',
+)
+
+backend_sources += rmgr_desc_sources
diff --git a/src/backend/access/spgist/meson.build b/src/backend/access/spgist/meson.build
new file mode 100644
index 00000000000..f18d0d2e53f
--- /dev/null
+++ b/src/backend/access/spgist/meson.build
@@ -0,0 +1,13 @@
+backend_sources += files(
+ 'spgdoinsert.c',
+ 'spginsert.c',
+ 'spgkdtreeproc.c',
+ 'spgproc.c',
+ 'spgquadtreeproc.c',
+ 'spgscan.c',
+ 'spgtextproc.c',
+ 'spgutils.c',
+ 'spgvacuum.c',
+ 'spgvalidate.c',
+ 'spgxlog.c',
+)
diff --git a/src/backend/access/table/meson.build b/src/backend/access/table/meson.build
new file mode 100644
index 00000000000..66c706d640e
--- /dev/null
+++ b/src/backend/access/table/meson.build
@@ -0,0 +1,6 @@
+backend_sources += files(
+ 'table.c',
+ 'tableam.c',
+ 'tableamapi.c',
+ 'toast_helper.c',
+)
diff --git a/src/backend/access/tablesample/meson.build b/src/backend/access/tablesample/meson.build
new file mode 100644
index 00000000000..63ee8203226
--- /dev/null
+++ b/src/backend/access/tablesample/meson.build
@@ -0,0 +1,5 @@
+backend_sources += files(
+ 'bernoulli.c',
+ 'system.c',
+ 'tablesample.c',
+)
diff --git a/src/backend/access/transam/meson.build b/src/backend/access/transam/meson.build
new file mode 100644
index 00000000000..fe3703e0f21
--- /dev/null
+++ b/src/backend/access/transam/meson.build
@@ -0,0 +1,28 @@
+backend_sources += files(
+ 'clog.c',
+ 'commit_ts.c',
+ 'generic_xlog.c',
+ 'multixact.c',
+ 'parallel.c',
+ 'rmgr.c',
+ 'slru.c',
+ 'subtrans.c',
+ 'timeline.c',
+ 'transam.c',
+ 'twophase.c',
+ 'twophase_rmgr.c',
+ 'varsup.c',
+ 'xact.c',
+ 'xlog.c',
+ 'xlogarchive.c',
+ 'xlogfuncs.c',
+ 'xloginsert.c',
+ 'xlogutils.c',
+)
+
+# used by frontend programs to built a frontend xlogreader
+xlogreader_sources = files(
+ 'xlogreader.c',
+)
+
+backend_sources += xlogreader_sources
diff --git a/src/backend/bootstrap/meson.build b/src/backend/bootstrap/meson.build
new file mode 100644
index 00000000000..55c0be68cc4
--- /dev/null
+++ b/src/backend/bootstrap/meson.build
@@ -0,0 +1,12 @@
+backend_sources += files(
+ 'bootstrap.c')
+
+bootscanner = custom_target('bootscanner',
+ input: ['bootscanner.l'],
+ output: ['bootscanner.c'],
+ command: [flex, '-o', '@OUTPUT@', '@INPUT@'])
+
+generated_backend_sources += custom_target('bootparse',
+ input: ['bootparse.y', bootscanner[0]],
+ output: ['bootparse.c'],
+ command: [bison, bisonflags, '-o', '@OUTPUT@', '@INPUT0@'])
diff --git a/src/backend/catalog/meson.build b/src/backend/catalog/meson.build
new file mode 100644
index 00000000000..2cc23582e35
--- /dev/null
+++ b/src/backend/catalog/meson.build
@@ -0,0 +1,41 @@
+backend_sources += files(
+ 'aclchk.c',
+ 'catalog.c',
+ 'dependency.c',
+ 'heap.c',
+ 'index.c',
+ 'indexing.c',
+ 'namespace.c',
+ 'objectaccess.c',
+ 'objectaddress.c',
+ 'partition.c',
+ 'pg_aggregate.c',
+ 'pg_cast.c',
+ 'pg_class.c',
+ 'pg_collation.c',
+ 'pg_constraint.c',
+ 'pg_conversion.c',
+ 'pg_db_role_setting.c',
+ 'pg_depend.c',
+ 'pg_enum.c',
+ 'pg_inherits.c',
+ 'pg_largeobject.c',
+ 'pg_namespace.c',
+ 'pg_operator.c',
+ 'pg_proc.c',
+ 'pg_publication.c',
+ 'pg_range.c',
+ 'pg_shdepend.c',
+ 'pg_subscription.c',
+ 'pg_type.c',
+ 'storage.c',
+ 'toasting.c',
+)
+
+
+install_data(
+ 'information_schema.sql',
+ 'sql_features.txt',
+ 'system_functions.sql',
+ 'system_views.sql',
+ install_dir: 'share/')
diff --git a/src/backend/commands/meson.build b/src/backend/commands/meson.build
new file mode 100644
index 00000000000..8e73b29a263
--- /dev/null
+++ b/src/backend/commands/meson.build
@@ -0,0 +1,50 @@
+backend_sources += files(
+ 'aggregatecmds.c',
+ 'alter.c',
+ 'amcmds.c',
+ 'analyze.c',
+ 'async.c',
+ 'cluster.c',
+ 'collationcmds.c',
+ 'comment.c',
+ 'constraint.c',
+ 'conversioncmds.c',
+ 'copy.c',
+ 'copyfrom.c',
+ 'copyfromparse.c',
+ 'copyto.c',
+ 'createas.c',
+ 'dbcommands.c',
+ 'define.c',
+ 'discard.c',
+ 'dropcmds.c',
+ 'event_trigger.c',
+ 'explain.c',
+ 'extension.c',
+ 'foreigncmds.c',
+ 'functioncmds.c',
+ 'indexcmds.c',
+ 'lockcmds.c',
+ 'matview.c',
+ 'opclasscmds.c',
+ 'operatorcmds.c',
+ 'policy.c',
+ 'portalcmds.c',
+ 'prepare.c',
+ 'proclang.c',
+ 'publicationcmds.c',
+ 'schemacmds.c',
+ 'seclabel.c',
+ 'sequence.c',
+ 'statscmds.c',
+ 'subscriptioncmds.c',
+ 'tablecmds.c',
+ 'tablespace.c',
+ 'trigger.c',
+ 'tsearchcmds.c',
+ 'typecmds.c',
+ 'user.c',
+ 'vacuum.c',
+ 'variable.c',
+ 'view.c',
+)
diff --git a/src/backend/executor/meson.build b/src/backend/executor/meson.build
new file mode 100644
index 00000000000..518674cfa28
--- /dev/null
+++ b/src/backend/executor/meson.build
@@ -0,0 +1,67 @@
+backend_sources += files(
+ 'execAmi.c',
+ 'execAsync.c',
+ 'execCurrent.c',
+ 'execExpr.c',
+ 'execExprInterp.c',
+ 'execGrouping.c',
+ 'execIndexing.c',
+ 'execJunk.c',
+ 'execMain.c',
+ 'execParallel.c',
+ 'execPartition.c',
+ 'execProcnode.c',
+ 'execReplication.c',
+ 'execSRF.c',
+ 'execScan.c',
+ 'execTuples.c',
+ 'execUtils.c',
+ 'functions.c',
+ 'instrument.c',
+ 'nodeAgg.c',
+ 'nodeAppend.c',
+ 'nodeBitmapAnd.c',
+ 'nodeBitmapHeapscan.c',
+ 'nodeBitmapIndexscan.c',
+ 'nodeBitmapOr.c',
+ 'nodeCtescan.c',
+ 'nodeCustom.c',
+ 'nodeForeignscan.c',
+ 'nodeFunctionscan.c',
+ 'nodeGather.c',
+ 'nodeGatherMerge.c',
+ 'nodeGroup.c',
+ 'nodeHash.c',
+ 'nodeHashjoin.c',
+ 'nodeIncrementalSort.c',
+ 'nodeIndexonlyscan.c',
+ 'nodeIndexscan.c',
+ 'nodeLimit.c',
+ 'nodeLockRows.c',
+ 'nodeMaterial.c',
+ 'nodeMemoize.c',
+ 'nodeMergeAppend.c',
+ 'nodeMergejoin.c',
+ 'nodeModifyTable.c',
+ 'nodeNamedtuplestorescan.c',
+ 'nodeNestloop.c',
+ 'nodeProjectSet.c',
+ 'nodeRecursiveunion.c',
+ 'nodeResult.c',
+ 'nodeSamplescan.c',
+ 'nodeSeqscan.c',
+ 'nodeSetOp.c',
+ 'nodeSort.c',
+ 'nodeSubplan.c',
+ 'nodeSubqueryscan.c',
+ 'nodeTableFuncscan.c',
+ 'nodeTidrangescan.c',
+ 'nodeTidscan.c',
+ 'nodeUnique.c',
+ 'nodeValuesscan.c',
+ 'nodeWindowAgg.c',
+ 'nodeWorktablescan.c',
+ 'spi.c',
+ 'tqueue.c',
+ 'tstoreReceiver.c',
+)
diff --git a/src/backend/foreign/meson.build b/src/backend/foreign/meson.build
new file mode 100644
index 00000000000..57463db92c1
--- /dev/null
+++ b/src/backend/foreign/meson.build
@@ -0,0 +1,3 @@
+backend_sources += files(
+ 'foreign.c'
+)
diff --git a/src/backend/jit/llvm/meson.build b/src/backend/jit/llvm/meson.build
new file mode 100644
index 00000000000..83a90770bca
--- /dev/null
+++ b/src/backend/jit/llvm/meson.build
@@ -0,0 +1,41 @@
+if llvm.found()
+
+ llvmjit_sources = []
+
+ # Infrastructure
+ llvmjit_sources += files(
+ 'llvmjit.c',
+ 'llvmjit_error.cpp',
+ 'llvmjit_inline.cpp',
+ 'llvmjit_wrap.cpp',
+ )
+
+ # Code generation
+ llvmjit_sources += files(
+ 'llvmjit_deform.c',
+ 'llvmjit_expr.c',
+ )
+
+ llvmjit = shared_module('llvmjit',
+ llvmjit_sources,
+ kwargs: pg_mod_args + {
+ 'dependencies': pg_mod_args['dependencies'] + [llvm],
+ }
+ )
+
+ backend_targets += llvmjit
+
+ # Note this is intentionally is not installed to bitcodedir, as it's not
+ # for inlining
+ llvmjit_types = custom_target('llvmjit_types.bc',
+ kwargs: llvm_irgen_kw + {
+ 'input': 'llvmjit_types.c',
+ 'output': 'llvmjit_types.bc',
+ 'depends': [postgres],
+ 'install': true,
+ 'install_dir': get_option('libdir')
+ }
+ )
+ backend_targets += llvmjit_types
+
+endif
diff --git a/src/backend/jit/meson.build b/src/backend/jit/meson.build
new file mode 100644
index 00000000000..63cd33a4bed
--- /dev/null
+++ b/src/backend/jit/meson.build
@@ -0,0 +1,3 @@
+backend_sources += files(
+ 'jit.c'
+)
diff --git a/src/backend/lib/meson.build b/src/backend/lib/meson.build
new file mode 100644
index 00000000000..53292563d34
--- /dev/null
+++ b/src/backend/lib/meson.build
@@ -0,0 +1,12 @@
+backend_sources += files(
+ 'binaryheap.c',
+ 'bipartite_match.c',
+ 'bloomfilter.c',
+ 'dshash.c',
+ 'hyperloglog.c',
+ 'ilist.c',
+ 'integerset.c',
+ 'knapsack.c',
+ 'pairingheap.c',
+ 'rbtree.c'
+)
diff --git a/src/backend/libpq/meson.build b/src/backend/libpq/meson.build
new file mode 100644
index 00000000000..49867647155
--- /dev/null
+++ b/src/backend/libpq/meson.build
@@ -0,0 +1,28 @@
+backend_sources += files(
+ 'auth-sasl.c',
+ 'auth-scram.c',
+ 'auth.c',
+ 'be-fsstubs.c',
+ 'be-secure-common.c',
+ 'be-secure.c',
+ 'crypt.c',
+ 'hba.c',
+ 'ifaddr.c',
+ 'pqcomm.c',
+ 'pqformat.c',
+ 'pqmq.c',
+ 'pqsignal.c',
+)
+
+if ssl.found()
+ backend_sources += files('be-secure-openssl.c')
+endif
+
+if gssapi.found()
+ backend_sources += files(
+ 'be-secure-gssapi.c',
+ 'be-gssapi-common.c'
+ )
+endif
+
+install_data('pg_hba.conf.sample', 'pg_ident.conf.sample', install_dir: 'share/')
diff --git a/src/backend/main/meson.build b/src/backend/main/meson.build
new file mode 100644
index 00000000000..241e125f089
--- /dev/null
+++ b/src/backend/main/meson.build
@@ -0,0 +1,2 @@
+main_file = files('main.c')
+backend_sources += main_file
diff --git a/src/backend/meson.build b/src/backend/meson.build
new file mode 100644
index 00000000000..0098411c6b2
--- /dev/null
+++ b/src/backend/meson.build
@@ -0,0 +1,197 @@
+backend_build_deps = [backend_code]
+backend_deps = [dl_dep, thread_dep]
+backend_sources = []
+backend_link_with = [pgport_srv, common_srv]
+backend_c_args = []
+
+generated_backend_sources = []
+
+subdir('access')
+subdir('bootstrap')
+subdir('catalog')
+subdir('commands')
+subdir('executor')
+subdir('foreign')
+subdir('jit')
+subdir('lib')
+subdir('libpq')
+subdir('main')
+subdir('nodes')
+subdir('optimizer')
+subdir('parser')
+subdir('partitioning')
+subdir('port')
+subdir('postmaster')
+subdir('regex')
+subdir('replication')
+subdir('rewrite')
+subdir('statistics')
+subdir('storage')
+subdir('tcop')
+subdir('tsearch')
+subdir('utils')
+
+
+postgres_link_args = []
+postgres_link_depends = []
+
+if host_machine.system() == 'windows'
+ backend_deps += cc.find_library('secur32', required: true)
+
+ if cc.get_id() == 'msvc'
+ postgres_link_args += '/STACK:@0@'.format(cdata.get('WIN32_STACK_RLIMIT'))
+ else
+ postgres_link_args += '-Wl,--stack,@0@'.format(cdata.get('WIN32_STACK_RLIMIT'))
+ endif
+endif
+
+
+# On windows when compiling with msvc we need to make postgres export all its
+# symbols so that extension libraries can use them. For that we need to scan
+# the constituting objects and generate a file specifying all the functions as
+# exported (variables need an "import" declaration in the header, hence
+# PGDLLEXPORT, but functions work without that, due to import libraries
+# basically being trampolines).
+#
+#
+# On meson there's currently no easy way to do this that I found. So we build
+# a static library with all the input objects, run our script to generate
+# exports, and build the final executable using that static library
+#
+#
+# XXX: This needs to be improved.
+#
+
+# NB: There's an outer and inner layer of == windows checks, to allow to
+# exercise most of this on !windows, by widening the outer "layer".
+
+if cc.get_id() == 'msvc' # or true
+
+ postgres_lib = static_library('postgres_lib',
+ backend_sources + timezone_sources + generated_backend_sources,
+ link_whole: backend_link_with,
+ c_pch: '../include/pch/postgres_pch.h',
+ c_args: backend_c_args,
+ implicit_include_directories: false,
+ dependencies: backend_build_deps,
+ build_by_default: false,
+ install: false,
+ )
+
+ postgres_def = custom_target('postgres.def',
+ command: [perl, files('../tools/msvc/gendef2.pl'), 'x64', '@OUTPUT@', '@PRIVATE_DIR@', '@INPUT@'],
+ input: [postgres_lib, common_srv, pgport_srv],
+ output: 'postgres.def',
+ depends: [postgres_lib, common_srv, pgport_srv],
+ install: false,
+ )
+
+ if cc.get_id() == 'msvc'
+ postgres_link_args += '/DEF:@0@'.format(postgres_def.full_path())
+ postgres_link_depends += postgres_def
+ endif
+
+ # Unfortunately the msvc linker whines when building an executable with just
+ # libraries, hence the reuse of the 'main' object directly.
+
+ postgres = executable('postgres',
+ objects: [postgres_lib.extract_objects(main_file)],
+ link_with: [postgres_lib],
+ link_args: postgres_link_args,
+ link_depends: postgres_link_depends,
+ dependencies: backend_deps,
+ export_dynamic: true,
+ implib: true,
+ kwargs: default_bin_args,
+ )
+
+else
+
+ postgres = executable('postgres',
+ backend_sources + generated_backend_sources + timezone_sources,
+ c_pch: '../include/pch/postgres_pch.h',
+ c_args: backend_c_args,
+ link_args: postgres_link_args,
+ link_with: backend_link_with,
+ export_dynamic: true,
+ dependencies: [backend_build_deps, backend_deps],
+ kwargs: default_bin_args,
+ )
+
+endif
+
+backend_targets += postgres
+
+pg_mod_args = default_mod_args + {
+ 'dependencies': [backend_mod_code],
+ 'c_args': [],
+ 'cpp_args': [],
+ }
+
+if cdata.has('HAVE_VISIBILITY_ATTRIBUTE')
+ pg_mod_args = pg_mod_args + {
+ 'c_args': pg_mod_args['c_args'] + ['-fvisibility=hidden'],
+ 'cpp_args': pg_mod_args['c_args'] + ['-fvisibility=hidden', '-fvisibility-inlines-hidden'],
+ }
+endif
+
+# Windows / MacOs link shared modules against postgres. To avoid unnecessary
+# build-time dependencies on other operating systems, only add it when
+# necessary.
+if host_machine.system() == 'windows' or host_machine.system() == 'darwin'
+ pg_mod_args = pg_mod_args + {'link_with': [postgres]}
+endif
+if host_machine.system() == 'darwin'
+ pg_mod_args = pg_mod_args + {'link_args': ['-bundle_loader', '@0@'.format(postgres.full_path())]}
+endif
+
+
+# Shared modules that, on some OSs, link against the server binary. Only enter
+# these after we defined the server build.
+
+subdir('jit/llvm')
+subdir('replication/libpqwalreceiver')
+subdir('replication/pgoutput')
+subdir('snowball')
+subdir('utils/mb/conversion_procs')
+
+
+if llvm.found()
+
+ # custom_target() insists on targetting files into the current
+ # directory. But we have files with the same name in different
+ # subdirectories. generators() don't have that problem, but their results
+ # are not installable. The irlink command copies the files for us.
+ #
+ # FIXME: this needs to be in a central place
+ #
+ # generator and custom_'t accept CustomTargetIndex as 'depends', nor do they
+ # like targets with more than one output. However, a custom target accepts
+ # them as input without a problem. So we have the below transitive target :(
+
+ transitive_depend_target = custom_target('stamp',
+ input: generated_headers + generated_backend_headers + generated_backend_sources,
+ output: 'stamp',
+ command: [touch, '@OUTPUT@'],
+ install: false)
+
+ llvm_gen = generator(llvm_irgen_command,
+ arguments: llvm_irgen_args + g_c_args,
+ depends: transitive_depend_target,
+ depfile: '@BASENAME@.c.bc.d',
+ output: ['@PLAINNAME@.bc']
+ )
+
+ bc_backend_sources = llvm_gen.process(backend_sources,
+ preserve_path_from: meson.current_source_dir())
+
+ postgres_llvm = custom_target('postgres.index.bc',
+ kwargs: llvm_irlink_kw + {
+ 'input': bc_backend_sources,
+ 'output': ['bitcode'],
+ },
+ )
+
+ backend_targets += postgres_llvm
+
+endif
diff --git a/src/backend/nodes/meson.build b/src/backend/nodes/meson.build
new file mode 100644
index 00000000000..9fca83fba44
--- /dev/null
+++ b/src/backend/nodes/meson.build
@@ -0,0 +1,17 @@
+backend_sources += files(
+ 'bitmapset.c',
+ 'copyfuncs.c',
+ 'equalfuncs.c',
+ 'extensible.c',
+ 'list.c',
+ 'makefuncs.c',
+ 'nodeFuncs.c',
+ 'nodes.c',
+ 'outfuncs.c',
+ 'params.c',
+ 'print.c',
+ 'read.c',
+ 'readfuncs.c',
+ 'tidbitmap.c',
+ 'value.c',
+)
diff --git a/src/backend/optimizer/geqo/meson.build b/src/backend/optimizer/geqo/meson.build
new file mode 100644
index 00000000000..c04f1dc2dfd
--- /dev/null
+++ b/src/backend/optimizer/geqo/meson.build
@@ -0,0 +1,17 @@
+backend_sources += files(
+ 'geqo_copy.c',
+ 'geqo_cx.c',
+ 'geqo_erx.c',
+ 'geqo_eval.c',
+ 'geqo_main.c',
+ 'geqo_misc.c',
+ 'geqo_mutation.c',
+ 'geqo_ox1.c',
+ 'geqo_ox2.c',
+ 'geqo_pmx.c',
+ 'geqo_pool.c',
+ 'geqo_px.c',
+ 'geqo_random.c',
+ 'geqo_recombination.c',
+ 'geqo_selection.c',
+)
diff --git a/src/backend/optimizer/meson.build b/src/backend/optimizer/meson.build
new file mode 100644
index 00000000000..1ab1d9934ae
--- /dev/null
+++ b/src/backend/optimizer/meson.build
@@ -0,0 +1,5 @@
+subdir('geqo')
+subdir('path')
+subdir('plan')
+subdir('prep')
+subdir('util')
diff --git a/src/backend/optimizer/path/meson.build b/src/backend/optimizer/path/meson.build
new file mode 100644
index 00000000000..310042e7aee
--- /dev/null
+++ b/src/backend/optimizer/path/meson.build
@@ -0,0 +1,11 @@
+backend_sources += files(
+ 'allpaths.c',
+ 'clausesel.c',
+ 'costsize.c',
+ 'equivclass.c',
+ 'indxpath.c',
+ 'joinpath.c',
+ 'joinrels.c',
+ 'pathkeys.c',
+ 'tidpath.c',
+)
diff --git a/src/backend/optimizer/plan/meson.build b/src/backend/optimizer/plan/meson.build
new file mode 100644
index 00000000000..22ec65a3845
--- /dev/null
+++ b/src/backend/optimizer/plan/meson.build
@@ -0,0 +1,10 @@
+backend_sources += files(
+ 'analyzejoins.c',
+ 'createplan.c',
+ 'initsplan.c',
+ 'planagg.c',
+ 'planmain.c',
+ 'planner.c',
+ 'setrefs.c',
+ 'subselect.c',
+)
diff --git a/src/backend/optimizer/prep/meson.build b/src/backend/optimizer/prep/meson.build
new file mode 100644
index 00000000000..4549a5b0e79
--- /dev/null
+++ b/src/backend/optimizer/prep/meson.build
@@ -0,0 +1,7 @@
+backend_sources += files(
+ 'prepagg.c',
+ 'prepjointree.c',
+ 'prepqual.c',
+ 'preptlist.c',
+ 'prepunion.c',
+)
diff --git a/src/backend/optimizer/util/meson.build b/src/backend/optimizer/util/meson.build
new file mode 100644
index 00000000000..e7ceaf566b5
--- /dev/null
+++ b/src/backend/optimizer/util/meson.build
@@ -0,0 +1,16 @@
+backend_sources += files(
+ 'appendinfo.c',
+ 'clauses.c',
+ 'inherit.c',
+ 'joininfo.c',
+ 'orclauses.c',
+ 'paramassign.c',
+ 'pathnode.c',
+ 'placeholder.c',
+ 'plancat.c',
+ 'predtest.c',
+ 'relnode.c',
+ 'restrictinfo.c',
+ 'tlist.c',
+ 'var.c',
+)
diff --git a/src/backend/parser/meson.build b/src/backend/parser/meson.build
new file mode 100644
index 00000000000..5ce4d09f31b
--- /dev/null
+++ b/src/backend/parser/meson.build
@@ -0,0 +1,43 @@
+backend_sources += files(
+ 'analyze.c',
+ 'parse_agg.c',
+ 'parse_clause.c',
+ 'parse_coerce.c',
+ 'parse_collate.c',
+ 'parse_cte.c',
+ 'parse_enr.c',
+ 'parse_expr.c',
+ 'parse_func.c',
+ 'parse_node.c',
+ 'parse_oper.c',
+ 'parse_param.c',
+ 'parse_relation.c',
+ 'parse_target.c',
+ 'parse_type.c',
+ 'parse_utilcmd.c',
+ 'scansup.c',
+)
+
+# Build a small utility static lib for the parser. This makes it easier to not
+# depend on gram.h already having been generated for most of the other code
+# (which depends on generated headers having been generated). The generation
+# of the parser is slow...
+
+parser_sources = [files('parser.c')]
+
+backend_scanner = custom_target('scan',
+ input: ['scan.l'],
+ output: ['scan.c'],
+ command: [flex, '-b', '-CF', '-p', '-p', '-o', '@OUTPUT@', '@INPUT0@'])
+parser_sources += backend_scanner[0]
+
+parser_sources += backend_parser_header[0]
+parser_sources += backend_parser_header[1]
+
+parser = static_library('parser',
+ parser_sources + generated_headers,
+ c_pch: '../../include/pch/c_pch.h',
+ dependencies: [backend_code],
+ kwargs: default_lib_args + {'install': false},
+)
+backend_link_with += parser
diff --git a/src/backend/partitioning/meson.build b/src/backend/partitioning/meson.build
new file mode 100644
index 00000000000..e5e3806a0cc
--- /dev/null
+++ b/src/backend/partitioning/meson.build
@@ -0,0 +1,5 @@
+backend_sources += files(
+ 'partbounds.c',
+ 'partdesc.c',
+ 'partprune.c',
+)
diff --git a/src/backend/port/meson.build b/src/backend/port/meson.build
new file mode 100644
index 00000000000..f1bf7f6d929
--- /dev/null
+++ b/src/backend/port/meson.build
@@ -0,0 +1,28 @@
+backend_sources += files(
+ 'atomics.c',
+)
+
+
+if cdata.has('USE_UNNAMED_POSIX_SEMAPHORES') or cdata.has('USE_NAMED_POSIX_SEMAPHORES')
+ backend_sources += files('posix_sema.c')
+endif
+
+if cdata.has('USE_SYSV_SEMAPHORES')
+ backend_sources += files('sysv_sema.c')
+endif
+
+if cdata.has('USE_WIN32_SEMAPHORES')
+ backend_sources += files('win32_sema.c')
+endif
+
+if cdata.has('USE_SYSV_SHARED_MEMORY')
+ backend_sources += files('sysv_shmem.c')
+endif
+
+if cdata.has('USE_WIN32_SHARED_MEMORY')
+ backend_sources += files('win32_shmem.c')
+endif
+
+if host_machine.system() == 'windows'
+ subdir('win32')
+endif
diff --git a/src/backend/port/win32/meson.build b/src/backend/port/win32/meson.build
new file mode 100644
index 00000000000..68fe4cc3cd0
--- /dev/null
+++ b/src/backend/port/win32/meson.build
@@ -0,0 +1,6 @@
+backend_sources += files(
+ 'crashdump.c',
+ 'signal.c',
+ 'socket.c',
+ 'timer.c',
+)
diff --git a/src/backend/postmaster/meson.build b/src/backend/postmaster/meson.build
new file mode 100644
index 00000000000..803405683e2
--- /dev/null
+++ b/src/backend/postmaster/meson.build
@@ -0,0 +1,15 @@
+backend_sources += files(
+ 'autovacuum.c',
+ 'auxprocess.c',
+ 'bgworker.c',
+ 'bgwriter.c',
+ 'checkpointer.c',
+ 'fork_process.c',
+ 'interrupt.c',
+ 'pgarch.c',
+ 'pgstat.c',
+ 'postmaster.c',
+ 'startup.c',
+ 'syslogger.c',
+ 'walwriter.c',
+)
diff --git a/src/backend/regex/meson.build b/src/backend/regex/meson.build
new file mode 100644
index 00000000000..d08e21cd6d6
--- /dev/null
+++ b/src/backend/regex/meson.build
@@ -0,0 +1,15 @@
+backend_sources += files(
+ 'regcomp.c',
+ 'regerror.c',
+ 'regexec.c',
+ 'regexport.c',
+ 'regfree.c',
+ 'regprefix.c'
+)
+
+#FIXME
+# mark inclusion dependencies between .c files explicitly
+#regcomp.o: regcomp.c regc_lex.c regc_color.c regc_nfa.c regc_cvec.c \
+# regc_locale.c regc_pg_locale.c
+#
+#regexec.o: regexec.c rege_dfa.c
diff --git a/src/backend/replication/libpqwalreceiver/meson.build b/src/backend/replication/libpqwalreceiver/meson.build
new file mode 100644
index 00000000000..3fc786c80a0
--- /dev/null
+++ b/src/backend/replication/libpqwalreceiver/meson.build
@@ -0,0 +1,13 @@
+libpqwalreceiver_sources = files(
+ 'libpqwalreceiver.c',
+)
+
+libpqwalreceiver = shared_module('pqwalreceiver',
+ libpqwalreceiver_sources,
+ kwargs: pg_mod_args + {
+ 'name_prefix': 'lib',
+ 'dependencies': pg_mod_args['dependencies'] + [libpq],
+ }
+)
+
+backend_targets += libpqwalreceiver
diff --git a/src/backend/replication/logical/meson.build b/src/backend/replication/logical/meson.build
new file mode 100644
index 00000000000..773583a12ba
--- /dev/null
+++ b/src/backend/replication/logical/meson.build
@@ -0,0 +1,14 @@
+backend_sources += files(
+ 'decode.c',
+ 'launcher.c',
+ 'logical.c',
+ 'logicalfuncs.c',
+ 'message.c',
+ 'origin.c',
+ 'proto.c',
+ 'relation.c',
+ 'reorderbuffer.c',
+ 'snapbuild.c',
+ 'tablesync.c',
+ 'worker.c',
+)
diff --git a/src/backend/replication/meson.build b/src/backend/replication/meson.build
new file mode 100644
index 00000000000..ee12c6d49da
--- /dev/null
+++ b/src/backend/replication/meson.build
@@ -0,0 +1,42 @@
+backend_sources += files(
+ 'backup_manifest.c',
+ 'basebackup.c',
+ 'slot.c',
+ 'slotfuncs.c',
+ 'syncrep.c',
+ 'walreceiver.c',
+ 'walreceiverfuncs.c',
+ 'walsender.c',
+)
+
+# [sync]repl_scanner is compiled as part of [sync]repl_gram. The
+# ordering is enforced by making the generation of grammar depend on
+# the scanner generation. That's unnecessarily strict, but overall
+# harmless.
+
+repl_scanner = custom_target('repl_scanner',
+ input : files('repl_scanner.l'),
+ output : ['repl_scanner.c'],
+ command: [flex, '-o', '@OUTPUT@', '@INPUT@']
+)
+
+generated_backend_sources += custom_target('repl_gram',
+ input: 'repl_gram.y',
+ output: 'repl_gram.c',
+ depends: repl_scanner,
+ command: [bison, bisonflags, '-o', '@OUTPUT@', '@INPUT0@'])
+
+
+syncrep_scanner = custom_target('syncrep_scanner',
+ input: 'syncrep_scanner.l',
+ output: 'syncrep_scanner.c',
+ command: [flex, '-o', '@OUTPUT@', '@INPUT@'])
+
+generated_backend_sources += custom_target('syncrep_gram',
+ input: 'syncrep_gram.y',
+ output: 'syncrep_gram.c',
+ depends: syncrep_scanner,
+ command: [bison, bisonflags, '-o', '@OUTPUT@', '@INPUT0@'])
+
+
+subdir('logical')
diff --git a/src/backend/replication/pgoutput/meson.build b/src/backend/replication/pgoutput/meson.build
new file mode 100644
index 00000000000..8ff0a0c6133
--- /dev/null
+++ b/src/backend/replication/pgoutput/meson.build
@@ -0,0 +1,11 @@
+pgoutput_sources = files(
+ 'pgoutput.c',
+)
+
+pgoutput = shared_module('pgoutput',
+ pgoutput_sources,
+ kwargs: pg_mod_args + {
+ }
+)
+
+backend_targets += pgoutput
diff --git a/src/backend/rewrite/meson.build b/src/backend/rewrite/meson.build
new file mode 100644
index 00000000000..032e2e409b5
--- /dev/null
+++ b/src/backend/rewrite/meson.build
@@ -0,0 +1,9 @@
+backend_sources += files(
+ 'rewriteDefine.c',
+ 'rewriteHandler.c',
+ 'rewriteManip.c',
+ 'rewriteRemove.c',
+ 'rewriteSearchCycle.c',
+ 'rewriteSupport.c',
+ 'rowsecurity.c'
+)
diff --git a/src/backend/snowball/meson.build b/src/backend/snowball/meson.build
new file mode 100644
index 00000000000..b1e52e9a0c3
--- /dev/null
+++ b/src/backend/snowball/meson.build
@@ -0,0 +1,83 @@
+dict_snowball_sources = files(
+ 'dict_snowball.c',
+ 'libstemmer/api.c',
+ 'libstemmer/utilities.c',
+)
+
+dict_snowball_sources += files(
+ 'libstemmer/stem_ISO_8859_1_basque.c',
+ 'libstemmer/stem_ISO_8859_1_catalan.c',
+ 'libstemmer/stem_ISO_8859_1_danish.c',
+ 'libstemmer/stem_ISO_8859_1_dutch.c',
+ 'libstemmer/stem_ISO_8859_1_english.c',
+ 'libstemmer/stem_ISO_8859_1_finnish.c',
+ 'libstemmer/stem_ISO_8859_1_french.c',
+ 'libstemmer/stem_ISO_8859_1_german.c',
+ 'libstemmer/stem_ISO_8859_1_indonesian.c',
+ 'libstemmer/stem_ISO_8859_1_irish.c',
+ 'libstemmer/stem_ISO_8859_1_italian.c',
+ 'libstemmer/stem_ISO_8859_1_norwegian.c',
+ 'libstemmer/stem_ISO_8859_1_porter.c',
+ 'libstemmer/stem_ISO_8859_1_portuguese.c',
+ 'libstemmer/stem_ISO_8859_1_spanish.c',
+ 'libstemmer/stem_ISO_8859_1_swedish.c',
+ 'libstemmer/stem_ISO_8859_2_hungarian.c',
+ 'libstemmer/stem_ISO_8859_2_romanian.c',
+ 'libstemmer/stem_KOI8_R_russian.c',
+ 'libstemmer/stem_UTF_8_arabic.c',
+ 'libstemmer/stem_UTF_8_armenian.c',
+ 'libstemmer/stem_UTF_8_basque.c',
+ 'libstemmer/stem_UTF_8_catalan.c',
+ 'libstemmer/stem_UTF_8_danish.c',
+ 'libstemmer/stem_UTF_8_dutch.c',
+ 'libstemmer/stem_UTF_8_english.c',
+ 'libstemmer/stem_UTF_8_finnish.c',
+ 'libstemmer/stem_UTF_8_french.c',
+ 'libstemmer/stem_UTF_8_german.c',
+ 'libstemmer/stem_UTF_8_greek.c',
+ 'libstemmer/stem_UTF_8_hindi.c',
+ 'libstemmer/stem_UTF_8_hungarian.c',
+ 'libstemmer/stem_UTF_8_indonesian.c',
+ 'libstemmer/stem_UTF_8_irish.c',
+ 'libstemmer/stem_UTF_8_italian.c',
+ 'libstemmer/stem_UTF_8_lithuanian.c',
+ 'libstemmer/stem_UTF_8_nepali.c',
+ 'libstemmer/stem_UTF_8_norwegian.c',
+ 'libstemmer/stem_UTF_8_porter.c',
+ 'libstemmer/stem_UTF_8_portuguese.c',
+ 'libstemmer/stem_UTF_8_romanian.c',
+ 'libstemmer/stem_UTF_8_russian.c',
+ 'libstemmer/stem_UTF_8_serbian.c',
+ 'libstemmer/stem_UTF_8_spanish.c',
+ 'libstemmer/stem_UTF_8_swedish.c',
+ 'libstemmer/stem_UTF_8_tamil.c',
+ 'libstemmer/stem_UTF_8_turkish.c',
+ 'libstemmer/stem_UTF_8_yiddish.c',
+)
+
+# see comment in src/include/snowball/header.h
+stemmer_inc = include_directories('../../include/snowball')
+
+dict_snowball = shared_module('dict_snowball',
+ dict_snowball_sources,
+ c_pch: '../../include/pch/postgres_pch.h',
+ kwargs: pg_mod_args + {
+ 'include_directories': [stemmer_inc],
+ }
+)
+
+snowball_create = custom_target('snowball_create',
+ input: ['snowball_create.pl'],
+ output: ['snowball_create.sql'],
+ depfile: 'snowball_create.dep',
+ command: [perl, '@INPUT0@', '--input', '@CURRENT_SOURCE_DIR@', '--output', '@OUTDIR@'],
+ install: true,
+ install_dir: get_option('datadir'))
+
+# FIXME: check whether the logic to select languages currently in Makefile is needed
+install_subdir('stopwords',
+ install_dir: get_option('datadir') / 'tsearch_data',
+ strip_directory: true)
+
+backend_targets += dict_snowball
+backend_targets += snowball_create
diff --git a/src/backend/statistics/meson.build b/src/backend/statistics/meson.build
new file mode 100644
index 00000000000..8530c55f73c
--- /dev/null
+++ b/src/backend/statistics/meson.build
@@ -0,0 +1,6 @@
+backend_sources += files(
+ 'dependencies.c',
+ 'extended_stats.c',
+ 'mcv.c',
+ 'mvdistinct.c',
+)
diff --git a/src/backend/storage/buffer/meson.build b/src/backend/storage/buffer/meson.build
new file mode 100644
index 00000000000..56a59b52484
--- /dev/null
+++ b/src/backend/storage/buffer/meson.build
@@ -0,0 +1,7 @@
+backend_sources += files(
+ 'buf_init.c',
+ 'buf_table.c',
+ 'bufmgr.c',
+ 'freelist.c',
+ 'localbuf.c',
+)
diff --git a/src/backend/storage/file/meson.build b/src/backend/storage/file/meson.build
new file mode 100644
index 00000000000..e1d5047d4aa
--- /dev/null
+++ b/src/backend/storage/file/meson.build
@@ -0,0 +1,8 @@
+backend_sources += files(
+ 'buffile.c',
+ 'copydir.c',
+ 'fd.c',
+ 'fileset.c',
+ 'reinit.c',
+ 'sharedfileset.c',
+)
diff --git a/src/backend/storage/freespace/meson.build b/src/backend/storage/freespace/meson.build
new file mode 100644
index 00000000000..e4200ea6527
--- /dev/null
+++ b/src/backend/storage/freespace/meson.build
@@ -0,0 +1,5 @@
+backend_sources += files(
+ 'freespace.c',
+ 'fsmpage.c',
+ 'indexfsm.c',
+)
diff --git a/src/backend/storage/ipc/meson.build b/src/backend/storage/ipc/meson.build
new file mode 100644
index 00000000000..516bc1d0193
--- /dev/null
+++ b/src/backend/storage/ipc/meson.build
@@ -0,0 +1,20 @@
+backend_sources += files(
+ 'barrier.c',
+ 'dsm.c',
+ 'dsm_impl.c',
+ 'ipc.c',
+ 'ipci.c',
+ 'latch.c',
+ 'pmsignal.c',
+ 'procarray.c',
+ 'procsignal.c',
+ 'shm_mq.c',
+ 'shm_toc.c',
+ 'shmem.c',
+ 'shmqueue.c',
+ 'signalfuncs.c',
+ 'sinval.c',
+ 'sinvaladt.c',
+ 'standby.c',
+
+)
diff --git a/src/backend/storage/large_object/meson.build b/src/backend/storage/large_object/meson.build
new file mode 100644
index 00000000000..8a181ab9b34
--- /dev/null
+++ b/src/backend/storage/large_object/meson.build
@@ -0,0 +1,3 @@
+backend_sources += files(
+ 'inv_api.c',
+)
diff --git a/src/backend/storage/lmgr/meson.build b/src/backend/storage/lmgr/meson.build
new file mode 100644
index 00000000000..938e7f89894
--- /dev/null
+++ b/src/backend/storage/lmgr/meson.build
@@ -0,0 +1,18 @@
+backend_sources += files(
+ 'condition_variable.c',
+ 'deadlock.c',
+ 'lmgr.c',
+ 'lock.c',
+ 'lwlock.c',
+ 'predicate.c',
+ 'proc.c',
+ 's_lock.c',
+ 'spin.c',
+)
+
+lwlocknames_backend = custom_target('lwlocknames',
+ input : files('lwlocknames.txt'),
+ output : ['lwlocknames.c', 'lwlocknames.h'],
+ command : [perl, files('generate-lwlocknames.pl'), '-o', '@OUTDIR@', '@INPUT@']
+)
+generated_backend_sources += lwlocknames_backend[0]
diff --git a/src/backend/storage/meson.build b/src/backend/storage/meson.build
new file mode 100644
index 00000000000..daad628d74c
--- /dev/null
+++ b/src/backend/storage/meson.build
@@ -0,0 +1,9 @@
+subdir('buffer')
+subdir('file')
+subdir('freespace')
+subdir('ipc')
+subdir('large_object')
+subdir('lmgr')
+subdir('page')
+subdir('smgr')
+subdir('sync')
diff --git a/src/backend/storage/page/meson.build b/src/backend/storage/page/meson.build
new file mode 100644
index 00000000000..2ecd16c952c
--- /dev/null
+++ b/src/backend/storage/page/meson.build
@@ -0,0 +1,5 @@
+backend_sources += files(
+ 'bufpage.c',
+ 'checksum.c',
+ 'itemptr.c',
+)
diff --git a/src/backend/storage/smgr/meson.build b/src/backend/storage/smgr/meson.build
new file mode 100644
index 00000000000..fdeb1223b32
--- /dev/null
+++ b/src/backend/storage/smgr/meson.build
@@ -0,0 +1,4 @@
+backend_sources += files(
+ 'md.c',
+ 'smgr.c',
+)
diff --git a/src/backend/storage/sync/meson.build b/src/backend/storage/sync/meson.build
new file mode 100644
index 00000000000..05148b91a8e
--- /dev/null
+++ b/src/backend/storage/sync/meson.build
@@ -0,0 +1,4 @@
+backend_sources += files(
+ 'sync.c',
+
+)
diff --git a/src/backend/tcop/meson.build b/src/backend/tcop/meson.build
new file mode 100644
index 00000000000..fb54aae8122
--- /dev/null
+++ b/src/backend/tcop/meson.build
@@ -0,0 +1,8 @@
+backend_sources += files(
+ 'cmdtag.c',
+ 'dest.c',
+ 'fastpath.c',
+ 'postgres.c',
+ 'pquery.c',
+ 'utility.c',
+)
diff --git a/src/backend/tsearch/meson.build b/src/backend/tsearch/meson.build
new file mode 100644
index 00000000000..460036b6d4c
--- /dev/null
+++ b/src/backend/tsearch/meson.build
@@ -0,0 +1,21 @@
+backend_sources += files(
+ 'dict.c',
+ 'dict_ispell.c',
+ 'dict_simple.c',
+ 'dict_synonym.c',
+ 'dict_thesaurus.c',
+ 'regis.c',
+ 'spell.c',
+ 'to_tsany.c',
+ 'ts_locale.c',
+ 'ts_parse.c',
+ 'ts_selfuncs.c',
+ 'ts_typanalyze.c',
+ 'ts_utils.c',
+ 'wparser.c',
+ 'wparser_def.c',
+)
+
+install_subdir('dicts',
+ install_dir: get_option('datadir') / 'tsearch_data',
+ strip_directory: true)
diff --git a/src/backend/utils/activity/meson.build b/src/backend/utils/activity/meson.build
new file mode 100644
index 00000000000..cef26eb564b
--- /dev/null
+++ b/src/backend/utils/activity/meson.build
@@ -0,0 +1,5 @@
+backend_sources += files(
+ 'backend_progress.c',
+ 'backend_status.c',
+ 'wait_event.c',
+)
diff --git a/src/backend/utils/adt/meson.build b/src/backend/utils/adt/meson.build
new file mode 100644
index 00000000000..e1cea1eb4e4
--- /dev/null
+++ b/src/backend/utils/adt/meson.build
@@ -0,0 +1,118 @@
+backend_sources += files(
+ 'acl.c',
+ 'amutils.c',
+ 'array_expanded.c',
+ 'array_selfuncs.c',
+ 'array_typanalyze.c',
+ 'array_userfuncs.c',
+ 'arrayfuncs.c',
+ 'arraysubs.c',
+ 'arrayutils.c',
+ 'ascii.c',
+ 'bool.c',
+ 'cash.c',
+ 'char.c',
+ 'cryptohashfuncs.c',
+ 'date.c',
+ 'datetime.c',
+ 'datum.c',
+ 'dbsize.c',
+ 'domains.c',
+ 'encode.c',
+ 'enum.c',
+ 'expandeddatum.c',
+ 'expandedrecord.c',
+ 'float.c',
+ 'format_type.c',
+ 'formatting.c',
+ 'genfile.c',
+ 'geo_ops.c',
+ 'geo_selfuncs.c',
+ 'geo_spgist.c',
+ 'inet_cidr_ntop.c',
+ 'inet_net_pton.c',
+ 'int.c',
+ 'int8.c',
+ 'json.c',
+ 'jsonb.c',
+ 'jsonb_gin.c',
+ 'jsonb_op.c',
+ 'jsonb_util.c',
+ 'jsonfuncs.c',
+ 'jsonbsubs.c',
+ 'jsonpath.c',
+ 'jsonpath_exec.c',
+ 'like.c',
+ 'like_support.c',
+ 'lockfuncs.c',
+ 'mac.c',
+ 'mac8.c',
+ 'mcxtfuncs.c',
+ 'misc.c',
+ 'multirangetypes.c',
+ 'multirangetypes_selfuncs.c',
+ 'name.c',
+ 'network.c',
+ 'network_gist.c',
+ 'network_selfuncs.c',
+ 'network_spgist.c',
+ 'numeric.c',
+ 'numutils.c',
+ 'oid.c',
+ 'oracle_compat.c',
+ 'orderedsetaggs.c',
+ 'partitionfuncs.c',
+ 'pg_locale.c',
+ 'pg_lsn.c',
+ 'pg_upgrade_support.c',
+ 'pgstatfuncs.c',
+ 'pseudotypes.c',
+ 'quote.c',
+ 'rangetypes.c',
+ 'rangetypes_gist.c',
+ 'rangetypes_selfuncs.c',
+ 'rangetypes_spgist.c',
+ 'rangetypes_typanalyze.c',
+ 'regexp.c',
+ 'regproc.c',
+ 'ri_triggers.c',
+ 'rowtypes.c',
+ 'ruleutils.c',
+ 'selfuncs.c',
+ 'tid.c',
+ 'timestamp.c',
+ 'trigfuncs.c',
+ 'tsginidx.c',
+ 'tsgistidx.c',
+ 'tsquery.c',
+ 'tsquery_cleanup.c',
+ 'tsquery_gist.c',
+ 'tsquery_op.c',
+ 'tsquery_rewrite.c',
+ 'tsquery_util.c',
+ 'tsrank.c',
+ 'tsvector.c',
+ 'tsvector_op.c',
+ 'tsvector_parser.c',
+ 'uuid.c',
+ 'varbit.c',
+ 'varchar.c',
+ 'varlena.c',
+ 'version.c',
+ 'windowfuncs.c',
+ 'xid.c',
+ 'xid8funcs.c',
+ 'xml.c',
+)
+
+
+jsonpath_scan = custom_target('jsonpath_scan',
+ input: ['jsonpath_scan.l'],
+ output: ['jsonpath_scan.c'],
+ command: [flex, '-b', '-CF', '-p', '-p', '-o', '@OUTPUT@', '@INPUT@'])
+
+# jsonpath_scan is compiled as part of jsonpath_gram
+generated_backend_sources += custom_target('jsonpath_parse',
+ input: ['jsonpath_gram.y', jsonpath_scan[0]],
+ output: ['jsonpath_gram.c'],
+ command: [bison, bisonflags, '-o', '@OUTPUT@', '@INPUT0@'])
diff --git a/src/backend/utils/cache/meson.build b/src/backend/utils/cache/meson.build
new file mode 100644
index 00000000000..92972db52ad
--- /dev/null
+++ b/src/backend/utils/cache/meson.build
@@ -0,0 +1,16 @@
+backend_sources += files(
+ 'attoptcache.c',
+ 'catcache.c',
+ 'evtcache.c',
+ 'inval.c',
+ 'lsyscache.c',
+ 'partcache.c',
+ 'plancache.c',
+ 'relcache.c',
+ 'relfilenodemap.c',
+ 'relmapper.c',
+ 'spccache.c',
+ 'syscache.c',
+ 'ts_cache.c',
+ 'typcache.c',
+)
diff --git a/src/backend/utils/error/meson.build b/src/backend/utils/error/meson.build
new file mode 100644
index 00000000000..ff0ae388263
--- /dev/null
+++ b/src/backend/utils/error/meson.build
@@ -0,0 +1,4 @@
+backend_sources += files(
+ 'assert.c',
+ 'elog.c',
+ )
diff --git a/src/backend/utils/fmgr/meson.build b/src/backend/utils/fmgr/meson.build
new file mode 100644
index 00000000000..e545b424fd2
--- /dev/null
+++ b/src/backend/utils/fmgr/meson.build
@@ -0,0 +1,8 @@
+backend_sources += files(
+ 'dfmgr.c',
+ 'fmgr.c',
+ 'funcapi.c',
+)
+
+# fmgrtab.c
+generated_backend_sources += fmgrtab_target[2]
diff --git a/src/backend/utils/hash/meson.build b/src/backend/utils/hash/meson.build
new file mode 100644
index 00000000000..242e2f0ecdf
--- /dev/null
+++ b/src/backend/utils/hash/meson.build
@@ -0,0 +1,4 @@
+backend_sources += files(
+ 'dynahash.c',
+ 'pg_crc.c'
+)
diff --git a/src/backend/utils/init/meson.build b/src/backend/utils/init/meson.build
new file mode 100644
index 00000000000..ec9d72c3df1
--- /dev/null
+++ b/src/backend/utils/init/meson.build
@@ -0,0 +1,4 @@
+backend_sources += files(
+ 'globals.c',
+ 'miscinit.c',
+ 'postinit.c')
diff --git a/src/backend/utils/mb/conversion_procs/meson.build b/src/backend/utils/mb/conversion_procs/meson.build
new file mode 100644
index 00000000000..b84a78b6318
--- /dev/null
+++ b/src/backend/utils/mb/conversion_procs/meson.build
@@ -0,0 +1,38 @@
+encodings = {
+ 'cyrillic_and_mic': ['cyrillic_and_mic/cyrillic_and_mic.c'],
+ 'euc2004_sjis2004': ['euc2004_sjis2004/euc2004_sjis2004.c'],
+ 'euc_cn_and_mic': ['euc_cn_and_mic/euc_cn_and_mic.c'],
+ 'euc_jp_and_sjis': ['euc_jp_and_sjis/euc_jp_and_sjis.c'],
+ 'euc_kr_and_mic': ['euc_kr_and_mic/euc_kr_and_mic.c'],
+ 'euc_tw_and_big5': [
+ 'euc_tw_and_big5/euc_tw_and_big5.c',
+ 'euc_tw_and_big5/big5.c',
+ ],
+ 'latin2_and_win1250': ['latin2_and_win1250/latin2_and_win1250.c'],
+ 'latin_and_mic': ['latin_and_mic/latin_and_mic.c'],
+ 'utf8_and_big5': ['utf8_and_big5/utf8_and_big5.c'],
+ 'utf8_and_cyrillic': ['utf8_and_cyrillic/utf8_and_cyrillic.c'],
+ 'utf8_and_euc2004': ['utf8_and_euc2004/utf8_and_euc2004.c'],
+ 'utf8_and_euc_cn': ['utf8_and_euc_cn/utf8_and_euc_cn.c'],
+ 'utf8_and_euc_jp': ['utf8_and_euc_jp/utf8_and_euc_jp.c'],
+ 'utf8_and_euc_kr': ['utf8_and_euc_kr/utf8_and_euc_kr.c'],
+ 'utf8_and_euc_tw': ['utf8_and_euc_tw/utf8_and_euc_tw.c'],
+ 'utf8_and_gb18030': ['utf8_and_gb18030/utf8_and_gb18030.c'],
+ 'utf8_and_gbk': ['utf8_and_gbk/utf8_and_gbk.c'],
+ 'utf8_and_iso8859': ['utf8_and_iso8859/utf8_and_iso8859.c'],
+ 'utf8_and_iso8859_1': ['utf8_and_iso8859_1/utf8_and_iso8859_1.c'],
+ 'utf8_and_johab': ['utf8_and_johab/utf8_and_johab.c'],
+ 'utf8_and_sjis': ['utf8_and_sjis/utf8_and_sjis.c'],
+ 'utf8_and_sjis2004': ['utf8_and_sjis2004/utf8_and_sjis2004.c'],
+ 'utf8_and_uhc': ['utf8_and_uhc/utf8_and_uhc.c'],
+ 'utf8_and_win': ['utf8_and_win/utf8_and_win.c'],
+}
+
+foreach encoding, sources : encodings
+ backend_targets += shared_module(encoding,
+ sources,
+ kwargs: pg_mod_args + {
+ }
+ )
+
+endforeach
diff --git a/src/backend/utils/mb/meson.build b/src/backend/utils/mb/meson.build
new file mode 100644
index 00000000000..39e45638db0
--- /dev/null
+++ b/src/backend/utils/mb/meson.build
@@ -0,0 +1,9 @@
+backend_sources += files(
+ 'conv.c',
+ 'mbutils.c',
+ 'stringinfo_mb.c',
+ 'wstrcmp.c',
+ 'wstrncmp.c',
+)
+
+# Note we only enter conversion_procs once the backend build is defined
diff --git a/src/backend/utils/meson.build b/src/backend/utils/meson.build
new file mode 100644
index 00000000000..afb1c0346ba
--- /dev/null
+++ b/src/backend/utils/meson.build
@@ -0,0 +1,13 @@
+subdir('activity')
+subdir('adt')
+subdir('cache')
+subdir('error')
+subdir('fmgr')
+subdir('hash')
+subdir('init')
+subdir('mb')
+subdir('misc')
+subdir('mmgr')
+subdir('resowner')
+subdir('sort')
+subdir('time')
diff --git a/src/backend/utils/misc/meson.build b/src/backend/utils/misc/meson.build
new file mode 100644
index 00000000000..2c0090ad337
--- /dev/null
+++ b/src/backend/utils/misc/meson.build
@@ -0,0 +1,28 @@
+backend_sources += files(
+ 'help_config.c',
+ 'pg_config.c',
+ 'pg_controldata.c',
+ 'pg_rusage.c',
+ 'ps_status.c',
+ 'queryenvironment.c',
+ 'queryjumble.c',
+ 'rls.c',
+ 'sampling.c',
+ 'superuser.c',
+ 'timeout.c',
+ 'tzparser.c',
+ 'guc.c',
+)
+
+# guc-file.c.h is compiled as part of guc.c
+guc_scan = custom_target('guc_scan',
+ input: ['guc-file.l'],
+ output: ['guc-file.c.h'],
+ command: [flex, '-o', '@OUTPUT@', '@INPUT@'])
+
+generated_backend_sources += guc_scan
+
+backend_build_deps += declare_dependency(sources: [guc_scan],
+ include_directories: include_directories('.'))
+
+install_data('postgresql.conf.sample', install_dir: 'share/')
diff --git a/src/backend/utils/mmgr/meson.build b/src/backend/utils/mmgr/meson.build
new file mode 100644
index 00000000000..641bb181ba1
--- /dev/null
+++ b/src/backend/utils/mmgr/meson.build
@@ -0,0 +1,10 @@
+backend_sources += files(
+ 'aset.c',
+ 'dsa.c',
+ 'freepage.c',
+ 'generation.c',
+ 'mcxt.c',
+ 'memdebug.c',
+ 'portalmem.c',
+ 'slab.c',
+)
diff --git a/src/backend/utils/resowner/meson.build b/src/backend/utils/resowner/meson.build
new file mode 100644
index 00000000000..d30891ca027
--- /dev/null
+++ b/src/backend/utils/resowner/meson.build
@@ -0,0 +1,3 @@
+backend_sources += files(
+ 'resowner.c'
+)
diff --git a/src/backend/utils/sort/meson.build b/src/backend/utils/sort/meson.build
new file mode 100644
index 00000000000..b626bdc9d96
--- /dev/null
+++ b/src/backend/utils/sort/meson.build
@@ -0,0 +1,7 @@
+backend_sources += files(
+ 'logtape.c',
+ 'sharedtuplestore.c',
+ 'sortsupport.c',
+ 'tuplesort.c',
+ 'tuplestore.c',
+)
diff --git a/src/backend/utils/time/meson.build b/src/backend/utils/time/meson.build
new file mode 100644
index 00000000000..6fff8792bb0
--- /dev/null
+++ b/src/backend/utils/time/meson.build
@@ -0,0 +1,4 @@
+backend_sources += files(
+ 'combocid.c',
+ 'snapmgr.c',
+)
diff --git a/src/bin/initdb/meson.build b/src/bin/initdb/meson.build
new file mode 100644
index 00000000000..52f679e3116
--- /dev/null
+++ b/src/bin/initdb/meson.build
@@ -0,0 +1,24 @@
+initdb_sources = files(
+ 'findtimezone.c',
+ 'initdb.c'
+)
+
+initdb_sources += timezone_localtime_source
+
+#fixme: reimplement libpq_pgport logic
+
+executable('initdb',
+ initdb_sources,
+ include_directories: [timezone_inc],
+ dependencies: [frontend_code, libpq],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name': 'initdb',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_initdb.pl'
+ ]
+}
diff --git a/src/bin/meson.build b/src/bin/meson.build
new file mode 100644
index 00000000000..5fd5a9d2f98
--- /dev/null
+++ b/src/bin/meson.build
@@ -0,0 +1,20 @@
+subdir('initdb')
+subdir('pg_amcheck')
+subdir('pg_archivecleanup')
+subdir('pg_basebackup')
+subdir('pg_checksums')
+subdir('pg_config')
+subdir('pg_controldata')
+subdir('pg_ctl')
+subdir('pg_dump')
+subdir('pg_resetwal')
+subdir('pg_rewind')
+subdir('pg_test_fsync')
+subdir('pg_test_timing')
+subdir('pg_upgrade')
+subdir('pg_verifybackup')
+subdir('pg_waldump')
+subdir('pgbench')
+subdir('pgevent')
+subdir('psql')
+subdir('scripts')
diff --git a/src/bin/pg_amcheck/meson.build b/src/bin/pg_amcheck/meson.build
new file mode 100644
index 00000000000..69eaef8f141
--- /dev/null
+++ b/src/bin/pg_amcheck/meson.build
@@ -0,0 +1,22 @@
+pg_amcheck_sources = files(
+ 'pg_amcheck.c'
+)
+
+pg_amcheck = executable('pg_amcheck',
+ pg_amcheck_sources,
+ dependencies: [frontend_code, libpq],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name': 'pg_amcheck',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_basic.pl',
+ 't/002_nonesuch.pl',
+ 't/003_check.pl',
+ 't/004_verify_heapam.pl',
+ 't/005_opclass_damage.pl',
+ ]
+}
diff --git a/src/bin/pg_archivecleanup/meson.build b/src/bin/pg_archivecleanup/meson.build
new file mode 100644
index 00000000000..27742fafab7
--- /dev/null
+++ b/src/bin/pg_archivecleanup/meson.build
@@ -0,0 +1,14 @@
+pg_archivecleanup = executable('pg_archivecleanup',
+ ['pg_archivecleanup.c'],
+ dependencies: [frontend_code],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name': 'pg_archivecleanup',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/010_pg_archivecleanup.pl'
+ ]
+}
diff --git a/src/bin/pg_basebackup/meson.build b/src/bin/pg_basebackup/meson.build
new file mode 100644
index 00000000000..a629b8b02f5
--- /dev/null
+++ b/src/bin/pg_basebackup/meson.build
@@ -0,0 +1,44 @@
+common_sources = files(
+ 'receivelog.c',
+ 'streamutil.c',
+ 'walmethods.c',
+)
+
+pg_basebackup_common = static_library('pg_basebackup_common',
+ common_sources,
+ dependencies: [frontend_code, libpq, zlib],
+ kwargs: internal_lib_args,
+)
+
+executable('pg_basebackup',
+ 'pg_basebackup.c',
+ link_with: [pg_basebackup_common],
+ dependencies: [frontend_code, libpq, zlib],
+ kwargs: default_bin_args,
+)
+
+executable('pg_receivewal',
+ 'pg_receivewal.c',
+ link_with: [pg_basebackup_common],
+ dependencies: [frontend_code, libpq],
+ kwargs: default_bin_args,
+)
+
+executable('pg_recvlogical',
+ 'pg_recvlogical.c',
+ link_with: [pg_basebackup_common],
+ dependencies: [frontend_code, libpq],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name' : 'pg_basebackup',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'env': {'GZIP_PROGRAM': gzip.path(), 'TAR': tar.path()},
+ 'tests': [
+ 't/010_pg_basebackup.pl',
+ 't/020_pg_receivewal.pl',
+ 't/030_pg_recvlogical.pl',
+ ]
+}
diff --git a/src/bin/pg_checksums/meson.build b/src/bin/pg_checksums/meson.build
new file mode 100644
index 00000000000..bbf9582b904
--- /dev/null
+++ b/src/bin/pg_checksums/meson.build
@@ -0,0 +1,16 @@
+executable('pg_checksums',
+ ['pg_checksums.c'],
+ include_directories: [timezone_inc],
+ dependencies: [frontend_code],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name': 'pg_checksums',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_basic.pl',
+ 't/002_actions.pl'
+ ]
+}
diff --git a/src/bin/pg_config/meson.build b/src/bin/pg_config/meson.build
new file mode 100644
index 00000000000..df0eb13f636
--- /dev/null
+++ b/src/bin/pg_config/meson.build
@@ -0,0 +1,14 @@
+executable('pg_config',
+ ['pg_config.c'],
+ dependencies: [frontend_code],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name': 'pg_config',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_pg_config.pl',
+ ]
+}
diff --git a/src/bin/pg_controldata/meson.build b/src/bin/pg_controldata/meson.build
new file mode 100644
index 00000000000..fa6057afa54
--- /dev/null
+++ b/src/bin/pg_controldata/meson.build
@@ -0,0 +1,14 @@
+executable('pg_controldata',
+ ['pg_controldata.c'],
+ dependencies: [frontend_code],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name': 'pg_controldata',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_pg_controldata.pl'
+ ]
+}
diff --git a/src/bin/pg_ctl/meson.build b/src/bin/pg_ctl/meson.build
new file mode 100644
index 00000000000..ac0d4f18192
--- /dev/null
+++ b/src/bin/pg_ctl/meson.build
@@ -0,0 +1,17 @@
+executable('pg_ctl',
+ ['pg_ctl.c'],
+ dependencies: [frontend_code, libpq],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name': 'pg_ctl',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_start_stop.pl',
+ 't/002_status.pl',
+ 't/003_promote.pl',
+ 't/004_logrotate.pl'
+ ]
+}
diff --git a/src/bin/pg_dump/meson.build b/src/bin/pg_dump/meson.build
new file mode 100644
index 00000000000..ce5ef11eaeb
--- /dev/null
+++ b/src/bin/pg_dump/meson.build
@@ -0,0 +1,69 @@
+pg_dump_common_sources = files(
+ 'compress_io.c',
+ 'dumputils.c',
+ 'parallel.c',
+ 'pg_backup_archiver.c',
+ 'pg_backup_custom.c',
+ 'pg_backup_db.c',
+ 'pg_backup_directory.c',
+ 'pg_backup_null.c',
+ 'pg_backup_tar.c',
+ 'pg_backup_utils.c',
+)
+
+pg_dump_common = static_library('pg_dump_common',
+ pg_dump_common_sources,
+ c_pch: '../../include/pch/c_pch.h',
+ dependencies: [frontend_code, libpq, zlib],
+ kwargs: internal_lib_args,
+)
+
+pg_dump_sources = files(
+ 'pg_dump.c',
+ 'common.c',
+ 'pg_dump_sort.c',
+)
+
+executable('pg_dump',
+ pg_dump_sources,
+ link_with: [pg_dump_common],
+ dependencies: [frontend_code, libpq, zlib],
+ kwargs: default_bin_args,
+)
+
+
+pg_dumpall_sources = files(
+ 'pg_dumpall.c',
+)
+
+executable('pg_dumpall',
+ pg_dumpall_sources,
+ link_with: [pg_dump_common],
+ dependencies: [frontend_code, libpq, zlib],
+ kwargs: default_bin_args,
+)
+
+
+pg_restore_sources = files(
+ 'pg_restore.c',
+)
+
+executable('pg_restore',
+ pg_restore_sources,
+ link_with: [pg_dump_common],
+ dependencies: [frontend_code, libpq, zlib],
+ kwargs: default_bin_args,
+)
+
+
+tap_tests += {
+ 'name': 'pg_dump',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_basic.pl',
+ 't/002_pg_dump.pl',
+ 't/003_pg_dump_with_server.pl',
+ 't/010_dump_connstr.pl',
+ ]
+}
diff --git a/src/bin/pg_resetwal/meson.build b/src/bin/pg_resetwal/meson.build
new file mode 100644
index 00000000000..7450c0f6432
--- /dev/null
+++ b/src/bin/pg_resetwal/meson.build
@@ -0,0 +1,15 @@
+executable('pg_resetwal',
+ files('pg_resetwal.c'),
+ dependencies: [frontend_code],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name': 'pg_resetwal',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_basic.pl',
+ 't/002_corrupted.pl'
+ ]
+}
diff --git a/src/bin/pg_rewind/meson.build b/src/bin/pg_rewind/meson.build
new file mode 100644
index 00000000000..c7c59e9e523
--- /dev/null
+++ b/src/bin/pg_rewind/meson.build
@@ -0,0 +1,34 @@
+pg_rewind_sources = files(
+ 'datapagemap.c',
+ 'file_ops.c',
+ 'filemap.c',
+ 'libpq_source.c',
+ 'local_source.c',
+ 'parsexlog.c',
+ 'pg_rewind.c',
+ 'timeline.c',
+ '../../backend/access/transam/xlogreader.c',
+)
+
+pg_rewind = executable('pg_rewind',
+ pg_rewind_sources,
+ dependencies: [frontend_code, libpq, lz4],
+ kwargs: default_bin_args,
+)
+
+
+tap_tests += {
+ 'name': 'pg_rewind',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_basic.pl',
+ 't/002_databases.pl',
+ 't/003_extrafiles.pl',
+ 't/004_pg_xlog_symlink.pl',
+ 't/005_same_timeline.pl',
+ 't/006_options.pl',
+ 't/007_standby_source.pl',
+ 't/008_min_recovery_point.pl',
+ ]
+}
diff --git a/src/bin/pg_test_fsync/meson.build b/src/bin/pg_test_fsync/meson.build
new file mode 100644
index 00000000000..527be88d125
--- /dev/null
+++ b/src/bin/pg_test_fsync/meson.build
@@ -0,0 +1,14 @@
+executable('pg_test_fsync',
+ ['pg_test_fsync.c'],
+ dependencies: [frontend_code],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name' : 'pg_test_fsync',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests' :[
+ 't/001_basic.pl',
+ ]
+}
diff --git a/src/bin/pg_test_timing/meson.build b/src/bin/pg_test_timing/meson.build
new file mode 100644
index 00000000000..c74577df493
--- /dev/null
+++ b/src/bin/pg_test_timing/meson.build
@@ -0,0 +1,14 @@
+pg_test_timing = executable('pg_test_timing',
+ ['pg_test_timing.c'],
+ dependencies: [frontend_code],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name': 'pg_test_timing',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_basic.pl'
+ ]
+}
diff --git a/src/bin/pg_upgrade/meson.build b/src/bin/pg_upgrade/meson.build
new file mode 100644
index 00000000000..88d0e03446b
--- /dev/null
+++ b/src/bin/pg_upgrade/meson.build
@@ -0,0 +1,26 @@
+pg_upgrade_sources = files(
+ 'check.c',
+ 'controldata.c',
+ 'dump.c',
+ 'exec.c',
+ 'file.c',
+ 'function.c',
+ 'info.c',
+ 'option.c',
+ 'parallel.c',
+ 'pg_upgrade.c',
+ 'relfilenode.c',
+ 'server.c',
+ 'tablespace.c',
+ 'util.c',
+ 'version.c',
+)
+
+pg_upgrade = executable('pg_upgrade',
+ pg_upgrade_sources,
+ c_pch: '../../include/pch/c_pch.h',
+ dependencies: [frontend_code, libpq],
+ kwargs: default_bin_args,
+)
+
+# FIXME: add test
diff --git a/src/bin/pg_verifybackup/meson.build b/src/bin/pg_verifybackup/meson.build
new file mode 100644
index 00000000000..c7039ddcc49
--- /dev/null
+++ b/src/bin/pg_verifybackup/meson.build
@@ -0,0 +1,25 @@
+pg_verifybackup_sources = files(
+ 'parse_manifest.c',
+ 'pg_verifybackup.c'
+)
+
+pg_verifybackup = executable('pg_verifybackup',
+ pg_verifybackup_sources,
+ dependencies: [frontend_code, libpq],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name': 'pg_verifybackup',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_basic.pl',
+ 't/002_algorithm.pl',
+ 't/003_corruption.pl',
+ 't/004_options.pl',
+ 't/005_bad_manifest.pl',
+ 't/006_encoding.pl',
+ 't/007_wal.pl',
+ ]
+}
diff --git a/src/bin/pg_waldump/meson.build b/src/bin/pg_waldump/meson.build
new file mode 100644
index 00000000000..f89139f89f5
--- /dev/null
+++ b/src/bin/pg_waldump/meson.build
@@ -0,0 +1,23 @@
+pg_waldump_sources = files(
+ 'compat.c',
+ 'pg_waldump.c',
+ 'rmgrdesc.c',
+)
+
+pg_waldump_sources += rmgr_desc_sources
+pg_waldump_sources += xlogreader_sources
+
+pg_waldump = executable('pg_waldump',
+ pg_waldump_sources,
+ dependencies: [frontend_code, lz4],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name': 'pg_waldump',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_basic.pl',
+ ]
+}
diff --git a/src/bin/pgbench/meson.build b/src/bin/pgbench/meson.build
new file mode 100644
index 00000000000..bc135abebfc
--- /dev/null
+++ b/src/bin/pgbench/meson.build
@@ -0,0 +1,38 @@
+pgbench_sources = files(
+ 'pgbench.c',
+)
+
+# exprscan is compiled as part ofexprparse. The ordering is enforced by making
+# the generation of grammar depend on the scanner generation. That's
+# unnecessarily strict, but overall harmless.
+
+exprscan = custom_target('exprscan',
+ input : files('exprscan.l'),
+ output : ['exprscan.c'],
+ command : [flex, '-o', '@OUTPUT0@', '@INPUT@']
+)
+
+exprparse = custom_target('exprparse',
+ input: 'exprparse.y',
+ output: 'exprparse.c',
+ depends: exprscan,
+ command: [bison, bisonflags, '-o', '@OUTPUT@', '@INPUT0@'])
+pgbench_sources += exprparse
+
+executable('pgbench',
+ pgbench_sources,
+ dependencies: [frontend_code, libpq, thread_dep],
+ include_directories: include_directories('.'),
+ kwargs: default_bin_args,
+)
+
+
+tap_tests += {
+ 'name' : 'pgbench',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests' :[
+ 't/001_pgbench_with_server.pl',
+ 't/002_pgbench_no_server.pl'
+ ]
+}
diff --git a/src/bin/pgevent/meson.build b/src/bin/pgevent/meson.build
new file mode 100644
index 00000000000..b35287d074b
--- /dev/null
+++ b/src/bin/pgevent/meson.build
@@ -0,0 +1,32 @@
+if host_machine.system() != 'windows'
+ subdir_done()
+endif
+
+pgevent_sources = files(
+ 'pgevent.c',
+)
+
+pgevent_cdata = rc_lib_cdata
+pgevent_cdata.set_quoted('FILEDESC', 'Eventlog message formatter')
+pgevent_cdata.set_quoted('INTERNAL_NAME', 'pgevent')
+pgevent_cdata.set_quoted('ORIGINAL_NAME', 'pgevent.dll')
+
+configure_file(
+ output: 'win32ver.rc',
+ input: win32ver_rc_in,
+ configuration: pgevent_cdata
+)
+
+pgevent_sources += windows.compile_resources(
+ 'pgmsgevent.rc',
+ include_directories: [include_directories('.'), postgres_inc],
+)
+
+shared_library('pgevent',
+ pgevent_sources,
+ dependencies: [frontend_code],
+ vs_module_defs: 'pgevent.def',
+ kwargs: default_lib_args + {
+ 'name_prefix': '',
+ },
+)
diff --git a/src/bin/psql/meson.build b/src/bin/psql/meson.build
new file mode 100644
index 00000000000..75905a52c13
--- /dev/null
+++ b/src/bin/psql/meson.build
@@ -0,0 +1,65 @@
+psql_sources = files(
+ 'command.c',
+ 'common.c',
+ 'copy.c',
+ 'crosstabview.c',
+ 'describe.c',
+ 'help.c',
+ 'input.c',
+ 'large_obj.c',
+ 'mainloop.c',
+ 'prompt.c',
+ 'startup.c',
+ 'stringutils.c',
+ 'tab-complete.c',
+ 'variables.c',
+)
+
+psql_sources += custom_target('psqlscanslash',
+ input: ['psqlscanslash.l'],
+ output: ['psqlscanslash.c'],
+ command: [flex, '-b', '-Cfe', '-p', '-p', '-o', '@OUTPUT@', '@INPUT@'])
+
+psql_sources += custom_target('psql_help',
+ input: ['create_help.pl'],
+ output: ['sql_help.c', 'sql_help.h'],
+ depfile: 'sql_help.dep',
+ command: [perl, '@INPUT0@', '@SOURCE_ROOT@/doc/src/sgml/ref', '@OUTDIR@', 'sql_help'])
+
+if host_machine.system() == 'windows'
+ psql_cdata = rc_bin_cdata
+ psql_cdata.set_quoted('FILEDESC', 'psql - the PostgreSQL interactive terminal')
+ psql_cdata.set_quoted('INTERNAL_NAME', 'psql')
+ psql_cdata.set_quoted('ORIGINAL_NAME', 'psql.exe')
+
+ win32_ver_rc = configure_file(
+ output: 'win32ver.rc',
+ input: win32ver_rc_in,
+ configuration: psql_cdata
+ )
+
+ psql_sources += windows.compile_resources(
+ win32_ver_rc,
+ include_directories: postgres_inc,
+ )
+endif
+
+executable('psql',
+ psql_sources,
+ c_pch: '../../include/pch/c_pch.h',
+ include_directories: include_directories('.'),
+ dependencies : [frontend_code, libpq, readline],
+ kwargs: default_bin_args,
+)
+
+tap_tests += {
+ 'name': 'psql',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'env': {'with_readline': readline.found() ? 'yes' : 'no'},
+ 'tests': [
+ 't/001_basic.pl',
+ 't/010_tab_completion.pl',
+ 't/020_cancel.pl',
+ ],
+}
diff --git a/src/bin/scripts/meson.build b/src/bin/scripts/meson.build
new file mode 100644
index 00000000000..547a53500a4
--- /dev/null
+++ b/src/bin/scripts/meson.build
@@ -0,0 +1,46 @@
+scripts_common = static_library('scripts_common',
+ files('common.c'),
+ dependencies: [frontend_code, libpq],
+ kwargs: internal_lib_args,
+)
+
+binaries = [
+ 'createdb',
+ 'dropdb',
+ 'createuser',
+ 'dropuser',
+ 'clusterdb',
+ 'vacuumdb',
+ 'reindexdb',
+ 'pg_isready',
+]
+
+foreach binary : binaries
+ executable(binary,
+ files(binary + '.c'),
+ link_with: [scripts_common],
+ dependencies: [frontend_code, libpq],
+ kwargs: default_bin_args,
+ )
+endforeach
+
+tap_tests += {
+ 'name': 'scripts',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/010_clusterdb.pl',
+ 't/011_clusterdb_all.pl',
+ 't/020_createdb.pl',
+ 't/040_createuser.pl',
+ 't/050_dropdb.pl',
+ 't/070_dropuser.pl',
+ 't/080_pg_isready.pl',
+ 't/090_reindexdb.pl',
+ 't/091_reindexdb_all.pl',
+ 't/100_vacuumdb.pl',
+ 't/101_vacuumdb_all.pl',
+ 't/102_vacuumdb_stages.pl',
+ 't/200_connstr.pl',
+ ]
+}
diff --git a/src/common/meson.build b/src/common/meson.build
new file mode 100644
index 00000000000..bb48a3c50a0
--- /dev/null
+++ b/src/common/meson.build
@@ -0,0 +1,155 @@
+common_sources = files(
+ 'archive.c',
+ 'base64.c',
+ 'checksum_helper.c',
+ 'config_info.c',
+ 'controldata_utils.c',
+ 'encnames.c',
+ 'exec.c',
+ 'file_perm.c',
+ 'file_utils.c',
+ 'hashfn.c',
+ 'ip.c',
+ 'jsonapi.c',
+ 'keywords.c',
+ 'kwlookup.c',
+ 'link-canary.c',
+ 'md5_common.c',
+ 'pg_get_line.c',
+ 'pg_lzcompress.c',
+ 'pgfnames.c',
+ 'psprintf.c',
+ 'relpath.c',
+ 'rmtree.c',
+ 'saslprep.c',
+ 'scram-common.c',
+ 'string.c',
+ 'stringinfo.c',
+ 'unicode_norm.c',
+ 'username.c',
+ 'wait_error.c',
+ 'wchar.c',
+)
+
+# FIXME: implement openssl
+if ssl.found()
+ common_sources += files(
+ 'cryptohash_openssl.c',
+ 'hmac_openssl.c',
+ 'protocol_openssl.c',
+ )
+else
+ common_sources += files(
+ 'cryptohash.c',
+ 'hmac.c',
+ 'md5.c',
+ 'sha1.c',
+ 'sha2.c',
+ )
+endif
+
+common_sources += custom_target('kwlist',
+ input: files('../include/parser/kwlist.h'),
+ output: 'kwlist_d.h',
+ command: [perl, '-I', '@SOURCE_ROOT@/src/tools', files('../tools/gen_keywordlist.pl'),
+ '--extern', '--output', '@OUTDIR@', '@INPUT@'])
+
+
+# The code imported from Ryu gets a pass on declaration-after-statement,
+# in order to keep it more closely aligned with its upstream.
+ryu_sources = files(
+ 'd2s.c',
+ 'f2s.c',
+)
+ryu_cflags = []
+
+if using_declaration_after_statement_warning
+ ryu_cflags += ['-Wno-declaration-after-statement']
+endif
+
+common_cflags = {'ryu': ryu_cflags}
+common_sources_cflags = {'ryu': ryu_sources}
+
+
+# A few files are currently only built for frontend, not server
+# (Mkvcbuild.pm has a copy of this list, too). logging.c is excluded
+# from OBJS_FRONTEND_SHLIB (shared library) as a matter of policy,
+# because it is not appropriate for general purpose libraries such
+# as libpq to report errors directly.
+
+common_sources_frontend_shlib = common_sources
+common_sources_frontend_shlib += files(
+ 'fe_memutils.c',
+ 'restricted_token.c',
+ 'sprompt.c',
+)
+
+common_sources_frontend_static = common_sources_frontend_shlib
+common_sources_frontend_static += files(
+ 'logging.c',
+)
+
+# Build pgport once for backend, once for use in frontend binaries, and once
+# for use in shared libraries
+#
+# XXX: in most environments we could probably link_whole pgcommon_shlib
+# against pgcommon_static, instead of compiling twice.
+#
+# For the server build of pgcommon, depend on lwlocknames_h, because at least
+# cryptohash_openssl.c, hmac_openssl.c depend on it. That's arguably a
+# layering violation, but ...
+pgcommon = {}
+pgcommon_variants = {
+ 'srv' : internal_lib_args + {
+ 'sources': common_sources + [lwlocknames_h],
+ 'dependencies': [backend_common_code],
+ },
+ 'static' : default_lib_args + {
+ 'sources': common_sources_frontend_static,
+ 'dependencies': [frontend_common_code],
+ },
+ 'shlib' : default_lib_args + {
+ 'pic': true,
+ 'sources': common_sources_frontend_shlib,
+ 'dependencies': [frontend_common_code],
+ },
+}
+
+foreach name, opts : pgcommon_variants
+
+ # Build internal static libraries for sets of files that need to be built
+ # with different cflags
+ cflag_libs = []
+ foreach cflagname, sources: common_sources_cflags
+ if sources.length() == 0
+ continue
+ endif
+ c_args = opts.get('c_args', []) + common_cflags[cflagname]
+ cflag_libs += static_library('pgcommon_@0@_@1@'.format(cflagname, name),
+ c_pch: '../include/pch/c_pch.h',
+ include_directories: include_directories('.'),
+ kwargs: opts + {
+ 'sources': sources,
+ 'c_args': c_args,
+ 'build_by_default': false,
+ 'install': false,
+ },
+ )
+ endforeach
+
+ lib = static_library('pgcommon_@0@'.format(name),
+ link_with: cflag_libs,
+ c_pch: '../include/pch/c_pch.h',
+ include_directories: include_directories('.'),
+ kwargs: opts + {
+ 'dependencies': opts['dependencies'] + [ssl],
+ }
+ )
+ pgcommon += {name: lib}
+endforeach
+
+common_srv = pgcommon['srv']
+common_shlib = pgcommon['shlib']
+common_static = pgcommon['static']
+
+subdir('unicode')
diff --git a/src/common/unicode/meson.build b/src/common/unicode/meson.build
new file mode 100644
index 00000000000..e8145e138c0
--- /dev/null
+++ b/src/common/unicode/meson.build
@@ -0,0 +1,99 @@
+# These files are part of the Unicode Character Database. Download
+# them on demand.
+
+UNICODE_VERSION = '14.0.0'
+
+unicode_data = {}
+unicode_baseurl = 'https://www.unicode.org/Public/@0@/ucd/@1@'
+
+if not wget.found()
+ subdir_done()
+endif
+
+foreach f : ['UnicodeData.txt', 'EastAsianWidth.txt', 'DerivedNormalizationProps.txt', 'CompositionExclusions.txt', 'NormalizationTest.txt']
+ url = unicode_baseurl.format(UNICODE_VERSION, f)
+ target = custom_target(f,
+ output: f,
+ command: [wget, wget_flags, url],
+ build_by_default: false,
+ )
+ unicode_data += {f: target}
+endforeach
+
+
+update_unicode_targets = []
+
+update_unicode_targets += \
+ custom_target('unicode_norm_table.h',
+ input: [unicode_data['UnicodeData.txt'], unicode_data['CompositionExclusions.txt']],
+ output: ['unicode_norm_table.h', 'unicode_norm_hashfunc.h'],
+ command: [perl, files('generate-unicode_norm_table.pl'), '@OUTDIR@', '@INPUT@'],
+ build_by_default: false,
+ )
+
+update_unicode_targets += \
+ custom_target('unicode_combining_table.h',
+ input: [unicode_data['UnicodeData.txt']],
+ output: ['unicode_combining_table.h'],
+ command: [perl, files('generate-unicode_combining_table.pl'), '@INPUT@'],
+ build_by_default: false,
+ capture: true,
+ )
+
+update_unicode_targets += \
+ custom_target('unicode_east_asian_fw_table.h',
+ input: [unicode_data['EastAsianWidth.txt']],
+ output: ['unicode_east_asian_fw_table.h'],
+ command: [perl, files('generate-unicode_east_asian_fw_table.pl'), '@INPUT@'],
+ build_by_default: false,
+ capture: true,
+ )
+
+update_unicode_targets += \
+ custom_target('unicode_normprops_table.h',
+ input: [unicode_data['DerivedNormalizationProps.txt']],
+ output: ['unicode_normprops_table.h'],
+ command: [perl, files('generate-unicode_normprops_table.pl'), '@INPUT@'],
+ build_by_default: false,
+ capture: true,
+ )
+
+norm_test_table = custom_target('norm_test_table.h',
+ input: [unicode_data['NormalizationTest.txt']],
+ output: ['norm_test_table.h'],
+ command: [perl, files('generate-norm_test_table.pl'), '@INPUT@', '@OUTPUT@'],
+ build_by_default: false,
+ )
+
+inc = include_directories('.')
+
+norm_test = executable('norm_test',
+ ['norm_test.c', norm_test_table],
+ dependencies: [frontend_port_code],
+ include_directories: inc,
+ link_with: [pgport_static, common_static],
+ build_by_default: false,
+ kwargs: default_bin_args + {
+ 'install': false,
+ }
+)
+
+if not meson.is_cross_build()
+ norm_test_valid = custom_target('norm_test.valid',
+ output: 'norm_test.valid',
+ depends: update_unicode_targets,
+ command: [norm_test],
+ build_by_default: false,
+ capture: true,
+ )
+
+ run_target('update-unicode',
+ depends: norm_test_valid,
+ command: ['cp', update_unicode_targets, '@SOURCE_ROOT@/src/include/common/']
+ )
+else
+ run_target('update-unicode',
+ depends: update_unicode_targets,
+ command: ['cp', update_unicode_targets, '@SOURCE_ROOT@/src/include/common/']
+ )
+endif
diff --git a/src/fe_utils/meson.build b/src/fe_utils/meson.build
new file mode 100644
index 00000000000..e3f0b34cf13
--- /dev/null
+++ b/src/fe_utils/meson.build
@@ -0,0 +1,27 @@
+fe_utils_sources = files(
+ 'archive.c',
+ 'cancel.c',
+ 'conditional.c',
+ 'connect_utils.c',
+ 'mbprint.c',
+ 'option_utils.c',
+ 'parallel_slot.c',
+ 'print.c',
+ 'query_utils.c',
+ 'recovery_gen.c',
+ 'simple_list.c',
+ 'string_utils.c',
+)
+
+fe_utils_sources += custom_target('psqlscan',
+ input: ['psqlscan.l'],
+ output: ['psqlscan.c'],
+ command: [flex, '-b', '-Cfe', '-p', '-p', '-o', '@OUTPUT@', '@INPUT@'])
+
+fe_utils = static_library('fe_utils',
+ fe_utils_sources + generated_headers,
+ c_pch: '../include/pch/c_pch.h',
+ include_directories : [postgres_inc, libpq_inc],
+ c_args: ['-DFRONTEND'],
+ kwargs: default_lib_args,
+)
diff --git a/src/include/catalog/meson.build b/src/include/catalog/meson.build
new file mode 100644
index 00000000000..f5bc294c814
--- /dev/null
+++ b/src/include/catalog/meson.build
@@ -0,0 +1,129 @@
+catalog_headers = [
+ 'pg_proc.h',
+ 'pg_type.h',
+ 'pg_attribute.h',
+ 'pg_class.h',
+ 'pg_attrdef.h',
+ 'pg_constraint.h',
+ 'pg_inherits.h',
+ 'pg_index.h',
+ 'pg_operator.h',
+ 'pg_opfamily.h',
+ 'pg_opclass.h',
+ 'pg_am.h',
+ 'pg_amop.h',
+ 'pg_amproc.h',
+ 'pg_language.h',
+ 'pg_largeobject_metadata.h',
+ 'pg_largeobject.h',
+ 'pg_aggregate.h',
+ 'pg_statistic.h',
+ 'pg_statistic_ext.h',
+ 'pg_statistic_ext_data.h',
+ 'pg_rewrite.h',
+ 'pg_trigger.h',
+ 'pg_event_trigger.h',
+ 'pg_description.h',
+ 'pg_cast.h',
+ 'pg_enum.h',
+ 'pg_namespace.h',
+ 'pg_conversion.h',
+ 'pg_depend.h',
+ 'pg_database.h',
+ 'pg_db_role_setting.h',
+ 'pg_tablespace.h',
+ 'pg_authid.h',
+ 'pg_auth_members.h',
+ 'pg_shdepend.h',
+ 'pg_shdescription.h',
+ 'pg_ts_config.h',
+ 'pg_ts_config_map.h',
+ 'pg_ts_dict.h',
+ 'pg_ts_parser.h',
+ 'pg_ts_template.h',
+ 'pg_extension.h',
+ 'pg_foreign_data_wrapper.h',
+ 'pg_foreign_server.h',
+ 'pg_user_mapping.h',
+ 'pg_foreign_table.h',
+ 'pg_policy.h',
+ 'pg_replication_origin.h',
+ 'pg_default_acl.h',
+ 'pg_init_privs.h',
+ 'pg_seclabel.h',
+ 'pg_shseclabel.h',
+ 'pg_collation.h',
+ 'pg_partitioned_table.h',
+ 'pg_range.h',
+ 'pg_transform.h',
+ 'pg_sequence.h',
+ 'pg_publication.h',
+ 'pg_publication_namespace.h',
+ 'pg_publication_rel.h',
+ 'pg_subscription.h',
+ 'pg_subscription_rel.h',
+]
+
+bki_data = files(
+ 'pg_aggregate.dat',
+ 'pg_am.dat',
+ 'pg_amop.dat',
+ 'pg_amproc.dat',
+ 'pg_authid.dat',
+ 'pg_cast.dat',
+ 'pg_class.dat',
+ 'pg_collation.dat',
+ 'pg_conversion.dat',
+ 'pg_database.dat',
+ 'pg_language.dat',
+ 'pg_namespace.dat',
+ 'pg_opclass.dat',
+ 'pg_operator.dat',
+ 'pg_opfamily.dat',
+ 'pg_proc.dat',
+ 'pg_range.dat',
+ 'pg_tablespace.dat',
+ 'pg_ts_config.dat',
+ 'pg_ts_config_map.dat',
+ 'pg_ts_dict.dat',
+ 'pg_ts_parser.dat',
+ 'pg_ts_template.dat',
+ 'pg_type.dat',
+ )
+
+
+input = []
+output_files = ['postgres.bki', 'schemapg.h', 'system_fk_info.h', 'system_constraints.sql']
+output_install = [get_option('datadir'), get_option('includedir'), get_option('includedir'), get_option('datadir')]
+
+foreach h : catalog_headers
+ fname = h.split('.h')[0]+'_d.h'
+ input += files(h)
+ output_files += fname
+ output_install += get_option('includedir')
+endforeach
+
+generated_headers += custom_target('generated_catalog_headers',
+ input: input,
+ depend_files: bki_data,
+ build_by_default: true,
+ install: true,
+ output: output_files,
+ install_dir: output_install,
+ command: [perl, files('../../backend/catalog/genbki.pl'), '--include-path=@SOURCE_ROOT@/src/include', '--set-version='+pg_version_major.to_string(), '--output=@OUTDIR@', '@INPUT@']
+ )
+
+
+# 'reformat-dat-files' is a convenience target for rewriting the
+# catalog data files in our standard format. This includes collapsing
+# out any entries that are redundant with a BKI_DEFAULT annotation.
+run_target('reformat-dat-files',
+ command: [perl, files('reformat_dat_file.pl'), '--output', '@CURRENT_SOURCE_DIR@', bki_data],
+)
+
+# 'expand-dat-files' is a convenience target for expanding out all
+# default values in the catalog data files. This should be run before
+# altering or removing any BKI_DEFAULT annotation.
+run_target('expand-dat-files',
+ command: [perl, files('reformat_dat_file.pl'), '--output', '@CURRENT_SOURCE_DIR@', bki_data, '--full-tuples'],
+)
diff --git a/src/include/meson.build b/src/include/meson.build
new file mode 100644
index 00000000000..c3af4a2574f
--- /dev/null
+++ b/src/include/meson.build
@@ -0,0 +1,50 @@
+configure_file(input : 'pg_config_ext.h.meson',
+ output : 'pg_config_ext.h',
+ configuration : cdata)
+
+system = host_machine.system()
+if system == 'windows'
+ system = 'win32'
+endif
+
+configure_file(
+ output : 'pg_config_os.h',
+ input: files('port/@0@.h'.format(system)),
+ install: true,
+ install_dir : get_option('includedir'),
+ copy : true)
+
+configure_file(
+ output : 'pg_config.h',
+ install : true,
+ install_dir : get_option('includedir'),
+ configuration : cdata)
+
+
+config_paths_data = configuration_data()
+config_paths_data.set_quoted('PGBINDIR', get_option('prefix') / get_option('bindir'))
+config_paths_data.set_quoted('PGSHAREDIR', get_option('prefix') / get_option('datadir'))
+config_paths_data.set_quoted('SYSCONFDIR', get_option('prefix') / get_option('sysconfdir'))
+config_paths_data.set_quoted('INCLUDEDIR', get_option('prefix') / get_option('includedir'))
+# FIXME: shouldn't be the same
+config_paths_data.set_quoted('PKGINCLUDEDIR', get_option('prefix') / get_option('includedir'))
+config_paths_data.set_quoted('INCLUDEDIRSERVER', get_option('prefix') / get_option('includedir'))
+config_paths_data.set_quoted('LIBDIR', get_option('prefix') / get_option('libdir'))
+# FIXME: figure out logic for pkglibdir
+config_paths_data.set_quoted('PKGLIBDIR', get_option('prefix') / get_option('libdir'))
+config_paths_data.set_quoted('LOCALEDIR', get_option('prefix') / get_option('localedir'))
+config_paths_data.set_quoted('DOCDIR', get_option('prefix') / get_option('datadir') / 'doc/postgresql')
+config_paths_data.set_quoted('HTMLDIR', get_option('prefix') / get_option('datadir') / 'doc/postgresql')
+config_paths_data.set_quoted('MANDIR', get_option('prefix') / get_option('datadir') / 'doc/postgresql')
+
+configure_file(
+ output: 'pg_config_paths.h',
+ configuration: config_paths_data,
+ install: false
+)
+
+
+subdir('utils')
+subdir('storage')
+subdir('catalog')
+subdir('parser')
diff --git a/src/include/parser/meson.build b/src/include/parser/meson.build
new file mode 100644
index 00000000000..caf4c092909
--- /dev/null
+++ b/src/include/parser/meson.build
@@ -0,0 +1,10 @@
+backend_parser_header = custom_target('gram',
+ input: [files('../../backend/parser/gram.y')],
+ output: ['gram.c', 'gram.h'],
+ command: [bison, bisonflags, '-d', '-o', '@OUTPUT0@', '@INPUT0@'],
+ install: true,
+ # Only install gram.h, not gram.c
+ install_dir: [false, get_option('includedir')]
+)
+
+#generated_backend_headers += backend_parser[1]
diff --git a/src/include/pch/c_pch.h b/src/include/pch/c_pch.h
new file mode 100644
index 00000000000..f40c757ca62
--- /dev/null
+++ b/src/include/pch/c_pch.h
@@ -0,0 +1 @@
+#include "c.h"
diff --git a/src/include/pch/postgres_pch.h b/src/include/pch/postgres_pch.h
new file mode 100644
index 00000000000..71b2f35f76b
--- /dev/null
+++ b/src/include/pch/postgres_pch.h
@@ -0,0 +1 @@
+#include "postgres.h"
diff --git a/src/include/pg_config_ext.h.meson b/src/include/pg_config_ext.h.meson
new file mode 100644
index 00000000000..57cdfca0cfd
--- /dev/null
+++ b/src/include/pg_config_ext.h.meson
@@ -0,0 +1,7 @@
+/*
+ * src/include/pg_config_ext.h.in. This is generated manually, not by
+ * autoheader, since we want to limit which symbols get defined here.
+ */
+
+/* Define to the name of a signed 64-bit integer type. */
+#mesondefine PG_INT64_TYPE
diff --git a/src/include/storage/meson.build b/src/include/storage/meson.build
new file mode 100644
index 00000000000..ef2bbb7c6f7
--- /dev/null
+++ b/src/include/storage/meson.build
@@ -0,0 +1,16 @@
+# FIXME: this creates an unnecessary lwlocknames.c - but it's not
+# obvious how to avoid that: meson insist on output files being in the
+# current dir.
+
+lwlocknames = custom_target('lwlocknames',
+ input : files('../../backend/storage/lmgr/lwlocknames.txt'),
+ output : ['lwlocknames.h', 'lwlocknames.c'],
+ command : [perl, files('../../backend/storage/lmgr/generate-lwlocknames.pl'), '-o', '@OUTDIR@', '@INPUT@'],
+ build_by_default: true,
+ install: true,
+ install_dir: [get_option('includedir'), false],
+)
+
+lwlocknames_h = lwlocknames[0]
+
+generated_backend_headers += lwlocknames_h
diff --git a/src/include/utils/meson.build b/src/include/utils/meson.build
new file mode 100644
index 00000000000..b9c959b474d
--- /dev/null
+++ b/src/include/utils/meson.build
@@ -0,0 +1,22 @@
+errcodes = custom_target('errcodes',
+ input : files('../../backend/utils/errcodes.txt'),
+ output : ['errcodes.h'],
+ command : [perl, files('../../backend/utils/generate-errcodes.pl'), '@INPUT@', '@OUTPUT@']
+)
+generated_headers += errcodes
+
+generated_backend_headers += custom_target('probes.d',
+ input: files('../../backend/utils/probes.d'),
+ output : 'probes.h',
+ capture: true,
+ command : [sed, '-f', files('../../backend/utils/Gen_dummy_probes.sed'), '@INPUT@']
+)
+
+fmgrtab_target = custom_target('fmgrtab',
+ input: '../catalog/pg_proc.dat',
+ output : ['fmgroids.h', 'fmgrprotos.h', 'fmgrtab.c'],
+ command: [perl, '-I', '@SOURCE_ROOT@/src/backend/catalog/', files('../../backend/utils/Gen_fmgrtab.pl'), '--include-path=@SOURCE_ROOT@/src/include', '--output=@OUTDIR@', '@INPUT@']
+)
+
+generated_backend_headers += fmgrtab_target[0]
+generated_backend_headers += fmgrtab_target[1]
diff --git a/src/interfaces/libpq/meson.build b/src/interfaces/libpq/meson.build
new file mode 100644
index 00000000000..4b716f0e89d
--- /dev/null
+++ b/src/interfaces/libpq/meson.build
@@ -0,0 +1,99 @@
+libpq_sources = files(
+ 'fe-auth-scram.c',
+ 'fe-connect.c',
+ 'fe-exec.c',
+ 'fe-lobj.c',
+ 'fe-misc.c',
+ 'fe-print.c',
+ 'fe-protocol3.c',
+ 'fe-secure.c',
+ 'fe-trace.c',
+ 'legacy-pqsignal.c',
+ 'libpq-events.c',
+ 'pqexpbuffer.c',
+ 'fe-auth.c',
+)
+
+if host_machine.system() == 'windows'
+ libpq_sources += files('win32.c', 'pthread-win32.c')
+endif
+
+if ssl.found()
+ libpq_sources += files('fe-secure-common.c')
+ libpq_sources += files('fe-secure-openssl.c')
+endif
+
+if gssapi.found()
+ libpq_sources += files(
+ 'fe-secure-gssapi.c',
+ 'fe-gssapi-common.c'
+ )
+endif
+
+export_file = custom_target('libpq_exports.list',
+ input: [files('exports.txt')],
+ output: ['@BASENAME@.list'],
+ command: [perl, files('../../tools/gen_versioning_script.pl'),
+ host_machine.system() == 'darwin' ? 'darwin' : 'gnu',
+ '@INPUT0@', '@OUTPUT0@'],
+ build_by_default: false,
+ install: false,
+)
+
+libpq_def = custom_target('libpq.def',
+ command: [perl, files('../../tools/msvc/export2def.pl'), '@OUTPUT@', '@INPUT0@', 'libpq'],
+ input: files('exports.txt'),
+ output: 'libpq.def',
+ build_by_default: false,
+ install: false,
+)
+
+# port needs to be in include path due to pthread-win32.h
+libpq_inc = include_directories('.', '../../port')
+libpq_deps = [frontend_shlib_code, thread_dep, ssl, ldap, gssapi]
+libpq_link_depends = []
+
+libpq_kwargs = default_lib_args + {
+ 'version': '5.'+pg_version_major.to_string(),
+}
+
+
+if host_machine.system() == 'darwin'
+ libpq_kwargs = libpq_kwargs + {
+ 'link_args': ['-exported_symbols_list', export_file.full_path()],
+ 'link_depends': export_file,
+ 'soversion': '5',
+ }
+elif host_machine.system() == 'windows'
+ libpq_deps += cc.find_library('secur32', required: true)
+
+ libpq_kwargs = libpq_kwargs + {
+ 'vs_module_defs': libpq_def,
+ 'soversion': '',
+ }
+else
+ libpq_kwargs = libpq_kwargs + {
+ 'link_args': '-Wl,--version-script=' + export_file.full_path(),
+ 'link_depends': export_file,
+ 'soversion': '5',
+ }
+endif
+
+libpq_so = shared_library('pq',
+ libpq_sources,
+ include_directories : [libpq_inc, postgres_inc],
+ c_args: ['-DFRONTEND'],
+ c_pch: '../../include/pch/c_pch.h',
+ dependencies: libpq_deps,
+ kwargs: libpq_kwargs,
+)
+
+libpq = declare_dependency(
+ link_with: [libpq_so],
+ include_directories: [include_directories('.')]
+)
+
+install_headers('libpq-fe.h', 'libpq-events.h')
+# FIXME: adjust path
+install_headers('libpq-int.h', 'pqexpbuffer.h')
+install_data('pg_service.conf.sample', install_dir: get_option('datadir'))
diff --git a/src/meson.build b/src/meson.build
new file mode 100644
index 00000000000..414be1db419
--- /dev/null
+++ b/src/meson.build
@@ -0,0 +1,10 @@
+# libraries that other subsystems might depend uppon first, in their
+# respective dependency order
+
+subdir('timezone')
+
+subdir('backend')
+
+subdir('bin')
+
+subdir('pl')
diff --git a/src/pl/meson.build b/src/pl/meson.build
new file mode 100644
index 00000000000..b720e922093
--- /dev/null
+++ b/src/pl/meson.build
@@ -0,0 +1,4 @@
+subdir('plpgsql')
+
+subdir('plperl')
+subdir('plpython')
diff --git a/src/pl/plperl/meson.build b/src/pl/plperl/meson.build
new file mode 100644
index 00000000000..a5a994e845f
--- /dev/null
+++ b/src/pl/plperl/meson.build
@@ -0,0 +1,81 @@
+if not perl_dep.found()
+ subdir_done()
+endif
+
+plperl_sources = files(
+ 'plperl.c',
+)
+
+subppdir = run_command(perl, '-e', 'use List::Util qw(first); print first { -r "$_/ExtUtils/xsubpp" } @INC',
+ check: true).stdout()
+xsubpp = '@0@/ExtUtils/xsubpp'.format(subppdir)
+typemap = '@0@/ExtUtils/typemap'.format(subppdir)
+
+plperl_sources += custom_target('perlchunks.h',
+ input: files('plc_perlboot.pl', 'plc_trusted.pl'),
+ output: 'perlchunks.h',
+ capture: true,
+ command: [perl, files('text2macro.pl'), '--strip=^(\#.*|\s*)$', '@INPUT@']
+)
+
+plperl_sources += custom_target('plperl_opmask.h',
+ input: files('plperl_opmask.pl'),
+ output: 'plperl_opmask.h',
+ command: [perl, '@INPUT@', '@OUTPUT@']
+)
+
+foreach n : ['SPI', 'Util']
+ xs = files(n+'.xs')
+ xs_c_name = n+'.c'
+
+ # FIXME: -output option is only available in perl 5.9.3 - but that's
+ # probably a fine minimum requirement?
+ xs_c = custom_target(xs_c_name,
+ input: xs,
+ output: xs_c_name,
+ command: [perl, xsubpp, '-typemap', typemap, '-output', '@OUTPUT@', '@INPUT@']
+ )
+ plperl_sources += xs_c
+endforeach
+
+plperl_inc = include_directories('.')
+shared_module('plperl',
+ plperl_sources,
+ c_pch: '../../include/pch/postgres_pch.h',
+ include_directories: [plperl_inc, postgres_inc],
+ kwargs: pg_mod_args + {
+ 'dependencies': [perl_dep, pg_mod_args['dependencies']],
+ },
+)
+
+install_data(
+ 'plperl.control',
+ 'plperl--1.0.sql',
+ install_dir: get_option('datadir') / 'extension'
+)
+
+install_data(
+ 'plperlu.control',
+ 'plperlu--1.0.sql',
+ install_dir: get_option('datadir') / 'extension'
+)
+
+regress_tests += {
+ 'name': 'plperl',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'plperl_setup',
+ 'plperl',
+ 'plperl_lc',
+ 'plperl_trigger',
+ 'plperl_shared',
+ 'plperl_elog',
+ 'plperl_util',
+ 'plperl_init',
+ 'plperlu',
+ 'plperl_array',
+ 'plperl_call',
+ 'plperl_transaction',
+ ],
+}
diff --git a/src/pl/plpgsql/meson.build b/src/pl/plpgsql/meson.build
new file mode 100644
index 00000000000..9537275d67c
--- /dev/null
+++ b/src/pl/plpgsql/meson.build
@@ -0,0 +1 @@
+subdir('src')
diff --git a/src/pl/plpgsql/src/meson.build b/src/pl/plpgsql/src/meson.build
new file mode 100644
index 00000000000..b040e5e8507
--- /dev/null
+++ b/src/pl/plpgsql/src/meson.build
@@ -0,0 +1,67 @@
+plpgsql_sources = files(
+ 'pl_comp.c',
+ 'pl_exec.c',
+ 'pl_funcs.c',
+ 'pl_handler.c',
+ 'pl_scanner.c',
+)
+
+plpgsql_sources += custom_target('gram',
+ input: ['pl_gram.y'],
+ output: ['pl_gram.c', 'pl_gram.h'],
+ command: [bison, bisonflags, '-d', '-o', '@OUTPUT0@', '@INPUT0@'])
+
+gen_plerrcodes = files('generate-plerrcodes.pl')
+plpgsql_sources += custom_target('plerrcodes',
+ input: ['../../../../src/backend/utils/errcodes.txt'],
+ output: ['plerrcodes.h'],
+ command: [perl, gen_plerrcodes, '@INPUT0@'],
+ capture: true)
+
+gen_keywordlist = files('../../../../src/tools/gen_keywordlist.pl')
+plpgsql_sources += custom_target('pl_reserved_kwlist',
+ input: ['pl_reserved_kwlist.h'],
+ output: ['pl_reserved_kwlist_d.h'],
+ command: [perl, '-I', '@SOURCE_ROOT@/src/tools', gen_keywordlist, '--output', '@OUTDIR@', '--varname', 'ReservedPLKeywords', '@INPUT@']
+)
+
+plpgsql_sources += custom_target('pl_unreserved_kwlist',
+ input: ['pl_unreserved_kwlist.h'],
+ output: ['pl_unreserved_kwlist_d.h'],
+ command: [perl, '-I', '@SOURCE_ROOT@/src/tools', gen_keywordlist, '--output', '@OUTDIR@', '--varname', 'UnreservedPLKeywords', '@INPUT@']
+)
+
+shared_module('plpgsql',
+ plpgsql_sources,
+ c_pch: '../../../include/pch/postgres_pch.h',
+ include_directories: include_directories('.'),
+ kwargs: pg_mod_args,
+)
+
+install_data('plpgsql.control', 'plpgsql--1.0.sql',
+ install_dir: get_option('datadir') / 'extension'
+)
+
+install_headers('plpgsql.h',
+ install_dir: get_option('includedir') / 'server')
+
+
+regress_tests += {
+ 'name': 'plpgsql',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'plpgsql_array',
+ 'plpgsql_call',
+ 'plpgsql_control',
+ 'plpgsql_copy',
+ 'plpgsql_domain',
+ 'plpgsql_record',
+ 'plpgsql_cache',
+ 'plpgsql_simple',
+ 'plpgsql_transaction',
+ 'plpgsql_trap',
+ 'plpgsql_trigger',
+ 'plpgsql_varprops',
+ ],
+}
diff --git a/src/pl/plpython/meson.build b/src/pl/plpython/meson.build
new file mode 100644
index 00000000000..48a02b532aa
--- /dev/null
+++ b/src/pl/plpython/meson.build
@@ -0,0 +1,78 @@
+if not python3.found()
+ subdir_done()
+endif
+
+plpython_sources = files(
+ 'plpy_cursorobject.c',
+ 'plpy_elog.c',
+ 'plpy_exec.c',
+ 'plpy_main.c',
+ 'plpy_planobject.c',
+ 'plpy_plpymodule.c',
+ 'plpy_procedure.c',
+ 'plpy_resultobject.c',
+ 'plpy_spi.c',
+ 'plpy_subxactobject.c',
+ 'plpy_typeio.c',
+ 'plpy_util.c',
+)
+
+plpython_sources += custom_target('spiexceptions.h',
+ input: files('../../backend/utils/errcodes.txt'),
+ output: 'spiexceptions.h',
+ command: [perl, files('generate-spiexceptions.pl'), '@INPUT@'],
+ capture: true
+ )
+
+
+# FIXME: need to duplicate import library ugliness?
+plpython_inc = include_directories('.')
+
+shared_module('plpython3',
+ plpython_sources,
+ c_pch: '../../include/pch/postgres_pch.h',
+ include_directories: [plpython_inc, postgres_inc],
+ kwargs: pg_mod_args + {
+ 'dependencies': [python3, pg_mod_args['dependencies']],
+ },
+)
+
+# FIXME: Only install the relevant versions
+install_data(
+ 'plpython3u.control',
+ 'plpython3u--1.0.sql',
+ install_dir: get_option('datadir') / 'extension'
+)
+
+plpython_regress = [
+ 'plpython_schema',
+ 'plpython_populate',
+ 'plpython_test',
+ 'plpython_do',
+ 'plpython_global',
+ 'plpython_import',
+ 'plpython_spi',
+ 'plpython_newline',
+ 'plpython_void',
+ 'plpython_call',
+ 'plpython_params',
+ 'plpython_setof',
+ 'plpython_record',
+ 'plpython_trigger',
+ 'plpython_types',
+ 'plpython_error',
+ 'plpython_ereport',
+ 'plpython_unicode',
+ 'plpython_quote',
+ 'plpython_composite',
+ 'plpython_subtransaction',
+ 'plpython_transaction',
+ 'plpython_drop',
+]
+
+regress_tests += {
+ 'name': 'plpython',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': plpython_regress,
+}
diff --git a/src/port/meson.build b/src/port/meson.build
new file mode 100644
index 00000000000..c2f35c02f1f
--- /dev/null
+++ b/src/port/meson.build
@@ -0,0 +1,191 @@
+pgport_sources = [
+ 'bsearch_arg.c',
+ 'chklocale.c',
+ 'erand48.c',
+ 'inet_net_ntop.c',
+ 'noblock.c',
+ 'path.c',
+ 'pg_bitutils.c',
+ 'pg_strong_random.c',
+ 'pgcheckdir.c',
+ 'pgmkdirp.c',
+ 'pgsleep.c',
+ 'pgstrcasecmp.c',
+ 'pgstrsignal.c',
+ 'pqsignal.c',
+ 'qsort.c',
+ 'qsort_arg.c',
+ 'quotes.c',
+ 'snprintf.c',
+ 'strerror.c',
+ 'tar.c',
+ 'thread.c',
+]
+
+if host_machine.system() == 'windows'
+ pgport_sources += files(
+ 'dirmod.c',
+ 'kill.c',
+ 'open.c',
+ 'system.c',
+ 'win32env.c',
+ 'win32error.c',
+ 'win32security.c',
+ 'win32setlocale.c',
+ 'win32stat.c',
+ )
+endif
+
+if cc.get_id() == 'msvc'
+ pgport_sources += files(
+ 'dirent.c',
+ )
+endif
+
+# Replacement functionality to be built if corresponding configure symbol
+# is false
+replace_funcs_neg = [
+ ['dlopen'],
+ ['explicit_bzero'],
+ ['fls'],
+ ['getaddrinfo'],
+ ['getopt'],
+ ['getopt_long'],
+ ['getpeereid'],
+ ['getpeereid'],
+ ['getrusage'],
+ ['gettimeofday'],
+ ['inet_aton'],
+ ['link'],
+ ['mkdtemp'],
+ ['pread'],
+ ['preadv', 'HAVE_DECL_PREADV'],
+ ['pwrite'],
+ ['pwritev', 'HAVE_DECL_PWRITEV'],
+ ['random'],
+ ['srandom'],
+ ['strlcat'],
+ ['strlcpy'],
+ ['strnlen'],
+]
+
+# Replacement functionality to be built if corresponding configure symbol
+# is true
+replace_funcs_pos = [
+ # x86/x64
+ ['pg_crc32c_sse42', 'USE_SSE42_CRC32C'],
+ ['pg_crc32c_sse42', 'USE_SSE42_CRC32C_WITH_RUNTIME_CHECK', 'crc'],
+ ['pg_crc32c_sse42_choose', 'USE_SSE42_CRC32C_WITH_RUNTIME_CHECK'],
+ ['pg_crc32c_sb8', 'USE_SSE42_CRC32C_WITH_RUNTIME_CHECK'],
+
+ # arm / aarch64
+ ['pg_crc32c_armv8', 'USE_ARMV8_CRC32C'],
+ ['pg_crc32c_armv8', 'USE_ARMV8_CRC32C_WITH_RUNTIME_CHECK', 'crc'],
+ ['pg_crc32c_armv8_choose', 'USE_ARMV8_CRC32C_WITH_RUNTIME_CHECK'],
+ ['pg_crc32c_sb8', 'USE_ARMV8_CRC32C_WITH_RUNTIME_CHECK'],
+
+ # generic fallback
+ ['pg_crc32c_sb8', 'USE_SLICING_BY_8_CRC32C'],
+]
+
+pgport_cflags = {'crc': cflags_crc}
+pgport_sources_cflags = {'crc': []}
+
+foreach f : replace_funcs_neg
+ func = f.get(0)
+ varname = f.get(1, 'HAVE_@0@'.format(func.to_upper()))
+ filename = '@0@.c'.format(func)
+
+ val = '@0@'.format(cdata.get(varname, 'false'))
+ if val == 'false' or val == '0'
+ pgport_sources += files(filename)
+ endif
+endforeach
+
+foreach f : replace_funcs_pos
+ func = f.get(0)
+ varname = f.get(1, 'HAVE_@0@'.format(func.to_upper()))
+ filename = '@0@.c'.format(func)
+
+ val = '@0@'.format(cdata.get(varname, 'false'))
+ if val == 'true' or val == '1'
+ src = files(filename)
+ if f.length() > 2
+ pgport_sources_cflags += {f[2]: pgport_sources_cflags[f[2]] + src}
+ else
+ pgport_sources += src
+ endif
+ endif
+endforeach
+
+
+if (host_machine.system() == 'windows' or host_machine.system() == 'cygwin') and \
+ (cc.get_id() != 'msvc' or cc.version().version_compare('<14.0'))
+
+ # Cygwin and (apparently, based on test results) Mingw both
+ # have a broken strtof(), so substitute the same replacement
+ # code we use with VS2013. That's not a perfect fix, since
+ # (unlike with VS2013) it doesn't avoid double-rounding, but
+ # we have no better options. To get that, though, we have to
+ # force the file to be compiled despite HAVE_STRTOF.
+ pgport_sources += files('strtof.c')
+ message('On @0@ with compiler @1@ @2@ we will use our strtof wrapper.'.format(
+ host_machine.system(), cc.get_id(), cc.version()))
+endif
+
+if not cdata.has('HAVE_PTHREAD_BARRIER_WAIT') and host_machine.system() != 'windows'
+ pgport_sources += files('pthread_barrier_wait.c')
+endif
+
+
+# Build pgport once for backend, once for use in frontend binaries, and once
+# for use in shared libraries
+pgport = {}
+pgport_variants = {
+ 'srv' : internal_lib_args + {
+ 'dependencies': [backend_port_code],
+ },
+ 'static' : default_lib_args + {
+ 'dependencies': [frontend_port_code],
+ },
+ 'shlib' : default_lib_args + {
+ 'pic': true,
+ 'dependencies': [frontend_port_code],
+ },
+}
+
+foreach name, opts : pgport_variants
+
+ # Build internal static libraries for sets of files that need to be built
+ # with different cflags
+ cflag_libs = []
+ foreach cflagname, sources: pgport_sources_cflags
+ if sources.length() == 0
+ continue
+ endif
+ c_args = opts.get('c_args', []) + pgport_cflags[cflagname]
+ cflag_libs += static_library('pgport_@0@_@1@'.format(cflagname, name),
+ sources,
+ c_pch: '../include/pch/c_pch.h',
+ kwargs: opts + {
+ 'c_args': c_args,
+ 'build_by_default': false,
+ 'install': false,
+ },
+ )
+ endforeach
+
+ lib = static_library('pgport_@0@'.format(name),
+ pgport_sources,
+ link_with: cflag_libs,
+ c_pch: '../include/pch/c_pch.h',
+ kwargs: opts + {
+ 'dependencies': opts['dependencies'] + [ssl],
+ }
+ )
+ pgport += {name: lib}
+endforeach
+
+pgport_srv = pgport['srv']
+pgport_static = pgport['static']
+pgport_shlib = pgport['shlib']
diff --git a/src/port/win32ver.rc.in b/src/port/win32ver.rc.in
new file mode 100644
index 00000000000..d5c98e2e8fb
--- /dev/null
+++ b/src/port/win32ver.rc.in
@@ -0,0 +1,41 @@
+#include <winver.h>
+#include "pg_config.h"
+
+// https://docs.microsoft.com/en-us/windows/win32/menurc/versioninfo-resource
+
+VS_VERSION_INFO VERSIONINFO
+ FILEVERSION PG_MAJORVERSION_NUM,0,PG_MINORVERSION_NUM,0
+ PRODUCTVERSION PG_MAJORVERSION_NUM,0,PG_MINORVERSION_NUM,0
+ FILEFLAGSMASK VS_FFI_FILEFLAGSMASK
+ FILEFLAGS 0x0L
+ FILEOS VOS_NT_WINDOWS32
+ FILETYPE @VFT_TYPE@
+ FILESUBTYPE 0x0L
+BEGIN
+ BLOCK "StringFileInfo"
+ BEGIN
+ BLOCK "040904B0" // U.S. English, Unicode
+ BEGIN
+ VALUE "CompanyName", "PostgreSQL Global Development Group"
+ VALUE "FileDescription", @FILEDESC@
+ /*
+ * XXX: In the autoconf / src/tools/msvc build this was set differently than
+ * ProductVersion below, using the current date. But that doesn't seem like a
+ * good idea, because it makes the build not reproducible and causes
+ * unnecessary rebuilds?
+ */
+ VALUE "FileVersion", PG_VERSION
+ VALUE "InternalName", @INTERNAL_NAME@
+ VALUE "LegalCopyright", "Portions Copyright (c) 1996-2021, PostgreSQL Global Development Group. Portions Copyright (c) 1994, Regents of the University of California."
+ VALUE "OriginalFileName", @ORIGINAL_NAME@
+ VALUE "ProductName", "PostgreSQL"
+ VALUE "ProductVersion", PG_VERSION
+ END
+ END
+ BLOCK "VarFileInfo"
+ BEGIN
+ VALUE "Translation", 0x0409, 1200 // U.S. English, Unicode
+ END
+END
+
+IDI_ICON ICON @ICO@
diff --git a/src/test/authentication/meson.build b/src/test/authentication/meson.build
new file mode 100644
index 00000000000..be41fb314a5
--- /dev/null
+++ b/src/test/authentication/meson.build
@@ -0,0 +1,9 @@
+tap_tests += {
+ 'name': 'authentication',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_password.pl',
+ 't/002_saslprep.pl',
+ ],
+}
diff --git a/src/test/isolation/meson.build b/src/test/isolation/meson.build
new file mode 100644
index 00000000000..ea8baa20634
--- /dev/null
+++ b/src/test/isolation/meson.build
@@ -0,0 +1,49 @@
+# pg_regress_c helpfully provided by regress/meson.build
+
+isolation_sources = pg_regress_c + files(
+ 'isolation_main.c',
+)
+
+# see src/backend/replication/meson.build for depend logic
+spec_scanner = custom_target('specscanner',
+ input : files('specscanner.l'),
+ output : ['specscanner.c'],
+ command : [flex, '-o', '@OUTPUT0@', '@INPUT@']
+)
+
+isolationtester_sources = files('isolationtester.c')
+isolationtester_sources += custom_target('specparse',
+ input: 'specparse.y',
+ output: 'specparse.c',
+ depends: spec_scanner,
+ command: [bison, bisonflags, '-o', '@OUTPUT@', '@INPUT0@'])
+
+pg_isolation_regress = executable('pg_isolation_regress',
+ isolation_sources,
+ c_args: pg_regress_cflags,
+ include_directories: [pg_regress_inc],
+ dependencies: [frontend_code],
+ kwargs: default_bin_args + {
+ 'install': false
+ },
+)
+
+isolationtester = executable('isolationtester',
+ isolationtester_sources,
+ include_directories: include_directories('.'),
+ dependencies: [frontend_code, libpq],
+ kwargs: default_bin_args + {
+ 'install': false
+ },
+)
+
+isolation_tests += {
+ 'name': 'main',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'schedule': files('isolation_schedule'),
+ 'test_kwargs': {
+ 'priority': 40,
+ 'timeout': 1000,
+ },
+}
diff --git a/src/test/kerberos/meson.build b/src/test/kerberos/meson.build
new file mode 100644
index 00000000000..9f9957a3b4c
--- /dev/null
+++ b/src/test/kerberos/meson.build
@@ -0,0 +1,12 @@
+tap_tests += {
+ 'name': 'kerberos',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_auth.pl',
+ ],
+ 'env' : {
+ 'with_gssapi': gssapi.found() ? 'yes' : 'no',
+ 'with_krb_srvnam': 'postgres',
+ },
+}
diff --git a/src/test/ldap/meson.build b/src/test/ldap/meson.build
new file mode 100644
index 00000000000..58eb9adc6f5
--- /dev/null
+++ b/src/test/ldap/meson.build
@@ -0,0 +1,9 @@
+tap_tests += {
+ 'name': 'ldap',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_auth.pl',
+ ],
+ 'env' : {'with_ldap': ldap.found() ? 'yes' : 'no'},
+}
diff --git a/src/test/meson.build b/src/test/meson.build
new file mode 100644
index 00000000000..f0b0d3d3b5e
--- /dev/null
+++ b/src/test/meson.build
@@ -0,0 +1,19 @@
+subdir('regress')
+subdir('isolation')
+
+subdir('authentication')
+subdir('recovery')
+subdir('subscription')
+subdir('modules')
+
+if ssl.found()
+ subdir('ssl')
+endif
+
+if ldap.found()
+ subdir('ldap')
+endif
+
+if gssapi.found()
+ subdir('kerberos')
+endif
diff --git a/src/test/modules/brin/meson.build b/src/test/modules/brin/meson.build
new file mode 100644
index 00000000000..99ccaac5b38
--- /dev/null
+++ b/src/test/modules/brin/meson.build
@@ -0,0 +1,19 @@
+isolation_tests += {
+ 'name': 'brin',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'specs': [
+ 'summarization-and-inprogress-insertion',
+ ]
+}
+
+
+tap_tests += {
+ 'name': 'brin',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/01_workitems.pl',
+ ],
+}
+
diff --git a/src/test/modules/commit_ts/meson.build b/src/test/modules/commit_ts/meson.build
new file mode 100644
index 00000000000..2794d837c35
--- /dev/null
+++ b/src/test/modules/commit_ts/meson.build
@@ -0,0 +1,20 @@
+regress_tests += {
+ 'name': 'commit_ts',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'commit_timestamp',
+ ]
+}
+
+tap_tests += {
+ 'name': 'commit_ts',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_base.pl',
+ 't/002_standby.pl',
+ 't/003_standby_2.pl',
+ 't/004_restart.pl',
+ ],
+}
diff --git a/src/test/modules/delay_execution/meson.build b/src/test/modules/delay_execution/meson.build
new file mode 100644
index 00000000000..58fe5a1a21d
--- /dev/null
+++ b/src/test/modules/delay_execution/meson.build
@@ -0,0 +1,15 @@
+# FIXME: prevent install during main install, but not during test :/
+delay_execution = shared_module('delay_execution',
+ ['delay_execution.c'],
+ kwargs: pg_mod_args,
+)
+
+isolation_tests += {
+ 'name': 'delay_execution',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'specs': [
+ 'partition-addition',
+ 'partition-removal-1',
+ ]
+}
diff --git a/src/test/modules/dummy_index_am/meson.build b/src/test/modules/dummy_index_am/meson.build
new file mode 100644
index 00000000000..a9c49bd9554
--- /dev/null
+++ b/src/test/modules/dummy_index_am/meson.build
@@ -0,0 +1,20 @@
+# FIXME: prevent install during main install, but not during test :/
+dummy_index_am = shared_module('dummy_index_am',
+ ['dummy_index_am.c'],
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'dummy_index_am.control',
+ 'dummy_index_am--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'dummy_index_am',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'reloptions',
+ ]
+}
diff --git a/src/test/modules/dummy_seclabel/meson.build b/src/test/modules/dummy_seclabel/meson.build
new file mode 100644
index 00000000000..ed31d8f9530
--- /dev/null
+++ b/src/test/modules/dummy_seclabel/meson.build
@@ -0,0 +1,20 @@
+# FIXME: prevent install during main install, but not during test :/
+dummy_seclabel = shared_module('dummy_seclabel',
+ ['dummy_seclabel.c'],
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'dummy_seclabel.control',
+ 'dummy_seclabel--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'dummy_seclabel',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'dummy_seclabel',
+ ]
+}
diff --git a/src/test/modules/libpq_pipeline/meson.build b/src/test/modules/libpq_pipeline/meson.build
new file mode 100644
index 00000000000..2f850215a6f
--- /dev/null
+++ b/src/test/modules/libpq_pipeline/meson.build
@@ -0,0 +1,21 @@
+libpq_pipeline = executable('libpq_pipeline',
+ files(
+ 'libpq_pipeline.c',
+ ),
+ dependencies: [frontend_code, libpq],
+ kwargs: default_bin_args + {
+ 'install': false,
+ },
+)
+
+tap_tests += {
+ 'name': 'libpq_pipeline',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'env': {
+ 'PATH': meson.current_build_dir(),
+ },
+ 'tests': [
+ 't/001_libpq_pipeline.pl',
+ ]
+}
diff --git a/src/test/modules/meson.build b/src/test/modules/meson.build
new file mode 100644
index 00000000000..c98225c6e7b
--- /dev/null
+++ b/src/test/modules/meson.build
@@ -0,0 +1,25 @@
+subdir('brin')
+subdir('commit_ts')
+subdir('delay_execution')
+subdir('dummy_index_am')
+subdir('dummy_seclabel')
+subdir('libpq_pipeline')
+subdir('plsample')
+subdir('snapshot_too_old')
+subdir('spgist_name_ops')
+subdir('ssl_passphrase_callback')
+subdir('test_bloomfilter')
+subdir('test_ddl_deparse')
+subdir('test_extensions')
+subdir('test_ginpostinglist')
+subdir('test_integerset')
+subdir('test_misc')
+subdir('test_parser')
+subdir('test_pg_dump')
+subdir('test_predtest')
+subdir('test_rbtree')
+subdir('test_regex')
+subdir('test_rls_hooks')
+subdir('test_shm_mq')
+subdir('unsafe_tests')
+subdir('worker_spi')
diff --git a/src/test/modules/plsample/meson.build b/src/test/modules/plsample/meson.build
new file mode 100644
index 00000000000..3f70688fb89
--- /dev/null
+++ b/src/test/modules/plsample/meson.build
@@ -0,0 +1,20 @@
+# FIXME: prevent install during main install, but not during test :/
+plsample = shared_module('plsample',
+ ['plsample.c'],
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'plsample.control',
+ 'plsample--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'plsample',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'plsample',
+ ]
+}
diff --git a/src/test/modules/snapshot_too_old/meson.build b/src/test/modules/snapshot_too_old/meson.build
new file mode 100644
index 00000000000..cdf4afd18b8
--- /dev/null
+++ b/src/test/modules/snapshot_too_old/meson.build
@@ -0,0 +1,11 @@
+isolation_tests += {
+ 'name': 'snapshot_too_old',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'specs': [
+ 'sto_using_cursor',
+ 'sto_using_select',
+ 'sto_using_hash_index',
+ ],
+ 'regress_args': ['--temp-config', files('sto.conf')],
+}
diff --git a/src/test/modules/spgist_name_ops/meson.build b/src/test/modules/spgist_name_ops/meson.build
new file mode 100644
index 00000000000..19aa00892f1
--- /dev/null
+++ b/src/test/modules/spgist_name_ops/meson.build
@@ -0,0 +1,20 @@
+# FIXME: prevent install during main install, but not during test :/
+spgist_name_ops = shared_module('spgist_name_ops',
+ ['spgist_name_ops.c'],
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'spgist_name_ops.control',
+ 'spgist_name_ops--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'spgist_name_ops',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'spgist_name_ops',
+ ]
+}
diff --git a/src/test/modules/ssl_passphrase_callback/meson.build b/src/test/modules/ssl_passphrase_callback/meson.build
new file mode 100644
index 00000000000..b9fa5ee1cdc
--- /dev/null
+++ b/src/test/modules/ssl_passphrase_callback/meson.build
@@ -0,0 +1,45 @@
+if not ssl.found()
+ subdir_done()
+endif
+
+# FIXME: prevent install during main install, but not during test :/
+ssl_passphrase_callback = shared_module('ssl_passphrase_func',
+ ['ssl_passphrase_func.c'],
+ kwargs: pg_mod_args + {
+ 'dependencies': [ssl, pg_mod_args['dependencies']],
+ }
+)
+
+# Targets to generate or remove the ssl certificate and key. Need to be copied
+# to the source afterwards. Normally not needed.
+
+openssl = find_program('openssl', native: true, required: false)
+
+if openssl.found()
+ cert = custom_target('server.crt',
+ output: ['server.crt', 'server.ckey'],
+ command: [openssl, 'req', '-new', '-x509', '-days', '10000', '-nodes', '-out', '@OUTPUT0@',
+ '-keyout', '@OUTPUT1@', '-subj', '/CN=localhost'],
+ build_by_default: false,
+ install: false,
+ )
+
+ # needs to agree with what's in the test script
+ pass = 'FooBaR1'
+
+ enccert = custom_target('server.key',
+ input: [cert[1]],
+ output: ['server.key'],
+ command: [openssl, 'rsa', '-aes256', '-in', '@INPUT0@', '-out', '@OUTPUT0@', '-passout', 'pass:@0@'.format(pass)]
+ )
+endif
+
+tap_tests += {
+ 'name': 'ssl_passphrase_callback',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_testfunc.pl',
+ ],
+ 'env': {'with_ssl': 'openssl'},
+}
diff --git a/src/test/modules/test_bloomfilter/meson.build b/src/test/modules/test_bloomfilter/meson.build
new file mode 100644
index 00000000000..2e995310876
--- /dev/null
+++ b/src/test/modules/test_bloomfilter/meson.build
@@ -0,0 +1,20 @@
+# FIXME: prevent install during main install, but not during test :/
+test_bloomfilter = shared_module('test_bloomfilter',
+ ['test_bloomfilter.c'],
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'test_bloomfilter.control',
+ 'test_bloomfilter--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'test_bloomfilter',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'test_bloomfilter',
+ ]
+}
diff --git a/src/test/modules/test_ddl_deparse/meson.build b/src/test/modules/test_ddl_deparse/meson.build
new file mode 100644
index 00000000000..3618229594d
--- /dev/null
+++ b/src/test/modules/test_ddl_deparse/meson.build
@@ -0,0 +1,40 @@
+# FIXME: prevent install during main install, but not during test :/
+test_ddl_deparse = shared_module('test_ddl_deparse',
+ ['test_ddl_deparse.c'],
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'test_ddl_deparse.control',
+ 'test_ddl_deparse--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'test_ddl_deparse',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'test_ddl_deparse',
+ 'create_extension',
+ 'create_schema',
+ 'create_type',
+ 'create_conversion',
+ 'create_domain',
+ 'create_sequence_1',
+ 'create_table',
+ 'create_transform',
+ 'alter_table',
+ 'create_view',
+ 'create_trigger',
+ 'create_rule',
+ 'comment_on',
+ 'alter_function',
+ 'alter_sequence',
+ 'alter_ts_config',
+ 'alter_type_enum',
+ 'opfamily',
+ 'defprivs',
+ 'matviews',
+ ]
+}
diff --git a/src/test/modules/test_extensions/meson.build b/src/test/modules/test_extensions/meson.build
new file mode 100644
index 00000000000..2ca504f8588
--- /dev/null
+++ b/src/test/modules/test_extensions/meson.build
@@ -0,0 +1,38 @@
+# FIXME: prevent install during main install, but not during test :/
+install_data(
+ 'test_ext1--1.0.sql',
+ 'test_ext1.control',
+ 'test_ext2--1.0.sql',
+ 'test_ext2.control',
+ 'test_ext3--1.0.sql',
+ 'test_ext3.control',
+ 'test_ext4--1.0.sql',
+ 'test_ext4.control',
+ 'test_ext5--1.0.sql',
+ 'test_ext5.control',
+ 'test_ext6--1.0.sql',
+ 'test_ext6.control',
+ 'test_ext7--1.0--2.0.sql',
+ 'test_ext7--1.0.sql',
+ 'test_ext7.control',
+ 'test_ext8--1.0.sql',
+ 'test_ext8.control',
+ 'test_ext_cyclic1--1.0.sql',
+ 'test_ext_cyclic1.control',
+ 'test_ext_cyclic2--1.0.sql',
+ 'test_ext_cyclic2.control',
+ 'test_ext_evttrig--1.0--2.0.sql',
+ 'test_ext_evttrig--1.0.sql',
+ 'test_ext_evttrig.control',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'test_extensions',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'test_extensions',
+ 'test_extdepend',
+ ]
+}
diff --git a/src/test/modules/test_ginpostinglist/meson.build b/src/test/modules/test_ginpostinglist/meson.build
new file mode 100644
index 00000000000..e177e90019f
--- /dev/null
+++ b/src/test/modules/test_ginpostinglist/meson.build
@@ -0,0 +1,20 @@
+# FIXME: prevent install during main install, but not during test :/
+test_ginpostinglist = shared_module('test_ginpostinglist',
+ ['test_ginpostinglist.c'],
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'test_ginpostinglist.control',
+ 'test_ginpostinglist--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'test_ginpostinglist',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'test_ginpostinglist',
+ ]
+}
diff --git a/src/test/modules/test_integerset/meson.build b/src/test/modules/test_integerset/meson.build
new file mode 100644
index 00000000000..ccb8db725e5
--- /dev/null
+++ b/src/test/modules/test_integerset/meson.build
@@ -0,0 +1,20 @@
+# FIXME: prevent install during main install, but not during test :/
+test_integerset = shared_module('test_integerset',
+ ['test_integerset.c'],
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'test_integerset.control',
+ 'test_integerset--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'test_integerset',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'test_integerset',
+ ]
+}
diff --git a/src/test/modules/test_misc/meson.build b/src/test/modules/test_misc/meson.build
new file mode 100644
index 00000000000..4ee8c562ac0
--- /dev/null
+++ b/src/test/modules/test_misc/meson.build
@@ -0,0 +1,8 @@
+tap_tests += {
+ 'name': 'misc',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_constraint_validation.pl',
+ ],
+}
diff --git a/src/test/modules/test_parser/meson.build b/src/test/modules/test_parser/meson.build
new file mode 100644
index 00000000000..c43ae95cf2c
--- /dev/null
+++ b/src/test/modules/test_parser/meson.build
@@ -0,0 +1,20 @@
+# FIXME: prevent install during main install, but not during test :/
+test_parser = shared_module('test_parser',
+ ['test_parser.c'],
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'test_parser.control',
+ 'test_parser--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'test_parser',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'test_parser',
+ ]
+}
diff --git a/src/test/modules/test_pg_dump/meson.build b/src/test/modules/test_pg_dump/meson.build
new file mode 100644
index 00000000000..110b3876832
--- /dev/null
+++ b/src/test/modules/test_pg_dump/meson.build
@@ -0,0 +1,24 @@
+# FIXME: prevent install during main install, but not during test :/
+install_data(
+ 'test_pg_dump.control',
+ 'test_pg_dump--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'test_pg_dump',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'test_pg_dump',
+ ]
+}
+
+tap_tests += {
+ 'name': 'test_pg_dump',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests': [
+ 't/001_base.pl',
+ ]
+}
diff --git a/src/test/modules/test_predtest/meson.build b/src/test/modules/test_predtest/meson.build
new file mode 100644
index 00000000000..9f9a9475c8b
--- /dev/null
+++ b/src/test/modules/test_predtest/meson.build
@@ -0,0 +1,20 @@
+# FIXME: prevent install during main install, but not during test :/
+test_predtest = shared_module('test_predtest',
+ ['test_predtest.c'],
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'test_predtest.control',
+ 'test_predtest--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'test_predtest',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'test_predtest',
+ ]
+}
diff --git a/src/test/modules/test_rbtree/meson.build b/src/test/modules/test_rbtree/meson.build
new file mode 100644
index 00000000000..6bbeca39ec9
--- /dev/null
+++ b/src/test/modules/test_rbtree/meson.build
@@ -0,0 +1,20 @@
+# FIXME: prevent install during main install, but not during test :/
+test_rbtree = shared_module('test_rbtree',
+ ['test_rbtree.c'],
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'test_rbtree.control',
+ 'test_rbtree--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'test_rbtree',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'test_rbtree',
+ ]
+}
diff --git a/src/test/modules/test_regex/meson.build b/src/test/modules/test_regex/meson.build
new file mode 100644
index 00000000000..c5fd92ee1c6
--- /dev/null
+++ b/src/test/modules/test_regex/meson.build
@@ -0,0 +1,21 @@
+# FIXME: prevent install during main install, but not during test :/
+test_regex = shared_module('test_regex',
+ ['test_regex.c'],
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'test_regex.control',
+ 'test_regex--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'test_regex',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'test_regex',
+ 'test_regex_utf8',
+ ]
+}
diff --git a/src/test/modules/test_rls_hooks/meson.build b/src/test/modules/test_rls_hooks/meson.build
new file mode 100644
index 00000000000..fb8b697e160
--- /dev/null
+++ b/src/test/modules/test_rls_hooks/meson.build
@@ -0,0 +1,19 @@
+# FIXME: prevent install during main install, but not during test :/
+test_rls_hooks = shared_module('test_rls_hooks',
+ ['test_rls_hooks.c'],
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'test_rls_hooks.control',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'test_rls_hooks',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'test_rls_hooks',
+ ]
+}
diff --git a/src/test/modules/test_shm_mq/meson.build b/src/test/modules/test_shm_mq/meson.build
new file mode 100644
index 00000000000..159943f861e
--- /dev/null
+++ b/src/test/modules/test_shm_mq/meson.build
@@ -0,0 +1,24 @@
+# FIXME: prevent install during main install, but not during test :/
+test_shm_mq = shared_module('test_shm_mq',
+ files(
+ 'setup.c',
+ 'test.c',
+ 'worker.c',
+ ),
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'test_shm_mq.control',
+ 'test_shm_mq--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'test_shm_mq',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'test_shm_mq',
+ ]
+}
diff --git a/src/test/modules/unsafe_tests/meson.build b/src/test/modules/unsafe_tests/meson.build
new file mode 100644
index 00000000000..9ed4d587721
--- /dev/null
+++ b/src/test/modules/unsafe_tests/meson.build
@@ -0,0 +1,9 @@
+regress_tests += {
+ 'name': 'unsafe_tests',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'rolenames',
+ 'alter_system_table',
+ ],
+}
diff --git a/src/test/modules/worker_spi/meson.build b/src/test/modules/worker_spi/meson.build
new file mode 100644
index 00000000000..a80bd493ea7
--- /dev/null
+++ b/src/test/modules/worker_spi/meson.build
@@ -0,0 +1,23 @@
+# FIXME: prevent install during main install, but not during test :/
+test_worker_spi = shared_module('worker_spi',
+ files(
+ 'worker_spi.c',
+ ),
+ kwargs: pg_mod_args,
+)
+
+install_data(
+ 'worker_spi.control',
+ 'worker_spi--1.0.sql',
+ kwargs: contrib_data_args,
+)
+
+regress_tests += {
+ 'name': 'worker_spi',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'sql': [
+ 'worker_spi',
+ ],
+ 'regress_args': ['--temp-config', files('dynamic.conf'), '--dbname=contrib_regression'],
+}
diff --git a/src/test/recovery/meson.build b/src/test/recovery/meson.build
new file mode 100644
index 00000000000..5678e1d27ae
--- /dev/null
+++ b/src/test/recovery/meson.build
@@ -0,0 +1,33 @@
+tap_tests += {
+ 'name': 'recovery',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'tests' : [
+ 't/001_stream_rep.pl',
+ 't/002_archiving.pl',
+ 't/003_recovery_targets.pl',
+ 't/004_timeline_switch.pl',
+ 't/005_replay_delay.pl',
+ 't/006_logical_decoding.pl',
+ 't/007_sync_rep.pl',
+ 't/008_fsm_truncation.pl',
+ 't/009_twophase.pl',
+ 't/010_logical_decoding_timelines.pl',
+ 't/011_crash_recovery.pl',
+ 't/012_subtransactions.pl',
+ 't/013_crash_restart.pl',
+ 't/014_unlogged_reinit.pl',
+ 't/015_promotion_pages.pl',
+ 't/016_min_consistency.pl',
+ 't/017_shm.pl',
+ 't/018_wal_optimize.pl',
+ 't/019_replslot_limit.pl',
+ 't/020_archive_status.pl',
+ 't/021_row_visibility.pl',
+ 't/022_crash_temp_files.pl',
+ 't/023_pitr_prepared_xact.pl',
+ 't/024_archive_recovery.pl',
+ 't/025_stuck_on_old_timeline.pl',
+ 't/026_overwrite_contrecord.pl',
+ ]
+}
diff --git a/src/test/regress/meson.build b/src/test/regress/meson.build
new file mode 100644
index 00000000000..1a2f7675e87
--- /dev/null
+++ b/src/test/regress/meson.build
@@ -0,0 +1,57 @@
+# also used by isolationtester
+pg_regress_c = files('pg_regress.c')
+pg_regress_inc = include_directories('.')
+
+regress_sources = pg_regress_c + files(
+ 'pg_regress_main.c'
+)
+
+pg_regress_cflags = ['-DHOST_TUPLE="frak"', '-DSHELLPROG="/bin/sh"']
+
+pg_regress = executable('pg_regress',
+ regress_sources,
+ c_args: pg_regress_cflags,
+ dependencies: [frontend_code, libpq],
+ kwargs: default_bin_args + {
+ 'install': false
+ },
+)
+
+regress_module = shared_module('regress',
+ ['regress.c'],
+ kwargs: pg_mod_args + {
+ 'install': false,
+ },
+)
+
+# Get some extra C modules from contrib/spi but mark them as not to be
+# installed.
+# FIXME: avoid the duplication.
+
+shared_module('autoinc',
+ ['../../../contrib/spi/autoinc.c'],
+ kwargs: pg_mod_args + {
+ 'install': false,
+ },
+)
+
+shared_module('refint',
+ ['../../../contrib/spi/refint.c'],
+ kwargs: pg_mod_args + {
+ 'c_args': refint_cflags + contrib_mod_args['c_args'],
+ 'install': false,
+ },
+)
+
+
+regress_tests += {
+ 'name': 'main',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'schedule': files('parallel_schedule'),
+ 'regress_args': ['--make-testtablespace-dir'],
+ 'test_kwargs': {
+ 'priority': 50,
+ 'timeout': 1000,
+ },
+}
diff --git a/src/test/ssl/meson.build b/src/test/ssl/meson.build
new file mode 100644
index 00000000000..42e34c9f632
--- /dev/null
+++ b/src/test/ssl/meson.build
@@ -0,0 +1,10 @@
+tap_tests += {
+ 'name': 'ssl',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'env' : {'with_ssl': get_option('ssl')},
+ 'tests': [
+ 't/001_ssltests.pl',
+ 't/002_scram.pl'
+ ],
+}
diff --git a/src/test/subscription/meson.build b/src/test/subscription/meson.build
new file mode 100644
index 00000000000..1024952e25b
--- /dev/null
+++ b/src/test/subscription/meson.build
@@ -0,0 +1,34 @@
+tap_tests += {
+ 'name': 'subscription',
+ 'sd': meson.current_source_dir(),
+ 'bd': meson.current_build_dir(),
+ 'env' : {'with_icu': icu.found() ? 'yes' : 'no'},
+ 'tests': [
+ 't/001_rep_changes.pl',
+ 't/002_types.pl',
+ 't/003_constraints.pl',
+ 't/004_sync.pl',
+ 't/005_encoding.pl',
+ 't/006_rewrite.pl',
+ 't/007_ddl.pl',
+ 't/008_diff_schema.pl',
+ 't/009_matviews.pl',
+ 't/010_truncate.pl',
+ 't/011_generated.pl',
+ 't/012_collation.pl',
+ 't/013_partition.pl',
+ 't/014_binary.pl',
+ 't/015_stream.pl',
+ 't/016_stream_subxact.pl',
+ 't/017_stream_ddl.pl',
+ 't/018_stream_subxact_abort.pl',
+ 't/019_stream_subxact_ddl_abort.pl',
+ 't/020_messages.pl',
+ 't/021_twophase.pl',
+ 't/022_twophase_cascade.pl',
+ 't/023_twophase_stream.pl',
+ 't/024_add_drop_pub.pl',
+ 't/025_rep_changes_for_schema.pl',
+ 't/100_bugs.pl',
+ ],
+}
diff --git a/src/timezone/meson.build b/src/timezone/meson.build
new file mode 100644
index 00000000000..c3703a5ec7d
--- /dev/null
+++ b/src/timezone/meson.build
@@ -0,0 +1,50 @@
+# files to build into backend
+timezone_sources = files(
+ 'localtime.c',
+ 'pgtz.c',
+ 'strftime.c',
+)
+
+
+timezone_inc = include_directories('.')
+
+timezone_localtime_source = files('localtime.c')
+
+# files needed to build zic utility program
+zic_sources = files(
+ 'zic.c'
+)
+
+# we now distribute the timezone data as a single file
+tzdata = files(
+ 'data/tzdata.zi'
+)
+
+
+# FIXME: For cross builds, it would need a native built libpgport/pgcommon to
+# build our zic. But for that we'd need to run a good chunk of the configure
+# tests both natively and cross. Unclear if it's worth it.
+if meson.is_cross_build()
+ zic = find_program('zic', native: true, required: false)
+else
+ zic = executable('zic', zic_sources,
+ dependencies: [frontend_code],
+ kwargs: default_bin_args + {'install': false}
+ )
+endif
+
+# FIXME: this used to be sorted - but also isn't actually used
+abbrevs_txt = custom_target('abbrevs.txt',
+ input: tzdata,
+ output: ['abbrevs.txt'],
+ command: [zic, '-P', '-b', 'fat', 'junkdir', '@INPUT@'],
+ capture: true)
+
+tzdata = custom_target('tzdata',
+ input: tzdata,
+ output: ['timezone'],
+ command: [zic, '-d', '@OUTPUT@', '@INPUT@'],
+ install: true,
+ install_dir: get_option('datadir'))
+
+subdir('tznames')
diff --git a/src/timezone/tznames/meson.build b/src/timezone/tznames/meson.build
new file mode 100644
index 00000000000..effd2880ce7
--- /dev/null
+++ b/src/timezone/tznames/meson.build
@@ -0,0 +1,20 @@
+tznames = files(
+ 'Africa.txt',
+ 'America.txt',
+ 'Antarctica.txt',
+ 'Asia.txt',
+ 'Atlantic.txt',
+ 'Australia.txt',
+ 'Etc.txt',
+ 'Europe.txt',
+ 'Indian.txt',
+ 'Pacific.txt',
+)
+
+tznames_sets = files(
+ 'Default',
+ 'Australia',
+ 'India')
+
+install_data(tznames, install_dir: get_option('datadir') / 'timezonesets')
+install_data(tznames_sets, install_dir: get_option('datadir') / 'timezonesets')
diff --git a/src/tools/find_meson b/src/tools/find_meson
new file mode 100755
index 00000000000..2d75537374e
--- /dev/null
+++ b/src/tools/find_meson
@@ -0,0 +1,20 @@
+#!/usr/bin/env python3
+
+import os
+import shlex
+import sys
+
+mesonintrospect = os.environ['MESONINTROSPECT']
+components = shlex.split(mesonintrospect)
+
+if len(components) < 2:
+ print(f'expected more, got: {components}')
+ sys.exit(1)
+
+if components[-1] != 'introspect':
+ print('expected introspection at the end')
+ sys.exit(1)
+
+print('\n'.join(components[:-1]), end='')
+
+sys.exit(0)
diff --git a/src/tools/irlink b/src/tools/irlink
new file mode 100644
index 00000000000..efc2c700277
--- /dev/null
+++ b/src/tools/irlink
@@ -0,0 +1,28 @@
+#!/bin/bash
+
+set -e
+
+srcdir="$1"
+builddir="$2"
+llvm_lto="$3"
+outputdir=$(realpath "$5")
+index="$outputdir/postgres.index.bc"
+priv="$6"
+shift 6
+numinput=$#
+
+if [ ! -d "$outputdir" ];then
+ mkdir -p "$outputdir/postgres"
+fi
+
+cd $priv
+
+# fixme, remove old contents"
+cp -r . "$outputdir/postgres"
+
+cd "$outputdir"
+
+filenames=$(for f in "$@";do echo "postgres/${f#$priv/}";done)
+"$llvm_lto" -thinlto -thinlto-action=thinlink -o "$index" $filenames
+
+exit 0
diff --git a/src/tools/msvc/export2def.pl b/src/tools/msvc/export2def.pl
new file mode 100644
index 00000000000..fb88e8b8ab9
--- /dev/null
+++ b/src/tools/msvc/export2def.pl
@@ -0,0 +1,22 @@
+# Copyright (c) 2021, PostgreSQL Global Development Group
+
+use strict;
+use warnings;
+use 5.8.0;
+use List::Util qw(max);
+
+my ($deffile, $txtfile, $libname) = @ARGV;
+
+print STDERR "Generating $deffile...\n";
+open(my $if, '<', $txtfile) || die("Could not open $txtfile\n");
+open(my $of, '>', $deffile) || die("Could not open $deffile for writing\n");
+print $of "LIBRARY $libname\nEXPORTS\n";
+while (<$if>)
+{
+ next if (/^#/);
+ next if (/^\s*$/);
+ my ($f, $o) = split;
+ print $of " $f @ $o\n";
+}
+close($of);
+close($if);
diff --git a/src/tools/msvc/gendef2.pl b/src/tools/msvc/gendef2.pl
new file mode 100644
index 00000000000..3b905d6f5da
--- /dev/null
+++ b/src/tools/msvc/gendef2.pl
@@ -0,0 +1,177 @@
+
+# Copyright (c) 2021, PostgreSQL Global Development Group
+
+use strict;
+use warnings;
+use 5.8.0;
+use List::Util qw(max);
+
+my @def;
+
+#
+# Script that generates a .DEF file for all objects in a directory
+#
+# src/tools/msvc/gendef.pl
+#
+
+# Given a symbol file path, loops over its contents
+# and returns a list of symbols of interest as a dictionary
+# of 'symbolname' -> symtype, where symtype is:
+#
+# 0 a CODE symbol, left undecorated in the .DEF
+# 1 A DATA symbol, i.e. global var export
+#
+sub extract_syms
+{
+ my ($symfile, $def) = @_;
+ open(my $f, '<', $symfile) || die "Could not open $symfile: $!\n";
+ while (<$f>)
+ {
+
+ # Expected symbol lines look like:
+ #
+ # 0 1 2 3 4 5 6
+ # IDX SYMBOL SECT SYMTYPE SYMSTATIC SYMNAME
+ # ------------------------------------------------------------------------
+ # 02E 00000130 SECTA notype External | _standbyState
+ # 02F 00000009 SECT9 notype Static | _LocalRecoveryInProgress
+ # 064 00000020 SECTC notype () Static | _XLogCheckBuffer
+ # 065 00000000 UNDEF notype () External | _BufferGetTag
+ #
+ # See http://msdn.microsoft.com/en-us/library/b842y285.aspx
+ #
+ # We're not interested in the symbol index or offset.
+ #
+ # SECT[ION] is only examined to see whether the symbol is defined in a
+ # COFF section of the local object file; if UNDEF, it's a symbol to be
+ # resolved at link time from another object so we can't export it.
+ #
+ # SYMTYPE is always notype for C symbols as there's no typeinfo and no
+ # way to get the symbol type from name (de)mangling. However, we care
+ # if "notype" is suffixed by "()" or not. The presence of () means the
+ # symbol is a function, the absence means it isn't.
+ #
+ # SYMSTATIC indicates whether it's a compilation-unit local "static"
+ # symbol ("Static"), or whether it's available for use from other
+ # compilation units ("External"). We export all symbols that aren't
+ # static as part of the whole program DLL interface to produce UNIX-like
+ # default linkage.
+ #
+ # SYMNAME is, obviously, the symbol name. The leading underscore
+ # indicates that the _cdecl calling convention is used. See
+ # http://www.unixwiz.net/techtips/win32-callconv.html
+ # http://www.codeproject.com/Articles/1388/Calling-Conventions-Demystified
+ #
+ s/notype \(\)/func/g;
+ s/notype/data/g;
+
+ my @pieces = split;
+
+ # Skip file and section headers and other non-symbol entries
+ next unless defined($pieces[0]) and $pieces[0] =~ /^[A-F0-9]{3,}$/;
+
+ # Skip blank symbol names
+ next unless $pieces[6];
+
+ # Skip externs used from another compilation unit
+ next if ($pieces[2] eq "UNDEF");
+
+ # Skip static symbols
+ next unless ($pieces[4] eq "External");
+
+ # Skip some more MSVC-generated crud
+ next if $pieces[6] =~ /^@/;
+ next if $pieces[6] =~ /^\(/;
+
+ # __real and __xmm are out-of-line floating point literals and
+ # (for __xmm) their SIMD equivalents. They shouldn't be part
+ # of the DLL interface.
+ next if $pieces[6] =~ /^__real/;
+ next if $pieces[6] =~ /^__xmm/;
+
+ # __imp entries are imports from other DLLs, eg __imp__malloc .
+ # (We should never have one of these that hasn't already been skipped
+ # by the UNDEF test above, though).
+ next if $pieces[6] =~ /^__imp/;
+
+ # More under-documented internal crud
+ next if $pieces[6] =~ /NULL_THUNK_DATA$/;
+ next if $pieces[6] =~ /^__IMPORT_DESCRIPTOR/;
+ next if $pieces[6] =~ /^__NULL_IMPORT/;
+
+ # Skip string literals
+ next if $pieces[6] =~ /^\?\?_C/;
+
+ # We assume that if a symbol is defined as data, then as a function,
+ # the linker will reject the binary anyway. So it's OK to just pick
+ # whatever came last.
+ $def->{ $pieces[6] } = $pieces[3];
+ }
+ close($f);
+ return;
+}
+
+sub writedef
+{
+ my ($deffile, $platform, $def) = @_;
+ open(my $fh, '>', $deffile) || die "Could not write to $deffile\n";
+ print $fh "EXPORTS\n";
+ foreach my $f (sort keys %{$def})
+ {
+ my $isdata = $def->{$f} eq 'data';
+
+ # Strip the leading underscore for win32, but not x64
+ $f =~ s/^_//
+ unless ($platform eq "x64");
+
+ # Emit just the name if it's a function symbol, or emit the name
+ # decorated with the DATA option for variables.
+ if ($isdata)
+ {
+ print $fh " $f DATA\n";
+ }
+ else
+ {
+ print $fh " $f\n";
+ }
+ }
+ close($fh);
+ return;
+}
+
+
+sub usage
+{
+ die( "Usage: gendef.pl platform outputfile tempdir sourcelib\n"
+ . " modulepath: path to dir with obj files, no trailing slash"
+ . " platform: Win32 | x64");
+}
+
+usage()
+ unless scalar(@ARGV) >= 4;
+
+my $platform = $ARGV[0];
+shift;
+my $deffile = $ARGV[0];
+shift;
+my $tempdir = $ARGV[0];
+shift;
+
+print STDERR "Generating $deffile in tmp dir $tempdir from ".join(' ', @ARGV)."\n";
+
+my %def = ();
+
+my $symfile = "$tempdir/all.sym";
+my $tmpfile = "$tempdir/tmp.sym";
+mkdir($tempdir);
+print STDERR "dumpbin /symbols /out:$tmpfile ".join(' ', @ARGV)."\n";
+system("dumpbin /symbols /out:$tmpfile ".join(' ', @ARGV))
+ && die "Could not call dumpbin";
+rename($tmpfile, $symfile);
+print "generated symfile to $symfile (via $tmpfile)\n";
+extract_syms($symfile, \%def);
+print "\n";
+
+writedef($deffile, $platform, \%def);
+
+print "Generated " . scalar(keys(%def)) . " symbols\n";
diff --git a/src/tools/relativize_shared_library_references b/src/tools/relativize_shared_library_references
new file mode 100755
index 00000000000..db6431639f1
--- /dev/null
+++ b/src/tools/relativize_shared_library_references
@@ -0,0 +1,84 @@
+#!/usr/bin/env python3
+# -*-python-*-
+
+# This script updates a macos postgres installation to reference all internal
+# shared libraries using rpaths, leaving absolute install_names in the
+# libraries themselves intact.
+
+import os
+import shlex
+import sys
+import json
+import subprocess
+import shutil
+
+
+def installed_path(destdir, path):
+ if destdir is not None:
+ return f'{destdir}{path}'
+ else:
+ return path
+
+
+def collect_information():
+ shared_libraries = []
+ executables = []
+ shared_modules = []
+
+ targets = json.load(open(os.path.join(build_root, 'meson-info', 'intro-targets.json')))
+ installed = json.load(open(os.path.join(build_root, 'meson-info', 'intro-installed.json')))
+
+ for target in targets:
+ if not target['installed']:
+ continue
+
+ filenames = target['filename']
+
+ if target['type'] == 'shared library':
+ assert(len(filenames) == 1)
+ filename = filenames[0]
+
+ shared_libraries.append(installed[filename])
+
+ if target['type'] == 'executable':
+ assert(len(filenames) == 1)
+ filename = filenames[0]
+ executables.append(installed[filename])
+
+ if target['type'] == 'shared module':
+ assert(len(filenames) == 1)
+ filename = filenames[0]
+ shared_modules.append(installed[filename])
+
+ return shared_libraries, executables, shared_modules
+
+
+def patch_references(destdir, shared_libraries, executables, shared_modules):
+ install_name_tool = [shutil.which('install_name_tool')]
+
+ for lib in shared_libraries:
+ libname = os.path.basename(lib)
+ libpath = installed_path(destdir, lib)
+ newref = f'@rpath/{libname}'
+
+ for patch in shared_modules + executables:
+ patchpath = installed_path(destdir, patch)
+
+ #print(f'in {patchpath} replace reference to {libpath} with {newref}')
+ if not os.path.exists(patchpath):
+ print(f"path {patchpath} doesn't exist", file=sys.stderr)
+ sys.exit(1)
+
+ subprocess.check_call(install_name_tool + ['-change', lib, newref, patchpath])
+
+
+if __name__ == '__main__':
+ build_root = os.environ['MESON_BUILD_ROOT']
+ destdir = os.environ.get('DESTDIR', None)
+
+ print(f'making references to shared libraries relative, destdir is {destdir}', file=sys.stderr)
+
+ shared_libraries, executables, shared_modules = collect_information()
+ patch_references(destdir, shared_libraries, executables, shared_modules)
+
+ sys.exit(0)
diff --git a/src/tools/relpath.py b/src/tools/relpath.py
new file mode 100755
index 00000000000..87bcb496ab5
--- /dev/null
+++ b/src/tools/relpath.py
@@ -0,0 +1,6 @@
+#!/usr/bin/env python3
+
+import os
+import sys
+
+print(os.path.relpath(sys.argv[2], start=sys.argv[1]))
diff --git a/src/tools/testwrap b/src/tools/testwrap
new file mode 100755
index 00000000000..aeb2019b099
--- /dev/null
+++ b/src/tools/testwrap
@@ -0,0 +1,22 @@
+#!/bin/sh
+#
+# FIXME: I should probably be a perl or python script
+#
+
+# FIXME: argument parsing
+
+basedir=$1
+builddir=$2
+testgroup=$3
+testname=$(basename -s .pl $4)
+shift 4
+
+testdir="$basedir/testrun/$testgroup/$testname"
+echo "# executing test in $testdir group $testgroup test $testname, builddir $builddir"
+rm -rf "$testdir/"
+mkdir -p "$testdir"
+
+export TESTOUTDIR="$testdir"
+export TESTDIR="$builddir"
+
+exec "$@"
--
2.23.0.385.gbc12974a89
v5-0016-meson-ci-Build-both-with-meson-and-as-before.patchtext/x-diff; charset=us-asciiDownload
From 0585858de3602b88b2c8f11924d3439f457b5ab2 Mon Sep 17 00:00:00 2001
From: Andres Freund <andres@anarazel.de>
Date: Fri, 8 Oct 2021 17:29:10 -0700
Subject: [PATCH v5 16/16] meson: ci: Build both with meson and as before.
---
.cirrus.yml | 466 ++++++++++++++++++++++++++++++++++------------------
1 file changed, 309 insertions(+), 157 deletions(-)
diff --git a/.cirrus.yml b/.cirrus.yml
index 2bb6f4a14d7..a7f955b3c63 100644
--- a/.cirrus.yml
+++ b/.cirrus.yml
@@ -13,14 +13,13 @@ env:
task:
- name: FreeBSD
only_if: $CIRRUS_CHANGE_MESSAGE !=~ '.*\nci-os-only:.*' || $CIRRUS_CHANGE_MESSAGE =~ '.*\nci-os-only:[^\n]*freebsd.*'
compute_engine_instance:
image_project: pg-vm-images-aio
image: family/pg-aio-freebsd-13-0
platform: freebsd
- cpu: 2
- memory: 2G
+ cpu: 4
+ memory: 4G
disk: 50
env:
CCACHE_DIR: "/tmp/ccache_dir"
@@ -39,33 +38,52 @@ task:
- mkdir -p /tmp/ccache_dir
- chown -R postgres:postgres /tmp/ccache_dir
- configure_script: |
- su postgres -c './configure \
- --enable-cassert --enable-debug --enable-tap-tests \
- --enable-nls \
- \
- --with-icu \
- --with-ldap \
- --with-libxml \
- --with-libxslt \
- \
- --with-lz4 \
- --with-pam \
- --with-perl \
- --with-python \
- --with-ssl=openssl \
- --with-tcl --with-tclconfig=/usr/local/lib/tcl8.6/ \
- --with-uuid=bsd \
- \
- --with-includes=/usr/local/include --with-libs=/usr/local/lib \
- CC="ccache cc"'
- build_script:
- - su postgres -c 'gmake -s -j3 && gmake -s -j3 -C contrib'
- upload_caches:
- - ccache
+ matrix:
+ - name: FreeBSD autoconf
+
+ configure_script: |
+ su postgres -c './configure \
+ --enable-cassert --enable-debug --enable-tap-tests \
+ --enable-nls \
+ \
+ --with-icu \
+ --with-ldap \
+ --with-libxml \
+ --with-libxslt \
+ \
+ --with-lz4 \
+ --with-pam \
+ --with-perl \
+ --with-python \
+ --with-ssl=openssl \
+ --with-tcl --with-tclconfig=/usr/local/lib/tcl8.6/ \
+ --with-uuid=bsd \
+ \
+ --with-includes=/usr/local/include --with-libs=/usr/local/lib \
+ CC="ccache cc"'
+ build_script:
+ - su postgres -c 'gmake -s -j4 && gmake -s -j4 -C contrib'
+ upload_caches:
+ - ccache
+
+ tests_script:
+ - su postgres -c 'time gmake -s -j4 ${CHECK} ${CHECKFLAGS}'
+
+ - name: FreeBSD meson
+
+ configure_script:
+ - su postgres -c 'meson setup --buildtype debug -Dcassert=true -Dssl=openssl -Duuid=bsd build'
+ build_script:
+ - su postgres -c 'ninja -C build'
+ upload_caches:
+ - ccache
+ run_tests_script:
+ - su postgres -c 'meson test --no-rebuild -C build'
- tests_script:
- - su postgres -c 'time gmake -s -j2 ${CHECK} ${CHECKFLAGS}'
+ always:
+ meson_log_artifacts:
+ path: "build/meson-logs/*.txt"
+ type: text/plain
on_failure:
cores_script: |
@@ -83,14 +101,16 @@ task:
tap_artifacts:
path: "**/regress_log_*"
type: text/plain
+ meson_test_artifacts:
+ path: "build/meson-logs/testlog.junit.xml"
+ type: text/xml
+ format: junit
task:
- name: Linux
only_if: $CIRRUS_CHANGE_MESSAGE !=~ '.*\nci-os-only:.*' || $CIRRUS_CHANGE_MESSAGE =~ '.*\nci-os-only:[^\n]*linux.*'
compute_engine_instance:
image_project: pg-vm-images-aio
- image: family/pg-aio-bullseye
platform: linux
cpu: 4
memory: 2G
@@ -120,37 +140,78 @@ task:
- su postgres -c 'ulimit -l -S'
- echo '/tmp/%e-%s-%p.core' > /proc/sys/kernel/core_pattern
- configure_script: |
- su postgres -c './configure \
- --enable-cassert --enable-debug --enable-tap-tests \
- --enable-nls \
- \
- --with-gssapi \
- --with-icu \
- --with-ldap \
- --with-libxml \
- --with-libxslt \
- --with-llvm \
- --with-lz4 \
- --with-pam \
- --with-perl \
- --with-python \
- --with-ssl=openssl \
- --with-systemd \
- --with-tcl --with-tclconfig=/usr/lib/tcl8.6/ \
- --with-uuid=e2fs \
- \
- CC="ccache gcc" CXX="ccache g++" CLANG="ccache clang" CFLAGS="-O0 -ggdb"'
- build_script:
- - su postgres -c 'make -s -j4 && make -s -j4 -C contrib'
- upload_caches:
- - ccache
+ matrix:
+ - name: Linux Autoconf
+
+ compute_engine_instance:
+ image: family/pg-aio-bullseye
+
+ configure_script: |
+ su postgres -c './configure \
+ --enable-cassert --enable-debug --enable-tap-tests \
+ --enable-nls \
+ \
+ --with-gssapi \
+ --with-icu \
+ --with-ldap \
+ --with-libxml \
+ --with-libxslt \
+ --with-llvm \
+ --with-lz4 \
+ --with-pam \
+ --with-perl \
+ --with-python \
+ --with-ssl=openssl \
+ --with-systemd \
+ --with-tcl --with-tclconfig=/usr/lib/tcl8.6/ \
+ --with-uuid=e2fs \
+ \
+ CC="ccache gcc" CXX="ccache g++" CLANG="ccache clang" CFLAGS="-O0 -ggdb"'
+ build_script:
+ - su postgres -c 'make -s -j4 && make -s -j4 -C contrib'
+ upload_caches:
+ - ccache
+
+ tests_script: |
+ su postgres -c '\
+ ulimit -c unlimited; \
+ make -s ${CHECK} ${CHECKFLAGS} -j8 \
+ '
+
+ - name: Linux Meson
+
+ compute_engine_instance:
+ image: family/pg-aio-bullseye
+
+ configure_script:
+ - su postgres -c 'meson setup --buildtype debug -Dcassert=true -Dssl=openssl -Duuid=e2fs build'
+ build_script:
+ - su postgres -c 'ninja -C build'
+ upload_caches:
+ - ccache
+
+ tests_script:
+ - su postgres -c 'meson test --no-rebuild -C build'
+
+ - name: Linux Meson Sid
+
+ compute_engine_instance:
+ image: family/pg-aio-sid
+
+ configure_script:
+ - su postgres -c 'meson setup --buildtype debug -Dcassert=true -Dssl=openssl -Duuid=ossp build'
+ build_script:
+ - su postgres -c 'ninja -C build'
+ upload_caches:
+ - ccache
+
+ tests_script:
+ - su postgres -c 'meson test --no-rebuild -C build'
- tests_script: |
- su postgres -c '\
- ulimit -c unlimited; \
- make -s ${CHECK} ${CHECKFLAGS} -j8 \
- '
+ always:
+ meson_log_artifacts:
+ path: "build/meson-logs/*.txt"
+ type: text/plain
on_failure:
cores_script: |
@@ -168,10 +229,13 @@ task:
tap_artifacts:
path: "**/regress_log_*"
type: text/plain
+ meson_test_artifacts:
+ path: "build/meson-logs/testlog.junit.xml"
+ type: text/xml
+ format: junit
task:
- name: macOS
only_if: $CIRRUS_CHANGE_MESSAGE !=~ '.*\nci-os-only:.*' || $CIRRUS_CHANGE_MESSAGE =~ '.*\nci-os-only:[^\n]*(macos|darwin|osx).*'
osx_instance:
image: big-sur-base
@@ -201,56 +265,87 @@ task:
- sudo chmod 777 /cores
homebrew_install_script:
- brew install make coreutils ccache icu4c lz4 tcl-tk openldap
+ - brew install meson ninja python@3.9
upload_caches:
- homebrew
- configure_script: |
- LIBS="/usr/local/lib:$LIBS"
- INCLUDES="/usr/local/include:$INCLUDES"
-
- INCLUDES="/usr/local/opt/openssl/include:$INCLUDES"
- LIBS="/usr/local/opt/openssl/lib:$LIBS"
-
- PKG_CONFIG_PATH="/usr/local/opt/icu4c/lib/pkgconfig:$PKG_CONFIG_PATH"
- INCLUDES="/usr/local/opt/icu4c/include:$INCLUDES"
- LIBS="/usr/local/opt/icu4c/lib:$LIBS"
-
- LIBS="/usr/local/opt/openldap/lib:$LIBS"
- INCLUDES="/usr/local/opt/openldap/include:$INCLUDES"
-
- export PKG_CONFIG_PATH
-
- ./configure \
- --prefix=$HOME/install \
- --with-includes="$INCLUDES" \
- --with-libs="$LIBS" \
- \
- --enable-cassert --enable-debug --enable-tap-tests \
- --enable-nls \
- \
- --with-icu \
- --with-ldap \
- --with-libxml \
- --with-libxslt \
- \
- --with-lz4 \
- --with-perl \
- --with-python \
- --with-ssl=openssl \
- --with-tcl --with-tclconfig=/usr/local/opt/tcl-tk/lib/ \
- --with-uuid=e2fs \
- \
- CC="ccache gcc" CFLAGS="-O0 -ggdb" \
- PYTHON=python3
- build_script:
- - gmake -s -j12 && gmake -s -j12 -C contrib
- upload_caches:
- - ccache
+ matrix:
+ - name: macOS autoconf
+
+ configure_script: |
+ LIBS="/usr/local/lib:$LIBS"
+ INCLUDES="/usr/local/include:$INCLUDES"
+
+ PKG_CONFIG_PATH="/usr/local/opt/openssl/lib/pkgconfig:$PKG_CONFIG_PATH"
+ INCLUDES="/usr/local/opt/openssl/include:$INCLUDES"
+ LIBS="/usr/local/opt/openssl/lib:$LIBS"
+
+ PKG_CONFIG_PATH="/usr/local/opt/icu4c/lib/pkgconfig:$PKG_CONFIG_PATH"
+ INCLUDES="/usr/local/opt/icu4c/include:$INCLUDES"
+ LIBS="/usr/local/opt/icu4c/lib:$LIBS"
+
+ PKG_CONFIG_PATH="/usr/local/opt/openldap/lib/pkgconfig:$PKG_CONFIG_PATH"
+ LIBS="/usr/local/opt/openldap/lib:$LIBS"
+ INCLUDES="/usr/local/opt/openldap/include:$INCLUDES"
+
+ export PKG_CONFIG_PATH
+
+ ./configure \
+ --prefix=$HOME/install \
+ --with-includes="$INCLUDES" \
+ --with-libs="$LIBS" \
+ \
+ --enable-cassert --enable-debug --enable-tap-tests \
+ --enable-nls \
+ \
+ --with-icu \
+ --with-ldap \
+ --with-libxml \
+ --with-libxslt \
+ \
+ --with-lz4 \
+ --with-perl \
+ --with-python \
+ --with-ssl=openssl \
+ --with-tcl --with-tclconfig=/usr/local/opt/tcl-tk/lib/ \
+ --with-uuid=e2fs \
+ \
+ CC="ccache gcc" CFLAGS="-O0 -ggdb" \
+ PYTHON=python3
+ build_script:
+ - gmake -s -j12 && gmake -s -j12 -C contrib
+ upload_caches:
+ - ccache
+
+ tests_script:
+ - ulimit -c unlimited
+ - ulimit -n 1024
+ - gmake -s -j12 ${CHECK} ${CHECKFLAGS}
+
+ - name: macOS meson
+
+ configure_script: |
+ PKG_CONFIG_PATH="/usr/local/opt/openssl/lib/pkgconfig:$PKG_CONFIG_PATH"
+ PKG_CONFIG_PATH="/usr/local/opt/icu4c/lib/pkgconfig:$PKG_CONFIG_PATH"
+ PKG_CONFIG_PATH="/usr/local/opt/openldap/lib/pkgconfig:$PKG_CONFIG_PATH"
+
+ export PKG_CONFIG_PATH
+
+ meson setup --buildtype debug -Dcassert=true -Dssl=openssl -Duuid=e2fs build
+ build_script:
+ - ninja -C build
+ upload_caches:
+ - ccache
+
+ tests_script:
+ - ulimit -c unlimited
+ - ulimit -n 1024
+ - meson test --no-rebuild -C build
- tests_script:
- - ulimit -c unlimited
- - ulimit -n 1024
- - gmake -s -j12 ${CHECK} ${CHECKFLAGS}
+ always:
+ meson_log_artifacts:
+ path: "build/meson-logs/*.txt"
+ type: text/plain
on_failure:
cores_script: |
@@ -266,10 +361,13 @@ task:
tap_artifacts:
path: "**/regress_log_*"
type: text/plain
+ meson_test_artifacts:
+ path: "build/meson-logs/testlog.junit.xml"
+ type: text/xml
+ format: junit
task:
- name: Windows
only_if: $CIRRUS_CHANGE_MESSAGE !=~ '.*\nci-os-only:.*' || $CIRRUS_CHANGE_MESSAGE =~ '.*\nci-os-only:[^\n]*windows.*'
windows_container:
dockerfile: ci/docker/windows_vs_2019
@@ -282,6 +380,8 @@ task:
TEMP_CONFIG: ${CIRRUS_WORKING_DIR}/ci/pg_ci_base.conf
# Avoid re-installing over and over
NO_TEMP_INSTALL: 1
+ # Try to hide git's tar
+ PATH: c:\windows\system32;${PATH}
sysinfo_script:
- chcp
@@ -290,55 +390,103 @@ task:
- ps: Get-Item -Path 'HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\AeDebug'
- set
- configure_script:
- - copy ci\windows_build_config.pl src\tools\msvc\config.pl
- - vcvarsall x64
- - perl src/tools/msvc/mkvcbuild.pl
- build_script:
- - vcvarsall x64
- # Disable file tracker, we're never going to rebuild...
- - msbuild -m /p:TrackFileAccess=false pgsql.sln
- tempinstall_script:
- # Installation on windows currently only completely works from src\tools\msvc
- - cd src\tools\msvc && perl .\install.pl %CIRRUS_WORKING_DIR%\tmp_install
-
- check_test_script:
- - perl src/tools/msvc/vcregress.pl check parallel
- startcreate_test_script:
- - tmp_install\bin\pg_ctl.exe initdb -D tmp_check\db -l tmp_check\initdb.log
- - echo include '%TEMP_CONFIG%' >> tmp_check\db\postgresql.conf
- - tmp_install\bin\pg_ctl.exe start -D tmp_check\db -l tmp_check\postmaster.log
- plcheck_test_script:
- - perl src/tools/msvc/vcregress.pl plcheck
- isolationcheck_test_script:
- - perl src/tools/msvc/vcregress.pl isolationcheck
- modulescheck_test_script:
- - perl src/tools/msvc/vcregress.pl modulescheck
- contribcheck_test_script:
- - perl src/tools/msvc/vcregress.pl contribcheck
- stop_test_script:
- - tmp_install\bin\pg_ctl.exe stop -D tmp_check\db -l tmp_check\postmaster.log
- ssl_test_script:
- - set with_ssl=openssl
- - perl src/tools/msvc/vcregress.pl taptest .\src\test\ssl\
- subscriptioncheck_test_script:
- - perl src/tools/msvc/vcregress.pl taptest .\src\test\subscription\
- authentication_test_script:
- - perl src/tools/msvc/vcregress.pl taptest .\src\test\authentication\
- recoverycheck_test_script:
- - perl src/tools/msvc/vcregress.pl recoverycheck
- bincheck_test_script:
- - perl src/tools/msvc/vcregress.pl bincheck
- upgradecheck_test_script:
- - perl src/tools/msvc/vcregress.pl upgradecheck
- ecpgcheck_test_script:
- # tries to build additional stuff
- - vcvarsall x64
- # References ecpg_regression.proj in the current dir
- - cd src\tools\msvc
- - perl vcregress.pl ecpgcheck
+ matrix:
+ - name: Windows homegrown
+
+ configure_script:
+ - copy ci\windows_build_config.pl src\tools\msvc\config.pl
+ - vcvarsall x64
+ - perl src/tools/msvc/mkvcbuild.pl
+ build_script:
+ - vcvarsall x64
+ # Disable file tracker, we're never going to rebuild...
+ - msbuild -m /p:TrackFileAccess=false pgsql.sln
+ tempinstall_script:
+ # Installation on windows currently only completely works from src\tools\msvc
+ - cd src\tools\msvc && perl .\install.pl %CIRRUS_WORKING_DIR%\tmp_install
+
+ check_test_script:
+ - perl src/tools/msvc/vcregress.pl check parallel
+ startcreate_test_script:
+ - tmp_install\bin\pg_ctl.exe initdb -D tmp_check\db -l tmp_check\initdb.log
+ - echo include '%TEMP_CONFIG%' >> tmp_check\db\postgresql.conf
+ - tmp_install\bin\pg_ctl.exe start -D tmp_check\db -l tmp_check\postmaster.log
+ plcheck_test_script:
+ - perl src/tools/msvc/vcregress.pl plcheck
+ isolationcheck_test_script:
+ - perl src/tools/msvc/vcregress.pl isolationcheck
+ modulescheck_test_script:
+ - perl src/tools/msvc/vcregress.pl modulescheck
+ contribcheck_test_script:
+ - perl src/tools/msvc/vcregress.pl contribcheck
+ stop_test_script:
+ - tmp_install\bin\pg_ctl.exe stop -D tmp_check\db -l tmp_check\postmaster.log
+ ssl_test_script:
+ - set with_ssl=openssl
+ - perl src/tools/msvc/vcregress.pl taptest .\src\test\ssl\
+ subscriptioncheck_test_script:
+ - perl src/tools/msvc/vcregress.pl taptest .\src\test\subscription\
+ authentication_test_script:
+ - perl src/tools/msvc/vcregress.pl taptest .\src\test\authentication\
+ recoverycheck_test_script:
+ - perl src/tools/msvc/vcregress.pl recoverycheck
+ bincheck_test_script:
+ - perl src/tools/msvc/vcregress.pl bincheck
+ upgradecheck_test_script:
+ - perl src/tools/msvc/vcregress.pl upgradecheck
+ ecpgcheck_test_script:
+ # tries to build additional stuff
+ - vcvarsall x64
+ # References ecpg_regression.proj in the current dir
+ - cd src\tools\msvc
+ - perl vcregress.pl ecpgcheck
+
+ - name: Windows Meson+vs+Ninja
+
+ meson_script:
+ - pip install meson
+ - pip install ninja
+ configure_script:
+ - vcvarsall x64
+ - mkdir subprojects
+ - meson wrap install lz4
+ - meson wrap install zlib
+ - meson setup --buildtype debug --backend ninja -Dcassert=true -Db_pch=true -Dssl=openssl -Dlz4=enabled -Dzlib=enabled -Dextra_lib_dirs=c:\openssl\1.1.1l\lib -Dextra_include_dirs=c:\openssl\1.1.1l\include build
+ build_script:
+ - vcvarsall x64
+ - ninja -C build
+
+ check_script:
+ - vcvarsall x64
+ - meson test --no-rebuild -C build
+
+ - name: Windows Meson+vs+msbuild
+
+ # Need a development version of meson for now
+ meson_dev_script:
+ - git clone https://github.com/mesonbuild/meson.git
+
+ configure_script:
+ - vcvarsall x64
+ - mkdir subprojects
+ - .\meson\meson.py wrap install lz4
+ - .\meson\meson.py wrap install zlib
+ - .\meson\meson.py setup --buildtype debug --backend vs -Dcassert=true -Db_pch=true -Dssl=openssl -Dlz4=enabled -Dzlib=enabled -Dextra_lib_dirs=c:\openssl\1.1.1l\lib -Dextra_include_dirs=c:\openssl\1.1.1l\include build
+
+ build_script:
+ - vcvarsall x64
+ - msbuild -m /p:UseMultiToolTask=true build\postgresql.sln
+
+ check_script:
+ - vcvarsall x64
+ - .\meson\meson.py test --no-rebuild -C build
always:
+ meson_log_artifacts:
+ path: "build/meson-logs/*.txt"
+ type: text/plain
+ cat_dumps_script:
+
cores_script:
- cat crashlog.txt || true
dump_artifacts:
@@ -355,12 +503,16 @@ task:
tap_artifacts:
path: "**/regress_log_*"
type: text/plain
+ meson_test_artifacts:
+ path: "build/meson-logs/testlog.junit.xml"
+ type: text/xml
+ format: junit
task:
name: CompilerWarnings
depends_on:
- - Linux
+ - Linux Autoconf
# task that did not run count as a success, so we need to recheck Linux' condition here :/
only_if: $CIRRUS_CHANGE_MESSAGE !=~ '.*\nci-os-only:.*' || $CIRRUS_CHANGE_MESSAGE =~ '.*\nci-os-only:[^\n]*linux.*'
container:
--
2.23.0.385.gbc12974a89
On 01.11.21 00:24, Andres Freund wrote:
- remaining hardcoded configure tests (e.g. ACCEPT_TYPE_ARG*)
I think we can get rid of that one.
That test originally catered to some strange edge cases where the third
argument was size_t that was not the same size as int. That is long
gone, if it ever really existed. All systems currently of interest use
either socklen_t or int, and socklen_t is always int. (A few build farm
animals report size_t, but they are all 32-bit.)
I think we can change the code to use socklen_t and add a simple check
to typedef socklen_t as int if not available. See attached patch.
Attachments:
0001-Remove-check-for-accept-argument-types.patchtext/plain; charset=UTF-8; name=0001-Remove-check-for-accept-argument-types.patch; x-mac-creator=0; x-mac-type=0Download
From 6229ba9973134dfb184eb21bc62822d83ba554d8 Mon Sep 17 00:00:00 2001
From: Peter Eisentraut <peter@eisentraut.org>
Date: Thu, 4 Nov 2021 13:50:25 +0100
Subject: [PATCH] Remove check for accept() argument types
---
aclocal.m4 | 1 -
config/ac_func_accept_argtypes.m4 | 78 -----------------------------
configure | 82 +++++--------------------------
configure.ac | 2 +-
src/backend/libpq/auth.c | 2 +-
src/backend/libpq/pqcomm.c | 8 +--
src/backend/postmaster/pgstat.c | 4 +-
src/include/c.h | 4 ++
src/include/libpq/pqcomm.h | 2 +-
src/include/pg_config.h.in | 15 ++----
src/interfaces/libpq/fe-connect.c | 2 +-
src/port/getpeereid.c | 4 +-
src/tools/msvc/Solution.pm | 5 +-
13 files changed, 31 insertions(+), 178 deletions(-)
delete mode 100644 config/ac_func_accept_argtypes.m4
diff --git a/aclocal.m4 b/aclocal.m4
index 5e22482cd5..58ade65046 100644
--- a/aclocal.m4
+++ b/aclocal.m4
@@ -1,5 +1,4 @@
dnl aclocal.m4
-m4_include([config/ac_func_accept_argtypes.m4])
m4_include([config/ax_prog_perl_modules.m4])
m4_include([config/ax_pthread.m4])
m4_include([config/c-compiler.m4])
diff --git a/config/ac_func_accept_argtypes.m4 b/config/ac_func_accept_argtypes.m4
deleted file mode 100644
index 178ef67818..0000000000
--- a/config/ac_func_accept_argtypes.m4
+++ /dev/null
@@ -1,78 +0,0 @@
-# config/ac_func_accept_argtypes.m4
-# This comes from the official Autoconf macro archive at
-# <http://research.cys.de/autoconf-archive/>
-
-
-dnl @synopsis AC_FUNC_ACCEPT_ARGTYPES
-dnl
-dnl Checks the data types of the three arguments to accept(). Results are
-dnl placed into the symbols ACCEPT_TYPE_RETURN and ACCEPT_TYPE_ARG[123],
-dnl consistent with the following example:
-dnl
-dnl #define ACCEPT_TYPE_RETURN int
-dnl #define ACCEPT_TYPE_ARG1 int
-dnl #define ACCEPT_TYPE_ARG2 struct sockaddr *
-dnl #define ACCEPT_TYPE_ARG3 socklen_t
-dnl
-dnl NOTE: This is just a modified version of the AC_FUNC_SELECT_ARGTYPES
-dnl macro. Credit for that one goes to David MacKenzie et. al.
-dnl
-dnl @version $Id: ac_func_accept_argtypes.m4,v 1.1 1999/12/03 11:29:29 simons Exp $
-dnl @author Daniel Richard G. <skunk@mit.edu>
-dnl
-
-# PostgreSQL local changes: In the original version ACCEPT_TYPE_ARG3
-# is a pointer type. That's kind of useless because then you can't
-# use the macro to define a corresponding variable. We also make the
-# reasonable(?) assumption that you can use arg3 for getsocktype etc.
-# as well (i.e., anywhere POSIX.2 has socklen_t).
-#
-# arg2 can also be `const' (e.g., RH 4.2). Change the order of tests
-# for arg3 so that `int' is first, in case there is no prototype at all.
-#
-# Solaris 7 and 8 have arg3 as 'void *' (disguised as 'Psocklen_t'
-# which is *not* 'socklen_t *'). If we detect that, then we assume
-# 'int' as the result, because that ought to work best.
-#
-# On Win32, accept() returns 'unsigned int PASCAL'
-# Win64 uses SOCKET for return and arg1
-
-AC_DEFUN([AC_FUNC_ACCEPT_ARGTYPES],
-[AC_MSG_CHECKING([types of arguments for accept()])
- AC_CACHE_VAL(ac_cv_func_accept_return,dnl
- [AC_CACHE_VAL(ac_cv_func_accept_arg1,dnl
- [AC_CACHE_VAL(ac_cv_func_accept_arg2,dnl
- [AC_CACHE_VAL(ac_cv_func_accept_arg3,dnl
- [for ac_cv_func_accept_return in 'int' 'SOCKET WSAAPI' 'unsigned int PASCAL'; do
- for ac_cv_func_accept_arg1 in 'int' 'SOCKET' 'unsigned int'; do
- for ac_cv_func_accept_arg2 in 'struct sockaddr *' 'const struct sockaddr *' 'void *'; do
- for ac_cv_func_accept_arg3 in 'int' 'size_t' 'socklen_t' 'unsigned int' 'void'; do
- AC_COMPILE_IFELSE([AC_LANG_SOURCE(
-[#include <sys/types.h>
-#include <sys/socket.h>
-extern $ac_cv_func_accept_return accept ($ac_cv_func_accept_arg1, $ac_cv_func_accept_arg2, $ac_cv_func_accept_arg3 *);])],
- [ac_not_found=no; break 4], [ac_not_found=yes])
- done
- done
- done
- done
- if test "$ac_not_found" = yes; then
- AC_MSG_ERROR([could not determine argument types])
- fi
- if test "$ac_cv_func_accept_arg3" = "void"; then
- ac_cv_func_accept_arg3=int
- fi
- ])dnl AC_CACHE_VAL
- ])dnl AC_CACHE_VAL
- ])dnl AC_CACHE_VAL
- ])dnl AC_CACHE_VAL
- AC_MSG_RESULT([$ac_cv_func_accept_return, $ac_cv_func_accept_arg1, $ac_cv_func_accept_arg2, $ac_cv_func_accept_arg3 *])
- AC_DEFINE_UNQUOTED(ACCEPT_TYPE_RETURN, $ac_cv_func_accept_return,
- [Define to the return type of 'accept'])
- AC_DEFINE_UNQUOTED(ACCEPT_TYPE_ARG1, $ac_cv_func_accept_arg1,
- [Define to the type of arg 1 of 'accept'])
- AC_DEFINE_UNQUOTED(ACCEPT_TYPE_ARG2, $ac_cv_func_accept_arg2,
- [Define to the type of arg 2 of 'accept'])
- AC_DEFINE_UNQUOTED(ACCEPT_TYPE_ARG3, $ac_cv_func_accept_arg3,
- [Define to the type of arg 3 of 'accept'])
-])
diff --git a/configure b/configure
index 4ffefe4655..e60e78efdf 100755
--- a/configure
+++ b/configure
@@ -14615,6 +14615,17 @@ cat >>confdefs.h <<_ACEOF
_ACEOF
+fi
+
+ac_fn_c_check_type "$LINENO" "socklen_t" "ac_cv_type_socklen_t" "#include <sys/socket.h>
+"
+if test "x$ac_cv_type_socklen_t" = xyes; then :
+
+cat >>confdefs.h <<_ACEOF
+#define HAVE_SOCKLEN_T 1
+_ACEOF
+
+
fi
ac_fn_c_check_type "$LINENO" "struct sockaddr_un" "ac_cv_type_struct_sockaddr_un" "#include <sys/types.h>
@@ -15327,77 +15338,6 @@ if test x"$pgac_cv_var_int_timezone" = xyes ; then
$as_echo "#define HAVE_INT_TIMEZONE 1" >>confdefs.h
fi
-{ $as_echo "$as_me:${as_lineno-$LINENO}: checking types of arguments for accept()" >&5
-$as_echo_n "checking types of arguments for accept()... " >&6; }
- if ${ac_cv_func_accept_return+:} false; then :
- $as_echo_n "(cached) " >&6
-else
- if ${ac_cv_func_accept_arg1+:} false; then :
- $as_echo_n "(cached) " >&6
-else
- if ${ac_cv_func_accept_arg2+:} false; then :
- $as_echo_n "(cached) " >&6
-else
- if ${ac_cv_func_accept_arg3+:} false; then :
- $as_echo_n "(cached) " >&6
-else
- for ac_cv_func_accept_return in 'int' 'SOCKET WSAAPI' 'unsigned int PASCAL'; do
- for ac_cv_func_accept_arg1 in 'int' 'SOCKET' 'unsigned int'; do
- for ac_cv_func_accept_arg2 in 'struct sockaddr *' 'const struct sockaddr *' 'void *'; do
- for ac_cv_func_accept_arg3 in 'int' 'size_t' 'socklen_t' 'unsigned int' 'void'; do
- cat confdefs.h - <<_ACEOF >conftest.$ac_ext
-/* end confdefs.h. */
-#include <sys/types.h>
-#include <sys/socket.h>
-extern $ac_cv_func_accept_return accept ($ac_cv_func_accept_arg1, $ac_cv_func_accept_arg2, $ac_cv_func_accept_arg3 *);
-_ACEOF
-if ac_fn_c_try_compile "$LINENO"; then :
- ac_not_found=no; break 4
-else
- ac_not_found=yes
-fi
-rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext
- done
- done
- done
- done
- if test "$ac_not_found" = yes; then
- as_fn_error $? "could not determine argument types" "$LINENO" 5
- fi
- if test "$ac_cv_func_accept_arg3" = "void"; then
- ac_cv_func_accept_arg3=int
- fi
-
-fi
-
-fi
-
-fi
-
-fi
- { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_func_accept_return, $ac_cv_func_accept_arg1, $ac_cv_func_accept_arg2, $ac_cv_func_accept_arg3 *" >&5
-$as_echo "$ac_cv_func_accept_return, $ac_cv_func_accept_arg1, $ac_cv_func_accept_arg2, $ac_cv_func_accept_arg3 *" >&6; }
-
-cat >>confdefs.h <<_ACEOF
-#define ACCEPT_TYPE_RETURN $ac_cv_func_accept_return
-_ACEOF
-
-
-cat >>confdefs.h <<_ACEOF
-#define ACCEPT_TYPE_ARG1 $ac_cv_func_accept_arg1
-_ACEOF
-
-
-cat >>confdefs.h <<_ACEOF
-#define ACCEPT_TYPE_ARG2 $ac_cv_func_accept_arg2
-_ACEOF
-
-
-cat >>confdefs.h <<_ACEOF
-#define ACCEPT_TYPE_ARG3 $ac_cv_func_accept_arg3
-_ACEOF
-
-
{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether gettimeofday takes only one argument" >&5
$as_echo_n "checking whether gettimeofday takes only one argument... " >&6; }
if ${pgac_cv_func_gettimeofday_1arg+:} false; then :
diff --git a/configure.ac b/configure.ac
index 44ee3ebe2f..22cabe1b60 100644
--- a/configure.ac
+++ b/configure.ac
@@ -1552,6 +1552,7 @@ PGAC_C_BUILTIN_UNREACHABLE
PGAC_C_COMPUTED_GOTO
PGAC_STRUCT_TIMEZONE
PGAC_UNION_SEMUN
+AC_CHECK_TYPES(socklen_t, [], [], [#include <sys/socket.h>])
PGAC_STRUCT_SOCKADDR_UN
PGAC_STRUCT_SOCKADDR_STORAGE
PGAC_STRUCT_SOCKADDR_STORAGE_MEMBERS
@@ -1686,7 +1687,6 @@ fi
##
PGAC_VAR_INT_TIMEZONE
-AC_FUNC_ACCEPT_ARGTYPES
PGAC_FUNC_GETTIMEOFDAY_1ARG
PGAC_FUNC_WCSTOMBS_L
diff --git a/src/backend/libpq/auth.c b/src/backend/libpq/auth.c
index a317aef1c9..7bcf52523b 100644
--- a/src/backend/libpq/auth.c
+++ b/src/backend/libpq/auth.c
@@ -3026,7 +3026,7 @@ PerformRadiusTransaction(const char *server, const char *secret, const char *por
struct addrinfo hint;
struct addrinfo *serveraddrs;
int port;
- ACCEPT_TYPE_ARG3 addrsize;
+ socklen_t addrsize;
fd_set fdset;
struct timeval endtime;
int i,
diff --git a/src/backend/libpq/pqcomm.c b/src/backend/libpq/pqcomm.c
index 89a5f901aa..6f2b2bbb37 100644
--- a/src/backend/libpq/pqcomm.c
+++ b/src/backend/libpq/pqcomm.c
@@ -1620,7 +1620,7 @@ pq_getkeepalivesidle(Port *port)
if (port->default_keepalives_idle == 0)
{
#ifndef WIN32
- ACCEPT_TYPE_ARG3 size = sizeof(port->default_keepalives_idle);
+ socklen_t size = sizeof(port->default_keepalives_idle);
if (getsockopt(port->sock, IPPROTO_TCP, PG_TCP_KEEPALIVE_IDLE,
(char *) &port->default_keepalives_idle,
@@ -1705,7 +1705,7 @@ pq_getkeepalivesinterval(Port *port)
if (port->default_keepalives_interval == 0)
{
#ifndef WIN32
- ACCEPT_TYPE_ARG3 size = sizeof(port->default_keepalives_interval);
+ socklen_t size = sizeof(port->default_keepalives_interval);
if (getsockopt(port->sock, IPPROTO_TCP, TCP_KEEPINTVL,
(char *) &port->default_keepalives_interval,
@@ -1788,7 +1788,7 @@ pq_getkeepalivescount(Port *port)
if (port->default_keepalives_count == 0)
{
- ACCEPT_TYPE_ARG3 size = sizeof(port->default_keepalives_count);
+ socklen_t size = sizeof(port->default_keepalives_count);
if (getsockopt(port->sock, IPPROTO_TCP, TCP_KEEPCNT,
(char *) &port->default_keepalives_count,
@@ -1863,7 +1863,7 @@ pq_gettcpusertimeout(Port *port)
if (port->default_tcp_user_timeout == 0)
{
- ACCEPT_TYPE_ARG3 size = sizeof(port->default_tcp_user_timeout);
+ socklen_t size = sizeof(port->default_tcp_user_timeout);
if (getsockopt(port->sock, IPPROTO_TCP, TCP_USER_TIMEOUT,
(char *) &port->default_tcp_user_timeout,
diff --git a/src/backend/postmaster/pgstat.c b/src/backend/postmaster/pgstat.c
index b7d0fbaefd..8c166e5e16 100644
--- a/src/backend/postmaster/pgstat.c
+++ b/src/backend/postmaster/pgstat.c
@@ -391,7 +391,7 @@ static void pgstat_recv_tempfile(PgStat_MsgTempFile *msg, int len);
void
pgstat_init(void)
{
- ACCEPT_TYPE_ARG3 alen;
+ socklen_t alen;
struct addrinfo *addrs = NULL,
*addr,
hints;
@@ -624,7 +624,7 @@ pgstat_init(void)
{
int old_rcvbuf;
int new_rcvbuf;
- ACCEPT_TYPE_ARG3 rcvbufsize = sizeof(old_rcvbuf);
+ socklen_t rcvbufsize = sizeof(old_rcvbuf);
if (getsockopt(pgStatSock, SOL_SOCKET, SO_RCVBUF,
(char *) &old_rcvbuf, &rcvbufsize) < 0)
diff --git a/src/include/c.h b/src/include/c.h
index c8ede08273..7c790f557e 100644
--- a/src/include/c.h
+++ b/src/include/c.h
@@ -408,6 +408,10 @@ typedef unsigned char bool;
* ----------------------------------------------------------------
*/
+#ifndef HAVE_SOCKLEN_T
+typedef socklen_t int;
+#endif
+
/*
* Pointer
* Variable holding address of any memory resident object.
diff --git a/src/include/libpq/pqcomm.h b/src/include/libpq/pqcomm.h
index be9d970574..1bcc189dee 100644
--- a/src/include/libpq/pqcomm.h
+++ b/src/include/libpq/pqcomm.h
@@ -62,7 +62,7 @@ struct sockaddr_storage
typedef struct
{
struct sockaddr_storage addr;
- ACCEPT_TYPE_ARG3 salen;
+ socklen_t salen;
} SockAddr;
/* Configure the UNIX socket location for the well known port. */
diff --git a/src/include/pg_config.h.in b/src/include/pg_config.h.in
index 15ffdd895a..ca3592465e 100644
--- a/src/include/pg_config.h.in
+++ b/src/include/pg_config.h.in
@@ -1,17 +1,5 @@
/* src/include/pg_config.h.in. Generated from configure.ac by autoheader. */
-/* Define to the type of arg 1 of 'accept' */
-#undef ACCEPT_TYPE_ARG1
-
-/* Define to the type of arg 2 of 'accept' */
-#undef ACCEPT_TYPE_ARG2
-
-/* Define to the type of arg 3 of 'accept' */
-#undef ACCEPT_TYPE_ARG3
-
-/* Define to the return type of 'accept' */
-#undef ACCEPT_TYPE_RETURN
-
/* Define if building universal (internal helper macro) */
#undef AC_APPLE_UNIVERSAL_BUILD
@@ -518,6 +506,9 @@
/* Define to 1 if you have the `shm_open' function. */
#undef HAVE_SHM_OPEN
+/* Define to 1 if the system has the type `socklen_t'. */
+#undef HAVE_SOCKLEN_T
+
/* Define to 1 if you have spinlocks. */
#undef HAVE_SPINLOCKS
diff --git a/src/interfaces/libpq/fe-connect.c b/src/interfaces/libpq/fe-connect.c
index b288d346f9..0b7ee3e3c8 100644
--- a/src/interfaces/libpq/fe-connect.c
+++ b/src/interfaces/libpq/fe-connect.c
@@ -2744,7 +2744,7 @@ PQconnectPoll(PGconn *conn)
case CONNECTION_STARTED:
{
- ACCEPT_TYPE_ARG3 optlen = sizeof(optval);
+ socklen_t optlen = sizeof(optval);
/*
* Write ready, since we've made it here, so the connection
diff --git a/src/port/getpeereid.c b/src/port/getpeereid.c
index d6aa755d30..4631869180 100644
--- a/src/port/getpeereid.c
+++ b/src/port/getpeereid.c
@@ -37,7 +37,7 @@ getpeereid(int sock, uid_t *uid, gid_t *gid)
#if defined(SO_PEERCRED)
/* Linux: use getsockopt(SO_PEERCRED) */
struct ucred peercred;
- ACCEPT_TYPE_ARG3 so_len = sizeof(peercred);
+ socklen_t so_len = sizeof(peercred);
if (getsockopt(sock, SOL_SOCKET, SO_PEERCRED, &peercred, &so_len) != 0 ||
so_len != sizeof(peercred))
@@ -48,7 +48,7 @@ getpeereid(int sock, uid_t *uid, gid_t *gid)
#elif defined(LOCAL_PEERCRED)
/* Debian with FreeBSD kernel: use getsockopt(LOCAL_PEERCRED) */
struct xucred peercred;
- ACCEPT_TYPE_ARG3 so_len = sizeof(peercred);
+ socklen_t so_len = sizeof(peercred);
if (getsockopt(sock, 0, LOCAL_PEERCRED, &peercred, &so_len) != 0 ||
so_len != sizeof(peercred) ||
diff --git a/src/tools/msvc/Solution.pm b/src/tools/msvc/Solution.pm
index 43fd1be088..a013951e0d 100644
--- a/src/tools/msvc/Solution.pm
+++ b/src/tools/msvc/Solution.pm
@@ -205,10 +205,6 @@ sub GenerateFiles
# Every symbol in pg_config.h.in must be accounted for here. Set
# to undef if the symbol should not be defined.
my %define = (
- ACCEPT_TYPE_ARG1 => 'unsigned int',
- ACCEPT_TYPE_ARG2 => 'struct sockaddr *',
- ACCEPT_TYPE_ARG3 => 'int',
- ACCEPT_TYPE_RETURN => 'unsigned int PASCAL',
ALIGNOF_DOUBLE => 8,
ALIGNOF_INT => 4,
ALIGNOF_LONG => 4,
@@ -365,6 +361,7 @@ sub GenerateFiles
HAVE_SETPROCTITLE_FAST => undef,
HAVE_SETSID => undef,
HAVE_SHM_OPEN => undef,
+ HAVE_SOCKLEN_T => 1,
HAVE_SPINLOCKS => 1,
HAVE_SRANDOM => undef,
HAVE_STDBOOL_H => 1,
--
2.33.1
Hi,
On 2021-11-04 19:17:05 +0100, Peter Eisentraut wrote:
On 01.11.21 00:24, Andres Freund wrote:
- remaining hardcoded configure tests (e.g. ACCEPT_TYPE_ARG*)
I think we can get rid of that one.
Oh, nice!
I was somewhat confused by "unsigned int PASCAL" as a type.
That test originally catered to some strange edge cases where the third
argument was size_t that was not the same size as int. That is long gone,
if it ever really existed. All systems currently of interest use either
socklen_t or int, and socklen_t is always int. (A few build farm animals
report size_t, but they are all 32-bit.)
diff --git a/src/include/c.h b/src/include/c.h index c8ede08273..7c790f557e 100644 --- a/src/include/c.h +++ b/src/include/c.h @@ -408,6 +408,10 @@ typedef unsigned char bool; * ---------------------------------------------------------------- */+#ifndef HAVE_SOCKLEN_T +typedef socklen_t int; +#endif
I'd put this in port.h instead of c.h, or is there a reason not to do so?
Probably worth putting this in fairly soon independent of whether anything
happens wrt meson?
Greetings,
Andres Freund
On 04.11.21 19:48, Andres Freund wrote:
Probably worth putting this in fairly soon independent of whether anything
happens wrt meson?
OK, done. Let's see what happens. ;-)
On 01.11.21 00:24, Andres Freund wrote:
Hi,
Attached is an updated version of the meson patchset.
Nanoreview: I think the patch
Subject: [PATCH v5 11/16] meson: prereq: Handle DLSUFFIX in msvc builds
similar to other build envs.
is good to go. It's not clear why it's needed in this context, but it
seems good in general to make these things more consistent.
Hi,
On 2021-11-10 11:07:02 +0100, Peter Eisentraut wrote:
On 01.11.21 00:24, Andres Freund wrote:
Hi,
Attached is an updated version of the meson patchset.
Nanoreview: I think the patch
Thanks for looking!
Subject: [PATCH v5 11/16] meson: prereq: Handle DLSUFFIX in msvc builds
similar to other build envs.is good to go. It's not clear why it's needed in this context, but it seems
good in general to make these things more consistent.
The way it was set between msvc and other builds is currently inconsistent
between msvc and other builds, by virtue of win32_port.h defining for msvc:
/* Things that exist in MinGW headers, but need to be added to MSVC */
#ifdef _MSC_VER
...
/* Pulled from Makefile.port in MinGW */
#define DLSUFFIX ".dll"
it'd have needed unnecessarily contorted logic to continue setting DLSUFFIX
via commandline for !msvc, given that the the meson stuff is the same for msvc
and !msvc.
Greetings,
Andres Freund
Hi,
FWIW, I tried building postgres on a few other operating systems using
meson, after I got access to the gcc compile farm. Here's the results:
- openbsd: Compiled fine. Hit one issue running tests:
openbsd has *completely* broken $ORIGIN support. It uses CWD as $ORIGIN
rpaths, which obviously breaks for binaries invoked via PATH. So there
goes the idea to only use $ORIGIN to run tests. Still seems worth to use
on other platforms, particularly because it works with SIP on macos
I understand not supporting $ORIGIN at all. But implementing it this way
seems insane.
I also ran into some problems with the semaphore limits. I had to switch to
USE_NAMED_POSIX_SEMAPHORES to make the tests pass at all.
- netbsd: Compiled fine after some minor fix. There's a bit more to fix around
many libraries not being in the normal library directory, but in
/usr/pkg/lib, which is not in the library search path (i.e. we need to add
an rpath for that in a few more places).
- AIX: Compiled and basic postgres runs fine after a few fixes (big endian
test, converting exports.txt into the right format). Doesn't yet
successfully run more than trivial tests, because I didn't implement the
necessary generation of import files for postgres, but that's just a bit of
work.
This is hampered by the fact that the vanilla postgres crashes for me. I
haven't quite figured out what's the problem. Might be a system issue -
lots of other tools, e.g. perl, segfault frequently.
One important thing to call out: Meson has support for the AIX linker, but
*not* the xlc compiler. I.e. one has to use gcc (or clang, but I didn't
try). I don't know if we'd require adding support for xlc to meson - xlc is
pretty buggy and it doesn't seem particularly crucial to support such an old
crufty compiler on a platform that's not used to a significant degree?
Greetings,
Andres
Andres Freund <andres@anarazel.de> writes:
One important thing to call out: Meson has support for the AIX linker, but
*not* the xlc compiler. I.e. one has to use gcc (or clang, but I didn't
try). I don't know if we'd require adding support for xlc to meson - xlc is
pretty buggy and it doesn't seem particularly crucial to support such an old
crufty compiler on a platform that's not used to a significant degree?
While I have no particular interest in AIX or xlc specifically, I do
worry about us becoming a builds-on-gcc-or-workalikes-only project.
I suppose MSVC provides a little bit of a cross-check, but I don't
really like giving up on other compilers. Discounting gcc+clang+MSVC
leaves just a few buildfarm animals, and the xlc ones are a significant
part of that population. (In fact, unless somebody renews fossa/husky's
icc license, the three xlc animals will be an outright majority of
them, because wrasse and anole are the only other active animals with
non-mainstream compilers.)
Having said that, I don't plan to be the one trying to get meson
to add xlc support ...
regards, tom lane
Hi,
On 2021-11-15 14:11:25 -0500, Tom Lane wrote:
Andres Freund <andres@anarazel.de> writes:
One important thing to call out: Meson has support for the AIX linker, but
*not* the xlc compiler. I.e. one has to use gcc (or clang, but I didn't
try). I don't know if we'd require adding support for xlc to meson - xlc is
pretty buggy and it doesn't seem particularly crucial to support such an old
crufty compiler on a platform that's not used to a significant degree?While I have no particular interest in AIX or xlc specifically, I do
worry about us becoming a builds-on-gcc-or-workalikes-only project.
I suppose MSVC provides a little bit of a cross-check, but I don't
really like giving up on other compilers. Discounting gcc+clang+MSVC
leaves just a few buildfarm animals, and the xlc ones are a significant
part of that population.
Yea, that's a reasonable concern. I wonder if there's some non-mainstream
compiler that actually works on, um, more easily available platforms that we
could utilize.
(In fact, unless somebody renews fossa/husky's
icc license, the three xlc animals will be an outright majority of
them, because wrasse and anole are the only other active animals with
non-mainstream compilers.)
It should probably be doable to get somebody to run another icc animal. Icc is
supported by meson, fwiw.
Having said that, I don't plan to be the one trying to get meson
to add xlc support ...
It'd probably not be too hard. But given that it's quite hard to get access to
AIX + xlc, I'm not sure it's something I want to propose. There's no resources
to run halfway regular tests on that I found...
It's good to make sure we're not growing too reliant on some compiler(s), but
imo only really makes sense if the alternative compilers are meaningfully
available and maintained.
Greetings,
Andres Freund
On Mon, Nov 15, 2021 at 2:23 PM Andres Freund <andres@anarazel.de> wrote:
It's good to make sure we're not growing too reliant on some compiler(s), but
imo only really makes sense if the alternative compilers are meaningfully
available and maintained.
That's a sensible position. I do worry that with this proposed move
we're going to be giving up some of the flexibility that we have right
now. I'm not sure exactly what that means in practice. But make is
just a way of running shell commands, and so you can run any shell
commands you want. The concept of some compiler not being supported
isn't really a thing that even makes sense in a world that is powered
by make. With a big enough hammer you can run any commands you like,
including any compilation commands you like. The whole thing is likely
to be a bit crufty which is a downside, and you might spend more time
fiddling with it than you really want. But nothing is really ever
blocked.
--
Robert Haas
EDB: http://www.enterprisedb.com
On Tue, Nov 16, 2021 at 8:23 AM Andres Freund <andres@anarazel.de> wrote:
On 2021-11-15 14:11:25 -0500, Tom Lane wrote:
Having said that, I don't plan to be the one trying to get meson
to add xlc support ...It'd probably not be too hard. But given that it's quite hard to get access to
AIX + xlc, I'm not sure it's something I want to propose. There's no resources
to run halfway regular tests on that I found...
FWIW there's a free-as-in-beer edition of xlc for Linux (various
distros, POWER only) so you could use qemu, though of course there
will be differences WRT AIX especially around linking, and I suppose a
big part of that work would be precisely understanding stuff like
linker details.
It looks like we have two xlc 12.1 compilers in the farm, but those
compilers are EOL'd[1]https://www.ibm.com/support/pages/lifecycle/search?q=xl%20c%2Fc%2B%2B. The current release is 16.1, and we have one
of those. The interesting thing about 16.1 is that you can invoke it
as xlclang to get the new clang frontend and, I think, possibly use
more clang/gcc-ish compiler switches[2]https://www.ibm.com/docs/en/xl-c-and-cpp-aix/16.1?topic=new-clang-based-front-end.
[1]: https://www.ibm.com/support/pages/lifecycle/search?q=xl%20c%2Fc%2B%2B
[2]: https://www.ibm.com/docs/en/xl-c-and-cpp-aix/16.1?topic=new-clang-based-front-end
Thomas Munro <thomas.munro@gmail.com> writes:
... The interesting thing about 16.1 is that you can invoke it
as xlclang to get the new clang frontend and, I think, possibly use
more clang/gcc-ish compiler switches[2].
[2] https://www.ibm.com/docs/en/xl-c-and-cpp-aix/16.1?topic=new-clang-based-front-end
Ho, that's an earful ;-). Though I wonder whether that frontend
hides the AIX-specific linking issues you mentioned. (Also, although
I see /opt/IBM/xlc/16.1.0/ on gcc119, there's no xlclang there.
So whether we have useful access to it right now is unclear.)
This plays into something that was nagging at me while I wrote my
upthread screed about not giving up on non-gcc/clang compilers:
are those compilers outcompeting all the proprietary ones, to the
extent that the latter will be dead soon anyway? I think Microsoft
is rich enough and stubborn enough to keep on developing MSVC no
matter what, but other compiler vendors may see the handwriting
on the wall. Writing C compilers can't be a growth industry these
days.
regards, tom lane
Hi,
On 2021-11-15 17:34:33 -0500, Tom Lane wrote:
Thomas Munro <thomas.munro@gmail.com> writes:
... The interesting thing about 16.1 is that you can invoke it
as xlclang to get the new clang frontend and, I think, possibly use
more clang/gcc-ish compiler switches[2].
[2] https://www.ibm.com/docs/en/xl-c-and-cpp-aix/16.1?topic=new-clang-based-front-endHo, that's an earful ;-). Though I wonder whether that frontend
hides the AIX-specific linking issues you mentioned. (Also, although
I see /opt/IBM/xlc/16.1.0/ on gcc119, there's no xlclang there.
So whether we have useful access to it right now is unclear.)
It's actually available there, but in /opt/IBM/xlC/16.1.0/bin/xlclang++ (note
the upper case C).
It doesn't really hide the linking issues afaict. I think they're basically an
ABI rather than a linker invocation issue. It's not that hard to address them
though, it's basically making mkldexport.sh a tiny bit more general and
integrating it into src/backend/postgres' build.
We don't have to generate export files for shared libraries anymore though,
afaict, because there's 'expall', which suffices for our purposes. dlopen()
doesn't require an import file.
This plays into something that was nagging at me while I wrote my
upthread screed about not giving up on non-gcc/clang compilers:
are those compilers outcompeting all the proprietary ones, to the
extent that the latter will be dead soon anyway?
I think that's a pretty clear trend. The ones that aren't dying seem to be
incrementally onto more and more rebasing onto llvm tooling.
It doesn't help that most of those compilers are primarily for OSs that, uh,
aren't exactly growing. Which limits their potential usability significantly.
Greetings,
Andres Freund
On Tue, Nov 16, 2021 at 11:08 AM Thomas Munro <thomas.munro@gmail.com> wrote:
FWIW there's a free-as-in-beer edition of xlc for Linux (various
distros, POWER only) so you could use qemu,
(It's also known to be possible to run AIX 7.2 on qemu, but the
install media is not made available to developers for testing/CI
without a hardware serial number. Boo.)
On Tue, Nov 16, 2021 at 8:23 AM Andres Freund <andres@anarazel.de> wrote:
On 2021-11-15 14:11:25 -0500, Tom Lane wrote:
(In fact, unless somebody renews fossa/husky's
icc license, the three xlc animals will be an outright majority of
them, because wrasse and anole are the only other active animals with
non-mainstream compilers.)It should probably be doable to get somebody to run another icc animal. Icc is
supported by meson, fwiw.
FWIW, in case someone is interested in bringing ICC back to the farm,
some light googling tells me that newer editions of "classic" ICC (as
opposed to "data parallel" ICC, parts of some kind of rebrand) no
longer require regular licence bureaucracy, and can be installed in
modern easier to maintain ways. For example, I see that some people
add Intel's APT repository and apt-get install the compiler inside CI
jobs, on Ubuntu.
On 10/13/21 16:06, Andrew Dunstan wrote:
On 10/13/21 1:26 PM, Andres Freund wrote:
pexports will be in the resulting path, and the build will use the
native compiler.I don't see pexports anywhere in the msys installation. I can see it available
on sourceforge, and I see a few others asking where to get it from in the
context of msys, and being pointed to manually downloading it.Weird. fairywren has it, which means that it must have been removed from
the packages at some stage, fairly recently as fairywren isn't that old.
I just confirmed the absence on a 100% fresh install.It is in Strawberry's c/bin directory.
Seems like we should consider using gendef instead of pexports, given it's
available in msys?Yeah. It's missing on my ancient msys animal (frogmouth), but it doesn't
build --with-perl.jacana seems to have it.
If you prep a patch I'll test it.
Here's a patch. I've tested the perl piece on master and it works fine.
It applies cleanly down to 9.4, which is before we got transform modules
(9.5) which fail if we just omit doing this platform-specific piece.
Before that only plpython uses pexports, and we're not committed to
supporting plpython at all on old branches.
cheers
andrew
--
Andrew Dunstan
EDB: https://www.enterprisedb.com
Attachments:
gendef.patchtext/x-patch; charset=UTF-8; name=gendef.patchDownload
commit 823b36d98b
Author: Andrew Dunstan <andrew@dunslane.net>
Date: Sun Feb 6 10:53:06 2022 -0500
Use gendef instead of pexports for building windows .def files
Modern msys systems lack pexports but have gendef instead, so use that.
Discussion: https://postgr.es/m/20d4fa3a-e0f1-0ac4-1657-0698ee1c511f@dunslane.net>
diff --git a/src/pl/plperl/GNUmakefile b/src/pl/plperl/GNUmakefile
index 919d46453f..a2e6410f53 100644
--- a/src/pl/plperl/GNUmakefile
+++ b/src/pl/plperl/GNUmakefile
@@ -48,7 +48,7 @@ lib$(perlwithver).a: $(perlwithver).def
dlltool --dllname $(perlwithver).dll --def $(perlwithver).def --output-lib lib$(perlwithver).a
$(perlwithver).def: $(PERLDLL)
- pexports $^ > $@
+ gendef - $^ > $@
endif # win32
diff --git a/src/pl/plpython/Makefile b/src/pl/plpython/Makefile
index 9e95285af8..a83ae8865c 100644
--- a/src/pl/plpython/Makefile
+++ b/src/pl/plpython/Makefile
@@ -69,7 +69,7 @@ libpython${pytverstr}.a: python${pytverstr}.def
dlltool --dllname python${pytverstr}.dll --def python${pytverstr}.def --output-lib libpython${pytverstr}.a
python${pytverstr}.def:
- pexports $(PYTHONDLL) > $@
+ gendef - $(PYTHONDLL) > $@
endif # win32
diff --git a/src/pl/tcl/Makefile b/src/pl/tcl/Makefile
index 1e7740da3f..25e65189b6 100644
--- a/src/pl/tcl/Makefile
+++ b/src/pl/tcl/Makefile
@@ -46,7 +46,7 @@ lib$(tclwithver).a: $(tclwithver).def
dlltool --dllname $(tclwithver).dll --def $(tclwithver).def --output-lib lib$(tclwithver).a
$(tclwithver).def: $(TCLDLL)
- pexports $^ > $@
+ gendef - $^ > $@
endif # win32
Hi,
On 2022-02-06 12:06:41 -0500, Andrew Dunstan wrote:
Here's a patch. I've tested the perl piece on master and it works fine.
It applies cleanly down to 9.4, which is before we got transform modules
(9.5) which fail if we just omit doing this platform-specific piece.
Given /messages/by-id/34e972bc-6e75-0754-9e6d-cde2518773a1@dunslane.net
wouldn't it make sense to simply remove the pexports/gendef logic instead of
moving to gendef?
Greetings,
Andres Freund
On 2/6/22 13:39, Andres Freund wrote:
Hi,
On 2022-02-06 12:06:41 -0500, Andrew Dunstan wrote:
Here's a patch. I've tested the perl piece on master and it works fine.
It applies cleanly down to 9.4, which is before we got transform modules
(9.5) which fail if we just omit doing this platform-specific piece.Given /messages/by-id/34e972bc-6e75-0754-9e6d-cde2518773a1@dunslane.net
wouldn't it make sense to simply remove the pexports/gendef logic instead of
moving to gendef?
I haven't found a way to fix the transform builds if we do that. So
let's leave that as a separate exercise unless you have a solution for
that - this patch is really trivial.
cheers
andrew
--
Andrew Dunstan
EDB: https://www.enterprisedb.com
Hi,
I've been wondering whether we should try to have the generated pg_config.h
look as similar as possible to autoconf/autoheader's, or not. And whether the
way autoconf/autoheader define symbols makes sense when not using either
anymore.
To be honest, I do not really understand the logic behind when autoconf ends
up with #defines that define a macro to 0/1 and when a macro ends defined/or
not and when we end up with a macro defined to 1 or not defined at all.
So far I've tried to mirror the logic, but not the description / comment
formatting of the individual macros.
The "defined to 1 or not defined at all" behaviour is a mildly awkward to
achieve with meson, because it doesn't match the behaviour for booleans
options meson has (there are two builtin behaviours, one to define/undefine a
macro, the other to set the macro to 0/1. But there's none that defines a
macro to 1 or undefines it).
Probably best to initially have the macros defined as similar as reasonably
possible, but subsequently clean things up a bit.
A second aspect that I'm wondering about is whether we should try to split
pg_config.h output a bit:
With meson it's easy to change options like whether to build with some
dependency in an existing build tree and then still get a reliable build
result (ninja rebuilds if the commandline changed from the last invocation).
But right now doing so often ends up with way bigger rebuilds than necessary,
because for a lot of options we add #define USE_LDAP 1 etc to pg_config.h,
which of course requires rebuilding a lot of files. Even though most of these
symbols are only checked in a handful of files, often only .c files.
It seems like it might make sense to separate out defines that depend on the
compiler / "standard libraries" (e.g. {SIZEOF,ALIGNOF}_*,
HAVE_DECL_{STRNLEN,...}, HAVE_*_H) from feature defines (like
USE_{LDAP,ICU,...}). The header containing the latter could then included in
the places needing it (or we could have one header for each of the places
using it).
Perhaps we should also separate out configure-time settings like BLCKSZ,
DEF_PGPORT, etc. Realistically most of them are going to require a "full tree"
recompile anway, but it seems like it might make things easier to understand.
I think a split into pg_config_{platform,features,settings}.h could make sense.
Similar to above, it's probably best to do this separately after merging meson
support. But knowing what the split should eventually look like would be
helpful before, to ensure it's easy to do.
Greetings,
Andres Freund
Andres Freund <andres@anarazel.de> writes:
I've been wondering whether we should try to have the generated pg_config.h
look as similar as possible to autoconf/autoheader's, or not. And whether the
way autoconf/autoheader define symbols makes sense when not using either
anymore.
To be honest, I do not really understand the logic behind when autoconf ends
up with #defines that define a macro to 0/1 and when a macro ends defined/or
not and when we end up with a macro defined to 1 or not defined at all.
Agreed, that always seemed entirely random to me too. I'd be content
to end up with "defined or not defined" as the standard. I think
we have way more #ifdef tests than #if tests, so changing the latter
seems more sensible than changing the former.
A second aspect that I'm wondering about is whether we should try to split
pg_config.h output a bit:
TBH I can't get excited about that. I do not think that rebuilding
with different options is a critical path. ccache already does most
of the heavy lifting when you do that sort of thing, anyway.
regards, tom lane
Hi,
I was trying to fix a few perl embedding oddities in the meson
patchset.
Whenever I have looked at the existing code, I've been a bit confused about
the following
code/comment in perl.m4:
# PGAC_CHECK_PERL_EMBED_LDFLAGS
# -----------------------------
# We are after Embed's ldopts, but without the subset mentioned in
# Config's ccdlflags; [...]
pgac_tmp1=`$PERL -MExtUtils::Embed -e ldopts`
pgac_tmp2=`$PERL -MConfig -e 'print $Config{ccdlflags}'`
perl_embed_ldflags=`echo X"$pgac_tmp1" | sed -e "s/^X//" -e "s%$pgac_tmp2%%" -e ["s/ -arch [-a-zA-Z0-9_]*//g"]`
What is the reason behind subtracting ccdlflags?
The comment originates in:
commit d69a419e682c2d39c2355105a7e5e2b90357c8f0
Author: Tom Lane <tgl@sss.pgh.pa.us>
Date: 2009-09-08 18:15:55 +0000
Remove any -arch switches given in ExtUtils::Embed's ldopts from our
perl_embed_ldflags setting. On OS X it seems that ExtUtils::Embed is
trying to force a universal binary to be built, but you need to specify
that a lot further upstream if you want Postgres built that way; the only
result of including -arch in perl_embed_ldflags is some warnings at the
plperl.so link step. Per my complaint and Jan Otto's suggestion.
but the subtraction goes all the way back to
commit 7662419f1bc1a994193c319c9304dfc47e121c98
Author: Peter Eisentraut <peter_e@gmx.net>
Date: 2002-05-28 16:57:53 +0000
Change PL/Perl and Pg interface build to use configured compiler and
Makefile.shlib system, not MakeMaker.
Greetings,
Andres Freund
Hi,
On 2022-02-07 16:30:53 -0500, Tom Lane wrote:
A second aspect that I'm wondering about is whether we should try to split
pg_config.h output a bit:TBH I can't get excited about that. I do not think that rebuilding
with different options is a critical path. ccache already does most
of the heavy lifting when you do that sort of thing, anyway.
I've found it to be pretty painful when building with msvc, which doesn't have
ccache (yet at least), and where the process startup overhead is bigger.
Even on some other platforms it's useful - it takes a while on net/openbsd to
recompile postgres, even if everything is in ccache. If I test on some
platform I'll often install the most basic set, get the tests to run, and then
incrementally figure out what other packages need to be installed etc.
Greetings,
Andres Freund
Andres Freund <andres@anarazel.de> writes:
What is the reason behind subtracting ccdlflags?
It looks like the coding actually originated here:
commit f5d0c6cad5bb2706e0e63f3f8f32e431ea428100
Author: Bruce Momjian <bruce@momjian.us>
Date: Wed Jun 20 00:26:06 2001 +0000
Apparently, on some systems, ExtUtils::Embed and MakeMaker are slightly
broken, and its impossible to make a shared library when compiling with
both CCDLFLAGS and LDDLFAGS, you have to pick one or the other.
Alex Pilosov
and Peter just copied the logic in 7662419f1. Considering that
the point of 7662419f1 was to get rid of MakeMaker, maybe we no
longer needed that at that point.
On my RHEL box, the output of ldopts is sufficiently redundant
that the subtraction doesn't actually accomplish much:
$ perl -MExtUtils::Embed -e ldopts
-Wl,--enable-new-dtags -Wl,-z,relro -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -Wl,-z,relro -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -fstack-protector-strong -L/usr/local/lib -L/usr/lib64/perl5/CORE -lperl -lpthread -lresolv -ldl -lm -lcrypt -lutil -lc
$ perl -MConfig -e 'print $Config{ccdlflags}'
-Wl,--enable-new-dtags -Wl,-z,relro -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld
which leads to
$ grep perl_embed_ldflags src/Makefile.global
perl_embed_ldflags = -Wl,-z,relro -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -fstack-protector-strong -L/usr/local/lib -L/usr/lib64/perl5/CORE -lperl -lpthread -lresolv -ldl -lm -lcrypt -lutil -lc
so the only thing we actually got rid of was -Wl,--enable-new-dtags,
which I think we'll put back anyway.
Things might be different elsewhere of course, but I'm tempted
to take out the ccdlflags subtraction and see what the buildfarm
says.
regards, tom lane
I wrote:
Andres Freund <andres@anarazel.de> writes:
What is the reason behind subtracting ccdlflags?
It looks like the coding actually originated here:
commit f5d0c6cad5bb2706e0e63f3f8f32e431ea428100
Ah, here's the thread leading up to that:
/messages/by-id/200106191206.f5JC6R108371@candle.pha.pa.us
The use of ldopts rather than hand-hacked link options seems to date to
0ed7864d6, only a couple days before that. I don't think we had a
buildfarm then, but I'd bet against the problem being especially
widespread even then, or more people would've complained.
BTW, the business with zapping arch options seems to not be necessary
anymore either on recent macOS:
$ perl -MExtUtils::Embed -e ldopts
-fstack-protector-strong -L/System/Library/Perl/5.30/darwin-thread-multi-2level/CORE -lperl
$ perl -MConfig -e 'print $Config{ccdlflags}'
$
(same results on either Intel or ARM Mac). However, it looks like it
is still necessary to keep locust happy, and I have no idea just when
Apple stopped using arch switches here, so we'd better keep that.
regards, tom lane
Hi,
On 2022-02-07 20:42:09 -0500, Tom Lane wrote:
Andres Freund <andres@anarazel.de> writes:
What is the reason behind subtracting ccdlflags?
It looks like the coding actually originated here:
commit f5d0c6cad5bb2706e0e63f3f8f32e431ea428100
Author: Bruce Momjian <bruce@momjian.us>
Date: Wed Jun 20 00:26:06 2001 +0000Apparently, on some systems, ExtUtils::Embed and MakeMaker are slightly
broken, and its impossible to make a shared library when compiling with
both CCDLFLAGS and LDDLFAGS, you have to pick one or the other.Alex Pilosov
and Peter just copied the logic in 7662419f1. Considering that
the point of 7662419f1 was to get rid of MakeMaker, maybe we no
longer needed that at that point.
Yea. And maybe what was broken in 2001 isn't broken anymore either ;)
Looking at a number of OSs:
debian sid:
embed: -Wl,-E -fstack-protector-strong -L/usr/local/lib -L/usr/lib/x86_64-linux-gnu/perl/5.34/CORE -lperl -ldl -lm -lpthread -lc -lcrypt
ldopts: -Wl,-E
fedora:
embed: -Wl,--enable-new-dtags -Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fstack-protector-strong -L/usr/local/lib -L/usr/lib64/perl5/CORE -lperl -lpthread -lresolv -ldl -lm -lcrypt -lutil -lc
ldopts: -Wl,--enable-new-dtags -Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1
suse tumbleweed:
embed: -Wl,-E -Wl,-rpath,/usr/lib/perl5/5.34.0/x86_64-linux-thread-multi/CORE -L/usr/local/lib64 -fstack-protector-strong -L/usr/lib/perl5/5.34.0/x86_64-linux-thread-multi/CORE -lperl -lm -ldl -lcrypt -lpthread
ldopts: -Wl,-E -Wl,-rpath,/usr/lib/perl5/5.34.0/x86_64-linux-thread-multi/CORE
freebsd:
embed: -Wl,-R/usr/local/lib/perl5/5.30/mach/CORE -pthread -Wl,-E -fstack-protector-strong -L/usr/local/lib -L/usr/local/lib/perl5/5.30/mach/CORE -lperl -lpthread -lm -lcrypt -lutil
ldopts: -Wl,-R/usr/local/lib/perl5/5.30/mach/CORE
netbsd:
embed: -Wl,-E -Wl,-R/usr/pkg/lib/perl5/5.34.0/x86_64-netbsd-thread-multi/CORE -pthread -L/usr/lib -Wl,-R/usr/lib -Wl,-R/usr/pkg/lib -L/usr/pkg/lib -L/usr/pkg/lib/perl5/5.34.0/x86_64-netbsd-thread-multi/CORE -lperl -lm -lcrypt -lpthread
ldopts: -Wl,-E -Wl,-R/usr/pkg/lib/perl5/5.34.0/x86_64-netbsd-thread-multi/CORE
openbsd:
embed: -Wl,-R/usr/libdata/perl5/amd64-openbsd/CORE -Wl,-E -fstack-protector-strong -L/usr/local/lib -L/usr/libdata/perl5/amd64-openbsd/CORE -lperl -lm -lc
ldopts: -Wl,-R/usr/libdata/perl5/amd64-openbsd/CORE
aix:
embed: -bE:/usr/opt/perl5/lib64/5.28.1/aix-thread-multi-64all/CORE/perl.exp -bE:/usr/opt/perl5/lib64/5.28.1/aix-thread-multi-64all/CORE/perl.exp -brtl -bdynamic -b64 -L/usr/opt/perl5/lib64/5.28.1/aix-thread-multi-64all/CORE -lperl -lpthread -lbind -lnsl -ldl -lld -lm -lcrypt -lpthreads -lc
ldopts: -bE:/usr/opt/perl5/lib64/5.28.1/aix-thread-multi-64all/CORE/perl.exp -bE:/usr/opt/perl5/lib64/5.28.1/aix-thread-multi-64all/CORE/perl.exp
mac m1 monterey:
embed: -fstack-protector-strong -L/System/Library/Perl/5.30/darwin-thread-multi-2level/CORE -lperl
ldopts:
windows msys install ucrt perl:
embed: -s -L"C:\dev\msys64\ucrt64\lib\perl5\core_perl\CORE" -L"C:\dev\msys64\ucrt64\lib" "C:\dev\msys64\ucrt64\lib\perl5\core_perl\CORE\libperl532.a"
ldopts:
windows strawberrry perl:
embed: -s -L"C:\STRAWB~1\perl\lib\CORE" -L"C:\STRAWB~1\c\lib" "C:\STRAWB~1\perl\lib\CORE\libperl530.a" "C:\STRAWB~1\c\x86_64-w64-mingw32\lib\libmoldname.a" "C:\STRAWB~1\c\x86_64-w64-mingw32\lib\libkernel32.a" "C:\STRAWB~1\c\x86_64-w64-mingw32\lib\libuser32.a" "C:\STRAWB~1\c\x86_64-w64-mingw32\lib\libgdi32.a" "C:\STRAWB~1\c\x86_64-w64-mingw32\lib\libwinspool.a" "C:\STRAWB~1\c\x86_64-w64-mingw32\lib\libcomdlg32.a" "C:\STRAWB~1\c\x86_64-w64-mingw32\lib\libadvapi32.a" "C:\STRAWB~1\c\x86_64-w64-mingw32\lib\libshell32.a" "C:\STRAWB~1\c\x86_64-w64-mingw32\lib\libole32.a" "C:\STRAWB~1\c\x86_64-w64-mingw32\lib\liboleaut32.a" "C:\STRAWB~1\c\x86_64-w64-mingw32\lib\libnetapi32.a" "C:\STRAWB~1\c\x86_64-w64-mingw32\lib\libuuid.a" "C:\STRAWB~1\c\x86_64-w64-mingw32\lib\libws2_32.a" "C:\STRAWB~1\c\x86_64-w64-mingw32\lib\libmpr.a" "C:\STRAWB~1\c\x86_64-w64-mingw32\lib\libwinmm.a" "C:\STRAWB~1\c\x86_64-w64-mingw32\lib\libversion.a" "C:\STRAWB~1\c\x86_64-w64-mingw32\lib\libodbc32.a" "C:\STRAWB~1\c\x86_64-w64-mingw32\lib\libodbccp32.a" "C:\STRAWB~1\c\x86_64-w64-mingw32\lib\libcomctl32.a"
ldopts:
So on windows, macos it makes no difference because ldopts is empty.
On various linuxes, except red-hat and debian ones, as well as on the BSDs, it
removes rpath. Which we then add back in various places (pl and transform
modules). On debian the added rpath never will contain the library.
AIX is the one exception. Specifying -bE... doesn't seem right for building
plperl etc. So possibly the subtraction accidentally does work for us there...
Things might be different elsewhere of course, but I'm tempted
to take out the ccdlflags subtraction and see what the buildfarm
says.
Except for the AIX thing I agree :(
Greetings,
Andres Freund
On 07.02.22 20:24, Andres Freund wrote:
To be honest, I do not really understand the logic behind when autoconf ends
up with #defines that define a macro to 0/1 and when a macro ends defined/or
not and when we end up with a macro defined to 1 or not defined at all.
The default is to define to 1 or not at all. The reason for this is
presumably that originally, autoconf (or its predecessor practices) just
populated the command line with a few -DHAVE_THIS options. Creating a
header file came later. And -DFOO is equivalent to #define FOO 1.
Also, this behavior allows code to use both the #ifdef HAVE_THIS and the
#if HAVE_THIS style.
The cases that deviate from this have a special reason for this. One
issue to consider is that depending on how the configure script is set
up or structured, a test might not run at all. But for example, if you
have a check for a declaration of a function, and the test doesn't run
in a particular configuration, the fallback in your own code would
normally be to then manually declare the function yourself. But if you
didn't even run the test, then adding a declaration of a function you
didn't want in the first place might be bad. In that case, you can
check with #ifdef whether the test was run, and then check the value of
the macro for the test outcome.
On 2/7/22 21:40, Tom Lane wrote:
I wrote:
Andres Freund <andres@anarazel.de> writes:
What is the reason behind subtracting ccdlflags?
It looks like the coding actually originated here:
commit f5d0c6cad5bb2706e0e63f3f8f32e431ea428100Ah, here's the thread leading up to that:
/messages/by-id/200106191206.f5JC6R108371@candle.pha.pa.us
The use of ldopts rather than hand-hacked link options seems to date to
0ed7864d6, only a couple days before that. I don't think we had a
buildfarm then, but I'd bet against the problem being especially
widespread even then, or more people would've complained.
The buildfarm's first entry is from 22 Oct 2004.
cheers
andrew
--
Andrew Dunstan
EDB: https://www.enterprisedb.com
Andres Freund <andres@anarazel.de> writes:
On 2022-02-07 20:42:09 -0500, Tom Lane wrote:
... Peter just copied the logic in 7662419f1. Considering that
the point of 7662419f1 was to get rid of MakeMaker, maybe we no
longer needed that at that point.
Yea. And maybe what was broken in 2001 isn't broken anymore either ;)
Yeah --- note that Bruce was complaining about a problem on
Perl 5.005, which was already a bit over-the-hill in 2001.
AIX is the one exception. Specifying -bE... doesn't seem right for building
plperl etc. So possibly the subtraction accidentally does work for us there...
I tried this on AIX 7.2 (using the gcc farm, same build options
as hoverfly). The build still works and passes regression tests,
but you get a warning about each symbol exported by Perl itself:
...
ld: 0711-415 WARNING: Symbol PL_veto_cleanup is already exported.
ld: 0711-415 WARNING: Symbol PL_warn_nl is already exported.
ld: 0711-415 WARNING: Symbol PL_warn_nosemi is already exported.
ld: 0711-415 WARNING: Symbol PL_warn_reserved is already exported.
ld: 0711-415 WARNING: Symbol PL_warn_uninit is already exported.
ld: 0711-415 WARNING: Symbol PL_WB_invlist is already exported.
ld: 0711-415 WARNING: Symbol PL_XPosix_ptrs is already exported.
ld: 0711-415 WARNING: Symbol PL_Yes is already exported.
ld: 0711-415 WARNING: Symbol PL_Zero is already exported.
So there's about 1200 such warnings for plperl, and then the same
again for each contrib foo_plperl module. Maybe that's annoying
enough that we should keep the logic. OTOH, it seems entirely
accidental that it has that effect. I'd be a little inclined to
replace it with some rule about stripping '-bE:' switches out of
the ldopts result.
regards, tom lane
On Tue, Oct 12, 2021, at 10:37, Andres Freund wrote:
- PGXS - and I don't yet know what to best do about it. One
backward-compatible way would be to continue use makefiles for pgxs,
but do the necessary replacement of Makefile.global.in via meson (and
not use that for postgres' own build). But that doesn't really
provide a nicer path for building postgres extensions on windows, so
it'd definitely not be a long-term path.
To help evaluate meson, I've put together a list consisting of 6165 Github repos with (?m)^PGXS in the Makefile.
It's structured in the alphabetical order of each parent repo, with possible children repos underneath, using Markdown nested lists.
https://github.com/joelonsql/postgresql-extension-repos
Perhaps such a list could be useful also for other purposes as well,
maybe to create some new type of automated tests.
/Joel
Hi,
On 2022-02-08 18:42:33 -0500, Tom Lane wrote:
I'd be a little inclined to replace it with some rule about stripping '-bE:'
switches out of the ldopts result.
Similar. That's a lot easier to understand than than -bE ending up stripped by
what we're doing. Should I do so, or do you want to?
Greetings,
Andres Freund
Andres Freund <andres@anarazel.de> writes:
On 2022-02-08 18:42:33 -0500, Tom Lane wrote:
I'd be a little inclined to replace it with some rule about stripping '-bE:'
switches out of the ldopts result.
Similar. That's a lot easier to understand than than -bE ending up stripped by
what we're doing. Should I do so, or do you want to?
I could look at it later, but if you want to do it, feel free.
regards, tom lane
On 2/6/22 15:57, Andrew Dunstan wrote:
On 2/6/22 13:39, Andres Freund wrote:
Hi,
On 2022-02-06 12:06:41 -0500, Andrew Dunstan wrote:
Here's a patch. I've tested the perl piece on master and it works fine.
It applies cleanly down to 9.4, which is before we got transform modules
(9.5) which fail if we just omit doing this platform-specific piece.Given /messages/by-id/34e972bc-6e75-0754-9e6d-cde2518773a1@dunslane.net
wouldn't it make sense to simply remove the pexports/gendef logic instead of
moving to gendef?I haven't found a way to fix the transform builds if we do that. So
let's leave that as a separate exercise unless you have a solution for
that - this patch is really trivial.
Any objection to my moving ahead with this? My current workaround is this:
cat > /usr/bin/pexports <<EOF
#!/bin/sh
/ucrt64/bin/gendef - "$@"
EOF
chmod +x /usr/bin/pexports
(gendef is available in the ucrt64/mingw-w64-ucrt-x86_64-tools-git
package on msys2)
cheers
andrew
--
Andrew Dunstan
EDB: https://www.enterprisedb.com
Hi,
On 2022-02-10 12:00:16 -0500, Andrew Dunstan wrote:
Any objection to my moving ahead with this?
No. I don't yet understand what the transforms issue is and whether it can be
avoidded, but clearly it's an improvement to be able to build with builtin
msys tools vs not...
Greetings,
Andres Freund
On 2/10/22 12:52, Andres Freund wrote:
Hi,
On 2022-02-10 12:00:16 -0500, Andrew Dunstan wrote:
Any objection to my moving ahead with this?
No. I don't yet understand what the transforms issue is and whether it can be
avoidded, but clearly it's an improvement to be able to build with builtin
msys tools vs not...
OK, thanks, done.
cheers
andrew
--
Andrew Dunstan
EDB: https://www.enterprisedb.com
Hi,
On 2021-10-13 13:54:10 +0200, Daniel Gustafsson wrote:
I added a --tap option for TAP output to pg_regress together with Jinbao Chen
for giggles and killing some time a while back.
Sorry for not replying to this earlier. I somehow thought I had, but the
archives disagree.
I think this would be great.
If it's helpful and there's any interest for this I'm happy to finish it up now.
Yes! Probably worth starting a new thread for...
One thing that came out of this, is that we don't really handle the ignored
tests in the way the code thinks it does for normal output, the attached treats
ignored tests as SKIP tests.
I can't really parse the first sentence...
if (exit_status != 0)
log_child_failure(exit_status);
@@ -2152,6 +2413,7 @@ regression_main(int argc, char *argv[],
{"config-auth", required_argument, NULL, 24},
{"max-concurrent-tests", required_argument, NULL, 25},
{"make-testtablespace-dir", no_argument, NULL, 26},
+ {"tap", no_argument, NULL, 27},
{NULL, 0, NULL, 0}
};
I'd make it a --format=(regress|tap) or such.
Greetings,
Andres Freund
Is there a current patch set to review in this thread at the moment?
Hi,
On 2022-03-07 14:56:24 +0100, Peter Eisentraut wrote:
Is there a current patch set to review in this thread at the moment?
I've been regularly rebasing and improving the patchset, but didn't post to
the thread about it most of the time.
I've just pushed another rebase, will work to squash it into a reasonable
number of patches and then repost that.
Greetings,
Andres Freund
Hi,
Attached is v6 of the meson patchset. There are a lots of changes since the
last version posted. These include:
- python2 removal is now committed, so not needed in here anymore
- CI changed to be based on the CI now merged into postgres
- CI also tests suse, rhel, fedora (Nazir Bilal Yavuz). Found several bugs. I
don't think we'd merge all of those, but while working on the meson branch,
it's really useful.
- all dependencies, except for pl/tcl (should be done soon)
- several missing options added (segsize, extra_{lib,include}_dirs, enable-tap-tests
- several portability fixes, builds on net/openbsd without changes now
- improvements to a number of "configure" tests
- lots of ongoing rebasing changes
- ...
Greetings,
Andres Freund
Attachments:
On 2022-03-07 09:58:41 -0800, Andres Freund wrote:
On 2022-03-07 14:56:24 +0100, Peter Eisentraut wrote:
Is there a current patch set to review in this thread at the moment?
I've been regularly rebasing and improving the patchset, but didn't post to
the thread about it most of the time.I've just pushed another rebase, will work to squash it into a reasonable
number of patches and then repost that.
Now done, see /messages/by-id/20220308025629.3xh2yo4sau74oafo@alap3.anarazel.de
On 08.03.22 03:56, Andres Freund wrote:
Attached is v6 of the meson patchset. There are a lots of changes since the
last version posted. These include:
- python2 removal is now committed, so not needed in here anymore
- CI changed to be based on the CI now merged into postgres
- CI also tests suse, rhel, fedora (Nazir Bilal Yavuz). Found several bugs. I
don't think we'd merge all of those, but while working on the meson branch,
it's really useful.
- all dependencies, except for pl/tcl (should be done soon)
- several missing options added (segsize, extra_{lib,include}_dirs, enable-tap-tests
- several portability fixes, builds on net/openbsd without changes now
- improvements to a number of "configure" tests
- lots of ongoing rebasing changes
- ...
I looked at this today mainly to consider whether some of the prereq
work is ready for adoption now. A lot of the work has to do with
making various scripts write the output to other directories. I
suspect this has something to do with how meson handles separate build
directories and how we have so far handled files created in the
distribution tarball. But the whole picture isn't clear to me.
More generally, I don't see a distprep target in the meson build
files. I wonder what your plan for that is, or whether that would
even work under meson. In [0]/messages/by-id/cf0bec33-d965-664d-e0ec-fb15290f2273@2ndquadrant.com, I argued for getting rid of the
distprep step. Perhaps it is time to reconsider that now.
[0]: /messages/by-id/cf0bec33-d965-664d-e0ec-fb15290f2273@2ndquadrant.com
/messages/by-id/cf0bec33-d965-664d-e0ec-fb15290f2273@2ndquadrant.com
For the short term, I think the patches 0002, 0008, 0010, and 0011
could be adopted, if they are finished as described.
Patch 0007 seems unrelated, or at least independently significant, and
should be discussed separately.
The rest is really all part of the same
put-things-in-the-right-directory issue.
For the overall patch set, I did a quick test with
meson setup build
cd build
ninja
which failed with
Undefined symbols for architecture x86_64:
"_bbsink_zstd_new", referenced from:
_SendBaseBackup in replication_basebackup.c.o
So maybe your patch set is not up to date with this new zstd build
option.
Details:
v6-0001-meson-prereq-output-and-depencency-tracking-work.patch.gz
This all looks kind of reasonable, but lacks explanation in some
cases, so I can't fully judge it.
v6-0002-meson-prereq-move-snowball_create.sql-creation-in.patch.gz
Looks like a reasonable direction, would be good to deduplicate with
Install.pm.
v6-0003-meson-prereq-add-output-path-arg-in-generate-lwlo.patch.gz
Ok. Similar to 0001. (But unlike 0001, nothing in this patch
actually uses the new output dir option. That only comes in 0013.)
v6-0004-meson-prereq-add-src-tools-gen_versioning_script..patch.gz
This isn't used until 0013, and there it is patched again, so I'm not
sure if this is in the right position of this patch series.
v6-0005-meson-prereq-generate-errcodes.pl-accept-output-f.patch.gz
Also similar to 0001.
v6-0006-meson-prereq-remove-unhelpful-chattiness-in-snowb.patch.gz
Might as well include this into 0002.
v6-0007-meson-prereq-Can-we-get-away-with-not-export-all-.patch.gz
This is a separate discussion. It's not clear to me why this is part
of this patch series.
v6-0008-meson-prereq-Handle-DLSUFFIX-in-msvc-builds-simil.patch.gz
Part of this was already done in 0001, so check if these patches are
split correctly.
I think the right way here is actually to go the other way around:
Move DLSUFFIX into header files for all platforms. Move the DLSUFFIX
assignment from src/makefiles/ to src/templates/, have configure read
it, and then substitute it into Makefile.global and pg_config.h.
Then we also don't have to patch the Windows build code a bunch of
times to add the DLSUFFIX define everywhere.
There is code in configure already that would benefit from this, which
currently says
# We don't know the platform DLSUFFIX here, so check 'em all.
v6-0009-prereq-make-unicode-targets-work-in-vpath-builds.patch.gz
Another directory issue
v6-0010-ldap-tests-Don-t-run-on-unsupported-operating-sys.patch.gz
Not sure what this is supposed to do, but it looks independent of this
patch series. Does it currently not work on "unsupported" operating
systems?
v6-0011-ldap-tests-Add-paths-for-openbsd.patch.gz
The more the merrier, although I'm a little bit worried about pointing
to a /usr/local/share/examples/ directory.
v6-0012-wip-split-TESTDIR-into-two.patch.gz
v6-0013-meson-Add-meson-based-buildsystem.patch.gz
v6-0014-meson-ci-Build-both-with-meson-and-as-before.patch.gz
I suggest in the interim to add a README.meson to show how to invoke
this. Eventually, of course, we'd rewrite the installation
instructions.
Hi,
On 2022-03-09 13:37:23 +0100, Peter Eisentraut wrote:
I looked at this today mainly to consider whether some of the prereq
work is ready for adoption now.
Thanks!
A lot of the work has to do with
making various scripts write the output to other directories. I
suspect this has something to do with how meson handles separate build
directories and how we have so far handled files created in the
distribution tarball. But the whole picture isn't clear to me.
A big part of it is that when building with ninja tools are invoked in the
top-level build directory, but right now a bunch of our scripts put their
output in CWD.
More generally, I don't see a distprep target in the meson build
files. I wonder what your plan for that is, or whether that would
even work under meson. In [0], I argued for getting rid of the
distprep step. Perhaps it is time to reconsider that now.[0]: /messages/by-id/cf0bec33-d965-664d-e0ec-fb15290f2273@2ndquadrant.com
I think it should be doable to add something roughly like the current distprep. The
cleanest way would be to use
https://mesonbuild.com/Reference-manual_builtin_meson.html#mesonadd_dist_script
to copy the files into the generated tarball.
Of course not adding it would be even easier ;)
For the short term, I think the patches 0002, 0008, 0010, and 0011
could be adopted, if they are finished as described.
Cool.
Patch 0007 seems unrelated, or at least independently significant, and
should be discussed separately.
It's related - it saves us from doing a lot of extra complexity on
windows. I've brought it up as a separate thread too:
/messages/by-id/20211101020311.av6hphdl6xbjbuif@alap3.anarazel.de
I'm currently a bit stuck implementing this properly for the configure / make
system, as outlined in:
/messages/by-id/20220111025328.iq5g6uck53j5qtin@alap3.anarazel.de
The rest is really all part of the same put-things-in-the-right-directory
issue.For the overall patch set, I did a quick test with
meson setup build
cd build
ninjawhich failed with
Undefined symbols for architecture x86_64:
"_bbsink_zstd_new", referenced from:
_SendBaseBackup in replication_basebackup.c.oSo maybe your patch set is not up to date with this new zstd build
option.
Yep, I posted it before "7cf085f077d - Add support for zstd base backup
compression." went in, but after 6c417bbcc8f. So the meson build knew about
the zstd dependency, but didn't yet specify it for postgres /
pg_basebackup. So all that's needed was / is adding the dependency to those
two places.
Updated patches attached. This just contains the fix for this issue, doesn't
yet address review comments.
FWIW, I'd already pushed those fixes out to the git tree. There's frequent
enough small changes that reposting everytime seems too noisy.
v6-0001-meson-prereq-output-and-depencency-tracking-work.patch.gz
This all looks kind of reasonable, but lacks explanation in some
cases, so I can't fully judge it.
I'll try to clean it up.
v6-0007-meson-prereq-Can-we-get-away-with-not-export-all-.patch.gz
This is a separate discussion. It's not clear to me why this is part
of this patch series.
See above.
v6-0008-meson-prereq-Handle-DLSUFFIX-in-msvc-builds-simil.patch.gz
Part of this was already done in 0001, so check if these patches are
split correctly.I think the right way here is actually to go the other way around:
Move DLSUFFIX into header files for all platforms. Move the DLSUFFIX
assignment from src/makefiles/ to src/templates/, have configure read
it, and then substitute it into Makefile.global and pg_config.h.Then we also don't have to patch the Windows build code a bunch of
times to add the DLSUFFIX define everywhere.There is code in configure already that would benefit from this, which
currently says# We don't know the platform DLSUFFIX here, so check 'em all.
I'll try it out.
v6-0009-prereq-make-unicode-targets-work-in-vpath-builds.patch.gz
Another directory issue
I think it's a tad different, in that it's fixing something that's currently
broken in VPATH builds.
v6-0010-ldap-tests-Don-t-run-on-unsupported-operating-sys.patch.gz
Not sure what this is supposed to do, but it looks independent of this
patch series. Does it currently not work on "unsupported" operating
systems?
Right now if you run the ldap tests on windows, openbsd, ... the tests
fail. The only reason it doesn't cause trouble on the buildfarm is that we
currently don't run those tests by default...
v6-0011-ldap-tests-Add-paths-for-openbsd.patch.gz
The more the merrier, although I'm a little bit worried about pointing
to a /usr/local/share/examples/ directory.
It's where the files are in the package :/.
v6-0012-wip-split-TESTDIR-into-two.patch.gz
v6-0013-meson-Add-meson-based-buildsystem.patch.gz
v6-0014-meson-ci-Build-both-with-meson-and-as-before.patch.gzI suggest in the interim to add a README.meson to show how to invoke
this. Eventually, of course, we'd rewrite the installation
instructions.
Good idea.
Greetings,
Andres Freund
Attachments:
v7-0013-meson-Add-meson-based-buildsystem.patch.gzapplication/x-patch-gzipDownload
�%�(b v7-0013-meson-Add-meson-based-buildsystem.patch �]���8��^1������v�u�{P����N9vw%��:���R����HJ�x�Y�AR����H����^��]�f�9^-�t~��c�,���e����������� ��,�{R��%��n�_����G�����HkB����E�~������H>M���^�����&�S4�F����8�9��Y����j6;z�����A��|���o��������hGhY@Oi*~D1�$Eq��)}�
�M������������S�G[��PX������~�_�o7;������}���r[�7�1/k��_�0-��-)RWuC������:9��$oSr�������z����?�Fh��F'2�^�C@9W����g���K�gI�N�������J�l��f�����Xg��<4��T����Tq�)k�q����B��M�S'U�d�����E�2��[A2�-��t�#��zX������D�WWC�x���;RO�d�v Z+��)�3�$�����l��]XpBq����v'p�V�����-�-�����������3+R�`��}a��N�m~E�������z��SB���8ie] g
�s��p���(�U:��V6\���z���VqqY6P�U��� No8� ��t�<���<��n��]h�h]�$��X���~e�Y��)��������<�p�: �����_����|�ZA�z~}90�X�|��$��U���o �7�ki��.�kc�i��pc��fF8���HZY1�jb<��j�������1�0�i�mP�5Y��EVXN���~Io��{A��:e�h��}�f��dc�`c��t:��U���Cx���$$�S{6 _n�Gu9n����J�gc\�)��j�8�}]��:k|*(�-�������]it�
��6Ybe�u�U��]��ko>G=n�9��?�!���6���U&%d8�������t��O,4��t�5��0�{���`7]�s�H;�����k����8�����M��O;G�$�I��e��ujt�p���2�����%8��������U�2�?�[[W�FO�s�+������[{���I�Z �����.
l���-��9�4�y3�u���2����g�qz�A�C��}����qR��7�]���b�u_��O�=]�����vI���n9�9(T~�1�����O���`z�P!����53(E/8�!d��`z�J��Q�v�1�F|��qw�_k��ES�y
��sj���I�p�
.mw6/�k�4��N����p����$�_��5�6��.�lt�8f ��r��65�{�����������%�R�����e\L
�����G�Hb�����~���n�H�N�:�k�v��U&��F���e�������=��f6�-G�*�H��s}~.�6�]�#{����K��l�~l������(�A���:��]�����3�w~)��f�$�F=P_����C�;�7{}���p��^�8���[\^��x~=���#���������W>�Vs��+pW�1.���{��)�{��wV��K��q���2��.K�(�14������__�q`�4����]+�$�0:Z����s�~��H�z������e��w%��]�8������3{���`����,��]5����K��!�+��j�� �����v�9�/�I�����qJ����?Z.l�R�v���+�tQ`�����e\�F�����3����%+h���4�r%�(�/��-��v������\n���0�
n����q��*P�q�J���[��Cv�U5��s��R{�����^^o|�S?L��hg��E��O�KR?V��41���������KO�?F��5�N�����:���:P�e�n9�q!
�~�i���'�� ��>���Y��w�:l�'��$�|^������Lf����q�]$��"�8�M��kp���kv������m�����m����'m��I%��>�r�-�i�i�p��nva�����*����������(�:���=�g8n0��}�,X�����q�pK���1�� ��<�<uD28�$����W�mN��DN����.���E ���HEh��S2���#�ia"��8��$�11]TZp�EUYE�3�_�x�p��0����A3>z��V��e���7�����"��EeE�u��S��m�)�������W���8��f9�Z��c~���9p} ������`,�0?a ��I�������V�o���J%�N�vY"_x�8[H��?�{Y�H0�j����������YB�����h[��v��gG��hgph:��� �f�/�T~���jQ�6���q5(=w}~d8�pLo
�K+�>L1���@�X8�]�S2�������C��A��89���n s�i��8q|�=�nq�9�O�NTr�8�������A���~+5!f��t\��� �E���%�PYq�{�*nC��["`���L�*WqL�=���(��������%g���m-R�V���0#H���"�P�^1��.�F��``��5����IM���}�WK�C@���i����>�A�����q�%��`t�JQ������\]-.PV��������S�B���|z�&�!���7�X.�Kk~�sH]]�!�T���r��b�����K^�]`E�I ����+D
�tW�S!�u=���f�N���0bx��qJ�C(N�~D�}E��B�zD(R��8��s�p=�5�CxQ��eW"�d����)^�)�!tH}%�!��'�E�Y~�u��/���p ��Pj2�!P%�����p T�j�B�]|9
~`��vg3�L�^�f� ��kHOk8��8SzC@Z��c�q��� 0[~C0TMq�r��J�C0���������T��J����<�z��c�s�����%�!�%:��D�@��v�9�!��vo�;@�#�
s�C ���Usa���f�t��aLopV�g�b�B� >�9���2&0xAZ�B �����5���9[�W������
>�97!�g&�`�������r<-�S_KCp�Wb���
9�=� �^Y��a�0fxj��\!��KD
5�wf��,��+���� k���f�k���SN@"`������ge=��]����A�N�5���2����~'J���~��r�;qz ������������@��;�;+�������'��) �
�D�;A��{�r�FY�}0[��g���A��^�m�������� {J�� Lq��1������;AJ,��n��e��w��c����1�nD�P��n�=>�p����@x'���������w�}�����nU�=�n�1��
1��j��vxEK�z��u��6`�[?o�_������L��aHW({`�������a
���h��~@������90���}�!A���pm��� ���_����������`�0�9���5���Z@��D�{*[��m��!��X��?�K�3h*��seC���\�H�1��>�
�T��C �p�`C���Mt���}���Q�����t�5� ��������QA�yt��"G\�A����t�����2��/���(RwgE��G'''�������:�?e��G']5��8���~F��N�p[�L������y����S���F*��R�[�q8
M�A|�K�bK���r�`\P�y�MY�ek���G�L�N~*+�Y�n��M�7D.�7D��� i0������ w�`�RC���P��[^Dj�]���Zu��`\���L/Xp��u�T)Z����0'�����%�P���J`
��W*��p��$���F�tx������T-�����o��������ohw%l,�v�R^0��R$$�L�*W���K�� ]N����Jn�]��oZ(b#��N�������Q�MC�\T�n���\�6o"���Q��j���d�L,�������!-������
�f�4����y��|m���m�~��b��W��I�T[������$������S����^KLy��p&&��#"���-�pn)
��[*��T:0Q�tO\��g�Ja������K������F�D.���{�?�jaw������/�TJ���8:��m�,��"�W#�=E���;�
4y�&4�27>?��$��/��Je{�K��k>�7����Y��t�� [�4�'�O���S1z34��>cF8��-�A����f��R�D�� L������$b�U:�����J�b�X;��4��m�G' $�����"����^Cp�����Ni���Q�"������+�H����{���(~�y�
j����IHeb~7���S��]�.��}bw�O;�N��3�����\a�������v�;e�����~|��e�����cmt���X?�x��������xS��S��z"�������/^��em��~����l���w��4e�d7��@�S>� �(���+2&�X/5�P��/�l k���Yl�� z������N��a���m����Q��dE�-�������CAM����v��g�F��?_G/^�z����(�F0����-��e�_5�����f��`�P�����?�{�����[�����wd�
z %���������P����[AT����=]��l��k���E�f?�6[�]y�~f���6��4y�5�f���NY����`�C19��%^�.�k��h1W3a����x"��M� (�����F�J=�E�|��g��N�)��K���d*J�@�l��*?� [,�)�r��j&��gk��2� �Am��u�x�����N��x":;>�j��x�p��t������u?P!���o�2���O���*���lI��S`����i�d�����~c"���ar�����
/����={�2��A�����_���}�*�e�
����_�y�����/�1c��^��/o:YL�$3�\�������������������_�<�<�5�u�Da����q�U}�����xf������%/�������(��k��� ;���o������?���OGv+I>���A�`e���9%�6��e[���gs��]������E���8t���bO�p��dv���]{5
C7c��P,:��3�;� ��U�w�\�d4O�.Y��#�O�����2M����N>+�_��XA{c��s��LK��Rrm��<f#+~�lQ~@o��q����tS�s��j6+�U����^'���=Gv�L�O����2�����M���r�\�.|����Q��m���]��d���{������P�,:��y�X����q�T�n�3�6��1p��<=X�t��Y�N��3�4:��|�g?�v���Y1��
[��k�<��?�vX����OX�`b��_f87%���~�%[P�wq���fe����-���5s�N������LS�~�r
&,�6����4��H
��q���h��k�����e��_�����w����+G��8�/����w�L��Y�����f���W�`��-
�����G���U���&���>e���%���.+[
6�=�2�+��{=n�_V|.�
*������0^����k��n��n��.F/G���}WR�x>�C�YC���i����J�%r;����m:n�w,BF�t��A�z#�se�,9��g4�h=�}�]��#��GD��]��}�������u��\�����p,�p��f��2l��M����9���������hz�TC+�����^w�^�c�G��{�Y�EU�1t��\'���2�
�j6���G�zD���79]O���������D�^�b��
��)�����f�z�=����hm�a�H�Tm��3��2�a�'����i^u�)��D��w�����c���)���!�Mj������>I�A��8���K���8���ruu�9MY���m�p2���R*�5������rx������}�P������OB���"��UW�@�����A�;�Dn�}�b]�Y5����p�
�_�#K]�C��,(��{;Mp��P�KW���P'�I+R=.��A����K�������|+Z������>�kx��(���b���S�0�N���)�QZ����QQ���B2����{�5����(�����|�t���}��)TV��4~p�O�e�����b�H�(��z
�]�Pj�Wm���,�.��%_I��]:�@ �P jq���<�<�<�DD.�L$@��\����.���3####c�+��r(�L�8D�W�uy%�]\���[�D��$6������h�hM�,P��+����f����A����8�)�����������!q&]A�".x�V����:HeQ(���g&��������T
�$I�J5I��sz��[>���=����0�U��@Kmi���5�cg���?��x|��)T'Y�9^�nL�zA��9��M���na'vA����{��I�������*=u�����{��3zhu��g���i��kF�G��A������R�'���y�'� ����6���[
�_�!<1�[)��9g��25.�`���k���|��������Z�����������K�Hx�ow�����_g��������}�������
��u����C-L�&�`�`�����t!&Dt�s�9P ��"��`�x�����W�����'�b�d+�h�p�!���=/�ab:����q�+(M�Oj���!jMB��78L�v��� �I�N��B�+��wf��7�/K��}����/���_��%(���l�Z���I�r����b�A��b95L�pZ�b���UU`���=�]GP!Y�����y���1�3���U���R����������BC^�*p{�nx�G1�a�D>�n������E��] ;yUk\)�v�\�X��Nz�(l����|J�Q���������zb��� \G��o��!�t\�
)��������p� �p,�n9;$s� F�u�:�2`�-�������{!�J�{��"*�^�����!��~�.vD(t��&W�Bf��b��^�����S�~a����O��N4��9,��SZT���~�I��:|s��]�'2 1;H1h��8���*�#�<!I���C<������nR�+����� W4�:q8���VVQ��+��ZD�E����I lq���t��b `P�_���ig�{�/���H�'a���������}�{!u.�����D/ �Bny`h�|<�e�N���c���enph;l�%B(��)�|Jv�t�I�(��� �<\�WJ��&��hQ=E�a'e�������q��eu�� 66-�x�/���(&H�I������}2<�&I�a�D��D�/�T��(�D���P�0
{��������:6�.{�h�������`��.q��QD��� � p�/��3f5t��:�=�4<9�}�%��'7�8��[��SvI�+q����N��N�O�Q��7�t;���k6�M��V���E�����<�g;��777��I�������Z]�?XG:�o ��� QS��l�� ��+�-TP� I��Pq|$����p�?EA~%���A\��1��8�H\�[��3�aX�$#����3a<��Z�s�b)�h+�1
b�� �S[��sb�������FS���rhL���kEA��Ym�p{NvxWJ�v�BW�a��{cE2��+��d@���%�������^�9z�����0��}E����%��b���JMU)���+*��.��N.t.x������r��� ��������'�`����zH'��M��P��I�7p������O�;�$�fWO|��W�����j
9wl���U����edX6M�Lyz��6��d��eSj�&����&
�1�3��2?l^������Y�!�U��(�Qn�U|���%���Ix��D�+�� � ���F�3���[�'.f��c�"�+�I��� ���)��jP�CZ�T��Y���� �d5_�aG5aM��Z��1�R�Z�p����i��8m��)�*��[�Y{�.�[���P�U���=e�;���������*Wb���z"��|�`�
�,�n�����.�<g���yR<��1����F�U2F�.k&�} d�$S%�{Q�� M�{� �8��<����������n ~G?���6~��.z���T|��^,sO�}{�>�k����d�c�������� ~�b������!5� �(�]�@�Pp�Z���^�Q~W��:�����WK��G��|"��I�Yg�J�:��<F?�()�K���T��(���4
�y�,���L��� 3����J]�0�=p)���A��.���h\����a�( �8S�.�aF��{�M�y ���ct�<�N
�.2T��`��&���������o<}�O�zh��@�����52'?��x����oOv�T3�+��,������Z6X�{� ����Y����1��Is��v+���s���~�1d\�������Y�8�wg���������
�{�����?�����;>?|s���p��O���jq��$����Yjo��0�/�:���GHy& �Z��{��������{���Z��XVY�L4)�f��y���.�����p������W�BcN��'v*�G#7�2�r���L�\��a�����K����r��{U���H&O�:cYY]��H�y����Zv�l����
2m��x��������
K��0�=H��B��!:%� ����j������IB���Q�����u^���x����3�9��������{�{L�'�����}� �&���KE���rn
_�������UR ������Up#X$p�+��
��^
w�������=���]����]��@�s:������Y�������'�+~�]����S��p���� :����g9�����@V<�Xyz`� I��N<�w{��Oz+ ���������^Y����Vr^�$�����3����������C�4���L��{�s��L�w�5�;Y\^���y���:�ltt-bG6S��������]vv~z~��S�.�����"�� �%,����m��+O���������&����:�tJnF��V[S9�5
V
��m�+:L#{{�2Q�������S����s��v���q����9���c/M�0uQ��M���W����%�����T)|��{�{T�ITU�Y��j�NC������CB����qU�������0L%��j5
���(�dT�P��+��Y��:D��5$����~H�Mh�&K��CjP�\�.J$A
:6L�N4S~zJc�b0����~�5���������E��!+�^�^3�-��<!�uG�!��u���� �B�t=���:?�d@�`��^���6F=@��
<��������oX�W�GBe��}����_'�����7k�5�P(���d����(��������gov��� /��zu�jP5���m��CO�V;?�����p�<�?��l��9���//~|uNf�/8���K�,j{y���_����yrFF� Ga*�`� rt�����e��_��X�w���s@���a����]�s~�=zw�<��ovw;����������9��d'�~x���4����?Q����^H��8�������������f�!s��V��h���:�+�������YW�`��v3Q��4 ~�#C58���f���3/]�&�����Y�ga8u�}�D��I�ozk�����@�l�;�L 0��z���=Q�LT�� �!����� �� (��S�x�Ye�^������o^����K/�K�wity�C~X��8{GB�
�����7��nn�
���TW3\sN�K��.�����������Y�u4:!�^�3�������60��H���U�]}j����7�L?z�u��*��E+S���`���[sz�n�n�Q��7����= O����E��'+8�I����7���A�_m���-�5�� ��(��
B�`q1t�J��UO�������)0S�����PCB��5�C�`��N��)
z'��a��Uz.��x��F�����`�t����n5X@/��\L�^V<+a�xY�`w�K�\{���� z���bP���1&����D�$� -
�Go�!,�5�0kh��1O� ��u��������X�t���U:�n��M6�)�����`c2�Q���%M��� #��cxH�&s��f���E�A�Fq
�R��G�L�D�(44
�O�1�������5*X���p����!e �}����jI��S@�yp��GaT�MbY[����6�Y�� K�1����r���X! �ge
b,Z��V
A�e������._73W�u��'}c��o�R[�=�Ug�P8�W`l5%��Xh�LkE%F-�""@_����*&����SX/�����p�7^ �-u�zM*�
<f�.���X����E@P���O��0����Y�Y��^�5��},�-Z������)]�>�l�)��7��pc�t���)+������$�i����������S�t�>g���6������zM���lB�!k� ��T!������7nz��P���� �`��tz�"'�9���
�.l ��j`��yS���1C�Rh6�L��5^��f������v�����{�C����#�A(|9 �S �`p��i�I�
���f `�~"