chore: dependencies upgrade and move to go 1.22

This commit is contained in:
2024-04-02 19:24:34 +02:00
parent 73cf9b0ba1
commit 322e6a65ae
246 changed files with 10165 additions and 5482 deletions

146
vendor/gocv.io/x/gocv/CHANGELOG.md generated vendored
View File

@@ -1,3 +1,149 @@
0.36.0
---
* **all**
* Add support for OpenCV 4.9.0
* update Go to version 1.22
* update minimum go version to 1.21
* **bugfix**
* aruco: correct test from latest OpenCV update
* **build**
* add GH action for Windows
* remove appveyor
* adjusted Makefile to build for debian bookworm
* **core**
* Add additional signature for MinMaxLoc.
* add color conversion alias
* add Mahalanobis(), Inv(), Row(), amd Col() functions
* add MulTransposed() function
* add PCABackProject() and PCAProject() functions
* add PSNR() function
* add SVBackSubst() and SVDecomp() functions
* **calib3d**
* add FisheyeCalibrate, FisheyeDistortPoints, and CheckChessboard functions
* Add func comments and update readme
* add Rodrigues function
* add SolvePnP function
* Add more smoke tests
* Initial commit of more stereo bindings
* **feature2d**
* Add interface for `Feature2D` algorithms
* Asserting some algorithms conform to `Feature2D`
* Prepend "Feature2D" prefix to component interfaces of Feature2D
* **imgproc**
* add CreateHanningWindow()
* add EMD()
* Add float version of BoxPoints and MinAreaRect
* Add new binding for cv::Erode.
* **videoio**
* add Retrieve function
* **contrib/xfeatures2d**
* Add BriefDescriptorExtractor to xfeatures2d (#1114)
* add NewSURFWithParams func
* Add separate "Compute" bindings for detection algorithms (#1117)
* **cuda/core**
* ADD Cuda MultiplyWithStream (#1142)
0.35.0
---
* **all**
* Add support for OpenCV 4.8.1
* correct Go formatting
* **features2d**
* Add Match method for BFMatcher
* **build**
* remove extra files from GH actions runner so GPU images builds have enough temp file space to run correctly
* **make**
* for build_raspi added conditional cmake build for 64 and 32bit platforms
* remove ENABLE_VFPV3=ON and add WITH_TBB=ON from 64bit build.
* added sudo_pre_install_clean to raspberry pi and jetson installs
* change sudo_pre_install_clean to support cleanup on 64bit architechtures (arm and x86)
0.34.0
---
* **all**
* Add support for OpenCV 4.8.0
* Add support for Go 1.21
* **build**
* update all builds to use OpenCV 4.8.0
* **core**
* Adds support for PCACompute
* **docker**
* add dockerfile for OpenCV static build
* **make**
* Leave one processor free instead of using all of them when building
0.33.0
---
* **bugfix**
* Remove opencv2/aruco.hpp include
* **all**
* build performance tests with all OpenCV builds
* **build**
* build and push Ubuntu 22.04 base image with OpenCV 4.7.0
* docker images with opencv
* docker production images with opencv 4.7.0
* Docker push to GHCR
* **core**
* Add ReduceArgMax and ReduceArgMin
* **dnn**
* improved NMSBoxes code
* **docker**
* add dockerfile for Ubuntu 22.04 OpenCV base image
* updates to migrate to GHCR
* **examples**
* Deallocate Mats in feature-matching example.
* Fix G108 (CWE-200) and G114 (CWE-676)
* Fix G304 (CWE-22) and G307 (CWE-703)
* Fix G304 (CWE-22) and G307 (CWE-703)
* Missed #nosec tag
* **make**
* Ubuntu Jammy (22) opencv build support.
0.32.0
---
* **all**
* update to OpenCV 4.7.0
* **core**
* Add the number of thread setter and getter
* **calib3d**
* add EstimateAffinePartial2DWithParams()
* **imgcodecs**
* Add IMDecodeIntoMat to reduce heap allocations (#1035)
* **imgproc**
* add matchShapes function support
* **objdetect**
* move aruco from contrib and also refactor/update to match current OpenCV API
* **photo**
* add inpaint function
* **video**
* cv::KalmanFilter bindings.
* **cuda**
* add support for cuda::TemplateMatching
* **docker**
* update all dockerfiles for OpenCV 4.7.0/GoCV 0.32.0
* multiplatform for both amd64 and arm64
* install libjpeg-turbo into docker image
* add Ubunutu 18.04 and 20.04 prebuilt OpenCV images
* add dockerfile for older version of CUDA for those who cannot upgrade
* **ci**
* remove circleci
* correct actions that trigger build
* **make**
* change download path for OpenCV release tag
* **windows**
* Update win_build_opencv.cmd
* **docs**
* correct docs on building docker
* update ROADMAP
* typo in comment
* update comments style with gofmt
* **openvino**
* Add openvino Dockerfile
* Fix OpenvinoVersion dangling pointer
* Update env.sh and README.md for 2022.1
0.31.0
---
* **all**

2
vendor/gocv.io/x/gocv/Dockerfile generated vendored
View File

@@ -1,6 +1,6 @@
# to build this docker image:
# docker build .
FROM gocv/opencv:4.6.0
FROM ghcr.io/hybridgroup/opencv:4.9.0
ENV GOPATH /go

View File

@@ -3,10 +3,10 @@
#
# To run tests:
# xhost +
# docker run -it --rm -e DISPLAY=$DISPLAY -v /tmp/.X11-unix:/tmp/.X11-unix gocv-test
# docker run -it --rm -e DISPLAY=$DISPLAY -v /tmp/.X11-unix:/tmp/.X11-unix gocv-test-4.x
# xhost -
#
FROM gocv/opencv:4.6.0 AS gocv-test
FROM ghcr.io/hybridgroup/opencv:4.9.0 AS gocv-test-4.7
ENV GOPATH /go
@@ -14,6 +14,6 @@ COPY . /go/src/gocv.io/x/gocv/
WORKDIR /go/src/gocv.io/x/gocv
RUN go get -u github.com/rakyll/gotest
RUN go install github.com/rakyll/gotest@latest
ENTRYPOINT ["gotest", "-v", ".", "./contrib/..."]

View File

@@ -4,7 +4,7 @@
# To run tests:
# docker run -it --rm --gpus all gocv-test-gpu-cuda-10
#
FROM gocv/opencv:4.6.0-gpu-cuda-10 AS gocv-gpu-test-cuda-10
FROM ghcr.io/hybridgroup/opencv:4.9.0-gpu-cuda-10 AS gocv-gpu-test-cuda-10
ENV GOPATH /go
ENV PATH="${PATH}:/go/bin"
@@ -13,6 +13,6 @@ COPY . /go/src/gocv.io/x/gocv/
WORKDIR /go/src/gocv.io/x/gocv
RUN go get -u github.com/rakyll/gotest
RUN go install github.com/rakyll/gotest@latest
ENTRYPOINT ["gotest", "-v", "./cuda/..."]

View File

@@ -4,7 +4,7 @@
# To run tests:
# docker run -it --rm --gpus all gocv-test-gpu-cuda-11
#
FROM gocv/opencv:4.6.0-gpu-cuda-11 AS gocv-gpu-test-cuda-11
FROM ghcr.io/hybridgroup/opencv:4.9.0-gpu-cuda-11 AS gocv-gpu-test-cuda-11
ENV GOPATH /go
ENV PATH="${PATH}:/go/bin"
@@ -13,6 +13,6 @@ COPY . /go/src/gocv.io/x/gocv/
WORKDIR /go/src/gocv.io/x/gocv
RUN go get -u github.com/rakyll/gotest
RUN go install github.com/rakyll/gotest@latest
ENTRYPOINT ["gotest", "-v", "./cuda/..."]

18
vendor/gocv.io/x/gocv/Dockerfile-test.gpu-cuda-11.2.2 generated vendored Normal file
View File

@@ -0,0 +1,18 @@
# To build:
# docker build -f Dockerfile-test.gpu-cuda-11.2.2 -t gocv-test-gpu-cuda-11.2.2 .
#
# To run tests:
# docker run -it --rm --gpus all gocv-test-gpu-cuda-11.2.2
#
FROM ghcr.io/hybridgroup/opencv:4.9.0-gpu-cuda-11.2.2 AS gocv-gpu-test-cuda-11
ENV GOPATH /go
ENV PATH="${PATH}:/go/bin"
COPY . /go/src/gocv.io/x/gocv/
WORKDIR /go/src/gocv.io/x/gocv
RUN go install github.com/rakyll/gotest@latest
ENTRYPOINT ["gotest", "-v", "./cuda/..."]

View File

@@ -1,6 +1,6 @@
# to build this docker image:
# docker build -f Dockerfile.gpu .
FROM gocv/opencv:4.6.0-gpu-cuda-11 AS gocv-gpu
FROM ghcr.io/hybridgroup/opencv:4.9.0-gpu-cuda-11 AS gocv-gpu
ENV GOPATH /go

View File

@@ -1,45 +1,140 @@
# to build this docker image:
# docker build -f Dockerfile.opencv -t gocv/opencv:4.6.0 .
FROM golang:1.18-buster AS opencv
# OpenCV 4 prebuilt multiarchitecture image
#
# To build release:
# docker buildx build -f Dockerfile.opencv -t ghcr.io/hybridgroup/opencv:4.9.0 -t ghcr.io/hybridgroup/opencv:latest --platform=linux/arm64,linux/amd64 --push .
#
# To build prerelease:
# docker buildx build --build-arg OPENCV_VERSION="4.x" --build-arg OPENCV_FILE="https://github.com/opencv/opencv/archive/refs/heads/4.x.zip" --build-arg OPENCV_CONTRIB_FILE="https://github.com/opencv/opencv_contrib/archive/refs/heads/4.x.zip" -f Dockerfile.opencv -t ghcr.io/hybridgroup/opencv:4.9.0-dev --platform=linux/arm64,linux/amd64 --push .
###################
# amd64 build stage
###################
FROM --platform=linux/amd64 golang:1.22-bullseye AS opencv-base-amd64
LABEL maintainer="hybridgroup"
RUN apt-get update && apt-get install -y \
git build-essential cmake pkg-config unzip libgtk2.0-dev \
curl ca-certificates libcurl4-openssl-dev libssl-dev \
libavcodec-dev libavformat-dev libswscale-dev libtbb2 libtbb-dev \
libjpeg62-turbo-dev libpng-dev libtiff-dev libdc1394-22-dev nasm && \
rm -rf /var/lib/apt/lists/*
FROM --platform=linux/amd64 opencv-base-amd64 AS opencv-build-amd64
ARG OPENCV_VERSION="4.9.0"
ENV OPENCV_VERSION $OPENCV_VERSION
ARG OPENCV_FILE="https://github.com/opencv/opencv/archive/${OPENCV_VERSION}.zip"
ENV OPENCV_FILE $OPENCV_FILE
ARG OPENCV_CONTRIB_FILE="https://github.com/opencv/opencv_contrib/archive/${OPENCV_VERSION}.zip"
ENV OPENCV_CONTRIB_FILE $OPENCV_CONTRIB_FILE
RUN curl -Lo opencv.zip ${OPENCV_FILE} && \
unzip -q opencv.zip && \
curl -Lo opencv_contrib.zip ${OPENCV_CONTRIB_FILE} && \
unzip -q opencv_contrib.zip && \
rm opencv.zip opencv_contrib.zip
RUN cd opencv-${OPENCV_VERSION} && \
mkdir build && cd build && \
cmake -D CMAKE_BUILD_TYPE=RELEASE \
-D WITH_IPP=OFF \
-D WITH_OPENGL=OFF \
-D WITH_QT=OFF \
-D CMAKE_INSTALL_PREFIX=/usr/local \
-D OPENCV_EXTRA_MODULES_PATH=../../opencv_contrib-${OPENCV_VERSION}/modules \
-D OPENCV_ENABLE_NONFREE=ON \
-D WITH_JASPER=OFF \
-D WITH_TBB=ON \
-D BUILD_JPEG=ON \
-D WITH_SIMD=ON \
-D ENABLE_LIBJPEG_TURBO_SIMD=ON \
-D BUILD_DOCS=OFF \
-D BUILD_EXAMPLES=OFF \
-D BUILD_TESTS=OFF \
-D BUILD_PERF_TESTS=ON \
-D BUILD_opencv_java=NO \
-D BUILD_opencv_python=NO \
-D BUILD_opencv_python2=NO \
-D BUILD_opencv_python3=NO \
-D OPENCV_GENERATE_PKGCONFIG=ON .. && \
make -j $(nproc --all) && \
make preinstall && make install && ldconfig && \
cd / && rm -rf opencv*
###################
# amd64 build stage
###################
FROM --platform=linux/arm64 golang:1.22-bullseye AS opencv-base-arm64
LABEL maintainer="hybridgroup"
RUN apt-get update && apt-get install -y --no-install-recommends \
git build-essential cmake pkg-config unzip libgtk2.0-dev \
curl ca-certificates libcurl4-openssl-dev libssl-dev \
libavcodec-dev libavformat-dev libswscale-dev libtbb2 libtbb-dev \
libjpeg-dev libpng-dev libtiff-dev libdc1394-22-dev && \
rm -rf /var/lib/apt/lists/*
git build-essential cmake pkg-config unzip libgtk2.0-dev \
curl ca-certificates libcurl4-openssl-dev libssl-dev \
libavcodec-dev libavformat-dev libswscale-dev libtbb2 libtbb-dev \
libjpeg62-turbo-dev libpng-dev libtiff-dev libdc1394-22-dev && \
apt-get autoremove -y && apt-get autoclean -y
ARG OPENCV_VERSION="4.6.0"
FROM --platform=linux/arm64 opencv-base-arm64 AS opencv-build-arm64
ARG OPENCV_VERSION="4.9.0"
ENV OPENCV_VERSION $OPENCV_VERSION
RUN curl -Lo opencv.zip https://github.com/opencv/opencv/archive/${OPENCV_VERSION}.zip && \
unzip -q opencv.zip && \
curl -Lo opencv_contrib.zip https://github.com/opencv/opencv_contrib/archive/${OPENCV_VERSION}.zip && \
unzip -q opencv_contrib.zip && \
rm opencv.zip opencv_contrib.zip && \
cd opencv-${OPENCV_VERSION} && \
mkdir build && cd build && \
cmake -D CMAKE_BUILD_TYPE=RELEASE \
-D WITH_IPP=OFF \
-D WITH_OPENGL=OFF \
-D WITH_QT=OFF \
-D CMAKE_INSTALL_PREFIX=/usr/local \
-D OPENCV_EXTRA_MODULES_PATH=../../opencv_contrib-${OPENCV_VERSION}/modules \
-D OPENCV_ENABLE_NONFREE=ON \
-D WITH_JASPER=OFF \
-D WITH_TBB=ON \
-D BUILD_DOCS=OFF \
-D BUILD_EXAMPLES=OFF \
-D BUILD_TESTS=OFF \
-D BUILD_PERF_TESTS=OFF \
-D BUILD_opencv_java=NO \
-D BUILD_opencv_python=NO \
-D BUILD_opencv_python2=NO \
-D BUILD_opencv_python3=NO \
-D OPENCV_GENERATE_PKGCONFIG=ON .. && \
make -j $(nproc --all) && \
make preinstall && make install && ldconfig && \
cd / && rm -rf opencv*
ARG OPENCV_FILE="https://github.com/opencv/opencv/archive/${OPENCV_VERSION}.zip"
ENV OPENCV_FILE $OPENCV_FILE
CMD ["go version"]
ARG OPENCV_CONTRIB_FILE="https://github.com/opencv/opencv_contrib/archive/${OPENCV_VERSION}.zip"
ENV OPENCV_CONTRIB_FILE $OPENCV_CONTRIB_FILE
RUN curl -Lo opencv.zip ${OPENCV_FILE} && \
unzip -q opencv.zip && \
curl -Lo opencv_contrib.zip ${OPENCV_CONTRIB_FILE} && \
unzip -q opencv_contrib.zip && \
rm opencv.zip opencv_contrib.zip
RUN cd opencv-${OPENCV_VERSION} && \
mkdir build && cd build && \
cmake -D CMAKE_BUILD_TYPE=RELEASE \
-D CMAKE_INSTALL_PREFIX=/usr/local \
-D OPENCV_EXTRA_MODULES_PATH=../../opencv_contrib-${OPENCV_VERSION}/modules \
-D ENABLE_NEON=ON \
-D WITH_FFMPEG=ON \
-D WITH_TBB=ON \
-D BUILD_TBB=ON \
-D BUILD_TESTS=OFF \
-D WITH_EIGEN=OFF \
-D WITH_GSTREAMER=OFF \
-D WITH_V4L=ON \
-D WITH_LIBV4L=ON \
-D WITH_VTK=OFF \
-D WITH_QT=OFF \
-D BUILD_JPEG=ON \
-D OPENCV_ENABLE_NONFREE=ON \
-D BUILD_DOCS=OFF \
-D BUILD_EXAMPLES=OFF \
-D BUILD_TESTS=OFF \
-D BUILD_PERF_TESTS=ON \
-D BUILD_opencv_java=NO \
-D BUILD_opencv_python=NO \
-D BUILD_opencv_python2=NO \
-D BUILD_opencv_python3=NO \
-D OPENCV_GENERATE_PKGCONFIG=ON \
-D CMAKE_TOOLCHAIN_FILE=../platforms/linux/aarch64-gnu.toolchain.cmake .. && \
make -j $(nproc --all) && \
make preinstall && make install && ldconfig && \
cd / && rm -rf opencv*
ARG TARGETARCH
###################
# multiarch build stage
###################
FROM opencv-build-${TARGETARCH} as opencv-final
CMD ["opencv_version", "-b"]

View File

@@ -1,5 +1,5 @@
# to build this docker image:
# docker build -f Dockerfile.opencv-gpu-cuda-10 -t gocv/opencv:4.6.0-gpu-cuda-10 .
# docker build -f Dockerfile.opencv-gpu-cuda-10 -t ghcr.io/hybridgroup/opencv:4.9.0-gpu-cuda-10 .
FROM nvidia/cuda:10.2-cudnn8-devel AS opencv-gpu-base
LABEL maintainer="hybridgroup"
@@ -9,13 +9,13 @@ LABEL maintainer="hybridgroup"
RUN apt-key adv --fetch-keys http://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/3bf863cc.pub
RUN apt-get update && apt-get install -y --no-install-recommends \
git build-essential cmake pkg-config unzip libgtk2.0-dev \
wget curl ca-certificates libcurl4-openssl-dev libssl-dev \
libavcodec-dev libavformat-dev libswscale-dev libtbb2 libtbb-dev \
libjpeg-dev libpng-dev libtiff-dev libdc1394-22-dev && \
rm -rf /var/lib/apt/lists/*
git build-essential cmake pkg-config unzip libgtk2.0-dev \
wget curl ca-certificates libcurl4-openssl-dev libssl-dev \
libavcodec-dev libavformat-dev libswscale-dev libtbb2 libtbb-dev \
libjpeg-dev libpng-dev libtiff-dev libdc1394-22-dev && \
rm -rf /var/lib/apt/lists/*
ARG OPENCV_VERSION="4.6.0"
ARG OPENCV_VERSION="4.9.0"
ENV OPENCV_VERSION $OPENCV_VERSION
RUN curl -Lo opencv.zip https://github.com/opencv/opencv/archive/${OPENCV_VERSION}.zip && \
@@ -26,32 +26,32 @@ RUN curl -Lo opencv.zip https://github.com/opencv/opencv/archive/${OPENCV_VERSIO
cd opencv-${OPENCV_VERSION} && \
mkdir build && cd build && \
cmake -D CMAKE_BUILD_TYPE=RELEASE \
-D WITH_IPP=OFF \
-D WITH_OPENGL=OFF \
-D WITH_QT=OFF \
-D CMAKE_INSTALL_PREFIX=/usr/local \
-D OPENCV_EXTRA_MODULES_PATH=../../opencv_contrib-${OPENCV_VERSION}/modules \
-D OPENCV_ENABLE_NONFREE=ON \
-D WITH_JASPER=OFF \
-D BUILD_DOCS=OFF \
-D BUILD_EXAMPLES=OFF \
-D BUILD_TESTS=OFF \
-D BUILD_PERF_TESTS=OFF \
-D BUILD_opencv_java=NO \
-D BUILD_opencv_python=NO \
-D BUILD_opencv_python2=NO \
-D BUILD_opencv_python3=NO \
-D WITH_TBB=ON \
-D WITH_CUDA=ON \
-D ENABLE_FAST_MATH=1 \
-D CUDA_FAST_MATH=1 \
-D WITH_CUBLAS=1 \
-D CUDA_TOOLKIT_ROOT_DIR=/usr/local/cuda/ \
-D BUILD_opencv_cudacodec=OFF \
-D WITH_CUDNN=ON \
-D OPENCV_DNN_CUDA=ON \
-D CUDA_GENERATION=Auto \
-D OPENCV_GENERATE_PKGCONFIG=ON .. && \
-D WITH_IPP=OFF \
-D WITH_OPENGL=OFF \
-D WITH_QT=OFF \
-D CMAKE_INSTALL_PREFIX=/usr/local \
-D OPENCV_EXTRA_MODULES_PATH=../../opencv_contrib-${OPENCV_VERSION}/modules \
-D OPENCV_ENABLE_NONFREE=ON \
-D WITH_JASPER=OFF \
-D BUILD_DOCS=OFF \
-D BUILD_EXAMPLES=OFF \
-D BUILD_TESTS=OFF \
-D BUILD_PERF_TESTS=ON \
-D BUILD_opencv_java=NO \
-D BUILD_opencv_python=NO \
-D BUILD_opencv_python2=NO \
-D BUILD_opencv_python3=NO \
-D WITH_TBB=ON \
-D WITH_CUDA=ON \
-D ENABLE_FAST_MATH=1 \
-D CUDA_FAST_MATH=1 \
-D WITH_CUBLAS=1 \
-D CUDA_TOOLKIT_ROOT_DIR=/usr/local/cuda/ \
-D BUILD_opencv_cudacodec=OFF \
-D WITH_CUDNN=ON \
-D OPENCV_DNN_CUDA=ON \
-D CUDA_GENERATION=Auto \
-D OPENCV_GENERATE_PKGCONFIG=ON .. && \
make -j $(nproc --all) && \
make preinstall && make install && ldconfig && \
cd / && rm -rf opencv*
@@ -59,7 +59,7 @@ RUN curl -Lo opencv.zip https://github.com/opencv/opencv/archive/${OPENCV_VERSIO
# install golang here
FROM opencv-gpu-base AS opencv-gpu-golang
ENV GO_RELEASE=1.18.3
ENV GO_RELEASE=1.22.0
RUN wget https://dl.google.com/go/go${GO_RELEASE}.linux-amd64.tar.gz && \
tar xfv go${GO_RELEASE}.linux-amd64.tar.gz -C /usr/local && \
rm go${GO_RELEASE}.linux-amd64.tar.gz

View File

@@ -1,53 +1,63 @@
# to build this docker image:
# docker build -f Dockerfile.opencv-gpu-cuda-11 -t gocv/opencv:4.6.0-gpu-cuda-11 .
FROM nvidia/cuda:11.5.2-cudnn8-devel-ubuntu20.04 AS opencv-gpu-cuda-11-base
# docker build -f Dockerfile.opencv-gpu-cuda-11 -t ghcr.io/hybridgroup/opencv:4.9.0-gpu-cuda-11 .
# docker build --build-arg OPENCV_VERSION="4.x" --build-arg OPENCV_FILE="https://github.com/opencv/opencv/archive/refs/heads/4.x.zip" --build-arg OPENCV_CONTRIB_FILE="https://github.com/opencv/opencv_contrib/archive/refs/heads/4.x.zip" -f Dockerfile.opencv-gpu-cuda-11.2.2 -t ghcr.io/hybridgroup/opencv:4.9.0-dev-gpu-cuda-11 .
FROM nvidia/cuda:11.8.0-cudnn8-devel-ubuntu20.04 AS opencv-gpu-cuda-11-base
LABEL maintainer="hybridgroup"
ENV DEBIAN_FRONTEND=noninteractive
RUN apt-get update && apt-get install -y --no-install-recommends \
git build-essential cmake pkg-config unzip libgtk2.0-dev \
wget curl ca-certificates libcurl4-openssl-dev libssl-dev \
libavcodec-dev libavformat-dev libswscale-dev libtbb2 libtbb-dev \
libjpeg-dev libpng-dev libtiff-dev libdc1394-22-dev && \
rm -rf /var/lib/apt/lists/*
git build-essential cmake pkg-config unzip libgtk2.0-dev \
wget curl ca-certificates libcurl4-openssl-dev libssl-dev \
libavcodec-dev libavformat-dev libswscale-dev libtbb2 libtbb-dev \
libjpeg-turbo8-dev libpng-dev libtiff-dev libdc1394-22-dev nasm && \
rm -rf /var/lib/apt/lists/*
ARG OPENCV_VERSION="4.6.0"
ARG OPENCV_VERSION="4.9.0"
ENV OPENCV_VERSION $OPENCV_VERSION
RUN curl -Lo opencv.zip https://github.com/opencv/opencv/archive/${OPENCV_VERSION}.zip && \
ARG OPENCV_FILE="https://github.com/opencv/opencv/archive/${OPENCV_VERSION}.zip"
ENV OPENCV_FILE $OPENCV_FILE
ARG OPENCV_CONTRIB_FILE="https://github.com/opencv/opencv_contrib/archive/${OPENCV_VERSION}.zip"
ENV OPENCV_CONTRIB_FILE $OPENCV_CONTRIB_FILE
RUN curl -Lo opencv.zip ${OPENCV_FILE} && \
unzip -q opencv.zip && \
curl -Lo opencv_contrib.zip https://github.com/opencv/opencv_contrib/archive/${OPENCV_VERSION}.zip && \
curl -Lo opencv_contrib.zip ${OPENCV_CONTRIB_FILE} && \
unzip -q opencv_contrib.zip && \
rm opencv.zip opencv_contrib.zip && \
cd opencv-${OPENCV_VERSION} && \
mkdir build && cd build && \
cmake -D CMAKE_BUILD_TYPE=RELEASE \
-D WITH_IPP=OFF \
-D WITH_OPENGL=OFF \
-D WITH_QT=OFF \
-D CMAKE_INSTALL_PREFIX=/usr/local \
-D OPENCV_EXTRA_MODULES_PATH=../../opencv_contrib-${OPENCV_VERSION}/modules \
-D OPENCV_ENABLE_NONFREE=ON \
-D WITH_JASPER=OFF \
-D BUILD_DOCS=OFF \
-D BUILD_EXAMPLES=OFF \
-D BUILD_TESTS=OFF \
-D BUILD_PERF_TESTS=OFF \
-D BUILD_opencv_java=NO \
-D BUILD_opencv_python=NO \
-D BUILD_opencv_python2=NO \
-D BUILD_opencv_python3=NO \
-D WITH_TBB=ON \
-D WITH_CUDA=ON \
-D ENABLE_FAST_MATH=1 \
-D CUDA_FAST_MATH=1 \
-D WITH_CUBLAS=1 \
-D CUDA_TOOLKIT_ROOT_DIR=/usr/local/cuda/ \
-D BUILD_opencv_cudacodec=OFF \
-D WITH_CUDNN=ON \
-D OPENCV_DNN_CUDA=ON \
-D CUDA_GENERATION=Auto \
-D OPENCV_GENERATE_PKGCONFIG=ON .. && \
-D WITH_IPP=OFF \
-D WITH_OPENGL=OFF \
-D WITH_QT=OFF \
-D CMAKE_INSTALL_PREFIX=/usr/local \
-D OPENCV_EXTRA_MODULES_PATH=../../opencv_contrib-${OPENCV_VERSION}/modules \
-D OPENCV_ENABLE_NONFREE=ON \
-D WITH_JASPER=OFF \
-D BUILD_DOCS=OFF \
-D BUILD_EXAMPLES=OFF \
-D BUILD_TESTS=OFF \
-D BUILD_PERF_TESTS=ON \
-D BUILD_opencv_java=NO \
-D BUILD_opencv_python=NO \
-D BUILD_opencv_python2=NO \
-D BUILD_opencv_python3=NO \
-D WITH_TBB=ON \
-D BUILD_JPEG=ON \
-D WITH_SIMD=ON \
-D WITH_LIBJPEG_TURBO_SIMD=ON \
-D WITH_CUDA=ON \
-D ENABLE_FAST_MATH=1 \
-D CUDA_FAST_MATH=1 \
-D WITH_CUBLAS=1 \
-D CUDA_TOOLKIT_ROOT_DIR=/usr/local/cuda/ \
-D BUILD_opencv_cudacodec=OFF \
-D WITH_CUDNN=ON \
-D OPENCV_DNN_CUDA=ON \
-D CUDA_ARCH_BIN=6.0,6.1,7.0,7.5,8.0,8.6 \
-D OPENCV_GENERATE_PKGCONFIG=ON .. && \
make -j $(nproc --all) && \
make preinstall && make install && ldconfig && \
cd / && rm -rf opencv*
@@ -55,7 +65,7 @@ RUN curl -Lo opencv.zip https://github.com/opencv/opencv/archive/${OPENCV_VERSIO
# install golang here
FROM opencv-gpu-cuda-11-base AS opencv-gpu-cuda-11-golang
ENV GO_RELEASE=1.18.3
ENV GO_RELEASE=1.22.0
RUN wget https://dl.google.com/go/go${GO_RELEASE}.linux-amd64.tar.gz && \
tar xfv go${GO_RELEASE}.linux-amd64.tar.gz -C /usr/local && \
rm go${GO_RELEASE}.linux-amd64.tar.gz

View File

@@ -0,0 +1,74 @@
# to build this docker image:
# docker build -f Dockerfile.opencv-gpu-cuda-11 -t ghcr.io/hybridgroup/opencv:4.9.0-gpu-cuda-11.2.2 .
# docker build --build-arg OPENCV_VERSION="4.x" --build-arg OPENCV_FILE="https://github.com/opencv/opencv/archive/refs/heads/4.x.zip" --build-arg OPENCV_CONTRIB_FILE="https://github.com/opencv/opencv_contrib/archive/refs/heads/4.x.zip" -f Dockerfile.opencv-gpu-cuda-11.2.2 -t ghcr.io/hybridgroup/opencv:4.9.0-dev-gpu-cuda-11.2.2 .
FROM nvidia/cuda:11.2.2-cudnn8-devel-ubuntu20.04 AS opencv-gpu-cuda-11-base
LABEL maintainer="hybridgroup"
ENV DEBIAN_FRONTEND=noninteractive
RUN apt-get update && apt-get install -y --no-install-recommends \
git build-essential cmake pkg-config unzip libgtk2.0-dev \
wget curl ca-certificates libcurl4-openssl-dev libssl-dev \
libavcodec-dev libavformat-dev libswscale-dev libtbb2 libtbb-dev \
libjpeg-turbo8-dev libpng-dev libtiff-dev libdc1394-22-dev nasm && \
rm -rf /var/lib/apt/lists/*
ARG OPENCV_VERSION="4.9.0"
ENV OPENCV_VERSION $OPENCV_VERSION
ARG OPENCV_FILE="https://github.com/opencv/opencv/archive/${OPENCV_VERSION}.zip"
ENV OPENCV_FILE $OPENCV_FILE
ARG OPENCV_CONTRIB_FILE="https://github.com/opencv/opencv_contrib/archive/${OPENCV_VERSION}.zip"
ENV OPENCV_CONTRIB_FILE $OPENCV_CONTRIB_FILE
RUN curl -Lo opencv.zip ${OPENCV_FILE} && \
unzip -q opencv.zip && \
curl -Lo opencv_contrib.zip ${OPENCV_CONTRIB_FILE} && \
unzip -q opencv_contrib.zip && \
rm opencv.zip opencv_contrib.zip && \
cd opencv-${OPENCV_VERSION} && \
mkdir build && cd build && \
cmake -D CMAKE_BUILD_TYPE=RELEASE \
-D WITH_IPP=OFF \
-D WITH_OPENGL=OFF \
-D WITH_QT=OFF \
-D CMAKE_INSTALL_PREFIX=/usr/local \
-D OPENCV_EXTRA_MODULES_PATH=../../opencv_contrib-${OPENCV_VERSION}/modules \
-D OPENCV_ENABLE_NONFREE=ON \
-D WITH_JASPER=OFF \
-D BUILD_DOCS=OFF \
-D BUILD_EXAMPLES=OFF \
-D BUILD_TESTS=OFF \
-D BUILD_PERF_TESTS=ON \
-D BUILD_opencv_java=NO \
-D BUILD_opencv_python=NO \
-D BUILD_opencv_python2=NO \
-D BUILD_opencv_python3=NO \
-D WITH_TBB=ON \
-D BUILD_JPEG=ON \
-D WITH_SIMD=ON \
-D WITH_LIBJPEG_TURBO_SIMD=ON \
-D WITH_CUDA=ON \
-D ENABLE_FAST_MATH=1 \
-D CUDA_FAST_MATH=1 \
-D WITH_CUBLAS=1 \
-D CUDA_TOOLKIT_ROOT_DIR=/usr/local/cuda/ \
-D BUILD_opencv_cudacodec=OFF \
-D WITH_CUDNN=ON \
-D OPENCV_DNN_CUDA=ON \
-D CUDA_ARCH_BIN=6.0,6.1,7.0,7.5,8.0,8.6 \
-D OPENCV_GENERATE_PKGCONFIG=ON .. && \
make -j $(nproc --all) && \
make preinstall && make install && ldconfig && \
cd / && rm -rf opencv*
# install golang here
FROM opencv-gpu-cuda-11-base AS opencv-gpu-cuda-11-golang
ENV GO_RELEASE=1.22.0
RUN wget https://dl.google.com/go/go${GO_RELEASE}.linux-amd64.tar.gz && \
tar xfv go${GO_RELEASE}.linux-amd64.tar.gz -C /usr/local && \
rm go${GO_RELEASE}.linux-amd64.tar.gz
ENV PATH="${PATH}:/usr/local/go/bin"
CMD ["go version"]

74
vendor/gocv.io/x/gocv/Dockerfile.opencv-gpu-cuda-12 generated vendored Normal file
View File

@@ -0,0 +1,74 @@
# to build this docker image:
# docker build -f Dockerfile.opencv-gpu-cuda-12 -t ghcr.io/hybridgroup/opencv:4.9.0-gpu-cuda-12 .
# docker build --build-arg OPENCV_VERSION="4.x" --build-arg OPENCV_FILE="https://github.com/opencv/opencv/archive/refs/heads/4.x.zip" --build-arg OPENCV_CONTRIB_FILE="https://github.com/opencv/opencv_contrib/archive/refs/heads/4.x.zip" -f Dockerfile.opencv-gpu-cuda-12 -t ghcr.io/hybridgroup/opencv:4.9.0-dev-gpu-cuda-12 .
FROM nvidia/cuda:12.1.0-cudnn8-devel-ubuntu22.04 AS opencv-gpu-cuda-12-base
LABEL maintainer="hybridgroup"
ENV DEBIAN_FRONTEND=noninteractive
RUN apt-get update && apt-get install -y --no-install-recommends \
git build-essential cmake pkg-config unzip libgtk2.0-dev \
wget curl ca-certificates libcurl4-openssl-dev libssl-dev \
libavcodec-dev libavformat-dev libswscale-dev libtbb2 libtbb-dev \
libjpeg-turbo8-dev libpng-dev libtiff-dev libdc1394-dev nasm && \
rm -rf /var/lib/apt/lists/*
ARG OPENCV_VERSION="4.9.0"
ENV OPENCV_VERSION $OPENCV_VERSION
ARG OPENCV_FILE="https://github.com/opencv/opencv/archive/${OPENCV_VERSION}.zip"
ENV OPENCV_FILE $OPENCV_FILE
ARG OPENCV_CONTRIB_FILE="https://github.com/opencv/opencv_contrib/archive/${OPENCV_VERSION}.zip"
ENV OPENCV_CONTRIB_FILE $OPENCV_CONTRIB_FILE
RUN curl -Lo opencv.zip ${OPENCV_FILE} && \
unzip -q opencv.zip && \
curl -Lo opencv_contrib.zip ${OPENCV_CONTRIB_FILE} && \
unzip -q opencv_contrib.zip && \
rm opencv.zip opencv_contrib.zip && \
cd opencv-${OPENCV_VERSION} && \
mkdir build && cd build && \
cmake -D CMAKE_BUILD_TYPE=RELEASE \
-D WITH_IPP=OFF \
-D WITH_OPENGL=OFF \
-D WITH_QT=OFF \
-D CMAKE_INSTALL_PREFIX=/usr/local \
-D OPENCV_EXTRA_MODULES_PATH=../../opencv_contrib-${OPENCV_VERSION}/modules \
-D OPENCV_ENABLE_NONFREE=ON \
-D WITH_JASPER=OFF \
-D BUILD_DOCS=OFF \
-D BUILD_EXAMPLES=OFF \
-D BUILD_TESTS=OFF \
-D BUILD_PERF_TESTS=ON \
-D BUILD_opencv_java=NO \
-D BUILD_opencv_python=NO \
-D BUILD_opencv_python2=NO \
-D BUILD_opencv_python3=NO \
-D WITH_TBB=ON \
-D BUILD_JPEG=ON \
-D WITH_SIMD=ON \
-D WITH_LIBJPEG_TURBO_SIMD=ON \
-D WITH_CUDA=ON \
-D ENABLE_FAST_MATH=1 \
-D CUDA_FAST_MATH=1 \
-D WITH_CUBLAS=1 \
-D CUDA_TOOLKIT_ROOT_DIR=/usr/local/cuda/ \
-D BUILD_opencv_cudacodec=OFF \
-D WITH_CUDNN=ON \
-D OPENCV_DNN_CUDA=ON \
-D CUDA_ARCH_BIN=6.0,6.1,7.0,7.5,8.0,8.6,8.9,9.0 \
-D OPENCV_GENERATE_PKGCONFIG=ON .. && \
make -j $(nproc --all) && \
make preinstall && make install && ldconfig && \
cd / && rm -rf opencv*
# install golang here
FROM opencv-gpu-cuda-12-base AS opencv-gpu-cuda-12-golang
ENV GO_RELEASE=1.22.0
RUN wget https://dl.google.com/go/go${GO_RELEASE}.linux-amd64.tar.gz && \
tar xfv go${GO_RELEASE}.linux-amd64.tar.gz -C /usr/local && \
rm go${GO_RELEASE}.linux-amd64.tar.gz
ENV PATH="${PATH}:/usr/local/go/bin"
CMD ["go version"]

57
vendor/gocv.io/x/gocv/Dockerfile.opencv-openvino generated vendored Normal file
View File

@@ -0,0 +1,57 @@
# to build this docker image:
# docker build -f Dockerfile.opencv-openvino -t ghcr.io/hybridgroup/opencv:4.9.0-openvino
FROM openvino/ubuntu20_dev:2022.1.0 AS opencv-openvino-base
LABEL maintainer="hybridgroup"
ENV DEBIAN_FRONTEND=noninteractive
USER root
RUN apt-get update && apt-get install -y --no-install-recommends \
git build-essential cmake pkg-config unzip libgtk2.0-dev \
wget curl ca-certificates libcurl4-openssl-dev libssl-dev \
libavcodec-dev libavformat-dev libswscale-dev libtbb2 libtbb-dev \
libjpeg-dev libpng-dev libtiff-dev libdc1394-22-dev && \
rm -rf /var/lib/apt/lists/*
ARG OPENCV_VERSION="4.9.0"
ENV OPENCV_VERSION $OPENCV_VERSION
RUN curl -Lo opencv.zip https://github.com/opencv/opencv/archive/${OPENCV_VERSION}.zip && \
unzip -q opencv.zip && \
curl -Lo opencv_contrib.zip https://github.com/opencv/opencv_contrib/archive/${OPENCV_VERSION}.zip && \
unzip -q opencv_contrib.zip && \
rm opencv.zip opencv_contrib.zip && \
cd opencv-${OPENCV_VERSION} && \
mkdir build && cd build && \
cmake -D CMAKE_BUILD_TYPE=RELEASE \
-D WITH_IPP=OFF \
-D WITH_OPENGL=OFF \
-D WITH_QT=OFF \
-D CMAKE_INSTALL_PREFIX=/usr/local \
-D OPENCV_EXTRA_MODULES_PATH=../../opencv_contrib-${OPENCV_VERSION}/modules \
-D OPENCV_ENABLE_NONFREE=ON \
-D WITH_JASPER=OFF \
-D BUILD_DOCS=OFF \
-D BUILD_EXAMPLES=OFF \
-D BUILD_TESTS=OFF \
-D BUILD_PERF_TESTS=ON \
-D BUILD_opencv_java=NO \
-D BUILD_opencv_python=NO \
-D BUILD_opencv_python2=NO \
-D BUILD_opencv_python3=NO \
-D WITH_TBB=ON \
-D WITH_OPENVINO=1 \
-D ENABLE_FAST_MATH=1 \
-D OPENCV_GENERATE_PKGCONFIG=ON .. && \
make -j $(nproc --all) && \
make preinstall && make install && ldconfig && \
cd / && rm -rf opencv*
# install golang here
FROM opencv-openvino-base AS opencv-openvino-golang
ENV GO_RELEASE=1.22.0
RUN wget https://dl.google.com/go/go${GO_RELEASE}.linux-amd64.tar.gz && \
tar xfv go${GO_RELEASE}.linux-amd64.tar.gz -C /usr/local && \
rm go${GO_RELEASE}.linux-amd64.tar.gz
ENV PATH="${PATH}:/usr/local/go/bin"
USER openvino
CMD ["go version"]

138
vendor/gocv.io/x/gocv/Dockerfile.opencv-static generated vendored Normal file
View File

@@ -0,0 +1,138 @@
# OpenCV 4 prebuilt multiarchitecture image
#
# To build release:
# docker buildx build -f Dockerfile.opencv-static -t ghcr.io/hybridgroup/opencv:4.9.0-static --platform=linux/arm64,linux/amd64 --push .
#
# To build prerelease:
# docker buildx build --build-arg OPENCV_VERSION="4.x" --build-arg OPENCV_FILE="https://github.com/opencv/opencv/archive/refs/heads/4.x.zip" --build-arg OPENCV_CONTRIB_FILE="https://github.com/opencv/opencv_contrib/archive/refs/heads/4.x.zip" -f Dockerfile.opencv -t ghcr.io/hybridgroup/opencv:4.9.0-dev --platform=linux/arm64,linux/amd64 --push .
###################
# amd64 build stage
###################
FROM --platform=linux/amd64 golang:1.22-bullseye AS opencv-base-amd64
LABEL maintainer="hybridgroup"
RUN apt-get update && apt-get install -y \
git build-essential cmake pkg-config unzip libgtk2.0-dev \
curl ca-certificates libcurl4-openssl-dev libssl-dev \
libavcodec-dev libavformat-dev libswscale-dev libtbb2 libtbb-dev \
libjpeg62-turbo-dev libpng-dev libtiff-dev libdc1394-22-dev nasm && \
rm -rf /var/lib/apt/lists/*
FROM --platform=linux/amd64 opencv-base-amd64 AS opencv-build-amd64
ARG OPENCV_VERSION="4.9.0"
ENV OPENCV_VERSION $OPENCV_VERSION
ARG OPENCV_FILE="https://github.com/opencv/opencv/archive/${OPENCV_VERSION}.zip"
ENV OPENCV_FILE $OPENCV_FILE
ARG OPENCV_CONTRIB_FILE="https://github.com/opencv/opencv_contrib/archive/${OPENCV_VERSION}.zip"
ENV OPENCV_CONTRIB_FILE $OPENCV_CONTRIB_FILE
RUN curl -Lo opencv.zip ${OPENCV_FILE} && \
unzip -q opencv.zip && \
curl -Lo opencv_contrib.zip ${OPENCV_CONTRIB_FILE} && \
unzip -q opencv_contrib.zip && \
rm opencv.zip opencv_contrib.zip
RUN cd opencv-${OPENCV_VERSION} && \
mkdir build && cd build && \
cmake -D CMAKE_BUILD_TYPE=RELEASE \
-D WITH_IPP=ON \
-D BUILD_WITH_DYNAMIC_IPP=OFF \
-D BUILD_IPP_IW=ON \
-D WITH_OPENGL=OFF \
-D WITH_QT=OFF \
-D CMAKE_INSTALL_PREFIX=/usr/local \
-D BUILD_SHARED_LIBS=OFF \
-D OPENCV_EXTRA_MODULES_PATH=../../opencv_contrib-${OPENCV_VERSION}/modules \
-D OPENCV_ENABLE_NONFREE=ON \
-D WITH_JASPER=OFF \
-D WITH_TBB=ON \
-D BUILD_JPEG=ON \
-D WITH_SIMD=ON \
-D ENABLE_LIBJPEG_TURBO_SIMD=ON \
-D BUILD_DOCS=OFF \
-D BUILD_EXAMPLES=OFF \
-D BUILD_TESTS=OFF \
-D BUILD_PERF_TESTS=ON \
-D BUILD_opencv_java=NO \
-D BUILD_opencv_python=NO \
-D BUILD_opencv_python2=NO \
-D BUILD_opencv_python3=NO \
-D OPENCV_GENERATE_PKGCONFIG=ON .. && \
make -j $(nproc --all) && \
make preinstall && make install && ldconfig && \
cd / && rm -rf opencv*
###################
# arm64 build stage
###################
FROM --platform=linux/arm64 golang:1.22-bullseye AS opencv-base-arm64
LABEL maintainer="hybridgroup"
RUN apt-get update && apt-get install -y --no-install-recommends \
git build-essential cmake pkg-config unzip libgtk2.0-dev \
curl ca-certificates libcurl4-openssl-dev libssl-dev \
libavcodec-dev libavformat-dev libswscale-dev libtbb2 libtbb-dev \
libjpeg62-turbo-dev libpng-dev libtiff-dev libdc1394-22-dev && \
rm -rf /var/lib/apt/lists/*
FROM --platform=linux/arm64 opencv-base-arm64 AS opencv-build-arm64
ARG OPENCV_VERSION="4.9.0"
ENV OPENCV_VERSION $OPENCV_VERSION
ARG OPENCV_FILE="https://github.com/opencv/opencv/archive/${OPENCV_VERSION}.zip"
ENV OPENCV_FILE $OPENCV_FILE
ARG OPENCV_CONTRIB_FILE="https://github.com/opencv/opencv_contrib/archive/${OPENCV_VERSION}.zip"
ENV OPENCV_CONTRIB_FILE $OPENCV_CONTRIB_FILE
RUN curl -Lo opencv.zip ${OPENCV_FILE} && \
unzip -q opencv.zip && \
curl -Lo opencv_contrib.zip ${OPENCV_CONTRIB_FILE} && \
unzip -q opencv_contrib.zip && \
rm opencv.zip opencv_contrib.zip
RUN cd opencv-${OPENCV_VERSION} && \
mkdir build && cd build && \
cmake -D CMAKE_BUILD_TYPE=RELEASE \
-D WITH_IPP=OFF \
-D WITH_OPENGL=OFF \
-D WITH_QT=OFF \
-D CMAKE_INSTALL_PREFIX=/usr/local \
-D BUILD_SHARED_LIBS=OFF \
-D OPENCV_EXTRA_MODULES_PATH=../../opencv_contrib-${OPENCV_VERSION}/modules \
-D OPENCV_ENABLE_NONFREE=ON \
-D WITH_JASPER=OFF \
-D WITH_TBB=ON \
-D BUILD_JPEG=ON \
-D WITH_SIMD=ON \
-D ENABLE_LIBJPEG_TURBO_SIMD=ON \
-D BUILD_DOCS=OFF \
-D BUILD_EXAMPLES=OFF \
-D BUILD_TESTS=OFF \
-D BUILD_PERF_TESTS=ON \
-D BUILD_opencv_java=NO \
-D BUILD_opencv_python=NO \
-D BUILD_opencv_python2=NO \
-D BUILD_opencv_python3=NO \
-D OPENCV_GENERATE_PKGCONFIG=ON .. && \
make -j $(nproc --all) && \
make preinstall && make install && ldconfig && \
cd / && rm -rf opencv*
ARG TARGETARCH
###################
# multiarch build stage
###################
FROM opencv-build-${TARGETARCH} as opencv-final
CMD ["opencv_version", "-b"]

55
vendor/gocv.io/x/gocv/Dockerfile.opencv-ubuntu-18.04 generated vendored Normal file
View File

@@ -0,0 +1,55 @@
# to build this docker image:
# docker build -f Dockerfile.opencv-ubuntu-18.04 -t ghcr.io/hybridgroup/opencv:4.9.0-ubuntu-18.04 .
# docker build --build-arg OPENCV_VERSION="4.x" --build-arg OPENCV_FILE="https://github.com/opencv/opencv/archive/refs/heads/4.x.zip" --build-arg OPENCV_CONTRIB_FILE="https://github.com/opencv/opencv_contrib/archive/refs/heads/4.x.zip" -f Dockerfile.opencv-ubuntu-18.04 -t ghcr.io/hybridgroup/opencv:4.9.0-dev-ubuntu-18.04 .
FROM ubuntu:18.04 AS opencv-base
LABEL maintainer="hybridgroup"
RUN apt-get update && apt-get install -y --no-install-recommends \
git build-essential cmake pkg-config wget unzip libgtk2.0-dev \
curl ca-certificates libcurl4-openssl-dev libssl-dev \
libavcodec-dev libavformat-dev libswscale-dev libtbb2 libtbb-dev \
libjpeg-turbo8-dev libpng-dev libtiff-dev libdc1394-22-dev nasm && \
rm -rf /var/lib/apt/lists/*
ARG OPENCV_VERSION="4.9.0"
ENV OPENCV_VERSION $OPENCV_VERSION
ARG OPENCV_FILE="https://github.com/opencv/opencv/archive/${OPENCV_VERSION}.zip"
ENV OPENCV_FILE $OPENCV_FILE
ARG OPENCV_CONTRIB_FILE="https://github.com/opencv/opencv_contrib/archive/${OPENCV_VERSION}.zip"
ENV OPENCV_CONTRIB_FILE $OPENCV_CONTRIB_FILE
RUN curl -Lo opencv.zip ${OPENCV_FILE} && \
unzip -q opencv.zip && \
curl -Lo opencv_contrib.zip ${OPENCV_CONTRIB_FILE} && \
unzip -q opencv_contrib.zip && \
rm opencv.zip opencv_contrib.zip && \
cd opencv-${OPENCV_VERSION} && \
mkdir build && cd build && \
cmake -D CMAKE_BUILD_TYPE=RELEASE \
-D WITH_IPP=OFF \
-D WITH_OPENGL=OFF \
-D WITH_QT=OFF \
-D CMAKE_INSTALL_PREFIX=/usr/local \
-D OPENCV_EXTRA_MODULES_PATH=../../opencv_contrib-${OPENCV_VERSION}/modules \
-D OPENCV_ENABLE_NONFREE=ON \
-D WITH_JASPER=OFF \
-D WITH_TBB=ON \
-D BUILD_JPEG=ON \
-D WITH_SIMD=ON \
-D ENABLE_LIBJPEG_TURBO_SIMD=ON \
-D BUILD_DOCS=OFF \
-D BUILD_EXAMPLES=OFF \
-D BUILD_TESTS=OFF \
-D BUILD_PERF_TESTS=ON \
-D BUILD_opencv_java=NO \
-D BUILD_opencv_python=NO \
-D BUILD_opencv_python2=NO \
-D BUILD_opencv_python3=NO \
-D OPENCV_GENERATE_PKGCONFIG=ON .. && \
make -j $(nproc --all) && \
make preinstall && make install && ldconfig && \
cd / && rm -rf opencv*
CMD ["opencv_version", "-b"]

58
vendor/gocv.io/x/gocv/Dockerfile.opencv-ubuntu-20.04 generated vendored Normal file
View File

@@ -0,0 +1,58 @@
# to build this docker image:
# docker build -f Dockerfile.opencv-ubuntu-20.04 -t ghcr.io/hybridgroup/opencv:4.9.0-ubuntu-20.04 .
# docker build --build-arg OPENCV_VERSION="4.x" --build-arg OPENCV_FILE="https://github.com/opencv/opencv/archive/refs/heads/4.x.zip" --build-arg OPENCV_CONTRIB_FILE="https://github.com/opencv/opencv_contrib/archive/refs/heads/4.x.zip" -f Dockerfile.opencv-ubuntu-20.04 -t ghcr.io/hybridgroup/opencv:4.9.0-dev-ubuntu-20.04 .
FROM ubuntu:20.04 AS opencv-base
LABEL maintainer="hybridgroup"
ENV TZ=Europe/Madrid
RUN ln -snf /usr/share/zoneinfo/$TZ /etc/localtime && echo $TZ > /etc/timezone
RUN apt-get update && apt-get install -y --no-install-recommends \
tzdata git build-essential cmake pkg-config wget unzip libgtk2.0-dev \
curl ca-certificates libcurl4-openssl-dev libssl-dev \
libavcodec-dev libavformat-dev libswscale-dev libtbb2 libtbb-dev \
libjpeg-turbo8-dev libpng-dev libtiff-dev libdc1394-22-dev nasm && \
rm -rf /var/lib/apt/lists/*
ARG OPENCV_VERSION="4.9.0"
ENV OPENCV_VERSION $OPENCV_VERSION
ARG OPENCV_FILE="https://github.com/opencv/opencv/archive/${OPENCV_VERSION}.zip"
ENV OPENCV_FILE $OPENCV_FILE
ARG OPENCV_CONTRIB_FILE="https://github.com/opencv/opencv_contrib/archive/${OPENCV_VERSION}.zip"
ENV OPENCV_CONTRIB_FILE $OPENCV_CONTRIB_FILE
RUN curl -Lo opencv.zip ${OPENCV_FILE} && \
unzip -q opencv.zip && \
curl -Lo opencv_contrib.zip ${OPENCV_CONTRIB_FILE} && \
unzip -q opencv_contrib.zip && \
rm opencv.zip opencv_contrib.zip && \
cd opencv-${OPENCV_VERSION} && \
mkdir build && cd build && \
cmake -D CMAKE_BUILD_TYPE=RELEASE \
-D WITH_IPP=OFF \
-D WITH_OPENGL=OFF \
-D WITH_QT=OFF \
-D CMAKE_INSTALL_PREFIX=/usr/local \
-D OPENCV_EXTRA_MODULES_PATH=../../opencv_contrib-${OPENCV_VERSION}/modules \
-D OPENCV_ENABLE_NONFREE=ON \
-D WITH_JASPER=OFF \
-D WITH_TBB=ON \
-D BUILD_JPEG=ON \
-D WITH_SIMD=ON \
-D ENABLE_LIBJPEG_TURBO_SIMD=ON \
-D BUILD_DOCS=OFF \
-D BUILD_EXAMPLES=OFF \
-D BUILD_TESTS=OFF \
-D BUILD_PERF_TESTS=ON \
-D BUILD_opencv_java=NO \
-D BUILD_opencv_python=NO \
-D BUILD_opencv_python2=NO \
-D BUILD_opencv_python3=NO \
-D OPENCV_GENERATE_PKGCONFIG=ON .. && \
make -j $(nproc --all) && \
make preinstall && make install && ldconfig && \
cd / && rm -rf opencv*
CMD ["opencv_version", "-b"]

58
vendor/gocv.io/x/gocv/Dockerfile.opencv-ubuntu-22.04 generated vendored Normal file
View File

@@ -0,0 +1,58 @@
# to build this docker image:
# docker build -f Dockerfile.opencv-ubuntu-22.04 -t ghcr.io/hybridgroup/opencv:4.9.0-ubuntu-22.04 .
# docker build --build-arg OPENCV_VERSION="4.x" --build-arg OPENCV_FILE="https://github.com/opencv/opencv/archive/refs/heads/4.x.zip" --build-arg OPENCV_CONTRIB_FILE="https://github.com/opencv/opencv_contrib/archive/refs/heads/4.x.zip" -f Dockerfile.opencv-ubuntu-20.04 -t ghcr.io/hybridgroup/opencv:4.9.0-dev-ubuntu-20.04 .
FROM ubuntu:22.04 AS opencv-base
LABEL maintainer="hybridgroup"
ENV TZ=Europe/Madrid
RUN ln -snf /usr/share/zoneinfo/$TZ /etc/localtime && echo $TZ > /etc/timezone
RUN apt-get update && apt-get install -y --no-install-recommends \
tzdata git build-essential cmake pkg-config wget unzip libgtk2.0-dev \
curl ca-certificates libcurl4-openssl-dev libssl-dev \
libavcodec-dev libavformat-dev libswscale-dev libtbb2 libtbb-dev \
libjpeg-turbo8-dev libpng-dev libtiff-dev libdc1394-dev nasm && \
rm -rf /var/lib/apt/lists/*
ARG OPENCV_VERSION="4.9.0"
ENV OPENCV_VERSION $OPENCV_VERSION
ARG OPENCV_FILE="https://github.com/opencv/opencv/archive/${OPENCV_VERSION}.zip"
ENV OPENCV_FILE $OPENCV_FILE
ARG OPENCV_CONTRIB_FILE="https://github.com/opencv/opencv_contrib/archive/${OPENCV_VERSION}.zip"
ENV OPENCV_CONTRIB_FILE $OPENCV_CONTRIB_FILE
RUN curl -Lo opencv.zip ${OPENCV_FILE} && \
unzip -q opencv.zip && \
curl -Lo opencv_contrib.zip ${OPENCV_CONTRIB_FILE} && \
unzip -q opencv_contrib.zip && \
rm opencv.zip opencv_contrib.zip && \
cd opencv-${OPENCV_VERSION} && \
mkdir build && cd build && \
cmake -D CMAKE_BUILD_TYPE=RELEASE \
-D WITH_IPP=OFF \
-D WITH_OPENGL=OFF \
-D WITH_QT=OFF \
-D CMAKE_INSTALL_PREFIX=/usr/local \
-D OPENCV_EXTRA_MODULES_PATH=../../opencv_contrib-${OPENCV_VERSION}/modules \
-D OPENCV_ENABLE_NONFREE=ON \
-D WITH_JASPER=OFF \
-D WITH_TBB=ON \
-D BUILD_JPEG=ON \
-D WITH_SIMD=ON \
-D ENABLE_LIBJPEG_TURBO_SIMD=ON \
-D BUILD_DOCS=OFF \
-D BUILD_EXAMPLES=OFF \
-D BUILD_TESTS=OFF \
-D BUILD_PERF_TESTS=ON \
-D BUILD_opencv_java=NO \
-D BUILD_opencv_python=NO \
-D BUILD_opencv_python2=NO \
-D BUILD_opencv_python3=NO \
-D OPENCV_GENERATE_PKGCONFIG=ON .. && \
make -j $(nproc --all) && \
make preinstall && make install && ldconfig && \
cd / && rm -rf opencv*
CMD ["opencv_version", "-b"]

94
vendor/gocv.io/x/gocv/Makefile generated vendored
View File

@@ -2,13 +2,13 @@
.PHONY: test deps download build clean astyle cmds docker
# GoCV version to use.
GOCV_VERSION?="v0.31.0"
GOCV_VERSION?="v0.35.0"
# OpenCV version to use.
OPENCV_VERSION?=4.6.0
OPENCV_VERSION?=4.9.0
# Go version to use when building Docker image
GOVERSION?=1.16.2
GOVERSION?=1.22.0
# Temporary directory to put files into.
TMP_DIR?=/tmp/
@@ -19,6 +19,8 @@ BUILD_SHARED_LIBS?=ON
# Package list for each well-known Linux distribution
RPMS=cmake curl wget git gtk2-devel libpng-devel libjpeg-devel libtiff-devel tbb tbb-devel libdc1394-devel unzip gcc-c++
DEBS=unzip wget build-essential cmake curl git libgtk2.0-dev pkg-config libavcodec-dev libavformat-dev libswscale-dev libtbb2 libtbb-dev libjpeg-dev libpng-dev libtiff-dev libdc1394-22-dev
DEBS_BOOKWORM=unzip wget build-essential cmake curl git libgtk2.0-dev pkg-config libavcodec-dev libavformat-dev libswscale-dev libtbbmalloc2 libtbb-dev libjpeg-dev libpng-dev libtiff-dev
DEBS_UBUNTU_JAMMY=unzip wget build-essential cmake curl git libgtk2.0-dev pkg-config libavcodec-dev libavformat-dev libswscale-dev libtbb2 libtbb-dev libjpeg-dev libpng-dev libtiff-dev libdc1394-dev
JETSON=build-essential cmake git unzip pkg-config libjpeg-dev libpng-dev libtiff-dev libavcodec-dev libavformat-dev libswscale-dev libgtk2.0-dev libcanberra-gtk* libxvidcore-dev libx264-dev libgtk-3-dev libtbb2 libtbb-dev libdc1394-22-dev libv4l-dev v4l-utils libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev libavresample-dev libvorbis-dev libxine2-dev libfaac-dev libmp3lame-dev libtheora-dev libopencore-amrnb-dev libopencore-amrwb-dev libopenblas-dev libatlas-base-dev libblas-dev liblapack-dev libeigen3-dev gfortran libhdf5-dev protobuf-compiler libprotobuf-dev libgoogle-glog-dev libgflags-dev
explain:
@@ -30,7 +32,15 @@ ifneq ($(shell which dnf 2>/dev/null),)
distro_deps=deps_fedora
else
ifneq ($(shell which apt-get 2>/dev/null),)
ifneq ($(shell cat /etc/os-release 2>/dev/null | grep "Jammy Jellyfish"),)
distro_deps=deps_ubuntu_jammy
else
ifneq ($(shell cat /etc/debian_version 2>/dev/null | grep "12."),)
distro_deps=deps_debian_bookworm
else
distro_deps=deps_debian
endif
endif
else
ifneq ($(shell which yum 2>/dev/null),)
distro_deps=deps_rh_centos
@@ -47,10 +57,18 @@ deps_rh_centos:
deps_fedora:
sudo dnf -y install pkgconf-pkg-config $(RPMS)
deps_debian_bookworm:
sudo apt-get -y update
sudo apt-get -y install $(DEBS_BOOKWORM)
deps_debian:
sudo apt-get -y update
sudo apt-get -y install $(DEBS)
deps_ubuntu_jammy:
sudo apt-get -y update
sudo apt-get -y install $(DEBS_UBUNTU_JAMMY)
deps_jetson:
sudo sh -c "echo '/usr/local/cuda/lib64' >> /etc/ld.so.conf.d/nvidia-tegra.conf"
sudo ldconfig
@@ -62,9 +80,9 @@ download:
rm -rf $(TMP_DIR)opencv
mkdir $(TMP_DIR)opencv
cd $(TMP_DIR)opencv
curl -Lo opencv.zip https://github.com/opencv/opencv/archive/$(OPENCV_VERSION).zip
curl -Lo opencv.zip https://github.com/opencv/opencv/archive/refs/tags/$(OPENCV_VERSION).zip
unzip -q opencv.zip
curl -Lo opencv_contrib.zip https://github.com/opencv/opencv_contrib/archive/$(OPENCV_VERSION).zip
curl -Lo opencv_contrib.zip https://github.com/opencv/opencv_contrib/archive/refs/tags/$(OPENCV_VERSION).zip
unzip -q opencv_contrib.zip
rm opencv.zip opencv_contrib.zip
cd -
@@ -86,7 +104,7 @@ build_openvino_package:
cd build
sudo rm -rf *
sudo cmake -D CMAKE_BUILD_TYPE=RELEASE -D CMAKE_INSTALL_PREFIX=/usr/local -D BUILD_SHARED_LIBS=${BUILD_SHARED_LIBS} -D ENABLE_VPU=ON -D ENABLE_MKL_DNN=ON -D ENABLE_CLDNN=ON ..
sudo $(MAKE) -j $(shell nproc --all)
sudo $(MAKE) -j $(shell nproc --all --ignore 1)
sudo touch VERSION
sudo mkdir -p src/ngraph
sudo cp thirdparty/ngraph/src/ngraph/version.hpp src/ngraph
@@ -98,8 +116,8 @@ build:
mkdir build
cd build
rm -rf *
cmake -D CMAKE_BUILD_TYPE=RELEASE -D CMAKE_INSTALL_PREFIX=/usr/local -D BUILD_SHARED_LIBS=${BUILD_SHARED_LIBS} -D OPENCV_EXTRA_MODULES_PATH=$(TMP_DIR)opencv/opencv_contrib-$(OPENCV_VERSION)/modules -D BUILD_DOCS=OFF -D BUILD_EXAMPLES=OFF -D BUILD_TESTS=OFF -D BUILD_PERF_TESTS=OFF -D BUILD_opencv_java=NO -D BUILD_opencv_python=NO -D BUILD_opencv_python2=NO -D BUILD_opencv_python3=NO -D WITH_JASPER=OFF -D WITH_TBB=ON -DOPENCV_GENERATE_PKGCONFIG=ON ..
$(MAKE) -j $(shell nproc --all)
cmake -D CMAKE_BUILD_TYPE=RELEASE -D CMAKE_INSTALL_PREFIX=/usr/local -D BUILD_SHARED_LIBS=${BUILD_SHARED_LIBS} -D OPENCV_EXTRA_MODULES_PATH=$(TMP_DIR)opencv/opencv_contrib-$(OPENCV_VERSION)/modules -D BUILD_DOCS=OFF -D BUILD_EXAMPLES=OFF -D BUILD_TESTS=OFF -D BUILD_PERF_TESTS=ON -D BUILD_opencv_java=NO -D BUILD_opencv_python=NO -D BUILD_opencv_python2=NO -D BUILD_opencv_python3=NO -D WITH_JASPER=OFF -D WITH_TBB=ON -DOPENCV_GENERATE_PKGCONFIG=ON ..
$(MAKE) -j $(shell nproc --all --ignore 1)
$(MAKE) preinstall
cd -
@@ -109,8 +127,12 @@ build_raspi:
mkdir build
cd build
rm -rf *
cmake -D CMAKE_BUILD_TYPE=RELEASE -D CMAKE_INSTALL_PREFIX=/usr/local -D BUILD_SHARED_LIBS=${BUILD_SHARED_LIBS} -D OPENCV_EXTRA_MODULES_PATH=$(TMP_DIR)opencv/opencv_contrib-$(OPENCV_VERSION)/modules -D BUILD_DOCS=OFF -D BUILD_EXAMPLES=OFF -D BUILD_TESTS=OFF -D BUILD_PERF_TESTS=OFF -D BUILD_opencv_java=OFF -D BUILD_opencv_python=NO -D BUILD_opencv_python2=NO -D BUILD_opencv_python3=NO -D ENABLE_NEON=ON -D ENABLE_VFPV3=ON -D WITH_JASPER=OFF -D OPENCV_GENERATE_PKGCONFIG=ON ..
$(MAKE) -j $(shell nproc --all)
ifneq ($(shell uname -m | grep "aarch64"),)
cmake -D CMAKE_BUILD_TYPE=RELEASE -D CMAKE_INSTALL_PREFIX=/usr/local -D BUILD_SHARED_LIBS=${BUILD_SHARED_LIBS} -D OPENCV_EXTRA_MODULES_PATH=$(TMP_DIR)opencv/opencv_contrib-$(OPENCV_VERSION)/modules -D BUILD_DOCS=OFF -D BUILD_EXAMPLES=OFF -D BUILD_TESTS=OFF -D BUILD_PERF_TESTS=ON -D BUILD_opencv_java=OFF -D BUILD_opencv_python=NO -D BUILD_opencv_python2=NO -D BUILD_opencv_python3=NO -D ENABLE_NEON=ON -D WITH_JASPER=OFF -D WITH_TBB=ON -D OPENCV_GENERATE_PKGCONFIG=ON ..
else
cmake -D CMAKE_BUILD_TYPE=RELEASE -D CMAKE_INSTALL_PREFIX=/usr/local -D BUILD_SHARED_LIBS=${BUILD_SHARED_LIBS} -D OPENCV_EXTRA_MODULES_PATH=$(TMP_DIR)opencv/opencv_contrib-$(OPENCV_VERSION)/modules -D BUILD_DOCS=OFF -D BUILD_EXAMPLES=OFF -D BUILD_TESTS=OFF -D BUILD_PERF_TESTS=ON -D BUILD_opencv_java=OFF -D BUILD_opencv_python=NO -D BUILD_opencv_python2=NO -D BUILD_opencv_python3=NO -D ENABLE_NEON=ON -D ENABLE_VFPV3=ON -D WITH_JASPER=OFF -D OPENCV_GENERATE_PKGCONFIG=ON ..
endif
$(MAKE) -j $(shell nproc --all --ignore 1)
$(MAKE) preinstall
cd -
@@ -120,8 +142,8 @@ build_raspi_zero:
mkdir build
cd build
rm -rf *
cmake -D CMAKE_BUILD_TYPE=RELEASE -D CMAKE_INSTALL_PREFIX=/usr/local -D BUILD_SHARED_LIBS=${BUILD_SHARED_LIBS} -D OPENCV_EXTRA_MODULES_PATH=$(TMP_DIR)opencv/opencv_contrib-$(OPENCV_VERSION)/modules -D BUILD_DOCS=OFF -D BUILD_EXAMPLES=OFF -D BUILD_TESTS=OFF -D BUILD_PERF_TESTS=OFF -D BUILD_opencv_java=OFF -D BUILD_opencv_python=NO -D BUILD_opencv_python2=NO -D BUILD_opencv_python3=NO -D ENABLE_VFPV2=ON -D WITH_JASPER=OFF -D OPENCV_GENERATE_PKGCONFIG=ON ..
$(MAKE) -j $(shell nproc --all)
cmake -D CMAKE_BUILD_TYPE=RELEASE -D CMAKE_INSTALL_PREFIX=/usr/local -D BUILD_SHARED_LIBS=${BUILD_SHARED_LIBS} -D OPENCV_EXTRA_MODULES_PATH=$(TMP_DIR)opencv/opencv_contrib-$(OPENCV_VERSION)/modules -D BUILD_DOCS=OFF -D BUILD_EXAMPLES=OFF -D BUILD_TESTS=OFF -D BUILD_PERF_TESTS=ON -D BUILD_opencv_java=OFF -D BUILD_opencv_python=NO -D BUILD_opencv_python2=NO -D BUILD_opencv_python3=NO -D ENABLE_VFPV2=ON -D WITH_JASPER=OFF -D OPENCV_GENERATE_PKGCONFIG=ON ..
$(MAKE) -j $(shell nproc --all --ignore 1)
$(MAKE) preinstall
cd -
@@ -136,7 +158,7 @@ build_jetson:
-D EIGEN_INCLUDE_PATH=/usr/include/eigen3 \
-D BUILD_SHARED_LIBS=${BUILD_SHARED_LIBS} \
-D OPENCV_EXTRA_MODULES_PATH=$(TMP_DIR)opencv/opencv_contrib-$(OPENCV_VERSION)/modules \
-D BUILD_DOCS=OFF -D BUILD_EXAMPLES=OFF -D BUILD_TESTS=OFF -D BUILD_PERF_TESTS=OFF -D BUILD_opencv_java=OFF -D BUILD_opencv_python=NO -D BUILD_opencv_python2=NO -D BUILD_opencv_python3=NO \
-D BUILD_DOCS=OFF -D BUILD_EXAMPLES=OFF -D BUILD_TESTS=OFF -D BUILD_PERF_TESTS=ON -D BUILD_opencv_java=OFF -D BUILD_opencv_python=NO -D BUILD_opencv_python2=NO -D BUILD_opencv_python3=NO \
-D WITH_OPENCL=OFF \
-D WITH_CUDA=ON \
-D CUDA_ARCH_BIN=5.3 \
@@ -160,7 +182,7 @@ build_jetson:
-D WITH_V4L=ON \
-D WITH_LIBV4L=ON \
-D OPENCV_GENERATE_PKGCONFIG=ON ..
$(MAKE) -j $(shell nproc --all)
$(MAKE) -j $(shell nproc --all --ignore 1)
$(MAKE) preinstall
cd -
@@ -170,8 +192,8 @@ build_nonfree:
mkdir build
cd build
rm -rf *
cmake -D CMAKE_BUILD_TYPE=RELEASE -D CMAKE_INSTALL_PREFIX=/usr/local -D BUILD_SHARED_LIBS=${BUILD_SHARED_LIBS} -D OPENCV_EXTRA_MODULES_PATH=$(TMP_DIR)opencv/opencv_contrib-$(OPENCV_VERSION)/modules -D BUILD_DOCS=OFF -D BUILD_EXAMPLES=OFF -D BUILD_TESTS=OFF -D BUILD_PERF_TESTS=OFF -D BUILD_opencv_java=NO -D BUILD_opencv_python=NO -D BUILD_opencv_python2=NO -D BUILD_opencv_python3=NO -D WITH_JASPER=OFF -D WITH_TBB=ON -DOPENCV_GENERATE_PKGCONFIG=ON -DOPENCV_ENABLE_NONFREE=ON ..
$(MAKE) -j $(shell nproc --all)
cmake -D CMAKE_BUILD_TYPE=RELEASE -D CMAKE_INSTALL_PREFIX=/usr/local -D BUILD_SHARED_LIBS=${BUILD_SHARED_LIBS} -D OPENCV_EXTRA_MODULES_PATH=$(TMP_DIR)opencv/opencv_contrib-$(OPENCV_VERSION)/modules -D BUILD_DOCS=OFF -D BUILD_EXAMPLES=OFF -D BUILD_TESTS=OFF -D BUILD_PERF_TESTS=ON -D BUILD_opencv_java=NO -D BUILD_opencv_python=NO -D BUILD_opencv_python2=NO -D BUILD_opencv_python3=NO -D WITH_JASPER=OFF -D WITH_TBB=ON -DOPENCV_GENERATE_PKGCONFIG=ON -DOPENCV_ENABLE_NONFREE=ON ..
$(MAKE) -j $(shell nproc --all --ignore 1)
$(MAKE) preinstall
cd -
@@ -181,8 +203,8 @@ build_openvino:
mkdir build
cd build
rm -rf *
cmake -D CMAKE_BUILD_TYPE=RELEASE -D CMAKE_INSTALL_PREFIX=/usr/local -D BUILD_SHARED_LIBS=${BUILD_SHARED_LIBS} -D ENABLE_CXX11=ON -D OPENCV_EXTRA_MODULES_PATH=$(TMP_DIR)opencv/opencv_contrib-$(OPENCV_VERSION)/modules -D WITH_INF_ENGINE=ON -D InferenceEngine_DIR=/usr/local/dldt/inference-engine/build -D BUILD_DOCS=OFF -D BUILD_EXAMPLES=OFF -D BUILD_TESTS=OFF -D BUILD_PERF_TESTS=OFF -D BUILD_opencv_java=NO -D BUILD_opencv_python=NO -D BUILD_opencv_python2=NO -D BUILD_opencv_python3=NO -D WITH_JASPER=OFF -D WITH_TBB=ON -DOPENCV_GENERATE_PKGCONFIG=ON -DOPENCV_ENABLE_NONFREE=ON ..
$(MAKE) -j $(shell nproc --all)
cmake -D CMAKE_BUILD_TYPE=RELEASE -D CMAKE_INSTALL_PREFIX=/usr/local -D BUILD_SHARED_LIBS=${BUILD_SHARED_LIBS} -D ENABLE_CXX11=ON -D OPENCV_EXTRA_MODULES_PATH=$(TMP_DIR)opencv/opencv_contrib-$(OPENCV_VERSION)/modules -D WITH_INF_ENGINE=ON -D InferenceEngine_DIR=/usr/local/dldt/inference-engine/build -D BUILD_DOCS=OFF -D BUILD_EXAMPLES=OFF -D BUILD_TESTS=OFF -D BUILD_PERF_TESTS=ON -D BUILD_opencv_java=NO -D BUILD_opencv_python=NO -D BUILD_opencv_python2=NO -D BUILD_opencv_python3=NO -D WITH_JASPER=OFF -D WITH_TBB=ON -DOPENCV_GENERATE_PKGCONFIG=ON -DOPENCV_ENABLE_NONFREE=ON ..
$(MAKE) -j $(shell nproc --all --ignore 1)
$(MAKE) preinstall
cd -
@@ -192,19 +214,19 @@ build_cuda:
mkdir build
cd build
rm -rf *
cmake -j $(shell nproc --all) -D CMAKE_BUILD_TYPE=RELEASE -D CMAKE_INSTALL_PREFIX=/usr/local -D BUILD_SHARED_LIBS=${BUILD_SHARED_LIBS} -D OPENCV_EXTRA_MODULES_PATH=$(TMP_DIR)opencv/opencv_contrib-$(OPENCV_VERSION)/modules -D BUILD_DOCS=OFF -D BUILD_EXAMPLES=OFF -D BUILD_TESTS=OFF -D BUILD_PERF_TESTS=OFF -D BUILD_opencv_java=NO -D BUILD_opencv_python=NO -D BUILD_opencv_python2=NO -D BUILD_opencv_python3=NO -D WITH_JASPER=OFF -D WITH_TBB=ON -DOPENCV_GENERATE_PKGCONFIG=ON -DWITH_CUDA=ON -DENABLE_FAST_MATH=1 -DCUDA_FAST_MATH=1 -DWITH_CUBLAS=1 -DCUDA_TOOLKIT_ROOT_DIR=/usr/local/cuda/ -DBUILD_opencv_cudacodec=OFF -D WITH_CUDNN=ON -D OPENCV_DNN_CUDA=ON -D CUDA_GENERATION=Auto ..
$(MAKE) -j $(shell nproc --all)
cmake -j $(shell nproc --all --ignore 1) -D CMAKE_BUILD_TYPE=RELEASE -D CMAKE_INSTALL_PREFIX=/usr/local -D BUILD_SHARED_LIBS=${BUILD_SHARED_LIBS} -D OPENCV_EXTRA_MODULES_PATH=$(TMP_DIR)opencv/opencv_contrib-$(OPENCV_VERSION)/modules -D BUILD_DOCS=OFF -D BUILD_EXAMPLES=OFF -D BUILD_TESTS=OFF -D BUILD_PERF_TESTS=ON -D BUILD_opencv_java=NO -D BUILD_opencv_python=NO -D BUILD_opencv_python2=NO -D BUILD_opencv_python3=NO -D WITH_JASPER=OFF -D WITH_TBB=ON -DOPENCV_GENERATE_PKGCONFIG=ON -DWITH_CUDA=ON -DENABLE_FAST_MATH=1 -DCUDA_FAST_MATH=1 -DWITH_CUBLAS=1 -DCUDA_TOOLKIT_ROOT_DIR=/usr/local/cuda/ -DBUILD_opencv_cudacodec=OFF -D WITH_CUDNN=ON -D OPENCV_DNN_CUDA=ON -D CUDA_GENERATION=Auto ..
$(MAKE) -j $(shell nproc --all --ignore 1)
$(MAKE) preinstall
cd -
# Build OpenCV staticly linked
# Build OpenCV statically linked
build_static:
cd $(TMP_DIR)opencv/opencv-$(OPENCV_VERSION)
mkdir build
cd build
rm -rf *
cmake -D CMAKE_BUILD_TYPE=RELEASE -D CMAKE_INSTALL_PREFIX=/usr/local -D BUILD_SHARED_LIBS=OFF -D OPENCV_EXTRA_MODULES_PATH=$(TMP_DIR)opencv/opencv_contrib-$(OPENCV_VERSION)/modules -D BUILD_DOCS=OFF -D BUILD_EXAMPLES=OFF -D BUILD_TESTS=OFF -D BUILD_PERF_TESTS=OFF -D BUILD_opencv_java=NO -D BUILD_opencv_python=NO -D BUILD_opencv_python2=NO -D BUILD_opencv_python3=NO -DWITH_JASPER=OFF -DWITH_QT=OFF -DWITH_GTK=OFF -DWITH_FFMPEG=OFF -DWITH_TIFF=OFF -DWITH_WEBP=OFF -DWITH_PNG=OFF -DWITH_1394=OFF -DWITH_OPENJPEG=OFF -DOPENCV_GENERATE_PKGCONFIG=ON ..
$(MAKE) -j $(shell nproc --all)
cmake -D CMAKE_BUILD_TYPE=RELEASE -D CMAKE_INSTALL_PREFIX=/usr/local -D BUILD_SHARED_LIBS=OFF -D OPENCV_EXTRA_MODULES_PATH=$(TMP_DIR)opencv/opencv_contrib-$(OPENCV_VERSION)/modules -D BUILD_DOCS=OFF -D BUILD_EXAMPLES=OFF -D BUILD_TESTS=OFF -D BUILD_PERF_TESTS=ON -D BUILD_opencv_java=NO -D BUILD_opencv_python=NO -D BUILD_opencv_python2=NO -D BUILD_opencv_python3=NO -DWITH_JASPER=OFF -DWITH_QT=OFF -DWITH_GTK=OFF -DWITH_FFMPEG=OFF -DWITH_TIFF=OFF -DWITH_WEBP=OFF -DWITH_PNG=OFF -DWITH_1394=OFF -DWITH_OPENJPEG=OFF -DOPENCV_GENERATE_PKGCONFIG=ON ..
$(MAKE) -j $(shell nproc --all --ignore 1)
$(MAKE) preinstall
cd -
@@ -214,8 +236,8 @@ build_all:
mkdir build
cd build
rm -rf *
cmake -j $(shell nproc --all) -D CMAKE_BUILD_TYPE=RELEASE -D CMAKE_INSTALL_PREFIX=/usr/local -D BUILD_SHARED_LIBS=${BUILD_SHARED_LIBS} -D ENABLE_CXX11=ON -D OPENCV_EXTRA_MODULES_PATH=$(TMP_DIR)opencv/opencv_contrib-$(OPENCV_VERSION)/modules -D WITH_INF_ENGINE=ON -D InferenceEngine_DIR=/usr/local/dldt/inference-engine/build -D BUILD_DOCS=OFF -D BUILD_EXAMPLES=OFF -D BUILD_TESTS=OFF -D BUILD_PERF_TESTS=OFF -D BUILD_opencv_java=NO -D BUILD_opencv_python=NO -D BUILD_opencv_python2=NO -D BUILD_opencv_python3=NO -D WITH_JASPER=OFF -D WITH_TBB=ON -DOPENCV_GENERATE_PKGCONFIG=ON -DWITH_CUDA=ON -DENABLE_FAST_MATH=1 -DCUDA_FAST_MATH=1 -DWITH_CUBLAS=1 -DCUDA_TOOLKIT_ROOT_DIR=/usr/local/cuda/ -DBUILD_opencv_cudacodec=OFF -D WITH_CUDNN=ON -D OPENCV_DNN_CUDA=ON -D CUDA_GENERATION=Auto ..
$(MAKE) -j $(shell nproc --all)
cmake -j $(shell nproc --all --ignore 1) -D CMAKE_BUILD_TYPE=RELEASE -D CMAKE_INSTALL_PREFIX=/usr/local -D BUILD_SHARED_LIBS=${BUILD_SHARED_LIBS} -D ENABLE_CXX11=ON -D OPENCV_EXTRA_MODULES_PATH=$(TMP_DIR)opencv/opencv_contrib-$(OPENCV_VERSION)/modules -D WITH_INF_ENGINE=ON -D InferenceEngine_DIR=/usr/local/dldt/inference-engine/build -D BUILD_DOCS=OFF -D BUILD_EXAMPLES=OFF -D BUILD_TESTS=OFF -D BUILD_PERF_TESTS=ON -D BUILD_opencv_java=NO -D BUILD_opencv_python=NO -D BUILD_opencv_python2=NO -D BUILD_opencv_python3=NO -D WITH_JASPER=OFF -D WITH_TBB=ON -DOPENCV_GENERATE_PKGCONFIG=ON -DWITH_CUDA=ON -DENABLE_FAST_MATH=1 -DCUDA_FAST_MATH=1 -DWITH_CUBLAS=1 -DCUDA_TOOLKIT_ROOT_DIR=/usr/local/cuda/ -DBUILD_opencv_cudacodec=OFF -D WITH_CUDNN=ON -D OPENCV_DNN_CUDA=ON -D CUDA_GENERATION=Auto ..
$(MAKE) -j $(shell nproc --all --ignore 1)
$(MAKE) preinstall
cd -
@@ -226,22 +248,38 @@ clean:
# Cleanup old library files.
sudo_pre_install_clean:
ifneq (,$(wildcard /usr/local/lib/libopencv*))
sudo rm -rf /usr/local/lib/cmake/opencv4/
sudo rm -rf /usr/local/lib/libopencv*
sudo rm -rf /usr/local/lib/pkgconfig/opencv*
sudo rm -rf /usr/local/include/opencv*
else
ifneq (,$(wildcard /usr/local/lib64/libopencv*))
sudo rm -rf /usr/local/lib64/cmake/opencv4/
sudo rm -rf /usr/local/lib64/libopencv*
sudo rm -rf /usr/local/lib64/pkgconfig/opencv*
sudo rm -rf /usr/local/include/opencv*
else
ifneq (,$(wildcard /usr/local/lib/aarch64-linux-gnu/libopencv*))
sudo rm -rf /usr/local/lib/aarch64-linux-gnu/cmake/opencv4/
sudo rm -rf /usr/local/lib/aarch64-linux-gnu/libopencv*
sudo rm -rf /usr/local/lib/aarch64-linux-gnu/pkgconfig/opencv*
sudo rm -rf /usr/local/include/opencv*
endif
endif
endif
# Do everything.
install: deps download sudo_pre_install_clean build sudo_install clean verify
# Do everything on Raspbian.
install_raspi: deps download build_raspi sudo_install clean verify
install_raspi: deps download sudo_pre_install_clean build_raspi sudo_install clean verify
# Do everything on the raspberry pi zero.
install_raspi_zero: deps download build_raspi_zero sudo_install clean verify
install_raspi_zero: deps download sudo_pre_install_clean build_raspi_zero sudo_install clean verify
# Do everything on Jetson.
install_jetson: deps download build_jetson sudo_install clean verify
install_jetson: deps download sudo_pre_install_clean build_jetson sudo_install clean verify
# Do everything with cuda.
install_cuda: deps download sudo_pre_install_clean build_cuda sudo_install clean verify verify_cuda

167
vendor/gocv.io/x/gocv/README.md generated vendored
View File

@@ -4,13 +4,13 @@
[![Go Reference](https://pkg.go.dev/badge/gocv.io/x/gocv.svg)](https://pkg.go.dev/gocv.io/x/gocv)
[![Linux](https://github.com/hybridgroup/gocv/actions/workflows/linux.yml/badge.svg?branch=dev)](https://github.com/hybridgroup/gocv/actions/workflows/linux.yml)
[![Windows](https://ci.appveyor.com/api/projects/status/9asd5foet54ru69q/branch/dev?svg=true)](https://ci.appveyor.com/project/deadprogram/gocv/branch/dev)
[![Windows](https://github.com/hybridgroup/gocv/actions/workflows/windows.yml/badge.svg?branch=dev)](https://github.com/hybridgroup/gocv/actions/workflows/windows.yml)
[![Go Report Card](https://goreportcard.com/badge/github.com/hybridgroup/gocv)](https://goreportcard.com/report/github.com/hybridgroup/gocv)
[![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://github.com/hybridgroup/gocv/blob/release/LICENSE.txt)
The GoCV package provides Go language bindings for the [OpenCV 4](http://opencv.org/) computer vision library.
The GoCV package supports the latest releases of Go and OpenCV (v4.6.0) on Linux, macOS, and Windows. We intend to make the Go language a "first-class" client compatible with the latest developments in the OpenCV ecosystem.
The GoCV package supports the latest releases of Go and OpenCV (v4.9.0) on Linux, macOS, and Windows. We intend to make the Go language a "first-class" client compatible with the latest developments in the OpenCV ecosystem.
GoCV supports [CUDA](https://en.wikipedia.org/wiki/CUDA) for hardware acceleration using Nvidia GPUs. Check out the [CUDA README](./cuda/README.md) for more info on how to use GoCV with OpenCV/CUDA.
@@ -122,7 +122,7 @@ There are examples in the [cmd directory](./cmd) of this repo in the form of var
## How to install
To install GoCV, you must first have the matching version of OpenCV installed on your system. The current release of GoCV requires OpenCV 4.6.0.
To install GoCV, you must first have the matching version of OpenCV installed on your system. The current release of GoCV requires OpenCV 4.9.0.
Here are instructions for Ubuntu, Raspian, macOS, and Windows.
@@ -130,30 +130,30 @@ Here are instructions for Ubuntu, Raspian, macOS, and Windows.
### Installation
You can use `make` to install OpenCV 4.6.0 with the handy `Makefile` included with this repo. If you already have installed OpenCV, you do not need to do so again. The installation performed by the `Makefile` is minimal, so it may remove OpenCV options such as Python or Java wrappers if you have already installed OpenCV some other way.
You can use `make` to install OpenCV 4.9.0 with the handy `Makefile` included with this repo. If you already have installed OpenCV, you do not need to do so again. The installation performed by the `Makefile` is minimal, so it may remove OpenCV options such as Python or Java wrappers if you have already installed OpenCV some other way.
#### Quick Install
First, change directories to where you want to install GoCV, and then use git to clone the repository to your local machine like this:
cd $HOME/folder/with/your/src/
git clone https://github.com/hybridgroup/gocv.git
cd $HOME/folder/with/your/src/
git clone https://github.com/hybridgroup/gocv.git
Make sure to change `$HOME/folder/with/your/src/` to where you actually want to save the code.
Once you have cloned the repo, the following commands should do everything to download and install OpenCV 4.6.0 on Linux:
Once you have cloned the repo, the following commands should do everything to download and install OpenCV 4.9.0 on Linux:
cd gocv
make install
cd gocv
make install
If you need static opencv libraries
make install BUILD_SHARED_LIBS=OFF
make install BUILD_SHARED_LIBS=OFF
If it works correctly, at the end of the entire process, the following message should be displayed:
gocv version: 0.31.0
opencv lib version: 4.6.0
gocv version: 0.36.0
opencv lib version: 4.9.0
That's it, now you are ready to use GoCV.
@@ -164,22 +164,22 @@ See the [cuda directory](./cuda) for information.
#### Using OpenVINO with GoCV
See the [openvino directory](./openvino) for information.
#### Make Install for OpenVINO and Cuda
The following commands should do everything to download and install OpenCV 4.6.0 with CUDA and OpenVINO on Linux. Make sure to change `$HOME/folder/with/your/src/` to the directory you used to clone GoCV:
The following commands should do everything to download and install OpenCV 4.9.0 with CUDA and OpenVINO on Linux. Make sure to change `$HOME/folder/with/your/src/` to the directory you used to clone GoCV:
cd $HOME/folder/with/gocv/
make install_all
cd $HOME/folder/with/gocv/
make install_all
If you need static opencv libraries
make install_all BUILD_SHARED_LIBS=OFF
make install_all BUILD_SHARED_LIBS=OFF
If it works correctly, at the end of the entire process, the following message should be displayed:
gocv version: 0.31.0
opencv lib version: 4.6.0-openvino
gocv version: 0.36.0
opencv lib version: 4.9.0-openvino
cuda information:
Device 0: "GeForce MX150" 2003Mb, sm_61, Driver/Runtime ver.10.0/10.0
@@ -189,8 +189,8 @@ If you have already done the "Quick Install" as described above, you do not need
First, change directories to where you want to install GoCV, and then use git to clone the repository to your local machine like this:
cd $HOME/folder/with/your/src/
git clone https://github.com/hybridgroup/gocv.git
cd $HOME/folder/with/your/src/
git clone https://github.com/hybridgroup/gocv.git
Make sure to change `$HOME/folder/with/your/src/` to where you actually want to save the code.
@@ -198,33 +198,33 @@ Make sure to change `$HOME/folder/with/your/src/` to where you actually want to
First, you need to change the current directory to the location where you cloned the GoCV repo, so you can access the `Makefile`:
cd $HOME/folder/with/your/src/gocv
cd $HOME/folder/with/your/src/gocv
Next, you need to update the system, and install any required packages:
make deps
make deps
#### Download source
Now, download the OpenCV 4.6.0 and OpenCV Contrib source code:
Now, download the OpenCV 4.9.0 and OpenCV Contrib source code:
make download
make download
#### Build
Build everything. This will take quite a while:
make build
make build
If you need static opencv libraries
make build BUILD_SHARED_LIBS=OFF
make build BUILD_SHARED_LIBS=OFF
#### Install
Once the code is built, you are ready to install:
make sudo_install
make sudo_install
### Verifying the installation
@@ -232,22 +232,22 @@ To verify your installation you can run one of the included examples.
First, change the current directory to the location of the GoCV repo:
cd $HOME/src/gocv.io/x/gocv
cd $HOME/src/gocv.io/x/gocv
Now you should be able to build or run any of the examples:
go run ./cmd/version/main.go
go run ./cmd/version/main.go
The version program should output the following:
gocv version: 0.31.0
opencv lib version: 4.6.0
gocv version: 0.36.0
opencv lib version: 4.9.0
#### Cleanup extra files
After the installation is complete, you can remove the extra files and folders:
make clean
make clean
### Custom Environment
@@ -255,12 +255,12 @@ By default, pkg-config is used to determine the correct flags for compiling and
For example:
export CGO_CPPFLAGS="-I/usr/local/include"
export CGO_LDFLAGS="-L/usr/local/lib -lopencv_core -lopencv_face -lopencv_videoio -lopencv_imgproc -lopencv_highgui -lopencv_imgcodecs -lopencv_objdetect -lopencv_features2d -lopencv_video -lopencv_dnn -lopencv_xfeatures2d"
export CGO_CPPFLAGS="-I/usr/local/include"
export CGO_LDFLAGS="-L/usr/local/lib -lopencv_core -lopencv_face -lopencv_videoio -lopencv_imgproc -lopencv_highgui -lopencv_imgcodecs -lopencv_objdetect -lopencv_features2d -lopencv_video -lopencv_dnn -lopencv_xfeatures2d"
Please note that you will need to run these 2 lines of code one time in your current session in order to build or run the code, in order to setup the needed ENV variables. Once you have done so, you can execute code that uses GoCV with your custom environment like this:
go run -tags customenv ./cmd/version/main.go
go run -tags customenv ./cmd/version/main.go
### Docker
@@ -270,17 +270,18 @@ The project now provides `Dockerfile` which lets you build [GoCV](https://gocv.i
make docker
```
By default Docker image built by running the command above ships [Go](https://golang.org/) version `1.16.5`, but if you would like to build an image which uses different version of `Go` you can override the default value when running the target command:
By default Docker image built by running the command above ships [Go](https://golang.org/) version `1.20.2`, but if you would like to build an image which uses different version of `Go` you can override the default value when running the target command:
```
make docker GOVERSION='1.15'
make docker GOVERSION='1.22.0'
```
#### Running GUI programs in Docker on macOS
Sometimes your `GoCV` programs create graphical interfaces like windows eg. when you use `gocv.Window` type when you display an image or video stream. Running the programs which create graphical interfaces in Docker container on macOS is unfortunately a bit elaborate, but not impossible. First you need to satisfy the following prerequisites:
* install [xquartz](https://www.xquartz.org/). You can also install xquartz using [homebrew](https://brew.sh/) by running `brew cask install xquartz`
* install [socat](https://linux.die.net/man/1/socat) `brew install socat`
- install [xquartz](https://www.xquartz.org/). You can also install xquartz using [homebrew](https://brew.sh/) by running `brew cask install xquartz`
- install [socat](https://linux.die.net/man/1/socat) `brew install socat`
Note, you will have to log out and log back in to your machine once you have installed `xquartz`. This is so the X window system is reloaded.
@@ -289,7 +290,8 @@ Once you have installed all the prerequisites you need to allow connections from
```shell
open -a xquartz
```
Click on *Security* tab in preferences and check the "Allow connections" box:
Click on _Security_ tab in preferences and check the "Allow connections" box:
![app image](./images/xquartz.png)
@@ -298,6 +300,7 @@ Next, you need to create a TCP proxy using `socat` which will stream [X Window](
```shell
lsof -i TCP:6000
```
Now you can start a local proxy which will proxy the X Window traffic into xquartz which acts a your local X server:
```shell
@@ -320,26 +323,26 @@ There is a Docker image with Alpine 3.7 that has been created by project contrib
### Installation
We have a special installation for the Raspberry Pi that includes some hardware optimizations. You use `make` to install OpenCV 4.6.0 with the handy `Makefile` included with this repo. If you already have installed OpenCV, you do not need to do so again. The installation performed by the `Makefile` is minimal, so it may remove OpenCV options such as Python or Java wrappers if you have already installed OpenCV some other way.
We have a special installation for the Raspberry Pi that includes some hardware optimizations. You use `make` to install OpenCV 4.9.0 with the handy `Makefile` included with this repo. If you already have installed OpenCV, you do not need to do so again. The installation performed by the `Makefile` is minimal, so it may remove OpenCV options such as Python or Java wrappers if you have already installed OpenCV some other way.
#### Quick Install
First, change directories to where you want to install GoCV, and then use git to clone the repository to your local machine like this:
cd $HOME/folder/with/your/src/
git clone https://github.com/hybridgroup/gocv.git
cd $HOME/folder/with/your/src/
git clone https://github.com/hybridgroup/gocv.git
Make sure to change `$HOME/folder/with/your/src/` to where you actually want to save the code.
The following make command should do everything to download and install OpenCV 4.6.0 on Raspbian:
The following make command should do everything to download and install OpenCV 4.9.0 on Raspbian:
cd $HOME/folder/with/your/src/gocv
make install_raspi
cd $HOME/folder/with/your/src/gocv
make install_raspi
If it works correctly, at the end of the entire process, the following message should be displayed:
gocv version: 0.31.0
opencv lib version: 4.6.0
gocv version: 0.36.0
opencv lib version: 4.9.0
That's it, now you are ready to use GoCV.
@@ -347,17 +350,18 @@ That's it, now you are ready to use GoCV.
### Installation
You can install OpenCV 4.6.0 using Homebrew.
You can install OpenCV 4.9.0 using Homebrew.
If you already have an earlier version of OpenCV (3.4.x) installed, you should probably remove it before installing the new version:
brew uninstall opencv
brew uninstall opencv
You can then install OpenCV 4.6.0:
You can then install OpenCV 4.9.0:
brew install opencv
brew install opencv
### pkgconfig Installation
pkg-config is used to determine the correct flags for compiling and linking OpenCV.
You can install it by using Homebrew:
@@ -369,16 +373,16 @@ To verify your installation you can run one of the included examples.
First, change the current directory to the location of the GoCV repo:
cd $HOME/folder/with/your/src/gocv
cd $HOME/folder/with/your/src/gocv
Now you should be able to build or run any of the examples:
go run ./cmd/version/main.go
go run ./cmd/version/main.go
The version program should output the following:
gocv version: 0.31.0
opencv lib version: 4.6.0
gocv version: 0.36.0
opencv lib version: 4.9.0
### Custom Environment
@@ -386,13 +390,13 @@ By default, pkg-config is used to determine the correct flags for compiling and
For example:
export CGO_CXXFLAGS="--std=c++11"
export CGO_CPPFLAGS="-I/usr/local/Cellar/opencv/4.6.0/include"
export CGO_LDFLAGS="-L/usr/local/Cellar/opencv/4.6.0/lib -lopencv_stitching -lopencv_superres -lopencv_videostab -lopencv_aruco -lopencv_bgsegm -lopencv_bioinspired -lopencv_ccalib -lopencv_dnn_objdetect -lopencv_dpm -lopencv_face -lopencv_photo -lopencv_fuzzy -lopencv_hfs -lopencv_img_hash -lopencv_line_descriptor -lopencv_optflow -lopencv_reg -lopencv_rgbd -lopencv_saliency -lopencv_stereo -lopencv_structured_light -lopencv_phase_unwrapping -lopencv_surface_matching -lopencv_tracking -lopencv_datasets -lopencv_dnn -lopencv_plot -lopencv_xfeatures2d -lopencv_shape -lopencv_video -lopencv_ml -lopencv_ximgproc -lopencv_calib3d -lopencv_features2d -lopencv_highgui -lopencv_videoio -lopencv_flann -lopencv_xobjdetect -lopencv_imgcodecs -lopencv_objdetect -lopencv_xphoto -lopencv_imgproc -lopencv_core"
export CGO_CXXFLAGS="--std=c++11"
export CGO_CPPFLAGS="-I/usr/local/Cellar/opencv/4.9.0/include"
export CGO_LDFLAGS="-L/usr/local/Cellar/opencv/4.9.0/lib -lopencv_stitching -lopencv_superres -lopencv_videostab -lopencv_aruco -lopencv_bgsegm -lopencv_bioinspired -lopencv_ccalib -lopencv_dnn_objdetect -lopencv_dpm -lopencv_face -lopencv_photo -lopencv_fuzzy -lopencv_hfs -lopencv_img_hash -lopencv_line_descriptor -lopencv_optflow -lopencv_reg -lopencv_rgbd -lopencv_saliency -lopencv_stereo -lopencv_structured_light -lopencv_phase_unwrapping -lopencv_surface_matching -lopencv_tracking -lopencv_datasets -lopencv_dnn -lopencv_plot -lopencv_xfeatures2d -lopencv_shape -lopencv_video -lopencv_ml -lopencv_ximgproc -lopencv_calib3d -lopencv_features2d -lopencv_highgui -lopencv_videoio -lopencv_flann -lopencv_xobjdetect -lopencv_imgcodecs -lopencv_objdetect -lopencv_xphoto -lopencv_imgproc -lopencv_core"
Please note that you will need to run these 3 lines of code one time in your current session in order to build or run the code, in order to setup the needed ENV variables. Once you have done so, you can execute code that uses GoCV with your custom environment like this:
go run -tags customenv ./cmd/version/main.go
go run -tags customenv ./cmd/version/main.go
## Windows
@@ -400,7 +404,7 @@ Please note that you will need to run these 3 lines of code one time in your cur
The following assumes that you are running a 64-bit version of Windows 10.
In order to build and install OpenCV 4.6.0 on Windows, you must first download and install MinGW-W64 and CMake, as follows.
In order to build and install OpenCV 4.9.0 on Windows, you must first download and install MinGW-W64 and CMake, as follows.
#### MinGW-W64
@@ -416,12 +420,12 @@ Add the `C:\Program Files\mingw-w64\x86_64-8.1.0-posix-seh-rt_v6-rev0\mingw64\bi
Download and install CMake [https://cmake.org/download/](https://cmake.org/download/) to the default location. CMake installer will add CMake to your system path.
#### OpenCV 4.6.0 and OpenCV Contrib Modules
#### OpenCV 4.9.0 and OpenCV Contrib Modules
The following commands should do everything to download and install OpenCV 4.6.0 on Windows:
The following commands should do everything to download and install OpenCV 4.9.0 on Windows:
chdir %GOPATH%\src\gocv.io\x\gocv
win_build_opencv.cmd
chdir %GOPATH%\src\gocv.io\x\gocv
win_build_opencv.cmd
It might take up to one hour.
@@ -431,16 +435,16 @@ Last, add `C:\opencv\build\install\x64\mingw\bin` to your System Path.
Change the current directory to the location of the GoCV repo:
chdir %GOPATH%\src\gocv.io\x\gocv
chdir %GOPATH%\src\gocv.io\x\gocv
Now you should be able to build or run any of the command examples:
go run cmd\version\main.go
go run cmd\version\main.go
The version program should output the following:
gocv version: 0.31.0
opencv lib version: 4.6.0
gocv version: 0.36.0
opencv lib version: 4.9.0
That's it, now you are ready to use GoCV.
@@ -452,31 +456,31 @@ Due to the way OpenCV produces DLLs, including the version in the name, using th
For example:
set CGO_CXXFLAGS="--std=c++11"
set CGO_CPPFLAGS=-IC:\opencv\build\install\include
set CGO_LDFLAGS=-LC:\opencv\build\install\x64\mingw\lib -lopencv_core460 -lopencv_face460 -lopencv_videoio460 -lopencv_imgproc460 -lopencv_highgui460 -lopencv_imgcodecs460 -lopencv_objdetect460 -lopencv_features2d460 -lopencv_video460 -lopencv_dnn460 -lopencv_xfeatures2d460 -lopencv_plot460 -lopencv_tracking460 -lopencv_img_hash460
set CGO_CXXFLAGS="--std=c++11"
set CGO_CPPFLAGS=-IC:\opencv\build\install\include
set CGO_LDFLAGS=-LC:\opencv\build\install\x64\mingw\lib -lopencv_core490 -lopencv_face490 -lopencv_videoio490 -lopencv_imgproc490 -lopencv_highgui490 -lopencv_imgcodecs490 -lopencv_objdetect490 -lopencv_features2d490 -lopencv_video490 -lopencv_dnn490 -lopencv_xfeatures2d490 -lopencv_plot490 -lopencv_tracking490 -lopencv_img_hash490
Please note that you will need to run these 3 lines of code one time in your current session in order to build or run the code, in order to setup the needed ENV variables. Once you have done so, you can execute code that uses GoCV with your custom environment like this:
go run -tags customenv cmd\version\main.go
go run -tags customenv cmd\version\main.go
## Android
There is some work in progress for running GoCV on Android using Gomobile. For information on how to install OpenCV/GoCV for Android, please see:
https://gist.github.com/ogero/c19458cf64bd3e91faae85c3ac887481
https://gist.github.com/ogero/c19458cf64bd3e91faae85c3ac887490
See original discussion here:
https://github.com/hybridgroup/gocv/issues/235
## Profiling
Since memory allocations for images in GoCV are done through C based code, the go garbage collector will not clean all resources associated with a `Mat`. As a result, any `Mat` created *must* be closed to avoid memory leaks.
Since memory allocations for images in GoCV are done through C based code, the go garbage collector will not clean all resources associated with a `Mat`. As a result, any `Mat` created _must_ be closed to avoid memory leaks.
To ease the detection and repair of the resource leaks, GoCV provides a `Mat` profiler that records when each `Mat` is created and closed. Each time a `Mat` is allocated, the stack trace is added to the profile. When it is closed, the stack trace is removed. See the [runtime/pprof documentation](https://golang.org/pkg/runtime/pprof/#Profile).
To ease the detection and repair of the resource leaks, GoCV provides a `Mat` profiler that records when each `Mat` is created and closed. Each time a `Mat` is allocated, the stack trace is added to the profile. When it is closed, the stack trace is removed. See the [runtime/pprof documentation](https://golang.org/pkg/runtime/pprof/#Profile).
In order to include the MatProfile custom profiler, you MUST build or run your application or tests using the `-tags matprofile` build tag. For example:
go run -tags matprofile cmd/version/main.go
go run -tags matprofile cmd/version/main.go
You can get the profile's count at any time using:
@@ -492,7 +496,7 @@ gocv.MatProfile.WriteTo(&b, 1)
fmt.Print(b.String())
```
This can be very helpful to track down a leak. For example, suppose you have
This can be very helpful to track down a leak. For example, suppose you have
the following nonsense program:
```go
@@ -534,7 +538,7 @@ gocv.io/x/gocv.Mat profile: total 1
# 0x402cd86 runtime.main+0x206 /usr/local/Cellar/go/1.11.1/libexec/src/runtime/proc.go:201
```
We can see that this program would leak memory. As it exited, it had one `Mat` that was never closed. The stack trace points to exactly which line the allocation happened on (line 11, the `gocv.NewMat()`).
We can see that this program would leak memory. As it exited, it had one `Mat` that was never closed. The stack trace points to exactly which line the allocation happened on (line 11, the `gocv.NewMat()`).
Furthermore, if the program is a long running process or if GoCV is being used on a web server, it may be helpful to install the HTTP interface )). For example:
@@ -567,8 +571,7 @@ func main() {
```
This will leak a `Mat` once per second. You can see the current profile count and stack traces by going to the installed HTTP debug interface: [http://localhost:6060/debug/pprof/gocv.io/x/gocv.Mat](http://localhost:6060/debug/pprof/gocv.io/x/gocv.Mat?debug=1).
This will leak a `Mat` once per second. You can see the current profile count and stack traces by going to the installed HTTP debug interface: [http://localhost:6060/debug/pprof/gocv.io/x/gocv.Mat](http://localhost:6060/debug/pprof/gocv.io/x/gocv.Mat?debug=1).
## How to contribute

88
vendor/gocv.io/x/gocv/ROADMAP.md generated vendored
View File

@@ -14,21 +14,7 @@ Your pull requests will be greatly appreciated!
- [ ] **core. Core functionality - WORK STARTED**
- [X] **Basic structures**
- [ ] **Operations on arrays - WORK STARTED**. The following functions still need implementation:
- [ ] [Mahalanobis](https://docs.opencv.org/master/d2/de8/group__core__array.html#ga4493aee129179459cbfc6064f051aa7d)
- [ ] [mulTransposed](https://docs.opencv.org/master/d2/de8/group__core__array.html#gadc4e49f8f7a155044e3be1b9e3b270ab)
- [ ] [PCABackProject](https://docs.opencv.org/master/d2/de8/group__core__array.html#gab26049f30ee8e94f7d69d82c124faafc)
- [ ] [PCACompute](https://docs.opencv.org/master/d2/de8/group__core__array.html#ga4e2073c7311f292a0648f04c37b73781)
- [ ] [PCAProject](https://docs.opencv.org/master/d2/de8/group__core__array.html#ga6b9fbc7b3a99ebfd441bbec0a6bc4f88)
- [ ] [PSNR](https://docs.opencv.org/master/d2/de8/group__core__array.html#ga07aaf34ae31d226b1b847d8bcff3698f)
- [ ] [randn](https://docs.opencv.org/master/d2/de8/group__core__array.html#gaeff1f61e972d133a04ce3a5f81cf6808)
- [ ] [randShuffle](https://docs.opencv.org/master/d2/de8/group__core__array.html#ga6a789c8a5cb56c6dd62506179808f763)
- [ ] [randu](https://docs.opencv.org/master/d2/de8/group__core__array.html#ga1ba1026dca0807b27057ba6a49d258c0)
- [ ] [setRNGSeed](https://docs.opencv.org/master/d2/de8/group__core__array.html#ga757e657c037410d9e19e819569e7de0f)
- [ ] [SVBackSubst](https://docs.opencv.org/master/d2/de8/group__core__array.html#gab4e620e6fc6c8a27bb2be3d50a840c0b)
- [ ] [SVDecomp](https://docs.opencv.org/master/d2/de8/group__core__array.html#gab477b5b7b39b370bb03e75b19d2d5109)
- [ ] [theRNG](https://docs.opencv.org/master/d2/de8/group__core__array.html#ga75843061d150ad6564b5447e38e57722)
- [X] **Operations on arrays**
- [ ] XML/YAML Persistence
- [ ] [FileStorage](https://docs.opencv.org/master/da/d56/classcv_1_1FileStorage.html)
@@ -47,7 +33,7 @@ Your pull requests will be greatly appreciated!
- [ ] [getGaborKernel](https://docs.opencv.org/master/d4/d86/group__imgproc__filter.html#gae84c92d248183bd92fa713ce51cc3599)
- [ ] [morphologyExWithParams](https://docs.opencv.org/master/d4/d86/group__imgproc__filter.html#ga67493776e3ad1a3df63883829375201f)
- [ ] [pyrMeanShiftFiltering](https://docs.opencv.org/master/d4/d86/group__imgproc__filter.html#ga9fabdce9543bd602445f5db3827e4cc0)
- [ ] **Geometric Image Transformations - WORK STARTED** The following functions still need implementation:
- [ ] [convertMaps](https://docs.opencv.org/master/da/d54/group__imgproc__transform.html#ga9156732fa8f01be9ebd1a194f2728b7f)
- [ ] [getDefaultNewCameraMatrix](https://docs.opencv.org/master/da/d54/group__imgproc__transform.html#ga744529385e88ef7bc841cbe04b35bfbf)
@@ -64,13 +50,10 @@ Your pull requests will be greatly appreciated!
- [ ] [ellipse2Poly](https://docs.opencv.org/master/d6/d6e/group__imgproc__draw.html#ga727a72a3f6a625a2ae035f957c61051f)
- [ ] [fillConvexPoly](https://docs.opencv.org/master/d6/d6e/group__imgproc__draw.html#ga906aae1606ea4ed2f27bec1537f6c5c2)
- [ ] [getFontScaleFromHeight](https://docs.opencv.org/master/d6/d6e/group__imgproc__draw.html#ga442ff925c1a957794a1309e0ed3ba2c3)
- [ ] ColorMaps in OpenCV
- [ ] Planar Subdivision
- [ ] **Histograms - WORK STARTED** The following functions still need implementation:
- [ ] [EMD](https://docs.opencv.org/master/d6/dc7/group__imgproc__hist.html#ga902b8e60cc7075c8947345489221e0e0)
- [ ] [wrapperEMD](https://docs.opencv.org/master/d6/dc7/group__imgproc__hist.html#ga31fdda0864e64ca6b9de252a2611758b)
- [X] **Histograms**
- [ ] **Structural Analysis and Shape Descriptors - WORK STARTED** The following functions still need implementation:
- [ ] [fitEllipse](https://docs.opencv.org/master/d3/dc0/group__imgproc__shape.html#gaf259efaad93098103d6c27b9e4900ffa)
- [ ] [fitEllipseAMS](https://docs.opencv.org/master/d3/dc0/group__imgproc__shape.html#ga69e90cda55c4e192a8caa0b99c3e4550)
@@ -78,14 +61,10 @@ Your pull requests will be greatly appreciated!
- [ ] [HuMoments](https://docs.opencv.org/master/d3/dc0/group__imgproc__shape.html#gab001db45c1f1af6cbdbe64df04c4e944)
- [ ] [intersectConvexConvex](https://docs.opencv.org/master/d3/dc0/group__imgproc__shape.html#ga8e840f3f3695613d32c052bec89e782c)
- [ ] [isContourConvex](https://docs.opencv.org/master/d3/dc0/group__imgproc__shape.html#ga8abf8010377b58cbc16db6734d92941b)
- [ ] [matchShapes](https://docs.opencv.org/master/d3/dc0/group__imgproc__shape.html#gaadc90cb16e2362c9bd6e7363e6e4c317)
- [ ] [minEnclosingTriangle](https://docs.opencv.org/master/d3/dc0/group__imgproc__shape.html#ga1513e72f6bbdfc370563664f71e0542f)
- [ ] [rotatedRectangleIntersection](https://docs.opencv.org/master/d3/dc0/group__imgproc__shape.html#ga8740e7645628c59d238b0b22c2abe2d4)
- [ ] **Motion Analysis and Object Tracking - WORK STARTED** The following functions still need implementation:
- [ ] [createHanningWindow](https://docs.opencv.org/master/d7/df3/group__imgproc__motion.html#ga80e5c3de52f6bab3a7c1e60e89308e1b)
- [ ] [phaseCorrelate](https://docs.opencv.org/master/d7/df3/group__imgproc__motion.html#ga552420a2ace9ef3fb053cd630fdb4952)
- [X] **Motion Analysis and Object Tracking**
- [ ] **Feature Detection - WORK STARTED** The following functions still need implementation:
- [ ] [cornerEigenValsAndVecs](https://docs.opencv.org/master/dd/d1a/group__imgproc__feature.html#ga4055896d9ef77dd3cacf2c5f60e13f1c)
- [ ] [cornerHarris](https://docs.opencv.org/master/dd/d1a/group__imgproc__feature.html#gac1fc3598018010880e370e2f709b4345)
@@ -93,7 +72,16 @@ Your pull requests will be greatly appreciated!
- [ ] [createLineSegmentDetector](https://docs.opencv.org/master/dd/d1a/group__imgproc__feature.html#ga6b2ad2353c337c42551b521a73eeae7d)
- [ ] [preCornerDetect](https://docs.opencv.org/master/dd/d1a/group__imgproc__feature.html#gaa819f39b5c994871774081803ae22586)
- [X] **Object Detection**
- [ ] **Object Detection - WORK STARTED**
- [ ] **aruco. ArUco Marker Detection - WORK STARTED**
- [ ] [refineDetectedMarkers](https://docs.opencv.org/4.x/d2/d1a/classcv_1_1aruco_1_1ArucoDetector.html#ad806c9310cfc826a178b0aefdf09bab6)
- [ ] [refineDetectedMarkers](https://docs.opencv.org/4.x/d2/d1a/classcv_1_1aruco_1_1ArucoDetector.html#ad806c9310cfc826a178b0aefdf09bab6)
- [ ] [drawDetectedCornersCharuco](https://docs.opencv.org/4.x/de/d67/group__objdetect__aruco.html#ga7225eee644190f791e1583c499b7ab10)
- [ ] [drawDetectedDiamonds](https://docs.opencv.org/4.x/de/d67/group__objdetect__aruco.html#ga0dbf27203267fb8e9f282554cf0d3433)
- [ ] [extendDictionary](https://docs.opencv.org/4.x/de/d67/group__objdetect__aruco.html#ga928c031e9a782b18405af56c851d9549)
- [ ] [CharucoDetector](https://docs.opencv.org/4.x/d9/df5/classcv_1_1aruco_1_1CharucoDetector.html#ad7647d1c3d0e2db97bedc31f743e796b)
- [ ] [detectBoard](https://docs.opencv.org/4.x/d9/df5/classcv_1_1aruco_1_1CharucoDetector.html#aacbea601612a3a0feaa45ebb7fb255fd)
- [ ] [detectDiamonds](https://docs.opencv.org/4.x/d9/df5/classcv_1_1aruco_1_1CharucoDetector.html#a50342803f68deb1e6b0b79f61d4b1a73)
- [X] **imgcodecs. Image file reading and writing.**
- [X] **videoio. Video I/O**
@@ -108,20 +96,17 @@ Your pull requests will be greatly appreciated!
- [ ] [CamShift](https://docs.opencv.org/master/dc/d6b/group__video__track.html#gaef2bd39c8356f423124f1fe7c44d54a1)
- [ ] [DualTVL1OpticalFlow](https://docs.opencv.org/master/dc/d47/classcv_1_1DualTVL1OpticalFlow.html)
- [ ] [FarnebackOpticalFlow](https://docs.opencv.org/master/de/d9e/classcv_1_1FarnebackOpticalFlow.html)
- [ ] [KalmanFilter](https://docs.opencv.org/master/dd/d6a/classcv_1_1KalmanFilter.html)
- [ ] [SparsePyrLKOpticalFlow](https://docs.opencv.org/master/d7/d08/classcv_1_1SparsePyrLKOpticalFlow.html)
- [ ] [GOTURN](https://docs.opencv.org/master/d7/d4c/classcv_1_1TrackerGOTURN.html)
- [ ] **calib3d. Camera Calibration and 3D Reconstruction - WORK STARTED**. The following functions still need implementation:
- [ ] **Camera Calibration - WORK STARTED** The following functions still need implementation:
- [X] [calibrateCamera](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [calibrateCameraRO](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [calibrateHandEye](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [calibrateCameraRO](https://docs.opencv.org/4.x/d9/d0c/group__calib3d.html#gacb6b35670216b24b67c70fcd21519ead)
- [ ] [calibrateHandEye](https://docs.opencv.org/4.x/d9/d0c/group__calib3d.html#gaebfc1c9f7434196a374c382abf43439b)
- [ ] [calibrateRobotWorldHandEye](https://docs.opencv.org/4.x/d9/d0c/group__calib3d.html#ga41b1a8dd70eae371eba707d101729c36)
- [ ] [calibrationMatrixValues](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [checkChessboard](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [composeRT](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [computeCorrespondEpilines](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [convertPointsFromHomogeneous](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [convertPointsHomogeneous](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [convertPointsToHomogeneous](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [correctMatches](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
@@ -130,13 +115,10 @@ Your pull requests will be greatly appreciated!
- [ ] [decomposeProjectionMatrix](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [drawChessboardCorners](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [drawFrameAxes](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [X] [estimateAffine2D](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [estimateAffine3D](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [filterHomographyDecompByVisibleRefpoints](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [filterSpeckles](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [find4QuadCornerSubpix](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [X] [findChessboardCorners](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [X] [findChessboardCornersSB](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [findCirclesGrid](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [findEssentialMat](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [findFundamentalMat](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
@@ -151,11 +133,9 @@ Your pull requests will be greatly appreciated!
- [ ] [recoverPose](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [rectify3Collinear](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [reprojectImageTo3D](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [Rodrigues](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [RQDecomp3x3](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [sampsonDistance](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [solveP3P](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [solvePnP](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [solvePnPGeneric](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [solvePnPRansac](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [solvePnPRefineLM](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
@@ -163,12 +143,9 @@ Your pull requests will be greatly appreciated!
- [ ] [stereoCalibrate](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [stereoRectify](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [stereoRectifyUncalibrated](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [triangulatePoints](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] [validateDisparity](https://docs.opencv.org/master/d9/d0c/group__calib3d.html)
- [ ] **Fisheye - WORK STARTED** The following functions still need implementation:
- [ ] [calibrate](https://docs.opencv.org/master/db/d58/group__calib3d__fisheye.html#gad626a78de2b1dae7489e152a5a5a89e1)
- [ ] [distortPoints](https://docs.opencv.org/master/db/d58/group__calib3d__fisheye.html#ga75d8877a98e38d0b29b6892c5f8d7765)
- [ ] [projectPoints](https://docs.opencv.org/master/db/d58/group__calib3d__fisheye.html#gab1ad1dc30c42ee1a50ce570019baf2c4)
- [ ] [stereoCalibrate](https://docs.opencv.org/master/db/d58/group__calib3d__fisheye.html#gadbb3a6ca6429528ef302c784df47949b)
- [ ] [stereoRectify](https://docs.opencv.org/master/db/d58/group__calib3d__fisheye.html#gac1af58774006689056b0f2ef1db55ecc)
@@ -188,9 +165,6 @@ Your pull requests will be greatly appreciated!
- [ ] **photo. Computational Photography - WORK STARTED** The following functions still need implementation:
- [ ] [inpaint](https://docs.opencv.org/master/d7/d8b/group__photo__inpaint.html#gaedd30dfa0214fec4c88138b51d678085)
- [ ] [denoise_TVL1](https://docs.opencv.org/master/d1/d79/group__photo__denoise.html#ga7602ed5ae17b7de40152b922227c4e4f)
- [X] [fastNlMeansDenoising](https://docs.opencv.org/master/d1/d79/group__photo__denoise.html#ga4c6b0031f56ea3f98f768881279ffe93)
- [X] [fastNlMeansDenoisingColored](https://docs.opencv.org/master/d1/d79/group__photo__denoise.html#ga03aa4189fc3e31dafd638d90de335617)
- [X] [fastNlMeansDenoisingMulti](https://docs.opencv.org/master/d1/d79/group__photo__denoise.html#gaf4421bf068c4d632ea7f0aa38e0bf172)
- [ ] [createCalibrateDebevec](https://docs.opencv.org/master/d6/df5/group__photo__hdr.html#ga7fed9707ad5f2cc0e633888867109f90)
- [ ] [createCalibrateRobertson](https://docs.opencv.org/master/d6/df5/group__photo__hdr.html#gae77813a21cd351a596619e5ff013be5d)
- [ ] [createMergeDebevec](https://docs.opencv.org/master/d6/df5/group__photo__hdr.html#gaa8eab36bc764abb2a225db7c945f87f9)
@@ -200,10 +174,6 @@ Your pull requests will be greatly appreciated!
- [ ] [createTonemapMantiuk](https://docs.opencv.org/master/d6/df5/group__photo__hdr.html#ga3b3f3bf083b7515802f039a6a70f2d21)
- [ ] [createTonemapReinhard](https://docs.opencv.org/master/d6/df5/group__photo__hdr.html#gadabe7f6bf1fa96ad0fd644df9182c2fb)
- [ ] [decolor](https://docs.opencv.org/master/d4/d32/group__photo__decolor.html#ga4864d4c007bda5dacdc5e9d4ed7e222c)
- [X] [detailEnhance](https://docs.opencv.org/master/df/dac/group__photo__render.html#ga0de660cb6f371a464a74c7b651415975)
- [X] [edgePreservingFilter](https://docs.opencv.org/master/df/dac/group__photo__render.html#gafaee2977597029bc8e35da6e67bd31f7)
- [X] [pencilSketch](https://docs.opencv.org/master/df/dac/group__photo__render.html#gae5930dd822c713b36f8529b21ddebd0c)
- [X] [stylization](https://docs.opencv.org/master/df/dac/group__photo__render.html#gacb0f7324017df153d7b5d095aed53206)
- [ ] stitching. Images stitching
@@ -212,9 +182,6 @@ Your pull requests will be greatly appreciated!
- [ ] **core. - WORK STARTED** The following functions still need implementation:
- [ ] [cv::cuda::convertFp16](https://docs.opencv.org/master/d8/d40/group__cudacore__init.html#gaa1c52258763197958eb9e6681917f723)
- [ ] [cv::cuda::deviceSupports](https://docs.opencv.org/master/d8/d40/group__cudacore__init.html#ga170b10cc9af4aa8cce8c0afdb4b1d08c)
- [X] [cv::cuda::getDevice](https://docs.opencv.org/master/d8/d40/group__cudacore__init.html#ga6ded4ed8e4fc483a9863d31f34ec9c0e)
- [X] [cv::cuda::resetDevice](https://docs.opencv.org/master/d8/d40/group__cudacore__init.html#ga6153b6f461101374e655a54fc77e725e)
- [X] [cv::cuda::setDevice](https://docs.opencv.org/master/d8/d40/group__cudacore__init.html#gaefa34186b185de47851836dba537828b)
- [ ] **cudaarithm. Operations on Matrices - WORK STARTED** The following functions still need implementation:
- [ ] **core** The following functions still need implementation:
@@ -225,32 +192,17 @@ Your pull requests will be greatly appreciated!
- [ ] [cv::cuda::transpose](https://docs.opencv.org/master/de/d09/group__cudaarithm__core.html#ga327b71c3cb811a904ccf5fba37fc29f2)
- [ ] **per-element operations - WORK STARTED** The following functions still need implementation:
- [X] [cv::cuda::absdiff](https://docs.opencv.org/master/d8/d34/group__cudaarithm__elem.html#gac062b283cf46ee90f74a773d3382ab54)
- [X] [cv::cuda::add](https://docs.opencv.org/master/d8/d34/group__cudaarithm__elem.html#ga5d9794bde97ed23d1c1485249074a8b1)
- [ ] [cv::cuda::addWeighted](https://docs.opencv.org/master/d8/d34/group__cudaarithm__elem.html#ga2cd14a684ea70c6ab2a63ee90ffe6201)
- [X] [cv::cuda::bitwise_and](https://docs.opencv.org/master/d8/d34/group__cudaarithm__elem.html#ga78d7c1a013877abd4237fbfc4e13bd76)
- [X] [cv::cuda::bitwise_not](https://docs.opencv.org/master/d8/d34/group__cudaarithm__elem.html#gae58159a2259ae1acc76b531c171cf06a)
- [X] [cv::cuda::bitwise_or](https://docs.opencv.org/master/d8/d34/group__cudaarithm__elem.html#gafd098ee3e51c68daa793999c1da3dfb7)
- [X] [cv::cuda::bitwise_xor](https://docs.opencv.org/master/d8/d34/group__cudaarithm__elem.html#ga3d95d4faafb099aacf18e8b915a4ad8d)
- [ ] [cv::cuda::cartToPolar](https://docs.opencv.org/master/d8/d34/group__cudaarithm__elem.html#ga82210c7d1c1d42e616e554bf75a53480)
- [ ] [cv::cuda::compare](https://docs.opencv.org/master/d8/d34/group__cudaarithm__elem.html#ga4d41cd679f4a83862a3de71a6057db54)
- [X] [cv::cuda::divide](https://docs.opencv.org/master/d8/d34/group__cudaarithm__elem.html#ga124315aa226260841e25cc0b9ea99dc3)
- [X] [cv::cuda::exp](https://docs.opencv.org/master/d8/d34/group__cudaarithm__elem.html#gac6e51541d3bb0a7a396128e4d5919b61)
- [ ] [cv::cuda::inRange](https://docs.opencv.org/master/d8/d34/group__cudaarithm__elem.html#gaf611ab6b1d85e951feb6f485b1ed9672)
- [X] [cv::cuda::log](https://docs.opencv.org/master/d8/d34/group__cudaarithm__elem.html#gaae9c60739e2d1a977b4d3250a0be42ca)
- [ ] [cv::cuda::lshift](https://docs.opencv.org/master/d8/d34/group__cudaarithm__elem.html#gafd072accecb14c9adccdad45e3bf2300)
- [ ] [cv::cuda::magnitude](https://docs.opencv.org/master/d8/d34/group__cudaarithm__elem.html#ga3d17f4fcd79d7c01fadd217969009463)
- [ ] [cv::cuda::magnitudeSqr](https://docs.opencv.org/master/d8/d34/group__cudaarithm__elem.html#ga7613e382d257e150033d0ce4d6098f6a)
- [X] [cv::cuda::max](https://docs.opencv.org/master/d8/d34/group__cudaarithm__elem.html#gadb5dd3d870f10c0866035755b929b1e7)
- [X] [cv::cuda::min](https://docs.opencv.org/master/d8/d34/group__cudaarithm__elem.html#ga74f0b05a65b3d949c237abb5e6c60867)
- [X] [cv::cuda::multiply](https://docs.opencv.org/master/d8/d34/group__cudaarithm__elem.html#ga497cc0615bf717e1e615143b56f00591)
- [ ] [cv::cuda::phase](https://docs.opencv.org/master/d8/d34/group__cudaarithm__elem.html#ga5b75ec01be06dcd6e27ada09a0d4656a)
- [ ] [cv::cuda::polarToCart](https://docs.opencv.org/master/d8/d34/group__cudaarithm__elem.html#ga01516a286a329c303c2db746513dd9df)
- [ ] [cv::cuda::pow](https://docs.opencv.org/master/d8/d34/group__cudaarithm__elem.html#ga82d04ef4bcc4dfa9bfbe76488007c6c4)
- [ ] [cv::cuda::rshift](https://docs.opencv.org/master/d8/d34/group__cudaarithm__elem.html#ga87af0b66358cc302676f35c1fd56c2ed)
- [X] [cv::cuda::sqr](https://docs.opencv.org/master/d8/d34/group__cudaarithm__elem.html#ga8aae233da90ce0ffe309ab8004342acb)
- [X] [cv::cuda::sqrt](https://docs.opencv.org/master/d8/d34/group__cudaarithm__elem.html#ga09303680cb1a5521a922b6d392028d8c)
- [X] [cv::cuda::subtract](https://docs.opencv.org/master/d8/d34/group__cudaarithm__elem.html#ga6eab60fc250059e2fda79c5636bd067f)
- [ ] **matrix reductions** The following functions still need implementation:
- [ ] [cv::cuda::absSum](https://docs.opencv.org/master/d5/de6/group__cudaarithm__reduce.html#ga690fa79ba4426c53f7d2bebf3d37a32a)
@@ -306,7 +258,6 @@ Your pull requests will be greatly appreciated!
- [ ] [cv::cuda::createSeparableLinearFilter](https://docs.opencv.org/master/dc/d66/group__cudafilters.html#gaf7b79a9a92992044f328dad07a52c4bf)
- [ ] **cudaimgproc. Image Processing - WORK STARTED** The following functions still need implementation:
- [ ] [cv::cuda::TemplateMatching](https://docs.opencv.org/master/d2/d58/classcv_1_1cuda_1_1TemplateMatching.html)
- [ ] [cv::cuda::alphaComp](https://docs.opencv.org/master/db/d8c/group__cudaimgproc__color.html#ga08a698700458d9311390997b57fbf8dc)
- [ ] [cv::cuda::demosaicing](https://docs.opencv.org/master/db/d8c/group__cudaimgproc__color.html#ga7fb153572b573ebd2d7610fcbe64166e)
- [ ] [cv::cuda::gammaCorrection](https://docs.opencv.org/master/db/d8c/group__cudaimgproc__color.html#gaf4195a8409c3b8fbfa37295c2b2c4729)
@@ -353,9 +304,8 @@ Your pull requests will be greatly appreciated!
## Contrib modules list
- [ ] alphamat. Alpha Matting
- [X] **aruco. ArUco Marker Detection - WORK STARTED**
- [ ] barcode. Barcode detecting and decoding methods
- [X] **bgsegm. Improved Background-Foreground Segmentation Methods - WORK STARTED**
- [ ] **bgsegm. Improved Background-Foreground Segmentation Methods - WORK STARTED**
- [ ] bioinspired. Biologically inspired vision models and derivated tools
- [ ] ccalib. Custom Calibration Pattern for 3D reconstruction
- [ ] cnn_3dobj. 3D object recognition and pose estimation API

36
vendor/gocv.io/x/gocv/appveyor.yml generated vendored
View File

@@ -1,36 +0,0 @@
version: "{build}"
clone_folder: c:\gopath\src\gocv.io\x\gocv
platform:
- MinGW_x64
environment:
GOPATH: c:\gopath
GOROOT: c:\go
GOVERSION: 1.16
TEST_EXTERNAL: 1
APPVEYOR_SAVE_CACHE_ON_ERROR: true
cache:
- C:\opencv -> appveyor_build_opencv.cmd
install:
- if not exist "C:\opencv" appveyor_build_opencv.cmd
- set PATH=C:\Perl\site\bin;C:\Perl\bin;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Program Files\7-Zip;C:\Program Files\Microsoft\Web Platform Installer\;C:\Tools\PsTools;C:\Program Files (x86)\CMake\bin;C:\go\bin;C:\Tools\NuGet;C:\Program Files\LLVM\bin;C:\Tools\curl\bin;C:\ProgramData\chocolatey\bin;C:\Program Files (x86)\Yarn\bin;C:\Users\appveyor\AppData\Local\Yarn\bin;C:\Program Files\AppVeyor\BuildAgent\
- set PATH=%PATH%;C:\mingw-w64\x86_64-7.3.0-posix-seh-rt_v5-rev0\mingw64\bin
- set PATH=%PATH%;C:\Tools\GitVersion;C:\Program Files\Git LFS;C:\Program Files\Git\cmd;C:\Program Files\Git\usr\bin;C:\opencv\build\install\x64\mingw\bin;
- echo %PATH%
- echo %GOPATH%
- go version
- cd c:\gopath\src\gocv.io\x\gocv
- go get -d .
- set GOCV_CAFFE_TEST_FILES=C:\opencv\testdata
- set GOCV_TENSORFLOW_TEST_FILES=C:\opencv\testdata
- set GOCV_ONNX_TEST_FILES=C:\opencv\testdata
- set OPENCV_ENABLE_NONFREE=ON
- go env
build_script:
- go test -tags matprofile -v .
- go test -tags matprofile -v ./contrib

View File

@@ -1,29 +0,0 @@
if not exist "C:\opencv" mkdir "C:\opencv"
if not exist "C:\opencv\build" mkdir "C:\opencv\build"
if not exist "C:\opencv\testdata" mkdir "C:\opencv\testdata"
appveyor DownloadFile https://github.com/opencv/opencv/archive/4.6.0.zip -FileName c:\opencv\opencv-4.6.0.zip
7z x c:\opencv\opencv-4.6.0.zip -oc:\opencv -y
del c:\opencv\opencv-4.6.0.zip /q
appveyor DownloadFile https://github.com/opencv/opencv_contrib/archive/4.6.0.zip -FileName c:\opencv\opencv_contrib-4.6.0.zip
7z x c:\opencv\opencv_contrib-4.6.0.zip -oc:\opencv -y
del c:\opencv\opencv_contrib-4.6.0.zip /q
cd C:\opencv\build
set PATH=C:\Perl\site\bin;C:\Perl\bin;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Program Files\7-Zip;C:\Program Files\Microsoft\Web Platform Installer\;C:\Tools\PsTools;C:\Program Files (x86)\CMake\bin;C:\go\bin;C:\Tools\NuGet;C:\Program Files\LLVM\bin;C:\Tools\curl\bin;C:\ProgramData\chocolatey\bin;C:\Program Files (x86)\Yarn\bin;C:\Users\appveyor\AppData\Local\Yarn\bin;C:\Program Files\AppVeyor\BuildAgent\
set PATH=%PATH%;C:\mingw-w64\x86_64-8.1.0-posix-seh-rt_v6-rev0\mingw64\bin
dir C:\opencv
cmake C:\opencv\opencv-4.6.0 -G "MinGW Makefiles" -BC:\opencv\build -DENABLE_CXX11=ON -DOPENCV_EXTRA_MODULES_PATH=C:\opencv\opencv_contrib-4.6.0\modules -DBUILD_SHARED_LIBS=ON -DWITH_IPP=OFF -DWITH_MSMF=OFF -DBUILD_EXAMPLES=OFF -DBUILD_TESTS=OFF -DBUILD_PERF_TESTS=OFF -DBUILD_opencv_java=OFF -DBUILD_opencv_python=OFF -DBUILD_opencv_python2=OFF -DBUILD_opencv_python3=OFF -DBUILD_DOCS=OFF -DENABLE_PRECOMPILED_HEADERS=OFF -DBUILD_opencv_saliency=OFF -DBUILD_opencv_wechat_qrcode=ON -DCPU_DISPATCH= -DBUILD_opencv_gapi=OFF -DOPENCV_GENERATE_PKGCONFIG=ON -DOPENCV_ENABLE_NONFREE=ON -DWITH_OPENCL_D3D11_NV=OFF -DOPENCV_ALLOCATOR_STATS_COUNTER_TYPE=int64_t -DWITH_TBB=ON -Wno-dev
mingw32-make -j%NUMBER_OF_PROCESSORS%
mingw32-make install
appveyor DownloadFile https://raw.githubusercontent.com/opencv/opencv_extra/master/testdata/dnn/bvlc_googlenet.prototxt -FileName C:\opencv\testdata\bvlc_googlenet.prototxt
appveyor DownloadFile https://raw.githubusercontent.com/WeChatCV/opencv_3rdparty/wechat_qrcode/detect.caffemodel -FileName C:\opencv\testdata\detect.caffemodel
appveyor DownloadFile https://raw.githubusercontent.com/WeChatCV/opencv_3rdparty/wechat_qrcode/detect.prototxt -FileName C:\opencv\testdata\detect.prototxt
appveyor DownloadFile https://raw.githubusercontent.com/WeChatCV/opencv_3rdparty/wechat_qrcode/sr.caffemodel -FileName C:\opencv\testdata\sr.caffemodel
appveyor DownloadFile https://raw.githubusercontent.com/WeChatCV/opencv_3rdparty/wechat_qrcode/sr.prototxt -FileName C:\opencv\testdata\sr.prototxt
appveyor DownloadFile http://dl.caffe.berkeleyvision.org/bvlc_googlenet.caffemodel -FileName C:\opencv\testdata\bvlc_googlenet.caffemodel
appveyor DownloadFile https://storage.googleapis.com/download.tensorflow.org/models/inception5h.zip -FileName C:\opencv\testdata\inception5h.zip
appveyor DownloadFile https://github.com/onnx/models/raw/main/vision/classification/inception_and_googlenet/googlenet/model/googlenet-9.onnx -FileName C:\opencv\testdata\googlenet-9.onnx
7z x C:\opencv\testdata\inception5h.zip -oC:\opencv\testdata tensorflow_inception_graph.pb -y
rmdir c:\opencv\opencv-4.6.0 /s /q
rmdir c:\opencv\opencv_contrib-4.6.0 /s /q

287
vendor/gocv.io/x/gocv/aruco.cpp generated vendored Normal file
View File

@@ -0,0 +1,287 @@
#include "aruco.h"
ArucoDetector ArucoDetector_New() {
return new cv::aruco::ArucoDetector();
}
ArucoDetector ArucoDetector_NewWithParams(ArucoDictionary dictionary, ArucoDetectorParameters params) {
return new cv::aruco::ArucoDetector(*dictionary, *params);
}
void ArucoDetector_Close(ArucoDetector ad) {
delete ad;
}
void ArucoDetector_DetectMarkers(ArucoDetector ad, Mat inputArr, Points2fVector markerCorners, IntVector *markerIds, Points2fVector rejectedCandidates) {
std::vector<int> _markerIds;
ad->detectMarkers(*inputArr, *markerCorners, _markerIds, *rejectedCandidates);
int *ids = new int[_markerIds.size()];
for (size_t i = 0; i < _markerIds.size(); ++i)
{
ids[i] = _markerIds[i];
}
markerIds->length = _markerIds.size();
markerIds->val = ids;
}
ArucoDetectorParameters ArucoDetectorParameters_Create()
{
return new cv::aruco::DetectorParameters();
}
void ArucoDetectorParameters_SetAdaptiveThreshWinSizeMin(ArucoDetectorParameters ap, int adaptiveThreshWinSizeMin) {
ap->adaptiveThreshWinSizeMin = adaptiveThreshWinSizeMin;
}
int ArucoDetectorParameters_GetAdaptiveThreshWinSizeMin(ArucoDetectorParameters ap) {
return ap->adaptiveThreshWinSizeMin;
}
void ArucoDetectorParameters_SetAdaptiveThreshWinSizeMax(ArucoDetectorParameters ap, int adaptiveThreshWinSizeMax) {
ap->adaptiveThreshWinSizeMax = adaptiveThreshWinSizeMax;
}
int ArucoDetectorParameters_GetAdaptiveThreshWinSizeMax(ArucoDetectorParameters ap) {
return ap->adaptiveThreshWinSizeMax;
}
void ArucoDetectorParameters_SetAdaptiveThreshWinSizeStep(ArucoDetectorParameters ap, int adaptiveThreshWinSizeStep) {
ap->adaptiveThreshWinSizeStep = adaptiveThreshWinSizeStep;
}
int ArucoDetectorParameters_GetAdaptiveThreshWinSizeStep(ArucoDetectorParameters ap) {
return ap->adaptiveThreshWinSizeStep;
}
void ArucoDetectorParameters_SetAdaptiveThreshConstant(ArucoDetectorParameters ap, double adaptiveThreshConstant) {
ap->adaptiveThreshConstant = adaptiveThreshConstant;
}
double ArucoDetectorParameters_GetAdaptiveThreshConstant(ArucoDetectorParameters ap) {
return ap->adaptiveThreshConstant;
}
void ArucoDetectorParameters_SetMinMarkerPerimeterRate(ArucoDetectorParameters ap, double minMarkerPerimeterRate) {
ap->minMarkerPerimeterRate = minMarkerPerimeterRate;
}
double ArucoDetectorParameters_GetMinMarkerPerimeterRate(ArucoDetectorParameters ap){
return ap->minMarkerPerimeterRate;
}
void ArucoDetectorParameters_SetMaxMarkerPerimeterRate(ArucoDetectorParameters ap, double maxMarkerPerimeterRate) {
ap->maxMarkerPerimeterRate = maxMarkerPerimeterRate;
}
double ArucoDetectorParameters_GetMaxMarkerPerimeterRate(ArucoDetectorParameters ap){
return ap->maxMarkerPerimeterRate;
}
void ArucoDetectorParameters_SetPolygonalApproxAccuracyRate(ArucoDetectorParameters ap, double polygonalApproxAccuracyRate) {
ap->polygonalApproxAccuracyRate = polygonalApproxAccuracyRate;
}
double ArucoDetectorParameters_GetPolygonalApproxAccuracyRate(ArucoDetectorParameters ap){
return ap->polygonalApproxAccuracyRate;
}
void ArucoDetectorParameters_SetMinCornerDistanceRate(ArucoDetectorParameters ap, double minCornerDistanceRate) {
ap->minCornerDistanceRate = minCornerDistanceRate;
}
double ArucoDetectorParameters_GetMinCornerDistanceRate(ArucoDetectorParameters ap) {
return ap->minCornerDistanceRate;
}
void ArucoDetectorParameters_SetMinDistanceToBorder(ArucoDetectorParameters ap, int minDistanceToBorder) {
ap->minDistanceToBorder = minDistanceToBorder;
}
int ArucoDetectorParameters_GetMinDistanceToBorder(ArucoDetectorParameters ap) {
return ap->minDistanceToBorder;
}
void ArucoDetectorParameters_SetMinMarkerDistanceRate(ArucoDetectorParameters ap, double minMarkerDistanceRate) {
ap->minMarkerDistanceRate = minMarkerDistanceRate;
}
double ArucoDetectorParameters_GetMinMarkerDistanceRate(ArucoDetectorParameters ap) {
return ap->minMarkerDistanceRate;
}
void ArucoDetectorParameters_SetCornerRefinementMethod(ArucoDetectorParameters ap, int cornerRefinementMethod) {
ap->cornerRefinementMethod = cv::aruco::CornerRefineMethod(cornerRefinementMethod);
}
int ArucoDetectorParameters_GetCornerRefinementMethod(ArucoDetectorParameters ap) {
return ap->cornerRefinementMethod;
}
void ArucoDetectorParameters_SetCornerRefinementWinSize(ArucoDetectorParameters ap, int cornerRefinementWinSize) {
ap->cornerRefinementWinSize = cornerRefinementWinSize;
}
int ArucoDetectorParameters_GetCornerRefinementWinSize(ArucoDetectorParameters ap) {
return ap->cornerRefinementWinSize;
}
void ArucoDetectorParameters_SetCornerRefinementMaxIterations(ArucoDetectorParameters ap, int cornerRefinementMaxIterations) {
ap->cornerRefinementMaxIterations = cornerRefinementMaxIterations;
}
int ArucoDetectorParameters_GetCornerRefinementMaxIterations(ArucoDetectorParameters ap) {
return ap->cornerRefinementMaxIterations;
}
void ArucoDetectorParameters_SetCornerRefinementMinAccuracy(ArucoDetectorParameters ap, double cornerRefinementMinAccuracy) {
ap->cornerRefinementMinAccuracy = cornerRefinementMinAccuracy;
}
double ArucoDetectorParameters_GetCornerRefinementMinAccuracy(ArucoDetectorParameters ap) {
return ap->cornerRefinementMinAccuracy;
}
void ArucoDetectorParameters_SetMarkerBorderBits(ArucoDetectorParameters ap, int markerBorderBits) {
ap->markerBorderBits = markerBorderBits;
}
int ArucoDetectorParameters_GetMarkerBorderBits(ArucoDetectorParameters ap) {
return ap->markerBorderBits;
}
void ArucoDetectorParameters_SetPerspectiveRemovePixelPerCell(ArucoDetectorParameters ap, int perspectiveRemovePixelPerCell) {
ap->perspectiveRemovePixelPerCell = perspectiveRemovePixelPerCell;
}
int ArucoDetectorParameters_GetPerspectiveRemovePixelPerCell(ArucoDetectorParameters ap) {
return ap->perspectiveRemovePixelPerCell;
}
void ArucoDetectorParameters_SetPerspectiveRemoveIgnoredMarginPerCell(ArucoDetectorParameters ap, double perspectiveRemoveIgnoredMarginPerCell) {
ap->perspectiveRemoveIgnoredMarginPerCell = perspectiveRemoveIgnoredMarginPerCell;
}
double ArucoDetectorParameters_GetPerspectiveRemoveIgnoredMarginPerCell(ArucoDetectorParameters ap) {
return ap->perspectiveRemoveIgnoredMarginPerCell;
}
void ArucoDetectorParameters_SetMaxErroneousBitsInBorderRate(ArucoDetectorParameters ap, double maxErroneousBitsInBorderRate) {
ap->maxErroneousBitsInBorderRate = maxErroneousBitsInBorderRate;
}
double ArucoDetectorParameters_GetMaxErroneousBitsInBorderRate(ArucoDetectorParameters ap) {
return ap->maxErroneousBitsInBorderRate;
}
void ArucoDetectorParameters_SetMinOtsuStdDev(ArucoDetectorParameters ap, double minOtsuStdDev) {
ap->minOtsuStdDev = minOtsuStdDev;
}
double ArucoDetectorParameters_GetMinOtsuStdDev(ArucoDetectorParameters ap) {
return ap->minOtsuStdDev;
}
void ArucoDetectorParameters_SetErrorCorrectionRate(ArucoDetectorParameters ap, double errorCorrectionRate) {
ap->errorCorrectionRate = errorCorrectionRate;
}
double ArucoDetectorParameters_GetErrorCorrectionRate(ArucoDetectorParameters ap) {
return ap->errorCorrectionRate;
}
void ArucoDetectorParameters_SetAprilTagQuadDecimate(ArucoDetectorParameters ap, float aprilTagQuadDecimate) {
ap->aprilTagQuadDecimate = aprilTagQuadDecimate;
}
float ArucoDetectorParameters_GetAprilTagQuadDecimate(ArucoDetectorParameters ap) {
return ap->aprilTagQuadDecimate;
}
void ArucoDetectorParameters_SetAprilTagQuadSigma(ArucoDetectorParameters ap, float aprilTagQuadSigma) {
ap->aprilTagQuadSigma = aprilTagQuadSigma;
}
float ArucoDetectorParameters_GetAprilTagQuadSigma(ArucoDetectorParameters ap) {
return ap->aprilTagQuadSigma;
}
void ArucoDetectorParameters_SetAprilTagMinClusterPixels(ArucoDetectorParameters ap, int aprilTagMinClusterPixels) {
ap->aprilTagMinClusterPixels = aprilTagMinClusterPixels;
}
int ArucoDetectorParameters_GetAprilTagMinClusterPixels(ArucoDetectorParameters ap) {
return ap->aprilTagMinClusterPixels;
}
void ArucoDetectorParameters_SetAprilTagMaxNmaxima(ArucoDetectorParameters ap, int aprilTagMaxNmaxima) {
ap->aprilTagMaxNmaxima = aprilTagMaxNmaxima;
}
int ArucoDetectorParameters_GetAprilTagMaxNmaxima(ArucoDetectorParameters ap) {
return ap->aprilTagMaxNmaxima;
}
void ArucoDetectorParameters_SetAprilTagCriticalRad(ArucoDetectorParameters ap, float aprilTagCriticalRad) {
ap->aprilTagCriticalRad = aprilTagCriticalRad;
}
float ArucoDetectorParameters_GetAprilTagCriticalRad(ArucoDetectorParameters ap) {
return ap->aprilTagCriticalRad;
}
void ArucoDetectorParameters_SetAprilTagMaxLineFitMse(ArucoDetectorParameters ap, float aprilTagMaxLineFitMse) {
ap->aprilTagMaxLineFitMse = aprilTagMaxLineFitMse;
}
float ArucoDetectorParameters_GetAprilTagMaxLineFitMse(ArucoDetectorParameters ap) {
return ap->aprilTagMaxLineFitMse;
}
void ArucoDetectorParameters_SetAprilTagMinWhiteBlackDiff(ArucoDetectorParameters ap, int aprilTagMinWhiteBlackDiff) {
ap->aprilTagMinWhiteBlackDiff = aprilTagMinWhiteBlackDiff;
}
int ArucoDetectorParameters_GetAprilTagMinWhiteBlackDiff(ArucoDetectorParameters ap) {
return ap->aprilTagMinWhiteBlackDiff;
}
void ArucoDetectorParameters_SetAprilTagDeglitch(ArucoDetectorParameters ap, int aprilTagDeglitch) {
ap->aprilTagDeglitch = aprilTagDeglitch;
}
int ArucoDetectorParameters_GetAprilTagDeglitch(ArucoDetectorParameters ap) {
return ap->aprilTagDeglitch;
}
void ArucoDetectorParameters_SetDetectInvertedMarker(ArucoDetectorParameters ap, bool detectInvertedMarker) {
ap->detectInvertedMarker = detectInvertedMarker;
}
bool ArucoDetectorParameters_GetDetectInvertedMarker(ArucoDetectorParameters ap) {
return ap->detectInvertedMarker;
}
void ArucoDrawDetectedMarkers(Mat image, Points2fVector markerCorners, IntVector markerIds, Scalar borderColor)
{
std::vector<int> _markerIds;
for (int i = 0, *v = markerIds.val; i < markerIds.length; ++v, ++i)
{
_markerIds.push_back(*v);
}
cv::Scalar _borderColor = cv::Scalar(borderColor.val1, borderColor.val2, borderColor.val3);
cv::aruco::drawDetectedMarkers(*image, *markerCorners, _markerIds, _borderColor);
}
void ArucoGenerateImageMarker(int dictionaryId, int id, int sidePixels, Mat img, int borderBits)
{
cv::aruco::Dictionary dict = cv::aruco::getPredefinedDictionary(dictionaryId);
cv::aruco::generateImageMarker(dict, id, sidePixels, *img, borderBits);
}
ArucoDictionary getPredefinedDictionary(int dictionaryId)
{
return new cv::aruco::Dictionary(cv::aruco::getPredefinedDictionary(dictionaryId));
}

337
vendor/gocv.io/x/gocv/aruco.go generated vendored Normal file
View File

@@ -0,0 +1,337 @@
package gocv
/*
#include <stdlib.h>
#include "aruco.h"
#include "core.h"
*/
import "C"
import (
"reflect"
"unsafe"
)
type ArucoDetector struct {
p C.ArucoDetector
}
// NewArucoDetector returns a new ArucoDetector.
func NewArucoDetector() ArucoDetector {
return ArucoDetector{p: C.ArucoDetector_New()}
}
// NewArucoDetectorWithParams returns a new ArucoDetector.
func NewArucoDetectorWithParams(dictionary ArucoDictionary, params ArucoDetectorParameters) ArucoDetector {
return ArucoDetector{p: C.ArucoDetector_NewWithParams(dictionary.p, params.p)}
}
// Close deletes the ArucoDetector's pointer.
func (a *ArucoDetector) Close() error {
C.ArucoDetector_Close(a.p)
a.p = nil
return nil
}
// DetectMarkers does basic marker detection.
//
// For further details, please see:
// https://docs.opencv.org/master/d9/d6a/group__aruco.html#gab9159aa69250d8d3642593e508cb6baa
func (a *ArucoDetector) DetectMarkers(input Mat) (
markerCorners [][]Point2f, markerIds []int, rejectedCandidates [][]Point2f,
) {
pvsCorners := NewPoints2fVector()
defer pvsCorners.Close()
pvsRejected := NewPoints2fVector()
defer pvsRejected.Close()
cmarkerIds := C.IntVector{}
defer C.free(unsafe.Pointer(cmarkerIds.val))
C.ArucoDetector_DetectMarkers(a.p, C.Mat(input.Ptr()), C.Points2fVector(pvsCorners.P()),
&cmarkerIds, C.Points2fVector(pvsRejected.P()))
h := &reflect.SliceHeader{
Data: uintptr(unsafe.Pointer(cmarkerIds.val)),
Len: int(cmarkerIds.length),
Cap: int(cmarkerIds.length),
}
pcids := *(*[]C.int)(unsafe.Pointer(h))
markerIds = []int{}
for i := 0; i < int(cmarkerIds.length); i++ {
markerIds = append(markerIds, int(pcids[i]))
}
return pvsCorners.ToPoints(), markerIds, pvsRejected.ToPoints()
}
func ArucoDrawDetectedMarkers(img Mat, markerCorners [][]Point2f, markerIds []int, borderColor Scalar) {
cMarkerIds := make([]C.int, len(markerIds))
for i, s := range markerIds {
cMarkerIds[i] = C.int(s)
}
markerIdsIntVec := C.IntVector{
val: (*C.int)(&cMarkerIds[0]),
length: C.int(len(cMarkerIds)),
}
_markerCorners := NewPoints2fVectorFromPoints(markerCorners)
cBorderColor := C.struct_Scalar{
val1: C.double(borderColor.Val1),
val2: C.double(borderColor.Val2),
val3: C.double(borderColor.Val3),
val4: C.double(borderColor.Val4),
}
C.ArucoDrawDetectedMarkers(
C.Mat(img.Ptr()),
C.Points2fVector(_markerCorners.P()),
markerIdsIntVec,
cBorderColor,
)
}
func ArucoGenerateImageMarker(dictionaryId ArucoDictionaryCode, id int, sidePixels int, img Mat, borderBits int) {
C.ArucoGenerateImageMarker(C.int(dictionaryId), C.int(id), C.int(sidePixels), C.Mat(img.Ptr()), C.int(borderBits))
}
type ArucoDetectorParameters struct {
p C.ArucoDetectorParameters
}
// NewArucoDetectorParameters returns the default parameters for the SimpleBobDetector
func NewArucoDetectorParameters() ArucoDetectorParameters {
return ArucoDetectorParameters{p: C.ArucoDetectorParameters_Create()}
}
func (ap *ArucoDetectorParameters) SetAdaptiveThreshWinSizeMin(adaptiveThreshWinSizeMin int) {
C.ArucoDetectorParameters_SetAdaptiveThreshWinSizeMin(ap.p, C.int(adaptiveThreshWinSizeMin))
}
func (ap *ArucoDetectorParameters) GetAdaptiveThreshWinSizeMin() int {
return int(C.ArucoDetectorParameters_GetAdaptiveThreshWinSizeMin(ap.p))
}
func (ap *ArucoDetectorParameters) SetAdaptiveThreshWinSizeMax(adaptiveThreshWinSizeMax int) {
C.ArucoDetectorParameters_SetAdaptiveThreshWinSizeMax(ap.p, C.int(adaptiveThreshWinSizeMax))
}
func (ap *ArucoDetectorParameters) GetAdaptiveThreshWinSizeMax() int {
return int(C.ArucoDetectorParameters_GetAdaptiveThreshWinSizeMax(ap.p))
}
func (ap *ArucoDetectorParameters) SetAdaptiveThreshWinSizeStep(adaptiveThreshWinSizeStep int) {
C.ArucoDetectorParameters_SetAdaptiveThreshWinSizeStep(ap.p, C.int(adaptiveThreshWinSizeStep))
}
func (ap *ArucoDetectorParameters) GetAdaptiveThreshWinSizeStep() int {
return int(C.ArucoDetectorParameters_GetAdaptiveThreshWinSizeStep(ap.p))
}
func (ap *ArucoDetectorParameters) SetAdaptiveThreshConstant(adaptiveThreshConstant float64) {
C.ArucoDetectorParameters_SetAdaptiveThreshConstant(ap.p, C.double(adaptiveThreshConstant))
}
func (ap *ArucoDetectorParameters) GetAdaptiveThreshConstant() float64 {
return float64(C.ArucoDetectorParameters_GetAdaptiveThreshConstant(ap.p))
}
func (ap *ArucoDetectorParameters) SetMinMarkerPerimeterRate(minMarkerPerimeterRate float64) {
C.ArucoDetectorParameters_SetMinMarkerPerimeterRate(ap.p, C.double(minMarkerPerimeterRate))
}
func (ap *ArucoDetectorParameters) GetMinMarkerPerimeterRate() float64 {
return float64(C.ArucoDetectorParameters_GetMinMarkerPerimeterRate(ap.p))
}
func (ap *ArucoDetectorParameters) SetMaxMarkerPerimeterRate(maxMarkerPerimeterRate float64) {
C.ArucoDetectorParameters_SetMaxMarkerPerimeterRate(ap.p, C.double(maxMarkerPerimeterRate))
}
func (ap *ArucoDetectorParameters) GetMaxMarkerPerimeterRate() float64 {
return float64(C.ArucoDetectorParameters_GetMaxMarkerPerimeterRate(ap.p))
}
func (ap *ArucoDetectorParameters) SetPolygonalApproxAccuracyRate(polygonalApproxAccuracyRate float64) {
C.ArucoDetectorParameters_SetPolygonalApproxAccuracyRate(ap.p, C.double(polygonalApproxAccuracyRate))
}
func (ap *ArucoDetectorParameters) GetPolygonalApproxAccuracyRate() float64 {
return float64(C.ArucoDetectorParameters_GetPolygonalApproxAccuracyRate(ap.p))
}
func (ap *ArucoDetectorParameters) SetMinCornerDistanceRate(minCornerDistanceRate float64) {
C.ArucoDetectorParameters_SetMinCornerDistanceRate(ap.p, C.double(minCornerDistanceRate))
}
func (ap *ArucoDetectorParameters) GetMinCornerDistanceRate() float64 {
return float64(C.ArucoDetectorParameters_GetMinCornerDistanceRate(ap.p))
}
func (ap *ArucoDetectorParameters) SetMinDistanceToBorder(minDistanceToBorder int) {
C.ArucoDetectorParameters_SetMinDistanceToBorder(ap.p, C.int(minDistanceToBorder))
}
func (ap *ArucoDetectorParameters) GetMinDistanceToBorder() int {
return int(C.ArucoDetectorParameters_GetMinDistanceToBorder(ap.p))
}
func (ap *ArucoDetectorParameters) SetMinMarkerDistanceRate(minMarkerDistanceRate float64) {
C.ArucoDetectorParameters_SetMinMarkerDistanceRate(ap.p, C.double(minMarkerDistanceRate))
}
func (ap *ArucoDetectorParameters) GetMinMarkerDistanceRate() float64 {
return float64(C.ArucoDetectorParameters_GetMinMarkerDistanceRate(ap.p))
}
func (ap *ArucoDetectorParameters) SetCornerRefinementMethod(cornerRefinementMethod int) {
C.ArucoDetectorParameters_SetCornerRefinementMethod(ap.p, C.int(cornerRefinementMethod))
}
func (ap *ArucoDetectorParameters) GetCornerRefinementMethod() int {
return int(C.ArucoDetectorParameters_GetCornerRefinementMethod(ap.p))
}
func (ap *ArucoDetectorParameters) SetCornerRefinementWinSize(cornerRefinementWinSize int) {
C.ArucoDetectorParameters_SetCornerRefinementWinSize(ap.p, C.int(cornerRefinementWinSize))
}
func (ap *ArucoDetectorParameters) GetCornerRefinementWinSize() int {
return int(C.ArucoDetectorParameters_GetCornerRefinementWinSize(ap.p))
}
func (ap *ArucoDetectorParameters) SetCornerRefinementMaxIterations(cornerRefinementMaxIterations int) {
C.ArucoDetectorParameters_SetCornerRefinementMaxIterations(ap.p, C.int(cornerRefinementMaxIterations))
}
func (ap *ArucoDetectorParameters) GetCornerRefinementMaxIterations() int {
return int(C.ArucoDetectorParameters_GetCornerRefinementMaxIterations(ap.p))
}
func (ap *ArucoDetectorParameters) SetCornerRefinementMinAccuracy(cornerRefinementMinAccuracy float64) {
C.ArucoDetectorParameters_SetCornerRefinementMinAccuracy(ap.p, C.double(cornerRefinementMinAccuracy))
}
func (ap *ArucoDetectorParameters) GetCornerRefinementMinAccuracy() float64 {
return float64(C.ArucoDetectorParameters_GetCornerRefinementMinAccuracy(ap.p))
}
func (ap *ArucoDetectorParameters) SetMarkerBorderBits(markerBorderBits int) {
C.ArucoDetectorParameters_SetMarkerBorderBits(ap.p, C.int(markerBorderBits))
}
func (ap *ArucoDetectorParameters) GetMarkerBorderBits() int {
return int(C.ArucoDetectorParameters_GetMarkerBorderBits(ap.p))
}
func (ap *ArucoDetectorParameters) SetPerspectiveRemovePixelPerCell(perspectiveRemovePixelPerCell int) {
C.ArucoDetectorParameters_SetPerspectiveRemovePixelPerCell(ap.p, C.int(perspectiveRemovePixelPerCell))
}
func (ap *ArucoDetectorParameters) GetPerspectiveRemovePixelPerCell() int {
return int(C.ArucoDetectorParameters_GetPerspectiveRemovePixelPerCell(ap.p))
}
func (ap *ArucoDetectorParameters) SetPerspectiveRemoveIgnoredMarginPerCell(perspectiveRemoveIgnoredMarginPerCell float64) {
C.ArucoDetectorParameters_SetPerspectiveRemoveIgnoredMarginPerCell(ap.p, C.double(perspectiveRemoveIgnoredMarginPerCell))
}
func (ap *ArucoDetectorParameters) GetPerspectiveRemoveIgnoredMarginPerCell() float64 {
return float64(C.ArucoDetectorParameters_GetPerspectiveRemoveIgnoredMarginPerCell(ap.p))
}
func (ap *ArucoDetectorParameters) SetMaxErroneousBitsInBorderRate(maxErroneousBitsInBorderRate float64) {
C.ArucoDetectorParameters_SetMaxErroneousBitsInBorderRate(ap.p, C.double(maxErroneousBitsInBorderRate))
}
func (ap *ArucoDetectorParameters) GetMaxErroneousBitsInBorderRate() float64 {
return float64(C.ArucoDetectorParameters_GetMaxErroneousBitsInBorderRate(ap.p))
}
func (ap *ArucoDetectorParameters) SetMinOtsuStdDev(minOtsuStdDev float64) {
C.ArucoDetectorParameters_SetMinOtsuStdDev(ap.p, C.double(minOtsuStdDev))
}
func (ap *ArucoDetectorParameters) GetMinOtsuStdDev() float64 {
return float64(C.ArucoDetectorParameters_GetMinOtsuStdDev(ap.p))
}
func (ap *ArucoDetectorParameters) SetErrorCorrectionRate(errorCorrectionRate float64) {
C.ArucoDetectorParameters_SetErrorCorrectionRate(ap.p, C.double(errorCorrectionRate))
}
func (ap *ArucoDetectorParameters) GetErrorCorrectionRate() float64 {
return float64(C.ArucoDetectorParameters_GetErrorCorrectionRate(ap.p))
}
func (ap *ArucoDetectorParameters) SetAprilTagQuadDecimate(aprilTagQuadDecimate float32) {
C.ArucoDetectorParameters_SetAprilTagQuadDecimate(ap.p, C.float(aprilTagQuadDecimate))
}
func (ap *ArucoDetectorParameters) GetAprilTagQuadDecimate() float32 {
return float32(C.ArucoDetectorParameters_GetAprilTagQuadDecimate(ap.p))
}
func (ap *ArucoDetectorParameters) SetAprilTagQuadSigma(aprilTagQuadSigma float32) {
C.ArucoDetectorParameters_SetAprilTagQuadSigma(ap.p, C.float(aprilTagQuadSigma))
}
func (ap *ArucoDetectorParameters) GetAprilTagQuadSigma() float32 {
return float32(C.ArucoDetectorParameters_GetAprilTagQuadSigma(ap.p))
}
func (ap *ArucoDetectorParameters) SetAprilTagMinClusterPixels(aprilTagMinClusterPixels int) {
C.ArucoDetectorParameters_SetAprilTagMinClusterPixels(ap.p, C.int(aprilTagMinClusterPixels))
}
func (ap *ArucoDetectorParameters) GetAprilTagMinClusterPixels() int {
return int(C.ArucoDetectorParameters_GetAprilTagMinClusterPixels(ap.p))
}
func (ap *ArucoDetectorParameters) SetAprilTagMaxNmaxima(aprilTagMaxNmaxima int) {
C.ArucoDetectorParameters_SetAprilTagMaxNmaxima(ap.p, C.int(aprilTagMaxNmaxima))
}
func (ap *ArucoDetectorParameters) GetAprilTagMaxNmaxima() int {
return int(C.ArucoDetectorParameters_GetAprilTagMaxNmaxima(ap.p))
}
func (ap *ArucoDetectorParameters) SetAprilTagCriticalRad(aprilTagCriticalRad float32) {
C.ArucoDetectorParameters_SetAprilTagCriticalRad(ap.p, C.float(aprilTagCriticalRad))
}
func (ap *ArucoDetectorParameters) GetAprilTagCriticalRad() float32 {
return float32(C.ArucoDetectorParameters_GetAprilTagCriticalRad(ap.p))
}
func (ap *ArucoDetectorParameters) SetAprilTagMaxLineFitMse(aprilTagMaxLineFitMse float32) {
C.ArucoDetectorParameters_SetAprilTagMaxLineFitMse(ap.p, C.float(aprilTagMaxLineFitMse))
}
func (ap *ArucoDetectorParameters) GetAprilTagMaxLineFitMse() float32 {
return float32(C.ArucoDetectorParameters_GetAprilTagMaxLineFitMse(ap.p))
}
func (ap *ArucoDetectorParameters) SetAprilTagMinWhiteBlackDiff(aprilTagMinWhiteBlackDiff int) {
C.ArucoDetectorParameters_SetAprilTagMinWhiteBlackDiff(ap.p, C.int(aprilTagMinWhiteBlackDiff))
}
func (ap *ArucoDetectorParameters) GetAprilTagMinWhiteBlackDiff() int {
return int(C.ArucoDetectorParameters_GetAprilTagMinWhiteBlackDiff(ap.p))
}
func (ap *ArucoDetectorParameters) SetAprilTagDeglitch(aprilTagDeglitch int) {
C.ArucoDetectorParameters_SetAprilTagDeglitch(ap.p, C.int(aprilTagDeglitch))
}
func (ap *ArucoDetectorParameters) GetAprilTagDeglitch() int {
return int(C.ArucoDetectorParameters_GetAprilTagDeglitch(ap.p))
}
func (ap *ArucoDetectorParameters) SetDetectInvertedMarker(detectInvertedMarker bool) {
C.ArucoDetectorParameters_SetDetectInvertedMarker(ap.p, C.bool(detectInvertedMarker))
}
func (ap *ArucoDetectorParameters) GetDetectInvertedMarker() bool {
return bool(C.ArucoDetectorParameters_GetDetectInvertedMarker(ap.p))
}

96
vendor/gocv.io/x/gocv/aruco.h generated vendored Normal file
View File

@@ -0,0 +1,96 @@
#ifndef _OPENCV3_ARUCO_H_
#define _OPENCV3_ARUCO_H_
#ifdef __cplusplus
#include <opencv2/opencv.hpp>
extern "C" {
#endif
#include "core.h"
#ifdef __cplusplus
typedef cv::aruco::Dictionary* ArucoDictionary;
typedef cv::aruco::DetectorParameters* ArucoDetectorParameters;
typedef cv::aruco::ArucoDetector* ArucoDetector;
#else
typedef void *ArucoDictionary;
typedef void *ArucoDetectorParameters;
typedef void *ArucoDetector;
#endif
ArucoDetectorParameters ArucoDetectorParameters_Create();
void ArucoDetectorParameters_SetAdaptiveThreshWinSizeMin(ArucoDetectorParameters ap, int adaptiveThreshWinSizeMin);
int ArucoDetectorParameters_GetAdaptiveThreshWinSizeMin(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetAdaptiveThreshWinSizeMax(ArucoDetectorParameters ap, int adaptiveThreshWinSizeMax);
int ArucoDetectorParameters_GetAdaptiveThreshWinSizeMax(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetAdaptiveThreshWinSizeStep(ArucoDetectorParameters ap, int adaptiveThreshWinSizeStep);
int ArucoDetectorParameters_GetAdaptiveThreshWinSizeStep(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetAdaptiveThreshConstant(ArucoDetectorParameters ap, double adaptiveThreshConstant);
double ArucoDetectorParameters_GetAdaptiveThreshConstant(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetMinMarkerPerimeterRate(ArucoDetectorParameters ap, double minMarkerPerimeterRate);
double ArucoDetectorParameters_GetMinMarkerPerimeterRate(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetMaxMarkerPerimeterRate(ArucoDetectorParameters ap, double maxMarkerPerimeterRate);
double ArucoDetectorParameters_GetMaxMarkerPerimeterRate(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetPolygonalApproxAccuracyRate(ArucoDetectorParameters ap, double polygonalApproxAccuracyRate);
double ArucoDetectorParameters_GetPolygonalApproxAccuracyRate(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetMinCornerDistanceRate(ArucoDetectorParameters ap, double minCornerDistanceRate);
double ArucoDetectorParameters_GetMinCornerDistanceRate(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetMinDistanceToBorder(ArucoDetectorParameters ap, int minDistanceToBorder);
int ArucoDetectorParameters_GetMinDistanceToBorder(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetMinMarkerDistanceRate(ArucoDetectorParameters ap, double minMarkerDistanceRate);
double ArucoDetectorParameters_GetMinMarkerDistanceRate(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetCornerRefinementMethod(ArucoDetectorParameters ap, int cornerRefinementMethod);
int ArucoDetectorParameters_GetCornerRefinementMethod(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetCornerRefinementWinSize(ArucoDetectorParameters ap, int cornerRefinementWinSize);
int ArucoDetectorParameters_GetCornerRefinementWinSize(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetCornerRefinementMaxIterations(ArucoDetectorParameters ap, int cornerRefinementMaxIterations);
int ArucoDetectorParameters_GetCornerRefinementMaxIterations(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetCornerRefinementMinAccuracy(ArucoDetectorParameters ap, double cornerRefinementMinAccuracy);
double ArucoDetectorParameters_GetCornerRefinementMinAccuracy(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetMarkerBorderBits(ArucoDetectorParameters ap, int markerBorderBits);
int ArucoDetectorParameters_GetMarkerBorderBits(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetPerspectiveRemovePixelPerCell(ArucoDetectorParameters ap, int perspectiveRemovePixelPerCell);
int ArucoDetectorParameters_GetPerspectiveRemovePixelPerCell(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetPerspectiveRemoveIgnoredMarginPerCell(ArucoDetectorParameters ap, double perspectiveRemoveIgnoredMarginPerCell);
double ArucoDetectorParameters_GetPerspectiveRemoveIgnoredMarginPerCell(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetMaxErroneousBitsInBorderRate(ArucoDetectorParameters ap, double maxErroneousBitsInBorderRate);
double ArucoDetectorParameters_GetMaxErroneousBitsInBorderRate(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetMinOtsuStdDev(ArucoDetectorParameters ap, double minOtsuStdDev);
double ArucoDetectorParameters_GetMinOtsuStdDev(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetErrorCorrectionRate(ArucoDetectorParameters ap, double errorCorrectionRate);
double ArucoDetectorParameters_GetErrorCorrectionRate(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetAprilTagQuadDecimate(ArucoDetectorParameters ap, float aprilTagQuadDecimate);
float ArucoDetectorParameters_GetAprilTagQuadDecimate(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetAprilTagQuadSigma(ArucoDetectorParameters ap, float aprilTagQuadSigma);
float ArucoDetectorParameters_GetAprilTagQuadSigma(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetAprilTagMinClusterPixels(ArucoDetectorParameters ap, int aprilTagMinClusterPixels);
int ArucoDetectorParameters_GetAprilTagMinClusterPixels(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetAprilTagMaxNmaxima(ArucoDetectorParameters ap, int aprilTagMaxNmaxima);
int ArucoDetectorParameters_GetAprilTagMaxNmaxima(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetAprilTagCriticalRad(ArucoDetectorParameters ap, float aprilTagCriticalRad);
float ArucoDetectorParameters_GetAprilTagCriticalRad(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetAprilTagMaxLineFitMse(ArucoDetectorParameters ap, float aprilTagMaxLineFitMse);
float ArucoDetectorParameters_GetAprilTagMaxLineFitMse(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetAprilTagMinWhiteBlackDiff(ArucoDetectorParameters ap, int aprilTagMinWhiteBlackDiff);
int ArucoDetectorParameters_GetAprilTagMinWhiteBlackDiff(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetAprilTagDeglitch(ArucoDetectorParameters ap, int aprilTagDeglitch);
int ArucoDetectorParameters_GetAprilTagDeglitch(ArucoDetectorParameters ap);
void ArucoDetectorParameters_SetDetectInvertedMarker(ArucoDetectorParameters ap, bool detectInvertedMarker);
bool ArucoDetectorParameters_GetDetectInvertedMarker(ArucoDetectorParameters ap);
ArucoDictionary getPredefinedDictionary(int dictionaryId);
ArucoDetector ArucoDetector_New();
ArucoDetector ArucoDetector_NewWithParams(ArucoDictionary dictionary, ArucoDetectorParameters params);
void ArucoDetector_Close(ArucoDetector ad);
void ArucoDetector_DetectMarkers(ArucoDetector ad, Mat inputArr, Points2fVector markerCorners, IntVector *markerIds, Points2fVector rejectedCandidates);
void ArucoDrawDetectedMarkers(Mat image, Points2fVector markerCorners, IntVector markerIds, Scalar borderColor);
void ArucoGenerateImageMarker(int dictionaryId, int id, int sidePixels, Mat img, int borderBits);
#ifdef __cplusplus
}
#endif
#endif //_OPENCV3_ARUCO_H_

43
vendor/gocv.io/x/gocv/aruco_dictionaries.go generated vendored Normal file
View File

@@ -0,0 +1,43 @@
package gocv
/*
#include <stdlib.h>
#include "aruco.h"
#include "core.h"
*/
import "C"
type ArucoDictionaryCode int
const (
ArucoDict4x4_50 ArucoDictionaryCode = iota
ArucoDict4x4_100 ArucoDictionaryCode = iota
ArucoDict4x4_250 ArucoDictionaryCode = iota
ArucoDict4x4_1000 ArucoDictionaryCode = iota
ArucoDict5x5_50 ArucoDictionaryCode = iota
ArucoDict5x5_100 ArucoDictionaryCode = iota
ArucoDict5x5_250 ArucoDictionaryCode = iota
ArucoDict5x5_1000 ArucoDictionaryCode = iota
ArucoDict6x6_50 ArucoDictionaryCode = iota
ArucoDict6x6_100 ArucoDictionaryCode = iota
ArucoDict6x6_250 ArucoDictionaryCode = iota
ArucoDict6x6_1000 ArucoDictionaryCode = iota
ArucoDict7x7_50 ArucoDictionaryCode = iota
ArucoDict7x7_100 ArucoDictionaryCode = iota
ArucoDict7x7_250 ArucoDictionaryCode = iota
ArucoDict7x7_1000 ArucoDictionaryCode = iota
ArucoDictArucoOriginal ArucoDictionaryCode = iota
ArucoDictAprilTag_16h5 ArucoDictionaryCode = iota ///< 4x4 bits, minimum hamming distance between any two codes = 5, 30 codes
ArucoDictAprilTag_25h9 ArucoDictionaryCode = iota ///< 5x5 bits, minimum hamming distance between any two codes = 9, 35 codes
ArucoDictAprilTag_36h10 ArucoDictionaryCode = iota ///< 6x6 bits, minimum hamming distance between any two codes = 10, 2320 codes
ArucoDictAprilTag_36h11 ArucoDictionaryCode = iota ///< 6x6 bits, minimum hamming distance between any two codes = 11, 587 codes
)
type ArucoDictionary struct {
p C.ArucoDictionary
}
func GetPredefinedDictionary(dictionaryId ArucoDictionaryCode) ArucoDictionary {
var p C.ArucoDictionary = C.getPredefinedDictionary(C.int(dictionaryId))
return ArucoDictionary{p: p}
}

View File

@@ -1,3 +1,4 @@
//go:build openvino
// +build openvino
package gocv

33
vendor/gocv.io/x/gocv/calib3d.cpp generated vendored
View File

@@ -1,5 +1,13 @@
#include "calib3d.h"
double Fisheye_Calibrate(Points3fVector objectPoints, Points2fVector imagePoints, Size size, Mat k, Mat d, Mat rvecs, Mat tvecs, int flags) {
cv::Size sz(size.width, size.height);
return cv::fisheye::calibrate(*objectPoints, *imagePoints, sz, *k, *d, *rvecs, *tvecs, flags);
}
void Fisheye_DistortPoints(Mat undistorted, Mat distorted, Mat k, Mat d) {
cv::fisheye::distortPoints(*undistorted, *distorted, *k, *d);
}
void Fisheye_UndistortImage(Mat distorted, Mat undistorted, Mat k, Mat d) {
cv::fisheye::undistortImage(*distorted, *undistorted, *k, *d);
@@ -49,6 +57,11 @@ void UndistortPoints(Mat distorted, Mat undistorted, Mat k, Mat d, Mat r, Mat p)
cv::undistortPoints(*distorted, *undistorted, *k, *d, *r, *p);
}
bool CheckChessboard(Mat image, Size size) {
cv::Size sz(size.width, size.height);
return cv::checkChessboard(*image, sz);
}
bool FindChessboardCorners(Mat image, Size patternSize, Mat corners, int flags) {
cv::Size sz(patternSize.width, patternSize.height);
return cv::findChessboardCorners(*image, sz, *corners, flags);
@@ -73,6 +86,10 @@ Mat EstimateAffinePartial2D(Point2fVector from, Point2fVector to) {
return new cv::Mat(cv::estimateAffinePartial2D(*from, *to));
}
Mat EstimateAffinePartial2DWithParams(Point2fVector from, Point2fVector to, Mat inliers, int method, double ransacReprojThreshold, size_t maxIters, double confidence, size_t refineIters) {
return new cv::Mat(cv::estimateAffinePartial2D(*from, *to, *inliers, method, ransacReprojThreshold, maxIters, confidence, refineIters));
}
Mat EstimateAffine2D(Point2fVector from, Point2fVector to) {
return new cv::Mat(cv::estimateAffine2D(*from, *to));
}
@@ -80,3 +97,19 @@ Mat EstimateAffine2D(Point2fVector from, Point2fVector to) {
Mat EstimateAffine2DWithParams(Point2fVector from, Point2fVector to, Mat inliers, int method, double ransacReprojThreshold, size_t maxIters, double confidence, size_t refineIters) {
return new cv::Mat(cv::estimateAffine2D(*from, *to, *inliers, method, ransacReprojThreshold, maxIters, confidence, refineIters));
}
void TriangulatePoints(Mat projMatr1, Mat projMatr2, Point2fVector projPoints1, Point2fVector projPoints2, Mat points4D) {
return cv::triangulatePoints(*projMatr1, *projMatr2, *projPoints1, *projPoints2, *points4D);
}
void ConvertPointsFromHomogeneous(Mat src, Mat dst) {
return cv::convertPointsFromHomogeneous(*src, *dst);
}
void Rodrigues(Mat src, Mat dst) {
cv::Rodrigues(*src, *dst);
}
bool SolvePnP(Point3fVector objectPoints, Point2fVector imagePoints, Mat cameraMatrix, Mat distCoeffs, Mat rvec, Mat tvec, bool useExtrinsicGuess, int flags) {
return cv::solvePnP(*objectPoints, *imagePoints, *cameraMatrix, *distCoeffs, *rvec, *tvec, useExtrinsicGuess, flags);
}

87
vendor/gocv.io/x/gocv/calib3d.go generated vendored
View File

@@ -55,6 +55,27 @@ const (
CalibFixPrincipalPoint
)
// FisheyeCalibrate performs camera calibration.
//
// For further details, please see:
// https://docs.opencv.org/4.x/db/d58/group__calib3d__fisheye.html#gad626a78de2b1dae7489e152a5a5a89e1
func FisheyeCalibrate(objectPoints Points3fVector, imagePoints Points2fVector, size image.Point, k, d, rvecs, tvecs *Mat, flags CalibFlag) float64 {
sz := C.struct_Size{
width: C.int(size.X),
height: C.int(size.Y),
}
return float64(C.Fisheye_Calibrate(objectPoints.p, imagePoints.p, sz, k.p, d.p, rvecs.p, tvecs.p, C.int(flags)))
}
// FisheyeDistortPoints distorts 2D points using fisheye model.
//
// For further details, please see:
// https://docs.opencv.org/master/db/d58/group__calib3d__fisheye.html#gab738cdf90ceee97b2b52b0d0e7511541
func FisheyeDistortPoints(undistorted Mat, distorted *Mat, k, d Mat) {
C.Fisheye_DistortPoints(undistorted.Ptr(), distorted.Ptr(), k.Ptr(), d.Ptr())
}
// FisheyeUndistortImage transforms an image to compensate for fisheye lens distortion
func FisheyeUndistortImage(distorted Mat, undistorted *Mat, k, d Mat) {
C.Fisheye_UndistortImage(distorted.Ptr(), undistorted.Ptr(), k.Ptr(), d.Ptr())
@@ -97,7 +118,6 @@ func EstimateNewCameraMatrixForUndistortRectify(k, d Mat, imgSize image.Point, r
//
// For further details, please see:
// https://docs.opencv.org/master/d9/d0c/group__calib3d.html#ga7dfb72c9cf9780a347fbe3d1c47e5d5a
//
func InitUndistortRectifyMap(cameraMatrix Mat, distCoeffs Mat, r Mat, newCameraMatrix Mat, size image.Point, m1type int, map1 Mat, map2 Mat) {
sz := C.struct_Size{
width: C.int(size.X),
@@ -110,7 +130,6 @@ func InitUndistortRectifyMap(cameraMatrix Mat, distCoeffs Mat, r Mat, newCameraM
//
// For further details, please see:
// https://docs.opencv.org/master/d9/d0c/group__calib3d.html#ga7a6c4e032c97f03ba747966e6ad862b1
//
func GetOptimalNewCameraMatrixWithParams(cameraMatrix Mat, distCoeffs Mat, imageSize image.Point, alpha float64, newImgSize image.Point, centerPrincipalPoint bool) (Mat, image.Rectangle) {
sz := C.struct_Size{
width: C.int(imageSize.X),
@@ -128,7 +147,6 @@ func GetOptimalNewCameraMatrixWithParams(cameraMatrix Mat, distCoeffs Mat, image
//
// For further details, please see:
// https://docs.opencv.org/master/d9/d0c/group__calib3d.html#ga3207604e4b1a1758aa66acb6ed5aa65d
//
func CalibrateCamera(objectPoints Points3fVector, imagePoints Points2fVector, imageSize image.Point,
cameraMatrix *Mat, distCoeffs *Mat, rvecs *Mat, tvecs *Mat, calibFlag CalibFlag) float64 {
sz := C.struct_Size{
@@ -140,6 +158,10 @@ func CalibrateCamera(objectPoints Points3fVector, imagePoints Points2fVector, im
return float64(res)
}
// Undistort transforms an image to compensate for lens distortion.
//
// For further details, please see:
// https://docs.opencv.org/4.x/d9/d0c/group__calib3d.html#ga69f2545a8b62a6b0fc2ee060dc30559d
func Undistort(src Mat, dst *Mat, cameraMatrix Mat, distCoeffs Mat, newCameraMatrix Mat) {
C.Undistort(src.Ptr(), dst.Ptr(), cameraMatrix.Ptr(), distCoeffs.Ptr(), newCameraMatrix.Ptr())
}
@@ -152,6 +174,18 @@ func UndistortPoints(src Mat, dst *Mat, cameraMatrix, distCoeffs, rectificationT
C.UndistortPoints(src.Ptr(), dst.Ptr(), cameraMatrix.Ptr(), distCoeffs.Ptr(), rectificationTransform.Ptr(), newCameraMatrix.Ptr())
}
// CheckChessboard renders the detected chessboard corners.
//
// For further details, please see:
// https://docs.opencv.org/master/d9/d0c/group__calib3d.html#ga6a10b0bb120c4907e5eabbcd22319022
func CheckChessboard(image Mat, size image.Point) bool {
sz := C.struct_Size{
width: C.int(size.X),
height: C.int(size.Y),
}
return bool(C.CheckChessboard(image.Ptr(), sz))
}
// CalibCBFlag value for chessboard calibration
// For more details, please see:
// https://docs.opencv.org/master/d9/d0c/group__calib3d.html#ga93efa9b0aa890de240ca32b11253dd4a
@@ -181,7 +215,6 @@ const (
//
// For further details, please see:
// https://docs.opencv.org/master/d9/d0c/group__calib3d.html#ga93efa9b0aa890de240ca32b11253dd4a
//
func FindChessboardCorners(image Mat, patternSize image.Point, corners *Mat, flags CalibCBFlag) bool {
sz := C.struct_Size{
width: C.int(patternSize.X),
@@ -194,7 +227,6 @@ func FindChessboardCorners(image Mat, patternSize image.Point, corners *Mat, fla
//
// For further details, please see:
// https://docs.opencv.org/master/d9/d0c/group__calib3d.html#gadc5bcb05cb21cf1e50963df26986d7c9
//
func FindChessboardCornersSB(image Mat, patternSize image.Point, corners *Mat, flags CalibCBFlag) bool {
sz := C.struct_Size{
width: C.int(patternSize.X),
@@ -207,7 +239,6 @@ func FindChessboardCornersSB(image Mat, patternSize image.Point, corners *Mat, f
//
// For further details, please see:
// https://docs.opencv.org/master/d9/d0c/group__calib3d.html#ga93efa9b0aa890de240ca32b11253dd4a
//
func FindChessboardCornersSBWithMeta(image Mat, patternSize image.Point, corners *Mat, flags CalibCBFlag, meta *Mat) bool {
sz := C.struct_Size{
width: C.int(patternSize.X),
@@ -220,7 +251,6 @@ func FindChessboardCornersSBWithMeta(image Mat, patternSize image.Point, corners
//
// For further details, please see:
// https://docs.opencv.org/master/d9/d0c/group__calib3d.html#ga6a10b0bb120c4907e5eabbcd22319022
//
func DrawChessboardCorners(image *Mat, patternSize image.Point, corners Mat, patternWasFound bool) {
sz := C.struct_Size{
width: C.int(patternSize.X),
@@ -238,6 +268,16 @@ func EstimateAffinePartial2D(from, to Point2fVector) Mat {
return newMat(C.EstimateAffinePartial2D(from.p, to.p))
}
// EstimateAffinePartial2DWithParams computes an optimal limited affine transformation
// with 4 degrees of freedom between two 2D point sets
// with additional optional parameters.
//
// For further details, please see:
// https://docs.opencv.org/master/d9/d0c/group__calib3d.html#gad767faff73e9cbd8b9d92b955b50062d
func EstimateAffinePartial2DWithParams(from Point2fVector, to Point2fVector, inliers Mat, method int, ransacReprojThreshold float64, maxIters uint, confidence float64, refineIters uint) Mat {
return newMat(C.EstimateAffinePartial2DWithParams(from.p, to.p, inliers.p, C.int(method), C.double(ransacReprojThreshold), C.size_t(maxIters), C.double(confidence), C.size_t(refineIters)))
}
// EstimateAffine2D Computes an optimal affine transformation between two 2D point sets.
//
// For further details, please see:
@@ -254,3 +294,36 @@ func EstimateAffine2D(from, to Point2fVector) Mat {
func EstimateAffine2DWithParams(from Point2fVector, to Point2fVector, inliers Mat, method int, ransacReprojThreshold float64, maxIters uint, confidence float64, refineIters uint) Mat {
return newMat(C.EstimateAffine2DWithParams(from.p, to.p, inliers.p, C.int(method), C.double(ransacReprojThreshold), C.size_t(maxIters), C.double(confidence), C.size_t(refineIters)))
}
// TriangulatePoints reconstructs 3-dimensional points (in homogeneous coordinates)
// by using their observations with a stereo camera.
//
// For further details, please see:
// https://docs.opencv.org/4.x/d9/d0c/group__calib3d.html#gad3fc9a0c82b08df034234979960b778c
func TriangulatePoints(projMatr1, projMatr2 Mat, projPoints1, projPoints2 Point2fVector, points4D *Mat) {
C.TriangulatePoints(projMatr1.Ptr(), projMatr2.Ptr(), projPoints1.p, projPoints2.p, points4D.Ptr())
}
// ConvertPointsFromHomogeneous converts points from homogeneous to Euclidean space.
//
// For further details, please see:
// https://docs.opencv.org/4.x/d9/d0c/group__calib3d.html#gac42edda3a3a0f717979589fcd6ac0035
func ConvertPointsFromHomogeneous(src Mat, dst *Mat) {
C.ConvertPointsFromHomogeneous(src.Ptr(), dst.Ptr())
}
// Rodrigues converts a rotation matrix to a rotation vector or vice versa.
//
// For further details, please see:
// https://docs.opencv.org/4.0.0/d9/d0c/group__calib3d.html#ga61585db663d9da06b68e70cfbf6a1eac
func Rodrigues(src Mat, dst *Mat) {
C.Rodrigues(src.p, dst.p)
}
// SolvePnP finds an object pose from 3D-2D point correspondences.
//
// For further details, please see:
// https://docs.opencv.org/4.0.0/d9/d0c/group__calib3d.html#ga549c2075fac14829ff4a58bc931c033d
func SolvePnP(objectPoints Point3fVector, imagePoints Point2fVector, cameraMatrix, distCoeffs Mat, rvec, tvec *Mat, useExtrinsicGuess bool, flags int) bool {
return bool(C.SolvePnP(objectPoints.p, imagePoints.p, cameraMatrix.p, distCoeffs.p, rvec.p, tvec.p, C.bool(useExtrinsicGuess), C.int(flags)))
}

8
vendor/gocv.io/x/gocv/calib3d.h generated vendored
View File

@@ -12,6 +12,8 @@ extern "C" {
#include "core.h"
//Calib
double Fisheye_Calibrate(Points3fVector objectPoints, Points2fVector imagePoints, Size size, Mat k, Mat d, Mat rvecs, Mat tvecs, int flags);
void Fisheye_DistortPoints(Mat undistorted, Mat distorted, Mat k, Mat d);
void Fisheye_UndistortImage(Mat distorted, Mat undistorted, Mat k, Mat d);
void Fisheye_UndistortImageWithParams(Mat distorted, Mat undistorted, Mat k, Mat d, Mat knew, Size size);
void Fisheye_UndistortPoints(Mat distorted, Mat undistorted, Mat k, Mat d, Mat R, Mat P);
@@ -22,13 +24,19 @@ Mat GetOptimalNewCameraMatrixWithParams(Mat cameraMatrix,Mat distCoeffs,Size siz
double CalibrateCamera(Points3fVector objectPoints, Points2fVector imagePoints, Size imageSize, Mat cameraMatrix, Mat distCoeffs, Mat rvecs, Mat tvecs, int flag);
void Undistort(Mat src, Mat dst, Mat cameraMatrix, Mat distCoeffs, Mat newCameraMatrix);
void UndistortPoints(Mat distorted, Mat undistorted, Mat k, Mat d, Mat r, Mat p);
bool CheckChessboard(Mat image, Size sz);
bool FindChessboardCorners(Mat image, Size patternSize, Mat corners, int flags);
bool FindChessboardCornersSB(Mat image, Size patternSize, Mat corners, int flags);
bool FindChessboardCornersSBWithMeta(Mat image, Size patternSize, Mat corners, int flags, Mat meta);
void DrawChessboardCorners(Mat image, Size patternSize, Mat corners, bool patternWasFound);
Mat EstimateAffinePartial2D(Point2fVector from, Point2fVector to);
Mat EstimateAffinePartial2DWithParams(Point2fVector from, Point2fVector to, Mat inliers, int method, double ransacReprojThreshold, size_t maxIters, double confidence, size_t refineIters);
Mat EstimateAffine2D(Point2fVector from, Point2fVector to);
Mat EstimateAffine2DWithParams(Point2fVector from, Point2fVector to, Mat inliers, int method, double ransacReprojThreshold, size_t maxIters, double confidence, size_t refineIters);
void TriangulatePoints(Mat projMatr1, Mat projMatr2, Point2fVector projPoints1, Point2fVector projPoints2, Mat points4D);
void ConvertPointsFromHomogeneous(Mat src, Mat dst);
void Rodrigues(Mat src, Mat dst);
bool SolvePnP(Point3fVector objectPoints, Point2fVector imagePoints, Mat cameraMatrix, Mat distCoeffs, Mat rvec, Mat tvec, bool useExtrinsicGuess, int flags);
#ifdef __cplusplus
}
#endif

2
vendor/gocv.io/x/gocv/cgo.go generated vendored
View File

@@ -9,6 +9,6 @@ package gocv
#cgo !windows pkg-config: opencv4
#cgo CXXFLAGS: --std=c++11
#cgo windows CPPFLAGS: -IC:/opencv/build/install/include
#cgo windows LDFLAGS: -LC:/opencv/build/install/x64/mingw/lib -lopencv_core460 -lopencv_face460 -lopencv_videoio460 -lopencv_imgproc460 -lopencv_highgui460 -lopencv_imgcodecs460 -lopencv_objdetect460 -lopencv_features2d460 -lopencv_video460 -lopencv_dnn460 -lopencv_xfeatures2d460 -lopencv_plot460 -lopencv_tracking460 -lopencv_img_hash460 -lopencv_calib3d460 -lopencv_bgsegm460 -lopencv_photo460 -lopencv_aruco460 -lopencv_wechat_qrcode460 -lopencv_ximgproc460
#cgo windows LDFLAGS: -LC:/opencv/build/install/x64/mingw/lib -lopencv_core490 -lopencv_face490 -lopencv_videoio490 -lopencv_imgproc490 -lopencv_highgui490 -lopencv_imgcodecs490 -lopencv_objdetect490 -lopencv_features2d490 -lopencv_video490 -lopencv_dnn490 -lopencv_xfeatures2d490 -lopencv_plot490 -lopencv_tracking490 -lopencv_img_hash490 -lopencv_calib3d490 -lopencv_bgsegm490 -lopencv_photo490 -lopencv_aruco490 -lopencv_wechat_qrcode490 -lopencv_ximgproc490
*/
import "C"

View File

@@ -10,6 +10,6 @@ package gocv
#cgo !windows CPPFLAGS: -I/usr/local/include -I/usr/local/include/opencv4
#cgo !windows LDFLAGS: -L/usr/local/lib -L/usr/local/lib/opencv4/3rdparty -lopencv_gapi -lopencv_stitching -lopencv_aruco -lopencv_bgsegm -lopencv_bioinspired -lopencv_ccalib -lopencv_dnn_objdetect -lopencv_dpm -lopencv_face -lopencv_fuzzy -lopencv_hfs -lopencv_img_hash -lopencv_line_descriptor -lopencv_quality -lopencv_reg -lopencv_rgbd -lopencv_saliency -lopencv_stereo -lopencv_structured_light -lopencv_phase_unwrapping -lopencv_superres -lopencv_optflow -lopencv_surface_matching -lopencv_tracking -lopencv_datasets -lopencv_text -lopencv_highgui -lopencv_dnn -lopencv_plot -lopencv_videostab -lopencv_video -lopencv_videoio -lopencv_xfeatures2d -lopencv_shape -lopencv_ml -lopencv_ximgproc -lopencv_xobjdetect -lopencv_objdetect -lopencv_calib3d -lopencv_imgcodecs -lopencv_features2d -lopencv_flann -lopencv_xphoto -lopencv_wechat_qrcode -lopencv_photo -lopencv_imgproc -lopencv_core -littnotify -llibprotobuf -lIlmImf -lquirc -lippiw -lippicv -lade -lz -ljpeg -ldl -lm -lpthread -lrt -lquadmath
#cgo windows CPPFLAGS: -IC:/opencv/build/install/include
#cgo windows LDFLAGS: -LC:/opencv/build/install/x64/mingw/staticlib -lopencv_stereo460 -lopencv_tracking460 -lopencv_superres460 -lopencv_stitching460 -lopencv_optflow460 -lopencv_gapi460 -lopencv_face460 -lopencv_dpm460 -lopencv_dnn_objdetect460 -lopencv_ccalib460 -lopencv_bioinspired460 -lopencv_bgsegm460 -lopencv_aruco460 -lopencv_xobjdetect460 -lopencv_ximgproc460 -lopencv_xfeatures2d460 -lopencv_videostab460 -lopencv_video460 -lopencv_structured_light460 -lopencv_shape460 -lopencv_rgbd460 -lopencv_rapid460 -lopencv_objdetect460 -lopencv_mcc460 -lopencv_highgui460 -lopencv_datasets460 -lopencv_calib3d460 -lopencv_videoio460 -lopencv_text460 -lopencv_line_descriptor460 -lopencv_imgcodecs460 -lopencv_img_hash460 -lopencv_hfs460 -lopencv_fuzzy460 -lopencv_features2d460 -lopencv_dnn_superres460 -lopencv_dnn460 -lopencv_xphoto460 -lopencv_wechat_qrcode460 -lopencv_surface_matching460 -lopencv_reg460 -lopencv_quality460 -lopencv_plot460 -lopencv_photo460 -lopencv_phase_unwrapping460 -lopencv_ml460 -lopencv_intensity_transform460 -lopencv_imgproc460 -lopencv_flann460 -lopencv_core460 -lade -lquirc -llibprotobuf -lIlmImf -llibpng -llibopenjp2 -llibwebp -llibtiff -llibjpeg-turbo -lzlib -lkernel32 -lgdi32 -lwinspool -lshell32 -lole32 -loleaut32 -luuid -lcomdlg32 -ladvapi32 -luser32
#cgo windows LDFLAGS: -LC:/opencv/build/install/x64/mingw/staticlib -lopencv_stereo490 -lopencv_tracking490 -lopencv_superres490 -lopencv_stitching490 -lopencv_optflow490 -lopencv_gapi490 -lopencv_face490 -lopencv_dpm490 -lopencv_dnn_objdetect490 -lopencv_ccalib490 -lopencv_bioinspired490 -lopencv_bgsegm490 -lopencv_aruco490 -lopencv_xobjdetect490 -lopencv_ximgproc490 -lopencv_xfeatures2d490 -lopencv_videostab490 -lopencv_video490 -lopencv_structured_light490 -lopencv_shape490 -lopencv_rgbd490 -lopencv_rapid490 -lopencv_objdetect490 -lopencv_mcc490 -lopencv_highgui490 -lopencv_datasets490 -lopencv_calib3d490 -lopencv_videoio490 -lopencv_text490 -lopencv_line_descriptor490 -lopencv_imgcodecs490 -lopencv_img_hash490 -lopencv_hfs490 -lopencv_fuzzy490 -lopencv_features2d490 -lopencv_dnn_superres490 -lopencv_dnn490 -lopencv_xphoto490 -lopencv_wechat_qrcode490 -lopencv_surface_matching490 -lopencv_reg490 -lopencv_quality490 -lopencv_plot490 -lopencv_photo490 -lopencv_phase_unwrapping490 -lopencv_ml490 -lopencv_intensity_transform490 -lopencv_imgproc490 -lopencv_flann490 -lopencv_core490 -lade -lquirc -llibprotobuf -lIlmImf -llibpng -llibopenjp2 -llibwebp -llibtiff -llibjpeg-turbo -lzlib -lkernel32 -lgdi32 -lwinspool -lshell32 -lole32 -loleaut32 -luuid -lcomdlg32 -ladvapi32 -luser32
*/
import "C"

81
vendor/gocv.io/x/gocv/core.cpp generated vendored
View File

@@ -95,6 +95,18 @@ bool Mat_IsContinuous(Mat m) {
return m->isContinuous();
}
void Mat_Inv(Mat m) {
m->inv();
}
Mat Mat_Col(Mat m, int c) {
return new cv::Mat(m->col(c));
}
Mat Mat_Row(Mat m, int r) {
return new cv::Mat(m->row(r));
}
// Mat_Clone returns a clone of this Mat
Mat Mat_Clone(Mat m) {
return new cv::Mat(m->clone());
@@ -504,6 +516,30 @@ void Mat_EigenNonSymmetric(Mat src, Mat eigenvalues, Mat eigenvectors) {
cv::eigenNonSymmetric(*src, *eigenvalues, *eigenvectors);
}
void Mat_PCABackProject(Mat data, Mat mean, Mat eigenvectors, Mat result) {
cv::PCABackProject(*data, *mean, *eigenvectors, *result);
}
void Mat_PCACompute(Mat src, Mat mean, Mat eigenvectors, Mat eigenvalues, int maxComponents) {
cv::PCACompute(*src, *mean, *eigenvectors, *eigenvalues, maxComponents);
}
void Mat_PCAProject(Mat data, Mat mean, Mat eigenvectors, Mat result) {
cv::PCAProject(*data, *mean, *eigenvectors, *result);
}
double PSNR(Mat src1, Mat src2) {
return cv::PSNR(*src1, *src2);
}
void SVBackSubst(Mat w, Mat u, Mat vt, Mat rhs, Mat dst) {
cv::SVBackSubst(*w, *u, *vt, *rhs, *dst);
}
void SVDecomp(Mat src, Mat w, Mat u, Mat vt) {
cv::SVDecomp(*src, *w, *u, *vt);
}
void Mat_Exp(Mat src, Mat dst) {
cv::exp(*src, *dst);
}
@@ -587,6 +623,14 @@ void Mat_Magnitude(Mat x, Mat y, Mat magnitude) {
cv::magnitude(*x, *y, *magnitude);
}
double Mat_Mahalanobis(Mat v1, Mat v2, Mat icovar) {
return cv::Mahalanobis(*v1, *v2, *icovar);
}
void MulTransposed(Mat src, Mat dest, bool ata) {
cv::mulTransposed(*src, *dest, ata);
}
void Mat_Max(Mat src1, Mat src2, Mat dst) {
cv::max(*src1, *src2, *dst);
}
@@ -624,6 +668,17 @@ void Mat_MinMaxLoc(Mat m, double* minVal, double* maxVal, Point* minLoc, Point*
maxLoc->y = cMaxLoc.y;
}
void Mat_MinMaxLocWithMask(Mat m, double* minVal, double* maxVal, Point* minLoc, Point* maxLoc, Mat mask) {
cv::Point cMinLoc;
cv::Point cMaxLoc;
cv::minMaxLoc(*m, minVal, maxVal, &cMinLoc, &cMaxLoc, *mask);
minLoc->x = cMinLoc.x;
minLoc->y = cMinLoc.y;
maxLoc->x = cMaxLoc.x;
maxLoc->y = cMaxLoc.y;
}
void Mat_MixChannels(struct Mats src, struct Mats dst, struct IntVector fromTo) {
std::vector<cv::Mat> srcMats;
@@ -690,6 +745,15 @@ void Mat_Reduce(Mat src, Mat dst, int dim, int rType, int dType) {
cv::reduce(*src, *dst, dim, rType, dType);
}
void Mat_ReduceArgMax(Mat src, Mat dst, int axis, bool lastIndex) {
cv::reduceArgMax(*src, *dst, axis, lastIndex);
}
void Mat_ReduceArgMin(Mat src, Mat dst, int axis, bool lastIndex) {
cv::reduceArgMin(*src, *dst, axis, lastIndex);
}
void Mat_Repeat(Mat src, int nY, int nX, Mat dst) {
cv::repeat(*src, nY, nX, *dst);
}
@@ -801,6 +865,16 @@ void Points_Close(Points ps) {
void Point_Close(Point p) {}
void Points2f_Close(Points2f ps) {
for (size_t i = 0; i < ps.length; i++) {
Point2f_Close(ps.points[i]);
}
delete[] ps.points;
}
void Point2f_Close(Point2f p) {}
void Rects_Close(struct Rects rs) {
delete[] rs.rects;
}
@@ -1145,3 +1219,10 @@ void Points3fVector_Close(Points3fVector ps) {
delete ps;
}
void SetNumThreads(int n) {
cv::setNumThreads(n);
}
int GetNumThreads() {
return cv::getNumThreads();
}

266
vendor/gocv.io/x/gocv/core.go generated vendored

File diff suppressed because it is too large Load Diff

34
vendor/gocv.io/x/gocv/core.h generated vendored
View File

@@ -121,6 +121,12 @@ typedef struct Size {
int height;
} Size;
// Wrapper for an individual cv::cvSize
typedef struct Size2f {
float width;
float height;
} Size2f;
// Wrapper for an individual cv::RotatedRect
typedef struct RotatedRect {
Points pts;
@@ -130,6 +136,15 @@ typedef struct RotatedRect {
double angle;
} RotatedRect;
// Wrapper for an individual cv::RotatedRect
typedef struct RotatedRect2f {
Points2f pts;
Rect boundingRect;
Point2f center;
Size2f size;
double angle;
} RotatedRect2f;
// Wrapper for an individual cv::cvScalar
typedef struct Scalar {
double val1;
@@ -268,6 +283,8 @@ void Rects_Close(struct Rects rs);
void Mats_Close(struct Mats mats);
void Point_Close(struct Point p);
void Points_Close(struct Points ps);
void Point2f_Close(struct Point2f p);
void Points2f_Close(struct Points2f ps);
void DMatches_Close(struct DMatches ds);
void MultiDMatches_Close(struct MultiDMatches mds);
@@ -283,6 +300,9 @@ Mat Mat_FromPtr(Mat m, int rows, int cols, int type, int prows, int pcols);
void Mat_Close(Mat m);
int Mat_Empty(Mat m);
bool Mat_IsContinuous(Mat m);
void Mat_Inv(Mat m);
Mat Mat_Col(Mat m, int c);
Mat Mat_Row(Mat m, int r);
Mat Mat_Clone(Mat m);
void Mat_CopyTo(Mat m, Mat dst);
int Mat_Total(Mat m);
@@ -379,6 +399,12 @@ void Mat_DFT(Mat m, Mat dst, int flags);
void Mat_Divide(Mat src1, Mat src2, Mat dst);
bool Mat_Eigen(Mat src, Mat eigenvalues, Mat eigenvectors);
void Mat_EigenNonSymmetric(Mat src, Mat eigenvalues, Mat eigenvectors);
void Mat_PCABackProject(Mat data, Mat mean, Mat eigenvectors, Mat result);
void Mat_PCACompute(Mat src, Mat mean, Mat eigenvectors, Mat eigenvalues, int maxComponents);
void Mat_PCAProject(Mat data, Mat mean, Mat eigenvectors, Mat result);
double PSNR(Mat src1, Mat src2);
void SVBackSubst(Mat w, Mat u, Mat vt, Mat rhs, Mat dst);
void SVDecomp(Mat src, Mat w, Mat u, Mat vt);
void Mat_Exp(Mat src, Mat dst);
void Mat_ExtractChannel(Mat src, Mat dst, int coi);
void Mat_FindNonZero(Mat src, Mat idx);
@@ -398,12 +424,15 @@ double KMeans(Mat data, int k, Mat bestLabels, TermCriteria criteria, int attemp
double KMeansPoints(PointVector pts, int k, Mat bestLabels, TermCriteria criteria, int attempts, int flags, Mat centers);
void Mat_Log(Mat src, Mat dst);
void Mat_Magnitude(Mat x, Mat y, Mat magnitude);
double Mat_Mahalanobis(Mat v1, Mat v2, Mat icovar);
void MulTransposed(Mat src, Mat dest, bool ata);
void Mat_Max(Mat src1, Mat src2, Mat dst);
void Mat_MeanStdDev(Mat src, Mat dstMean, Mat dstStdDev);
void Mat_Merge(struct Mats mats, Mat dst);
void Mat_Min(Mat src1, Mat src2, Mat dst);
void Mat_MinMaxIdx(Mat m, double* minVal, double* maxVal, int* minIdx, int* maxIdx);
void Mat_MinMaxLoc(Mat m, double* minVal, double* maxVal, Point* minLoc, Point* maxLoc);
void Mat_MinMaxLocWithMask(Mat m, double* minVal, double* maxVal, Point* minLoc, Point* maxLoc, Mat mask);
void Mat_MixChannels(struct Mats src, struct Mats dst, struct IntVector fromTo);
void Mat_MulSpectrums(Mat a, Mat b, Mat c, int flags);
void Mat_Multiply(Mat src1, Mat src2, Mat dst);
@@ -417,6 +446,8 @@ bool Mat_Solve(Mat src1, Mat src2, Mat dst, int flags);
int Mat_SolveCubic(Mat coeffs, Mat roots);
double Mat_SolvePoly(Mat coeffs, Mat roots, int maxIters);
void Mat_Reduce(Mat src, Mat dst, int dim, int rType, int dType);
void Mat_ReduceArgMax(Mat src, Mat dst, int axis, bool lastIndex);
void Mat_ReduceArgMin(Mat src, Mat dst, int axis, bool lastIndex);
void Mat_Repeat(Mat src, int nY, int nX, Mat dst);
void Mat_ScaleAdd(Mat src1, double alpha, Mat src2, Mat dst);
void Mat_SetIdentity(Mat src, double scalar);
@@ -512,6 +543,9 @@ Point3fVector Points3fVector_At(Points3fVector ps, int idx);
void Points3fVector_Append(Points3fVector psv, Point3fVector pv);
void Points3fVector_Close(Points3fVector ps);
void SetNumThreads(int n);
int GetNumThreads();
#ifdef __cplusplus
}
#endif

40
vendor/gocv.io/x/gocv/dnn.go generated vendored
View File

@@ -15,7 +15,6 @@ import (
//
// For further details, please see:
// https://docs.opencv.org/master/db/d30/classcv_1_1dnn_1_1Net.html
//
type Net struct {
// C.Net
p unsafe.Pointer
@@ -140,7 +139,6 @@ func (net *Net) Close() error {
//
// For further details, please see:
// https://docs.opencv.org/master/db/d30/classcv_1_1dnn_1_1Net.html#a6a5778787d5b8770deab5eda6968e66c
//
func (net *Net) Empty() bool {
return bool(C.Net_Empty((C.Net)(net.p)))
}
@@ -149,7 +147,6 @@ func (net *Net) Empty() bool {
//
// For further details, please see:
// https://docs.opencv.org/trunk/db/d30/classcv_1_1dnn_1_1Net.html#a672a08ae76444d75d05d7bfea3e4a328
//
func (net *Net) SetInput(blob Mat, name string) {
cName := C.CString(name)
defer C.free(unsafe.Pointer(cName))
@@ -161,7 +158,6 @@ func (net *Net) SetInput(blob Mat, name string) {
//
// For further details, please see:
// https://docs.opencv.org/trunk/db/d30/classcv_1_1dnn_1_1Net.html#a98ed94cb6ef7063d3697259566da310b
//
func (net *Net) Forward(outputName string) Mat {
cName := C.CString(outputName)
defer C.free(unsafe.Pointer(cName))
@@ -173,7 +169,6 @@ func (net *Net) Forward(outputName string) Mat {
//
// For further details, please see:
// https://docs.opencv.org/3.4.1/db/d30/classcv_1_1dnn_1_1Net.html#adb34d7650e555264c7da3b47d967311b
//
func (net *Net) ForwardLayers(outBlobNames []string) (blobs []Mat) {
cMats := C.struct_Mats{}
C.Net_ForwardLayers((C.Net)(net.p), &(cMats), toCStrings(outBlobNames))
@@ -189,7 +184,6 @@ func (net *Net) ForwardLayers(outBlobNames []string) (blobs []Mat) {
//
// For further details, please see:
// https://docs.opencv.org/3.4/db/d30/classcv_1_1dnn_1_1Net.html#a7f767df11386d39374db49cd8df8f59e
//
func (net *Net) SetPreferableBackend(backend NetBackendType) error {
C.Net_SetPreferableBackend((C.Net)(net.p), C.int(backend))
return nil
@@ -199,7 +193,6 @@ func (net *Net) SetPreferableBackend(backend NetBackendType) error {
//
// For further details, please see:
// https://docs.opencv.org/3.4/db/d30/classcv_1_1dnn_1_1Net.html#a9dddbefbc7f3defbe3eeb5dc3d3483f4
//
func (net *Net) SetPreferableTarget(target NetTargetType) error {
C.Net_SetPreferableTarget((C.Net)(net.p), C.int(target))
return nil
@@ -209,7 +202,6 @@ func (net *Net) SetPreferableTarget(target NetTargetType) error {
//
// For further details, please see:
// https://docs.opencv.org/3.4/d6/d0f/group__dnn.html#ga3b34fe7a29494a6a4295c169a7d32422
//
func ReadNet(model string, config string) Net {
cModel := C.CString(model)
defer C.free(unsafe.Pointer(cModel))
@@ -223,7 +215,6 @@ func ReadNet(model string, config string) Net {
//
// For further details, please see:
// https://docs.opencv.org/master/d6/d0f/group__dnn.html#ga138439da76f26266fdefec9723f6c5cd
//
func ReadNetBytes(framework string, model []byte, config []byte) (Net, error) {
cFramework := C.CString(framework)
defer C.free(unsafe.Pointer(cFramework))
@@ -242,7 +233,6 @@ func ReadNetBytes(framework string, model []byte, config []byte) (Net, error) {
//
// For further details, please see:
// https://docs.opencv.org/master/d6/d0f/group__dnn.html#ga29d0ea5e52b1d1a6c2681e3f7d68473a
//
func ReadNetFromCaffe(prototxt string, caffeModel string) Net {
cprototxt := C.CString(prototxt)
defer C.free(unsafe.Pointer(cprototxt))
@@ -256,7 +246,6 @@ func ReadNetFromCaffe(prototxt string, caffeModel string) Net {
//
// For further details, please see:
// https://docs.opencv.org/master/d6/d0f/group__dnn.html#ga946b342af1355185a7107640f868b64a
//
func ReadNetFromCaffeBytes(prototxt []byte, caffeModel []byte) (Net, error) {
bPrototxt, err := toByteArray(prototxt)
if err != nil {
@@ -273,7 +262,6 @@ func ReadNetFromCaffeBytes(prototxt []byte, caffeModel []byte) (Net, error) {
//
// For further details, please see:
// https://docs.opencv.org/master/d6/d0f/group__dnn.html#gad820b280978d06773234ba6841e77e8d
//
func ReadNetFromTensorflow(model string) Net {
cmodel := C.CString(model)
defer C.free(unsafe.Pointer(cmodel))
@@ -284,7 +272,6 @@ func ReadNetFromTensorflow(model string) Net {
//
// For further details, please see:
// https://docs.opencv.org/master/d6/d0f/group__dnn.html#gacdba30a7c20db2788efbf5bb16a7884d
//
func ReadNetFromTensorflowBytes(model []byte) (Net, error) {
bModel, err := toByteArray(model)
if err != nil {
@@ -294,11 +281,11 @@ func ReadNetFromTensorflowBytes(model []byte) (Net, error) {
}
// ReadNetFromTorch reads a network model stored in Torch framework's format (t7).
// check net.Empty() for read failure
//
// check net.Empty() for read failure
//
// For further details, please see:
// https://docs.opencv.org/master/d6/d0f/group__dnn.html#gaaaed8c8530e9e92fe6647700c13d961e
//
func ReadNetFromTorch(model string) Net {
cmodel := C.CString(model)
defer C.free(unsafe.Pointer(cmodel))
@@ -306,11 +293,11 @@ func ReadNetFromTorch(model string) Net {
}
// ReadNetFromONNX reads a network model stored in ONNX framework's format.
// check net.Empty() for read failure
//
// check net.Empty() for read failure
//
// For further details, please see:
// https://docs.opencv.org/master/d6/d0f/group__dnn.html#ga7faea56041d10c71dbbd6746ca854197
//
func ReadNetFromONNX(model string) Net {
cmodel := C.CString(model)
defer C.free(unsafe.Pointer(cmodel))
@@ -321,7 +308,6 @@ func ReadNetFromONNX(model string) Net {
//
// For further details, please see:
// https://docs.opencv.org/master/d6/d0f/group__dnn.html#ga9198ecaac7c32ddf0aa7a1bcbd359567
//
func ReadNetFromONNXBytes(model []byte) (Net, error) {
bModel, err := toByteArray(model)
if err != nil {
@@ -336,7 +322,6 @@ func ReadNetFromONNXBytes(model []byte) (Net, error) {
//
// For further details, please see:
// https://docs.opencv.org/trunk/d6/d0f/group__dnn.html#ga152367f253c81b53fe6862b299f5c5cd
//
func BlobFromImage(img Mat, scaleFactor float64, size image.Point, mean Scalar,
swapRB bool, crop bool) Mat {
@@ -361,7 +346,6 @@ func BlobFromImage(img Mat, scaleFactor float64, size image.Point, mean Scalar,
//
// For further details, please see:
// https://docs.opencv.org/master/d6/d0f/group__dnn.html#ga2b89ed84432e4395f5a1412c2926293c
//
func BlobFromImages(imgs []Mat, blob *Mat, scaleFactor float64, size image.Point, mean Scalar,
swapRB bool, crop bool, ddepth MatType) {
@@ -395,7 +379,6 @@ func BlobFromImages(imgs []Mat, blob *Mat, scaleFactor float64, size image.Point
//
// For further details, please see:
// https://docs.opencv.org/master/d6/d0f/group__dnn.html#ga4051b5fa2ed5f54b76c059a8625df9f5
//
func ImagesFromBlob(blob Mat, imgs []Mat) {
cMats := C.struct_Mats{}
C.Net_ImagesFromBlob(blob.p, &(cMats))
@@ -407,14 +390,13 @@ func ImagesFromBlob(blob Mat, imgs []Mat) {
// GetBlobChannel extracts a single (2d)channel from a 4 dimensional blob structure
// (this might e.g. contain the results of a SSD or YOLO detection,
// a bones structure from pose detection, or a color plane from Colorization)
//
// a bones structure from pose detection, or a color plane from Colorization)
func GetBlobChannel(blob Mat, imgidx int, chnidx int) Mat {
return newMat(C.Net_GetBlobChannel(blob.p, C.int(imgidx), C.int(chnidx)))
}
// GetBlobSize retrieves the 4 dimensional size information in (N,C,H,W) order
//
func GetBlobSize(blob Mat) Scalar {
s := C.Net_GetBlobSize(blob.p)
return NewScalar(float64(s.val1), float64(s.val2), float64(s.val3), float64(s.val4))
@@ -430,7 +412,6 @@ type Layer struct {
//
// For further details, please see:
// https://docs.opencv.org/master/db/d30/classcv_1_1dnn_1_1Net.html#a70aec7f768f38c32b1ee25f3a56526df
//
func (net *Net) GetLayer(layer int) Layer {
return Layer{p: unsafe.Pointer(C.Net_GetLayer((C.Net)(net.p), C.int(layer)))}
}
@@ -439,7 +420,6 @@ func (net *Net) GetLayer(layer int) Layer {
//
// For further details, please see:
// https://docs.opencv.org/master/db/d30/classcv_1_1dnn_1_1Net.html#a06ce946f675f75d1c020c5ddbc78aedc
//
func (net *Net) GetPerfProfile() float64 {
return float64(C.Net_GetPerfProfile((C.Net)(net.p)))
}
@@ -448,7 +428,6 @@ func (net *Net) GetPerfProfile() float64 {
//
// For further details, please see:
// https://docs.opencv.org/master/db/d30/classcv_1_1dnn_1_1Net.html#ae62a73984f62c49fd3e8e689405b056a
//
func (net *Net) GetUnconnectedOutLayers() (ids []int) {
cids := C.IntVector{}
C.Net_GetUnconnectedOutLayers((C.Net)(net.p), &cids)
@@ -471,7 +450,6 @@ func (net *Net) GetUnconnectedOutLayers() (ids []int) {
//
// For furtherdetails, please see:
// https://docs.opencv.org/master/db/d30/classcv_1_1dnn_1_1Net.html#ae8be9806024a0d1d41aba687cce99e6b
//
func (net *Net) GetLayerNames() (names []string) {
cstrs := C.CStrings{}
defer C.CStrings_Close(cstrs)
@@ -500,7 +478,6 @@ func (l *Layer) GetType() string {
//
// For further details, please see:
// https://docs.opencv.org/master/d3/d6c/classcv_1_1dnn_1_1Layer.html#a60ffc8238f3fa26cd3f49daa7ac0884b
//
func (l *Layer) InputNameToIndex(name string) int {
cName := C.CString(name)
defer C.free(unsafe.Pointer(cName))
@@ -511,7 +488,6 @@ func (l *Layer) InputNameToIndex(name string) int {
//
// For further details, please see:
// https://docs.opencv.org/master/d3/d6c/classcv_1_1dnn_1_1Layer.html#a60ffc8238f3fa26cd3f49daa7ac0884b
//
func (l *Layer) OutputNameToIndex(name string) int {
cName := C.CString(name)
defer C.free(unsafe.Pointer(cName))
@@ -522,7 +498,7 @@ func (l *Layer) OutputNameToIndex(name string) int {
//
// For futher details, please see:
// https://docs.opencv.org/4.4.0/d6/d0f/group__dnn.html#ga9d118d70a1659af729d01b10233213ee
func NMSBoxes(bboxes []image.Rectangle, scores []float32, scoreThreshold float32, nmsThreshold float32, indices []int) {
func NMSBoxes(bboxes []image.Rectangle, scores []float32, scoreThreshold float32, nmsThreshold float32) (indices []int) {
bboxesRectArr := []C.struct_Rect{}
for _, v := range bboxes {
bbox := C.struct_Rect{
@@ -560,6 +536,7 @@ func NMSBoxes(bboxes []image.Rectangle, scores []float32, scoreThreshold float32
ptr := *(*[]C.int)(unsafe.Pointer(h))
indices = make([]int, indicesVector.length)
for i := 0; i < int(indicesVector.length); i++ {
indices[i] = int(ptr[i])
}
@@ -570,7 +547,7 @@ func NMSBoxes(bboxes []image.Rectangle, scores []float32, scoreThreshold float32
//
// For futher details, please see:
// https://docs.opencv.org/4.4.0/d6/d0f/group__dnn.html#ga9d118d70a1659af729d01b10233213ee
func NMSBoxesWithParams(bboxes []image.Rectangle, scores []float32, scoreThreshold float32, nmsThreshold float32, indices []int, eta float32, topK int) {
func NMSBoxesWithParams(bboxes []image.Rectangle, scores []float32, scoreThreshold float32, nmsThreshold float32, eta float32, topK int) (indices []int) {
bboxesRectArr := []C.struct_Rect{}
for _, v := range bboxes {
bbox := C.struct_Rect{
@@ -608,6 +585,7 @@ func NMSBoxesWithParams(bboxes []image.Rectangle, scores []float32, scoreThresho
ptr := *(*[]C.int)(unsafe.Pointer(h))
indices = make([]int, indicesVector.length)
for i := 0; i < int(indicesVector.length); i++ {
indices[i] = int(ptr[i])
}

View File

@@ -1,3 +1,4 @@
//go:build openvino
// +build openvino
package gocv
@@ -17,7 +18,6 @@ import "C"
//
// For further details, please see:
// https://docs.opencv.org/trunk/db/d30/classcv_1_1dnn_1_1Net.html#a814890154ea9e10b132fec00b6f6ba30
//
func (net *Net) ForwardAsync(outputName string) AsyncArray {
cName := C.CString(outputName)
defer C.free(unsafe.Pointer(cName))

133
vendor/gocv.io/x/gocv/features2d.cpp generated vendored
View File

@@ -26,6 +26,30 @@ struct KeyPoints AKAZE_Detect(AKAZE a, Mat src) {
return ret;
}
struct KeyPoints AKAZE_Compute(AKAZE a, Mat src, struct KeyPoints kp, Mat desc) {
std::vector<cv::KeyPoint> computed;
for (size_t i = 0; i < kp.length; i++) {
cv::KeyPoint k = cv::KeyPoint(kp.keypoints[i].x, kp.keypoints[i].y,
kp.keypoints[i].size, kp.keypoints[i].angle, kp.keypoints[i].response,
kp.keypoints[i].octave, kp.keypoints[i].classID);
computed.push_back(k);
}
(*a)->compute(*src, computed, *desc);
KeyPoint* kps = new KeyPoint[computed.size()];
for (size_t i = 0; i < computed.size(); ++i) {
KeyPoint k = {computed[i].pt.x, computed[i].pt.y, computed[i].size, computed[i].angle,
computed[i].response, computed[i].octave, computed[i].class_id
};
kps[i] = k;
}
KeyPoints ret = {kps, (int)computed.size()};
return ret;
}
struct KeyPoints AKAZE_DetectAndCompute(AKAZE a, Mat src, Mat mask, Mat desc) {
std::vector<cv::KeyPoint> detected;
(*a)->detectAndCompute(*src, *mask, detected, *desc);
@@ -95,6 +119,30 @@ struct KeyPoints BRISK_Detect(BRISK b, Mat src) {
return ret;
}
struct KeyPoints BRISK_Compute(BRISK b, Mat src, struct KeyPoints kp, Mat desc) {
std::vector<cv::KeyPoint> computed;
for (size_t i = 0; i < kp.length; i++) {
cv::KeyPoint k = cv::KeyPoint(kp.keypoints[i].x, kp.keypoints[i].y,
kp.keypoints[i].size, kp.keypoints[i].angle, kp.keypoints[i].response,
kp.keypoints[i].octave, kp.keypoints[i].classID);
computed.push_back(k);
}
(*b)->compute(*src, computed, *desc);
KeyPoint* kps = new KeyPoint[computed.size()];
for (size_t i = 0; i < computed.size(); ++i) {
KeyPoint k = {computed[i].pt.x, computed[i].pt.y, computed[i].size, computed[i].angle,
computed[i].response, computed[i].octave, computed[i].class_id
};
kps[i] = k;
}
KeyPoints ret = {kps, (int)computed.size()};
return ret;
}
struct KeyPoints BRISK_DetectAndCompute(BRISK b, Mat src, Mat mask, Mat desc) {
std::vector<cv::KeyPoint> detected;
(*b)->detectAndCompute(*src, *mask, detected, *desc);
@@ -164,6 +212,30 @@ struct KeyPoints KAZE_Detect(KAZE a, Mat src) {
return ret;
}
struct KeyPoints KAZE_Compute(KAZE a, Mat src, struct KeyPoints kp, Mat desc) {
std::vector<cv::KeyPoint> computed;
for (size_t i = 0; i < kp.length; i++) {
cv::KeyPoint k = cv::KeyPoint(kp.keypoints[i].x, kp.keypoints[i].y,
kp.keypoints[i].size, kp.keypoints[i].angle, kp.keypoints[i].response,
kp.keypoints[i].octave, kp.keypoints[i].classID);
computed.push_back(k);
}
(*a)->compute(*src, computed, *desc);
KeyPoint* kps = new KeyPoint[computed.size()];
for (size_t i = 0; i < computed.size(); ++i) {
KeyPoint k = {computed[i].pt.x, computed[i].pt.y, computed[i].size, computed[i].angle,
computed[i].response, computed[i].octave, computed[i].class_id
};
kps[i] = k;
}
KeyPoints ret = {kps, (int)computed.size()};
return ret;
}
struct KeyPoints KAZE_DetectAndCompute(KAZE a, Mat src, Mat mask, Mat desc) {
std::vector<cv::KeyPoint> detected;
(*a)->detectAndCompute(*src, *mask, detected, *desc);
@@ -265,6 +337,30 @@ struct KeyPoints ORB_Detect(ORB o, Mat src) {
return ret;
}
struct KeyPoints ORB_Compute(ORB o, Mat src, struct KeyPoints kp, Mat desc) {
std::vector<cv::KeyPoint> computed;
for (size_t i = 0; i < kp.length; i++) {
cv::KeyPoint k = cv::KeyPoint(kp.keypoints[i].x, kp.keypoints[i].y,
kp.keypoints[i].size, kp.keypoints[i].angle, kp.keypoints[i].response,
kp.keypoints[i].octave, kp.keypoints[i].classID);
computed.push_back(k);
}
(*o)->compute(*src, computed, *desc);
KeyPoint* kps = new KeyPoint[computed.size()];
for (size_t i = 0; i < computed.size(); ++i) {
KeyPoint k = {computed[i].pt.x, computed[i].pt.y, computed[i].size, computed[i].angle,
computed[i].response, computed[i].octave, computed[i].class_id
};
kps[i] = k;
}
KeyPoints ret = {kps, (int)computed.size()};
return ret;
}
struct KeyPoints ORB_DetectAndCompute(ORB o, Mat src, Mat mask, Mat desc) {
std::vector<cv::KeyPoint> detected;
(*o)->detectAndCompute(*src, *mask, detected, *desc);
@@ -380,6 +476,19 @@ void BFMatcher_Close(BFMatcher b) {
delete b;
}
struct DMatches BFMatcher_Match(BFMatcher b, Mat query, Mat train) {
std::vector<cv::DMatch> matches;
(*b)->match(*query, *train, matches);
DMatch *dmatches = new DMatch[matches.size()];
for (size_t i = 0; i < matches.size(); ++i) {
DMatch dmatch = {matches[i].queryIdx, matches[i].trainIdx, matches[i].imgIdx, matches[i].distance};
dmatches[i] = dmatch;
}
DMatches ret = {dmatches, (int) matches.size()};
return ret;
}
struct MultiDMatches BFMatcher_KnnMatch(BFMatcher b, Mat query, Mat train, int k) {
std::vector< std::vector<cv::DMatch> > matches;
(*b)->knnMatch(*query, *train, matches, k);
@@ -502,6 +611,30 @@ struct KeyPoints SIFT_Detect(SIFT d, Mat src) {
return ret;
}
struct KeyPoints SIFT_Compute(SIFT d, Mat src, struct KeyPoints kp, Mat desc) {
std::vector<cv::KeyPoint> computed;
for (size_t i = 0; i < kp.length; i++) {
cv::KeyPoint k = cv::KeyPoint(kp.keypoints[i].x, kp.keypoints[i].y,
kp.keypoints[i].size, kp.keypoints[i].angle, kp.keypoints[i].response,
kp.keypoints[i].octave, kp.keypoints[i].classID);
computed.push_back(k);
}
(*d)->compute(*src, computed, *desc);
KeyPoint* kps = new KeyPoint[computed.size()];
for (size_t i = 0; i < computed.size(); ++i) {
KeyPoint k = {computed[i].pt.x, computed[i].pt.y, computed[i].size, computed[i].angle,
computed[i].response, computed[i].octave, computed[i].class_id
};
kps[i] = k;
}
KeyPoints ret = {kps, (int)computed.size()};
return ret;
}
struct KeyPoints SIFT_DetectAndCompute(SIFT d, Mat src, Mat mask, Mat desc) {
std::vector<cv::KeyPoint> detected;
(*d)->detectAndCompute(*src, *mask, detected, *desc);

209
vendor/gocv.io/x/gocv/features2d.go generated vendored
View File

@@ -7,21 +7,42 @@ package gocv
import "C"
import (
"image/color"
"io"
"reflect"
"unsafe"
)
type Feature2DDetector interface {
Detect(src Mat) []KeyPoint
}
type Feature2DComputer interface {
Compute(src Mat, mask Mat, kps []KeyPoint) ([]KeyPoint, Mat)
}
type Feature2DDetectComputer interface {
DetectAndCompute(src Mat, mask Mat) ([]KeyPoint, Mat)
}
type Feature2D interface {
io.Closer
Feature2DDetector
Feature2DComputer
Feature2DDetectComputer
}
// AKAZE is a wrapper around the cv::AKAZE algorithm.
type AKAZE struct {
// C.AKAZE
p unsafe.Pointer
}
var _ Feature2D = (*AKAZE)(nil)
// NewAKAZE returns a new AKAZE algorithm
//
// For further details, please see:
// https://docs.opencv.org/master/d8/d30/classcv_1_1AKAZE.html
//
func NewAKAZE() AKAZE {
return AKAZE{p: unsafe.Pointer(C.AKAZE_Create())}
}
@@ -37,7 +58,6 @@ func (a *AKAZE) Close() error {
//
// For further details, please see:
// https://docs.opencv.org/master/d0/d13/classcv_1_1Feature2D.html#aa4e9a7082ec61ebc108806704fbd7887
//
func (a *AKAZE) Detect(src Mat) []KeyPoint {
ret := C.AKAZE_Detect((C.AKAZE)(a.p), src.p)
defer C.KeyPoints_Close(ret)
@@ -45,11 +65,37 @@ func (a *AKAZE) Detect(src Mat) []KeyPoint {
return getKeyPoints(ret)
}
// Compute keypoints in an image using AKAZE.
//
// For further details, please see:
// https://docs.opencv.org/4.x/d0/d13/classcv_1_1Feature2D.html#ab3cce8d56f4fc5e1d530b5931e1e8dc0
func (a *AKAZE) Compute(src Mat, mask Mat, kps []KeyPoint) ([]KeyPoint, Mat) {
desc := NewMat()
kp2arr := make([]C.struct_KeyPoint, len(kps))
for i, kp := range kps {
kp2arr[i].x = C.double(kp.X)
kp2arr[i].y = C.double(kp.Y)
kp2arr[i].size = C.double(kp.Size)
kp2arr[i].angle = C.double(kp.Angle)
kp2arr[i].response = C.double(kp.Response)
kp2arr[i].octave = C.int(kp.Octave)
kp2arr[i].classID = C.int(kp.ClassID)
}
cKeyPoints := C.struct_KeyPoints{
keypoints: (*C.struct_KeyPoint)(&kp2arr[0]),
length: (C.int)(len(kps)),
}
ret := C.AKAZE_Compute((C.AKAZE)(a.p), src.p, cKeyPoints, desc.p)
defer C.KeyPoints_Close(ret)
return getKeyPoints(ret), desc
}
// DetectAndCompute keypoints and compute in an image using AKAZE.
//
// For further details, please see:
// https://docs.opencv.org/master/d0/d13/classcv_1_1Feature2D.html#a8be0d1c20b08eb867184b8d74c15a677
//
func (a *AKAZE) DetectAndCompute(src Mat, mask Mat) ([]KeyPoint, Mat) {
desc := NewMat()
ret := C.AKAZE_DetectAndCompute((C.AKAZE)(a.p), src.p, mask.p, desc.p)
@@ -68,7 +114,6 @@ type AgastFeatureDetector struct {
//
// For further details, please see:
// https://docs.opencv.org/master/d7/d19/classcv_1_1AgastFeatureDetector.html
//
func NewAgastFeatureDetector() AgastFeatureDetector {
return AgastFeatureDetector{p: unsafe.Pointer(C.AgastFeatureDetector_Create())}
}
@@ -84,7 +129,6 @@ func (a *AgastFeatureDetector) Close() error {
//
// For further details, please see:
// https://docs.opencv.org/master/d0/d13/classcv_1_1Feature2D.html#aa4e9a7082ec61ebc108806704fbd7887
//
func (a *AgastFeatureDetector) Detect(src Mat) []KeyPoint {
ret := C.AgastFeatureDetector_Detect((C.AgastFeatureDetector)(a.p), src.p)
defer C.KeyPoints_Close(ret)
@@ -98,11 +142,12 @@ type BRISK struct {
p unsafe.Pointer
}
var _ Feature2D = (*BRISK)(nil)
// NewBRISK returns a new BRISK algorithm
//
// For further details, please see:
// https://docs.opencv.org/master/d8/d30/classcv_1_1AKAZE.html
//
func NewBRISK() BRISK {
return BRISK{p: unsafe.Pointer(C.BRISK_Create())}
}
@@ -118,7 +163,6 @@ func (b *BRISK) Close() error {
//
// For further details, please see:
// https://docs.opencv.org/master/d0/d13/classcv_1_1Feature2D.html#aa4e9a7082ec61ebc108806704fbd7887
//
func (b *BRISK) Detect(src Mat) []KeyPoint {
ret := C.BRISK_Detect((C.BRISK)(b.p), src.p)
defer C.KeyPoints_Close(ret)
@@ -126,11 +170,37 @@ func (b *BRISK) Detect(src Mat) []KeyPoint {
return getKeyPoints(ret)
}
// Compute keypoints in an image using BRISK.
//
// For further details, please see:
// https://docs.opencv.org/4.x/d0/d13/classcv_1_1Feature2D.html#ab3cce8d56f4fc5e1d530b5931e1e8dc0
func (b *BRISK) Compute(src Mat, mask Mat, kps []KeyPoint) ([]KeyPoint, Mat) {
desc := NewMat()
kp2arr := make([]C.struct_KeyPoint, len(kps))
for i, kp := range kps {
kp2arr[i].x = C.double(kp.X)
kp2arr[i].y = C.double(kp.Y)
kp2arr[i].size = C.double(kp.Size)
kp2arr[i].angle = C.double(kp.Angle)
kp2arr[i].response = C.double(kp.Response)
kp2arr[i].octave = C.int(kp.Octave)
kp2arr[i].classID = C.int(kp.ClassID)
}
cKeyPoints := C.struct_KeyPoints{
keypoints: (*C.struct_KeyPoint)(&kp2arr[0]),
length: (C.int)(len(kps)),
}
ret := C.BRISK_Compute((C.BRISK)(b.p), src.p, cKeyPoints, desc.p)
defer C.KeyPoints_Close(ret)
return getKeyPoints(ret), desc
}
// DetectAndCompute keypoints and compute in an image using BRISK.
//
// For further details, please see:
// https://docs.opencv.org/master/d0/d13/classcv_1_1Feature2D.html#a8be0d1c20b08eb867184b8d74c15a677
//
func (b *BRISK) DetectAndCompute(src Mat, mask Mat) ([]KeyPoint, Mat) {
desc := NewMat()
ret := C.BRISK_DetectAndCompute((C.BRISK)(b.p), src.p, mask.p, desc.p)
@@ -164,7 +234,6 @@ type FastFeatureDetector struct {
//
// For further details, please see:
// https://docs.opencv.org/master/df/d74/classcv_1_1FastFeatureDetector.html
//
func NewFastFeatureDetector() FastFeatureDetector {
return FastFeatureDetector{p: unsafe.Pointer(C.FastFeatureDetector_Create())}
}
@@ -173,7 +242,6 @@ func NewFastFeatureDetector() FastFeatureDetector {
//
// For further details, please see:
// https://docs.opencv.org/master/df/d74/classcv_1_1FastFeatureDetector.html#ab986f2ff8f8778aab1707e2642bc7f8e
//
func NewFastFeatureDetectorWithParams(threshold int, nonmaxSuppression bool, typ FastFeatureDetectorType) FastFeatureDetector {
return FastFeatureDetector{p: unsafe.Pointer(C.FastFeatureDetector_CreateWithParams(C.int(threshold), C.bool(nonmaxSuppression), C.int(typ)))}
}
@@ -189,7 +257,6 @@ func (f *FastFeatureDetector) Close() error {
//
// For further details, please see:
// https://docs.opencv.org/master/d0/d13/classcv_1_1Feature2D.html#aa4e9a7082ec61ebc108806704fbd7887
//
func (f *FastFeatureDetector) Detect(src Mat) []KeyPoint {
ret := C.FastFeatureDetector_Detect((C.FastFeatureDetector)(f.p), src.p)
defer C.KeyPoints_Close(ret)
@@ -207,7 +274,6 @@ type GFTTDetector struct {
//
// For further details, please see:
// https://docs.opencv.org/master/df/d21/classcv_1_1GFTTDetector.html
//
func NewGFTTDetector() GFTTDetector {
return GFTTDetector{p: unsafe.Pointer(C.GFTTDetector_Create())}
}
@@ -223,7 +289,6 @@ func (a *GFTTDetector) Close() error {
//
// For further details, please see:
// https://docs.opencv.org/master/d0/d13/classcv_1_1Feature2D.html#aa4e9a7082ec61ebc108806704fbd7887
//
func (a *GFTTDetector) Detect(src Mat) []KeyPoint {
ret := C.GFTTDetector_Detect((C.GFTTDetector)(a.p), src.p)
defer C.KeyPoints_Close(ret)
@@ -237,11 +302,12 @@ type KAZE struct {
p unsafe.Pointer
}
var _ Feature2D = (*KAZE)(nil)
// NewKAZE returns a new KAZE algorithm
//
// For further details, please see:
// https://docs.opencv.org/master/d3/d61/classcv_1_1KAZE.html
//
func NewKAZE() KAZE {
return KAZE{p: unsafe.Pointer(C.KAZE_Create())}
}
@@ -257,7 +323,6 @@ func (a *KAZE) Close() error {
//
// For further details, please see:
// https://docs.opencv.org/master/d0/d13/classcv_1_1Feature2D.html#aa4e9a7082ec61ebc108806704fbd7887
//
func (a *KAZE) Detect(src Mat) []KeyPoint {
ret := C.KAZE_Detect((C.KAZE)(a.p), src.p)
defer C.KeyPoints_Close(ret)
@@ -265,11 +330,37 @@ func (a *KAZE) Detect(src Mat) []KeyPoint {
return getKeyPoints(ret)
}
// Compute keypoints in an image using KAZE.
//
// For further details, please see:
// https://docs.opencv.org/4.x/d0/d13/classcv_1_1Feature2D.html#ab3cce8d56f4fc5e1d530b5931e1e8dc0
func (a *KAZE) Compute(src Mat, mask Mat, kps []KeyPoint) ([]KeyPoint, Mat) {
desc := NewMat()
kp2arr := make([]C.struct_KeyPoint, len(kps))
for i, kp := range kps {
kp2arr[i].x = C.double(kp.X)
kp2arr[i].y = C.double(kp.Y)
kp2arr[i].size = C.double(kp.Size)
kp2arr[i].angle = C.double(kp.Angle)
kp2arr[i].response = C.double(kp.Response)
kp2arr[i].octave = C.int(kp.Octave)
kp2arr[i].classID = C.int(kp.ClassID)
}
cKeyPoints := C.struct_KeyPoints{
keypoints: (*C.struct_KeyPoint)(&kp2arr[0]),
length: (C.int)(len(kps)),
}
ret := C.KAZE_Compute((C.KAZE)(a.p), src.p, cKeyPoints, desc.p)
defer C.KeyPoints_Close(ret)
return getKeyPoints(ret), desc
}
// DetectAndCompute keypoints and compute in an image using KAZE.
//
// For further details, please see:
// https://docs.opencv.org/master/d0/d13/classcv_1_1Feature2D.html#a8be0d1c20b08eb867184b8d74c15a677
//
func (a *KAZE) DetectAndCompute(src Mat, mask Mat) ([]KeyPoint, Mat) {
desc := NewMat()
ret := C.KAZE_DetectAndCompute((C.KAZE)(a.p), src.p, mask.p, desc.p)
@@ -288,7 +379,6 @@ type MSER struct {
//
// For further details, please see:
// https://docs.opencv.org/master/d3/d28/classcv_1_1MSER.html
//
func NewMSER() MSER {
return MSER{p: unsafe.Pointer(C.MSER_Create())}
}
@@ -304,7 +394,6 @@ func (a *MSER) Close() error {
//
// For further details, please see:
// https://docs.opencv.org/master/d0/d13/classcv_1_1Feature2D.html#aa4e9a7082ec61ebc108806704fbd7887
//
func (a *MSER) Detect(src Mat) []KeyPoint {
ret := C.MSER_Detect((C.MSER)(a.p), src.p)
defer C.KeyPoints_Close(ret)
@@ -318,11 +407,12 @@ type ORB struct {
p unsafe.Pointer
}
var _ Feature2D = (*ORB)(nil)
// NewORB returns a new ORB algorithm
//
// For further details, please see:
// https://docs.opencv.org/master/db/d95/classcv_1_1ORB.html
//
func NewORB() ORB {
return ORB{p: unsafe.Pointer(C.ORB_Create())}
}
@@ -331,7 +421,6 @@ func NewORB() ORB {
//
// For further details, please see:
// https://docs.opencv.org/master/db/d95/classcv_1_1ORB.html#aeff0cbe668659b7ca14bb85ff1c4073b
//
func NewORBWithParams(nFeatures int, scaleFactor float32, nLevels int, edgeThreshold int, firstLevel int, WTAK int, scoreType ORBScoreType, patchSize int, fastThreshold int) ORB {
return ORB{p: unsafe.Pointer(C.ORB_CreateWithParams(
C.int(nFeatures),
@@ -364,7 +453,6 @@ func (o *ORB) Close() error {
//
// For further details, please see:
// https://docs.opencv.org/master/d0/d13/classcv_1_1Feature2D.html#aa4e9a7082ec61ebc108806704fbd7887
//
func (o *ORB) Detect(src Mat) []KeyPoint {
ret := C.ORB_Detect((C.ORB)(o.p), src.p)
defer C.KeyPoints_Close(ret)
@@ -372,11 +460,37 @@ func (o *ORB) Detect(src Mat) []KeyPoint {
return getKeyPoints(ret)
}
// Compute keypoints in an image using ORB.
//
// For further details, please see:
// https://docs.opencv.org/4.x/d0/d13/classcv_1_1Feature2D.html#ab3cce8d56f4fc5e1d530b5931e1e8dc0
func (o *ORB) Compute(src Mat, mask Mat, kps []KeyPoint) ([]KeyPoint, Mat) {
desc := NewMat()
kp2arr := make([]C.struct_KeyPoint, len(kps))
for i, kp := range kps {
kp2arr[i].x = C.double(kp.X)
kp2arr[i].y = C.double(kp.Y)
kp2arr[i].size = C.double(kp.Size)
kp2arr[i].angle = C.double(kp.Angle)
kp2arr[i].response = C.double(kp.Response)
kp2arr[i].octave = C.int(kp.Octave)
kp2arr[i].classID = C.int(kp.ClassID)
}
cKeyPoints := C.struct_KeyPoints{
keypoints: (*C.struct_KeyPoint)(&kp2arr[0]),
length: (C.int)(len(kps)),
}
ret := C.ORB_Compute((C.ORB)(o.p), src.p, cKeyPoints, desc.p)
defer C.KeyPoints_Close(ret)
return getKeyPoints(ret), desc
}
// DetectAndCompute detects keypoints and computes from an image using ORB.
//
// For further details, please see:
// https://docs.opencv.org/master/d0/d13/classcv_1_1Feature2D.html#a8be0d1c20b08eb867184b8d74c15a677
//
func (o *ORB) DetectAndCompute(src Mat, mask Mat) ([]KeyPoint, Mat) {
desc := NewMat()
ret := C.ORB_DetectAndCompute((C.ORB)(o.p), src.p, mask.p, desc.p)
@@ -400,7 +514,6 @@ type SimpleBlobDetectorParams struct {
//
// For further details, please see:
// https://docs.opencv.org/master/d0/d7a/classcv_1_1SimpleBlobDetector.html
//
func NewSimpleBlobDetector() SimpleBlobDetector {
return SimpleBlobDetector{p: unsafe.Pointer(C.SimpleBlobDetector_Create())}
}
@@ -409,7 +522,6 @@ func NewSimpleBlobDetector() SimpleBlobDetector {
//
// For further details, please see:
// https://docs.opencv.org/master/d0/d7a/classcv_1_1SimpleBlobDetector.html
//
func NewSimpleBlobDetectorWithParams(params SimpleBlobDetectorParams) SimpleBlobDetector {
return SimpleBlobDetector{p: unsafe.Pointer(C.SimpleBlobDetector_Create_WithParams(params.p))}
}
@@ -620,7 +732,6 @@ func (p *SimpleBlobDetectorParams) GetThresholdStep() float64 {
//
// For further details, please see:
// https://docs.opencv.org/master/d0/d13/classcv_1_1Feature2D.html#aa4e9a7082ec61ebc108806704fbd7887
//
func (b *SimpleBlobDetector) Detect(src Mat) []KeyPoint {
ret := C.SimpleBlobDetector_Detect((C.SimpleBlobDetector)(b.p), src.p)
defer C.KeyPoints_Close(ret)
@@ -657,7 +768,6 @@ type BFMatcher struct {
//
// For further details, please see:
// https://docs.opencv.org/master/d3/da1/classcv_1_1BFMatcher.html#abe0bb11749b30d97f60d6ade665617bd
//
func NewBFMatcher() BFMatcher {
return BFMatcher{p: unsafe.Pointer(C.BFMatcher_Create())}
}
@@ -667,7 +777,6 @@ func NewBFMatcher() BFMatcher {
//
// For further details, please see:
// https://docs.opencv.org/master/d3/da1/classcv_1_1BFMatcher.html#abe0bb11749b30d97f60d6ade665617bd
//
func NewBFMatcherWithParams(normType NormType, crossCheck bool) BFMatcher {
return BFMatcher{p: unsafe.Pointer(C.BFMatcher_CreateWithParams(C.int(normType), C.bool(crossCheck)))}
}
@@ -679,11 +788,21 @@ func (b *BFMatcher) Close() error {
return nil
}
// Match Finds the best match for each descriptor from a query set.
//
// For further details, please see:
// https://docs.opencv.org/4.x/db/d39/classcv_1_1DescriptorMatcher.html#a0f046f47b68ec7074391e1e85c750cba
func (b *BFMatcher) Match(query, train Mat) []DMatch {
ret := C.BFMatcher_Match((C.BFMatcher)(b.p), query.p, train.p)
defer C.DMatches_Close(ret)
return getDMatches(ret)
}
// KnnMatch Finds the k best matches for each descriptor from a query set.
//
// For further details, please see:
// https://docs.opencv.org/master/db/d39/classcv_1_1DescriptorMatcher.html#aa880f9353cdf185ccf3013e08210483a
//
func (b *BFMatcher) KnnMatch(query, train Mat, k int) [][]DMatch {
ret := C.BFMatcher_KnnMatch((C.BFMatcher)(b.p), query.p, train.p, C.int(k))
defer C.MultiDMatches_Close(ret)
@@ -701,7 +820,6 @@ type FlannBasedMatcher struct {
//
// For further details, please see:
// https://docs.opencv.org/master/dc/de2/classcv_1_1FlannBasedMatcher.html#ab9114a6471e364ad221f89068ca21382
//
func NewFlannBasedMatcher() FlannBasedMatcher {
return FlannBasedMatcher{p: unsafe.Pointer(C.FlannBasedMatcher_Create())}
}
@@ -717,7 +835,6 @@ func (f *FlannBasedMatcher) Close() error {
//
// For further details, please see:
// https://docs.opencv.org/master/db/d39/classcv_1_1DescriptorMatcher.html#aa880f9353cdf185ccf3013e08210483a
//
func (f *FlannBasedMatcher) KnnMatch(query, train Mat, k int) [][]DMatch {
ret := C.FlannBasedMatcher_KnnMatch((C.FlannBasedMatcher)(f.p), query.p, train.p, C.int(k))
defer C.MultiDMatches_Close(ret)
@@ -816,11 +933,12 @@ type SIFT struct {
p unsafe.Pointer
}
var _ Feature2D = (*SIFT)(nil)
// NewSIFT returns a new SIFT algorithm.
//
// For further details, please see:
// https://docs.opencv.org/master/d5/d3c/classcv_1_1xfeatures2d_1_1SIFT.html
//
func NewSIFT() SIFT {
return SIFT{p: unsafe.Pointer(C.SIFT_Create())}
}
@@ -836,7 +954,6 @@ func (d *SIFT) Close() error {
//
// For further details, please see:
// https://docs.opencv.org/master/d0/d13/classcv_1_1Feature2D.html#aa4e9a7082ec61ebc108806704fbd7887
//
func (d *SIFT) Detect(src Mat) []KeyPoint {
ret := C.SIFT_Detect((C.SIFT)(d.p), C.Mat(src.Ptr()))
defer C.KeyPoints_Close(ret)
@@ -844,11 +961,37 @@ func (d *SIFT) Detect(src Mat) []KeyPoint {
return getKeyPoints(ret)
}
// Compute keypoints in an image using SIFT.
//
// For further details, please see:
// https://docs.opencv.org/4.x/d0/d13/classcv_1_1Feature2D.html#ab3cce8d56f4fc5e1d530b5931e1e8dc0
func (d *SIFT) Compute(src Mat, mask Mat, kps []KeyPoint) ([]KeyPoint, Mat) {
desc := NewMat()
kp2arr := make([]C.struct_KeyPoint, len(kps))
for i, kp := range kps {
kp2arr[i].x = C.double(kp.X)
kp2arr[i].y = C.double(kp.Y)
kp2arr[i].size = C.double(kp.Size)
kp2arr[i].angle = C.double(kp.Angle)
kp2arr[i].response = C.double(kp.Response)
kp2arr[i].octave = C.int(kp.Octave)
kp2arr[i].classID = C.int(kp.ClassID)
}
cKeyPoints := C.struct_KeyPoints{
keypoints: (*C.struct_KeyPoint)(&kp2arr[0]),
length: (C.int)(len(kps)),
}
ret := C.SIFT_Compute((C.SIFT)(d.p), src.p, cKeyPoints, desc.p)
defer C.KeyPoints_Close(ret)
return getKeyPoints(ret), desc
}
// DetectAndCompute detects and computes keypoints in an image using SIFT.
//
// For further details, please see:
// https://docs.opencv.org/master/d0/d13/classcv_1_1Feature2D.html#a8be0d1c20b08eb867184b8d74c15a677
//
func (d *SIFT) DetectAndCompute(src Mat, mask Mat) ([]KeyPoint, Mat) {
desc := NewMat()
ret := C.SIFT_DetectAndCompute((C.SIFT)(d.p), C.Mat(src.Ptr()), C.Mat(mask.Ptr()),

6
vendor/gocv.io/x/gocv/features2d.h generated vendored
View File

@@ -39,6 +39,7 @@ typedef void* SIFT;
AKAZE AKAZE_Create();
void AKAZE_Close(AKAZE a);
struct KeyPoints AKAZE_Detect(AKAZE a, Mat src);
struct KeyPoints AKAZE_Compute(AKAZE a, Mat src, struct KeyPoints kp, Mat desc);
struct KeyPoints AKAZE_DetectAndCompute(AKAZE a, Mat src, Mat mask, Mat desc);
AgastFeatureDetector AgastFeatureDetector_Create();
@@ -48,6 +49,7 @@ struct KeyPoints AgastFeatureDetector_Detect(AgastFeatureDetector a, Mat src);
BRISK BRISK_Create();
void BRISK_Close(BRISK b);
struct KeyPoints BRISK_Detect(BRISK b, Mat src);
struct KeyPoints BRISK_Compute(BRISK b, Mat src, struct KeyPoints kp, Mat desc);
struct KeyPoints BRISK_DetectAndCompute(BRISK b, Mat src, Mat mask, Mat desc);
FastFeatureDetector FastFeatureDetector_Create();
@@ -62,6 +64,7 @@ struct KeyPoints GFTTDetector_Detect(GFTTDetector a, Mat src);
KAZE KAZE_Create();
void KAZE_Close(KAZE a);
struct KeyPoints KAZE_Detect(KAZE a, Mat src);
struct KeyPoints KAZE_Compute(KAZE a, Mat src, struct KeyPoints kp, Mat desc);
struct KeyPoints KAZE_DetectAndCompute(KAZE a, Mat src, Mat mask, Mat desc);
MSER MSER_Create();
@@ -72,6 +75,7 @@ ORB ORB_Create();
ORB ORB_CreateWithParams(int nfeatures, float scaleFactor, int nlevels, int edgeThreshold, int firstLevel, int WTA_K, int scoreType, int patchSize, int fastThreshold);
void ORB_Close(ORB o);
struct KeyPoints ORB_Detect(ORB o, Mat src);
struct KeyPoints ORB_Compute(ORB o, Mat src, struct KeyPoints kp, Mat desc);
struct KeyPoints ORB_DetectAndCompute(ORB o, Mat src, Mat mask, Mat desc);
SimpleBlobDetector SimpleBlobDetector_Create();
@@ -83,6 +87,7 @@ SimpleBlobDetectorParams SimpleBlobDetectorParams_Create();
BFMatcher BFMatcher_Create();
BFMatcher BFMatcher_CreateWithParams(int normType, bool crossCheck);
void BFMatcher_Close(BFMatcher b);
struct DMatches BFMatcher_Match(BFMatcher b, Mat query, Mat train);
struct MultiDMatches BFMatcher_KnnMatch(BFMatcher b, Mat query, Mat train, int k);
FlannBasedMatcher FlannBasedMatcher_Create();
@@ -94,6 +99,7 @@ void DrawKeyPoints(Mat src, struct KeyPoints kp, Mat dst, const Scalar s, int fl
SIFT SIFT_Create();
void SIFT_Close(SIFT f);
struct KeyPoints SIFT_Detect(SIFT f, Mat src);
struct KeyPoints SIFT_Compute(SIFT f, Mat src, struct KeyPoints kp, Mat desc);
struct KeyPoints SIFT_DetectAndCompute(SIFT f, Mat src, Mat mask, Mat desc);
void DrawMatches(Mat img1, struct KeyPoints kp1, Mat img2, struct KeyPoints kp2, struct DMatches matches1to2, Mat outImg, const Scalar matchesColor, const Scalar pointColor, struct ByteArray matchesMask, int flags);

1
vendor/gocv.io/x/gocv/gocv.go generated vendored
View File

@@ -7,5 +7,4 @@
//
// For further details, please see:
// http://docs.opencv.org/master/d1/dfb/intro.html
//
package gocv // import "gocv.io/x/gocv"

19
vendor/gocv.io/x/gocv/highgui.go generated vendored
View File

@@ -19,7 +19,6 @@ import (
//
// For further details, please see:
// http://docs.opencv.org/master/d7/dfc/group__highgui.html
//
type Window struct {
name string
open bool
@@ -29,7 +28,6 @@ type Window struct {
//
// For further details, please see:
// http://docs.opencv.org/master/d7/dfc/group__highgui.html#ga5afdf8410934fd099df85c75b2e0888b
//
func NewWindow(name string) *Window {
runtime.LockOSThread()
@@ -45,7 +43,6 @@ func NewWindow(name string) *Window {
//
// For further details, please see:
// http://docs.opencv.org/master/d7/dfc/group__highgui.html#ga851ccdd6961022d1d5b4c4f255dbab34
//
func (w *Window) Close() error {
cName := C.CString(w.name)
defer C.free(unsafe.Pointer(cName))
@@ -109,7 +106,6 @@ const (
//
// For further details, please see:
// https://docs.opencv.org/master/d7/dfc/group__highgui.html#gaaf9504b8f9cf19024d9d44a14e461656
//
func (w *Window) GetWindowProperty(flag WindowPropertyFlag) float64 {
cName := C.CString(w.name)
defer C.free(unsafe.Pointer(cName))
@@ -121,7 +117,6 @@ func (w *Window) GetWindowProperty(flag WindowPropertyFlag) float64 {
//
// For further details, please see:
// https://docs.opencv.org/master/d7/dfc/group__highgui.html#ga66e4a6db4d4e06148bcdfe0d70a5df27
//
func (w *Window) SetWindowProperty(flag WindowPropertyFlag, value WindowFlag) {
cName := C.CString(w.name)
defer C.free(unsafe.Pointer(cName))
@@ -133,7 +128,6 @@ func (w *Window) SetWindowProperty(flag WindowPropertyFlag, value WindowFlag) {
//
// For further details, please see:
// https://docs.opencv.org/master/d7/dfc/group__highgui.html#ga56f8849295fd10d0c319724ddb773d96
//
func (w *Window) SetWindowTitle(title string) {
cName := C.CString(w.name)
defer C.free(unsafe.Pointer(cName))
@@ -150,7 +144,6 @@ func (w *Window) SetWindowTitle(title string) {
//
// For further details, please see:
// http://docs.opencv.org/master/d7/dfc/group__highgui.html#ga453d42fe4cb60e5723281a89973ee563
//
func (w *Window) IMShow(img Mat) {
cName := C.CString(w.name)
defer C.free(unsafe.Pointer(cName))
@@ -165,7 +158,6 @@ func (w *Window) IMShow(img Mat) {
//
// For further details, please see:
// http://docs.opencv.org/master/d7/dfc/group__highgui.html#ga5628525ad33f52eab17feebcfba38bd7
//
func (w *Window) WaitKey(delay int) int {
return int(C.Window_WaitKey(C.int(delay)))
}
@@ -174,7 +166,6 @@ func (w *Window) WaitKey(delay int) int {
//
// For further details, please see:
// https://docs.opencv.org/master/d7/dfc/group__highgui.html#ga8d86b207f7211250dbe6e28f76307ffb
//
func (w *Window) MoveWindow(x, y int) {
cName := C.CString(w.name)
defer C.free(unsafe.Pointer(cName))
@@ -186,7 +177,6 @@ func (w *Window) MoveWindow(x, y int) {
//
// For further details, please see:
// https://docs.opencv.org/master/d7/dfc/group__highgui.html#ga9e80e080f7ef33f897e415358aee7f7e
//
func (w *Window) ResizeWindow(width, height int) {
cName := C.CString(w.name)
defer C.free(unsafe.Pointer(cName))
@@ -203,7 +193,6 @@ func (w *Window) ResizeWindow(width, height int) {
//
// For further details, please see:
// https://docs.opencv.org/master/d7/dfc/group__highgui.html#ga8daf4730d3adf7035b6de9be4c469af5
//
func (w *Window) SelectROI(img Mat) image.Rectangle {
cName := C.CString(w.name)
defer C.free(unsafe.Pointer(cName))
@@ -222,7 +211,6 @@ func (w *Window) SelectROI(img Mat) image.Rectangle {
//
// For further details, please see:
// https://docs.opencv.org/master/d7/dfc/group__highgui.html#ga0f11fad74a6432b8055fb21621a0f893
//
func (w *Window) SelectROIs(img Mat) []image.Rectangle {
cName := C.CString(w.name)
defer C.free(unsafe.Pointer(cName))
@@ -256,7 +244,6 @@ func SelectROIs(name string, img Mat) []image.Rectangle {
// WaitKey that is not attached to a specific Window.
// Only use when no Window exists in your application, e.g. command line app.
//
func WaitKey(delay int) int {
return int(C.Window_WaitKey(C.int(delay)))
}
@@ -271,7 +258,6 @@ type Trackbar struct {
//
// For further details, please see:
// https://docs.opencv.org/master/d7/dfc/group__highgui.html#gaf78d2155d30b728fc413803745b67a9b
//
func (w *Window) CreateTrackbar(name string, max int) *Trackbar {
cName := C.CString(w.name)
defer C.free(unsafe.Pointer(cName))
@@ -288,7 +274,6 @@ func (w *Window) CreateTrackbar(name string, max int) *Trackbar {
//
// For further details, please see:
// https://docs.opencv.org/master/d7/dfc/group__highgui.html#gaf78d2155d30b728fc413803745b67a9b
//
func (w *Window) CreateTrackbarWithValue(name string, value *int, max int) *Trackbar {
cName := C.CString(w.name)
defer C.free(unsafe.Pointer(cName))
@@ -304,7 +289,6 @@ func (w *Window) CreateTrackbarWithValue(name string, value *int, max int) *Trac
//
// For further details, please see:
// https://docs.opencv.org/master/d7/dfc/group__highgui.html#ga122632e9e91b9ec06943472c55d9cda8
//
func (t *Trackbar) GetPos() int {
cName := C.CString(t.parent.name)
defer C.free(unsafe.Pointer(cName))
@@ -319,7 +303,6 @@ func (t *Trackbar) GetPos() int {
//
// For further details, please see:
// https://docs.opencv.org/master/d7/dfc/group__highgui.html#ga67d73c4c9430f13481fd58410d01bd8d
//
func (t *Trackbar) SetPos(pos int) {
cName := C.CString(t.parent.name)
defer C.free(unsafe.Pointer(cName))
@@ -334,7 +317,6 @@ func (t *Trackbar) SetPos(pos int) {
//
// For further details, please see:
// https://docs.opencv.org/master/d7/dfc/group__highgui.html#gabe26ffe8d2b60cc678895595a581b7aa
//
func (t *Trackbar) SetMin(pos int) {
cName := C.CString(t.parent.name)
defer C.free(unsafe.Pointer(cName))
@@ -349,7 +331,6 @@ func (t *Trackbar) SetMin(pos int) {
//
// For further details, please see:
// https://docs.opencv.org/master/d7/dfc/group__highgui.html#ga7e5437ccba37f1154b65210902fc4480
//
func (t *Trackbar) SetMax(pos int) {
cName := C.CString(t.parent.name)
defer C.free(unsafe.Pointer(cName))

View File

@@ -42,3 +42,8 @@ Mat Image_IMDecode(ByteArray buf, int flags) {
cv::Mat img = cv::imdecode(data, flags);
return new cv::Mat(img);
}
void Image_IMDecodeIntoMat(ByteArray buf, int flags, Mat dest) {
std::vector<uchar> data(buf.data, buf.data + buf.length);
cv::imdecode(data, flags, dest);
}

25
vendor/gocv.io/x/gocv/imgcodecs.go generated vendored
View File

@@ -135,7 +135,6 @@ const (
//
// For further details, please see:
// http://docs.opencv.org/master/d4/da8/group__imgcodecs.html#ga288b8b3da0892bd651fce07b3bbd3a56
//
func IMRead(name string, flags IMReadFlag) Mat {
cName := C.CString(name)
defer C.free(unsafe.Pointer(cName))
@@ -147,7 +146,6 @@ func IMRead(name string, flags IMReadFlag) Mat {
//
// For further details, please see:
// http://docs.opencv.org/master/d4/da8/group__imgcodecs.html#gabbc7ef1aa2edfaa87772f1202d67e0ce
//
func IMWrite(name string, img Mat) bool {
cName := C.CString(name)
defer C.free(unsafe.Pointer(cName))
@@ -160,7 +158,6 @@ func IMWrite(name string, img Mat) bool {
//
// For further details, please see:
// http://docs.opencv.org/master/d4/da8/group__imgcodecs.html#gabbc7ef1aa2edfaa87772f1202d67e0ce
//
func IMWriteWithParams(name string, img Mat, params []int) bool {
cName := C.CString(name)
defer C.free(unsafe.Pointer(cName))
@@ -196,7 +193,6 @@ const (
//
// For further details, please see:
// http://docs.opencv.org/master/d4/da8/group__imgcodecs.html#ga461f9ac09887e47797a54567df3b8b63
//
func IMEncode(fileExt FileExt, img Mat) (buf *NativeByteBuffer, err error) {
cfileExt := C.CString(string(fileExt))
defer C.free(unsafe.Pointer(cfileExt))
@@ -211,11 +207,11 @@ func IMEncode(fileExt FileExt, img Mat) (buf *NativeByteBuffer, err error) {
// using the image format passed in in the form of a file extension string.
//
// Usage example:
// buffer, err := gocv.IMEncodeWithParams(gocv.JPEGFileExt, img, []int{gocv.IMWriteJpegQuality, quality})
//
// buffer, err := gocv.IMEncodeWithParams(gocv.JPEGFileExt, img, []int{gocv.IMWriteJpegQuality, quality})
//
// For further details, please see:
// http://docs.opencv.org/master/d4/da8/group__imgcodecs.html#ga461f9ac09887e47797a54567df3b8b63
//
func IMEncodeWithParams(fileExt FileExt, img Mat, params []int) (buf *NativeByteBuffer, err error) {
cfileExt := C.CString(string(fileExt))
defer C.free(unsafe.Pointer(cfileExt))
@@ -242,7 +238,6 @@ func IMEncodeWithParams(fileExt FileExt, img Mat, params []int) (buf *NativeByte
//
// For further details, please see:
// https://docs.opencv.org/master/d4/da8/group__imgcodecs.html#ga26a67788faa58ade337f8d28ba0eb19e
//
func IMDecode(buf []byte, flags IMReadFlag) (Mat, error) {
data, err := toByteArray(buf)
if err != nil {
@@ -250,3 +245,19 @@ func IMDecode(buf []byte, flags IMReadFlag) (Mat, error) {
}
return newMat(C.Image_IMDecode(*data, C.int(flags))), nil
}
// IMDecodeIntoMat reads an image from a buffer in memory into a matrix.
// The function IMDecodeIntoMat reads an image from the specified buffer in memory.
// If the buffer is too short or contains invalid data, the function
// returns an error
//
// For further details, please see:
// https://docs.opencv.org/4.x/d4/da8/group__imgcodecs.html#ga5a0acefe5cbe0a81e904e452ec7ca733
func IMDecodeIntoMat(buf []byte, flags IMReadFlag, dest *Mat) error {
data, err := toByteArray(buf)
if err != nil {
return err
}
C.Image_IMDecodeIntoMat(*data, C.int(flags), dest.p)
return nil
}

1
vendor/gocv.io/x/gocv/imgcodecs.h generated vendored
View File

@@ -17,6 +17,7 @@ void Image_IMEncode(const char* fileExt, Mat img, void* vector);
void Image_IMEncode_WithParams(const char* fileExt, Mat img, IntVector params, void* vector);
Mat Image_IMDecode(ByteArray buf, int flags);
void Image_IMDecodeIntoMat(ByteArray buf, int flag, Mat dest);
#ifdef __cplusplus
}

51
vendor/gocv.io/x/gocv/imgproc.cpp generated vendored
View File

@@ -76,6 +76,10 @@ double CompareHist(Mat hist1, Mat hist2, int method) {
return cv::compareHist(*hist1, *hist2, method);
}
float EMD(Mat sig1, Mat sig2, int distType) {
return cv::EMD(*sig1, *sig2, distType);
}
struct RotatedRect FitEllipse(PointVector pts)
{
cv::RotatedRect bRect = cv::fitEllipse(*pts);
@@ -150,6 +154,14 @@ void ErodeWithParams(Mat src, Mat dst, Mat kernel, Point anchor, int iterations,
cv::erode(*src, *dst, *kernel, pt1, iterations, borderType, cv::morphologyDefaultBorderValue());
}
void ErodeWithParamsAndBorderValue(Mat src, Mat dst, Mat kernel, Point anchor, int iterations, int borderType, Scalar borderValue) {
cv::Point pt1(anchor.x, anchor.y);
cv::Scalar c = cv::Scalar(borderValue.val1, borderValue.val2, borderValue.val3, borderValue.val4);
cv::erode(*src, *dst, *kernel, pt1, iterations, borderType, c);
}
void MatchTemplate(Mat image, Mat templ, Mat result, int method, Mat mask) {
cv::matchTemplate(*image, *templ, *result, method, *mask);
}
@@ -186,6 +198,13 @@ void BoxPoints(RotatedRect rect, Mat boxPts){
cv::boxPoints(rotatedRectangle, *boxPts);
}
void BoxPoints2f(RotatedRect2f rect, Mat boxPts){
cv::Point2f centerPt(rect.center.x , rect.center.y);
cv::Size2f rSize(rect.size.width, rect.size.height);
cv::RotatedRect rotatedRectangle(centerPt, rSize, rect.angle);
cv::boxPoints(rotatedRectangle, *boxPts);
}
double ContourArea(PointVector pts) {
return cv::contourArea(*pts);
}
@@ -213,6 +232,29 @@ struct RotatedRect MinAreaRect(PointVector pts){
return retrect;
}
struct RotatedRect2f MinAreaRect2f(PointVector pts){
cv::RotatedRect cvrect = cv::minAreaRect(*pts);
Point2f* rpts = new Point2f[4];
cv::Point2f* pts4 = new cv::Point2f[4];
cvrect.points(pts4);
for (size_t j = 0; j < 4; j++) {
Point2f pt = {pts4[j].x, pts4[j].y};
rpts[j] = pt;
}
delete[] pts4;
cv::Rect bRect = cvrect.boundingRect();
Rect r = {bRect.x, bRect.y, bRect.width, bRect.height};
Point2f centrpt = {cvrect.center.x, cvrect.center.y};
Size2f szsz = {cvrect.size.width, cvrect.size.height};
RotatedRect2f retrect = {(Contour2f){rpts, 4}, r, centrpt, szsz, cvrect.angle};
return retrect;
}
void MinEnclosingCircle(PointVector pts, Point2f* center, float* radius){
cv::Point2f center2f;
cv::minEnclosingCircle(*pts, center2f, *radius);
@@ -608,6 +650,10 @@ void LinearPolar(Mat src, Mat dst, Point center, double maxRadius, int flags) {
cv::linearPolar(*src, *dst, centerPt, maxRadius, flags);
}
double MatchShapes(PointVector contour1, PointVector contour2, int method, double parameter) {
return cv::matchShapes(*contour1, *contour2, method, parameter);
}
CLAHE CLAHE_Create() {
return new cv::Ptr<cv::CLAHE>(cv::createCLAHE());
}
@@ -639,6 +685,11 @@ Point2f PhaseCorrelate(Mat src1, Mat src2, Mat window, double* response) {
return result2f;
}
void CreateHanningWindow(Mat dst, Size size, int typ) {
cv::Size sz(size.width, size.height);
cv::createHanningWindow(*dst, sz, typ);
}
void Mat_Accumulate(Mat src, Mat dst) {
cv::accumulate(*src, *dst);
}

227
vendor/gocv.io/x/gocv/imgproc.go generated vendored
View File

@@ -18,7 +18,6 @@ import (
// For further details, please see:
//
// https://docs.opencv.org/master/d3/dc0/group__imgproc__shape.html#ga8d26483c636be6b35c3ec6335798a47c
//
func ArcLength(curve PointVector, isClosed bool) float64 {
return float64(C.ArcLength(curve.p, C.bool(isClosed)))
}
@@ -28,7 +27,6 @@ func ArcLength(curve PointVector, isClosed bool) float64 {
// For further details, please see:
//
// https://docs.opencv.org/master/d3/dc0/group__imgproc__shape.html#ga0012a5fdaea70b8a9970165d98722b4c
//
func ApproxPolyDP(curve PointVector, epsilon float64, closed bool) PointVector {
return PointVector{p: C.ApproxPolyDP(curve.p, C.double(epsilon), C.bool(closed))}
}
@@ -37,7 +35,6 @@ func ApproxPolyDP(curve PointVector, epsilon float64, closed bool) PointVector {
//
// For further details, please see:
// https://docs.opencv.org/master/d3/dc0/group__imgproc__shape.html#ga014b28e56cb8854c0de4a211cb2be656
//
func ConvexHull(points PointVector, hull *Mat, clockwise bool, returnPoints bool) {
C.ConvexHull(points.p, hull.p, C.bool(clockwise), C.bool(returnPoints))
}
@@ -46,7 +43,6 @@ func ConvexHull(points PointVector, hull *Mat, clockwise bool, returnPoints bool
//
// For further details, please see:
// https://docs.opencv.org/master/d3/dc0/group__imgproc__shape.html#gada4437098113fd8683c932e0567f47ba
//
func ConvexityDefects(contour PointVector, hull Mat, result *Mat) {
C.ConvexityDefects(contour.p, hull.p, result.p)
}
@@ -57,7 +53,6 @@ func ConvexityDefects(contour PointVector, hull Mat, result *Mat) {
//
// For further details, please see:
// http://docs.opencv.org/master/d7/d1b/group__imgproc__misc.html#ga4e0972be5de079fed4e3a10e24ef5ef0
//
func CvtColor(src Mat, dst *Mat, code ColorConversionCode) {
C.CvtColor(src.p, dst.p, C.int(code))
}
@@ -181,10 +176,17 @@ func CompareHist(hist1 Mat, hist2 Mat, method HistCompMethod) float32 {
return float32(C.CompareHist(hist1.p, hist2.p, C.int(method)))
}
// EMD Computes the "minimal work" distance between two weighted point configurations.
//
// For further details, please see:
// https://docs.opencv.org/4.x/d6/dc7/group__imgproc__hist.html#ga902b8e60cc7075c8947345489221e0e0
func EMD(signature1, signature2 Mat, typ DistanceTypes) float32 {
return float32(C.EMD(signature1.p, signature2.p, C.int(typ)))
}
// ClipLine clips the line against the image rectangle.
// For further details, please see:
// https://docs.opencv.org/master/d6/d6e/group__imgproc__draw.html#gaf483cb46ad6b049bc35ec67052ef1c2c
//
func ClipLine(imgSize image.Point, pt1 image.Point, pt2 image.Point) bool {
pSize := C.struct_Size{
width: C.int(imgSize.X),
@@ -214,7 +216,6 @@ func ClipLine(imgSize image.Point, pt1 image.Point, pt2 image.Point) bool {
//
// For further details, please see:
// https://docs.opencv.org/master/d4/d86/group__imgproc__filter.html#ga9d7064d478c95d60003cf839430737ed
//
func BilateralFilter(src Mat, dst *Mat, diameter int, sigmaColor float64, sigmaSpace float64) {
C.BilateralFilter(src.p, dst.p, C.int(diameter), C.double(sigmaColor), C.double(sigmaSpace))
}
@@ -223,7 +224,6 @@ func BilateralFilter(src Mat, dst *Mat, diameter int, sigmaColor float64, sigmaS
//
// For further details, please see:
// https://docs.opencv.org/master/d4/d86/group__imgproc__filter.html#ga8c45db9afe636703801b0b2e440fce37
//
func Blur(src Mat, dst *Mat, ksize image.Point) {
pSize := C.struct_Size{
width: C.int(ksize.X),
@@ -237,7 +237,6 @@ func Blur(src Mat, dst *Mat, ksize image.Point) {
//
// For further details, please see:
// https://docs.opencv.org/master/d4/d86/group__imgproc__filter.html#gad533230ebf2d42509547d514f7d3fbc3
//
func BoxFilter(src Mat, dst *Mat, depth int, ksize image.Point) {
pSize := C.struct_Size{
height: C.int(ksize.X),
@@ -250,7 +249,6 @@ func BoxFilter(src Mat, dst *Mat, depth int, ksize image.Point) {
//
// For further details, please see:
// https://docs.opencv.org/master/d4/d86/group__imgproc__filter.html#ga045028184a9ef65d7d2579e5c4bff6c0
//
func SqBoxFilter(src Mat, dst *Mat, depth int, ksize image.Point) {
pSize := C.struct_Size{
height: C.int(ksize.X),
@@ -263,7 +261,6 @@ func SqBoxFilter(src Mat, dst *Mat, depth int, ksize image.Point) {
//
// For further details, please see:
// https://docs.opencv.org/master/d4/d86/group__imgproc__filter.html#ga4ff0f3318642c4f469d0e11f242f3b6c
//
func Dilate(src Mat, dst *Mat, kernel Mat) {
C.Dilate(src.p, dst.p, kernel.p)
}
@@ -318,7 +315,6 @@ const (
//
// For further details, please see:
// https://docs.opencv.org/master/d7/d1b/group__imgproc__misc.html#ga8a0b7fdfcb7a13dde018988ba3a43042
//
func DistanceTransform(src Mat, dst *Mat, labels *Mat, distType DistanceTypes, maskSize DistanceTransformMasks, labelType DistanceTransformLabelTypes) {
C.DistanceTransform(src.p, dst.p, labels.p, C.int(distType), C.int(maskSize), C.int(labelType))
}
@@ -327,7 +323,6 @@ func DistanceTransform(src Mat, dst *Mat, labels *Mat, distType DistanceTypes, m
//
// For further details, please see:
// https://docs.opencv.org/master/d4/d86/group__imgproc__filter.html#gaeb1e0c1033e3f6b891a25d0511362aeb
//
func Erode(src Mat, dst *Mat, kernel Mat) {
C.Erode(src.p, dst.p, kernel.p)
}
@@ -336,7 +331,6 @@ func Erode(src Mat, dst *Mat, kernel Mat) {
//
// For further details, please see:
// https://docs.opencv.org/master/d4/d86/group__imgproc__filter.html#gaeb1e0c1033e3f6b891a25d0511362aeb
//
func ErodeWithParams(src Mat, dst *Mat, kernel Mat, anchor image.Point, iterations, borderType int) {
cAnchor := C.struct_Point{
x: C.int(anchor.X),
@@ -346,6 +340,28 @@ func ErodeWithParams(src Mat, dst *Mat, kernel Mat, anchor image.Point, iteratio
C.ErodeWithParams(src.p, dst.p, kernel.p, cAnchor, C.int(iterations), C.int(borderType))
}
// ErodeWithParamsAndBorderValue erodes an image by using a specific structuring
// element. Same as ErodeWithParams but requires an additional borderValue
// parameter.
//
// For further details, please see:
// https://docs.opencv.org/master/d4/d86/group__imgproc__filter.html#gaeb1e0c1033e3f6b891a25d0511362aeb
func ErodeWithParamsAndBorderValue(src Mat, dst *Mat, kernel Mat, anchor image.Point, iterations, borderType int, borderValue Scalar) {
cAnchor := C.struct_Point{
x: C.int(anchor.X),
y: C.int(anchor.Y),
}
bv := C.struct_Scalar{
val1: C.double(borderValue.Val1),
val2: C.double(borderValue.Val2),
val3: C.double(borderValue.Val3),
val4: C.double(borderValue.Val4),
}
C.ErodeWithParamsAndBorderValue(src.p, dst.p, kernel.p, cAnchor, C.int(iterations), C.int(borderType), bv)
}
// RetrievalMode is the mode of the contour retrieval algorithm.
type RetrievalMode int
@@ -401,7 +417,6 @@ const (
//
// For further details, please see:
// https://docs.opencv.org/3.3.0/d3/dc0/group__imgproc__shape.html#gacb413ddce8e48ff3ca61ed7cf626a366
//
func BoundingRect(contour PointVector) image.Rectangle {
r := C.BoundingRect(contour.p)
rect := image.Rect(int(r.x), int(r.y), int(r.x+r.width), int(r.y+r.height))
@@ -412,7 +427,6 @@ func BoundingRect(contour PointVector) image.Rectangle {
//
// For further Details, please see:
// https://docs.opencv.org/3.3.0/d3/dc0/group__imgproc__shape.html#gaf78d467e024b4d7936cf9397185d2f5c
//
func BoxPoints(rect RotatedRect, pts *Mat) {
rPoints := toCPoints(rect.Points)
@@ -444,11 +458,45 @@ func BoxPoints(rect RotatedRect, pts *Mat) {
C.BoxPoints(r, pts.p)
}
// BoxPoints finds the four vertices of a rotated rect. Useful to draw the rotated rectangle.
//
// For further Details, please see:
// https://docs.opencv.org/3.3.0/d3/dc0/group__imgproc__shape.html#gaf78d467e024b4d7936cf9397185d2f5c
func BoxPoints2f(rect RotatedRect2f, pts *Mat) {
rPoints := toCPoints2f(rect.Points)
rRect := C.struct_Rect{
x: C.int(rect.BoundingRect.Min.X),
y: C.int(rect.BoundingRect.Min.Y),
width: C.int(rect.BoundingRect.Max.X - rect.BoundingRect.Min.X),
height: C.int(rect.BoundingRect.Max.Y - rect.BoundingRect.Min.Y),
}
rCenter := C.struct_Point2f{
x: C.float(rect.Center.X),
y: C.float(rect.Center.Y),
}
rSize := C.struct_Size2f{
width: C.float(rect.Width),
height: C.float(rect.Height),
}
r := C.struct_RotatedRect2f{
pts: rPoints,
boundingRect: rRect,
center: rCenter,
size: rSize,
angle: C.double(rect.Angle),
}
C.BoxPoints2f(r, pts.p)
}
// ContourArea calculates a contour area.
//
// For further details, please see:
// https://docs.opencv.org/3.3.0/d3/dc0/group__imgproc__shape.html#ga2c759ed9f497d4a618048a2f56dc97f1
//
func ContourArea(contour PointVector) float64 {
result := C.ContourArea(contour.p)
return float64(result)
@@ -463,8 +511,16 @@ type RotatedRect struct {
Angle float64
}
type RotatedRect2f struct {
Points []Point2f
BoundingRect image.Rectangle
Center Point2f
Width float32
Height float32
Angle float64
}
// toPoints converts C.Contour to []image.Points
//
func toPoints(points C.Contour) []image.Point {
pArray := points.points
pLength := int(points.length)
@@ -483,11 +539,29 @@ func toPoints(points C.Contour) []image.Point {
return points4
}
// toPoints2f converts C.Contour2f to []Point2f
func toPoints2f(points C.Contour2f) []Point2f {
pArray := points.points
pLength := int(points.length)
pHdr := reflect.SliceHeader{
Data: uintptr(unsafe.Pointer(pArray)),
Len: pLength,
Cap: pLength,
}
sPoints := *(*[]C.Point)(unsafe.Pointer(&pHdr))
points4 := make([]Point2f, pLength)
for j, pt := range sPoints {
points4[j] = NewPoint2f(float32(pt.x), float32(pt.y))
}
return points4
}
// MinAreaRect finds a rotated rectangle of the minimum area enclosing the input 2D point set.
//
// For further details, please see:
// https://docs.opencv.org/master/d3/dc0/group__imgproc__shape.html#ga3d476a3417130ae5154aea421ca7ead9
//
func MinAreaRect(points PointVector) RotatedRect {
result := C.MinAreaRect(points.p)
defer C.Points_Close(result.pts)
@@ -502,11 +576,28 @@ func MinAreaRect(points PointVector) RotatedRect {
}
}
// MinAreaRect finds a rotated rectangle of the minimum area enclosing the input 2D point set.
//
// For further details, please see:
// https://docs.opencv.org/master/d3/dc0/group__imgproc__shape.html#ga3d476a3417130ae5154aea421ca7ead9
func MinAreaRect2f(points PointVector) RotatedRect2f {
result := C.MinAreaRect2f(points.p)
defer C.Points2f_Close(result.pts)
return RotatedRect2f{
Points: toPoints2f(result.pts),
BoundingRect: image.Rect(int(result.boundingRect.x), int(result.boundingRect.y), int(result.boundingRect.x)+int(result.boundingRect.width), int(result.boundingRect.y)+int(result.boundingRect.height)),
Center: NewPoint2f(float32(result.center.x), float32(result.center.y)),
Width: float32(result.size.width),
Height: float32(result.size.height),
Angle: float64(result.angle),
}
}
// FitEllipse Fits an ellipse around a set of 2D points.
//
// For further details, please see:
// https://docs.opencv.org/master/d3/dc0/group__imgproc__shape.html#gaf259efaad93098103d6c27b9e4900ffa
//
func FitEllipse(pts PointVector) RotatedRect {
cRect := C.FitEllipse(pts.p)
defer C.Points_Close(cRect.pts)
@@ -539,7 +630,6 @@ func MinEnclosingCircle(pts PointVector) (x, y, radius float32) {
//
// For further details, please see:
// https://docs.opencv.org/master/d3/dc0/group__imgproc__shape.html#ga95f5b48d01abc7c2e0732db24689837b
//
func FindContours(src Mat, mode RetrievalMode, method ContourApproximationMode) PointsVector {
hierarchy := NewMat()
defer hierarchy.Close()
@@ -550,7 +640,6 @@ func FindContours(src Mat, mode RetrievalMode, method ContourApproximationMode)
//
// For further details, please see:
// https://docs.opencv.org/master/d3/dc0/group__imgproc__shape.html#ga17ed9f5d79ae97bd4c7cf18403e1689a
//
func FindContoursWithParams(src Mat, hierarchy *Mat, mode RetrievalMode, method ContourApproximationMode) PointsVector {
return PointsVector{p: C.FindContours(src.p, hierarchy.p, C.int(mode), C.int(method))}
}
@@ -559,7 +648,6 @@ func FindContoursWithParams(src Mat, hierarchy *Mat, mode RetrievalMode, method
//
// For further details, please see:
// https://docs.opencv.org/master/d3/dc0/group__imgproc__shape.html#ga1a539e8db2135af2566103705d7a5722
//
func PointPolygonTest(pts PointVector, pt image.Point, measureDist bool) float64 {
cp := C.struct_Point{
x: C.int(pt.X),
@@ -568,7 +656,7 @@ func PointPolygonTest(pts PointVector, pt image.Point, measureDist bool) float64
return float64(C.PointPolygonTest(pts.p, cp, C.bool(measureDist)))
}
//ConnectedComponentsAlgorithmType specifies the type for ConnectedComponents
// ConnectedComponentsAlgorithmType specifies the type for ConnectedComponents
type ConnectedComponentsAlgorithmType int
const (
@@ -586,7 +674,6 @@ const (
//
// For further details, please see:
// https://docs.opencv.org/master/d3/dc0/group__imgproc__shape.html#gaedef8c7340499ca391d459122e51bef5
//
func ConnectedComponents(src Mat, labels *Mat) int {
return int(C.ConnectedComponents(src.p, labels.p, C.int(8), C.int(MatTypeCV32S), C.int(CCL_DEFAULT)))
}
@@ -595,7 +682,6 @@ func ConnectedComponents(src Mat, labels *Mat) int {
//
// For further details, please see:
// https://docs.opencv.org/master/d3/dc0/group__imgproc__shape.html#gaedef8c7340499ca391d459122e51bef5
//
func ConnectedComponentsWithParams(src Mat, labels *Mat, conn int, ltype MatType,
ccltype ConnectedComponentsAlgorithmType) int {
return int(C.ConnectedComponents(src.p, labels.p, C.int(conn), C.int(ltype), C.int(ccltype)))
@@ -628,7 +714,6 @@ const (
//
// For further details, please see:
// https://docs.opencv.org/master/d3/dc0/group__imgproc__shape.html#ga107a78bf7cd25dec05fb4dfc5c9e765f
//
func ConnectedComponentsWithStats(src Mat, labels *Mat, stats *Mat, centroids *Mat) int {
return int(C.ConnectedComponentsWithStats(src.p, labels.p, stats.p, centroids.p,
C.int(8), C.int(MatTypeCV32S), C.int(CCL_DEFAULT)))
@@ -639,7 +724,6 @@ func ConnectedComponentsWithStats(src Mat, labels *Mat, stats *Mat, centroids *M
//
// For further details, please see:
// https://docs.opencv.org/master/d3/dc0/group__imgproc__shape.html#ga107a78bf7cd25dec05fb4dfc5c9e765f
//
func ConnectedComponentsWithStatsWithParams(src Mat, labels *Mat, stats *Mat, centroids *Mat,
conn int, ltype MatType, ccltype ConnectedComponentsAlgorithmType) int {
return int(C.ConnectedComponentsWithStats(src.p, labels.p, stats.p, centroids.p, C.int(conn),
@@ -668,7 +752,6 @@ const (
//
// For further details, please see:
// https://docs.opencv.org/master/df/dfb/group__imgproc__object.html#ga586ebfb0a7fb604b35a23d85391329be
//
func MatchTemplate(image Mat, templ Mat, result *Mat, method TemplateMatchMode, mask Mat) {
C.MatchTemplate(image.p, templ.p, result.p, C.int(method), mask.p)
}
@@ -678,7 +761,6 @@ func MatchTemplate(image Mat, templ Mat, result *Mat, method TemplateMatchMode,
//
// For further details, please see:
// https://docs.opencv.org/master/d3/dc0/group__imgproc__shape.html#ga556a180f43cab22649c23ada36a8a139
//
func Moments(src Mat, binaryImage bool) map[string]float64 {
r := C.Moments(src.p, C.bool(binaryImage))
@@ -715,7 +797,6 @@ func Moments(src Mat, binaryImage bool) map[string]float64 {
//
// For further details, please see:
// https://docs.opencv.org/master/d4/d86/group__imgproc__filter.html#gaf9bba239dfca11654cb7f50f889fc2ff
//
func PyrDown(src Mat, dst *Mat, ksize image.Point, borderType BorderType) {
pSize := C.struct_Size{
height: C.int(ksize.X),
@@ -728,7 +809,6 @@ func PyrDown(src Mat, dst *Mat, ksize image.Point, borderType BorderType) {
//
// For further details, please see:
// https://docs.opencv.org/master/d4/d86/group__imgproc__filter.html#gada75b59bdaaca411ed6fee10085eb784
//
func PyrUp(src Mat, dst *Mat, ksize image.Point, borderType BorderType) {
pSize := C.struct_Size{
height: C.int(ksize.X),
@@ -742,7 +822,6 @@ func PyrUp(src Mat, dst *Mat, ksize image.Point, borderType BorderType) {
//
// For further details, please see:
// https://docs.opencv.org/master/d4/d86/group__imgproc__filter.html#ga94756fad83d9d24d29c9bf478558c40a
//
func MorphologyDefaultBorderValue() Scalar {
var scalar C.Scalar = C.MorphologyDefaultBorderValue()
return NewScalar(float64(scalar.val1), float64(scalar.val2), float64(scalar.val3), float64(scalar.val4))
@@ -752,7 +831,6 @@ func MorphologyDefaultBorderValue() Scalar {
//
// For further details, please see:
// https://docs.opencv.org/master/d4/d86/group__imgproc__filter.html#ga67493776e3ad1a3df63883829375201f
//
func MorphologyEx(src Mat, dst *Mat, op MorphType, kernel Mat) {
C.MorphologyEx(src.p, dst.p, C.int(op), kernel.p)
}
@@ -761,7 +839,6 @@ func MorphologyEx(src Mat, dst *Mat, op MorphType, kernel Mat) {
//
// For further details, please see:
// https://docs.opencv.org/master/d4/d86/group__imgproc__filter.html#ga67493776e3ad1a3df63883829375201f
//
func MorphologyExWithParams(src Mat, dst *Mat, op MorphType, kernel Mat, iterations int, borderType BorderType) {
pt := C.struct_Point{
x: C.int(-1),
@@ -789,7 +866,6 @@ const (
//
// For further details, please see:
// https://docs.opencv.org/master/d4/d86/group__imgproc__filter.html#gac342a1bb6eabf6f55c803b09268e36dc
//
func GetStructuringElement(shape MorphShape, ksize image.Point) Mat {
sz := C.struct_Size{
width: C.int(ksize.X),
@@ -863,7 +939,6 @@ const (
//
// For further details, please see:
// http://docs.opencv.org/master/d4/d86/group__imgproc__filter.html#gaabe8c836e97159a9193fb0b11ac52cf1
//
func GaussianBlur(src Mat, dst *Mat, ksize image.Point, sigmaX float64,
sigmaY float64, borderType BorderType) {
pSize := C.struct_Size{
@@ -894,7 +969,6 @@ func GetGaussianKernelWithParams(ksize int, sigma float64, ktype MatType) Mat {
//
// For further details, please see:
// https://docs.opencv.org/master/d4/d86/group__imgproc__filter.html#gacea54f142e81b6758cb6f375ce782c8d
//
func Sobel(src Mat, dst *Mat, ddepth MatType, dx, dy, ksize int, scale, delta float64, borderType BorderType) {
C.Sobel(src.p, dst.p, C.int(ddepth), C.int(dx), C.int(dy), C.int(ksize), C.double(scale), C.double(delta), C.int(borderType))
}
@@ -903,7 +977,6 @@ func Sobel(src Mat, dst *Mat, ddepth MatType, dx, dy, ksize int, scale, delta fl
//
// For further details, please see:
// https://docs.opencv.org/master/d4/d86/group__imgproc__filter.html#ga405d03b20c782b65a4daf54d233239a2
//
func SpatialGradient(src Mat, dx, dy *Mat, ksize MatType, borderType BorderType) {
C.SpatialGradient(src.p, dx.p, dy.p, C.int(ksize), C.int(borderType))
}
@@ -912,7 +985,6 @@ func SpatialGradient(src Mat, dx, dy *Mat, ksize MatType, borderType BorderType)
//
// For further details, please see:
// https://docs.opencv.org/master/d4/d86/group__imgproc__filter.html#gad78703e4c8fe703d479c1860d76429e6
//
func Laplacian(src Mat, dst *Mat, dDepth MatType, size int, scale float64,
delta float64, borderType BorderType) {
C.Laplacian(src.p, dst.p, C.int(dDepth), C.int(size), C.double(scale), C.double(delta), C.int(borderType))
@@ -922,7 +994,6 @@ func Laplacian(src Mat, dst *Mat, dDepth MatType, size int, scale float64,
//
// For further details, please see:
// https://docs.opencv.org/master/d4/d86/group__imgproc__filter.html#gaa13106761eedf14798f37aa2d60404c9
//
func Scharr(src Mat, dst *Mat, dDepth MatType, dx int, dy int, scale float64,
delta float64, borderType BorderType) {
C.Scharr(src.p, dst.p, C.int(dDepth), C.int(dx), C.int(dy), C.double(scale), C.double(delta), C.int(borderType))
@@ -932,7 +1003,6 @@ func Scharr(src Mat, dst *Mat, dDepth MatType, dx int, dy int, scale float64,
//
// For further details, please see:
// https://docs.opencv.org/master/d4/d86/group__imgproc__filter.html#ga564869aa33e58769b4469101aac458f9
//
func MedianBlur(src Mat, dst *Mat, ksize int) {
C.MedianBlur(src.p, dst.p, C.int(ksize))
}
@@ -947,7 +1017,6 @@ func MedianBlur(src Mat, dst *Mat, ksize int) {
//
// For further details, please see:
// http://docs.opencv.org/master/dd/d1a/group__imgproc__feature.html#ga04723e007ed888ddf11d9ba04e2232de
//
func Canny(src Mat, edges *Mat, t1 float32, t2 float32) {
C.Canny(src.p, edges.p, C.double(t1), C.double(t2))
}
@@ -957,7 +1026,6 @@ func Canny(src Mat, edges *Mat, t1 float32, t2 float32) {
//
// For further details, please see:
// https://docs.opencv.org/master/dd/d1a/group__imgproc__feature.html#ga354e0d7c86d0d9da75de9b9701a9a87e
//
func CornerSubPix(img Mat, corners *Mat, winSize image.Point, zeroZone image.Point, criteria TermCriteria) {
winSz := C.struct_Size{
width: C.int(winSize.X),
@@ -978,7 +1046,6 @@ func CornerSubPix(img Mat, corners *Mat, winSize image.Point, zeroZone image.Poi
//
// For further details, please see:
// https://docs.opencv.org/master/dd/d1a/group__imgproc__feature.html#ga1d6bb77486c8f92d79c8793ad995d541
//
func GoodFeaturesToTrack(img Mat, corners *Mat, maxCorners int, quality float64, minDist float64) {
C.GoodFeaturesToTrack(img.p, corners.p, C.int(maxCorners), C.double(quality), C.double(minDist))
}
@@ -1005,7 +1072,6 @@ const (
// The function implements the GrabCut image segmentation algorithm.
// For further details, please see:
// https://docs.opencv.org/master/d7/d1b/group__imgproc__misc.html#ga909c1dda50efcbeaa3ce126be862b37f
//
func GrabCut(img Mat, mask *Mat, r image.Rectangle, bgdModel *Mat, fgdModel *Mat, iterCount int, mode GrabCutMode) {
cRect := C.struct_Rect{
x: C.int(r.Min.X),
@@ -1042,7 +1108,6 @@ const (
//
// For further details, please see:
// https://docs.opencv.org/master/dd/d1a/group__imgproc__feature.html#ga47849c3be0d0406ad3ca45db65a25d2d
//
func HoughCircles(src Mat, circles *Mat, method HoughMode, dp, minDist float64) {
C.HoughCircles(src.p, circles.p, C.int(method), C.double(dp), C.double(minDist))
}
@@ -1052,7 +1117,6 @@ func HoughCircles(src Mat, circles *Mat, method HoughMode, dp, minDist float64)
//
// For further details, please see:
// https://docs.opencv.org/master/dd/d1a/group__imgproc__feature.html#ga47849c3be0d0406ad3ca45db65a25d2d
//
func HoughCirclesWithParams(src Mat, circles *Mat, method HoughMode, dp, minDist, param1, param2 float64, minRadius, maxRadius int) {
C.HoughCirclesWithParams(src.p, circles.p, C.int(method), C.double(dp), C.double(minDist), C.double(param1), C.double(param2), C.int(minRadius), C.int(maxRadius))
}
@@ -1063,7 +1127,6 @@ func HoughCirclesWithParams(src Mat, circles *Mat, method HoughMode, dp, minDist
//
// For further details, please see:
// http://docs.opencv.org/master/dd/d1a/group__imgproc__feature.html#ga46b4e588934f6c8dfd509cc6e0e4545a
//
func HoughLines(src Mat, lines *Mat, rho float32, theta float32, threshold int) {
C.HoughLines(src.p, lines.p, C.double(rho), C.double(theta), C.int(threshold))
}
@@ -1074,7 +1137,6 @@ func HoughLines(src Mat, lines *Mat, rho float32, theta float32, threshold int)
//
// For further details, please see:
// http://docs.opencv.org/master/dd/d1a/group__imgproc__feature.html#ga8618180a5948286384e3b7ca02f6feeb
//
func HoughLinesP(src Mat, lines *Mat, rho float32, theta float32, threshold int) {
C.HoughLinesP(src.p, lines.p, C.double(rho), C.double(theta), C.int(threshold))
}
@@ -1088,7 +1150,6 @@ func HoughLinesPWithParams(src Mat, lines *Mat, rho float32, theta float32, thre
//
// For further details, please see:
// https://docs.opencv.org/master/dd/d1a/group__imgproc__feature.html#ga2858ef61b4e47d1919facac2152a160e
//
func HoughLinesPointSet(points Mat, lines *Mat, linesMax int, threshold int,
minRho float32, maxRho float32, rhoStep float32,
minTheta float32, maxTheta float32, thetaStep float32) {
@@ -1100,7 +1161,6 @@ func HoughLinesPointSet(points Mat, lines *Mat, linesMax int, threshold int,
// Integral calculates one or more integral images for the source image.
// For further details, please see:
// https://docs.opencv.org/master/d7/d1b/group__imgproc__misc.html#ga97b87bec26908237e8ba0f6e96d23e28
//
func Integral(src Mat, sum *Mat, sqsum *Mat, tilted *Mat) {
C.Integral(src.p, sum.p, sqsum.p, tilted.p)
}
@@ -1138,7 +1198,6 @@ const (
//
// For further details, please see:
// https://docs.opencv.org/3.3.0/d7/d1b/group__imgproc__misc.html#gae8a4a146d1ca78c626a53577199e9c57
//
func Threshold(src Mat, dst *Mat, thresh float32, maxvalue float32, typ ThresholdType) (threshold float32) {
return float32(C.Threshold(src.p, dst.p, C.double(thresh), C.double(maxvalue), C.int(typ)))
}
@@ -1158,7 +1217,6 @@ const (
//
// For further details, please see:
// https://docs.opencv.org/master/d7/d1b/group__imgproc__misc.html#ga72b913f352e4a1b1b397736707afcde3
//
func AdaptiveThreshold(src Mat, dst *Mat, maxValue float32, adaptiveTyp AdaptiveThresholdType, typ ThresholdType, blockSize int, c float32) {
C.AdaptiveThreshold(src.p, dst.p, C.double(maxValue), C.int(adaptiveTyp), C.int(typ), C.int(blockSize), C.double(c))
}
@@ -1168,7 +1226,6 @@ func AdaptiveThreshold(src Mat, dst *Mat, maxValue float32, adaptiveTyp Adaptive
//
// For further details, please see:
// https://docs.opencv.org/master/d6/d6e/group__imgproc__draw.html#ga0a165a3ca093fd488ac709fdf10c05b2
//
func ArrowedLine(img *Mat, pt1 image.Point, pt2 image.Point, c color.RGBA, thickness int) {
sp1 := C.struct_Point{
x: C.int(pt1.X),
@@ -1194,7 +1251,6 @@ func ArrowedLine(img *Mat, pt1 image.Point, pt2 image.Point, c color.RGBA, thick
//
// For further details, please see:
// https://docs.opencv.org/master/d6/d6e/group__imgproc__draw.html#gaf10604b069374903dbd0f0488cb43670
//
func Circle(img *Mat, center image.Point, radius int, c color.RGBA, thickness int) {
pc := C.struct_Point{
x: C.int(center.X),
@@ -1215,7 +1271,6 @@ func Circle(img *Mat, center image.Point, radius int, c color.RGBA, thickness in
//
// For further details, please see:
// https://docs.opencv.org/master/d6/d6e/group__imgproc__draw.html#gaf10604b069374903dbd0f0488cb43670
//
func CircleWithParams(img *Mat, center image.Point, radius int, c color.RGBA, thickness int, lineType LineType, shift int) {
pc := C.struct_Point{
x: C.int(center.X),
@@ -1236,7 +1291,6 @@ func CircleWithParams(img *Mat, center image.Point, radius int, c color.RGBA, th
//
// For further details, please see:
// https://docs.opencv.org/master/d6/d6e/group__imgproc__draw.html#ga28b2267d35786f5f890ca167236cbc69
//
func Ellipse(img *Mat, center, axes image.Point, angle, startAngle, endAngle float64, c color.RGBA, thickness int) {
pc := C.struct_Point{
x: C.int(center.X),
@@ -1261,7 +1315,6 @@ func Ellipse(img *Mat, center, axes image.Point, angle, startAngle, endAngle flo
//
// For further details, please see:
// https://docs.opencv.org/master/d6/d6e/group__imgproc__draw.html#ga28b2267d35786f5f890ca167236cbc69
//
func EllipseWithParams(img *Mat, center, axes image.Point, angle, startAngle, endAngle float64, c color.RGBA, thickness int, lineType LineType, shift int) {
pc := C.struct_Point{
x: C.int(center.X),
@@ -1286,7 +1339,6 @@ func EllipseWithParams(img *Mat, center, axes image.Point, angle, startAngle, en
//
// For further details, please see:
// https://docs.opencv.org/master/d6/d6e/group__imgproc__draw.html#ga7078a9fae8c7e7d13d24dac2520ae4a2
//
func Line(img *Mat, pt1 image.Point, pt2 image.Point, c color.RGBA, thickness int) {
sp1 := C.struct_Point{
x: C.int(pt1.X),
@@ -1313,7 +1365,6 @@ func Line(img *Mat, pt1 image.Point, pt2 image.Point, c color.RGBA, thickness in
//
// For further details, please see:
// http://docs.opencv.org/master/d6/d6e/group__imgproc__draw.html#ga346ac30b5c74e9b5137576c9ee9e0e8c
//
func Rectangle(img *Mat, r image.Rectangle, c color.RGBA, thickness int) {
cRect := C.struct_Rect{
x: C.int(r.Min.X),
@@ -1337,7 +1388,6 @@ func Rectangle(img *Mat, r image.Rectangle, c color.RGBA, thickness int) {
//
// For further details, please see:
// http://docs.opencv.org/master/d6/d6e/group__imgproc__draw.html#ga346ac30b5c74e9b5137576c9ee9e0e8c
//
func RectangleWithParams(img *Mat, r image.Rectangle, c color.RGBA, thickness int, lineType LineType, shift int) {
cRect := C.struct_Rect{
x: C.int(r.Min.X),
@@ -1411,7 +1461,6 @@ func Polylines(img *Mat, pts PointsVector, isClosed bool, c color.RGBA, thicknes
//
// For more information, see:
// http://sources.isc.org/utils/misc/hershey-font.txt
//
type HersheyFont int
const (
@@ -1441,7 +1490,6 @@ const (
//
// For more information, see:
// https://vovkos.github.io/doxyrest-showcase/opencv/sphinx_rtd_theme/enum_cv_LineTypes.html
//
type LineType int
const (
@@ -1461,7 +1509,6 @@ const (
//
// For further details, please see:
// http://docs.opencv.org/master/d6/d6e/group__imgproc__draw.html#ga3d2abfcb995fd2db908c8288199dba82
//
func GetTextSize(text string, fontFace HersheyFont, fontScale float64, thickness int) image.Point {
cText := C.CString(text)
defer C.free(unsafe.Pointer(cText))
@@ -1476,7 +1523,6 @@ func GetTextSize(text string, fontFace HersheyFont, fontScale float64, thickness
//
// For further details, please see:
// http://docs.opencv.org/master/d6/d6e/group__imgproc__draw.html#ga3d2abfcb995fd2db908c8288199dba82
//
func GetTextSizeWithBaseline(text string, fontFace HersheyFont, fontScale float64, thickness int) (image.Point, int) {
cText := C.CString(text)
defer C.free(unsafe.Pointer(cText))
@@ -1493,7 +1539,6 @@ func GetTextSizeWithBaseline(text string, fontFace HersheyFont, fontScale float6
//
// For further details, please see:
// http://docs.opencv.org/master/d6/d6e/group__imgproc__draw.html#ga5126f47f883d730f633d74f07456c576
//
func PutText(img *Mat, text string, org image.Point, fontFace HersheyFont, fontScale float64, c color.RGBA, thickness int) {
cText := C.CString(text)
defer C.free(unsafe.Pointer(cText))
@@ -1521,7 +1566,6 @@ func PutText(img *Mat, text string, org image.Point, fontFace HersheyFont, fontS
//
// For further details, please see:
// http://docs.opencv.org/master/d6/d6e/group__imgproc__draw.html#ga5126f47f883d730f633d74f07456c576
//
func PutTextWithParams(img *Mat, text string, org image.Point, fontFace HersheyFont, fontScale float64, c color.RGBA, thickness int, lineType LineType, bottomLeftOrigin bool) {
cText := C.CString(text)
defer C.free(unsafe.Pointer(cText))
@@ -1784,7 +1828,6 @@ const (
//
// For further details, please see:
// https://docs.opencv.org/master/d9/d0c/group__calib3d.html#ga4abc2ece9fab9398f2e560d53c8c9780
//
func FindHomography(srcPoints Mat, dstPoints *Mat, method HomographyMethod, ransacReprojThreshold float64, mask *Mat, maxIters int, confidence float64) Mat {
return newMat(C.FindHomography(srcPoints.Ptr(), dstPoints.Ptr(), C.int(method), C.double(ransacReprojThreshold), mask.Ptr(), C.int(maxIters), C.double(confidence)))
}
@@ -1793,7 +1836,6 @@ func FindHomography(srcPoints Mat, dstPoints *Mat, method HomographyMethod, rans
//
// For further details, please see:
// https://docs.opencv.org/master/d6/d6e/group__imgproc__draw.html#ga746c0625f1781f1ffc9056259103edbc
//
func DrawContours(img *Mat, contours PointsVector, contourIdx int, c color.RGBA, thickness int) {
sColor := C.struct_Scalar{
val1: C.double(c.B),
@@ -1809,7 +1851,6 @@ func DrawContours(img *Mat, contours PointsVector, contourIdx int, c color.RGBA,
//
// For further details, please see:
// https://docs.opencv.org/master/d6/d6e/group__imgproc__draw.html#ga746c0625f1781f1ffc9056259103edbc
//
func DrawContoursWithParams(img *Mat, contours PointsVector, contourIdx int, c color.RGBA, thickness int, lineType LineType, hierarchy Mat, maxLevel int, offset image.Point) {
sColor := C.struct_Scalar{
val1: C.double(c.B),
@@ -1912,6 +1953,26 @@ func FitLine(pts PointVector, line *Mat, distType DistanceTypes, param, reps, ae
C.FitLine(pts.p, line.p, C.int(distType), C.double(param), C.double(reps), C.double(aeps))
}
// Shape matching methods.
//
// For further details, please see:
// https://docs.opencv.org/4.x/d3/dc0/group__imgproc__shape.html#gaadc90cb16e2362c9bd6e7363e6e4c317
type ShapeMatchModes int
const (
ContoursMatchI1 ShapeMatchModes = 1
ContoursMatchI2 ShapeMatchModes = 2
ContoursMatchI3 ShapeMatchModes = 3
)
// Compares two shapes.
//
// For further details, please see:
// https://docs.opencv.org/4.x/d3/dc0/group__imgproc__shape.html#gaadc90cb16e2362c9bd6e7363e6e4c317
func MatchShapes(contour1 PointVector, contour2 PointVector, method ShapeMatchModes, parameter float64) float64 {
return float64(C.MatchShapes(contour1.p, contour2.p, C.int(method), C.double(parameter)))
}
// CLAHE is a wrapper around the cv::CLAHE algorithm.
type CLAHE struct {
// C.CLAHE
@@ -1922,7 +1983,6 @@ type CLAHE struct {
//
// For further details, please see:
// https://docs.opencv.org/master/d6/db6/classcv_1_1CLAHE.html
//
func NewCLAHE() CLAHE {
return CLAHE{p: unsafe.Pointer(C.CLAHE_Create())}
}
@@ -1931,7 +1991,6 @@ func NewCLAHE() CLAHE {
//
// For further details, please see:
// https://docs.opencv.org/master/d6/db6/classcv_1_1CLAHE.html
//
func NewCLAHEWithParams(clipLimit float64, tileGridSize image.Point) CLAHE {
pSize := C.struct_Size{
width: C.int(tileGridSize.X),
@@ -1951,7 +2010,6 @@ func (c *CLAHE) Close() error {
//
// For further details, please see:
// https://docs.opencv.org/master/d6/db6/classcv_1_1CLAHE.html#a4e92e0e427de21be8d1fae8dcd862c5e
//
func (c *CLAHE) Apply(src Mat, dst *Mat) {
C.CLAHE_Apply((C.CLAHE)(c.p), src.p, dst.p)
}
@@ -1964,7 +2022,6 @@ func InvertAffineTransform(src Mat, dst *Mat) {
//
// For further details, please see:
// https://docs.opencv.org/master/d7/df3/group__imgproc__motion.html#ga552420a2ace9ef3fb053cd630fdb4952
//
func PhaseCorrelate(src1, src2, window Mat) (phaseShift Point2f, response float64) {
var responseDouble C.double
result := C.PhaseCorrelate(src1.p, src2.p, window.p, &responseDouble)
@@ -1975,6 +2032,19 @@ func PhaseCorrelate(src1, src2, window Mat) (phaseShift Point2f, response float6
}, float64(responseDouble)
}
// CreateHanningWindow computes a Hanning window coefficients in two dimensions.
//
// For further details, please see:
// https://docs.opencv.org/4.x/d7/df3/group__imgproc__motion.html#ga80e5c3de52f6bab3a7c1e60e89308e1b
func CreateHanningWindow(img *Mat, size image.Point, typ MatType) {
sz := C.struct_Size{
width: C.int(size.X),
height: C.int(size.Y),
}
C.CreateHanningWindow(img.p, sz, C.int(typ))
}
// ToImage converts a Mat to a image.Image.
func (m *Mat) ToImage() (image.Image, error) {
switch m.Type() {
@@ -2180,7 +2250,6 @@ func Accumulate(src Mat, dst *Mat) {
//
// For further details, please see:
// https://docs.opencv.org/master/d7/df3/group__imgproc__motion.html#ga1a567a79901513811ff3b9976923b199
//
func AccumulateWithMask(src Mat, dst *Mat, mask Mat) {
C.Mat_AccumulateWithMask(src.p, dst.p, mask.p)
}
@@ -2189,7 +2258,6 @@ func AccumulateWithMask(src Mat, dst *Mat, mask Mat) {
//
// For further details, please see:
// https://docs.opencv.org/master/d7/df3/group__imgproc__motion.html#gacb75e7ffb573227088cef9ceaf80be8c
//
func AccumulateSquare(src Mat, dst *Mat) {
C.Mat_AccumulateSquare(src.p, dst.p)
}
@@ -2198,7 +2266,6 @@ func AccumulateSquare(src Mat, dst *Mat) {
//
// For further details, please see:
// https://docs.opencv.org/master/d7/df3/group__imgproc__motion.html#gacb75e7ffb573227088cef9ceaf80be8c
//
func AccumulateSquareWithMask(src Mat, dst *Mat, mask Mat) {
C.Mat_AccumulateSquareWithMask(src.p, dst.p, mask.p)
}
@@ -2207,7 +2274,6 @@ func AccumulateSquareWithMask(src Mat, dst *Mat, mask Mat) {
//
// For further details, please see:
// https://docs.opencv.org/master/d7/df3/group__imgproc__motion.html#ga82518a940ecfda49460f66117ac82520
//
func AccumulateProduct(src1 Mat, src2 Mat, dst *Mat) {
C.Mat_AccumulateProduct(src1.p, src2.p, dst.p)
}
@@ -2216,7 +2282,6 @@ func AccumulateProduct(src1 Mat, src2 Mat, dst *Mat) {
//
// For further details, please see:
// https://docs.opencv.org/master/d7/df3/group__imgproc__motion.html#ga82518a940ecfda49460f66117ac82520
//
func AccumulateProductWithMask(src1 Mat, src2 Mat, dst *Mat, mask Mat) {
C.Mat_AccumulateProductWithMask(src1.p, src2.p, dst.p, mask.p)
}
@@ -2225,7 +2290,6 @@ func AccumulateProductWithMask(src1 Mat, src2 Mat, dst *Mat, mask Mat) {
//
// For further details, please see:
// https://docs.opencv.org/master/d7/df3/group__imgproc__motion.html#ga4f9552b541187f61f6818e8d2d826bc7
//
func AccumulatedWeighted(src Mat, dst *Mat, alpha float64) {
C.Mat_AccumulatedWeighted(src.p, dst.p, C.double(alpha))
}
@@ -2234,7 +2298,6 @@ func AccumulatedWeighted(src Mat, dst *Mat, alpha float64) {
//
// For further details, please see:
// https://docs.opencv.org/master/d7/df3/group__imgproc__motion.html#ga4f9552b541187f61f6818e8d2d826bc7
//
func AccumulatedWeightedWithMask(src Mat, dst *Mat, alpha float64, mask Mat) {
C.Mat_AccumulatedWeightedWithMask(src.p, dst.p, C.double(alpha), mask.p)
}

6
vendor/gocv.io/x/gocv/imgproc.h generated vendored
View File

@@ -23,6 +23,7 @@ void EqualizeHist(Mat src, Mat dst);
void CalcHist(struct Mats mats, IntVector chans, Mat mask, Mat hist, IntVector sz, FloatVector rng, bool acc);
void CalcBackProject(struct Mats mats, IntVector chans, Mat hist, Mat backProject, FloatVector rng, bool uniform);
double CompareHist(Mat hist1, Mat hist2, int method);
float EMD(Mat sig1, Mat sig2, int distType);
void ConvexHull(PointVector points, Mat hull, bool clockwise, bool returnPoints);
void ConvexityDefects(PointVector points, Mat hull, Mat result);
void BilateralFilter(Mat src, Mat dst, int d, double sc, double ss);
@@ -34,14 +35,17 @@ void DilateWithParams(Mat src, Mat dst, Mat kernel, Point anchor, int iterations
void DistanceTransform(Mat src, Mat dst, Mat labels, int distanceType, int maskSize, int labelType);
void Erode(Mat src, Mat dst, Mat kernel);
void ErodeWithParams(Mat src, Mat dst, Mat kernel, Point anchor, int iterations, int borderType);
void ErodeWithParamsAndBorderValue(Mat src, Mat dst, Mat kernel, Point anchor, int iterations, int borderType, Scalar borderValue);
void MatchTemplate(Mat image, Mat templ, Mat result, int method, Mat mask);
struct Moment Moments(Mat src, bool binaryImage);
void PyrDown(Mat src, Mat dst, Size dstsize, int borderType);
void PyrUp(Mat src, Mat dst, Size dstsize, int borderType);
struct Rect BoundingRect(PointVector pts);
void BoxPoints(RotatedRect rect, Mat boxPts);
void BoxPoints2f(RotatedRect2f rect, Mat boxPts);
double ContourArea(PointVector pts);
struct RotatedRect MinAreaRect(PointVector pts);
struct RotatedRect2f MinAreaRect2f(PointVector pts);
struct RotatedRect FitEllipse(PointVector pts);
void MinEnclosingCircle(PointVector pts, Point2f* center, float* radius);
PointsVector FindContours(Mat src, Mat hierarchy, int mode, int method);
@@ -124,6 +128,7 @@ void SepFilter2D(Mat src, Mat dst, int ddepth, Mat kernelX, Mat kernelY, Point a
void LogPolar(Mat src, Mat dst, Point center, double m, int flags);
void FitLine(PointVector pts, Mat line, int distType, double param, double reps, double aeps);
void LinearPolar(Mat src, Mat dst, Point center, double maxRadius, int flags);
double MatchShapes(PointVector contour1, PointVector contour2, int method, double parameter);
bool ClipLine(Size imgSize, Point pt1, Point pt2);
CLAHE CLAHE_Create();
CLAHE CLAHE_CreateWithParams(double clipLimit, Size tileGridSize);
@@ -131,6 +136,7 @@ void CLAHE_Close(CLAHE c);
void CLAHE_Apply(CLAHE c, Mat src, Mat dst);
void InvertAffineTransform(Mat src, Mat dst);
Point2f PhaseCorrelate(Mat src1, Mat src2, Mat window, double* response);
void CreateHanningWindow(Mat dst, Size size, int typ);
void Mat_Accumulate(Mat src, Mat dst);
void Mat_AccumulateWithMask(Mat src, Mat dst, Mat mask);
void Mat_AccumulateSquare(Mat src, Mat dst);

View File

@@ -4,28 +4,40 @@ package gocv
//
// For further details, please see:
// http://docs.opencv.org/master/d7/d1b/group__imgproc__misc.html#ga4e0972be5de079fed4e3a10e24ef5ef0
//
type ColorConversionCode int
const (
// ColorBGRToBGRA adds alpha channel to BGR image.
ColorBGRToBGRA ColorConversionCode = 0
// ColorRGBToRGBA adds alpha channel to RGB image.
ColorRGBToRGBA ColorConversionCode = ColorBGRToBGRA
// ColorBGRAToBGR removes alpha channel from BGR image.
ColorBGRAToBGR ColorConversionCode = 1
// ColorRGBAToRGB removes alpha channel from RGB image.
ColorRGBAToRGB ColorConversionCode = ColorBGRAToBGR
// ColorBGRToRGBA converts from BGR to RGB with alpha channel.
ColorBGRToRGBA ColorConversionCode = 2
// ColorRGBToBGRA converts from RGB to BGR with alpha channel.
ColorRGBToBGRA ColorConversionCode = ColorBGRToRGBA
// ColorRGBAToBGR converts from RGB with alpha to BGR color space.
ColorRGBAToBGR ColorConversionCode = 3
// ColorBGRAToRGB converts from BRG with alpha to RGB color space.
ColorBGRAToRGB ColorConversionCode = ColorRGBAToBGR
// ColorBGRToRGB converts from BGR to RGB without alpha channel.
ColorBGRToRGB ColorConversionCode = 4
// ColorRGBToBGR converts from RGB to BGR without alpha channel.
ColorRGBToBGR ColorConversionCode = ColorBGRToRGB
// ColorBGRAToRGBA converts from BGR with alpha channel
// to RGB with alpha channel.
ColorBGRAToRGBA ColorConversionCode = 5
// ColorRGBAToBGRA converts from RGB with alpha channel
// to BGR with alpha channel.
ColorRGBAToBGRA ColorConversionCode = ColorBGRAToRGBA
// ColorBGRToGray converts from BGR to grayscale.
ColorBGRToGray ColorConversionCode = 6
@@ -35,9 +47,13 @@ const (
// ColorGrayToBGR converts from grayscale to BGR.
ColorGrayToBGR ColorConversionCode = 8
// ColorGrayToRGB converts from grayscale to RGB.
ColorGrayToRGB ColorConversionCode = ColorGrayToBGR
// ColorGrayToBGRA converts from grayscale to BGR with alpha channel.
ColorGrayToBGRA ColorConversionCode = 9
// ColorGrayToRGBA converts from grayscale to RGB with alpha channel.
ColorGrayToRGBA ColorConversionCode = ColorGrayToBGRA
// ColorBGRAToGray converts from BGR with alpha channel to grayscale.
ColorBGRAToGray ColorConversionCode = 10
@@ -324,6 +340,11 @@ const (
ColorBayerRGToBGR ColorConversionCode = 48
ColorBayerGRToBGR ColorConversionCode = 49
ColorBayerBGToRGB ColorConversionCode = ColorBayerRGToBGR
ColorBayerGBToRGB ColorConversionCode = ColorBayerGRToBGR
ColorBayerRGToRGB ColorConversionCode = ColorBayerBGToBGR
ColorBayerGRToRGB ColorConversionCode = ColorBayerGBToBGR
ColorBayerBGToGRAY ColorConversionCode = 86
ColorBayerGBToGRAY ColorConversionCode = 87
ColorBayerRGToGRAY ColorConversionCode = 88
@@ -335,17 +356,32 @@ const (
ColorBayerRGToBGRVNG ColorConversionCode = 64
ColorBayerGRToBGRVNG ColorConversionCode = 65
ColorBayerBGToRGBVNG ColorConversionCode = ColorBayerRGToBGRVNG
ColorBayerGBToRGBVNG ColorConversionCode = ColorBayerGRToBGRVNG
ColorBayerRGToRGBVNG ColorConversionCode = ColorBayerBGToBGRVNG
ColorBayerGRToRGBVNG ColorConversionCode = ColorBayerGBToBGRVNG
// Edge-Aware Demosaicing
ColorBayerBGToBGREA ColorConversionCode = 135
ColorBayerGBToBGREA ColorConversionCode = 136
ColorBayerRGToBGREA ColorConversionCode = 137
ColorBayerGRToBGREA ColorConversionCode = 138
ColorBayerBGToRGBEA ColorConversionCode = ColorBayerRGToBGREA
ColorBayerGBToRGBEA ColorConversionCode = ColorBayerGRToBGREA
ColorBayerRGToRGBEA ColorConversionCode = ColorBayerBGToBGREA
ColorBayerGRToRGBEA ColorConversionCode = ColorBayerGBToBGREA
// Demosaicing with alpha channel
ColorBayerBGToBGRA ColorConversionCode = 139
ColorBayerGBToBGRA ColorConversionCode = 140
ColorBayerRGToBGRA ColorConversionCode = 141
ColorBayerGRToBGRA ColorConversionCode = 142
ColorBayerBGToRGBA ColorConversionCode = ColorBayerRGToBGRA
ColorBayerGBToRGBA ColorConversionCode = ColorBayerGRToBGRA
ColorBayerRGToRGBA ColorConversionCode = ColorBayerBGToBGRA
ColorBayerGRToRGBA ColorConversionCode = ColorBayerGBToBGRA
ColorCOLORCVTMAX ColorConversionCode = 143
)

View File

@@ -1,3 +1,4 @@
//go:build !matprofile
// +build !matprofile
package gocv

View File

@@ -1,3 +1,4 @@
//go:build matprofile
// +build matprofile
package gocv
@@ -25,7 +26,7 @@ import (
//
// and you can display the current entries with:
//
// var b bytes.Buffer
// var b bytes.Buffer
// gocv.MatProfile.WriteTo(&b, 1)
// fmt.Print(b.String())
//

18
vendor/gocv.io/x/gocv/objdetect.go generated vendored
View File

@@ -14,7 +14,6 @@ import (
//
// For further details, please see:
// http://docs.opencv.org/master/d1/de5/classcv_1_1CascadeClassifier.html
//
type CascadeClassifier struct {
p C.CascadeClassifier
}
@@ -35,7 +34,6 @@ func (c *CascadeClassifier) Close() error {
//
// For further details, please see:
// http://docs.opencv.org/master/d1/de5/classcv_1_1CascadeClassifier.html#a1a5884c8cc749422f9eb77c2471958bc
//
func (c *CascadeClassifier) Load(name string) bool {
cName := C.CString(name)
defer C.free(unsafe.Pointer(cName))
@@ -47,7 +45,6 @@ func (c *CascadeClassifier) Load(name string) bool {
//
// For further details, please see:
// http://docs.opencv.org/master/d1/de5/classcv_1_1CascadeClassifier.html#aaf8181cb63968136476ec4204ffca498
//
func (c *CascadeClassifier) DetectMultiScale(img Mat) []image.Rectangle {
ret := C.CascadeClassifier_DetectMultiScale(c.p, img.p)
defer C.Rects_Close(ret)
@@ -60,7 +57,6 @@ func (c *CascadeClassifier) DetectMultiScale(img Mat) []image.Rectangle {
//
// For further details, please see:
// http://docs.opencv.org/master/d1/de5/classcv_1_1CascadeClassifier.html#aaf8181cb63968136476ec4204ffca498
//
func (c *CascadeClassifier) DetectMultiScaleWithParams(img Mat, scale float64,
minNeighbors, flags int, minSize, maxSize image.Point) []image.Rectangle {
@@ -85,7 +81,6 @@ func (c *CascadeClassifier) DetectMultiScaleWithParams(img Mat, scale float64,
//
// For further details, please see:
// https://docs.opencv.org/master/d5/d33/structcv_1_1HOGDescriptor.html#a723b95b709cfd3f95cf9e616de988fc8
//
type HOGDescriptor struct {
p C.HOGDescriptor
}
@@ -107,7 +102,6 @@ func (h *HOGDescriptor) Close() error {
//
// For further details, please see:
// https://docs.opencv.org/master/d5/d33/structcv_1_1HOGDescriptor.html#a660e5cd036fd5ddf0f5767b352acd948
//
func (h *HOGDescriptor) DetectMultiScale(img Mat) []image.Rectangle {
ret := C.HOGDescriptor_DetectMultiScale(h.p, img.p)
defer C.Rects_Close(ret)
@@ -120,7 +114,6 @@ func (h *HOGDescriptor) DetectMultiScale(img Mat) []image.Rectangle {
//
// For further details, please see:
// https://docs.opencv.org/master/d5/d33/structcv_1_1HOGDescriptor.html#a660e5cd036fd5ddf0f5767b352acd948
//
func (h *HOGDescriptor) DetectMultiScaleWithParams(img Mat, hitThresh float64,
winStride, padding image.Point, scale, finalThreshold float64, useMeanshiftGrouping bool) []image.Rectangle {
wSz := C.struct_Size{
@@ -144,7 +137,6 @@ func (h *HOGDescriptor) DetectMultiScaleWithParams(img Mat, hitThresh float64,
//
// For further details, please see:
// https://docs.opencv.org/master/d5/d33/structcv_1_1HOGDescriptor.html#a660e5cd036fd5ddf0f5767b352acd948
//
func HOGDefaultPeopleDetector() Mat {
return newMat(C.HOG_GetDefaultPeopleDetector())
}
@@ -153,7 +145,6 @@ func HOGDefaultPeopleDetector() Mat {
//
// For further details, please see:
// https://docs.opencv.org/master/d5/d33/structcv_1_1HOGDescriptor.html#a09e354ad701f56f9c550dc0385dc36f1
//
func (h *HOGDescriptor) SetSVMDetector(det Mat) error {
C.HOGDescriptor_SetSVMDetector(h.p, det.p)
return nil
@@ -163,7 +154,6 @@ func (h *HOGDescriptor) SetSVMDetector(det Mat) error {
//
// For further details, please see:
// https://docs.opencv.org/master/d5/d54/group__objdetect.html#ga3dba897ade8aa8227edda66508e16ab9
//
func GroupRectangles(rects []image.Rectangle, groupThreshold int, eps float64) []image.Rectangle {
cRectArray := make([]C.struct_Rect, len(rects))
for i, r := range rects {
@@ -189,7 +179,6 @@ func GroupRectangles(rects []image.Rectangle, groupThreshold int, eps float64) [
//
// For further details, please see:
// https://docs.opencv.org/master/de/dc3/classcv_1_1QRCodeDetector.html
//
type QRCodeDetector struct {
p C.QRCodeDetector
}
@@ -214,7 +203,6 @@ func (a *QRCodeDetector) Close() error {
// Returns true as long as some QR code was detected even in case where the decoding failed
// For further details, please see:
// https://docs.opencv.org/master/de/dc3/classcv_1_1QRCodeDetector.html#a7290bd6a5d59b14a37979c3a14fbf394
//
func (a *QRCodeDetector) DetectAndDecode(input Mat, points *Mat, straight_qrcode *Mat) string {
goResult := C.GoString(C.QRCodeDetector_DetectAndDecode(a.p, input.p, points.p, straight_qrcode.p))
return string(goResult)
@@ -224,7 +212,6 @@ func (a *QRCodeDetector) DetectAndDecode(input Mat, points *Mat, straight_qrcode
//
// For further details, please see:
// https://docs.opencv.org/master/de/dc3/classcv_1_1QRCodeDetector.html#a64373f7d877d27473f64fe04bb57d22b
//
func (a *QRCodeDetector) Detect(input Mat, points *Mat) bool {
result := C.QRCodeDetector_Detect(a.p, input.p, points.p)
return bool(result)
@@ -234,7 +221,6 @@ func (a *QRCodeDetector) Detect(input Mat, points *Mat) bool {
//
// For further details, please see:
// https://docs.opencv.org/master/de/dc3/classcv_1_1QRCodeDetector.html#a4172c2eb4825c844fb1b0ae67202d329
//
func (a *QRCodeDetector) Decode(input Mat, points Mat, straight_qrcode *Mat) string {
goResult := C.GoString(C.QRCodeDetector_DetectAndDecode(a.p, input.p, points.p, straight_qrcode.p))
return string(goResult)
@@ -252,13 +238,13 @@ func (a *QRCodeDetector) DetectMulti(input Mat, points *Mat) bool {
return bool(result)
}
// Detects QR codes in image and finds of the quadrangles containing the codes and decode the decode the QRCodes to strings.
// Detects QR codes in image, finds the quadrangles containing the codes, and decodes the QRCodes to strings.
//
// Each quadrangle would be returned as a row in the `points` Mat and each point is a Vecf.
// Returns true as long as some QR code was detected even in case where the decoding failed
// For usage please see TestQRCodeDetector
// For further details, please see:
//https://docs.opencv.org/master/de/dc3/classcv_1_1QRCodeDetector.html#a188b63ffa17922b2c65d8a0ab7b70775
// https://docs.opencv.org/master/de/dc3/classcv_1_1QRCodeDetector.html#a188b63ffa17922b2c65d8a0ab7b70775
func (a *QRCodeDetector) DetectAndDecodeMulti(input Mat, decoded *[]string, points *Mat, qrCodes *[]Mat) bool {
cDecoded := C.CStrings{}
defer C.CStrings_Close(cDecoded)

4
vendor/gocv.io/x/gocv/photo.cpp generated vendored
View File

@@ -116,3 +116,7 @@ void PencilSketch(Mat src, Mat dst1, Mat dst2, float sigma_s, float sigma_r, flo
void Stylization(Mat src, Mat dst, float sigma_s, float sigma_r) {
cv::stylization(*src, *dst, sigma_s, sigma_r);
}
void PhotoInpaint(Mat src, Mat mask, Mat dst, float inpaint_radius, int algorithm_type) {
cv::inpaint(*src, *mask, *dst, inpaint_radius, algorithm_type);
}

44
vendor/gocv.io/x/gocv/photo.go generated vendored
View File

@@ -11,7 +11,7 @@ import (
"unsafe"
)
//SeamlessCloneFlags seamlessClone algorithm flags
// SeamlessCloneFlags seamlessClone algorithm flags
type SeamlessCloneFlags int
// MergeMertens is a wrapper around the cv::MergeMertens.
@@ -39,7 +39,6 @@ const (
//
// For further details, please see:
// https://docs.opencv.org/master/df/da0/group__photo__clone.html#ga6684f35dc669ff6196a7c340dc73b98e
//
func ColorChange(src, mask Mat, dst *Mat, red_mul, green_mul, blue_mul float32) {
C.ColorChange(src.p, mask.p, dst.p, C.float(red_mul), C.float(green_mul), C.float(blue_mul))
}
@@ -48,7 +47,6 @@ func ColorChange(src, mask Mat, dst *Mat, red_mul, green_mul, blue_mul float32)
//
// For further details, please see:
// https://docs.opencv.org/master/df/da0/group__photo__clone.html#ga2bf426e4c93a6b1f21705513dfeca49d
//
func SeamlessClone(src, dst, mask Mat, p image.Point, blend *Mat, flags SeamlessCloneFlags) {
cp := C.struct_Point{
x: C.int(p.X),
@@ -62,7 +60,6 @@ func SeamlessClone(src, dst, mask Mat, p image.Point, blend *Mat, flags Seamless
//
// For further details, please see:
// https://docs.opencv.org/master/df/da0/group__photo__clone.html#gac5025767cf2febd8029d474278e886c7
//
func IlluminationChange(src, mask Mat, dst *Mat, alpha, beta float32) {
C.IlluminationChange(src.p, mask.p, dst.p, C.float(alpha), C.float(beta))
}
@@ -71,7 +68,6 @@ func IlluminationChange(src, mask Mat, dst *Mat, alpha, beta float32) {
//
// For further details, please see:
// https://docs.opencv.org/master/df/da0/group__photo__clone.html#gad55df6aa53797365fa7cc23959a54004
//
func TextureFlattening(src, mask Mat, dst *Mat, lowThreshold, highThreshold float32, kernelSize int) {
C.TextureFlattening(src.p, mask.p, dst.p, C.float(lowThreshold), C.float(highThreshold), C.int(kernelSize))
}
@@ -80,7 +76,6 @@ func TextureFlattening(src, mask Mat, dst *Mat, lowThreshold, highThreshold floa
//
// For further details, please see:
// https://docs.opencv.org/master/d1/d79/group__photo__denoise.html#gaa501e71f52fb2dc17ff8ca5e7d2d3619
//
func FastNlMeansDenoisingColoredMulti(src []Mat, dst *Mat, imgToDenoiseIndex int, temporalWindowSize int) {
cMatArray := make([]C.Mat, len(src))
for i, r := range src {
@@ -97,7 +92,6 @@ func FastNlMeansDenoisingColoredMulti(src []Mat, dst *Mat, imgToDenoiseIndex int
//
// For further details, please see:
// https://docs.opencv.org/master/d1/d79/group__photo__denoise.html#gaa501e71f52fb2dc17ff8ca5e7d2d3619
//
func FastNlMeansDenoisingColoredMultiWithParams(src []Mat, dst *Mat, imgToDenoiseIndex int, temporalWindowSize int, h float32, hColor float32, templateWindowSize int, searchWindowSize int) {
cMatArray := make([]C.Mat, len(src))
for i, r := range src {
@@ -118,7 +112,6 @@ func FastNlMeansDenoisingColoredMultiWithParams(src []Mat, dst *Mat, imgToDenois
// https://docs.opencv.org/master/d6/df5/group__photo__hdr.html
// https://docs.opencv.org/master/d7/dd6/classcv_1_1MergeMertens.html
// https://docs.opencv.org/master/d6/df5/group__photo__hdr.html#ga79d59aa3cb3a7c664e59a4b5acc1ccb6
//
func NewMergeMertens() MergeMertens {
return MergeMertens{p: unsafe.Pointer(C.MergeMertens_Create())}
}
@@ -131,7 +124,6 @@ func NewMergeMertens() MergeMertens {
// https://docs.opencv.org/master/d6/df5/group__photo__hdr.html
// https://docs.opencv.org/master/d7/dd6/classcv_1_1MergeMertens.html
// https://docs.opencv.org/master/d6/df5/group__photo__hdr.html#ga79d59aa3cb3a7c664e59a4b5acc1ccb6
//
func NewMergeMertensWithParams(contrast_weight float32, saturation_weight float32, exposure_weight float32) MergeMertens {
return MergeMertens{p: unsafe.Pointer(C.MergeMertens_CreateWithParams(C.float(contrast_weight), C.float(saturation_weight), C.float(exposure_weight)))}
}
@@ -147,7 +139,6 @@ func (b *MergeMertens) Close() error {
// Return a image MAT : 8bits 3 channel image ( RGB 8 bits )
// For further details, please see:
// https://docs.opencv.org/master/d7/dd6/classcv_1_1MergeMertens.html#a2d2254b2aab722c16954de13a663644d
//
func (b *MergeMertens) Process(src []Mat, dst *Mat) {
cMatArray := make([]C.Mat, len(src))
for i, r := range src {
@@ -172,7 +163,6 @@ func (b *MergeMertens) Process(src []Mat, dst *Mat) {
// https://docs.opencv.org/master/d6/df5/group__photo__hdr.html
// https://docs.opencv.org/master/d7/db6/classcv_1_1AlignMTB.html
// https://docs.opencv.org/master/d6/df5/group__photo__hdr.html#ga2f1fafc885a5d79dbfb3542e08db0244
//
func NewAlignMTB() AlignMTB {
return AlignMTB{p: unsafe.Pointer(C.AlignMTB_Create())}
}
@@ -186,7 +176,6 @@ func NewAlignMTB() AlignMTB {
// https://docs.opencv.org/master/d6/df5/group__photo__hdr.html
// https://docs.opencv.org/master/d7/db6/classcv_1_1AlignMTB.html
// https://docs.opencv.org/master/d6/df5/group__photo__hdr.html#ga2f1fafc885a5d79dbfb3542e08db0244
//
func NewAlignMTBWithParams(max_bits int, exclude_range int, cut bool) AlignMTB {
return AlignMTB{p: unsafe.Pointer(C.AlignMTB_CreateWithParams(C.int(max_bits), C.int(exclude_range), C.bool(cut)))}
}
@@ -202,7 +191,6 @@ func (b *AlignMTB) Close() error {
//
// For further details, please see:
// https://docs.opencv.org/master/d7/db6/classcv_1_1AlignMTB.html#a37b3417d844f362d781f34155cbcb201
//
func (b *AlignMTB) Process(src []Mat, dst *[]Mat) {
cSrcArray := make([]C.Mat, len(src))
@@ -232,7 +220,6 @@ func (b *AlignMTB) Process(src []Mat, dst *[]Mat) {
//
// For further details, please see:
// https://docs.opencv.org/4.x/d1/d79/group__photo__denoise.html#ga4c6b0031f56ea3f98f768881279ffe93
//
func FastNlMeansDenoising(src Mat, dst *Mat) {
C.FastNlMeansDenoising(src.p, dst.p)
}
@@ -242,7 +229,6 @@ func FastNlMeansDenoising(src Mat, dst *Mat) {
//
// For further details, please see:
// https://docs.opencv.org/4.x/d1/d79/group__photo__denoise.html#ga4c6b0031f56ea3f98f768881279ffe93
//
func FastNlMeansDenoisingWithParams(src Mat, dst *Mat, h float32, templateWindowSize int, searchWindowSize int) {
C.FastNlMeansDenoisingWithParams(src.p, dst.p, C.float(h), C.int(templateWindowSize), C.int(searchWindowSize))
}
@@ -251,7 +237,6 @@ func FastNlMeansDenoisingWithParams(src Mat, dst *Mat, h float32, templateWindow
//
// For further details, please see:
// https://docs.opencv.org/4.x/d1/d79/group__photo__denoise.html#ga21abc1c8b0e15f78cd3eff672cb6c476
//
func FastNlMeansDenoisingColored(src Mat, dst *Mat) {
C.FastNlMeansDenoisingColored(src.p, dst.p)
}
@@ -260,7 +245,6 @@ func FastNlMeansDenoisingColored(src Mat, dst *Mat) {
//
// For further details, please see:
// https://docs.opencv.org/4.x/d1/d79/group__photo__denoise.html#ga21abc1c8b0e15f78cd3eff672cb6c476
//
func FastNlMeansDenoisingColoredWithParams(src Mat, dst *Mat, h float32, hColor float32, templateWindowSize int, searchWindowSize int) {
C.FastNlMeansDenoisingColoredWithParams(src.p, dst.p, C.float(h), C.float(hColor), C.int(templateWindowSize), C.int(searchWindowSize))
}
@@ -269,7 +253,6 @@ func FastNlMeansDenoisingColoredWithParams(src Mat, dst *Mat, h float32, hColor
//
// For further details, please see:
// https://docs.opencv.org/4.x/df/dac/group__photo__render.html#gae5930dd822c713b36f8529b21ddebd0c
//
func DetailEnhance(src Mat, dst *Mat, sigma_s, sigma_r float32) {
C.DetailEnhance(src.p, dst.p, C.float(sigma_s), C.float(sigma_r))
}
@@ -289,7 +272,6 @@ const (
//
// For further details, please see:
// https://docs.opencv.org/4.x/df/dac/group__photo__render.html#gafaee2977597029bc8e35da6e67bd31f7
//
func EdgePreservingFilter(src Mat, dst *Mat, filter EdgeFilter, sigma_s, sigma_r float32) {
C.EdgePreservingFilter(src.p, dst.p, C.int(filter), C.float(sigma_s), C.float(sigma_r))
}
@@ -298,7 +280,6 @@ func EdgePreservingFilter(src Mat, dst *Mat, filter EdgeFilter, sigma_s, sigma_r
//
// For further details, please see:
// https://docs.opencv.org/4.x/df/dac/group__photo__render.html#gae5930dd822c713b36f8529b21ddebd0c
//
func PencilSketch(src Mat, dst1, dst2 *Mat, sigma_s, sigma_r, shade_factor float32) {
C.PencilSketch(src.p, dst1.p, dst2.p, C.float(sigma_s), C.float(sigma_r), C.float(shade_factor))
}
@@ -310,7 +291,28 @@ func PencilSketch(src Mat, dst1, dst2 *Mat, sigma_s, sigma_r, shade_factor float
//
// For further details, please see:
// https://docs.opencv.org/4.x/df/dac/group__photo__render.html#gacb0f7324017df153d7b5d095aed53206
//
func Stylization(src Mat, dst *Mat, sigma_s, sigma_r float32) {
C.Stylization(src.p, dst.p, C.float(sigma_s), C.float(sigma_r))
}
// InpaintMethods is the methods for inpainting process.
type InpaintMethods int
const (
// NS inpaints using Navier-Stokes based method, created by Bertalmio, Marcelo,
// Andrea L. Bertozzi, and Guillermo Sapiro in 2001
NS InpaintMethods = 0
// Telea inpaints using Fast Marching Method proposed by Alexandru Telea in 2004.
Telea InpaintMethods = 1
)
// Inpaint reconstructs the selected image area from the pixel near the area boundary.
// The function may be used to remove dust and scratches from a scanned photo, or to
// remove undesirable objects from still images or video.
//
// For further details, please see:
// https://docs.opencv.org/4.x/d7/d8b/group__photo__inpaint.html#gaedd30dfa0214fec4c88138b51d678085
func Inpaint(src Mat, mask Mat, dst *Mat, inpaintRadius float32, algorithmType InpaintMethods) {
C.PhotoInpaint(C.Mat(src.Ptr()), C.Mat(mask.Ptr()), C.Mat(dst.Ptr()), C.float(inpaintRadius), C.int(algorithmType))
}

2
vendor/gocv.io/x/gocv/photo.h generated vendored
View File

@@ -49,6 +49,8 @@ void EdgePreservingFilter(Mat src, Mat dst, int filter, float sigma_s, float sig
void PencilSketch(Mat src, Mat dst1, Mat dst2, float sigma_s, float sigma_r, float shade_factor);
void Stylization(Mat src, Mat dst, float sigma_s, float sigma_r);
void PhotoInpaint(Mat src, Mat mask, Mat dst, float inpaint_radius, int algorithm_type);
#ifdef __cplusplus
}
#endif

2
vendor/gocv.io/x/gocv/version.go generated vendored
View File

@@ -7,7 +7,7 @@ package gocv
import "C"
// GoCVVersion of this package, for display purposes.
const GoCVVersion = "0.31.0"
const GoCVVersion = "0.36.0"
// Version returns the current golang package version
func Version() string {

132
vendor/gocv.io/x/gocv/video.cpp generated vendored
View File

@@ -75,3 +75,135 @@ TrackerMIL TrackerMIL_Create() {
void TrackerMIL_Close(TrackerMIL self) {
delete self;
}
KalmanFilter KalmanFilter_New(int dynamParams, int measureParams) {
return new cv::KalmanFilter(dynamParams, measureParams, 0, CV_32F);
}
KalmanFilter KalmanFilter_NewWithParams(int dynamParams, int measureParams, int controlParams, int type) {
return new cv::KalmanFilter(dynamParams, measureParams, controlParams, type);
}
void KalmanFilter_Init(KalmanFilter kf, int dynamParams, int measureParams) {
kf->init(dynamParams, measureParams, 0, CV_32F);
}
void KalmanFilter_InitWithParams(KalmanFilter kf, int dynamParams, int measureParams, int controlParams, int type) {
kf->init(dynamParams, measureParams, controlParams, type);
}
void KalmanFilter_Close(KalmanFilter kf) {
delete kf;
}
Mat KalmanFilter_Predict(KalmanFilter kf) {
return new cv::Mat(kf->predict());
}
Mat KalmanFilter_PredictWithParams(KalmanFilter kf, Mat control) {
return new cv::Mat(kf->predict(*control));
}
Mat KalmanFilter_Correct(KalmanFilter kf, Mat measurement) {
return new cv::Mat(kf->correct(*measurement));
}
Mat KalmanFilter_GetStatePre(KalmanFilter kf) {
return new cv::Mat(kf->statePre);
}
Mat KalmanFilter_GetStatePost(KalmanFilter kf) {
return new cv::Mat(kf->statePost);
}
Mat KalmanFilter_GetTransitionMatrix(KalmanFilter kf) {
return new cv::Mat(kf->transitionMatrix);
}
Mat KalmanFilter_GetControlMatrix(KalmanFilter kf) {
return new cv::Mat(kf->controlMatrix);
}
Mat KalmanFilter_GetMeasurementMatrix(KalmanFilter kf) {
return new cv::Mat(kf->measurementMatrix);
}
Mat KalmanFilter_GetProcessNoiseCov(KalmanFilter kf) {
return new cv::Mat(kf->processNoiseCov);
}
Mat KalmanFilter_GetMeasurementNoiseCov(KalmanFilter kf) {
return new cv::Mat(kf->measurementNoiseCov);
}
Mat KalmanFilter_GetErrorCovPre(KalmanFilter kf) {
return new cv::Mat(kf->errorCovPre);
}
Mat KalmanFilter_GetGain(KalmanFilter kf) {
return new cv::Mat(kf->gain);
}
Mat KalmanFilter_GetErrorCovPost(KalmanFilter kf) {
return new cv::Mat(kf->errorCovPost);
}
Mat KalmanFilter_GetTemp1(KalmanFilter kf) {
return new cv::Mat(kf->temp1);
}
Mat KalmanFilter_GetTemp2(KalmanFilter kf) {
return new cv::Mat(kf->temp2);
}
Mat KalmanFilter_GetTemp3(KalmanFilter kf) {
return new cv::Mat(kf->temp3);
}
Mat KalmanFilter_GetTemp4(KalmanFilter kf) {
return new cv::Mat(kf->temp4);
}
Mat KalmanFilter_GetTemp5(KalmanFilter kf) {
return new cv::Mat(kf->temp5);
}
void KalmanFilter_SetStatePre(KalmanFilter kf, Mat statePre) {
kf->statePre = *statePre;
}
void KalmanFilter_SetStatePost(KalmanFilter kf, Mat statePost) {
kf->statePost = *statePost;
}
void KalmanFilter_SetTransitionMatrix(KalmanFilter kf, Mat transitionMatrix) {
kf->transitionMatrix = *transitionMatrix;
}
void KalmanFilter_SetControlMatrix(KalmanFilter kf, Mat controlMatrix) {
kf->controlMatrix = *controlMatrix;
}
void KalmanFilter_SetMeasurementMatrix(KalmanFilter kf, Mat measurementMatrix) {
kf->measurementMatrix = *measurementMatrix;
}
void KalmanFilter_SetProcessNoiseCov(KalmanFilter kf, Mat processNoiseCov) {
kf->processNoiseCov = *processNoiseCov;
}
void KalmanFilter_SetMeasurementNoiseCov(KalmanFilter kf, Mat measurementNoiseCov) {
kf->measurementNoiseCov = *measurementNoiseCov;
}
void KalmanFilter_SetErrorCovPre(KalmanFilter kf, Mat errorCovPre) {
kf->errorCovPre = *errorCovPre;
}
void KalmanFilter_SetGain(KalmanFilter kf, Mat gain) {
kf->gain = *gain;
}
void KalmanFilter_SetErrorCovPost(KalmanFilter kf, Mat errorCovPost) {
kf->errorCovPost = *errorCovPost;
}

292
vendor/gocv.io/x/gocv/video.go generated vendored
View File

@@ -10,11 +10,13 @@ import (
"unsafe"
)
/**
cv::OPTFLOW_USE_INITIAL_FLOW = 4,
cv::OPTFLOW_LK_GET_MIN_EIGENVALS = 8,
cv::OPTFLOW_FARNEBACK_GAUSSIAN = 256
For further details, please see: https://docs.opencv.org/master/dc/d6b/group__video__track.html#gga2c6cc144c9eee043575d5b311ac8af08a9d4430ac75199af0cf6fcdefba30eafe
/*
*
cv::OPTFLOW_USE_INITIAL_FLOW = 4,
cv::OPTFLOW_LK_GET_MIN_EIGENVALS = 8,
cv::OPTFLOW_FARNEBACK_GAUSSIAN = 256
For further details, please see: https://docs.opencv.org/master/dc/d6b/group__video__track.html#gga2c6cc144c9eee043575d5b311ac8af08a9d4430ac75199af0cf6fcdefba30eafe
*/
const (
OptflowUseInitialFlow = 4
@@ -22,12 +24,14 @@ const (
OptflowFarnebackGaussian = 256
)
/**
cv::MOTION_TRANSLATION = 0,
cv::MOTION_EUCLIDEAN = 1,
cv::MOTION_AFFINE = 2,
cv::MOTION_HOMOGRAPHY = 3
For further details, please see: https://docs.opencv.org/4.x/dc/d6b/group__video__track.html#ggaaedb1f94e6b143cef163622c531afd88a01106d6d20122b782ff25eaeffe9a5be
/*
*
cv::MOTION_TRANSLATION = 0,
cv::MOTION_EUCLIDEAN = 1,
cv::MOTION_AFFINE = 2,
cv::MOTION_HOMOGRAPHY = 3
For further details, please see: https://docs.opencv.org/4.x/dc/d6b/group__video__track.html#ggaaedb1f94e6b143cef163622c531afd88a01106d6d20122b782ff25eaeffe9a5be
*/
const (
MotionTranslation = 0
@@ -49,7 +53,6 @@ type BackgroundSubtractorMOG2 struct {
// For further details, please see:
// https://docs.opencv.org/master/de/de1/group__video__motion.html#ga2beb2dee7a073809ccec60f145b6b29c
// https://docs.opencv.org/master/d7/d7b/classcv_1_1BackgroundSubtractorMOG2.html
//
func NewBackgroundSubtractorMOG2() BackgroundSubtractorMOG2 {
return BackgroundSubtractorMOG2{p: unsafe.Pointer(C.BackgroundSubtractorMOG2_Create())}
}
@@ -61,7 +64,6 @@ func NewBackgroundSubtractorMOG2() BackgroundSubtractorMOG2 {
// For further details, please see:
// https://docs.opencv.org/master/de/de1/group__video__motion.html#ga2beb2dee7a073809ccec60f145b6b29c
// https://docs.opencv.org/master/d7/d7b/classcv_1_1BackgroundSubtractorMOG2.html
//
func NewBackgroundSubtractorMOG2WithParams(history int, varThreshold float64, detectShadows bool) BackgroundSubtractorMOG2 {
return BackgroundSubtractorMOG2{p: unsafe.Pointer(C.BackgroundSubtractorMOG2_CreateWithParams(C.int(history), C.double(varThreshold), C.bool(detectShadows)))}
}
@@ -77,7 +79,6 @@ func (b *BackgroundSubtractorMOG2) Close() error {
//
// For further details, please see:
// https://docs.opencv.org/master/d7/df6/classcv_1_1BackgroundSubtractor.html#aa735e76f7069b3fa9c3f32395f9ccd21
//
func (b *BackgroundSubtractorMOG2) Apply(src Mat, dst *Mat) {
C.BackgroundSubtractorMOG2_Apply((C.BackgroundSubtractorMOG2)(b.p), src.p, dst.p)
return
@@ -96,7 +97,6 @@ type BackgroundSubtractorKNN struct {
// For further details, please see:
// https://docs.opencv.org/master/de/de1/group__video__motion.html#gac9be925771f805b6fdb614ec2292006d
// https://docs.opencv.org/master/db/d88/classcv_1_1BackgroundSubtractorKNN.html
//
func NewBackgroundSubtractorKNN() BackgroundSubtractorKNN {
return BackgroundSubtractorKNN{p: unsafe.Pointer(C.BackgroundSubtractorKNN_Create())}
}
@@ -108,7 +108,6 @@ func NewBackgroundSubtractorKNN() BackgroundSubtractorKNN {
// For further details, please see:
// https://docs.opencv.org/master/de/de1/group__video__motion.html#gac9be925771f805b6fdb614ec2292006d
// https://docs.opencv.org/master/db/d88/classcv_1_1BackgroundSubtractorKNN.html
//
func NewBackgroundSubtractorKNNWithParams(history int, dist2Threshold float64, detectShadows bool) BackgroundSubtractorKNN {
return BackgroundSubtractorKNN{p: unsafe.Pointer(C.BackgroundSubtractorKNN_CreateWithParams(C.int(history), C.double(dist2Threshold), C.bool(detectShadows)))}
}
@@ -124,7 +123,6 @@ func (k *BackgroundSubtractorKNN) Close() error {
//
// For further details, please see:
// https://docs.opencv.org/master/d7/df6/classcv_1_1BackgroundSubtractor.html#aa735e76f7069b3fa9c3f32395f9ccd21
//
func (k *BackgroundSubtractorKNN) Apply(src Mat, dst *Mat) {
C.BackgroundSubtractorKNN_Apply((C.BackgroundSubtractorKNN)(k.p), src.p, dst.p)
return
@@ -135,7 +133,6 @@ func (k *BackgroundSubtractorKNN) Apply(src Mat, dst *Mat) {
//
// For further details, please see:
// https://docs.opencv.org/master/dc/d6b/group__video__track.html#ga5d10ebbd59fe09c5f650289ec0ece5af
//
func CalcOpticalFlowFarneback(prevImg Mat, nextImg Mat, flow *Mat, pyrScale float64, levels int, winsize int,
iterations int, polyN int, polySigma float64, flags int) {
C.CalcOpticalFlowFarneback(prevImg.p, nextImg.p, flow.p, C.double(pyrScale), C.int(levels), C.int(winsize),
@@ -148,7 +145,6 @@ func CalcOpticalFlowFarneback(prevImg Mat, nextImg Mat, flow *Mat, pyrScale floa
//
// For further details, please see:
// https://docs.opencv.org/master/dc/d6b/group__video__track.html#ga473e4b886d0bcc6b65831eb88ed93323
//
func CalcOpticalFlowPyrLK(prevImg Mat, nextImg Mat, prevPts Mat, nextPts Mat, status *Mat, err *Mat) {
C.CalcOpticalFlowPyrLK(prevImg.p, nextImg.p, prevPts.p, nextPts.p, status.p, err.p)
return
@@ -159,7 +155,6 @@ func CalcOpticalFlowPyrLK(prevImg Mat, nextImg Mat, prevPts Mat, nextPts Mat, st
//
// For further details, please see:
// https://docs.opencv.org/master/dc/d6b/group__video__track.html#ga473e4b886d0bcc6b65831eb88ed93323
//
func CalcOpticalFlowPyrLKWithParams(prevImg Mat, nextImg Mat, prevPts Mat, nextPts Mat, status *Mat, err *Mat,
winSize image.Point, maxLevel int, criteria TermCriteria, flags int, minEigThreshold float64) {
winSz := C.struct_Size{
@@ -174,7 +169,6 @@ func CalcOpticalFlowPyrLKWithParams(prevImg Mat, nextImg Mat, prevPts Mat, nextP
//
// For futther details, please see:
// https://docs.opencv.org/4.x/dc/d6b/group__video__track.html#ga1aa357007eaec11e9ed03500ecbcbe47
//
func FindTransformECC(templateImage Mat, inputImage Mat, warpMatrix *Mat, motionType int, criteria TermCriteria, inputMask Mat, gaussFiltSize int) float64 {
return float64(C.FindTransformECC(templateImage.p, inputImage.p, warpMatrix.p, C.int(motionType), criteria.p, inputMask.p, C.int(gaussFiltSize)))
}
@@ -182,7 +176,6 @@ func FindTransformECC(templateImage Mat, inputImage Mat, warpMatrix *Mat, motion
// Tracker is the base interface for object tracking.
//
// see: https://docs.opencv.org/master/d0/d0a/classcv_1_1Tracker.html
//
type Tracker interface {
// Close closes, as Trackers need to be Closed manually.
//
@@ -230,7 +223,6 @@ func trackerUpdate(trk C.Tracker, img Mat) (image.Rectangle, bool) {
//
// For further details, please see:
// https://docs.opencv.org/master/d0/d26/classcv_1_1TrackerMIL.html
//
type TrackerMIL struct {
p C.TrackerMIL
}
@@ -256,3 +248,257 @@ func (trk TrackerMIL) Init(img Mat, boundingBox image.Rectangle) bool {
func (trk TrackerMIL) Update(img Mat) (image.Rectangle, bool) {
return trackerUpdate(C.Tracker(trk.p), img)
}
// KalmanFilter implements a standard Kalman filter http://en.wikipedia.org/wiki/Kalman_filter.
// However, you can modify transitionMatrix, controlMatrix, and measurementMatrix
// to get an extended Kalman filter functionality.
//
// For further details, please see:
// https://docs.opencv.org/4.6.0/dd/d6a/classcv_1_1KalmanFilter.html
type KalmanFilter struct {
p C.KalmanFilter
}
// NewKalmanFilter returns a new KalmanFilter.
//
// For further details, please see:
// https://docs.opencv.org/4.6.0/dd/d6a/classcv_1_1KalmanFilter.html#ac0799f0611baee9e7e558f016e4a7b40
func NewKalmanFilter(dynamParams int, measureParams int) KalmanFilter {
return KalmanFilter{p: C.KalmanFilter_New(C.int(dynamParams), C.int(measureParams))}
}
// NewKalmanFilterWithParams returns a new KalmanFilter.
//
// For further details, please see:
// https://docs.opencv.org/4.6.0/dd/d6a/classcv_1_1KalmanFilter.html#abac82ecfa530611a163255bc7d91c088
func NewKalmanFilterWithParams(dynamParams int, measureParams int, controlParams int, matType MatType) KalmanFilter {
return KalmanFilter{p: C.KalmanFilter_NewWithParams(C.int(dynamParams), C.int(measureParams), C.int(controlParams), C.int(matType))}
}
// Init re-initializes the Kalman filter. The previous content is destroyed.
//
// For further details, please see:
// https://docs.opencv.org/4.6.0/dd/d6a/classcv_1_1KalmanFilter.html#a4f136c39c016d3530c7c5801dd1ddb3b
func (kf *KalmanFilter) Init(dynamParams int, measureParams int) {
C.KalmanFilter_Init(kf.p, C.int(dynamParams), C.int(measureParams))
}
// Predict computes a predicted state.
//
// For further details, please see:
// https://docs.opencv.org/4.6.0/dd/d6a/classcv_1_1KalmanFilter.html#aa710d2255566bec8d6ce608d103d4fa7
func (kf *KalmanFilter) Predict() Mat {
return newMat(C.KalmanFilter_Predict(kf.p))
}
// PredictWithParams computes a predicted state.
//
// For further details, please see:
// https://docs.opencv.org/4.6.0/dd/d6a/classcv_1_1KalmanFilter.html#aa710d2255566bec8d6ce608d103d4fa7
func (kf *KalmanFilter) PredictWithParams(control Mat) Mat {
return newMat(C.KalmanFilter_PredictWithParams(kf.p, control.p))
}
// Correct the predicted state from the measurement.
//
// For further details, please see:
// https://docs.opencv.org/4.6.0/dd/d6a/classcv_1_1KalmanFilter.html#a60eb7feb569222ad0657ef1875884b5e
func (kf *KalmanFilter) Correct(measurement Mat) Mat {
return newMat(C.KalmanFilter_Correct(kf.p, measurement.p))
}
// Close closes the kalman filter.
func (kf *KalmanFilter) Close() {
C.KalmanFilter_Close(kf.p)
kf.p = nil
}
// GetStatePre returns the Kalman filter's statePre Mat.
//
// predicted state (x'(k)): x(k)=A*x(k-1)+B*u(k)
//
// For further details, please see:
// https://docs.opencv.org/4.6.0/dd/d6a/classcv_1_1KalmanFilter.html#a60eb7feb569222ad0657ef1875884b5e
func (kf *KalmanFilter) GetStatePre() Mat {
return newMat(C.KalmanFilter_GetStatePre(kf.p))
}
// GetStatePost returns the Kalman filter's statePost Mat.
//
// corrected state (x(k)): x(k)=x'(k)+K(k)*(z(k)-H*x'(k))
//
// For further details, please see:
// https://docs.opencv.org/4.6.0/dd/d6a/classcv_1_1KalmanFilter.html#add8fb5ac9c04b4600b679698dcb0447d
func (kf *KalmanFilter) GetStatePost() Mat {
return newMat(C.KalmanFilter_GetStatePost(kf.p))
}
// GetTransitionMatrix returns the Kalman filter's transitionMatrix Mat.
//
// state transition matrix (A)
//
// For further details, please see:
// https://docs.opencv.org/4.6.0/dd/d6a/classcv_1_1KalmanFilter.html#a0657173e411acbf40d2d3c6b46e03b19
func (kf *KalmanFilter) GetTransitionMatrix() Mat {
return newMat(C.KalmanFilter_GetTransitionMatrix(kf.p))
}
// GetControlMatrix returns the Kalman filter's controlMatrix Mat.
//
// control matrix (B) (not used if there is no control)
//
// For further details, please see:
// https://docs.opencv.org/4.6.0/dd/d6a/classcv_1_1KalmanFilter.html#a6486e7287114810636fb33953280ed52
func (kf *KalmanFilter) GetControlMatrix() Mat {
return newMat(C.KalmanFilter_GetControlMatrix(kf.p))
}
// GetMeasurementMatrix returns the Kalman filter's measurementMatrix Mat.
//
// measurement matrix (H)
//
// For further details, please see:
// https://docs.opencv.org/4.6.0/dd/d6a/classcv_1_1KalmanFilter.html#a0f60b78726d8eccf74a1f2479c2d1f97
func (kf *KalmanFilter) GetMeasurementMatrix() Mat {
return newMat(C.KalmanFilter_GetMeasurementMatrix(kf.p))
}
// GetProcessNoiseCov returns the Kalman filter's processNoiseCov Mat.
//
// process noise covariance matrix (Q)
//
// For further details, please see:
// https://docs.opencv.org/4.6.0/dd/d6a/classcv_1_1KalmanFilter.html#af19be9c0630d0f658bdbaea409a35cda
func (kf *KalmanFilter) GetProcessNoiseCov() Mat {
return newMat(C.KalmanFilter_GetProcessNoiseCov(kf.p))
}
// GetMeasurementNoiseCov returns the Kalman filter's measurementNoiseCov Mat.
//
// measurement noise covariance matrix (R)
//
// For further details, please see:
// https://docs.opencv.org/4.6.0/dd/d6a/classcv_1_1KalmanFilter.html#a828d051035ba807966ad65edf288a08e
func (kf *KalmanFilter) GetMeasurementNoiseCov() Mat {
return newMat(C.KalmanFilter_GetMeasurementNoiseCov(kf.p))
}
// GetErrorCovPre returns the Kalman filter's errorCovPre Mat.
//
// priori error estimate covariance matrix (P'(k)): P'(k)=A*P(k-1)*At + Q)*/
//
// For further details, please see:
// https://docs.opencv.org/4.6.0/dd/d6a/classcv_1_1KalmanFilter.html#ae1bd3a86f10753d723e7174d570d9ac1
func (kf *KalmanFilter) GetErrorCovPre() Mat {
return newMat(C.KalmanFilter_GetErrorCovPre(kf.p))
}
// GetGain returns the Kalman filter's gain Mat.
//
// Kalman gain matrix (K(k)): K(k)=P'(k)*Ht*inv(H*P'(k)*Ht+R)
//
// For further details, please see:
// https://docs.opencv.org/4.6.0/dd/d6a/classcv_1_1KalmanFilter.html#a077d73eb075b00779dc009a9057c27c3
func (kf *KalmanFilter) GetGain() Mat {
return newMat(C.KalmanFilter_GetGain(kf.p))
}
// GetErrorCovPost returns the Kalman filter's errorCovPost Mat.
//
// posteriori error estimate covariance matrix (P(k)): P(k)=(I-K(k)*H)*P'(k)
//
// For further details, please see:
// https://docs.opencv.org/4.6.0/dd/d6a/classcv_1_1KalmanFilter.html#a446d8e9a0105b0aa35cd66119c529803
func (kf *KalmanFilter) GetErrorCovPost() Mat {
return newMat(C.KalmanFilter_GetErrorCovPost(kf.p))
}
// GetTemp1 returns the Kalman filter's temp1 Mat.
//
// For further details, please see:
// https://docs.opencv.org/4.6.0/dd/d6a/classcv_1_1KalmanFilter.html#aa3d064a9194c2815dbe19c056b6dc763
func (kf *KalmanFilter) GetTemp1() Mat {
return newMat(C.KalmanFilter_GetTemp1(kf.p))
}
// GetTemp2 returns the Kalman filter's temp2 Mat.
//
// For further details, please see:
// https://docs.opencv.org/4.6.0/dd/d6a/classcv_1_1KalmanFilter.html#a14866bd506668eb0ed57b3974b3a1ee7
func (kf *KalmanFilter) GetTemp2() Mat {
return newMat(C.KalmanFilter_GetTemp2(kf.p))
}
// GetTemp3 returns the Kalman filter's temp3 Mat.
//
// For further details, please see:
// https://docs.opencv.org/4.6.0/dd/d6a/classcv_1_1KalmanFilter.html#afdbe36066a7d7f560aa02abe6be114d8
func (kf *KalmanFilter) GetTemp3() Mat {
return newMat(C.KalmanFilter_GetTemp3(kf.p))
}
// GetTemp4 returns the Kalman filter's temp4 Mat.
//
// For further details, please see:
// https://docs.opencv.org/4.6.0/dd/d6a/classcv_1_1KalmanFilter.html#a84342f2d9dec1e6389025ad229401809
func (kf *KalmanFilter) GetTemp4() Mat {
return newMat(C.KalmanFilter_GetTemp4(kf.p))
}
// GetTemp5 returns the Kalman filter's temp5 Mat.
//
// For further details, please see:
// https://docs.opencv.org/4.6.0/dd/d6a/classcv_1_1KalmanFilter.html#a846c2a6222c6e5d8b1385dfbccc83ae0
func (kf *KalmanFilter) GetTemp5() Mat {
return newMat(C.KalmanFilter_GetTemp5(kf.p))
}
// SetStatePre sets the Kalman filter's statePre Mat.
func (kf *KalmanFilter) SetStatePre(statePre Mat) {
C.KalmanFilter_SetStatePre(kf.p, statePre.p)
}
// SetStatePost sets the Kalman filter's statePost Mat.
func (kf *KalmanFilter) SetStatePost(statePost Mat) {
C.KalmanFilter_SetStatePost(kf.p, statePost.p)
}
// SetTransitionMatrix sets the Kalman filter's transitionMatrix Mat.
func (kf *KalmanFilter) SetTransitionMatrix(transitionMatrix Mat) {
C.KalmanFilter_SetTransitionMatrix(kf.p, transitionMatrix.p)
}
// SetControlMatrix sets the Kalman filter's controlMatrix Mat.
func (kf *KalmanFilter) SetControlMatrix(controlMatrix Mat) {
C.KalmanFilter_SetControlMatrix(kf.p, controlMatrix.p)
}
// SetMeasurementMatrix sets the Kalman filter's measurementMatrix Mat.
func (kf *KalmanFilter) SetMeasurementMatrix(measurementMatrix Mat) {
C.KalmanFilter_SetMeasurementMatrix(kf.p, measurementMatrix.p)
}
// SetProcessNoiseCov sets the Kalman filter's processNoiseCov Mat.
func (kf *KalmanFilter) SetProcessNoiseCov(processNoiseCov Mat) {
C.KalmanFilter_SetProcessNoiseCov(kf.p, processNoiseCov.p)
}
// SetMeasurementNoiseCov sets the Kalman filter's measurementNoiseCov Mat.
func (kf *KalmanFilter) SetMeasurementNoiseCov(measurementNoiseCov Mat) {
C.KalmanFilter_SetMeasurementNoiseCov(kf.p, measurementNoiseCov.p)
}
// SetErrorCovPre sets the Kalman filter's errorCovPre Mat.
func (kf *KalmanFilter) SetErrorCovPre(errorCovPre Mat) {
C.KalmanFilter_SetErrorCovPre(kf.p, errorCovPre.p)
}
// SetGain sets the Kalman filter's gain Mat.
func (kf *KalmanFilter) SetGain(gain Mat) {
C.KalmanFilter_SetGain(kf.p, gain.p)
}
// SetErrorCovPost sets the Kalman filter's errorCovPost Mat.
func (kf *KalmanFilter) SetErrorCovPost(errorCovPost Mat) {
C.KalmanFilter_SetErrorCovPost(kf.p, errorCovPost.p)
}

39
vendor/gocv.io/x/gocv/video.h generated vendored
View File

@@ -15,12 +15,14 @@ typedef cv::Ptr<cv::BackgroundSubtractorKNN>* BackgroundSubtractorKNN;
typedef cv::Ptr<cv::Tracker>* Tracker;
typedef cv::Ptr<cv::TrackerMIL>* TrackerMIL;
typedef cv::Ptr<cv::TrackerGOTURN>* TrackerGOTURN;
typedef cv::KalmanFilter* KalmanFilter;
#else
typedef void* BackgroundSubtractorMOG2;
typedef void* BackgroundSubtractorKNN;
typedef void* Tracker;
typedef void* TrackerMIL;
typedef void* TrackerGOTURN;
typedef void* KalmanFilter;
#endif
BackgroundSubtractorMOG2 BackgroundSubtractorMOG2_Create();
@@ -47,6 +49,43 @@ bool Tracker_Update(Tracker self, Mat image, Rect* boundingBox);
TrackerMIL TrackerMIL_Create();
void TrackerMIL_Close(TrackerMIL self);
KalmanFilter KalmanFilter_New(int dynamParams, int measureParams);
KalmanFilter KalmanFilter_NewWithParams(int dynamParams, int measureParams, int controlParams, int type);
void KalmanFilter_Close(KalmanFilter kf);
void KalmanFilter_Init(KalmanFilter kf, int dynamParams, int measureParams);
void KalmanFilter_InitWithParams(KalmanFilter kf, int dynamParams, int measureParams, int controlParams, int type);
Mat KalmanFilter_Predict(KalmanFilter kf);
Mat KalmanFilter_PredictWithParams(KalmanFilter kf, Mat control);
Mat KalmanFilter_Correct(KalmanFilter kf, Mat measurement);
Mat KalmanFilter_GetStatePre(KalmanFilter kf);
Mat KalmanFilter_GetStatePost(KalmanFilter kf);
Mat KalmanFilter_GetTransitionMatrix(KalmanFilter kf);
Mat KalmanFilter_GetControlMatrix(KalmanFilter kf);
Mat KalmanFilter_GetMeasurementMatrix(KalmanFilter kf);
Mat KalmanFilter_GetProcessNoiseCov(KalmanFilter kf);
Mat KalmanFilter_GetMeasurementNoiseCov(KalmanFilter kf);
Mat KalmanFilter_GetErrorCovPre(KalmanFilter kf);
Mat KalmanFilter_GetGain(KalmanFilter kf);
Mat KalmanFilter_GetErrorCovPost(KalmanFilter kf);
Mat KalmanFilter_GetTemp1(KalmanFilter kf);
Mat KalmanFilter_GetTemp2(KalmanFilter kf);
Mat KalmanFilter_GetTemp3(KalmanFilter kf);
Mat KalmanFilter_GetTemp4(KalmanFilter kf);
Mat KalmanFilter_GetTemp5(KalmanFilter kf);
void KalmanFilter_SetStatePre(KalmanFilter kf, Mat statePre);
void KalmanFilter_SetStatePost(KalmanFilter kf, Mat statePost);
void KalmanFilter_SetTransitionMatrix(KalmanFilter kf, Mat transitionMatrix);
void KalmanFilter_SetControlMatrix(KalmanFilter kf, Mat controlMatrix);
void KalmanFilter_SetMeasurementMatrix(KalmanFilter kf, Mat measurementMatrix);
void KalmanFilter_SetProcessNoiseCov(KalmanFilter kf, Mat processNoiseCov);
void KalmanFilter_SetMeasurementNoiseCov(KalmanFilter kf, Mat measurementNoiseCov);
void KalmanFilter_SetErrorCovPre(KalmanFilter kf, Mat errorCovPre);
void KalmanFilter_SetGain(KalmanFilter kf, Mat gain);
void KalmanFilter_SetErrorCovPost(KalmanFilter kf, Mat errorCovPost);
#ifdef __cplusplus
}
#endif

4
vendor/gocv.io/x/gocv/videoio.cpp generated vendored
View File

@@ -47,6 +47,10 @@ void VideoCapture_Grab(VideoCapture v, int skip) {
}
}
int VideoCapture_Retrieve(VideoCapture v, Mat buf) {
return v->retrieve(*buf);
}
// VideoWriter
VideoWriter VideoWriter_New() {
return new cv::VideoWriter();

13
vendor/gocv.io/x/gocv/videoio.go generated vendored
View File

@@ -286,7 +286,6 @@ const (
//
// For further details, please see:
// http://docs.opencv.org/master/d8/dfe/classcv_1_1VideoCapture.html
//
type VideoCapture struct {
p C.VideoCapture
}
@@ -380,6 +379,14 @@ func (v *VideoCapture) Grab(skip int) {
C.VideoCapture_Grab(v.p, C.int(skip))
}
// Retrieve decodes and returns the grabbed video frame. Should be used after Grab
//
// For further details, please see:
// http://docs.opencv.org/master/d8/dfe/classcv_1_1VideoCapture.html#a9ac7f4b1cdfe624663478568486e6712
func (v *VideoCapture) Retrieve(m *Mat) bool {
return C.VideoCapture_Retrieve(v.p, m.p) != 0
}
// CodecString returns a string representation of FourCC bytes, i.e. the name of a codec
func (v *VideoCapture) CodecString() string {
res := ""
@@ -406,7 +413,6 @@ func (v *VideoCapture) ToCodec(codec string) float64 {
//
// For further details, please see:
// http://docs.opencv.org/master/dd/d9e/classcv_1_1VideoWriter.html
//
type VideoWriter struct {
mu *sync.RWMutex
p C.VideoWriter
@@ -418,7 +424,6 @@ type VideoWriter struct {
//
// For further details, please see:
// http://docs.opencv.org/master/dd/d9e/classcv_1_1VideoWriter.html#a0901c353cd5ea05bba455317dab81130
//
func VideoWriterFile(name string, codec string, fps float64, width int, height int, isColor bool) (vw *VideoWriter, err error) {
if fps == 0 || width == 0 || height == 0 {
@@ -452,7 +457,6 @@ func (vw *VideoWriter) Close() error {
//
// For further details, please see:
// http://docs.opencv.org/master/dd/d9e/classcv_1_1VideoWriter.html#a9a40803e5f671968ac9efa877c984d75
//
func (vw *VideoWriter) IsOpened() bool {
isOpend := C.VideoWriter_IsOpened(vw.p)
return isOpend != 0
@@ -462,7 +466,6 @@ func (vw *VideoWriter) IsOpened() bool {
//
// For further details, please see:
// http://docs.opencv.org/master/dd/d9e/classcv_1_1VideoWriter.html#a3115b679d612a6a0b5864a0c88ed4b39
//
func (vw *VideoWriter) Write(img Mat) error {
vw.mu.Lock()
defer vw.mu.Unlock()

1
vendor/gocv.io/x/gocv/videoio.h generated vendored
View File

@@ -28,6 +28,7 @@ double VideoCapture_Get(VideoCapture v, int prop);
int VideoCapture_IsOpened(VideoCapture v);
int VideoCapture_Read(VideoCapture v, Mat buf);
void VideoCapture_Grab(VideoCapture v, int skip);
int VideoCapture_Retrieve(VideoCapture v, Mat buf);
// VideoWriter
VideoWriter VideoWriter_New();

View File

@@ -1,4 +1,4 @@
echo off
@echo off
if not exist "C:\opencv" mkdir "C:\opencv"
if not exist "C:\opencv\build" mkdir "C:\opencv\build"
@@ -11,24 +11,24 @@ echo.
REM This is why there is no progress bar:
REM https://github.com/PowerShell/PowerShell/issues/2138
echo Downloading: opencv-4.6.0.zip [91MB]
powershell -command "[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12; $ProgressPreference = 'SilentlyContinue'; Invoke-WebRequest -Uri https://github.com/opencv/opencv/archive/4.6.0.zip -OutFile c:\opencv\opencv-4.6.0.zip"
echo Downloading: opencv-4.9.0.zip [91MB]
powershell -command "[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12; $ProgressPreference = 'SilentlyContinue'; Invoke-WebRequest -Uri https://github.com/opencv/opencv/archive/4.9.0.zip -OutFile c:\opencv\opencv-4.9.0.zip"
echo Extracting...
powershell -command "$ProgressPreference = 'SilentlyContinue'; Expand-Archive -Path c:\opencv\opencv-4.6.0.zip -DestinationPath c:\opencv"
del c:\opencv\opencv-4.6.0.zip /q
powershell -command "$ProgressPreference = 'SilentlyContinue'; Expand-Archive -Path c:\opencv\opencv-4.9.0.zip -DestinationPath c:\opencv"
del c:\opencv\opencv-4.9.0.zip /q
echo.
echo Downloading: opencv_contrib-4.6.0.zip [58MB]
powershell -command "[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12; $ProgressPreference = 'SilentlyContinue'; Invoke-WebRequest -Uri https://github.com/opencv/opencv_contrib/archive/4.6.0.zip -OutFile c:\opencv\opencv_contrib-4.6.0.zip"
echo Downloading: opencv_contrib-4.9.0.zip [58MB]
powershell -command "[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12; $ProgressPreference = 'SilentlyContinue'; Invoke-WebRequest -Uri https://github.com/opencv/opencv_contrib/archive/4.9.0.zip -OutFile c:\opencv\opencv_contrib-4.9.0.zip"
echo Extracting...
powershell -command "$ProgressPreference = 'SilentlyContinue'; Expand-Archive -Path c:\opencv\opencv_contrib-4.6.0.zip -DestinationPath c:\opencv"
del c:\opencv\opencv_contrib-4.6.0.zip /q
powershell -command "$ProgressPreference = 'SilentlyContinue'; Expand-Archive -Path c:\opencv\opencv_contrib-4.9.0.zip -DestinationPath c:\opencv"
del c:\opencv\opencv_contrib-4.9.0.zip /q
echo.
echo Done with downloading and extracting sources.
echo.
echo on
@echo on
cd /D C:\opencv\build
set PATH=%PATH%;C:\Program Files (x86)\CMake\bin;C:\mingw-w64\x86_64-8.1.0-posix-seh-rt_v6-rev0\mingw64\bin
@@ -38,9 +38,9 @@ if [%1]==[static] (
) else (
set enable_shared=ON
)
cmake C:\opencv\opencv-4.6.0 -G "MinGW Makefiles" -BC:\opencv\build -DENABLE_CXX11=ON -DOPENCV_EXTRA_MODULES_PATH=C:\opencv\opencv_contrib-4.6.0\modules -DBUILD_SHARED_LIBS=%enable_shared% -DWITH_IPP=OFF -DWITH_MSMF=OFF -DBUILD_EXAMPLES=OFF -DBUILD_TESTS=OFF -DBUILD_PERF_TESTS=OFF -DBUILD_opencv_java=OFF -DBUILD_opencv_python=OFF -DBUILD_opencv_python2=OFF -DBUILD_opencv_python3=OFF -DBUILD_DOCS=OFF -DENABLE_PRECOMPILED_HEADERS=OFF -DBUILD_opencv_saliency=OFF -DBUILD_opencv_wechat_qrcode=ON -DCPU_DISPATCH= -DOPENCV_GENERATE_PKGCONFIG=ON -DWITH_OPENCL_D3D11_NV=OFF -DOPENCV_ALLOCATOR_STATS_COUNTER_TYPE=int64_t -Wno-dev
cmake C:\opencv\opencv-4.9.0 -G "MinGW Makefiles" -BC:\opencv\build -DENABLE_CXX11=ON -DOPENCV_EXTRA_MODULES_PATH=C:\opencv\opencv_contrib-4.9.0\modules -DBUILD_SHARED_LIBS=%enable_shared% -DWITH_IPP=OFF -DWITH_MSMF=OFF -DBUILD_EXAMPLES=OFF -DBUILD_TESTS=OFF -DBUILD_PERF_TESTS=ON -DBUILD_opencv_java=OFF -DBUILD_opencv_python=OFF -DBUILD_opencv_python2=OFF -DBUILD_opencv_python3=OFF -DBUILD_DOCS=OFF -DENABLE_PRECOMPILED_HEADERS=OFF -DBUILD_opencv_saliency=OFF -DBUILD_opencv_wechat_qrcode=ON -DCPU_DISPATCH= -DOPENCV_GENERATE_PKGCONFIG=ON -DWITH_OPENCL_D3D11_NV=OFF -DOPENCV_ALLOCATOR_STATS_COUNTER_TYPE=int64_t -Wno-dev
mingw32-make -j%NUMBER_OF_PROCESSORS%
mingw32-make install
rmdir c:\opencv\opencv-4.6.0 /s /q
rmdir c:\opencv\opencv_contrib-4.6.0 /s /q
rmdir c:\opencv\opencv-4.9.0 /s /q
rmdir c:\opencv\opencv_contrib-4.9.0 /s /q
chdir /D %GOPATH%\src\gocv.io\x\gocv