-
Updated
Jun 30, 2020 - GLSL
video-processing
Here are 747 public repositories matching this topic...
The main new feature in version 1.0 is the ability to customise the progress bars using the library Proglog. This is almost entirely undocumented apart from a section in the README. The relevant commit is Zulko/moviepy@bfad5ea.
Hi, I noticed this line of code in README.md
printf("\tCodec %s ID %d bit_rate %lld", pLocalCodec->long_name, pLocalCodec->id, pCodecParameters->bit_rate);in which pLocalCodec is not declared in the article before used which might cause confusion in my view. Digging into the c code, I found this line.
https://github.com/leandromoreira/ffmpeg-libav-tutorial/blob/cdd616ce871078e
-
Updated
Jun 23, 2020
-
Updated
Jul 15, 2020 - C++
-
Updated
Jul 11, 2020 - Python
-
Updated
Oct 31, 2019 - Python
There is no way to specify the name of the .csv file where the scene list is saved (after reading the docs).
This is an important feature to have as rerunning with a different threshold parameter will result in the previous Filename+'-Stats.csv' file being overwritten (I think). Also it would be nice if the default behaviour would be to save it to something like Filename+Stats+Threshold.csv (movi
Read the docs documentation references classes from the project, but they are not clickable. There are no links to take you to either the documentation of that class API on read the docs, or to take you to the source code of that class.
Feel free to propose your ideas on how to improve this.
-
Updated
Mar 24, 2020 - Swift
-
Updated
Jul 14, 2020 - C
<?xml version="1.0" encoding="utf-8"?>
<mlt>
<profile description="" width="640" height="360"/>
<producer id="16da6066-6d38-76bd-75de-5394693db212">
<property name="resource">http://kb-oss-daily.oss-cn-zhangjiakou.aliyuncs.com/video/2019/08/06/67399a58-8788-474b-90d0-56
travis and AppVeyor need to be updated for OpenCV 3 and to work with the new eos version.
4dface builds just fine though, the build failures are just because travis and AppVeyor haven't been updated yet.
Hi, don't know if I am being stupid but are the values for the filters listed anywhere in documentation or source code?
For example if I want to find the min/max value for the Saturation filter so that I can apply these values to slider where can I find them? Had a look through the Filters folder in the framework but was unable to find any info.
Thanks
Checklist
- I've read the
Documentation
-
Updated
May 29, 2020 - Java
-
Updated
Jul 4, 2020 - C++
Describe the bug:
The "Pixelization" values is not linear. It starts with 0,0 which means "no pixelization at all", through 0,9" which means "heavy pixelization", but ends with "1,0" which does not pixelate at all (i. e. 0,0 and 1,0 behave the same). This makes it hard to understand for the user, and hard to prevent failures in usage.
Steps to reproduce the behavior:
- Use Pixe
https://github.com/scikit-video/scikit-video/blob/master/CONTRIBUTING.rst wants flake8 (pyflakes and pep8); do you want to run that automatically in the Travis build? We could use something like env: TOXENV=check in https://github.com/ionelmc/cookiecutter-pylibrary/blob/master/%7B%7Bcookiecutter.repo_name%7D%7D/ci/templates/.travis.yml#L15 with
[testenv:check]
deps =
flake8
skip_ins
-
Updated
Mar 1, 2018 - Objective-C
Change documentation to expose monit port [2812] to get monitoring available.
docker run -d --name=kaltura -p 80:80 -p 443:443 -p 1935:1935 -p 2812:2812 -p 88:88 -p 8443:8443 kaltura/server
-
Updated
Apr 22, 2019 - JavaScript
-
Updated
Mar 10, 2020 - Java
-
Updated
Jul 10, 2020 - C#





Hello, dear Mediapipe guys.
I want to inference the hand pose with Mediapipe model and my own model.
I have my own tf-lite models, it can work on the RGB bitmap.
I try to query the RGB bitmap from input frame with data packet.
My code is