/tech/ - Technology

Technology & Computing


New Reply
Name
×
Sage
Subject
Message
Files Max 5 files32MB total
Tegaki
Password
[New Reply]


1504024316530.png
[Hide] (137.5KB, 478x365)
Nobody wants to make the thread edition

See also /agdg/ at >>>/v/ for videogames.

What are you working on?
1_out36.webm
[Hide] (1.1MB, 1000x650, 00:43)
I learned that FFmpeg can record your screen, so I immediately decided to make a screen recorder to replace ShareX. It's so much more convenient to have everything I need pop up and configurable immediately, than going to some settings screen in a menu within a menu within a menu and still not getting the settings I really want, all while various other hotkeys are hijacked by the program.

It records a lossless mp4 at the moment. I wanted to add a way to compress the video after recording, but I'm not sure how to add it into the UI since I'll probably want to configure the format and bitrate anyway. I'll just use a batch script to compress videos for now.

The audio is also very awkward, I have to type the name of the audio device somewhere manually since FFmpeg can't select it on it's own, and I have to enable some recorder device in my audio settings. I'm pretty sure other programs like OBS can just record whatever comes out of the speakers, but I don't know how to do that.
I could just query audio devices with FFmpeg and then pick a random one or give a dropdown selector with all the options, but it would still require you to have a recording device enabled in your audio settings.
Replies: >>8956 >>8967
>>8955
Here's also the ffmpeg command in case anyone is curious:
ffmpeg -y   -f dshow -i audio="Stereo Mix (ASUS Essence STX II"   -f gdigrab -framerate 30 -offset_x 0 -offset_y 0 -video_size 600x400 -i desktop   -c:v libx264rgb -preset ultrafast -qp 0   -c:a libopus -b:a 320k   "output.mp4"
The audio="" is where you need to put your audio device, which you can query with this command:
ffmpeg   -hide_banner   -f dshow -list_devices true   -i dummy

On Linux you need to use something other than "dshow".
Replies: >>8957
>>8956
Oh and the recording is stopped by typing "q" into the ffmpeg command line.
Just learning C for now, and I'm getting much farther than I ever have previously in my life trying to do this. I'll learn Vulkan immediately after so I can try my hand at a "game engine" (Which might also be useful for 3D animation, who knows?).
Replies: >>8974
>>8955
Finished it for the most part and uploaded it here:
https://sundee.neocities.org/regioncap/index.html
You'll have to manually compress recorded videos afterwards.
>>8958
>'ll learn Vulkan immediately after so I can try my hand at a "game engine"
It's going to take years of programming and math study before you can write a game engine.
Replies: >>8976
>>8974
A game engine, no.
A good game engine, yes.
Replies: >>8977
>>8976
1) I have all the time in the world to study (I tried starting out in the morning about a week ago and was able to study for about 10 hours that day)
2) I mainly want to do it because its fun

If I can get moneys on some decently reliable donation platform or something for FOSSing my code that's just a cherry on top.
Replies: >>8988
thumbscat.jpg
[Hide] (14.2KB, 180x192)
>>8977
Good luck with your programming and programming studies anon. I'm also a NEET programmer.
>do subpixel font rendering
>can barely tell a difference
>it's 9-30x slower
Well that was disappointing. 4 different size comparisons, subpixel renderer on top, cached glyph sprites on bottom.
Replies: >>9011 >>9015
fonts.png
[Hide] (103.4KB, 878x568)
>>9010
Forgot image.
Replies: >>9012 >>9013
subpixel_font_rendering.png
[Hide] (11.1KB, 2012x120)
>>9011
You'd see some rainbow colors in the zoomed in versions if that was subpixel font rendering.
Replies: >>9013
>>9011
Realized that I forgot to do proper alpha blending (which is why one of them seemed brighter), that made it slightly faster (the difference doesn't go above 20x anymore). I guess it's because my blending function has a much faster branch for fully transparent and fully opaque pixels.

>>9012
I mean subpixel positioning, so the glyph starting point isn't aligned to pixels.
>>9010
How is the initial rendering done? I just thought of this quickly, but if this isn't being done already would it potentially be faster and better looking?
>Assess screen size, scale, window dimensions, etc.
>Select closest size of the chosen font available to the rendered size ("A" of font X has (clean) sizes 1, 2, 4, 8, etc... A scale approximately 6.5 times larger than size 1 would select size 8 and scale it downwards)
>Cache the font approximation for speed (You could have 1000+ extremely high quality sizes of font Xs character set due to how cheap storage is nowadays...)
>Insert characters according to the arrangement of the text
>Combine characters together into one massive, static texture that only needs to be translated whenever a page is moved rather than rendered frame-by-frame game-engine style (Think of my entire posts body as being one single image with interactivity imposed on top of it like buttons (cursor "collision"? (If that's the right term to use here)), text "highlighting", etc.)
As far as I know, browsers render game engine style with each frame independent of one another, right?
[New Reply]
14 replies | 5 files
Connecting...
Show Post Actions

Actions:

Captcha:

Select the solid/filled icons
- news - rules - faq -
jschan 0.10.2