<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Arthur Brainville's website on Ybalrid.info</title><link>https://www.ybalrid.info/</link><description>Recent content in Arthur Brainville's website on Ybalrid.info</description><generator>Hugo</generator><language>en-us</language><atom:link href="https://www.ybalrid.info/index.xml" rel="self" type="application/rss+xml"/><item><title>Darkroom Notes</title><link>https://www.ybalrid.info/posts/darkroom-notes/</link><pubDate>Sun, 29 Sep 2024 16:53:38 +0200</pubDate><guid>https://www.ybalrid.info/posts/darkroom-notes/</guid><description>&lt;p&gt;I have not been very active writing on this website. But I have decided to start writing a bit more about random hobbies of mine.&lt;/p&gt;
&lt;p&gt;In the last year or so, I started playing with old cameras, and with somewhat toxic chemicals &amp;hellip; 😅&lt;/p&gt;
&lt;p&gt;I do black and white and color development. I have a lot to learn about wet printing and enlargement techniques still.&lt;/p&gt;
&lt;p&gt;I do not really plan to publish pictures I take, unless they are useful to illustrate something.&lt;/p&gt;</description></item><item><title>Mixing C++ with AMD64 (x86_64) assembly</title><link>https://www.ybalrid.info/2019/11/mixing-c-with-amd64-x86_64-assembly/</link><pubDate>Sat, 09 Nov 2019 04:05:23 +0000</pubDate><guid>https://www.ybalrid.info/2019/11/mixing-c-with-amd64-x86_64-assembly/</guid><description>&lt;p&gt;Lately, I’ve been dabbling into some “closer to the metal” kind of programming.&lt;/p&gt;
&lt;p&gt;On most compilers (Visual Studio’s one for instance) it &lt;em&gt;used to be&lt;/em&gt; rather easy to mix &lt;em&gt;assembly&lt;/em&gt; code and &lt;em&gt;C++&lt;/em&gt; code using a feature called &lt;em&gt;inline assembly&lt;/em&gt; where the ASM code will be put in a block (decorated with a special macro/symbol like &lt;code&gt;_asm&lt;/code&gt; for instance), and when the compiler sees that, it will put the content of this “as is” inside of the compiled code.&lt;/p&gt;</description></item><item><title>SDL: not (just) a 2D graphics library</title><link>https://www.ybalrid.info/2019/09/sdl-not-just-a-2d-graphics-library/</link><pubDate>Tue, 03 Sep 2019 19:03:04 +0000</pubDate><guid>https://www.ybalrid.info/2019/09/sdl-not-just-a-2d-graphics-library/</guid><description>&lt;p&gt;My first real introduction with programming was with the C programming language, on a French website that was, at the time, known by the “Le Site Du Zéro” (think “The newbie’s website”).&lt;/p&gt;
&lt;p&gt;At this time, I was in middle school, and I was pretty bored by it. It’s around that time that I started to have really access to the internet at home, and started spending a lot of time on it.&lt;/p&gt;</description></item><item><title>Ogre_glTF: A glTF loader plugin for Ogre 2.x</title><link>https://www.ybalrid.info/2019/02/ogre_gltf-a-gltf-loader-plugin-for-ogre-2-x/</link><pubDate>Mon, 18 Feb 2019 19:47:19 +0000</pubDate><guid>https://www.ybalrid.info/2019/02/ogre_gltf-a-gltf-loader-plugin-for-ogre-2-x/</guid><description>&lt;p&gt;If there’s one open source library that I really like and think has a great level of usefulness for both myself, and a whole community, it’s Ogre.&lt;/p&gt;
&lt;p&gt;Before going on the story of why I felt loading glTF files into ogre was a necessary thing to do, and why I decided to actually write a loader myself, I need to tell you a bit about Ogre:&lt;/p&gt;
&lt;h2 id="good-old-ogre3d"&gt;Good old Ogre3D&lt;/h2&gt;
&lt;p&gt;I prefer to write it “Ogre”, but really, it’s name is OGRE. This is an acronym that stands for “&lt;strong&gt;O&lt;/strong&gt;bject-oriented &lt;strong&gt;G&lt;/strong&gt;raphical &lt;strong&gt;R&lt;/strong&gt;endering &lt;strong&gt;E&lt;/strong&gt;ngine”.&lt;/p&gt;</description></item><item><title>No nonsense networking for C++ : Introducing kissnet, a K.I.S.S. socket library!</title><link>https://www.ybalrid.info/2018/12/no-nonsense-networking-for-c-introducing-kissnet-a-k-i-s-s-socket-library/</link><pubDate>Tue, 04 Dec 2018 21:02:36 +0000</pubDate><guid>https://www.ybalrid.info/2018/12/no-nonsense-networking-for-c-introducing-kissnet-a-k-i-s-s-socket-library/</guid><description>&lt;p&gt;Sometimes I wonder why some things are inside the C and C++ standard libraries, and some aren’t.&lt;/p&gt;
&lt;p&gt;As far as I can be bothered to read the actual “standards” document (that are mostly written in &lt;em&gt;legalize&lt;/em&gt;, not in understandable English for you and me), these languages are defined against an “abstract machine”, and the actual real-world implementation of them, on computers that actually exists, should follow the behavior described for that thing, modulo &lt;em&gt;some implementations details&lt;/em&gt;.&lt;/p&gt;</description></item><item><title>Why glTF 2.0 is awesome!</title><link>https://www.ybalrid.info/2018/07/why-gltf-2-0-is-awesome/</link><pubDate>Mon, 16 Jul 2018 13:40:58 +0000</pubDate><guid>https://www.ybalrid.info/2018/07/why-gltf-2-0-is-awesome/</guid><description>&lt;p&gt;There’s one single thing that I find truly frustrating when dealing with multiple 3D-related software : making them exchange 3D assets.&lt;/p&gt;
&lt;p&gt;You don’t have much warranty that what has been put out of one software will look the same into something else (e.g. a Game Engine. You may work with meters, and find out that Unreal works in centimeters. They could use different conversations for texturing, material definitions may just “not work”…)&lt;/p&gt;</description></item><item><title>Install and run SteamVR on ArchLinux (for using an HTC-Vive) and do OpenGL/OpenVR developement</title><link>https://www.ybalrid.info/2018/03/install-and-run-steamvr-on-archlinux-for-using-an-htc-vive-and-do-opengl-openvr-developement/</link><pubDate>Tue, 20 Mar 2018 10:33:54 +0000</pubDate><guid>https://www.ybalrid.info/2018/03/install-and-run-steamvr-on-archlinux-for-using-an-htc-vive-and-do-opengl-openvr-developement/</guid><description>&lt;p&gt;So, I recently had the chance to try out an HTC-Vive on a &lt;strong&gt;Linux&lt;/strong&gt; machine. Naturally, I installed Arch on it 😉&lt;/p&gt;
&lt;p&gt;The installation is pretty straight forward, but there are some little catches if you want to do OpenGL development on Linux With OpenVR (OpenVR is the API you use to talk to the SteamVR runtime.)&lt;/p&gt;
&lt;p&gt;SteamVR has a Linux &lt;strong&gt;beta&lt;/strong&gt; since February 2017. They also announced that the SteamVR runtime itself is implemented with &lt;strong&gt;Vulkan&lt;/strong&gt; only.&lt;/p&gt;</description></item><item><title>“Scenario Testing” a game engine by misusing an unit test framework.</title><link>https://www.ybalrid.info/2017/11/scenario-testing-a-game-engine-by-misusing-an-unit-test-framework/</link><pubDate>Fri, 17 Nov 2017 13:23:41 +0000</pubDate><guid>https://www.ybalrid.info/2017/11/scenario-testing-a-game-engine-by-misusing-an-unit-test-framework/</guid><description>&lt;p&gt;I don’t post regularly on this blog, but I really should post more… ^^”&lt;/p&gt;
&lt;p&gt;If you have ever read me here before, you probably know that one of my pet project is a game engine called Annwvyn.&lt;/p&gt;
&lt;h2 id="where-did-i-get-from"&gt;Where did I get from&lt;/h2&gt;
&lt;p&gt;Annwvyn was just “a few classes to act as glue code around a few free software library”. I really thought that in 2 months I had some piece of software worthy of bearing the name &lt;strong&gt;game engine&lt;/strong&gt;. Obviously, I was just a foolish little nerd playing around with an Oculus DK1 in his room, but still, I did actually manage to have something render in real time on the rift with some physics and sound inside! That was cool!&lt;/p&gt;
&lt;p&gt;Everything started as just a test project, then, I decided to remove the &lt;span class="lang:c++ decode:true crayon-inline"&gt;int main(void)&lt;/span&gt;  function I had and stash everything else inside a DLL file. That was quickly done (after banging my head against the MSDN website and Visual Studio’s 2010 project settings, and writing a macro to insert &lt;span class="lang:c++ decode:true crayon-inline"&gt;__declspec(dllexport)&lt;/span&gt; or &lt;span class="lang:c++ decode:true crayon-inline "&gt;__declspec(dllimport)&lt;/span&gt; everywhere.)&lt;/p&gt;
&lt;h2 id="the-need-for-testability-and-the-difficulties-of-retrofitting-tests"&gt;The need for testability and the difficulties of retrofitting tests&lt;/h2&gt;
&lt;p&gt;So let’s be clear: I know about &lt;em&gt;good&lt;/em&gt; development practice, about &lt;em&gt;automated testing&lt;/em&gt;, about &lt;em&gt;TDD&lt;/em&gt;, about &lt;em&gt;software architecture&lt;/em&gt;, about &lt;em&gt;UML Class Diagrams&lt;/em&gt; and all that jazz. Heck, I’m a student in those things. But, the little hobby project wasn’t intended to grow as a 17000 lines of C++ with a lot of modules and bindings to a scripting language, and an event dispatch system, and a lot of interconnected components that abstract writing data to the file system (well, it’s for video game save files) or rendering to multiple different kind of VR hardware, to go expand the Resource Manager of Ogre. Hell, I did not know that Ogre had such a complex resource management system. I thought that Ogre was a C++ thing that drew polygon on the screen without me having to learn OpenGL. (I still had to actually learn quite a lot about OpenGL because I needed to hack into it’s guts, but I &lt;a href="https://blog.ybalrid.info/2016/02/29/using-ogre3ds-opengl-renderer-with-the-oculus-rift-sdk/"&gt;blogged about that already&lt;/a&gt;.).&lt;/p&gt;
&lt;p&gt;Lets just say that things are really getting out of hands, and that I seriously needed to start thinking about making the code saner, and to be able to detect when I break stuff.&lt;/p&gt;</description></item><item><title>Shoehorning anything (with `operator&lt;&lt;()`) into `qDebug()` the quick and dirty templated way</title><link>https://www.ybalrid.info/2017/06/shoehorning-anything-with-operator/</link><pubDate>Thu, 22 Jun 2017 21:05:50 +0000</pubDate><guid>https://www.ybalrid.info/2017/06/shoehorning-anything-with-operator/</guid><description>&lt;p&gt;So, the other day I was working on some Ogre + Qt5 code.&lt;/p&gt;
&lt;p&gt;I haven’t really worked with Qt that much since Qt4 was the hot new thing, so I was a bit rusty, but I definitively like the new things I’ve seen in version 5. But I’m not here to discuss Qt 5 today. ^^&lt;/p&gt;
&lt;p&gt;There’s a few weird things Qt does that I can’t really warp my head around. One is the incompatibility between QString and std::string (There’s probably a nasty problem called “unicode” behind this), but one other one is that QDebug is not an std::ostream derived object.&lt;/p&gt;</description></item><item><title>Getting the name of an audio device from it’s GUID : Using the Oculus Rift selected audio device with OpenAL</title><link>https://www.ybalrid.info/2017/05/getting-the-name-of-an-audio-device-from-its-guid-using-the-oculus-rift-selected-audio-device-with-openal/</link><pubDate>Sun, 21 May 2017 13:39:50 +0000</pubDate><guid>https://www.ybalrid.info/2017/05/getting-the-name-of-an-audio-device-from-its-guid-using-the-oculus-rift-selected-audio-device-with-openal/</guid><description>&lt;p&gt;So, while working on my &lt;a href="https://blog.ybalrid.info/2016/12/31/the-annwvyn-game-engine-and-how-i-started-doing-vr/"&gt;game engine&lt;/a&gt;, I was curious about looking at the &lt;a href="https://developer.oculus.com/distribute/latest/concepts/publish-rift-app-submission/" target="_blank" rel="noopener noreferrer"&gt;technical requirement for submitting an application to the Oculus Store&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;One of the things required is that you need to target the audio output (and input) devices selected by the user in the Oculus app&lt;/p&gt;
&lt;p&gt;&lt;a href="https://www.ybalrid.info/wp-content/uploads/2017/05/OculusClient_2017-05-21_13-30-27.png"&gt;&lt;img loading="lazy" class="wp-image-156 size-medium aligncenter" src="https://www.ybalrid.info/wp-content/uploads/2017/05/OculusClient_2017-05-21_13-30-27-300x225.png" alt="" width="300" height="225" srcset="https://www.ybalrid.info/wp-content/uploads/2017/05/OculusClient_2017-05-21_13-30-27-300x225.png 300w, https://www.ybalrid.info/wp-content/uploads/2017/05/OculusClient_2017-05-21_13-30-27-768x576.png 768w, https://www.ybalrid.info/wp-content/uploads/2017/05/OculusClient_2017-05-21_13-30-27.png 1024w, https://www.ybalrid.info/wp-content/uploads/2017/05/OculusClient_2017-05-21_13-30-27-705x529.png 705w" sizes="(max-width: 300px) 100vw, 300px" /&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;So, how does the Oculus SDK tells you what is the selected device?&lt;/p&gt;</description></item><item><title>The locomotion problem in Virtual Reality</title><link>https://www.ybalrid.info/2017/01/the-locomotion-problem-in-virtual-reality/</link><pubDate>Tue, 17 Jan 2017 03:05:16 +0000</pubDate><guid>https://www.ybalrid.info/2017/01/the-locomotion-problem-in-virtual-reality/</guid><description>&lt;iframe width="560" height="315" src="https://www.youtube.com/embed/POWsFzSFLCE" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen&gt;&lt;/iframe&gt;
&lt;p&gt;(Seriously, I hesitated some time between this version and the original, but that’s not the point of this article, and I kinda like the 80’s vibe anyway…)&lt;/p&gt;
&lt;p&gt;I think we can all agree here, Virtual Reality (VR) is &lt;strong&gt;now&lt;/strong&gt;, and not science-fiction anymore. “Accessible” (not cheap by any stretch of the imagination) hardware is available for costumers to buy and enjoy. Now you can experience being immersed in virtual worlds generated in real time by a gaming computer and feel &lt;em&gt;presence&lt;/em&gt; in it.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;The subject that I’m about to address doesn’t really apply to mobile (smartphone powered) VR since theses experiences tend to be static ones. Mobile VR will need to have reliable positional tracking of the user’s head before hitting this issue… We will limit the discussion on actual computer-based VR&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;One problem still bother me, and the whole VR community as well is: In order to explore a virtual world, you have to, well, &lt;em&gt;walk inside the virtual world&lt;/em&gt;. And doing this comfortably for the user is, interestingly, more complex that you can think.&lt;/p&gt;
&lt;p&gt;You will allways have a limited space for your VR play room. You can’t physically walk from one town to another in &lt;strong&gt;Skyrim&lt;/strong&gt; inside your living room, the open world of that game is a bit bigger than a few square meters.&lt;/p&gt;
&lt;p&gt;The case of cockpit games like &lt;strong&gt;Elite:Dangerous&lt;/strong&gt; aside, simulating locomotion is tricky. Any situation where you’re moving can induce nausea.&lt;/p&gt;
&lt;p&gt;Cockpit-based game grounds you in the fact that you’re seated somewhere and “not moving” because most of the object around you don’t move (the inside of the spaceship/car/plane). This make it mostly a no problem, you can do barrel rolls and looping all day long and keep your meal inside your stomach. And you have less chance to kill yourself than inside an actual fighter jet 😉&lt;/p&gt;
&lt;p&gt;Simulator (VR) sickness is induced by a disparity between the visual cues of acceleration you get from your visual system, and what your vestibular system sense. The vestibular system is your equilibrium center, it’s a bit like a natural accelerometer located inside your inner ears.&lt;/p&gt;</description></item><item><title>The Annwvyn Game Engine, and how I started doing VR</title><link>https://www.ybalrid.info/2016/12/the-annwvyn-game-engine-and-how-i-started-doing-vr/</link><pubDate>Sat, 31 Dec 2016 19:28:34 +0000</pubDate><guid>https://www.ybalrid.info/2016/12/the-annwvyn-game-engine-and-how-i-started-doing-vr/</guid><description>&lt;p&gt;If you know me, you also probably know that I’m developing a small C++ game engine, aimed at simplifying the creation of VR games and experiences for “consumer grade” VR systems (mainly the Oculus Rift, more recently the Vive too), called &lt;strong&gt;Annwvyn.&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;The funny question is : With the existence of tools like Unreal Engine 4 or Unity 5, that are &lt;em&gt;free (or almost free)&lt;/em&gt; to use, why bother?&lt;/p&gt;
&lt;p&gt;There are multiple reasons, but to understand why, I should add some context. This story started in 2013, at a time where you had to actually pay to use Unity with the first Oculus Rift Development Kit (aka DK1), and where UDK (the version of the Unreal Engine 3 you were able to use) was such a mess I wouldn’t want to touch it…&lt;/p&gt;</description></item><item><title>Using Ogre3D’s OpenGL renderer with the Oculus Rift SDK</title><link>https://www.ybalrid.info/2016/02/using-ogre3ds-opengl-renderer-with-the-oculus-rift-sdk/</link><pubDate>Sun, 28 Feb 2016 23:18:38 +0000</pubDate><guid>https://www.ybalrid.info/2016/02/using-ogre3ds-opengl-renderer-with-the-oculus-rift-sdk/</guid><description>&lt;p&gt;Hello there!&lt;br&gt;
The process of getting a scene rendered by Ogre to the Oculus Rift is a bit envolved process. With a basic conaissance of Ogre, and trials and error while browsing the Ogre wiki, Documentation and source code itself I got the thing runing each time Oculus changed the way it worked.&lt;br&gt;
Since we are in the version 0.8 of the SDK, and that 1.0 will come with probably not much change in this front, I think I can write some sort of guide, while browing my Ogre powered VR game engine, and tell you the story of &lt;em&gt;how it works, step by step&lt;/em&gt;.&lt;/p&gt;
&lt;p&gt;I’ll paste here some code with explaination. It’s not structured into classes because I don’t know how you want to do. I don’t use the Ogre Application framework because I want to choose myself the order where things happen&lt;/p&gt;</description></item><item><title> Using the Meopta Color 3 head, but for black and white</title><link>https://www.ybalrid.info/darkroom/using-the-meopta-color-3-head-but-for-black-and-white/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://www.ybalrid.info/darkroom/using-the-meopta-color-3-head-but-for-black-and-white/</guid><description>&lt;p&gt;This is the color head of my enlarger, it can be used for contrast control of black and white print too, but this is not the intended use.&lt;/p&gt;
&lt;p&gt;These informations should match the ones for Meopta Color 4 too, they use the same exact color filters.&lt;/p&gt;
&lt;h2 id="filtering-wheels"&gt;Filtering wheels&lt;/h2&gt;
&lt;p&gt;The head has the following subtractive filters intended for print color correction&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;C: Cyan&lt;/li&gt;
&lt;li&gt;M: Magenta&lt;/li&gt;
&lt;li&gt;Y: Yellow&lt;/li&gt;
&lt;li&gt;D: Neutral Density&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="contrast-control-with-black-and-white-using-ilford-multigrade"&gt;Contrast control with black and white using ILFORD Multigrade&lt;/h2&gt;
&lt;p&gt;As a guideline, ILFORD&amp;rsquo;s &lt;a href="https://www.ilfordphoto.com/wp/wp-content/uploads/2017/03/Contrast-control-for-Ilford-Multigrade.pdf"&gt;&lt;em&gt;Technical Information Contrast Control&lt;/em&gt;&lt;/a&gt; memo for has provide as a guideline the following conversion table between Multigrade filters and Meopta color filtering values.&lt;/p&gt;</description></item><item><title>About Me</title><link>https://www.ybalrid.info/about-me/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://www.ybalrid.info/about-me/</guid><description>&lt;p&gt;Hi there!&lt;/p&gt;
&lt;!-- &lt;img src="https://www.ybalrid.info/wp-content/uploads/2019/04/avatar.jpg"&gt; --&gt;
&lt;p&gt;My name is Arthur Brainville, but I generally go by &amp;ldquo;Ybalrid&amp;rdquo; on the internet.&lt;/p&gt;
&lt;p&gt;I&amp;rsquo;m a C++ developer specialising in low-level systems programming, real-time graphics, and XR technologies. For the past seven years I worked at LIV, a startup building mixed reality capture and streaming software. I worked on everything from GPU texture sharing across Direct3D, OpenGL and Vulkan, to OpenXR API layers, to integrating ML-based human segmentation so streamers could ditch the green screen. It was a great run.&lt;/p&gt;</description></item><item><title>Black and White developers</title><link>https://www.ybalrid.info/darkroom/black-and-white-developers/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://www.ybalrid.info/darkroom/black-and-white-developers/</guid><description>&lt;p&gt;These are just a relatively sparse list of developers I have used for black and white film.&lt;/p&gt;
&lt;h2 id="rodinal-from-labo-argentique"&gt;Rodinal (from Labo-Argentique)&lt;/h2&gt;
&lt;p&gt;High acutance, my mainly used so far, generally at 1+25 to 1+50. Reveal grain as it is.&lt;/p&gt;
&lt;p&gt;Used 2 times to do semi-stand at 1+100, agitate 1 minute, stand 30 minutes, 4 inversions, then 30 minutes of stand again.&lt;/p&gt;
&lt;p&gt;When developing Ilford Delta 3200, it is better to push development time one extra stop when using the time form Massive Dev chart to get a more contrasted/denser negative that is easier to scan.&lt;/p&gt;</description></item><item><title>C41 Developer depletion</title><link>https://www.ybalrid.info/darkroom/c41-developer-depletion/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://www.ybalrid.info/darkroom/c41-developer-depletion/</guid><description>&lt;p&gt;&lt;em&gt;After getting bad results from one development session and not being sure why, I decided that every C41 development should be logged&lt;/em&gt;&lt;/p&gt;
&lt;h2 id="film-surface-area"&gt;Film surface area&lt;/h2&gt;
&lt;p&gt;The best kit I can get for C41 color chemistry is the Bellini one. It is a kit for 16 rolls of 24 exposures as per its datasheet.&lt;/p&gt;
&lt;p&gt;Because the lifetime of the bleach, fixer, and stabilizer is higher than the developer, you can also purchase the developer part separately on the Bellini kit to &amp;ldquo;recharge&amp;rdquo; it. Everything here is talking about one bottle of the &lt;em&gt;developer&lt;/em&gt; specifically, and not specifically the lifetime of &lt;em&gt;one kit&lt;/em&gt;&lt;/p&gt;</description></item><item><title>Caffenol experiment</title><link>https://www.ybalrid.info/darkroom/caffenol-experiment/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://www.ybalrid.info/darkroom/caffenol-experiment/</guid><description>&lt;p&gt;Those are attempts at developing film with &lt;a href="https://en.wikipedia.org/wiki/Caffenol"&gt;Caffenol&lt;/a&gt;&lt;/p&gt;
&lt;h2 id="first-roll-fomapan-100-from-flexaret-test-2024-09-08"&gt;First roll, Fomapan 100 from Flexaret (test) 2024-09-08&lt;/h2&gt;
&lt;p&gt;This is a roll of film that I knew was damaged by the camera (rollers for 35mm film dug into the emulsion and backing paper. I fixed it since)&lt;/p&gt;
&lt;p&gt;Tried to develop it in Caffenol&amp;rsquo;s- main recipe&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Sodium Carbonate 54 g&lt;/li&gt;
&lt;li&gt;Vitamin C 16g&lt;/li&gt;
&lt;li&gt;Instant coffee (Belle France, store brand) 40g&lt;/li&gt;
&lt;li&gt;tap water to 1L&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;I think temperature was around 24 degrees, but I proceeded as if it was 20 degrees.&lt;/p&gt;</description></item><item><title>Find me elsewhere</title><link>https://www.ybalrid.info/links/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://www.ybalrid.info/links/</guid><description>&lt;p&gt;Besides my ramblings in this very blog (&lt;a rel="noreferrer noopener" href="https://ybalrid.info/" target="_blank"&gt;&lt;a href="https://ybalrid.info/"&gt;https://ybalrid.info/&lt;/a&gt;&lt;/a&gt;), I&amp;rsquo;m also findable on some other interwebz. Here&amp;rsquo;s a selection of hypertext links, just for you!&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://github.com/Ybalrid/"&gt;GitHub&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://twitter.com/ybalrid"&gt;Twitter&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://fosstodon.org/@ybalrid/"&gt;Mastodon (Fosstodon)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://bsky.app/profile/ybalrid.info"&gt;Blue Sky&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.threads.net/@ybalrid"&gt;Threads&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.linkedin.com/in/arthurbrainville/"&gt;LinkedIn&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://dev.to/ybalrid"&gt;DEV.to&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.polutropon.games/"&gt;My indie game studio project&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.youtube.com/channel/UCbI9uVnBqn407EvL4WrjSlw"&gt;Useless YouTube&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://liv.tv/"&gt;LIV&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;My messaging application of choice is Discord, and you&amp;rsquo;ll find me as &lt;code&gt;Ybalrid#1337&lt;/code&gt; on there.&lt;/p&gt;</description></item><item><title>Making slides from Kodak Aerocolor IV (Santacolor 100, elektra 100, REFLX LAB Pro 100)</title><link>https://www.ybalrid.info/darkroom/making-slides-from-kodak-aerocolor-iv-santacolor-100-elektra-100-reflx-lab-pro-100/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://www.ybalrid.info/darkroom/making-slides-from-kodak-aerocolor-iv-santacolor-100-elektra-100-reflx-lab-pro-100/</guid><description>&lt;p&gt;I love shooting color slides. But slide film is expensive today. Development is also expensive at most labs (Though, I do develop my E-6 film myself using &lt;a href="https://www.bellinifoto.it/en/prodotto/kit-amateur-e6/"&gt;this kit from Bellini Foto&lt;/a&gt;. You can find this kit for a bit less than 45€ at most photography online stores in Europe, and it has a capacity of 9 rolls of 135-36)&lt;/p&gt;
&lt;p&gt;I like shooting slides for the sake of making slides that can be projected. Not &lt;em&gt;specifically&lt;/em&gt; for the color rendering and contrast. Though, I am in love with Fuji Velvia 100, a film that I really hope they will continue to make.&lt;/p&gt;</description></item><item><title>Negative scans and inversion</title><link>https://www.ybalrid.info/darkroom/negative-scans-and-inversion/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://www.ybalrid.info/darkroom/negative-scans-and-inversion/</guid><description>&lt;p&gt;I scan all my film myself using a DSLR. I am currently waiting on a Copy stand. used contraption I 3d printed in the past, but recently got an &lt;a href="https://clifforth.co.uk/"&gt;Essential Film Holder&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;I attempt to expose the histogram to the right to get the most information possible in the shadows.&lt;/p&gt;
&lt;p&gt;I align the camera with the film by looking at a mirror though the camera and trying to have the center of the lens in the center of the&lt;/p&gt;</description></item><item><title>Printing hardware</title><link>https://www.ybalrid.info/darkroom/hardware/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://www.ybalrid.info/darkroom/hardware/</guid><description>&lt;h2 id="easel"&gt;Easel&lt;/h2&gt;
&lt;h3 id="l-p-l--21-x-26-cm"&gt;L P L 21 x 26 cm&lt;/h3&gt;
&lt;p&gt;Max paper size usable on this thing is 20x25 (8&amp;quot;x10&amp;quot;), there is a screws on the back to set the offset of the top left corner for the margins. Sadly it is not graduated in any way so you cannot easily set it to half an inch&amp;hellip;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Note&lt;/strong&gt;: Base is magnetic, so it should be possible to use Paterson marginless magnetic holders, if those are any good?&lt;/p&gt;</description></item><item><title>RA-4 Filtrations for Meopta filters</title><link>https://www.ybalrid.info/darkroom/ra-4-filtrations-for-meopta-filters/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://www.ybalrid.info/darkroom/ra-4-filtrations-for-meopta-filters/</guid><description>&lt;p&gt;These numbers are in the Meopta filter scale. The meopta color 3 user manual has a conversion table to other filters in case you want to cross reference my numbers.&lt;/p&gt;
&lt;p&gt;There are acceptable starting filtration from one sample from what we batch of film I bought, onto whatever RA-4 paper I was able to buy. All RA-4 processing has been done at 35°C, in a kit from Bellini.&lt;/p&gt;
&lt;p&gt;Because of where most of those results tend to hover, and because the &amp;ldquo;it looks alright most of the time&amp;rdquo; value with usual, good quality film tend to be. I will always use &lt;strong&gt;C0 M90 Y90&lt;/strong&gt; as the staring value for my Meopta Color 3. This may be due to the combo of paper manufacturer and chemistry I use (Fuji and Bellini). I do not know. But it works for me.&lt;/p&gt;</description></item><item><title>RA-4 paper development</title><link>https://www.ybalrid.info/darkroom/ra-4-paper-development/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://www.ybalrid.info/darkroom/ra-4-paper-development/</guid><description>&lt;p&gt;I use the Bellini RA-4 kit.&lt;/p&gt;
&lt;p&gt;The Bellini kit is for 5L. though I prepared only what was needed for 1L of each (beside the stabilizer)&lt;/p&gt;
&lt;p&gt;I store all chemicals in collapsible bottles so they do not contain air, to limit oxidation.&lt;/p&gt;
&lt;p&gt;I am developing using a 8x10 drum for cibachrome, on top of an old french made &amp;ldquo;Rotocuve&amp;rdquo; motorized base.&lt;/p&gt;
&lt;p&gt;I use to roll the drum back and forth on a table, so you know, that does not really matter.&lt;/p&gt;</description></item><item><title>Scanning setup</title><link>https://www.ybalrid.info/darkroom/scanning-setup/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://www.ybalrid.info/darkroom/scanning-setup/</guid><description>&lt;p&gt;My scanning setup works better on full, uncut rolls of film.&lt;/p&gt;
&lt;p&gt;I currently use the &lt;a href="https://clifforth.co.uk/index.php"&gt;Essential Film Holder&lt;/a&gt; on top of a CineStill CsLight. It does not fit supper well but works okay.&lt;/p&gt;
&lt;p&gt;I have a old Durst enlarger colum that was converted into a copy stand.&lt;/p&gt;
&lt;p&gt;I use a Canon DSLR, and a 100mm f/2.8 USM Macro. This lens is &lt;em&gt;awesome&lt;/em&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Cut a piece of the leader at a slant for easier insertion&lt;/li&gt;
&lt;li&gt;Put in Essential Film Holder&lt;/li&gt;
&lt;li&gt;Turn on light source
&lt;ul&gt;
&lt;li&gt;Cool light for color negs&lt;/li&gt;
&lt;li&gt;Medium light for black and white (does not matter to much)&lt;/li&gt;
&lt;li&gt;Warm light for color slide (at least for Ektachrome)&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Plug DSLR to PC or Mac
&lt;ul&gt;
&lt;li&gt;EOS Utility is probably running, if not start it&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Camera in Av mode at f/5.6&lt;/li&gt;
&lt;li&gt;Set white balance to correspond to the configured light&lt;/li&gt;
&lt;li&gt;Attach camera to the Durst enlarger column&lt;/li&gt;
&lt;li&gt;Align film holder on top of light&lt;/li&gt;
&lt;li&gt;Fill the frame with the negative, and focus on the grain using the zoom in live view, or use autofocus if the camera is able to see the grain.&lt;/li&gt;
&lt;li&gt;Set camera to 100 ISO&lt;/li&gt;
&lt;li&gt;Expose with EV comp to push histogram to the right&lt;/li&gt;
&lt;li&gt;Take the first frame of the shot
&lt;ul&gt;
&lt;li&gt;This creates a folder in &lt;code&gt;C:\User\&amp;lt;logon name&amp;gt;\Pictures\&amp;lt;today's date&amp;gt;&lt;/code&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Open Filmomat SmartConvert&lt;/li&gt;
&lt;li&gt;In preferences, change the hot folder to the one found above&lt;/li&gt;
&lt;li&gt;You should see that picture, adjust settings for density, contrast, color&amp;hellip;&amp;hellip;.&lt;/li&gt;
&lt;li&gt;Each time you take a picture of the next frame, check exposure to see if all the dynamic range is well available at the top of the histogram, then adjust frame by frame in SmartConvert&lt;/li&gt;
&lt;li&gt;Once you are done, export the TIFF files&lt;/li&gt;
&lt;li&gt;Open those for furher editing (I do not crop them in Filmomat, I do that in DarkTable for example)&lt;/li&gt;
&lt;/ul&gt;</description></item><item><title>Tips with the C41 1K kit from Bellini</title><link>https://www.ybalrid.info/darkroom/tips-with-the-c41-1l-kit-from-bellini/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://www.ybalrid.info/darkroom/tips-with-the-c41-1l-kit-from-bellini/</guid><description>&lt;h2 id="stabilizer"&gt;Stabilizer&lt;/h2&gt;
&lt;p&gt;The kit comes with enough stabilizer concentrate to make like 20L of the stuff. The Stabilizer is used as preservative for the dyes, and should be the last thing to hit your film. I would advise&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Mix it with deionized or demineralized water (whatever you put in a clothes iron should be fine)&lt;/li&gt;
&lt;li&gt;&lt;em&gt;Add a wetting agent to the stabilizer&lt;/em&gt; I personally use Agepon because my lab stock BERGGER producs very well. Photo-Flo or the Ilford one, it does not matter. Add the recommended concentration on the packaging to your stabilizer when you mix it (Agepon is 1:500)&lt;/li&gt;
&lt;li&gt;After fixing, wash using the &lt;a href="https://www.ilfordphoto.com/wp/wp-content/uploads/2017/03/Reducing-Wash-Water.pdf"&gt;ILFORD method&lt;/a&gt; of doing 5, then 10, then 20 inversion with running water. But do it with water temperture in the range for the stabilizer (between 32C and 38C)&lt;/li&gt;
&lt;li&gt;Stabilizer should be used one shot (only use fresh one). Do the 3 minute continuous agitation. Because of the added wetting agent it may get very foamy in the top of the tank. Let the film stand in the wetting agent for like a minute then pull the reels out and shake them out to remove excess. squeegee the film with your gloves between two fingers wet with the stabilizer solution.&lt;/li&gt;
&lt;/ul&gt;</description></item></channel></rss>