Search Wiki

White, Black, and DDG

Andrew Irving: It is a little odd that DD does not like to include white in its results but loves black.

Andrew Irving Very often, DD makes a real mess of users’ results by including too much black, and yet fails totally to include white (not off grey) when the content image has lots of white (result is related to => a x content + b x style ; where a and b are the fractional percentages that we chose…)

Ben Beekman I guess I tend to work with very bright colors in my style images and I guess I do have to overcompensate proportionally in my collages to get areas of true white in the output. For example, the collage of yours I use all the time does well in rendering white pets against darker backgrounds. But it uses a lot of white and light colors. Thanks for this observation, I guess I hadn’t thought hard about the issue but I had sort of noticed I needed an EXTREMELY white style image to style a mostly white source image without the dream coming out way darker. Here’s your collage that does well in rendering patches of white. When I consider how exaggerated the white is in this collage, it only supports your point that white is under-represented in the final output.

Andrew Irving Yes I guess. Also changing the ‘near off white’ to something that is ‘more appropriate (whatever than means)’ is relatively straightforward, but trying to resolve a ‘big black mess’ that is embedded in many dark colours … don’t even bother just delete and start again with a different style. DD could fix this as it is down to their internal brightness and contrast settings – the problem is as they approach the light and dark boundaries … perhaps they should think about conformal mappings or/and variable mesh methods in the context of tone and colour etc.

Ben Beekman Now that I think about it, I think I’ve been grappling with this problem for a long time— one of the uniting factors of almost all of my most frequently used styles is a palette that places heavy emphasis on white and light colors, so I’ve been compensating for this factor for quite a while without a conscious awareness. And when I have to style something that’s almost all white, I know from experience that only my “snowstorm” styles have a shot at it.

Andrew Irving Hi Ben, yes light coloured style images seem to work better than dark ones. But dark style images can still work … particularly if you tweak the brightness and contrast of the image before using. My guess is that things will continue to evolve, and we will have to develop a new set of empirical rules after each new DD release… which is kind of double edged but better to change in the long run… Ps. you may also note that I sometimes include a deliberate white region in the style image, just to force DD to recognise white in the content image (see example at end)

Ben Beekman  I imagine the real challenge will be fixing this issue without crippling previously useful style images that were useful because they compensated for DD’s skewed value scale. It wouldn’t be very user friendly for previously rendered dreams to come out worse in a re-render after fixing this.

Andrew Irving Hahaha… you think. No, just remember the step change when the second release of DD happened, all that went before was swept away and we all rushed to explore the new DD space with whatever seemed to work best… must be a moral in there somewhere…haha.

Ben Beekman I was actually absent from the site during the transition— I only had a vague recollection of how the site used to function when I came back after an absence of many months including the redesign. So it’s good to have you fill in the gaps for me.

Andrew Irving You were lucky then. I spent some time developing a complex system of using many styles in sequence on a content image; but had to throw that away with the new system that is less tolerant to multiple sequential operations – they lose detail and forget the original colours (kind of obviously in a way) but retain the form of the original composition. I can still achieve good results that way, but it is much harder and takes a couple of days for each result image (instead of one or two each hour – no-brainer choice really).

Ben Beekman That’s the thing about AI… deep Dream was initially conceived partially as a way to get a better look at the thought processes of a visual neural net… but even when we manage to figure out what it’s thinking, an upgrade can throw all our understanding out the window, and we’re back to a mostly unknown “black box”. I respect your pioneering spirit and am sorry for the sudden obsolescence of your multi-step processes brought on by the upgraded—read “obfuscated”— AI art engine.

Andrew Irving AI (previously Computer Science) in the 80’s and 90’s was a mix of straight applied maths and programming, which combined and branched out into neural nets, symbolic algebra, OOP, multifunctional code writing (hence C etc.). Neural nets has two primary roots … one is from trying to emulate the architecture and connections in our lizard brain section and the other is from the need to solve simultaneous ODE sets (using Taylor or Volterra series etc.). (I worked part time for about 15 years on Volterra series and coupled nonlinear ODEs, with bits of OOP and the theory of multifunctional programming – also did chaos, forecasting, hysteresis, etc.). Neural nets appeared to do well at fault finding for the DoD, HP, and Ford mainly (eg on helicopter/car gear boxes etc.), and that is the strand from which the Deep Learning approaches stem (least squares fitting to coupled quadratic nonlinear ODEs … or the hardware equivalent analogue boxes (neural nets …). The pioneering spirit as you say is the only game in town atm, nobody knows how to interpret (or improve) neural nets (I managed an analytical solution for the one variable expanding tree case in 1992), so empirical testing is the only real option that we have. But that is not bad, because we still learn, can still produce good results and can still have more to explore.

Xymo Nau The empirical stuff is more pleasing to me than knowing exactly the result I am going to get. You can do that in any image editor. But DDG’s element of unpredictability is the fun – albeit often frustrating – bit. To me, anyway. 😉


Example of adding white or color to style image

Andrew Irving If you are trying to get white (or colour) sections into the results image try including some white (coloured) sections into the style image, and try to make the shapes of the edges of the white(coloured) sections ‘sympathetic’ to the shapes of some edges in the content image …, see below example result with content and style images.

Janet Rekesius DDG does accept a strong hint. I like that you created the shape – I usually hunt for a match.

Andy Irving Hi Janet, yes I also try and match aspects, but generally speaking that is not possible … so I try to help DD in whatever way to achieve an acceptable result.