Translate 3D Rectangular Target to Pan/Tilt/Focus?

Request a feature that you would like to see in QLC+.
Explain in details why you would need it and which is your usage case.
Post Reply
AaronD
Posts: 4
Joined: Fri Feb 14, 2020 7:37 pm
Real Name: Aaron Duerksen

A few years ago, I wrote my own PC-based DMX controller because nothing I found even came close to what I wanted. I was almost about to do a major update on it when I discovered QLC+. It seems to do everything I did and more...except for one critical function: the ability to have multiple movers follow a single target point around the room with a joystick. (or a finger on a touchscreen over an image of the room)

Did I just miss that? Or is it really yet to be added?

I suspect the latter because, to make it work, each fixture needs to know its own mounting point in the same coordinate system, and I don't see that either.
User avatar
GGGss
Posts: 2733
Joined: Mon Sep 12, 2016 7:15 pm
Location: Belgium
Real Name: Fredje Gallon

As up to now the QLC+ v4 is a 2D aware product.

Rest assured that even today (2020) in the fora of the big desks (GrandMA, HOG, Etc, ...) this topic is still very alive and not answered to a complete solution.
(automate follow spots, highlights for solo's, ...)

Even though you could calculate all vectors in a 3D world, you'd lack the insight to pinpoint the object, in depth, on a 2D screen.
And then the problem arises with the nearby sources over saturating the object vs creating a acceptable light plot to the object with the presence of ambient light.
Taking into account frosts, filters, haze, distance lumens under an angle to lux and f.i. television broadcasts in 4K, would need a very complex math model.
It still is considered a piece of art for the operator to decide the stage looks and people are hired to do so - because the outcome is nicer, better, sellable, ..., more adopted to specific needs.

and what if in a HQI source, a lense is dirty? a LED source with 4k hours isn't as bright no more, because of mis aligning a fixture need channel modifiers, one's focus is off par, ..., blue filter is burnt, ...
Throw that into the equation ;-)

Theoretical it IS possible ... have a look in the interweb for f.i. Kinect guided lights - all very very nice projects - but not on a producible scale...
Lucky me - still in business and not replaced by a robot.
All electric machines work on smoke... when the smoke escapes... they don't work anymore
User avatar
sandinak
Posts: 188
Joined: Mon Apr 03, 2017 5:40 pm
Location: Yorktown, VA
Real Name: Branson Matheson
Contact:

heya! So I too had this problem and decided to go around the problem by writing tooling that worked independently of the DMX controller. Please checkout

https://github.com/sandinak/dmx-followspot

that's a project I wrote that acts as a proxy between controller and heads to do exactly what you're looking for. Stores 'scenes' that are callable from the master console with saved positions .. etc. Looking for contributors and feedback. I havn't done much with it recently .. but will be back to it after comp season this year as we're gonna have a specific need next year given our stage design.
AaronD
Posts: 4
Joined: Fri Feb 14, 2020 7:37 pm
Real Name: Aaron Duerksen

GGGss wrote: Mon Feb 17, 2020 2:25 pm Even though you could calculate all vectors in a 3D world, you'd lack the insight to pinpoint the object, in depth, on a 2D screen.
I did. Touchscreen + joystick-offset for 2 of the 3 dimensions, fader for the 3rd. That's operated manually in real-time, with 4 movers tracking that one spot from very different mounting locations. In my case, it's a plan view of the floor, and the fader is the height off the floor, but it could just as easily be a spot on the fourth wall, and the fader as up/down-stage.

Some permutations would take more guesswork than others - my specific case hardly takes any - but I think that the real-time 3D math by itself is easily a worthwhile addition.
GGGss wrote: Mon Feb 17, 2020 2:25 pm And then the problem arises with the nearby sources over saturating the object vs creating a acceptable light plot to the object with the presence of ambient light.
Taking into account frosts, filters, haze, distance lumens under an angle to lux and f.i. television broadcasts in 4K, would need a very complex math model.
It still is considered a piece of art for the operator to decide the stage looks and people are hired to do so - because the outcome is nicer, better, sellable, ..., more adopted to specific needs.

and what if in a HQI source, a lense is dirty? a LED source with 4k hours isn't as bright no more, because of mis aligning a fixture need channel modifiers, one's focus is off par, ..., blue filter is burnt, ...
Throw that into the equation ;-)

Theoretical it IS possible ... have a look in the interweb for f.i. Kinect guided lights - all very very nice projects - but not on a producible scale...
Lucky me - still in business and not replaced by a robot.
The appearance of the spot is outside the scope of this feature request. It might be a separate deal, and possibly based on the calculated distance, but not an inherent requirement.
User avatar
sandinak
Posts: 188
Joined: Mon Apr 03, 2017 5:40 pm
Location: Yorktown, VA
Real Name: Branson Matheson
Contact:

I agree with this ... a hand spot doesn't take any of that into account and is still used as a tool for highlighting performers regardless of the scenes behind them. Having a hand spot replaced by strategically placed movers would just simplify things and give us more options as lighting directors. Not saying it'd wouldn't be fun/cool project to goto this level .. but we have enough higher priority issues on the table. And not wanting to take any jobs ;)
Last edited by sandinak on Tue Feb 25, 2020 1:17 pm, edited 1 time in total.
User avatar
GGGss
Posts: 2733
Joined: Mon Sep 12, 2016 7:15 pm
Location: Belgium
Real Name: Fredje Gallon

Actually I came across a new product yesterday.
It uses the ultra high band 6-7GHz and the stage has to be transformed in a mesh network with receivers.
Every spot keeps track of it's beacon on stage.
I didn't study no further but apparently the system is capable of focus and beam width adaptation in real time.

I'm getting off topic here.

I find it an interesting thread - so keep on sending in ideas / toughs.
All electric machines work on smoke... when the smoke escapes... they don't work anymore
AaronD
Posts: 4
Joined: Fri Feb 14, 2020 7:37 pm
Real Name: Aaron Duerksen

GGGss wrote: Tue Feb 25, 2020 8:50 am ...the stage has to be transformed in a mesh network with receivers.
Every spot keeps track of it's beacon on stage...
I thought about doing that too, but decided that it was too ambitious for a first version, and therefore out of scope here as well. Manual control is fine, and pre-programmed chases/patterns.

My version would have been more like the GPS system, but adapted for very short range. The "satellites" would be fixed in different places around the room, and the hat-mounted infrared "dumb blinkenlight" target would have a known distance from each one. Or maybe an angle. Either way works, but requires different math. Put those distances or angles together with their anchor points, and the intersection is where the target must be.
This then becomes a general purpose input to the overall lighting software, completely independent of any fixtures. You *could* use it to hit that spot with multiple movers, or you can use it to blackout when an actor goes offstage, or whatever.

But again, it's out of scope for this thread. This thread assumes a 3D rectangular target that is already known *somehow*, and then tracks that target with multiple movers. Different movers may have different targets - like an arbitrary two hitting "here", another three hitting "there", etc. - while all of the targets can move independently and all of the fixtures keep up with their assignments.
Post Reply