[Ohrrpgce] Slice based menus
Ralph Versteegen
teeemcee at gmail.com
Wed Feb 22 01:21:43 PST 2012
On 22 February 2012 07:32, Mike Caron <caron.mike at gmail.com> wrote:
> On 2012-02-21, at 12:36 PM, James Paige <Bob at hamsterrepublic.com> wrote:
>> On Tue, Feb 21, 2012 at 07:47:06PM +1300, Ralph Versteegen wrote:
>>> On 21 February 2012 06:46, Jay Tennant <hierandel8 at crazyleafgames.com> wrote:
>>>>> From: James Paige <Bob at HamsterRepublic.com>
>>>>> Sent: Monday, February 20, 2012 10:00 AM
>>>>>
>>>>> I have been reading this whole thread, and I have to admit, I am totally
>>>>> confused. I am not sure at all what you are working on. Would you mind
>>>>> explaining this project to me?
>>>>>
>>>>> ---
>>>>> James
>>>>
>>>> Of course. I'll explain the problem, goal, and implementation of
>>>> this project.
>>>
>>> I'll chip in as well...
>>>
>>>> 1. Problem: the OHRRPGCE GUI lacks a uniform implementation
>>>>
>>>> I like the OHRRPGCE's interface, but it is difficult to upgrade
>>>> it to be point-and-clickable. Additional monitoring would be
>>>> necessary to capture the mouse coordinates and button presses.
>>>>
>>>> While that can be done for one menu, the lack of uniform
>>>> implementation makes it troubling to monitor each menu. And
>>>> if there was interest to adding more buttons later on,
>>>> understanding context and rewriting the monitoring code
>>>> could become daunting.
>>>>
>>>> It would be easier if the menu items were objects that could
>>>> respond to operations performed on them--encapsulation. It
>>>> would be easier if all menu objects interfaced the same way.
>>>> It would be easier if there was a manager that uniformly
>>>> passed messages to the appropriate menu objects that are the
>>>> focus of the user's input.
>>>>
>>>> 2. Goal: create a GUI system that brings uniformity and
>>>> flexibility to the rendering, interfacing, and management of
>>>> menu's in the engine.
>>>
>>> This is for the "graphical" menus in Custom, of which there aren't
>>> many currently: font, sprite, tile, and map editors. The others are
>>> just lists of menu items and aren't included in this.
>>>
>>> The map and sprite editors have lots of different widgets visible
>>> onscreen, and the code for them is a mess because we have these
>>> monolithic functions that handle everything instead of splitting
>>> things up.
>>
>> The map editor and sprite editor code is terrible because it is some of
>> the oldest code in the engine, and because I knew nothing about good
>> design when I wrote it :)
>>
>> There is nothing inherently difficult about creating a complex irregular
>> screen layout that prevents the code from being well organized-- it just
>> doesn't happen to be that way right now :)
>>
>> I love the idea of being able to add callbacks to slices. Consider the
>> sprite editor. I like the idea of having button slices that run a
>> callback if you click on them. They could also have a key shortcut
>> callback so that the existing keyboard shortcuts for them could continue
>> to work.
>>
>> I also like the idea of mouse-down and mouse-up callbacks for the sprite
>> drawing area. I was originally thinking that we would just use a slice
>> to specify the position where the drawing area is located on the screen,
>> and hand-code the rest, but it would be much better if the slice could
>> recieve mouse and keyboard events and pass them on to
>> appropriate sprite-drawing handlers.
>>
>>>> This is a non-OS specific goal. Everything is rendered inside
>>>> the engine, as it is now. The objective is to make widgets
>>>> that are intuitive for users. It is a native implementation.
>>>
>>> "native implementation"? What does "native" mean anyway? I always
>>> thought it meant provided by/tightly dependent on the platform
>>> (OS/hardware).
>>>
>>> There was also another goal: look into whether it would be possible to
>>> abstract different GUI frameworks. Maybe we could draw widgets using
>>> either slices, or something like wxWidgets (which I was especially
>>> interested in, due to my gfx_wx attempt)
>>>
>>>> A GUI manager can create, destroy, and pass messages to
>>>> widgets. A widget can optionally process the messages, or
>>>> delegate the message processing to another widget. All
>>>> widgets have a message procedure.
>>>>
>>>> Functionally, this is very similar to Window's design.
>>>> Messages generated by the GUI manager are added to a queue,
>>>> then processed in the order they are added.
>>>
>>> Why queue messages instead of dispatching them immediately? So that
>>> widgets can peek ahead?
>>>
>>>> 3. Implementation: GuiManager, GuiObject, GUI_* messages,
>>>> input synthesis, and rendering
>>>>
>>>> I've already been working on this since October of last year,
>>>> though I stopped somewhere in late December (work, etc.). I
>>>> developed a GuiManager object, which provides methods for
>>>> sending and posting messages, changing focus, and synthesizing
>>>> input to appropriate GuiObject's. A GuiObject is an implementation
>>>> of a widget. I successfully created a Button and Text widget,
>>>> and a custom widget to test the button and text on.
>>>
>>> Jay mentioned that he was working on a GUI framework written in C++;
>>> not directly OHR-related at the time, but I believe built on
>>> gfx_directx? I said that the OHR needed such a thing to clean up the
>>> map and sprite editors, so I'd like to see whether we could reuse any
>>> of his code.
>>>
>>> Jay's uses message passing instead of callbacks/methods like GTK+ and
>>> wxWidgets do (actually I think wxWidgets does both, it's a mess).
>>> Methods don't work as well when your language doesn't support
>>> inheritance or virtual methods :) Message passing also has the
>>> advantage that you can easily catch messages of any kind and forward
>>> them to something else. Messages allow a type of inheritance. The
>>> disadvantage is slightly less clean code.
>>>
>>> But James, I know that you have tonnes of experience with building
>>> applications with pyGTK, and I have very little experience with GUIs,
>>> so ultimately I think you should have final say on anything.
>>
>> Callbacks are definitely what we want. If there are technical barriers
>> that prevent us from using callbacks, then some other message-handling
>> scheme is okay, if that is the only way to "fake" object-oriented
>> callbacks. Since FB is lacking in inhertance features, maybe messages
>> are the right way to go.
>
> Just to provide some historical perspective, I can think of four
> reasons Windows uses messages instead of callbacks:
>
> 1. Back in the 80s, when they were coming up with this stuff, memory
> was tight, and they probably preferred storing one function pointer,
> rather than potentially dozens per window.
>
> 2. There are messages other than the ones defined by the window
> manager. Basically every control and application defined their own
> messages, and the window manager had no way of keeping track of them.
> So, it just said "I'll give them all to you, and you can figure it
> out."
>
> 3. Having a place where the application stops and actively seeks out
> messages is a good way of ensuring that you are allowed to run code on
> that application thread without introducing all kinds of interesting
> problems (eg, signal handlers in POSIX environments).
>
> 4. Applications run in different address spaces, so your event
> handling function pointers don't mean anything to other applications.
> This causes problems because applications need to send messages to
> each other. Messages allow applications to just send the data and let
> the kernel worry about where it goes.
Those are all important, but I insist that the couple I mentioned are
also very real benefits.
Not just Windows applications use a central message handler, after
all: GTK+, X11, wxWidgets, SDL, fbgfx all do too, sometimes
implementing callbacks on top. Can't say I actually understand how the
heck Cocoa works (I don't understand ObjC under the hood), but I think
it might ironically be the exception (of course *everything* is a
"message" on OSX).
Of course, all these examples are totally irrelevant, for exactly the
reasons you gave next...
> That all said, in this case everything is single threaded in the same
> application in 2012 so none of them apply to this situation. I suspect
> that means you actually want to use callbacks instead of message
> passing.
Actually, we might not be single threaded. I would really love the
ability to open multiple editors/copies of an editor (for comparing
two enemies, for example), and the easiest way to support that may be
to run each in its own thread (and make most of the allmodex state
per-thread). However in that case we would probably be doing
per-thread GUI anyway.
Anyway... I'm not at all opposed to using individual callbacks instead
of message passing. I think there's little difference either way, so
if everyone else prefers function pointers ...
More information about the Ohrrpgce
mailing list