This is the mail archive of the ecos-discuss@sources.redhat.com mailing list for the eCos project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: using the Qt virtual frame buffer (qvfb) from ecos synth. target


>>>>> "Alexander" == Alexander Neundorf <neundorf@kde.org> writes:

    Alexander> Then a general frame buffer API is needed. We could try
    Alexander> to implement it compatible to the linux frame buffer
    Alexander> (with ioctl()'s and stuff). Or a native ecos
    Alexander> framebuffer API. This could help the GUI stuff for
    Alexander> ecos.

A general frame buffer API for eCos is something I have been think
about recently. It is a hard problem, frame buffer hardware varies
widely. I think an object-based API is needed, based around cyg_frame
structures. There would be a generic package CYGPKG_IO_FRAME to define
the API, and various frame buffer device drivers below
CYGPKG_DEVS_FRAME.

An embedded system may have multiple displays. Usually there will be
only one so the API should allow for that case to be optimized with
macros and inline functions, but you don't want the API to preclude
multiple displays.

Each display may run in several different modes, affecting the
resolution and colour depth. Ideally a single mode would be configured
statically, but again you do not want to preclude dynamic mode
changing in a generic API.

Sometimes the distinction between displays and modes is unclear. For
example an LCD controller may support picture-in-picture mode,
effectively giving you two separate framebuffers talking to the same
LCD display. Double-buffered displays for animation involve a similar
complication. Hence it is probably best to encapsulate both displays
and screen modes in a single cyg_framebuffer structure.

For any given target there would be one or more cyg_frame objects,
corresponding to displays*modes. Each object would be controlled by a
configuration option, and each option would implement an interface
CYGINT_IO_FRAME_BUFFERS. This would allow other packages to
"require { CYGINT_IO_FRAME_BUFFERS > 0 }", and provides an easy route
to optimization:

  #if (1 == CYGINT_IO_FRAME_BUFFERS)
    ...
  #else
    ...
  #endif

An init call like "cyg_frame_init(cyg_frame* buffer)" would just
access a function pointer in the cyg_frame structure. The exact
operation would depend on the circumstances. It might just initialize
a display, or it might change a screen mode. There should be at least
one init() call per active display. The application developer or
higher-level code would decide which cyg_frame's to initialize -
possibly they should be held in a table. Arguably if there is only a
single cyg_frame then it should be self-initializing.

The next issue is access to the frame buffer. On sensible hardware
you'll have a linear frame buffer which higher-level code can
manipulate directly. The cyg_frame structure should contain base
address and stride information. Unfortunately not all hardware is
sensible so it may instead be necessary to go via a plot_pixel()
function pointer in the cyg_frame structure, and perhaps also provide
functions for common operations like filling rectangles and block
copies. That sort of thing can help performance on e.g. a PC VGA
display in various modes. Care must be taken not to define too many of
these functions and to only define them when the driver can really
provide a more efficient operation than normal: because they are
accessed via function pointers in a cyg_frame structure they cannot be
removed by linker garbage collection.

There would have to be a flush call, usually optimized to nothing, to
handle hardware which needs to be informed that the frame buffer has
been updated. That would be needed for the synthetic target, and
possibly for things like double-buffered animations.

Screen resolution can be managed by two fields in the cyg_frame
structure. It is also possible to optimize when exporting these in the
device driver's header:

  #ifdef CYGVAR_I386_PC_FRAME_MODE_X
  extern cyg_frame	cyg_frame_vga_modex;

  # if (1 == CYGINT_IO_FRAME_BUFFERS)
  #  define cyg_frame_width(_frame_)	320
  #  define cyg_frame_height(_frame_)	240
  # endif

  #endif

  ...

  #if (1 != CYGINT_IO_FRAME_BUFFERS)
  #  define cyg_frame_width(_frame_)	((_frame_)->fb_width)
  #  define cyg_frame_height(_frame_)	((_frame_)->fb_width)
  #endif


Colour gets messy. 24-bit colour may be ubiquitous on desktop systems
these days, but not for embedded systems. It is still necessary to
worry about monochrome, grey scale, and paletted colour displays of
various depths. I don't yet have any clear ideas on how to handle
this in a generic API.

Then there are issues like hardware mouse cursors, or if you want to
get really messy 3D acceleration. These are probably best ignored for
now.

Bart

-- 
Bart Veer                       eCos Configuration Architect
http://www.ecoscentric.com/     The eCos and RedBoot experts
Besuchen Sie uns vom 22-24.02.05 auf der Embedded World 2005, Stand 11-124
Visit us at Embedded World 2005, Nürnberg, Germany, 22-24 Feb, Stand 11-124

-- 
Before posting, please read the FAQ: http://ecos.sourceware.org/fom/ecos
and search the list archive: http://ecos.sourceware.org/ml/ecos-discuss


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]