Please find attached a Remote UI BoF FAQ which explains some of the problems
with existing protocols and clarifies the scope of proposed work. To discuss
this in more detail, a mailing list is available at
remoteui(_at_)ietf(_dot_)org and you can subscribe here
Also, a starting point for this work can be found here
Frequently Asked Questions - Remote UI BoF
1. What problem is the Remote UI protocol trying to solve?
The Remote UI is a mechanism that enables user interfaces to be rendered on
other devices than those that have the application logic. The Remote UI
protocol provides the mechanism which enable a client device the generate a UI,
received from a server device, using the client device's native UI
capabilities, and keeping the UI synchronised with the application logic.
2. Who needs this solution?
Device manufacturers invest a lot of energy in trying to optimise their devices
for certain environments. Because the devices are intended for a diverse range
of purposes, their UI capabilities can vary considerably; screen size and
ratio, color depth, windowing system with various widget sets, input methods
are making the environment highly heterogeneous. At the same time, application
developers and UI designers are trying to create user interfaces that are high
optimised for the rendering platform, so that the user experience is improved
by having the respective application easy to learn and use.
Therefore, when an UI is rendered on another device than the one that is
running the application logic provisions need to be made, so that the user can
perceive the UI as a local application making it intuitively usable. For
example an application UI may appear differently when remoted on an Windows XP
desktop vs. a mobile phone.
3. Why this cannot be done with existing protocol?
Existing protocols can be split in two categories: framebuffer-level and
In the framebuffer-level protocol, the contents of the framebuffer (i.e. the
individual pixels) are copied across the network to a framebuffer on the
client. In order to avoid sending the full screen every time something changed
on the screen, these protocols typically send only the pixels that are changed
inside the clipping regions to the client. Examples of such protocols are VNC
and protocols based on T.120, like Microsoft's RDP.
In the graphics-level protocol, the drawing request to the graphical device
interface (GDI), such as DrawLine(), DrawString(), etc. are copied across the
network. The client is responsible for interpreting these commands and
rendering the lines, rectangles, strings, etc. in its framebuffer. Example of
such protocol is X Windows.
The problem with these approaches is that, in order to render the UI, the
clients are following blindly the instructions received from the server; they
don't have means to influence the appearance of the UI, they just render the UI
using the graphical elements/instructions that are provided by the server and
are specific to the server platform.
4. What are the pieces, building blocks that are needed to build this solution?
In order to correct the problems pointed out with the existing protocols, a
mechanism is needed so that information about the widgets or widget trees is
sent to the client, so that it can generate the UI using native client's
widgets. This mechanism has usually three components: UI description language,
UI remoting protocol and session setup.
UI description language contains the descriptions of the widgets, their
properties and relationships between the widgets. Typically the widget-level UI
descriptions are augmented with stylesheets containing hints such as preferred
colours to use, which layout to use, which background picture, etc. Examples of
descriptions languages are Mozillla's XUL and OASIS' UIML.
The UI remoting protocol is the transport protocol that is responsible with
communicating (partial) UI updates from the server to the client and UI events
triggered through changes in widget states made by the user from the client to
the server. Usually an UI update can have three meanings:
* UI Initialisation - an application can send the full description of
the UI at session setup.
* UI Fragment - an application is sending partial descriptions of the
UI when the application is becoming aware of which parts of the UI will be used
depending on the runtime status.
* UI Update - an application is sending notifications when widget
properties are changing due to changes in application state.
The actual structures that are exchanged with UI updates and UI events are
described using the UI description language.
Session setup is responsible with identifying compatible servers and clients
and initiating the UI remoting session between them. A client is compatible
with a server when they support the same UI remoting protocol and the same UI
description language. Examples of session setup protocols are SIP/SDP and UPnP
5. What is the goal of the Remote UI effort in IETF?
Create a UI remoting protocol for applications that are using UI Description
From: ext Pekka Savola [mailto:pekkas(_at_)netcore(_dot_)fi]
Sent: 30 June, 2005 20:42
To: Dave Crocker
Cc: Stirbu Vlad (Nokia-TP-MSW/Tampere); ietf(_at_)ietf(_dot_)org
Subject: Re: Remote UI BoF at IETF63
On Thu, 30 Jun 2005, Dave Crocker wrote:
Are there any materials to use as input, either as
exemplars or possibly even
as a beginning specification? This latter would be
particularly helpful for
the success of the effort?
FWIW, when I read the BoF description, I started wondering about the
same things. Specifically I started thinking about the half a dozen
protocols that already exist in this space (vnc, rdp, ...). It'd be
very good to understand the scope of the work, and how the existing
protocols fit (or don't fit) the bill.
Pekka Savola "You each name yourselves king, yet the
Netcore Oy kingdom bleeds."
Systems. Networks. Security. -- George R.R. Martin: A Clash of Kings
Ietf mailing list