This particular wording dates to when libei created devices and the EIS
implementation would ack/nack those devices. This isn't the case anymore
so let's reword this a bit.
Closes#62
libreis was intended for an intermediary to set some information that
the libei client cannot be entrusted with. In particular this was the
application name, the allowed capabilities, and some properties that -
once set - the client could no longer change (appid as probably the only
really useful one). The price for this was a rather complicated version
negotiation dance before the initial CONNECT request.
Now that we have a clear view of what's going to happen -
RemoteDesktop.ConnectToEIS and the InputCapture portal - there is no
longer any need for libreis. The extra information that libreis would've
sent is communicated out-of-band in both portals and are known to the
compositor at the time the connection is being established.
So we can simply drop this, it's no longer required and dropping it
makes the protocol significantly simpler anyway.
libei used to have direct portal support code (see the git history) but:
- that code was a custom proposed portal that never went anywhere
- libei has slowly changed to be more an input event transport layer since
it is now also used sending events *to* a libei context
- a number of libei users will never need the DBus code, either because they
don't want it or because they talk Dbus themselves na ddon't need this
abstraction.
Luckily, it's quite easy to move this into a separate library with a
simple API that does, effectively, the same trick as the old portal backend.
This API is aiming to be as simple as possible because the tools that
require anything more complex should talk to DBus directly.
An example tool that uses the API to retrieve an EIS fd over the
RemoteDesktop portal is included in this patch.
"Öffis" is a German word meaning public transport. It also sounds like the
French Œuf, the word for egg.
Co-authored-by: Olivier Fourdan <ofourdan@redhat.com>
At least for event replaying it looks like we'll go through the
RemoteDesktop portal instead of creating another one that doesn't
provide any additional benefits.
See https://github.com/flatpak/xdg-desktop-portal/pull/762
Xwayland is not a compositor, it's a Wayland client.
Also fix a few typos while at it (XWayland -> Xwayland).
Signed-off-by: Olivier Fourdan <ofourdan@redhat.com>
There's nothing in the protocol to modify the client device state from
the server, so a pause/resume cycle must leave the client with the
same(-ish) state. Pause is really just that, a short "no event now
please". Anything that would require e.g. modifying the device state by
releasing keys or buttons should result in the device being removed and
re-added.
Signed-off-by: Peter Hutterer <peter.hutterer@who-t.net>
Build the doxygen API documentation. This is copied from libinput so it takes
over that style (which is more readable than the default doxygen style).
Some extra documentation is added too and all the immediate errors are fixed
in this commit but doxygen still warns about undocumented parameters.
Signed-off-by: Peter Hutterer <peter.hutterer@who-t.net>
The original idea here was that a libei client could request the Pointer
capability to be notified of any pointer movements, thus providing a simple
way to capture input for the synergy use-case.
This is a can of worms better left untouched. How input events are captured
and what information is available is quite specific to the display server, let
alone the triggers for when it needs to start and stop. To have that in libei
requires something like triggers ("start when pointer hits the edge") which
again opens a new can of worms. Which seat are we referring to? What is a
screen edge? How about shortcuts?
Receiving input events can be handled by libeis anyway: any EIS server is
capable of receiving input events by definition so the capability monitoring
could be solved by making the capturing compositor a libei client and the
other process an EIS server. i.e. the circle is closed with:
[compositor|libei] -> [EIS|synergy-client]
||
[synergy-server|libei] -> [EIS|compositor]
Signed-off-by: Peter Hutterer <peter.hutterer@who-t.net>
This is a basic example on how an EIS server can map to uinput, just to
illustrate that there's nothing in libeis that requires a display server
protocol (X or Wayland). All that's done here is we set up a keyboard and
pointer device and route the input events through that.
Signed-off-by: Peter Hutterer <peter.hutterer@who-t.net>
The current implementation of that portal has two methods: EmulateInput to
authenticate and Connect to get the fd to the EIS implementation. The portal
implementation is in charge of finding EIS and restricting it if need be.
This uses libsystemd because we can integrate that with epoll and our
libei_dispatch() method. GDBus requires a glib mainloop, so it's not really
suitable here. Given how simple this is anyway, it's easy to just do the DBus
bits in the caller and then hand the fd to ei_setup_backend_fd().
A eis-fake-portal is provided for testing, this "portal" can use the custom
portal bus name and connect the eis-demo-client to the eis-demo-server.
Signed-off-by: Peter Hutterer <peter.hutterer@who-t.net>
Short story, we can't do it but it's simple enough to work around in the
caller so let them do it.
Signed-off-by: Peter Hutterer <peter.hutterer@who-t.net>
keymap handling is difficult because a lot of it relies on specific server
implementation details. So let's provide the API for a client to assign a
specific keymap to the device and for the server to accept/refuse/override
that keymap.
Where the server refuses, it's up to the client to figure out the rest.
Signed-off-by: Peter Hutterer <peter.hutterer@who-t.net>