What is the proper way to scale fringe-bitmaps for high-DPI displays?

classic Classic list List threaded Threaded
19 messages Options
Reply | Threaded
Open this post in threaded view
|

What is the proper way to scale fringe-bitmaps for high-DPI displays?

Clément Pit-Claudel
Hi all,

Users on Flycheck are complaining about poor readability of our fringe bitmaps on high-DPI monitors, as the bitmaps look tiny on such screens.  An easy fix is to double the size of the bitmap, but it leaves users of low-DPI monitors in the cold.  A trickier fix would be to dynamically detect the current monitor's density, and pick the appropriate bitmap accordingly, but I'm not entirely sure how to do detect these high-DPI monitors:

- x-display-monitor-attributes-list seems OK, but looks more complex than what we need (based on looking at the C code) — is it OK to call it repeatedly to figure out the current monitor's density for a given frame?

- x-display-pixel-width and x-display-pixel-mm seem simpler, but the documentation says 'On "multi-monitor" setups this refers to the pixel width for all
physical monitors associated with TERMINAL.'.  What does this mean?

Also, how do applications typically deal with frames being moved from a low-DPI monitor to a high-DPI one? Is that an issue in practice?

Thanks!
Clément.

Reply | Threaded
Open this post in threaded view
|

Re: What is the proper way to scale fringe-bitmaps for high-DPI displays?

Eli Zaretskii
> From: Clément Pit-Claudel <[hidden email]>
> Date: Wed, 20 Mar 2019 10:55:43 -0400
>
> Users on Flycheck are complaining about poor readability of our fringe bitmaps on high-DPI monitors, as the bitmaps look tiny on such screens.  An easy fix is to double the size of the bitmap, but it leaves users of low-DPI monitors in the cold.  A trickier fix would be to dynamically detect the current monitor's density, and pick the appropriate bitmap accordingly, but I'm not entirely sure how to do detect these high-DPI monitors:
>
> - x-display-monitor-attributes-list seems OK, but looks more complex than what we need (based on looking at the C code) — is it OK to call it repeatedly to figure out the current monitor's density for a given frame?

That'd be very inelegant, IMO.

Instead, I think when a frame is created, we should record its
high-DPI state in the frame structure, or maybe in the frame's
parameters, and then use that when we prepare the fringe bitmaps for
display.

> Also, how do applications typically deal with frames being moved from a low-DPI monitor to a high-DPI one? Is that an issue in practice?

If a frame can be moved from high-DPI to low-DPI, then I guess we will
need to query the low-level interfaces which report that in
update_frame or thereabouts.

Reply | Threaded
Open this post in threaded view
|

Re: What is the proper way to scale fringe-bitmaps for high-DPI displays?

Clément Pit-Claudel
On 2019-03-20 13:32, Eli Zaretskii wrote:
>> From: Clément Pit-Claudel <[hidden email]>
>> Date: Wed, 20 Mar 2019 10:55:43 -0400
>>
>> Users on Flycheck are complaining about poor readability of our fringe bitmaps on high-DPI monitors, as the bitmaps look tiny on such screens.  An easy fix is to double the size of the bitmap, but it leaves users of low-DPI monitors in the cold.  A trickier fix would be to dynamically detect the current monitor's density, and pick the appropriate bitmap accordingly, but I'm not entirely sure how to do detect these high-DPI monitors:
>>
>> - x-display-monitor-attributes-list seems OK, but looks more complex than what we need (based on looking at the C code) — is it OK to call it repeatedly to figure out the current monitor's density for a given frame?
>
> That'd be very inelegant, IMO.

Understood. Thanks for your input! (And for the quick answer)

> Instead, I think when a frame is created, we should record its
> high-DPI state in the frame structure, or maybe in the frame's
> parameters, and then use that when we prepare the fringe bitmaps for
> display.

That would be nice.  In fact, we already have code to detect high-DPI displays in C, in x_get_scale_factor in xterm.c (used to scale wavy underlines).  Would the way to go be to record the value returned by this function in the frame's parameters?
 
>> Also, how do applications typically deal with frames being moved from a low-DPI monitor to a high-DPI one? Is that an issue in practice?
>
> If a frame can be moved from high-DPI to low-DPI, then I guess we will
> need to query the low-level interfaces which report that in
> update_frame or thereabouts.

Got it. Thanks!

I guess that another approach would be to support scalable images in the fringe, rather than bitmaps.  If the fringe image was SVG instead of a bitmap, for example, we could make it fill the whole width of the fringe.

Do you have a sense of how hard that would be?

Clément.

Reply | Threaded
Open this post in threaded view
|

Re: What is the proper way to scale fringe-bitmaps for high-DPI displays?

Clément Pit-Claudel
In reply to this post by Eli Zaretskii
On 2019-03-20 13:32, Eli Zaretskii wrote:

>> From: Clément Pit-Claudel <[hidden email]>
>> Date: Wed, 20 Mar 2019 10:55:43 -0400
>>
>> Users on Flycheck are complaining about poor readability of our fringe bitmaps on high-DPI monitors, as the bitmaps look tiny on such screens.  An easy fix is to double the size of the bitmap, but it leaves users of low-DPI monitors in the cold.  A trickier fix would be to dynamically detect the current monitor's density, and pick the appropriate bitmap accordingly, but I'm not entirely sure how to do detect these high-DPI monitors:
>>
>> - x-display-monitor-attributes-list seems OK, but looks more complex than what we need (based on looking at the C code) — is it OK to call it repeatedly to figure out the current monitor's density for a given frame?
>
> That'd be very inelegant, IMO.
>
> Instead, I think when a frame is created, we should record its
> high-DPI state in the frame structure, or maybe in the frame's
> parameters, and then use that when we prepare the fringe bitmaps for
> display.

In addition to my previous message, I should add that other parts of Emacs would likely benefit from this as well, since we have code (see cb73c70180f57f3fb99fae3aaefbacf0a61cea3f) that computes DPI values in gamegrid.el.

Cheers,
Clément.


Reply | Threaded
Open this post in threaded view
|

Re: What is the proper way to scale fringe-bitmaps for high-DPI displays?

Eli Zaretskii
In reply to this post by Clément Pit-Claudel
> Cc: [hidden email]
> From: Clément Pit-Claudel <[hidden email]>
> Date: Wed, 20 Mar 2019 15:34:26 -0400
>
> > Instead, I think when a frame is created, we should record its
> > high-DPI state in the frame structure, or maybe in the frame's
> > parameters, and then use that when we prepare the fringe bitmaps for
> > display.
>
> That would be nice.  In fact, we already have code to detect high-DPI displays in C, in x_get_scale_factor in xterm.c (used to scale wavy underlines).  Would the way to go be to record the value returned by this function in the frame's parameters?

The frame's parameters is a better way if we think such a parameter
will be useful to Lisp programs, and calling a function for that is
too much overhead.  Otherwise, a simple field of 'struct frame' will
be somewhat less hassle, because you don't need to mess with the likes
of frame-parameter to teach them about this new parameter.  But either
way, the job is not hard.

> I guess that another approach would be to support scalable images in the fringe, rather than bitmaps.  If the fringe image was SVG instead of a bitmap, for example, we could make it fill the whole width of the fringe.
>
> Do you have a sense of how hard that would be?

No, I don't, sorry.

And in any case, I don't think we should rely on SVG support for a
feature as basic as fringe bitmaps, since we use that in the most
basic display on GUI frames, whereas some people intentionally build
Emacs without SVG support.

Reply | Threaded
Open this post in threaded view
|

Re: What is the proper way to scale fringe-bitmaps for high-DPI displays?

Clément Pit-Claudel
On 2019-03-20 15:44, Eli Zaretskii wrote:

>> Cc: [hidden email]
>> From: Clément Pit-Claudel <[hidden email]>
>> Date: Wed, 20 Mar 2019 15:34:26 -0400
>>
>>> Instead, I think when a frame is created, we should record its
>>> high-DPI state in the frame structure, or maybe in the frame's
>>> parameters, and then use that when we prepare the fringe bitmaps for
>>> display.
>>
>> That would be nice.  In fact, we already have code to detect high-DPI displays in C, in x_get_scale_factor in xterm.c (used to scale wavy underlines).  Would the way to go be to record the value returned by this function in the frame's parameters?
>
> The frame's parameters is a better way if we think such a parameter
> will be useful to Lisp programs, and calling a function for that is
> too much overhead.  Otherwise, a simple field of 'struct frame' will
> be somewhat less hassle, because you don't need to mess with the likes
> of frame-parameter to teach them about this new parameter.  But either
> way, the job is not hard.

Oh, so Emacs' C code would scale the bitmaps? I expected the Lisp code would do that.


Reply | Threaded
Open this post in threaded view
|

Re: What is the proper way to scale fringe-bitmaps for high-DPI displays?

Eli Zaretskii
> Cc: [hidden email]
> From: Clément Pit-Claudel <[hidden email]>
> Date: Wed, 20 Mar 2019 16:05:48 -0400
>
> > The frame's parameters is a better way if we think such a parameter
> > will be useful to Lisp programs, and calling a function for that is
> > too much overhead.  Otherwise, a simple field of 'struct frame' will
> > be somewhat less hassle, because you don't need to mess with the likes
> > of frame-parameter to teach them about this new parameter.  But either
> > way, the job is not hard.
>
> Oh, so Emacs' C code would scale the bitmaps? I expected the Lisp code would do that.

Fringes are displayed in C.  Doing this in Lisp will produce
flickering, I'm afraid.

Reply | Threaded
Open this post in threaded view
|

Re: What is the proper way to scale fringe-bitmaps for high-DPI displays?

Clément Pit-Claudel
On 2019-03-20 16:17, Eli Zaretskii wrote:

>> Cc: [hidden email]
>> From: Clément Pit-Claudel <[hidden email]>
>> Date: Wed, 20 Mar 2019 16:05:48 -0400
>>
>>> The frame's parameters is a better way if we think such a parameter
>>> will be useful to Lisp programs, and calling a function for that is
>>> too much overhead.  Otherwise, a simple field of 'struct frame' will
>>> be somewhat less hassle, because you don't need to mess with the likes
>>> of frame-parameter to teach them about this new parameter.  But either
>>> way, the job is not hard.
>>
>> Oh, so Emacs' C code would scale the bitmaps? I expected the Lisp code would do that.
>
> Fringes are displayed in C.  Doing this in Lisp will produce
> flickering, I'm afraid.

I thought the C code would read the scaling factor and set the bitmap accordingly just once, when creating overlays or applying text properties.

Reply | Threaded
Open this post in threaded view
|

Re: What is the proper way to scale fringe-bitmaps for high-DPI displays?

Eli Zaretskii
> Cc: [hidden email]
> From: Clément Pit-Claudel <[hidden email]>
> Date: Wed, 20 Mar 2019 17:17:16 -0400
>
> >> Oh, so Emacs' C code would scale the bitmaps? I expected the Lisp code would do that.
> >
> > Fringes are displayed in C.  Doing this in Lisp will produce
> > flickering, I'm afraid.
>
> I thought the C code would read the scaling factor and set the bitmap accordingly just once, when creating overlays or applying text properties.

But you were saying that a frame can move from a high-DPI terminal to
a low-DPI one, which seems to mean we cannot compute that just once.
And besides, there are fringe bitmaps that we display regardless of
any overlays and text properties (e.g., truncation and continuation
indicators), which are displayed directly from C.

Reply | Threaded
Open this post in threaded view
|

Re: What is the proper way to scale fringe-bitmaps for high-DPI displays?

Daniel Pittman-3
On Thu, Mar 21, 2019 at 3:32 AM Eli Zaretskii <[hidden email]> wrote:
> Cc: [hidden email]
> From: Clément Pit-Claudel <[hidden email]>
> Date: Wed, 20 Mar 2019 17:17:16 -0400
>
> >> Oh, so Emacs' C code would scale the bitmaps? I expected the Lisp code would do that.
> >
> > Fringes are displayed in C.  Doing this in Lisp will produce
> > flickering, I'm afraid.
>
> I thought the C code would read the scaling factor and set the bitmap accordingly just once, when creating overlays or applying text properties.

But you were saying that a frame can move from a high-DPI terminal to
a low-DPI one, which seems to mean we cannot compute that just once.

They can, at least on macOS, where that is entirely trivial to achieve by plugging an external (low DPI) monitor into a (high DPI) laptop with the panel open.  You could even enjoy the fun situation where your frame is displaying half the window on each of them, so technically has two different and concurrent densities.
 
And besides, there are fringe bitmaps that we display regardless of
any overlays and text properties (e.g., truncation and continuation
indicators), which are displayed directly from C.

What you really want here is a resolution independent unit for specifying the size of the output, or a macOS-alike ability to give multiple resolution bitmaps and have the most appropriate selected by Emacs, yeah?  Anyway, I'd certainly say that having the C code scale the bitmap is the most reasonable "no changes to anything else" solution – macOS did that during the early transition to those high density displays, and it worked out pretty well overall.  (Though they have a rendering model vastly less tied to physical units than most things.)

As a potentially useful aside in this context, HTML specifies that the "pixel" is a resolution-independent unit, and should probably approximate a 72 DPI display as the 1:1 logical:physical device.  I mention this because applying similar logic in Emacs would give the closest approximation of what people have been trained to expect in other media.  (Though the choice to not scale the content of the img tag... was not great.)
Reply | Threaded
Open this post in threaded view
|

Re: What is the proper way to scale fringe-bitmaps for high-DPI displays?

Yuri Khan
On Thu, Mar 21, 2019 at 6:50 PM Daniel Pittman <[hidden email]> wrote:
>> But you were saying that a frame can move from a high-DPI terminal to
>> a low-DPI one, which seems to mean we cannot compute that just once.
>
> They can, at least on macOS, where that is entirely trivial to achieve by plugging an external (low DPI) monitor into a (high DPI) laptop with the panel open.  You could even enjoy the fun situation where your frame is displaying half the window on each of them, so technically has two different and concurrent densities.

I hear GTK+/Wayland also has this ability. Never tried it though.
GTK+/X is as far as I know constant DPI over the whole X display.

> What you really want here is a resolution independent unit for specifying the size of the output, or a macOS-alike ability to give multiple resolution bitmaps and have the most appropriate selected by Emacs, yeah?

I can see the following options:

* Migrate everything to SVG. Teach developers SVG is good, bitmaps are
bad. Package developer provides a single vector image. Failure mode:
developer is on a high DPI screen, makes a high-detail image, low DPI
users complain “image is blurry”.

* Keep bitmaps and upscale them for high DPI. Package developer
provides a single bitmap image. Failure mode 1: Nearest neighbor
upscaling looks ugly at non-integer factors. Failure mode 2: all other
upscaling algorithms look ugly pretty much always.

* Keep bitmaps and downscale them for low DPI. Package developer
provides a single, fairly large bitmap image. Failure mode: small
details get lost on low resolutions, image looks blurry.

* Migrate to multi-resolution bitmaps. Package developer has to
provide multiple bitmaps. Failure mode 1: Nobody knows what sizes they
need. Failure mode 2: Some will only include one for the low DPI. This
can be combined with up/downscaling, trading the corresponding failure
modes around.


> As a potentially useful aside in this context, HTML specifies that the "pixel" is a resolution-independent unit, and should probably approximate a 72 DPI display as the 1:1 logical:physical device.

Actually, the HTML pixel is specified as 1/96 of an inch.

Reply | Threaded
Open this post in threaded view
|

Re: What is the proper way to scale fringe-bitmaps for high-DPI displays?

Eli Zaretskii
> From: Yuri Khan <[hidden email]>
> Date: Thu, 21 Mar 2019 20:33:33 +0700
> Cc: Eli Zaretskii <[hidden email]>, Clément Pit-Claudel <[hidden email]>,
> emacs-devel <[hidden email]>
>
> * Migrate everything to SVG. Teach developers SVG is good, bitmaps are
> bad. Package developer provides a single vector image. Failure mode:
> developer is on a high DPI screen, makes a high-detail image, low DPI
> users complain “image is blurry”.

The failure mode that bothers me much more is that Emacs without SVG
support will be unable to show the standard fringe indicators.

In general, having the basic Emacs functionality depend on image
libraries is a non-starter, IMO.

> * Keep bitmaps and upscale them for high DPI. Package developer
> provides a single bitmap image. Failure mode 1: Nearest neighbor
> upscaling looks ugly at non-integer factors. Failure mode 2: all other
> upscaling algorithms look ugly pretty much always.

Is this worse than the current situation?

> * Keep bitmaps and downscale them for low DPI. Package developer
> provides a single, fairly large bitmap image. Failure mode: small
> details get lost on low resolutions, image looks blurry.

Is this worse than the current situation?

> * Migrate to multi-resolution bitmaps. Package developer has to
> provide multiple bitmaps. Failure mode 1: Nobody knows what sizes they
> need. Failure mode 2: Some will only include one for the low DPI. This
> can be combined with up/downscaling, trading the corresponding failure
> modes around.

Is this worse than the current situation?

Reply | Threaded
Open this post in threaded view
|

Re: What is the proper way to scale fringe-bitmaps for high-DPI displays?

Dmitry Gutov
In reply to this post by Eli Zaretskii
On 20.03.2019 19:32, Eli Zaretskii wrote:

> Instead, I think when a frame is created, we should record its
> high-DPI state in the frame structure, or maybe in the frame's
> parameters, and then use that when we prepare the fringe bitmaps for
> display.

I'm all for scaling the standard bitmaps (question mark, etc), but there
are also bitmaps that don't need scaling. Such as the ones diff-hl makes
dynamically, depending on the current width of the fringe. We could use
the hidpi as a hint e.g. to draw thicker lines, but a proportional
scaling won't do in that case.

Just wanted to point that out. So a way to opt out of scaling would be
nice, at least.

Reply | Threaded
Open this post in threaded view
|

Re: What is the proper way to scale fringe-bitmaps for high-DPI displays?

Eli Zaretskii
> Cc: [hidden email]
> From: Dmitry Gutov <[hidden email]>
> Date: Thu, 21 Mar 2019 17:24:44 +0200
>
> Just wanted to point that out. So a way to opt out of scaling would be
> nice, at least.

Right, thanks.

Reply | Threaded
Open this post in threaded view
|

Re: What is the proper way to scale fringe-bitmaps for high-DPI displays?

Alex Gramiak
In reply to this post by Eli Zaretskii
Eli Zaretskii <[hidden email]> writes:

>> From: Yuri Khan <[hidden email]>
>> Date: Thu, 21 Mar 2019 20:33:33 +0700
>> Cc: Eli Zaretskii <[hidden email]>, Clément Pit-Claudel <[hidden email]>,
>> emacs-devel <[hidden email]>
>>
>> * Migrate everything to SVG. Teach developers SVG is good, bitmaps are
>> bad. Package developer provides a single vector image. Failure mode:
>> developer is on a high DPI screen, makes a high-detail image, low DPI
>> users complain “image is blurry”.
>
> The failure mode that bothers me much more is that Emacs without SVG
> support will be unable to show the standard fringe indicators.
>
> In general, having the basic Emacs functionality depend on image
> libraries is a non-starter, IMO.

What would be the issue in having this support be conditional if it is
otherwise the best solution?

Reply | Threaded
Open this post in threaded view
|

Re: What is the proper way to scale fringe-bitmaps for high-DPI displays?

Clément Pit-Claudel
In reply to this post by Eli Zaretskii
On 2019-03-21 10:32, Eli Zaretskii wrote:

>> From: Yuri Khan <[hidden email]>
>> Date: Thu, 21 Mar 2019 20:33:33 +0700
>> Cc: Eli Zaretskii <[hidden email]>, Clément Pit-Claudel <[hidden email]>,
>> emacs-devel <[hidden email]>
>>
>> * Migrate everything to SVG. Teach developers SVG is good, bitmaps are
>> bad. Package developer provides a single vector image. Failure mode:
>> developer is on a high DPI screen, makes a high-detail image, low DPI
>> users complain “image is blurry”.
>
> The failure mode that bothers me much more is that Emacs without SVG
> support will be unable to show the standard fringe indicators.

I was expecting something like the toolbars images, where the package author would provide both an SVG and a bitmap, and Emacs would use the latter when SVG support isn't available.

Reply | Threaded
Open this post in threaded view
|

Re: What is the proper way to scale fringe-bitmaps for high-DPI displays?

Clément Pit-Claudel
In reply to this post by Eli Zaretskii
On 2019-03-20 23:32, Eli Zaretskii wrote:

>> Cc: [hidden email]
>> From: Clément Pit-Claudel <[hidden email]>
>> Date: Wed, 20 Mar 2019 17:17:16 -0400
>>
>>>> Oh, so Emacs' C code would scale the bitmaps? I expected the Lisp code would do that.
>>>
>>> Fringes are displayed in C.  Doing this in Lisp will produce
>>> flickering, I'm afraid.
>>
>> I thought the C code would read the scaling factor and set the bitmap accordingly just once, when creating overlays or applying text properties.
>
> But you were saying that a frame can move from a high-DPI terminal to
> a low-DPI one, which seems to mean we cannot compute that just once.

True, but I wouldn't be too unhappy if switching from high to low DPI (or vice versa) didn't work so well.  I also don't know if it's actually possible to have different-DPI displays on GNU/Linux systems.

> And besides, there are fringe bitmaps that we display regardless of
> any overlays and text properties (e.g., truncation and continuation
> indicators), which are displayed directly from C.

Good point.


Reply | Threaded
Open this post in threaded view
|

Re: What is the proper way to scale fringe-bitmaps for high-DPI displays?

Eli Zaretskii
In reply to this post by Alex Gramiak
> From: Alex <[hidden email]>
> Date: Thu, 21 Mar 2019 11:32:38 -0600
> Cc: Yuri Khan <[hidden email]>, [hidden email],
> [hidden email], [hidden email]
>
> > In general, having the basic Emacs functionality depend on image
> > libraries is a non-starter, IMO.
>
> What would be the issue in having this support be conditional if it is
> otherwise the best solution?

As an option, sure; patches to that effect are welcome.  But we should
make fringe bitmaps look reasonably well on high-DPI displays even
without that.  IOW, leaving it as it is now when SVG is not compiled
in is not an idea that we should welcome, IMO.

Reply | Threaded
Open this post in threaded view
|

Re: What is the proper way to scale fringe-bitmaps for high-DPI displays?

Eli Zaretskii
In reply to this post by Clément Pit-Claudel
> Cc: [hidden email], [hidden email]
> From: Clément Pit-Claudel <[hidden email]>
> Date: Thu, 21 Mar 2019 13:38:45 -0400
>
> > The failure mode that bothers me much more is that Emacs without SVG
> > support will be unable to show the standard fringe indicators.
>
> I was expecting something like the toolbars images, where the package author would provide both an SVG and a bitmap, and Emacs would use the latter when SVG support isn't available.

I was talking about the standard indicator bitmaps, where we are "the
package author".