Age | Commit message (Collapse) | Author |
|
|
|
This ensures the window gets resized properly when a user changes
the scaler options in the GUI. Simply unlocking the window size on
a call to setGraphicsMode is not good enough, because the scaler
mode can be changed by games during mode switches, and we don't
want to reset the window size in that case.
|
|
32bpp cursor scaling is not available, but this should be fine as
many of the software scalers are not designed to work with >16bpp
data in any case.
This change also includes some minor cleanup of unnecessary #ifdefs
around code that works equally well with or without USE_RGB_COLOR,
to simplify the implementation.
|
|
Normally with SDL, a mouse motion event will be sent after the
system mouse cursor has been moved by a call to
SDL_WarpMouseInWindow, but if the system cursor cannot be moved
(e.g. because the window does not have mouse focus), games still
need to receive these mouse events so they can successfully update
the mouse position internally. Otherwise, games continue to think
the mouse is still in the original position and will continue to
try to perform whatever action is associated with that mouse
position.
Refs Trac#9689.
|
|
|
|
|
|
The SDL graphics manager was just ignoring calls from CursorMan to
set the cursor to a blank cursor, which meant engines that did not
immediately send a cursor to CursorMan at startup would still show
the launcher's cursor (usually with a broken palette).
The OpenGL graphics manager would try to generate and draw an
invalid cursor surface when receiving an empty cursor.
|
|
the cursor
The only reason we show the system cursor outside the game area is
to show users where their mouse is when the window is resized and
the mouse is outside the game area. If the game cannot be
interacted with, then the mouse also does not need to be shown in
the black areas.
|
|
This patch refactors the OpenGL and SDL graphics backends,
primarily to unify window scaling and mouse handling, and to
fix coordinate mapping between the ScummVM window and the
virtual game screen when they have different aspect ratios.
Unified code for these two backends has been moved to a new
header-only WindowedGraphicsManager class, so named because it
contains code for managing graphics managers that interact with
a windowing system and render virtual screens within a larger
physical content window.
The biggest behavioral change here is with the coordinate
system mapping:
Previously, mouse offsets were converted by mapping the whole
space within the window as input to the virtual game screen
without maintaining aspect ratio. This was done to prevent
'stickiness' when the mouse cursor was within the window but
outside of the virtual game screen, but it caused noticeable
distortion of mouse movement speed on the axis with blank
space.
Instead of introducing mouse speed distortion to prevent
stickiness, this patch changes coordinate transformation to
show the system cursor when the mouse moves outside of the virtual
game screen when mouse grab is off, or by holding the mouse inside
the virtual game screen (instead of the entire window) when mouse
grab is on.
This patch also improves some other properties of the
GraphicsManager/PaletteManager interfaces:
* Nullipotent operations (getWidth, getHeight, etc.) of the
PaletteManager/GraphicsManager interfaces are now const
* Methods marked `virtual` but not inherited by any subclass have
been de-virtualized
* Extra unnecessary calculations of hardware height in
SurfaceSdlGraphicsManager have been removed
* Methods have been renamed where appropriate for clarity
(setWindowSize -> handleResize, etc.)
* C++11 support improved with `override` specifier added on
overridden virtual methods in subclasses (primarily to avoid
myself accidentally creating new methods in the subclasses
by changing types/names during refactoring)
Additional refactoring can and should be done at some point to
continue to deduplicate code between the OpenGL and SDL backends.
Since the primary goal here was to improve the coordinate mapping,
full refactoring of these backends was not completed here.
|
|
This matches the other ScummVM and SDL APIs for mouse warp.
|
|
The GUI is not redrawn when the window size changes, but that is
not as important as being able to resize the games themselves.
|
|
|
|
ports
|
|
There is no particular reason why backends that don't need to
calculate screen dimensions in advance should still need to
implement initSizeHint at this point.
|
|
|
|
|
|
This flag is removed for a few reasons:
* Engines universally set this flag to true for widths > 320,
which made it redundant everywhere;
* This flag functioned primarily as a "force 1x scaler" flag,
since its behaviour was almost completely undocumented and users
would need to figure out that they'd need an explicit non-default
scaler set to get a scaler to operate at widths > 320;
* (Most importantly) engines should not be in the business of
deciding how the backend may choose to render its virtual screen.
The choice of rendering behaviour belongs to the user, and the
backend, in that order.
A nearby future commit restores the default1x scaler behaviour in
the SDL backend code for the moment, but in the future it is my
hope that there will be a better configuration UI to allow users
to specify how they want scaling to work for high resolutions.
|
|
This change allows:
* Engines to update their target rendering surface/size and pixel
format with the backend multiple times during gameplay;
* Users to resize the ScummVM window without having it reset
size/position every time an engine updates its target surface
format;
* Conversions/scaling to continue to run efficiently in hardware,
instead of requiring engines to pick their maximum possible
output format once and upscale inefficiently in software;
* The window to reset size once when an engine calls to set its
initial output size, and to reset again once ScummVM returns to
the launcher.
This is relevant for at least SCI32 and DreamWeb engines, which
perform graphics mode switches during games.
|
|
Also remove a couple of TODOs. I think we can limit the CoreAudio
plugin to the Apple DLS softsynth since with have the CoreMidi
plugin to access other MIDI devices.
|
|
|
|
|
|
|
|
|
|
Also fix the MemoryReadWriteStream managed buffer being leaked.
Fixes #9718.
|
|
This function is only defined when USE_RGB_COLOR is defined so
these additional conditions are redundant.
|
|
|
|
The use of DoubleBufferSDLMixerManager in the OpenPandora backend was
removed in commit b157269 but the include for it was left behind.
|
|
The use of DoubleBufferSDLMixerManager in the GPH backend was removed
in commit 3b6398c but the include for it was left behind.
|
|
Since the macosx backend now does the same as the base SDL backend
we can just let the base class do its stuff.
|
|
This mixer type was added in
943b4c2036002454b276e0190dfc2c8919fb0cbf because "anything which
produces sampled data with high latency (like the MT-32 emulator)
will sound terribly", but as far as I can see (or reproduce), this
mixer doesn't do anything that would solve that problem, except
that it effectively doubles the size of the audio buffer so there's
less chance of an underflow due to slower-than-realtime synthesis
by the softsynth. But you don't need the overhead of a separate
thread to do that, you just need to increase the buffer size.
|
|
If it turns out that everything that had previously been fixed by
this manager is broken by this change, everything that had been
fixed probably could have been fixed by just increasing the audio
buffer size in SdlMixerManager. :\
|
|
The previous default buffer size of 4096 samples for 44kHz mixer
would add up to 93ms of audio latency, which is fine for early
adventure games, but this is significantly more latency than is
acceptable for games with full motion video. For these games,
the latency needs to be kept within roughly +15ms and -45ms of
video frame presentation to avoid lip sync problems. With this
change, the default audio buffer size is calculated to be 1024
samples at 44kHz (which happens to match what DOSBox uses).
There is a possibility that the reduced latency may cause issues
that did not previously exist with things like the MT-32 emulator,
where a larger buffer size allowed for a larger window where
high-complexity synthesis that could not be generated in realtime
could be balanced out by low-complexity synthesis that could be
generated faster than realtime. In this case, rather than
increasing the system mixer buffer size again, please move the
MT-32 emulator into its own thread and give it its own larger ring
buffer into which it can generate more sample data in advance,
independently from the rest of the audio system.
For other systems where this buffer size reduction might cause a
problem with audio drop-outs, a new audio_buffer_size
configuration option has been added to allow users to tweak the
audio buffer size to match their machine's ability to generate
audio samples.
Fixes Trac#10033. Also improves playback of samples in SCI that
were programmed to restart across several consecutive frames,
relying on lower audio latency in the original engine for this to
not sound bad, like the hopping sound at the start of chapter 1
of KQ7, and the sound of turning on the power in the digger train
in the Lighthouse volcano.
|
|
Removing this GUI control was suggested as far back as 2011 at
<http://lists.scummvm.org/pipermail/scummvm-devel/2011-November/010416.html>.
There were no objections, but it was never removed. When working
on audio latency bugs, I independently rediscovered that the GUI
option was broken: the per-game options would *never* work, and the
option would not take effect until ScummVM was restarted because
there is no API for interacting with the backend audio mixer. So
now, it is finally gone.
Primarily for the sake of future troubleshooting, configurability
of the audio sample frequency within SdlMixerManager is maintained
for the moment, but now users will need to edit their ScummVM
configuration file manually to change it.
|
|
|
|
_hwscreen is always initialized to 16bpp so the supported 32bpp
pixel formats would never be put into the list of supported pixel
formats, making it useless for engines to query for usable 32bpp
pixel formats.
This patch changes things so that the native desktop pixel format
is at the top of the supported formats list, and all pixel formats
<= the default desktop pixel format will now show up in the list
of supported formats. ("Supported" is somewhat of a misnomer here
since there is no hardware querying beyond checking the default
desktop pixel format. SDL generally accepts a wide variety of pixel
formats and tries to convert them to whatever the hardware
supports.)
|
|
|
|
When -Wundeclared-selector is enabled (recommended by Apple), the
calls to the setBadgeLabel selector in MacOSXTaskbarManager are
warned on because NSDockTile declarations are not included because
they do not exist in macOS 10.4 and earlier. While I don't know
that we are even supporting such old macOS versions these days, it
is simple enough to fix this problem when compiling to modern
macOS versions by conditionally including the necessary header.
|
|
|
|
Translation strings come from external data sources and can cause
a stack buffer overflow here just by accidentally (or maliciously)
being too long.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
I think these are the last one that were already flagged as being
deliberate.
|
|
Previously we were clearing the whole backbuffer for 3 frames after a
window size change, and then only clearing the game area. This assumes
the OpenGL driver uses at most 3 render buffer and uses them in
sequential order. This does not seem to be the case on Linux when using
an Intel integrated GPU.
Instead we now clear the whole backbuffer on each frame to make sure
there are no leftovers remaining on the screen. All semi-recent GPUs
should have hardware clear anyway so this should not impact negatively
performance.
Possibly fixes #10025.
|