Up until now we relied on implicit loading of middlewares and reducers, through
having imports in each feature's index.js.
This leads to many complex import cycles which result in (sometimes) hard to fix
bugs in addition to (often) breaking mobile because a web-only feature gets
imported on mobile too, thanks to the implicit loading.
This PR changes that to make the process explicit. Both middlewares and reducers
are imported in a single place, the app entrypoint. They have been divided into
3 categories: any, web and native, which represent each of the platforms
respectively.
Ideally no feature should have an index.js exporting actions, action types and
components, but that's a larger ordeal, so this is just the first step in
getting there. In order to both set example and avoid large cycles the app
feature has been refactored to not have an idex.js itself.
> playinline attr needs to be set to true to stop local video from playing in full screen mode in Safari on iOS.
> This applies to the local video thumbnails and the camera previews from the device selection menu and video preview button
- Adds the ability to share video as a "PiP" when screenshare is in progress.
- Add a method for creating a local presenter track.
- Make sure isLocalVideoTrackMuted returns the correct mute state when only screenshare is present.
- Make sure we get the updated window size of the window being shared before painting it on the canvas.
- Make sure we check if the shared window has been resized
This adds an option to disable video autoplay that will be used mostly with maleus (our selenium-based load testing tool for testing the new bridge). Disabling video rendering lowers the resource utilisation of the selenium nodes.
React Native doesn't define __filename nor __dirname so do it artisanally. In
addition, this helps with centralizing the configuration passed to loggers.
This refactors all handling of audio-only and last N to 2 features in preparation
for "low bandwidth mode".
The main motivation to do this is that lastN is a "global" setting so it helps
to have all processing for it in a single place.
Using anything non-serializable for action types is discouraged:
https://redux.js.org/faq/actions#actions
In fact, this is the Flow definition for dispatching actions:
declare export type DispatchAPI<A> = (action: A) => A;
declare export type Dispatch<A: { type: $Subtype<string> }> = DispatchAPI<A>;
Note how the `type` field is defined as a subtype of string, which Symbol isn’t.
It doesn't seem like videoTrack needs to be set onto state
if it can be accessed directly from props. Removing the state
automatically removes the deprecated componentWillReceiveProps.
For the most part the changes are taking the "static propTypes" declaration off
of components and declaring them as Flow types. Sometimes to support flow some
method signatures had to be added. There are some exceptions in which more had
to be done to tame the beast:
- AbstractVideoTrack: put in additional truthy checks for videoTrack.
- Video: add truthy checks for the _videoElement ref.
- shouldRenderVideoTrack function: Some component could pass null for the
videoTrack argument and Flow wanted that called out explicitly.
- DisplayName: Add a truthy check for the input ref before acting on it.
- NumbersList: Move array checks inline for Flow to comprehend array methods
could be called. Add type checks in the Object.entries loop as the value is
assumed to be a mixed type by Flow.
- AbstractToolbarButton: add additional truthy check for passed in type.
It's too sensitive and most of the time I cannot perform an onPress. In
contrast, the builtin/default/standard onPress is noticeably more
forgiving. While we fix the sensitivity of "pinch to zoom", don't use
its onPress unless absolutely necessary i.e. use it only for desktop
streams.
On Android the files will be copied to the assets/sounds directory of
the SDK bundle on build time. To play the "asset:/" prefix has to be
used to locate the files correctly.
On iOS each sound file must be added to the SDK's Xcode project in order
to be bundled correctly. To playback we need to know the path of the SDK
bundle which is now exposed by the AppInfo iOS module.
TouchableWithoutFeedback and TouchableHighlight interfere with the
implementation of 'pinch to zoom' to come. We prepare for it by driving
the onClick/onPress handler(s) out of Conference, through LargeVideo and
ParticipantView into Video itself where the bulk of 'pinch to zoom' will
be implemented.