React Native doesn't define __filename nor __dirname so do it artisanally. In
addition, this helps with centralizing the configuration passed to loggers.
For the most part the changes are taking the "static propTypes" declaration off
of components and declaring them as Flow types. Sometimes to support flow some
method signatures had to be added. There are some exceptions in which more had
to be done to tame the beast:
- AbstractVideoTrack: put in additional truthy checks for videoTrack.
- Video: add truthy checks for the _videoElement ref.
- shouldRenderVideoTrack function: Some component could pass null for the
videoTrack argument and Flow wanted that called out explicitly.
- DisplayName: Add a truthy check for the input ref before acting on it.
- NumbersList: Move array checks inline for Flow to comprehend array methods
could be called. Add type checks in the Object.entries loop as the value is
assumed to be a mixed type by Flow.
- AbstractToolbarButton: add additional truthy check for passed in type.
It's too sensitive and most of the time I cannot perform an onPress. In
contrast, the builtin/default/standard onPress is noticeably more
forgiving. While we fix the sensitivity of "pinch to zoom", don't use
its onPress unless absolutely necessary i.e. use it only for desktop
streams.
On Android the files will be copied to the assets/sounds directory of
the SDK bundle on build time. To play the "asset:/" prefix has to be
used to locate the files correctly.
On iOS each sound file must be added to the SDK's Xcode project in order
to be bundled correctly. To playback we need to know the path of the SDK
bundle which is now exposed by the AppInfo iOS module.
TouchableWithoutFeedback and TouchableHighlight interfere with the
implementation of 'pinch to zoom' to come. We prepare for it by driving
the onClick/onPress handler(s) out of Conference, through LargeVideo and
ParticipantView into Video itself where the bulk of 'pinch to zoom' will
be implemented.
In preparation for "pinch to zoom" support in desktop streams on mobile, make
certain Views not intervene in touch event handling. While the modification is
necessary for "pinch to zoom" which is coming later, it really makes sense for
the modified Views to not be involved in touching because they're used to aid
layout and/or animations and are to behave to the user as if they're not there.
Adds base/sounds feature which allows other features to register a sound
source under specified id. A new SoundsCollection component will then
render corresponding HTMLAudioElement for each such sound. Once "setRef"
callback is called by the HTMLAudioElement, this element will be added
to the Redux store. When that happens sound can be played through the
new 'playSound' action which will call play() method on the stored
HTMLAudioElement instance.
Android uses a SurfaceView to render video, which is not quite a View, so the
fade-in animation (which varies the opacity) doesn't work.
Instead, add an opaque black view covering the video, which transitions to
transparent. This creates much smoother transitions on Android, while behaving
the same.
In addition, I removed the flip animation for local tracks, which is no longer
used, since the camera is switched without changing tracks.
Introduce certain React Components which may be used to write
cross-platform source code such as Audio like Web's audio, Container
like Web's div, Text like Web's p, etc.
As an intermediate step on the path to merging jitsi-meet and
jitsi-meet-react, import the whole source code of jitsi-meet-react as it
stands at
2f23d98424
i.e. the lastest master at the time of this import. No modifications are
applied to the imported source code in order to preserve a complete
snapshot of it in the repository of jitsi-meet and, thus, facilitate
comparison later on. Consequently, the source code of jitsi-meet and/or
jitsi-meet-react may not work. For example, jitsi-meet's jshint may be
unable to parse jitsi-meet-react's source code.