In 49e3b03885 we turned on SW encoders / decoders
on account of some devices having broken HW *encoders* and also our desire for
using simulcast.
Well, the astute reader may have noticed that only *encoding* was mentioned.
Indeed, we should be able to keep using the HW decoder just fine.
This shouldn't be needed, as ConnectionService should take care of it, but we
suspect some devices don't do it since we got reports of people not hearing
users, and the problem went away when CS was disabled.
Fallback to the non-ConnectionService case for any error. Also, handle errors
when registering the phone account; Pixel C devices throw UnsupportedException.
Some Samsung devices will fail to fully engage ConnectionService if no SIM card
was ever installed on the device. We could check for it, but it would require
the CALL_PHONE permission, which is not something we want to do, so fallback to
not using ConnectionService.
Some devices seem to have a bug in their Android versions and startCall fails
with SecurityError because the CALL_PHONE permissions is not granted. This is
not a requirement for self-managed connection services as per the official
documentation though:
https://developer.android.com/guide/topics/connectivity/telecom/selfManaged
Alas, connection services takes over audio device management too, so let's
handle the error and disable CS if we get SecurityError.
Samsung devices (of course) seem to stick with the earpiece if we first select
Bluetooth but then set speaker to false. Reverse the order to make everyone
happy.
This only applies to the generic and legacy handlers.
When ConnectionService is used (the default) we were attaching the handlers too
early, and since attaching them requires that the RNConnectionService module is
loaded, it silently failed. Instead, use the initialize() method, which gets
called after all the Catalyst (aka native) modules have been loaded.
Separate each implementation (3 as of this writing) into each own "handler"
class.
This should make the code easier to understand, maintain and extend.
We are downloading code off the Internet and executing it on the user's device,
so run it sandboxed to avoid potential bad actors.
Since it's impossible to eval() safely in JS and React Native doesn't offer
something akin to Node's vm module, here we are rolling our own.
On Android it uses the Duktape JavaScript engine and on iOS the builtin
JavaScriptCore engine. The extra JS engine is *only* used for evaluating the
downloaded code and returning a JSON string which is then passed back to RN.
If the Activity is put into the background before the ReactContext is created we
get an NPE here. While the window might be short, it's thechnically possible to
hit this, as our Crashlytics reports show.
In
b53a034aaf (diff-0339cf92cc68bc5981fe6df601316c1c)
I removed this, because RN has updated the builtin JSC version. On the next
release, however, RN introduced a new JS interpreter (Hermes) so JSC is now a RN
dependency. Thus, add the magic spells to publish the AARs to Maven.
These provide the ability to integrate the SDK with some other application
loggers.
At the time this was written we use Timber on Android and CocoaLumberjack on iOS.
In addition to the integration capabilities, a LogBridge React Native module
provides log transports for JavaScript code, thus centralizing all logs on the
native loggers.
Will emit new 'network.info' action with the online/offline status and
extra details for native like the network type and
'isConnectionExpensive' flag.
This commit refactors device selection (more heavily on iOS) to make it
consistent across platforms.
Due to its complexity I couldn't break out each step into separate commits,
apologies to the reviewer.
Changes made to device handling:
- speaker is always the default, regardless of the mode
- "Phone" shows as a selectable option, even in video call mode
- "Phone" is not displayed when wired headphones are present
- Shared device picker between iOS and Android
- Runtime device updates while the picker is open
Set our own audio device manager so we can tweak it if need be (enabling /
disabling the HW AEC on specific devices).
Switch to using the software video encoder / decoder. This may feel like a
downgrade, but it has advantages:
- simulcast is now working (on par with iOS)
- certain devices have broken VP8 HW encoders (I'm looking at you Samsung Galaxy
S7) so this fixes that
After calling startService we are supposed to have a bit of time before turning
the service into a foreground service, but certain devices seem to be more
spartan and we've seen the following failure:
Caused by java.lang.IllegalStateException: Not allowed to start service Intent { act=JitsiMeetOngoingConferenceService:START cmp=org.jitsi.meet/.sdk.JitsiMeetOngoingConferenceService }: app is in background uid UidRecord{f6778d5 u0a220 CAC bg:+1m1s417ms idle change:idle procs:1 proclist:15604, seq(0,0,0)}
at android.app.ContextImpl.startServiceCommon + 1600(ContextImpl.java:1600)
at android.app.ContextImpl.startService + 1546(ContextImpl.java:1546)
at android.content.ContextWrapper.startService + 669(ContextWrapper.java:669)
at org.jitsi.meet.sdk.JitsiMeetOngoingConferenceService.launch + 50(JitsiMeetOngoingConferenceService.java:50)
Be expliocit and call startForegroundService, on supported platforms.
The app is about to crash at that stage so it was a moot point to try to leave
the conference anyway.
Stopping ConnectionServers is still a good idea though, since a crash may leave
the device in a bad state otherwise.
Fixes this issue:
~~~
java.util.ConcurrentModificationException
at java.util.HashMap$HashIterator.nextNode(HashMap.java:1441)
at java.util.HashMap$KeyIterator.next(HashMap.java:1465)
at org.jitsi.meet.sdk.OngoingConferenceTracker.updateListeners(OngoingConferenceTracker.java:89)
at org.jitsi.meet.sdk.OngoingConferenceTracker.onExternalAPIEvent(OngoingConferenceTracker.java:74)
at org.jitsi.meet.sdk.ExternalAPIModule.sendEvent(ExternalAPIModule.java:71)
at java.lang.reflect.Method.invoke(Native Method)
at com.facebook.react.bridge.JavaMethodWrapper.invoke(JavaMethodWrapper.java:372)
at com.facebook.react.bridge.JavaModuleWrapper.invoke(JavaModuleWrapper.java:158)
at com.facebook.react.bridge.queue.NativeRunnable.run(Native Method)
at android.os.Handler.handleCallback(Handler.java:873)
at android.os.Handler.dispatchMessage(Handler.java:99)
at com.facebook.react.bridge.queue.MessageQueueThreadHandler.dispatchMessage(MessageQueueThreadHandler.java:29)
at android.os.Looper.loop(Looper.java:214)
at com.facebook.react.bridge.queue.MessageQueueThreadImpl$4.run(MessageQueueThreadImpl.java:232)
at java.lang.Thread.run(Thread.java:764)
~~~
This helper method gets the current Activity attached to React Native (via the
ReactContext). This is useful for modules which need access to it, without being
actual React Native modules.
Its main task is to cleanup conferences (specially the connection services
stuff) to make sure the system is left in a working state even when the
unexpected happens.
Entering PiP mode while the permissions dialog is display will not only
fail, but also mess up the Activity lifecycle on some OS versions.
We may end up with two activity/fragment instances and a situation where
the onStop callback was not called yet on the instance #1 while
the onResume has been already called on instance #2.
On some Samsung devices the call done with the ConnectionService end up
in the native call history which we don't want. That's fixable by
marking the Connection as "external" just before the call is
disconnected.
Another issue specific to Samsung devices about the audio focus not
always being release when that call ends. That's fixable by marking
the call as holding just before disconnecting it.
They greatly simplify starting a JitsiMeetActivity by encapsulating the creation
of the Intent adn extras placement.
In order to make this possible JitsiMeetConferenceOptions now implements
Parcelable so it can be serialized and passed around when creating an Intent.
Now that we have both a Fragment and an Activity there are lifecycle methods
that overlap. If a Fragment requests permission by calling requestPermissions
then the result handler will be called on itself. React Native's permissions
module, however, calls ActivityCompat.requestPermissions on the Activity, thus
we need to handle the results at the Activity level and not at the Fragment
level.
It's a number whichb must be ever increasing with each build submitted to the
store.
Automate its value by using the number of seconds since 1st of January 2019.
That should be enough for ~680 years.
Turns out that on Samsung phones the calls placed with
the ConnectionService appear in the calls log as weird long numbers.
The system mangles the address we give it ("sip:meet.jit.si/something")
into this weird long number and the call to request.getAddress() returns
that. Turn off the presentation as neither this number nor our address
makes sense. This way the call appears as from "Unknown" caller in call
history which is still not perfect, but better than the random number.
Note that other phones will preserve the originally passed address value
(tested on One Plus 5).
Turns out the microphone will not work on some devices when starting in
"audio only", because the audio mode is not set to the MODE_IN_COMMUNICATION,
but to the MODE_IN_CALL. Calling setAudioModeIsVoip(true) makes
the system adjust to MODE_IN_COMMUNICATION and the mic works fine.
When running the app from Android Studio the React packager is not automatically
started. In vanilla RN projects this is done by the "react-native run-android"
command, but often times it is desired to run from Android Studio.
This fixes that by starting the packager from Gradle.
It's enabled by default, but marked as experimental (uh?!). It creates trouble
as sometimes the packager goes bananas. Disable them until further notice, our
bundle is not that large anyway.
Consolidate all failure cases into a single one: CONFERENCE_TERMINATED. If the
conference ended gracefully no error indicator will be present, otherwise there
will be.
In practice, we are never going to be in a position where we don't have a
ReactContext but we do have some React Native code running. So let's not expect
the impossible.
It was never used and typicallt the Activity / Fragment holding the
JitsiMeetView object will be the listener.
In addition, once we refactor the events they will be reduced into far fewer.
* feat(Android): implement ConnectionService
Adds basic integration with Android's ConnectionService by implementing
the outgoing call scenario.
* ref(callkit): rename _SET_CALLKIT_SUBSCRIPTIONS
* ref(callkit): move feature to call-integration directory
* feat(ConnectionService): synchronize video state
* ref(AudioMode): use ConnectionService on API >= 26
Not ready yet - few details left mentioned in the FIXMEs
* feat(ConnectionService): add debug logs
Adds logs to trace the calls.
* fix(ConnectionService): leaking ConnectionImpl instances
Turns out there is no callback fired back from the JavaScript side after
the disconnect or abort event is sent from the native. The connection
must be marked as disconnected and removed immediately.
* feat(ConnectionService): handle onCreateOutgoingConnectionFailed
* ref(ConnectionService): merge classes and move to the sdk package
* feat(CallIntegration): show Alert if outgoing call fails
* fix(ConnectionService): alternatively get call UUID from the account
Some Android flavours (or versions ?) do copy over extras to
the onCreateOutgoingConnectionFailed callback. But the call UUID is also
set as the PhoneAccount's label, so eventually it should be available
there.
* ref(ConnectionService): use call UUID as PhoneAccount ID.
The extra is not reliable on some custom Android flavours. It also makes
sense to use unique id for the account instead of the URL given that
it's created on the per call basis.
* fix(ConnectionService): abort the call when hold is requested
Turns out Android P can sometimes request HOLD even though there's no
HOLD capability added to the connection (what!?), so just abort the call
in that case.
* fix(ConnectionService): unregister account on call failure
Unregister the PhoneAccount onCreateOutgoingConnectionFailed. That's
before the ConnectionImpl instance is created which is normally
responsible for doing that.
* fix(AudioModeModule): make package private and run on the audio thread
* address other review comments
Provide a default and builtin default implementation which finishes the
Activity, same as before.
What this PR removes is the ability to provide a custom default handler because
applications can already take this decision when calling `onBackPressed`. In
addition, make `onBackPressed` return `void` because it's virtually impossible
for it to return `false` (that would mean that there is no
`ReactInstanceManager`, which means there is no app to begin with).
In addition, remove the use of `BackAndroid` since `BackHandler` contains an iOS
shim now.
This is done at the app level, not the SDK.
Currently 2 Firebase services are used:
- Crashlytics
- Dynamic Links
They are enabled in tandem, if the appropriate Google services file
(GoogleService-Info.plist on iOS or google-services.json on Android) is found.
Each service needs to be individually enabled in the Firebase console.
Note that Android 9 Pie (API 28) disallows HTTP requests by default, so an
exception was needed in the app in order for the Metro bundler to work in debug
mode.
Glide (which is used by react-native-fast-image) can cause trouble if the host
app (the one using the SDK) is using Glide already.
To avoid this, don't use the builtin AppGlideModule (as the docs recommend) and
let apps define it.
Set them to the next release versions. In additon, the buildNumber variable will
be used to match the requirements of versionCode:
https://developer.android.com/studio/publish/versioning
that is, a monotonically increasing number, independent of the app / sdk
version.
The upstream package has been unmaintained for 2 years now, and making the litle
changes needed as React Native needs them is getting old. The actual
funcionality is a couple of one-liners plus tons of boliterplate, which gets
reduced by quite a bit if we just embed it. So here it goes.
Due to a switch to a newer version of JSCore, the jsc-android dependency is now used by the
SDK. As this dependency is not (yet) available in the Jitsi Maven repository, an error like
this is reported when an application is ran that uses the SDK:
com.facebook.react.common.JavascriptException: Can't find variable: Symbol
This commit primarily improves the instructions on how to create a local Maven repository
that contains all required dependencies, including the JSCore dependency that was missing.
This intends to address the issue described in https://github.com/jitsi/jitsi-meet/issues/3399
This makes the PermissionsAndroid builtin module work.
Introduce the JitsiMeetActivityInterface, which defines the interface that
activities using JitsiMeetView directly must implement in order to ensure full
functionality.