![]() ![]() Easy mode for animation timings - 3 sliders interlinked to help you find the perfect look-and-feel for your SALSA-based project.Timing and Easing overrides - set the global values for all viseme components.IsSALSAing or subscribe to the C# events for notification. Check SALSA's processing status via an active check to Salsa.This is a feature release which contains several goodies requested by our awesome customers. SALSA LipSync Suite v2.2.2 is now available on the AssetStore. If you use a shared blendshape for speech and an emote, the emote will fire but as soon as lipsync needs that blendshape it will override the emote. There simply aren't enough blendshapes to do comprehensive lipsync and emoting at the same time using Fuse character default blendshapes. Eyes overrides none, except blink (which is sent to the queue as an emote).įuse characters barely have enough blendshapes for speech to begin with, unless you're adding additional blendshapes (you could do this with our MorphMixer asset), or using a smaller less detailed set than our one-click setup uses.SALSA's queue deals with conflict with the following prioritization rules: This is why we're starting to see character systems implementing emote blendshapes in addition to visemes. The only way to prevent blendshape conflicts with any system is to have enough blendshapes to prevent dual usage. Regardless of the controller, blendshapes can only be set to one value at a time. I watched the first two videos of the (v2 Version) In the dropdown of the Viseme - controller type - there is UMA… Is it a good idea - to do this tutorial with V1 and then upgrade? Salsa Suite to v2? (Unfortunately there is no tutorial for UMA 2 and LipSyncSuite v2 on youtube – but for the "old" one (probably the SALSA LipSync Suite (v1)) version I found a tutorial for the integration of UMA2Ĥ. ![]() ![]() Same as previous?! I do not understand the context. UmaUepDriver.ManualStart(yourUMAExpressionPlayer) Then, call our UEP driver manual start function, passing a link to the UMAExpressionPlayer you just configured: UMAExpressionPlayer.umaData = yourUmaData UMAExpressionPlayer.expressionSet = yourExpressionSet On the UmaUepDriver, disable (uncheck) the "UMA Character is Dynamic" option.Ĭonfigure an UMAExpressionPlayer on your avatar and you will need to configure it with these parameters: Same as previous, apply the OneClick to your avatar root. I found the following implementation steps in this forum. I also got a recommendation from the UMA2 forum - however I see here that it doesn't seem to work that easy after all?!ģ. In my current project I want to make a UMA character speak. Is this another (paid) upgrade coming, or is this already the version I purchase and download from the Asset Store?Ģ. On Youtube you can find tutorials of SALSA LipSync Suite (v2) tutorial series. In order not to have this experience again, I would like to clarify a few things before I buy the upgrade.ġ. On the other hand, I understand that the further development has to be financed somehow. I was surprised, that the asset was gone. I'm a buyer of the first version of Salsa, but I never used it. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |