Prompt Muse https://promptmuse.com A.I Tutorials, News, Reviews and Community Tue, 06 Aug 2024 12:04:21 +0000 en-US hourly 1 https://promptmuse.com/wp-content/uploads/2022/11/cropped-channels4_profile-32x32.jpeg Prompt Muse https://promptmuse.com 32 32 How I Made A K-Pop Singer: Full Process with phone mocap! https://promptmuse.com/how-i-made-a-k-pop-singer-full-process-with-phone-mocap/ https://promptmuse.com/how-i-made-a-k-pop-singer-full-process-with-phone-mocap/#respond Thu, 01 Aug 2024 11:07:03 +0000 https://promptmuse.com/?p=3915 Character Creator & Blender Pipeline Introduction Welcome to this Written Tutorial on Creating 3D Characters with Character Creator, Blender and Unreal engine. Please see the full video here In this tutorial, I’ll guide you through the entire process of creating 3D characters using Character Creator and the free plugin CC Blender Pipeline. From character creation [...]

<p>The post How I Made A K-Pop Singer: Full Process with phone mocap! first appeared on Prompt Muse.</p>

]]>

How I Made A K-Pop Singer: Full Process with phone mocap!

How to create a 3d character with Moveai and suno
Character Creator & Blender Pipeline

Introduction

Welcome to this Written Tutorial on Creating 3D Characters with Character Creator, Blender and Unreal engine. Please see the full video here

In this tutorial, I’ll guide you through the entire process of creating 3D characters using Character Creator and the free plugin CC Blender Pipeline. From character creation to importing into Blender and finally into Unreal engine. We’ll cover every step in detail and try out some new plugins! Please note that Reallusion currently has a competition to win cash prizes check it out here!

What You’ll Learn

  • How to quickly put together a 3D character using character creator and the blender pipeline. How to integrate plugin into your workflow and bring everything back into character creator and add mocap.
  •  

Prerequisites

  1. Character Creator: Used to create and export your character (currently 40% off with a 30-day trial).
  2.  
  3. Unreal Engine: Ensure you have it installed.
  4.   
  5. CC Rig Plugin: Available on the Epic Store Marketplace.
  6.  
  7. MetaHuman Plugin: Install from the Marketplace.
  8.  
  9. Unreal Auto Setup: Download from the Reallusion website.
  10.  
  11. Live Link Face App: Free on the App Store via your phones app store
  12.  
  13. Ensuring All Plugins Are Active In Unreal Engine:

    To ensure a smooth workflow, you need to make sure all necessary plugins are active in Unreal Engine. Follow these steps:

    1. Activate Required Plugins:
      • In the Plugins window, use the search bar to find each of the required plugins:
        • Quixel Bridge
        • CC Rig Plugin
        • MetaHuman Plugin
        • Unreal Auto Setup
        • Live Link
        •  
        • Apple ARKit (for ARFaceKit functionality)
      • Make sure each of these plugins is enabled by checking the box next to their names.
      • Click Restart Now if prompted to restart Unreal Engine after enabling plugins.
  14.  

Step 1: 

  1. Creating Your 3D Character

    We start with Character Creator to design and customize our 3D character. This powerful tool allows you to sculpt detailed and lifelike characters with ease. You can easily with sliders adjust facial features, body proportions, and clothing to match your desired look.

A computer screen displays a 3D character modeling application with a female K-Pop singer avatar in the center. A small inset shows a woman speaking and gesturing towards the screen, explaining the full process of using phone mocap for realistic movement.
Initial setup of the mocap project in Unreal Engine, preparing to animate Metahuman characters

Importing the Character into Blender

  1. Once our character is ready, we’ll utilize the CC Blender Pipeline plugin to export and import the character into Blender. This is a free plugin.

  2.  

  3.  This plugin simplifies the process, ensuring that all character details are preserved. In Blender, we’ll begin sculpting and using additional plugins to paint details and create stylized hair quickly.

Screenshot of the iClone 7 software interface with a character model displayed on the right side. A dropdown menu under "ZBrush Pipeline" shows options including "Export Character to Blender." The interface supports phone mocap and even features a preset for creating a K-Pop Singer.
Exporting the Character Using CC Blender Pipeline
A 3D model of a face is displayed on a computer screen with Blender software. In the bottom right corner, a person, possibly a K-Pop singer, is visible, perhaps demonstrating or explaining the full process.
Using the sclupting tools in Blender on the CC character
  1.  

Step 2: 

  1. Creating a Stylized 3D K-pop Outfit

    If you would like access to a premade market place of clothes, the Real Illusion Content store has many assets that can be downloaded directly into your CC program. This makes it easy to get quality clothing onto your character. Click here to enter the Content Store.

  2.  
  3.  
  4. But if you would prefer to make your own clothes this comes with a additional learning curve and  the expense of a third part product. I do get asked regularly about creating clothes from scratch, so I thought I’d try out a product for you called Style 3D Artlier. which we will use to design a bespoke K-pop style outfit for our character. 

 
A computer screen displays a website offering digital characters for download, showcasing diverse 3D models in various outfits. The site features a small video chat window at the bottom right where a woman appears, demonstrating the full process of phone mocap to animate a K-Pop Singer.
Character Creator Content Store for all your clothing needs
A computer screen displays character modeling software with a 3D character model in the center. A video feed of a K-Pop singer giving a full process tutorial is in the bottom right corner.
Final Outfit
  1.  
Step 3: 
  1. Adding Mocap with Move.ai One

    Once our character is complete, we’ll add motion capture (mocap) using Move.ai One. 

  2.  

  3. This tool claims to produce mocap using just one camera.

  4.  Although it may not be as precise as mocap suits or the multi cam option (Moveai Pro), I’m curious to see its capabilities. 

  5.  

  6. Setting up Move.ai was straightforward—I simply downloaded the app on my iPhone and recorded using the free trial, capturing 10-second intervals and uploading the finished .fbx files to my PC. My first impression were good, I knew it would be a wobbly animation that would require mocap clean up! But it was good enough for my little example.

A K-Pop singer standing with arms outstretched in a living room appears on the main screen of a smartphone. Inset image shows the same singer talking. "Stop recording" button is visible, capturing the full process with phone mocap technology.
Using Move.ai to create Mocap

 Step 4:

Cleaning Up Mocap Data

Of course, some clean-up is necessary, especially when you are recording mocap from one iPhone! I recommend using iClone for this which will help you clean up the mocap fast, but if you’re on a budget, Blender is a great alternative, however it may require a bit more of a learning curve and might not be as intuitive as Iclone.

I imported my 3D character into iClone and used a free retargeting file to streamline the process. It was easy and enjoyable!

I repeated the clean-up steps for each mocap import from my phone until I had a good selection of dance moves, which I saved in my iClone library for future use.

  1.  
A 3D modeling software interface displaying a female character model in a dance pose reminiscent of a K-Pop singer. The left panel lists animation options, while the right panel shows character settings and modifications, seamlessly integrating phone mocap for capturing the full process.
Cleaning up Mocap in Iclone

Step 5: 

Exporting To Unreal Engine

At this point, you can export your character to Blender using the CC Pipeline. However, I opted to take my character into Unreal Engine, so I can use the Iphone to record my facial animations. (You can do this in Iclone with Accuface).

I simple exported my character without any animations as an fbx file and then exported each of my animation as an fbx.

In Unreal Engine I imported my skeletal mesh and animation sequences into the Unreal content draw , added them to a new sequence level, and converted my character to a CC Control rig to enable facial animation.

A computer screen displaying character modeling software with a 3D character model of a K-Pop singer in the center and various customization options on the side panels.
Cleaning Up Mocap In Iclone
A screen displays the FBX Import Options panel in 3D software, showing animation import settings and file selection. The environment includes UI elements, a sky background, and folders at the bottom, perfect for a K-Pop singer using phone mocap to capture their moves through the full process.
Importing skeleton and animations into UE

Step 6: Recording Animation

  1. To record in Unreal engine you need to set up a Level sequence, think of this as your time line to add animation to. Below are the steps to setup your level sequencer and record your facial animation via your Iphone to the sequencer. I converted my Imported Skeleton mesh to a CC control rig. This is a fantastic free plug in which you can grab here. This plug will now allow me adjust my animation further in unreal engine and copy and paste animation onto the facial control from my Metahuman.
  2.  
  • Recording Singing with MetaHuman and LiveLink

    To keep things organized and efficient, I opened a separate project for this step. I imported a MetaHuman in my scene, used the LiveLink and Unreal Face app on my phone to record singing, and exported the animation as an .FBX file. Finally, I imported this into my original project and applied it to my character’s facial control rig.

A screen displays the FBX Import Options panel in 3D software, showing animation import settings and file selection. The environment includes UI elements, a sky background, and folders at the bottom, perfect for a K-Pop singer using phone mocap to capture their moves through the full process.
Importing skeleton and animations into UE
A computer screen displays video editing software with an animated K-Pop singer in the center. The character, with red hair and a colorful outfit, is being edited using phone mocap technology. The editing timeline and tools are visible at the bottom, showcasing the full process of animation.
Putting everything together in Unreal Engine

DOWNLOAD LINKS

Promptmuse X
  •  #CharacterCreator #iClone #RiggedCharacter #UnrealEngine #UE #Controlrig

More To Explore

<p>The post How I Made A K-Pop Singer: Full Process with phone mocap! first appeared on Prompt Muse.</p>

]]>
https://promptmuse.com/how-i-made-a-k-pop-singer-full-process-with-phone-mocap/feed/ 0 How I Made A K-Pop Singer: Full Process with phone mocap nonadult
Mocap with Custom Characters https://promptmuse.com/mocap-with-custom-characters/ https://promptmuse.com/mocap-with-custom-characters/#respond Fri, 28 Jun 2024 09:38:57 +0000 https://promptmuse.com/?p=3770 Epic Games Face App mocap for custom characters Introduction Welcome to this written tutorial on how to animate custom characters in Unreal Engine using the

<p>The post Mocap with Custom Characters first appeared on Prompt Muse.</p>

]]>

Mocap with Custom Characters

A woman on a video call gestures to a large cartoon minotaur on a blue studio interface. An arrow points from her to the minotaur, indicating a Mocap transformation or connection with Custom Characters.
Epic Games Face App mocap for custom characters

Introduction

Welcome to this written tutorial on how to animate custom characters in Unreal Engine using the Live Link Face app. This guide will show you how to easily transfer facial animations from your iPhone to your custom characters, including setting up body animations. Full video here

What You’ll Learn

  • How to set up and use the Live Link Face app with Unreal Engine
  •  
  • How to import and animate
  •  characters from Character Creator
  •  
  • How to add body animations to your characters
  •  

Prerequisites

  1. Character Creator: Used to create and export your character (currently 40% off with a 30-day trial).
  2.  
  3. Unreal Engine: Ensure you have it installed.
  4.  
  5. Quixel Bridge: Download and install from the Epic Store Marketplace.
  6.  
  7. CC Rig Plugin: Available on the Epic Store Marketplace.
  8.  
  9. MetaHuman Plugin: Install from the Marketplace.
  10.  
  11. Unreal Auto Setup: Download from the Reallusion website.
  12.  
  13. Live Link Face App: Free on the App Store via your phones app store
  14.  
  15. Ensuring All Plugins Are Active In Unreal Engine:

    To ensure a smooth workflow, you need to make sure all necessary plugins are active in Unreal Engine. Follow these steps:

    1. Activate Required Plugins:
      • In the Plugins window, use the search bar to find each of the required plugins:
        • Quixel Bridge
        • CC Rig Plugin
        • MetaHuman Plugin
        • Unreal Auto Setup
        • Live Link
        •  
        • Apple ARKit (for ARFaceKit functionality)
      • Make sure each of these plugins is enabled by checking the box next to their names.
      • Click Restart Now if prompted to restart Unreal Engine after enabling plugins.
  16.  

Step 1: Create an Unreal Project 

(Installing the Auto Setup as a bridge from Character Creator to Unreal Engine)

  1. Launch Unreal Engine and create a new blank project.
  2. Name the project (e.g., “AutoTutorial”) and create it.
  3. Close the project to install the necessary files.
Screenshot of setting up a mocap project in Unreal Engine for Metahuman character animation.
Initial setup of the mocap project in Unreal Engine, preparing to animate Metahuman characters
  1.  

Step 2: Install Unreal Auto Setup

  1. Download the Auto Setup from the Reallusion website and unzip it.
  2. Run the setup executable file.
  3. Copy the contents and plugin folders from the downloaded files to your Unreal Engine project folder (e.g., Documents > Unreal Engine > AutoTutorial).
  4. Replace the files when prompted.
 
Screenshot of downloading the AutoSetup plugin from Reallusion for Unreal Engine.
Downloading the AutoSetup plugin from Reallusion for seamless integration with Unreal Engine
  1.  
Step 3: Import Your Custom Character
  1. Open Character Creator and select your character. (Has to be a CC3 character)
  2. Export as FBX with Unreal Engine as the target preset. File>Export>FBX>Clothed Character
  3. Import the character into your Unreal Engine project, ensuring CC Control Rig is installed.
Screenshot of exporting a character from Character Creator to create a control rig in Unreal Engine
Exporting a character from Character Creator to create a control rig in Unreal Engine

 Step 4: Setup MetaHuman

We are now going to use a metahuman as a Dummy to record our facial animations onto.

  1. Import a MetaHuman character via Quixel Bridge and add it to your Unreal project.
  2. Set up Live Link Face App on your iPhone and ensure it is connected to your computer.
  3.  

Step 5: Connect MetaHuman to Live Link

  1. In Unreal Engine, select your MetaHuman character.
  2. Set up the Live Link connection in the details panel and ensure facial tracking is active.
  3.  

Step 6: Recording Animation

  1. To record in Unreal engine you need to set up a Level sequence, think of this as your time line to add animation to. Below are the steps to setup your level sequencer and record your facial animation via your Iphone to the sequencer:
  •  

Create a New Level Sequence:
In the Content Browser, right-click and go to Animation > Level Sequence.
Name your sequence and double-click to open it in the Sequencer.

 

Add Your Metahuman to the Sequence:
In the Sequencer window, click the + Track button.
Select Actor to Sequencer, then choose your Metahuman character from the list.

 

Start Recording:
During the countdown, ensure your ARFaceKit device is capturing your facial movements. Perform the desired expressions and movements.
Click the Record button in the Sequencer toolbar (Red Button Left of the screen) A countdown will begin.

Stop Recording:
Once you’ve finished the performance, click the Stop button in the Sequencer toolbar.
The recorded animation will appear as keyframes in the Sequencer timeline.

 

Review and Edit the Animation:
Scrub through the timeline to review the recorded animation.
You can adjust keyframes, refine movements, and blend animations as needed for a polished result.

 

Save Your Work:
Always save your Level Sequence and project to avoid losing any progress.

Step 7: Baking The Key Frames

  1. After stopping the recording, select the recorded track in the Sequencer.
  2. Right-click on the track and choose Bake To Control Rig > Face_ControlBoard_CtrlRig. This process will convert the live link data into keyframes, which we now can copy and paste on to our custom Character.
Screenshot of baking animation keys in Unreal Engine for a Metahuman character.
Baking live link animation data into keyframes for a Metahuman character in Unreal Engine

Step 8: Baking The Key Frames

Select the baked keyframes in the Sequencer for your Metahuman character.
Right-click and choose Copy.
Add your Character Creator (CC) character to the Sequencer by clicking the + Track button and selecting your CC character.
Navigate to the appropriate track on your CC character where you want to paste the keyframes.
Right-click on the track and choose Paste to apply the baked keyframes to your CC character.

 

  •  

Conclusion

That’s it for this tutorial on using the Live Link Face app to animate custom characters in Unreal Engine. If you have any questions or want to share your creations, feel free to tag me on social media @PromptMuse

DOWNLOAD LINKS

More To Explore

<p>The post Mocap with Custom Characters first appeared on Prompt Muse.</p>

]]>
https://promptmuse.com/mocap-with-custom-characters/feed/ 0 Finally! Custom 3D Character Face Mocap nonadult
Custom Blender Hair To Metahuman Guide https://promptmuse.com/creating-custom-hair-for-metahumans-in-unreal-engine-using-blender/ https://promptmuse.com/creating-custom-hair-for-metahumans-in-unreal-engine-using-blender/#respond Fri, 19 Jan 2024 10:27:16 +0000 https://promptmuse.com/?p=3211 Introduction Tired of the default hair options for Metahumans in Unreal Engine? This comprehensive guide will walk you through creating and customizing Metahuman hair using

<p>The post Custom Blender Hair To Metahuman Guide first appeared on Prompt Muse.</p>

]]>
Introduction

Tired of the default hair options for Metahumans in Unreal Engine? This comprehensive guide will walk you through creating and customizing Metahuman hair using Blender. This process involves no plugins and uses entirely free software.

Please Join my Patreon to get access to Blender Resources and additional guides! PromptMuse | 3D & AI Learning Community | Patreon

Prerequisites

Basic familiarity with Blender and Unreal Engine. This tutorial uses 100% free software.

If you don’t have these installed, refer to my getting started with Unreal Engine and Metahumans here (Coming soon).


1.1 Create a Metahuman

1.2 Setting Up your Unreal Project

  • Open Unreal Engine via the Epic Launcher, navigate to the Marketplace on the top tab, and search “Metahuman” download the free ‘Metahuman Lighting’ rig.
  • Create a new project and launch it.
  • In Unreal Engine, navigate to Window and Quxiel Bridge.
  • In Quxiel Bridge download your Metahuman by selecting “My Metahuman” from the lefthand side navigation window. Once downloaded Select your metahumans card and from the right handside panel select “Add” This will add your metahuman to your content draw in your UnrealProject

Step 2: Exporting Mesh to Blender

  • In Unreal Engine, select your Metahuman and remove the existing hair if needed.
  • Now, export the head mesh. Find the mesh in the content browser, right-click, and select “Export.” Choose FBX format for Blender compatibility. Save the file in a convenient location

Step 3: Creating Hair in Blender

3.1 Importing and Prepping the Mesh

  • Open Blender, Im using Blender 4.0. In a new scene, delete the default objects.
  • Go to File > Import > FBX, and select the exported head mesh.
  • In the viewport, separate the scalp from the face. This isolation will help in focusing the hair creation on the scalp area.

3.2 Designing the Hair

  • With the scalp selected, enter Sculpt Mode.
  • Use the Add Hair tool to begin placing hair guides on the scalp. These guides will shape the overall hairstyle.
  • Adjust hair length, density, and curvature. For long hairstyles, increase the length and density. For short styles, reduce these parameters.
  • Apply modifiers for specific textures and effects:
    • Clump Modifier: To create grouped strands of hair.
    • Curl Modifier: For curly or wavy hair.
    • Frizz Modifier: Adds a frizzy, unstructured look to the hair.
  • Focus on the hairline and parting. Add more hair guides here to ensure a natural, dense look.

3.3 Exporting Hair Back to Unreal Engine

  • Once you’re satisfied with the hair design, export it as an Alembic (.abc) file. This format preserves the hair details for Unreal Engine.

Step 4: Finalizing in Unreal Engine

4.1 Importing Hair into Unreal

  • Back in Unreal Engine, import the Alembic file. Navigate to the content browser, right-click, and select ‘Import to /Game’. Locate your Alembic file and import it.

4.2 Adjusting Hair Settings

  • Select the imported hair in the content browser. In the details panel, fine-tune the settings:
    • Interpolation: Adjust for smooth hair transitions.
    • Root and Tip Scale: Control the thickness of the hair at the root and the tip.
    • Color Settings: Customize the hair color and texture to match your character’s style.
  • Enable physics simulation for realistic hair movement.

4.3 Binding Hair to Metahuman

  • To attach the hair to your character, use the ‘Create Binding’ option in Unreal Engine. This step ensures that the hair moves naturally with your character’s animations.

Conclusion

You’ve successfully created and customized hair for your Metahuman character in Unreal Engine. Experiment with different styles and modifiers to enhance your digital characters further. Don’t forget to save your project to preserve your work.

Additional Resources

<p>The post Custom Blender Hair To Metahuman Guide first appeared on Prompt Muse.</p>

]]>
https://promptmuse.com/creating-custom-hair-for-metahumans-in-unreal-engine-using-blender/feed/ 0 Custom Blender Hair to Metahuman Guide nonadult
How to Animate Game Characters and Import Them into Blender and Unreal Engine 5 https://promptmuse.com/how-to-animate-game-characters-and-import-them-into-blender-and-unreal-engine-5/ https://promptmuse.com/how-to-animate-game-characters-and-import-them-into-blender-and-unreal-engine-5/#respond Mon, 25 Sep 2023 08:58:36 +0000 https://promptmuse.com/?p=3087 In this tutorial, I will guide you through the process of animating your game characters and importing them into Blender and Unreal Engine 5. This

<p>The post How to Animate Game Characters and Import Them into Blender and Unreal Engine 5 first appeared on Prompt Muse.</p>

]]>
In this tutorial, I will guide you through the process of animating your game characters and importing them into Blender and Unreal Engine 5. This tutorial is designed for those who don’t have a budget for expensive animation software or motion capture suits. The full tutorial video of this process can be found on my Youtube channel here.

We will be using a software called “Cascadeur,” which has been around for about 10 years and offers a free version with some powerful animation tools. While this method is not a replacement for professional animation software or mocap, it’s a viable alternative if you’re on a tight budget.

Note: Before you start, make sure you have Cascadeur and Accurig installed on your computer. You can download Cascadeur from the official website, and Accurig is a free auto-rigging tool that complements Cascadeur.

<iframe width=”560″ height=”315″ src=”https://www.youtube.com/embed/ScQTV2Xb–0?si=_4-LUd5vW3w7Nz64″ title=”YouTube video player” frameborder=”0″ allow=”accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share” allowfullscreen></iframe>

Let’s get started!

Part 1: Rigging Your Character in Accurig

  1. Open Accurig and click on “Choose File” to select your 3D character’s FBX file. You can use a sample character from Cascadeur, Mixamo, Sketchfab, or your own custom character.
  2. After loading your character’s mesh, click on “Rig Body” to generate the joint structure for your character’s skeleton.
  3. Accurig will display circles where joints should be placed. Ensure symmetry is checked to work on one side of the character.
  4. Position the joint guides according to your character’s anatomy, following the on-screen guides for reference.
  5. Use the tools in the bottom left corner to rotate and move around your character for precise joint placement.
  6. Repeat the process for other body parts, such as arms and legs, ensuring correct joint placement.
  7. Use the “Preview Motion” window to check the animation on various body parts, including fingers.
  8. Ensure your character is in a neutral pose (A-pose or T-pose) before exporting.
  9. Click “Upload to AccuRig” and then “Export” > “Export FBX.” Set the target application to “Maya” and check “Embed Texture.” Click “Export” to save the rig.
  10. Export another FBX file of your character’s base mesh but set the target application to “Blender” for later use.
 Accurig Auto Rigger Tool
Accurig Auto Rigger Tool By Real Illusion

Part 2: Creating a Basic Idle Animation in Cascadeur

  1. Open Cascadeur and start a new scene. Import the FBX file with Maya settings that you exported from Accurig.
  2. Cascadeur will ask if you want to enter “Rig Mode.” Click “Yes.”
  3. In the “Rig Mode Helper” dialog, click “Yes” and then “OK” on the next dialog.
  4. Click “Add Rig Elements” at the bottom of the “Quick Rigging Tool” dialog.
  5. Rotate your character by holding ALT and the left mouse button to navigate.
  6. Select the “Auto Pose” tool to enable automatic control point positioning as you move your character.
  7. Position your character into an initial pose for your idle animation by moving and rotating control points. Use ‘W’ to move and ‘E’ to rotate.
  8. Add a keyframe at frame 10 by clicking the key icon.
  9. Change the hand pose on frame 10 to create a hand open/close animation.
  10. Duplicate the first frame to frame 20 and mirror the pose for variety.
  11. Duplicate the second keyframe to frame 35 and mirror it to frame 45.
  12. Extend the timeline to add more frames for smoother animation (e.g., 200 frames in total).
  13. Copy the first frame to frame 145 and the second keyframe to frame 110.
  14. Apply bezier curve interpolation for smoother animation between keyframes.
  15. Review and refine your animation by adding subtle movements, such as chest and shoulder motion.
  16. Create a seamless loop by ensuring the first and last frames are identical and adjust frame numbers accordingly.
  17. Cascadeur Tutorial

Part 3: Exporting the Animation to Blender

  1. Export the animation from Cascadeur to Blender by going to “File” > “Export” > “FBX.” Name the file and click “Save.”
  2. In Blender, import the animation by going to “File” > “Import” > “FBX.” Use the default settings and click “Import FBX.”
  3. Delete any existing objects in the Blender scene and select the imported Armature.
  4. Adjust the Armature’s rotation to face the front and place it in the scene.
  5. Create an animation track for the imported animation and rename it.
  6. Copy the animation keyframes from the imported Armature and paste them onto your character’s Armature.
  7. Delete the imported Armature to keep your scene clean.
  8. Create an animation loop for your idle animation in Blender using the NLA (Non-Linear Animation) Editor.
  9. Blender_Animation_Import
    Blender_Animation_Import

Part 4: Exporting the Animation to Unreal Engine 5

  1. In Unreal Engine 5, create a new project and organize your project folder.
  2. Import your character’s 3D mesh into Unreal Engine by right-clicking in the “Content” folder, selecting “Import,” and choosing your FBX file. Ensure it’s assigned to the correct skeleton.
  3. Add a Level Sequence to your project by right-clicking in the “Content” folder and selecting “Level Sequence.”
  4. Drag your character’s skeletal mesh into the Level Sequence.
  5. Add your idle animation to the Level Sequence by clicking the plus icon and selecting the animation.
  6. Adjust the timeline as needed and press the spacebar to preview your animation.
  7. Extend the timeline and blend your idle and walk animations for a seamless transition.

Part 5: Adding Free Mocap Data

  1. Visit the ActorCore website and explore the free motion resources.
  2. Download free motion data compatible with Cascadeur.
  3. Import the downloaded motion data into Cascadeur, and apply it to your character as needed.
  4. Refine and customize the imported motion data to suit your character and animation needs.

This tutorial should help you get started with animating and importing your game characters into Blender and Unreal Engine 5 using Cascadeur. Feel free to explore further features and animation possibilities in Cascadeur to enhance your character animations.

Remember, practice makes perfect, and with time, you’ll be creating stunning animations for your game characters. Enjoy animating!

<p>The post How to Animate Game Characters and Import Them into Blender and Unreal Engine 5 first appeared on Prompt Muse.</p>

]]>
https://promptmuse.com/how-to-animate-game-characters-and-import-them-into-blender-and-unreal-engine-5/feed/ 0
Turn AI Images into 3D Animated Characters: Tutorial https://promptmuse.com/turn-ai-images-into-3d-animated-characters-tutorial/ https://promptmuse.com/turn-ai-images-into-3d-animated-characters-tutorial/#respond Fri, 13 Jan 2023 17:00:13 +0000 https://promptmuse.com/?p=1298 Welcome to this tutorial on how to turn an AI generated character into a 3D animated character. This workflow can be used to create AI

<p>The post Turn AI Images into 3D Animated Characters: Tutorial first appeared on Prompt Muse.</p>

]]>
Welcome to this tutorial on how to turn an AI generated character into a 3D animated character. This workflow can be used to create AI influencers, bring a music video to life, or even create a feature film.

Before we begin, you will need a trained model to produce the head shots. You can either follow a tutorial to create your own unique trained AI model, or use the one provided in this tutorial below.

Please select what is compatible for your phone as you may require a different type of adapter:
Apple Lighting to Ethernet

Ethernet cable

RESOURCES: Download Redhead.ckpt my model from HERE

Stable Diffusion (Use local or remote)

Step 1: Gather Pose Reference Images

Take some photos of yourself to use as headshot references. These photos will be used to ensure that the output pose of your AI generated character is consistent when it is run through stable diffusion. It is important to note that the reference images do not need to look like the final character.

Step 2: Use Automatic1111 webui (You can use either local or remote- I’ll add a tutorial soon!)

Use Automatic1111 webui to run stable diffusion 1.5. Load your Redhead.ckpt into the models file within the Automatic1111 directly.

Step 3: Run stable diffusion

In stable diffusion, select your redhead.ckpt from the drop-down list. Navigate to the img to img tab and upload your front, side, and perspective headshot references.

Step 4: Create consistent images of your character

Use your reference images as an img to img reference to create consistent images of your character.

With these steps, you should now have a 3D animated character that is based on your AI generated character. Be creative and experiment with different poses and animations to bring your character to life!

Blender

Use the Facebuilder plug-in to create a 3D model head mesh that is based on the reference images. This tool is very useful as the sculpting tools in meta human are limited and can be very laggy. However this stage is optional.

Step 1: Download and Install Blender here (its free) the Facebuilder plug-in by Keen tools here

Step 2: Open Blender and import your reference images

Step 3: Use the Facebuilder plug-in to create the 3D model head mesh

Step 4: Export your head mesh as a .fbx files.

 

Note: The creator of this tutorial is not paid in any way to promote the Facebuilder plug-in. It is just a tool that they found useful and thought others may also find it helpful.

With these steps, you should now have a 3D model head mesh that is based on your reference images. You can now continue to the meta human creator section to bring your character to life with animations and other features.

Epic Launcher & Unreal

Step 1: Follow this link here to download Epic game launcher and unreal engine.

Please avoid 5.1 (new release ) due to compatibility issues with meta humans. I’m sure there will be an update soon to fix a few of the issues, but until then I’d advise downloading Unreal version 5.0.03

Once above installed get Quixel Bridge for Unreal Engine

https://docs.unrealengine.com/5.0/en-US/quixel-bridge-plugin-for-unreal-engine/

 

<p>The post Turn AI Images into 3D Animated Characters: Tutorial first appeared on Prompt Muse.</p>

]]>
https://promptmuse.com/turn-ai-images-into-3d-animated-characters-tutorial/feed/ 0 Turn AI Images into 3D Animated Characters: Tutorial nonadult