Prompt Muse https://promptmuse.com A.I Tutorials, News, Reviews and Community Sun, 13 Oct 2024 10:12:56 +0000 en-US hourly 1 https://promptmuse.com/wp-content/uploads/2022/11/cropped-channels4_profile-32x32.jpeg Prompt Muse https://promptmuse.com 32 32 The best AI is still free! Forge & Flux for FREE (Easy Guide) + Animation https://promptmuse.com/the-best-ai-is-still-free-forge-flux-for-free-easy-guide-animation/ https://promptmuse.com/the-best-ai-is-still-free-forge-flux-for-free-easy-guide-animation/#respond Sun, 13 Oct 2024 09:42:33 +0000 https://promptmuse.com/?p=4273 https://www.youtube.com/watch?v=dbqiz6jt0o8 Introduction Welcome to my tutorial on how to use Forge UI and the model Flux to create and animate images for free. This step-by-step guide will help you harness the power of these free AI tools and bring your creative ideas to life with ease. Please see video above for step by step process. [...]

<p>The post The best AI is still free! Forge & Flux for FREE (Easy Guide) + Animation first appeared on Prompt Muse.</p>

]]>

The best AI is still free! Forge & Flux for FREE (Easy Guide) + Animation

Introduction

Welcome to my tutorial on how to use Forge UI and the model Flux to create and animate images for free. This step-by-step guide will help you harness the power of these free AI tools and bring your creative ideas to life with ease. Please see video above for step by step process.

Introduction

Forge and Flux are powerful, free AI-driven tools that allow you to create stunning images and animations effortlessly. Whether you’re an artist, content creator, or just curious about AI, this guide will walk you through everything you need to get started.

System Requirements

  • Windows Tutorial: The video is aimed at Windows users, with a minimum VRAM requirement of 6 GB for Forge.
  •  
  • Storage Requirements: Forge requires at least 1.7 GB, while Flux needs between 12-17 GB, depending on the version you use (NF4 or FP8).

Installing Forge UI

Forge is popular because it offers a clean, user-friendly interface that makes AI image generation accessible to both beginners and advanced users. It’s free, open-source, and provides compatibility with various AI models, including Flux and Stable Diffusion, allowing for versatile creative projects. The tool is optimized for fast performance, particularly for users with strong GPUs, and supports LoRA models for further customization. Forge’s ability to deliver unlimited image generation without subscriptions, coupled with its easy-to-use design, has made it a go-to option for those looking for a powerful yet efficient AI tool.
 
Download Forge Here:

Forge Download: https://github.com/lllyasviel/stable-diffusion-webui-forge

 

 

Installing Flux

Flux is a popular AI model known for its ability to generate high-quality images, from hyperrealistic art to anime and digital paintings, while also excelling at comprehending and integrating text within images. It is designed to work efficiently on lower-end GPUs, making it accessible to a broader range of users, even those with limited hardware. With its compatibility in tools like Forge, it allows for flexible creativity and fast performance, offering a streamlined experience for AI-based image generation without sacrificing quality. These features, combined with its ability to run on systems with moderate VRAM, make Flux a go-to model for AI enthusiasts
 Download Flux Here:

Flux Download: https://github.com/lllyasviel/stable-diffusion-webui-forge/discussions/981

NF4 Version:

  • Recommended for VRAM between 6 GB and 12 GB.
  • This is the developer-recommended version because it is very fast and efficient.
  • Ideal for users with moderate hardware specifications. It provides a good balance between speed and image quality while taking up less space.
  •  

FP8 Version:

  • Recommended for VRAM of 12 GB and higher.
  • This version is more demanding on your system and offers higher precision and quality. It is suitable for users with more powerful GPUs.
  • If you have ample VRAM (12 GB or more), the FP8 version can generate more detailed images, though it will require more system resources and take up more storage.
  •  

Choosing Between NF4 and FP8:

  • If you have 6-12 GB of VRAM, it’s suggested to go with the NF4 version, as it’s optimized for speed and performance with lower hardware requirements.
  • For users with 12 GB of VRAM or more, you can opt for the FP8 version for higher-quality image generation.

 

  1.  Prompt I used in the Youtube video:

    Cinematic composition, Digital art, The central figure is a young woman with long, flowing, silver-white hair cascading down from her head, blending seamlessly with the surrounding white to form a surreal, ethereal background. Her expression is calm and serene, with closed eyes, suggesting a state of relaxation or meditation. She is surrounded by a white Japanese dragon. The dragon’s smooth body twists its body around hers the out of the dragon’s body turning to dust. She is facing forwards towards the viewer with her eyes open, and has placed one hand elegantly place on the dragon. She has extremely sharp, long red nails. There is a glowing ring light in the background. The overall style is a blend of fantasy and majestic science fiction, with a high level of detail and smooth, polished textures.

 
Steps: 40, Sampler: Euler,
Schedule type: Simple, 
CFG scale: 1, 
Distilled CFG Scale: 3.5, 
Seed: 739566982, 
Size: 896×1152,
 
  1.  

 

  1.  Conclusion

    Congratulations! You’ve learned how to use Forge and Flux to create stunning AI-driven images and animations for free. With these tools, the possibilities for creativity are endless. Don’t hesitate to experiment with new features and explore more advanced techniques.

  2.  

 

  1. Resource links:

  2. https://blackforestlabs.ai/
  3.  
  4.  
  • Thank you for joining me, and image creation!

Promptmuse X
  •  #fluxai #forge #forgeui #fluxanimation #flux

More To Explore

<p>The post The best AI is still free! Forge & Flux for FREE (Easy Guide) + Animation first appeared on Prompt Muse.</p>

]]>
https://promptmuse.com/the-best-ai-is-still-free-forge-flux-for-free-easy-guide-animation/feed/ 0 nonadult
How To Install Flux On ComfyUI https://promptmuse.com/how-to-install-flux-on-comfyui/ https://promptmuse.com/how-to-install-flux-on-comfyui/#respond Tue, 03 Sep 2024 09:38:05 +0000 https://promptmuse.com/?p=4058 Introduction Welcome to this written Tutorial on How to install Flux on ComfyUI. The occpomeying video can be found here: What is Flux? Flux, released

<p>The post How To Install Flux On ComfyUI first appeared on Prompt Muse.</p>

]]>

How To Install Flux On ComfyUI

Installing Flux Ai on ComfyUI

Introduction

Welcome to this written Tutorial on How to install Flux on ComfyUI. The occpomeying video can be found here:

What is Flux?

Flux, released in mid-2023, was specifically developed for ComfyUI to enhance its image generation capabilities. However, it is now avalible on webui’s such as Automtic1111, Forge and ComfyUI.

The Flux AI model excels in prompt adherence, producing high-quality images with accurate anatomy and demonstrating strong capabilities in generating text.

 

Flux and ComfyUI were developed by Black Forest Labs, and the tool is freely avavlibe to use locally, howver terms and condtions have to be agreed to.

It’s important to use it responsibly and within the bounds of ethical guidelines.

  1. Resource links:

  2. https://blackforestlabs.ai/
  3. https://github.com/comfyanonymous/ComfyUI_examples/tree/master/flux

 

 

Hardware Requirements:

Feature/VersionFP8SchnellDev
OverviewOptimized for low VRAM, best for efficient developmentFastest version for lower-end GPUs, optimized for speedGreat for quality and efficiency
Visual QualityGood, performs slower than the Schnell versionCompromises on visual quality for speed, less detailBetter details, more prompt adherence. High Quality
Image DetailGoodGoodHigh
VRAM Requirement8-12GB+
Have seen folks with 8GB using but at a push
8-12GB+12GB+
Model Size17.2GB17.2GB23.8GB
    
  1.  
  1. For all version make sure you have updated Comfyui, simply go to the ComfyUI Manager, click on Manager > Update All, and allow the update process to complete.
  2.  

1. FP8

Faster, optimized version for users with more limited GPU resources

Download Flux FP8

Save the flux1-dev-fp8.safetensors file into ComfyUI\models\checkpoints folder onto your PC.

Load up ComfyUI and Update via the ComfyUI Manager. Update ALL

Download This simple Flux worksflow below, drag and drop tje JSON file into your ComfyUI, Alterntively Load in via your manager. Update All

2. Schnell

Faster, optimized version for users with more limited GPU resources

Download Schnell Model here and put into ComfyUI > models > unet.

Download VAE here ComfyUI > models > vae.

Download Clip model clip_l.safetensors and t5xxl_fp8_e4m3fn.safetensors

Load up ComfyUI and Update via the ComfyUI Manager. Update ALL

3. Dev

 

Development and personal use, capable of producing high-quality images but requires more powerful hardware than the [Schnell] version

Download the .safetensor Model here and put into ComfyUI > models > unet.

Download the following two CLIP models, and put them in ComfyUI > models > clip.

(https://huggingface.co/comfyanonymous/flux_text_encoders/tree/main)
  •  
  • Download the a.e VAE  file. Put it in ComfyUI > models > vae.

 

  1. Download FLUX Model

    Go to the Flux dev model Hugging Face page. Sign up (It’s free) /Login and accept the Terms and conditions. 

    Click on the Files and versions tab and download the flux-dev.safetensors file (See Image below)

  2.  
A screenshot of an image modification software interface, ComfyUI, shows nodes connected to the “Save Image” module with a cat photo at the bottom. A mouse cursor points to the "Load" button.
Load in Json Workflow

 

  1. Resource links:

  2. https://blackforestlabs.ai/
  3. https://github.com/comfyanonymous/ComfyUI_examples/tree/master/flux
  4.  
  • Thank you for joining me, and image creation!

Promptmuse X
  •  #fluxai #fluxcomfyui #comfyui 

More To Explore

<p>The post How To Install Flux On ComfyUI first appeared on Prompt Muse.</p>

]]>
https://promptmuse.com/how-to-install-flux-on-comfyui/feed/ 0
How I Made A K-Pop Singer: Full Process with phone mocap! https://promptmuse.com/how-i-made-a-k-pop-singer-full-process-with-phone-mocap/ https://promptmuse.com/how-i-made-a-k-pop-singer-full-process-with-phone-mocap/#respond Thu, 01 Aug 2024 11:07:03 +0000 https://promptmuse.com/?p=3915 Character Creator & Blender Pipeline Introduction Welcome to this Written Tutorial on Creating 3D Characters with Character Creator, Blender and Unreal engine. Please see the

<p>The post How I Made A K-Pop Singer: Full Process with phone mocap! first appeared on Prompt Muse.</p>

]]>

How I Made A K-Pop Singer: Full Process with phone mocap!

How to create a 3d character with Moveai and suno
Character Creator & Blender Pipeline

Introduction

Welcome to this Written Tutorial on Creating 3D Characters with Character Creator, Blender and Unreal engine. Please see the full video here

In this tutorial, I’ll guide you through the entire process of creating 3D characters using Character Creator and the free plugin CC Blender Pipeline. From character creation to importing into Blender and finally into Unreal engine. We’ll cover every step in detail and try out some new plugins! Please note that Reallusion currently has a competition to win cash prizes check it out here!

What You’ll Learn

  • How to quickly put together a 3D character using character creator and the blender pipeline. How to integrate plugin into your workflow and bring everything back into character creator and add mocap.
  •  

Prerequisites

  1. Character Creator: Used to create and export your character (currently 40% off with a 30-day trial).
  2.  
  3. Unreal Engine: Ensure you have it installed.
  4.   
  5. CC Rig Plugin: Available on the Epic Store Marketplace.
  6.  
  7. MetaHuman Plugin: Install from the Marketplace.
  8.  
  9. Unreal Auto Setup: Download from the Reallusion website.
  10.  
  11. Live Link Face App: Free on the App Store via your phones app store
  12.  
  13. Ensuring All Plugins Are Active In Unreal Engine:

    To ensure a smooth workflow, you need to make sure all necessary plugins are active in Unreal Engine. Follow these steps:

    1. Activate Required Plugins:
      • In the Plugins window, use the search bar to find each of the required plugins:
        • Quixel Bridge
        • CC Rig Plugin
        • MetaHuman Plugin
        • Unreal Auto Setup
        • Live Link
        •  
        • Apple ARKit (for ARFaceKit functionality)
      • Make sure each of these plugins is enabled by checking the box next to their names.
      • Click Restart Now if prompted to restart Unreal Engine after enabling plugins.
  14.  

Step 1: 

  1. Creating Your 3D Character

    We start with Character Creator to design and customize our 3D character. This powerful tool allows you to sculpt detailed and lifelike characters with ease. You can easily with sliders adjust facial features, body proportions, and clothing to match your desired look.

A computer screen displays a 3D character modeling application with a female K-Pop singer avatar in the center. A small inset shows a woman speaking and gesturing towards the screen, explaining the full process of using phone mocap for realistic movement.
Initial setup of the mocap project in Unreal Engine, preparing to animate Metahuman characters

Importing the Character into Blender

  1. Once our character is ready, we’ll utilize the CC Blender Pipeline plugin to export and import the character into Blender. This is a free plugin.

  2.  

  3.  This plugin simplifies the process, ensuring that all character details are preserved. In Blender, we’ll begin sculpting and using additional plugins to paint details and create stylized hair quickly.

Screenshot of the iClone 7 software interface with a character model displayed on the right side. A dropdown menu under "ZBrush Pipeline" shows options including "Export Character to Blender." The interface supports phone mocap and even features a preset for creating a K-Pop Singer.
Exporting the Character Using CC Blender Pipeline
A 3D model of a face is displayed on a computer screen with Blender software. In the bottom right corner, a person, possibly a K-Pop singer, is visible, perhaps demonstrating or explaining the full process.
Using the sclupting tools in Blender on the CC character
  1.  

Step 2: 

  1. Creating a Stylized 3D K-pop Outfit

    If you would like access to a premade market place of clothes, the Real Illusion Content store has many assets that can be downloaded directly into your CC program. This makes it easy to get quality clothing onto your character. Click here to enter the Content Store.

  2.  
  3.  
  4. But if you would prefer to make your own clothes this comes with a additional learning curve and  the expense of a third part product. I do get asked regularly about creating clothes from scratch, so I thought I’d try out a product for you called Style 3D Artlier. which we will use to design a bespoke K-pop style outfit for our character. 

 
A computer screen displays a website offering digital characters for download, showcasing diverse 3D models in various outfits. The site features a small video chat window at the bottom right where a woman appears, demonstrating the full process of phone mocap to animate a K-Pop Singer.
Character Creator Content Store for all your clothing needs
A computer screen displays character modeling software with a 3D character model in the center. A video feed of a K-Pop singer giving a full process tutorial is in the bottom right corner.
Final Outfit
  1.  
Step 3: 
  1. Adding Mocap with Move.ai One

    Once our character is complete, we’ll add motion capture (mocap) using Move.ai One. 

  2.  

  3. This tool claims to produce mocap using just one camera.

  4.  Although it may not be as precise as mocap suits or the multi cam option (Moveai Pro), I’m curious to see its capabilities. 

  5.  

  6. Setting up Move.ai was straightforward—I simply downloaded the app on my iPhone and recorded using the free trial, capturing 10-second intervals and uploading the finished .fbx files to my PC. My first impression were good, I knew it would be a wobbly animation that would require mocap clean up! But it was good enough for my little example.

A K-Pop singer standing with arms outstretched in a living room appears on the main screen of a smartphone. Inset image shows the same singer talking. "Stop recording" button is visible, capturing the full process with phone mocap technology.
Using Move.ai to create Mocap

 Step 4:

Cleaning Up Mocap Data

Of course, some clean-up is necessary, especially when you are recording mocap from one iPhone! I recommend using iClone for this which will help you clean up the mocap fast, but if you’re on a budget, Blender is a great alternative, however it may require a bit more of a learning curve and might not be as intuitive as Iclone.

I imported my 3D character into iClone and used a free retargeting file to streamline the process. It was easy and enjoyable!

I repeated the clean-up steps for each mocap import from my phone until I had a good selection of dance moves, which I saved in my iClone library for future use.

  1.  
A 3D modeling software interface displaying a female character model in a dance pose reminiscent of a K-Pop singer. The left panel lists animation options, while the right panel shows character settings and modifications, seamlessly integrating phone mocap for capturing the full process.
Cleaning up Mocap in Iclone

Step 5: 

Exporting To Unreal Engine

At this point, you can export your character to Blender using the CC Pipeline. However, I opted to take my character into Unreal Engine, so I can use the Iphone to record my facial animations. (You can do this in Iclone with Accuface).

I simple exported my character without any animations as an fbx file and then exported each of my animation as an fbx.

In Unreal Engine I imported my skeletal mesh and animation sequences into the Unreal content draw , added them to a new sequence level, and converted my character to a CC Control rig to enable facial animation.

A computer screen displaying character modeling software with a 3D character model of a K-Pop singer in the center and various customization options on the side panels.
Cleaning Up Mocap In Iclone
A screen displays the FBX Import Options panel in 3D software, showing animation import settings and file selection. The environment includes UI elements, a sky background, and folders at the bottom, perfect for a K-Pop singer using phone mocap to capture their moves through the full process.
Importing skeleton and animations into UE

Step 6: Recording Animation

  1. To record in Unreal engine you need to set up a Level sequence, think of this as your time line to add animation to. Below are the steps to setup your level sequencer and record your facial animation via your Iphone to the sequencer. I converted my Imported Skeleton mesh to a CC control rig. This is a fantastic free plug in which you can grab here. This plug will now allow me adjust my animation further in unreal engine and copy and paste animation onto the facial control from my Metahuman.
  2.  
  • Recording Singing with MetaHuman and LiveLink

    To keep things organized and efficient, I opened a separate project for this step. I imported a MetaHuman in my scene, used the LiveLink and Unreal Face app on my phone to record singing, and exported the animation as an .FBX file. Finally, I imported this into my original project and applied it to my character’s facial control rig.

A screen displays the FBX Import Options panel in 3D software, showing animation import settings and file selection. The environment includes UI elements, a sky background, and folders at the bottom, perfect for a K-Pop singer using phone mocap to capture their moves through the full process.
Importing skeleton and animations into UE
A computer screen displays video editing software with an animated K-Pop singer in the center. The character, with red hair and a colorful outfit, is being edited using phone mocap technology. The editing timeline and tools are visible at the bottom, showcasing the full process of animation.
Putting everything together in Unreal Engine

DOWNLOAD LINKS

Promptmuse X
  •  #CharacterCreator #iClone #RiggedCharacter #UnrealEngine #UE #Controlrig

More To Explore

<p>The post How I Made A K-Pop Singer: Full Process with phone mocap! first appeared on Prompt Muse.</p>

]]>
https://promptmuse.com/how-i-made-a-k-pop-singer-full-process-with-phone-mocap/feed/ 0 How I Made A K-Pop Singer: Full Process with phone mocap nonadult
Mocap with Custom Characters https://promptmuse.com/mocap-with-custom-characters/ https://promptmuse.com/mocap-with-custom-characters/#respond Fri, 28 Jun 2024 09:38:57 +0000 https://promptmuse.com/?p=3770 Epic Games Face App mocap for custom characters Introduction Welcome to this written tutorial on how to animate custom characters in Unreal Engine using the

<p>The post Mocap with Custom Characters first appeared on Prompt Muse.</p>

]]>

Mocap with Custom Characters

A woman on a video call gestures to a large cartoon minotaur on a blue studio interface. An arrow points from her to the minotaur, indicating a Mocap transformation or connection with Custom Characters.
Epic Games Face App mocap for custom characters

Introduction

Welcome to this written tutorial on how to animate custom characters in Unreal Engine using the Live Link Face app. This guide will show you how to easily transfer facial animations from your iPhone to your custom characters, including setting up body animations. Full video here

What You’ll Learn

  • How to set up and use the Live Link Face app with Unreal Engine
  •  
  • How to import and animate
  •  characters from Character Creator
  •  
  • How to add body animations to your characters
  •  

Prerequisites

  1. Character Creator: Used to create and export your character (currently 40% off with a 30-day trial).
  2.  
  3. Unreal Engine: Ensure you have it installed.
  4.  
  5. Quixel Bridge: Download and install from the Epic Store Marketplace.
  6.  
  7. CC Rig Plugin: Available on the Epic Store Marketplace.
  8.  
  9. MetaHuman Plugin: Install from the Marketplace.
  10.  
  11. Unreal Auto Setup: Download from the Reallusion website.
  12.  
  13. Live Link Face App: Free on the App Store via your phones app store
  14.  
  15. Ensuring All Plugins Are Active In Unreal Engine:

    To ensure a smooth workflow, you need to make sure all necessary plugins are active in Unreal Engine. Follow these steps:

    1. Activate Required Plugins:
      • In the Plugins window, use the search bar to find each of the required plugins:
        • Quixel Bridge
        • CC Rig Plugin
        • MetaHuman Plugin
        • Unreal Auto Setup
        • Live Link
        •  
        • Apple ARKit (for ARFaceKit functionality)
      • Make sure each of these plugins is enabled by checking the box next to their names.
      • Click Restart Now if prompted to restart Unreal Engine after enabling plugins.
  16.  

Step 1: Create an Unreal Project 

(Installing the Auto Setup as a bridge from Character Creator to Unreal Engine)

  1. Launch Unreal Engine and create a new blank project.
  2. Name the project (e.g., “AutoTutorial”) and create it.
  3. Close the project to install the necessary files.
Screenshot of setting up a mocap project in Unreal Engine for Metahuman character animation.
Initial setup of the mocap project in Unreal Engine, preparing to animate Metahuman characters
  1.  

Step 2: Install Unreal Auto Setup

  1. Download the Auto Setup from the Reallusion website and unzip it.
  2. Run the setup executable file.
  3. Copy the contents and plugin folders from the downloaded files to your Unreal Engine project folder (e.g., Documents > Unreal Engine > AutoTutorial).
  4. Replace the files when prompted.
 
Screenshot of downloading the AutoSetup plugin from Reallusion for Unreal Engine.
Downloading the AutoSetup plugin from Reallusion for seamless integration with Unreal Engine
  1.  
Step 3: Import Your Custom Character
  1. Open Character Creator and select your character. (Has to be a CC3 character)
  2. Export as FBX with Unreal Engine as the target preset. File>Export>FBX>Clothed Character
  3. Import the character into your Unreal Engine project, ensuring CC Control Rig is installed.
Screenshot of exporting a character from Character Creator to create a control rig in Unreal Engine
Exporting a character from Character Creator to create a control rig in Unreal Engine

 Step 4: Setup MetaHuman

We are now going to use a metahuman as a Dummy to record our facial animations onto.

  1. Import a MetaHuman character via Quixel Bridge and add it to your Unreal project.
  2. Set up Live Link Face App on your iPhone and ensure it is connected to your computer.
  3.  

Step 5: Connect MetaHuman to Live Link

  1. In Unreal Engine, select your MetaHuman character.
  2. Set up the Live Link connection in the details panel and ensure facial tracking is active.
  3.  

Step 6: Recording Animation

  1. To record in Unreal engine you need to set up a Level sequence, think of this as your time line to add animation to. Below are the steps to setup your level sequencer and record your facial animation via your Iphone to the sequencer:
  •  

Create a New Level Sequence:
In the Content Browser, right-click and go to Animation > Level Sequence.
Name your sequence and double-click to open it in the Sequencer.

 

Add Your Metahuman to the Sequence:
In the Sequencer window, click the + Track button.
Select Actor to Sequencer, then choose your Metahuman character from the list.

 

Start Recording:
During the countdown, ensure your ARFaceKit device is capturing your facial movements. Perform the desired expressions and movements.
Click the Record button in the Sequencer toolbar (Red Button Left of the screen) A countdown will begin.

Stop Recording:
Once you’ve finished the performance, click the Stop button in the Sequencer toolbar.
The recorded animation will appear as keyframes in the Sequencer timeline.

 

Review and Edit the Animation:
Scrub through the timeline to review the recorded animation.
You can adjust keyframes, refine movements, and blend animations as needed for a polished result.

 

Save Your Work:
Always save your Level Sequence and project to avoid losing any progress.

Step 7: Baking The Key Frames

  1. After stopping the recording, select the recorded track in the Sequencer.
  2. Right-click on the track and choose Bake To Control Rig > Face_ControlBoard_CtrlRig. This process will convert the live link data into keyframes, which we now can copy and paste on to our custom Character.
Screenshot of baking animation keys in Unreal Engine for a Metahuman character.
Baking live link animation data into keyframes for a Metahuman character in Unreal Engine

Step 8: Baking The Key Frames

Select the baked keyframes in the Sequencer for your Metahuman character.
Right-click and choose Copy.
Add your Character Creator (CC) character to the Sequencer by clicking the + Track button and selecting your CC character.
Navigate to the appropriate track on your CC character where you want to paste the keyframes.
Right-click on the track and choose Paste to apply the baked keyframes to your CC character.

 

  •  

Conclusion

That’s it for this tutorial on using the Live Link Face app to animate custom characters in Unreal Engine. If you have any questions or want to share your creations, feel free to tag me on social media @PromptMuse

DOWNLOAD LINKS

More To Explore

<p>The post Mocap with Custom Characters first appeared on Prompt Muse.</p>

]]>
https://promptmuse.com/mocap-with-custom-characters/feed/ 0 Finally! Custom 3D Character Face Mocap nonadult
Character Creator Workflow For Fast 3D Animation Production https://promptmuse.com/character-creator-workflow-for-fast-3d-animation-production/ https://promptmuse.com/character-creator-workflow-for-fast-3d-animation-production/#respond Fri, 08 Dec 2023 12:39:42 +0000 https://promptmuse.com/?p=3149 The world of 3D character design is evolving rapidly, and two software giants, Character Creator and ZBrush, are at the forefront of this revolution. These

<p>The post Character Creator Workflow For Fast 3D Animation Production first appeared on Prompt Muse.</p>

]]>
The world of 3D character design is evolving rapidly, and two software giants, Character Creator and ZBrush, are at the forefront of this revolution. These tools are not just simplifying the character creation process but are also ensuring that characters are ready for animation and production, saving both time and resources.

Character Creator stands out for its ability to quickly generate base models that are ripe for customization. With its intuitive interface and versatile morph sliders, creators can easily shape the basic anatomy of their characters. This software shines in its ability to offer a base that includes rigs and morph targets, essential for both facial and body animation.

A character creator workflow for fast production using base meshes
A character creator workflow for fast production using base meshes

Once the basic form is created in Character Creator, ZBrush comes into play. Known for its robust sculpting tools, ZBrush allows artists to add intricate details and a unique personality to their characters. Its brushes are perfect for crafting stylized aesthetics, moving away from realism and embracing a more exaggerated, cartoon-like appearance.

A fast production screen shot of a 3D model being transferred to Zbrush
A fast production screen shot of a 3D model being transferred to Zbrush

One of the most significant advantages of using Character Creator and ZBrush is their seamless integration. With just a click, characters can be transferred between the two programs without losing any detail. This integration is a game-changer, ensuring that the creative flow isn’t interrupted.

Characters created with these tools are not just visually appealing but are also production-ready. They come with clean topology, rigs, and weights, making them perfect for animation projects. This readiness significantly cuts down the time from concept to production, a crucial factor in fast-paced project environments

Easy Facial Edit tools in Character Creator create seamless workflow for character production
Easy Facial Edit tools in Character Creator create seamless workflow for character production

For those who use Blender for composting, the good news is these characters are fully compatible. With the help of a simple plugin, characters can be imported into Blender, retaining all their rigging and morphing qualities. This flexibility opens up avenues for creators who operate in different software environments.

The combination of Character Creator and ZBrush is a testament to how technology is simplifying yet enhancing the art of 3D character creation. By reducing the technical barriers, these tools allow artists to focus more on the creative aspect of character design. As we continue to see advancements in these tools, the future of character creation looks more exciting than ever.

<p>The post Character Creator Workflow For Fast 3D Animation Production first appeared on Prompt Muse.</p>

]]>
https://promptmuse.com/character-creator-workflow-for-fast-3d-animation-production/feed/ 0
How to Animate Game Characters and Import Them into Blender and Unreal Engine 5 https://promptmuse.com/how-to-animate-game-characters-and-import-them-into-blender-and-unreal-engine-5/ https://promptmuse.com/how-to-animate-game-characters-and-import-them-into-blender-and-unreal-engine-5/#respond Mon, 25 Sep 2023 08:58:36 +0000 https://promptmuse.com/?p=3087 In this tutorial, I will guide you through the process of animating your game characters and importing them into Blender and Unreal Engine 5. This

<p>The post How to Animate Game Characters and Import Them into Blender and Unreal Engine 5 first appeared on Prompt Muse.</p>

]]>
In this tutorial, I will guide you through the process of animating your game characters and importing them into Blender and Unreal Engine 5. This tutorial is designed for those who don’t have a budget for expensive animation software or motion capture suits. The full tutorial video of this process can be found on my Youtube channel here.

We will be using a software called “Cascadeur,” which has been around for about 10 years and offers a free version with some powerful animation tools. While this method is not a replacement for professional animation software or mocap, it’s a viable alternative if you’re on a tight budget.

Note: Before you start, make sure you have Cascadeur and Accurig installed on your computer. You can download Cascadeur from the official website, and Accurig is a free auto-rigging tool that complements Cascadeur.

<iframe width=”560″ height=”315″ src=”https://www.youtube.com/embed/ScQTV2Xb–0?si=_4-LUd5vW3w7Nz64″ title=”YouTube video player” frameborder=”0″ allow=”accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share” allowfullscreen></iframe>

Let’s get started!

Part 1: Rigging Your Character in Accurig

  1. Open Accurig and click on “Choose File” to select your 3D character’s FBX file. You can use a sample character from Cascadeur, Mixamo, Sketchfab, or your own custom character.
  2. After loading your character’s mesh, click on “Rig Body” to generate the joint structure for your character’s skeleton.
  3. Accurig will display circles where joints should be placed. Ensure symmetry is checked to work on one side of the character.
  4. Position the joint guides according to your character’s anatomy, following the on-screen guides for reference.
  5. Use the tools in the bottom left corner to rotate and move around your character for precise joint placement.
  6. Repeat the process for other body parts, such as arms and legs, ensuring correct joint placement.
  7. Use the “Preview Motion” window to check the animation on various body parts, including fingers.
  8. Ensure your character is in a neutral pose (A-pose or T-pose) before exporting.
  9. Click “Upload to AccuRig” and then “Export” > “Export FBX.” Set the target application to “Maya” and check “Embed Texture.” Click “Export” to save the rig.
  10. Export another FBX file of your character’s base mesh but set the target application to “Blender” for later use.
 Accurig Auto Rigger Tool
Accurig Auto Rigger Tool By Real Illusion

Part 2: Creating a Basic Idle Animation in Cascadeur

  1. Open Cascadeur and start a new scene. Import the FBX file with Maya settings that you exported from Accurig.
  2. Cascadeur will ask if you want to enter “Rig Mode.” Click “Yes.”
  3. In the “Rig Mode Helper” dialog, click “Yes” and then “OK” on the next dialog.
  4. Click “Add Rig Elements” at the bottom of the “Quick Rigging Tool” dialog.
  5. Rotate your character by holding ALT and the left mouse button to navigate.
  6. Select the “Auto Pose” tool to enable automatic control point positioning as you move your character.
  7. Position your character into an initial pose for your idle animation by moving and rotating control points. Use ‘W’ to move and ‘E’ to rotate.
  8. Add a keyframe at frame 10 by clicking the key icon.
  9. Change the hand pose on frame 10 to create a hand open/close animation.
  10. Duplicate the first frame to frame 20 and mirror the pose for variety.
  11. Duplicate the second keyframe to frame 35 and mirror it to frame 45.
  12. Extend the timeline to add more frames for smoother animation (e.g., 200 frames in total).
  13. Copy the first frame to frame 145 and the second keyframe to frame 110.
  14. Apply bezier curve interpolation for smoother animation between keyframes.
  15. Review and refine your animation by adding subtle movements, such as chest and shoulder motion.
  16. Create a seamless loop by ensuring the first and last frames are identical and adjust frame numbers accordingly.
  17. Cascadeur Tutorial

Part 3: Exporting the Animation to Blender

  1. Export the animation from Cascadeur to Blender by going to “File” > “Export” > “FBX.” Name the file and click “Save.”
  2. In Blender, import the animation by going to “File” > “Import” > “FBX.” Use the default settings and click “Import FBX.”
  3. Delete any existing objects in the Blender scene and select the imported Armature.
  4. Adjust the Armature’s rotation to face the front and place it in the scene.
  5. Create an animation track for the imported animation and rename it.
  6. Copy the animation keyframes from the imported Armature and paste them onto your character’s Armature.
  7. Delete the imported Armature to keep your scene clean.
  8. Create an animation loop for your idle animation in Blender using the NLA (Non-Linear Animation) Editor.
  9. Blender_Animation_Import
    Blender_Animation_Import

Part 4: Exporting the Animation to Unreal Engine 5

  1. In Unreal Engine 5, create a new project and organize your project folder.
  2. Import your character’s 3D mesh into Unreal Engine by right-clicking in the “Content” folder, selecting “Import,” and choosing your FBX file. Ensure it’s assigned to the correct skeleton.
  3. Add a Level Sequence to your project by right-clicking in the “Content” folder and selecting “Level Sequence.”
  4. Drag your character’s skeletal mesh into the Level Sequence.
  5. Add your idle animation to the Level Sequence by clicking the plus icon and selecting the animation.
  6. Adjust the timeline as needed and press the spacebar to preview your animation.
  7. Extend the timeline and blend your idle and walk animations for a seamless transition.

Part 5: Adding Free Mocap Data

  1. Visit the ActorCore website and explore the free motion resources.
  2. Download free motion data compatible with Cascadeur.
  3. Import the downloaded motion data into Cascadeur, and apply it to your character as needed.
  4. Refine and customize the imported motion data to suit your character and animation needs.

This tutorial should help you get started with animating and importing your game characters into Blender and Unreal Engine 5 using Cascadeur. Feel free to explore further features and animation possibilities in Cascadeur to enhance your character animations.

Remember, practice makes perfect, and with time, you’ll be creating stunning animations for your game characters. Enjoy animating!

<p>The post How to Animate Game Characters and Import Them into Blender and Unreal Engine 5 first appeared on Prompt Muse.</p>

]]>
https://promptmuse.com/how-to-animate-game-characters-and-import-them-into-blender-and-unreal-engine-5/feed/ 0
I Turned Myself Into A 3D Game character – Tutorial https://promptmuse.com/create-3d-characters-fast/ https://promptmuse.com/create-3d-characters-fast/#respond Thu, 17 Aug 2023 09:22:46 +0000 https://promptmuse.com/?p=3066 Introduction In this tutorial, I will show you how to use the Headshot 2 plug-in for Character Creator to create a 3D character from a

<p>The post I Turned Myself Into A 3D Game character – Tutorial first appeared on Prompt Muse.</p>

]]>
Introduction

In this tutorial, I will show you how to use the Headshot 2 plug-in for Character Creator to create a 3D character from a scanned head mesh. This new plugin from Real Illusion uses AI technology to turn any mesh into a retopgized 3D character, with UVW, rigged mesh and blend shapes for facial animation. There is a full length video of the tutorial here

What you will need

Polycam
Polycam LiDAR headscan with IPhone

Step 1: Import the head mesh into Character Creator

  1. Open Character Creator software.
  2. Go to File > Import > .obj.
  3. Select the head .obj mesh file that you want to import.
Import obj into character creator
Character Creator Import obj mesh

Step 2: Headshot 2 Plugin 

  1. Click on the Headshot 2 plug-in in the top toolbar. You need to install this prior from the Real Illusion website here.
  2. The Headshot 2 dialog box will open.
  3. Click on the Align Points button.
  4. The starter pins will be automatically placed on the head mesh.
  5. Move the pins so that they are aligned with the corresponding points on the Character Creator mesh. Delete pins by Ctrl & click the pin you want to remove.
  6. Prompt Muse | A.I News, Tech Reviews and Free Tutorials
    Headshot 2 Plugin. Aligning Points

Step 3: Refine the head mesh

  1. Click on the Head Gen button from the horizontal toolbar.
  2. A preview of the generated head mesh will be displayed. You can select the area of the mesh you wish to be generated onto the CC3+ model.
  3. If you are happy with the preview, click on the Generate button, otherwise you can go back and adjust your points.

Step 4: Refine the head mesh

  1. Use the brushes in the panel to refine the head mesh.
  2. You can smooth the mesh, move the mesh, or project specific areas.
Prompt Muse | A.I News, Tech Reviews and Free Tutorials
Refine the Headmesh with brushes

Step 5: Attach the head mesh to a body

  1. Click on the Attach to Body button in the Headshot 2 dialog box.
  2. Select the body type that you want to attach the head mesh to.
  3. Click on the Attach button.

Step 6: Add skin textures and materials

  1. Open the contents dialog.

  2. Use the presets in the Skingen plugin to add realism to the skin texture, such as pores, muscle definition and freckles

  3. You can also add make up and decals to the character.

  4. In the Modify panel, go to the Morphs options and adjust the depth of the eyeballs and height of the teeth using the sliders.

Prompt Muse | A.I News, Tech Reviews and Free Tutorials
Skin Gen plugin for realistic skin presets

Step 7: Add hair

  1. I used the hair plugin (hair builder) and additional hairs assets from the Real Illusion marketplace.

Conclusion

This is just a basic tutorial on how to use the Headshot 2 plug-in for Character Creator. There are many more things that you can do with this plug-in, so I encourage you to experiment and explore.

I hope this tutorial is helpful. Please let me know if you have any questions.

<p>The post I Turned Myself Into A 3D Game character – Tutorial first appeared on Prompt Muse.</p>

]]>
https://promptmuse.com/create-3d-characters-fast/feed/ 0 I Turned Myself Into A Game Character ! nonadult