Replies: 2 comments
-
I'm very curious for how you trained this too as it's leagues better than most other controlnets for SDXL, any information regarding training would be much appreciated. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
This is truly fantastic work! You have indeed achieved a grand unification of ControlNet for any type of line art input. I have personally tested and found that the support for lineart, softedge, and canny is excellent, better than others' training, and the consistency of the lines is truly great.



I am very curious about how you specifically achieved this level of generalization. Additionally, what order of magnitude of data did you use? What is the data distribution like? I would like to try to see if it can be applied to other types of controls.
Beta Was this translation helpful? Give feedback.
All reactions