No.106477
Hmm, I honestly have no idea. All I'd want to say is that SD may need some sort of queue mechanism to stop it from crashing when two people use it at the same time, and you might want to limit certain parameters even prior to that.
No.106478
A game server shouldn't cost more than an old computer to run.
For SD it depends on how fast you want the image to be generated. And sometimes you can get more computing power per buck by using multiple GPUs which has benefit with a queue system.
No.106479
>>106477>>106478A queue sounds nice, but I'm not really sure how you'd set one up...
Also an issue we've considered is who'd take care of the server. I probably have the best space to put it, but I don't know anything about managing a server and would probably prefer vermin be able to fix whatever issues may arise.
No.106484
>>106477Stable Diffusion already works off of a queue, so that shouldn't be an issue.
>>106479>I don't know anything about managing a server and would probably prefer vermin be able to fix whatever issues may arise.Depending on the level of complexity we're talking about, some type of out-of-band management would be ideal for this. Server motherboards and some workstations will have "IPMI" which allows managing a computer and viewing its status across a network. It also allows advanced features such as being able to turn on a computer and even access the BIOS without being physically in the same location. The more "enthusiast", but DIY solution would be a PiKVM: essentially the same thing as an IPMI motherboard, but based off of a raspberry pi that you then plug into the motherboard headers (for power on, power off and reset), a USB port, and a HDMI port.
Now, depending on how complex the setup we're talking about is, and how much performance (and cost) we would want, there are a few different routes to consider:
1. Server-level hardware - reaching into the Enterprise realm, likely making use of a lot of old server equipment.
2. Prosumer - High end consumer hardware.
3. Minimum viable product - few generations out of date to cost-optimize.
So... We should ask ourselves what our requirements are.
1. For game servers, are we just hosting a single server at a time and shutting it down once activity falls off, or are we keeping them online indefinitely?
2. For stable diffuion, what sort of performance do we want? Is 4090 performance necessary (2 seconds to generate a 512x512), or is 3060-level performance okay (~30 seconds)?
If the answer to the first question is, "just one server at a time," then we could go for a more budget-oriented build. Otherwise, we would likely want a CPU that has a high core count to support multiple servers at once without bottle-necking. The second question kind of answers itself.
The third and final question is related to storage:
1. How much data do we actually need?
2. How fast do we realistically need our storage to be?
3. How valuable is our data? Do we care if the server's hard drive dies?
With a bunch of models, I'm currently using around 100GB for Stable Diffusion. A moderately sized Minecraft would can grow to a few gigabytes.
For storage speed, remember that we are limited by the network connection. A 200MB/s 3.5" HDD is generally going to be just as fast a 5000MB/s PCIe 4.0 x4 NVME SSD if our network connection is 200MB/s (1 Gigabit per second is roughly 125MB/s).
If our data is very valuable and we cannot tolerate a drive dying, we likely want to consider some form of RAID. Depending on how much storage we need, and how much redundancy we want will determine what RAID level we should go for. Bear in mind, RAID level will also improve read speed on our drive. Otherwise, if we don't really care about drive deaths, could could use any old drive.
Finally:
1. What sort of operating system would we plan on using?
2. Would this server stay as-is once it's built, or would we want future expand-ability?
For a more complex setup, it may be beneficial to run a hypervisor such as Proxmox and then run virtual machines for whatever we need to do. For a more basic solution, something like Windows 10 or 11 would likely work fine, however.
If we would want future expand-ability, we should consider what that might mean: If NAS functionality is desirable, then a case that can fit a lot of hard drives like a Fractal Define R5 (8 3.5" HDD capacity) might work well, but it likely wouldn't fit a 4090. Something like a Fractal Meshify 2 XL or Define 7 XL might be worth considering: (16 3.5" HDDs and very spacious).
No.106504
>>106484I think for perfomance/cost I want to be somewhere around prosumer and enterprise. Although, I'm not really sure how much better enterprise would be once we have the next generation of GPUs out which are set to outclass most everything on the market, so I'd probably stick with Prosumer for now to reduce how much we need to spend on stuff that may just get upgraded.
In terms of game servers I think we'll just do hosting one game at a time, not like we're going to be running 2 /qa/ games at once. For SD, I think we want a fairly fast setup to reduce headaches on the user-end (assuming multiple people will be using it at similar times) so I'd probably go with the 4090 option.
For storage I would probably want a fairly large HDD to hold whatever necessary files we may need, and then maybe a nice SSD for the main system and AI generation (Although maybe there's a way to store resulting images in the HDD without affecting the speed of generation...). The RAID consideration is probably a good idea, but I've never set up one myself so I don't really know what it does besides serve as a backup.
>A 200MB/s 3.5" HDD is generally going to be just as fast a 5000MB/s PCIe 4.0 x4 NVME SSD if our network connection is 200MB/s Maybe that's true for transferring over the net but when it comes to actually writing stuff onto the SSD is that really the case? I've heard somewhere that a fast SSD can actually improve speeds somehow but this may just be gibberish I've heard online.
>Finally:>1. What sort of operating system would we plan on using?>2. Would this server stay as-is once it's built, or would we want future expand-ability?Now this is the part where I completely give up any thoughts and opinions of my own and transfer all my decision-making onto vermin. I really have no idea here.
No.106507
>>106504>Although maybe there's a way to store resulting images in the HDD without affecting the speed of generation...In the case of stable diffusion, bear in mind that the model is loaded into VRAM and RAM so HDD and SSD speed doesn't matter. I run my SD bot off of a WD Gold HDD. The only practical difference it would make is in load times for changing models. I don't think an end user is going to notice the time difference it takes for saving a ~300KiB to ~2MiB image.
>The RAID consideration is probably a good idea, but I've never set up one myself so I don't really know what it does besides serve as a backup.The general idea is that striped arrays increase read speeds by however many non-parity drives you have. RAID 0 of 2 disks is a 2x increase. RAID 5 is an increase of n-1. RAID 6 is an increase of n-2. To make something both fast and cheap, I think maybe a RAID 10 of some SATA SSDs would be okay; 4 drives total, 2 sets of RAID 0 mirrored. So, a fault tolerance of 1 disk per pair, up to 2 disks in a single pair, but IIRC if 1 disk in each pair fails at the same time then all of the data is lost. Synology has a rough calculator that gives an idea of space and fault tolerance:
https://www.synology.com/en-us/support/RAID_calculator?hdds=2%20TB|2%20TB|2%20TB|2%20TB
>Maybe that's true for transferring over the net but when it comes to actually writing stuff onto the SSD is that really the case?For random reads and writes (i.e lots of tiny files), an SSD is going to be considerably faster than an HDD, whereas sequential reads and writes (few big files) will be perfectly fine. The worst case scenario in terms of reads would be reading from a parked hard drive that needs to spin up, but eh. Shouldn't really be a big deal either way.
If reads are a genuine concern, you could use a program like Primocache. For reference, this is the read speed of my HDD while using Primocache with a RAM cache. Nothing is going to be faster than RAM, but RAM is expensive and you likely wouldn't want to dedicate a large amount of RAM to caching. In that case you could add an M.2 drive (preferably an Intel Optane drive, but anything would work) for read caching so that reads would be accelerated to the speed of the SSD. You could do write caching too, but this is a bit risky since data is liable to become corrupted in case of a power fault or if the system is shut down unsafely; a UPS would be an absolute requirement.
Over all would be my tentative recommendation:
https://pcpartpicker.com/user/ciuieque/saved/Rw8H3CThe only thing you might want to change is drives to something else. I chose those Intel data center SSDs because they've got a crazy high write endurance of 36.5 Petabytes, but something more pedestrian like WD blue SSDs or even HDDs would be fine. Stock fans should okay since the 4090 has a huge heatsink and the 13500 comes with a heatsink that should be good enough.
No.106508
>>106507LGA1700 would also have options for future upgrade depending on the life of the server: potentially, you could upgrade to a 13700K or even a 13900K, but realistically it wouldn't help all that much since nobody's going to be gaming on this PC. I've heard that Intel is going to release a Raptor lake refresh so that would mean there'd be Intel 14th gen CPUs to upgrade to as well.
This would be my recommendation if someone wins the lottery though :P
https://pcpartpicker.com/user/ciuieque/saved/jLfmLk
No.106518
>>106507>Total: $3068.06When you take into consideration the GPU is $1600 that's pretty crazy cheap for the rest of the parts that I'd consider pretty nice. I think I'd probably want to swap out an SSD for a nice WD black HDD, yeah, but aside from that I can't really figure out any other issues I'd have with this setup. Best part is that 3k price tag is probably cheap enough for vermin to not feel insane pressure taking care of it so that problem sorts itself out nicely. Can probably get started on that as soon as I have a good day...
>>106508Brb buying a powerball ticket
No.106519
>>106508what is this monstrosity whyyy howwww
No.106523
>>106518>I think I'd probably want to swap out an SSD for a nice WD black HDDIf endurance and drive lifetime is a concern, I would recommend WD Gold HDDs instead. From what I remember, the WD Blacks have an MTBF of 1 million hours versus 2.5 million hours on the WD Golds. The Golds are also a bit faster. The only downside is that they can be a bit noisy since they're enterprise grade and not optimized for quiet operation, like you would want for a hard drive at someone's desk. Granted, the Corsair 4000D only has room for 2 3.5" drives so a different case would need to be chosen if you wanted to replace all of the Intel data center SSDs with WD Golds or some other hard drives.
>Brb buying a powerball ticketI hope you win!
>>10651964 core Threadripper for supporting a high number of virtual machines.
2TB of RAM, largely for ZFS cache, but would also help with high performance virtual machines.
15 14TB WD Gold 3.5" HDDs in a RAIDz3 of 11 drives, with 4 drives as hotspares. Usable capacity of 112TB. Used for NAS and bulk storage.
7 Intel Optane P1600X's, mostly for ZFS special vdevs: log, cache, and metadata. Metadata vdev in a RAID10 for safety (if the Metadata vdev dies, all data is lost because it tells where data is stored on each drive, so redundancy is extremely important).
2 Intel Optane P5800X's Maybe for storing virtual machines. I mostly just wanted to utilize the U.2 connectors on the motherboard and P5800X's are the top of the line of Intel's Optane line before they scrapped it.
4x Nvidia RTX A4000's for giving virtual machines dedicated GPUs. Also for giving a Plex instance a GPU for GPU accelerated transcoding. Chose A4000's over A6000's or RTX 6000 Ada's due to power draw concerns. Likewise, the A4000 is a single slot card meaning more can be used, unless the other cards were using water blocks, but that would require a water cooling loop and the HDD's take up most of the space where that would go.
Fans, case, PSU, and UPS are self-explanatory.
Drive brackets because the case only comes with a few.
Mellanox SFP+ 10Gb NIC for high speed networking. In theory, could do 40Gb/s or 100Gb/s with a different card for direct attach storage to another device, but I don't think it's possible to buy 40Gb or 100Gb internet, let alone 10 Gb.
LSI 9400-16i used as an HBA for connecting the 15 hard drives.
Rolly cart because it would be too heavy to carry. Napkin math, it would probably weigh around 80 lbs, I think.
Displayport EDID emulator because sometimes machines get angry if they don't have a "monitor" connected. This emulates a monitor.
Then, finally, an SFP+ RJ45 transceiver because the Mellanox NIC is SFP+ not RJ45.
Over all, the system would be incredibly versatile, capable of running
many virtual machines with their own dedicated cores and even GPUs to the point where you could run Windows virtual machines that would be on par with a midrange RTX 3070 system with 64GB of RAM. If I had the money for it, I would love to use a system like this to host every thing Kissu: from multiple concurrent game servers to stable diffusion to potentially even making virtual machines accessible for people to work on their own projects because their own hardware isn't powerful enough, or to just let people mess around with. Most server tasks are fairly lightweight so a system like this could easily last a decade of continuous usage. It's just a pipe dream, of course, but it would be a nice one to come true.
No.106527
>>106523How much space would this monster take up? The server room next to me is like 6x5m.
No.106530
>>106507Price is not abusrd
Way too many drives
Why not a dedicated network card?Likewise, I would rather strip off the 1000$ of SDD and buy myself a home security system and cameras for my room that notify me on a motion detector
No.106531
>>106508Damn, 2TB of RAM. Does that mean text language models could have far longer "memories"?
No.106532
>>106531Almost certainly, yes.
No.106533
>>106531>>106532Tard here, doesn't text generation rely on VRAM only so any memory would also need to be in VRAM or is there some way to separate the two?
No.106534
>>106530>Way too many drivesIdea was for a RAID10 of SSDs, and RAID10 has a minimum requirement of 4 drives. As mentioned, if the data isn't that important, then you could easily scale down or change the drives to something else.
>Why not a dedicated network card?2.5Gb should be good enough, and there likely wouldn't be any real world benefit to using an add-in card. Not sure about clearance with the 4090. You could get a 10GbE networking card potentially, but, again, you're limited by the speed of your internet so if you have a 10GbE NIC going to a 1Gb network connection, it's just a wasted expense.
>>106533Depends on the model we're talking about. The Facebook Llama models and their derivatives use RAM instead of VRAM. You are right, however, in regards to other models using VRAM primarily.
No.106535
>>106534I can agree with those reasons. I think SSD is going for something more than the requirements.
No.106536
a dedicated network card is excessive for a simple setup. However, i believe that there are network devices such as switches, modems, routers, cabels and other network devices that represent that theme and cost
No.106537
>>106535I guess this would be my recommendation without SSDs:
https://pcpartpicker.com/user/ciuieque/saved/MfCrGXDrops price by $200 to $2897.70. Had to switch cases to accommodate 4 HDDs, however.
No.106539
>>106537But, again, if data integrity isn't that important you could shave off another $239.94 and use a single HDD alongside the boot SSD for a reduced cost of $2,657.76.
No.106544
>>106537or you could just mount it on a piece of plywood and stick it to the wall
No.106999
@team just wanted to generate alignment on this follow-up. We would like to know if it's possible to move forward at this time. Could somebody please take an action item to socialize this understanding before the next review?
No.107186
>>107185Can't wait to play PissuCraft.
No.107708
>>107705When you ask for price saving you must concede something...
No.108017
>>108006Just install Ubuntu with the UI.
You'll just waste too much time trying to do things though the CLI and the system resources are high enough that you can get by with one running
No.108048
>>108040The difference is running a desktop vs not running one. Removing it is a performance booster.
I think adding one to a server is not the hardest thing.
https://phoenixnap.com/kb/how-to-install-a-gui-on-ubuntu
No.108049
>>108048https://askubuntu.com/questions/435314/switching-between-console-and-gui-in-ubuntuthis seems neat, can keep the performance boost of cli while letting my tard self lean on gui when i need to
No.108060
I'm messing around with Stable Diffusion stuff models again, I might be able to make a better one than Kissu Megamix... maybe. I guess it would be an option at least. Although, I don't really know what I'm aiming for at the moment
No.108209
Alright, the Kissu Megamix v2 is a go! (top is original, bottom is v2)
I wasn't planning on making another merge any time soon, but with this kissu server thing happening I thought I'd try to make an improvement. However, it's not a binary gain because it trades off some of the 2D flat coloring for more detail, and the faces might be a little uncanny sometimes since it's basically adding more real life influence, but in a subdued way. But, I think a lot of that can be altered with tags like "realistic" in the negative prompt.
There will certainly be times when people prefer v1, and I'm sure some will flatly prefer the old one in all situations. But, I wanted to see if I could create a version that's better than Kissu Megamix in some ways. After generating about 100 comparison charts I think it's good enough to be considered an upgrade.
This most recent merge was about 20 hours of tinkering. I say 20 hours, but like 80% of the time was me doing something else while it generates. But, there was no way for me to make it shorter.
My AI stuff is centered around fapping, so this image doesn't really do it justice. (I'll throw a version of this image on /megu/ with the 'penis' and 'nsfw' tags added)
Anyway, I'll probably do a little bit more tinkering tomorrow to see if there's anything I'm missing, but once the Kissu AI image generation stuff is set up people will have access to this super merge of mine that's hundreds of hours in the making and in my humble opinion the best model in the world for what it does, as these prompts here in the image are all very plain and don't have any LORAs or elaborate scene modifiers. The first set is literally just "1girl, portrait, face".
No.108223
>>108220YAY!Yeah, I'm doing some testing on it as we speak and it's going pretty well. Man, 4090s are so fast and I can make images far larger than I could with my own local setup.
Both of my Kissu Megamixes are on it and we can add other models on request, but my experience with them is limited since I prefer my own. We're going to load this thing up with hundreds of LORAs, including the ones I've made myself including the fabled Kuon LORA.
No.108477
So how are you going about doing the trusted users thing? Last time I saw this done, albeit with no real safeguards, it didn't take long for trolls to ruin it for everyone and the guy running it took it down out of fear of getting in trouble for hosting a cunny machine.
No.108480
please don't ban cunny
No.108485
>>108478I suppose posting the LORA on an auditing thread should be enough to get it added by a staff member. It's not like it's gonna do something bad when you give it a specific undisclosed prompt, right? Should be pretty straightforward.
No.108486
you just block it on the UI, no?
No.108502
>>108478LORAs can be trained to have a trigger word that can be an obscure token only the maker knows, and the same trigger word is needed for generation to observe its effect.
There is no way to know what's hidden in them if you don't know the exact secret word.
No.108539
>>108477i'm a cunny plowing machine
No.108757
>>108756Uhh... they're all extremely pornographic...
No.108883
>>108882You could always try uploading to MEGA or something.
No.109106
This is a SFW example comparison image thingie that I can post here on /qa/. If you can't tell, the bottom three rows are set up so that it's an increasing amount of real life model imagery in the merge.
I've been doing lots of tests and although I made it as an ingredient for the 2.5D mix attempt, I think my REALMIX is really good at what it does. It combines 3 different RL checkpoints into specific ratios and it counters the weaknesses of each one in a pretty good way, at least in my view.
But, uhh.. I feel like it's leaving the scope of Kissu's shtick so I don't think we can include it, among other reasons. But, you can see how using the RL models influences the merges I make.
I think my current goal is to have the flat colors of a furry mix, the relatively decent hands of an RL model, and then to cover all of it with a nice layer of Japanese-style 2D. That is something I could finally crown as Kissumix 3 (or maybe 2 since I don't like 2 that much)
It's pretty hard to get them all working at once, though, especially if you want characters and booru tags to be recognized.
Bleh.
No.109108
>>109106For the hell of it I'll show you what the furry models look like with these prompts. It's weird to think that these have anything to offer, but they're actually really good at certain things, the most obvious of which is the nice flat coloring that lacks the weird noise of most models.
The top one is actually in kissu megamix (yiffy-e18)
No.109113
>>109108Hmmm, the bottom one seems to be able to do 2D pretty dang well for some reason. Wonder why that is.
No.109124
>>109113I found the model by luck when doing a search on 4chan because it's only updated on some discord, so I have no idea on what the model actually is apart from its name (easter). It really sucks that so much stuff is locked away behind random discords instead of shared freely on the internet, but that's a rant I've already done a few times on kissu.
Anyway, It's definitely merged with some booru data somewhere since it recognized Remi and she's definitely not popular enough on furry sites. The top row is "pure" and that's why it looks so abstract- it uses e621 tags because all of its data is from e621. I never actually learned how to use it since kissu megamix thankfully didn't require e621 tags despite having furry data in it.
No.109397
Make sure to have backups. Lain site got filled with soyjaks today.
No.109398
>>109397If its lainchan lol
No.109406
lainjaks
No.109447
should have had cloudflare
No.110593
I've spent the past two days doing a whole lot of checkpoint merging and testing for Kissu's eventual AI thingie.
However, due to the
very NSFW nature of the testing my blog progress is on
>>>/megu/ as usual.
No.110597
>>110595This looks insanely good. Seems like the model testing is going swimmingly from the sound of it?
No.110599
>>110597Yeah, I took a break for a bit to do other stuff since merging isn't terribly fun (90% of the time you're doing other stuff waiting for it to finish, but you need to check it every couple minutes). I feel like this is the pinnacle that I can reach at the moment in any reasonable timeframe. The return on investment for doing more is rapidly fading away again.
The unfortunate thing is I still haven't found many patterns when it comes to merging when it comes to extrapolating it for future merges. I can't say I learned anything from the past two days when it comes to SD, just what works with this specific combination of checkpoints.
No.110634
Looking at these and the ones on /megu/ I gotta say, it's beautiful, absolutely beautiful. Honestly way beyond what I expected.
No.110684
Not everyone has /ec/ visible, but I made a thread for testing out prompts can people can submit to me with the primary model we have loaded:
>>>/ec/13144This is just a temporary thing for testing purposes
No.111791
>>111787>Pretty rare tagThese models are all built on top of Stable Diffusion 1.5 as a base so it's not just pulling from boorus so stuff can work that you might not think will work. I really don't know how it functions, but all of this stuff is possible because SD had its own "definitions" of things that the booru scrapes used as a guideline. The best example of this is food which is surprisingly accurate and realistic. Typing "woman" instead of "1girl" will push it closer to reality, but not by much since my newer merges don't have much RL data in them (which makes hands a lot less reliable)
"Thai text" is most likely a placebo, but I'd have to test it on my own local model since you need to be able to repeat the seed to test influences.
Hmm... could try to add some real life merges in to improve hands, but ehhh
No.111792
>>111791I mean, the set of all images that are given eldritch tags is far less than others, so when you ask for this tag it's going to start shoving into it more humanoid characteristics.
So you have to override the humanoid preferences with monster ones by giving the weaker traits a higher priority
No.115208
Just learned about
https://github.com/Chainlit/chainlit which is some sort of API thing (I don't know programming) to help make chatgpt-like websites with online or local LLM chat models.
Do you think this has any value?
No.115209
>>115208Do you still need the expensive gpu?
No.115210
>>115209Eventually, but I think I'll be waiting until Cyber Thursday/Black Friday to look at them to see if there's any crazy deals. I'm strongly distrustful of used GPUs, even if crypto mining is pretty much dead.
It would be nice if there's a discount somewhere, but I doubt there will be. Lately I've been focusing on refining the image generation side of things and a 3080 is enough for that, although it's obviously slower than a 4090.
Facebook, er "Meta" won't begin training Llama3 until next year, but it's going to have so many GPUs that training will takes days instead of months. Another advance that people are waiting for, and it's likely only a matter of time, is the ability to train text LORAs on consumer hardware so you can add knowledge to it without eating up limited prompt context.
There won't be any new nvidia gpu released in the entirety of 2024 so I don't know how prices will fare. I doubt they will go down, at least, since monopolies tend to do that.
Anyway, the reason I asked is that it would be cool to have something like what I linked on kissu somewhere, but I don't know if it's feasible.
No.115211
>>115210It's feasible if I make a million dollars and can afford to spend crazy amounts on expensive server architecture and multiple 4090s. So I guess you'd better hope for a freezing winter with extreme blizzards.
No.115475
We really need an actual way to do it on kissu proper
*cough*,
but if you want to prompt with the merges I've made from what I've learned over the past year you can hop in [
>>>/chat/580]
and type commands in your post and it will generate a reply with a puu.sh link! (also my poorly documented blog for this is at
>>>/megu/183 )
Start your post with: .sd or .sdK or .sdP if you want to mess around with 3 different checkpoint models that are loaded. There are actually still 2 more loaded, .sdC and .sd2 but I want to replace those eventually.
The more tags you add the less likely the "minor" ones will have an effect. I.E you can expect blonde hair to almost always work, but maybe something like 'earrings' would get ignored. AI stuff still has errors of course, but if you're like me you can ignore things like multiple vanishing points or an extra finger when you're focused elsewhere on the image. Ideally I'd like people to have a UI to do inpainting and stuff, but...
Anyway, make use of the danbooru and e621 wikis to get an idea of how you should tag things. They both allow you to click a tag on an image page to be taken to the wiki page which will include links to related concepts, like 'poses':
https://danbooru.donmai.us/wiki_pages/tag_group%3Aposture https://e621.net/wiki_pages/4612.
I've yet to do extensive testing on my newest .sd model to see which prompts it prefers.
You can also find images you like to get examples. When prompting, make sure to focus on the most specific tag and avoid redundancy. I.E a girl with a bow in her hair will often have the "bow" and "hair bow" tags, but you should only put "hair bow" if that's what you want.
.sd - My latest (and perhaps last
major upgrade) merge. Not very good at character tags but it's far better at sexual stuff/genitals. My goal with these merges was fap material and it will generate great penises, even from behind. You should use booru and/or e621 tags for it. The more you use booru tags the more booru-like it will appear (less of a nose and such). I make no guarantees for the quality of images made without tags. (horrific things sometimes). It is my opinion that this is the best model in existence for my fapping purposes. I know that I could charge money for access to this if I had a way to form a website around it.
.sdK - The Kemono one. Uses e621 tags and recognizes some artists there, but I'm not really familiar with them. It does follow booru tags to a degree. You'd have to try yourself. Use stuff like (kemono:1.3) to give it a more kemono feel, otherwise it can make stuff that's more Western in appearance. Unlike other existing furry models, this one doesn't need elaborate prompting to make it look good.
.sdP - 'Painted'. Through some error in the merging process with specific settings it gives it a look like a painting as
>>110684 shows. This one doesn't follow tags very well, but it can really make some interesting scenes.
Example (SFW) prompts
.sd 1girl, looking at viewer, (holding paintbrush:1.2), sitting, paint, palette \(object\), (paint splatter, paint splash:1.2), slime, multicolored background, colorful
.sdP 1girl, seashell bikini, (gradient hair:1.3), blue hair, pink hair, (mermaid:1.2), frills, bikini, (gigantic breasts, cleavage:1.1), very long hair, jewelry, (floating hair), arms out, fish tail, (scales:1.2), marine, coral reef, underwater
.sdK female, feline, cat eats, (kemono:1.3), kimono, (blush), shrine, indoorsFor NSFW prompts it's not much different, just add stuff that you want to see and use the proper booru/e621 words for it. Nipples, pussy, penis, testicles, anus, etc.
You can try to use sex-related tags, but even with a great model it's unreliable. 'Sex' isn't good on its own since it's a tag applied to many different situations, but adding stuff like "cowgirl position", "pov", and "vaginal penetration" and will fare far better at trying to assemble something coherent.
If anyone here shows genuine interest I can download LORAs for you which you can plug in, which will allow characters, outfits and other concepts to be better prompted.
No.115480
>>115211out of context it sounds like a "when hell freezes over" sort of joke... but knowing context you actually want hell to freeze over