Not signed in (Sign In)

Vanilla 1.1.4 is a product of Lussumo. More Information: Documentation, Community Support.

Welcome Guest!
Want to take part in these discussions? If you have an account, sign in now.
If you don't have an account, apply for one now.
  1.  
    Hey guys! Long time no post! I hope everyone has been well! As I do work on large scale maps with a ton of entities on my map, I do tend to get a lot of down time, when hiding, freezing, changing layers/sheets as well as the draw too. It's still quicker than CC3. So my question is, does CC3+ run off HDD, Graphic Card and/or Ram? Most of my time in designing large scale maps is through it loading so slow. I was wondering if it was a hardware issue on my end? I am saving up for a SSD soon. I was wondering if that would help decrease load/render times?
    • CommentAuthorJimP
    • CommentTimeApr 1st 2018
     
    I know it doesn't use the video card.
    • CommentAuthorjslayton
    • CommentTimeApr 1st 2018 edited
     
    CC3+ is generally CPU-bound. The importance of system features is (very approximately):
    total amount of RAM up to about 6GB,
    OS type (64-bit OS will let the program run with 4GB of usable memory; 32-bit OS lets the program run with 2GB of usable memory),
    amount of CPU cache,
    CPU clock speed,
    memory speed,
    hard disk speed (mostly for first load if you have sufficient RAM),
    graphics card speed.

    The general process for drawing in CC3+ is:
    1) clear the background image
    2) load image files needed for bitmaps and fills. After the initial load, decoded images will be cached by CC3+ and encoded images by the OS.
    3) for each sheet in the drawing
    3a) draw everything on a sheet into a memory image
    3b) apply effects to the memory image
    3c) apply the processed sheet image to the background image
    4) show the final image on the screen

    Disk speed (SSD vs. spinning rust) usually only affects step 2. The disk hardware usually only has an impact on the first redraw.
    Graphics card speed may have a minor impact on 3a and 4, depending on your OS and graphics card.
    CPU speed, cache, memory speed, have a major impact on step 3b and 3c. These steps are usually the most performance-intensive, with certain effects like blur having a severe impact.

    How slow a drawing takes to render is also affected by the entities that you're drawing. The slowest entities overall are bitmap-filled smooth polygons. The worst performance killer is not the entity type, but entity fill. If you're drawing a hatch-filled entity (especially an oriented hatch fill), then that hatch fill will be composed of thousands of entities that need to be drawn individually.
  2.  
    Great detail, thanks Joe.

    Farsight, what I do it try to hide sheets and layers that I don't need. Especially those with bitmaps fills in them.
    • CommentAuthormlesnews
    • CommentTimeApr 1st 2018 edited
     
    I guess that's my problem then, my bottleneck is CPU. I'm running two 1080's in SLI and they don't seem to be doing anything to contribute to CC3+ performance.
  3.  
    I don't think you're going to get much improvement with another CPU, that's already pretty near the top end. If your maps are that large, you're just going to have to start using the sheets and layers to bring things into except-able performance.

    For reference, I'm running i7-4712HQ @2.30 GHz and I don't consider my maps all that intense but use blanked sheets and layers.

    I think on my current map the biggest culprit is cave walls, they use a bitmap fill and are fractalized. I had to go and reduce the number of nodes and I usually keep them and the cave floors hidden so I have better performance.
    •  
      CommentAuthorMonsen
    • CommentTimeApr 1st 2018
     
    As Joe said, CC3+ is CPU-bound, so the CPU will almost always be the bottleneck no matter how fast your CPU is. Of course, there is a big difference between slow and fast CPU's, but they will always be the limiting factor.

    Because of the processing a program like CC3+ needs to do for each redraw (much much more than for example an image editor), there are no processor invented that CC3+ won't be slow on if the map gets complicated enough, although it shouldn't be slow on more "normal" size maps, but I don't know how complicated your maps are (CC3+ doesn't put any limits on what you can do, but I do recommend rethinking complexity if a map gets too slow to work with. Also, on very complex maps, it may be an advantage to work with effects off, calculating those effects all the time really eats CPU cycles)
    • CommentAuthorLoopysue
    • CommentTimeApr 2nd 2018 edited
     
    As a benchmark for how complicated a map will slow things down on a much smaller machine (64 bit, 4GB RAM, dual core +2 virtual core laptop that's officially incompatible with Win 10 but still manages to work despite it), I have a city map 5000 x 5000 map units in size that has app. 130-150 sheets (each with 1-5 sheet effects), three sets of fills (not including 20 of my own 3000 pi fills), and well over 2000 symbols including all the houses and trees. Its impractical to run with the effects on. Without the effects everything works at normal speed except zoom, pan and redraw (understandably), which can take 30 seconds to respond.

    It doesn't crash though. It just takes a really long time to do any one of those 3 things - but we are talking about a machine that only ever has 1.7 GB RAM available nowadays thanks to Win 10, and which struggles to cope with processors that have been kneecapped by the recent security upgrades.

    (If this thread is still active by the time I've saved enough for a new laptop with more RAM and processors, I'll update those facts :) )

    [EDIT: actually, thinking about it, could it be the security upgrades over the last 3 months, which were feared to slow things down by up to 30% that are causing this particular problem? Processor speeds will never be what they were last year!]
    •  
      CommentAuthorMonsen
    • CommentTimeApr 2nd 2018 edited
     
    Posted By: Loopysue[EDIT: actually, thinking about it, could it be the security upgrades over the last 3 months, which were feared to slow things down by up to 30% that are causing this particular problem? Processor speeds will never be what they were last year!]
    It might be. I know the speed of the modern processors (6th, 7th and 8th gen Intel) are almost unaffected (because these could be fixed in a different way), but speed of the older processors, like the 4th gen both LoreEntrails and mlesnews have was hit much harder.
    I had a student in my office the other day with an older i7 processor. It would normally be a beefy machine, but some tasks where slowed to a crawl after the intel spectre/meltdown fixes. Personally, I am running a processor new enough that it could be fixed without taking the severe speed hit, but those older ones wouldn't. I am not sure if this affects CC3 performance or not though, but it sure is a thought.
    • CommentAuthorLoopysue
    • CommentTimeApr 2nd 2018
     
    This is why I desperately need a new machine. Everything is taking so long to do!
    • CommentAuthorjslayton
    • CommentTimeApr 2nd 2018
     
    The exploits for the Spectre and Meltdown processor problems rely on undesirable behavior when making a system call. The fixes involve doing some extra work at every system call to make the undesirable behavior irrelevant. The hardest-hit software will be that which makes a great many system calls in a given period of time. Reading files and rendering the basic drawing elements (lines and polygons) do make system calls and can be affected by the slowdown, but effects should be largely immune because they don't usually make any system calls. The short version of that is that some drawings will be impacted more than others by the processor security updates.
    • CommentAuthorGathar
    • CommentTimeApr 2nd 2018
     
    From my point of view, the element that has the largest impact on performances is the screen resolution. I have a 4K screen, and when working full screen, I have lots of performance issues (even with effects disabled). One of the biggest issue being that very often, the drawing cycle, as described by jslayton happens not once, but twice while refreshing a picture (resulting in a green flash on the screen, and a doubled refreshing time).

    However, if I reduce the window so that the effective drawing resolution becomes much lower, then performances become acceptable, and in some circumstances (usually reducing the window size even further), I can even enable effects and continue working. This is annoying, giving the feeling of drawing on a post-stamp, but this is the only way I found to get decent performances (I don't have any performance issues with my computer with any other program, including games). Sometimes, I look at youtube videos about CC3+, and the performance is huge compared to what I have in fullscreen.
    • CommentAuthorjslayton
    • CommentTimeApr 2nd 2018
     
    Gathar is completely correct in pointing out that the number of pixels drawn is always the limiting factor in terms of redraw speed. Some of the implementations used in the effects system (especially blurs and point lights) are particularly ill-behaved.
    • CommentAuthorLoopysue
    • CommentTimeApr 2nd 2018
     
    How big is a 4K screen - in terms of inches?
    • CommentAuthorjslayton
    • CommentTimeApr 2nd 2018 edited
     
    A 4k screen is 3840x2160 pixels (twice the 1920x1080 HD screen dimensions in each direction). The size in inches will vary according to the screen resolution (there are 4k screens available that are well under 10 inches, as well as some that are over 70 inches).

    https://en.wikipedia.org/wiki/720p#/media/File:Vector_Video_Standards2.svg shows some of the more popular video standard resolutions in use down over the years. https://en.wikipedia.org/wiki/4K_resolution#/media/File:Digital_video_resolutions_(VCD_to_4K).svg shows how the common broadcast resolutions vary. The rendering system in CC3+ was developed back when HD (1920x1080) was a high-end device.
    • CommentAuthorLoopysue
    • CommentTimeApr 2nd 2018
     
    Thanks Joe :)

    That's the way of things isn't it. I wonder just how big things will get before we finally accept that 'this is enough', and 'I wouldn't be able to see the difference if we doubled it again because I don't have the eyes of a hawk'?
    • CommentAuthorjslayton
    • CommentTimeApr 2nd 2018
     
    8k screens are available in the wild. It's basically trying to hit about 60+ samples per degree of field of view, which is generally accepted as the limit of human visual acuity. The farther away a screen is, the bigger it needs to physically be to hit that goal for a fixed resolution. There are physical and practical limitations to how high a resolution is useful: there are no such limits for marketing purposes.

    The desire for huge and high-resolution screens is as much for multi-viewer experiences as anything (the home theater). A single human has a relatively small area of their visual field that hits the limit stated above, so you can get by with smaller screens held closer for a single-person device. Sharing devices means that you need to provide high resolution everywhere on the device and at a larger distance.

    How much is enough? It depends on the level of bragging rights you want with your friends. After spatial resolution (pixels on the screen) comes temporal resolution (how many frames per second), but because CC3+ doesn't continually redraw its screen under most situations, temporal resolution is less important for it. That's not to say that I wouldn't like it to be much faster, but the compatibility needs for some of the features that have been implemented in CC3+ are, um, complex.
    • CommentAuthorGathar
    • CommentTimeApr 2nd 2018
     
    I sincerely believe my 4k screen is not only for bragging :)

    Before, with HD, I worked with one application full screen, one document in the application, now, I can work with two documents (real-size vertical A4) or two applications side-by-side, and the text is crystal clear, with individual pixels invisible.


    Question for jslayton: What API is used by CC3+ for drawing? Is this gdi, openGL,D3D, something else?..
    • CommentAuthorjslayton
    • CommentTimeApr 2nd 2018
     
    I agree that 4k is definitely useful for most pure strain humans at reasonable monitor sizes. A 27" 4k monitor about at the perceptual threshold for normal viewing distances and vision, but an 8k monitor at 24" is a bit above that threshold and as much a marketing gimmick as anything for desktop use. A 5.7" phone at 2560x1440 (531ppi) isn't much use for most humans over the age of 30 (or most of them below that age, either). It does play into the "bigger numbers are better" philosophy of most marketers, though.

    The FastCAD core dates to the mid 90s from the comments. It's straight GDI rendered to in-memory DIBs. It uses custom bitmap software to do its fills, which is why you get nearest-neighbor sampling and artifacts along edges of some filled entities. The effects system uses the drawn DIBs to apply software-based operations. Switching the rendering engine to something other than its current state is a huge effort, but the initial prototype showed some promise.
    • CommentAuthorJimP
    • CommentTimeApr 3rd 2018
     
    As an old geezer, my eyes see no different between HD and regular television shows. I noticed that my satellite shows are now all HD. I didn't notice anything other than the channel name changing. So, unless pressed by monitor failure I wont be getting a 4k screen. Sadly, they are too new to go on sale at what I can afford.

    My current 24" monitor is 1920 x 1080p. Works fine for me. I don't have to wear my eye glasses when looking at my computer screen.
    • CommentAuthorpvernon
    • CommentTimeApr 3rd 2018
     
    I have a stupid question, are the all of your cpus turned on? Windows defaults to one cpu. You have to manualy go in and turn the others on. Just asking since a lot of people are unaware of that detail.
    • CommentAuthorLoopysue
    • CommentTimeApr 3rd 2018
     
    How weird!

    I have 2 real ones, and 2 virtual ones. All have always been automatically turned on without me having to do anything. There was one app that couldn't 'see' the virtual CPUs no matter what I did (nothing to do with PF software I hasten to add). I suppose it must be different between different machines, and even between different apps :)