We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.

mcolinp

Forum Replies Created

Viewing 15 posts - 46 through 60 (of 70 total)
  • Author
    Posts
  • in reply to: Shading Artifacts #4484
    mcolinp
    Customer

    Upon further testing; it seems that when I use remove doubles, that the shading goes wonky. I can use the Alt-J to convert to 4 sided polygons; and the results still look great. (in Blender). IT still seems like there is some very minor shading issues with my current results, but I will keep testing to see what I can come up with.

    in reply to: Shading Artifacts #4482
    mcolinp
    Customer

    **Also please keep me posted if you would like any private help in testing your translator in the future.

    in reply to: Shading Artifacts #4480
    mcolinp
    Customer

    Thanks for the feedback. Very valuable, and much appreciated!

    Look forward to any further workflow tips or examples you may provide in the future.

    Thanks again.

    in reply to: Shading Artifacts #4477
    mcolinp
    Customer

    I really wish Blender would implement a NURBS translator that converts NURBS to mesh & could have a modifier that allows the user to play with different resolutions without losing the original data. Perhaps they could implement an NGON converter with an option to make sure the mesh stay a consistent resolution across the entire part. Avoiding the issue you pointed out in the last couple screenshots.

    in reply to: Shading Artifacts #4475
    mcolinp
    Customer

    Mikhail, thank you for your detailed response. So I have always been a bit confused as to the best procedure to get good quality NURBS into equally good mesh. I do always start with a “Stitched Solid”; though I was given the impression that the triangulated mesh generated by Alias was not the greatest when using for shading in polygonal modeling programs such as Blender.

    Someone suggested I use the OBJ N-GON exporter from MOI (Moment of Inspiration; made by developers who also work on Rhino). It generates what you have seen and allows the user to use a slider to effect how many verts & faces are created based on the NURBS data. I also always use “Remove Doubles” to make sure the NURBS doesn’t leave any extra unneeded vertices.

    I can see what you did in the last couple screenshots; and will make some of my own attempts at making the Mesh flow better. It can become daunting to go through a whole part manually. I find it quite finicky even when trying to make subtle topographical changes.

    So are you suggesting that I should be using Triangulated mesh? I guess the reason I was trying Quads (Ngons) is that it was implied it would be lighter geometry overall with good shading. Any feedback you could provide would be greatly appreciated. I replied to your poll on CAD/CAM as well. Would love to see some workflow example videos if you get a chance to create something.

    in reply to: Which CAD/CAM do you use? #4474
    mcolinp
    Customer

    I use primarily Autodesk Alias Design,

    but I also use the following:

    Autodesk Inventor
    Autodesk Fusion 360
    Rhino 5 w/ Autodesk VSR Plugin

    in reply to: Shading Artifacts #4425
    mcolinp
    Customer
    in reply to: Materials from an Artists Perspective . . . #4011
    mcolinp
    Customer

    Being able to support the Principled BSDF PBR shader in Eevee would be a game changer; along with supporting reflecting the World Environment. I realize there are benefits to workflows which are more complex; but I feel there is a need for both, as you have also indicated. Getting something that renders correctly should be a relatively easy path; while optimizing for best of all possible scenarios should be a secondary and more involved stage.

    Perhaps a way to simulate multiple devices and give feedback on what might help in various cases could be a future development goal. I’ve used Tumult Hype to create really nice animated html5 interactive content. It includes a feedback system that specifies which features of your design will be unsupported or rendered differently than expected in certain OS’s and browsers. I like this approach as it leaves it up to me to decide what is acceptable. (I can always make a note on a webpage that certain features aren’t supported in specific setups . .. )

    in reply to: Materials from an Artists Perspective . . . #4009
    mcolinp
    Customer

    Also curious if there is any beta support for 2.8/Eevee that people can be involved in. Perhaps a dedicated forum for it would be good. I noticed that the Plugin does load in the latest 2.8 daily build; but I’ve not really spent any time trying to see what I can do differently if anything . . .

    in reply to: Materials from an Artists Perspective . . . #4007
    mcolinp
    Customer

    In your first reply; you mentioned a tutorial for Equirectangular Environment maps . . .

    Do you have a link?

    Also, When you mentioned creating 3-4 cubemaps depending on the location of objects . . .This seems to me like something that could be automated. Something that the user should be able to go into the node tree of each object or material; and add a node (or node group) that calculates the appropriate cubemaps from the World environment in relation to the object/material. Perhaps with options for how many levels of different cubemaps it creates and toggles between, (1-2 up to maybe 5 or 6 total). Think of how much time would be saved with such a setup!

    I also realize that there are huge changes coming in 2.8; which you have eluded to. And I am excited to see where things will go with the coming changes. Hopefully this discussion will help shape some of the vision and oversight in the coming changes. I encourage anyone else invested in these workflows to also pitch in to the discussion. My main hope is that there is some unification around what makes “sense” from a user perspective; to streamline the user experience (for creators); as well as increase the quality of the scenes created.

    in reply to: Materials from an Artists Perspective . . . #4002
    mcolinp
    Customer

    There’s no realtime reflection in Verge3D now. To imitate reflection you need to ue Cubemaps and insert them directly into the material as textures. It’s a coomon method for Game engines to use cubemaps as imitation of reflection you can check for this Unreal Engine for example and many more. It’s a tricky and not clear at first look but it’s more optimised for web. If you check demos of blend4web that has realtime reflection they very slow it’s pay for the rendering of reflections. But they need so we hope we implement them in future.“

    Thank you for your detailed responses. I do wonder about this approach with cubemaps; my gut senses that too many cubemaps makes your filesize balloon in size . . . Is this true? This is why I suggested making material layers that that can control blurred reflections from a master environment image; controlled overall with a fresnel node. (In each material) In my mind; this eliminates so many separate images needed using cube maps in every material.

    in reply to: Materials from an Artists Perspective . . . #3997
    mcolinp
    Customer

    A possible aspect that could help; create a flag for nodes that are not supported. It could be a yellow triangle with an exclamation point; or a red glow outline (only around unsupported nodes) in the Blender node editor. Having this kind of feedback is essential to know what is not working/supported.

    mcolinp
    Customer

    I also forgot to mention that a nice aspect of Radeon Pro Render is that it fully supports Windows, Mac, & Linux; and it is FREE!! :yahoo:

    mcolinp
    Customer

    Yuri,

    Please accept my apologies. I believe that the issue causing my problems was actually from my NAS not being recognized/discovered. (It’s where the “scripts” folder is pointing to in my case . . .)

    So not a problem on your end.

    Thanks for the support!

    in reply to: Roadmap For The Near Term #1393
    mcolinp
    Customer

    Another aspect of using HDR images in blender; is that there doesn’t seem to be a way to make object appear as if they are actually resting in the scene. They are always “floating”. There are some hacks that work with compositing for Cycles renders; but this doesn’t work or help for interactive scenes.

    In other software I have used there are “height” adjustments that can be made to the HDR relationship to the scene. This may be partly a Blender issue; but perhaps you can put some pressure on them as a needed feature.

Viewing 15 posts - 46 through 60 (of 70 total)