Advertisement

Overengeneering Modularity

Started by September 18, 2018 07:54 AM
23 comments, last by GNPA 6 years, 1 month ago

 

3 hours ago, Fulcrum.013 said:

Just create folder called include and place any shared headers to it folder. For code that not just a inlined templates and require to add a dependencies make folder libs, a static libs project that have all depndencies compiled in and specify it on #pragma comment (lib, "") on related header by concept one header=one lib .  Add all libs to solution in case you need to rebuild libs later. Add a include and lib path to projects settings and output targets of libs projects.  And no any nightmare any more. Just include required header and enjoy.

That's exactly what I call 'nightmare' :)

http://9tawan.net/en/

1 hour ago, Shaarigan said:
4 hours ago, LorenzoGatti said:

Do you really have a "core component"?

Yes, the core component contains anything that is used by other components that are build on top of it. Crypto module for example uses streams and long integer math from Core, Input uses the event implementation from Core, Network uses streams, threading and other stuff from Core ... and so on.

So it isn't a cohesive component, it's a collection of low-level libraries that have little to do with each other and would probably benefit from being separated.

Omae Wa Mou Shindeiru

Advertisement
32 minutes ago, mr_tawan said:

That's exactly what I call 'nightmare' :)

Its library organization concept serve well since fortrain times. Many other language have similar strategy for common libs. Just C/C++ have a separate headers to solve a module header doubling and cross-dependencies problems. Just one tiny concept help to turn it "nightmare" to powerfull libraries organization tool - common components headers and build libs have to be moved to common libs/include folder (or its subfolders), project specific code have to be placed to project folder. Especially for follow it concept #include have a 2 options of syntax - with "" to begin search from current dir and <> to begin search from dirs specified in project sattings.

#define if(a) if((a) && rand()%100)

1 hour ago, Fulcrum.013 said:

MSVC uses __clang-based compiler. It compile thru LLVM code that uses sparse native codegenerators.

Got a citation for that? My understanding was that even VS 2017 still used MS's proprietary (and, from what I've heard, very hard to maintain) compiler toolchain.

3 hours ago, Fulcrum.013 said:

Just create folder called include and place any shared headers to it folder. For code that not just a inlined templates and require to add a dependencies make folder libs, a static libs project that have all depndencies compiled in and specify it on #pragma comment (lib, "") on related header by concept one header=one lib .  Add all libs to solution in case you need to rebuild libs later. Add a include and lib path to projects settings and output targets of libs projects.  And no any nightmare any more. Just include required header and enjoy.

That's probably fine for small projects, but how well does that work when your codebase is several million (or even a mere couple of hundred thousand) lines long? :D

Just now, Oberon_Command said:

Got a citation for that? My understanding was that even VS 2017 still used MS's proprietary (and, from what I've heard, very hard to maintain) compiler toolchain.

2015 has a optional experemental __clang-based compiler. 2017 rapidly extends set of supported targets. So looks like thay has do same as Embarcadero - just added his language extensions to __clang and result  become to proprientary tool. Ever in case thay turned back to old compiler thay has added a similar to __clang sparce code-generators technology anycase.

#define if(a) if((a) && rand()%100)

Let me goes back to the topic.

I think, you should starts with evaluating the scope of your project. 'Modular' can be 'overkill'. I think most people don't go that route because they know that once they start a new project, they would just make a copy of the old one and start reiterating (or may be start from scratch).

I think 'modular' design benefit when there are many developers work on different projects based using modules. You can just pull a module when you need to, dependencies are automatically managed (including indirect dependcies). One project might uses one module for one thing, another uses for another, etc. That's something fits larger companies (eg. Google), but probably not indie developer. It's like you said, 'overengineering'.

Modern languages have ways to manage library, module, and dependencies so setting up is a breeze. In java for example, we can just create a project with a gradle build file containing direct dependencies it requires. Gradle can then pull anything required from a centralized repository (publicly or privately hosted). It can even host build steps so everything is automated. This is some thing yet to be available for us C++ programmer as C++ module is not yet standardized. We are pretty close though. Until then having modular design can be quite challenging.

CMake is probably the closest things to tools available in other languages, but it's still very awkward to use and it can only checks if the system has that modules or not. Installing required module is still essentially a manual work.

Anyway we can have project that has modularized design project structure in C++ (though it is essentially a monolithic system, the code just looks like they are modules). 

http://9tawan.net/en/

Advertisement
37 minutes ago, Oberon_Command said:

ut how well does that work when your codebase is several million (or even a mere couple of hundred thousand) lines long?

Most likely remake it from scratch following science proven requirments for development.  Most likely it require much less code. For example i has seen a SCRUM-"developed" system that take over 10 human-years to solve 5 specific cases of one geometrical problem,  has 300k lines length and not works properly. For comparsion classicaly developed system to solve a complete set of similar problems universally fit to 15-20k lines of code and can be done from scratch in less then 1 human-year. 

#define if(a) if((a) && rand()%100)

1 hour ago, mr_tawan said:

 This is some thing yet to be available for us C++ programmer as C++ module is not yet standardized.

C/C++ headers system much powerful than any other languages have. But as any powerfull tool it just require some expirience and understaning of concepts how to use it properly. Also it applicable to any other C++ features that much more powerfull than any other languages have, like templates magic, operators overloading, pointers arithmetic,  multiple inheritance, memory managment, automatic constructors and destructors calls an so on. Only problem of C++ is a commitete that propogate conceptually outdated library instead to add to standart core language features that serve well on modern C++ dialects since midlle of 90-th, like core-implemented properties, delegates and Reflection/RTTI.

 

 

#define if(a) if((a) && rand()%100)

Alright, let me try to organize some thoughts on this subject. Modularized design is one of these things that everyone tosses around as of course that must be a good idea, but very few people actually succeed at doing. As has been noted, C++ is particularly poor for modular design compared to most later language designs, but that's far from the only issue or even the major one.

Whenever code interacts, there are assumptions involved about how both sides behave. It's tempting to say that oh, B strictly depends on module A and so module A can be cheerfully reused, interchanged, etc without having to worry about B. It's also easy to say that things won't ever end up straddling the boundary, or making assumptions about the internal behavior of either piece. This is where we start to head into the land of leaky abstractions, systems that ostensibly live as independent modules and yet expose much more of their implementation than expected or intended.

Eventually at the scale of something like a game, we tend to end up with modules that are entirely co-dependent, at least at a behavioral level and sometimes at a full code level. The physics and rendering code ends up with hacks, taps, or magic variables that exist to enable behaviors and structures that simply did not modularize in any meaningful way. I've seen people argue that the entire concept of code reuse (especially in the context of OOP) is overblown and more of a fantasy than reality, though I think that's extreme.

That's all well and good at an academic level, but you'd probably like some advice about what to do about it. I'm going to give the same advice I give for many design problems: don't do anything at all. The biggest mistake developers routinely make is designing systems and interfaces for what they imagine they would like to solve instead of what they're actually solving. Focus on the exact things you need to actually produce the exact product you're making, and put just enough design effort in so that everything doesn't go to hell if you need to reshape it later. KISS and YAGNI are undervalued approaches. Take your matchmaking server that doesn't need math code. Suppose the math code goes in anyway. Who cares? What has this cost you? What problem has it created? What problem does removing it solve?

Don't get lost trying to solve problems you don't actually have. Create a module when you have need to share blocks of code smoothly across multiple products, and then you'll have the necessary context to design the module properly. There's no sense making a module out of something that only has one use case in the first place.

SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.
1 hour ago, Promit said:

Focus on the exact things you need to actually produce the exact product you're making, and put just enough design effort in so that everything doesn't go to hell if you need to reshape it later. KISS and YAGNI are undervalued approaches.

Ok. As we know any programm directly or indirectly implements a state machine. If machine A have n states and machine B have m states together it have n+m states. But machine C that have action equal to both machines same time have n*m states. So then tinyest universal pieces used to decomposite task then shorter and simplier code it required totaly.  So deep task analisis and high-level abstraction is much simple way to get ready solution then "just code as something who know nothing in task fields sayd". Also do not forget that any component that you code today become to your tool tomorrow. So universally designed components just a way to make something for cost of couple machine-seconds of template instantination instead of human-years of coding. Really good programmer - is a lazzy programmer, becouse prefer to do short universal code that allow to shorten depnded code at least twice in comparsion with "Focus on the exact things you need to actually produce". Becouse only mind of lazzy programmer - is to make architecture/components that make for him 99% of his job. Fnd really it as result allow to spent other couple machine-seconds of template instantination instead of human-years of coding witout analizing a task.

 

 

#define if(a) if((a) && rand()%100)

This topic is closed to new replies.

Advertisement