Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
A FactoryFactoryFactory in Production (hackernoon.com)
69 points by hliyan on Dec 18, 2017 | hide | past | favorite | 46 comments


What is up with the seemingly popular knee-jerk reactions towards OO design patterns? It genuinely feels like some sort of mass hysteria... Yes, strategies like dependency injection might seem like pointless complexity if you don't examine why they exist in the first place, but they do solve genuine problems. And no, these aren't just problems that wouldn't exist if you used a functional language... (no silver bullets or else we'd all be writing Scala or Haskell)

Yes Java has a lot of problems but they are mostly ones that have workarounds and aren't fundamentally dealbreakers. A few example issues are assignment isn't final by default, @Override isn't required to override methods, you can't have abstract static methods (sort of can workaround this now get this now with default interface methods). Java is basically just a really un-opinionated language that requires careful practices to wield it properly (see Effective Java by Joshua Bloch) but other than that it isn't a completely terrible language and you can write some really robust, easy to read code with it.

I would also have a lot more time for this article if the author actually took the time to explain the motivation behind the *FactoryFactoryFactory class instead of just discounting it without explanation and using it as the proverbial Straw Man for his article. I'm sure something this unusual must have a very interesting reason as to why it exists (I'm taking him literally here and assuming he isn't just exaggerating for comic purposes).


I've personally seen the kind of engineers who simply pattern match rather than thinking things through go into nesting as many GoF chapters as they can into a commit. I was literally told during a code review when questioning what these layers were achieving for the product that 'all abstraction is a benefit since you can then always work at an even higher level'.

Java for whatever reason seems to attract these ideas.


My pet speculation on why Java attracts this stuff is that it's actually a very robust and strongly backward compatible (and forward compatible) language that runs on a very robust VM.

I can take a compiled .jar file from the late 90s and in quite a few cases I can run it right now unmodified. I can also link against it and use it in Java code pretty trivially. I can do this with modern tools and modern VMs. Then I can copy the result from an x64 box to and aarch64 box and most of the time it will run the same way. That is just amazing.

Java's strength in the areas of robustness and compatibility means it can sustain a lot of complexity. You can build really insanely complicated towers of Babel in Java and they won't topple over. Java's excellent tooling means they'll even be somewhat maintainable.

This transforms Java's great strengths into its great weakness. Its capability to sustain complexity means that nothing checks the tendency of sophomore programmers to over-engineer everything. In other languages over-engineered rococo monstrosities just fall over, but in Java you can actually get away with it to a shocking degree and still ship working robust code.

Try that kind of over-engineering in C++ and you'll have 20 minute compile times for small projects. (I've seen this, so people still try!) Try it in a dynamic language like JavaScript or Ruby and your complex design pattern monstrosity will become a non-deterministic generator of random runtime bugs. These languages can't sustain code that is riddled with FactoryFactoryFactorySingletonFactoryObserverFactory type stuff. Java can, so the Java ecosystem is where the people who love to over-engineer go.


My pet speculation is that Java programmers are culturally predisposed to, and are typically proud of creating towering architectural ziggurats with at least 3 layers of unnecessary abstraction.

It's the only language for which I've ever met "architects who didn't code". It wasn't backwards compatibility that made that happen.


Java today is used heavily in the enterprise, which thrives on utilizing marginal talent. Java didn’t look that way in the 90s, and anyways, I bet you’d see drastically different kinds of code comparing enterprise and Android.


> Java's excellent tooling means they'll even be somewhat maintainable.

I think it may be otherway round. Excellent tooling seems essential for Java to succeed. Since standard JDK could not even compile a project without ant/gradle/maven etc. Java has tools to parse/filter stack traces, of course because few 1000 line vomit for a missing file, or wrong jar means one need tools to extract relevant error from stack trace. A 10 level file hierarchy to read some config data means I can't really code without full blown Java IDE.


> Since standard JDK could not even compile a project without ant/gradle/maven etc.

I'm sorry, but I think you have to defend that statement.

Certainly, gradle and maven do a LOT to make it easier to BUILD java projects, but compile?

These tools do more to apply some common-sense patterns to project source and dependency management, so that you're only specifying the things you really need to, but, like most compilers, javac just needs to know where the compile-time dependencies are and where the source files are. The SDK is quite able to compile without a build tool.


That actually makes a lot of sense.

You've got VM tech like polymorphic inline caches, and the kind of speculative devirtualization and subsequent inlining that you can only get from a JIT that'll flatten these towers of complexity at runtime.

And you have a simple static type system that can easily tooled to flatten it on developer mind space as well.


I guess design patterns should never be applied. Those are simply optimal solutions to common design problems in OOP. When you encounter your own problem, derive your own optimal solution, and you call it some "pattern" if it happens to resemble one.

IMO pattern names are merely for communication purposes, like saying "abstract factory" instead of having to say "I have an interface F for creating instances of interface A, where different implementations of F can be chosen at runtime to create different instances of implementation of A". And also you are not required to call your factory "Factory".


OOP has been a massive step in the wrong direction for programming. To take your example of DI, in Clojure, dependency injection can be done in a few lines for the whole application. In Java/C++/PHP and other languages that use constructor or setter injection, it requires all the code to be rewritten around a DI container, requires many modifications just to get to the point of writing code that actually does something. There is no comparison. We're talking a dozen lines of code vs. dozens of classes spread over dozens of files to achieve the same thing. Minutes vs. hours spend on boilerplate.

I wouldn't call this a knee-jerk reaction, but rather a well-thought out reaction to a few decades of OOP. When I started programming 25 years ago, I felt OO was a misguided idea but I was a novice and couldn't put my finger on it. Now, it's clear to me that all OO does is add unnecessary complexity and simply cannot model real life at all. I don't see a single problem that it solves better than either procedural or functional paradigms. Those paradigms have a lot less boilerplate, are a lot easier to understand, and can be learned much more easily.

Giving precedence to code over data, IMO, is a huge, irredeemable mistake. That's literally the mistake OOP is built on. Data should be a first class citizen in any system, yet it's a second or third class citizen in OOP. It is hidden in objects, processed only by limited certain methods, and generally hard to work with (you can have data only objects, of course, structs, but then you're back to a procedural paradigm). I find myself spending anywhere from 25-75% of my time in OOP languages writing boilerplate to get the right objects talking to each other and the rest writing actual business logic that matters. In something like Clojure, time spent writing boilerplate is around 5% at most. I'm using Clojure simply as one example. I'm sure there are other functional and procedural languages that are also good examples.


Because nested factory patterns are a legacy of an era where lambdas were not well understood, or poorly represented by the type system

Continued use of Factory patterns doesn't deserve outright mockery, but does deserve immediate and firm correction. It introduced confusion and complexity and ontology to a place where a nullary lambda would do everything it does, and slice away all the ambiguity.


I don't really see why it matters what the motivation was behind it. The point is that Java for some reason seems to be an attractor for that kind of ridiculous code. It is where you end up when taking "OO design" to its logical conclusion. The OO paradigm is like religion: mostly harmless if you don't take it seriously.


I'm sorry but you can't possibly be serious. There is no sane motivation for a *FactoryFactoryFactory class. No problem on earth is complex enough to benefit from that much abstraction.


What, serious about wanting to know why it is in their codebase? Yes, I'm 100% serious. If they're currently using it in production I'd assume there's a good reason for it. Plus, I'm pretty sure the folks over at LinkedIn engineering are definitely serious. In any case I'm very curious as to why (rightly or wrongly) it's in there.


It's in their codebase because they catastrophically over engineered it. Even the largest codebases on earth don't require FactoryFactoryFactory's.


After reading this yesterday, I felt inspired to write a short article about how modern techniques deprecate the factory pattern (at least in the form GoF offers with specific ontology) and why composed lambda functions offer more functionality with less complexity.

http://extradimensional.space/posts/2017-12-18-Deindustriali...


I've worked with a lot of different languages, developing games and server software, for 15 years. I and everyone I've worked with always tries our best to give reasonable, readable names to things. I've never needed to call something a "Factory". Let alone a "FactoryFactory" or "FactoryFactoryFactory".

I don't know what it is about Java that makes people write code and name things like this.

This kind of stuff is just foreign to me, and I just don't understand any of the rationale behind it. I hear about it all the time, but can't fathom how code ends up like this.


Well, a factory pattern is a well-defined pattern in OO software design, and one that a developer of any experience level can look up and understand. Your particular implementation may give rise to a more plain and contextually-accurate name, but if you call it something other than a Factory, posterity may need more time and context to discern what your code does.

If I'm a new developer on a project, I know what a SprocketFactory does, and I know that I'm going to get SprocketFactorys from a SprocketFactoryFactory, and I know that if I call the SprocketFactoryFactoryFactory, I'm sure as hell gonna get back a SprocketFactoryFactory, which will then give me a SprocketFactory, which will then produce Sprockets.


If I'm a new developer on a project and see a class called "SprocketFactoryFactoryFactory" I'm probably going to cry for a bit and wonder whether I couldn't have got a better gig, and then try to figure out a better way of producing sprockets.

Human short-term memory only has a size limit of about 4 simultaneous chunks so once you're juggling those levels of indirection any development is going to be glacially slow until you've acquired a great deal of experience with the codebase.

Out of interest, can you describe a concrete case where multiple levels of factory patterns have actually been justified and resulted in simpler code than an alternative?


If Java had default parameters and named parameters I think a ton of the need for Builders and Factories would go away.


I can't, but if you're the FNG on a project and see SprocketFactoryFactoryFactory, you may have a quick and easy path to a better solution. Just make sure that that better solution names its patterns well so the next FNG can make your code even better.


I like how you assume that a mind that doesn't balk at "SprocketFactoryFactoryFactory" is also a mind that will ensure the implied contract of that name is adhered to. But I guess that sort of optimism -does- come with 'new developer on a project', so you're probably correct in what the developer will think they know.


I think between a horrible application of a pattern that has a good chance of being implemented properly and a horribly named class that leaves no potential to intuit its behavior, you might fare better with the former, as a new dev. But at that level it's all subjective.


"There are two hard problems in Computer Science: naming things, cache invalidation, and off by one errors"

People give up on the naming things because the name they're seeing makes sense to them in that moment and they may not realize how confused they'll be when they come back to the code in 3 months. May be a lack of experience in maintaining "old code" or just part of the programming culture they're in.


"Comments are for people other than yourself and that includes you because future you is not you"


I think a portion of it may be java's widespread use in large, "enterprise" environments where due to a variety of cultural and technical factors there's a high level of complexity and abstraction. Pretty soon it becomes the norm. I know plenty of java codebases that are clean, readable, and FactoryFactory-free, but never from an "enterprise" company.


>I don't know what it is about Java that makes people write code and name things like this.

The cause is lack of abstraction and a very limited(rigid, opposite of "flexible") object system. The programmer must use some specific design patterns as workaround and thus name classes according to the semantics of the pattern applied.


In OOP, a factory is a pretty common design pattern, and I'm a bit surprised you've never seen it in 15 years of software development... considering it's taught in schools these days. In a nutshell, a factory is a class who's purpose is to generate instances of other classes (https://en.wikipedia.org/wiki/Factory_method_pattern).

The main reason they're used is to create objects without specifying the class of the object, however there are plenty of other reasons why they're common as well.

You start stretching it when you have FactoryFactory. This presumably would be a factory that creates factories. If I saw this in a code review, I would call it out, as it could most likely be refactored to be simpler. FactoryFactoryFactory is beyond silly.


I think you misread the comment. The poster says that the classes did not have the word «factory» in the name. The functionality was the same.


I think a big part of it is that common libraries do it and ask you to do it more. Spring and Tomcat and etc are, imo, abominations of complexity -- but you still often find yourself buying into their ecosystems. Especially if you're working on code that's a few years old.


I'll add: both Spring and Tomcat exemplify this quality that a lot of earlier (2000-era) Java frameworks have (others: Hibernate, actually literally anything that embraces XML...), of being "extensible mostly in name only". You can, in theory, implement modules that tie into various parts of their functionality -- and in practice I've found myself doing it occasionally -- but it's almost always regrettable.

Anything you figure out how to do in these will itself just become un-figure-outable for the next engineer who handles it.

(One of the most frustrating tasks in my engineering career involved trying to debug why Spring beans loaded by a Spring-Tomcat integration layer, in the initialization of a Tomcat container, weren't discovering XML files loaded at another stage of initialization. When I look back on that year I think I might have indirectly changed jobs because of it. Hah.)


I've never done heavy development in Java, but my day-to-day has been C# for years. I've never seen a FactoryFactoryFactory, or even a FactoryFactory. Since the two languages are so similar, I would think C# would have the same issues as Java in this regard, but in my experience it doesn't. Is there something about C# that has let it avoid this? Or have I just happened to work at places that don't like the Factory pattern as much as everyone else seems to? :D


C# often has "release valves" that allow you to divert from some of the mess that Java tends to exhibit. `events` and `delegates` from C# 1/.NET 1, flawed as they may be in retrospect, alone simplify several major patterns that Java falls into.

I also think on the paradigm issue here is Java architects seem more inclined to have a "ravioli code" problem. There's an ancient Java ideal of componentization [1] that a lot of Java code tries to live up to. Lots of little components boxed into tight containers that only ever interact through often "reconfigurable" sauce: ravioli.

There are some benefits to that ideal, it exists because it has some merits in terms of component testing, in particular. It's just that as with spaghetti code and copypasta code, ravioli code also sometimes winds up in a mess where the ideal meets the real world, and somehow a toddler (or junior developer in this analogy?) has thrown it all over the kitchen and you have no idea how to clean it up without destroying the entire kitchen and starting over... May not be a great analogy. ;)

I'd argue that you are more likely to see Java-style ravioli code in C# projects that use IoC containers of one sort or another. Again, there's nothing particularly wrong with IoC containers [2] as an idea connected to some ideals of component testing, it's just that it's an ideal that often falls down and fails to deliver its advantages in the real world.

[1] That somewhat resembles the Smalltalk ideal of object message passing while the language itself lacks much of the architecture for that. Arguably, a lot of Java's worst problems come from trying to replicate a lot of Smalltalk ideas and design patterns in a language that is nothing at all like Smalltalk and misses some important characteristics of Smalltalk.

[2] IoC Containers are a ravioli concept directly imported to C# from the Java world, with the earliest IoC containers such as Spring.NET being Java ports.


It's a cultural issue.

Patterns are things you can avoid to use, they simplify a prior existing problem. But if engineers don't ask themselves what the code would look like if they wrote it differently, a lot of stupid code will be the result. That can be Boilerplate or it can be Spaghetti.

More communication within the team will help.


You see this a lot in C# projects that are ports from a Java analogue, like log4net.


That's pretty funny, but EventFactoryFactoryFactory isn't exactly Java's fault.

There's a software development culture that believes adding a layer of abstraction is the answer (doesn't matter what the problem is) as long as it has a GoF pattern name (doesn't even need the implement the pattern its named for).

For whatever reason they've adopted Java, but if you're going to throw abstractions on top of abstractions carelessly you're going to create a mess of spaghetti no matter the language or environment.

(I think the preference for Java is just an accident of timing of when the culture arose. The responsibilities of corporate IT groups were exploding and they needed a language for all the software they had to create. Java was there and it was a good choice.)

I'm feeling bad about my laughing and complaining, which is useless, so here's some practical advice on restraint when it comes to using abstractions and patterns:

* Don't use an abstraction to wrap something you don't understand well. You may feel like you've tamed a confusing, complicated API and turned it into something you do understand and can use effectively. But you've actually just rolled up all the things to didn't account for and don't understand and hidden them away from view where they will explode if your software actually sees much use. I realize this isn't always easy to avoid doing, but at least never wrap something in an abstraction because you don't understand it that well. And when you're forced to implement an abstraction on top of an API you don't understand well, go through the API, method (or function or endpoint or whatever) by method, and look at all the options and ask yourself what each of those options and methods are there for.

* When considering using a pattern, make sure to check the section titled something like "When Does This Pattern Apply?" and make sure the abstraction you're considering actually solves the problem.


I look at JDK codebase it has about 360 factory classes. I look at popular Java project: Gradle, it has about 350 factories. My project at work has about 50 factories.

I agree that Factory* is not Java's fault. However I think Factory phenomenon did not start with corporate IT but by people working on core JDK or big open source projects.


I should clarify:

I don't have a problem with the factory pattern. I have a problem with piling abstractions and indirection on top of abstractions and indirection to create spaghetti systems.

I don't know how EventFactoryFactoryFactory is being used but it looks like we're dealing with four levels of abstraction simultaneously, which seem likely to me to be a horrible failure of design. (I'm kind of assuming this isn't an event planning system, or something, and that "event" is an abstraction for something happening.)


This is a great critique of a coding style, but not really a language. Being an older person, I can appreciate how much object oriented languages helped organize complex code. But then it became a religion and we got the silliness the author describe. Keep in mind, no matter how smart you are, something you are doing today is going to seem really dumb twenty years from now.


This article has no insights whatsoever.


Author here, I also wrote a more in-depth article on how to apply concepts from functional programming to languages like Java 8: https://hackernoon.com/practical-functional-programming-6d79...


I think most would agree FactoryFactoryFactorys are bad. But how does one fix this? What's the alternative? Maybe a post on how to disentangle one of these?


Eh, are Haskell programmers really in a position to throw stones here? I mean, in Haskell terms, a FactoryFactory corresponds to something like

  type EventFactory = () -> Event
  type EventFactoryFactory = () -> () -> Event
which seems fairly innocuous. Meanwhile, Haskell programs have gems like

  type Prism s t a b = forall p f. (Choice p, Applicative f) => p a (f b) -> p s (f t)


Is there anything a factory does that currying wouldn't solve?


"are you sure you want to delete this repo?"


Let the past die. Kill it, if you have to.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: