Showing posts with label Java. Show all posts
Showing posts with label Java. Show all posts

Tuesday, February 10, 2015

A simple rate limiter using Java's DelayQueue

It is rare for me to develop at this level of use case but recently we had to manage a bit of work-load with limited resources that lead to a simplified and light-weight rate limiter using Java's concurrent additions.

Guava's rate limiter is pretty good but I didn't want to include a (fat) dependency on guava just for one class so I wrote a variant based on Java's DelayQueue:

class DelayedEntry implements Delayed {
  long expireAt;
  TimeUnit unit;
  DelayedEntry(long delay, TimeUnit tu) {
   unit = tu;
   setDelay(delay);
  }
  void setDelay(long delay) {
   this.expireAt = System.nanoTime() + unit.toNanos(delay);
  }
  int compareTo(Delayed other) {
   throw new IllegalStateException("Expected single element queue");
  }
  long getDelay(TimeUnit u) {
   return u.convert(expireAt - System.nanoTime(), NANOSECONDS);
  }
}

class RateLimiter {
 DelayQueue<DelayedEntry> queue;
 DelayedEntry token;
 TimeUnit rateUnit;
 AtomicInteger rate;
 RateLimiter(int rateLimit) {
  queue = new DelayQueue<>();
  rateUnit = NANOSECONDS;
  rate = new AtomicInteger(rateLimit);
  token = new DelayedEntry(0, NANOSECONDS);
 }
 boolean acquire(int permits) throws InterruptedException {
  long targetDelay = rateUnit.toNanos(permits) / max(1, rate.get());
  DelayedEntry nextToken = token;
  while (!queue.isEmpty()) {
   nextToken = queue.take();
  }
  assert nextToken != null;
  nextToken.setDelay(targetDelay);
  return queue.offer(token);
 }
}
This implementation isn't exactly a mathematically precise rate limiter that can shape traffic bursts in uniform distributions for volatile rate limits. However, for use-cases involving predictable timings, this works pretty well with minimal and bounded resource usage.

One thing that shouldn't go un-noticed is that we can easily use such utility implementations to instrument collection interfaces such as Iterable and Iterator to decorate existing code base, here's an example:
 public <T> Iterator decorate(final Iterator<T> delegate) {
  return new Iterator() {
   public boolean hasNext() {
    return delegate.hasNext();
   }
   public T next() {
    acquire();
    return delegate.next();
   }
   public void remove() {
    delegate.remove();
   }
  };
 }

I would love to hear reader's opinions about this approach.

Monday, December 16, 2013

Interesting use-case for SynchronousQueue

While working on a very unusual system, where: 1) producers can be significantly faster than consumers at times (by more than a factor of two) and 2) producers have low latency processing overhead for real time data, I was contemplating on a data structure that is efficient, performant and can model this situation elegantly. After researching probable candidates, I came across Exchanger/SynchronousQueue from Java's util.concurrent class library.

If I was looking at SynchronousQueue without the above context, I would have wondered why would anyone need a queue that's not really a queue but more like a pointer swap between appropriate threads. But the use-case I'm dealing with ("event bursts") are probably the perfect use case for preventing the consumers from overwhelming rates by modeling the problem more as a "hand-off" than a typical case of buffered queuing. The central idea behind this data-structure is to adapt queue idioms without using a queue in a very efficient manner with an added feature that message production is rate limited by consumer's speed of processing them. Behind the scenes, it uses dual-stack/queue algorithm (depending on ordering fairness preference) to transfer a reference between threads.

SynchronousQueue is more of a thread queue than a data queue, it maintains a stack/queue of waiter threads (i.e. "consumers") and not the queue of data itself. You can probably achieve the same functionality by using BlockingQueue of size 1 or using an explicit object lock and explicit wait/notify on a datum reference like an example below:

//Example code, probably riddled with concurrency bugs 
//(I've only tested it on my laptop :)) 
public class MyNaiveSyncQueue {
    private final Object LOCK = new Object();
    private volatile Object data; //volatile is needed for non compressed OOPS
    public void put(Object o) throws InterruptedException{
        synchronized (LOCK) {
            if(data != null){
                LOCK.wait();
            }
            data = o;
            LOCK.notify();
        }
    }
    public Object take() throws InterruptedException{
        synchronized (LOCK) {
            if(data == null){
                LOCK.wait();
            }
            Object o = data;
            data = null;
            LOCK.notify();
            return o;
        }
    }
}

There are several problems with the solution above:
  • Violent locking and memory fence overhead: for individual queue operations, this will scale terribly with number of producers/consumers, especially on server class SMP hardware.
  • Constant context switching: each successful queue operation involves  syscall(s) for context switching which might involve kernel scheduler and everything that comes with it (cache flush/register reload et. al.).
  • Overhead for fair processing: JVM manages object monitor wait queues in a fixed FIFO order, there's certain overhead in seeking the first enqueued thread to schedule as a consumer. This may or may not be the behavior the programmer cares about.
SynchronousQueue takes care of all of these limitations by providing options for trade-off in terms of scheduler ordering fairness as well as eliminating expensive locking with hardware level CAS (whenever available). It also does a fair bit of spin-locking before a kernel level timed-wait kicks-in, this ensures that context-switches don't become the hot-spots in message processing.

So far this has been working great for the system I'm dealing with which processes about a couple of hundred messages per millisecond at peak bursts but I realize that it might not be appropriate (or even worth it) for non-realtime/non-bursty producers.


Monday, December 05, 2011

Avoid storing references of java.net.URL

Normally, I avoid writing something so obvious but since I'm bitten multiple times now, it might help future me.

Never ever store references of java.net.URL in Java collections. The reasoning is pretty simple, 'equals' and 'hashCode' methods of this class does extremely expensive synchronous DNS lookup on every call.

It is not uncommon to see most of your thread's time being spent on monitors:

"pool-2-thread-2" prio=10 tid=0x92061400 nid=0x1744 waiting for monitor entry [0x91fad000]
   java.lang.Thread.State: BLOCKED (on object monitor)
    at java.net.URLStreamHandler.getHostAddress(URLStreamHandler.java:429)
    - waiting to lock <0x9731b200> (a sun.net.www.protocol.http.Handler)
    at java.net.URLStreamHandler.hashCode(URLStreamHandler.java:354)
    at java.net.URL.hashCode(URL.java:875)
    - locked <0xaac87290> (a java.net.URL)
    at java.util.HashMap.getEntry(HashMap.java:361)
    at java.util.HashMap.containsKey(HashMap.java:352)
    at java.util.HashSet.contains(HashSet.java:201)


"pool-2-thread-1" prio=10 tid=0x9205e800 nid=0x1743 runnable [0x91ffe000]
   java.lang.Thread.State: RUNNABLE
    at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
    at java.net.InetAddress$1.lookupAllHostAddr(InetAddress.java:866)
    at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1258)
    at java.net.InetAddress.getAllByName0(InetAddress.java:1211)
    at java.net.InetAddress.getAllByName(InetAddress.java:1127)
    at java.net.InetAddress.getAllByName(InetAddress.java:1063)
    at java.net.InetAddress.getByName(InetAddress.java:1013)
    at java.net.URLStreamHandler.getHostAddress(URLStreamHandler.java:437)
    - locked <0x9731b200> (a sun.net.www.protocol.http.Handler)
    at java.net.URLStreamHandler.hashCode(URLStreamHandler.java:354)
    at java.net.URL.hashCode(URL.java:875)
    - locked <0xaac97228> (a java.net.URL)
    at java.util.HashMap.getEntry(HashMap.java:361)
    at java.util.HashMap.containsKey(HashMap.java:352)
    at java.util.HashSet.contains(HashSet.java:201)

This stack-trace just depicts hashCode but expect similar blocking code for 'equals' too. If you care a bit about performance, just stay away from this goddamned class.

Monday, December 13, 2010

Tackling nulls the functional way

Most programmers have suffered null pointer one way or other - usually a core-dump followed by a segmentation fault on development machine or on a production box with application in smokes. NullPointerException results in a visible embarrassment of not thinking about "that something *could be* null".

Tracking Null Pointer ranges from loading core-dump in gdb and tracing dereferenced pointer to stack traces pointing to exact location in source. However, ease of tracking nulls opens up the doors to ignore them in practice and throwing null-checks just becomes as common as throwing one more div to fix IE's layout problems which is bad.

Problems with Null:
I hate having to ignore nulls as it is not always enough just to add one more null check. The reason why I am writing this blog is because I have several problems with Nulls:
  1. All and Every reference can be a null in languages like Java. This covers everything: method parameters, return values, fields etc. There's no precise way for a programmer to know that some method might return null or accept null parameters. You absolutely have to resort to actual source code or documentation to see if it can possibly return null (and you are going to need good luck with that). All of it adds extra work when you really want to be focusing on fixing the real problem.
  2. The problem with NullPointerException is that they point to causal eventuality and not usually the actual cause. So what you see in stack traces are usually the code paths where damage is not really initiated but done when we are normally interested in case where damage is initiated. 
  3. Null is actually very ambiguous. Is it the uninitialized value or absence of value or is it used to indicate an error? The paradigm of null fits well in database but not in programming model.
  4. Having Nulls in your code has major implications in code quality and complexity. For example, it is not unusual to see code branches with null checks breeding like rabbits when an API "may" return null which in turn results in extremely defensive code. This significantly taxes readability.
  5. Null makes Java's type system dumber when a method is overridden and you want to call it. Writing code like methodDoingStuff((ActualType)null, otherArgs) isn't exactly a pretty sight. This results in subtle errors when arguments are non-generic. 
In many ways Nulls are necessary evil. For those of us who care about readability and safety we can't ignore them yet we shouldn't just let it overtake safety and readability.

I have come to know several techniques to tackle nulls. First, there is Null Object pattern which is not entirely as ridiculous as the name implies but it's not practical in real life software having hundreds of class hierarchies and thousands of classes, and so, I will not talk about it. Then there are languages like Haskell and Scala with library classes that try to treat nulls in, IMO, a better way. Haskell has MayBe and Scala has Options. After using options in Scala for a while in a side project, I found that I was no longer fighting with nulls. I knew exactly when I had to make a decision that a value is really optional and I must do alternate processing.

The central idea behind Haskell's MayBe and Scala's Option is to introduce a definitive agreement on a value's eligibility to be either null or not-null enforced with the help of type system. I will talk about Scala's Option since I have worked with it, but the concept remains same. I will also introduce how to implement and use Options in Java since this is much more of a functional way of thinking about handling nulls and it doesn't (almost) take Scala's neat language features to implement it.

Treating nulls the better way:

Usual course of action when you are not sure what value to return from a method is:
Most of the times it results in the last option because you don't have to fear about breaking everything (well, mostly) and everyone passes the buck like this.

We can do better with Scala's Option classes. We can wrap any reference in to Some or None and handle it with pattern matching or "for comprehension". For example:


Some(x) represents a wrapper with x as actual value; None represents absence of value. Some and None are subclasses of Option. Option has all the interesting methods you can use. When a variable in question is null we can:  fall back to default value, evaluate and return a function's computed value, filter and so on.

Options in Java:
Implementing Options in Java is surprisingly a trivial task. However, it is not as pleasant as Scala's options. Implementation boils down to a wrapper class Option with two children: Some and None. None represents a null but with a type (None[T]) and Some represents non-null type.

To make Option interesting we make Option extend List, so we can iterate on it to mimic poor man's "for comprehension". We will also go as far as tagging both types with Enums so we can do poor man's pattern matching with a switch. You can find example implementation of Options in Java with a test case demonstrating use of Options. Here's the small snippet which covers the essense:

As you can see, Option opens up several doors to fix the null situation. You now have  choice to compute a value, use default value or do arbitrary stuff when you encounter nulls.

How are my null problem solved with Options:
  1. Using Options for optional/null-able references I have at least avoided "all things could be null" problem in my code. When an API is returning a Option, I don't have to wonder if it can return null. Intention is pretty clear.
  2. When I am forced to handle null right at the time of using an API, I have to handle it right there: do alternate processing or use default. No surprises.
  3. Option is a very clear way of saying a variable represents possibly an absent value.
  4. Option doesn't really solve this problem completely. For example, method signatures with wrapper Option type can get really long (e.g. def method1(): Map[String, Option[List[Option[String]]] = {}). However, compared to null checks, I would prefer long method signature any day. Other benefits out-weight this limitation.
  5. Clearly, Option[Integer] always means only Option[Integer] and not Option[Integer], Option[String], Option[Character], Option[Date] and so on. Compiler can infer exact method call from generic types.
As good as the concept behind optional values is, it doesn't and will not always save you from Null. You will still have to deal with existing libraries which return nulls and cause all these problems and more. However, most of the time null is problematic in your own code.

Where to use Options:
Here are the common places where I think using Options makes more sense:
  1. APIs: Make your API as specific and as readable as possible; all optional parameters and return values should be Option.
  2. Use in your domain model: You already have fair understanding on null-able columns, use Option for null-able fields in your table. It is not hard to integrate using Options if you are using an ORM with interceptable DB fetch; you can initialize fields to None if database contains null and so on.

In the interest of keeping this post relevant and on topic, I have completely avoided heavy theoretical baggage (monads et. al.) that's inevitable when theoretical functionalists (functional programmers) talk about Options. I really hope this post generates some interest in this topic. If you disagree or would like to share more on this topic, please leave a comment.

Saturday, May 01, 2010

Future of a Java programmer

As a long time Java only programmer professionally, I have been pondering about how things are changing around me as a Java programmer. Ever since I remember I had no choice but to use subset of C++ dialect (Java lang) with an extremely rich class library and ecosystem (Java platform).

In last few years there has been a drastic shift in number of languages targeting JVM. For example: dynamic (javascript, jruby, jython, groovy), functional + OO (scala) and a lisp dialect (clojure) and so many others. While I am excited about all the options I have today I don't think a single language will dominate on JVM anymore like Java did so far.

In a way this is a good thing, one tool rarely fits all needs (I couldn't curse Java enough for GUI programming). Like C, Java was never designed to be used for developing dynamic web apps, but we still tried and miserably failed with JSP/JSF and plethora of frameworks against PHP/Rails/Python in terms of productivity. One really good thing Java did was to raise a level of abstractions from platform specific details and memory management. These new languages on top of JVM raise the abstraction level even further for its area of strength.

It is not a remote future when we will see concurrent processes being programmed in clojure and presented with jruby/rails with intermediate code written in Java. Each layer of application is going to be implemented in different programming languages while interfaces being transparent for developers working in each layer. This is a big thing, it has never been envisioned before for Java Platform, the lowest coupling we have seen so far is through remoting (web services et. al.) where clients and servers are on different runtimes and languages.

What this means for a Java developer is if you are

  • A web developer: you are going to learn things which are extremely different from struts/jsf/jsps, no more artificial model1/model2 MVCs.
  • A non web-developer : you are going to write code which is far more readable and very specific to your business domain via DSL created in any of the languages mentioned above without worrying about accidental complexity Java and its frameworks imposed on you.

While I can keep classifying developers on Java platform all day long, these two are major ones whose life (and resumes) are going to change soon, they will be expected to know more than one programming languages rather than frameworks now. Contrary to the cool kids on interwebz, I don't think Java the language is going to die anytime soon not because many of the existing libraries are written in it, but because of the number of programmers on earth who know Java, tooling around it and the native JVM support for it. Java is like C in a way, you can do whatever is supported by underlying implementation.

Many of you who are like me are going to see change around them soon, I am thrilled to see how my career is going to transform as polyglot programmer are you?

Tuesday, May 12, 2009

Scala v/s Java arrays

Here's a Java puzzler for the curious (and a good interview question too!). Given a array merge method below, what will be the output of following program?



public class Generics {

static class A {
}

static class B extends A {
}

public static void main(String[] args) {

A[] copy = merge(new B[] { new B() }, new A[] { new A() }, new B[1]);
System.out.println(copy.length != 1);

}

static Z[] merge(Z[] arr1, Z[] arr2, Z[] store) {
List list = new ArrayList();
list.addAll(Arrays.asList(arr1));
list.addAll(Arrays.asList(arr2));
return list.toArray(store);
}

}


If you didn't guess it already, the program above results in a runtime exception (java.lang.ArrayStoreException).


Exception in thread "main" java.lang.ArrayStoreException
at java.lang.System.arraycopy(Native Method)
at java.util.Arrays.copyOf(Unknown Source)
at java.util.ArrayList.toArray(Unknown Source)
at name.nirav.Generics.merge(Generics.java:23)
at name.nirav.Generics.main(Generics.java:16)


I am not a huge fan of generics in Java because we are left with whatever type safety we get from a half-hearted implementation (and I'm not even criticizing). It is too much to expect from a Java compiler to check that the program above has type safety compromised at call site, mostly because that's how arrays in Java are handled by VM. Arrays are special types of mutable objects with components as anonymous members which are accessed with indices. An array itself isn't a type, it assumes whatever type its components are. This is where the problem starts.

With current generics implementation, generic arrays are treated as covariant by default i.e. an array of component type T is also array of component type S where T is a subclass of S. This introduces type issues such as above where syntactically valid programs are victimized, making Java's "statically typed, type safe language" designation an irony. If arrays were regular objects, compiler will report an error in code without type variance information.

Arrays are regular objects in Scala, each array is an instance of Scala.Array class. The code below is equivalent to Java program above with some syntactic differences, unlike Java code the Scala code below is not syntactically valid. Scala arrays are non-variant, and Scala compiler uses what is called "conservative approximation" to ensure type safety at compile time.


object App extends Application{

class A

class B extends A

def merge[T](arr1 : Array[T], arr2: Array[T], store: Array[T]) : Array[T] = {
val list = new ArrayList[T]
list.addAll(Arrays.asList(arr1:_*)) // :_* is for vararg conversion
list.addAll(Arrays.asList(arr2:_*))
list toArray store
}

merge(Array[B](new B), Array[A](new A), new Array[B](1)) //Error, type mismatch

}


The Scala compiler will report an error on "merge" call, complaining about type mismatch.

Not everyone likes to know about such details until it bites back with million dollar bugs. Why are Java arrays co-variant? Who needs more run time checks?

Tuesday, April 21, 2009

How Scala's pattern matching can replace Visitors

The primary motivation of Visitor design pattern is to separate model traversal from operational logic. A visitable model takes the responsibility of model navigation while the behavior is defined by arbitrary visitors. In this post I will try to explain problems associated with Visitors in general and how Scala's pattern matching feature can eliminate such problems cleanly.

Consider a simplified Insurance Policy model as follows (In Java):


public class PolicyElement {
static class Quote extends PolicyElement {
protected final Risk risk;
public Quote(Risk risk) {
this.risk = risk;
}
public void accept(PolicyVisitor visitor){
visitor.visit(this);
visitor.visit(this.risk);
}
}

static class Risk extends PolicyElement {
protected Coverage coverage;
public Risk(Coverage coverage) {
this.coverage = coverage;
}
public void accept(PolicyVisitor visitor){
visitor.visit(coverage);
}
}

static class Coverage extends PolicyElement {
protected final Premium prem;
public Coverage(Premium prem) {
this.prem = prem;
}
public void accept(PolicyVisitor visitor){
visitor.visit(prem);
}
}

static class Premium extends PolicyElement {
protected final double amt;
public Premium(double amt) {
this.amt = amt;
}
public void accept(PolicyVisitor visitor){
visitor.visit(this);
}
}
}

public interface PolicyVisitor {
public void visit(Quote quote);
public void visit(Risk risk);
public void visit(Coverage cvrg);
public void visit(Premium prem);
}
public class PolicyTest {
static class PremiumCalcVisitor implements PolicyVisitor {
private double totalPremium;

@Override
public void visit(Premium prem) {
totalPremium = getTotalPremium() + prem.amt;
}

@Override
public void visit(Coverage cvrg) {
}

@Override
public void visit(Risk risk) {
}

@Override
public void visit(Quote quote) {
}

public double getTotalPremium() {
return totalPremium;
}
};

public static void main(String[] args) {
Quote quote1 = new Quote(new Risk(new Coverage(new Premium(10))));
Quote quote2 = new Quote(new Risk(new Coverage(new Premium(30))));
PremiumCalcVisitor visitor1 = new PremiumCalcVisitor();
PremiumCalcVisitor visitor2 = new PremiumCalcVisitor();
quote1.accept(visitor1);
quote2.accept(visitor2);
assert visitor1.getTotalPremium() + visitor2.getTotalPremium() == 40;
}
}


(Generally, we introduce one more abstract class to omit empty implementations in Visitors but I have left it for brevity.)

Now, not so apparent problem here is that if the object model changes (which is more frequently the case in real life), we have to add one more method to PolicyVisitor interface, all visitor implementations if change is substantial and have new Policy elements implement visitor methods. This invasive nature of Visitor couples it tightly with the model.

With pattern matching and views in Scala, you can have alternative implementation which is precise as well as non-invasive unlike visitors.

class PolicyElement
case class Quote(risks: Risk) extends PolicyElement
case class Risk(cvrg: Coverage) extends PolicyElement
case class Coverage(limit: Premium) extends PolicyElement
case class Premium(amt: Double) extends PolicyElement
object PremCalcTest {
class PremCalculator(pol: PolicyElement){
def calcPrem : Double = calcPrem(pol)

def calcPrem(policy: PolicyElement): Double = policy match{
case Quote(risk) => calcPrem(risk)
case Risk(coverage) => calcPrem(coverage)
case Coverage(premium)=> calcPrem(premium)
case Premium(amt) => amt
}
}

implicit def calPremV(pol: PolicyElement)= new PremCalculator(pol)

def main(string: Array[String]){
val risk1 = Risk(Coverage(Premium(10)))
val risk2 = Risk(Coverage(Premium(30)))
println(Quote(risk1).calcPrem + Quote(risk2).calcPrem)
}
}

This code requires some explanation. What we have done here is we labeled domain classes with a 'case' keyword in Scala. If you tag a class with 'case' it can be used for pattern matching in a switch-case like structure as done in method 'calcPrem'. You don't need to create members or setter/getters for them, they are created by compiler for you. A case class can be instantiated without 'new' keyword; So Risk(Coverage(Premium(0)) is translated as new Risk(new Coverage(new Premium(0D))) in equivalent Java code.

The code in 'calcPrem' function can be assumed to be something similar to instanceOf checks for each possible case in Java, for example:


if(object instanceOf Premium)
return ((Premium)object).amt;


What we also have done silently is added a method 'calcPrem' to PolicyObject class. This is done through implicitly defined function 'calPremV', this will allow us to call 'calcPrem' method on any PolicyObject without actually modifying the domain model code. This type of lexically scoped class extension is known as a View in Scala and is similar to what is available in Ruby as open classes except without scoping.

In case if the model changes in this case, we just need to modify a single function and we are done. These programming language features of Scala frees us from coupling introduced by inheritance.

So it is easy to see that Scala's language features can be elegant and far more powerful than other languages (specifically Java) without sacrificing compiler checks and type safety.

Thursday, April 09, 2009

Scala: First impression

If you are curious enough about programming languages then you probably have heard about Scala - the 'statically typed' functional and object oriented language. Scala is the new sun rising to balance 'the burn factor' between functional and object oriented schools of thoughts.

Unlike what this paper suggests[pdf], The reason why I think Scala exists is because functional v/s object oriented groups are moving in opposite directions, which is not only inefficient but they can't leverage achievements of one another. If you read about functional v/s object oriented programming comparison, every argument boils down to productivity, tooling and maintainability of code. While functional languages (Lisp, Haskell, Python etc.) offer excellent productivity compared to OO languages (Java, C++ etc.), Object oriented languages offer excellent tooling and are relatively maintainable. The reason why I think OO languages have been so popular is due to its easier to understand concept which is easy to map in real life, so for most people who have never took computer science course OO is still easier to grasp compared to Functional programming methods like list comprehension, closures or the higher order functions which are rooted from the formal systems of mathematics.

Scala tries to satisfy both of the groups by providing grammar and a type system which seamlessly integrates with mainstream platforms (Java, .NET) and offers powerful functional abstractions only available in dynamic languages. What this means is, Java developers can write their code in same fashion they write in Java using existing libraries and frameworks but with an added advantage of functional programming techniques wherever they feel it might be productive. Functional programming language enthusiast get access to rich class libraries and powerful tooling (eventually).

If you take a look at the Scala language grammar[pdf] you will notice that what you can create with Scala is limited by your creativity. Based on what I have learned so far, I find Scala much more refreshing than Java, Scala feels a lot more like a programming language of the 21st century! Scala compiler itself is pluggable so you can do heck of a stuff you can only dream with javac, ecj. What is missing is tooling, the existing tooling is scrap but that will improve hopefully with an active community.

Bill Venners of Artima has presented Scala wonderfully, take a look at the presentation[requires flash] on 'The feel of Scala'.

Tuesday, March 24, 2009

OPath ported for Java object model

I have been thinking about alternate uses of OPath I created for EVars plug-in. And it occurred to me that I can very well use it for my unit testing (which involves pretty complex insurance object model) and for things like JSPs where I really hate to write ten lines of Java code just to display some value.

So I wrote a port of OPath for Java object model (less than 200 lines of real code). Following example should explain how it can add some value to your development efforts:

Consider an example of code for simple Accessible Swing Table :

Image
If you are writing a UI test to see specific table's column heading, you just write following

Collection<Object> findAll = OPathReflectiveInterpreter.findAll(frame, "//dataModel//@val\\$column.*");
assertEquals("First Name",((Object[])findAll.toArray()[0])[0]);

This is very trivial test, of course. But it is sufficient to express the power of OPath micro-scripting.

If you like to try it out for unit-testing or templating check-out the download here (You will also need opath.jar).

Disclaimer: This is experimental work at best, it might be slow and it might have bugs at this time.

Tuesday, March 04, 2008

How to find which jar file contains your class at 'runtime'

If you ever wanted to know which jar your class belongs, or wanted to access the meta-information from that jar file (such as certs from Manifest.mf etc.), here's how you can do it.

    public URL getJarURL() {
URL clsUrl = getClass().getResource(getClass().getSimpleName() + ".class");
if (clsUrl != null) {
try {
URLConnection conn = clsUrl.openConnection();
if (conn instanceof JarURLConnection) {
JarURLConnection connection = (JarURLConnection) conn;
return connection.getJarFileURL();
}
}
catch (IOException e) {
throw new RuntimeException(e);
}
}
return null;
}
This would work in most of the enterprisely Java environment (unless your classloader is hacked by some stupid framework in between), not sure if this would be valid under OSGi container.

Why in the name of programming I need to write such trivial hacks? O Java Module system (JSR277), I await you.

Note: This is the simplest way I could write, if you know smarter way, feel free to comment here.

Wednesday, October 11, 2006

Some Lost+Found gyan from JMM :)

Findings from an article by Goetz:

“While the JMM specified in chapter 17 of the Java Language Specification was an ambitious attempt to define a consistent, cross-platform memory model, it had some subtle but significant failings. The semantics of synchronized and volatile were quite confusing….

...JSR 133, chartered to fix the JMM...

Most programmers know that the synchronized keyword enforces a mutex (mutual exclusion) …. But synchronization also has another aspect: It enforces certain memory visibility rules as specified by the JMM. It ensures that caches are flushed when exiting a synchronized block and invalidated when entering one, (significant performance degradation on multiprocessors) so that a …. It also ensures that the compiler does not move instructions from inside a synchronized block to outside…. The JMM does not make this guarantee in the absence of synchronization -- which is why synchronization (or its younger sibling, volatile) must be used whenever multiple threads are accessing the same variables.”


I’m not sure if this cache flush is entire cache flush (128k to 2m) or cache-line flushes, but i consider it really bad. All operations are stalled until memory request is complete.

I never knew this behavior of “synchronized”. I always wondered about the bad performance of concurrency primitives, some facts like these makes my understanding clearer, why it's not scalable,

This will give one certain idea of how painful it is to achieve "Platform Independence", apart from being the buzzword.

(and who likes snoopy cache being flushed like dumb one:( )

Saturday, September 10, 2005

JVM Optimization

Kudos !!! We've been assigned new project on Performance & Profiling of our product Wallet Service Center. I'm wondering from where to start optimizing, optimize JVM through configuration, optimize Code, optimize AppServer?? Its hardly easier to improve code overnight, so I think i will look into JVM configuration and optimize it. Sun's website is full of performance tips, most of them relate to GC and JIT. It was fun to knew some cool aspects and behavior of JIT (Just In Time) Compiler e.g. Default threshold for method compilation is 1500 executions, What it means is, If a method is called more than 1499 times in HotSpot Client VM, the method will be compiled to native code.

Kewl!! So C++ guys we are ready to outperform. Still Generational GC isn't clear to me, I think I should read more about it. Now that I have the book of Jack Shirazi on performance tuning, I'll be doing something better. I'm happy as I've got the work that I like!!