DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports Events Over 2 million developers have joined DZone. Join Today! Thanks for visiting DZone today,
Edit Profile Manage Email Subscriptions Moderation Admin Console How to Post to DZone Article Submission Guidelines
View Profile
Sign Out
Refcards
Trend Reports
Events
Zones
Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

The Latest Java Topics

article thumbnail
How to Convert a PDF to Text (TXT) Using Java
This article outlines the difficulties in extracting plain text from regular PDF documents at scale and demonstrates two API solutions that efficiently perform that task.
August 21, 2022
by Brian O'Neill CORE
· 6,238 Views · 2 Likes
article thumbnail
Spring Data JPA Interview Questions and Answers
One of the most popular starters in Spring Boot applications is Spring Data JPA. Hence, you have nearly no chance to avoid JPA-related questions during a job interview. Let's take a look at the most frequently asked questions with detailed answers.
August 20, 2022
by Georgii Vlasov
· 6,853 Views · 6 Likes
article thumbnail
How To Connect a Heroku Java App to a Cloud-Native Database
In this post, continue the journey to create a geo-distributed messenger in Java by looking at several connectivity options for Heroku and YugabyteDB Managed.
August 19, 2022
by Denis Magda CORE
· 5,451 Views · 4 Likes
article thumbnail
Troubleshooting Microservices OutOfMemoryError: Metaspace
Confronted with an interesting java.lang.OutOfMemoryError: Metaspace problem in a microservice application? Learn the steps taken to troubleshoot the problem.
August 19, 2022
by Ram Lakshmanan CORE
· 5,861 Views · 1 Like
article thumbnail
PHP Over Java? Or Is It Java Over PHP?
Looking at the popularity, PHP has been used by 81.25 percent of the tech companies worldwide just to build their web applications.
August 18, 2022
by Jasmine Ronald
· 6,267 Views · 5 Likes
article thumbnail
Open-Source SPL Helps Java Handle Files of Open Formats: TXT, CSV, JSON, XML, and XLS
SPL (JVM-based programming language) can parse structured data files of regular-or irregular-format, represent 2D/hierarchical data in a uniform way, and more.
August 18, 2022
by Jerry Zhang CORE
· 4,218 Views · 1 Like
article thumbnail
Modern Strategy Pattern in Functional Java
This article shows how to use a strategy pattern with a pinch of enums and functional syntactic sugar in functional Java.
August 17, 2022
by Jakub JRZ
· 3,636 Views · 6 Likes
article thumbnail
Step by Step Guide to Create Mule Custom Connector Using Java SDK
This tutorial guides you to create a custom connector for Mule 4 which includes both Source and Operation components.
August 17, 2022
by Jitendra Rawat
· 2,530 Views · 2 Likes
article thumbnail
Top 10 Java Language Features
Let's explore ten Java programming features used frequently by developers in their day-to-day programming jobs.
August 15, 2022
by A N M Bazlur Rahman CORE
· 8,411 Views · 14 Likes
article thumbnail
On Cosmetics vs. Intrinsics in Programming
Code has cosmetic and intrinsic characteristics. See examples demonstrating how to achieve the same intrinsic with different cosmetics and vice versa.
August 15, 2022
by Nicolas Fränkel CORE
· 3,645 Views · 2 Likes
article thumbnail
How To Build a Multi-Zone Java App in Days With Vaadin, YugabyteDB, and Heroku
Welcome to my journal documenting my experience building a geo-distributed app in Java from scratch. Here, I’ll share my first results and any challenges.
August 15, 2022
by Denis Magda CORE
· 5,700 Views · 8 Likes
article thumbnail
Brewing Patterns in Java: An Informal Primer
Thoughts on pattern matching, records, and sealing.
August 14, 2022
by Manoj N Palat
· 5,954 Views · 7 Likes
article thumbnail
Why "Polyglot Programming" or "Do It Yourself Programming Languages" or "Language Oriented Programming" sucks?
Last year we saw the launch of a new Web programming language Dart - Structured Web Programming from Google. A very interesting approach to support web application development. Not so long after Go, Groovy, Ruby, Scala, << Name your DSL here >>; we see Dart. Is it a good thing to have at least one programming language to solve one problem? The answer is, like we already know, it depends. Stay Away From “Do it Yourself” It is your choice as to if you will try to do things yourself or allow the truly seasoned professionals to help out. Some decide that they are going to try to go it alone when they are programming something new, but this often ends up in a less than desirable place. It may even be more expensive than just hiring an expert who can help you get it programmed for you in the first place. Most people do not go it alone with the vast majority of important services in their life, so why should they ever attempt to do so when they are looking at how to create a website? It is best to avoid making this mistake, and just try to make some progress towards your goals by hiring people who truly know how to help you make the progress that you need to make. Some important backgrounds you should know about the multi-programming language paradigm are the following: 1. You can read Martin Fowler's article about language-oriented programming with language workbenches which enables you to write small programming languages easily. In this article I see everyone writing their small language, everywhere. In this concept, we see DSL (Domain Specific Language) as the future of our programming activities. Source: http://martinfowler.com/articles/languageWorkbench.html 2. Neal Ford talked about Polyglot Programming, combining multiple programming languages in application development. Later Mr. Fowler added this paradigm with Polyglot Persistence, using different types of databases within one application. Source: http://memeagora.blogspot.com/2006/12/polyglot-programming.html and http://martinfowler.com/bliki/PolyglotPersistence.html Since 2006 I already discussed and collected some experiences in multi programming language paradigm: 1. I remember a “hot” discussion in 2006 with Sebastian Meyen, chief editor of JavaMagazin Germany, also the biggest organizor of Java Conference JAX. We agreed to see the future of programming in a multi-language paradigm concept. I also said that all those languages will be based on Java VM. I also told him that one day SAP will move ABAP as a language that can be run within the Java VM, so just another language within the Java environment, with no two different personalities anymore. Today we see the beginning of this in the project called Caffeine ABAP. Source: https://cw.sdn.sap.com/cw/groups/caffeine 2. Also in 2006 I had a project in which we also used different kinds of languages and also created our own DSL: Java for the most implementation stuff UML for the design of the business objects. We generate a lot of things using the concept of MDA (Model Driven Architecture) Groovy for a lot of things, especially for writing unit tests Based on ANTLR we create our own DSL for some aspects of the application It was really exciting and we had some very good programmers in the project. The result was a very nice and flexible product, just as what we expected at the beginning of the project. Please read this article in the German language for more information: http://www.sigs.de/publications/os/2009/02/dewanto_egger_OS_02_09.pdf So after all those nice things about the multi-language paradigm, I told you, why does this suck at the end? Here are some reasons from my point of view: 1. As typical in application development the problem comes first in the maintenance phase after all the capable programmers leave the project. Did you, programming language creators ever try to teach a new programming language to a “normal”, 9 till 5 programmers? I’m not talking about 9 (am) till 9 (pm) programmers who love to learn new languages. It is definitely tough to be proficient in one programming language. This is comparable with the languages we speak every day. I’m not a native English speaker, so I’m quite sure that I made a lot of syntax and grammar errors in this article. It is possible to be able to speak three or four languages perfectly but this is an exception. 2. Did you ever try to maintain a big Struts web application with AJAX? Just try to add functionality and you will end up creating and editing a lot of files: Action and Form files, Struts XML configuration files, JavaScript files with JSON, and also HTML or JSP files. Can you imagine adding Groovy, Scala, and Dart additionally into that web app? The complexity of such a project is very high. 3. Creating a new programming language means that you also have to build the environment for it. Good IDE, good documentation, good community support, a clear roadmap, and backward compatibility are some points to be done. Groovy is a bad example of this. In the early version of this language, the editor for Eclipse was really bad. After a while, they improved the editor but they made a lot of basic changes in the language so your old groovy applications do not work anymore. You are punished if you update to the new version. This never happens to Java. You still can compile Java 1.1 applications with Java 6 compiler. 4. Before you are creating your own DSL with e.g. ANTLR ask those language Gurus first, how hard it is to maintain a programming language for the long term. Before you discuss with them don’t ever create your own DSL. Especially if you are working for SME (Small and Medium-sized Enterprise). With a small team and small budget, you will never ever maintain your own language decently. So in year 2012, six years after my support to Polyglot Programming, I hope to see following things happen: 1. One language for all aspects in one application is the best concept ever. I name this as “One for All Programming Language paradigm”. Just like we don’t use English as a technical language, German as a literate language, and Indonesian as a community language, to be able to communicate internationally with each other we just use English pragmatically for all aspects of our life. In Germany, you need to speak German in all aspects to be able to communicate with others. My best solution so far is Java + XML, that’s it, no more, no less. No mixing with Groovy, Dart, Ruby, Scala, <> in one application. Especially if you are working as a contractor, please don’t try to use all those languages just for a small Java web application. I don’t say that you should not use the other languages at all. The only thing which is important is not to mix those languages in one application. In SME you may also want to use just one programming language for all your applications. 2. Concept like GWT (Java to JavaScript compiler) or XMLC (XML compiler which compiles XML, HTML to Java classes) is great. You can work just in plain Java. Guice with all Java and no XML is also a great solution (I know that SpringFramework is also doing this with Annotations). Android is great because it uses Java as its application programming language. In conclusion, I can only hope to see more such pure and plain Java solutions in 2012!
August 13, 2022
by Lofi Dewanto
· 12,720 Views · 5 Likes
article thumbnail
Openshift and AWS Lambda Deployment With Quarkus
Nowadays Quarkus is known as Supersonic Subatomic Java. It provides a lot of features to facilitate build and deployment.
August 13, 2022
by Elina Valieva
· 10,191 Views · 5 Likes
article thumbnail
Migrate Serialized Java Objects with XStream and XMT
Java serialization is convenient to store the state of Java objects. However, there are some drawbacks of serialized data: It is not human-readable. It is Java-specific and is not exchangeable with other programming languages. It is not migratable if fields of the associated Java class have been changed. These drawbacks make Java serialization not a practical approach to storing object states for real-world projects. In a product developed recently, we use XStream to serialize/deserialize Java objects, which solves the first and second problems. The third problem is addressed with XMT, an open source tool developed by us to migrate XStream serialized XMLs. This article introduces this tool with some examples. Computer Languages Need to be Simplified So many of the issues that we all run into when we are working on converting computer languages into something that can be better understood by human beings is the fact that computer languages need to be simplified if possible. These languages are great for the computers that speak back and forth with one another, but they don’t necessarily work out as well when humans try to become involved with them. Many humans end up confused and unable to make much progress at all on getting these systems cleared up. Thus, it is necessary to get them cleaned up and made more usable. There are people who are actively working on this problem right now, but in the meantime, we may simply have to deal with computers that can’t do everything we would like for them to do. XStream deserialization problem when class is evolved Assume a Task class below with a prioritized field indicating whether it is a prioritized task: package example; public class Task { public boolean prioritized; } With XStream, we can serialize objects of this class to XML like below: import com.thoughtworks.xstream.XStream; public class Test { public static void main(String args[]) { Task task = new Task(); task.prioritized = true; String xml = new XStream().toXML(task); saveXMLToFileOrDatabase(xml); } private static void saveXMLToFileOrDatabase(String xml) { // save XML to file or database here } } The resulting XML will be: true And you can deserialize the XML to get back task object: import com.thoughtworks.xstream.XStream; import com.thoughtworks.xstream.io.xml.DomDriver; public class Test { public static void main(String args[]) { String xml = readXMLFromFileOrDatabase(); Task task = (Task) new XStream(new DomDriver()).fromXML(xml); } private static String readXMLFromFileOrDatabase() { // read XML from file or database here } } Everything is fine. Now we find that a prioritized flag is not enough, so we enhance the Task class to be able to distinguish between high priority, medium priority and low priority: package example; public class Task { enum Priority {HIGH, MEDIUM, LOW} public Priority priority; } However, deserialization of previously saved XML is no longer possible since the new Task class is not compatible with the previous version. How does XMT Address the Problem XMT comes to the rescue: it introduces the class VersionedDocument to version serialized XMLs and handles the migration. With XMT, serialization of task object can be written as: package example; import com.pmease.commons.xmt.VersionedDocument; public class Test { public static void main(String args[]) { Task task = new Task(); task.prioritized = true; String xml = VersionedDocument.fromBean(task).toXML(); saveXMLToFileOrDatabase(xml); } private static void saveXMLToFileOrDatabase(String xml) { // save XML to file or database here } } For task class of the old version, the resulting XML will be: true Compared with the XML generated previously with XStream, an additional attribute version is added to the root element indicating the version of the XML. The value is set to "0" unless there are migration methods defined in the class as we will introduce below. When Task class is evolved to use enum based priority field, we add a migrated method like the below: package example; import java.util.Stack; import org.dom4j.Element; import com.pmease.commons.xmt.VersionedDocument; public class Task { enum Priority {HIGH, MEDIUM, LOW} public Priority priority; @SuppressWarnings("unused") private void migrate1(VersionedDocument dom, Stack versions) { Element element = dom.getRootElement().element("prioritized"); element.setName("priority"); if (element.getText().equals("true")) element.setText("HIGH"); else element.setText("LOW"); } } Migration methods need to be declared as a private method with the name in the form of "migrateXXX", where "XXX" is a number indicating the current version of the class. Here method "migrate1" indicates that the current version of the Task class is of "1", and the method migrates the XML from version "0" to "1". The XML to be migrated is passed as a VersionedDocument object which implements the dom4j Document interface and you may use dom4j to migrate it to be compatible with the current version of the class. In this migration method, we read back the "prioritized" element of version "0", rename it as "priority", and set the value as "HIGH" if the task is originally a prioritized task; otherwise, set the value as "LOW". With this migration method defined, you can now safely deserialize the task object from XML: package example; import com.pmease.commons.xmt.VersionedDocument; public class Test { public static void main(String args[]) { String xml = readXMLFromFileOrDatabase(); Task task = (Task) VersionedDocument.fromXML(xml).toBean(); } private static String readXMLFromFileOrDatabase() { // read XML from file or database here } } The deserialization works not only for XML of the old version but also for XML of the new version. At deserialization time, XMT compares the version of the XML (recorded in the version attribute as we mentioned earlier) with the current version of the class (maximum suffix number of various migrate methods), and runs applicable migrate methods one by one. In this case, if an XML of version "0" is read, method migrate1 will be called; if an XML of version "1" is read, no migration methods will be called since it is already up to date. As the class keeps evolving, more migration methods can be added to the class by increasing the suffix number of the latest migration method. For example, let's further enhance our Task class so that the priority field is taking a numeric value ranging from "1" to "10". We add another migrate method to the Task class to embrace the change: @SuppressWarnings("unused") private void migrate2(VersionedDocument dom, Stack versions) { Element element = dom.getRootElement().element("priority"); if (element.getText().equals("HIGH")) element.setText("10"); else if (element.getText().equals("MEDIUM")) element.setText("5"); else element.setText("1"); } This method only handles the migration from version "1" to version "2", and we do not need to care about version "0" anymore, since the XML of version "0" will first be migrated to version "1" by calling the method migrate1 before running this method. With this change, you will be able to deserialize the task object from XML of the current version and any previous versions. This article demonstrates the idea of how to migrate field change of classes. XMT can handle many complicated scenarios, such as migrating data defined in multiple tiers of class hierarchy, addressing class hierarchy change, etc. For more information of XMT, please visit http://wiki.pmease.com/display/xmt/Documentation+Home
August 13, 2022
by Robin Shen
· 17,911 Views · 1 Like
article thumbnail
A Guide to Maven 3 Beta
In just over six years, Apache Maven has become one of the most coveted tools for project build and reporting management. It’s been five years since the release of Maven 2, and now the Maven committers have released the next landmark version of the software. Incredible Improvements in Little Time It didn’t take long for Maven to become one of the most respected and desired tools in computer engineering. However, the product only continues to improve as Maven 3 is now available for those needing this extra computing power. The whole system just gets more and more interesting as time goes on, and that is exactly what people are most interested in when they look through different software options that are open to them. They just want something that they know will add to the amount of work that they can get done. We took some time to speak with the visionary founder of the Maven series of products, and we got some answers directly from him about how these products work and what kind of updates we might expect in the future. Believe it or not, just sitting down with him and getting some of these answers was a big help getting us to a place where we better understand the product. There is so much buzz and excitement about Maven 3 right now, and there should be. However, we wanted to hear directly from the creator of it to see which features we should be most excited about. We are so happy that he took the time to speak with us and review his innovative product. Hear what we learned today directly from the creator. The first beta release of Maven 3, which is now complete after seven public alphas, was released this week. Maven founder and Sonatype CTO Jason van Zyl answered some questions for DZone about Maven 3 earlier this month. Below are the main new feature categories of Maven 3. Drop-in Replacement Users of Maven 1.x may remember the bumpy transition to Maven 2 because of several fundamental changes. The Maven committers remember too, and they’ve put a lot of extra work into providing backward compatibility and making Maven 3 a simple drop-in replacement for Maven 2.x in most cases. van Zyl says this was “very difficult given how much of the internals we’ve changed.” Apart from fixing problems with duplicate dependency and plugin declarations, no changes are needed for your POMs. They’ve made the command line fully compatible between 2 and 3. Polyglot Maven Polyglot Maven is not a part of Maven 3 per se, but it is a tool from van Zyl’s company, Sonatype, that can be integrated with Maven 3 via an extension point. The extension points are a new feature in Maven 3 that support tools such as Tycho, Polyglot Maven, and Maven Shell. As you’ve probably guessed from the name, Polyglot Maven supports dynamic languages and is trying to provide first-class POM-mapped DSL (Domain Specific Language) support for Groovy, Scala, Clojure, Ruby, Xtext, and YAML. Polyglot Maven currently supports YAML. This is a welcome feature for developers who find the original XML format annoying. If you don’t, no big deal. Van Zyl also says it’s important for these DSLs to have repository interoperability and tooling to leverage M2Eclipse. M2Eclipse Maven 3 has changes related to embedding that make it work a lot better inside of M2Eclipse (the first Maven integration plugin for Eclipse). Maven 3 is now capable of a 200 to 300% performance boost while running in this plugin environment built specifically for Maven and the Eclipse IDE. M2Eclipse will provide some extra XML metadata in the Maven POM, which M2Eclipse only recognizes. This is one feature that enables high build performance. M2Eclipse also downloads all sources automatically and has a single-click new project creation feature for any of your dependencies. Maven Shell The Maven Shell is another extension point. It is Maven embedded in a long-lived shell process that caches parsed POMs, avoids start-up costs when invoking Maven repeatedly, supports Maven Archetype integration, provides Nexus integration, includes a built-in help system, and on Mac OS X, provides Growl support. Van Zyl says typical cases will see a 50% reduction in build times. Version 1.0 of the Maven Shell integrates the make-like reactor mode that builds only the modified modules. Support for project workflow, Hudson, Tycho, and Polyglot Maven are also present. Other improvements Developers working in multi-module or multi-pom projects won’t have to specify the parent version in every sub-module in Maven 3. Instead, you can add version-less parent elements. Maven 3 will also be able to see which POMs supplied which artifacts. In M2Eclipse, you can then deselect a certain contribution and select others. This is made possible through Maven 3’s decoupling of execution plans and execution. Maven 3 also includes extension points (mentioned above), which allow developers to hook up to different extension points instead of subclassing a plugin to alter the plugin’s behavior. You might, for example, have an extension point to alter how web.xml is processed through the WAR plugin. The source code in Maven 3 uses Google Guice for dependency injection and Peaberry to add OSGi capabilities to Guice. The whole dependency resolution is refactored into a standalone product by Sonatype called Mercury, for which Maven 3 is a client. Believe it or not, the Maven 3 codebase ended up being 1/3rd smaller than Maven 2. Maven 3.1 Looking toward the next release, Maven 3.1 will include a security manager with the settings.xml implementation as the default. Sonatype is planning an implementation that interacts with Nexus. Maven 3.1 will also introduce POM mixins, which make the configuration more maintainable and portable. Mixins will help solve the problem in Maven 2.0, where sharing configuration could only be done via inheritance. POM mixins are a type of POM composition that allows parameterized POM fragments to be injected into the current POM with a simple reference. References: "What's New in Maven 3" - Eingestellt von Reikje, DZone interview with Jason van Zyl, and "EclipseMagazine Interview with Jason van Zyl on the Maven Ecosystem"
August 13, 2022
by Mitch Pronschinske
· 45,023 Views · 2 Likes
article thumbnail
Creating Application using Spring Roo and Deploying on Google App Engine
Spring Roo is an rapid application development tool that helps you in rapidly building spring-based enterprise applications in the java programming language. Google app engine is a cloud computing technology that lets you run your application on Google's infrastructure. Using Spring Roo, you can develop applications that can be deployed on the Google app engine. In this tutorial, we will develop a simple application that can run on the Google app engine. Roo configures and manages your application using the Roo shell. Roo shell can be launched as a stand-alone, command-line tool, or as a view pane in the Springsource tool suite ide. Create it Fast and Effectively Most who create applications want to make them fast, and they want to make them effectively. What this means is that if they can figure out a way to create something that will both work for their users and also provide them with the speed of transaction that they need, then it is entirely possible that this will be precisely what they need to do in order to get the best results. Most are looking to Google search as a great way to get their apps out into the world, and it seems like this is as good of a place as any to start. Pushing out apps that can help the general population get the help they need with various projects means working with the most popular search engines in the world to make it happen. Thus, you should look to develop apps that work on Google in order to get the kind of results that you require. Prerequisite Before we can start using the Roo shell, we need to download and install all prerequisites. Download and install SpringSource Tool Suite 2.3.3. m2. Spring Roo 1.1.0.m2 comes bundled with STS. While installing STS, the installer asks for the location where STS should be installed. in that directory, it will create a folder with the name "roo-%release_number%" which will contain roo stuff. add %spring_roo%/roo-1.1.0.m2/bin in your path so that when you can fire roo commands from the command line. Start STS and go to the dashboard (help->dashboard) Click on the extensions tab Install the "google plugin for eclipse" and the "datanucleus plugin". Restart STS when prompted. After installing all the above we can start building the application. Conferenceregistration.roo Application Conference registration is a simple application where speakers can register themselves and create a session they want to talk about. So, we will be having two entities: speaker and presentation. Follow the instructions to create the application: Open your operating system command-line shell Create a directory named conference-registration Go to the conference-registration directory in your command-line shell Fire Roo command. You will see a roo shell as shown below. Hint command gives you the next actions you can take to manage your application. Type the hint command and press enter. Roo will tell you that first you need to create a project and for creating a project you should type 'project' and then hit tab. Hint command is very useful as you don't have to cram all the commands; it will always give you the next logical steps that you can take at that point. Roo hint command told us that we have to create the project so type the project command as shown below project --toplevelpackage com.shekhar.conference.registration --java 6 This command created a new maven project with the top-level package name as com. Shekhar.conference.registration and created directories for storing source code and other resource files. In this command, we also specified that we are using Java version 6. Once you have created the project, type in the hint command again, Roo will tell you that now you have to set up the persistence. Type the following command persistence setup --provider datanucleus --database google_app_engine --applicationid roo-gae This command set up all the things required for persistence. It creates persistence.xml and adds all the dependencies required for persistence in pom.xml. We have chosen the provider as DataNucleus and the database as google_app_engine because we are developing our application for google app engine and it uses its own data store. Applicationid is also required when we deploy our application to the Google app engine. Now our persistence setup is completed. 8. Type the hint command again, Roo will tell you that you have to create entities now. so, we need to create our entities' speaker and presentation. To create a speaker entity, we will type the following commands entity --class ~.domain.speaker --testautomatically field string --fieldname fullname --notnull field string --fieldname email --notnull --regexp ^([0-9a-za-z]([-.\w]*[0-9a-za-z])*@([0-9a-za-z][-\w]*[0-9a-za-z]\.)+[a-za-z]{2,9})$ field string --fieldname city field date --fieldname birthdate --type java.util.date --notnull field string --fieldname bio The above six lines created an entity named session with different fields. In this, we have used notnull constraint, email regex validation, date field. Spring Roo on the app engine does not support enum and references yet which means that you can't define one-one or one-to-many relationships between entities yet. These capabilities are supported on spring MVC applications but spring MVC applications can't be deployed on app engines as of now. Spring Roo Jira has these issues. They will be fixed in future releases(hope so :) ). 9. Next create the second entity of our application presentation. To create a presentation entity type the following commands on Roo shell entity --class ~.domain.presentation --testautomatically field string --fieldname title --notnull field string --fieldname description --notnull field string --fieldname speaker --notnull The above four lines created a jpa entity called presentation, located in the domain sub-package, and added three fields -- title,description and speaker. As you can see, the speaker is added as a string (just enter the full name). Spring Roo on google app engine still does not support references. 10. Now that we have created our entities, we have to create the face of our application i.e. user interface. currently, only GWT-created UI runs on the app engine. so, we will create GWT user interface. To do that type gwt setup this command will add the GWT controller as well as all the UI required stuff. This command may take a couple of minutes if you don't have those dependencies in your maven repository. 11. Next you can configure the log4j to debug level using the following command logging setup --level debug 12. Quit the Roo shell 13. You can easily run your application locally if you have maven installed on your system, simply type "mvn gwt:run" at your command line shell while you are in the same directory in which you created the project. This will launch the GWT development mode and you can test your application. Please note that applications do not run in the Google chrome browser when you run from your development environment. So, better run it in firefox. 14. To deploy your application to the Google app engine just type mvn gwt:compile gae:deploy it will ask you for app engine credentials (email id and password).
August 12, 2022
by Shekhar Gulati
· 49,993 Views · 1 Like
article thumbnail
Clojure: Destructuring
In The Joy of Clojure (TJoC) destructuring is described as a mini-language within Clojure. It's not essential to learn this mini-language; however, as the authors of TJoC point out, destructuring facilitates concise, elegant code. Making Code More Understandable One of the scariest things for those who are just now learning how to do some coding is the fact that they have to try to figure out what a seemingly impossible set of rules and structures means for the work that they are trying to do. It is not easy at all, and many people struggle with it in big ways. Fortunately, there are some people who are going about the process of destructuring code so that it may be broken into smaller and more manageable chunks. If this is to happen, then one can easily see how they can potentially get a lot more value from the process of coding, and even how they can contribute to it for themselves in the future. We need to be as encouraging of the next generation of coders as we possibly can because there is no question that they will ultimately have an outsized impact on how the future of coding is decided. If they are best set up to understand coding and to make sense of its many intricacies, then they will be able to handle it without problems. However, we need to support and encourage them along the way, and that all begins by making coding easier to understand in general. What is destructuring? Clojure supports abstract structural binding, often called destructuring, in let binding lists, fn parameter lists, and any macro that expands into a let or fn. -- http://clojure.org/special_forms The simplest example of destructuring is assigning the values of a vector. user=> (def point [5 7]) #'user/point user=> (let [[x y] point] (println "x:" x "y:" y)) x: 5 y: 7 note: I'm using let for my examples of destructuring; however, in practice, I tend to use destructuring in function parameter lists at least as often, if not more often. I'll admit that I can't remember ever using destructuring like the first example, but it's a good starting point. A more realistic example is splitting a vector into a head and a tail. When defining a function with an arglist** you use an ampersand. The same is true in destructuring. user=> (def indexes [1 2 3]) #'user/indexes user=> (let [[x & more] indexes] (println "x:" x "more:" more)) x: 1 more: (2 3) It's also worth noting that you can bind the entire vector to a local using the :as directive. user=> (def indexes [1 2 3]) #'user/indexes user=> (let [[x & more :as full-list] indexes] (println "x:" x "more:" more "full list:" full-list)) x: 1 more: (2 3) full list: [1 2 3] Vector examples are the easiest; however, in practice I find myself using destructuring with maps far more often. Simple destructuring on a map is as easy as choosing a local name and providing the key. user=> (def point {:x 5 :y 7}) #'user/point user=> (let [{the-x :x the-y :y} point] (println "x:" the-x "y:" the-y)) x: 5 y: 7 As the example shows, the values of :x and :y are bound to locals with the names the-x and the-y. In practice we would never prepend "the-" to our local names; however, using different names provides a bit of clarity for our first example. In production code you would be much more likely to want locals with the same name as the key. This works perfectly well, as the next example shows. user=> (def point {:x 5 :y 7}) #'user/point user=> (let [{x :x y :y} point] (println "x:" x "y:" y)) x: 5 y: 7 While this works perfectly well, creating locals with the same name as the keys become tedious and annoying (especially when your keys are longer than one letter). Clojure anticipates this frustration and provides :keys directive that allows you to specify keys that you would like as locals with the same name. user=> (def point {:x 5 :y 7}) #'user/point user=> (let [{:keys [x y]} point] (println "x:" x "y:" y)) x: 5 y: 7 There are a few directives that work while destructuring maps. The above example shows the use of :keys. In practice I end up using :keys the most; however, I've also used the :as directive while working with maps. The following example illustrates the use of an :as directive to bind a local with the entire map. user=> (def point {:x 5 :y 7}) #'user/point user=> (let [{:keys [x y] :as the-point} point] (println "x:" x "y:" y "point:" the-point)) x: 5 y: 7 point: {:x 5, :y 7} We've now seen the :as directive used for both vectors and maps. In both cases, the locale is always assigned to the entire expression that is being destructured. For completeness I'll document the :or directive; however, I must admit that I've never used it in practice. The :or directive is used to assign default values when the map being destructured doesn't contain a specified key. user=> (def point {:y 7}) #'user/point user=> (let [{:keys [x y] :or {x 0 y 0} point] (println "x:" x "y:" y)) x: 0 y: 7 Lastly, it's also worth noting that you can destructure nested maps, vectors and a combination of both. The following example destructures a nested map user=> (def book {:name "SICP" :details {:pages 657 :isbn-10 "0262011530"}) #'user/book user=> (let [{name :name {pages :pages isbn-10 :isbn-10} :details} book] (println "name:" name "pages:" pages "isbn-10:" isbn-10)) name: SICP pages: 657 isbn-10: 0262011530 As you would expect, you can also use directives while destructuring nested maps. user=> (def book {:name "SICP" :details {:pages 657 :isbn-10 "0262011530"}) #'user/book user=> user=> (let [{name :name {:keys [pages isbn-10]} :details} book] (println "name:" name "pages:" pages "isbn-10:" isbn-10)) name: SICP pages: 657 isbn-10: 0262011530 Destructuring nested vectors is also very straight-forward, as the following example illustrates user=> (def numbers [[1 2][3 4]]) #'user/numbers user=> (let [[[a b][c d]] numbers] (println "a:" a "b:" b "c:" c "d:" d)) a: 1 b: 2 c: 3 d: 4 Since binding forms can be nested within one another arbitrarily, you can pull apart just about anything -- http://clojure.org/special_forms The following example destructures a map and a vector at the same time. user=> (def golfer {:name "Jim" :scores [3 5 4 5]}) #'user/golfer user=> (let [{name :name [hole1 hole2] :scores} golfer] (println "name:" name "hole1:" hole1 "hole2:" hole2)) name: Jim hole1: 3 hole2: 5 The same example can be rewritten using a function definition to show the simplicity of using destructuring in parameter lists. user=> (defn print-status [{name :name [hole1 hole2] :scores}] (println "name:" name "hole1:" hole1 "hole2:" hole2)) #'user/print-status user=> (print-status {:name "Jim" :scores [3 5 4 5]}) name: Jim hole1: 3 hole2: 5 There are other (less used) directives and deeper explanations available on http://clojure.org/special_forms and in The Joy of Clojure. I recommend both. **(defn do-something [x y & more] ... )
August 12, 2022
by Jay Fields
· 13,971 Views · 1 Like
article thumbnail
The Challenges of a JavaFX Reboot
In Jonathan Giles's post An FX Experience Retrospective, he starts by looking at the history of JavaFX and focuses on "what has happened in the world of JavaFX" in 2011. I was highly skeptical of JavaFX prior to JavaOne 2010 (see here and here for examples), but started to think more positively about it after the JavaOne 2010 and JavaOne 2011 announcements related to JavaFX. One thing that has been a little tricky about learning JavaFX since JavaOne 2011's big announcements have been knowing for certain whether a particular resource on JavaFX applies to JavaFX 1.x or JavaFX 2.x. Reading the An FX Experience Retrospective post provided a different perspective on the risks and challenges Oracle and the JavaFX team has faced in making this major overhaul. A JavaFX Reboot is Going to be a Challenge It is absolutely the case that a JavaFx Reboot will be an extreme challenge in that most people haven’t given the idea of rebooting this system much thought at all. They assume that the JavaFx system will work just as well for them today as it has in the past, but they may be making a big mistake in making such a sweeping judgment. There are many who feel that JavaFx is in need of an update/upgrade, but it is not clear how that is going to be possible. So many of the elements of JavaFX were built with a specific purpose in mind, and it is far from easy to change course on that idea at this stage of the game. Thus, it seems likely that JavaFx will largely continue to exist as it has for all of these years and not receive the upgrade that it needs at this time. However, there are at least some people who are working around the edges to see what they can do about putting the JavaFx train back on the tracks that were designed for it. Giles writes in his post, "Another vivid recollection I have from JavaOne 2010 is the various reactions that people had of this news. It varied from those in shock at losing their favorite language, to those who said it was long overdue and was the right way to proceed with JavaFX." I was in the latter group, welcoming this change and I would have liked to see it happen even sooner. The ability to use JavaFX with standard Java language and APIs was a huge benefit in my opinion and finally gave credence to the pro-JavaFX argument to Java developers that "JavaFX is Java." The JavaOne 2011 announcements of making JavaFX open source and making it part of standard Java SE were likely less controversial for Java developers (who wouldn't want these characteristics?) and are also important to me in my renewed interest in JavaFX. For developers just learning JavaFX, it can be a bit tricky to know if an online resource is for JavaFX 1.x or 2.x without delving into the article. The changes in JavaFX from 1.x to 2.x are significant enough that I generally don't want to risk confusion by reading JavaFX 1.x resources (though some have found value in reading JavaFX 1.x resources in preparation for using JavaFX 2.0). However, there are some clues that can help make it quicker and easier to identify which version of JavaFX is applicable. It is most obvious that an article is about JavaFX 2.0 when it explicitly states so. I try to do this with my blog posts on JavaFX 2.0, though I'm sure I occasionally forget to do so. When an article or blog post does not state the version of JavaFX specifically, another good clue is the date of the resource. In general, it is usually safe to assume that anything written about JavaFX before late 2010 is about JavaFX 1.x and it is similarly safe to assume that most things written about JavaFX in 2011 or later are about JavaFX 2.x. Another clue to watch for is discussion in a resource that includes JavaFX Script or FXML references. The former (JavaFX Script) was exclusive to JavaFX 1.x and the latter (FXML) is exclusive to JavaFX 2.x. Some really good documentation on JavaFX 2.x has been made available recently. The JavaFX 2.0 documentation states the following about JavaFX 2.0 versus JavaFX 1.3: JavaFX 2.0 is the latest major update release for JavaFX. Many of the new features introduced in JavaFX 2.0 are incompatible with JavaFX 1.3. If you are developing a new application in JavaFX, it is recommended that you start with JavaFX 2.0. The JavaFX 2.0 documentation contains many newly written or updated articles and posts on JavaFX 2.0. This set of documentation includes What is JavaFX?, Getting Started with JavaFX, Working with the JavaFX Scene Graph, Introduction to FXML, Getting Started with FXML, and Using JavaFX Charts. Books on JavaFX provide another perspective on the challenges associated with the major shift in JavaFX's vision. Most JavaFX books that are currently available were written for JavaFX 1.x. berry120 (who has also blogged on JavaFX 2) recently asked, Any decent books on JavaFX 2? As far as I can tell, the only JavaFX 2.x book currently available is Carl Dea's JavaFX 2.0: Introduction by Example (I hope to write a review of this short, recipe-oriented book in the near future). This book has a publication date (2011) and JavaFX 2.0 in its title, making it clear that it's on JavaFX 2.0. With books, which typically have a longer time between writing and publishing, even early 2011 publication dates might still mean a book on JavaFX 1.x. Another good clue with books is the price of used books on the Amazon Marketplace. Books on old and/or deprecated language versions tend to be very cheap. Other books on JavaFX 2.0 are likely to come. Pro JavaFXTM 2 Platform A Definitive Guide to Script, Desktop and Mobile RIA with JavaTM Technology has an advertised publication date of 12 February 2012. Carefully wading through online resources and selecting books to purchase is tricky for developers because of the major shift in JavaFX's long-term vision. Giles's post provides some insight into the even greater effort required within Oracle and the JavaFX team to make this major shift. As painful as the shift is, I believe this shift in vision coming at the cost of short-term pain provides JavaFX a fighting chance for a prosperous long-term future.
August 12, 2022
by Dustin Marx
· 8,826 Views · 4 Likes
article thumbnail
Querydsl vs. JPA Criteria - Introduction
Introduction to the Querydsl series with the goal to highlight the difference from JPA Criteria.
August 9, 2022
by Arnošt Havelka CORE
· 7,531 Views · 6 Likes
  • Previous
  • ...
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • ...
  • Next

ABOUT US

  • About DZone
  • Send feedback
  • Careers
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 600 Park Offices Drive
  • Suite 300
  • Durham, NC 27709
  • support@dzone.com
  • +1 (919) 678-0300

Let's be friends: