Search Results

Search found 4074 results on 163 pages for 'titanium modules'.

Page 40/163 | < Previous Page | 36 37 38 39 40 41 42 43 44 45 46 47  | Next Page >

  • Apply WCF For Large Projects

    - by svlytns
    We have a large projects that have nearly 20 modules on it.We want to use WCF for business layer. We think three way to implement WCF our project: Use only one datacontract and one operation contract. Send ClassName, MethodName to operation and create class by reflaction then invoke the method in WCF side. Second way put all modules in one wcf application, and create their data contracts, operation contracts. Third way is create seperate wcf application for each module and host them seperatly. Which one is the best way? I need your ideas. TIA!

    Read the article

  • upgrade from ubuntu 13.04 to 13.10 causes vmware workstation 9 problems

    - by dan
    so, upgrade caused problem where running vmware ws 9 needed patches to accommodate linux kernal 3.11. I applied those fixes I found that others reported, and now i can only run vmware ws 9 from sudo. If i run it from standard user, it says it wants to recompile modules, which it does not do unless I open up a terminal and run sudo vmware. that works, but would like it to work correctly, have the modules that are recompiled stick. when running under sudo vmware, it does recompile with errors.. (vmware-unity-helper:13019): Gtk-WARNING **: Unable to locate theme engine in module_path: "murrine", and starts up and works ok. any ideas? thanks for any help you can provide

    Read the article

  • Event Driven Communication in Game Engine - Yes or No?

    - by Bunkai.Satori
    As I am reading book Game Coding Complete (http://www.amazon.com/Game-Coding-Complete-Third-McShaffry/dp/1584506806/ref=sr_1_1?ie=UTF8&qid=1295978774&sr=8-1), the author recommend Event Driven communication among the all game objects and modules. Basicaly, all the living game actors and object should communicate with the key modules (Physics, AI, Game Logic, Game View, etc..) via internal event messaging system. This would mean designing efficient event manager as well. My question is, whether this is proven and recommended approach. If it is not properly designed, it might mean consuming a lot of CPU cycles, which can be used elsewhere. This is especially true, if the game is targetted for mobile platform. What is your opinion and recommendation, please?

    Read the article

  • Can't get Broadcom 43142 drivers to work after installation on 5420 Inspiron

    - by beckett
    I'm a complete newbie to ubuntu, I've already installed and reinstalled the Broadcom driver required done a numerous amount of steps on terminal but I can't seem to get wireless working. However, my wired connection seems to work just fine. also I have tried to follow the instructions but get stuck at step number 2 when i get told "No package dpkg is available": 1) Download the file from the link 2) sudo yum install dpkg 3) mkdir BCM43142 4) dpkg-deb -x Downloads/wireless-bcm43142-dkms-6.20.55.19_amd64.deb BCM43142 5) cd BCM43142/usr/src/wireless-bcm43142-oneiric-dkms-6.20.55.19~bdcom0602.0400.1000.0400/src/wl/sys 6) sudo yum install kernel-devel kernel-headers 7) vi wl_linux.c 8) around line 43, remove the line include 9) save the file (:wq) 10) cd ../../.. 11) make Things should work, and you'll have a file called "wl.ko" in the current directory. 12) sudo yum remove broadcom-wl 13) sudo mkdir -p /lib/modules/3.5.2-3.fc17.x86_64/extra/wl 14) sudo cp wl.ko /lib/modules/3.5.2-3.fc17.x86_64/extra/wl 15) sudo depmod -a 16) sudo modprobe wl I really need help :/

    Read the article

  • Organazing ASP.Net Single Page Application with Nancy

    - by OnesimusUnbound
    As a personal project, I'm creating a single page, asp.net web application using Nancy to provide RESTful services to the single page. Due to the complexity of the single page, particularly the JavaScripts used, I've think creating a dedicated project for the client side of web development and another for service side will organize and simplify the development. solution | +-- web / client side (single html page, js, css) | - contains asp.net project, and nancy library | to host the modules in application ptoject folder | +-- application / service (nancy modules, bootstrap for other layer) | . . . and other layers (three teir, domain driven, etc) . Is this a good way of organizing a complex single page application? Am I over-engineering the web app, incurring too much complexity?

    Read the article

  • while installing kde 4.11 through terminal showing error while 'processing initramfs-tools'

    - by saptarshi nag
    the full output is------- Setting up initramfs-tools (0.103ubuntu0.2.1) ... update-initramfs: deferring update (trigger activated) Processing triggers for initramfs-tools ... update-initramfs: Generating /boot/initrd.img-3.5.0-42-generic cp: cannot stat ‘/module-files.d/libpango1.0-0.modules’: No such file or directory cp: cannot stat ‘/modules/pango-basic-fc.so’: No such file or directory E: /usr/share/initramfs-tools/hooks/plymouth failed with return 1. update-initramfs: failed for /boot/initrd.img-3.5.0-42-generic with 1. dpkg: error processing initramfs-tools (--configure): subprocess installed post-installation script returned error exit status 1 Errors were encountered while processing: initramfs-tools E: Sub-process /usr/bin/dpkg returned an error code (1)

    Read the article

  • Which programming career path fits my terms? [closed]

    - by Goward Gerald
    I am sick and tired of my enterprise development job, I need some programming direction like this: Demanded in jobs-market Demanded in freelance market Can use Ubuntu as development environment Not enterprise. Standalone, mobile, web-development, anything, just not enterprise. Basically, I need a programming direction which doesn't need 20 developers, terribly big databases systems and long going projects with intense long-term support, I don't want enterprise job where a lot of people are working on one terribly big project and do modules to it all day long. Instead, I need something where: Projects change pretty often Projects are little, or medium-sized (in terms of code, modules and people working on it) but still not enterprise-sized Possible for freelance, solo-development, or at least requires a team of 3-4 programmers. Not like in enterprise where you feel like a drop in the sea with your 50 classes while system itself has hundreds of classes. Suggestions please?

    Read the article

  • UPK Pre-Built Content Update

    - by Karen Rihs
    UPK pre-built content development efforts are always underway and growing. Over the last few months, the following new and upgraded modules became available:  NEW CONTENT RELEASES E-Business Suite 12.1 Field Service Manufacturing Operations Center Process Manufacturing:  System Administration Strategic Network Optimization U.S. Federal Financials Oracle Communications 11.1 Oracle Communications UPK for Pricing Design Center, Voice and Data Offerings Oracle Mobile Workforce 2.1.0 Administrative Setup User Tasks Primavera Primavera Portfolio Management 9.0 UPK CONTENT UPGRADES JDE E1 9.1 HCM Fundamentals for EnterpriseOne Manufacturing - Product Data Management Manufacturing Management Discrete Shop Floor Management Procurement and Subcontract Management JDE World A9.3 Accounts Payable Address Book  Common Foundation General Ledger For a list of modules currently available for each product line, visit the UPK Resource Library on Oracle.com. For more information on how your organization can take advantage of UPK pre-built content, see our previous blog,  The Value of UPK Pre-Built Content. - Karen Rihs, UPK Outbound Product Management

    Read the article

  • How to suggest using an ORM instead of stored procedures?

    - by Wayne M
    I work at a company that only uses stored procedures for all data access, which makes it very annoying to keep our local databases in sync as every commit we have to run new procs. I have used some basic ORMs in the past and I find the experience much better and cleaner. I'd like to suggest to the development manager and rest of the team that we look into using an ORM Of some kind for future development (the rest of the team are only familiar with stored procedures and have never used anything else). The current architecture is .NET 3.5 written like .NET 1.1, with "god classes" that use a strange implementation of ActiveRecord and return untyped DataSets which are looped over in code-behind files - the classes work something like this: class Foo { public bool LoadFoo() { bool blnResult = false; if (this.FooID == 0) { throw new Exception("FooID must be set before calling this method."); } DataSet ds = // ... call to Sproc if (ds.Tables[0].Rows.Count > 0) { foo.FooName = ds.Tables[0].Rows[0]["FooName"].ToString(); // other properties set blnResult = true; } return blnResult; } } // Consumer Foo foo = new Foo(); foo.FooID = 1234; foo.LoadFoo(); // do stuff with foo... There is pretty much no application of any design patterns. There are no tests whatsoever (nobody else knows how to write unit tests, and testing is done through manually loading up the website and poking around). Looking through our database we have: 199 tables, 13 views, a whopping 926 stored procedures and 93 functions. About 30 or so tables are used for batch jobs or external things, the remainder are used in our core application. Is it even worth pursuing a different approach in this scenario? I'm talking about moving forward only since we aren't allowed to refactor the existing code since "it works" so we cannot change the existing classes to use an ORM, but I don't know how often we add brand new modules instead of adding to/fixing current modules so I'm not sure if an ORM is the right approach (too much invested in stored procedures and DataSets). If it is the right choice, how should I present the case for using one? Off the top of my head the only benefits I can think of is having cleaner code (although it might not be, since the current architecture isn't built with ORMs in mind so we would basically be jury-rigging ORMs on to future modules but the old ones would still be using the DataSets) and less hassle to have to remember what procedure scripts have been run and which need to be run, etc. but that's it, and I don't know how compelling an argument that would be. Maintainability is another concern but one that nobody except me seems to be concerned about.

    Read the article

  • .Net Application & Database Modularity/Reuse

    - by Martaver
    I'm looking for some guidance on how to architect an app with regards to modularity, separation of concerns and re-usability. I'm working on an application (ASP.Net, C#) that has distinctly generic chunks of functionality, that I'd love to be able to lift out, all layers, into re-usable components. This means the module handles the database schema, data access, API, everything so that the next time I want to use it I can just register the module and hook into it. Developing modules of re-usable functionality is a no-brainer, but what is really confusing me is what to do when it comes to handling a core re-usable database schema that serves the module's functionality. In an ideal world, I would register a module and it would ensure that the associated database schema exists in the DB. I would code on the assumption that the tables exist, calling the module's functionality through the DLL, agnostic of the database layer. Kind of like Enterprise Library's Caching/Logging Application Block, which can create a DB schema in the target DB to use as a data store. My Questions is: What do you think is the best way to achieve this, firstly, in terms design architecture, and secondly solution structure. What patterns/frameworks do you know that exist & support this kind of thing? My thoughts so far: I mostly use Entity Framework and SQL Server DB Projects. I thought about a 'black box' approach to modules of functionality. I could use use a code-first approach in EF4, and use the ObjectContext to create a database when the module is initialized. However this means that all of the entities that my module encapsulates would be disconnected from the rest of the application because they belonged to an abstracted ObjectContext. Further - Creating appropriate indexes and references between domain entities and the module's entities would be impossible to do practically. I've thought of adopting Enterprise Library and creating my own Application Blocks. I'm not sure how this would play nice with Entity Framework (if at all) though. I like the idea of building on proven patterns & practices to encapsulate established, reusable functionality. I thought of abandoning Entity Framework for the Module, and just creating a separate DB schema for the module with its own set of stored procedures & ADO.Net. Then deploying the script at run-time if interrogation shows that it doesn't exist. But once again, for application developing outside of the application, I would want to use Entity Framework and I would have to use the module separately, disconnected from the domain ObjectContext. Has anyone had experience developing these sorts of full-stack modules? What advice can you offer? Am I biting off more than I can chew?

    Read the article

  • When to use mixins in Ruby

    - by Gilles
    I am wondering when to use mixins? I have read about them. Many authors compare them to interfaces, abstract classes, etc. Mixins are modules that are mixed-in and modules are a way to group similar methods, constants and classes together. I have seen examples where a module for math functions is created. It makes sense to group and reuse such functions but should I only mix these in a class if I am faced with an inheritance situation? Should I mix these in anytime I want to use them in a class? Should they be used exactly like interfaces in other languages or are there other subtleties?

    Read the article

  • AngularJS: structuring a web application with multiple ng-apps

    - by mg1075
    The blogosphere has a number of articles on the topic of AngularJS app structuring guidelines such as these (and others): http://www.johnpapa.net/angular-app-structuring-guidelines/ http://codingsmackdown.tv/blog/2013/04/19/angularjs-modules-for-great-justice/ http://danorlando.com/angularjs-architecture-understanding-modules/ http://henriquat.re/modularizing-angularjs/modularizing-angular-applications/modularizing-angular-applications.html However, one scenario I have yet to come across for guidelines and best practices is the case where you have a large web application containing multiple "mini-spa" apps, and the mini-spa apps all share a certain amount of code. I am not referring to the case of trying to have multiple ng-app declarations on the same page; rather, I mean different sections of a large site that have their own, unique ng-app declaration. As Scott Allen writes in his OdeToCode blog: One scenario I haven't found addressed very well is the scenario where multiple apps exist in the same greater web application and require some shared code on the client. Are there any recommended approaches to take, pitfalls to avoid, or good sample structures of this scenario that you can point to?

    Read the article

  • What's the best way to refactor this Rails controller?

    - by Robert DiNicolas
    I'd like some advice on how to best refactor this controller. The controller builds a page of zones and modules. Page has_many zones, zone has_many modules. So zones are just a cluster of modules wrapped in a container. The problem I'm having is that some modules may have some specific queries that I don't want executed on every page, so I've had to add conditions. The conditions just test if the module is on the page, if it is the query is executed. One of the problems with this is if I add a hundred special module queries, the controller has to iterate through each one. I think I would like to see these module condition moved out of the controller as well as all the additional custom actions. I can keep everything in this one controller, but I plan to have many apps using this controller so it could get messy. class PagesController < ApplicationController # GET /pages/1 # GET /pages/1.xml # Show is the main page rendering action, page routes are aliased in routes.rb def show #-+-+-+-+-Core Page Queries-+-+-+-+- @page = Page.find(params[:id]) @zones = @page.zones.find(:all, :order => 'zones.list_order ASC') @mods = @page.mods.find(:all) @columns = Page.columns # restful params to influence page rendering, see routes.rb @fragment = params[:fragment] # render single module @cluster = params[:cluster] # render single zone @head = params[:head] # render html, body and head #-+-+-+-+-Page Level Json Conversions-+-+-+-+- @metas = @page.metas ? ActiveSupport::JSON.decode(@page.metas) : nil @javascripts = @page.javascripts ? ActiveSupport::JSON.decode(@page.javascripts) : nil #-+-+-+-+-Module Specific Queries-+-+-+-+- # would like to refactor this process @mods.each do |mod| # Reps Module Custom Queries if mod.name == "reps" @reps = User.find(:all, :joins => :roles, :conditions => { :roles => { :name => 'rep' } }) end # Listing-poc Module Custom Queries if mod.name == "listing-poc" limit = params[:limit].to_i < 1 ? 10 : params[:limit] PropertyEntry.update_from_listing(mod.service_url) @properties = PropertyEntry.all(:limit => limit, :order => "city desc") end # Talents-index Module Custom Queries if mod.name == "talents-index" @talent = params[:type] @reps = User.find(:all, :joins => :talents, :conditions => { :talents => { :name => @talent } }) end end respond_to do |format| format.html # show.html.erb format.xml { render :xml => @page.to_xml( :include => { :zones => { :include => :mods } } ) } format.json { render :json => @page.to_json } format.css # show.css.erb, CSS dependency manager template end end # for property listing ajax request def update_properties limit = params[:limit].to_i < 1 ? 10 : params[:limit] offset = params[:offset] @properties = PropertyEntry.all(:limit => limit, :offset => offset, :order => "city desc") #render :nothing => true end end So imagine a site with a hundred modules and scores of additional controller actions. I think most would agree that it would be much cleaner if I could move that code out and refactor it to behave more like a configuration.

    Read the article

  • How to compile Asus-syntec-webcam-driver stk11xx

    - by berot3
    i have asus v1s and some time ago i was able to easily compile and install it. but now compiling fails... i hope u guys know what i can do, if there is anything that can be done... after svn checkout https://syntekdriver.svn.sourceforge.net/svnroot/syntekdriver syntekdriver and cd ./syntekdriver/trunk/driver as well as make -f Makefile.standalone i got: make -C /lib/modules/2.6.36.2/build SUBDIRS=/home/berot3/syntekdriver/trunk/driver modules make[1]: Betrete Verzeichnis '/usr/src/linux-headers-2.6.38-8-generic' CC [M] /home/berot3/syntekdriver/trunk/driver/stk11xx-usb.o CC [M] /home/berot3/syntekdriver/trunk/driver/stk11xx-v4l.o /home/berot3/syntekdriver/trunk/driver/stk11xx-v4l.c:43:28: fatal error: linux/videodev.h: File not found! compilation terminated. make[2]: *** [/home/berot3/syntekdriver/trunk/driver/stk11xx-v4l.o] Error 1 make[1]: *** [_module_/home/berot3/syntekdriver/trunk/driver] Error 2 make[1]: Verlasse Verzeichnis '/usr/src/linux-headers-2.6.38-8-generic' make: *** [driver] Error 2 same for http://bookeldor-net.info/merdier/Makefile-syntekdriver http://webcache.googleusercontent.com/search?q=cache:OM0zVNlYiVQJ:bookeldor-net.info/merdier/Makefile-syntekdriver+http://bookeldor-net.info/merdier/Makefile-syntekdriver&hl=de&strip=1

    Read the article

  • Project Jigsaw: Late for the train: The Q&A

    - by Mark Reinhold
    I recently proposed, to the Java community in general and to the SE 8 (JSR 337) Expert Group in particular, to defer Project Jigsaw from Java 8 to Java 9. I also proposed to aim explicitly for a regular two-year release cycle going forward. Herewith a summary of the key questions I’ve seen in reaction to these proposals, along with answers. Making the decision Q Has the Java SE 8 Expert Group decided whether to defer the addition of a module system and the modularization of the Platform to Java SE 9? A No, it has not yet decided. Q By when do you expect the EG to make this decision? A In the next month or so. Q How can I make sure my voice is heard? A The EG will consider all relevant input from the wider community. If you have a prominent blog, column, or other communication channel then there’s a good chance that we’ve already seen your opinion. If not, you’re welcome to send it to the Java SE 8 Comments List, which is the EG’s official feedback channel. Q What’s the overall tone of the feedback you’ve received? A The feedback has been about evenly divided as to whether Java 8 should be delayed for Jigsaw, Jigsaw should be deferred to Java 9, or some other, usually less-realistic, option should be taken. Project Jigsaw Q Why is Project Jigsaw taking so long? A Project Jigsaw started at Sun, way back in August 2008. Like many efforts during the final years of Sun, it was not well staffed. Jigsaw initially ran on a shoestring, with just a handful of mostly part-time engineers, so progress was slow. During the integration of Sun into Oracle all work on Jigsaw was halted for a time, but it was eventually resumed after a thorough consideration of the alternatives. Project Jigsaw was really only fully staffed about a year ago, around the time that Java 7 shipped. We’ve added a few more engineers to the team since then, but that can’t make up for the inadequate initial staffing and the time lost during the transition. Q So it’s really just a matter of staffing limitations and corporate-integration distractions? A Aside from these difficulties, the other main factor in the duration of the project is the sheer technical difficulty of modularizing the JDK. Q Why is modularizing the JDK so hard? A There are two main reasons. The first is that the JDK code base is deeply interconnected at both the API and the implementation levels, having been built over many years primarily in the style of a monolithic software system. We’ve spent considerable effort eliminating or at least simplifying as many API and implementation dependences as possible, so that both the Platform and its implementations can be presented as a coherent set of interdependent modules, but some particularly thorny cases remain. Q What’s the second reason? A We want to maintain as much compatibility with prior releases as possible, most especially for existing classpath-based applications but also, to the extent feasible, for applications composed of modules. Q Is modularizing the JDK even necessary? Can’t you just put it in one big module? A Modularizing the JDK, and more specifically modularizing the Java SE Platform, will enable standard yet flexible Java runtime configurations scaling from large servers down to small embedded devices. In the long term it will enable the convergence of Java SE with the higher-end Java ME Platforms. Q Is Project Jigsaw just about modularizing the JDK? A As originally conceived, Project Jigsaw was indeed focused primarily upon modularizing the JDK. The growing demand for a truly standard module system for the Java Platform, which could be used not just for the Platform itself but also for libraries and applications built on top of it, later motivated expanding the scope of the effort. Q As a developer, why should I care about Project Jigsaw? A The introduction of a modular Java Platform will, in the long term, fundamentally change the way that Java implementations, libraries, frameworks, tools, and applications are designed, built, and deployed. Q How much progress has Project Jigsaw made? A We’ve actually made a lot of progress. Much of the core functionality of the module system has been prototyped and works at both compile time and run time. We’ve extended the Java programming language with module declarations, worked out a structure for modular source trees and corresponding compiled-class trees, and implemented these features in javac. We’ve defined an efficient module-file format, extended the JVM to bootstrap a modular JRE, and designed and implemented a preliminary API. We’ve used the module system to make a good first cut at dividing the JDK and the Java SE API into a coherent set of modules. Among other things, we’re currently working to retrofit the java.util.ServiceLoader API to support modular services. Q I want to help! How can I get involved? A Check out the project page, read the draft requirements and design overview documents, download the latest prototype build, and play with it. You can tell us what you think, and follow the rest of our work in real time, on the jigsaw-dev list. The Java Platform Module System JSR Q What’s the relationship between Project Jigsaw and the eventual Java Platform Module System JSR? A At a high level, Project Jigsaw has two phases. In the first phase we’re exploring an approach to modularity that’s markedly different from that of existing Java modularity solutions. We’ve assumed that we can change the Java programming language, the virtual machine, and the APIs. Doing so enables a design which can strongly enforce module boundaries in all program phases, from compilation to deployment to execution. That, in turn, leads to better usability, diagnosability, security, and performance. The ultimate goal of the first phase is produce a working prototype which can inform the work of the Module-System JSR EG. Q What will happen in the second phase of Project Jigsaw? A The second phase will produce the reference implementation of the specification created by the Module-System JSR EG. The EG might ultimately choose an entirely different approach than the one we’re exploring now. If and when that happens then Project Jigsaw will change course as necessary, but either way I think that the end result will be better for having been informed by our current work. Maven & OSGi Q Why not just use Maven? A Maven is a software project management and comprehension tool. As such it can be seen as a kind of build-time module system but, by its nature, it does nothing to support modularity at run time. Q Why not just adopt OSGi? A OSGi is a rich dynamic component system which includes not just a module system but also a life-cycle model and a dynamic service registry. The latter two facilities are useful to some kinds of sophisticated applications, but I don’t think they’re of wide enough interest to be standardized as part of the Java SE Platform. Q Okay, then why not just adopt the module layer of OSGi? A The OSGi module layer is not operative at compile time; it only addresses modularity during packaging, deployment, and execution. As it stands, moreover, it’s useful for library and application modules but, since it’s built strictly on top of the Java SE Platform, it can’t be used to modularize the Platform itself. Q If Maven addresses modularity at build time, and the OSGi module layer addresses modularity during deployment and at run time, then why not just use the two together, as many developers already do? A The combination of Maven and OSGi is certainly very useful in practice today. These systems have, however, been built on top of the existing Java platform; they have not been able to change the platform itself. This means, among other things, that module boundaries are weakly enforced, if at all, which makes it difficult to diagnose configuration errors and impossible to run untrusted code securely. The prototype Jigsaw module system, by contrast, aims to define a platform-level solution which extends both the language and the JVM in order to enforce module boundaries strongly and uniformly in all program phases. Q If the EG chooses an approach like the one currently being taken in the Jigsaw prototype, will Maven and OSGi be made obsolete? A No, not at all! No matter what approach is taken, to ensure wide adoption it’s essential that the standard Java Platform Module System interact well with Maven. Applications that depend upon the sophisticated features of OSGi will no doubt continue to use OSGi, so it’s critical that implementations of OSGi be able to run on top of the Java module system and, if suitably modified, support OSGi bundles that depend upon Java modules. Ideas for how to do that are currently being explored in Project Penrose. Java 8 & Java 9 Q Without Jigsaw, won’t Java 8 be a pretty boring release? A No, far from it! It’s still slated to include the widely-anticipated Project Lambda (JSR 335), work on which has been going very well, along with the new Date/Time API (JSR 310), Type Annotations (JSR 308), and a set of smaller features already in progress. Q Won’t deferring Jigsaw to Java 9 delay the eventual convergence of the higher-end Java ME Platforms with Java SE? A It will slow that transition, but it will not stop it. To allow progress toward that convergence to be made with Java 8 I’ve suggested to the Java SE 8 EG that we consider specifying a small number of Profiles which would allow compact configurations of the SE Platform to be built and deployed. Q If Jigsaw is deferred to Java 9, would the Oracle engineers currently working on it be reassigned to other Java 8 features and then return to working on Jigsaw again after Java 8 ships? A No, these engineers would continue to work primarily on Jigsaw from now until Java 9 ships. Q Why not drop Lambda and finish Jigsaw instead? A Even if the engineers currently working on Lambda could instantly switch over to Jigsaw and immediately become productive—which of course they can’t—there are less than nine months remaining in the Java 8 schedule for work on major features. That’s just not enough time for the broad review, testing, and feedback which such a fundamental change to the Java Platform requires. Q Why not ship the module system in Java 8, and then modularize the platform in Java 9? A If we deliver a module system in one release but don’t use it to modularize the JDK until some later release then we run a big risk of getting something fundamentally wrong. If that happens then we’d have to fix it in the later release, and fixing fundamental design flaws after the fact almost always leads to a poor end result. Q Why not ship Jigsaw in an 8.5 release, less than two years after 8? Or why not just ship a new release every year, rather than every other year? A Many more developers work on the JDK today than a couple of years ago, both because Oracle has dramatically increased its own investment and because other organizations and individuals have joined the OpenJDK Community. Collectively we don’t, however, have the bandwidth required to ship and then provide long-term support for a big JDK release more frequently than about every other year. Q What’s the feedback been on the two-year release-cycle proposal? A For just about every comment that we should release more frequently, so that new features are available sooner, there’s been another asking for an even slower release cycle so that large teams of enterprise developers who ship mission-critical applications have a chance to migrate at a comfortable pace.

    Read the article

  • Deploy from NetBeans IDE by Twisting an External Dial

    - by Geertjan
    Via this code in a NetBeans module, i.e., a registered NetBeans ModuleInstall class, you can twist the Tinkerforge Rotary Poti Bricklet to deploy the current application in the IDE: import com.tinkerforge.BrickMaster; import com.tinkerforge.BrickletLCD20x4; import com.tinkerforge.BrickletRotaryPoti; import com.tinkerforge.IPConnection; import javax.swing.Action; import javax.swing.JMenuItem; import org.netbeans.api.project.Project; import org.netbeans.api.project.ProjectUtils; import org.openide.awt.Actions; import org.openide.modules.ModuleInstall; import org.openide.util.Utilities; public class Installer extends ModuleInstall { private static final String HOST = "localhost"; private static final int PORT = 4223; private static final String MASTERBRICKUID = "abc"; private static final String LCDUID = "abc"; private static final String ROTIUID = "abc"; private static IPConnection ipc; private static BrickMaster master = new BrickMaster(MASTERBRICKUID); private static BrickletLCD20x4 lcd = new BrickletLCD20x4(LCDUID); private static BrickletRotaryPoti poti = new BrickletRotaryPoti(ROTIUID); @Override public void restored() { try { ipc = new IPConnection(HOST, PORT); ipc.addDevice(master); ipc.addDevice(lcd); ipc.addDevice(poti); poti.setPositionCallbackPeriod(50); poti.addListener(new BrickletRotaryPoti.PositionListener() { @Override public void position(final short position) { lcd.backlightOn(); lcd.clearDisplay(); final Action runAction = Actions.forID("Project","org.netbeans.modules.project.ui.RunMainProject"); //The action must be invoked from menu item or toolbar button, //see line 147 in org.netbeans.modules.project.ui.actions.LookupSensitiveAction: JMenuItem jmi = new JMenuItem(runAction); //When position is 100 (range is -150 to 150), deploy the app //and print info about the project to the LCD display: if (position == 100) { jmi.doClick(); Project p = Utilities.actionsGlobalContext().lookup(Project.class); lcd.writeLine((short) 0, (short) 0, "Deployed:"); lcd.writeLine((short) 1, (short) 0, ProjectUtils.getInformation(p).getDisplayName()); } else { lcd.writeLine((short) 0, (short) 0, "Position: " + position); } } }); } catch (Exception e) { } } }

    Read the article

  • Is programming in layers real?

    - by Aura
    I am fairly new in product development and I am trying to work over a product. The problem that I have realized is that people draw diagrams and charts showing different modules and layers. But as I am working alone (I am my own team) I got a bit confused about the interaction I am facing in the development within the programs and I am wondering whether developing a product in modules is real or not? Maybe I am not a great programmer, but I see no boundaries when data start to travel from frontend to backend.

    Read the article

  • Sortie de Qt 5 RC, avec une restructuration de la documentation

    Sortie de Qt 5 alpha La première version majeure du Qt Project autonome se concentre sur les performances et les capacités graphiques La version 5 de Qt vient de sortir en version alpha. Cette version est la première version majeure depuis que Qt est devenu autonome avec la création du Qt Project. Beaucoup de personnes ont contribué à cette nouvelle version, pas uniquement des développeurs de chez Nokia. Les différents modules ont été regroupés en deux catégories, les essentiels, installés par défaut, et les add-ons, installés à la demande. L'objectif de cette version alpha est de récupérer les retours des utilisateurs, principalement sur les modules essentiels. Lars Knoll, le responsable...

    Read the article

  • Error applying iptable rules

    - by user3215
    When I try to use iptables command on one of my Rackspace cloud server, I'm getting the following error. I was trying to apply iptable rules with iptables-apply -t 120 /etc/iptables.rules and iptables-restore < /etc/iptables.rules FATAL: Could not load /lib/modules/2.6.35.4-rscloud/modules.dep: No such file or directory iptables-restore v1.4.4: iptables-restore: unable to initialize table 'filter' Error occurred at line: 2 Try `iptables-restore -h' or 'iptables-restore --help' for more information. How could I fix this?. Anybody has any technique?

    Read the article

  • How can I fix the iptables error message "unable to initialize table 'filter'"?

    - by user3215
    When I try to use iptables command on one of my Rackspace cloud server, I'm getting the following error. In an attempt to apply iptables rules with iptables-apply -t 120 /etc/iptables.rules and iptables-restore < /etc/iptables.rules I got the next error: FATAL: Could not load /lib/modules/2.6.35.4-rscloud/modules.dep: No such file or directory iptables-restore v1.4.4: iptables-restore: unable to initialize table 'filter' Error occurred at line: 2 Try `iptables-restore -h' or 'iptables-restore --help' for more information. How do I fix this?

    Read the article

  • No sound card detected [elementary os]

    - by Silouane Gerin
    h61 I set up a computer for my parents with elementary os. I installed an Asus P8H61-I motherboard which contains an integrated Realtek ALC887 sound card. The system doesn't recognize the sound card. Here are the results of some commands: lspci | grep audio doesn't return anything, alsa force-reload return Unloading ALSA sound driver modules: (none loaded). Loading ALSA sound driver modules: (none to reload). aplay -l no soundcard detected and cat /proc/asound/cards let me know that asound doesn't exist. I tried installing Realtek drivers but nothing happened. Here is a log from alsa : Log Can someone help me? Thanks.

    Read the article

  • HTML Tidy in NetBeans IDE (Part 2)

    - by Geertjan
    This is what I was aiming for in the previous blog entry: What you can see above (especially if you click to enlarge it) is that I have HTML Tidy integrated into the NetBeans analyzer functionality, which is pluggable from 7.2 onwards. Well, if you set an implementation dependency on "Static Analysis Core", since it's not an official API yet. Also, the scopes of the analyzer functionality are not pluggable. That means you can 'only' set the analyzer's scope to one or more projects, one or more packages, or one or more files. Not one or more folders, which means you can't have a bunch off HTML files in a folder that you access via the Favorites window and then run the analyzer on that folder (or on multiple folders). Thus, to try out my new code, I had to put some HTML files into a package inside a Java application. Then I chose that package as the scope of the analyzer. Then I ran all the analyzers (i.e., standard NetBeans Java hints, FindBugs, as well as my HTML Tidy extension) on that package. The screenshot above is the result. Here's all the code for the above, which is a port of the Action code from the previous blog entry into a new Analyzer implementation: import java.io.IOException; import java.io.PrintWriter; import java.io.StringWriter; import java.util.ArrayList; import java.util.Collections; import java.util.List; import javax.swing.JComponent; import javax.swing.text.Document; import org.netbeans.api.fileinfo.NonRecursiveFolder; import org.netbeans.modules.analysis.spi.Analyzer; import org.netbeans.modules.analysis.spi.Analyzer.AnalyzerFactory; import org.netbeans.modules.analysis.spi.Analyzer.Context; import org.netbeans.modules.analysis.spi.Analyzer.CustomizerProvider; import org.netbeans.modules.analysis.spi.Analyzer.WarningDescription; import org.netbeans.spi.editor.hints.ErrorDescription; import org.netbeans.spi.editor.hints.ErrorDescriptionFactory; import org.netbeans.spi.editor.hints.Severity; import org.openide.cookies.EditorCookie; import org.openide.filesystems.FileObject; import org.openide.loaders.DataObject; import org.openide.util.Exceptions; import org.openide.util.lookup.ServiceProvider; import org.w3c.tidy.Tidy; public class TidyAnalyzer implements Analyzer {     private final Context ctx;     private TidyAnalyzer(Context cntxt) {         this.ctx = cntxt;     }     @Override     public Iterable<? extends ErrorDescription> analyze() {         List<ErrorDescription> result = new ArrayList<ErrorDescription>();         for (NonRecursiveFolder sr : ctx.getScope().getFolders()) {             FileObject folder = sr.getFolder();             for (FileObject fo : folder.getChildren()) {                 for (ErrorDescription ed : doRunHTMLTidy(fo)) {                     if (fo.getMIMEType().equals("text/html")) {                         result.add(ed);                     }                 }             }         }         return result;     }     private List<ErrorDescription> doRunHTMLTidy(FileObject sr) {         final List<ErrorDescription> result = new ArrayList<ErrorDescription>();         Tidy tidy = new Tidy();         StringWriter stringWriter = new StringWriter();         PrintWriter errorWriter = new PrintWriter(stringWriter);         tidy.setErrout(errorWriter);         try {             Document doc = DataObject.find(sr).getLookup().lookup(EditorCookie.class).openDocument();             tidy.parse(sr.getInputStream(), System.out);             String[] split = stringWriter.toString().split("\n");             for (String string : split) {                 //Bit of ugly string parsing coming up:                 if (string.startsWith("line")) {                     final int end = string.indexOf(" c");                     int lineNumber = Integer.parseInt(string.substring(0, end).replace("line ", ""));                     string = string.substring(string.indexOf(": ")).replace(":", "");                     result.add(ErrorDescriptionFactory.createErrorDescription(                             Severity.WARNING,                             string,                             doc,                             lineNumber));                 }             }         } catch (IOException ex) {             Exceptions.printStackTrace(ex);         }         return result;     }     @Override     public boolean cancel() {         return true;     }     @ServiceProvider(service = AnalyzerFactory.class)     public static final class MyAnalyzerFactory extends AnalyzerFactory {         public MyAnalyzerFactory() {             super("htmltidy", "HTML Tidy", "org/jtidy/format_misc.gif");         }         public Iterable<? extends WarningDescription> getWarnings() {             return Collections.EMPTY_LIST;         }         @Override         public <D, C extends JComponent> CustomizerProvider<D, C> getCustomizerProvider() {             return null;         }         @Override         public Analyzer createAnalyzer(Context cntxt) {             return new TidyAnalyzer(cntxt);         }     } } The above only works on packages, not on projects and not on individual files.

    Read the article

  • Remote Diagnostic Agent (RDA) version 4.30

    - by inowodwo
    posted by Maurice Bauhahn Remote Diagnostic Agent (RDA) version 4.30 was released on December 11th A free download can be accessed via Knowledge Management article 314422.1 and installed in any Enterprise Performance Management 11.1.2.x environment. EPM-specific instructions are available in Knowledge Management article 1304885.1. This RDA version incorporates two new modules (EAS=Essbase Administration Services; HWA=Hyperion Web Analysis) and improvements in modules and profiles relating to twelve other Hyperion applications (EPM, EPMA, ESS, FCM, HFM, HFR, HIR, HPL, HPSV, HSS, PR, and HSV). To follow best practice, run related RDA profiles [for example: "perl rda.pl -vnSCRPp Hyperion1112_EAS"] and attach the output zip file [by default in \rda\output\] to your service requests. The comprehensive set of details provided in such output files should help technicians to avoid delays in handling service requests (by avoiding ping-pong communications resulting from repeated requests for additional values).

    Read the article

  • Sortie de Qt 5 beta, cette préversion montre l'achèvement de la modularisation du framework

    Sortie de Qt 5 alpha La première version majeure du Qt Project autonome se concentre sur les performances et les capacités graphiques La version 5 de Qt vient de sortir en version alpha. Cette version est la première version majeure depuis que Qt est devenu autonome avec la création du Qt Project. Beaucoup de personnes ont contribué à cette nouvelle version, pas uniquement des développeurs de chez Nokia. Les différents modules ont été regroupés en deux catégories, les essentiels, installés par défaut, et les add-ons, installés à la demande. L'objectif de cette version alpha est de récupérer les retours des utilisateurs, principalement sur les modules essentiels. Lars Knoll, le responsable...

    Read the article

< Previous Page | 36 37 38 39 40 41 42 43 44 45 46 47  | Next Page >