Search Results

Search found 17579 results on 704 pages for 'mercurial build'.

Page 431/704 | < Previous Page | 427 428 429 430 431 432 433 434 435 436 437 438  | Next Page >

  • Usual Suspects: Typical 3rd Party Entities in E-Commerce [closed]

    - by zharvey
    I am doing some requirements/analysis for a web app that I'd like to build (Ruby/Java developer here). This web app would have a store front, shopping cart and would need to be totally compliant with all e-com best practices. It's amazing how much non-technical info comes up when you search for phrases like "how does e-commerce work", but very little comes up in the way of technical details. As such, I'm having extreme frustration finding answers to what I consider pretty straight-forward questions. I came here because I believe this question is not off-topic; if it is, please leave a comment as to why this question does not belong here and I will happily remove it myself (upvotes if your comment can point me to the correct place for this question!). So then: What 3rd parties will I need to work with to have a modern, web-compliant e-com site? So far I can account for a payment gateway provider like Authorize.net and an SSL certificate provider like Trustwave. Any others? What other standards besides PCI compliance will I be held to (besides governing laws, of course!)? Vulnerability scans: PCI compliance requires quarterly scans: if I'm a "Level 4" (low volume) Merchant does that still apply to me? Irregardless, my backend architecture is quite huge, with web servers, app servers, database, message brokers and more. Do each of these servers need to be scanned?!? If not what servers do need to get these quarterly scans? I usually hate to ask micro-questions inside of one large one, but these are so closely-related I just felt like asking them all separately would be spamming the site with too many petty questions. Thanks in advance!

    Read the article

  • Host a SSTP VPN Server on Windows 8

    - by Maarten
    I have a small server computer running Windows 8 at home. Currently it is hosting a PPTP VPN server using the build-in Windows 8 functionality for that. What I would want is something more secure, like an SSTP VPN server. However, I can't find any functionality of windows 8 or 3rd party software that can HOST a SSTP vpn server on Windows 8. I've only seen Client stuff and vpn pass trough services via google, all which i don't want/need. The only HOST stuff i find via google is the PPTP i set up currently. Is there any way of hosting a SSTP VPN server on my home machine? Thanks in advance, Maarten

    Read the article

  • Can't connect to IIS7 on one of my machines

    - by user38974
    Hello, I have 2 computers, both running Win7 Professional x64. Computer "A" (192.168.0.10) is my work machine, it contains my tools etc. Computer "B" (192.168.0.15) is supposed to be my build server / web server for my projects I've installed IIS on "B", and installed IIS7 remote manager on "A". I'm trying to connect from A's IIS7 Manager to B's IIS but I fail. I have little knowledge of IIS, and I feel that's the main reason. I can ping B from A and get positive results so the machines do see each other. A and B are in the same workgroup but not in a domain (if that matters) - they're in my home network. What do I need to do to "see" B's IIS?

    Read the article

  • Windows 7: "localhost name resolution is handled within DNS itself". Why?

    - by Portman
    After 18 years of hosts files on Windows, I was surprised to see this in Windows 7 build 7100: # localhost name resolution is handled within DNS itself. # 127.0.0.1 localhost # ::1 localhost Does anyone know why this change was introduced? I'm sure there has to be some kind reasoning. And, perhaps more relevantly, are there any other important DNS-related changes in Windows 7? It scares me a little bit to think that something as fundamental as localhost name resolution has changed... makes me think there are other subtle but important changes to the DNS stack in Win7.

    Read the article

  • What is a preferred method for automatically configuring and setting up an Ubuntu instance?

    - by sutch
    I am tired of manually configuring instances of Ubuntu for testing web applications and for setting up workstations. I'm even more frustrated by the issues caused by inconsistent configurations. Is there a method (hopefully not too time consuming to learn and setup) that allows for automation of the setup and configuration of an Ubuntu server or workstation from an ISO. This is primarily for virtual machine instances, but it would be helpful to also create instances on hardware. I am specifically looking for a method to automate the installation of libraries (apt-get), configure services (such as Apache and MySQL), add 3rd party software (download, extract and build), and add libraries to scripting languages (for example, Ruby Gems or CPAN packages for Perl).

    Read the article

  • ArchBeat Link-o-Rama for 2012-09-12

    - by Bob Rhubart
    15 Lessons from 15 Years as a Software Architect | Ingo Rammer In this presentation from the GOTO Conference in Copenhagen, Ingo Rammer shares 15 tips regarding people, complexity and technology that he learned doing software architecture for 15 years. Adding a runtime picker to a taskflow parameter in WebCenter | Yannick Ongena Oracle ACE Yannick Ongena shows how to create an Oracle WebCenter popup to allow users to "select items or do more complex things." Oracle Identity Manager 11g R2 Catalog | Daniel Gralewski Oracle Fusion Middleware A-Team blogger Daniel Gralewski shares a detailed overview of the new Catalog feature, one of the most talked about features in the latest release of Oracle Identity Manager 11g. Cloud API and service designers, stop thinking small | Cloud Computing - InfoWorld "The focus must shift away from fine-grained APIs that provide some type of primitive service, such as pushing data to a block of storage or perhaps making a request to a cloud-rooted database," says InfoWorld's David Linthicum. "To go beyond primitives, you must understand how these services should be used in a much larger architectural context. In other words, you need to understand how businesses will employ these services to form real workplace solutions -- inside and outside the enterprise." Oracle Solaris 8 P2V with Oracle database 10.2 and ASM | Orgad Kimchi Orgad Kimchi's technical post illustrates the migration of "a Solaris 8 physical system, with Oracle database version 10.2.0.5 with ASM file-system located on a SAN storage, into a Solaris 8 branded zone inside a Solaris 10 guest domain on top of a Solaris 11 control domain." Thought for the Day "The hardest single part of building a software system is deciding precisely what to build. " — Fred Brooks Source: SoftwareQuotes.com

    Read the article

  • Friday Stats

    - by jjg
    As some of you may have noticed, we've recently opened a new repository in the Code Tools project for small utilities which can be used to gather info about the OpenJDK code base and builds. 1 The latest addition is a utility for analyzing the class file versions in a collection of class files. I've posted an example set of results from analyzing the class files in an OpenJDK build on Linux. 2. Most of the files are version 52 files as you would expect, but there is a surprising number of version 51 and 50 files, as well as a handful of v45.3 files as well. Digging deeper, it turns out that Nashorn is still using version 51 class files, and the Serviceability Agent is still using version 50 class files and one 45.3 class file, leaving the remainder of the 45.3 class files coming from RMI. For more info on the different class file versions, see Joe Darcy's class file version decoder rIng. Thanks to Stuart Marks for planting the seed for the class file version tool. See the project page, repo, and mail archive. http://cr.openjdk.java.net/~jjg/cfv-summary/open/

    Read the article

  • Moving from a static site to a CMS with new URLs and meta-data for pages

    - by Chris J
    Hi I am in the process of rebuilding a site from static pages to a CMS which will be using mod_rewrite to generate new page URLs. In this process our marketing people and myself have decided to tidy up the descriptions, keywords and titles. Eg: a page which who's URL is currently "website-name/about_us.html" and has a title of "website-name - something not quite page specific" will change to "website-name/about-us/" and title: "about us - website-name" and may have a few keywords and the description changed. Our goal with updating the meta data is to improve our page rankings and try to keep in line with some best practices for SEO. Though our current page rankings are quite good in many aspects, there is room for improvement. All of the pages will also have content changes (like rearranging heading tags, new menu on all pages, new content in footer, extra pieces of dynamic content relating to other pages). In this new site process I plan to use 301 redirects for all the old URLs pointing to the new URLs. My question is what can I expect to happen to the page rankings in Google, in the sort term and long term? Will this be like kicking off a new site which will have to build up trust over time or will the original page rankings have affect?

    Read the article

  • Where can I find ready to use windows scripts that used robocopy?

    - by Geo
    We are installing the Windows Resource Kit, and that installs RoboCopy. We want to have access to a few windows scripts that uses RoboCopy so we can start from those to build something else. Any ideas on where I can find a few samples? NOTE 1: A bit of information. Every time we try to copy D drive to E drive (new drive) we get an error that says: ERROR 32 (0x000000020) Copying File d:\pagefile.sys The process cannot access the file because it is being used by another process. Waiting 30 seconds. Just to help figure it out.

    Read the article

  • .NET Dependency Management Systems

    - by StriplingWarrior
    I have some .NET projects that are starting to get large enough to merit looking into Dependency Management solutions, so we don't have to copy binaries from one project to another. Here's what I've found so far: NPanday is based on a port of Maven. I can't tell how recently it was worked on, but the last release was in May 2011. NuGet seems to be under active development, and it appears to have support directly from Microsoft. Some people complained that it "only addresses dependency resolution," but I don't know what else it should address, or whether it has added more features since that point. It does appear to have recently added the ability to import binaries as part of the build process so we don't have to commit them to our repositories. Refix appears to still be in Beta, after having received no attention since Sept 2011. Would somebody with recent experience using any of these dependency management tools (or any others that work well) share your experience? Is NuGet mature enough to use it for dependency management? If not, what does it lack?

    Read the article

  • Where to obtain openssl-devel for SunOS 5.10

    - by user35949
    So I am having an issue that I have seen other people have on many different systems. I have to build Subversion on a SunOS 5.10 box and have run into issues. I have the openssl source code installed and in the subversion-1.6.9 folder, I run the following: ./configure --with-ssl --with-libs=/opt/exp/lib/openssl/lib and receive the error: checking for library containing RSA_new... not found configure: error: could not find library containing RSA_new configure failed for neon I have also tried running the command without the "lib" on the end of the --with-libs path. I read online that I need the openssl-devel packages, but I have been unable to find them for SunOS 5.10, and they do not show up already installed on my system when running pkginfo. I have looked online including http://www.sunfreeware.com/ which I was told was a good SunOS software source. Any help you can provide would be welcomed. Thanks, Sean

    Read the article

  • Deciding on a company-wide javascript strategy [on hold]

    - by drogon
    Our company is moving most of its software from thick-client winforms apps to web apps. We are using asp.net mvc on the server side. Most of the developers are brand new to the web and need to become efficient and knowledgeable at writing client-side web code (javascript). We are deciding on a number of things and would appreciate feedback on the following: Angular.js or Backbone.js? Backbone (w/ Underscore) is certainly more light weight, but requires more custom development. Angular seems to be a full-fledged framework, but would require everyone to embrace it and probably a longer learning curve(??). (Note: I know nothing about Angular at this point) Require.js or script includes w/ MVC bundleconfig? Require.js makes development "feel like" c# (importing namespaces). But, integrating the build/minification process can be a pain (especially the configuration). Bundling via mvc requires developers to worry more about which scripts to include but has less overall development friction. Typescript vs Javascript Regardless of frameworks, our developers are going to need to learn the basics. Typescript is more like c# and MAY be easier for c# developers to understand. However, learning TypeScript before javascript may hinder their mastery of javascript at the expense of efficiency.

    Read the article

  • Why can't my networks reach each other?

    - by HOLOGRAPHICpizza
    We have two Buffalo WZR-HP-G300NH2 routers, with the default firmware, DD-WRT v24SP2-MULTI (10/31/11) std - build 17798. Each has a separate cable internet connection with a public static IP address. They are both in the 24.123.68.0/24 space. Both of them can contact pretty much the whole internet, and they can both be accessed out on the internet with no problem, but for some reason they can't talk to each other! When I try to ping one from the other I always get "Destination Host Unreachable". There are no strange routing or firewall rules in place. And they are both set to respond to pings, I can ping them from outside. Our main IT guy is going to call our ISP on Monday, but I'm impatient, so does anyone have any ideas?

    Read the article

  • Enterprise Cloud Infrastructure for Dummies eBook

    - by ferhat
    Are you considering "going to the cloud" as a way to cut IT costs and maximize your virtualization investments? Then Enterprise Cloud Infrastructure for Dummies is a no-nonsense guide to help you navigate this hot topic. This user friendly guide explains how to cut through the noise and take advantage of integrated virtualization and management tools to implement a cloud infrastructure that not only lowers operational costs but that can easily adapt and scale to run a broad range of application services safely and securely. &amp;amp;<span id="XinhaEditingPostion"></span>amp;amp;amp;amp;amp;lt;span id=&amp;amp;amp;amp;amp;amp;amp;quot;XinhaEditingPostion&amp;amp;amp;amp;amp;amp;amp;quot;&amp;amp;amp;amp;amp;amp;amp;gt;&amp;amp;amp;amp;amp;amp;amp;lt;/span&amp;amp;amp;amp;amp;amp;amp;gt; This e-book will serve as a valuable Cloud computing guide covering important topics such as: The current overall cloud landscape and how to best leverage private cloud infrastructure How to build an effective Enterprise Cloud Infrastructure using the Oracle Optimized Solution methodology Quantifiable costs savings gained using Oracle's integrated hardware and software and Optimized Solutions Download your exclusive copy of Enterprise Cloud Infrastructure, Oracle Special Edition today.

    Read the article

  • windows 7 frequently crashes when running 64 bit debian on virtualbox

    - by erin c
    I've been having some hard times with Virtualbox. I use it to run 64-bit Debian guest on a Windows 7 host to be able to code for linux systems. It generally crashes when I build my code in eclipse cdt, or if I am doing some intensive operations. Should I lower the memory and core usage? Is this some sort of virtualbox problem? I upgraded to virtualbox v4.1.8 and the problem still occurs. My virtual machine instance uses 1736MB out of 4GB of ram, and I use 2 out of 8 processor cores. But still, the whole thing crashes 1 or 2 times every single day.

    Read the article

  • Trying to compile from source newest apache with newest openssl

    - by AlexMA
    I need to install apache 2.4.10 using openssl 1.0.1i. I compiled openssl from source with: $ ./config \ --prefix=/opt/openssl-1.0.1i \ --openssldir=/opt/openssl-1.0.1i $ make $ sudo make install and Apache with: ./configure --prefix=/etc/apache2 \ --enable-access_compat=shared \ --enable-actions=shared \ --enable-alias=shared \ --enable-allowmethods=shared \ --enable-auth_basic=shared \ --enable-authn_core=shared \ --enable-authn_file=shared \ --enable-authz_core=shared \ --enable-authz_groupfile=shared \ --enable-authz_host=shared \ --enable-authz_user=shared \ --enable-autoindex=shared \ --enable-dir=shared \ --enable-env=shared \ --enable-headers=shared \ --enable-include=shared \ --enable-log_config=shared \ --enable-mime=shared \ --enable-negotiation=shared \ --enable-proxy=shared \ --enable-proxy_http=shared \ --enable-rewrite=shared \ --enable-setenvif=shared \ --enable-ssl=shared \ --enable-unixd=shared \ --enable-ssl \ --with-ssl=/opt/openssl-1.0.1i \ --enable-ssl-staticlib-deps \ --enable-mods-static=ssl make (would run sudo make install next but I get an error) I'm essentially following the guide here except with newer slightly newer versions. My problem is I get a linker error when I run make for apache: Making all in support make[1]: Entering directory `/home/developer/downloads/httpd-2.4.10/support' make[2]: Entering directory `/home/developer/downloads/httpd-2.4.10/support' /usr/share/apr-1.0/build/libtool --silent --mode=link x86_64-linux-gnu-gcc -std=gnu99 -pthread -L/opt/openssl-1.0.1i/lib -lssl -lcrypto \ -o ab ab.lo /usr/lib/x86_64-linux-gnu/libaprutil-1.la /usr/lib/x86_64-linux-gnu/libapr-1.la -lm /usr/bin/ld: /opt/openssl-1.0.1i/lib/libcrypto.a(dso_dlfcn.o): undefined reference to symbol 'dlclose@@GLIBC_2.2.5' I tried the answer here, but no luck. I would prefer to just use aptitude, but unfortunately the versions I need aren't available yet. If anyone knows how to fix the linker problem (or what I think is a linker problem), or knows of a better way to tell apache to use a newer openssl, it would be greatly appreciated; I've got apache 1.0.1i working otherwise.

    Read the article

  • Web development and tips for building a website and the advantages of using HTML 5 in the site

    - by Siddarth
    I am trying to make a website for my college, and the program starts from jan 13 and we get 15 days of time to develop a running site. The best site will become the college site. I am participating, for all these days i used to participate in C and C++ contests and also won a few contests, now i am really into web dev for the last 2 months. I knew HTML long ago recently i brushed up on it and learnt javascript from "javascript and jquery the missing manual"(sorry for not adding the link) and recently bought "PHP and MySQL web development" and I am going on fine with it, but still a lot of pages to cover in that book. After this what do i need to know ajax is one language to concentrate on, what else do i need to do to make this project up and running. Can someone let me know the tricks of this trade and complete information to build a site like this. Right now i am good with javascript HTML and CSS and thats it, what else I am studying HTML5 and CSS3 its pretty fast and neat. The info on site is a college website which includes students profiles where the have to register their info with college id number and pretty much thats it. Think of it as a college site + a social networking site for students, where they can upload there pics and videos pdf books etc.

    Read the article

  • Git can no longer open emacs as its editor

    - by mwilliams
    I'm running Git version 1.7.3.2 that I built from source, zsh is my shell, and emacs is my editor. Recently I started seeing the following: /usr/local/Cellar/git/1.7.3.2/libexec/git-core/git-sh-setup: line 106: emacs: command not found Could not execute editor My zshrc looks like the following so I can use the Cocoa build and the console binary provided with it. EMACS_HOME="/Applications/Emacs.app/Contents/MacOS" function e() { PATH=$EMACS_HOME/bin:$PATH $EMACS_HOME/Emacs -nw $@ } function ec() { PATH=$EMACS_HOME/bin:$PATH emacsclient -t $@ } function es() { e --daemon=$1 && ec -s $1 } function el() { ps ax|grep Emacs } function ek() { $EMACS_HOME/bin/emacsclient -e '(kill-emacs)' -s $1 } function ecompile() { e -eval "(setq load-path (cons (expand-file-name \".\") load-path))" \ -batch -f batch-byte-compile $@ } alias emacs=e alias emacsclient=ec And I also have export EDITOR="emacs" and have tried adding export GIT_EDITOR="emacs" (and swapping that out with "e") But whatever I try I can't get git to open emacs whenever I need to do a commit or an interactive rebase, etc etc...

    Read the article

  • Windows file server access control by device

    - by Ori Shavit
    I'm trying to build a system where access to certain resources (file shares) in Windows Server, is limited not only by the username (in a Active Directory domain), but also by the client machine. So far, I haven't found a good way to do this; adding the computer account to the DACL is apparently not the way to do it. Windows Server 2012 supports this with Dynamic Access Control, but this method requires all clients to be Windows 8, it seems, with no way to use this with Windows 7 clients. Is there a supported way to do this? (or alternatively, add support for device authorization with Windows 7).

    Read the article

  • Issue with Visual C++ 2010 (Express) External Tools command

    - by espais
    Hi all, Normally we develop in VS 2005 Pro, but I wanted to give VS 2010 a spin. We have custom build tools based off of GNU make tools that are called when creating an executable. This is the error that I see whenever I call my external tool: ...\gnu\make.exe): *** couldn't commit memory for cygwin heap, Win32 error 487 The caveat is that it still works perfectly fine in VS2005, as well as being called straight from the command line. Also, my external tool is setup exactly the same as in VS 2005. Is there some setting somewhere that could cause this error to be thrown?

    Read the article

  • How to Implement Project Type "Copy", "Move", "Rename", and "Delete"

    - by Geertjan
    You've followed the NetBeans Project Type Tutorial and now you'd like to let the user copy, move, rename, and delete the projects conforming to your project type. When they right-click a project, they should see the relevant menu items and those menu items should provide dialogs for user interaction, followed by event handling code to deal with the current operation. Right now, at the end of the tutorial, the "Copy" and "Delete" menu items are present but disabled, while the "Move" and "Rename" menu items are absent: The NetBeans Project API provides a built-in mechanism out of the box that you can leverage for project-level "Copy", "Move", "Rename", and "Delete" actions. All the functionality is there for you to use, while all that you need to do is a bit of enablement and configuration, which is described below. To get started, read the following from the NetBeans Project API: http://bits.netbeans.org/dev/javadoc/org-netbeans-modules-projectapi/org/netbeans/spi/project/ActionProvider.html http://bits.netbeans.org/dev/javadoc/org-netbeans-modules-projectapi/org/netbeans/spi/project/CopyOperationImplementation.html http://bits.netbeans.org/dev/javadoc/org-netbeans-modules-projectapi/org/netbeans/spi/project/MoveOrRenameOperationImplementation.html http://bits.netbeans.org/dev/javadoc/org-netbeans-modules-projectapi/org/netbeans/spi/project/DeleteOperationImplementation.html Now, let's do some work. For each of the menu items we're interested in, we need to do the following: Provide enablement and invocation handling in an ActionProvider implementation. Provide appropriate OperationImplementation classes. Add the new classes to the Project Lookup. Make the Actions visible on the Project Node. Run the application and verify the Actions work as you'd like. Here we go: Create an ActionProvider. Here you specify the Actions that should be supported, the conditions under which they should be enabled, and what should happen when they're invoked, using lots of default code that lets you reuse the functionality provided by the NetBeans Project API: class CustomerActionProvider implements ActionProvider { @Override public String[] getSupportedActions() { return new String[]{ ActionProvider.COMMAND_RENAME, ActionProvider.COMMAND_MOVE, ActionProvider.COMMAND_COPY, ActionProvider.COMMAND_DELETE }; } @Override public void invokeAction(String string, Lookup lkp) throws IllegalArgumentException { if (string.equalsIgnoreCase(ActionProvider.COMMAND_RENAME)) { DefaultProjectOperations.performDefaultRenameOperation( CustomerProject.this, ""); } if (string.equalsIgnoreCase(ActionProvider.COMMAND_MOVE)) { DefaultProjectOperations.performDefaultMoveOperation( CustomerProject.this); } if (string.equalsIgnoreCase(ActionProvider.COMMAND_COPY)) { DefaultProjectOperations.performDefaultCopyOperation( CustomerProject.this); } if (string.equalsIgnoreCase(ActionProvider.COMMAND_DELETE)) { DefaultProjectOperations.performDefaultDeleteOperation( CustomerProject.this); } } @Override public boolean isActionEnabled(String command, Lookup lookup) throws IllegalArgumentException { if ((command.equals(ActionProvider.COMMAND_RENAME))) { return true; } else if ((command.equals(ActionProvider.COMMAND_MOVE))) { return true; } else if ((command.equals(ActionProvider.COMMAND_COPY))) { return true; } else if ((command.equals(ActionProvider.COMMAND_DELETE))) { return true; } return false; } } Importantly, to round off this step, add "new CustomerActionProvider()" to the "getLookup" method of the project. If you were to run the application right now, all the Actions we're interested in would be enabled (if they are visible, as described in step 4 below) but when you invoke any of them you'd get an error message because each of the DefaultProjectOperations above looks in the Lookup of the Project for the presence of an implementation of a class for handling the operation. That's what we're going to do in the next step. Provide Implementations of Project Operations. For each of our operations, the NetBeans Project API lets you implement classes to handle the operation. The dialogs for interacting with the project are provided by the NetBeans project system, but what happens with the folders and files during the operation can be influenced via the operations. Below are the simplest possible implementations, i.e., here we assume we want nothing special to happen. Each of the below needs to be in the Lookup of the Project in order for the operation invocation to succeed. private final class CustomerProjectMoveOrRenameOperation implements MoveOrRenameOperationImplementation { @Override public List<FileObject> getMetadataFiles() { return new ArrayList<FileObject>(); } @Override public List<FileObject> getDataFiles() { return new ArrayList<FileObject>(); } @Override public void notifyRenaming() throws IOException { } @Override public void notifyRenamed(String nueName) throws IOException { } @Override public void notifyMoving() throws IOException { } @Override public void notifyMoved(Project original, File originalPath, String nueName) throws IOException { } } private final class CustomerProjectCopyOperation implements CopyOperationImplementation { @Override public List<FileObject> getMetadataFiles() { return new ArrayList<FileObject>(); } @Override public List<FileObject> getDataFiles() { return new ArrayList<FileObject>(); } @Override public void notifyCopying() throws IOException { } @Override public void notifyCopied(Project prjct, File file, String string) throws IOException { } } private final class CustomerProjectDeleteOperation implements DeleteOperationImplementation { @Override public List<FileObject> getMetadataFiles() { return new ArrayList<FileObject>(); } @Override public List<FileObject> getDataFiles() { return new ArrayList<FileObject>(); } @Override public void notifyDeleting() throws IOException { } @Override public void notifyDeleted() throws IOException { } } Also make sure to put the above methods into the Project Lookup. Check the Lookup of the Project. The "getLookup()" method of the project should now include the classes you created above, as shown in bold below: @Override public Lookup getLookup() { if (lkp == null) { lkp = Lookups.fixed(new Object[]{ this, new Info(), new CustomerProjectLogicalView(this), new CustomerCustomizerProvider(this), new CustomerActionProvider(), new CustomerProjectMoveOrRenameOperation(), new CustomerProjectCopyOperation(), new CustomerProjectDeleteOperation(), new ReportsSubprojectProvider(this), }); } return lkp; } Make Actions Visible on the Project Node. The NetBeans Project API gives you a number of CommonProjectActions, including for the actions we're dealing with. Make sure the items in bold below are in the "getActions" method of the project node: @Override public Action[] getActions(boolean arg0) { return new Action[]{ CommonProjectActions.newFileAction(), CommonProjectActions.copyProjectAction(), CommonProjectActions.moveProjectAction(), CommonProjectActions.renameProjectAction(), CommonProjectActions.deleteProjectAction(), CommonProjectActions.customizeProjectAction(), CommonProjectActions.closeProjectAction() }; } Run the Application. When you run the application, you should see this: Let's now try out the various actions: Copy. When you invoke the Copy action, you'll see the dialog below. Provide a new project name and location and then the copy action is performed when the Copy button is clicked below: The message you see above, in red, might not be relevant to your project type. When you right-click the application and choose Branding, you can find the string in the Resource Bundles tab, as shown below: However, note that the message will be shown in red, no matter what the text is, hence you can really only put something like a warning message there. If you have no text at all, it will also look odd.If the project has subprojects, the copy operation will not automatically copy the subprojects. Take a look here and here for similar more complex scenarios. Move. When you invoke the Move action, the dialog below is shown: Rename. The Rename Project dialog below is shown when you invoke the Rename action: I tried it and both the display name and the folder on disk are changed. Delete. When you invoke the Delete action, you'll see this dialog: The checkbox is not checkable, in the default scenario, and when the dialog above is confirmed, the project is simply closed, i.e., the node hierarchy is removed from the application. However, if you truly want to let the user delete the project on disk, pass the Project to the DeleteOperationImplementation and then add the children of the Project you want to delete to the getDataFiles method: private final class CustomerProjectDeleteOperation implements DeleteOperationImplementation { private final CustomerProject project; private CustomerProjectDeleteOperation(CustomerProject project) { this.project = project; } @Override public List<FileObject> getDataFiles() { List<FileObject> files = new ArrayList<FileObject>(); FileObject[] projectChildren = project.getProjectDirectory().getChildren(); for (FileObject fileObject : projectChildren) { addFile(project.getProjectDirectory(), fileObject.getNameExt(), files); } return files; } private void addFile(FileObject projectDirectory, String fileName, List<FileObject> result) { FileObject file = projectDirectory.getFileObject(fileName); if (file != null) { result.add(file); } } @Override public List<FileObject> getMetadataFiles() { return new ArrayList<FileObject>(); } @Override public void notifyDeleting() throws IOException { } @Override public void notifyDeleted() throws IOException { } } Now the user will be able to check the checkbox, causing the method above to be called in the DeleteOperationImplementation: Hope this answers some questions or at least gets the discussion started. Before asking questions about this topic, please take the steps above and only then attempt to apply them to your own scenario. Useful implementations to look at: http://kickjava.com/src/org/netbeans/modules/j2ee/clientproject/AppClientProjectOperations.java.htm https://kenai.com/projects/nbandroid/sources/mercurial/content/project/src/org/netbeans/modules/android/project/AndroidProjectOperations.java

    Read the article

  • Bluetooth Connectivity that an error

    - by eka anggraini
    I have a bluetooth USB, every time I enter the bluetooth, bluetooth is active only a few seconds, after which it can not be used again In 'devices and printer' there is still a bluetooth but if you want to save the settings it can not, there are error message or bluetooth is revoked. I am using windows 7, whereas if use Windows XP smoothly no problems. This not only happens on my computer alone, my friend who has a build-in bluetooth also experienced the same thing with the same OS. Is there a patch / fix / update for its Windows? I've been brain-tweaking drivers for all but still can not.

    Read the article

  • Compiling custom kernel 3.7.x lowlatency on Ubuntu 12.04

    - by FlabbergastedPickle
    All, I have a peculiar problem with trying to compile a lowlatency flavor of the latest 3.7 kernel. I retrieved the prepatched source from the launchpad using bzr, compiled it using the usual make-kpkg using the current config file plus default options for the rest, installed the kernel and booted into it. Everything works except for the fglrx and wl drivers that I had to install in the original 12.04 lowlatency kernel. So, I tried recompiling these and succeeded with both of them (no errors were reported)--wl driver required a minor adjustment to system.h include while latest fglrx 12.11 beta11 (released yesterday, Dec. 3rd, 2012) compiled without the hitch. Yet, when I try to modprobe either module (both having in common the fact that they were built after the kernel, fglrx as a deb, and wl via the usual make/make install), I get "FATAL: no MODULENAME module found" (MODULENAME being either wl or fglrx). The graphic driver watermark shows 3D crossed out and "for testing purposes" (or "unsupported hardware," can't remember), and no fglrx or wl is loaded. More mysteriously, dmesg shows no attempt on kernel's behalf to load the said drivers, even though they are clearly in the right /lib/modules/KERNEL_VERSION folder. How is this possible? Has something fundamentally changed in 3.7 kernel that would prevent modprobing of these? I know that there is driver signing option that was merged recently but as far as I could tell the kernel config file generated by the build process had that disabled. OTOH, while building wl driver, I did get a warning that the driver was not signed... Then again, even if the kernel disallowed loading of those modules, shouldn't dmesg reflect that? Any thoughts on this one are most appreciated.

    Read the article

  • Windows 8 Developer Camp - Raleigh September 25th

    - by Jim Duffy
    Time is ticking away and the time to act is now! How's that for some motivation? :-)  Microsoft Developer Evangelist Brian Hitney and I want to help you get your app in the store in time for the October 26 Windows 8 launch. Come join us on Tuesday, September 25, at 9:00 AM in the Microsoft RTP offices to learn how simple it can be to construct a world class Windows 8 application. Don't think you can be ready to join the Windows 8 launch? Come anyway. You might be surprised. Don't have any idea what kind of app to build? Come anyway. They're are plenty of places to look for inspiration. Either way you can learn what it takes to create or tune an app for Windows 8 and publish it in the Windows App Store. Learn more about this free event on the registration page. Registration is now open and space is limited. Have a day.

    Read the article

  • How do I install automake and autoconf on RedHat Enterprise 5?

    - by Kevin Sedgley
    I am attempting to install "uploadprogress" for a PHP application, and have failed on dependencies. Firstly, on phpize, then php-devel, then on autoconf and automake. I have tried yum, and various repositories, with no luck. I think it's to do with the ultra-tight but annoying set up they have on Rackspace Cloud servers. Does anyone know where I can find a repository that I can tell yum to look at that will contain php-devel, autoconf, automake, etc? Thanks ever so much. Release details: Red Hat Enterprise Linux Server release 5.3 (Tikanga) Linux version 2.6.18-128.7.1.el5xen (mockbuild@hs20-bc2-3.build.redhat.com) (gcc version 4.1.2 20080704 (Red Hat 4.1.2-44)) #1 SMP Wed Aug 19 04:17:26 EDT 2009 Linux Serv001 2.6.18-128.7.1.el5xen #1 SMP Wed Aug 19 04:17:26 EDT 2009 x86_64 x86_64 x86_64 GNU/Linux

    Read the article

< Previous Page | 427 428 429 430 431 432 433 434 435 436 437 438  | Next Page >