Search Results

Search found 45324 results on 1813 pages for 'open source'.

Page 69/1813 | < Previous Page | 65 66 67 68 69 70 71 72 73 74 75 76  | Next Page >

  • JSR Updates and EC Nominations open

    - by heathervc
    JSR 310, Date and Time API, has published an Early Draft Review 2.  This review closes 14 October. JSR 353, Java API for JSON Processing, has published an Early Draft Review.  This review closes 7 October. JSR 356, Java API for WebSocket, has published an Early Draft Review. This review closes 27 October. JSR 339,  JAX-RS 2.0: The Java API for RESTful Web Services, has published a Public Review. This review closes 12 November. The EC Nominations are now open until 11 October.  Any JCP Member may nominate themselves for the 2 open seats in the 2012 EC Elections.  Note that both seats will be for a 1 year term only, since all EC Members will stand for Election in 2013.  The merged EC will take effect in November 2012.

    Read the article

  • Developing a SSRS report using a SSAS Data Source

    After designing several SSRS reports based on regular relational databases, your boss would now like several new reports to be designed and rolled out to production based on your organization's SSAS OLAP cube. How do you get started with designing a report based on a cube? Get smart with SQL Backup ProGet faster, smaller backups with integrated verification.Quickly and easily DBCC CHECKDB your backups. Learn more.

    Read the article

  • Flex Drag & Drop: Detecting when all data has been moved from source to destination

    - by Adam Tuttle
    I have two mx:TileList controls that I'm using to allow editing of objects in batch. The first contains a collection of all available data, and the 2nd contains the current batch. Both are bound to ArrayCollections, and using the native drag-n-drop functionality of the TileList control the data is moved from one ArrayCollection to the other when an object is dragged between them. I need to change the currentState to show & reset the batch manipulation controls when the batch count goes from 0 to n or n to 0 items. Based on the documentation, I would have thought that I should listen to the dragComplete event, but my testing shows that instead of firing after the data has been removed from the source ArrayCollection and added to the destination ArrayCollection, it fires (consistently) between these two actions. Both lists are similar to this: <mx:TileList id="srcList" dragEnabled="true" dropEnabled="true" dragMoveEnabled="true" dataProvider="{images}" dragComplete="handleDragComplete(event)" allowMultipleSelection="true" /> And here's the source of the handleDragComplete function: private function handleDragComplete(e:DragEvent):void{ trace(e.dragInitiator.name + '.dragComplete: batch.length=' + batch.length.toString()); trace(e.dragInitiator.name + '.dragComplete: images.length=' + images.length.toString()); if (batch.length > 0){ currentState = 'show'; }else{ currentState = ''; } } And lastly, here's some example output from running the code. These are all run one after the other. Case 1: The application loads with 10 objects in the first list and the batch is empty. I dragged 1 object from the source list to the batch list. srcList.dragComplete: batch.length=1 srcList.dragComplete: images.length=10 (Expected: 1,9) Clearly, the object has been added to the batch ArrayCollection but not removed from the source. Case 2: Now, I'll drag a 2nd object into the batch. srcList.dragComplete: batch.length=2 srcList.dragComplete: images.length=9 (Expected: 2,8) Firstly, we can see that images.length has changed, showing that the object that I dragged from the source list to the batch list was removed AFTER the dragComplete event fired. The same thing happens this time: The new object is added to the batch ArrayCollection (batch.length=2), the dragComplete event fires (running these traces), and then the object is removed from the source ArrayCollection. Case 3: Now, I'll drag both images from the batch list back to their original location in the source list. batchList.dragComplete: batch.length=2 batchList.dragComplete: images.length=10 (Expected: 0,10) We can see that batch.length hasn't gone down, but the source images array is back at its original length of 10. QUESTION: Am I doing something wrong? Is there another event I could listen for? (Note: I tried both DragExit and DragDrop, just to be sure, and those behave as expected, but are not what I need.) Or is there another way to get the data that I want? Or... have I found a bug in the SDK?

    Read the article

  • build-helper-maven-plugin add-source does not working when trying to add linked resources

    - by Julian
    I am new to maven and hit a problem that looks easy in the first place but I already kept me busy for a whole day about and no way to get it working. First as part of running eclipse:eclipse plugin I create a linked folder like below: <linkedResources> <linkedResource> <name>properties</name> <type>2</type> <location>${PARENT-2-PROJECT_LOC}/some_other_project/properties</location> </linkedResource> <linkedResource> <name>properties/messages.properties</name> <type>1</type> <location>${PARENT-2-PROJECT_LOC}/some_other_project/properties/messages.properties</location> </linkedResource> And then I am adding that folder as a source folder like below: <plugin> <groupId>org.codehaus.mojo</groupId> <artifactId>build-helper-maven-plugin</artifactId> <version>1.7</version> <executions> <execution> <id>add-source</id> <phase>generate-sources</phase> <goals> <goal>add-source</goal> </goals> <configuration> <sources> <source>properties</source> <source>some_real_folder</source> </sources> </configuration> </execution> </executions> </plugin> However when I am looking at the generated .classpath in eclipse the “some_real_folder” is there but the “properties” is not. It looks like by default the build-helper-maven-plugin will check if the folder is there and if it is not it won’t add it. I am using maven 3.0.4 outside eclipse to run the build and I can see in the maven logs something like this: [INFO] Source directory: <some path>\properties added. This is my project structure: project1 \-- properties (this is the real folder) project2 \-- some_real_folder \-- properties (this is the link resource pointing to the project1/properties folder) All I need is to have both "some_real_folder" and the linked resource "properties" added to the .classpath of the project2

    Read the article

  • The Oracle Enterprise Linux Software and Hardware Ecosystem

    - by sergio.leunissen
    It's been nearly four years since we launched the Unbreakable Linux support program and with it the free Oracle Enterprise Linux software. Since then, we've built up an extensive ecosystem of hardware and software partners. Oracle works directly with these vendors to ensure joint customers can run Oracle Enterprise Linux. As Oracle Enterprise Linux is fully--both source and binary--compatible with Red Hat Enterprise Linux (RHEL), there is minimal work involved for software and hardware vendors to test their products with it. We develop our software on Oracle Enterprise Linux and perform full certification testing on Oracle Enterprise Linux as well. Due to the compatibility between Oracle Enterprise Linux and RHEL, Oracle also certifies its software for use on RHEL, without any additional testing. Oracle Enterprise Linux tracks RHEL by publishing freely downloadable installation media on edelivery.oracle.com/linux and updates, bug fixes and security errata on Unbreakable Linux Network (ULN). At the same time, Oracle's Linux kernel team is shaping the future of enterprise Linux distributions by developing technologies and features that matter to customers who deploy Linux in the data center, including file systems, memory management, high performance computing, data integrity and virtualization. All this work is contributed to the Linux and Xen communities. The list below is a sample of the partners who have certified their products with Oracle Enterprise Linux. If you're interested in certifying your software or hardware with Oracle Enterprise Linux, please contact us via [email protected] Chip Manufacturers Intel, Intel Enabled Server Acceleration Alliance AMD Server vendors Cisco Unified Computing System Dawning Dell Egenera Fujitsu HP Huawei IBM NEC Sun/Oracle Storage Systems, Volume Management and File Systems 3Par Compellent EMC VPLEX FalconStor Fusion-io Hitachi Data Systems HP Storage Array Systems Lustre Network Appliance OCFS2 PillarData Symantec Veritas Storage Foundation Networking: Switches, Host Bus Adapters (HBAs), Converged Network Adapters (CNAs), InfiniBand Brocade Emulex Mellanox QLogic Voltaire SOA and Middleware ActiveState ActivePerl, ActivePython Tibco Zend Backup, Recovery & Replication Arkeia Network Backup Suite BakBone NetVault CommVault Simpana 8 EMC Networker, Replication Manager FalconStor Continuous Data Protector HP Data Protector NetApp Snapmanager Quest LiteSpeed Engine Steeleye Data Replication, Disaster Recovery Symantec NetBackup, Veritas Volume Replicator, Symantec Backup Exec Zmanda Amanda Enterprise Data Center Automation BMC CA Unicenter HP Server Automation (formerly Opsware), System Management Homepage Oracle Enterprise Manager Ops Center Quest Vizioncore vFoglight Pro TeamQuest Manager Clustering & High Availability FUJITSU x10sure NEC Express Cluster X Steeleye Lifekeeper Symantec Cluster Server Univa UniCluster Virtualization Platforms and Cloud Providers Amazon EC2 Citrix XenServer Rackspace Cloud VirtualBox VMWare ESX Security Management ArcSight: Enterprise Security Manager, Logger CA Access Control Centrify Suite Ecora Auditor FoxT Manager Likewise: Unix Account Management Lumension Endpoint Management and Security Suite QualysGuard Suite Quest Privilege Manager McAfee Application Control, Change ControlIntegrity Monitor, Integrity Control, PCI Pro Solidcore S3 Symantec Enterprise Security Manager (ESM) Tripwire Trusted Computer Solutions

    Read the article

  • How to download source, modify source, recompile and build .deb package?

    - by burnersk
    I have to customize my Apache2 suExec module to ensure some special environment variables getting passed through suExec. How to download the source code form Debian package apache2-suexec, modify suexec.c - safe_env_lst, recompile and build a .deb package again to rollout on the production systems? I tried apt-get source apache2-suexec but didn't found the suexec.c within the occurred apache2-* folder. The altered source code should be like this: static const char *const safe_env_lst[] = { /* variable name starts with */ "HTTP_", "SSL_", /* NEW: Perl debugging variables */ "PERL5OPT=", "PERL5LIB=", "PERLDB_OPTS=", "DBGP_IDEKEY=", /* NEW: FCGI variables */ "FCGI=", "FCGI_CONNECTION=", "FCGI_RUNTIME=", "FCGI_STARTTIME=", draft based on: http://static.askapache.com/httpd/support/suexec.c

    Read the article

  • Look strange on gvim after applying Source Sans Pro font

    - by abcdabcd987
    I downloaded the Source Sans Pro font and install on my Fedora17(Xfce). I did mkfontscale, mkfontdir, fc-cache -fv, and after fc-list, could see it on the list. Then I changed guifont in gvim to Source\ Sans\ Pro\ 10, but it looks quite strange. And then I changed it to DejaVu\ Sans\ Mono\ 10, it looks nothing strange. So, why would this happend? And how to solve it? Thanks! Source Sans Pro DejaVu Sans Mono

    Read the article

  • Debugging problems in Visual Studio 2005 - No source code available for the current location

    - by espais
    Hi all I've searched up and down Google for others with a similar problem, and while I can find the error I don't think that other people have the same base problem that I do. Basically, I had to create a project for a unit-testing environment in order to run this test suite. First, I add my original C file, compile, and then a test file (C++) is generated. I then exclude my original source from the project, include this test script (which includes the original source at the top), and then run. I can debug the test file fine, but when it jumps to the original C file I get the dreaded 'no source code available for the current location' error. Both files are located within the same location, and I compiled the original file without any issue. Anybody have any thoughts about this? Its driving me crazy!

    Read the article

  • Proper line-ending for an open-source PHP project

    - by Mahdi
    What is the proper line-ending preferences for an open-source web project? Obviously it includes source code of PHP, HTML, CSS and Javascript. The source code is managing via Github now, and there are Windows (8 & 7), Linux (Ubuntu) and OSX developers inside the team, which means all the major operating systems. P.S. We are using "Windows" CRLF line-ending, plus "UTF-8 without BOM" right now, without facing any problem, however I think it might be better to use "*nix/OSX" LF style. I heard some stories about the problems that caused by the additional "CR" on Linux or OS X.

    Read the article

  • Replacing sick NTP server source and re-synching (with internal time currently 2 minutes late)

    - by l0c0b0x
    One of the external NTP servers (the primary one--currently) we're using as source seems to not be responding to NTP calls. Unfortunately, on our core router (Cisco 6509), the NTP functionality hasn't switched to the secondary NTP external server as it was expected. As a result, our core router which is pretty much our main internal NTP source is 2 minutes late. I'm planning to fix the external router issue by making the external NTP source be the one currently working. I'm wondering, how much will a 2 minute change affect my users and services? Specially since these days, we're heavily relying on certificate-based authentication. We're a Windows/Cisco shop. Internal NTP setup: [Core Router 1 / Cisco 6509]: looking out to two external NTP servers (in which the primary one is not responding to NTP calls) [Core Router 2]: Synching with Core router 1 (primary), working external router (secondary) [Other Cisco network devices]: Synching with Core router 1 (primary), core router 2 (secondary) [Domain controller(s)]: Synching with Core router 1 [All windows clients/servers]: Synching with domain controllers

    Read the article

  • apt-get for a package with when "contrib/source/Sources" is not found correctly

    - by Stuart Woodward
    I tried to install Webmin on Ubuntu by following the instructions on http://www.webmin.com/deb.html. Using the Update Manager GUI I added the repositories and the key deb http://download.webmin.com/download/repository sarge contrib deb http://webmin.mirror.somersettechsolutions.co.uk/repository sarge contrib However when I do an: apt-get update apt-get install webmin I get the error: W: Failed to fetch http://download.webmin.com/download/repository/dists/sarge/Release Unable to find expected entry 'contrib/source/Sources' in Release file (Wrong sources.list entry or malformed file) W: Failed to fetch http://webmin.mirror.somersettechsolutions.co.uk/repository/dists/sarge/Release Unable to find expected entry 'contrib/source/Sources' in Release file (Wrong sources.list entry or malformed file) Looing at the page at the URL I can see: 7029066c27ac6f5ef18d660d5741979a 20 contrib/source/Sources.gz Is the error caused by the fact that the Sources are compressed with gzip or am I doing something wrong?

    Read the article

  • Looking for an open source real-time network analysis program

    - by JrSysAdmin
    Can somebody recommend an open source real-time network analysis program? What I'm looking for the program to do is display a graph of bandwidth usage by IP within our internal network that can quickly be viewed any time we need to (typically when we want to quickly find out who is utilizing high amounts of bandwidth and slowing down the network). We ideally simply want to hook up a monitor on the wall of our server room to a system whose NIC will be in permissive mode to log all network activity in a visual manner which can easily be seen and running 24/7. Prefer open source as I do not have a budget for this project and prefer open source projects in general. I'd also prefer for this to be available for CentOS but any linux distro or Windows OS would be acceptable. Thanks!

    Read the article

  • Discount Codes Galore

    - by Cassandra Clark
    Saving money is at the top of everyones list right now. With this in mind the Oracle Technology Network team has compiled a list of discounts available at the Oracle Store. We are also introducing an Oracle Technology Network member discount from O'Reilly Media. If you subscribe to any of the Oracle Technology newsletters you also saw special discounts from CRC Press, Packt Publishing and Apress. We are going to do our best to bring you more offers like this every month. Now on to the discounts... Oracle Store offers - all below expiring May 31st 2010. Don't miss out! Expand Your Productivity with Oracle Open Office and Save 15%? Enter OTNOffice at checkout. Buy Now! Drive Business Agility and Performance with Industry-leading Oracle Database Management Packs.  Save 10% when you purchase them at the Oracle Store. Enter OTNDBMP at checkout. Buy Now! 15% Savings on Oracle Virtualization and Unbreakable Linux Support at the Oracle Store Enter code OTNLinuxVM at checkout. Buy Now! 20% Savings on Oracle SQL Developer Data Modeler Use OTNSQL at checkout. Buy Now! O'Reilly Oracle Technology Network Member Offer O'Reilly is generously offering Oracle Technology Network Members 35% off for print books and 40% off of eBooks. Browse Oracle titles at- http://oreilly.com/pub/topic/oracle. Use discount code TECNT at checkout.

    Read the article

  • Using SQL Source Control with Fortress or Vault &ndash; Part 2

    - by AjarnMark
    In Part 1, I started talking about using Red-Gate’s newest version of SQL Source Control and how I really like it as a viable method to source control your database development.  It looks like this is going to turn into a little series where I will explain how we have done things in the past, and how life is different with SQL Source Control.  I will also explain some of my philosophy and methodology around deployment with these tools.  But for now, let’s talk about some of the good and the bad of the tool itself. More Kudos and Features I mentioned previously how impressed I was with the responsiveness of Red-Gate’s team.  I have been having an ongoing email conversation with Gyorgy Pocsi, and as I have run into problems or requested things behave a little differently, it has not been more than a day or two before a new Build is ready for me to download and test.  Quite impressive! I’m sure much of the requests I put in were already in the plans, so I can’t really take credit for them, but throughout this conversation, Red-Gate has implemented several features that were not in the first Early Access version.  Those include: Honoring the Fortress configuration option to require Work Item (Bug) IDs on check-ins. Adding the check-in comment text as a comment to the Work Item. Adding the list of checked-in files, along with the Fortress links for automatic History and DIFF view Updating the status of a Work Item on check-in (e.g. setting the item to Complete or, in our case “Dev-Complete”) Support for the Fortress 2.0 API, and not just the Vault Pro 5.1 API.  (See later notes regarding support for Fortress 2.0). These were all features that I felt we really needed to have in-place before I could honestly consider converting my team to using SQL Source Control on a regular basis.  Now that I have those, my only excuse is not wanting to switch boats on the team mid-stream.  So when we wrap up our current release in a few weeks, we will make the jump.  In the meantime, I will continue to bang on it to make sure it is stable.  It passed one test for stability when I did a test load of one of our larger database schemas into Fortress with SQL Source Control.  That database has about 150 tables, 200 User-Defined Functions and nearly 900 Stored Procedures.  The initial load to source control went smoothly and took just a brief amount of time. Warnings Remember that this IS still in pre-release stage and while I have not had any problems after that first hiccup I wrote about last time, you still need to treat it with a healthy respect.  As I understand it, the RTM is targeted for February.  There are a couple more features that I hope make it into the final release version, but if not, they’ll probably be coming soon thereafter.  Those are: A Browse feature to let me lookup the Work Item ID instead of having to remember it or look back in my Item details.  This is just a matter of convenience. I normally have my Work Item list open anyway, so I can easily look it up, but hey, why not make it even easier. A multi-line comment area.  The current space for writing check-in comments is a single-line text box.  I would like to have a multi-line space as I sometimes write lengthy commentary.  But I recognize that it is a struggle to get most developers to put in more than the word “fixed” as their comment, so this meets the need of the majority as-is, and it’s not a show-stopper for us. Merge.  SQL Source Control currently does not have a Merge feature.  If two or more people make changes to the same database object, you will get a warning of the conflict and have to choose which one wins (and then manually edit to include the others’ changes).  I think it unlikely you will run into actual conflicts in Stored Procedures and Functions, but you might with Views or Tables.  This will be nice to have, but I’m not losing any sleep over it.  And I have multiple tools at my disposal to do merges manually, so really not a show-stopper for us. Automation has its limits.  As cool as this automation is, it has its limits and there are some changes that you will be better off scripting yourself.  For example, if you are refactoring table definitions, and want to change a column name, you can write that as a quick sp_rename command and preserve the data within that column.  But because this tool is looking just at a before and after picture, it cannot tell that you just renamed a column.  To the tool, it looks like you dropped one column and added another.  This is not a knock against Red-Gate.  All automated scripting tools have this issue, unless the are actively monitoring your every step to know exactly what you are doing.  This means that when you go to Deploy your changes, SQL Compare will script the change as a column drop and add, or will attempt to rebuild the entire table.  Unfortunately, neither of these approaches will preserve the existing data in that column the way an sp_rename will, and so you are better off scripting that change yourself.  Thankfully, SQL Compare will produce warnings about the potential loss of data before it does the actual synchronization and give you a chance to intercept the script and do it yourself. Also, please note that the current official word is that SQL Source Control supports Vault Professional 5.1 and later.  Vault Professional is the new name for what was previously known as Fortress.  (You can read about the name change on SourceGear’s site.)  The last version of Fortress was 2.x, and the API for Fortress 2.x is different from the API for Vault Pro.  At my company, we are currently running Fortress 2.0, with plans to upgrade to Vault Pro early next year.  Gyorgy was able to come up with a work-around for me to be able to use SQL Source Control with Fortress 2.0, even though it is not officially supported.  If you are using Fortress 2.0 and want to use SQL Source Control, be aware that this is not officially supported, but it is working for us, and you can probably get the work-around instructions from Red-Gate if you’re really, really nice to them. Upcoming Topics Some of the other topics I will likely cover in this series over the next few weeks are: How we used to do source control back in the old days (a few weeks ago) before SQL Source Control was available to Vault users What happens when you restore a database that is linked to source control Handling multiple development branches of source code Concurrent Development practices and handling Conflicts Deployment Tips and Best Practices A recap after using the tool for a while

    Read the article

  • GitHub: Are there external tools for managing issues list vs. project backlog

    - by DXM
    Recently I posted one of my the projects1 on GitHub and as I was exploring capabilities of the site, I noticed they have a rather decent issue tracking section. I want to use that section as a) other people can report bugs if they'd like and b) other people can see which bugs I'm aware of. However, as others have noted, issues list cannot be prioritized in order to create a project backlog. For now my backlog has been a text file, but I'd like to be able to have it integrated so the same information isn't maintained in different places. Having a fully ordered list, which is something we also practice at work, has been very useful as I can open one file, start with line 1 and fire off 2 or 3 items in one sitting without having to go back to a full issues/stories bucket. GitHub doesn't offer this. What GitHub does offer is a very nice and clean API so issues can easily be exported into anything else. I've searched to see if there are other websites (like Trello) that integrate with GitHub issues, but did not find anything. Does anyone know of such a product, service or offline tool? Those that use GitHub, what is your experience in managing backlog? I kinda hate the idea of manually managing two disconnected lists like some people seem to be doing with Wiki project pages. 1 - are shameless plugs allowed no this site? Searched but didn't find a definite answer. If it's bad practice, STOP and don't read further As a developer I got sick and tired of navigating to same set of folders 30 times a day, so I wrote a little, auto-collapsible utility that gets stuck to the desktop and allows easy access to the folders you constantly use.

    Read the article

< Previous Page | 65 66 67 68 69 70 71 72 73 74 75 76  | Next Page >