Search Results

Search found 22000 results on 880 pages for 'worker process'.

Page 525/880 | < Previous Page | 521 522 523 524 525 526 527 528 529 530 531 532  | Next Page >

  • .jar why it works only with specigic jdk version???

    - by sadiq
    Hi friends, I have website source code...in which there is a adito.jar file first i extract it ...then make again jar using "jar -cvf adito.jar . and the changed jar file work fine again with that website. above process is only work when i use jdk1.6.0_14 only... BUT..... if i use jdk1.5 or letest jdk1.6.0_18 the same thing is not work..i mean that jar file will not run in my website... So what is the possible reason of this..as i am new kid in java... thanking you, sadiq.

    Read the article

  • In Cent os 6.2 can i update Kernel version to 3.4 ? if so how to upgrade kernel?

    - by shiva
    Hi, I have a server with Centos 6.2 with Kernel version 2.6 , but i need to increase my application Performance. The Kernel Version 3.4 has x32abi which can improve the performance so i want to upgrade to 3.4 ? Is it possible? I tried 1) downloading kernel compiling and installing but still i see the same Kernel version.. What went wrong? i followed the process in mentioned in the below link.. http://www.tecmint.com/kernel-3-5-released-install-compile-in-redhat-centos-and-fedora/

    Read the article

  • The MsC gray zone: How to deal with the "too unexperienced on engineering/too under-qualified for research" situation?

    - by Hunter2
    Last year I've got a MsC degree on CS. On the beginning of the MsC course, I was keen on moving on with research and go for a PhD. However, as the months passed, I started to feel the urge to write software that people would, well, actually use. The programming bug had bitten me, again. So, I decided that before deciding on getting a PhD degree, I would spend some time on the "real world", working as a software developer. Sadly, most companies here in Brazil are "services" companies that seem to be stuck on the 80s when it comes to software development. I have to fend off pushy managers, less-than-competent coworkers and outrageous software requirements (why does everyone seem to need a 50k Oracle license and a behemoth Websphere AS for their CRUD applications?) on a daily basis, and even though I still love software development, the situation is starting to touch a nerve. And, mind you, I'm already lucky for getting a job at a place that isn't a plain software sweatshop. Sure, there are better places around here or I could always try my luck abroad, but then I hit the proverbial brick wall: Sorry, you're too unexperienced as a developer and too under-qualified as a researcher I've already heard this, and variations of that, multiple times. Research position recruiters look for die-hard, publication-ridden, rockstar PhDs, while development position recruiters look for die-hard, experience-ridden, rockstar programmers. To most, my MsC degree seems like a minor bump on my CV (and an outright waste of time for some). Applying for abroad positions is even harder, since the employer would have to deal of the hassle of a VISA process, which I understand that, sometimes, is too much. Now I'm feeling I've reached a dead-end. I'm certain that development (and not research) is my thing, so should I just dismiss my MsC (or play it as a "trump card") and play the "big fish on a small pond" role while I gather some experience and contribute on some open-source projects as a plus? Is there a better way to handle this?

    Read the article

  • Benefits of Behavior Driven Development

    - by Aligned
    Originally posted on: http://geekswithblogs.net/Aligned/archive/2013/07/26/benefits-of-behavior-driven-development.aspxContinuing my previous article on BDD, I wanted to point out some benefits of BDD and since BDD is an extension of Test Driven Development (TDD), you get those as well. I’ll add another article on some possible downsides of this approach. There are many articles about the benefits of TDD and they apply to BDD. I’ve pointed out some here and copied some of the main points for each article, but there are many more including the book The Art of Unit Testing by Roy Osherove. http://geekswithblogs.net/leesblog/archive/2008/04/30/the-benefits-of-test-driven-development.aspx (Lee Brandt) Stability Accountability Design Ability Separated Concerns Progress Indicator http://tddftw.com/benefits-of-tdd/ Help maintainers understand the intention behind the code Bring validation and proper data handling concerns to the forefront. Writing the tests first is fun. Better APIs come from writing testable code. TDD will make you a better developer. http://www.slideshare.net/dhelper/benefit-from-unit-testing-in-the-real-world (from Typemock). Take a look at the slides, especially the extra time required for TDD (slide 10) and the next one of the bugs avoided using TDD (slide 11). Less bugs (slide 11) about testing and development (13) Increase confidence in code (14) Fearlessly change your code (14) Document Requirements (14) also see http://visualstudiomagazine.com/articles/2013/06/01/roc-rocks.aspx Discover usability issues early (14) All these points and articles are great and there are many more. The following are my additions to the benefits of BDD from using it in real projects for my company. July 2013 on MSDN - Behavior-Driven Design with SpecFlow Scott Allen did a very informative TDD and MVC module, but to me he is doing BDDCompile and Execute Requirements in Microsoft .NET ~ Video from TechEd 2012 Communication I was working through a complicated task that the decision tree kept growing. After writing out the Given, When, Then of the scenario, I was able tell QA what I had worked through for their initial test cases. They were able to add from there. It is also useful to use this language with other developers, managers, or clients to help make informed decisions on if it meets the requirements or if it can simplified to save time (money). Thinking through solutions, before starting to code This was the biggest benefit to me. I like to jump into coding to figure out the problem. Many times I don't understand my path well enough and have to do some parts over. A past supervisor told me several times during reviews that I need to get better at seeing "the forest for the trees". When I sit down and write out the behavior that I need to implement, I force myself to think things out further and catch scenarios before they get to QA. A co-worker that is new to BDD and we’ve been using it in our new project for the last 6 months, said “It really clarifies things”. It took him awhile to understand it all, but now he’s seeing the value of this approach (yes there are some downsides, but that is a different issue). Developers’ Confidence This is huge for me. With tests in place, my confidence grows that I won’t break code that I’m not directly changing. In the past, I’ve worked on projects with out tests and we would frequently find regression bugs (or worse the users would find them). That isn’t fun. We don’t catch all problems with the tests, but when QA catches one, I can write a test to make sure it doesn’t happen again. It’s also good for Releasing code, telling your manager that it’s good to go. As time goes on and the code gets older, how confident are you that checking in code won’t break something somewhere else? Merging code - pre release confidence If you’re merging code a lot, it’s nice to have the tests to help ensure you didn’t merge incorrectly. Interrupted work I had a task that I started and planned out, then was interrupted for a month because of different priorities. When I started it up again, and un-shelved my changes, I had the BDD specs and it helped me remember what I had figured out and what was left to do. It would have much more difficult without the specs and tests. Testing and verifying complicated scenarios Sometimes in the UI there are scenarios that get tricky, because there are a lot of steps involved (click here to open the dialog, enter the information, make sure it’s valid, when I click cancel it should do {x}, when I click ok it should close and do {y}, then do this, etc….). With BDD I can avoid some of the mouse clicking define the scenarios and have them re-run quickly, without using a mouse. UI testing is still needed, but this helps a bunch. The same can be true for tricky server logic. Documentation of Assumptions and Specifications The BDD spec tests (Jasmine or SpecFlow or other tool) also work as documentation and show what the original developer was trying to accomplish. It’s not a different Word document, so developers will keep this up to date, instead of letting it become obsolete. What happens if you leave the project (consulting, new job, etc) with no specs or at the least good comments in the code? Sometimes I think of a new scenario, so I add a failing spec and continue in the same stream of thought (don’t forget it because it was on a piece of paper or in a notepad). Then later I can come back and handle it and have it documented. Jasmine tests and JavaScript –> help deal with the non-typed system I like JavaScript, but I also dislike working with JavaScript. I miss C# telling me if a property doesn’t actually exist at build time. I like the idea of TypeScript and hope to use it more in the future. I also use KnockoutJs, which has observables that need to be called with ending (), since the observable is a function. It’s hard to remember when to use () or not and the Jasmine specs/tests help ensure the correct usage.   This should give you an idea of the benefits that I see in using the BDD approach. I’m sure there are more. It talks a lot of practice, investment and experimentation to figure out how to approach this and to get comfortable with it. I agree with Scott Allen in the video I linked above “Remember that TDD can take some practice. So if you're not doing test-driven design right now? You can start and practice and get better. And you'll reach a point where you'll never want to get back.”

    Read the article

  • Too many heap subpools might break the upgrade

    - by Mike Dietrich
    Recently one of our new upcoming Oracle Database 11.2 reference customers did upgrade their production database - a huge EBS system - from Oracle 9.2.0.8 to Oracle Database 11.2.0.2. They've tested very well, we've optimized the upgrade process, the recompilation timings etc.  But once the live upgrade was done it did fail in the JAVA component piece with this error: begin if initjvmaux.startstep('CREATE_JAVA_SYSTEM') then * ORA-29553: classw in use: SYS.javax/mail/folder ORA-06512: at "SYS.INITJVMAUX", line 23 ORA-06512: at line 5 Support diagnosis was pretty quick - and refered to:Bug 10165223 - ORA-29553: class in use: sys.javax/mail/folder during database upgrade But how could this happen? Actually I don't know as we have used the same init.ora setup on test and production. The only difference: the prod system has more CPUs and RAM. Anyway, the bug names as workarounds to either decrease the SGA to less than 1GB or decrease the number of heap subpools to 1. Finally this query did help to diagnose the number of heap subpools: select count(distinct kghluidx) num_subpools from x$kghlu where kghlushrpool = 1; The result was 2 - so we did run the upgrade now with this parameter set: _kghdsidx_count=1 And finally it did work well. One sad thing:After the upgrade did fail Support did recommend to restore the whole database - which took an additional 3-4 hours. As the ORACLE SERVER component has been already upgraded successfully at the stage where the error did happen it would have been fine to go on with the manual upgrade and start catupgrd.sql script. It would have been detected that the ORACLE SERVER is upgraded already and just picked up the non-upgraded components. The good news:Finally I had one extra slide to add to our workshop presentation

    Read the article

  • A Huge Opportunity in Small Things

    - by Tori Wieldt
    Addressing the strong demand for Java in the embedded market, Oracle is hosting a new Java Embedded @ JavaOne event in San Francisco October 3-4. The event allows decision makers to attend the Java Embedded @ JavaOne business-focused program, while their IT/development staff can attend the technically-focused JavaOne conference. [Obligatory comment about suits & ties vs. jeans & T-shirts removed.] The two-day event includes keynotes, sessions and demonstrations. In his keynote this morning, Judson Althoff, Senior Vice President of Worldwide Alliances and Channels and Embedded Sales, Oracle explained  Devices are all around us - on 24x7, connected all the time. The explosion of devices is the next IT revolution. Java is the right solution for this space. Java embedded solutions provide a framework to  provision, manage, and secure devices.  Java embedded solutions also provide the ability to aggregate, process and analyze multitude of data.  Java is one platform to program them all. Terrance Barr, Java Evangelist and Java ME expert is enthusiastic about the huge opportunity, "It's the right time and right place for Java Embedded," he said, "Oracle is looking for partners who want to take advantage of this next wave in IT." The Embedded space continues to heat up. Today, Cinterion launched the EHS5, an ultra compact, high-speed M2M communication module providing secure wireless connectivity for a wide variety of industrial applications. Last week, Oracle announced Oracle Java ME Embedded 3.2, a complete client Java runtime Optimized for resource-constrained, connected, embedded systems, Oracle Java Wireless Client 3.2, Oracle Java ME Software Development Kit (SDK) 3.2, and Oracle Java Embedded Suite 7.0 for larger embedded devices. There is a huge opportunity in small things. 

    Read the article

  • Where is php executable on Ubuntu?

    - by user601L
    I have installed apache and php. I know php works as I have tested a simple php file on apache server. I'm writing a simple webserver which should be able to process php files. So what I want to do is once I get a request for a php file, something like 'exec php test.php' and get the output and pass it to the client. As I'm not much into Ubuntu, I don't know where is the php executable (should be in \bin right?) to do it. But there is no php file inside \bin or \usr\bin. When I run 'which php' it shows nothing. How do I do this?

    Read the article

  • Backing up data in an encrypted way

    - by Eli Bendersky
    I have the following use case: There's some data from my PC I want to periodically back-up online I own some hosting, so I want to use that for the backups, don't want to pay to another backup service I want to encrypt my data locally prior to moving it to the server I have no problem writing scripts to automate the process (say, periodically generate the backup and upload by FTP to my server), but my main question is about step 3 - the encryption: which way is recommended to encrypt my files (say, collected into a .ZIP) prior to uploading to the server? P.S. TrueCrypt seems popular but it's not quite what I'm looking for, since I don't want the files to be constantly encrypted here on my PC.

    Read the article

  • Case study: LOREX Technology Increases Website Traffic 90% with Oracle ATG

    - by Richard Lefebvre
    LOREX Technology Increases Website Traffic 90% by Enhancing the Online Customer Experience with a Flexible E-Commerce Platform LOREX Technology Inc. provides businesses and consumers with advanced video surveillance security products under the LOREX and Digimerge brands. LOREX, which caters to midsize business and consumer markets, is available in thousands of retail locations across North America. The Digimerge division sells its products through security system distributors in North America. Both brands concentrate on the sale of wired, wireless, and IP security surveillance and monitoring equipment, including cameras, digital video recorders, and all-in-one systems. LOREX conducted an extensive search for the right e-commerce platform to address its immediate need for a more intuitive shopping cart interface that could grow along with the company. After reviewing other solutions, including open source, LOREX chose Oracle ATG Web Commerce because it addressed every stage of the buying process and crossed all customer touch points, including the Web, contact center, mobile devices, social media, and its B2B partners’ physical stores. LOREX also found that Oracle ATG Web Commerce’s functionality was more robust than competing options, and it offered an attractive total cost of ownership. “Oracle ATG Web Commerce provided an optimal foundation to support rapid, scalable, long-term business growth while allowing full control of the platform,” said Sufi Khan Sulaiman, director, E-Commerce and Digital, LOREX. Read full story here  

    Read the article

  • OpenVPN Setup - Service Won't Start

    - by Lenwood
    I'm in the process of setting up OpenVPN on a VPS running Debian 6. I've walked step-by-step through this guide twice now, and I can't get the service to start. When I start the service, the error reported in the log file is: Cannot ioctl TUNSETIFF tun: Inappropriate ioctl for device (errno=25) I've searched the web a few times and I'm not finding anything helpful. I've tried: Changing file permissions: no change Deleting the file: get an error stating no file found (errno=2) Making a folder named "tun": get an error stating no file found (errno=2) I've wiped my installation and completed the steps verbatim twice now. I get no errors along the way, just the error above within my log file. The contents of my server.conf file are listed below, minus all the comments for brevity. Can anyone help? port 1194 proto udp dev tun ca ca.crt cert myserver.crt key myserver.key dh dh1024.pem server 10.8.0.0 255.255.255.0 ifconfig-pool-persist ipp.txt keepalive 10 120 comp-lzo persist-key persist-tun status openvpn-status.log log openvpn.log verb 3

    Read the article

  • Google analytics - drop in traffic

    - by user1001421
    Bit of a general question here. We are in the process of converting a number of our clients from older web sites to new ones. The problem we are getting, and sorry for being so general here, is we are getting a sharp decline in traffic as reported on Google Analytics. It's not a gradual decline, it seems to hit almost as soon as the new site goes live. I've just got a few questions to see if there is something we are doing wrong: a) We are using the same analytics accounts going from old to new site. Is this a bad idea? b) The actual analytics code is integrated into the pages using a server-side include. IS this a bad idea? c) We structure our sites differently to our old site. IE. The old sites would pretty must have all the web pages in the root directory, and hyperlinks would be linked to the page files: EG. <a href="somepage.aspx">Link</a> Our new sites now have a directory structure that pretty much reflects the navigation structure, and hyper links link to the pages directory instead of the actual page: EG. <a href="/new-items/shoes/">New shoes</a> Is this a bad idea. I'm really searching for a needle in a haystack here. Would appriciate any help or advice as to why we are getting such a sharp and sudden drop in traffic. Again, so this is such a general question. Thanks in advance.

    Read the article

  • backface culling error (in world space)

    - by acrilige
    I write simple software renderer. In my pipeline i have stage of backface culling. But looks like it has some error (see picture). I perform culling right after world transformation (is it correct?). (i can't insert picture in post coz i don't have enough points, so i just upload it (cube model): http://imageshack.us/photo/my-images/705/bcerror.png/) Vector3F view_dir(0.0f, 0.0f, 1.0f); std::vector<Triangle> to_remove; for (Triangle &t : m_triangles) { Vector4F e1 = t.v2 - t.v1; Vector4F e2 = t.v3 - t.v1; Vector3F normal( e1.y * e2.z - e1.z * e2.y, e1.z * e2.x - e1.x * e2.z, e1.x * e2.y - e1.y * e2.x ); normal.Normalize(); float dot = Dot(view_dir, normal); if (dot <= 0) to_remove.push_back(t); } for (Triangle& t : to_remove) m_triangles.erase(std::remove(m_triangles.begin(), m_triangles.end(), t), m_triangles.end()); Camera sits in origin and points in screen (RH). What is the reason? For better explanation i upload picture with cube rotation screenshots: http://imageshack.us/photo/my-images/842/bcmove.png/ UPDATED: The error occurs only when triangle has non-zero offset from origin UPDATED 2: If i process backface culling in clip space (after transforming all vertices with view and projection matrix), and just check z coordinate of triangle normal - it works perfect... Can i perform culing RIGHT BEFORE view/proj transforms? In this case looks like culling will not depends of projection and it's not right?.. UPDATED 3: I found answer and will post it in two hours - again coz of reputation lack.

    Read the article

  • MBP becomes very hot after using Xcode

    - by Globalhawk
    Hardware: MBP early 2011 version OS: Mountain lion App: Xcode 4.5.2 Problem: Every time when I start Xcode, 2 or 3 processes called "git" start running. But when I quit Xcode the "git" process won't quit and are still using a lot of CPU. Then the computer becomes quite hot and the battery gets drained very quickly. If I manually kill these processes the problem is gone. I tried to reinstall Xcode several times but the problem comes back every time. It drives me crazy. Any help will be appreciated!

    Read the article

  • Memory compatibility?

    - by nvillec
    I'm in the process of building a PC intended mostly for gaming and I've got a few questions. Currently, I only have the motherboard and CPU: Motherboard: GIGABYTE GA-Z68AP-D3 CPU: Intel Core i3-2120 My main question is about memory compatibility. I have 4 Hynix 2GB HMT125U7BFR8C-G7 with light-moderate use laying around and I'd love to save a few bucks if I can. I've read that this is server memory... a) Will that be a problem for PC use? b) Is it compatible with the motherboard? I've emailed Hynix and checked Crucial to no avail. If incompatible, what memory would be a good fit given the components I have? The motherboard has 4 sockets and supports up to 32GB, but I don't know that I have the budget for that at the moment. Thanks!!

    Read the article

  • How can I load an image directly into the Windows clipboard from the command line?

    - by Daniel J. Pritchett
    My dad asked me how he could script the pasting/inclusion of images in various applications. I'm sure I could script out some quick <img src=... HTML but I believe he's also looking to do this in Windows GUI applications like Word or Outlook. So, how could I script a process with the following inputs and outputs: load_image_into_clipboard_script.cmd sample_file.jpg Clipboard now contains the aforementioned image file, just as if I'd e.g. opened it in Paint and done a Select All - Copy. I noticed there's a clip.exe utility with Vista/Win2003 and up; perhaps that will be a useful intermediary?

    Read the article

  • GNOME session not starting after filesystem corruption

    - by user3215
    I'm running Ubuntu 9.10 desktop edition. Suddenly today /home became corrupted and I was prompted to run fsck manually. I ran fsck -y /home and rebooted the system. The system booted but I got no GUI interface (GNOME session) but a black screen with a user prompt instead. Any tricks here to start my system normally? Any help is greatly appreciated. EDIT:1 The error were similar to the the following(may be with some mistakes as I had to type it manually): machine1 login: root password: at login Sun Jan 16 15:30:46 IST 2011 on tty1 EXT3-fs error (devie sda1): ext3_lookup :deleted inode referenced aborting journal on device sda1 Remounting filesystem read-only root@machine1:~# startx ktemp: failed to create file via template `/tmp/serverauth.xxxxxxxxxxx: Read-only file /usr/bin/startx: line 157: cannot create temp file for here-document: Read-only file xauth: error in locking authority file /root/.Xauthority /usr/bin/startx: line 173: cannnot create temp file for here-document: Read-only file xauth: error in locking authority file /root/.Xauthority /usr/bin/startx: line 173: cannnot create temp file for here-document: Read-only file X: cannot stat /tmp/.x11-unix (No such file or directory), aborting giving up. xinit: No such file or directory (errno 2): unable to connect to xserver xinit: No such process (errno 3): Server error xauth: error in locking authority file /root/.Xauthority

    Read the article

  • Can a MySQL slave be a master at the same time?

    - by mmattax
    I am in the process of migrating 2 DB servers (Master & Slave) to two new DB Servers (Master and Slave) DB1 - Master (production) DB2 - Slave (production) DB3 - New Master DB4 - New Slave Currently I have the replication set up as: DB1 -> DB2 DB3 -> DB4 To get the production data replicated to the new servers, I'd like to get it "daisy chained" so that it looks like this: DB1 -> DB2 -> DB3 -> DB4 Is this possible? When I run show master status; on DB2 (the production slave) the binlog possition never seems to change: +------------------+----------+--------------+------------------+ | File | Position | Binlog_Do_DB | Binlog_Ignore_DB | +------------------+----------+--------------+------------------+ | mysql-bin.000020 | 98 | | | +------------------+----------+--------------+------------------+ I'm a bit confused as to why the binlog position is not changing on DB2, Ideally it will be the master to DB3.

    Read the article

  • exchange server 2010 Outlook Web Access - Exchange Control Panel WEB Interface

    - by Aceth
    from what i can gather the mailbox bit of the web interface works fine.. when any of the users go to options (top right) and try to use some of the features such as the Organise Mail Delivery Reports to find messages etc... it comes up with a message .. "An item with the same key has already been added" I've looked in the event viewer and i think its this error - Watson report about to be sent for process id: 7016, with parameters: E12IIS, c-RTL-AMD64, 14.00.0639.021, ECP, ECP.Powershell, https://x.x.x.x/ecp/PersonalSettings/Accounts.svc/GetList, UnexpectedCondition:ArgumentException, c09, 14.00.0639.021. ErrorReportingEnabled: False and Request for URL 'https://x.x.x.x/ecp/PersonalSettings/Accounts.svc/GetList' failed with the following error: System.ArgumentException: An item with the same key has already been added. at System.ServiceModel.AsyncResult.End[TAsyncResult](IAsyncResult result) at System.ServiceModel.Activation.HostedHttpRequestAsyncResult.End(IAsyncResult result) at System.ServiceModel.Activation.HostedHttpRequestAsyncResult.ExecuteSynchronous(HttpApplication context, Boolean flowContext) at Microsoft.Exchange.Management.ControlPanel.WebServiceHandler.ProcessRequest(HttpContext context) at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute() at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously) I've tried googling but no luck that's relevant :(

    Read the article

  • How to install .deb file from within preinst script

    - by Ashwin D
    I have my own application packaged using dpkg. The application depends on several deb files which I'm trying to install from within the preinst script of my application. The preinst script checks if a dependent deb file is installed, if not it goes to installt it using the dpkg -i command. This is repeated for all the dependent deb files needed by the main application. When I try to install the main application using dpkg -i, the commands returns failure when trying to execute the preinst script. Below is that error message. dpkg: error: dpkg status database is locked by another process I deleted /var/lib/dpkg/lock file and retried to install the application. But to no avail. If I run the preinst script separately like any other shell script, it runs without any issue. All the deb files will be installed properly. So, the issue is only when this preinst script is being run automatically by the dpkg -i command. I'm lost trying to determine the root cause. If anyone can shed some light on what the real issue might be, their help will be greatly appreciated. Thank you. Ashwin

    Read the article

  • How can I discourage the use of Access?

    - by Greg Buehler
    Lets pretend that a very large company (revenue numbers with more than 8 figures) is looking to do a refresh on a software system, particularly the dashboard used by employees. This system was originally put together in the early 1990's to handle inventory tracking and storage across a variety of facilities (10+). Since this large company is now in the process of implementing some of these inventory processes with SAP they are in need of a major refresh. The existing system: Microsoft Access project performs dashboard duties Unique shipping/receiving configurations at different facilities require unique forms and queries within the Access project Uses 3rd party libraries referenced by Access to directly interface with at control system (read: motors, conveyors, and counters) Individual SQL Server 2000 instances (some traces of pre-update SQL Server 6.0 documents) at each facility The Issue: This system started as a home brewed inventory tracking scheme with a single internal sponsor who is still in charge of the technical direction. The original sponsor prescribing the desired deliverables that are being called for in the current RFP. The RFP describes a system based around a single Access project. Any suggestion that Access is ill suited for a project of this scope are shot down under the reasoning that "it works for the scope now". Are there any case studies, notices, or statements that can be used to disuade this potential customer from repeating their mistake? Does Microsoft make any statements directly about when it is highly recommended to ditch Access?

    Read the article

  • SQL Server backup/restore error: The Media Family on Device is Incorrectly Formed.

    - by Chris
    Basically, I'm having this issue: http://www.sqlcoffee.com/Troubleshooting047.htm What I'm doing is running a script I found online (http://pastebin.com/3n0ZfybL) to do a full backup, then rar'ing up the file and moving it to my computer. The CRC of the backup file inside the rar is correct on both computers, so there is no problem with data being corrupted when I transfer it. But then I go and try to restore the database on my dev computer here and I get the errors "sql server cannot process this media family" ... "msg 3013". Why is this happening? I'd test out the backup on the server I'm getting it from, but it's a production server.

    Read the article

  • Daily Weekly and Monthly DB backup with logrotate?

    - by benjisail
    I am currently keeping daily backup of my database by doing a daily mysqldump and by using logrotate to keep the 7 last days of mysqldump. I would like to improve this backup process to keep 7 daily backup, 3 weekly backups and 12 monthly backup. I found this article which explain how to di this with logrotate : http://www.hotcoding.com/os/sysadmin/35751.html However I am using the dateext logrotate option to name my backup files so I cannot use this solution. How can I do daily, weekly and monthly backup with logrotate and with the dateext option?

    Read the article

  • Zoom Layer centered on a Sprite

    - by clops
    I am in process of developing a small game where a space-ship travels through a layer (doh!), in some situations the spaceship comes close to an enemy space ship, and the whole layer is zoomed in on the two with the zoom level being dependent on the distance between the ship and the enemy. All of this works fine. The main question, however, is how do I keep the zoom being centered on the center point between the two space-ships and make sure that the two are not off-screen? Currently I control the zooming in the GameLayer object through the update method, here is the code (there is no layer repositioning here yet): -(void) prepareLayerZoomBetweenSpaceship{ CGPoint mainSpaceShipPosition = [mainSpaceShip position]; CGPoint enemySpaceShipPosition = [enemySpaceShip position]; float distance = powf(mainSpaceShipPosition.x - enemySpaceShipPosition.x, 2) + powf(mainSpaceShipPosition.y - enemySpaceShipPosition.y,2); distance = sqrtf(distance); /* Distance > 250 --> no zoom Distance < 100 --> maximum zoom */ float myZoomLevel = 0.5f; if(distance < 100){ //maximum zoom in myZoomLevel = 1.0f; }else if(distance > 250){ myZoomLevel = 0.5f; }else{ myZoomLevel = 1.0f - (distance-100)*0.0033f; } [self zoomTo:myZoomLevel]; } -(void) zoomTo:(float)zoom { if(zoom > 1){ zoom = 1; } // Set the scale. if(self.scale != zoom){ self.scale = zoom; } } Basically my question is: How do I zoom the layer and center it exactly between the two ships? I guess this is like a pinch zoom with two fingers!

    Read the article

  • Languages on a resume: Is it better to put "C/C++" or "C, C++"?

    - by Kevin
    I'm graduating in a couple of weeks, and my resume (as expected) lists the languages that I've had experience with. Previously I've put "C/C++", however back then I didn't have that much experience with these two languages as I do now. Now that I've formally learned these two languages, it has become evident to me (and anyone who really knows these languages) that they are similar, and completely disimilar at the same time. Sure, most C code is compilable C++ code, but syntax and incorporation of library functions is pretty much where these similarities end. In most non-trivial problems, chances are that the desirable C++ solution will be different from the desirable C solution. My question: Will recruiters take note or care about whether you put "C/C++" as opposed to "C, C++"? Will they assume a lack of knowledge of the workings of either because of the inclusion of the first form, or perhaps see the inclusion of the second form as a potential "resume beefer" (listing them as 2 languages, instead of "one")? Furthermore, for jobs that you've applied to that were particularly interested in these two langauges, did the interview process include questions about the differences between C programming and C++ programming (so, about actual programming techniques, not only the extra paradigms in the latter)?

    Read the article

  • Reference Data Management

    - by rahulkamath
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} table.MsoTableColorfulListAccent2 {mso-style-name:"Colorful List - Accent 2"; mso-tstyle-rowband-size:1; mso-tstyle-colband-size:1; mso-style-priority:72; mso-style-unhide:no; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-tstyle-shading:#F8EDED; mso-tstyle-shading-themecolor:accent2; mso-tstyle-shading-themetint:25; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; color:black; mso-themecolor:text1;} table.MsoTableColorfulListAccent2FirstRow {mso-style-name:"Colorful List - Accent 2"; mso-table-condition:first-row; mso-style-priority:72; mso-style-unhide:no; mso-tstyle-shading:#9E3A38; mso-tstyle-shading-themecolor:accent2; mso-tstyle-shading-themeshade:204; mso-tstyle-border-bottom:1.5pt solid white; mso-tstyle-border-bottom-themecolor:background1; color:white; mso-themecolor:background1; mso-ansi-font-weight:bold; mso-bidi-font-weight:bold;} table.MsoTableColorfulListAccent2LastRow {mso-style-name:"Colorful List - Accent 2"; mso-table-condition:last-row; mso-style-priority:72; mso-style-unhide:no; mso-tstyle-shading:white; mso-tstyle-shading-themecolor:background1; mso-tstyle-border-top:1.5pt solid black; mso-tstyle-border-top-themecolor:text1; color:#9E3A38; mso-themecolor:accent2; mso-themeshade:204; mso-ansi-font-weight:bold; mso-bidi-font-weight:bold;} table.MsoTableColorfulListAccent2FirstCol {mso-style-name:"Colorful List - Accent 2"; mso-table-condition:first-column; mso-style-priority:72; mso-style-unhide:no; mso-ansi-font-weight:bold; mso-bidi-font-weight:bold;} table.MsoTableColorfulListAccent2LastCol {mso-style-name:"Colorful List - Accent 2"; mso-table-condition:last-column; mso-style-priority:72; mso-style-unhide:no; mso-ansi-font-weight:bold; mso-bidi-font-weight:bold;} table.MsoTableColorfulListAccent2OddColumn {mso-style-name:"Colorful List - Accent 2"; mso-table-condition:odd-column; mso-style-priority:72; mso-style-unhide:no; mso-tstyle-shading:#EFD3D2; mso-tstyle-shading-themecolor:accent2; mso-tstyle-shading-themetint:63; mso-tstyle-border-top:cell-none; mso-tstyle-border-left:cell-none; mso-tstyle-border-bottom:cell-none; mso-tstyle-border-right:cell-none; mso-tstyle-border-insideh:cell-none; mso-tstyle-border-insidev:cell-none;} table.MsoTableColorfulListAccent2OddRow {mso-style-name:"Colorful List - Accent 2"; mso-table-condition:odd-row; mso-style-priority:72; mso-style-unhide:no; mso-tstyle-shading:#F2DBDB; mso-tstyle-shading-themecolor:accent2; mso-tstyle-shading-themetint:51;} Reference Data Management Oracle Data Relationship Management (DRM) has always been extremely powerful as an Enterprise MDM solution that can help manage changes to master data in a way that influences enterprise structure, whether it be mastering chart of accounts to enable financial transformation, or revamping organization structures to drive business transformation and operational efficiencies, or mastering sales territories in light of rapid fire acquisitions that require frequent sales territory refinement, equitable distribution of leads and accounts to salespersons, and alignment of budget/forecast with results to optimize sales coverage. Increasingly, DRM is also being utilized by Oracle customers for reference data management, an emerging solution space that deserves some explanation. What is reference data? Reference data is a close cousin of master data. While master data may be more rapidly changing, requires consensus building across stakeholders and lends structure to business transactions, reference data is simpler, more slowly changing, but has semantic content that is used to categorize or group other information assets – including master data – and give them contextual value. The following table contains an illustrative list of examples of reference data by type. Reference data types may include types and codes, business taxonomies, complex relationships & cross-domain mappings or standards. Types & Codes Taxonomies Relationships / Mappings Standards Transaction Codes Industry Classification Categories and Codes, e.g., North America Industry Classification System (NAICS) Product / Segment; Product / Geo Calendars (e.g., Gregorian, Fiscal, Manufacturing, Retail, ISO8601) Lookup Tables (e.g., Gender, Marital Status, etc.) Product Categories City à State à Postal Codes Currency Codes (e.g., ISO) Status Codes Sales Territories (e.g., Geo, Industry Verticals, Named Accounts, Federal/State/Local/Defense) Customer / Market Segment; Business Unit / Channel Country Codes (e.g., ISO 3166, UN) Role Codes Market Segments Country Codes / Currency Codes / Financial Accounts Date/Time, Time Zones (e.g., ISO 8601) Domain Values Universal Standard Products and Services Classification (UNSPSC), eCl@ss International Classification of Diseases (ICD) e.g., ICD9 à IC10 mappings Tax Rates Why manage reference data? Reference data carries contextual value and meaning and therefore its use can drive business logic that helps execute a business process, create a desired application behavior or provide meaningful segmentation to analyze transaction data. Further, mapping reference data often requires human judgment. Sample Use Cases of Reference Data Management Healthcare: Diagnostic Codes The reference data challenges in the healthcare industry offer a case in point. Part of being HIPAA compliant requires medical practitioners to transition diagnosis codes from ICD-9 to ICD-10, a medical coding scheme used to classify diseases, signs and symptoms, causes, etc. The transition to ICD-10 has a significant impact on business processes, procedures, contracts, and IT systems. Since both code sets ICD-9 and ICD-10 offer diagnosis codes of very different levels of granularity, human judgment is required to map ICD-9 codes to ICD-10. The process requires collaboration and consensus building among stakeholders much in the same way as does master data management. Moreover, to build reports to understand utilization, frequency and quality of diagnoses, medical practitioners may need to “cross-walk” mappings -- either forward to ICD-10 or backwards to ICD-9 depending upon the reporting time horizon. Spend Management: Product, Service & Supplier Codes Similarly, as an enterprise looks to rationalize suppliers and leverage their spend, conforming supplier codes, as well as product and service codes requires supporting multiple classification schemes that may include industry standards (e.g., UNSPSC, eCl@ss) or enterprise taxonomies. Aberdeen Group estimates that 90% of companies rely on spreadsheets and manual reviews to aggregate, classify and analyze spend data, and that data management activities account for 12-15% of the sourcing cycle and consume 30-50% of a commodity manager’s time. Creating a common map across the extended enterprise to rationalize codes across procurement, accounts payable, general ledger, credit card, procurement card (P-card) as well as ACH and bank systems can cut sourcing costs, improve compliance, lower inventory stock, and free up talent to focus on value added tasks. Specialty Finance: Point of Sales Transaction Codes and Product Codes In the specialty finance industry, enterprises are confronted with usury laws – governed at the state and local level – that regulate financial product innovation as it relates to consumer loans, check cashing and pawn lending. To comply, it is important to demonstrate that transactions booked at the point of sale are posted against valid product codes that were on offer at the time of booking the sale. Since new products are being released at a steady stream, it is important to ensure timely and accurate mapping of point-of-sale transaction codes with the appropriate product and GL codes to comply with the changing regulations. Multi-National Companies: Industry Classification Schemes As companies grow and expand across geographies, a typical challenge they encounter with reference data represents reconciling various versions of industry classification schemes in use across nations. While the United States, Mexico and Canada conform to the North American Industry Classification System (NAICS) standard, European Union countries choose different variants of the NACE industry classification scheme. Multi-national companies must manage the individual national NACE schemes and reconcile the differences across countries. Enterprises must invest in a reference data change management application to address the challenge of distributing reference data changes to downstream applications and assess which applications were impacted by a given change.

    Read the article

< Previous Page | 521 522 523 524 525 526 527 528 529 530 531 532  | Next Page >