Search Results

Search found 1068 results on 43 pages for 'degree'.

Page 32/43 | < Previous Page | 28 29 30 31 32 33 34 35 36 37 38 39  | Next Page >

  • High-Powered Sites for low Cost

    - by HighAltitudeCoder
    Ahh, I am experiencing the intimidation of my very first post - visible by the whole world. Ok, here goes.   This first post is nothing exceptional.  It is simply a recommendation based (fittingly, I suppose) upon the job search you may be gearing up for.  I find myself in this very situation right now.  And, I will take my own recommendation after posting this entry. Job-Seekers: To the left you will notice two links under "Recommended Learning".  I have found these links to be invaluable when it comes to re-tooling, re-familiarizing, or otherwise resharping my skills when looking for that next job. Often, you will find job-postings with the text, usually posted after a laborious list of qualifications indicating the company's desire to hire candidates who know what they are doing: "...Looking for a candidate who can hit the ground running...".  The interesting thing about this post to me is I've encountered many individuals who, after speaking and working with them for some time, I've realized are perfectly capable of hitting the ground running - and FAST.  But what if they speed off in the wrong direction? The next time you spearhead a major task in your job, ask yourself: Am I headed in the wrong direction?  There are many ways to do this.  In fact, I've found in this new field there are more tempting ways to steer your project in the wrong direction than there are good ones.  I don't want to suggest that every one of my posts will fall into the "right direction" category, however I do think a healthy dose of introspection of the pros and cons will always be beneficial before you set off. That said, allow me to expound on the previously mentioned links. These web sites are invaluable.  They demonstrate the capabilities of existing as well as new and upcoming tools available in several IDE's.  I've viewed many tutorials in LearnVisualStudio.NET, and only one or two so far in TrainingSpot, however I've been delighted in their simplicity and straightforward approach to proper usage of the particular tool or concept being discussed.  They have not (so far in my experience) demonstrated ways in which to use the tools that become cumbersome, impractical, or error-prone. Each website has step-by-step videos that can be paused, replayed, and most importantly, they are done in real time.  As the author is typing, the viewer gets to experience the coding experience from a first-person perspective, including syntax errors, unexpected behaviors, IDE setup idiosyncracies, everything.  A subtle value I've gained from these videos is that a certain degree of confusion and introspection is normal when working with new tools and exploring new paths.  They (as well as your own experience) are not to be feared, but enjoyed.  I highly recommend them. Good work, guys!

    Read the article

  • Why is Java the lingua franca at so many institutions?

    - by Billy ONeal
    EDIT: This question at first seems to be bashing Java, and I guess at this point it is a bit. However, the bigger point I am trying to make is why any one single language is chosen as the one end all be all solution to all problems. Java happens to be the one that's used so that's the one I had to beat on here, but I'm not intentionality ripping Java a new one :) I don't like Java in most academic settings. I'm not saying the language itself is bad -- it has several extremely desirable aspects, most importantly the ability to run without recompilation on most any platform. Nothing wrong with using the language for Your Next App ^TM. (Not something I would personally do, but that's more because I have less experience with it, rather than it's design being poor) I think it is a waste that high level CS courses are taught using Java as a language. Too many of my co-students cannot program worth a damn, because they don't know how to work in a non-garbage-collected world. They don't fundamentally understand the machines they are programming for. When someone can work outside of a garbage collected world, they can work inside of one, but not vice versa. GC is a tool, not a crutch. But the way it is used to teach computer science students is a as a crutch. Computer science should not teach an entire suite of courses tailored to a single language. Students leave with the idea that all good design is idiomatic Java design, and that Object Oriented Design is the ONE TRUE WAY THAT IS THE ONLY WAY THINGS CAN BE DONE. Other languages, at least one of them not being a garbage collected language, should be used in teaching, in order to give the graduate a better understanding of the machines. It is an embarrassment that somebody with a PHD in CS from a respected institution cannot program their way out of a paper bag. What's worse, is that when I talk to those CS professors who actually do understand how things operate, they share feelings like this, that we're doing a disservice to our students by doing everything in Java. (Note that the above would be the same if I replaced it with any other language, generally using a single language is the problem, not Java itself) In total, I feel I can no longer respect any kind of degree at all -- when I can't see those around me able to program their way out of fizzbuzz problems. Why/how did it get to be this way?

    Read the article

  • Is it smart to take a year off from school to get experience?

    - by user134147
    firstly I apologize if this question is not appropriate for the site, but I've seen other similar (though slightly deviant) questions on this sight before and I know the people here are the most qualified to answer my question. Anyways, I'm currently between my sophomore and junior years at a 4 year university, and after a bit of deliberation I've decided on computer science as a major (BA, by the way, as a BS would require me to stay at least an extra year the way our program is set up). I've been interested now in programming for a few months and I've developed a passion for it in a very short time. I began learning C++, migrating to Java recently when I learned my school focuses on this language. Now, I should mention that the concept of higher education has never sat well with me, so part of my motivation for wanting to take time off is to truly challenge myself and see what I can accomplish when I actually try at something. The autodidact in me finds it difficult to focus on my passions while trying to keep a high GPA in unrelated classes. However, I understand the times we live in and therefore would plan to complete my degree after this year. So my question is whether or not the skills I learn in a year off from college could justify the time off from school. Unfortunately, I don't believe I know enough yet to gain any professional experience (internship, etc.) so I would mostly focus my time on learning Java and another language, possibly Wordpress (to gain an understanding of web programming concepts as I have not yet decided what field I want to get into, and to make some money to fund my off-year), and to delve into security concepts, which also interest me. I'm hoping I could work on projects, such as simple applications or contributions to open source software during this time to enhance my resume once I do finish school, so I can find a job out of college easier. I do not want to be the new hire who knows nothing beyond the concepts of his Java textbooks. Does anyone have any input about these thoughts of mine, or any ideas for where I should focus my studies or how high I might set the bar for my work? Thanks a lot everyone!

    Read the article

  • nginx 301 redirect to subfolder on primary domain

    - by 187j3x1
    hello there, sorry for my poor english. i just set up wordpress on my vps, so far its the only item on my site. there for seo reason, i think is better redirect all primary domain to the blog folder. primary domain is example.com wordpress is at example.com/blog what i want is rewrite www.example.com and example.com to example.com/blog. googled got some scripts, and make some change paste into nginx config file. here is: #301 redirect www to non-www server { server_name www.example.com; location = / { rewrite ^/(.*) http://example.com/$1 permanent; } } #301 non-www to subfolder server { server_name example.com; location = / { rewrite ^/(.*) http://example.com/blog$1 permanent; } } it works at some degree, successfully redirect to example.com/blog. the only problem is i get 404 not found error. then i only make nginx redirect www to example.com/blog. ok, this time i can access blog page. i know there is something wrong in the non-www to subfolder script. but do not how to fix it :(

    Read the article

  • Getting Server 2008 R2 to ignore all traffic from Internet-facing NIC, leaving it to a VM

    - by Wolvenmoon
    I got in to Server 2008 R2 via Dreamspark and would like to start learning on it. I don't have much option but to put it on a system sitting between the Internet and my home LAN due to electricity bills and the fact that 3 computers in an 11x11 space in 102 degree weather is pretty stygian. Currently I use a ClearOS gateway to manage everything, what I'd like to do is take my server 2008 R2 box, which has two NICs, and drop it at the head of my network. I'd want Server 2008 R2 to ignore all traffic on the external facing NIC and pass it to a virtual ClearOS gateway, and to put all its Internet traffic through its other NIC - which will face the rest of my network and be the default gateway for it. The theory is to keep the potentially vulnerable Server 2008 R2 install as tucked behind a Linux box as possible, without sacrificing too much performance. This is a home network that occasionally hosts dedicated game servers and voice chat servers, so most malicious activity is in the form of drive by non-targeted attacks, however, I don't trust Windows Server because I don't know the OS well enough, yet. So, three questions: How do I do this, am I going to be reasonably more secure doing this than if I just let the Server 2008 R2 rig handle all the network traffic and DHCP (not an option), and should I virtualize the Server 2008 R2 rig instead and if so in what? (Core 2 Duo e6600 w/ 5 gigs usable RAM)

    Read the article

  • 32 core (each physical core) 2.2 GhZ or 12 core (6 physical cores) 3.0GHZ?

    - by Tejaswi Rana
    I am working on a multithreaded application (Forex trading app built on C#) and had the client upgrade from the 12 core 3.0GHZ machine (Intel) to a 32 core 2.2 Ghz machine (AMD). The PassMark benchmark results were significantly higher when using multicores doing Integer, Floating and other calculations while for a single core calculation it was a bit slower than the pack (others that were being compared to with similar config as the 12 core one). Oh it also comes with 64 GB RAM (4 times as the other one) and a much faster SSD. So after configuring and running the application on that machine, not only did it not perform as well, it was significantly slower. We're talking about 30seconds - 1 minute slower on an app that usually completes processing within 5-20 secs. The application uses MAX DEGREE of PARALLELISM (TPL) which I've tried setting to number of cores and also half of that. I've also tried running single threaded and without setting any limits in parallel threading. While it may be the hardware has some issues, I am wondering if the CPU processing speed is the issue. I can overclock to 3.0 GHZ. But is that even a good idea? Server Info - AMD http://www.passmark.com/forum/showthread.php?4013-AMD-Dual-6272-performance-is-60-lower-than-benchmarks Seems that benchmark was wrong to start with - officially. Intel i7 3930k OS (same in both) Windows 7 Professional 64-bit

    Read the article

  • How do hdparm's -S and -B options interact?

    - by user697683
    These two options seem confusing. For example: according to the man page -B 254 "does not permit spin-down". However, testing with -B 254 -S 1 the drive does spin down after 5 seconds. -B Query/set Advanced Power Management feature, if the drive supports it. A low value means aggressive power management and a high value means better performance. Possible settings range from values 1 through 127 (which permit spin-down), and values 128 through 254 (which do not permit spin-down). The highest degree of power management is attained with a setting of 1, and the highest I/O performance with a setting of 254. A value of 255 tells hdparm to disable Advanced Power Management altogether on the drive (not all drives support disabling it, but most do). -S Put the drive into idle (low-power) mode, and also set the standby (spindown) timeout for the drive. This timeout value is used by the drive to determine how long to wait (with no disk activity) before turning off the spindle motor to save power. Under such circumstances, the drive may take as long as 30 seconds to respond to a subsequent disk access, though most drives are much quicker. The encoding of the timeout value is somewhat peculiar. A value of zero means "timeouts are disabled": the device will not automatically enter standby mode. Values from 1 to 240 specify multiples of 5 seconds, yielding timeouts from 5 seconds to 20 minutes. Values from 241 to 251 specify from 1 to 11 units of 30 minutes, yielding timeouts from 30 minutes to 5.5 hours. A value of 252 signifies a timeout of 21 minutes. A value of 253 sets a vendor-defined timeout period between 8 and 12 hours, and the value 254 is reserved. 255 is interpreted as 21 minutes plus 15 seconds. Note that some older drives may have very different interpretations of these values.

    Read the article

  • finding the best network latency between two countries

    - by Yoav Aner
    I know there are many tools to test for bandwidth and latency, but they all rely on having at least one host from which you can run those tests. I wonder whether there's an online source or some other way to guestimate the latency or speed between two countries (in general). For example, would a customer in Japan get lower latency if the server is located in Singapore or Australia? Is a user in India likely to get higher download speed from a server in the UK or in the US? Are there any online resources or some clever ways to answer those questions with a reasonable degree of accuracy? [UPDATE]: Thanks for the great suggestions from Raffael Luthiger. I didn't know about those looking glass servers. The submarine cable maps were also really cool to discover (Thanks to Jesper Mortensen). Also seems really wise if I could ask those network professional in the area for their experience, but obviously I don't have access to those. At least some of them are on SF :) However, I'm still a little unsure how to combine those resources to give me some measurements. This is the information I have: Two countries (A,B). I do have IP addresses of customers in country A (I can obtain those from the web server log files for example). Presumably I can find some looking glass servers in country B and run a trace to those IPs. What's the best measurements to use? Are there any scripts that help automate at least some of this process?

    Read the article

  • Issues with VPN functionality

    - by Xorandor
    I've been working on setting up VPN connectivity to our office location. We bought a Cisco WRV210 which have a builtin VPN server. Cisco has some software QuickVPN, which is not as quick and easy as I had thought. I've had mixed experiences on different machines with connecting. Instead I configured an IPSec VPN tunnel following a guide from TheGreenBow here http://www.thegreenbow.com/doc/tgbvpn_cg_linksys_wrv200_en.pdf I followed their instructions and tried out an evaluation of their software, and VPN connection should be working ok. I'm able to do RDP to a machine on the network (using IP address, not machine name) and ping the router etc. What I'm trying to solve are two things: It's not like I'm "really" on the network. Or at least I'm restricted to some degree when going through the VPN. I can't access a machine on the network using machine name, only IP. I can't ping a machine, but the router just fine. Could this be that something is not set up properly? If so, I can ofcourse supply additional information. Second, when I log onto the VPN, I would really like my outgoing connection to go through the internet connection of the remote location. Basically if I connect to the VPN I want my outgoing IP to be that of the remote location's (needed for some IP resctrictions on some of our servers). At a previous work location it worked like this when we connected to our office VPN over PPTP and the builtin windows VPN client. I'm not a huge expert on the topic, so any hints will be appreciated.

    Read the article

  • Maintaining "Portability" Between Linux and Windows 7

    - by lokheart
    I am using the following ways in my office's Windows 7 machine to maintain my "portabilibity" when disaster strikes and I need to switch computer while I have no luxury of time for reinstalling all my program to the new PC. a majority of programs I used are portable, mostly from portableapp.com, like notepad+, GIMP, even R, I extract them and store them in a folder in My document, in a structure similar to the default portableapp installation when they are installed to a thumbdrive only a few software that portable version is not available and I will install them as usual all of my working files are stored in a folder in My document I regularly backup them all using syncback, because this program can keep versioning of my backup, and the backup is stored in a portable drive. One day I need to switch my computer and the operation is relative simple for me: I just move the two folders mentioned above into the my document folder of the new PC, install those few "non-portable" program in it, and this is almost done, some minor hiccups can be solved by reinstalling the portableapp into the drive. Overall speaking it is a smooth process. I would like to maintain the same degree of "portability" in my home Linux desktop (Ubuntu or Mint, I'm still deciding), that is, if my Linux crash and I need to reinstall it again. All I need to do is the move the two folder back to the new Linux, and most of my work will be almost ready to be worked on again. But I don't know how to find a Linux-alternative of portableapps. Being a newer to Linux, can anyone tell me whether this is possible in Linux?

    Read the article

  • nginx 301 redirect to subfolder on primary domain

    - by 187j3x1
    sorry for my poor english. i just set up wordpress on my vps, so far its the only item on my site. there for seo reason, i think is better redirect all primary domain to the blog folder. primary domain is example.com wordpress is at example.com/blog what i want is rewrite www.example.com and example.com to example.com/blog. googled got some scripts, and make some change paste into nginx config file. here is: #301 redirect www to non-www server { server_name www.example.com; location = / { rewrite ^/(.*) http://example.com/$1 permanent; } } #301 non-www to subfolder server { server_name example.com; location = / { rewrite ^/(.*) http://example.com/blog$1 permanent; } } it works at some degree, successfully redirect to example.com/blog. the only problem is i get 404 not found error. then i only make nginx redirect www to example.com/blog. ok, this time i can access blog page. i know there is something wrong in the non-www to subfolder script. but do not how to fix it :(

    Read the article

  • Finding the current user authenticated by basic auth (Apache)

    - by jtd
    When you log in through a basic auth page, is the username you authenticated as stored anywhere (on the server or client machine), maybe in an environment variable? Background: I have a common web administration page for an e-mail server and I'd like to know who is doing what. When a user successfully logs in via basic auth, I somehow want to be able to identify them and log their actions. So each time a request is submitted, I can write to a log file. The basic format would be: $username ran a $function against $useraccount so if a user changed someone's permissions, eg: Admin-Bob ran a permission change against User-Scott So if errors occur, I can easily trace back in the log file what actions lead to the cause. I tried checking the %ENV hash to no avail, any Ideas? I don't really want to get into PHP-like sessions, because that would mean scrapping my basic auth, which gives me a fine degree of control already. If I have to code something with sessions, I'd need to implement a system to block users after maximum tries and so on, which I don't really want to code. I think this is better geared towards serverfault because it pertains to Apache moreso than the programming language. Sessions can be done in a myriad of languages.

    Read the article

  • Memory overcommitment on VmWare ESXi 5.0

    - by Tibor
    I would like to understand better the possibilities of VmWare ESXi memory overcommitment. I've read this paper from VmWare, so I am familiar with general concepts, such as hypervisor swapping, memory balooning and page sharing. It seems that a combination of these techniques allows for quite a large degree of overcommitment. However, I am not sure. I am deploying a virtual test lab comprising of 4 identical sets of virtual servers and workstations and a couple of virtual router instances. Overall, I expect to be running around 20 virtual machines with Windows XP, Windows 7 and Ubuntu for workstation hosts as well as CentOS and Windows 2008 Server instances for servers. The problem is, however, that the host machine only has 12GB of RAM and I don't have an option to stuff in some more. I would like to know what is the best option to configure hosts in order to achieve reasonable performance within the constrains. I have these two options: Allocate as little as possible of RAM to each virtual machine. Allocate an extraordinary amount (such as 4 GB per instance) and let the baloon driver do the rest. Something else? Which would work better? Machines will mostly be idle, so I don't have any major performance expectations, but they should run reasonably smoothly nevertheless.

    Read the article

  • What Media Extender / Centre Set up should I use?

    - by Bryn Hird
    I have installed cat6 throughout the house which I use for telephony and network. In my cellar I have a NAS Server, gigabit switch and I want to install a Media Centre to stream my video's, music, photo's and live TV (coax from the aerial to the cellar) over the cat6. Yeah I know I can get stuff on the internet but shared experience of watching TV as a family as it happens is a big plus for live TV. I'm aiming for 1080p. I want different users to be able to watch different channels. Max users = 4. I've played a little with Windows Media Centre, works fine with live TV. Likewise I have XBMC up and running with live TV. The issue I have is what do I put near the TV. I'd like a consistent user interface (grandma and the the other technophobes in the house are continually pestering me on how to use different TVs, change channel, inputs etc.) so a key part of this for me is to make the user experience the same and simple i.e. no keyboards / PCs hanging around the TV. I've just bought a Linksys DMA 2200 to test the Windows Media Centre, but obviously off eBay as they're a dying breed. And with Windows Media Centre removed from Microsoft plans such devices will get rarer. And as for 1080p, think I can forget it with that set up. I have tested XBOX 360, also works but ditto on Microsoft plans for WMC. I was thinking of a WD Live TV to test the XMBC setup. Now to the question. Any advice on Media Centre / Extender setups that will do the job as above and have some degree of futureproofing (building my own with my Raspberry PI is a last resort). I'd like to understand the standards involved in the futureproofing if anyone knows (DNLA, RVU etc.).

    Read the article

  • How can I totally flatten a PDF in Mac OS on the command line?

    - by Matthew Leingang
    I use Mac OS X Snow Leopard. I have a PDF with form fields, annotations, and stamps on it. I would like to freeze (or "flatten") that PDF so that the form fields can't be changed and the annotations/stamps are no longer editable. Since I actually have many of these PDFs, I want to do this automatically on the command line. Some things I've tried/considered, with their degree of success: Open in Preview and Print to File. This creates a totally flat PDF without changing the file size. The only way to automate seems to be to write a kludgy UI-based AppleScript, though, which I've been trying to avoid. Open in Acrobat Pro and use a JavaScript function to flatten. Again, not sure how to automate this on the command line. Use pdftk with the flatten option. But this only flattens form fields, not stamps and other annotations. Use cupsfilter which can create PDF from many file formats. Like pdftk this flattened only the form fields. Use cups-pdf to hook into the Mac's printserver and save a PDF file instead of print. I used the macports version. The resulting file is flat but huge. I tried this on an 8MB file; the flattened PDF was 358MB! Perhaps this can be combined with a ghostscript call as in Ubuntu Tip:Howto reduce PDF file size from command line. Any other suggestions would be appreciated.

    Read the article

  • How intrusive is using VPN?

    - by Slade
    My company lets us work from home sometimes using VPN (during weather emergencies and stuff). When logging in a big window comes up that says the network is private and for employees only and that there's no right to privacy while using VPN. It makes sense that they don't want people poking around their network but I wonder if the company can use the connection to look around my computer while I'm connected. I'm not entirely computer-illiterate but I'm not a networks person at all so the technical documents I've found don't help me. Is that possible, and if so to what degree? UPDATE Thanks Mark. The funneling thing is what I was really asking about. Mostly I was worried that I would already have some IM conversation open or log into eBay forgetting that the VPN was open and that my company IT people would see it or that they would log my eBay password. Thanks again. ANOTHER UPDATE What if my son wants to play online poker or Warcraft etcetera while I have VPN on to work? Can my company think I'm the one playing if I am not typing often?

    Read the article

  • Recommendations for good Unix MTA / groupware solutions? [closed]

    - by Jez
    Possible Duplicate: Exchange server replacement that runs on Linux I'm setting up a Debian server, and one of the things I need on it is an MTA. I don't want to use something like Exim or Postfix because I want something that ties in SMTP, POP3, and IMAP all in one (a la Microsoft Exchange). Most MTAs also seem to be hellishly difficult to configure. Try and read the Exim documentation; you could do a university degree on it (I'm not kidding). When you can get an HTTP server like Cherokee which is easy to configure and has a nice web interface, do MTAs or groupware solutions need to be that hard? I'm aware that some people think "the Unix way" is to have lots of different interacting pieces of software (like maybe an SMTP MTA, POP3 service, webmail service, and overarching manager to tie them all together), but I think this is a situation where that just makes things a lot harder to deal with and one large software suite fits in much more nicely. So, I'm looking for good open source software suites that will run on Debian that: Combine (at least) SMTP, POP3, and IMAP Are easy(ish) to configure Have a nice configuration web interface or GUI Are not defunct projects I don't mind if it's groupware and offers calendaring too, but I would only be using the e-mail functionality for now. Another nice-to-have would be built-in webmail (if we're combining a bunch of functionality, why not?) Note however that I do NOT need Outlook support. I am not really looking for an "Exchange replacement drop-in". The suites I've found so far that seem to match the above criteria (and have appropriate licenses) are Citadel, Kolab, and Zimbra. I'd appreciate anyone who has experience with any of these giving me the pros and cons of them, such as how easy they are to configure and what their performance is like. I'd also appreciate any other suggestions for solutions that fulfil my criteria that I may have missed out.

    Read the article

  • How does everyone set up AWS for PHP with a git workflow while worrying about distributing EC2?

    - by Parris
    Hello, I have been looking for something like heroku but for php, and after much frustration (and almost finding what I need, but not quite) we decided to just go with AWS without any other abstraction. We are using PHP 5.3 (and CakePHP 1.3), and are currently using git. Ubuntu seems like the easiest way to get both of those on there and we will most likely use that. We aren't really going worry about outgoing email. We are using smtp through gmail, but will most likely switch to some other service eventually. I had 3 questions: 1) I have been looking at Zend Server, and I am not quite sure how that is more beneficial than xampp. Perhaps it is not? 2) I suppose to make the application scale we would need multiple instances of some ec2 ami. Then just duplicate it and such. The question then becomes how do we make sure all EC2 instances are up to date? 3) I understand the concept of load balancing to some degree. I understand that in 1 region you select a bunch of servers and have it load balance across them. The question then becomes well how about world wide? How do I make it so that traffic is directed to the correct ec2 server? I have heard of route 53, and tried signing up for that, but nothing appears in my control panel. Also perhaps it is just a DNS thing with my domain registrar? AHHH... some tutorial would be helpful!

    Read the article

  • Different font SIZES in a Text Editor, based on Script(Alphabet) type (ie. per Unicode Code-Block)

    - by fred.bear
    Some non-Latin-based scripts(alphabets) have more detail in their glyphs than do the Latin-based-script equivalents, and typically need a larger font to give the same degree of legibility (resolution-wise). Sometimes, both script types need to be present in the same file. Notepad++ allows different font SIZES (and colour, etc) courtesy of syntax-highlighting. This allows me to display larger-fonted non-Latin-based script in a // BIG-FONT comment. Although this has been quite handy for me in some situations, it is quite limited. A Word Processor can handle this scenario, but I'm not interested in that. I want a nice simple(?) plain(?) Text Editor to do it... on a per script-type basis... eg. mixing Latin-1 and Devanagari (and Mandarin, and ... Such a thing may not exits, but Notepad++ has shown that a simple(?) plain(?) Text Editor is capable of it. Does anyone know of such a Text Editor? ...Q. Why not a Word Processor? ...A. Because GCC and Python don't like that format! but UTF-8 is fine.

    Read the article

  • Mysql InnoDB and quickly applying large updates

    - by Tim
    Basically my problem is that I have a large table of about 17,000,000 products that I need to apply a bunch of updates to really quickly. The table has 30 columns with the id set as int(10) AUTO_INCREMENT. I have another table which all of the updates for this table are stored in, these updates have to be pre-calculated as they take a couple of days to calculate. This table is in the format of [ product_id int(10), update_value int(10) ]. The strategy I'm taking to issue these 17 million updates quickly is to load all of these updates into memory in a ruby script and group them in a hash of arrays so that each update_value is a key and each array is a list of sorted product_id's. { 150: => [1,2,3,4,5,6], 160: => [7,8,9,10] } Updates are then issued in the format of UPDATE product SET update_value = 150 WHERE product_id IN (1,2,3,4,5,6); UPDATE product SET update_value = 160 WHERE product_id IN (7,8,9,10); I'm pretty sure I'm doing this correctly in the sense that issuing the updates on sorted batches of product_id's should be the optimal way to do it with mysql / innodb. I'm hitting a weird issue though where when I was testing with updating ~13 million records, this only took around 45 minutes. Now I'm testing with more data, ~17 million records and the updates are taking closer to 120 minutes. I would have expected some sort of speed decrease here but not to the degree that I'm seeing. Any advice on how I can speed this up or what could be slowing me down with this larger record set? As far as server specs go they're pretty good, heaps of memory / cpu, the whole DB should fit into memory with plenty of room to grow.

    Read the article

  • Multiple SSL certificates on one server

    - by Kyle O'Brien
    We're hosting two websites on our fairly tiny but dedicated production server. Both website require SSL authentication. So, we have virtualhosts set up for both of them. They both reference their own domain.key, domain.crt and domain.intermediate.crt files. Each CSR and certificate file for each site was setup using its own unique information and nothing is shared between them (other than the server itself) However, which ever site's symbolic link (set up in /etc/apache2/sites-enabled) is reference first, is the site who's certificate is referenced even if we're visiting the second site. So for example, assume our companies are Cadbury and Nestle. We set up both sites with their own certificates but we create Cadbury's symbolic link in apache's site-enabled folder first and then Nestle's. You can visit Nestle perfectly fine but if you check the certificate installation, it reference's Cadbury's certificate. We're hosting these websites on a dedicated Ubuntu 12.04.3 LTS server. Both certificates are provided by Thawte.com. I came across a few potential solutions with no degree of success. I'm hoping someone else has a decent solution? Thanks Edit: The only other solution that seems to have provided success to some people is using SNI with Apache. However, the setups here didn't seem to coincide with our setup at all.

    Read the article

  • Our Oracle Recruitment Team is Growing - Multiple Job Opportunities in Bangalore, India

    - by david.talamelli
    DON"T GET STUCK IN THE MATRIXSEE YOUR FUTUREVISIT THE ORACLE The position(s): CORPORATE RECRUITING RESEARCH ANALYST(S) ABOUT ORACLE Oracle's business is information--how to manage it, use it, share it, protect it. For three decades, Oracle, the world's largest enterprise software company, has provided the software and services that allow organizations to get the most up-to-date and accurate information from their business systems. Only Oracle powers the information-driven enterprise by offering a complete, integrated solution for every segment of the process industry. When you run Oracle applications on Oracle technology, you speed implementation, optimize performance, and maximize ROI. Great hiring doesn't happen by accident; it's the culmination of a series of thoughtfully planned and well executed events. At the core of any hiring process is a sourcing strategy. This is where you come in... Do you want to be a part of a world-class recruiting organization that's on the cutting edge of technology? Would you like to experience a rewarding work environment that allows you to further develop your skills, while giving you the opportunity to develop new skills? If you answered yes, you've taken your first step towards a future with Oracle. We are building a Research Team to support our North America Recruitment Team, and we need creative, smart, and ambitious individuals to help us drive our research department forward. Oracle has a track record for employing and developing the very best in the industry. We invest generously in employee development, training and resources. Be a part of the most progressive internal recruiting team in the industry. For more information about Oracle, please visit our Web site at http://www.oracle.com Escape the hum drum job world matrix, visit the Oracle and be a part of a winning team, apply today. POSITION: Corporate Recruiting Research Analyst LOCATION: Bangalore, India RESPONSIBILITIES: •Develop candidate pipeline using Web 2.0 sourcing strategies and advanced Boolean Search techniques to support U.S. Recruiting Team for various job functions and levels. •Engage with assigned recruiters to understand the supported business as well as the recruiting requirements; partner with recruiters to meet expectations and deliver a qualified pipeline of candidates. •Source candidates to include both active and passive job seekers to provide a strong pipeline of qualified candidates for each recruiter; exercise creativity to find candidates using Oracle's advanced sourcing tools/techniques. •Fully evaluate candidate's background against the requirements provided by recruiter, and process leads using ATS (Applicant Tracking System). •Manage your efforts efficiently; maintain the highest levels of client satisfaction as well as strong operations and reporting of research activities. PREFERRED QUALIFICATIONS: •Fluent in English, with excellent written and oral communication skills. •Undergraduate degree required, MBA or Masters preferred. •Proficiency with Boolean Search techniques desired. •Ability to learn new software applications quickly. •Must be able to accommodate some U.S. evening hours. •Strong organization and attention to detail skills. •Prior HR or corporate in-house recruiting experiences a plus. •The fire in the belly to learn new ideas and succeed. •Ability to work in team and individual environments. This is an excellent opportunity to join Oracle in our Bangalore Offices. Interested applicants can send their resume to [email protected] or contact David on +61 3 8616 3364

    Read the article

  • I’m a Phoenix… and I’m miffed

    - by Stan Spotts
    For personal reasons, almost 30 years ago I left school to enter the workforce. I decided late 2008 to go back to school and finish my degree. After the expected loss of credits for a transfer, from Temple University to University of Phoenix, I'm now about 75% done. The experience has been interesting. Classes are time compressed, only 5 weeks each. Because I have a family and a full time job, I'm taking one at a time. Even so, I've written more papers in these classes than I ever wrote at Temple. My own papers are one thing, but the team papers give me heartburn since I can't completely control what goes into them. Not a big deal except that they make up 30% of our grade. In any case, most of the class facilitators have been great. I had great ones for Accounting, Finance, and frankly most others. I've had a few (4, maybe) cases where I was less than 2 points from an A, and asked the facilitator if I could get any of my work reviewed to see if I could get those extra points. I figured it was worth a shot, and there were no extenuating circumstances to help make my case. I think that only one facilitator decided after a review of one paper that my interpretation was good, just not what he expected, and gave me another point, which gave me an A. So while none are pushovers, they've all been open to discussion, which is as much as I should expect. Overall, good experience. That is, until my last class. On the second week, the day I was due to hand in my personal assignment for the week, I was in an accident. An SUV creamed my little Ford Focus, and totaled it (estimated repair over $11K). I was pretty banged up, especially my left shoulder. I was scheduled for rotator cuff surgery for two weeks later, and getting hit against the door really made it worse. After dealing with the police, the EMT, the tow truck, and the Percocet and Flexeril for the pain, I crashed for the night and didn't get to upload my paper until the next day. The instructor took 30% off for it being late, even after I supplied photos of the car, my arm (huge bruises), and offered to supply the police report number. I figured I'd be okay since that's 2.7 points, and I could lose up to 5 before jeopardizing an A grade. Well, that wasn't the case as we lost more points than I expected on our team paper in Week 5. I ended up with a 94.3. Yes, 7/10 of a point from an A. Of course I asked the instructor to review the issue with the accident and give me just the 0.7 points I needed for the A. That got me a short response of "I have received your emails and review your work over the last five weeks. Your current grade will stand. If you would like to dispute your grade then please feel free to contact your academic advisor. I wish you much success in your professional and academic career." Brrrr….! So I asked my academic advisor to file a dispute. If it wasn't that a pretty bad car accident was the cause, I wouldn't have. Without the grade reduction, I would have had a 97 for the class, so I'll argue that I was performing at the A level throughout the class. Why her purported "review" of my work didn't then warrant such a minor adjustment, I don't know. An A- drops my GPA, and this ticked me off. Now I have to wait and see what the school says about the grade dispute.

    Read the article

  • How to install SharePoint Server 2013 Preview

    - by ybbest
    The Office 2013 and SharePoint Server 2013 Preview is announced yesterday and as a SharePoint Developer, I am really excited to learn all the new features and capabilities. Today I will show you how to install the preview. 1. Create a service account called SP2013Install and give this account Dbcreator and SecurityAdmin in SQL Server 2012 2. You need to run the following script to set the ‘maxdegree of parellism’ setting to the required value of 1 in SQL Server 2012(using sysadmin privilege) before configure the SharePoint Farm. Otherwise , you might get the error ‘This SQL Server Instance does not have the required maxdegree of parellism setting of 1’ sp_configure 'show advanced options', 1; GO RECONFIGURE WITH OVERRIDE; GO sp_configure 'max degree of parallelism', 1; GO RECONFIGURE WITH OVERRIDE; GO 3. Download the SharePoint preview from here and I am going to install it on Windows Server 2008R2 with SQL2012. 4. Click the Install software prerequisites, this works fine with the internet connection. (However, if you do not have internet connection, it is a bit tricky to install window azure AppFabric as it has to be installed using the prerequisite installer. Your computer might reboot a few times in the process.) 5.After the prerequisites are installed `completely, you can then install the Preview. Click the Install SharePoint Server and Enter the Product key you get from the Preview download page. 6. Accept the License terms and Click Next. 7. Leave the default path for the file location. 8. You can now start the installation process 9. After binary files are installed, you then can configure your farm using the farm configuration wizard. 10.Specify the Database server and the install account 11. Specify SharePoint farm passphrase. 12 Specify the port number , you should choose your own favorite port number. 13. Choose Create a New Server Farm and click next. 14. Double-check with the settings and click Next to Configure the farm install. 15. Finally, your farm is configured successfully and you now are able to go to your Central Admin site http://sp2010:6666/ 16. You should configure the services manually or automate using PowerShell (If you like to understand why,you can read the blog post here) ,however I will use the wizard to configure automatically here  as  this is a test machine. After the configuration is complete, you now be able to see your SharePoint Site. 17.To start the evaluate the Preview , you need to install Visual Studio 2012 RC , Microsoft Office Developer Tools for Visual Studio 2012,SharePoint 2013 Designer Preview , Office 2013 Preview. References: Download SharePoint2013 Server 2013 Download Microsoft Visio Professional 2013 Preview Install SharePoint 2013 Preview Hardware and software requirements for SharePoint 2013 Preview SharePoint 2013 IT Pro and Developer training materials released Plan for SharePoint 2013 Preview Microsoft Office Developer Tools for Visual Studio 2012 SharePoint 2013 Preview Office365 for the SharePoint 2013 preview SharePoint Designer 2013 Download: Microsoft Office 2013 Preview Language Pack Try Office

    Read the article

  • Markus Zirn, "Big Data with CEP and SOA" @ SOA, Cloud &amp; Service Technology Symposium 2012

    - by JuergenKress
    ORACLE PROMOTIONAL DISCOUNT FOR EXCLUSIVE ORACLE DISCOUNT, ENTER PROMO CODE: DJMXZ370 Early-Bird Registration is Now Open with Special Pricing! Register before July 1, 2012 to qualify for discounts. Visit the Registration page for details. The International SOA, Cloud + Service Technology Symposium is a yearly event that features the top experts and authors from around the world, providing a series of keynotes, talks, demonstrations, and panels, as well as training and certification workshops - all dedicated to empowering IT professionals to realize modern service technologies and practices in the real world. Click here for a two-page printable conference overview (PDF). Big Data with CEP and SOA - September 25, 2012 - 14:15 Speaker: Markus Zirn, Oracle and Baz Kuthi, Avocent The "Big Data" trend is driving new kinds of IT projects that process machine-generated data. Such projects store and mine using Hadoop/ Map Reduce, but they also analyze streaming data via event-driven patterns, which can be called "Fast Data" complementary to "Big Data". This session highlights how "Big Data" and "Fast Data" design patterns can be combined with SOA design principles into modern, event-driven architectures. We will describe specific architectures that combines CEP, Distributed Caching, Event-driven Network, SOA Composites, Application Development Framework, as well as Hadoop. Architecture patterns include pre-processing and filtering event streams as close as possible to the event source, in memory master data for event pattern matching, event-driven user interfaces as well as distributed event processing. Focus is on how "Fast Data" requirements are elegantly integrated into a traditional SOA architecture. Markus Zirn is Vice President of Product Management covering Oracle SOA Suite, SOA Governance, Application Integration Architecture, BPM, BPM Solutions, Complex Event Processing and UPK, an end user learning solution. He is the author of “The BPEL Cookbook” (rated best book on Services Oriented Architecture in 2007) as well as “Fusion Middleware Patterns”. Previously, he was a management consultant with Booz Allen & Hamilton’s High Tech practice in Duesseldorf as well as San Francisco and Vice President of Product Marketing at QUIQ. Mr. Zirn holds a Masters of Electrical Engineering from the University of Karlsruhe and is an alumnus of the Tripartite program, a joint European degree from the University of Karlsruhe, Germany, the University of Southampton, UK, and ESIEE, France. KEYNOTES & SPEAKERS More than 80 international subject matter experts will be speaking at the Symposium. Below are confirmed keynotes and speakers so far. Over 50% of the agenda has not yet been finalized. Many more speakers to come. View the partial program calendars on the Conference Agenda page. CONFERENCE THEMES & TRACKS Cloud Computing Architecture & Patterns New SOA & Service-Orientation Practices & Models Emerging Service Technology Innovation Service Modeling & Analysis Techniques Service Infrastructure & Virtualization Cloud-based Enterprise Architecture Business Planning for Cloud Computing Projects Real World Case Studies Semantic Web Technologies (with & without the Cloud) Governance Frameworks for SOA and/or Cloud Computing Projects Service Engineering & Service Programming Techniques Interactive Services & the Human Factor New REST & Web Services Tools & Techniques Oracle Specialized SOA & BPM Partners Oracle Specialized partners have proven their skills by certifications and customer references. To find a local Specialized partner please visit http://solutions.oracle.com SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit  www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Technorati Tags: Markus Zirn,SOA Symposium,Thomas Erl,SOA Community,Oracle SOA,Oracle BPM,BPM Community,OPN,Jürgen Kress

    Read the article

< Previous Page | 28 29 30 31 32 33 34 35 36 37 38 39  | Next Page >