Search Results

Search found 93603 results on 3745 pages for 'all in one'.

Page 30/3745 | < Previous Page | 26 27 28 29 30 31 32 33 34 35 36 37  | Next Page >

  • Choppy USB mice on just one of USB ports

    - by user20532
    I've got Lenovo b560 laptop with latest, properly updated Kubuntu on it (11.04 natty, kernel 2.6.38-8-generic). It has three USB2.0 ports onboard. I usually plug a mouse into one of them (I've got 3 different mice - in office, at home and for when on the go). Sometimes, usually after laptop awakening from sleep, the mouse still works but cursor movements are choppy, as if the processor was extremely loaded (it's usually not). I found that if I re-plug the mouse cord into the other USB port, it works just fine. If I plug it back to problematic port, it is still choppy and remains choppy until next boot. Of course I want my mice to always work fine. Problem is: I cannot reproduce this behavior for sure, it happens sporadically but regularly. I use different USB ports (problem has ever happened on each of them since), I use different mice (each has failed me this way at least once), I cannot generally find what exactly is going wrong and why plugging to different port fixes the mouse instantly. So I'd like to hear at least clues where to look at, what to try to identify my problem. A bit of update: while beginning this post, I had the issue once again. I have just replugged the mouse back to problematic port and it is not recognized at all. On the other port it works smoothly.

    Read the article

  • One codebase - lots of hosted services (similar to a basecamp style service) - planning structure

    - by RickM
    We have built a service (PHP Based) for a client, and are now looking to offer it to other clients as a hosted service. For this example, think of it like a hosted forum service, where a client signs up on our site, and is given a subdomain or can use their own domain, and the code picks up the domain, checks it against a 'master' users table, and then loads the content as needed. I'm trying to work out the best way of handling multiple clients. At the moment I can only think of two options that would work: Option 1 - Have 1 set of database tables, but on each table have a column called 'siteid' - this would mean every query has to check the siteid. This would effectively work with just 1 codebase, and 1 database. Option 2 - Have 1 'master' database with all the core stuff such as the client details and their domain. Then when the systen checks the domain, it pulls the clients database details (username/password/dbname) from a table, and loads a second database. The issue here is security of the mysql server details, however it does have the benefit that they are running their own database instead of sharing one. Which option would I be better taking here, and why? Ideally I want it to be fairly easy to convert the 'standalone' script to the 'multi-domain' script as we're on a tight deadline.

    Read the article

  • One good reason for a rewrite

    - by Supermighty
    I have a personal web project I cut my teeth on learning how to program. I wrote it in PHP and learned as I went. I eventually I re-factored it to use MVC and removed all mixing of php/html. Right now it has no users, save myself, and it makes no money. I have a strong desire to rewrite the entire app. Which really isn't that large of an app. I have a lot of reasons why I should not rewrite it. I know that I should move forward. It's a working app now and it will only set me back to rewrite it. But I can't shake this feeling that I would be better off using a different programming language in the long run. That I'd enjoy it more. That I'd feel comfortable with it. I feel like my one good reason to rewrite my app is that I have a gut feeling that I should. PHP seems like a hack thrown together. I want to use a language that feels more elegant to me. Any feedback you have would be welcome.

    Read the article

  • Choppy USB mice on just one of USB ports

    - by user20532
    I've got Lenovo b560 laptop with latest, properly updated Kubuntu on it (11.04 natty, kernel 2.6.38-8-generic). It has three USB2.0 ports onboard. I usually plug a mouse into one of them (I've got 3 different mice - in office, at home and for when on the go). Sometimes, usually after laptop awakening from sleep, the mouse still works but cursor movements are choppy, as if the processor was extremely loaded (it's usually not). I found that if I re-plug the mouse cord into the other USB port, it works just fine. If I plug it back to problematic port, it is still choppy and remains choppy until next boot. Of course I want my mice to always work fine. Problem is: I cannot reproduce this behavior for sure, it happens sporadically but regularly. I use different USB ports (problem has ever happened on each of them since), I use different mice (each has failed me this way at least once), I cannot generally find what exactly is going wrong and why plugging to different port fixes the mouse instantly. So I'd like to hear at least clues where to look at, what to try to identify my problem. A bit of update: while beginning this post, I had the issue once again. I have just replugged the mouse back to problematic port and it is not recognized at all. On the other port it works smoothly.

    Read the article

  • Git Branch Model for iOS projects with one developer

    - by glenwayguy
    I'm using git for an iOS project, and so far have the following branch model: feature_brach(usually multiple) -> development -> testing -> master Feature-branches are short-lived, just used to add a feature or bug, then merged back in to development and deleted. Development is fairly stable, but not ready for production. Testing is when we have a stable version with enough features for a new update, and we ship to beta testers. Once testing is finished, it can be moved back into development or advanced into master. The problem, however, lies in the fact that we can't instantly deploy. On iOS, it can be several weeks between the time a build is released and when it actually hits users. I always want to have a version of the code that is currently on the market in my repo, but I also have to have a place to keep the current stable code to be sent for release. So: where should I keep stable code where should I keep the code currently on the market and where should I keep the code that is in review with Apple, and will be (hopefully) put on the market soon? Also, this is a one developer team, so collaboration is not totally necessary, but preferred because there may be more members in the future.

    Read the article

  • SQL RDBMS : one query or multiple calls

    - by None None
    After looking around the internet, I decided to create DAOs that returned objects (POJOs) to the calling business logic function/method. For example: a Customer object with a Address reference would be split in the RDBMS into two tables; Customer and ADDRESS. The CustomerDAO would be in charge of joining the data from the two tables and create both an Address POJO and Customer POJO adding the address to the customer object. Finally return the fulll Customer POJO. Simple, however, now i am at a point where i need to join three or four tables and each representing an attribute or list of attributes for the resulting POJO. The sql will include a group by but i will still result with multiple rows for the same pojo, because some of the tables are joining a one to many relationship. My app code will now have to loop through all the rows trying to figure out if the rows are the same with different attributes or if the record should be a new POJO. Should I continue to create my daos using this technique or break up my Pojo creation into multiple db calls to make the code easier to understand and maintain?

    Read the article

  • Drupal 7: One-time user account

    - by Noob
    I'm going to create a survey in Drupal 7 with the webform module, installed on a debian system which may be adapted in every way. The users (personally known, approx. 120) doing that survey will walk into a room and complete the survey in browsers on different computers. After that, they'll leave the room and other persons will enter, complete the survey on the same computers and so on. Each user may enter only one submission. The process needs to be anonymous, i. e. I mustn't have any idea of who did wich submission. My current solution is to generate random one-time-passwords and hand out one password per user (without noting who got which password). Within the survey there will be a password field where the one-time-password is entered. The value is checked by webform to be unique. I'll get the data via csv or Excel and verify the passwords manually in excel by comparing them to the list of valid passwords. The problem is: I don't like the idea of manually generating the password list, copying it to excel and doing a manual check. That's a good idea for one-time-use, but we're going to repeat the survey every once in a while. I'd rather generate one-time-logins (like user0001/fdlkjewf, user0002/dfrefnnr, ...) for each survey, hand them out to the users and let drupal/debian/whatever check whether a submission is valid or not. Do you have any idea how to batch-generate about 120 users with one-time-passwords in Drupal 7 and verify that each user may submit the form only once? Do you even have a better idea how to accomplish the task within the intranet? Thank you for your help.

    Read the article

  • vSphere Promiscuous mode only receiving packets one way from network switch

    - by steve.lippert
    We have two network switches, a POE switch (SwitchA) to power our phones / users computers and a non-POE switch (SwitchB for the rest network.) Each switch is setup to do port mirroring to support our VoIP recording system. SwitchA does port mirroring on specific ports if we need to record a user. SwitchB mirrors one port to monitor our work at home users (Internet comes in from managed router, to switch, back out to our firewall.) These two port mirroring setups feed into one vmware vSphere 4.1 server, it has four total physical cards. The other two NICs feed into an unmanaged switch for connecting to the rest of the network. Once into the vSphere server all network ports go into a vSwitch, and then one of the servers (Windows 2008 R2) sniffs them out and does its thing. Everything is working fine and dandy from SwitchB. But on SwitchA we only receive one side of the VoIP packets (going out to the phone, nothing coming in from the phone). Troubleshooting steps I have taken so far: I hooked up my laptop to the monitor port on SwitchB and I see both sides of the packets. I swapped which network interface is plugged into the monitor port on SwitchA. Because everything feeds into one vSwitch / vNetwork and both sides of the conversation arrive just fine from SwitchB I believe everything is configured correctly on the vSphere server/guest. What could be causing one way packets to arrive on my guest machine from only one interface, but not the other? Could a bad cable be causing the problems from SwitchB?

    Read the article

  • 10 PowerShell One Liners

    - by BizTalk Visionary
    Here are a few one-liners that use NetCmdlets. Some of these I've blogged about before, some are new. Let me know if you have questions, which ones you find useful, or how you altered these to suit your own needs. Send email to a list of recipient addresses: import-csv users.csv | % { send-email -to $_.email -from [email protected] -subject "Important Email" –message "Hello World!" -server 10.0.1.1 } Show the access control list for a specific Exchange folder: get-imap -server $mymailserver -cred $mycred -folder INBOX.RESUMES –acl Add look and read permissions on an Exchange folder, for a list of accounts pulled from a CSV file: import-csv users.csv | % { set-imap -server -acluser $_.username $mymailserver -cred $mycred -folder INBOX.RESUMES –acl “lr”  } Sync system time with an Internet time server: get-time -server clock.psu.edu –set To remotely sync the time on a set of computers: import-csv computers.csv | % { Invoke-Command -computerName $_.computer -cred $mycred -scriptblock { get-time -server clock.psu.edu –set } } Delete all emails from an Exchange folder that match a certain criteria.  For example, delete all emails from [email protected]: get-imap -server $mailserver –cred $mycred | ? {$_.FromEmail -eq [email protected]} | %{ set-imap -server $mailserver –cred $mycred-message $_.Id -delete } Update Twitter status from PowerShell: get-http –url "http://twitter.com/statuses/update.xml" –cred $mycred -variablename status -variablevalue "Tweeting with NetCmdlets!" A test-path that works over FTP, FTPS (SSL), and SFTP (SSH) connections: get-ftp -server $remoteserver –cred $mycred -path /remote/path/to/checkfor* Don't forget the *.  Also, to use SSL or SSH just add an –ssl or –ssh parameter. List disabled user accounts in Active Directory (or any other LDAP server): get-ldap -server $ad -cred $mycred -dn dc=yourdc -searchscope wholesubtree     -search "(&(objectclass=user)(objectclass=person)(company=*)(userAccountControl:1.2.840.113556.1.4.803:=2))" List Active Directory groups and their members: get-ldap -server testman -cred $mycred -dn dc=NS2 -searchscope wholesubtree -search "(&(objectclass=group)(cn=*admin*))" | select ResultDN, member Display the last initialization time (e.g. last reboot time) of all discoverable SNMP agents on a network: import-csv computers.csv | % { get-snmp -agent $_.computer -oid sysUpTime.0 | %{([datetime]::Now).AddSeconds(-($_.OIDValue/100))} } Not mentioned here:  data conversion (Yenc, QP, UUencoding, MD5, SHA1, base64, etc), DNS, News Groups (NNTP/UseNet), POP mail, RSS feeds, Amazon S3, Syslog, TFTP, TraceRoute, SNMP Traps, UDP, WebDAV, whois, Rexec/Rshell/Telnet, Zip files, sending IMs (Jabber/GoogleTalk/XMPP), sending text messages and pages, ping, and more. Original Source: Lance's Textbox

    Read the article

  • Visual Studio 10 crashed when tried to open one of solutions

    - by Michael Freidgeim
    Visual Studio 10 crashed when I tried to open  one of my solutions. Closing Visual Studio and rebooting the machine didn’t help.The error message that was logged(see below), didn’t give any useful ideas.Finally It was fixed after I’ve deleted MySolution.suo file, which was quite big, and also Resharper folders.Log Name:      ApplicationSource:        Application ErrorEvent ID:      1000Task Category: (100)Level:         ErrorKeywords:      ClassicUser:          N/ADescription:Faulting application name: devenv.exe, version: 10.0.40219.1, time stamp: 0x4d5f2a73Faulting module name: msenv.dll, version: 10.0.40219.1, time stamp: 0x4d5f2d48Exception code: 0xc0000005Fault offset: 0x00355770Faulting process id: 0x1dc0Faulting application start time: 0x01cd1836888599f4Faulting application path: C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\devenv.exeFaulting module path: c:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\msenv.dllReport Id: 9924b2f9-844e-11e1-bc19-782bcba513eaEvent Xml:<Event >  <System>    <Provider Name="Application Error" />    <EventID Qualifiers="0">1000</EventID>    <Level>2</Level>    <Task>100</Task>    <Keywords>0x80000000000000</Keywords>    <TimeCreated SystemTime="2012-04-12T03:21:31.000000000Z" />    <EventRecordID>401998</EventRecordID>    <Channel>Application</Channel>    <Security />  </System>  <EventData>    <Data>devenv.exe</Data>    <Data>10.0.40219.1</Data>    <Data>4d5f2a73</Data>    <Data>msenv.dll</Data>    <Data>10.0.40219.1</Data>    <Data>4d5f2d48</Data>    <Data>c0000005</Data>    <Data>00355770</Data>    <Data>1dc0</Data>    <Data>01cd1836888599f4</Data>    <Data>C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\devenv.exe</Data>    <Data>c:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\msenv.dll</Data>    <Data>9924b2f9-844e-11e1-bc19-782bcba513ea</Data>  </EventData></Event>v

    Read the article

  • Exciting product releases (and one disappointing thing) with Mix10

    - by Jeff
    Sadly, I'm not at Mix this year, for the first time in a few years. It's a little harder to go if you work for Microsoft, oddly enough. And then there's this little guy next to me, who at ten days old really needs his daddy to be around! But oh, the excitement of what Microsoft has in store! It's great to finally see all of these major releases coming together for Microsoft developer products. There is a great deal of excitement among people internally no matter where you work, because there is so much cool stuff in the pipe. In case you live under a rock...Visual Studio 2010 - Great to see all of the positive feedback on the Twitter and what not. I've been using it on one of my home products for awhile, and I really like it. The newer nightly builds of ReSharper also seem to be gaining speed in quality as well. I like the new debugging features, and the text readability is not imagined. Love it.Silverlight 4 - I've been running a couple of minor SL3 apps on my personal sites for awhile now, and I'm thrilled with the platform. With a couple of key concepts down, .NET folk like you and me can do some stellar things with this, and if you're a Mac nerd (like me), it's all kinds of awesome to be able to build stuff for it without the agony of Objective-C and X Code.Windows Phone 7 Series - A few weeks ago you got to see the shiny new UI that went beyond the icon grid, and now you've got the developer story as well. That I can adapt my existing Silverlight apps with minimal effort to work on the phone is pretty powerful. Millions of .NET devs just because phone developers, using the tools they already know. How great is that?ASP.NET MVC2 - The final bits shipped last week, and there was much rejoicing. I love this framework because of the testability and the real ability to get to the true mechanics of HTTP. The other cool thing is the speed at which the framework has evolved. v2 in less than a year is pretty "un-Microsoft" in a lot of eyes.The video of keynotes and sessions is starting to appear on the Mix site, but for reasons I can't understand, they're WMV downloads. For real? Not that helpful for Mac folk. Why wouldn't they be using a Silverlight player?In any case, the thing that continues to motivate me is that getting what you imagine on to the Internet gets easier every year. This is not a new revelation for me. I've only been at Microsoft for four months, but I've felt this way for years. I'm thrilled to be a part of it.

    Read the article

  • Programming Windows 8 Apps with HTML, CSS, and JavaScript - All you need in one title

    It took me a while to work through the 800+ pages of this title. And yes, I really mean working not reading... Since the release of Windows 8 it should be obvious to any Windows software developer that there are new ways to develop, deploy and market applications for a broader audience. Interestingly, Microsoft started to narrow the technological gap between the various platforms - desktop, web, smartphone and XBox - and development of modern apps with HTML, CSS and JavaScript couldn't be easier. Kraig covers all facets of modern Windows 8 apps from the basic building blocks and project templates in Visual Studio 2012 over to the thoughtful use of specific APIs to finally proper deployment in the App Store and potential monetization. The organisation of the book is lied out like step by step instructions or a tutorial. Kraig literally takes the reader by the hand and explains in detail in his examples about the reasons, the pros, and the cons of a certain way of implementation. Thanks to cross-references to other chapters he leaves the choice to the reader to dig deeper right now or to catch up at some time later. Personally, I have to admit that I really enjoyed the relaxed writing style. App development is not dust-dry rocket science and it should be joyful to learn about new technologies. And thanks to the richness of the various chapters and samples you could easily adapt and transfer the knowledge gained in this title to other platforms like Windows Phone 8. And last but not least: The ebook is freely available at Amazon, Microsoft Press and O'Reilly. Don't think about it, just get the book. Now. Update: I already mentioned this title in other blog entries which are related to Microsoft certification. Feel free to read on and to discover more online resources: Learning content for MCSDs: Web Applications and Windows Store Apps using HTML5 More content for MCSDs: Web Applications and Windows Store Apps using HTML5 O'Reilly offers free webcasts on their site, too. And in case that you would like to know more about Kraig's book and his experience with various development teams, please checkout this one: Zero to App in Two Weeks: Programming Windows 8 Apps in HTML, CSS, and JavaScript. The recording should be available soon.

    Read the article

  • One Week To Go: OTN Architect Day: Cloud Computing

    - by Bob Rhubart
    One week remains until OTN Architect Day: Cloud Computing kicks of at the spectacular Oracle HQ campus in Redwood Shores, CA. The event is free, and there is still time to register. When: Tuesday July 9, 2013 8:30am - 12:30pm Where: Oracle Conference Center350 Oracle Pkwy Redwood City, CA 94065 Register now. It's free! Here's the latest update to the event agenda: 8:30am - 9:00am Registration and Continental Breakfast 9:00am - 9:45am Keynote 21st Century IT | Dr. James Baty VP, Global Enterprise Architecture Program, Oracle Imagine a time long, long ago. A time when servers were certified and dedicated to specific applications, when anything posted on an enterprise web site was from restricted, approved channels, and when we tried to limit the growth of 'dirty' data and storage. Today, applications are services running in the muti-tenant hybrid cloud. Companies beg their customers to tweet them, friend them, and publicly rate their products. And constantly analyzing a deluge of Internet, social and sensor data is the key to creating the next super-successful product, or capturing an evil terrorist. The old IT architecture was planned, dedicated, stable, controlled, with separate and well-defined roles. The new architecture is shared, dynamic, continuous, XaaS, DevOps. This keynote session describes the challenges and opportunities that the new business / IT paradigms present to the IT architecture and architects. 9:45am - 10:30am Technical Session Oracle Cloud: A Case Study in Building a Cloud | Anbu Krishnaswami Enterprise Architect, Oracle Building a Cloud can be challenging thanks to the complex requirements unique to Cloud computing and the massive scale typically associated with Cloud. Cloud providers can take an Infrastructure as a Service (IaaS) approach and build a cloud on virtualized commodity hardware, or they can take the Platform as a Service (PaaS) path, a service-oriented approach based on pre-configured, integrated, engineered systems. This presentation uses the Oracle Cloud itself as a case study in the use of engineered systems, demonstrating how the technical design of engineered systems is leveraged for building PaaS and SaaS Cloud services and a Cloud management infrastructure. The presentation will also explore the principles, patterns, best practices, and architecture views provided in Oracle's Cloud reference architecture. 10:30 am -10:45 am Break 10:45am-11:30am Technical Session Database as a Service | Markus Michalewicz Senior Principal Product Manager Oracle Real Application Clusters (RAC) New applications are now commonly built in a Cloud model, where the database is consumed as a service, and many established business processes are beginning to migrate to database as a service (DBaaS). This adoption of DBaaS is made possible by the availability of new capabilities in the database that enable resource pooling, dynamic resource management, model-based provisioning, metered use, and effective quality-of-service controls. This session will examine the catalog of database services at a large commercial bank to understand how these capabilities are enabling DBaaS for a wide range of needs within the enterprise. 11:30 am - 12:00 pm Panel Q&A Dr. James Baty, Anbu Krishnaswami, and Markus Michalewicz respond to audience questions. Registration is free, but seating is limited, so register now.

    Read the article

  • One Week To Go: OTN Architect Day: Cloud Computing

    - by Bob Rhubart
    One week remains until OTN Architect Day: Cloud Computing kicks of at the spectacular Oracle HQ campus in Redwood Shores, CA. The event is free, and there is still time to register. When: Tuesday July 9, 2013 8:30am - 12:30pm Where: Oracle Conference Center350 Oracle Pkwy Redwood City, CA 94065 Register now. It's free! Here's the latest update to the event agenda: 8:30am - 9:00am Registration and Continental Breakfast 9:00am - 9:45am Keynote 21st Century IT | Dr. James Baty VP, Global Enterprise Architecture Program, Oracle Imagine a time long, long ago. A time when servers were certified and dedicated to specific applications, when anything posted on an enterprise web site was from restricted, approved channels, and when we tried to limit the growth of 'dirty' data and storage. Today, applications are services running in the muti-tenant hybrid cloud. Companies beg their customers to tweet them, friend them, and publicly rate their products. And constantly analyzing a deluge of Internet, social and sensor data is the key to creating the next super-successful product, or capturing an evil terrorist. The old IT architecture was planned, dedicated, stable, controlled, with separate and well-defined roles. The new architecture is shared, dynamic, continuous, XaaS, DevOps. This keynote session describes the challenges and opportunities that the new business / IT paradigms present to the IT architecture and architects. 9:45am - 10:30am Technical Session Oracle Cloud: A Case Study in Building a Cloud | Anbu Krishnaswami Enterprise Architect, Oracle Building a Cloud can be challenging thanks to the complex requirements unique to Cloud computing and the massive scale typically associated with Cloud. Cloud providers can take an Infrastructure as a Service (IaaS) approach and build a cloud on virtualized commodity hardware, or they can take the Platform as a Service (PaaS) path, a service-oriented approach based on pre-configured, integrated, engineered systems. This presentation uses the Oracle Cloud itself as a case study in the use of engineered systems, demonstrating how the technical design of engineered systems is leveraged for building PaaS and SaaS Cloud services and a Cloud management infrastructure. The presentation will also explore the principles, patterns, best practices, and architecture views provided in Oracle's Cloud reference architecture. 10:30 am -10:45 am Break 10:45am-11:30am Technical Session Database as a Service | Markus Michalewicz Senior Principal Product Manager Oracle Real Application Clusters (RAC) New applications are now commonly built in a Cloud model, where the database is consumed as a service, and many established business processes are beginning to migrate to database as a service (DBaaS). This adoption of DBaaS is made possible by the availability of new capabilities in the database that enable resource pooling, dynamic resource management, model-based provisioning, metered use, and effective quality-of-service controls. This session will examine the catalog of database services at a large commercial bank to understand how these capabilities are enabling DBaaS for a wide range of needs within the enterprise. 11:30 am - 12:00 pm Panel Q&A Dr. James Baty, Anbu Krishnaswami, and Markus Michalewicz respond to audience questions. Registration is free, but seating is limited, so register now.

    Read the article

  • TechCast Live: "Java and Oracle, One Year Later" Replay Now Available

    - by Justin Kestelyn
    Earlier this week I had the opportunity to chat with Ajay Patel, Oracle's VP leading the Java Evangelist team, about "the state of the union" wrt Oracle and Java. Take a look: And here are some choice quotes, some paraphrased, as helpfully transcribed by Java evangelist Terrence Barr: "One key thing we have learned ... Java is not just a platform, it is also an ecosystem, and you can't have an ecosystem without a community." "The objectives, strategically [for Java at Oracle] have been pretty clear: How do we drive adoption, how do we build a larger, stronger developer community, how do we really make the platform much more competitive." "It's about transparency, involvement. IBM, RedHat, Apple have all agreed to working with us to make OpenJDK the best platform for open source development ... it is a sign that the community has been waiting to move the Java platform forward." "It's not just about Oracle anymore, it's about Java, the technology, the community, the developer base, and how we work with them to move the innovation forward." "Java is strategic to Oracle, and the community is strategic for Java to be successful ... it is critical to our business." On JavaFX 2.0: "... is coming to beta soon, with a release planned in second half [of 2011] ... will give you a new, high-performance graphics engine, the new API for JavaFX ... you will see a very strong, relevant platform for levering rich media platforms." On the JDK and SE: "... aggressively moving forward, JDK 7 is now code complete ... looking good for getting JDK 7 out by summer as we promised. Started work on JDK 8, Jigsaw and Lambda are moving along nicely, on track for JDK 8 release next year ... good progress." On Java EE and Glassfish: "... Very excited to have Glassfish 3.1 released, with clustering and management capabilities ... working with the JCP to shortly submit a number of JSRs for Java EE 7 ... You'll see Java EE 7 becoming the platform for cloud-based development." "You will see Oracle continue to step up to this role of Java steward, making sure that the language, the technology, the platform ... is competitive, relevant, and widely adopted." Making progress!

    Read the article

  • Tech Cast Live - Java and Oracle, One Year Later - February 15th 10AM PST

    - by Cassandra Clark
    Join us for a special live conversation with Ajay Patel, Vice President of Product Development for Application Grid Products and Justin Kestelyn, Director of the Oracle Technology Network. Justin and Ajay will discuss the changes that have come to Java and Oracle since the Sun acquisition, just over a year ago. This live broadcast conversation will include discussion on: - Highlights, challenges and what we learned over the past year - The Future of Java and its importance to Oracle and the community - Oracle's Application Grid product portfolio today Watch Live Event February 15th Watch Archived TechCast Lives You will also have the chance to submit questions to the speakers live on the show, for real-time feedback by using #techcastlive. If your question is read on air we will send you a Free I am the Future of Java t-shirt* *Promotion Details After you have submitted your question and it is read on the live TechCast held February 15th your shirt should arrive in two to four weeks while supplies last. No purchase, payments, or fees are required to receive the gift. Limit one thank you gift per person, and the offer is available only while supplies last. Oracle reserves the right to modify or terminate this offer at any time, for any reason. This offer is not available to Oracle employees or residents of countries subject to U.S. embargo (including Cuba, Iran, Iraq, Libya, North Korea, Sudan, and Syria). Due to Federal Government regulations, this offer is not available to Federal Government customers. Those residing in India or Brazil will be given a substitute gift as we can not ship t-shirts to your country. You are responsible for complying with your employer's policies regarding acceptance of promotional items, and for government laws, regulations and agency policies, if you are a government employee you will not be able to participate. Must be 18 years of age or older. Void where prohibited. Neither Oracle nor any third party assisting Oracle with this offer is responsible for any problems, errors, delays, or technical malfunction related to or impacting this offer. Oracle respects your right to privacy and your information will not be distributed or used for any other purpose. For more information on Oracle's privacy policy, please review our http://www.oracle.com/html/privacy-policy.html. If you have any questions, please contact us at [email protected].

    Read the article

  • Drawing multiple objects from one Vertex Buffer Object in OpenGL/OpenTK

    - by stoney78us
    I am trying to experimenting drawing method using VBO in OpenGL. Many people normally use 1 vbo to store one object data array. I was trying to do something quite opposite which is storing multiple object data into 1 vbo then drawing it. There is story behind why i want to do this. I want to group many of objects as a single object sometime. However my code doesn't do the justice. Following is my pseudo code: //Data double[] vertices = {line strip 1, line strip 2, line strip 3}; //series of vertices int linestrip1offset = index of the first vertex in line strip 1; int linestrip2offset = index of the first vertex in line strip 2; int linestrip3offset = index of the first vertex in line strip 3; int linestrip1VertexNum = number of vertices in linestrip 1; int linestrip2VertexNum = number of vertices in linestrip 2; int linestrip3VertexNum = number of vertices in linestrip 3; //Setting Up void init() { int[] vBO = new int[1]; GL.GenBuffer(1, vBO); GL.BindBuffer(BufferTarget.ArrayBuffer, vBO[0]); GL.BufferData(BufferTarget.ArrayBuffer, new IntPtr(_vertices.Length * sizeof(double)), _vertices, BufferUsageHint.StaticDraw); GL.EnableClientState(Array.VertexArray); } //Drawing void draw() { GL.BindBuffer(BufferTarget.ArrayBuffer, vBO[0]); GL.EnableClientState(ArrayCap.VertexArray); GL.VertexPointer(3, VertexPointerType.Double, 0, linestrip1offset); //drawing first linestrip GL.DrawArrays(drawMode, linestrip1offset , linestrip1VertexNum ); GL.VertexPointer(3, VertexPointerType.Double, 0, linestrip2offset); //drawing second linestrip GL.DrawArrays(drawMode, linestrip2offset , linestrip2VertexNum ); GL.VertexPointer(3, VertexPointerType.Double, 0, linestrip3offset); //drawing third linestrip GL.DrawArrays(drawMode, linestrip3offset , linestrip3VertexNum ); GL.DisableClientState(ArrayCap.VertexArray); GL.BindBuffer(BufferTarget.ArrayBuffer, 0); } I don't know what i did wrong but i think technically it should work where we can tell OpenGL which part of the data in the vBO to be drawn.

    Read the article

  • Multiple country-specific domains or one global domain [closed]

    - by CJM
    Possible Duplicate: How should I structure my urls for both SEO and localization? My company currently has its main (English) site on a .com domain with a .co.uk alias. In addition, we have separate sites for certain other countries - these are also hosted in the UK but are distinct sites with a country-specific domain names (.de, .fr, .se, .es), and the sites have differing amounts of distinct but overlapping content, For example, the .es site is entirely in Spanish and has a page for every section of the UK site but little else. Whereas the .de site has much more content (but still less than the UK site), in German, and geared towards our business focus in that country. The main point is that the content in the additional sites is a subset of the UK, is translated into the local language, and although sometimes is simply only a translated version of UK content, it is usually 'tweaked' for the local market, and in certain areas, contains unique content. The other sites get a fraction of the traffic of the UK site. This is perfectly understandable since the biggest chunk of work comes from the UK, and we've been established here for over 30 years. However, we are wanting to build up our overseas business and part of that is building up our websites to support this. The Question: I posed a suggestion to the business that we might consider consolidating all our websites onto the .com domain but with /en/de/fr/se/etc sections, as plenty of other companies seem to do. The theory was that the non-english sites would benefit from the greater reputation of the parent .com domain, and that all the content would be mutually supporting - my fear is that the child domains on their own are too small to compete on their own compared to competitors who are established in these countries. Speaking to an SEO consultant from my hosting company, he feels that this move would have some benefit (for the reasons mentioned), but they would likely be significantly outweighed by the loss of the benefits of localised domains. Specifically, he said that since the Panda update, and particularly the two sets of changes this year, that we would lose more than we would gain. Having done some Panda research since, I've had my eyes opened on many issues, but curiously I haven't come across much that mentions localised domain names, though I do question whether Google would see it as duplicated content. It's not that I disagree with the consultant, I just want to know more before I make recommendations to my company. What is the prevailing opinion in this case? Would I gain anything from consolidating country-specific content onto one domain? Would Google see this as duplicate content? Would there be an even greater penalty from the loss of country-specific domains? And is there anything else I can do to help support the smaller, country-specific domains?

    Read the article

  • Inherit one instance variable from the global scope

    - by Julian
    I'm using Curses to create a command line GUI with Ruby. Everything's going well, but I have hit a slight snag. I don't think Curses knowledge (esoteric to be fair) is required to answer this question, just Ruby concepts such as objects and inheritance. I'm going to explain my problem now, but if I'm banging on, just look at the example below. Basically, every Window instance needs to have .close called on it in order to close it. Some Window instances have other Windows associated with it. When closing a Window instance, I want to be able to close all of the other Window instances associated with it at the same time. Because associated Windows are generated in a logical fashion, (I append the name with a number: instance_variable_set(self + integer, Window.new(10,10,10,10)) ), it's easy to target generated windows, because methods can anticipate what assosiated windows will be called, (I can recreate the instance variable name from scratch, and almost query it: instance_variable_get(self + integer). I have a delete method that handles this. If the delete method is just a normal, global method (called like this: delete_window(@win543) then everything works perfectly. However, if the delete method is an instance method, which it needs to be in-order to use the self keyword, it doesn't work for a very clear reason; it can 'query' the correct instance variable perfectly well (instance_variable_get(self + integer)), however, because it's an instance method, the global instances aren't scoped to it! Now, one way around this would obviously be to simply make a global method like this: delete_window(@win543). But I have attributes associated with my window instances, and it all works very elegantly. This is very simplified, but it literally translates the problem exactly: class Dog def speak woof end end def woof if @dog_generic == nil puts "@dog_generic isn't scoped when .woof is called from a class method!\n" else puts "@dog_generic is scoped when .woof is called from the global scope. See:\n" + @dog_generic end end @dog_generic = "Woof!" lassie = Dog.new lassie.speak #=> @dog_generic isn't scoped when .woof is called from an instance method!\n woof #=> @dog_generic is scoped when .woof is called from the global scope. See:\nWoof! TL/DR: I need lassie.speak to return this string: "@dog_generic is scoped when .woof is called from the global scope. See:\nWoof!" @dog_generic must remain as an insance variable. The use of Globals or Constants is not acceptable. Could woof inherit from the Global scope? Maybe some sort of keyword: def woof < global # This 'code' is just to conceptualise what I want to do, don't take offence! end Is there some way the .woof method could 'pull in' @dog_generic from the global scope? Will @dog_generic have to be passed in as a parameter?

    Read the article

  • HOw to make one email as favorite in gmail to send it more often

    - by Mirage
    I have one email which i need to forward on regular basis. But when i forward that. then all emails which i have forwarded are attached on the bottom to look likr long conversation and i had to click on top email to again forward to some one. Is there any way that i one email marked as Starred etc so that when i forward it , the forwarded message should not attach to that mail and that email stays only one so that it becomes easy for me to forward to other people

    Read the article

  • How to stretch the image from one screen to two screens

    - by wxiiir
    I want to be able to stretch the image from one monitor to a second monitor even if i have to use some software to do it. For example i want the top half of the image that is now shown on my first monitor to occupy the whole second monitor and i want the bottom half to occupy the whole first monitor, or the other way around i don't really care as long as it works. I would be cool to know how to do the same stuff but to the left half and right half. I know that image pixels would have twice the lenght or height this way but i don't really care about it as long as it works so basically i want the stuff to show on both monitors but with the same pixels as before. I have a hd4870 and windows7 and hd4000 family doesnt support having two monitors behaving like a large one, only hd5000 upwards, this would solve my problems without any of the drawbacks but it just can't be done (or maybe it can via software but i'm just too tired of searching). A solution to make almost any graphic card have two monitors behaving like a large one is matrox dualhead2go but that's just as expensive as a good hd5000 card so it's not worth it. thanks in advance EDIT I guess that nobody so far was able to fully comprehend my problem that was very explicitly written but i will elaborate some more. My hd4870 can have 2 monitors working with it but some stuff like games won't run on both monitors, which sucks. There are some ways to circumvent this problem and two of them are perfect or almost perfect but expensive and the third would be a software solution that would make it possible. The first one is to have and hd5000 family video card which will work just fine with both monitors. The second is to have a matrox dualhead2go that will make my hd4870 detect my two monitors as a large monitor. The third is to have a software that makes my two displays be detected as a large display and then captures the output of the video card, splits the images and renders them as 2d images to both monitors OR a simpler one but that would make outputted pixels double the width or height would be to capture the output of the graphics card to one screen, split it in two and enlarge it to fit both monitors and then output it to the monitors. p.s. By capturing the output of the video card i mean just make the video card process the stuff in a certain way. Making the video card detect two monitors as a large one via software may be a bit impossible or impracticable but stretching the output as a 2d image from one to both monitors for some coders should be a walk in the park so it would be likely that such program would exist or that some widespread softwares for dual monitor would have such function in them.

    Read the article

  • Installing SharePoint 2010 in one machine with built in database

    - by sreejukg
    It is very easy to deploy SharePoint 2010 in a single server using the built-in database. Normally one need to choose such installation for evaluation purposes. When installing with default settings, setup installs Microsoft SQL server 2008 express database along with SharePoint. After installing SharePoint, you need to run SharePoint products and technology configuration wizard which will install central admin website and creates the configuration database and content database for SharePoint sites. Limitations 1. You can not perform this installation on a domain controller 2. The maximum size for express edition database is 4 GB SharePoint 2010 only supports 64 bit operating systems. The installation steps are for windows server r2 64 bit enterprise edition. Installation steps The first screen for the installation is as follows As a first step you need to install the s/w prerequisites. Click on the corresponding link Click next, here you have to agree on the license terms. Select the checkbox and then click next. The installation will starts. The progress will be updated in the screen. This may take some time as during this process, there are some components needs to be downloaded from internet. Make sure you are connected to the internet, then only the installation will become a success. If any error occurs, it will display the error, you need to configure in order to continue. If everything ok you will receive the following success page. Click finish to exit the installation window. Now from the first screen, select Install SharePoint server. This will install SharePoint and SQL server 2008 express edition. First you need to enter the product key for SharePoint. Enter the product key and clicks continue. Now you need to accept the license agreement. Select the checkbox and click on continue. Select the installation type you want.   Now click on the standalone button. In production scenario, you need to select the server farm installation. This article only cover the first option, installing server farm is not in the scope of this article. Once you click on the standalone, the installation starts and you can view the progress as below. If any error occurred during installation, you will get the details and link to the log file. Refer log file and fix the corresponding issue and then start the installation again. If installation completes without any error, you will see the below screen. Make sure you selected the check box “Run the SharePoint products Configuration Wizard now” and click close. The SharePoint products configuration wizard starts. Click next; you will get the following warning Click yes and the configuration steps starts. You can view the progress for each step. Once completed the below screen appears to the user. Click finish to complete the installation. Now SharePoint installation is completed. You can navigate to SharePoint central administration website from the administrative tools and start building your portal. Good luck

    Read the article

  • SharePoint 2007 Hosting :: How to Move a Document from One Lbrary to Another

    - by mbridge
    Moving a document using a SharePoint Designer workflow involves copying the document to the SharePoint document library you want to move the document to, and then deleting the document from the current document library it is in. You can use the Copy List Item action to copy the document and the Delete item action to delete the document. To create a SharePoint Designer workflow that can move a document from one document library to another: 1. In SharePoint Designer 2007, open the SharePoint site on which the document library that contains the documents to move is located. 2. On the Define your new workflow screen of the Workflow Designer, enter a name for the workflow, select the document library you want to attach the workflow to (this would be a document library containing documents to move), select Allow this workflow to be manually started from an item, and click Next. 3. On the Step 1 screen of the Workflow Designer, click Actions, and then click More Actions from the drop-down menu. 4. On the Workflow Actions dialog box, select List Actions from the category drop-down list box, select Copy List Item from the actions list, and click Add. The following text is added to the Workflow Designer: Copy item in this list to this list 5. On the Step 1 screen of the Workflow Designer, click the first this list (representing the document library to copy the document from) in the text of the Copy List Item action. 6. On the Choose List Item dialog box, leave Current Item selected, and click OK. 7. On the Step 1 screen of the Workflow Designer, click the second this list (representing the document library to copy the document to) in the text of the Copy List Item action, and select the document library (this is the document library to where you want to move the document) from the drop-down list box that appears. 8. On the Step 1 screen of the Workflow Designer, click Actions, and then click More Actions from the drop-down menu. 9. On the Workflow Actions dialog box, select List Actions from the category drop-down list box, select Delete Item from the actions list, and click Add. The following text is added to the Workflow Designer: then Delete item in this list 10. On the Step 1 screen of the Workflow Designer, click this list in the text of the Delete Item action. 11. On the Choose List Item dialog box, leave Current Item selected and click OK. The final text for the workflow should now look like: Copy item in DocLib1 to DocLib2   then Delete item in DocLib1 where DocLib1 is the SharePoint document library containing the document to move and DocLib2 the document library to move the document to. 12. On the Step 1 screen of the Workflow Designer, click Finish. How to Test the Workflow? 1. Go to the SharePoint document library to which you attached the workflow, click on a document, and select Workflows from the drop-down menu. 2. On the Workflows page, click the name of your SharePoint Designer workflow. 3. On the workflow initiation page, click Start.

    Read the article

  • Some thoughts on email hosting for one’s own domain

    - by jamiet
    I have used the same email providers for my own domains for a few years now however I am considering moving over to a new provider. In this email I’ll share my current thoughts and hopefully I’ll get some feedback that might help me to decide on what to do next. What I use today I have three email addresses that I use primarily (I have changed the domains in this blog post as I don’t want to give them away to spammers): [email protected] – My personal account that I give out to family and friends and which I use to register on websites [email protected]  - An account that I use to catch email from the numerous mailing lists that I am on [email protected] – I am a self-employed consultant so this is an account that I hand out to my clients, my accountant, and other work-related organisations Those two domains (jtpersonaldomain.com & jtworkdomain.com) are both managed at http://domains.live.com which is a fantastic service provided by Microsoft that for some perplexing reason they never bother telling anyone about. It offers multiple accounts (I have seven at jtpersonaldomain.com though as already stated I only use two of them) which are accessed via Outlook.com (formerly Hotmail.com) along with usage reporting plus a few other odds and sods that I never use. Best of all though, its totally free. In addition, given that I have got both domains hosted using http://domains.live.com I can link my various accounts together and switch between them at Outlook.com without having to login and logout: N.B. You’ll notice that there are two other accounts listed there in addition to the three I already mentioned. One is my mum’s account which helps me provide IT support/spam filtering services to her and the other is the donation account for AdventureWorks on Azure. I find that linking feature to be very handy indeed. Finally, http://domains.live.com is the epitome of “it just works”. I set up jtworkdomain.com at http://domains.live.com over three years ago and I am pretty certain I haven’t been back there even once to administer it. Proposed changes OK, so if I like http://domains.live.com so much why am I considering changing? Well, I earn my corn in the Microsoft ecosystem and if I’m reading the tea-leaves correctly its looking increasingly likely that the services that I’m going to have to be familiar with in the future are all going to be running on top of and alongside Windows Azure Active Directory and Office 365 respectively. Its clear to me that Microsoft’s are pushing their customers toward cloud services and, like it or lump it, data integration developers like me may have to come along for the ride. I don’t think the day is too far off when we can log into Windows Azure SQL Database (aka SQL Azure), Team Foundation Service, Dynamics etc… using the same credentials that are currently used for Office 365 and over time I would expect those things to get integrated together a lot better – that integration will be based upon a Windows Azure Active Directory identity. This should not come as a surprise, in my opinion Microsoft’s whole enterprise play over the past 15 or 20 years can be neatly surmised as “get people onto Windows Server and Active Directory then upsell from there” – in the not-too-distant-future the only difference is that they’re trying to do it in the cloud. I want to get familiar with these services and hence I am considering moving jtworkdomain.com onto Office 365. I’ll lose the convenience of easily being able to switch to that account at Outlook.com and moreover I’ll have to start paying for it (I think it’ll be about fifty quid a year – not a massive amount but its quite a bit more than free) but increasingly this is beginning to look like a move I have to make. So that’s where my head is at right now. Anyone have any relevant thoughts or experiences to share? Please let me know in the comments below. @Jamiet

    Read the article

  • UEFI Dual-Boot - Ubuntu 12.04.3 + Windows 8.1 (One GPT HDD)

    - by swafbrother
    UEFI Dual-Boot - Ubuntu 12.04.3 + Windows 8.1 (One GPT HDD) Hello, I'm having trouble setting up a dual-boot (Ubuntu 12.04 LTS and Windows 8.1) in my ASUS K55VM laptop's hard drive disk (500 GB). I was mostly following tutorials for doing this, but at some point something has gone wrong. Up to now, I have followed these steps: I formatted my HDD into GPT. I clean-installed Windows 8.1. I didn't prevent Windows from choosing the partitions to use and it created these partitions: A Recovery partition (sda1). An EFI System Partition (sda2). A Microsoft Reserved Partition (sda3). A Windows Data Partition or C drive (sda4). I reduced the Windows Data Partition via Windows' Disk Management. I made a bootable USB Stick with Ubuntu 12.04 LTS from ISO, using Universal USB Installer. I created these partitions for Ubuntu: A Boot partition, mounted at /boot (sda5). A Root partition, mounted at / (sda6). A Swap partition (sda7). In Device for boot loader installation I chose: /dev/sda. Then, when I rebooted, it went straight into Ubuntu. So I installed Boot-Repair, and clicked on Recommended Repair. It automatically did its job without asking for anything. I rebooted and Grub showed up, with a lot of options. At this point I had a decent dual-boot setup; Ubuntu and both Windows entries worked fine: Ubuntu. Windows Boot UEFI Loader. Windows UEFI bkpbootmgfw.efi. I executed this command: sudo grub-install --force /dev/sda5. Then I tried to make Windows 8.1's Boot Manager the main boot manager, so that I could choose which OS to boot into from a menu. I downloaded EasyBCD on Windows. It showed 2 Ubuntu entries and 1 Windows entry. I went into BCD Deployment tab and clicked on Write MBR. At this point, I went into BIOS and made Windows Boot Manager the first boot option. When I rebooted, I got a black screen with the message efidisk read error, and then (I guess) it switched to the next boot option, which is Ubuntu, resulting in Grub showing up. From Grub, Ubuntu entry is working and so are both Windows entries. If I choose Ubuntu, it normally boots into Ubuntu. But if I choose Windows, it goes into Windows' boot manager. In Windows' boot manager, a menu shows up: Ubuntu. Ubuntu. Windows 8.1. If I choose Windows, it boots into Windows without any problem. If I choose Ubuntu, it boots into Grub (back to step 14). Here's my BootInfo Summary: http://paste.ubuntu.com/6698171/ Windows Boot Manager is clearly not working as expected; I can't directly boot into it and I can't boot into it from BIOS either (efidisk read error again). If I want to boot into Windows I need to boot into Grub first, which is the opposite of what I wanted. I need help at this point. What is the best thing I can do? Is there a more reliable and/or simpler way of acomplishing a satisfying dual-boot for this situation? Can someone provide a way for going back to step 8, where I had a more efficient dual-boot setup? If only I could undo what I did with Easy BCD and skip Windows' Boot Menu... Can someone provide a way to fix this mess? Thanks in advance and sorry for the length of this, I wanted to be exhaustive.

    Read the article

< Previous Page | 26 27 28 29 30 31 32 33 34 35 36 37  | Next Page >