Search Results

Search found 90811 results on 3633 pages for 'hyper v server 2012 r2'.

Page 50/3633 | < Previous Page | 46 47 48 49 50 51 52 53 54 55 56 57  | Next Page >

  • DC on Hyper V Host

    - by Saif Khan
    I've read a few similar questions but wasn't clear. I have a small office (13) users and recently purchased a new single server (this is all I have to work with for now). Server specs Memory - 16GB Drives - 6 SCSI (2TB) PROCESSOR - DUAL QUAD I plan on making the Hyper-V host the DC and then 2 VMs, one for a file server and the other an application server. Question - since I am restricted to this single server, would it be a big issue making the Hyper-V host the domain controller? Your input greatly appreciated.

    Read the article

  • Reporting Services 2008 & 2012 side by side

    - by Iulian Ilies
    I have installed Sql Server 2008 & 2012 side by side on the same machine, and that's includes the reporting service for each. Both are instances are named instances: MSSQLSERVER2008 and MSSQLSERVER2012. I didn't configure the 2008 one but configured 2012 first and this one is working fine. However later on when I wanted to configure the 2008 reporting service instance I was not able to do so; it simply cannot find it. Both services are displayed as running, nevertheless while being in Reporting Services Configuration Manager only the 2012 instance is displayed. I tried stopping the 2012 but still no luck, 2008 won't show up in the RS configuration manager. Any suggestions? Thanks in advance Iulian

    Read the article

  • Problem installing a w2k DC on Hyper-V?

    - by Tony
    Hi, We have a cluster with four node windows 2008 r2 and hyper-v installed. We would like to install 2 VM with role domain controller w2k (the domain is different from the domain of the hyper-v cluster). Do you know if there are any restriction on doing it? Some collegues say that we risk data corruption if we do live migrations. Others speak about the fact that Microsoft don't support w2k any more. And others have doubts because the global catalog server installed on these DC could have loss of performance. Any idea? Thanks Tony

    Read the article

  • Hyper-V Ubuntu Networking Problems Copying Large Amounts of Data

    - by Anonymous
    I am trying to copy a large amount (about 50 GB) of data over my network from a Hyper-V-hosted virtual machine running Ubuntu 11.04 (Natty Narwhal) to another (non-virtual) Ubuntu host that I plan to use for testing upgrades to one of our web applications. The problem I am having is with the virtual machine, which I shall refer to in what follows as "source.host". This machine is running 64-bit Ubuntu Server with the 2.6.38-8-server kernel and the Microsoft Linux Integration Components for Hyper-V kernel modules (hv_utils, hv_timesource, hv_netvsc, hv_blkvsc, hv_storvsc, and hv_vmbus) loaded. It uses a Hyper-V "synthetic network adapter" for its networking interface. To do the copy, I log on to the machine with the data and run the following commands (Call the remote machine "destination.host".): $ cd /path/to/data $ tar -cvf - datafolder/ | ssh [email protected] "cat > ~/data.tar" This runs for a while and then suddenly stops after transferring somewhere from 2-6 GB. The terminal on the source.host machine displays a Write failed: broken pipe error. The odd part is this: after this occurs, the "source.host" machine is no longer able to talk to the rest of the network. I cannot ping any other hosts on the network from the "source.host" machine, and I cannot ping the "source.host" machine from any other host on the network. I am equally unable to access the any of the web services hosted on "source.host". Running ifconfig on "source.host" shows the network adapter to be up and running as usual with the correct IP address and everything. I tried restarting the networking service with $ /etc/init.d/networking restart but the problem does not go away. Restarting the machine makes it capable of talking to the network again -- it can ping and be pinged by other hosts, and the web services are also accessible and usable as normal -- but attempting the copy operation again results in the same failure, requiring another restart. As an experiment, I tried replacing the tar -- ssh pipeline above with a straight scp: $ scp -r datafolder/ [email protected]:~ but to no avail Thinking that the issue might have to do with the kernel packet-send buffers filling up, I tried increasing the buffer size to 12 MB (up from the 128 KB default) with # echo 12582911 > /proc/sys/net/core/wmem_max but this also had no effect. I'm guessing at this point that it might be a problem with the Microsoft synthetic network driver, but I don't really know. Does anyone have any suggestions? Thank you very much in advance!

    Read the article

  • Hyper-v dynamic memory client machines always use maximum memory

    - by Eric P
    When I create a virtual machine in Hyper-V and set it up to use dynamic memory, the virtual machine will always use the maximum memory within the virtualized OS. Hyper-V will show the assigned memory at 514mb, but when I log into the server and pull up task manager, it will show 90% memory used. When I bump the maximum memory up to 4gb, I get the same result: 90% memory usage. Nothing is even running on the virtual machine other than a clean instal of Windows Server 2008 R2. I have also tried it with Windows 7 with the same results. Is this the expected behavior or is something setup wrong

    Read the article

  • Hyper-V: determine the guest's name given the GUID

    - by syneticon-dj
    How would I go about determining the guest's name given its GUID or vice-versa, preferably with only the Hyper-V/Server Core stock install at hands? Rationale: I am in favor of having a repository of dirty tricks to revert to when in great need. To immediately quiesce all (storage) operations of a VM guest without losing the state, I used to run kill 17 <all VM's virtual processes> (signaling SIGSTOP) and resumed afterwards using kill 19 <all VM's virtual processes> (signaling SIGCONT) in ESXi/vSphere shell. I tried the same technique with Hyper-V using Process Explorer's "Suspend" functionality on the vmwp.exe processes and it seemed to work. I have yet to find a way for easily identifying the processes to suspend, though - the vmwp command line is only listing a GUID.

    Read the article

  • rdesktop remote connect windows server 2012r2, device redirection doesn't work

    - by user197677
    I use rdesktop to connect to a windows server 2012r2 from my linux box. it worked well. but it doesn't work when I redirected my local directory /home to the server. My command: rdesktop -a 24 -r disk:home=/home -u test 192.168.1.110 I could connect to the Windows server. but I cannot see the directory in my remote computer. Running net use x: \\tsclient\home' works when I connect to windows 7, but it yields this error in windows server 2012r2: system error 87 has occure the parameter is error I have no idea what to do. I have turned off the firewall and deployed the RemoteFX to enable device redirection.

    Read the article

  • Preparing a hyper-v VM image

    - by Anteru
    We have a Hyper-V Windows Server 2k3, and we're hosting multiple VMs on it. However, right now, we always start the VM creation right on the server, i.e. when preparing a new Ubuntu image, I just install it into a new VM and set it up and when I'm happy we store the disk image. I wonder if there is a way to prepare a hyper-v image locally on my desktop machine instead? I'm running Windows 7, and I would love to be able to set up a VM so that we can copy the image over to the server and be done with it. This is for linux images only, and we definitely need the hyperv network integration. Is there a recommended way how to prepare hyperv images without running a hyperv instance somewhere?

    Read the article

  • Hyper-V share a folder between host and instance

    - by Fly_Trap
    I have a hyper-v server and several VM's (Virtual Machines). All the VM's are connected to an external network. I have tried to share a folder on the host and connect via the VM, I can do this but I'm prompted for a user name and password (as you would expect). I do not want to enable the "Everyone" group permissions as the physical host server is on a network of other servers. I have created a new virtual internal network in Hyper-V and given it's adapter a static ip of 33.0.0.100. I have added the virtual adapter to one of the VM's and set to IP to 33.0.0.2 (as advised here). Again this seems to work but I'm still prompted for a user name and password. Am I on the right lines here? I just want to share a directory from the host to the vm's without exposing the share to other servers on the network.

    Read the article

  • SQL SERVER – SQL Server 2008 with Service Pack 2

    - by pinaldave
    SQL Server 2008 Service Pack 2 has been already released earlier. I suggest that all of you who are running SQL Server 2008 I suggest you updated to SQL Server 2008 Service Pack 2. Download SQL Server 2008 – Service Pack 2 from here. Please note, this is not SQL Server 2008 R2 but it is SQL Server 2008 – Service Pack 2. Test Lab Guide of sQL Server 2008 with Service Pack 2 is also released by Microsoft. This document contains an introduction to SQL Server 2008 with Service Pack 2 and step-by-step instructions for extending the Base Configuration test lab to include a SQL Server 2008 with Service Pack 2 server. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: SQL, SQL Authority, SQL Documentation, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology

    Read the article

  • Download SQL Server 2008 R2 Express (Database Size Limit Increased to 10GB! )

    - by Aamir Hasan
    Yesterday i was researching about SQL Server 2008. i found New release of MS SQL Server 2008 R2, which have many new BI features and enhancements. There is a tiny cute feature that I am sure all of us will appreciate a lot. The product team has increased the Database Size limit for SQL Server 2008 R2 Express from 4 GB to 10 GB. So if you have got a growing SQL Server Express database that is close to the 4 GB Limit, hurry, upgrade to R2 Express. See the announcement from Product Team. SQL Server 2008 R2 Express download. SQL Server 2008 R2 Express Download

    Read the article

  • Windows web server and SQL Server on same dedicated server

    - by asinc
    I'm currently trying to decide on the best approach to handle hosting a few moderate traffic websites for production e-commerce and online applications. We'd like to move to a dedicated server and are looking at this as the most likely machine: Quad Core Intel Core2Quad Q9550 Processor, 2.83 Ghz X 4, 4 GB Kingston Ram This would run Windows Web Server 2008 R2 x64 and potentially also Sql Server Web 2008 and SmarterMail server. Given that we already pay for a high-end VPS for development, testing, shared version control we'd like to avoid going with two servers for production. We'd like to avoid using shared sql server hosting and have thought of using the development server as the database server as an option too - but potentially a security risk due to use for development by internal and contract users. The questions are: - Do you feel there would be performance degradation by running this on the same machine? - Are there significant issues to be concerned about if we do this? We understand that best practice would be to run separate db and app servers but the volume of traffic is currently not that high and adding another server just for database is currently too costly. - What are others doing out there? Alternatively, would you recommend instead going with two separate VPS servers with 2GB RAM each on Hyper-v which would be about the same cost as the single dedicated server above? Thanks!

    Read the article

  • Problem with importing an mdf created with SQL Server Express 2008 into SQL Server 2005

    - by user252160
    The question is probably extremely easy to resolve, but I need to resolve it because I need to carry on with my project. I am using SQL Server Express 2008 at home, and I've been working on an ASP.NET MVC app that stores my DB in an mdf file in the project's folder. The problem is that the SQL Server in the Uni labs is SQL Server 2005, and when I try to open the mdf file with the VS Server Explorer,It says that the version of the mdf file is more than the server can accept. The only option that comes to my mind is exporting the DB as an sql file, just like I've done it thousand times with phpmyadmin. the thing is that the SQL Management Studio Express is not the most usable tool in the world, and for some strange reason all the articles I could find in Google were irrelevant. Please, help.

    Read the article

  • Configuring external SMTP server on Azure VM - messages staying in queue

    - by Steph Locke
    I have an external SMTP provider: auth.smtp.1and1.co.uk I am trying to send SQL Server Reporting Services emails via this on an Windows 2012 Azure VM. It is configured sufficiently correctly for emails to be generated, but I've not configured something or mis-configured something as the emails then stay in the queue. Setup details Configured SMTP Virtual Server General: IP Address: Fixed value Access: Access Control: Authentication: ticked Anonymous access Access: Connection Control: All except the list below (which is empty) Access: relay restrictions: Only the list below (which contains 127.0.0.1), ticked 'allow all..' option Delivery: Outbound Security...:Basic Authentication with username and password completed, ticked TLS encryption Delivery: Outbound connections...:TCP port=587 Delivery: Advanced: FQDN=ServerName, smarthost=auth.smtp.1and1.co.uk I then set the following SSRS rsreportserver.config values: <SMTPServer>100.92.192.3</SMTPServer> <SendUsing>2</SendUsing> <SMTPServerPickupDirectory> c:\inetpub\mailroot\pickup </SMTPServerPickupDirectory> <From>[email protected]</From> Tried so far 1) turning the smtp service off and on again (just in case) 2) run SMTPDiag with no errors (also no emails) 3) tried turning off the firewall for the ports (and more generally to see if it made a difference) 4) tried generation from powershell which resulted with message in queue 5) added 25 and 857 as endpoint 6) perused the event log and found some warnings that appear to be about the recipient Message delivery to the remote domain 'gmail.com' failed for the following reason: Unable to bind to the destination server in DNS. Message delivery to the host '212.227.15.179' failed while delivering to the remote domain 'gmail.com' for the following reason: The remote server did not respond to a connection attempt. 7) tried pinging but this appears to be blocked on azure 8) tried more powershell sending on different domains variants (localhost, boxname, internal ip used in smtp properties, 127.0.0.1) - none resulting in success 9) tried adding a remote domain - no change Could anyone recommend what step 10 should be in fixing this issue please?

    Read the article

  • SQL Server 2005 Merge Replication to SQL Server CE 3.5

    - by user33067
    Hi, In my organization, we have a SQL Server 2005 database server (DBServer). Users of an application will normally be connected to DBServer, but, occasionally, would like to disconnect and continue their work on a laptop using SQL Server Compact Edition 3.5 (SQLCE). Due to this, we have been looking into using Merge Replication between the DBServer and SQLCE. From what I have read about this process, IIS must be installed on "the server"... yet, I have found no indication to whether this is talking about DBServer or SQLCE. I had assumed the documentation was referring to DBServer and proposed this to our networking staff. That idea was quickly put to rest as it is not our policy to install IIS on an internal server. This is where our SQL Server 2005 web server (WebServer) entered the picture. The idea being that IIS would be installed on WebServer and would be the conduit for DBServer and SQLCE to communicate. This sounded like a good idea at first, until I started looking for documentation on this type of setup. Everything I have been able deals with a DBServer -- SQLCE -- DBServer setup... nothing on DBServer -- WebServer -- SQLCE -- WebServer -- DBServer. Questions: Is going with a 3 server setup ideal? Does anyone have documentation on this type of setup? Does IIS even need to be running on one of the big servers, or can it just run off the laptop with SQLCE on it? (I'd really like this option ;))

    Read the article

  • Hyper-Threading and Dual-Core, What's the Difference?

    - by Josh Stodola
    In a conversation with the network administator, I mentioned that my machine was a dual-core. He told me it was not. I brought up the task manager, went to the perfomance tab, and showed him that there are two separate CPU usage graphs. I have a quad-core machine at home and it has four graphs. He told there were two graphs on this particular machine because of hyper-threading. I used to have a hyper-thread pentium 4 processor back in the day, but I never fully understood what it meant. So what is the difference between hyper threading and dual-core? And how do you tell which one you have?

    Read the article

  • Access server using IP on another interface

    - by Markos
    I am using Windows Server 2012 instead of a router for my home network. Currently I am using RRAS and computers from local network can access Internet correctly. Here is a map of the current setup: [PC1] ---| |---- (lan ip)[Server](wan ip)--> internet [PC2] ---| I have applications running on Server, such as IIS and others. All can be accessed from internet using wan ip and from lan using lan ip. I have a domain, lets say its my-domain.com, which is resolved to my wan ip. What I want is to enable my LAN computers to be able to connect to services on my server using the very same address as internet users: eg http://my-domain.com/. However this does not work for my lan computers. What I understand is that I need to set up some kind of loopback route in a way that packets comming to LAN interface get routed to WAN interface. But I haven't found how to achieve this (in fact, I don't know WHAT to search for). Feel free to ask for additional informations and I will try to update the question.

    Read the article

  • SQL SERVER – Repair a SQL Server Database Using a Transaction Log Explorer

    - by Pinal Dave
    In this blog, I’ll show how to use ApexSQL Log, a SQL Server transaction log viewer. You can download it for free, install, and play along. But first, let’s describe some disaster recovery scenarios where it’s useful. About SQL Server disaster recovery Along with database development and administration, you must work on a good recovery plan. Disasters do happen and no one’s immune. What you can do is take all actions needed to be ready for a disaster and go through it with minimal data loss and downtime. Besides creating a recovery plan, it’s necessary to have a list of steps that will be executed when a disaster occurs and to test them before a disaster. This way, you’ll know that the plan is good and viable. Testing can also be used as training for all team members, so they can all understand and execute it when the time comes. It will show how much time is needed to have your servers fully functional again and how much data you can lose in a real-life situation. If these don’t meet recovery-time and recovery-point objectives, the plan needs to be improved. Keep in mind that all major changes in environment configuration, business strategy, and recovery objectives require a new recovery plan testing, as these changes most probably induce a recovery plan changing and tweaking. What is a good SQL Server disaster recovery plan? A good SQL Server disaster recovery strategy starts with planning SQL Server database backups. An efficient strategy is to create a full database backup periodically. Between two successive full database backups, you can create differential database backups. It is essential is to create transaction log backups regularly between full database backups. Keep in mind that transaction log backups can be created only on databases in the full recovery model. In other words, a simple, but efficient backup strategy would be a full database backup every night, a transaction log backup every hour, or every 15 minutes. The frequency depends on how much data you can afford to lose and how busy the database is. Another option, instead of creating a full database backup every night, is to create a full database backup once a week (e.g. on Friday at midnight) and differential database backup every night until next Friday when you will create a full database backup again. Once you create your SQL Server database backup strategy, schedule the backups. You can do that easily using SQL Server maintenance plans. Why are transaction logs important? Transaction log backups contain transactions executed on a SQL Server database. They provide enough information to undo and redo the transactions and roll back or forward the database to a point in time. In SQL Server disaster recovery situations, transaction logs enable to repair a SQL Server database and bring it to the state before the disaster. Be aware that even with regular backups, there will be some data missing. These are the transactions made between the last transaction log backup and the time of the disaster. In some situations, to repair your SQL Server database it’s not necessary to re-create the database from its last backup. The database might still be online and all you need to do is roll back several transactions, such as wrong update, insert, or delete. The restore to a point in time feature is available in SQL Server, but for large databases, it is very time-consuming, as SQL Server first restores a full database backup, and then restores transaction log backups, one after another, up to the recovery point. During that time, the database is unavailable. This is where a SQL Server transaction log viewer can help. For optimal recovery, besides having a database in the full recovery model, it’s important that you haven’t manually truncated the online transaction log. This ensures that all transactions made after the last transaction log backup are still in the online transaction log. All you have to do is read and replay them. How to read a SQL Server transaction log? SQL Server doesn’t provide an option to read transaction logs. There are several SQL Server commands and functions that read the content of a transaction log file (fn_dblog, fn_dump_dblog, and DBCC PAGE), but they are undocumented. They require T-SQL knowledge, return a large number of not easy to read and understand columns, sometimes in binary or hexadecimal format. Another challenge is reading UPDATE statements, as it’s necessary to match it to a value in the MDF file. When you finally read the transactions executed, you have to create a script for it. How to easily repair a SQL database? The easiest solution is to use a transaction log reader that will not only read the transactions in the transaction log files, but also automatically create scripts for the read transactions. In the following example, I will show how to use ApexSQL Log to repair a SQL database after a crash. If a database has crashed and both MDF and LDF files are lost, you have to rely on the full database backup and all subsequent transaction log backups. In another scenario, the MDF file is lost, but the LDF file is available. First, restore the last full database backup on SQL Server using SQL Server Management Studio. I’ll name it Restored_AW2014. Then, start ApexSQL Log It will automatically detect all local servers. If not, click the icon right to the Server drop-down list, or just type in the SQL Server instance name. Select the Windows or SQL Server authentication type and select the Restored_AW2014 database from the database drop-down list. When all options are set, click Next. ApexSQL Log will show the online transaction log file. Now, click Add and add all transaction log backups created after the full database backup I used to restore the database. In case you don’t have transaction log backups, but the LDF file hasn’t been lost during the SQL Server disaster, add it using Add.   To repair a SQL database to a point in time, ApexSQL Log needs to read and replay all the transactions in the transaction log backups (or the LDF file saved after the disaster). That’s why I selected the Whole transaction log option in the Filter setup. ApexSQL Log offers a range of various filters, which are useful when you need to read just specific transactions. You can filter transactions by the time of the transactions, operation type (e.g. to read only data inserts), table name, SQL Server login that made the transaction, etc. In this scenario, to repair a SQL database, I’ll check all filters and make sure that all transactions are included. In the Operations tab, select all schema operations (DDL). If you omit these, only the data changes will be read so if there were any schema changes, such as a new function created, or an existing table modified, they will be ignored and database will not be properly repaired. The data repair for modified tables will fail. In the Tables tab, I’ll make sure all tables are selected. I will uncheck the Show operations on dropped tables option, to reduce the number of transactions. Click Next. ApexSQL Log offers three options. Select Open results in grid, to get a user-friendly presentation of the transactions. As you can see, details are shown for every transaction, including the old and new values for updated columns, which are clearly highlighted. Now, select them all and then create a redo script by clicking the Create redo script icon in the menu.   For a large number of transactions and in a critical situation, when acting fast is a must, I recommend using the Export results to file option. It will save some time, as the transactions will be directly scripted into a redo file, without showing them in the grid first. Select Generate reconstruction (REDO) script , change the output path if you want, and click Finish. After the redo T-SQL script is created, ApexSQL Log shows the redo script summary: The third option will create a command line statement for a batch file that you can use to schedule execution, which is not really applicable when you repair a SQL database, but quite useful in daily auditing scenarios. To repair your SQL database, all you have to do is execute the generated redo script using an integrated developer environment tool such as SQL Server Management Studio or any other, against the restored database. You can find more information about how to read SQL Server transaction logs and repair a SQL database on ApexSQL Solution center. There are solutions for various situations when data needs to be recovered, restored, or transactions rolled back. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • Talking JavaOne with Rock Star Raghavan Srinivas

    - by Janice J. Heiss
    Raghavan Srinivas, affectionately known as “Rags,” is a two-time JavaOne Rock Star (from 2005 and 2011) who, as a Developer Advocate at Couchbase, gets his hands dirty with emerging technology directions and trends. His general focus is on distributed systems, with a specialization in cloud computing. He worked on Hadoop and HBase during its early stages, has spoken at conferences world-wide on a variety of technical topics, conducted and organized Hands-on Labs and taught graduate classes.He has 20 years of hands-on software development and over 10 years of architecture and technology evangelism experience and has worked for Digital Equipment Corporation, Sun Microsystems, Intuit and Accenture. He has evangelized and influenced the architecture of numerous technologies including the early releases of JavaFX, Java, Java EE, Java and XML, Java ME, AJAX and Web 2.0, and Java Security.Rags will be giving these sessions at JavaOne 2012: CON3570 -- Autosharding Enterprise to Social Gaming Applications with NoSQL and Couchbase CON3257 -- Script Bowl 2012: The Battle of the JVM-Based Languages (with Guillaume Laforge, Aaron Bedra, Dick Wall, and Dr Nic Williams) Rags emphasized the importance of the Cloud: “The Cloud and the Big Data are popular technologies not merely because they are trendy, but, largely due to the fact that it's possible to do massive data mining and use that information for business advantage,” he explained. I asked him what we should know about Hadoop. “Hadoop,” he remarked, “is mainly about using commodity hardware and achieving unprecedented scalability. At the heart of all this is the Java Virtual Machine which is running on each of these nodes. The vision of taking the processing to where the data resides is made possible by Java and Hadoop.” And the most exciting thing happening in the world of Java today? “I read recently that Java projects on github.com are just off the charts when compared to other projects. It's exciting to realize the robust growth of Java and the degree of collaboration amongst Java programmers.” He encourages Java developers to take advantage of Java 7 for Mac OS X which is now available for download. At the same time, he also encourages us to read the caveats.

    Read the article

  • Looks Like We Made It! What's Up on Thursday at Oracle OpenWorld

    - by Oracle OpenWorld Blog Team
     By Karen Shamban Thursday is the last day of the conference for 2012, and there's still much to see and do. The day starts with an awesome keynote session, which includes a discussion with Michael Lewis, the author of Moneyball, Liar's Poker, and The Blind Side -- you won't want to miss it! Here's what's happening on Thursday at Oracle OpenWorld 2012: Registration Moscone West, Moscone South, Hilton San Francisco, Hotel Nikko, 8:00 a.m. - 4:00 p.m. Westin St. Francis, 8:00 a.m. - 5:00 p.m. Oracle OpenWorld Keynote featuring Oracle President Mark Hurd, and Oracle Executive Vice President Bob Weiler in conversation with Michael Lewis, author of Moneyball, Liar's Poker, and The Blind Side Moscone North Hall D, 9:00 a.m. - 10:45 a.m. Sessions, Labs Various times and locations Oracle OpenWorld Music Festival @ It's a Wrap! Yerba Buena Gardens, 3:30 p.m. - 6:30 p.m. Think back to everything you wanted to do while you attended the conference -- and be sure you get it done on Thursday!

    Read the article

  • JavaOne Gangnam Style

    - by Tori Wieldt
    Yes, JavaOne is *the* place for excellent content, including technical information, opportunities to learn best practices from your peers, and access to industry experts. You can find lots of information about content in Java Evangelist Arun Gupta's 25 Reasons to attend JavaOne 2012. But you also have to let your Gangnam Style loose. Here are the Top Ten Fun Reasons to attend JavaOne 2012: 10. Connect with developers from more than 80 countries 9. Kick off the week at GlassFish and Friends Party Sunday night 8. Meet the community of Java Rock Stars 7. Enjoy all San Francisco has to offer 6. Meet your next best friend playing pinball in the Game Zone 5. Have your picture taken with Duke 4. Java in the morning and brews in the afternoon at the Taylor Street Cafe 3. Ride across the Golden Gate Bridge at the Community Geek Bike Ride 2. Rock out at the first-ever Oracle OpenWorld Music Festival and #1... 1. It beats being at work!  If you haven't registered, there's still time. Join us!

    Read the article

  • And the Winner Is ...

    - by Oracle OpenWorld Blog Team
    If you know excellent Oracle technologists, now's your chance to nominate them for an award. by Karen Shamban It’s possible to win an Oracle Excellence Award in one of 12 categories this year—nominations are open now through July 17, 2012. Winning customers and partners will be hosted at Oracle OpenWorld or JavaOne 2012, where they can meet with Oracle executives, network with peers, and be featured in an upcoming edition of an Oracle publication such as Oracle Magazine. This year’s Oracle Excellence Award categories are: •    CIO of the Year•    Database Administrator of the Year•    Eco-Enterprise Innovation•    Java Business Innovation•    Leadership•    Oracle Fusion Middleware Innovation•    Proactive Support Champion–Global•    Specialized Partner of the Year–Europe, Middle East, and Africa•    Specialized Partner of the Year–Global•    Specialized Partner of the Year–North America•    Technologist of the Year Learn more about each award and nominate a deserving candidate now! Go to the Oracle Excellence Awards information page for details.

    Read the article

  • User cannot access a system DSN on Windows Server 2008

    - by Ra Osolage
    We run our SQL Server services using a low privileged domain account. That account is NOT a local admin on the OS. Only access I give the user account is assigned during install of SQL: full control over its mount points and then everything else is granted by the SQL Server 2005/2008 installer. I need to create a linked server in SQL Server 2008 to an ODBC data source. So I remoted into the computer using my domain account, which is part of a group that DOES have local admin privs to the OS. I created a system DSN and configured it to connect to another SQL Server. The DSN works perfectly when I test it. However, when I try to create the linked server, I get an error. It appears to me that the DSN is invisible to the domain account that SQL Server is running as. It seems that this problem is only happening to me on Windows 2008 servers. Does anybody know whether there's anything that you need to do after creating a DSN to make it visible for other users to access?

    Read the article

  • Server configuration advice for new site that could get lots of traffic within 6m

    - by alchemical
    We're setting up a new web2.0 type site with elements of e-commerce. Budget is kind of tight. Due to the nature of the site and promotions, etc., we expect traffic could ramp up fairly quickly. Looking for advice for a good configuration to start with, we' looking to co-lo with CalPop in downtown LA. We've looked at Dell, ABMX.com, and got a quote from CalPop (they make their own servers as they also do managed hosting). Price range has been anywhere from about $1200-$3300 per server. We're thinking to start with a web server and db server, both with mirrored drives. It would be nice to stay under about 2k per server if possible. Min configuration for each would probably be a quad-core with 8GB Ram. Thinking to run Windows Server 2008 R2 (Web Edition?) and SQL Server 2008. Looking for advice on the best server configurations and/or brands that fit the budget, yet will allow us to smoothly scale as traffic increases. Reliability is also pretty important. Also wondering if a switch/router is necessary or useful to connect the two servers.

    Read the article

  • linked-server sql - access

    - by user22121
    Hi, I have a SQL server 2000 and an Access database mdb connected by Linked server on the other hand I have a program in c # that updates data in a SQL table (Users) based data base access. When running my program returns the following error message: OLE DB provider 'Microsoft.Jet.OLEDB.4.0' reported an error. Authentication failed. [OLE / DB provider returned message: Can not start the application. Missing information file of the working group or is opened exclusively by another user.] OLE DB error trace [OLE / DB Provider 'Microsoft.Jet.OLEDB.4.0' IDBInitialize:: Initialize returned 0x80040E4D: Authentication failed.] . Both the program, the sql server and database access are on a remote server. On the local server the problem was solved by running the following: "sp_addlinkedsrvlogin 'ActSC', 'false', NULL, 'admin', NULL". Try on the remote server the next, without result: "sp_addlinkedsrvlogin 'ActSC', true, null, 'user', 'pass'". On the remote server and from the "Query Analyzer" sql update statements are working correctly. Can you think of what may be the problem? Thanks!

    Read the article

< Previous Page | 46 47 48 49 50 51 52 53 54 55 56 57  | Next Page >