Search Results

Search found 30575 results on 1223 pages for 'database integrity'.

Page 531/1223 | < Previous Page | 527 528 529 530 531 532 533 534 535 536 537 538  | Next Page >

  • BI&EPM in Focus April 2012

    - by Mike.Hallett(at)Oracle-BI&EPM
    General News Oracle OpenWorld call for papers now open, now through April 9 (link) Oracle Announces Availability of Oracle Exalytics In-Memory Machine (link) Oracle EPM and BI Support Newsletter Current Edition - Volume 3 : March 2012 (link) Customers Asiana Airlines Improves Passenger Management with Near-Real-Time Reservation and Ticketing Information  Centraal Boekhuis Delivers Faster with Oracle BI 11g Essatto Software Speeds Data Aggregation Tenfold; Integrates BI, Performance Management, and Data Warehousing for Midsize Businesses Grupo WTorre Supports Management's Decision-Making with OBIEE, Ensuring Uniform, Reliable, and Consistent Data Indian Overseas Bank Cuts Planning Schedule by 45 Worker Days per Year, Assesses Market Risk Instantly with Business Intelligence System Kentucky Community and Technical College System Enables Data-Driven Decision-Making Using Integrated System with Management Dashboards National Australia Bank Achieves 200% ROI, Improves Data Quality and Reporting Integrity with Oracle Hyperion DRM R.L. Polk & Co. Enhances Business Intelligence Capabilities, Optimizes System Performance with Extreme Analytics Machine Test ResCare, Inc. Transforms Reporting to Improve Healthcare Service Performance with Oracle Business Analytics  Rochester City School District Uses OBIEE to Track Student Achievement, Identify Areas for Improvement, Accelerate Reporting  Société Générale Standardizes, Accelerates, and Improves Budget Planning Accuracy across Global Enterprise The State Accounting Office of Georgia Integrates Financial Information, Shortens Financial Closings and Streamlines Reporting across 175 Organizations   Events 4-day Oracle Real-Time Decisions Hands-on Technical Workshop for Partners (PTS, Free) May 14-17, 2012: Colombes, Paris, France Nordic events : “Latest Release of Oracle Hyperion EPM and BI Suites Helps Organizations Plan through Uncertainty, Improve Decision-Making and Meet Regulatory Requirements” (April 17, Sweden | April 18, Norway | April 19, Denmark | April 24, Finland) Webcast Replay from Balaji Yelamanchili and Paul Rodwick: “Analytics Without Limits - The Latest on Oracle Exalytics In-Memory Machine and Oracle Business Intelligence”  (link)  Wednesday, April 04, 2012: Business Analytics launch webcast: Invite your customers to register (link) Big Data Online Forum now available on Demand (link)  Enterprise Performance Management Webcast Replay: Accurate Forecasting within the Business Planning Cycle (link) Oracle Hyperion Profitability and Cost Management (HPCM) Master Support Note (link) Business  Intelligence Whitepaper: Driving Innovation Through Analytics (link) Gartner: CIOs Identify BI as the No. 1 Technology Priority for 2012 (link) Webcast Replay: Exalytics in Action: Airlines, US Census and Federal Spending Demo Applications  (link) NEWLY RELEASED Walk-in Video for Exalytics - Use This to Start Customer/Partner Meetings! (link) IDC Insight Paper: “Oracle's All-Out Assault on the Big Data Market: Offering Hadoop, R, Cubes, and Scalable IMDB in Familiar Packages” (link) System Requirements and Supported Platforms for Oracle Business Intelligence Suite Enterprise Edition 11gR1 Certification Matrix now published to include OBIEE 11.1.1.6.0 (link) Maintenance Release Guide (List of Bugs Fixed) for Oracle Business Intelligence Enterprise Edition (OBIEE) 11.1.1.6.0  (link) OBIEE 11.1.1.6: Is OBIEE 11.1.1.6 Certified With OBI Apps 7.9.6.3?  (link) Information Center: Troubleshooting Oracle Business Intelligence Applications (support login req'd)  (link)      

    Read the article

  • Cheating on Technical Debt

    - by Tony Davis
    One bad practice guaranteed to cause dismay amongst your colleagues is passing on technical debt without full disclosure. There could only be two reasons for this. Either the developer or DBA didn’t know the difference between good and bad practices, or concealed the debt. Neither reflects well on their professional competence. Technical debt, or code debt, is a convenient term to cover all the compromises between the ideal solution and the actual solution, reflecting the reality of the pressures of commercial coding. The one time you’re guaranteed to hear one developer, or DBA, pass judgment on another is when he or she inherits their project, and is surprised by the amount of technical debt left lying around in the form of inelegant architecture, incomplete tests, confusing interface design, no documentation, and so on. It is often expedient for a Project Manager to ignore the build-up of technical debt, the cut corners, not-quite-finished features and rushed designs that mean progress is satisfyingly rapid in the short term. It’s far less satisfying for the poor person who inherits the code. Nothing sends a colder chill down the spine than the dawning realization that you’ve inherited a system crippled with performance and functional issues that will take months of pain to fix before you can even begin to make progress on any of the planned new features. It’s often hard to justify this ‘debt paying’ time to the project owners and managers. It just looks as if you are making no progress, in marked contrast to your predecessor. There can be many good reasons for allowing technical debt to build up, at least in the short term. Often, rapid prototyping is essential, there is a temporary shortfall in test resources, or the domain knowledge is incomplete. It may be necessary to hit a specific deadline with a prototype, or proof-of-concept, to explore a possible market opportunity, with planned iterations and refactoring to follow later. However, it is a crime for a developer to build up technical debt without making this clear to the project participants. He or she needs to record it explicitly. A design compromise made in to order to hit a deadline, be it an outright hack, or a decision made without time for rigorous investigation and testing, needs to be documented with the same rigor that one tracks a bug. What’s the best way to do this? Ideally, we’d have some kind of objective assessment of the level of technical debt in a software project, although that smacks of Science Fiction even as I write it. I’d be interested of hear of any methods you’ve used, but I’m sure most teams have to rely simply on the integrity of their colleagues and the clear perceptions of the project manager… Cheers, Tony.

    Read the article

  • Can't install alternate CD from USB?

    - by mattias
    Hi im trying to install ubuntu 12.04 with full hard disk encryption. After downloading and installing the Ubuntu live CD, I learned that truecrypt doesnt support full disk encryption on linux. I also learned that the best way to get "nearly full disk encryption" on ubuntu is by installing it from the alternate install CD. I tried that, but something is wrong with my CD reader/burner so it doesnt boot up when i insert the cd. My thought here was to take the .iso that I downloaded on my unencrypted Ubuntu system, use Unetbootin to make the usb drive. The usb drive used for this is exactly the same brand as one that I know has worked with a previous ubuntu live system on the same computer. I also used unetbootin for that usb, but I created it from windows that time. The usb stick boots up fine and i get through the first couple of steps in the installation process. However, After a while I get a "box" with the following error message "Load Installer components from CD" There was a problem reading data from the CD-ROM. Please make sure it is in the drive. If retrying does not work, you should check the integrity of your CD-ROM. "Failed to copy file from CD-ROM. Retry? " Then I cant get any further. I googled a lot and found this page which seems to tackle this very problem: http://www.dotkam.com/2010/11/29/ins...mage-from-usb/ I tried to do what it said. After pressing TAB, I wrote : cdrom-detect/try-usb=true without quotes because that's what i think is right. When I press TAB, there already is a text saying : /ubnkern initrd=/ubninit vga=788 -- quiet which can be removed. I have tried to both delete the text before the "--" and just inserting cdrom-detect/try-usb=true before it. Any idea of what can be wrong? I would like to do a full system encryption, or as full as it is possible. I dont want to just encrypt my /home folder. Maybe this isn't the easiest way. I use SanDisk usb sticks. I know there is a problem with U3 launcher on some SanDisks, but I never had to remove U3 before from similar disks, and the alternate install does boot up, so I dont think using U3 removal would help me. Any help or indication to an easier way to do this would be appreciated

    Read the article

  • Techniques to re-factor garbage and maintain sanity?

    - by Incognito
    So I'm sitting down to a nice bowl of c# spaghetti, and need to add something or remove something... but I have challenges everywhere from functions passing arguments that doesn't make sense, someone who doesn't understand data structures abusing strings, redundant variables, some comments are red-hearings, internationalization is on a per-every-output-level, SQL doesn't use any kind of DBAL, database connections are left open everywhere... Are there any tools or techniques I can use to at least keep track of the "functional integrity" of the code (meaning my "improvements" don't break it), or a resource online with common "bad patterns" that explains a good way to transition code? I'm basically looking for a guidebook on how to spin straw into gold. Here's some samples from the same 500 line function: protected void DoSave(bool cIsPostBack) { //ALWAYS a cPostBack cIsPostBack = true; SetPostBack("1"); string inCreate ="~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~"; parseValues = new string []{"","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","",""}; if (!cIsPostBack) { //....... //.... //.... if (!cIsPostBack) { } else { } //.... //.... strHPhone = StringFormat(s1.Trim()); s1 = parseValues[18].Replace(encStr," "); strWPhone = StringFormat(s1.Trim()); s1 = parseValues[11].Replace(encStr," "); strWExt = StringFormat(s1.Trim()); s1 = parseValues[21].Replace(encStr," "); strMPhone = StringFormat(s1.Trim()); s1 = parseValues[19].Replace(encStr," "); //(hundreds of lines of this) //.... //.... SQL = "...... lots of SQL .... "; SqlCommand curCommand; curCommand = new SqlCommand(); curCommand.Connection = conn1; curCommand.CommandText = SQL; try { curCommand.ExecuteNonQuery(); } catch {} //.... } I've never had to refactor something like this before, and I want to know if there's something like a guidebook or knowledgebase on how to do this sort of thing, finding common bad patterns and offering the best solutions to repair them. I don't want to just nuke it from orbit,

    Read the article

  • What are the differences between abstract classes, interfaces, and when to use them

    - by user66662
    Recently I have started to wrap my head around OOP, and I am now to the point where the more I read about the differences between Abstract classes and Interfaces the more confused I become. So far, neither can be instantiated. Interfaces are more or less structural blueprints that determine the skeleton and abstracts are different by being able to partially develop code. I would like to learn more about these through my specific situation. Here is a link to my first question if you would like a little more background information: What is a good design model for my new class? Here are two classes I created: class Ad { $title; $description $price; function get_data($website){ } function validate_price(){ } } class calendar_event { $title; $description $start_date; function get_data($website){ //guts } function validate_dates(){ //guts } } So, as you can see these classes are almost identical. Not shown here, but there are other functions, like get_zip(), save_to_database() that are common across my classes. I have also added other classes Cars and Pets which have all the common methods and of course properties specific to those objects (mileage, weight, for example). Now I have violated the DRY principle and I am managing and changing the same code across multiple files. I intend on having more classes like boats, horses, or whatever. So is this where I would use an interface or abstract class? From what I understand about abstract classes I would use a super class as a template with all of the common elements built into the abstract class, and then add only the items specifically needed in future classes. For example: abstract class content { $title; $description function get_data($website){ } function common_function2() { } function common_function3() { } } class calendar_event extends content { $start_date; function validate_dates(){ } } Or would I use an interface and, because these are so similar, create a structure that each of the subclasses are forced to use for integrity reasons, and leave it up to the end developer who fleshes out that class to be responsible for each of the details of even the common functions. my thinking there is that some 'common' functions may need to be tweaked in the future for the needs of their specific class. Despite all that above, if you believe I am misunderstanding the what and why of abstracts and interfaces altogether, by all means let a valid answer to be stop thinking in this direction and suggest the proper way to move forward! Thanks!

    Read the article

  • EBS Accounts Payables Customer Advisory

    - by cwarticki
    Blogging to let you know of an important set of Oracle Payables patches that were released for R12.1 customers.  Accounts Payable Customer Advisory: Dear Valued Oracle Support Customer, Since the release of R12.1.3 a number of recommended Payables patches have been made available as standalone patches, to help address important business process incidents. Adoption of these patches is highly recommended. To further facilitate adoption of these Payables patches Oracle has consolidated them into a single Recommended Patch Collection (RPC). The RPC is a collection of recommended Payables patches created with the following goals in mind: Stability: Help address issues that are identified by Oracle Development and Oracle Software Support that may interfere with the normal completion of important business processes such as period close. Root Cause Fixes: Help make available root cause fix for data integrity that may delay period close, normal invoice flow and other business actions. Compact: Keep the file footprint as small as possible to help facilitate the install process and minimize testing. Granular: Collection of patches based on functional area that allows customer to apply, based on their individual needs and goals, all three RPC’s at once or in phases. Payables: -          New AP RPC (14273383:R12.AP.B) has all data corruption root cause fixes known to date plus tons of other crucial fixes (Note: 1397581.1). -          Companion must have RPCs: o   Note: 1481221.1: R12.1: Payments Recommended Patch Collection (IBY RPC), August 2012 o   Note: 1481235.1: R12.1: E-Business Tax Recommended Patch Collection (ZX RPC), August 2012 o   Note: 1481222.1: R12.1: Sub Ledger Accounting (SLA) Recommended Patch Collection (XLA RPC), August 2012 -          This time we beat the system far harder on testing and it held up remarkably well. We could not get any data corruption events in the Invoice Cancel/Discard flow (that is the #1 generator) neither we could cause Orphan Events in the system. Therefore this is very good code. Financials: -          ALL FIN modules now have RPCs: full listing is in (Note: 954704.1)

    Read the article

  • Help identify the pattern for reacting on updates

    - by Mike
    There's an entity that gets updated from external sources. Update events are at random intervals. And the entity has to be processed once updated. Multiple updates may be multiplexed. In other words there's a need for the most current state of entity to be processed. There's a point of no-return during processing where the current state (and the state is consistent i.e. no partial update is made) of entity is saved somewhere else and processing goes on independently of any arriving updates. Every consequent set of updates has to trigger processing i.e. system should not forget about updates. And for each entity there should be no more than one running processing (before the point of no-return) i.e. the entity state should not be processed more than once. So what I'm looking for is a pattern to cancel current processing before the point of no return or abandon processing results if an update arrives. The main challenge is to minimize race conditions and maintain integrity. The entity sits mainly in database with some files on disk. And the system is in .NET with web-services and message queues. What comes to my mind is a database queue-like table. An arriving update inserts row in that table and the processing is launched. The processing gathers necessary data before the point of no-return and once it reaches this barrier it looks into the queue table and checks whether there're more recent updates for the entity. If there are new updates the processing simply shuts down and its data is discarded. Otherwise the processing data is persisted and it goes beyond the point of no-return. Though it looks like a solution to me it is not quite elegant and I believe this scenario may be supported by some sort of middleware. If I would use message queues for this then there's a need to access the queue API in the point of no-return to check for the existence of new messages. And this approach also lacks elegance. Is there a name for this pattern and an existing solution?

    Read the article

  • Oracle Linux Hands-on Lab from your Home? Yes You Can Do That!

    - by Zeynep Koch
    We're taking the very popular OTN Sysadmin Days and going virtual! We have two days to choose from: Americas - Tuesday January 15th, 2013 9:00 a.m – 1:00 p.m. PT / 12:00 p.m. – 4:00 p.m. ET / 1:00 p.m. - 5:00 p.m. BRT EMEA -  Tuesday January 29th, 2013 - 9:00 a.m – 13:00 p.m GMT / 10:00 a.m – 14:00 p,m CET / 12:00 p.m – 16:00 p.m AST / 13:00 p.m – 17:00 p,m MSK / 14:30 p.m – 18:30 p.m IST You'll be able to perform real-world tasks with Oracle Linux and if you have questions you can ask for help from the Oracle experts through chat window. There's one caveat: you'll have to do a little homework ahead of time. Load the virtual images onto your laptop, find the instructions, and make sure everything is working properly. This wiki https://wikis.oracle.com/display/virtualsysadminday/Home explains what you need to do. If you have questions, ask them as comments to the wiki:https://wikis.oracle.com/display/virtualsysadminday/Home.  Oracle Linux Track  1. Oracle Linux Technology Overview - In this session we will go over the latest Oracle Linux features including tools for Linux administration such as the Unbreakable Linux Network (ULN) and public yum. We will also show you a demo of Ksplice zero downtime kernel updates, only available to Oracle Linux customers. You will see how easy it is to switch from Red Hat support to Oracle Linux support by using ULN. Last but not least, we’ll introduce the 3 hands-on labs that will follow this session in the Linux track. 2. HOL: Package Management -  In this lab session you will use the package management on Oracle Linux using RPM and yum. Some of the tasks that you will experience include listing installed packages, obtaining additional information about packages, searching for packages and installing/updating them as well as verifying package integrity and removing software. We’ll also review Linux services and run levels, how to start and stop them, checking the status of a particular service and enabling a service to be started automatically at system boot. 3. HOL: Storage Management - In this hands-on lab session, you will learn about storage management with LVM2, the Linux Logical Volume Manager, preparing block devices, creating physical and logical volumes, creating file systems on top of logical volumes, and resizing file systems dynamically. You will also practice setting up software RAID devices, configuring encrypted block devices.Btrfs File System - In this hands-on lab session, we will introduce you to Btrfs file system. You will be able to create and mount a Btrfs file system and learn to setup a mirrored/striped file system across multiple block devices. You’ll also learn how to add and remove block devices, and create file system snapshots. Register for this FREE event.

    Read the article

  • ArchBeat Link-o-Rama for 10-24-2012

    - by Bob Rhubart
    Play Oracle Vanquisher Here's a little respite from whatever it is you normally spend your time on. Oracle Vanquisher is an online diversion that makes a game of data center optimization. According to the description: "Armed with a cool Oracle vacuum pack suit and a strategic IT roadmap, you will thwart threats and optimize your data center to increase your company’s stock price and boost your company's position." Mainly you avoid electric shock and killer birds. The current high score belongs to someone identified as "TEN." My score? Never mind. Book: DevOps for Developers | The Java Source The subject of DevOps has come up in a couple of recent OTN ArchBeat Podcasts, so it's somewhat serendipitous that Tori Weildt's recent blog post offers an overview of Java Champion Michael Hutterman's new book, DevOps for Developers, now available from Apress. Bring Your Own Device (BYOD) : Context is everything… | The ORACLE-BASE Blog BOYD is a factor in the evolution of IT, but in what context? "The real IT work in companies is still being done on PCs," says Oracle ACE Director Tim Hall. "Yes, you can use a cloud service on your phone, but look around the office and you will see those cloud services are actually being used by people on PCs." Oracle in the Cloud: Oracle EBusiness Suite sizing | Tom Laszewski Cloud expert Tom Laszewski shares several technical resources that will be helpful for sizing of Oracle EBusiness Suite. Setting Up, Configuring, and Using an Oracle WebLogic Server Cluster Author and expert Yuli Vasiliev shows you how take advantage of multiple Oracle WebLogic Server instances grouped into a cluster to maximize scalability and availability. Webcast: Reduce Costs with Oracle's Database Storage Management Watch this! Join Oracle experts Kevin Jernigan and Margaret Hamburger for an interactive webcast in which you'll learn how Oracle's Database Storage Management can reduce storage costs and management complexity while improving query performance to meet service-level agreements and compliance requirements. Event Date: Tuesday, November 6, 2012 Event Time: 10 a.m. PT/1 p.m. ET Thought for the Day "Most software today is very much like an Egyptian pyramid with millions of bricks piled on top of each other, with no structural integrity, but just done by brute force and thousands of slaves." — Alan Kay Source: softwarequotes.com

    Read the article

  • Announcing the Next AppAdvantage IT Leader Documentary

    - by Tanu Sood
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 Close on the heels of the very successful launch of Oracle BPM 12c, we are so excited to be announcing the next video in the Oracle AppAdvantage IT Leaders Series. As you know, the Oracle AppAdvantage IT Leaders Series profiles a successful executive in an industry leading organization that has embarked on a business transformation or platform modernization journey by taking full advantage of the Oracle AppAdvantage pace layered architecture. Tune in on Tuesday, September 9th at 10 am Pacific/1 pm Eastern to catch our IT executive, Regis Louis in an in-depth conversation with IT executives from Siram S.p.a., a major energy services management company in Italy. The documentary will explore how the company is doing process orchestration and business process management to build inter-department collaboration, maintain data integrity and offer complete transparency throughout their Request for Proposal (RFP) process. Oracle’s technology expert will then do a comprehensive walk-through of Siram’s IT model, the various components and how Oracle Fusion Middleware technologies are enabling their Oracle E-Business Suite processes. Experts will be at hand to answer your questions live. Check out this live documentary webcast and find out how organizations like yours are building business agility leveraging existing investments in Oracle Applications. Register today. And if you are on twitter, send your questions to @OracleMiddle with #ITLeader, #AppAdvantage. We look forward to connecting with you on Tue, Sep 9. Siram Achieves Commercial Efficiency with Improved Business Process Agility Date: Tuesday, September 9, 2014 Time: 10:00 AM PT / 1:00 PM ET /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-family:"Calibri","sans-serif"; mso-ascii- mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi- mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

    Read the article

  • Showing ZFS some LOVE

    - by Kristin Rose
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} L is for the way you look at us, and O because we’re Oracle, but V is very, very, extra ordinary, and E, well that’s obvious… E is because Oracle’s new Sun ZFS Storage Appliance is Excellent, and here at OPN, we like spell out the obvious!  If you haven’t already heard, the Sun ZFS Appliance has “A simple, GUI-driven setup and configuration, solid price-performance and world-class Oracle support behind it. The CRN Test Center recommends the Sun ZFS Storage”. Read more about what CRN said here. Oracle's Sun ZFS Appliance family delivers enterprise-class network attached storage (NAS) capabilities with leading Oracle integration, simplicity, efficiency, performance, and TCO.  The systems offer an easy way to manage and expand your storage environment at a lower cost, with more efficiency, better data integrity, and higher performance when compared with competitive NAS offerings. Did we mention that set up, including configuring, will take you less than an hour since it all comes in one box and is so darn simple to use? So if you L-O-V-E what you’re hearing about Oracle’s Sun Z-F-S, learn more by watching the video below, and visiting any of our available resources . It Had to Be You, The OPN Communications Team

    Read the article

  • MS Access Premiere Products Exercise

    - by rynwtts
    I am working with Microsoft Access, Premiere Products Exercises for a college course. I can't seem to get past a specific question. We are working with DBDL and E-R Diagrams. The question is here. Indicate the changes you need to make to the design of the Premiere Products database to support the following situation. A customer is not necessarily represented by a single sales rep but can be represented by several sales reps. when a customer places an order, the sales rep who gets the commission on the order must be one of the collection of sales reps who represents the customer. In the database already each customer is represented by a sales rep. Which yields a one to one relationship. I need to enable a customer to have several sales reps, and make it so that only those sales rep will be eligible for commission upon each order.

    Read the article

  • Can't join OS X Mavericks to AD Domain

    - by watkipet
    I'm attempting to join an OS X Mavericks (10.9) client to a Windows Server 2008 Active Directory domain, however the bind fails with this error in the OS X client's system.log: Oct 24 15:03:15 host.domain.com com.apple.preferences.users.remoteservice[5547]: -[ODCAddServerSheetController handleOtherActionError: gotError: Error Domain=com.apple.OpenDirectory Code=5202 "Authentication server encountered an error while attempting the requested operation." UserInfo=0x7f9e6cb3e180 {NSLocalizedDescription=Authentication server encountered an error while attempting the requested operation., NSLocalizedFailureReason=Authentication server encountered an error while attempting the requested operation.}, Authentication server encountered an error while attempting the requested operation. I've joined (bound) Ubuntu Linux clients to the same domain with net ads join in the past with no problems (using the same administrative user). I don't have access to any server logs. Here's the GUI error (from Directory Utility) on the OS X client: Here's the GUI error (from User's and Groups) in System Preferences on the OS X client: Update After some Wiresharking I've got some more info: OS X Client - KDC (over UDP): AS_REQ (no padata) OS X Client <- KDC (over UDP): KRB5KDC_ERR_PREAUTH_REQUIRED OS X Client - KDC (over UDP): AS_REQ (this time with PA-ENC-TIMESTAMP in padata) OS X Client <- KDC (over UDP): KRB5KDC_ERR_RESPONSE_TOO_BIG OS X Client - KDC (over TCP): AS_REQ (also with PA-ENC-TIMESTAMP in padata) OS X Client <- KDC (over TCP): KDC_ERR_ETYPE_NOSUPP ...and that's it. This is what I think is going on: The OS X client sends a kerberos request. The KDC says, "You need to pre-authenticate. Try again" The OS X client tries to pre-authenticate (all this so far is over UDP) Something gets lost on our network and the KDC says, "Oops something went wrong" The OS X client switches to TCP and tries again. Over TCP, the KDC says, "You're using an encryption type I don't support" Note that in its padata records, the OS X client is always using "aes256-cts-hmac-sha1-96" as its encryption type. However, in its KDC_REQ_BODY record it lists the aes256-cts-hmac-sha1-96, aes128-cts-hmac-sha1-96, des3-cbc-sha1, and rc4-hmac encryption types. When the KDC comes back with KDC_ERR_ETYPE_NOSUPP, it uses rc4-hmac as its encryption type in its padata record. I know next to nothing about Kerberos, but it seems to me that the OS X client should go ahead and try the rc4-hmac encryption type. However, it does nothing after this. Update 2 Here's the debug log from Directory Services on the OS X client. Sorry--it's long. 2013-10-25 14:19:13.219128 PDT - 10544.20463 - ODNodeCustomCall request, NodeID: 52A65FAE-4B24-455D-86EC-2199A780D234, Code: 80 2013-10-25 14:19:13.220409 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - client requested OU - 'CN=Computers,DC=domain,DC=com' 2013-10-25 14:19:13.220427 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - Binding using '[email protected]' for kerberos ID 2013-10-25 14:19:13.220571 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - new kerberos credential cache 'MEMORY:0x7fa713635470' for '[email protected]' 2013-10-25 14:19:13.220623 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_init_creds: loop 1 2013-10-25 14:19:13.220639 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - KDC send 0 patypes 2013-10-25 14:19:13.220653 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - fast disabled, not doing any fast wrapping 2013-10-25 14:19:13.220699 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - Trying to find service kdc for realm DOMAIN.COM flags 0 2013-10-25 14:19:13.221275 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - submissing new requests to new host 2013-10-25 14:19:13.221326 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - connecting to host: udp 192.168.0.1:kerberos (192.168.0.1) tid: 00000001 2013-10-25 14:19:13.221373 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - writing packet: udp 192.168.0.1:kerberos (192.168.0.1) tid: 00000001 2013-10-25 14:19:13.222588 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - reading packet: udp 192.168.0.1:kerberos (192.168.0.1) tid: 00000001 2013-10-25 14:19:13.222617 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - host completed: udp 192.168.0.1:kerberos (192.168.0.1) tid: 00000001 2013-10-25 14:19:13.222665 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_sendto_context DOMAIN.COM done: 0 hosts 1 packets 1 wc: 0.001960 nr: 0.000000 kh: 0.000560 tid: 00000001 2013-10-25 14:19:13.222705 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_init_creds: loop 2 2013-10-25 14:19:13.222737 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_init_creds: processing input 2013-10-25 14:19:13.222752 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_init_creds: got an KRB-ERROR from KDC 2013-10-25 14:19:13.222775 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_init_creds: KRB-ERROR -1765328359/Additional pre-authentication required 2013-10-25 14:19:13.222791 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - KDC send 4 patypes 2013-10-25 14:19:13.222800 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - KDC send PA-DATA type: 19 2013-10-25 14:19:13.222808 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - KDC send PA-DATA type: 2 2013-10-25 14:19:13.222816 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - KDC send PA-DATA type: 16 2013-10-25 14:19:13.222825 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - KDC send PA-DATA type: 15 2013-10-25 14:19:13.222840 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_init_creds: using ENC-TS with enctype 18 2013-10-25 14:19:13.222850 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_init_creds: using default_s2k_func 2013-10-25 14:19:13.227443 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - fast disabled, not doing any fast wrapping 2013-10-25 14:19:13.227502 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - Trying to find service kdc for realm DOMAIN.COM flags 0 2013-10-25 14:19:13.228233 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - submissing new requests to new host 2013-10-25 14:19:13.228320 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - connecting to host: udp 192.168.0.1:kerberos (192.168.0.1) tid: 00010001 2013-10-25 14:19:13.228374 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - writing packet: udp 192.168.0.1:kerberos (192.168.0.1) tid: 00010001 2013-10-25 14:19:13.229930 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - reading packet: udp 192.168.0.1:kerberos (192.168.0.1) tid: 00010001 2013-10-25 14:19:13.229957 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - host completed: udp 192.168.0.1:kerberos (192.168.0.1) tid: 00010001 2013-10-25 14:19:13.229975 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_sendto trying over again (reset): 0 2013-10-25 14:19:13.230023 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - Trying to find service kdc for realm DOMAIN.COM flags 2 2013-10-25 14:19:13.230664 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - submissing new requests to new host 2013-10-25 14:19:13.230726 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - connecting to host: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00010002 2013-10-25 14:19:13.230818 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - connecting to 11: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00010002 2013-10-25 14:19:13.231101 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - writing packet: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00010002 2013-10-25 14:19:13.232743 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - reading packet: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00010002 2013-10-25 14:19:13.232777 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - host completed: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00010002 2013-10-25 14:19:13.232798 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_sendto_context DOMAIN.COM done: 0 hosts 2 packets 2 wc: 0.005316 nr: 0.000000 kh: 0.001339 tid: 00010002 2013-10-25 14:19:13.232856 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_init_creds: loop 3 2013-10-25 14:19:13.232868 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_init_creds: processing input 2013-10-25 14:19:13.232900 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_init_creds: using keyproc 2013-10-25 14:19:13.232910 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_init_creds: using default_s2k_func 2013-10-25 14:19:13.236487 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_init_creds: extracting ticket 2013-10-25 14:19:13.236557 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_init_creds: wc: 0.015944 2013-10-25 14:19:13.237022 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - Trying to find service kdc for realm DOMAIN.COM flags 2 2013-10-25 14:19:13.237444 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - submissing new requests to new host 2013-10-25 14:19:13.237482 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - connecting to host: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00020001 2013-10-25 14:19:13.237551 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - connecting to 11: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00020001 2013-10-25 14:19:13.237900 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - writing packet: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00020001 2013-10-25 14:19:13.238616 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - reading packet: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00020001 2013-10-25 14:19:13.238645 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - host completed: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00020001 2013-10-25 14:19:13.238674 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_sendto_context DOMAIN.COM done: 0 hosts 1 packets 1 wc: 0.001656 nr: 0.000000 kh: 0.000409 tid: 00020001 2013-10-25 14:19:13.238839 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - Trying to find service kdc for realm DOMAIN.COM flags 2 2013-10-25 14:19:13.239302 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - submissing new requests to new host 2013-10-25 14:19:13.239360 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - connecting to host: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00030001 2013-10-25 14:19:13.239429 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - connecting to 11: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00030001 2013-10-25 14:19:13.239683 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - writing packet: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00030001 2013-10-25 14:19:13.240350 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - reading packet: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00030001 2013-10-25 14:19:13.240387 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - host completed: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00030001 2013-10-25 14:19:13.240415 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_sendto_context DOMAIN.COM done: 0 hosts 1 packets 1 wc: 0.001578 nr: 0.000000 kh: 0.000445 tid: 00030001 2013-10-25 14:19:13.240514 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_credentials_with_flags: DOMAIN.COM wc: 0.003615 2013-10-25 14:19:13.240537 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - valid credentials for [email protected] 2013-10-25 14:19:13.240541 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - switching to cache 'MEMORY:0x7fa713635470' 2013-10-25 14:19:13.240545 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - switching GSS to cache 'MEMORY:0x7fa713635470 2013-10-25 14:19:13.240555 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - Bind Step 5 - Bind/Join computer to domain - 'domain.com' 2013-10-25 14:19:13.241345 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - resolving 'server.domain.com' 2013-10-25 14:19:13.241646 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - added socket 12 for host 'server.domain.com:389' address '192.168.0.2' to kqueue list 2013-10-25 14:19:13.241930 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - Setting kerberos server for 'Kerberos:DOMAIN.COM' to 'server.domain.com' 2013-10-25 14:19:13.241962 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - switching to cache 'MEMORY:0x7fa713635470' 2013-10-25 14:19:13.241969 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - switching GSS to cache 'MEMORY:0x7fa713635470 2013-10-25 14:19:13.242231 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - GSSAPI allow Confidentiality 2013-10-25 14:19:13.242234 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - setting realm 'DOMAIN.COM' for node '/Active Directory/domain.com' 2013-10-25 14:19:13.242239 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - GSSAPI allow Integrity (signing) 2013-10-25 14:19:13.242274 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - GSSAPI using hostname 'server.domain.com' 2013-10-25 14:19:13.242282 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - GSSAPI using initiator credential '[email protected]' 2013-10-25 14:19:13.250771 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - Authenticate to LDAP using Kerberos credential - 0 2013-10-25 14:19:13.250784 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - verified connectivity to '192.168.0.2' with socket 12 2013-10-25 14:19:13.251513 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - locating site using domain domain.com using CLDAP 2013-10-25 14:19:13.252145 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - using site of 'DOMAINGROUP' from CLDAP 2013-10-25 14:19:13.253626 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - resolving 'server2.domain.com' 2013-10-25 14:19:13.253933 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - added socket 13 for host 'server2.domain.com:389' address '192.168.0.1' to kqueue list 2013-10-25 14:19:13.254428 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - Setting kerberos server for 'Kerberos:DOMAIN.COM' to 'server2.domain.com' 2013-10-25 14:19:13.254462 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - switching to cache 'MEMORY:0x7fa713635470' 2013-10-25 14:19:13.254468 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - switching GSS to cache 'MEMORY:0x7fa713635470 2013-10-25 14:19:13.254617 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - setting realm 'DOMAIN.COM' for node '/Active Directory/domain.com' 2013-10-25 14:19:13.254661 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - GSSAPI allow Confidentiality 2013-10-25 14:19:13.254670 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - GSSAPI allow Integrity (signing) 2013-10-25 14:19:13.254689 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - GSSAPI using hostname 'server2.domain.com' 2013-10-25 14:19:13.254695 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - GSSAPI using initiator credential '[email protected]' 2013-10-25 14:19:13.262092 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - Authenticate to LDAP using Kerberos credential - 0 2013-10-25 14:19:13.262108 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - verified connectivity to '192.168.0.1' with socket 13 2013-10-25 14:19:13.262982 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - Computer account either already exists or DC is already Read/Write 2013-10-25 14:19:13.264968 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - Adding record 'cn=spike,CN=Computers,DC=domain,DC=com' in 'domain.com' The failure point seems to be Computer account either already exists or DC is already Read/Write, however, I can search for 'spike' on the Active Directory server using Active Directory Explorer and it's not there. If I do the same search for the Linux and Windows PCs I added previously, I can find them.

    Read the article

  • PostgreSQL service doesn't start on Windows 7

    - by Mehrdad
    (Not sure if this should be on Stack Overflow or Super User... please move if needed.) When I start the PostgreSQL service on Windows 7 x64, it immediately stops. When I check my log folder (C:\PostgreSQL\9.1\data\pg_log\), I see new but empty log files. The Event Viewer doesn't tell me anything other than the fact that the server did not respond. I've even tried turning off my firewall (I don't have any antivirus or anything else), but nothing helps. The setup works fine when I'm on Windows XP (32-bit) (same computer, different partition). I can't figure out what's wrong, even though I've tried tracing the system calls. Is PostgreSQL compatible with Windows 7 x64 at all? Any ideas what the issue might be? More info: This problem also happens at the end of installation -- the service starts, then stops immediately, before the installer can do anything. Installation log: Starting the database server... Executing cscript //NoLogo "C:\Program Files\PostgreSQL\9.1\installer\server\startserver.vbs" postgresql-x64-9.1 Script exit code: 0 Script output: Starting postgresql-x64-9.1 Service postgresql-x64-9.1 started successfully // <==== NOT REALLY!! It stops! startserver.vbs ran to completion Script stderr: Loading additional SQL modules... Executing cscript //NoLogo "C:\Program Files\PostgreSQL\9.1\installer\server\loadmodules.vbs" "postgres" "****" "C:\Program Files\PostgreSQL\9.1" "C:\Program Files\PostgreSQL\9.1\data" 5432 Script exit code: 2 Script output: Installing the adminpack module in the postgres database... Executing 'C:\Users\HOMEUS~1\AppData\Local\Temp\rad6C20D.bat'... psql: could not connect to server: Connection refused (0x0000274D/10061) Is the server running on host "localhost" (::1) and accepting TCP/IP connections on port 5432? could not connect to server: Connection refused (0x0000274D/10061) Is the server running on host "localhost" (127.0.0.1) and accepting TCP/IP connections on port 5432? Failed to install the 'adminpack' module in the 'postgres' database loadmodules.vbs ran to completion Script stderr: Program ended with an error exit code Error running cscript //NoLogo "C:\Program Files\PostgreSQL\9.1\installer\server\loadmodules.vbs" "postgres" "****" "C:\Program Files\PostgreSQL\9.1" "C:\Program Files\PostgreSQL\9.1\data" 5432 : Program ended with an error exit code

    Read the article

  • Oracle 11gR2 exp does not export some tables

    - by Tilo Prütz
    I have an Oracle 11g (11.2.0.1) Database running on Linux (x64). Within the database I have a schema and 33 tables for it. When I log in via sqlplus I can list all the tables via SELECT OBJECT_NAME FROM USER_OBJECTS WHERE OBJECT_TYPE = 'TABLE'; But when I export the Tablespace using exp ... BUFFER=65536 FULL=N COMPRESS=N CONSISTENT=Y TABLESPACES=... FILE=... Then it only exports 24 of the 33 tables. I have tried to export the missing tables via exp ... TABLES=<missing_table> ... But then I get an error: EXP-00011: NPSMIGRO2_CM.DEFAULT_USR_ATTR_VALUES does not exist How can I find out what's wrong here? How can I export all the tables?

    Read the article

  • ERROR [01000] [Microsoft][ODBC SQL Server Driver][DBNETLIB]ConnectionOpen (Connect())

    - by Steven Morrison
    Our company had resently upgraded to a domain functionality from workgroup and now we get this error message shown below on the clients laptops. The software on the server is working as it was opening the SECWARE application and a successful login to the database is possible.But from our workstations we cannot access the database to get the login screen. error in DB_OpenDS: ERROR [08001] [Microsoft][ODBC SQL Server Driver][DBNETLIB]SQL Server does not exist or access denied. ERROR [01000] [Microsoft][ODBC SQL Server Driver][DBNETLIB]ConnectionOpen (Connect()). OpenFolder Error :ERROR [08001] [Microsoft][ODBC SQL Server Driver][DBNETLIB]SQL Server does not exist or access denied. ERROR [01000] [Microsoft][ODBC SQL Server Driver][DBNETLIB]ConnectionOpen (Connect()). exiting with error Cannot find table 0. exiting with error We have a network share that we used to get the login file from and that is accessible.

    Read the article

  • Can't restore backup from SQL Server 2008 R2 to SQL Server 2005 or 2008

    - by Erick
    Hi everyone, I'm trying to get a backup from SQL Server 2008 R2 restored to SQL Server 2008, but when we try to do the restore we get this: The database was backed up on a server running version 10.50.1092. That version is incompatible with this server, which is running version 10.00.2531. Either restore the database on a server that supports the backup, or use a backup that is compatible with this server. I can use the script wizard to generate a script, but that takes over an hour to run. I also tried just exporting the data from server to server, but it had issues with the primary keys/identity columns. I will be running into this issue with several other clients so any help you could offer about how to get around this would be great. Thanks for your help!

    Read the article

  • How to change the firmware of my NETGEAR WGT624 v2 to DD-WRT

    - by Lirik
    In reference to my previous question: I can't find the appropriate firmware for my NETGEAR WGT624 v2 router. I went on the dd-wrt web site and I read the wiki, but I didn't see any files or instructions on how to change the firmware. The router database lists my router, but as I said: no files. In addition the "Supported" column lists "wip", what is wip? Router Database 4 routers found Manufacturer Model Revision Supported Activation required Netgear WGT624 v1 wip no Netgear WGT624 v2 wip no ... The only thing listed for my router on the dd-wrt web site is: Router details Chipset: AR2312A RAM: 16 MB FLASH: 4 MB Additional information * OpenWRT Wiki: Netgear WGT624 * Unbrick procedure Is the router supported? Can I change the firmware? If yes, then how can I change it?

    Read the article

  • Windows - Use Local Service and/or Network Service account for a windows service

    - by user19185
    I've created a window's service that monitors files on a specific directory on our Windows OS. When a file is detected, the service does some file I/O, reads the files, creates sub-directories, etc. This service also uses database connectivity to connect to another server. My plan is to have the service run as the default "Local Service" account. Since I need to allow write/read privileges, which apparently the "Local Service" account does not do by default, I'm going to explicitly set "Full Control" privileges for the "Local Service" account on the folder that I'm reading/writing to and from. I believe the above is a good . My question is, for the folder that I'm reading and writing to, do I need to setup a "Network Service" role with full control access? I'm wondering since my service uses database connectivity to another server, if I'll need the "Network Service" account setup. I may be misunderstanding what the "Network Service" account does.

    Read the article

  • Non-functioning AutoFilter on Locked Cells in Office 2008 - works in Office 2007

    - by Sarcas
    I'm looking into a problem for someone, who works in a mixed OS environment. She has created an Excel spreadsheet in Office 2007 to act as a directory, with AutoFilter turned on for names, email addresses, departments etc. To make sure no one accidentally edits email addresses (for example), she has protected the work sheet. Accessing this worksheet on a PC running Excel 2007, everything runs as you'd expect. You can filter the sheet by any of the auto-filtered columns, and because the sheet is protected, the data integrity is guaranteed. However, if you access the sheet on a Mac running Excel 2008, you can't filter the columns. What's strange here is that the AutoFilter dropdown arrows do appear in each of the column headers as you would expect. It's just that nothing happens if you click on them. If you select one of the column header cells (say, 'First Name') and check the menu: Data-Filter, you can see that AutoFilter is ticked. As another datapoint, you also seem to be able to apply an Advanced filter to these rows on the protected sheets. Does anyone know why this might be? It seems to be a compatibility issue between Excel 2007/2008 (I know the codebase isn't the same), but I can't find any references to it in documentation or forums anywhere, and it would be good to know if there's a way around this. Thanks!

    Read the article

  • MySQL replication Slave_IO_Running: No

    - by Christy
    Hi all, I have two servers that I am trying to get replication of one database between. I found a setup guide on sourceforge that I followed and I have tried various other settings since then, but no matter what I do, when I start the slave, the 'Slave_IO_Running' setting is always No.... I have no idea why or what to look at, any suggestions are appreciated. The slave setup was: CHANGE MASTER TO MASTER_HOST='myserver.mydomain.net', MASTER_USER='slave_user', 'MASTER_PASSWORD='mypassword', 'MASTER_LOG_FILE='mysql-bin.000011', MASTER_LOG_POS=1368363 (last data from today, trying to do setup again. I deleted and recreated the database on the slave from a new dump and tried to redo the setup.) I have slave_user setup for %, localhost, and the specific IP of the slave computer but nothing seems to be working... Thanks in advance for any advice or suggestions

    Read the article

  • Magento Community - Hosting :: Need Advice which sharing hosting will run magento fast

    - by user43353
    Hi, Need Advice which sharing hosting will run Magento Community fast or some other not expensive solution. This website will not have a lot of users, I only need that it will run fast for 100-20 users in same time. The problem with magento is database design that make this system very slow , also other staff not the best. I had hostmonster.com and justhost.com for previous website but it wasn't fast enough for single user that not located in USA (my customer areas: Asia, Africa). each action that involve database takes a lot of time. Thanks

    Read the article

  • IOError: [Errno 32] Broken pipe

    - by khati
    I got "IOError: [Errno 32] Broken pipe" while writing files in linux. I am using python to read each line a of csv file and then write into a database table. My code is f = open(path,'r') command = command to connect to database p = Popen(command, shell=True, stdin=PIPE, stdout=PIPE, stderr=PIPE, env=env) query = " COPY myTable( id, name, address) FROM STDIN WITH DELIMITER ';' CSV QUOTE '"'; " p.stdin.write(query.encode('ascii')) *-->(Here exactly I got the error, p.stdin.write(query.encode('ascii')) IOError: [Errno 32] Broken pipe )* So when I run this program in linux, I got error "IOError: [Errno 32] Broken pipe" . However this works fine when I run in windows7. Do I need to do some configuration in Linux sever? Any suggestions will be appreciated. Thank you.

    Read the article

  • SQL Server 2000 tables

    - by user40766
    We currently have an SQL Server 2000 database with one table containing data for multiple users. The data is keyed by memberid which is an integer field. The table has a clustered index on memberid. The table is now about 200 million rows. Indexing and maintenance are becoming issues. We are debating splitting the table into one table per user model. This would imply that we would end up with a very large number of tables potentially upto the 2,147,483,647, considering just positive values. My questions: Does anyone have any experience with a SQL Server (2000/2005) installation with millions of tables? What are the implications of this architecture with regards to maintenance and access using Query Analyzer, Enterprise Manager etc. What are the implications to having such a large number of indexes in a database instance. All comments are appreciated. Thanks

    Read the article

  • [DBNETLIB][ConnectionOpen (Connect()).]SQL Server does not exist or access denied.

    - by shah
    Web server and SQL server both are running on the different machine. The below is the connection string that we are using to connect MS SQL database from classic ASP web application. set oConn = server.createobject("ADODB.Connection") oConn.open "PROVIDER=SQLOLEDB;Data Source=xxx.xxx.x.xx,1433;Network Library=DBMSSOCN;Initial Catalog=databasename;User ID=xxxxx;Password=xxxxx;" No idea why it's loosing the database connection in the middle of uploading the page. Here is error message that we got. Microsoft OLE DB Provider for SQL Server error '80004005' [DBNETLIB][ConnectionOpen (Connect()).]SQL Server does not exist or access denied. Already verified SQL server 2005 remote connection settings and default port number. * Remote connections are enabled in SQL Server as per http://support.microsoft.com/kb/914277 Please help. Thanks,

    Read the article

< Previous Page | 527 528 529 530 531 532 533 534 535 536 537 538  | Next Page >