Search Results

Search found 9154 results on 367 pages for 'job market'.

Page 70/367 | < Previous Page | 66 67 68 69 70 71 72 73 74 75 76 77  | Next Page >

  • Can you make a living as a system programmer?

    - by Helper Method
    Is there still a market for C system programmers? I love Java and some of the newer JVM languages but at the same time I really enjoy low-level system programming under Unix, using C and the GNU toolchain (it makes you feel elitist ;-)). Now I wonder a) is there still a market for C system programmers and b) how much do you earn compared to an app programmer c) is it that much fun in a large scale project?

    Read the article

  • A reliable, Australia-based ASP.NET Web Hosting

    - by Leonardo
    In the excellent Secret Geek’s Building a Micro-ISV series, Leon Bambrick admits that he prefers to host his sites in the US because of the prices and proximity to his target market. For Australian companies and start-ups, what’s the best ASP.NET web hosting in the country? Should a company consider hosting its website overseas even if the potential market is in here?

    Read the article

  • Implementing In App purchases in Android?

    - by hgpc
    It looks like Android won't natively support in-app purchases for a while, and when it does there might be a huge user base with devices that don't support them. What's the best way to implement iPhone-like (additional content or services) in-app purchases in Android using the Android Market if possible? The solution should consider in particular: For all kinds of in-app purchases: Android Market's 24-hour cancellation policy For consumables/non-consumables: storage of additional content (ie: use precious application memory to avoid piracy, or use SD card to avoid bloating application memory) Thanks!

    Read the article

  • which images load the android device at the time of installation?

    - by VSC
    I am placing 20 images in hdpi folder, 20 images in mdpi folder and 20 images in ldpi folder, all are same images but different resolutions . I am placing my app in android market. One user install my app in his normal android device through market, My question is total 60 images download the device or only 20 images(mdpi) download the device at the time of install the app.Thanks for reading my question.

    Read the article

  • Microsoft Declares the Future of ASP.NET is Web API

    - by sbwalker
    Sitting on a plane on my way home from Tech Ed 2012 in Orlando, I thought it would be a good time to jot down some key takeaways from this year’s conference. Some of these items I have known since the Microsoft MVP Summit which occurred in Redmond in late February ( but due to NDA restrictions I could not share them with the developer community at large ) and some of them are a result of insightful conversations with a wide variety of industry insiders and Microsoft employees at the conference. First, let’s travel back in time 4 years to the Microsoft MVP Summit in 2008. Microsoft was facing some heat from market newcomer Ruby on Rails and responded with a new web development framework of its own, ASP.NET MVC. At the Summit they estimated that MVC would only be applicable for ~10% of all new web development projects. Based on that prediction I questioned why they were investing such considerable resources for such a relative edge case, but my guess is that they felt it was an important edge case at the time as some of the more vocal .NET evangelists as well as some very high profile start-ups ( ie. Twitter ) had publicly announced their intent to use Rails. Microsoft made a lot of noise about MVC. In fact, they focused so much of their messaging and marketing hype around MVC that it appeared that WebForms was essentially dead. Yes, it may have been true that Microsoft continued to invest in WebForms, but from an outside perspective it really appeared that MVC was the only framework getting any real attention. As a result, MVC started to gain market share. An inside source at Microsoft told me that MVC usage has grown at a rate of about 5% per year and now sits at ~30%. Essentially by focusing so much marketing effort on MVC, Microsoft actually created a larger market demand for it.  This is because in the Microsoft ecosystem there is somewhat of a bandwagon mentality amongst developers. If Microsoft spends a lot of time talking about a specific technology, developers get the perception that it must be really important. So rather than choosing the right tool for the job, they often choose the tool with the most marketing hype and then try to sell it to the customer. In 2010, I blogged about the fact that MVC did not make any business sense for the DotNetNuke platform. This was because our ecosystem relied on third party extensions which were dependent on the WebForms model. If we migrated the core to MVC it would mean that all of the third party extensions would no longer be compatible, which would be an irresponsible business decision for us to make at the expense of our users and customers. However, this did not stop the debate from continuing to occur in our ecosystem. Clearly some developers had drunk Microsoft’s Kool-Aid about MVC and were of the mindset, to paraphrase an old Scottish saying, “If its not MVC, it’s crap”. Now, this is a rather ignorant position to take as most of the benefits of MVC can be achieved in WebForms with solid architecture and responsible coding practices. Clean separation of concerns, unit testing, and direct control over page output are all possible in the WebForms model – it just requires diligence and discipline. So over the past few years some horror stories have begun to bubble to the surface of software development projects focused on ground-up rewrites of web applications for the sole purpose of migrating from WebForms to MVC. These large scale rewrites were typically initiated by engineering teams with only a single argument driving the business decision, that Microsoft was promoting MVC as “the future”. These ill-fated rewrites offered no benefit to end users or customers and in fact resulted in a less stable, less scalable and more complicated systems – basically taking one step forward and two full steps back. A case in point is the announcement earlier this week that a popular open source .NET CMS provider has decided to pull the plug on their new MVC product which has been under active development for more than 18 months and revert back to WebForms. The availability of multiple server-side development models has deeply fragmented the Microsoft developer community. Some folks like to compare it to the age-old VB vs. C# language debate. However, the VB vs. C# language debate was ultimately more of a religious war because at least the two dominant programming languages were compatible with one another and could be used interchangeably. The issue with WebForms vs. MVC is much more challenging. This is because the messaging from Microsoft has positioned the two solutions as being incompatible with one another and as a result web developers feel like they are forced to choose one path or another. Yes, it is true that it has always been technically possible to use WebForms and MVC in the same project, but the tooling support has always made this feel “dirty”. The fragmentation has also made it difficult to attract newcomers as the perceived barrier to entry for learning ASP.NET has become higher. As a result many new software developers entering the market are gravitating to environments where the development model seems more simple and intuitive ( ie. PHP or Ruby ). At the same time that the Web Platform team was busy promoting ASP.NET MVC, the Microsoft Office team has been promoting Sharepoint as a platform for building internal enterprise web applications. Sharepoint has great penetration in the enterprise and over time has been enhanced with improved extensibility capabilities for software developers. But, like many other mature enterprise ASP.NET web applications, it is built on the WebForms development model. Similar to DotNetNuke, Sharepoint leverages a rich third party ecosystem for both generic web controls and more specialized WebParts – both of which rely on WebForms. So basically this resulted in a situation where the Web Platform group had headed off in one direction and the Office team had gone in another direction, and the end customer was stuck in the middle trying to figure out what to do with their existing investments in Microsoft technology. It really emphasized the perception that the left hand was not speaking to the right hand, as strategically speaking there did not seem to be any high level plan from Microsoft to ensure consistency and continuity across the different product lines. With the introduction of ASP.NET MVC, it also made some of the third party control vendors scratch their heads, and wonder what the heck Microsoft was thinking. The original value proposition of ASP.NET over Classic ASP was the ability for web developers to emulate the highly productive desktop development model by using abstract components for creating rich, interactive web interfaces. Web control vendors like Telerik, Infragistics, DevExpress, and ComponentArt had all built sizable businesses offering powerful user interface components to WebForms developers. And even after MVC was introduced these vendors continued to improve their products, offering greater productivity and a superior user experience via AJAX to what was possible in MVC. And since many developers were comfortable and satisfied with these third party solutions, the demand remained strong and the third party web control market continued to prosper despite the availability of MVC. While all of this was going on in the Microsoft ecosystem, there has also been a fundamental shift in the general software development industry. Driven by the explosion of Internet-enabled devices, the focus has now centered on service-oriented architecture (SOA). Service-oriented architecture is all about defining a public API for your product that any client can consume; whether it’s a native application running on a smart phone or tablet, a web browser taking advantage of HTML5 and Javascript, or a rich desktop application running on a PC. REST-based services which utilize the less verbose characteristics of JSON as a transport mechanism, have become the preferred approach over older, more bloated SOAP-based techniques. SOA also has the benefit of producing a cross-platform API, as every major technology stack is able to interact with standard REST-based web services. And for web applications, more and more developers are turning to robust Javascript libraries like JQuery and Knockout for browser-based client-side development techniques for calling web services and rendering content to end users. In fact, traditional server-side page rendering has largely fallen out of favor, resulting in decreased demand for server-side frameworks like Ruby on Rails, WebForms, and (gasp) MVC. In response to these new industry trends, Microsoft did what it always does – it immediately poured some resources into developing a solution which will ensure they remain relevant and competitive in the web space. This work culminated in a new framework which was branded as Web API. It is convention-based and designed to embrace native HTTP standards without copious layers of abstraction. This framework is designed to be the ultimate replacement for both the REST aspects of WCF and ASP.NET MVC Web Services. And since it was developed out of band with a dependency only on ASP.NET 4.0, it means that it can be used immediately in a variety of production scenarios. So at Tech Ed 2012 it was made abundantly clear in numerous sessions that Microsoft views Web API as the “Future of ASP.NET”. In fact, one Microsoft PM even went as far as to say that if we look 3-4 years into the future, that all ASP.NET web applications will be developed using the Web API approach. This is a fairly bold prediction and clearly telegraphs where Microsoft plans to allocate its resources going forward. Currently Web API is being delivered as part of the MVC4 package, but this is only temporary for the sake of convenience. It also sounds like there are still internal discussions going on in terms of how to brand the various aspects of ASP.NET going forward – perhaps the moniker of “ASP.NET Web Stack” coined a couple years ago by Scott Hanselman and utilized as part of the open source release of ASP.NET bits on Codeplex a few months back will eventually stick. Web API is being positioned as the unification of ASP.NET – the glue that is able to pull this fragmented mess back together again. The  “One ASP.NET” strategy will promote the use of all frameworks - WebForms, MVC, and Web API, even within the same web project. Basically the message is utilize the appropriate aspects of each framework to solve your business problems. Instead of navigating developers to a fork in the road, the plan is to educate them that “hybrid” applications are a great strategy for delivering solutions to customers. In addition, the service-oriented approach coupled with client-side development promoted by Web API can effectively be used in both WebForms and MVC applications. So this means it is also relevant to application platforms like DotNetNuke and Sharepoint, which means that it starts to create a unified development strategy across all ASP.NET product lines once again. And so what about MVC? There have actually been rumors floated that MVC has reached a stage of maturity where, similar to WebForms, it will be treated more as a maintenance product line going forward ( MVC4 may in fact be the last significant iteration of this framework ). This may sound alarming to some folks who have recently adopted MVC but it really shouldn’t, as both WebForms and MVC will continue to play a vital role in delivering solutions to customers. They will just not be the primary area where Microsoft is spending the majority of its R&D resources. That distinction will obviously go to Web API. And when the question comes up of why not enhance MVC to make it work with Web API, you must take a step back and look at this from the higher level to see that it really makes no sense. MVC is a server-side page compositing framework; whereas, Web API promotes client-side page compositing with a heavy focus on web services. In order to make MVC work well with Web API, would require a complete rewrite of MVC and at the end of the day, there would be no upgrade path for existing MVC applications. So it really does not make much business sense. So what does this have to do with DotNetNuke? Well, around 8-12 months ago we recognized the software industry trends towards web services and client-side development. We decided to utilize a “hybrid” model which would provide compatibility for existing modules while at the same time provide a bridge for developers who wanted to utilize more modern web techniques. Customers who like the productivity and familiarity of WebForms can continue to build custom modules using the traditional approach. However, in DotNetNuke 6.2 we also introduced a new Service Framework which is actually built on top of MVC2 ( we chose to leverage MVC because it had the most intuitive, light-weight REST implementation in the .NET stack ). The Services Framework allowed us to build some rich interactive features in DotNetNuke 6.2, including the Messaging and Notification Center and Activity Feed. But based on where we know Microsoft is heading, it makes sense for the next major version of DotNetNuke ( which is expected to be released in Q4 2012 ) to migrate from MVC2 to Web API. This will likely result in some breaking changes in the Services Framework but we feel it is the best approach for ensuring the platform remains highly modern and relevant. The fact that our development strategy is perfectly aligned with the “One ASP.NET” strategy from Microsoft means that our customers and developer community can be confident in their current and future investments in the DotNetNuke platform.

    Read the article

  • Testing IPP Printing with ipptool

    - by senloe
    I'm trying to send an IPP print job using the ipptool. Using the sample .test files, I can send commands to the printer, but I am unable to successfully use the print-job.test file. Here's an example using ipptool. c:\...>ipptool -v ipp://name.local.:631/ipp/printer print-job.test ipptool: Filename "$filename" on line 21 cannot be read. ipptool: Filename mapped to "". It looks like it's failing resolving the variable $filename within the test file so I attempted to hardcode this value in the test file. In this case I get no error, but still no print. Does anybody have any experience using ipptool to test ipp printing?

    Read the article

  • ulimit not reflected for jenkins slave

    - by techastute
    Problem Got java.io.IOException: Too many open files in solr indexing through jenkins. Did some googling and found we have to set the ulimit for the box in where we are running the job. So set the ulimit in a linux box with spec Linux x86_64 GNU/Linux in both of the following fashions ulimit -n 1000000 /etc/security/limits.conf userx soft nofile 1000000 userx hard nofile 1000000 Given userx is the user through which the jenkins job is being executed. when doing ssh to the box as userx manually through terminal and check ulimit -n am getting 10000000 Question But when executing the same ulimit -n through a jenkins job, only getting 1024 which is the default. Any advice would be much helpful?

    Read the article

  • Deploy to JBoss 7 using Hudson Deploy plugin

    - by Uluk Biy
    I have 2 machines where one of them contains the Hudson CI and other JBoss 7 AS. In Hudson, I have installed "Deploy Plugin", created new job and filled required JBoss manager user connection fields. When I run the job, the project successfully built however the deployment process to remote JBoss AS is not being triggered. No errors or messages about the deployment in log. What should I do? EDIT The deployment is triggered (at least expected) as "Post-build Action" with parameters: [x] Deploy war/ear to a container WAR/EAR files : **/*.war Container : JBoss 7.x Manager user name : test Manager password : * * * * JBoss URL : http://192.168.1.2 JBoss JMX Management port : 9990 It is not a separate job.

    Read the article

  • Server Administration

    - by Kassem
    Hi everyone, My client asked me for a job description of a system administration because I might be assigned this position along with the other guy I'm working with. To be honest, I do not know much about a System Administrator's job but I'm willing to learn. Questions: What are the security requirements of a server? * What are the key responsibilities in a system admin's job description? What are some of the day to day tasks of a system admin? What is the average monthly salary of a system admin? Note: I will be working inside a Windows environment. But your replies do not necessarily need to be constricted to a Windows environment. (*) Other software I know will be required are: Windows Server 2008 IIS 7.0 MS SQL Server .NET 4.0 Runtime Let me know if there are other things I should be aware of as well. Thanks!

    Read the article

  • Running "Rebuild Index" maintenance plan with "Online indexing"

    - by Bharanidharan
    Hi I am using Windows Server 2003 SP 2 and SQL Server 2005 Enterprise edition I am creating a "Rebuild Index" job for a particular database and I am successfully able to run the job. But when I try to enable the "Keep index online while rebuilding" option, the job does not execute successfully and throws out errors. I have attached the screenshots. Any help would be app http://img535.imageshack.us/gal.php?g=error1r.png PS: I am not able to attach the images here since i do not have 10 points yet! Thanks.

    Read the article

  • SQL Server 2005 Default Backup Plan

    - by tylerl
    I noticed that a newly imported database on SQLServer 2005 had configured itself (without my knowledge) to perform daily backups; but it's not deleting old files and quickly filling up the disk. I don't know how the backup job got configured (maybe that's something that gets transferred when you move a database?) but I'm having trouble modifying it. The backup runs as part of SQL Server Agent job called "Daily Backups". This job runs a package called "(SSIS Packages)\Maintenance Plans\Backup Plan" -- which I can't find. The "Management\Maintenance Plans" area for my server is empty. I imagine I could delete the existing plan and re-create it manually, but I was hoping to just modify what was already there, since all that's missing is deleting old files.

    Read the article

  • Which Message Queue should I choose (must run on Linux)

    - by MHS
    There are many open source Message queues for Linux, and I need some help deciding what I should go for. My problem is simple - I get sent a list of files that needs to be processed. Each job can't be split up, but they are self contained and can be spread to multiple computers. I'm thinking of solving this using a message queue. Multiple clients send a message to a central queue. Each queue has a number of subscribers that will take jobs from that queue when they have finished processing the current job. Ideally it should have the following qualities Message queue must be able to store unprocessed messages in case of a shutdown/reboot A job can only be processed by a single subscriber (don't want duplicate jobs) The subscribers should be able to send jobs of their own, that will be processed by a different set of subscribers. Can anyone suggest a simple to use message queue?

    Read the article

  • Help with running crontab from root

    - by user242065
    Im using OSX and having trouble getting a cron job to run. I type the following: $ sudo -i $ crontab -e I then enter: * * * * * root ifconfig en0 down > /dev/null 0 19 * * * root ifconfig en0 down > /dev/null 0 7 * * * root ifconfig en0 up > /dev/null and no success, the first line is for testing. I want it to shut off my internet. The next two lines I plan to leave in, once I get this working. If I type this in to the terminal the internet goes off ifconfig en0 down Why is my cron job not shutting down the internet? FYI: This is a follow up question from http://stackoverflow.com/questions/3027362/how-can-i-write-a-cron-job-that-will-block-my-internet-from-7pm-to-7am-so-i-can most of the comments there are people making fun of me. And a few attempts to solve the problem with out cron jobs.

    Read the article

  • How to configure/save layout of SQL Server's Log File Viewer?

    - by gernblandston
    When I'm viewing the job history of a particular SQL Agent Job, I typically want to see whether it succeeded, its duration and maybe the duration of the individual steps of the job. When I open the history in the Log File Viewer, I always need to scroll over and shrink the 'Message' column and drag the 'Duration' column over next to the 'Step Name' column. Is there a way to configure the layout of the Log File Viewer (e.g. reposition columns, resize columns) and save it for future sessions? Thanks!

    Read the article

  • Printer options follow Office documents

    - by tkalve
    One person (John) creates an Office document, and prints this document to his HP printer which is using HP Universal Printing PS (v4.7) driver. He has got Job Storage (Personal job) enabled for this printer, with custom username and a personal PIN. He later sends this document in an e-mail to his colleagues. Another person (Anne) opens the document, and tries to print the document to her HP printer (also using HP Universal Printing driver), but is not able to fetch it on the printer. The Job Storage options from Johns computer follows the Office Excel document, so Anne has to change this manually to her username and her PIN before she can print. What on earth is causing this, and how do we fix it?

    Read the article

  • SQL 2005 Log Shipping - Was working, now isnt!

    - by Jim
    Hello, I had log shipping working between two SQL 2005 server fine. I suspect that a job was added to the source server which backed up the transaction log to disk (nothing to do with the existing log shipping job). As I understand it, if you do this then log shipping will fail to work. Sure enough, it no longer works. I've deleted the job which had just been created. Log shipping still does not work. I've rebooted both servers and, again, Log shipping does not work. I'm at a loss now... all I get is the folloing error: The log shipping secondary database XXXXXXXXXX has restore threshold of 45 minutes and is out of sync. No restore was performed for 5882 minutes. Restored latency is 15 minutes. Check agent log and logshipping monitor information. Any help appreciated! Thanks in advance.

    Read the article

  • SQL Server not releasing Memory

    - by noob2487
    I am using SQL Server 2005. I am running a job which processes around 100 K records. Job runs fine, it takes are 45 mins to execute, which is good. But after that job is processed, I can see instance of SQL Server 2005 still there with around 900 MB of Memory. I waited for around 2 hrs but that memory was not released. Is there any process which takes care of memory here, something like GC (unpredictable) Or am I doing something wrong???

    Read the article

  • Backup SQL server db issue: delete old backup files

    - by David.Chu.ca
    I tried to use sqlmaint.exe tool to back up a database on a remote PC. Here is an example of backup: sqlmaint.exe -S remoteSQLServer\SQLInstance -U username -P pwdxxx -D myDB -BkUpMedia DISK -BkUpDB C:\MSSQL_Backups -DelBkUps 3days ... Here I specified to delete backups older than 3 days. However, the job seems not deleting old bak files on the remote PC(where the SQL server sits). The remote PC has Windows 2008 Server. I also set the C:\MSQL_Backups as shared network drive for EnyOne as owner. My understanding is that the job will delete any bak files older than 3 days. Not sure what I missed? By the way, the job runs at a box with SQL server 2005 installed.

    Read the article

  • Organizing files relationally in Windows 7?

    - by Cayetano Gonçalves
    I just took a new job as a policy analyst, and after even one week keeping track of hundreds of files- lawsuits, legislation, letters, etc- in Windows 7 is proving difficult. In my last job I was a database architect and I helped build Linux based servers to track files among an entire department, however there is no way for me to do that at this time in this job. Is there any way to track files/indices/locations/tags/themes and store them in some kind of RDBMS system, instead of storing the files in folders that only allow for flat and fixed storage? For example, if I have a file that deals with: ELID organization Appeals court John Smith It really is inconvenient to have to decide which one of these tags to create into a folder and place the file into it, when it falls under all the categories. Even if I could place tags the way you can in Stack Exchange on files, it would solve a lot of heart ache.

    Read the article

  • Printer offline until spooler service is restarted multiple times

    - by Zian Choy
    When I try to print from my ThinkPad to a printer shared through a Windows 7 Homegroup hosted by a desktop computer, I often have to restart the Print Spooler service several times before the job will go through. In particular, this problem occurs when the desktop is in sleep mode when the print job is started and then brought out of sleep mode after the print job has been kicked off. Both computers are running Windows 7 32-bit edition with the latest patches. I have tried the following with no improvement: SNMP registry hack (see MS KB for details) Following the instructions in a blog post entitled "Sharing Printers on Vista 64-bit" Looking at Printer offline until spooler service is restarted

    Read the article

  • Crontab - stop sending mail, special case ||

    - by 2ge
    Hi all, I need to put into my crontab small command, which is checking, if lighttpd web server is running, for some reason it hangups sometimes. So I got there: * * * * * root /bin/pgrep lighttpd || /usr/local/etc/rc.d/lighttpd restart >/dev/null 2>&1 problem is, this send me mail every minute, in the mail is number of PID of lighttpd, which is running. For other crontab job, redirection work, so I assume, when is there "||", it makes problem. maybe there should be better to rewrite crontab job, so it uses exit status of pgrep, so I can avoid "||". I am using FreeBSD. Thanks for any help, for now I disabled this job

    Read the article

  • Where are the SQLServer jobs stored?

    - by Saiyine
    I'd like to know what is the process a SqlServer job is executing, but I only can find that it calls DTSRun with an encrypted string. After decoding the string, it results is just the name of the job with the user and the password. How can I find what is this job really calling? Edit: I've found a candidate, they could be at the msdb.sysdtspackages, but again, can't read them as SQLServer says the data is binary. How can I read them to confirm they are the jobs?

    Read the article

  • How to allow Hudson build URL through Nginx auth_basic?

    - by rodreegez
    Hi, I have Hudson running and made available to the world via nginx. I have protected Hudson with nginx's auth_basic and that works great. The trouble is, I want to allow unauthenticated requests to the build URL, i.e. /job/<job_name>/build. Currently I have this in my nginx conf: upstream hudson { server 127.0.0.1:8888; } server { server_name ci.myurl.com; root /var/lib/hudson; location / { proxy_pass http://hudson/; auth_basic "Super secret stuff"; auth_basic_user_file /var/opt/hudson/htpasswd; } location ~ \/build { auth_basic off; } } I can't get that second location to allow unauthenticated requests. I have tried various combinations of location ~ /job/(.*)/biuld { } location ^~ \/build { } location ~ \/job\/(.*)\/build { } etc... Maddening! Can anyone point me in the right direction? Thanks, Ad.

    Read the article

  • Specific cron at time point [closed]

    - by ARTI
    I have a very specific task, but can't handle it. I am not a programmer and totally n00b on bash scritps. So the question is, how to create a cron job like this: Script A.sh could be called at any time by hands, and it should create cron job to run script B.sh once at the nearest time point. For example I will have 4 time points: 10.00pm, 10.15pm, 10.30pm, 10.45pm. So if trigger a script A.sh at 10.07pm it should creat cron job to run ONCE script B.sh at 10.15h, because 10.15h is the nearest time point in future. Is it possible? How can I write such script A.sh? I use Centos 6 It is very important and urgent for me. Thank you very much.

    Read the article

  • Error while taking Transaction log backup

    - by Divya Kapoor
    Hello, I have sceduled a Transaction log back up schedule. But the backup is not happening. The error in the logs is this: Transaction Log Backup.Subplan_1,Error,0,ARCOTDB1\ARCOT_DB_INST1,Transaction Log Backup.Subplan_1,(Job outcome),,The job failed. Unable to determine if the owner (ARCOT-DB1\Superuser) of job Transaction Log Backup.Subplan_1 has server access (reason: Could not obtain information about Windows NT group /user 'ARCOT-DB1\Superuser'<c/> error code 0x534. [SQLSTATE 42000] (Error 15404)) Please help!

    Read the article

< Previous Page | 66 67 68 69 70 71 72 73 74 75 76 77  | Next Page >