Search Results

Search found 24117 results on 965 pages for 'write'.

Page 462/965 | < Previous Page | 458 459 460 461 462 463 464 465 466 467 468 469  | Next Page >

  • Slow Network Performance with Windows Server 2008 SP1

    - by Axeva
    I recently installed Service Pack 1 for Windows Server 2008. Since that time, network performance has been awful. Both Windows 7 and Mac Snow Leopard clients have seen miserable speeds when trying to read or write to the server. This is the exact update: Windows Server 2008 R2 Service Pack 1 x64 Edition (KB976932) It's a very simple file server setup. No Domain or Active Directory. Essentially just shared folders. It's Windows Web Server that I'm running. Are there any settings I can tweak? Should I roll back the update (doesn't seem wise)? Update: I've turned off the Power Management for the Network Adapter. That may help. If it doesn't have to be powered on at the start of a request, it should speed things up. Or so I would assume.

    Read the article

  • Caching API Proxy Server

    - by edc1591
    I need to have a server that caches API responses and then forwards them along to a desktop app. I don't really have much experience with this, so I have a few questions. First of all, what kind of server should I get? I already use Linode for my websites, so ideally I'd like to go with them. I expect to get anywhere from 30 million to 40 million requests to my proxy server each month. Will a 512 Linode be able to support that? Also, is there any software out there that does this already, or will I have to write my own? The API responses are roughly 10 KB each on average, so doing the math, that's a lot of data each month. Should I just add more transfer to whatever server I buy, or can I somehow compress the API responses before sending them off to the user? Thanks for any help.

    Read the article

  • How much system and business analysis should a programmer be reasonably expected to do?

    - by Rahul
    In most places I have worked for, there were no formal System or Business Analysts and the programmers were expected to perform both the roles. One had to understand all the subsystems and their interdependencies inside out. Further, one was also supposed to have a thorough knowledge of the business logic of the applications and interact directly with the users to gather requirements, answer their queries etc. In my current job, for ex, I spend about 70% time doing system analysis and only 30% time programming. I consider myself a good programmer but struggle with developing a good understanding of the business rules of a complex application. Often, this creates a handicap because while I can write efficient algorithms and thread-safe code, I lose out to guys who may be average programmers but have a much better understanding of the business processes. So I want to know - How much business and systems knowledge should a programmer have ? - How does one go about getting this knowledge in an immensely complex software system (e.g. trading applications) with several interdependent business processes but poorly documented business rules.

    Read the article

  • How to use T4 templates in WP7, Silverlight, Desktop or even MonoDroid apps

    - by Daniel Cazzulino
    In other words, how to use T4 templates without ANY runtime dependencies? Yes, it is possible, and quite simple and elegant actually. In a desktop project, just open the Add New Item dialog, and search for "text template": From the two available templates, the one that gives you a zero-dependency runtime-usable template is the first one: Preprocessed Text Template. Once unfolded, you get the .tt file, but also a dependent .cs file automatically generated. Note the Custom Tool associated with the file: If you open up the .cs file, you will see that it doesn't contain the rendered "Hello World!!!" I added in the .tt, but rather a full class named after the template file itself: namespace ConsoleApplication1 { using System; #line 1 "C:\Temp\ConsoleApplication1\ConsoleApplication1\PreTextTemplate1.tt" [System.CodeDom.Compiler.GeneratedCodeAttribute("Microsoft.VisualStudio.TextTemplating", "10.0.0.0")] public partial class PreTextTemplate1 : PreTextTemplate1Base { public virtual string TransformText() { this.GenerationEnvironment = null; this.Write("Hello World!!!"); return this.GenerationEnvironment.ToString(); } } #region Base class ... #endregion } ... Read full article

    Read the article

  • CodePlex Daily Summary for Monday, June 07, 2010

    CodePlex Daily Summary for Monday, June 07, 2010New ProjectsAgile Personal Development Methodology: 本项目并不是软件开发项目,它是一个关注个人能力发展的项目,希望通过大家的积极参与和实践,在价值观、原则和最佳实践的基础上不断丰富和完善这些内容。我将主要从生活、工作和个人三个主要方面来指导个人如何快速地提高自己的能力。在工作方面,首先关注的是IT技术人员。 希望本方法论的不断完善,能够对不同阶...Altairis Mail Toolkit: Altairis Mail Toolkit is a component for .NET developers making easy to send templated and localized e-mail messages. Uses standard resource mechan...AmazonPriceTicker: AmazonPriceTicker ist ein kleines ASP.NET-Projekt für unser Studium.CC.Hearts Screen Saver: A complete screensaver that draws pretty hearts. Supports all standard screensaver functionality (preview window, options, multi-monitor). Written...Controle Financeiro com DDD: Projeto para fins de estudo das tencologias: - DDD - TDD - Asp Net MVCDebugWriterTextBox: This is modified TextBox which can catch up Debug.Write() and display log. Also it can write log data to file - all you need is to set up file name!DEIConsultingDev: DEI Consulting DEVEasy Augmented Reality Suite for Silverlight: Easy Augmented Reality Suite for SilverlightEvent Broker: Simple event broker with an hierarchical implementation.fleet It: A WPF Client for Team Space. Developed using WPF and C#. Various URL Shortening services integrated.Gest-Bus: Gest-BusHNV Project: Projects created by Hoàng, Ngọc, VũHotel Management System: Hotel Management System : concerned in making a complete website for a hotel every thing in hotel in just one web application : Finance , managemen...ISEN découverte majeure application mobile: ISEN découverte majeure application mobile contrôle d'un pc par un téléphone portable avec plateforme windows 7 LiveSequence: Based in the sequence diagramming control of Nauman Leghari.NHash: This is a simple project that integrates with Explorer and Computes MD5 and SHA1.NHibernate Sidekick Library: NHibernate Sidekick is a library intended to assist in the development of multi-tiered applications using the NHibernate ORM framework.ScrewTurn ASP.NET Proxy Membership Provider: Plugin for the ScrewTurn Wiki System to use Standard ASP.NET Membership and Role providers.Silverlight SNS: We are going to develop a SNS web application based on Silver lightSilverPoiMap: SilverPoiMap provides interactive searching and management of Points of Interest. It is a Facebook client application which allow you to connect to...Simple Resource Localization Editor: Simple editor that simplifies localization and synchronization of .NET resource files (.resx).SimpleBlog.NET MVC: A very simple blog engine developed in ASP.NET 3.5 MVC2 and C#.Siverlight Project: Hope that everybody continue to develop it.SpaceConquest: Incorporated standard design patterns to build a peer to peer game in Java. The game rules were similar in complexity to games like Civilization an...Yasts: Yasts - Yet Another Space Trading Sim - This is a learning environment project.New ReleasesAgile Personal Development Methodology: 敏捷个人-认识自我,管理自我.pdf: 去年我在blog上写了个人管理系列的一些blog,其中一些文章深受大家的喜欢。想到写这个系列是源于在实施敏捷Scrum方法时,对方法实施是否对人的水平需要高要求的一些思考。自组织团队是建立在敏捷个人之上的,没有个人就没有团队,实施Scrum对人要求不高,但想实施得好,那么对人的要求肯定不低。 ...Altairis Mail Toolkit: 1.0.0 RTM: Initial releaseAndrew's XNA Helpers: Andrew's Xna Helpers V1.1: Currently only supports X86 projects since portions of the code have to be reworked to work with the xbox. I do plan to code it for the 360 though....C#Mail: Higuchi.Mail.dll (2010.6.6 ver): Higuchi.Mail.dll (2010.4.30 ver)Community Forums NNTP bridge: Community Forums NNTP Bridge V31: Release of the Community Forums NNTP Bridge to access the social and anwsers MS forums with a single, open source NNTP bridge. This release has ad...Html Agility Pack: Experimental Xpath Updates: In efforts to update make Html Agility Packs Xpath support to be closer to the System.Xml.Xpath implementation I have updated HAP to have all nodes...imdb movie downloader: myImdb 0.9.2: myImdb 0.9.2 Fully changed...and added some more features.. working with XML movie list... Used Backgroundworker more clever results and guess m...InfoService: InfoService v1.6 - MPE1 or RAR Package: InfoService Release v1.6.0.136 Please read Plugin installation for installation instructions.ISEN découverte majeure application mobile: appli traitement d'image: but : capturer, redimensionner l'image de l'écran d'un pc.Jeremi Stadler: Clipboard Manager: Version 1.0.5.14 It's finally here! I have been working on this one the whole night but it's worth it ;) The program catches clipboard changes an...mesoBoard: mesoBoard 0.9 - Beta: mesoBoard version 0.9 beta Released under the New BSD License. http://mesoboard.codeplex.com/license http://mesoboard.comMiniTwitter: 1.13: MiniTwitter 1.13 変更内容 修正 ダイレクトメッセージの取得に失敗するバグを修正 タイムラインを編集すると落ちるバグを修正 リストのインポートで最初の 20 人しか取得できない問題を修正 追加 URL マウスオーバーでの画像のプレビュー機能を実装MiniTwitter: 1.13.1: MiniTwitter 1.13.1 更新内容 修正 タイムライン追加、編集ダイアログでキャンセルを選ぶと落ちるバグを修正MSTS Editors & Tools: Simis Editor v0.4: Simis Editor v0.4 Open and Save dialogs support full filename filters from BNFs (e.g. "tsection.dat"). Added statusbar and menu help text. Adde...NHibernate Sidekick Library: 0.6.0: v0.6.0 - Initial Release TODONLog - Advanced .NET Logging: Nightly Build 2010.06.06.001: Changes since the last build:2010-06-05 21:50:21 Jarek Kowalski fixed doc for ${document-uri} 2010-06-05 20:21:53 Jarek Kowalski removed Layout re...NQ - Component-oriented Framework: NQ Core 0.90: Main changes in 0.90 release: Introduction of an additional built-in component loader to load component and service information from XML files ins...PhluffyFotos: v4 Windows Azure: This release has been updated to Visual Studio 2010 as well as the latest StorageClient library. Make sure you run the Provision.cmd in order to b...RIA Services Essentials: Book Club Application (June 6, 2010): The initial release of the BookClub application based on the MIX presentation with a few changes: 1. Some bug fixes 2. Added the ability to Like a ...Samurai.Workflow: 1.1 Stable Release: Removed reference to WPF assemblies so non-WPF applications can use the workflow. For WPF apps, the workflow will use reflection to seek out UI thr...SilverPoiMap: SilverPoiMap Beta: Beta VersionSimple Resource Localization Editor: Release 1: First release.Siverlight Project: Auto Arrange Panel Project: Auto Arrange Panel ProjectSmart Voice: Smart Voice 0.2: Changelog: Corrected a few bugsSmart Voice: Smart Voice 0.2.1: Changelog: Fixed a major bug that was slowing the application Added opt in for usage data In order to contribute with user data, please change t...Spider Compiler: Release 0.1: Contains the setup for the spider compiler. This release includes the changeset #66980.Spider Compiler: Release 0.2: This release includes the changeset #67003.The Lounge Repository: Lounge Repo Binary Release: All in one binary download of the Lounge Repository. Improvements: -More tolerant towards schema changes -Bug fixed regarding array normalization ...UrzaGatherer: UrzaGatherer v2.0.1: This release integrate the support of the Full Database Backup.Web/Cloud Applications Development Framework | Visual WebGui: 6.4 beta 3: This is the last beta version before the official release of Visual WebGui, including support for Visual Studio 2010 and it is fully featured with ...WinGet: Alpha 2: This is the second release of WinGet (0.5.0.2). Is is alpha quality not suitable for use in a production environment and it has bugs (see Known Iss...WoW Character Viewer: WoW Viewer: Newly redesigned layout of the original WoW Character Viewer. Faster access, cleaner layout.Most Popular ProjectsCommunity Forums NNTP bridgeASP.NET MVC Time PlannerMoonyDesk (windows desktop widgets)NeatUploadOutSyncFileupload AJAXViperWorks IgnitionAgUnit - Silverlight unit testing with ReSharperSQL Nexus ToolSmith Async .NET Memcached ClientMost Active ProjectsCommunity Forums NNTP bridgepatterns & practices – Enterprise LibraryRawrGMap.NET - Great Maps for Windows Forms & PresentationStyleCopN2 CMSsmark C# LibraryFarseer Physics EngineIonics Isapi Rewrite FilterNB_Store - Free DotNetNuke Ecommerce Catalog Module

    Read the article

  • Duplicate incoming TCP traffic on Debian Squeeze

    - by Erwan Queffélec
    I have to test a homebrew server that accepts a lot of incoming TCP traffic on a single port. The protocol is homebrew as well. For testing purposes, I'd like to send this traffic both : - to the production server (say, listening on port 12345) - to the test server (say, listening on port 23456) My clients apps are "dumb" : they never read data back, and the server never replies anyway, my server only accepts connections, and do statistical computations and store/forward/service both raw and computed data. Actually, client apps and hardware are so simple there is no way I can tell clients to send their stream on both servers... And using "fake" clients is not good enough. What could be the simplest solution ? I can of course write an intermediary app that just copy incoming data and send it back to the testing server, pretending to be the client. I have a single server running Squeeze and have total control over it. Thanks in advance for your replies.

    Read the article

  • Practical Approaches to increasing Virtualization Density-Part 1

    - by Girish Venkat
    Happy New year everyone!. Let me kick start the year off by talking about Virtualization density.  What is it?The number of virtual servers that a physical server can support and it's increase from the prior physical infrastructure as a percentage. Why is it important?This is important because the density should be indicative of how well the server is getting consumed?So what is wrong ?Virtualization density fails to convey the "Real usage" of a server.  Most of the hypervisor based O/S Virtualization  evangelists take pride in the fact that they are now running a Virtual Server farm of X machines compared to a Physical server farm of Y (with Y less than X obviously). The real question is - has your utilization of the server really increased or not.  In an internal study that was conducted by one of the top financial institution - the utilization of servers only went up by 15% from 30 to 45. So, this really means that just by increasing virtualization density one will not be achieving the goal of using up the servers in their server farm better.  I will write about what the possible approaches are to increase virtualization density in the next entry. 

    Read the article

  • Launching mysql server: same permissions for root and for user

    - by toinbis
    Hi folks, have been directed here from stackoverflow here, am reposting the question and adding my.cnf at the end of a post. so far in my 10+ years experience with linux, all the permission problems I've ever encountered, have been successfully solved with chmod -R 777 /path/where/the/problem/has/occured (every lie has a grain of truth in it :) This time the trick doesn't work, so I'm turning to you for help. I'm compiling mysql server from scratch with zc.buildout (www . buildout . org). I do launch it by executing /home/toinbis/.../parts/mysql/bin/mysqld_safe, this works. The thing is that i'll be launching this from within supervisor (supervisord . org) script, and when used on the deployment server, it'll need it to be launched with root permissions(so that nginx server, launched with the same script, would have access to 80 port). The problem is that sudo /home/toinbis/.../parts/mysql/bin/mysqld_safe, fails, generating the error, posted bellow, in mysql error log (apache and nginx works as expected). http://lists.mysql.com/mysql/216045 suggests, that "there are two errors: A missing table and a file system that mysqld doesn't have access to". Mysqldatadir and all the mysql server binary files has 777 permissions, talbe mysql.plugin does exist and has 777 permissions (why Can't open the mysql.plugin table?), "sudo touch mysql_datadir/tmp/file" does create file (why Can't create/write to file /home/toinbis/.../runtime/mysql_datadir/tmp/ib4e9Huz?). chgrp -R mysql mysql_datadir and adding "root, toinbis, mysql" users to mysql group ( cat /etc/group | grep mysql outputs mysql:x:124:root,toinbis,mysql) has no effect - when i launch it as a casual user, it starts, when as a root - it fails. Does mysql server, even started as root, tries to operate as other, let's say, 'mysql' user? but even in that case, adding mysql user to mysql group and making all the mysql_datadirs files belong to mysql group should make things work smoothly. I do know that it might be a better idea to simply to launch one the nginx as root and mysql - as just a user, but this error irritated me enough so to devote enough energy so not to only "make things work", but to also make things work exactly as i wanted it initially, so to have a proof of concept that it's possible. and this is the generated error: 091213 20:02:55 mysqld_safe Starting mysqld daemon with databases from /home/toinbis/.../runtime/mysql_datadir /home/toinbis/.../parts/mysql/libexec/mysqld: Table 'plugin' is read only 091213 20:02:55 [ERROR] Can't open the mysql.plugin table. Please run mysql_upgrade to create it. /home/toinbis/.../parts/mysql/libexec/mysqld: Can't create/write to file '/home/toinbis/.../runtime/mysql_datadir/tmp/ib4e9Huz' (Errcode: 13) 091213 20:02:55 InnoDB: Error: unable to create temporary file; errno: 13 091213 20:02:55 [ERROR] Plugin 'InnoDB' init function returned error. 091213 20:02:55 [ERROR] Plugin 'InnoDB' registration as a STORAGE ENGINE failed. 091213 20:02:55 [ERROR] Can't start server : Bind on unix socket: Permission denied 091213 20:02:55 [ERROR] Do you already have another mysqld server running on socket: /home/toinbis/.../runtime/var/pids/mysql.sock ? 091213 20:02:55 [ERROR] Aborting 091213 20:02:55 [Note] /home/toinbis/.../parts/mysql/libexec/mysqld: Shutdown complete 091213 20:02:55 mysqld_safe mysqld from pid file /home/toinbis/.../runtime/var/pids/mysql.pid ended My my.cnf (the basedir and datadir(including tempdir) have chmod -R 777 permissions) : [client] socket = /home/toinbis/.../runtime/var/pids/mysql.sock port = 8002 [mysqld_safe] socket = /home/toinbis/.../runtime/var/pids/mysql.sock nice = 0 [mysqld] # # * Basic Settings # socket = /home/toinbis/.../runtime/var/pids/mysql.sock port = 8002 pid-file = /home/toinbis/.../runtime/var/pids/mysql.pid basedir = /home/toinbis/.../parts/mysql datadir = /home/toinbis/.../runtime/mysql_datadir tmpdir = /home/toinbis/.../runtime/mysql_datadir/tmp skip-external-locking bind-address = 127.0.0.1 log-error =/home/toinbis/.../runtime/logs/mysql_errorlog # # * Fine Tuning # key_buffer = 16M max_allowed_packet = 32M thread_stack = 128K thread_cache_size = 8 myisam-recover = BACKUP #max_connections = 100 #table_cache = 64 #thread_concurrency = 10 # # * Query Cache Configuration # query_cache_limit = 1M query_cache_size = 16M # # * Logging and Replication # # Both location gets rotated by the cronjob. # Be aware that this log type is a performance killer. #log = /home/toinbis/.../runtime/logs/mysql_logs/mysql.log # # Error logging goes to syslog. This is a Debian improvement :) # # Here you can see queries with especially long duration #log_slow_queries = /home/toinbis/.../runtime/logs/mysql_logs/mysql-slow.log #long_query_time = 2 #log-queries-not-using-indexes # # The following can be used as easy to replay backup logs or for replication. #server-id = 1 #log_bin = /home/toinbis/.../runtime/mysql_datadir/mysql-bin.log #binlog_format = ROW #read_only = 0 #expire_logs_days = 10 #max_binlog_size = 100M #sync_binlog = 1 #binlog_do_db = include_database_name #binlog_ignore_db = include_database_name # # * InnoDB # innodb_data_file_path = ibdata1:10M:autoextend innodb_buffer_pool_size=64M innodb_log_file_size=16M innodb_log_buffer_size=8M innodb_flush_log_at_trx_commit=1 innodb_file_per_table innodb_locks_unsafe_for_binlog=1 [mysqldump] quick quote-names max_allowed_packet = 32M [mysql] #no-auto-rehash # faster start of mysql but no tab completion [isamchk] key_buffer = 16M Any ideas much appreciated! regards, to P.S. sorry for messy hyperlinks, it's my first post and anti-spam feature of SF doesn't allow to post them properly :)

    Read the article

  • Create mix CDs from MP3 files

    - by Dave Jarvis
    How would you write a script (preferably for the Windows commandline) that: Examines thousands of MP3 files stored on a single drive (e.g., G:\) Randomizes the collection Populates a series of directories up to 650MB worth of songs (without exceeding 650MB) Every song is shucked exactly once (Optional) The directory size comes as close as possible to 650MB The DIR, COPY, and XCOPY commands have no explicit file size switches. A few Google searches have come up with: File size condition in DOS Cygwin and UWIN DOS File sizes It would be ideal if UNIX-like environments can be avoided. My question, then: How do you compare file (or directory) sizes using the Windows commandline?

    Read the article

  • Looking for FTP server that allows user management from database

    - by hughesdan
    I'm planning a server application that will handle files uploaded via FTP. The application must parse text documents that it receives and write them to a database (most likely a document-oriented database like Mongo). And the application must also relay all large binary files it receives to Amazon S3 for storage and hosting. I'd like to manage all aspects of the FTP server programmatically. For example, when a user registers via a web page the application should be able to create the user account in the database and provision a directory on the server for receiving files. I'm using a Linux server but am otherwise open to considering any programming language or framework. I experimented with VSFTPD but didn't like the way the application relies on config files and the creation of users and directories via the command line. Can someone please recommend what server framework I should consider? I'm a little biased toward solutions that leverage Javascript/Nodejs or Python. However, I'm open to anything that can run on a Linux box.

    Read the article

  • when to introduce an application services tier in an n-tier application

    - by user20358
    I am developing a web based application whose primary objective is to fetch data from the database, display it on the UI, take in user inputs and write them back to the database. The application is not going to be doing any industrial strength algorithm crunching, but will be receiving a very high number of hits at peak times (described below) which will be changing thru the day. The layers are your typical Presentation, Business, Data. The data layer is taken care of by the database server. The business layer will contain the DAL component to access the database server over tcp. The choices I have to separate these layers into tiers are: The presentation and business layers can be either kept on the same tier. The presentation layer on a separate tier by itself and the business layer on a separate tier by itself. In the case of choice 2, the business layer will be accessed by the presentation layer using a WCF service either over http or tcp. I don't see any heavy processing being done on the Business layer, so I am leaning towards option 1 above. I also feel for the same reason, adding a new tier will only introduce the network latency. However, in terms of scalability in case I need to scale up or scale out, which is a better way to go? This application will need to be able to support up to 6 million users an hour. There will be a reasonable amount of data in each user session, storing user's preferences and other details. I will be using page level caching as well.

    Read the article

  • Monster's AI in an Action-RPG

    - by Andrea Tucci
    I'm developing an action rpg with some University colleagues. We've gotton to the monsters' AI design and we would like to implement a sort of "utility-based AI" so we have a "thinker" that assigns a numeric value on all the monster's decisions and we choose the highest (or the most appropriate, depending on monster's iq) and assign it in the monster's collection of decisions (like a goal-driven design pattern) . One solution we found is to write a mathematical formula for each decision, with all the important parameters for evaluation (so for a spell-decision we might have mp,distance from player, player's hp etc). This formula also has coefficients representing some of monster's behaviour (in this way we can alterate formulas by changing coefficients). I've also read how "fuzzy logic" works; I was fascinated by it and by the many ways of expansion it has. I was wondering how we could use this technique to give our AI more semplicity, as in create evaluations with fuzzy rules such as IF player_far AND mp_high AND hp_high THEN very_Desiderable (for a spell having an high casting-time and consume high mp) and then 'defuzz' it. In this way it's also simple to create a monster behaviour by creating ad-hoc rules for every monster's IQ category. But is it correct using fuzzy logic in a game with many parameters like an rpg? Is there a way of merging these two techniques? Are there better AI design techniques for evaluating monster's chooses?

    Read the article

  • Samba - permission issue

    - by user88432
    I am trying to get samba to work properly... I have a "Movies" share (//server/Movies), I want only root account to be able to upload and delete. Guest can view "Movies" share without password/login but they cant delete/update (only view). [Movies] path = /mnt/user/Movies browsable = yes public = yes writable = no write list = root guest ok = yes I can access to Movies share as guest but when I try to add new file I get an error saying: "You need permission to perform this action" I expected username/password to popup but it didn't, how to fix this?

    Read the article

  • How do projects manage chef cookbooks when multiple teams manage multiple sets of cookbooks?

    - by strife25
    I am wondering how projects that have multiple component teams manage their sets of cookbooks? We are trying to figure out how we can have an ops team provide a set of "common component" cookbooks that can be re-used by other teams that will also write their own cookbooks. For example, the ops team should own the Java cookbook, while a component manages their cookbooks written for their component or build engines. From my little experience with chef server, this kind of workflow seems to not be well supported since the server stores and manages all cookbooks - so there is a potential to overwrite a cookbook written by another team. How do other projects deal with this type of problem?

    Read the article

  • On booting my laptop warns me about failure of hard disk is imminent...can it be repaired or to be replaced..?

    - by nrb
    when I am booting my laptop (Lenovo G550) it gives an error message prior to that.. SMART Failure Predicted on Hard Disk O:Hitachi HTS543225l9A300-(PM) Warning : Immediately back up your data and replace your hard disk drive. Afailure may be imminent. Press F1 to continue.. Once it started, it runs smoothly, I also verified its status by running some internal HDD tests by Hard Disk Senitel software..It shows no bad sector , no virus, no damage , normal temperature ... it suggests every thing is OK except its health... Now my question is that then what is the wrong with it..? It may be wrong with plates (dusty) or the read/write heads...then can I go for repair it or replace it...? My system now reminds me every half an hour to resolve this problem since last 10 days... Help me.

    Read the article

  • debian mysql server always down

    - by lokheart
    I am hosting a mysql server in a debian 6.0.4 server hosted in linode, the new data is frequently written into the mysql server using R, about 30-40 "write" per minute, about 7 hours a day, 5 days a week. Recently I found that the mysql server seem to down frequently, always give me error that mysql server cannot be connected through socket. I wonder if this is caused by my high demand to my mysql server, and if this can be solved. I am definitely a newbie in manage server. Please let me know if I need to provide additional information for this question in order to get it solved. Thanks.

    Read the article

  • Is there a webproxy via email?

    - by mafutrct
    This is probably a bit weird, and I don't think it exists yet. I'm basically asking for a program that, upon receiving a request via email, downloads a html page and sends it via email, possibly changing the links inside the page into outgoing emails to this program asking for another page. This is certainly one of the most crude ways of accessing the web, and obviously fails at anything beyond the most basic stuff. But it may still be useful to those that can send email, but can't access the web due to company policy or whatever reason. In the (likely) case this does not exist, I'd be interested to write such a proxy.

    Read the article

  • Windows Azure Mobile Services: New support for iOS apps, Facebook/Twitter/Google identity, Emails, SMS, Blobs, Service Bus and more

    - by ScottGu
    A few weeks ago I blogged about Windows Azure Mobile Services - a new capability in Windows Azure that makes it incredibly easy to connect your client and mobile applications to a scalable cloud backend. Earlier today we delivered a number of great improvements to Windows Azure Mobile Services.  New features include: iOS support – enabling you to connect iPhone and iPad apps to Mobile Services Facebook, Twitter, and Google authentication support with Mobile Services Blob, Table, Queue, and Service Bus support from within your Mobile Service Sending emails from your Mobile Service (in partnership with SendGrid) Sending SMS messages from your Mobile Service (in partnership with Twilio) Ability to deploy mobile services in the West US region All of these improvements are now live in production and available to start using immediately. Below are more details on them: iOS Support This week we delivered initial support for connecting iOS based devices (including iPhones and iPads) to Windows Azure Mobile Services.  Like the rest of our Windows Azure SDK, we are delivering the native iOS libraries to enable this under an open source (Apache 2.0) license on GitHub.  We’re excited to get your feedback on this new library through our forum and GitHub issues list, and we welcome contributions to the SDK. To create a new iOS app or connect an existing iOS app to your Mobile Service, simply select the “iOS” tab within the Quick Start view of a Mobile Service within the Windows Azure Portal – and then follow either the “Create a new iOS app” or “Connect to an existing iOS app” link below it: Clicking either of these links will expand and display step-by-step instructions for how to build an iOS application that connects with your Mobile Service: Read this getting started tutorial to walkthrough how you can build (in less than 5 minutes) a simple iOS “Todo List” app that stores data in Windows Azure.  Then follow the below tutorials to explore how to use the iOS client libraries to store data and authenticate users. Get Started with data in Mobile Services for iOS Get Started with authentication in Mobile Services for iOS Facebook, Twitter, and Google Authentication Support Our initial preview of Mobile Services supported the ability to authenticate users of mobile apps using Microsoft Accounts (formerly called Windows Live ID accounts).  This week we are adding the ability to also authenticate users using Facebook, Twitter, and Google credentials.  These are now supported with both Windows 8 apps as well as iOS apps (and a single app can support multiple forms of identity simultaneously – so you can offer your users a choice of how to login). The below tutorials walkthrough how to register your Mobile Service with an identity provider: How to register your app with Microsoft Account How to register your app with Facebook How to register your app with Twitter How to register your app with Google The tutorials above walkthrough how to obtain a client ID and a secret key from the identity provider. You can then click on the “Identity” tab of your Mobile Service (within the Windows Azure Portal) and save these values to enable server-side authentication with your Mobile Service: You can then write code within your client or mobile app to authenticate your users to the Mobile Service.  For example, below is the code you would write to have them login to the Mobile Service using their Facebook credentials: Windows Store App (using C#): var user = await App.MobileService                     .LoginAsync(MobileServiceAuthenticationProvider.Facebook); iOS app (using Objective C): UINavigationController *controller = [self.todoService.client     loginViewControllerWithProvider:@"facebook"     completion:^(MSUser *user, NSError *error) {        //... }]; Learn more about authenticating Mobile Services using Microsoft Account, Facebook, Twitter, and Google from these tutorials: Get started with authentication in Mobile Services for Windows Store (C#) Get started with authentication in Mobile Services for Windows Store (JavaScript) Get started with authentication in Mobile Services for iOS Using Windows Azure Blob, Tables and ServiceBus with your Mobile Services Mobile Services provide a simple but powerful way to add server logic using server scripts. These scripts are associated with the individual CRUD operations on your mobile service’s tables. Server scripts are great for data validation, custom authorization logic (e.g. does this user participate in this game session), augmenting CRUD operations, sending push notifications, and other similar scenarios.   Server scripts are written in JavaScript and are executed in a secure server-side scripting environment built using Node.js.  You can edit these scripts and save them on the server directly within the Windows Azure Portal: In this week’s release we have added the ability to work with other Windows Azure services from your Mobile Service server scripts.  This is supported using the existing “azure” module within the Windows Azure SDK for Node.js.  For example, the below code could be used in a Mobile Service script to obtain a reference to a Windows Azure Table (after which you could query it or insert data into it):     var azure = require('azure');     var tableService = azure.createTableService("<< account name >>",                                                 "<< access key >>"); Follow the tutorials on the Windows Azure Node.js dev center to learn more about working with Blob, Tables, Queues and Service Bus using the azure module. Sending emails from your Mobile Service In this week’s release we have also added the ability to easily send emails from your Mobile Service, building on our partnership with SendGrid. Whether you want to add a welcome email upon successful user registration, or make your app alert you of certain usage activities, you can do this now by sending email from Mobile Services server scripts. To get started, sign up for SendGrid account at http://sendgrid.com . Windows Azure customers receive a special offer of 25,000 free emails per month from SendGrid. To sign-up for this offer, or get more information, please visit http://www.sendgrid.com/azure.html . One you signed up, you can add the following script to your Mobile Service server scripts to send email via SendGrid service:     var sendgrid = new SendGrid('<< account name >>', '<< password >>');       sendgrid.send({         to: '<< enter email address here >>',         from: '<< enter from address here >>',         subject: 'New to-do item',         text: 'A new to-do was added: ' + item.text     }, function (success, message) {         if (!success) {             console.error(message);         }     }); Follow the Send email from Mobile Services with SendGrid tutorial to learn more. Sending SMS messages from your Mobile Service SMS is a key communication medium for mobile apps - it comes in handy if you want your app to send users a confirmation code during registration, allow your users to invite their friends to install your app or reach out to mobile users without a smartphone. Using Mobile Service server scripts and Twilio’s REST API, you can now easily send SMS messages to your app.  To get started, sign up for Twilio account. Windows Azure customers receive 1000 free text messages when using Twilio and Windows Azure together. Once signed up, you can add the following to your Mobile Service server scripts to send SMS messages:     var httpRequest = require('request');     var account_sid = "<< account SID >>";     var auth_token = "<< auth token >>";       // Create the request body     var body = "From=" + from + "&To=" + to + "&Body=" + message;       // Make the HTTP request to Twilio     httpRequest.post({         url: "https://" + account_sid + ":" + auth_token +              "@api.twilio.com/2010-04-01/Accounts/" + account_sid + "/SMS/Messages.json",         headers: { 'content-type': 'application/x-www-form-urlencoded' },         body: body     }, function (err, resp, body) {         console.log(body);     }); I’m excited to be speaking at the TwilioCon conference this week, and will be showcasing some of the cool scenarios you can now enable with Twilio and Windows Azure Mobile Services. Mobile Services availability in West US region Our initial preview of Windows Azure Mobile Services was only supported in the US East region of Windows Azure.  As with every Windows Azure service, overtime we will extend Mobile Services to all Windows Azure regions. With this week’s preview update we’ve added support so that you can now create your Mobile Service in the West US region as well: Summary The above features are all now live in production and are available to use immediately.  If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using Mobile Services today. Visit the Windows Azure Mobile Developer Center to learn more about how to build apps with Mobile Services. We’ll have even more new features and enhancements coming later this week – including .NET 4.5 support for Windows Azure Web Sites.  Keep an eye out on my blog for details as new features become available. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • Other games that employ mechanics like the game "Diplomacy"

    - by Kevin Peno
    I'm doing a little bit of research and I'm hoping you can help me track down any games, other than Diplomacy (online version here), that employ all or some of the mechanics in Diplomacy (rules, short form). Examples I'm looking for: Simultaneous orders given prior to execution of orders In Diplomacy, players "write down" their moves and execute them "at the same time" Support, in terms of supporting an attacker or defender "take" a territory. In Diplomacy, no one unit is stronger than another you need to combine the strength of multiple units to attack other territories. Rules for how move conflicts are resolved Example, 2 units move into a space, but only one is allowed, what happens. I may add to this list later, but these are the primary things I'm looking for. If you need clarification on anything just let me know. Note: I tried asking this on GamingSE, but it was shot down. So, I am unsure where else I could post this. Since I am researching this for game development purposes, I assume this post is on topic. Please let me know if this is not the case. Please also feel free to re-categorize this. Thanks!

    Read the article

  • Redirecting program errors to both log and error file

    - by Gnanam
    I've a Java standalone program scheduled to run as cron at every 10 minutes I want to catch/write errors thrown by this Java program both in the log file and also as a separate error file (MyJavaStandalone.err). I know the following commands: Errors redirected to a separate file but not to log file /usr/java/jdk1.6.0/bin/java MyJavaStandalone >> MyJavaStandalone.log 2>> MyJavaStandalone.err & Both log and errors are redirected to the same log file, but errors alone are not written to a separate error file /usr/java/jdk1.6.0/bin/java MyJavaStandalone >> MyJavaStandalone.log 2>&1 &

    Read the article

  • SQLAuthority News – Learning, Community and Book Signing at #SQLPASS 2012

    - by pinaldave
    SQLPASS event is going excellent we are having great great fun! We are having book signing events and the response is overwhelmingly positive. I am glad that all of you love our books and I totally appreciate your support. Rick and I both are feeling very motivated to write more books in future. Here is our schedule for book signing. SQL Queries 2012 Joes 2 Pros Volume1 Finally a book for the true SQL Server beginner! Whether you are brand new to databases and are thinking of getting your 70-461 certification or already a semi-pro working in the field and need some fingertip support, this is this is the book for you. Joes 2 Pros does not assume you already know anything about databases or SQL server.  This book builds on the success of the previous series and will help anyone transform themselves from a beginner “Joe” into a SQL 2012 “Pro”. Wednesday, November 7, 2012 12pm-1pm – Book Signing at Exhibit Hall Joes Pros booth#117 (FREE BOOK) Rest all the time – I will be at Exhibition Hall Joes 2 Pros Booth #117. Stop by for the goodies! This book is also available on Amazon. SQL 2012 Functions Joes 2 Pros Functions have been around for many years to make our lives easier. Because of them, thousands of lines of valuable programming can be done with one statement. When we know what functions are offered in SQL Server we can get powerful projects done very quickly. Often times, the functions you wished you had are released in the next version. Wednesday, November 7, 2012 7pm-8pm - Embarcadero Booth Book Signing (FREE BOOK) Thursday, November 8, 2012 12pm-1pm - Embarcadero Booth Book Signing (FREE BOOK) This book is also available on Amazon. If you are at SQLPASS stop by Booth #117 – I will be there and many be you can get one of my signed book! Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL PASS, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Book Review, SQLAuthority News, SQLServer, T SQL, Technology

    Read the article

  • adding keyboard shortcuts for OSX terminal or xterm

    - by I J
    Is there a way to add a keyboard shortcut for a terminal command in OSX. Basically most of the times i open the terminal app in MAC in order to ssh into a certain server foo. What I want to do is add a keyboard shortcut (say ^k) so that on a terminal when I do that, it runs "ssh foo" in the terminal. Thanks PS: I think if there is something for the xterm in linux then it should work for the terminal too. So this might not be an OSX specific question. PS2: I want the shortcut to do carriage return with the "ssh foo". If its just "ssh foo", then I can write an alias in .bashrc. My goal is to minimize the number of keystrokes I've to do at the end of the day.

    Read the article

  • Replicate Oracle to MySQL

    - by Rosdi
    I am developing a web apps, this web application will be using MySQL. Now I need to replicate my client's Oracle database into MySQL, only a few tables will be involved.. a table can be up to 2-3 million rows. I only have SELECT privilege on this Oracle, so don't ask me to install any kind of service on the Oracle machine. I have complete control on the MySQL side however. The replication is only one way (Oracle to MySQL). I can write a simple script to truncate MySQL table and repopulate it every night but I think this is very inefficient, there must be a better way. Is there any free tools I can use? Expensive database replication system is definitely out of the question.

    Read the article

  • Spooling in SQL execution plans

    - by Rob Farley
    Sewing has never been my thing. I barely even know the terminology, and when discussing this with American friends, I even found out that half the words that Americans use are different to the words that English and Australian people use. That said – let’s talk about spools! In particular, the Spool operators that you find in some SQL execution plans. This post is for T-SQL Tuesday, hosted this month by me! I’ve chosen to write about spools because they seem to get a bad rap (even in my song I used the line “There’s spooling from a CTE, they’ve got recursion needlessly”). I figured it was worth covering some of what spools are about, and hopefully explain why they are remarkably necessary, and generally very useful. If you have a look at the Books Online page about Plan Operators, at http://msdn.microsoft.com/en-us/library/ms191158.aspx, and do a search for the word ‘spool’, you’ll notice it says there are 46 matches. 46! Yeah, that’s what I thought too... Spooling is mentioned in several operators: Eager Spool, Lazy Spool, Index Spool (sometimes called a Nonclustered Index Spool), Row Count Spool, Spool, Table Spool, and Window Spool (oh, and Cache, which is a special kind of spool for a single row, but as it isn’t used in SQL 2012, I won’t describe it any further here). Spool, Table Spool, Index Spool, Window Spool and Row Count Spool are all physical operators, whereas Eager Spool and Lazy Spool are logical operators, describing the way that the other spools work. For example, you might see a Table Spool which is either Eager or Lazy. A Window Spool can actually act as both, as I’ll mention in a moment. In sewing, cotton is put onto a spool to make it more useful. You might buy it in bulk on a cone, but if you’re going to be using a sewing machine, then you quite probably want to have it on a spool or bobbin, which allows it to be used in a more effective way. This is the picture that I want you to think about in relation to your data. I’m sure you use spools every time you use your sewing machine. I know I do. I can’t think of a time when I’ve got out my sewing machine to do some sewing and haven’t used a spool. However, I often run SQL queries that don’t use spools. You see, the data that is consumed by my query is typically in a useful state without a spool. It’s like I can just sew with my cotton despite it not being on a spool! Many of my favourite features in T-SQL do like to use spools though. This looks like a very similar query to before, but includes an OVER clause to return a column telling me the number of rows in my data set. I’ll describe what’s going on in a few paragraphs’ time. So what does a Spool operator actually do? The spool operator consumes a set of data, and stores it in a temporary structure, in the tempdb database. This structure is typically either a Table (ie, a heap), or an Index (ie, a b-tree). If no data is actually needed from it, then it could also be a Row Count spool, which only stores the number of rows that the spool operator consumes. A Window Spool is another option if the data being consumed is tightly linked to windows of data, such as when the ROWS/RANGE clause of the OVER clause is being used. You could maybe think about the type of spool being like whether the cotton is going onto a small bobbin to fit in the base of the sewing machine, or whether it’s a larger spool for the top. A Table or Index Spool is either Eager or Lazy in nature. Eager and Lazy are Logical operators, which talk more about the behaviour, rather than the physical operation. If I’m sewing, I can either be all enthusiastic and get all my cotton onto the spool before I start, or I can do it as I need it. “Lazy” might not the be the best word to describe a person – in the SQL world it describes the idea of either fetching all the rows to build up the whole spool when the operator is called (Eager), or populating the spool only as it’s needed (Lazy). Window Spools are both physical and logical. They’re eager on a per-window basis, but lazy between windows. And when is it needed? The way I see it, spools are needed for two reasons. 1 – When data is going to be needed AGAIN. 2 – When data needs to be kept away from the original source. If you’re someone that writes long stored procedures, you are probably quite aware of the second scenario. I see plenty of stored procedures being written this way – where the query writer populates a temporary table, so that they can make updates to it without risking the original table. SQL does this too. Imagine I’m updating my contact list, and some of my changes move data to later in the book. If I’m not careful, I might update the same row a second time (or even enter an infinite loop, updating it over and over). A spool can make sure that I don’t, by using a copy of the data. This problem is known as the Halloween Effect (not because it’s spooky, but because it was discovered in late October one year). As I’m sure you can imagine, the kind of spool you’d need to protect against the Halloween Effect would be eager, because if you’re only handling one row at a time, then you’re not providing the protection... An eager spool will block the flow of data, waiting until it has fetched all the data before serving it up to the operator that called it. In the query below I’m forcing the Query Optimizer to use an index which would be upset if the Name column values got changed, and we see that before any data is fetched, a spool is created to load the data into. This doesn’t stop the index being maintained, but it does mean that the index is protected from the changes that are being done. There are plenty of times, though, when you need data repeatedly. Consider the query I put above. A simple join, but then counting the number of rows that came through. The way that this has executed (be it ideal or not), is to ask that a Table Spool be populated. That’s the Table Spool operator on the top row. That spool can produce the same set of rows repeatedly. This is the behaviour that we see in the bottom half of the plan. In the bottom half of the plan, we see that the a join is being done between the rows that are being sourced from the spool – one being aggregated and one not – producing the columns that we need for the query. Table v Index When considering whether to use a Table Spool or an Index Spool, the question that the Query Optimizer needs to answer is whether there is sufficient benefit to storing the data in a b-tree. The idea of having data in indexes is great, but of course there is a cost to maintaining them. Here we’re creating a temporary structure for data, and there is a cost associated with populating each row into its correct position according to a b-tree, as opposed to simply adding it to the end of the list of rows in a heap. Using a b-tree could even result in page-splits as the b-tree is populated, so there had better be a reason to use that kind of structure. That all depends on how the data is going to be used in other parts of the plan. If you’ve ever thought that you could use a temporary index for a particular query, well this is it – and the Query Optimizer can do that if it thinks it’s worthwhile. It’s worth noting that just because a Spool is populated using an Index Spool, it can still be fetched using a Table Spool. The details about whether or not a Spool used as a source shows as a Table Spool or an Index Spool is more about whether a Seek predicate is used, rather than on the underlying structure. Recursive CTE I’ve already shown you an example of spooling when the OVER clause is used. You might see them being used whenever you have data that is needed multiple times, and CTEs are quite common here. With the definition of a set of data described in a CTE, if the query writer is leveraging this by referring to the CTE multiple times, and there’s no simplification to be leveraged, a spool could theoretically be used to avoid reapplying the CTE’s logic. Annoyingly, this doesn’t happen. Consider this query, which really looks like it’s using the same data twice. I’m creating a set of data (which is completely deterministic, by the way), and then joining it back to itself. There seems to be no reason why it shouldn’t use a spool for the set described by the CTE, but it doesn’t. On the other hand, if we don’t pull as many columns back, we might see a very different plan. You see, CTEs, like all sub-queries, are simplified out to figure out the best way of executing the whole query. My example is somewhat contrived, and although there are plenty of cases when it’s nice to give the Query Optimizer hints about how to execute queries, it usually doesn’t do a bad job, even without spooling (and you can always use a temporary table). When recursion is used, though, spooling should be expected. Consider what we’re asking for in a recursive CTE. We’re telling the system to construct a set of data using an initial query, and then use set as a source for another query, piping this back into the same set and back around. It’s very much a spool. The analogy of cotton is long gone here, as the idea of having a continual loop of cotton feeding onto a spool and off again doesn’t quite fit, but that’s what we have here. Data is being fed onto the spool, and getting pulled out a second time when the spool is used as a source. (This query is running on AdventureWorks, which has a ManagerID column in HumanResources.Employee, not AdventureWorks2012) The Index Spool operator is sucking rows into it – lazily. It has to be lazy, because at the start, there’s only one row to be had. However, as rows get populated onto the spool, the Table Spool operator on the right can return rows when asked, ending up with more rows (potentially) getting back onto the spool, ready for the next round. (The Assert operator is merely checking to see if we’ve reached the MAXRECURSION point – it vanishes if you use OPTION (MAXRECURSION 0), which you can try yourself if you like). Spools are useful. Don’t lose sight of that. Every time you use temporary tables or table variables in a stored procedure, you’re essentially doing the same – don’t get upset at the Query Optimizer for doing so, even if you think the spool looks like an expensive part of the query. I hope you’re enjoying this T-SQL Tuesday. Why not head over to my post that is hosting it this month to read about some other plan operators? At some point I’ll write a summary post – once I have you should find a comment below pointing at it. @rob_farley

    Read the article

  • Sync Chrome bookmarks to Delicious

    - by becomingGuru
    I use Google Chrome. I love the snappy and fast feeling. I thought I will miss the add ons of Firefox. But, not really. However, I used to use delicious actively and want to continue to do the same. I miss syncing my bookmarks with delicious. What is the simplest way to sync Chrome bookmarks with delicious? Addition: I see and understand that there are no good "add-ons" per se. I don't mind even a simple cron that I can schedule every day. May be I just have to write one, after all.

    Read the article

< Previous Page | 458 459 460 461 462 463 464 465 466 467 468 469  | Next Page >