CheckIO — ресурс для изучения и практики языка программирования Python. Абсолютно любой пользовать может зарегистрироваться на площадке и начать обучение или, уже зная язык, отшлифовать свои навыки.

Обучение представлено в виде игры, в которой каждому пользователю необходимо в той или иной мере использовать свои знания. Например, первый этап обучения «Learning» — это цепочка задач от легкой к сложной. Причем, в описании самой задачи есть все справочные данные для ее решения.

Таким образом, пользователь на примерах учится языку программирования Python и параллельно пробует свои силы. Впоследствии это может быть использовано для изучения и получения навыков работы с оупенсорсными библиотеками.

Следующий тип задач — Score Games или Single Player Game. Это игры, в которых выиграть нельзя, но можно постараться набрать как можно больше очков. Первым представителем этих игр стал Lines lite.

От классической игры в Lines эту игру отличает то, что не нужно, чтобы шар проходил путь полностью, а достаточно, чтоб он просто становился на пустое место. Как только пользователь написал программу и нажал play, он увидит в результатах работы программы поле, которое будет заполняться шарами разных цветов.

Нажимая next или prev, пользователь будет двигаться по итерациям этой игры. Под игровым полем выводится количество набранных очков. Если пользователь набрал больше 0, он попадает в High Score этой игры.

Третий тип задач — это Competition или Multi Player Game. Для тестирования своей программы пользователь выбирает соперника. Под соперником понимается написанная на этом портале кем-то другим программа, т. е. играют друг с другом программы пользователя. Первым представителем этой игры является «5 in row» или «Gomoku».

Пользователю необходимо написать программу, которая бы играла с «кем-то» в эту игру. Кроме того, он может разместить свою игру на Арене («On Arena») для того, чтобы другие пользователи тестировали свою игру с этой программой. Игры, помещенные на Арену, периодически запускаются системой для проведения между ними соревнований.

Проект был запущен в 2011 году.

Terrelle Pryor presently has too a lot hype for leaving Ohio State Paul McQuistan Jersey.

The Oakland Raiders reaching for him in the third round of the supplemental draft definitely won't quell Pryor's massive ego.

Pryor became this way all the way back in 2008, when he was the No. one QB prospect and arguably the No. one total prospect that every university needed to recruit for its plan http://www.seahawksfanshoponline.com/27-Earl_Thomas_Jersey.

He was drawing comparisons to Vince Youthful in high school, who at that time was a hot commodity, winning the BCS nationwide championship in spectacular trend vs. USC and getting to be the third all round choose in the 2006 NFL draft http://www.seahawksfanshoponline.com/147-Russell_Wilson_Elite_Jersey. He was also enjoying rather admirably for the Tennessee Titans http://www.seahawksfanshoponline.com/33-Marshawn_Lynch_Jersey.

When Pryor narrowed his selections down to Ohio State, Penn State, Michigan and Oregon, Sports activities Illustrated dubbed his coming decision &ldquounquestionably the most anticipated signing day announcement in historical past.&rdquo

He made these teams wait more than a month and a half past National Signing Day on his choice to attend The Ohio State University. If this does not scream cockiness, I don&rsquot know what does.

With so a lot hype behind him, can you blame Pryor for having such a huge ego heading into his school profession?

The difficulty is, he did not check out it at the door.

Pryor considered he was a huge shot, and he acted like a single all around campus.

He drove an estimated eight different cars in his three many years at the school, none of which he paid for personally. He drove a Nissan 350Z to practices and meetings whilst holding a license that was suspended by the State of Ohio.

You want to speak about vanity? He allegedly made 1000's of bucks placing his autograph on memorabilia for an OSU booster.

Pryor carried out rather nicely on the football area but was kind of a jerk about it. Pay attention to his submit-Sugar Bowl interview in which he informed reporters that he just wanted to make his opponent (Arkansas) search undesirable.

Along with the memorabilia and sketchy car predicament, Pryor also made a decision to trade autographs for ink at a Columbus tattoo parlor.

This was a single of the ultimate straws that forced the NCAA to suspend him for 5 games in the course of the 2011 season Russell Wilson Green Jersey.

Pryor decided to minimize his losses, leave school and employ an agent.

Guess who that agent is?

Yup, Drew Rosenhaus. The greatest jerk you can locate in the organization. He will get his consumers paid, but he has just as considerably of an ego as Pryor. It appears like a match manufactured in heaven, but it won&rsquot perform in the NFL.

The Raiders chosen Pryor in the supplemental draft as a positional player, according to ESPN's John Clayton, and he is actually going to need to eat his humble pie in buy to find success in the NFL.

The odds are already stacked against him if he does not want to use his freakish athleticism and develop to help the team in any way possible, including a conversion to wide receiver.

So, Terrelle Pryor, if you want to use the reality that you have been a third-round supplemental draft pick to improve your ego, by all signifies, go for it. Just don&rsquot be shocked when you fail.

Penguins 5, Sharks 1
* Sidney Crosby (PIT): 3 assists 10 factors (2 G, eight A) in final five video games 500th occupation game
* Chris Kunitz (PIT): two targets, 2:03 apart (3rd multigoal game this season)
* Penguins: Scored 1st four targets of game in 7:03 span in 2nd time period
* Sharks: Reduction snaps 6-game win streak
FROM ELIAS: Sidney Crosby, in the 500th normal-season game of his NHL job, recorded three assists as the Penguins beat the Sharks, 5&ndash1. Crosby&rsquos profession complete of 706 points is the sixth-highest by any player in NHL historical past by way of his very first 500 video games. The five players with increased level totals are Wayne Gretzky (one,186), Mario Lemieux (971), Peter Stastny (759), Mike Bossy (757) and Jari Kurri (730). Crosby has tallied at least two factors in every of his five century milestone games: No. a hundred (2), 200 (two), 300 (two), 400 (two) http://www.penguinsteamproshop.com/Ruslan-Fedotenko-Jersey-27, 500 (three).

Oilers 8, Avalanche two
* Taylor Hall (EDM): 4th career hat trick (eight Penguins Jerseys, 9, 10)
* Oilers: 6-three- in previous 9 games (four-15-2 in 1st 21 video games this season)
* Oilers: have won 4 straight meetings vs Avalanche (1st time considering that March-Nov. 2006)
* Avalanche: 5-five- in past 10 video games (14-two- in 1st 16 video games this season)

Wild four, Blackhawks 3
* Marco Scandella (MIN): 1st aim of season, broke three-3 tie with 1:48 remaining in 3rd time period
* Wild: enhance to 12-three-2 at property this season (T-most residence wins and points in NHL)
* Blackhawks: 2 straight regulation losses for 1st time this season
* Blackhawks: had won earlier 6 road video games

Canadiens 2 Tom Barrasso Jersey, Bruins one
* Max Pacioretty (MON): goal in 2nd period scored objective in 5 straight property games (9 complete targets in 5 video games)
* Canadiens: eight--1 in last 9 video games
* Canadiens: won 5 straight residence games
* Bruins: two-five- in final seven street games

Rangers three Matt Cooke Jersey, Sabres one
* Rick Nash (NYR): Aim (5) purpose in 4 of last 5 games following one purpose in 1st 7 games this season
* Henrik Lundqvist (NYR): 1 GA, had permitted multiple ambitions in five straight video games

FROM ELIAS: Henrik Lundqvist, enjoying in his 1st game since he signed a 7-year contract extension with the Rangers, manufactured 27 saves as he led New York to a 3&ndash1 victory in excess of the Sabres in Buffalo. It was the 285th win of Lundqvist&rsquos job http://www.penguinsteamproshop.com/Brent-Johnson-Jersey-13, the most by any goaltender throughout his 9 seasons in the NHL. Above the last nine seasons Lundqvist ranks second amongst NHL goaltenders in both games played (532, behind the recently retired Miikka Kiprusoff at 538) and shutouts (47, behind Martin Brodeur with 48).

The staff raised that probability in regard to an earlier postponed game and was denied, casting doubts the request will be authorized now.

The Orioles are arranging to go ahead with tonight's game towards the Anaheim Angels, scheduled for seven:05, the 1st of a 4-game series, Orioles spokesman Bill Stetka said. A final selection will be made nowadays by the fire division, which is cleaning up the wreckage of chemical cars in a tunnel that runs near the stadium.

&quotWe're taking it day by day,&quot Stetka said. The staff is hoping to steer clear of a lot more postponements.

cComments Got some thing to say? Start the conversation and be the 1st to comment. Add a comment

The stakes are substantial: special amongst the main-league sports, baseball teams derive almost 80 % of their funds from the sale of tickets, concessions and other &quotlocal&quot revenue that is earned only when a game is played. An NFL crew, by contrast, makes most of its funds from network tv costs that are distributed by the league.

The Orioles estimate they could shed $three million in revenue if the two Rangers video games are not manufactured up. The Maryland Stadium Authority, as well, stands to get rid of a share of its income.

Normally a missed game is additional to the subsequent series in which the guests return to town. The game played Wednesday afternoon, for instance, was a makeup of a Could 26 rainout. But the Rangers aren't scheduled to return once again this season.

Baseball guidelines permit the Orioles to make up the misplaced video games at the Rangers' home area in Arlington https://www.rangersofficialstoreonline.com/20-Rick_Nash_Jersey, Texas, where the group is scheduled to visit subsequent week. But the league tries to steer clear of swapping fields for both aggressive and economic motives. It would deny the Orioles a residence-area advantage for the video games as effectively as the money that would be produced from tickets and concession revenue.

&quotNo group desires to lose their gate or residence-discipline advantage. You want everyone to have their 81 property games,&quot said Katy Feeney, vice president of scheduling and club relations for Main League Baseball in New York.

The home staff keeps all of the funds manufactured on a game other than twenty % of &quotnet revenues&quot that are paid into a league fund for distribution back to the clubs, with the neediest clubs receiving the most.

Yesterday, Feeney started browsing for gaps in the two teams' schedules. 3 open days overlap: Aug. 13 and 27, and Sept. 17. At least one particular of individuals days presents a travel dilemma for the Rangers https://www.rangersofficialstoreonline.com/18-Martin_StLouis_Jersey, who would be forced to fly cross-country twice for the game.

Scheduling a makeup game needs consultation with the gamers union, whose contract prohibits far more than 20 straight game days without having the players' consent.

If no date can be located, and both team is in contention, a game can be extra right after the season's end, Feeney explained. Which is not most likely to be a aspect for both the Orioles or Rangers, each of whom are in fourth location.

Stetka said the team presently experimented with to get the May rainout scheduled as a postseason game for Ripken's finale, but the league rejected the notion and it was scheduled for Wednesday afternoon. The group has asked again, but is not optimistic the league will approve because of the prior denial, Stetka explained.

John Moag, managing spouse of investment company Legg Mason's sports activities business practice, stated, &quotYou never, ever want to drop a game due to the fact it is difficult, genuine cash which is currently being removed from the bottom line.&quot

For the Orioles, the postponements come at a time of declining attendance. The group, despite the fact that the fourth-best drawing club in baseball this season, is averaging about 37 https://www.rangersofficialstoreonline.com/8-Brian_Leetch_Jersey,000 followers per game, a 12 percent drop from last yr.

&quotWhen you are receiving hit with lowered attendance, the final point you want is a lost game,&quot Moag said.

NOTE: Followers with tickets to both of yesterday's games can hold them till a makeup date is announced or exchange them for available seats at any Monday-Thursday home games this season. Followers living 75 miles or farther from Camden Yards might request a refund by sending a letter https://www.rangersofficialstoreonline.com/17-Mark_Messier_Jersey, and the unique tickets, through certified mail to the Orioles at 333 W. Camden St., Baltimore https://www.rangersofficialstoreonline.com/11-Derek_Stepan_Jersey, Md. 21201. Send the letter to the consideration of &quotJuly 18 [or 19] Postponement.&quot

Wire reviews contributed to this article.

On second imagined, you won&rsquot see the three Hawk Olympians on Crew Canada at practice when the Blackhawks resume perform on Thursday -- at least not for a couple of more days https://www.blackhawksofficialstoreonline.com/39-Niklas-Hjalmarsson-Jersey. Their Olympic team ultimately came to play, keeping away from elimination for the second straight night and advancing to the semi-finals on Friday.
[+] EnlargeCameron Spencer/Getty PhotographsJonathan Toews handles the puck towards Sergey Goncharof Russia during the ice hockey men's quarter ultimate between Russia and Canada on day 13 of the Vancouver 2010 Winter Olympics.

Canada woke up from its tournament slumber, pasting what was thought to be a hazardous Russian squad seven-3. The emotion Canada showed in contrast to preceding video games was evening and day. From the drop of the puck, the host nation&rsquos group basically would not be denied. They scored two:21 into the game and never looked back following a 4-purpose opening time period. Russia looked stunned, if not shocked by the onslaught.

Yet again, Blackhawks forward Jonathan Toews was one particular of the far better gamers on the ice as his worldwide star continues to develop. Toews had assists on two of Canada&rsquos seven goals. In the very first, he incited a Russian turnover, then moved up ice and acquired a pass only to feed Rick Nash for an easy objective. That place Canada up three- Jeremy Roenick Jersey. In the 2nd Corey Crawford Jersey, he took a loose puck, entered the offensive zone and located Shea Weber to extend the lead to 6-one. Toews has been nothing quick of fantastic on each ends of the ice in his Olympic debut. He played 15:50, finishing with those two helpers and a plus 2 rating. In the 5 video games Canada has played, he&rsquos a team-ideal plus 9 with a group-substantial seven assists. That&rsquos right, Toews is helping lead the way on 1 of the most talented teams Canada has ever assembled.

Duncan Keith and Brent Seabrook had solid efforts as effectively, although once more Seabrook played restricted minutes. Keith had two assists in 22:54 of ice time and if Toews has been the most constant forward Sheldon Brookbank Jersey, then Keith has accomplished the very same on the back finish. Seabrook played well in only 7:49 of ice time.

Speaking of defense, Canada&rsquos 7 blueliners manufactured the variation amongst this game and prior ones in the tournament. They played with aggression and passion, jumping into the offensive perform and placing a lot more strain on Russia than they could manage. On this evening https://www.blackhawksofficialstoreonline.com/28-Jonathan-Toews-Jersey, they combined for two ambitions and 5 assists. Those are much more like the numbers Canada anticipated when its crew was put together.

Canada might have identified its game, but their up coming contest will surely verify or refute that. But for now, they&rsquove place their stamp on this tournament with their largest victory but. The U.S. and Canada are a single game away from the ultimate rematch. On to the semifinals.

Добрый день! Прошу помочь разобраться с проблемой. Есть сайт на python, внес изменения в файлы py, на сайте все осталось по прежнему. Перепробовал все, реакции никакой. touth wsgi.py, compileall.compile_dir(...), python manage.py - реакции ноль. Сервер провайдера, прав на перезапуск служб нет. https://code.google.com/p/modwsgi/wiki/ReloadingSourceCode - не решило проблему. Срочно! Пополню счет на телефон за полезный совет!

Измените регистр символа, если он был латинской буквой: сделайте его заглавным, если он был строчной буквой и наоборот. Для этого напишите отдельную функцию, меняющую регистр символа. - Олимпиадная задача, был разбор: if-ом проверяем, где лежит данный символ - если между 'a' и 'z', то выводим символ с кодом, уменьшенным на разницу между кодами символов 'a' и 'A'. Если же он лежит между 'A' и 'Z' - символ с кодом, увеличенным на разницу между теми же кодами. И еще: Требуется найти N-е число Фибоначчи. Примечание. В программе запрещается использовать циклы. Формат входных данных На вход программы поступает целое неотрицательное число N (N ≤ 30). Формат выходных данных Требуется вывести N-е число Фибоначчи. ВОт разбор: Задача решается с помощью рекурсивной функции, которая для n = 0 и n = 1 непосредственно возвращает ответ, а при n > 1 производит рекурсивные вызовы для n - 1 и n - 2, в качестве ответа возвращая сумму результатов.

Приветствую, друзья!
Я решил встать на не лёгкий путь питон программиста, и выучить, этот самый, питон. Скажу сразу, опыта программирования у меня сейчас нет, сам я работаю системным администратором и на среднем уровне знаю баш.
Я уже одныжды начинал учить этот язык несколько лет назад, читал Лутца, но по каким-то причинам, уже не помню каким, забросил это дело. Сейчас вот решил возобновить попытки изучения.
Зачем мне блог на этом ресурсе?
Ну во первых: Основной причиной назову то, что из - за своей непоседливости могу забросить я это дело опять, а так, вероятно будет какая-то доля ответственности перед потенциальными читателями, и будет стимул что-то делать.
Во вторых: Портал как-раз посвящён сабжевому языку, и возможно, я тут наткнусь на неравнодушных людей, от которых получу полезные советы и поддержку.
Да и все причины наверно.
Начало, так сказать, положено, теперь пойду смотреть лекции Екатерины Тузовой на лекториуме, мне их всячески нахваливали.
И снова да, начну с 3го питона.
Для людей не равнодушных: Приму дельные советы.

ЗЫ: Помогите выбрать толковое IDE под питона, только не vim и не sublime text.

Утилита для работы с архивными каталогами

Часто возникает желание привести свои фото-видео и просто архивы впорядок, но никогда не хватает терпения сделать это вручную. Решил написать удобную утилитку для поиска дубликатов ( дубликатами считаюся файлы с одинаковым содержимым, точнее с одинаковым md5 hexdigest, хотя они могут иметь и разные имена ) в "архивных"(raw) каталогах и поискать во всех возможных местах "потерянные"(uniq) файлы т.е файлы отсутствуразветвительющие в "архивных"(raw) каталогах. Хотелось набросать это за пару часов и "хотелось, как лучше, а получилось, как всегда". Вместо пары часов ушло пару вечеров. И раз уж все равно потрачено много времени, хочется довести эту утилитку до "более законченного" сосотояния и услышать от Питонистов их менение, замечания и просто пожелания по поводу: оптимизации, стиля, форматирования/оформления кода, именования переменных и т.д. В общем любая конструктивная критика приветствуется. Надеюсь с вашей помощью поучить что-то полезное не только для себя. Возможности настройки и использование должны быть понятны из кода (безусловно по возможности добавлю описание)

Вкратце

Утилита только читает каталоги и ничего не меняет. Пока, не меняет, в следующих версиях возможно будет. Утилита принимает на вход либо конфигурационный файл, либо опции коммандной строки(command-line) Например: ./ardiff.py --config=test.conf , либо и то и то, при этом command-line опции имеют приоритет. Основной параметр это "архивный каталог"(raw) или их список в которых будет осуществляться поиск дубликатов. Например вот так: ./ardiff.py --raw=dir1,dir2 #через запятую без пробелов. Могут быть указаны абсолютные пути или только их "относительная"(name) часть. Относительная по отношению к текущему каталогу или к каталогу, указанному в --raw-root./ardiff.py --raw-root=/home/user/photo-arch --raw=dir1,dir2 Аналогичным способом можно указать "внешние"(raw-sorted) каталоги, где будет осуществряться поиск недостающих(uniq) в "архивных"(raw) каталогах файлов. Еще можно управлять выводом результатов работы, их можно выводить на экран --verbose=2 и/или в файлы см. пример ниже.

Рабочий пример

В этом примере, утилита запускается с опциями, при которых будут найдены все дубликаты файлов в подкаталогах dir1 и dir2, которые находятся в каталоге /home/user/photo-arch, затем будут найдены все файлы, которые есть в подкаталогах dir3, dir4 и dir5 каталога photo-arch на внешнем носителе примонтированном к /media/user/3XX, при этом результаты будут выведены, как на экран, так и в файлы в каталоге ./folder-for-results.


./adiff.py  --raw-root=/home/user/photo-arch --raw=dir1,dir2 --raw-sorted-root=/media/user/3XX/photo-arch
--raw-sorted=dir3,dir5 --dup_prefix=dup- --check_dup=Y --uniq_prefix=uniq- --check_uniq=Y 
--out_root=folder-for-results --verbose=2
Чтобы было не очень скучно сидеть, пока утилита перетряхивает гигабайты ваших арховов, при verbose = 1 на stdout, а при verbose = 2 на stderr, будет выводиться количество просмотренных каталогов, файлов и скорость из обхода в файлах в секунду. Утилитка пока сырая, но в умелых руках, вполне полезная и, как я уже писал, вполне безопасная, потому что ничего не удаляет и не пишет, кроме не очень больших файлов с результатами работы. Да и их порождение можно отключить.
Не судите строго, и это именно тот не частый случай, когда помогать советом - уместно.

Теперь исходники

Файл: ardiff.py

#!/usr/bin/env python
# encoding: utf-8
'''
fsutils.ardiff -- Utility to help cleaning up the files archives/catalogs

fsutils.ardiff is an utility to help cleaning up the files(photo,video, etc.) 
archives/catalogs. For now it is do only look up for duplicates thru the 
"main"/"raw" archives/catalogs. Also this utility can find unique files (files
which not in the "main"/"raw" archives) in external catalogs. 

It defines classes_and_methods

@author:	 Gurman

@copyright:  2014 Gurman Inc. All rights reserved.

@license:	"Copyright 2014 Gurman (Gurman Inc.)    	\
			Licensed under the Apache License 2.0\n		\
			http://www.apache.org/licenses/LICENSE-2.0" 

@contact:	apgurman@gmail.com
@deffield	updated: Updated
'''
import sys
import os
import md5
import time

from kcommon import OptionParserWraper
from kcommon import err_report
from kcommon import CatalogsWalker
from kcommon import OutSplitter

__all__ = []
__version__ = "v0.1"
__date__ = '2014-11-24'
__update__='2014-11-24'

__version_string__ = '%%prog %s (%s)' % (__version__, __update__)

DEBUG = 1

__SORTED_SUFFIX__= "sorted.txt"
__ASIS_SUFFIX__= "asis.txt"

class FileInfo(object):
	""" Class to hold file info """
	
	def __init__(self,_path,_name=None):
		""" if _path only passed the _path == path+name """
		self.__size, self.__hexdigest = None, None
		if _name is None:
			self.__path = os.path.dirname(_path)
			self.__name = os.path.basename(_path)		  
		else:
			self.__path, self.__name = _path, _name
		
			
	@property		
	def path(self):
		return self.__path
	
	@property		
	def name(self):
		return self.__name
	
	@property
	def full_name(self):
		return os.path.join(self.path,self.name)
	
	@property
	def size(self):
		""" calculate if need and return size for file """
		if self.__size is None:
			self.__size = os.stat(self.full_name).st_size
		return self.__size
	
	@property
	def hexdigest(self):
		""" calculate if need and return md5 hexdigest for file """
		if self.__hexdigest is None:
			_md5 = md5.new()
			_file = open(self.full_name, 'rb')
			_block = _file.read(_md5.block_size)
			while _block:
				_md5.update(_block)
				_block = _file.read(_md5.block_size)
			self.__hexdigest = _md5.hexdigest()
		return self.__hexdigest
	
	@hexdigest.setter
	def hexdigest(self,_hxd):
		""" hexdigest setter if we want to use something specific """
		self.__hexdigest = _hxd
		return self.__hexdigest
	
	def __eq__(self,finfo):
		if finfo.size == self.size:
			if self.name == finfo.name:
				return True
			else:
				if self.hexdigest == finfo.hexdigest:
					return True
		return False
	
	def __ne__(self, o):
		return not self.__eq__(o)
	
	def is_in(self,_flist):
		for finfo in _flist:
			if self.__eq__(finfo):
				return True
		return False
						

class RawCatalogs(object):

	"""Class to manipulate and hold list of raw(archive) catalogs - main storage """

	def __init__(self, opts ):
		self.__count = 0
		self.__opts=opts
		
		ctlgs=opts.raw.split(',')
		self.__raw_catalogs = [os.path.join(opts.raw_root,x) for x in ctlgs] if opts.raw_root else ctlgs
		
		self.__out_splitter = OutSplitter([sys.stdout]) if opts.verbose == 1  else OutSplitter([sys.stderr])
		
		self.__raw_files, self.__raw_dup_files, self.__uniq_files = None, None, None

	@property
	def raw_files(self):
		if self.__raw_files  is None:
			self.__raw_files = {}
			self._add_arhs(self.__raw_catalogs)
		return self.__raw_files
	
	@property			
	def raw_dup_files(self):
		if self.__raw_dup_files is None:
			self.__raw_dup_files = {}
			self.raw_files
		return self.__raw_dup_files
	
	@property
	def uniq_files(self):
		if self.__uniq_files is None:
			self.__uniq_files = {}
		return self.__uniq_files
	
	def _check_duplicate(self, finfo):
		""" checks if the file's info already in archives """
		for i in self.__raw_files[finfo.size]:
			if finfo.hexdigest == i.hexdigest:
				#First duplicate
				if finfo.hexdigest not in self.__raw_dup_files:
					d = []; d.append(i)
					self.__raw_dup_files[finfo.hexdigest] = d
				else:	#More then one duplicate
					pass
				#Remembering all duplicates
				self.__raw_dup_files[finfo.hexdigest].append(finfo)
			else: #Same size but different files 
				pass
		
	def _add_to_arh_dict(self, finfo):
		""" add file's information to archives """
		if finfo.size not in self.raw_files:
			self.raw_files[finfo.size] = []
		elif self.__opts.check_dup == 'Y' :	
			self._check_duplicate(finfo)
			
		self.raw_files[finfo.size].append(finfo)
		
	def _add_arhs(self,_catalogs, out_splitter=None):
		""" main loop thru the archives to create internal structures """
		
		if out_splitter is None: out_splitter = self.__out_splitter
		
		for p,n in CatalogsWalker(_catalogs, out_splitter):
			finfo = FileInfo(p,n)
			self._add_to_arh_dict(finfo)
	
	def check_for_uniq_files(self,_catalogs,rfs=None,out_splitter=None):
		""" look up external catalogs for files which isn't in archives """
		uniq_files={}
		
		rfs = self.raw_files if rfs is None else rfs
		
		if out_splitter is None: out_splitter = self.__out_splitter
		
		cw=CatalogsWalker(_catalogs,out_splitter)
		for p,n in cw:
			
			finfo = FileInfo(p,n)
			
			if finfo.size in rfs:
				if not finfo.is_in(rfs[finfo.size]):
					if finfo.hexdigest not in uniq_files:
						x = []; x.append(finfo)
						uniq_files[finfo.hexdigest] = x
								
			elif finfo.hexdigest not in uniq_files:
					x = []; x.append(finfo)
					uniq_files[finfo.hexdigest] = x
					
		return uniq_files
	
	def group_by_folders(self,d):
		pd={}
		for i in d:
			for z in d[i]:
				if z.path not in pd:
					pd[z.path]=[]
				pd[z.path].append(z)
		return pd

	def print_as_is(self,n,d,outs):
		if not hasattr(outs, 'write') : return
		w = outs.write
		w(n)
		for i in d:
			w("\n")
			for x in d[i]:
				w(x.hexdigest+repr(x.size).rjust(12)+(x.full_name).rjust(64)+'\n')
		w("\n")	
						
	def print_sorted_by_full_name(self,d,outs):
		if not hasattr(outs, 'write') : return
		for i in sorted(d.items(), key=lambda x: x[1][0].full_name):
			print '\n',
			for z in d[i]:
				print z.full_name,"\t", z.size,"\t",z.hexdigest
		print "\n"
	
	def print_sorted_by_catalogs_and_names(self, n, d,outs):
		if not hasattr(outs, 'write') : return
		w = outs.write
		w(n)
		for i in d:
			w('\n\npath='+ i+ "\nfiles=")
			for z in sorted(d[i], key=lambda x: x.name):
				w(z.name + ',') #,"\t", z.size,"\t",z.hexdigest
		w("\n")
				
def _time_str():
	gmt=time.gmtime()
	return  '-'+str(gmt.tm_year)+'-'+str(gmt.tm_mon)+'-'+str(gmt.tm_mday)+'-'+str(gmt.tm_hour)+\
	 	 '-'+str(gmt.tm_min)+'-'+str(gmt.tm_sec)+'-'
		
def main_scenario(opts,args):
	
	rctlgs=RawCatalogs(opts)
	
	stdouts = sys.stdout if opts.verbose == 2 else None

	if opts.check_dup == 'Y' :
		rctlgs.raw_dup_files
		
		r, p = opts.out_root, opts.dup_prefix
		fout_prefix = ( os.path.join(r, p+_time_str()) if r else p + _time_str() ) if p else None 			
		
		if fout_prefix or stdouts: 
			fout_prefix_ = fout_prefix + __SORTED_SUFFIX__ if fout_prefix else None
			out_splitter = OutSplitter([stdouts, fout_prefix_])
			if out_splitter.pipes:
				rctlgs.print_sorted_by_catalogs_and_names("\n[SORTED DUPLICATES]\n",
					 	 	 rctlgs.group_by_folders(rctlgs.raw_dup_files), 
						 	 out_splitter)
			fout_prefix_ = fout_prefix + "-asis.txt" if fout_prefix else None	
			out_splitter = OutSplitter([stdouts, fout_prefix_])
			if out_splitter.pipes:
				rctlgs.print_as_is("\n[DUPLICATES]\n", 
								rctlgs.raw_dup_files, 
								out_splitter)

	if opts.check_uniq == "Y":
		_catalogs=opts.raw_sorted.split(',')
		_catalogs=[os.path.join(opts.raw_sorted_root,x) for x in _catalogs] if opts.raw_sorted_root else _catalogs
		rctlgs.check_for_uniq_files(_catalogs)
		
		r, p = opts.out_root, opts.uniq_prefix
		fout_prefix = ( os.path.join(r, p) if r else p ) if p else None 

		if fout_prefix or stdouts: 
			out_splitter = OutSplitter([stdouts, fout_prefix+__ASIS_SUFFIX__])
			if out_splitter.pipes:
				rctlgs.print_as_is("\n[UNIQS]\n",rctlgs.uniq_files, out_splitter) 
	
	print("\nThe end")
			
def main(argv=None):			
	
	try:
		
		opt_parser = OptionParserWraper(argv)#, version_string = __version_string__)
		
		main_scenario(opt_parser.opts,opt_parser.args)
	
	except Exception:
		err_report()
		return 2		


if __name__ == "__main__":
	if DEBUG:
		if len(sys.argv) == 1:
			sys.argv.append("--config=test.conf")
			pass

	sys.exit(main())

Файл: kcommon.py

#!/usr/bin/env python
# encoding: utf-8

import sys
import os
import time
from sets import Set

from optparse import OptionParser

__def_vers__ = "v0.1"
__def_duild_date__ = "2014-11-24"

def err_report():
    
    if sys.exc_info() != (None,None,None) : 
        last_type, last_value, last_traceback = sys.exc_info()
    else : 
        last_type, last_value, last_traceback = sys.last_type, sys.last_value, sys.last_traceback 
    
    tb, descript = last_traceback, []
    
    while tb :
        fname, lno = tb.tb_frame.f_code.co_filename, tb.tb_lineno
        descript.append('\tfile "%s", line %s, in %s\n'%(fname, lno, tb.tb_frame.f_code.co_name))
        tb = tb.tb_next
        
    descript.append('%s : %s\n'%(last_type.__name__, last_value))
    
    for i in descript : 
        sys.stderr.write(i),


class Error(Exception):
    """Base class for ardiff exceptions."""

    def __init__(self, msg=None):
        self.msg = msg

    def __str__(self):
        return self.msg
    
    #def __repr__(self, *args, **kwargs):
    #    return Exception.__repr__(self, *args, **kwargs)
    
class ConfigNotFoundError(Error):
    """Raised if config file passed but not found."""

    def __str__(self):
        return "call expression when config file passed but not found"
    
class OutSplitter(object):
    """ splitter """
    
    def __init__(self, out_splitter):
        self.__pipes=None
        if isinstance(out_splitter, OutSplitter):
                self.__pipes = Set(out_splitter.pipes)
        elif out_splitter:
            if hasattr(out_splitter,"__iter__") : #
                for pn in out_splitter:
                    self.add_pipe(pn)
            else:
                self.add_pipe(out_splitter)
                   
    @property
    def pipes(self):
        return self.__pipes
    
    def add_pipe(self, pn):
        if isinstance(pn, basestring):
            if pn == 'stderr': x = sys.stderr
            elif pn == 'stdout': x = sys.stdout
            else: 
                p = os.path.dirname(pn)
                if p and not os.path.exists(p):
                    os.makedirs(p)
                x = open(pn,'w+')
        else: x = pn
        
        if hasattr(x, 'write') and hasattr(x, 'flush'):
            if self.__pipes is None:
                self.__pipes=Set()
            self.__pipes.add(x)
    
    def write(self,s):
        if self.__pipes:
            for p in self.__pipes:
                p.write(s)
    
    def flush(self):
        if self.__pipes:
            for p in self.__pipes:
                p.flush()
                
class CatalogsWalker(OutSplitter):
    
    def __init__(self,catalogs,out_splitter=None):
        OutSplitter.__init__(self,out_splitter)        
        self.__catalogs = catalogs
        self.__file_counter, self.__catalog_counter, self.__start_time, self.__speed = 0, 0, 0, 0
        
    def __iter__(self):
        self.__file_counter, self.__catalog_counter, self.__start_time, self.__speed = 0, 0, 0, 0
        self.__start_time=time.time()
        for catalog in self.__catalogs:
            for p,d,ns in os.walk(catalog): 
                _ = d;
                self.__catalog_counter+=1
                for n in ns:
                    self.__file_counter+=1
                    self.__speed=self.__file_counter/(time.time()-self.__start_time)
                    self.fanny_indicator(self.__catalog_counter, self.__file_counter, self.__speed)
                    yield p,n
                    
    @property
    def catalog_counter(self):
        return self.__catalog_counter    
    
    @property
    def file_counter(self):
        return self.__file_counter
    
    @property
    def start_time(self):
        return self.__start_time
    
    @property
    def speed(self):
        return self.__speed    
    
    def fanny_indicator(self,p,n,s):
        if self.pipes:
            _result="\rcataloges:"+repr(p).rjust(8)+"\tfiles:"+repr(n).rjust(13)+"\tspeed: %12.3f"%s+" files/s"
            self.write(_result)
            self.flush()    

class OptionParserWraper(OptionParser):
    def __init__(self,argv,
                 version_string = '%%prog %s (%s)' % (__def_vers__, __def_duild_date__),
                 longdesc = '''''', # optional - give further explanation about what the program does
                 license_ = "Copyright 2014 Gurman (Gurman Inc.)                                            \
                    Licensed under the Apache License 2.0\nhttp://www.apache.org/licenses/LICENSE-2.0"                 
                 ):
        '''Command line options.'''
        # setup option parser
        OptionParser.__init__(self,
                              version=version_string, 
                              epilog=longdesc, 
                              description=license_)
        
        x=self.add_option
        x('--config', dest='config', default=None, metavar='FILE', 
          help='config file')
        x('--raw_root', dest='raw_root', default=None, metavar='PATH', 
          help='prefix(root path) for raw catalogs')        
        x('--raw', dest='raw', default='raw', metavar='DIRS', 
          help='list of raw catalogs names - archives names')
        x('--raw_sorted_root', dest='raw_sorted_root',default= None, metavar='PATH', 
          help='prefix(root path) for sorted raw catalogs')      
        x('--raw_sorted', dest='raw_sorted', default='raw_sorted', metavar='DIRS', 
          help='list of sorted raw catalogs names')
        x('--verbose',  dest='verbose', type=int, default=2, metavar='int', 
          help='level of verbosity')
        x('--check_dup', dest='check_dup', default='Y', metavar='Y/n', 
          help='Y for check duplicates default[Y]')
        x('--out_root', dest='out_root', default=None, metavar='FILE', 
          help='root dir to results output')
        x('--dup_prefix', dest='dup_prefix', default=None, metavar='FILE', 
          help='filenames prefix to out list of duplicate files')
        x('--check_uniq', dest='check_uniq', default='n', metavar='Y/n', 
          help='Y for check uniq files default[n]')
        x('--uniq_prefix', dest='uniq_prefix', default=None, metavar='FILE', 
          help='filenames prefix to out list of uniq files')
                       
        def config_parser_and_options_merger(filename, argv=None):
    
            cllps = (lambda s: ' '.join(s.split()))
            dopts = {}
        
            fconf = open(filename, 'r')
            for ln in fconf.read().split('\n'):
                
                if ln and ln[0] ==' #' or len(ln) 
Пример содержимого test.conf:

--raw_root=/home/user/photo/ 
--raw=dir1,dir2
#comments 1
--raw_sorted_root=/media/user/3XX/photo/ 
--raw_sorted=dir2,dir3
--out_root=results
--dup_prefix=dup-
--uniq_prefix=uniq-
--check_uniq=Y
--check_dup=Y
--verbose=2

Реально строим сайт.

Что входит в обучение: Изучение Django 1.7 с ноля. Построение реального портала. Включено так же Bootstrap Обучение продажам обучение продвижению

http://spb-tut.ru