鹤啸九天 自律更自由,平凡不平庸 Less is More

LLM 大模型训练之路

2024-03-06
阅读量

Notes(温馨提示):

  1. ★ 首次阅读建议浏览:导航指南, 或划到本页末尾, 或直接点击跳转, 查看全站导航图
  2. 右上角工具条搜索文章,右下角二维码关注微信公众号(鹤啸九天),底栏分享、赞赏、评论
  3. ★ 转载请注明文章来源,知识点积累起来不容易,水滴石穿,绳锯木断,谢谢理解
  4. ★ 如有疑问,邮件讨论,欢迎贡献优质资料


LLM 大模型训练之路

训练组件

LLM 训练模式

【2024-3-8】LLM-SFT-trick

微调是指在已经预训练好的大型语言模型基础上,使用特定数据集进行进一步的训练,使模型适应特定任务或领域。

  • 微调主要目的是,完成 知识注入、指令对齐

大模型应用中,指令微调已成为预训练大模型在实际业务应用最重要方式。许多垂直领域模型,都在预训练模型的基础上,通过针对性的指令微调,可以更好地适应最终任务和对齐用户偏好。

指令微调时,会将 Instruction(指令) 及对应的answer拼接成文本

  • 拼接过程中一般会加入【USER】、【BOT】等角色
  • 同时会加入开始结束的special token

这样可以转换成一个chat式任务

如翻译任务

# instruction:
【USER】:将下列内容翻译成英语:{待翻译文本}
# answer:
【BOT】:{翻译结果}
# 拼接后的文本:
<bos_token>【USER】:将下列内容翻译成英语:{待翻译文本}<special token>【BOT】:{翻译结果} <eos_token>

将拼接文本采用预训练任务方式进行自回归预测

  • 与预训练的区别:loss的计算,同样使用Cross-Entropy作为loss,指令微调时只会计算answer部分,Instruction部分通过设置ignore_index隐掉。
  • 上面的案例中,只会计算 【BOT】: 之后的loss。

特定任务改造

  • 分类任务: 模型最后添加softmax层。典型案: reward模型。

通过生成式模式解决判别式任务

  • 多目标文本分类问题,采用指令微调方式去解决,效果非常好。
  • 甚至在7B、3B的base模型上,去生成一个复杂json结构(包含多层结构的标签)依然有效。

微调方法

  • 微调方法分为全参数微调(Full Fine-tuning)、部分参数微调(Repurposing)
  • 全微调方法:SFT
  • 部分微调方法:LoRA、Adapter、Prefix-tuning、P-tuning、Prompt-tuning 、Freeze-tuning 等。

受GPT论文影响,大模型通用训练模式是三阶段训练模式:第一阶段 pre-train,第二阶段 SFT,第三阶段 RLHF

  • 三阶段训练分别得到 base模型 以及 chat模型
  • chat模型base模型基础进行通用任务SFT以及RLHF,使模型具备了对话能力、推理能力、用户偏好对齐、以及其他的NLU的能力。

SFT 训练模式

  • 模式一:基于 base模型 + 领域任务的SFT;
  • 模式二:基于 base模型 + 领域数据 continue pre-train + 领域任务SFT;
  • 模式三:基于 base模型 + 领域数据 continue pre-train + 通用任务SFT + 领域任务SFT;
  • 模式四:基于 base模型 + 领域数据 continue pre-train + 通用任务与领域任务混合SFT;
  • 模式五:基于 base模型 + 领域数据 continue pre-train(混入SFT数据) + 通用任务与领域任务混合SFT;
  • 模式六:基于 chat模型 + 领域任务SFT;
  • 模式六:基于 chat模型 + 领域数据 continue pre-train + 领域任务SFT

根据领域任务、领域样本、业务需求选择合适的训练模式。

  • a. 是否需要 continue pre-train
    • 大模型的知识来自 pre-train 阶段
    • 如果领域任务数据集与 pre-train 数据集差异较大(如领域任务数据来自公司内部),pre-train 训练样本基本不可能覆盖到,那一定要进行 continue pre-train。
    • 如果领域任务数据量较大(token在1B以上),并只追求领域任务效果,不考虑通用能力,建议进行continue pre-train。
  • b. 选择 chat模型 还是 base模型
    • 如果有好的base模型,在base模型基础进行领域数据的SFT, 与在chat模型上进行SFT,效果上差异不大。
    • 基于chat模型进行领域SFT,很容导致灾难性遗忘,进行领域任务SFT之后,模型通用能力会降低,如只追求领域任务的效果,则不用考虑。
    • 如果领域任务与通用任务有很大相关性,那这种二阶段SFT会提升领域任务效果。
    • 如果既追求领域任务的效果,并且希望通用能力不下降,建议选择 base模型 作为基座模型。在base模型上进行多任务混合训练,混合训练的时候需要关注各任务间的数据配比。
  • c. 其他
    • 资源运行的情况下,如只考虑领域任务效果,选择模式二;
    • 资源运行的情况下,如考虑模型综合能力,选择模式五;
    • 资源不允许的情况下,考虑模式六;

SFT-训练参数

  1. 学习率
    • 学习率非常重要,如果设置不当,很容易让SFT模型烂掉。
    • SFT数据集不大时,建议设置较小学习率,一般为pre-train阶段学习率的0.1左右,如在pre-train阶段的学习率为9e-5,则SFT学习率设置为9e-6。
    • 在10万SFT样本上,采用与pre-train一样的学习率,发现loss一直不收敛,在调低学习率至原来0.1之后,loss在两个epoch之后就收敛。
  2. warmup_ratio
    • 通常 pre-train 训练的 warmup_ratio 0.01~0.015之间,warmup-steps在2000左右。
    • SFT 时,建议用更小的ratio,因为相较于pre-train,SFT样本非常小,较小warmup_ratio可以使模型收敛更平滑。
    • 但如果学习率设置较大,那可增大 warmup_ratio,两者呈正相关。
  3. Epoch
    • Epoch 可根据loss收敛情况设置
    • 如果SFT样本较少,可设置较大epoch,在较小的epoch上loss会不收敛,指令都很难遵循。较大epoch会容易导致过拟合,但过拟合要优于欠拟合。
    • 如果SFT样本数量较多,如在十万以上,一般2个epoch即可收敛。

其它

  • 如果SFT任务类型较多,添加 system_prompt,不同任务使用不同 system_prompt;
  • 好的基座模型非常重要
  • SFT 时,loss依然是最重要的指标,一般在SFT过程中,loss会先升后降
  • 尝试多种模式训练方案,如 continue pre-train 中添加SFT数据,在SFT数据添加高质量的pre-train数据;
  • 模型参数量非常重要

二次开发

二次开发方法分类

  • 1、领域知识注入:Continue PreTraining(增量预训练): 一般垂直大模型是基于通用大模型进行二次开发,用领域内的语料进行继续预训练。
  • 2、知识召回(激发):SFT( Supervised Finetuning,有监督微调): 通过SFT激发大模型理解领域内的各种问题, 并进行回答的能力。
  • 3、基础偏好对齐:奖励模型(RM)、强化学习(RL),让大模型的回答对齐人们的偏好,比如行文风格。
  • 4、高阶偏好对齐RLHF(人类反馈强化学习训练)、DPO(直接偏好优化)。

3个阶段:

  • (1)、第一阶段: CPT(Continue PreTraining)增量预训练,在海量领域文档数据上二次预训练GPT模型,以注入领域知识。
  • (2)、第二阶段: SFT(Supervised Fine-tuning)有监督微调,构造指令微调数据集,在预训练模型基础上做指令精调,以对齐指令意图。
  • (3)、第三阶段 : RLHFDPO二选一。

Post-pretraining

Post-pretraining(后期预训练)是一种在模型的初始预训练和最终微调之间进行的训练方法。这种方法通常用于进一步适应模型以处理特定类型的数据或任务。

  • 在通用预训练模型的基础上,对模型进行额外训练,使模型更好地适应特定的领域或任务
  • 数据集: 某个领域,但比微调阶段使用的数据集更大、更广泛。
  • 训练方法: 监督学习,自监督学习,取决于数据类型和训练目标, 如语言建模、文本分类、实体识别等

Post-pretraining 允许模型在保持通用性的同时,增强对特定领域的理解,有助于模型在后续的微调阶段更快速地适应特定任务。

  • 与 SFT 相比,Post-pretraining 在微调之前提供了一个中间步骤,有助于模型更平滑地过渡到特定任务上。
  • 与 RLHF 相比,Post-pretraining 不依赖于复杂的奖励机制人类反馈,而是通过大量的领域特定数据来提升模型性能。

总结

  • Post-pretraining 是一个介于预训练微调之间的训练阶段
  • 使用大量的领域特定数据来进一步调整模型,使其更好地理解特定领域的语言和任务。
  • 这个阶段不需要复杂的奖励机制,而是通过传统的监督或自监督学习方法来实现模型性能的提升。

增量预训练

增量预训练是属于后期预训练(Post-pretraining)

增量预训练也叫领域自适应预训练(domain-adapter pretraining),即在所属领域数据上继续预训练。

自适应预训练(domain-adapter pretraining)的方法可以分为三类:Prompt-based方法、representation-based方法和model mixed-based方法。

  • Prompt-based 方法
  • representation-based 方法
  • model mixed-based 方法

1. Prompt-based 方法

使用模型全局tuning的方式适应下游任务时,预训练模型的泛化性能会被严重削弱

因此, Prompt-based方法在保持预训练模型参数权重不变的条件下, 增加额外可学习的 Prompt tuning 模块来实现对下游任务的泛化,这样就能较好地保持原模型的泛化性能。

VPT 虽然可以较好地保留模型的泛化性,但是面对新的任务时,以往的Prompt模块的知识同样被覆盖,依旧遭遇了灾难性遗忘问题。

为此,有学者提出了Prompt Pool 概念,设计了Prompt模块的集合,即P={P1,P2,…,Pm}(m表示该Pool的最大尺寸)。

Prompt Pool 有效避免了单一Prompt的问题,但是Pool的设计使得其需要进行Prompt Selection操作,也就是需要将特定任务与其对应的Prompt模块进行索引匹配。

L2P算法是一种较为常用的 Prompt selection算法,该算法设计了一种Key-Query的Prompt匹配方法,为每一个Prompt提供一个可学习的索引键k,即 P={(k1,P1),(k2,P2),…,(km,Pm)}

L2P利用预训练模型将输入特征编码到Key对用的嵌入空间中,然后利用余弦距离损失函数在已有的Pool中搜索最近似的Key。接着,利用如交叉熵损失等方法对搜索到的Key对应的Prompt进行进行优化。

类似的Prompt Selection 算法很多,如DualPrompt算法,该算法将Prompt进行解耦,分化为General Prompt和Expert Prompt。General Prompt面向所有任务,为所有任务中共享信息,而Expert Prompt针对独立任务,数量与任务量一致。其采用了和L2P相同的key-query匹配策略。

Prompt Selection虽然可行,但仍是硬匹配,选项有限。基于注意力信息加权的Prompt Combination方法则有效缓解了该问题。如CODA-Prompt通过对Prompt Pool进行注意力机制嵌入,为每个注意力赋予自适应权重,进而求算全局Key-Query的加权和,实现可学习式Prompt组合。我觉得稀疏式注意力Prompt combination应该也是很有趣的研究。

从根本上来说Prompt Combination仍受制于Prompt Pool的范围。为此, 许多学者则开展Prompt Generation有关的研究,如DAP,其利用MLP进行特定任务提示信息的编码生成。

优点:

  • Prompt 有助于弥合domain gap,并可有效地对特定任务的知识进行编码。
  • Prompt Design 属于lightweight模块,与input feature具有相同的维度,因此保存Prompt是parameter-efficient,适用于边缘场景。
  • Prompt Pool作为预训练模型的外部存储器,其支持自适应知识的检索和特定实例的预测。

缺点:

  • 一些研究发现L2P中的prompt selection过程收敛到一个单点,使得prompt selection只集中在特定子集上。
  • 由于key和query在整个学习过程中不断变化,这些参数的更新将会消除先前任务的参数,导致matchimg-level和prompt-level的遗忘,使prompt selection成为CL的瓶颈。
  • 固定大小的Prompt Pool会使得模型的表示能力受限。但是,若Prompt Pool随着数据的发展而增长,可能会为旧任务检索新的提示,导致训练和测试之间的不匹配。
  • 最后,一些研究发现prompt-based CL的性能低于简单的representation-based的baseline性能。并且批量提示有损比较的公平性。

2. Representation-based 方法

representation-based 方法直接利用预训练模型强大的泛化性和通用性来实现持续学习。

  • 比如Simple-CIL方法,是ADAM算法原文中提出的Baseline,Simple-CIL冻结预训练模型参数,并通过求算类别中心的方式来构建Classifier。在面对很多类别时,计算同类的embedding或features的平均值,并将该平均值作为该类别的标准(prototype),最后结合类别标准与余弦比较的方法替换模型的原始Classifier。

虽然基于prototype的方法存在一定的作用,但是并未很好地适应下游任务。为此,一些研究在基于prototype方法的基础上结合了外置参数高效调节模块或者外置适配器来使得预训练模型更加适应下游任务,如ADAM等。

ADAM等算法在进行类别标准设定时,类别标准之间的仍存在联系,导致任务效果降低。为此,RanPAC算法则采用online LDA classifier来去除原始方法prototype计算结果之间的相关性,加大类别间的分布差异。此外,RanPAC算法利用Random Projection layer将features映射到高维空间中,并在高维空间中进行prototype的计算,以使得特征分布符合高斯拟合。

相较于前面将预训练模型的通用语和适应性分离处理的方式,SLCA算法采用了差异学习率调整和特征经验重播的方式进行持续学习研究。该算法使用较小的learn rate调整模型主体部分,而使用较大的learn rate 调节模型的classifier,以实现模型的逐步微调和classifier的快速适应。为了避免忘记以前的分类器,SLCA还对分类特征分布进行建模,并重播它们以校准classifier。

优点:

  • 由于class prototype代表了对应类别最常见的标准格式,因此利用其构建模型具有直观和可解释性。
  • Representation-based 方法主要是冻结backbone和更新classifier权重。lightweight的更新成本增加了其现实应用的可行性。

缺点:

  • 将不同模型的特征连接起来形成class prototype,容易造成模型信息冗余。例如,不同的backbone中存在重复提取共享特征。
  • 当下游任务涉及多个领域时,在第一阶段调整模型不足以弥合数据集之间的领域差距。在这种情况下,不断调整backbone可能更适合提取特定于任务的特征。

3. Model Mixture-based 方法

Model Mixture-based 方法在持续学习工程中构建了一组模型,然后再推理阶段通过Model Ensemble和Model Merge来进行信息综合决策。

Model Ensemble中,ESN算法凭借预训练模型强大的通用性,构建多个classifier,在面对新任务重新初始化和训练一个新的classifier。在推理时,采用投票策略来整合多个模型的结果进行最终决策。

由于Model Ensemble的核心因素取决于模型的方差,一些研究通过增强模型之间的多样性来替代使用相同的预训练模型构建不同的classifier。如PromptFusion利用预训练的ViT和CLIP,并在推理过程中动态地对logit进行组合,即f(x) = λ fvit (x) +(1−λ)fclip(x)。

与多个backbone的集成不同,PROOF采用了仅使用单个CLIP的更全面的推理方法。由于CLIP支持视觉和文本特征的跨模态匹配,因此PROOF设计了一个三层集成,考虑image-to-text、image-to-image prototype、image-to-adjusted text的跨模态融合。

Model Merge将多个不同的模型合并为一个统一的模型,无需要额外的训练。LAE定义了online和offline学习协议,online模型通过交叉熵损失进行更新,目的是在新的任务中获取新的知识。离线模型则通过Model Merge进行更新,例如指数移动平均(EMA): θ offline←α·θ offline +(1−α)·θ Online,其中α为权衡参数。LAE仅将EMA应用于参数高效调谐模块(如prompt),其利用online和offline模型的最大logit进行推断。

与LAE一样,ZSCL将合并技术应用于CLIP模型,目的是在持续学习过程中保持其zero-shot性能。然而,随着EMA中权衡参数的改变,CLIP性能不再具有鲁棒性。因此,ZSCL建议每隔几次迭代合并参数,从而在模型训练期间创建平滑的损失轨迹。

此外,CoFiMA注意到EMA在Merge过程中对每个参数的重要性是相等的,CoFiMA 在Merge过程中插入Fisher information(费雪信息)作为每个参数的估计重要性。

优点:

  • 学习多个模型可以做出不同的决策。因此,使用Model Ensemble和Model Merge自然会产生更健壮的结果。
  • 由于直接合并模型进行统一预测,因此可以调整前模型和后模型的权重,以突出不同阶段之间知识共享的重要性。
  • 由于模型集将在推理过程中合并,因此最终的推理成本不会随着模型集中添加更多模型而增加。

缺点:

  • Model Ensemble需要保存所有的历史模型,并消耗大量的内存缓冲区。虽然基于Model Merge不需要这么大的成本,但合并大型backbone的权重也需要大量的额外计算。
  • 决定Merge哪些参数仍然是问题。

微调 (Fine-tuning)

这个阶段,预训练模型(可能经过了Post-pretraining)被进一步训练,以优化特定任务上的表现。

微调通常在一个相对较小的、特定任务的数据集上进行,这个数据集包含了明确的标签,模型通过监督学习来进行优化。

微调目的: 调整模型的参数,使其能够在特定任务上做出准确的预测。

SFT 监督微调

SFT (Supervised Fine-Tuning) 是微调的一种形式,强调在有监督的环境下进行。

SFT阶段,用特定领域数据或私有化数据, 对预训练模型进行改良。

这一阶段需要指令微调数据,数据集通常由输入(用户问题)和输出(标准答案)两个字段构成。标准答案通常由专家标注获得。

  • 1、SFT是一种简单的微调方法,它使用带有正确答案的数据集来继续训练一个预训练的模型。
  • 2、这种方法依赖于大量的标注数据,即每个输入都有一个预先定义的正确输出。
  • 3、微调的目的是使模型更好地适应特定的任务或领域【垂直领域】,比如特定类型的语言理解或生成任务。
  • 4、SFT通常不涉及复杂的策略或奖励函数,只是简单地最小化预测输出和真实输出之间的差异。

SFT VS Pretrain

【2024-10-22】细谈大模型监督微调SFT:实战经验技巧和debug分析思路

SFT 和 pretrain 在训练方式上没有任何区别,主要区别在于数据组成形式上:

  1. pretrain 每条数据都是满编 4K / 8K,SFT 每条数据原本多长就是多长;
  2. SFT 会引入 pretrain 阶段未见过的 special_token,来让它们学习全新的语义;
  3. SFT 会让模型见到最重要的 eos_token,pretrain 模型因为没见过该 token 而无法停止生成;
  4. 借助 special_token,SFT 会把语料切分成不同的角色,标配的有 system、user、assistant,根据业务需求也可以有“背景”、“旁白”、“事件”等等;
  5. SFT 的 prompt 不做 loss,但这并不是说它不能做 loss。主要原因是 prompt 的同质化比较严重,不做 loss_mask 的话,同样的一句话会被翻来覆去的学,但如果你能保证你的每条 prompt 都是独一无二的,就完全可以省去 prompt 的 loss_mask 环节。对了,session 数据一定要想清楚是每一个 answer 都算 loss,还是只对最后一轮的 answer 算 loss。

除此之外,训练目的也不一样。

  • pretrain 是在背书,纯粹的学习知识;
  • sft 则是在做题,学习的是指令 follow 能力。

切勿在 sft 阶段强行给模型做知识注入,比如训个 50W 条的 code 数据,所有的知识注入工作应该采用 continue-pretrain 的思路进行,否则都会使得模型的通用能力掉点明显(SFT 做知识注入基本上是 100% 某个知识,但 continue-pretrain 做知识注入会控制在 10% ~ 20% 左右的比例)。

RLHF 人类反馈强化学习

RLHF 利用人类反馈来训练强化学习模型。

在RLHF中,模型通过与人类交互获得反馈,这些反馈作为奖励信号来指导模型的行为。RLHF通常用于训练能够生成更自然、更符合人类偏好的文本或其他输出的模型。这种方法特别适用于需要模型理解和适应人类偏好的场景。

  • 1、RLHF (Reinforcement Learning from Human Feedback) 是一种更复杂的训练方法,结合了监督学习和强化学习。
  • 2、在RLHF中,模型首先通过监督学习进行预训练,然后通过人类提供的反馈来进行强化学习。
  • 3、人类反馈可以直接对模型输出评分,或模型输出之间做出选择的偏好
  • 4、强化学习部分涉及到定义一个奖励函数,根据人类反馈来调整模型的行为,以优化长期的奖励。
  • 5、RLHF目标: 训练出一个在没有明确标签的复杂任务中表现良好的模型,这些任务可能需要更细致的判断和调整。

思考

对齐

instruction following 是 alignment (对齐)的一个特殊形式,但它并不构成对齐的全部内容。

对齐问题原本称为价值对齐 (value alignment)指一个 AI 系统训练目标可能与其实际需要面对的核心价值并不一致。

  • 训练目标与真正希望 AI 满足的目标之间存在不匹配,而如何解决这个不匹配的问题被称作 value alignment problem。

OpenAI 2024年初提出 “Super-Alignment”, 探讨了 AGI 的水平远远超越人类,人类将如何是好。

OpenAI 当时提出了一个概念,即 “Weak-to-Strong Generalization”,如果目前机器智能尚不及人类,人类尚能与之互动;但若其智能发展至极高水平,人类似乎难以与其沟通。那么也就产生了一个问题,人们应该如何训练 AI,是否应该采用特定的方式?Next Token Prediction 或是 instruction following 是不是一个好的对齐方法?

alignment 问题核心假设:

  • 因为人类很多时候并不清楚自己到底想要什么,因此很难给出一个完全具体的价值观描述,且不同人的价值观都有区分。
  • 如果人类给出的指令永远不是特别准确,那么 AI 系统在执行任务时需要保持一定的不确定性。

框架 Cooperative Inverse Reinforcement Learning,来源于师兄 Dylan Hadfield-Menell(目前在MIT任教)和导师做的一个研究。

  • 假设每个人都有一个 hidden reward function。当人与 AI 交互时,人可能想的是 AI 帮我递个咖啡,但人给 AI 的具体指令可能并不是这样,比如人可能只是说了“给我个喝的”,AI 需要不断去推断人类的真正意图。

在这样的定义下,人类的真正意图可以被建模成一个隐藏的奖励函数,机器人需要不断地根据人给出的所有信息来主动推断人类的真正意图。如果不确定时,最优策略是 AI 去问人类。

post-training 让模型更聪明

【2024-8-23】RL 是 LLM 的新范式

曾在 OpenAI 负责 post-traning 的 John Schulman: (RL 拥趸和布道者)

  • post-training 是模型变得越来越聪明的重要原因,而 RLHF 是最重要的技术 tricks。

John Schulman 对 RLHF 的信仰来自 OpenAI 的亲身实践:

  • GPT-4 的 Elo 分数之所以能比第一代 GPT 高出 100 分也和 post-traning 的提升相关。

Scaling law 让 AI 更聪明,而 RL 让 AI 更有用

InstructGPT 核心思想

  • 利用人类的判断来指导模型的训练,因为这些 instruction following 的任务本身就是人类给出的指令。
  • InstructGPT 能够处理复杂的指令,包括写代码等任务,很多在 zero-shot 设定上 GPT-3 做不了的任务都可以被完成。

InstructGPT 目标: 微调 GPT 模型,使其能够产生满足人类指令的输出。

为了使 GPT 完成指令遵从,技术挑战集中在:如何收集数据?

为了实现这一目标,需要完成两件事情:

  • 指令,fine-tuning 首先需要收集指令,即人类的 prompts 或 instructions。
  • 反馈,需要收集好的反馈来满足 human instructions。

从训练语言模型的角度来看,收集大量的人类指令(human instructions),以及对应的人类反馈。这些对应好的数据将被作为 Next Token Prediction 的训练数据,通过传统语言模型训练方法,即 SFT (Supervised Fine-Tuning),来进行训练。

于是, InstructGPT 训练过程:

  • • 第一步,通过 SFT 收集 human demostration data 进行 SFT。
  • • 第二步,收集人类偏好数据,利用数据学习一个奖励模型。
  • • 第三步,使用 reward model 进行强化学习的 RLHF 训练。

最终就可以得到优化后的 InstructGPT 模型。

之后的 ChatGPT 总体训练流程概括为两个主要部分。

  • Pre-training :涉及使用大量数据,通过语言模型的训练方法来训练一个基础模型。
  • Post-trainingInstructGPTChatGPT 所执行的步骤,即利用人类的标注数据或高质量的人类反馈数据进行后训练。

Post-training通常包括至少两个步骤:

  • 1)SFT 步骤,通过 human demonstration 的方法进行监督学习
  • 2)RLHF 步骤,通过 human preference data 的方法进行奖励学习

预训练与后训练之间也存在区别:

  • 数据方面:预训练和后训练在数据的质量和数量上存在差异。
    • 预训练阶段需要处理海量数据,这可能需要大量的计算资源和较长的时间。
    • 而在后训练部分,大量的数据是人类标注或通过某种方式构造出来的数据,数据质量通常较高,但与预训练阶段相比,数量会少很多。
  • • 训练目标方面:
    • 预训练的目标是压缩Next Token Prediction
    • 后训练的目标是 instruction following。通过训练激发大模型的能力与智能,使模型 usable,能够尊从人类指令。
  • 训练过程方面 (dynamics):
    • 预训练通常是固定的,需要收集一个庞大的数据集进行训练,这些数据通常是静态的。
    • 对应 post-training,尤其是 RLHF ,其反馈是在线的,需要不断收集人的反馈,不断迭代,逐渐进化模型,这是一个动态的在线过程。

最后, post-training phase 也被称为对齐(alignment phase), 将 LLM 的能力和人类的偏好保持一致,希望大模型的输出能够满足人类的价值取向和意图,确保模型的输出与人类的偏好一致。

SFT < RLHF ?

【2024-8-23】RL 是 LLM 的新范式

为什么 RLHF 效果优于 SFT ?

PPO 算法提出者 John Schulman,曾经在 OpenAI 工作,Berkeley 的PhD, 2024年4月, 到 Berkeley 做过一场讲座,仔细讨论了 RLHF PPO 的重要性,两个观点:

  • 第一, SFT 会导致幻觉 hallucination :
  • 第二, RLHF helps uncertainty awareness,让大模型“知道”自己“确实不知道”。

进一步完善, RLHF 过程三点好处:

  • 使用 负向反馈 进行对比学习,通过对比过程帮助模型降低幻觉 halluciation。
  • 强化学习不是一个固定的过程。允许模型随着能力的不断提升,通过不断地问问题、不断地给出答案、不断地评判,从而让模型不停地从当前能力的边界进行主动探索,并不断拓宽自己的能力边界。
  • 这两个因素共同作用能够形成 反事实推理 counter-factual reasoning 的作用,有可能解锁因果学习(casual learning)的巨大潜力,让模型具备更强的 reasoning 能力。
SFT 会导致幻觉

John Schulman 认为,大型模型之所以会产生幻觉,是因为 SFT 阶段学到了一些不正确的认知。

举例

  • 当 GPT-3 被要求 “ write a bio of AI researcher John Schulman”时,GPT 错误地输出:John 从 2009 年开始在 CMU 任职 associate professor,从 2012 年开始任职 professor。但是真实情况是,John 在完成 PHD 学位后就在 OpenAI 工作,并未在其他地方工作(注:最近John刚加入了Anthropic)。GPT-3 输出的内容与实际明显不符。

为何大型模型会生成这样的错误信息

  • 思维实验,假设在预训练阶段,就存在一个 知识截断(knowledge cut off)。比如,假设 ChatGPT 的所有的知识和数据都截止于 2023 年。到 2024 年,希望通过 SFT 的方式 fine-tune ChatGPT,让它来描述 2024 年欧洲杯的情况。但因为 GPT 在预训练过程中没有任何关于 2024 年欧洲杯的信息,它自然也不知道西班牙是否夺冠,也不知道是否有进球等具体情况。

如果用现有的数据进行简单的 SFT,实际上 GPT 并不知道 2024 年发生了什么,但由于 SFT 的数据中包含了其他欧洲杯相关的问答数据,这些回答都是精准的,因此大模型可能会觉得,对于2024年欧洲杯的问题也应该给出一个准确答案才可以,但它本身可能在预训练阶段并没有掌握正确的信息,于是就鹦鹉学舌地说一些错误的内容。这种情况下,SFT 过强的监督信号导致人类实际上在引导 ChatGPT 说它不知道的东西。

另外还存在一种可能性,即 GPT 实际上知道答案,但提供标注的人员不知道。

  • 例如,如果问到 2022 年某场足球联赛的问题,标注人员可能不了解答案,而 GPT 反而可能知道。在这种情况下,标注人员可能会给出 “I don’t know ” 的人类反馈。这反倒可能导致 GPT 产生混淆,因为它明明知道答案却被要求说不知道。这两种原因综合来看就可能导致模型在经过 SFT 阶段后非常容易出现 hallucination 现象。

他人观点

  • SFT 确实容易导致幻觉,但不一定完全是预训练阶段数据的知识截断导致的,SFT也能学习新知识

问题:大模型在是否学会新知识?

存在一个非常微妙的边界。

  • 如果不提供数据,大模型就不能够提供答案;
  • 如果提供数据不完整,可能导致模型出现幻觉
  • 如果数据提供足够多,模型就可能会学会新知识

因此,到底给多少数据,很难判断,SFT 高质量数据集也是非常难构建的,这里就有一个非常不容易的数据挑战( a non-trivial data challenge for building a good SFT dataset)。

RLHF让大模型“知道”自己“确实不知道”

RLHF helps uncertainty awareness,让大模型“知道”自己“确实不知道”。

欧洲杯的例子

  • 如果大模型不知道 2024 年欧洲杯的情况,用户却让大模型去描述欧洲杯的情况(在2024年欧洲杯上哪位运动员有进球),那大模型就可能会产生幻觉,这是因为模型实际上并不了解 2024 年欧洲杯的具体事件但被 SFT 引导说一个貌似正确的回复。

RLHF 如何防止 hallucination 的出现?

  • 如果存在一个设计良好的奖励函数,情况就会不同。
  • 如果模型给出正确答案,就给予正向的奖励分数 1分;
  • 如果模型表示“我不知道”,就给予 0分;
  • 如果模型给出错误答案,则扣除分数 4分。

在这种情况下,如果模型不知道 2024 年发生了什么,在强化学习过程中无法提供正确的回答,选择“不知道”成为更合理的策略。

这种机制鼓励模型在不知道答案时能够提供“不知道”的回答。这种方式能帮助模型保留了一定的不确定性,使模型能够产生正确的自我认知,来判断是否真的知道一个问题的答案。

他人观点

  • 基本正确,尽管 John 解释可能不完全准确
  • RLHF 所带来的不仅仅是处理知识边界的不确定性的能力(not only handle the knowledge cut off problem)
RLHF 提高了模型推理能力

RLHF 过程不仅帮助模型意识到不确定性,更重要的事情是 RLHF 帮助模型提高了 reasoning 能力。

相关性不代表因果性。大家会希望大模型掌握因果性,而不希望仅仅看到相关性

因果性指什么?

  • 传统统计学习里面有一个判断因果性的过程,叫 反事实推理 counter-factual reasoning。

是否可以舍弃 online attempt

问题:

  • 模型训练上利用 negative signal 和 online exploration 两件事上,是否可以舍弃 online attempt ?即只通过正反馈负反馈是否足够,而不需要模型持续在线尝试。只通过 contrasted learning,在 SFT 上加上负向案例,能否达到预期效果?

可以, DPO( Direct Policy Optimization)

  • 它与 PPO 算法的主要区别: DPO 去除了在线尝试的部分。 DPO 算法其实很简单,基本遵从了SFT训练流程,但是在收集正例之外还会收集负例,对于每一个 prompt 都要求标注员提供好的和坏的两个答案。对于好的答案提升概率,对于坏的答案则是让模型“不说”。

DPO 算法是否能达到与 PPO 效果?

如果仅仅通过静态数据 覆盖 LLM 所有可能的输出, 非常困难。因此,在线探索及时奖励反馈是一种更加高效让 LLM 学会说正确答案的方法。

结论

  • 如果能够实现 PPO 算法,PPO 效果将会远远超过 DPO。因为, 正例反例和在线探索两件事都非常重要。
  • 用 PPO 和 Code Llama 在 Coding Contest 上做了测试,发现使用开源模型加上 PPO 可以比 AlphaCode 这样的闭源模型在很难的 CodeForce 竞赛题上通过率提高 6%。这是一个纯开源模型加 RLHF 的尝试,并未添加任何新的数据。在这种很难的、需要强调 reasoning 能力的任务上,DPO 完全没有效果。

PPO RLHF 框架有哪些挑战?

PPO 包含四个模型:actor、critic、value network 和 reference network。

  • 不同模型还有不同依赖,也就是前后依赖关系;
  • 不同模型也有不同吞吐量,比如,actor 是一个传统的大模型,需要输出所有 response,而 critic 则只需要做评分。评分的吞吐量会远小于需要输出 response 的模型。

因此,不同模块的计算量存在显著差异。将这四个模块 scale up,并且做好算力平衡是具有挑战的。

挑战

  • 算法: PPO RLHF 算法流程相对复杂
    • 算法、流程都相对麻烦,多了很多流程。不仅需要正反馈、负反馈、需要奖励模型,并且涉及在线探索过程。
    • 建议: 要 advantage normalization、需要一个大的 training batch;reference model 需要 moving average 等。
  • 系统: 强化学习训练系统与传统的 SFT 有不太一样
    • SFT 或 DPO 模型通常只包含一个 policy 模型,只需将数据输入语言模型即可,其训练逻辑相对简单。然而,对于强化学习,或者对于 PPO RLHF,情况则更为复杂。
  • 数据: 数据非常重要
    • RLHF 数据包括两部分:一是 prompt,即人写的 instruction。二是指模型的 responses。这两部分都相当复杂

PPO RLHF 面临的挑战主要分为算法、系统和数据三个方面:

  1. 算法层面:关键在于如何稳定训练过程,并调整算法的细节以提高性能。
  2. 系统设计:由于强化学习 PPO,RLHF 的计算流程非常复杂,系统设计需要提高整体的训练效率。
  3. 数据:数据分为两部分,一部分是 prompt,一部分是 response。两部分都很关键,只有将它们结合起来,才能形成一个完整的,比较成功的 PPO RLHF 的 training process。

【2024-8-23】RL 是 LLM 的新范式

训练数据

【2024-9-11】大模型数据基础:预训练阶段数据详解

  • 预训练数据集组成
  • 1 通用预训练数据集
    • 1.1 网页
    • 1.2 语言文本
    • 1.3 书籍
    • 1.4 学术材料
    • 1.5 代码
    • 1.6 平行语料库
    • 1.7 社交媒体数据
    • 1.8 百科全书
    • 1.9 多类别数据
  • 2 特定领域预训练数据集
  • 预训练数据处理步骤
    • 1 数据收集
    • 2 数据过滤
      • 2.1 基于模型的方法
      • 2.2 基于启发式的方法
    • 3 数据去重
    • 4 数据标准化
    • 5 数据审核
  • 预训练数据整体分布现状及分析

预处理通常包括五个步骤:

【2024-5-23】再聊多轮对话微调训练格式与长序列训练

3个阶段的数据集格式: 增量预训练、单轮对话、多轮对话

  • 增量预训练数据集:提升模型在特定领域任务的能力。
  • 单轮对话和多轮对话数据集:用于指令微调(instruction tuning)阶段,以提升模型回复特定指令的能力。

指令微调阶段目标:训练语言模型根据人类指令给出回答。一般只有回答部分(Output)的 loss 会用于梯度回传,而指令部分(System、Input)部分的 loss 则不会用于权重更新。

数据集进行预处理时引入 “system”、”input” 和 “output” 三个字段

  • “system”、”input” 字段用于保存不需要计算 loss 的文本,如 系统或用户指令
  • 而 “output” 字段则用于保存 需要计算 loss 的文本,如 输入指令对应的 GroundTruth 回答。

数据量

资源受限时,模型训练应该用多少数据?

  • 预训练: 参考 缩放定律 ( scaling law)
  • 微调: 如下文

【2024-7-29】大型语言模型高效微调策略,通过实验发现少量数据即可显著提升特定任务性能,并提出一种基于早期模型表现的贝叶斯超参数优化方法,有效预测最终模型效果,为资源节约型的LLM微调提供新途径。

数据效率研究

模型性能与数据量之间的最佳平衡点,从而优化资源利用。

  • 虽然小型数据集显著改进效果,但是必须仔细考虑训练数据中属性分布,确保模型在所有目标变量上的全面表现。
  • 另外可探索数据增强技术或不同的采样策略,增强模型性能,特别是针对那些出现频率较低的属性。

数据量对模型效果影响

  • 200 (显著提升18pp) -> 1000 (放缓) -> 6500 (平衡点过后,收益减少)

详情

  • (1)快速初始改进:
    • 200个样本(相当于大约100个网页),模型准确率从70%显著提升至88%。—— 即使是相对较小的数据集也能带来显著的性能提升。
  • (2)收益递减:
    • 达到1,000个样本后,准确率提升速度放缓,大部分性能增益在这个数据量水平就已经实现。
  • (3)属性特定趋势:
    • 后期准确率提升主要由一个特定属性类型(如产品评分)所驱动。这一属性在数据集中出现的频率较低,只在大约25%的产品详情页面中出现。
  • (4)性能瓶颈:
    • 大约6,500个样本时,模型达到最大性能,这表明存在一个“最佳点”,在此之后,更多数据带来的收益逐渐减少。
  • (5)战略数据采样重要性:
    • 即使小数据集也能显著提升模型性能,但要确保所有目标变量在训练数据中的分布均衡,以实现全面的模型表现。

超参数优化

通过采用贝叶斯(Bayesian)优化并结合早期模型性能评估,可显著提高大型语言模型微调的效率和效果,减少计算成本,同时确保高最终准确率。

  • 首先,使用一系列超参数进行LoRA微调。
  • 然后,训练过程早期阶段,使用模型评估验证集上的准确率。
  • 接着,将超参数配置及准确率添加到结果池中。
  • 最后,运用Bayesian优化算法,基于结果池生成下一组超参数。

(1)超参数优化目标

  • 寻找最优超参数集:找到一组能最大化模型在验证集上性能指标(如准确率)的超参数集合。
  • 预测最终性能:最大化早期训练阶段与最终训练阶段之间模型性能的相关性,以便通过早期表现预测最终模型的质量。

(2)方法论

  • Bayesian优化:采用Bayesian优化算法智能地探索超参数空间,平衡探索(exploration)和利用(exploitation),通过构建代理模型(surrogate model)预测不同超参数设置下的模型性能。
  • LoRA微调:首先使用一组超参数进行LoRA(Low-Rank Adaptation)微调,然后在训练过程的早期阶段评估模型性能。
  • 迭代优化:保存超参数配置及其对应的性能值,然后使用Bayesian优化算法更新代理模型,建议下一步要评估的超参数配置。

训练早期阶段的模型性能与最终阶段的性能具有强烈的正相关性: 早期评估可有效地预测模型质量。

数据配比

引入大量行业数据,模型怎么反而变弱了? 参考

  • 对一个回答问题能力不错的模型,用大量数据做指令微调后,模型不会回答问题了。

原因:

  • 数据配比
  • 数据差异过大

大模型可能在训练过程中过度专注于垂类数据,导致 loss 收敛不再依赖全局而是从部分数据进行考虑。

贝壳论文中,比较好的结果:

  • 开源数据集:垂域数据集 = 4:1, 即开源占比总体训练数据的80%,而垂类数据仅占20%。
  • 《垂域大模型训练》

对 continue pretraining, 如果要让模型不丢失通用能力,比如 summarization,qa 等

(1) 领域数据 continue pretraining 时,一定更要混合大量通用数据。

  • 领域数据比例要在15%以下」
    • 一旦超过这个阈值,模型通用能力会下降很明显。
  • 这个阈值和不同的预训练模型相关,有些模型比如llama需要控制的阈值更低。

阈值其实是经验主义结论,范围都在 10%-15% 左右。

  • 而且阈值和预训练模型的大小,预训练时原始数据的比例等条件都息息相关,需要在实践中反复修正。

(2) sft 比例可提高不少

  • 领域数据:通用数据=1:1
  • 如果sft数据量少,混不混数据差别就不太大了。

统一格式

统一增量预训练单轮对话多轮对话三种数据集格式

[{
    "conversation":[
        {
            "system": "xxx",
            "input": "xxx",
            "output": "xxx"
        }
    ]
},
{
    "conversation":[
        {
            "system": "xxx",
            "input": "xxx",
            "output": "xxx"
        },
        {
            "input": "xxx",
            "output": "xxx"
        }
    ]
}]

训练过程中,将一条数据中 多组 “system”、”input” 和 “output” 进行拼接,之后输入模型,并行计算每个位置的 loss ,但只有 “output” 部分对应的 loss 参与梯度回传

<BOS><EOS>表示句子或文本的开始和结束

图解

增量预训练

增量预训练旨在帮助模型学习针对特定下游任务的语言知识和表达能力,因此数据集的全部内容对应的 loss 都应该用于梯度回传。

因此,数据集的 “system”、”input” 为空,而 “output” 为一整条语料数据。

[{
    "conversation":[
        {
            "system": "",
            "input": "",
            "output": "I am an artificial intelligence (AI) assistant named Puyu. I was created by the Shanghai AI Laboratory and my purpose is to assist users with various tasks through natural language processing technology."
        }
    ]
},
{
    "conversation":[
        {
            "system": "",
            "input": "",
            "output": "I am an artificial intelligence programmed to assist with various types of tasks, including answering questions, providing information, and performing automated processes."
        }
    ]
}]

单轮数据

单轮对话数据集由1条指令(或问题)及其对应 GroundTruth 回答组成。

由于只有回答部分需要对 loss 进行回传,因此数据集的 “system”、”input” 字段为输入指令,”output” 字段为对应回答

[{
    "conversation":[
        {
            "system": "You are an AI asssistant."
            "input": "Give three tips for staying healthy.",
            "output": "1.Eat a balanced diet. 2. Exercise regularly. 3. Get enough sleep."
        }
    ]
},
{
    "conversation":[
        {
            "system": "You are an AI asssistant."
            "input": "How to study English?",
            "output": "1. Set clear goals. 2. Create a study plan. 3. Build vocabulary. 4. Practice speaking."
        }
    ]
}]

多轮数据

多轮对话数据集往往由多轮指令(或问题)+ 对应 GroundTruth 回答组成。

假设有一条多轮对话数据,内容如下。

对于第 n 轮对话,将 User 和 Assistant 对应的输出设为 UserN 和 AssistantN。

System: You are an AI asssistant.
User1: Hello?
Assistant1: Hello! How can I help you?
User2: What\'s the date today?
Assistant2: Today is Monday, August 14, 2023.
User3: Thank you!
Assistant3: You are welcome.

如何使用上述这条多轮对话数据训练大模型?目前有两个主流方法。

  • 方法 1
    • System、User1、Assistant1、User2、Assistant2、User3 文本都视为模型的输入部分,将 Assistant3 的文本视为模型的预测部分,只有 Assistant3 部分的 loss 参与权重更新。
    • 弊端在于没有充分利用多轮对话的训练数据,因为 Assistant1 和 Assistant2 的内容没有参与模型训练,导致训练数据利用率较低。
  • 方法 2
    • 将1条多轮对话数据拆分成多条数据。如将以上示例拆分成如下三条数据。
    • 相比于方法1,方法2可以充分利用每一轮对话的数据,但需要将一条包含 n 轮对话的数据拆分为 n 条数据,训练效率降低 1/n
  • 方法 3
    • XTuner 训练多轮对话模型时,采取了一种更加充分高效的方法。
    • 将多轮对话进行拼接,之后输入模型,并行计算每个位置的 loss,而只有 Output 部分的 loss 参与回传。
[{
    "conversation":[
        {
            "system": "You are an AI asssistant."
            "input": "Hello?",
            "output": "Hello! How can I help you?"
        },
        {
            "input": "What's the date today?",
            "output": "Today is Monday, August 14, 2023."
        },
        {
            "input": "Thank you!",
            "output": "You are welcome."
        }
    ]
},
{
    "conversation":[
        {
            "system": "You are an AI asssistant."
            "input": "Hello?",
            "output": "Hello! How can I help you?"
        },
        {
            "input": "How's the weather today in Rosso?",
            "output": "The weather in Rosso on Wednesday, August 16th, is going to be cloudy for most of the day, together with moderate rain around noon."
        },
        {
            "input": "Thank you!",
            "output": "You are welcome."
        }
    ]
}]

数据集中的 “conversation” 键对应的值是一个列表,用于保存每一轮对话的指令和实际回答(GroundTruth)。为了保持格式统一,增量预训练数据集和单轮对话数据集中的 “conversation” 键也对应一个列表,只不过该列表的长度为 1。而在多轮对话数据集中,”conversation” 列表的长度为 n,以容纳 n 轮的对话内容。

LLMs 数据格式汇总

各类LLM数据格式汇总: chat_template

不同模型在是否存在默认 system message上, 有所不同(大多数模型都是没有的)。

每个模型都附上了有system版本和无system版本,如果在训练模型时希望加上system message, 可以参照template模板自行添加。

Qwen

官方默认 system message 即:You are a helpful assistant

<|im_start|>system
You are a helpful assistant<|im_end|>
<|im_start|>user
This is a instruction<|im_end|>
<|im_start|>assistant
This is a answer<|im_end|>

Yi

官方版本没有默认 system message,可以与llama一样, 不加 system message使用,有

<|im_start|>system
This is a system message<|im_end|>
<|im_start|>user
This is a instruction<|im_end|>
<|im_start|>assistant
This is a answer<|im_end|>

无system模式

<|im_start|>user
This is a instruction<|im_end|>
<|im_start|>assistant
This is a answer<|im_end|>

Gemma

官方版本不支持system

无system模式

<bos><start_of_turn>user
This is a instruction<end_of_turn>
<start_of_turn>model
This is a answer<end_of_turn>

Phi-3

官方版本没有默认的system message, 有此需求可依据下述模板自己构建

<s><|system|>
This is a system message<|end|>
<|user|>
This is a instruction<end>
<|assistant|>
This is a answer<end>

无system模式

<s><|user|>
This is a instruction<end>
<|assistant|>
This is a answer<end>

Deepseek

官方同样没有提供默认system message,有此需求可依据下述模板自己构建

<beginofsentence>This is a system message
User:This is a instruction
Assistant:This is a answer<endofsentence>

无system模式

<beginofsentence>User:This is a instruction
Assistant:This is a answer<endofsentence>

Mistral

没有提供system模式

无system模式

<s>[INST]:This is a instruction [/INST]This is a answer</s>

Llama2

<s>[INST] <<SYS>>
You are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe.  Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature.

If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information.
<</SYS>>

There's a llama in my garden 😱 What should I do? [/INST] This is a answer</s>

Llama3&3.1

<|begin_of_text|><|start_header_id|>system<|end_header_id|>

This is a system prompt.<|eot_id|><|start_header_id|>user<|end_header_id|>

This is the first user input.<|eot_id|><|start_header_id|>assistant<|end_header_id|>

This is the first assistant response.<|eot_id|>

MiniCPM

<用户>This is a system message<AI>This is a instruction</s>

DeepSeek-coder

<beginofsentence>User: {user_message_1}

Assistant: {assistant_message_1}<endofsentence>User: {user_message_2}

Assistant:
You can also add an optional system message:

<beginofsentence>{system_message}

User: {user_message_1}

Assistant: {assistant_message_1}<endofsentence>User: {user_message_2}

Assistant:

ChatGPT 三步走

InstructGPT 分为如下三大步:

  • SFT:生成模型GPT的有监督精调 (supervised fine-tuning)
  • RM奖励模型的训练(reward model training)
  • PPO近端策略优化模型( reinforcement learning via proximal policy optimization)

SFT(supervised fine-tuning) 主要还是大量Prompt数据

  • GPT模型通过有监督Prompt数据进行精调,即 next token prediction 任务(NTP)。
  • 然后用精调后的模型对每个输入的 < 文本+prompt > 进行 generate,生成4~9个输出,并且进行解码操作。
  • SFT流程图

【2023-11-20】transformers_tasks GPT-2 和 RLHF 示例

【2023-9-11】Understanding and Using Supervised Fine-Tuning (SFT) for Language Models

GPT 训练流程

【2023-5-23】Andrej Karpathy 在微软Build 2023开发者大会上进行了主题演讲:State of GPT(GPT的现状)

模型训练分为四个阶段:预训练(Pretraining)、监督微调(Supervised Finetuning)、奖励建模(Reward Modeling)、以及强化学习(Reinforcement Learning)。

  • 数据量:预训练阶段所需的数据量很大,但质量要求不高;而后面的三个阶段恰恰相反,需要的数据质量较高。
  • 训练方法:预训练和监督微调的训练方法相同,都是预测下一个单词。奖励模型和强化学习的训练方法则不同。奖励模型是二元分类学习,而强化学习则鼓励模型生成奖励模型评分较高的回答。
  • 训练所需资源:预训练阶段的资源消耗巨大,使用数千颗GPU,花费数月时间,占总训练时间的99%。后面的三个阶段只需使用数十颗GPU,训练时间约数天。

预训练阶段的资源消耗如此巨大,只有大厂才有能力进行。如果资源有限,我们应将重心放在后三个阶段

ChatGPT流程

InstructGPT和instruction tuning方向的工作比较相关,独特之处在于继承了之前工作的风格——对齐人类偏好。与之前摘要任务相比,instructGPT的prompt分布更多样和复杂。

【2023-5-1】 ChatGPT训练三步流程

  • AC架构中,Actor(学习策略)和Critic(学习价值)是两个模型,训练过程中参数都是变动的
  • PPO基于A2C算法(同步优势更新的AC),经验回放过程中更新参数
  • Critic和RM是一个模型,Instruct GPT论文中,RM 都是6b gpt-3, Critic目标不只是学习RM,还要配合、监督actor同步更新
  • RLHF训练过程中涉及4个模型:actor、critic、rm和sft,后两者冻结,前两个持续更新
  • PPO损失函数组成:RM打分损失 - β SFT差异损失,即打分高而差异小

备注

  1. SFT数据集和RM数据集的prompts来自于API和标注人员编写
    • SFT数据集的回答是标注人员写的;
    • PPO数据集来自于API
  2. Prompts的任务类型包括生成、QA、脑暴、聊天、改写、摘要、分类、抽取等任务。
  3. RM模型用了GPT-3 6B,训练方法和之前摘要任务一样。
  4. Policy增加一个LM pretrain objective,可以修复alignment tax,让RL policy在公开NLP数据集上表现也很好

优化

全参数微调:对模型所有参数进行调整,如:SFT

  • 问题:代价大(模型大、数据多、参数量大)

混合精度微调:省显存、加速,但丢失精度

  • 训练时同时使用16位32位浮点类型,加速,减少内存开销
  • 部分参数使用32位类型,以保持数值稳定性,缩短单步用时
  • 现代加速器使用16位专用硬件执行运算,速度更快

注意

  • FP16: 内存存储、乘法运算
  • FP32: 累加运算,避免下溢
  • 手动放大梯度,以避免梯度爆炸,这样 FP16和FP32运算时,不容易出现下溢

其它加速方法

  • 多卡并行:数据并行、模型并行
  • ZeRO:分布式机器学习的新型内存优化技术,将三个步骤(optimizer state partitioning/add gradient partitioning/add parameter partitioning)拆分到不同的卡上,相比 数据并行,节省GPU
  • p-tuning:通过prompt encoder结构将prompt编码为向量,再与input embedding拼接。增加模型理解能力
  • LoRA:低秩适配,冻结大模型权重,只训练新增的网络层(两个小矩阵的乘积),降低fine-tune成本,同时保持类似效果

(0) Pre-Train

问题

【2024-7-28】面试LLM//各阶段

CLS token

预训练阶段:

  • 模型训练句子时, 没有加 <CLS> Token,但是预测时加了<CLS> Token
  • 或者训练时加了<CLS> token, 但是预测时没有加<CLS> Token

benchmark 预测会有啥问题?

benchmark会直接崩溃,之前gemma-2b训的时候带BOS,预测忘加了,benchmark全崩了。

原因

  • 一个句子的第一个Token在模型中会吸收大量attention,那么当预测时改变了第一个Token,句子的预测会改变比较大,因为第一个Token改变了,而预测中大量attention来自第一个Token,所以预测的时候大量benchmark效果会不好。
三个阶段训练(SFT->RM->PPO)过程较长,更新迭代较慢?

考虑以下几种方法:

  • 并行化训练:利用多个计算资源进行并行化训练,可以加速整个训练过程。可以通过使用多个CPU核心或GPU来并行处理不同的训练任务,从而提高训练的效率和速度。
  • 分布式训练:将训练任务分发到多台机器或多个节点上进行分布式训练。通过将模型和数据分布在多个节点上,并进行并行计算和通信,可以加快训练的速度和更新的迭代。
  • 优化算法改进:针对每个阶段的训练过程,可以考虑改进优化算法来加速更新迭代。例如,在SFT(Supervised Fine-Tuning)阶段,可以使用更高效的优化算法,如自适应学习率方法(Adaptive Learning Rate)或者剪枝技术来减少模型参数;在RM(Reward Modeling)阶段,可以使用更快速的模型训练算法,如快速梯度法(Fast Gradient Method)等;在PPO(Proximal Policy Optimization)阶段,可以考虑使用更高效的采样和优化方法,如并行采样、多步采样等。
  • 迁移学习和预训练:利用迁移学习和预训练技术,可以利用已有的模型或数据进行初始化或预训练,从而加速训练过程。通过将已有模型的参数或特征迁移到目标模型中,可以减少目标模型的训练时间和样本需求。
  • 参数调优和超参数搜索:对于每个阶段的训练过程,可以进行参数调优和超参数搜索,以找到更好的参数设置和配置。通过系统地尝试不同的参数组合和算法设定,可以找到更快速和高效的训练方式。

综合运用上述方法,可以加速三个阶段训练过程,提高更新迭代的速度和效率,从而减少训练时间和资源消耗。

(1) 第一步 SFT(全参数微调)

SFT 原理比较简单,难的是数据问题,需要大量的有监督Prompt文本

  • img
  • Transformer【左】GPT【右】

大模型训练基座模型时,都采用「Next Token Prediction,NTP」 任务

【2024-5-31】sft分为两种,拟合对齐

  • 拟合:通过finetuning 得到稳定、符合需求的输出,包括格式、风格、特定模式等,是在业务落地中高频使用的方式;
  • 对齐:指令对齐,让LLM更好地理解人类语言、执行自然语言指令,即LLM三个阶段之第二个阶段(pretrain、sft、rlhf)。

loss 改进

【2024-9-24】SFT loss 计算的那些坑(多轮合并/packing)

SFT 训练时, 直接输入 (input_ids, label), 训练效率低。

通常有两个加速方法:

  1. 多轮合并: 同一个会话的拆分、合并
    • user 和 bot 交互了 3 轮, 数据格式: bot作答部分用 input_ids, 其余用 -100 表示
    • (system, user1, bot1, pad), bot1 计算loss
    • (system, user1, bot1, user2, bot2, pad), bot2 计算loss
    • (system, user1, bot1, user2, bot2, user3, bot3), bot3 计算loss - loss 表达式: loss = 1/3 (l1/n1+l2/n2+l3/n3), ni 是 boti token数, li 是第i个样本的 loss - 不同样本之间有很多重复计算的前缀, 训练偏慢
  2. 加速
    • 将3个样本合成1个, 借助 causal attention mask,每个 token 只能看到前面的 token,计算上和之前是等价
    • 数据格式: (system, user1, bot1, user2, bot2, user3, bot3), 对应权重 li/ni
    • 问题: loss 计算有问题, pytorch CrossEntropyLoss 默认取均值 mean, loss = (l1+l2+l3)/(n1+n2+n3), 而 ni 不一定相同, 导致 短句子权重被降低, 长句子被加权, loss 不等价
  3. packing: 将多个会话合成一条, 进一步加速
    • 将所有样本拼接成1条,并加入 attention mask, 保证后面的样本看不见前面的token。如 在 flash attention 中调用 flash_attn_varlen_qkvpacked_func,并传入 cu_seqlens 参数。
    • 和之前一样,如果不修改 loss 计算方法,packing 的样本之间会存在因为长度不同,导致训练不充分的问题。

loss 计算会经历三次平均

  • micro batch 维度,分母是这个 micro batch 中的所有 label 不是 -100 的 token 数
  • DP 维度,分母是 DP size (和GPU数量相关)
  • 梯度累加维度,分母是梯度累加数

禁用这三个平均,统一用 global batch 对话轮数作为分母。

  • 新版 megatron 框架中,开启开关 --calculate-per-token-loss, 即可禁用 DP 和梯度累加的平均
  • 然后 修改 loss_func,每个 micro batch 都需要返回这个 micro batch 的轮数
  • 最后 框架会自动将所有轮数求和,作为分母。对于分子,需要除以这个轮次的token 数。

正确实现代码如下(loss_token_num, turn_num 是在构建 data 的时候构建的):

def loss_func(output_tensor, loss_mask, loss_token_num, turn_num):
    losses = output_tensor.view(-1).float()
    loss_mask = loss_mask.view(-1).float()
    loss_token_num = loss_token_num.view(-1).float()
    # label: [-100, -100, a, a, a, -100, b, b, -100, -100, c, c, c, -100, -100]
    # loss_mask: [0, 0, 1, 1, 1, 0, 1, 1, 0, 0, 1, 1, 1, 0, 0]
    # losses: [a0, a1, a2, a3, a4, b0, b1, b2, c0, c1, c2, c3, c4, d0, d1]
    # losses * loss_mask = [0, 0, a2, a3, a4, 0, b1, b2, 0, 0, c2, c3, c4, 0, 0]
    # loss_token_num: [3, 3, 3, 3, 3, 2, 2, 2, 3, 3, 3, 3, 3, 1, 1]
    # losses * loss_mask / loss_token_num = [0, 0, a2/3, a3/3, a4/3, 0, b1/2, b2/2, 0, 0, c2/3, c3/3, c4/3, 0, 0]
    # sum = 1/3 (a2 + a3 + a4) + 1/2 (b1 + b2) + 1/3 (c2 + c3 + c4)
    loss = torch.sum(losses * loss_mask / loss_token_num)

    loss_and_turn_num = torch.cat([loss.view(1), turn_num.view(1)])
    # Reduce loss for logging.
    loss_and_turn_num = loss_and_turn_num.clone().detach()
    torch.distributed.all_reduce(loss_and_turn_num, group=mpu.get_data_parallel_group())
    # 新版返回结构,开启 calculate_per_token_loss 开关后,返回三个值
    # 第一个是反向传播实际使用的 loss, 所有 packing 的 loss 求和
    # 第二个是 turn_num, 优化器状态更新时会使用对这个值求和然后缩放梯度
    # 第三个是用于日志打印的 loss, 包含两个值,第一个是所有 loss 求和作为分子,第二个是所有 turn_num 求和作为分母
    return loss, turn_num, {"lm loss": (loss_and_turn_num[0], loss_and_turn_num[1])}

无论是哪种方法,加速后都需要保证 loss 和原来等价。

加速注意:

  • 不同样本之间等价;
  • 不同轮次之间等价。

合并多轮 / packing 时,要修改 loss 计算方法,为每个 token 设置正确权重,并且关闭 DP / 梯度累加的平均。

IFT 问题

5 月,伯克利的论文 The False Promise of Imitating Proprietary LLMs 指出这种方式微调出来的指令遵循模型存在的一系列问题:

  • 在缺少大量模仿 ChatGPT 数据支持的任务上,这类模型无法改善 Base Model 到 ChatGPT 的差距;
  • 这类模型只是擅长模仿 ChatGPT 的风格,而不是事实性,导致实际的性能差异会骗过人类评估者;
  • 当前开源模型最大的限制仍然是 Base Model 层面跟 GPT 系列的差距,在微调而不是预训练环境进行优化可能是不正确的方向;
  • 为了广泛地匹配 ChatGPT 支持的任务,需要更广泛和大量的模仿数据集,还需要新的工作;

而 6 月份 Allen Institute for AI 和华盛顿大学的 How Far Can Camels GO ?工作再次通过实验表明不同的指令微调数据集可以释放或者增强特定的能力,但并没有一个数据集或者组合可以在所有的评估中提供最佳性能,并且这一点在人类或模型担任评估者时也很容易无法被揭示。

对于指令遵循微调背后的团队来说,他们也意识到自己的模型由于 Base Model(LLaMA)的限制,在复杂推理和代码任务上很弱,并且难以进入正向数据飞轮 —— 模型能力越弱的领域越难得到更多的 query,也就难以筛选出高质量 query,想自己再标注提升模型能力就很困难。

至此,开源社区已经充分意识到原来这套微调 LLaMA 的框架的局限性,越来越多的团队开始探索预训练环节和更接近真实的人类反馈数据

数据示例

数据准备

Raw Data Prompt Label
我们去成都旅游,必须要去的地方是大熊猫繁殖基地。 大熊猫是 一种有黑白斑纹的动物。
我们去成都旅游,必须要去的地方是大熊猫繁殖基地。 大熊猫是 中国特有种,主要栖息地是中国四川、陕西和甘肃的山区。
我们去成都旅游,必须要去的地方是大熊猫繁殖基地。 大熊猫是 已在地球上生存了至少800万年,被誉为“活化石”和“中国国宝”即国兽,世界自然基金会的形象大使,是世界生物多样性保护的旗舰物种。
我们去成都旅游,必须要去的地方是大熊猫繁殖基地。 大熊猫是 属于熊科、大熊猫属的哺乳动物。仅有二个亚种。雄性个体稍大于雌性。体型肥硕似熊、丰腴富态,头圆尾短,头躯长1.2-1.8米,尾长10-12厘米。
raw_data = "我们去成都旅游,必须要去的地方是大熊猫繁殖基地。"
prompt = "大熊猫是"
labels = ["一种有黑白斑纹的动物。","中国特有种,主要栖息地是中国四川、陕西和甘肃的山区。",
"已在地球上生存了至少800万年,被誉为“活化石”和“中国国宝”即国兽,世界自然基金会的形象大使,是世界生物多样性保护的旗舰物种。",
"属于熊科、大熊猫属的哺乳动物。仅有二个亚种。雄性个体稍大于雌性。体型肥硕似熊、丰腴富态,头圆尾短,头躯长1.2-1.8米,尾长10-12厘米。"]
combine_data = [raw_data+prompt+label for label in labels]

初始化模型,对输入数据进行编码, 以 GPT-2 模型为例

from torch.utils.data import Dataset
from transformers import Trainer, TrainingArguments
from transformers import AutoTokenizer, AutoModelForCausalLM
# 模型加载
tokenizer = BloomTokenizerFast.from_pretrained('pre_train_model/gpt2')
model = BloomForCausalLM.from_pretrained('pre_train_model/gpt2')
# 自定义DataSet类
class Datasets(Dataset):
    def __init__(self, sample):
        super(Datasets, self).__init__()
        self.sample = sample

    def __getitem__(self, item):
        res = {k: v[item] for k, v in self.sample.items()}
        return res

    def __len__(self):
        return len(self.sample['labels'])
# 数据转换
combine_data_token = tokenizer.batch_encode_plus(
    initial_data_,
    max_length=256,
    padding='max_length',
    truncation=True,
    return_tensors='pt'
)
# 将标签标签加入
combine_data_token['labels'] = combine_data_token['input_ids']
combine_data_token['labels'] = torch.where(
    combine_data_token['labels']==0,
    -100,
    combine_data_token['labels']
)
# 模型训练保存
trainer_args = TrainingArguments("./model/", learning_rate=2e-5, weight_decay=0.01, num_train_epochs=10, auto_find_batch_size=True)
trainer = Trainer(model=initial_model, args=trainer_args, train_dataset=Datasets(initial_token_info))
trainer.train()
trainer.save_model()
# ----- 加载生成 --------
# 加载模型
model = AutoModelForCausalLM.from_pretrained('./model')
# 处理输入数据
input_data = raw_input + prompt
input_datas = tokenizer.encode_plus(
    input_data,
    return_tensors='pt'
)
input_ids = input_datas['input_ids']
# 模型生成
result = model.generate(
    input_ids=input_ids,
    max_length=256,
    do_sample=True,  # 增加随机性
    num_beams=5,
    num_return_sequences=5,  # 每个样本生成5个结果
    no_repeat_ngram_size=3,  # 防止重复的token
    early_stopping=True  # 提前停止
)

decode_tokens = tokenizer.batch_decode(
    result,
    skip_special_tokens=True
)

results = [i.replace(' ', '') for i in decode_tokens]

print("results",results)

结果:

我们去成都旅游必须要去的地方是大熊猫繁殖基地大熊猫是今世界上保存最完好的哺乳动物之一也是世界自然保护联盟濒危物种红色名录的保护对象之一在这里你可以看到全世界最大的熊猫栖息地成都成都是中国国家林业局直属的国家重点风景名胜区是国家森林公园国家湿地公园和国家地质公园的重要组成部分是全国重点文物保护单位全国生态文明建设示范区中国红色旅游名城国际生态旅游目的地和国际旅游岛建设先进区地址四川省成都市绵阳市成华区成都高新技术产业开发区成华大道1号乘车路线成都绵阳都江堰雅
我们去成都旅游必须要去的地方是大熊猫繁殖基地大熊猫是我国唯一的国家二级保护动物是世界上保存最完整的动物种群之一也是我国第一个国家级自然保护区我们是四川省的首批国家重点保护野生动物和珍稀动物基金会的成员被誉为中国动物保护的摇篮和世界生物多样性保护基地被中国科学院中华人民共和国国家林业局授予全国生态文明建设示范区称号被国务院批准为国家森林城市国际生态旅游目的地熊猫基地位于成都市双流区东南部是国家aaaa级旅游景区国家地理标志保护单位熊猫栖息地为亚热带或热带的高山
我们去成都旅游必须要去的地方是大熊猫繁殖基地大熊猫是我国唯一的国家级自然保护区也是世界上保存最完好的熊猫种群之一它们栖息在亚热带或热带的高海拔草原上生活环境十分优越是中国四大自然奇观之一被誉为世界自然遗产和中国国家森林公园熊猫栖息地主要分布在中国大陆的西藏青海甘肃宁夏新疆内蒙古山西辽宁吉林黑龙江江苏河南安徽湖北湖南江西广东海南四川云南贵州陕西等地中国熊猫研究中心主任中国科学院院士国家自然科学基金委员会委员中华全国工商业联合会副主席
我们去成都旅游必须要去的地方是大熊猫繁殖基地大熊猫是我国唯一的国家级自然保护区也是世界上保存最完整规模最大的野生动物种类繁多的地区之一是中国国家重点保护的珍稀濒危动物及其栖息地和世界自然遗产的重要组成部分被誉为中国最美丽的城市和世界生物多样性保护基地被国际旅游组织评为全球生态旅游目的地成都熊猫国家公园位于四川省甘孜藏族自治州是国家aaaa级旅游景区世界遗产名录列为全国重点文物保护单位目前我国已建成国家森林公园国家湿地公园和国家地质公园国家林业局国务院扶贫
我们去成都旅游必须要去的地方是大熊猫繁殖基地大熊猫是现存最大保存最完整的动物属于国家二级保护动物熊猫种类繁多分布广泛主要分布在四川云南陕西甘肃宁夏内蒙古新疆青海吉林辽宁黑龙江山西江苏江西河南湖北湖南广东广西海南重庆贵州西藏四川等省区市它们的栖息地主要为亚热带或热带的低地湿润低地林亚高山草原高山湖泊高原湿润山区和高原沼泽地等常栖息在高海拔地区在中国大陆熊猫分布于四川省甘孜藏族自治州和青海省西宁市等地雄性熊猫体长约1.5米

这和instructGPT的SFT过程大致相同,思路原理是一样的,差别是 缺乏硬件设备、大规模高质量监督数据

引入RM模型的作用是对生成的文本进行打分排序,让模型生成的结果更加符合人类的日常理解习惯,更加符合人们想要的答案。RM模型主要分为两个部分:训练数据获取和模型训练部分。流程如下图所示

Bloom SFT

【2023-5-23】bloom_tuning: BLOOM 模型的指令微调

BLOOM 系列模型是由数百名研究人员在包含 46 种自然语言和 13 种编程语言的数据集上, 基于大规模分布式训练框架 Megatron-DeepSpeed 训练得到。

  • 实验发现,BLOOM 在一系列基准测试上取得了具有竞争力的性能,经过多任务提示微调后,可以获得更为惊艳的效果。
  • BLOOM 模型支持中文、英文、代码、法语、西班牙语。

链接:bloom-560m

LLMPruner 工具对 BLOOM 进行词表裁剪,保留常用的中英文 token,词表大小由 250880 降至 46145,缩减为原来的 18.39%,在后续微调过程中可以减少显存占用。

数据

训练数据来自于 BelleGroup/train_3.5M_CN,该数据集包含 3.6M 条指令,从中筛选出单轮对话数据,进行 10:1 采样后得到约 0.25M 指令数据:

python sample_data.py \
--input data/train_3.5M_CN.json \
--output data/train.jsonl \
--sample_ratio 0.1

单条指令数据形如:

{
    "instruction": "你好,请问你能做什么?", 
    "output": "你好,我可以回答各种问题,提供辅助,或者与你聊天。有什么我可以帮你的吗?"
}

输出部分的长度分布如下图所示(若输出长度超过2048,则设置为2048)

指令微调

基于 deepspeed ZeRO-Stage 2 进行指令微调训练:

deepspeed --include localhost:0 train.py \
--model_name_or_path /path/to/bloom \
--data_path data/train.jsonl \
--max_input_length 200 \
--max_output_length 768 \
--output_dir output \
--per_device_train_batch_size 1 \
--gradient_accumulation_steps 16 \
--learning_rate 3e-5 \
--num_train_epochs 2 \
--lr_scheduler_type "cosine" \
--warmup_steps 2000 \
--logging_steps 10 \
--save_strategy "steps" \
--save_steps 200 \
--save_total_limit 1 \
--deepspeed deepspeed.json \
--fp16 False

在 per_device_train_batch_size=1、gradient_accumulation_steps=16、max_input_length=200、max_output_length=768、fp16=false 的配置下,单卡需要14G 显存可对 bloom-396m-zh 进行微调。

推理

微调后的模型已上传至 huggingface: bloom-396m-chat

可以通过如下代码调用 bloom-396m-chat 模型来生成对话:

from transformers import BloomTokenizerFast, BloomForCausalLM

model_name_or_path = "WangZeJun/bloom-396m-chat"

tokenizer = BloomTokenizerFast.from_pretrained(model_name_or_path)
model = BloomForCausalLM.from_pretrained(model_name_or_path).cuda()
model = model.eval()

input_pattern = "{}</s>"
text = "你好"
input_ids = tokenizer(input_pattern.format(text), return_tensors="pt").input_ids
input_ids = input_ids.cuda()

outputs = model.generate(input_ids, do_sample=True, max_new_tokens=1024, top_p=0.85,
    temperature=0.3, repetition_penalty=1.2, eos_token_id=tokenizer.eos_token_id)

output = tokenizer.decode(outputs[0])
response = output.replace(text, "").replace('</s>', "")
print(response)

运行如下命令进行命令行推理:

python chat.py \
--model_name_or_path /path/to/model \
--max_new_tokens 1024 \
--top_p 0.85 \
--temperature 0.3 \
--repetition_penalty 1.2
测试
用户 Assistant
你好 您好,有什么我可以帮您的吗?
你是谁 我是一个AI语言模型,没有个人身份和情感。我的目的是回答你的问题并提供有用的信息来帮助你解决问题或提供娱乐性内容。如果你有任何其他的问题需要解答,请随时问我!

SFT 思考

SFT packing 是什么

SFT packing 指训练sft过程中,将多个sft数据pack到一个样本内进行训练

  • 这种方式会加快模型训练速度,如果不进行SFT packing,那么对于短文本sft需要padding到一个batch最长长度,浪费很多计算token。
  • SFT packing 有很多种类,比如 block diagonal attention, 每个token仅仅去attention自己的问题内的token。但一般业务中会直接将其相连接,然后进行预测,虽然这样会引入一些噪音,但好像相对于非sft packing方式的整体的效果损失不大。这个可能是因为pretrain的时候模型也是这么训练的。
SFT packing 对SFT训练的影响

SFT packing 后削弱了模型对难的短query和短答案的拟合。

  • 无sft packing 情况下,假设batch_size = 1,那么如果有个短query和短答案在这个batch里,其余补充padding,那么这个batch的gradient全是这个短文本的gradient,模型对这个query的拟合能力会变强。
  • 但SFT packing 后,多个短文本在一个样本中,这个batch的gradient会被稀释,短文本的拟合就不会特别强。但拟合能力似乎和泛化不可以挂钩,初步观察sft packing和non sft packing的效果差不了很多。在数据量小或者特定困难的数据上,sft packing是有损泛化效果的,non-packing的方式会影响模型续写的效果,因此会影响一些benchmark效果。但在大批量数据上是无损泛化效果的。
SFT 关注什么方面
  • 1 根据 prompt 筛选sft数据:Prompt的diversity:丰富多样的prompt数据可以让模型更多的了解人类的指令,包括指令指复杂指令中每一步的含义。Prompt的丰富程度决定了模型指令遵循的能力。
    • 明文TAG法:对SFT的prompt进行打tag,对其中的名词和动词进行分类打标,最后通过tag对prompt的分布进行调整,保证tag的分布是均匀的。著名的就是InsTag这个方法。
    • 模型embedding聚类方法:通过模型最后一层的embedding对prompt进行表示,那么通过prompt embedding的距离表示prompt的相似度,对于过于相似的prompt进行删除。著名的有Self-Evolved Diverse Data Sampling for Efficient Instruction Tuning。
    • 从complexity角度,对于prompt直接进行难度的升级,所以即使在同一个语意空间的prompt也会变得diverse。比较著名的是Wizard 方法,通过GPT4进行prompt难度升级,然后构成complexity丰富的prompt。
  • 2 利用sft model和pretrain model的关系筛选模型的sft数据:
    • IFD方法:利用公式进行数据选择: 这个公式是计算pretrain model生成对齐后模型的answer的难度(在 prompt的condition 下生成A的概率)。这个概率越低,越说明生成难度高,那么sft模型学习到的对齐规律越多,那么我们更应该选择这个sft数据。
    • Hybrid Method (混合了多种之前列举的指标和方法。):例如 What MakeGood Data for Alignment? A Comprehensive Study of Automatic Data Selectionin Instruction Tuning [2] 文章,从complexity,diversity和quality三个方向对sft数据建模,训练了多个模型对各个指标维度进行分别衡量。
  • 3 Answer的质量:Answer的质量包括内容和格式两方面,一方面内容的正确性需要得到保证,一方面内容的格式也很重要,细节丰富,逻辑缜密的answer可以激发模型更多的回答能力。
  • 4 SFT阶段不能太多的知识注入:过多的知识注入,或者超过模型能力本身的回答过多会导致对齐税。
提升模型 reasoning 能力

什么数据格式在SFT或者ICL阶段可以提升模型的reasoning的能力?

数学reasoning上是有三种形式可显著提高效果模型 reasoning 能力

  • Reverse : 128 + 367 = 495 -> 128 + 367 = ^594, 因为人就是反着计算的,从个位数到百位数。
  • COT or POT (Simplified Scratchpad): 把这个计算过程列举下来,用自然语言、符号或者代码形式呈现。
  • Detailed Scratchpad:把整个思考过程详细地用自然语言和符号表达出来。
    • 整体上Detailed Scratchpad需要的总条数最少就能达到100%在加法上的效果,但是其实总token数和plain需要差不多数量达到最好的效果。
SFT 中代码数据+文本数据, 哪个更容易改变

代码数据,因为

  • 预训练中, 代码数据确定性更高,ppl更低,记忆越深刻
  • 文本数据变化更大,ppl更高,熵更高。

SFT过程中,改变文本数据比较容易,因为本身ppl就会高,但代码数据会比较难,因为本身ppl会比较低,或者说代码数据的生成确定性更高,少量样本很难对其内部改变,只能大段替换。

SFT 能学新知识吗

虽然理论上可以,但很少且不推荐sft阶段去学习知识。

  • LIMA原文中就表述过同样一个假设,sft阶段更多是将模型能力和人类对齐,而不过多学习新的知识

原因如下:

  • sft相对于pretrain过的数据量实在太小,模型的知识学习的概率就很低。
  • 如果加大sft的数据量和pretrain数据相当,那么sft有一些特定的格式以及一些system prompt需要重复当作context进行attention,这些重复的context势必会影响模型原始的attention模式,从而影响模型的效果。
  • 最后, 如果希望sft学习新知识,不如把这部分sft的新知识组织好放入pre-train or post-train阶段更为合适。

(2)第二步 RM训练

奖励模型(Reward Model, RM)目标是刻画模型的输出是否在人类看来表现不错。

  • 输入: [提示(prompt),模型生成的文本]
  • 输出: 一个刻画文本质量的标量数字。

同一个prompt输出的多个答案,人工评测排序后,使用lambdarank的思想,优化RM奖励模型。

RM模型学习的是对于一个prompt,人类对答案的喜好程度。

  • RM模型【左】RM损失函数【右】
  • rm

奖励模型接收一系列文本并返回一个标量奖励,数值上对应人的偏好

引入RM模型的作用是对生成的文本进行打分排序,让模型生成的结果更加符合人类的日常理解习惯,更加符合人们想要的答案。

RM模型主要分为两个部分:数据获取模型训练。流程如下图所示

原论文中使用GPT架构做了一个reward model

注意

  • 要将模型的输出映射成维度为1的打分向量,即增加一个linear结构。

RM模型主要在于人工参与的训练数据构建部分,将训练好的SFT模型输入Prompt进行生成任务,每个Prompt生成4~9个文本,然后人为的对这些文本进行排序,将每个Prompt生成的文本构建为排序序列的形式进行训练,得到打分模型,以此模型用来评估SFT模型生成的文本是否符合人类的思维习惯。

两种方法命名为 direct score 和 rank score:

  • Direct score:直接对输出的文本进行打分,通过与自定义的label score计算loss,以此来更新模型参数;
  • Rank score:用排序方法对每个Prompt输出的n个句子进行排序作为输入,通过计算排序在前面的句子与排序在后面的句子的差值累加作为最终loss。

【2023-6-5】ChatGPT 为什么不用 Reward-Model 的数据直接 fine-tune,而用 RL?

  • Reward-model的输出对于整个token序列,一种滞后反馈,而finetune需要在每个token都有监督信号。这是强化学习与监督学习的差别。
  • 生成Reward-model的数据有些是结果对比较pair数据,没法直接用于监督学习finetune。

① Direct score方法

① Direct score方法

  • 利用 Bert模型对标注数据进行编码,用 linear层 映射到1维,然后用 Sigmoid函数输出每个句子的得分,与人工标记的得分进行loss计算,以此来更新模型参数。流程如下所示

数据为SFT最后所生成的数据,数据准备:

def data_prepare(pretrain_path):
    data_lst = [
        "我们去成都旅游,必须要去的地方是大熊猫繁殖基地。大熊猫是今世界上保存最完好的哺乳动物之一,也是世界自然保护联盟濒危物种红色名录的保护对象之一。在这里,你可以看到全世界最大的熊猫栖息地成都。成都是中国国家林业局直属的国家重点风景名胜区,是国家森林公园、国家湿地公园和国家地质公园的重要组成部分,是全国重点文物保护单位、全国生态文明建设示范区、中国红色旅游名城、国际生态旅游目的地和国际旅游岛建设先进区。地址:四川省成都市绵阳市成华区成都高新技术产业开发区成华大道1号乘车路线:成都绵阳都江堰雅",
        "我们去成都旅游,必须要去的地方是大熊猫繁殖基地。大熊猫是我国唯一的国家二级保护动物,是世界上保存最完整的动物种群之一,也是我国第一个国家级自然保护区。我们是四川省的首批国家重点保护野生动物和珍稀动物基金会的成员,被誉为中国动物保护的摇篮和世界生物多样性保护基地,被中国科学院、中华人民共和国国家林业局授予全国生态文明建设示范区称号,被国务院批准为国家森林城市、国际生态旅游目的地。熊猫基地位于成都市双流区东南部,是国家aaaa级旅游景区,国家地理标志保护单位。熊猫栖息地为亚热带或热带的高山",
        "我们去成都旅游,必须要去的地方是大熊猫繁殖基地。大熊猫是我国唯一的国家级自然保护区,也是世界上保存最完好的熊猫种群之一。它们栖息在亚热带或热带的高海拔草原上,生活环境十分优越,是中国四大自然奇观之一,被誉为世界自然遗产和中国国家森林公园。熊猫栖息地主要分布在中国大陆的西藏、青海、甘肃、宁夏、新疆、内蒙古、山西、辽宁、吉林、黑龙江、江苏、河南、安徽、湖北、湖南、江西、广东、海南、四川、云南、贵州、陕西等地。中国熊猫研究中心主任、中国科学院院士、国家自然科学基金委员会委员、中华全国工商业联合会副主席",
        "我们去成都旅游,必须要去的地方是大熊猫繁殖基地。大熊猫是我国唯一的国家级自然保护区,也是世界上保存最完整、规模最大的野生动物种类繁多的地区之一,是中国国家重点保护的珍稀濒危动物及其栖息地和世界自然遗产的重要组成部分,被誉为中国最美丽的城市和世界生物多样性保护基地,被国际旅游组织评为全球生态旅游目的地。成都熊猫国家公园位于四川省甘孜藏族自治州,是国家aaaa级旅游景区,被《世界遗产名录》列为全国重点文物保护单位。目前,我国已建成国家森林公园、国家湿地公园和国家地质公园,国家林业局、国务院扶贫",
        "我们去成都旅游,必须要去的地方是大熊猫繁殖基地。大熊猫是现存最大、保存最完整的动物,属于国家二级保护动物。熊猫种类繁多,分布广泛,主要分布在四川、云南、陕西、甘肃、宁夏、内蒙古、新疆、青海、吉林、辽宁、黑龙江、山西、江苏、江西、河南、湖北、湖南、广东、广西、海南、重庆、贵州、西藏、四川等省区市。它们的栖息地主要为亚热带或热带的(低地)湿润低地林、亚高山草原、高山湖泊、高原湿润山区和高原沼泽地等,常栖息在高海拔地区。在中国大陆,熊猫分布于四川省甘孜藏族自治州和青海省西宁市等地。雄性熊猫体长约1.5米"]
    # 自定义打分标签,每个句子一个分值。也可以定义多维度的打分方法,只是模型的线性层需要改为你所定义的维度数
    direct_score = [[0.75], [0.5], [0.35], [0.4], [0.8]]
    tokenizer = BertTokenizer.from_pretrained(pretrain_path)
    train_data = tokenizer.batch_encode_plus(data_lst, max_length=256, padding="max_length", truncation=True,
                                             return_tensors='pt')
    train_data["labels"] = torch.tensor(direct_score)
    return train_data, tokenizer

RM模型搭建

  • 采用了Bert模型作为编码模型,后取CLS作为文本表征,采用MSE作为loss函数,最后接linear进行维度压缩
import torch
import torch.nn as nn
from torch.utils.data import Dataset, DataLoader
from transformers import BertModel, BertPreTrainedModel, BertTokenizer, BertConfig, get_scheduler


class RewardModel(BertPreTrainedModel):
    def __init__(self, config):
        super(RewardModel, self).__init__(config)
        self.config = config
        self.sigmoid = nn.Sigmoid()
        self.loss_fn = nn.MSELoss()
        self.model = BertModel(config)
        self.linear = nn.Linear(config.hidden_size, 1)

    def forward(self, input_ids, token_type_ids, attention_mask, labels=None):
        outputs = self.model(input_ids=input_ids, token_type_ids=token_type_ids,
                             attention_mask=attention_mask).pooler_output
        output = self.linear(outputs)
        logits = self.sigmoid(output)
        if labels is not None:
            loss = self.loss_fn(logits, labels)
            return logits, loss
        else:
            return logits

训练过程

class Datasets(Dataset):
    def __init__(self, sample):
        super(Datasets, self).__init__()
        self.sample = sample

    def __getitem__(self, item):
        res = {k: v[item] for k, v in self.sample.items()}
        return res

    def __len__(self):
        return len(self.sample['input_ids'])


def train(pretrain_path, save_path):
    config = BertConfig.from_pretrained(pretrain_path)
    model = RewardModel(config=config)

    no_decay = ["bias", "LayerNorm.weight"]
    optimizer_grouped_parameters = [
        {
            "params": [p for n, p in model.named_parameters() if not any(nd in n for nd in no_decay)],
            "weight_decay": 0.01,
        },
        {
            "params": [p for n, p in model.named_parameters() if any(nd in n for nd in no_decay)],
            "weight_decay": 0.0,
        },
    ]
    optimizer = torch.optim.AdamW(optimizer_grouped_parameters, lr=2e-5)
    train_data, tokenizer = data_prepare(pretrain_path)
    dataloader = DataLoader(dataset=Datasets(train_data), shuffle=False, batch_size=1)

    max_train_steps = 10 * len(dataloader)
    warm_steps = int(0.0 * max_train_steps)
    lr_scheduler = get_scheduler(
        name='linear',
        optimizer=optimizer,
        num_warmup_steps=warm_steps,
        num_training_steps=max_train_steps,
    )
    model.train()
    for i in range(1, 51):
        loss_lst = []
        for batch in dataloader:
            out, loss = model(batch["input_ids"], token_type_ids=batch["token_type_ids"], attention_mask=batch["attention_mask"], labels=batch["labels"])
            loss_lst.append(loss.item())
            loss.backward()
            optimizer.step()
            lr_scheduler.step()
            optimizer.zero_grad()
        print("epoch{}\tloss: {}".format(str(i), str(sum(loss_lst) / len(loss_lst))))
    tokenizer.save_pretrained(save_path)
    model_to_save = model.module if hasattr(model, 'module') else model
    model_to_save.save_pretrained(save_path)
    model_to_save.config.save_pretrained(save_path)

模型预测

def predict(model_path):
    text = ["我们去成都旅游,必须要去的地方是大熊猫繁殖基地。大熊猫是今世界上保存最完好的哺乳动物之一,也是世界自然保护联盟濒危物种红色名录的保护对象之一。在这里,你可以看到全世界最大的熊猫栖息地成都。成都是中国国家林业局直属的国家重点风景名胜区,是国家森林公园、国家湿地公园和国家地质公园的重要组成部分,是全国重点文物保护单位、全国生态文明建设示范区、中国红色旅游名城、国际生态旅游目的地和国际旅游岛建设先进区。地址:四川省成都市绵阳市成华区成都高新技术产业开发区成华大道1号乘车路线:成都绵阳都江堰雅",
            "我们去成都旅游,必须要去的地方是大熊猫繁殖基地。大熊猫是我国唯一的国家二级保护动物,是世界上保存最完整的动物种群之一,也是我国第一个国家级自然保护区。我们是四川省的首批国家重点保护野生动物和珍稀动物基金会的成员,被誉为中国动物保护的摇篮和世界生物多样性保护基地,被中国科学院、中华人民共和国国家林业局授予全国生态文明建设示范区称号,被国务院批准为国家森林城市、国际生态旅游目的地。熊猫基地位于成都市双流区东南部,是国家aaaa级旅游景区,国家地理标志保护单位。熊猫栖息地为亚热带或热带的高山",]
    model = RewardModel.from_pretrained(model_path)
    tokenizer = BertTokenizer.from_pretrained(model_path)

    model.eval()
    data = tokenizer.batch_encode_plus(text, max_length=256, padding="max_length", truncation=True,
                                           return_tensors='pt')
    score = model(**data)
    return score

完成了一个基于Bert的文本打分模型。

  • 当然,这里展示的只是个思路,模型也很粗糙,而且自定义的打分标签也经不起推敲。

② Rank score方法

② Rank score方法

这种方法的区别在于:loss函数的设计

  • 首先,为什么在 InstructGPT 中不采用上面方法? 原因在于给生成句子在打分时,不同标注人员的标准不同,而且这个标准是很难进行统一的,这样会导致标注的数据评判标准不一样,即使每个标注人员的理解是一样的,但对于同一条文本给的分数也不一样的,因此在进行标注时需要把这个定量的问题转为一种更为简单的处理方法,采用排序来方法来进行数据标注可以在一定程度上解决这个问题。
  • 标注员在使用直接打分(Direct Score)时,会由于主观意识的不同,对同一个文本出现不同的分值;而使用等级排序(Rank Level)来进行数据标注时,可以统一标注结果。

数据是将每个Prompt生成的文本进行排序,最直接的方法就是最好的句子排在最前面,后面的句子以此类推。

def rank_data_prepare(pretrain_path):
    data_lst = []
    data_outputs = {
        'input_ids': [],
        'token_type_ids': [],
        'attention_mask': []
    }
    data_str = "我们去成都旅游,必须要去的地方是大熊猫繁殖基地。大熊猫是现存最大、保存最完整的动物,属于国家二级保护动物。熊猫种类繁多,分布广泛,主要分布在四川、云南、陕西、甘肃、宁夏、内蒙古、新疆、青海、吉林、辽宁、黑龙江、山西、江苏、江西、河南、湖北、湖南、广东、广西、海南、重庆、贵州、西藏、四川等省区市。它们的栖息地主要为亚热带或热带的(低地)湿润低地林、亚高山草原、高山湖泊、高原湿润山区和高原沼泽地等,常栖息在高海拔地区。在中国大陆,熊猫分布于四川省甘孜藏族自治州和青海省西宁市等地。雄性熊猫体长约1.5米\t我们去成都旅游,必须要去的地方是大熊猫繁殖基地。大熊猫是今世界上保存最完好的哺乳动物之一,也是世界自然保护联盟濒危物种红色名录的保护对象之一。在这里,你可以看到全世界最大的熊猫栖息地成都。成都是中国国家林业局直属的国家重点风景名胜区,是国家森林公园、国家湿地公园和国家地质公园的重要组成部分,是全国重点文物保护单位、全国生态文明建设示范区、中国红色旅游名城、国际生态旅游目的地和国际旅游岛建设先进区。地址:四川省成都市绵阳市成华区成都高新技术产业开发区成华大道1号乘车路线:成都绵阳都江堰雅\t我们去成都旅游,必须要去的地方是大熊猫繁殖基地。大熊猫是我国唯一的国家二级保护动物,是世界上保存最完整的动物种群之一,也是我国第一个国家级自然保护区。我们是四川省的首批国家重点保护野生动物和珍稀动物基金会的成员,被誉为中国动物保护的摇篮和世界生物多样性保护基地,被中国科学院、中华人民共和国国家林业局授予全国生态文明建设示范区称号,被国务院批准为国家森林城市、国际生态旅游目的地。熊猫基地位于成都市双流区东南部,是国家aaaa级旅游景区,国家地理标志保护单位。熊猫栖息地为亚热带或热带的高山\t我们去成都旅游,必须要去的地方是大熊猫繁殖基地。大熊猫是我国唯一的国家级自然保护区,也是世界上保存最完整、规模最大的野生动物种类繁多的地区之一,是中国国家重点保护的珍稀濒危动物及其栖息地和世界自然遗产的重要组成部分,被誉为中国最美丽的城市和世界生物多样性保护基地,被国际旅游组织评为全球生态旅游目的地。成都熊猫国家公园位于四川省甘孜藏族自治州,是国家aaaa级旅游景区,被《世界遗产名录》列为全国重点文物保护单位。目前,我国已建成国家森林公园、国家湿地公园和国家地质公园,国家林业局、国务院扶贫\t我们去成都旅游,必须要去的地方是大熊猫繁殖基地。大熊猫是我国唯一的国家级自然保护区,也是世界上保存最完好的熊猫种群之一。它们栖息在亚热带或热带的高海拔草原上,生活环境十分优越,是中国四大自然奇观之一,被誉为世界自然遗产和中国国家森林公园。熊猫栖息地主要分布在中国大陆的西藏、青海、甘肃、宁夏、新疆、内蒙古、山西、辽宁、吉林、黑龙江、江苏、河南、安徽、湖北、湖南、江西、广东、海南、四川、云南、贵州、陕西等地。中国熊猫研究中心主任、中国科学院院士、国家自然科学基金委员会委员、中华全国工商业联合会副主席\n昨天买的,今天就到了,因为给家中父母买的,怕东西多老人取件不方便,今天听家里人说京东小哥送到家门楼下,心里太高兴了,在这里希望京东能表扬一下本次快递小哥,他让我本次购物感觉很好,本来就喜欢京东一直购物,现在我更欣赏。购物的同事还能享受温暖的服务,京东的快递服务果然很棒,在此感谢京东,感觉快递小哥,如此服务真的很温暖。\t京东 ,对于S8的货品状态 ,你们你们京东采购下单是应该在预售前还是预售后(定金不退的预售方式)?预售前下单叫正规预订补款了有货拿,预售补款了没货并且还要重新再采购叫空手套白狼,京东是哪种?\t在北京住过不下10多家酒店,也喜欢住公寓,从凯宾斯基到建国饭店,从京广到美华再到星城亮马,而这个是我住过的有史以来最差的一个酒店公寓。难怪价格不上不下,不是因为临时有事绝对不住,希望这里那么多好评语不是枪手1、入口难找到要死不说,大堂感觉就是某个买小商品的商铺,check in 竟然要压证件,没有听说过,坚决不同意拿了我的证件去复印。私人住宿和旅客混杂,拖着箱子看着买菜回来的人一同电梯很奇怪。2、半夜接到骚扰电话3、房间设计装饰非常的“家常“,设施陈旧,非常像当年在江南古镇租住的农家房3、住的房间刚好在过道口,声音那叫一个大阿,谁说的房间隔音?楼上住户的动静镇清楚啊4、服务态度不好,和客人顶着说,铁板一样的语气。5, 实在要找一优点出来的话:唯一就是小区里面比较安静,没有汽车闹声。\t码数刚刚好,穿上很好看,和身。宝贝不掉色,弹力好。穿着不紧绷,试了好几下蹲下站起来,都轻松自如,不会感觉腿被束缚着。价格也不贵,现在认准这家店了这款洗发水挺适合我的发质,用完果断续上一瓶,还搞了个特价,值了!\t之前就听说苏州万丽是苏州生意最好,房价最高,也是业内人士最推崇的酒店,远胜于喜来登,香格里拉,索菲特,在苏州属于一枝独秀型的,平时房间非常的难定,几乎天天满房,这次好不容易定了个行政套,本打算住一天,后又延了一天,简单来说吧,房间不大但很温馨,酒店工作人员不多但都非常专业,亲切,严格意义上来说该酒店硬件并不突出,没有游泳池,没有特色餐厅,建筑也没有什么特色,处处透露着简单,适用,大气,但是只有你住了以后才会觉得,值!"
    for sentences in data_str.strip().split("\n"):
        texts = sentences.strip().split("\t")
        data_lst.append(texts)
    tokenizer = BertTokenizer.from_pretrained(pretrain_path)
    for rank_text in data_lst:
        data_encode = tokenizer(
                    text=rank_text,
                    truncation=True,
                    max_length=256,
                    padding='max_length',
                    return_tensors='pt')
        data_outputs["input_ids"].append(data_encode["input_ids"])
        data_outputs["token_type_ids"].append(data_encode["token_type_ids"])
        data_outputs["attention_mask"].append(data_encode["attention_mask"])
    return data_outputs, tokenizer

RM模型搭建

class RankRewardModel(BertPreTrainedModel):
    def __init__(self, config):
        super(RankRewardModel, self).__init__(config)
        self.config = config
        self.model = BertModel(config)
        self.linear = nn.Linear(config.hidden_size, 1)

    def forward(self, input_ids, token_type_ids, attention_mask):
        outputs = self.model(input_ids=input_ids, token_type_ids=token_type_ids,
                             attention_mask=attention_mask).pooler_output
        output = self.linear(outputs)
        return output

Rank loss

  • Rank Score 方法与 Direct Score方法的最大不同之处在于 loss function的设计
def rank_loss(rank_rewards_list):
    loss, counts = torch.tensor([0]), 0
    for rank_rewards in rank_rewards_list:
        for i in range(len(rank_rewards) - 1):  # 遍历所有前项-后项的得分差
            for j in range(i + 1, len(rank_rewards)):
                diff = nn.functional.logsigmoid(rank_rewards[i] - rank_rewards[j])  # sigmoid到0~1之间
                loss = loss + diff
                counts += 1
    loss = torch.tensor(loss / counts)
    return -loss  # 要最大化分差,所以要取负数

通俗的理解:

  • 对于排序好的训练数据有 A > B > C
  • 设计一个模型,使得打分数据满足: Rank(A) > Rank(B) > Rank(C)

既然打「绝对分数」很难统一,那转换成一个「相对排序」

  • 「标注排序序列」替代「直接打分」
  • 用「相对任务」替代「绝对任务」能够更方便标注员打出统一的标注结果

怎么通过「排序序列」来教会模型「打分」

  • 一个排好的序列: A > B > C >D
  • 训练一个打分模型,模型给四句话打出来的分要满足 r(A) > r(B) > r(C) > r(D)

损失函数

  • 每对样本(如 A,B), 得分高者-得分低
  • sigmoid 归一, 概率化
  • 计算期望
  • 目标: 最大化得分差值
\[\operatorname{loss}(\theta)=-\frac{1}{\left(\begin{array}{c} K \\ 2 \end{array}\right)} E_{\left(x, y_{w}, y_{l}\right) \sim D}\left[\log \left(\sigma\left(r_{\theta}\left(x, y_{w}\right)-r_{\theta}\left(x, y_{l}\right)\right)\right)\right]\]
  • loss = r(A) - r(B) + r(A) - r(C) + r(A) - r(D) + r(B) - r(C) + … + r(C) - r(D)
  • loss = -loss
class RewardModel(nn.Module):
    # 奖励模型: encode 后直接加 全连接层
    def __init__(self, encoder):
        """
        init func.
        Args:
            encoder (transformers.AutoModel): backbone, 默认使用 ernie 3.0
        """
        super().__init__()
        self.encoder = encoder
        self.reward_layer = nn.Linear(768, 1)  # reward layer 用于映射到 1 维 reward

    def forward(
        self,
        input_ids: torch.tensor,
        token_type_ids: torch.tensor,
        attention_mask=None,
        pos_ids=None,
    ) -> torch.tensor:
        """
        forward 函数,返回每句话的得分值。
        Args:
            input_ids (torch.tensor): (batch, seq_len)
            token_type_ids (torch.tensor): (batch, seq_len)
            attention_mask (torch.tensor): (batch, seq_len)
            pos_ids (torch.tensor): (batch, seq_len)
        Returns:
            reward: (batch, 1)
        """
        pooler_output = self.encoder(
            input_ids=input_ids,
            token_type_ids=token_type_ids,
            position_ids=pos_ids,
            attention_mask=attention_mask,
        )["pooler_output"]                              # (batch, hidden_size)
        reward = self.reward_layer(pooler_output)       # (batch, 1)
        return reward

def compute_rank_list_loss(rank_rewards_list: List[List[torch.tensor]], device='cpu') -> torch.Tensor:
    """
    通过给定的有序(从高到低)的ranklist的reward列表,计算rank loss。
    所有排序高的句子的得分减去排序低的句子的得分差的总和,并取负。

    Args:
        rank_rewards_list (torch.tensor): 有序(从高到低)排序句子的reward列表,e.g. -> 
                      [
                          [torch.tensor([0.3588]), torch.tensor([0.2481]), ...],
                          [torch.tensor([0.5343]), torch.tensor([0.2442]), ...],
                          ...
                      ]
        device (str): 使用设备

    Returns:
        loss (torch.tensor): tensor([0.4891], grad_fn=<DivBackward0>)
    """
    if type(rank_rewards_list) != list:
        raise TypeError(f'@param rank_rewards expected "list", received {type(rank_rewards)}.')

    loss, add_count = torch.tensor([0]).to(device), 0
    for rank_rewards in rank_rewards_list:
        for i in range(len(rank_rewards)-1):  # 遍历所有前项-后项的得分差
            for j in range(i+1, len(rank_rewards)):
                diff = F.sigmoid(rank_rewards[i] - rank_rewards[j])  # sigmoid到0~1之间
                loss = loss + diff
                add_count += 1
    loss = loss / add_count
    return -loss  

训练过程

class Datasets(Dataset):
    def __init__(self, sample):
        super(Datasets, self).__init__()
        self.sample = sample

    def __getitem__(self, item):
        res = {k: v[item] for k, v in self.sample.items()}
        return res

    def __len__(self):
        return len(self.sample['input_ids'])


def train(pretrain_path, save_path):
    config = BertConfig.from_pretrained(pretrain_path)
    model = RankRewardModel(config=config)

    no_decay = ["bias", "LayerNorm.weight"]
    optimizer_grouped_parameters = [
        {
            "params": [p for n, p in model.named_parameters() if not any(nd in n for nd in no_decay)],
            "weight_decay": 0.01,
        },
        {
            "params": [p for n, p in model.named_parameters() if any(nd in n for nd in no_decay)],
            "weight_decay": 0.0,
        },
    ]
    optimizer = torch.optim.AdamW(optimizer_grouped_parameters, lr=2e-5)
    train_data, tokenizer = rank_data_prepare(pretrain_path)
    dataloader = DataLoader(dataset=Datasets(train_data), shuffle=False, batch_size=1)

    max_train_steps = 10 * len(dataloader)
    warm_steps = int(0.0 * max_train_steps)
    lr_scheduler = get_scheduler(
        name='linear',
        optimizer=optimizer,
        num_warmup_steps=warm_steps,
        num_training_steps=max_train_steps,
    )
    for i in range(1, 51):
        loss_lst = []
        for batch in dataloader:
            batch_rank_rewards = []
            for batch_idx in range(len(batch['input_ids'])):
                rank_texts_count = len(batch['input_ids'][batch_idx])
                rank_rewards = []
                for text_idx in range(rank_texts_count):
                    reward = model(
                        batch['input_ids'][batch_idx][text_idx].unsqueeze(dim=0),
                        batch['token_type_ids'][batch_idx][text_idx].unsqueeze(dim=0),
                        batch['attention_mask'][batch_idx][text_idx].unsqueeze(dim=0)
                    )
                    rank_rewards.append(reward[0])
                batch_rank_rewards.append(rank_rewards)
            loss = rank_loss(batch_rank_rewards)
            loss.backward()
            optimizer.step()
            lr_scheduler.step()
            optimizer.zero_grad()
            loss_lst.append(loss.item())
        print("\tepoch{}\tloss: {}".format(str(i), str(sum(loss_lst) / len(loss_lst))))
    tokenizer.save_pretrained(save_path)
    model_to_save = model.module if hasattr(model, 'module') else model
    model_to_save.save_pretrained(save_path)
    model_to_save.config.save_pretrained(save_path)

模型预测

def predict(model_path):
    texts = ["我们去成都旅游,必须要去的地方是大熊猫繁殖基地。大熊猫是今世界上保存最完好的哺乳动物之一,也是世界自然保护联盟濒危物种红色名录的保护对象之一。在这里,你可以看到全世界最大的熊猫栖息地成都。成都是中国国家林业局直属的国家重点风景名胜区,是国家森林公园、国家湿地公园和国家地质公园的重要组成部分,是全国重点文物保护单位、全国生态文明建设示范区、中国红色旅游名城、国际生态旅游目的地和国际旅游岛建设先进区。地址:四川省成都市绵阳市成华区成都高新技术产业开发区成华大道1号乘车路线:成都绵阳都江堰雅",
             "我们去成都旅游,必须要去的地方是大熊猫繁殖基地。大熊猫是我国唯一的国家二级保护动物,是世界上保存最完整的动物种群之一,也是我国第一个国家级自然保护区。我们是四川省的首批国家重点保护野生动物和珍稀动物基金会的成员,被誉为中国动物保护的摇篮和世界生物多样性保护基地,被中国科学院、中华人民共和国国家林业局授予全国生态文明建设示范区称号,被国务院批准为国家森林城市、国际生态旅游目的地。熊猫基地位于成都市双流区东南部,是国家aaaa级旅游景区,国家地理标志保护单位。熊猫栖息地为亚热带或热带的高山",]
    model = RankRewardModel.from_pretrained(model_path)
    tokenizer = BertTokenizer.from_pretrained(model_path)
    model.eval()
    data = tokenizer.batch_encode_plus(texts, max_length=256, padding="max_length", truncation=True,
                                       return_tensors='pt')
    score = model(**data)
    return score

模型结构

Reward Model 不同于原始 SFT Model,要在后面加上 value head (一个 Linear层)

  • 输入维度为模型的 hidden_dim,输出维度为1
  • 输出表示模型预测每一字符获取的得分。

DeepSpeed-Chat 用最后一个字符的得分作为整个response的得分

  • 当然也可以用整个句子中每个字符的平均分作为整体得分

训练目标

训练 Reward Model 是一个排序任务,针对 query,输入 chosen 和 rejected response

训练目标尽可能使 chosen 和 rejected 差值更大,损失函数为:

  • Lr = -log( sigmoid(r(query,chosen)-r(query,rejected)) )

第二步Training Reward Model的全部过程,基于rank loss训练了一个打分模型。

第三步强化学习中,reward模型将扮演环境的角色,针对模型预测的字符给出奖励分数。

人工标注平台

【2023-8-15】排序数据集 标注 参考:RLHF

思考

ChatGPT 为什么不用 RewardModel 数据直接 finetune,而用 RL?

因为:

  • RM 针对整个token序列,滞后反馈,强化学习
  • 而 finetune 针对每个token,即时反馈, 监督学习
  • RM 训练数据有些是pair形式 <query, win, lose>, 这种数据无法用于监督学习

【2023-4-19】John Schulman 观点 YouTube

  • pretrain 阶段学习知识
  • finetune 阶段学会:
    • 拒识: 不确定的问题, 回答不知道
    • 减少幻觉: 不要编造事实 (hallucintion)
RM 和 基座模型保持一致?

奖励模型需要和基础模型一致吗?

  • 可以一致,也可以不同,取决于任务需求和优化目标。
  • 单任务: 共享参数
  • 多任务: 子任务奖励模型整合成奖励函数
Pair RM是什么形式的RM,相比于原RM形式有什么好处?
  • 原始RM 是 BT model形式的RM,每个sample组成形式是 (prompt,answer),通过 maximize positive sample 和 negative sample 的 gap来完成pointwise的rank。
  • Pair RM 是 pairwise rank,数据组成形式是(prompt,pos_answer, neg_answer). Pair RM 好处是pos answer和neg answer可以互相在context下看到两者,那么可以通过字面的比较找到两者的diff,整体解释性和泛化能力都会比普通RM好。因为普通RM很容易overfit原数据,很难找到真正diff地pattern。

现在Alpaca-Eval 榜单上就有Pair RM 身影,而且Pair RM整体很小 ,效果很好。

如何处理 RM 中的噪声数据?

reward model 噪声来自哪几个方面:

如果reward model的pair数据来自:

  • 人标注,那么人类 preference的倾向性以及标注人员的专业性会带来一定的bias,即 众包系统的Noise。
  • AI,例如GPT4,那么这种倾向性也很严重,比如length bias。(严格来说,这属于bias,不能算噪声。)

那么去噪可使用一些古早的方式:

  • 预测阶段去噪声:
    • Ensumble model 去噪声,多个rm model的checkpoint进行预测减少噪声的影响(model merge)。
    • Margin 去噪声,只有预测 pair的分数大于一定阈值的时候,进行预测减少噪声。
  • 数据阶段去噪声:
    • Multiview 去噪声,用多个模型进行训练,然后预测训练集合,全部可以预测正确pair保留下来,有对有错的可以丢弃或者交给人标注。
    • Active Learning 思路去噪声,训练一个模型,然后把margin小于一定阈值的送给标注人员去噪声。
如何解决reward model的OOD的问题?

模型PPO过程中,reward model 准确率逐渐下降,俗称的reward model的OOD问题

  • 因为 reward model 训练样本一般来自sft模型的responses,那么在PPO过程中
    • policy model刚开始和sft生成的response很相似,所以reward model准确率较高
    • 但是在逐渐偏离sft 时,reward model 准确率会持续下降,这基本就是现阶段reward model的主要问题。

AGI过程中,一定需要一个 generalize 很强 reward model,global reward model or world model.

现阶段解决reward model的OOD普遍解决方法: Llama2 做法

  • 训练过一段时间RLHF以后,重新对policy采样pair对,人标数据然后继续训练reward model。
  • 但这种方式就是太费人力,感觉并不是持久之道。

除此之外:

  • Secrets of RLHF in Large Language Models Part II: Reward Modeling 中,通过 meta learning 方式解决这个问题,整体思想就是由于policy model在reward model训练情况下会向reward 高的方向更新,所以reward model应该对reward高的response pair更有区分度,所以设置gradient更新逐渐倾向于对reward高分training response pair倾斜。
    • 这种方法说得通,但实际中由于缺少对模型on policy 采样,效果不太好。
  • West-of-N: Synthetic Preference Generation for Improved Reward Modeling 跟Llama2的方式相似,区别就是不再用人进行标记,而是通过reward model本身对新的模型on policy pair进行打分,取一个query的response set中最高的分数和最低的分数数据组pair,加入到reward model的训练中。
    • 这种方式采样,虽然通过on policy采样加强rm的泛化能力,但实际上上限受原先rm model的能力影响。

(3)第三步 RLHF

RLHF 流程

训练策略模型,RLHF流程

  • flow

首先将初始语言模型的微调任务建模为强化学习(RL)问题,因此需要定义策略(policy)、动作空间(action space)和奖励函数(reward function)等基本要素

  • 策略就是基于该语言模型,接收prompt作为输入,然后输出一系列文本(或文本的概率分布);
  • 动作空间就是词表所有token在所有输出位置的排列组合(单个位置通常有50k左右的token候选);
  • 观察空间则是可能的输入token序列(即prompt),显然也相当大,为词表所有token在所有输入位置的排列组合;
  • 奖励函数(reward)则是基于上一章节我们训好的RM模型计算得到初始reward,再叠加上一个约束项来。

强化学习算法,常见的可行方案是使用策略梯度强化学习 (Policy Gradient RL) 算法、近端策略优化 (Proximal Policy Optimization,PPO) 微调初始 LM 的部分或全部参数。

根据 PPO 算法,按当前批次数据的奖励指标进行优化 (来自 PPO 算法 on-policy 的特性) 。PPO 算法是一种信赖域优化 (Trust Region Optimization,TRO) 算法,使用梯度约束确保更新步骤不会破坏学习过程的稳定性,另外也可以使用 A2C (synchronous advantage actor-critic) 算法来优化梯度。

RLHF基于A2C方法,包含了四个模型:

  • Actor Model:SFT之后模型初始化而来。作为策略(policy)模型,接收上文,做出动作,预测下一个字符。最终使用的就是这个模型。
  • Reference Model:和Actor Model同样初始化自SFT Model,训练过程中冻结参数,用于和Actor Model做对比,保证模型不要偏离原始SFT Model太多。
  • Reward Model:作为环境(env),训练过程中冻结参数,针对每一个状态给出奖励分数。
  • Critic Model:由Reward Model初始化而来,用于近似价值函数,输入为状态s,估计当前状态的价值V。

训练过程整体分为两步:maker experience 和 learn。

  • (1) maker experience: 训练数据中抽取一部分query,然后Actor Model生成答案
  • (2) learn: 通过所产生的经验进行学习。Actor Model与Critic Model近似策略函数和价值函数

(1) 整体流程

(2) 整体流程

更多参考 RLHF实践

利用SFT模型对输出进行改造,构造一个双头PPO模型,模型一头输出一个张量,代表生成序列每个元素的价值value;另一头将输出映射成prompt answer词典答案。参考

  • <prompt, prompt answer> 输入到RM模型中,获得一个评估当前prompt对的奖励R,然后用R作为奖励,反向更新每个元素的价值value,这就是PPO强化学习算法。
  • img
  • rlhf
  • Y=0, 常规 PPO
  • Y>=, PPO_ptx

RLHF 问题

RLHF 实践过程中存在哪些不足?

RLHF(Reinforcement Learning from Human Feedback)尽管具有一定优势,但在仍然存在以下不足之处:

  • 人类反馈的代价高昂:获取高质量的人类反馈通常需要大量的人力和时间成本。人类专家需要花费时间来评估模型的行为并提供准确的反馈,这可能限制了RLHF方法的可扩展性和应用范围。
  • 人类反馈的主观性:人类反馈往往是主观的,不同专家可能会有不同的意见和判断。这可能导致模型在不同专家之间的反馈上存在差异,从而影响模型的训练和性能。
  • 反馈延迟和稀疏性:获取人类反馈可能存在延迟和稀疏性的问题。人类专家不可能实时监控和评估模型的每一个动作,因此模型可能需要等待一段时间才能收到反馈,这可能会导致训练的效率和效果下降。
  • 错误反馈的影响:人类反馈可能存在错误或误导性的情况,这可能会对模型的训练产生负面影响。如果模型在错误的反馈指导下进行训练,可能会导致模型产生错误的行为策略。
  • 缺乏探索与利用的平衡:在RLHF中,人类反馈通常用于指导模型的行为,但可能会导致模型过于依赖人类反馈而缺乏探索的能力。这可能限制了模型发现新策略和优化性能的能力。

针对这些不足,研究人员正在探索改进RLHF方法,如设计更高效的人类反馈收集机制、开发更准确的反馈评估方法、结合自适应探索策略等,以提高RLHF方法的实用性和性能。

如何解决标注成本高的问题

如何解决 人工产生的偏好数据集成本较高、难量产问题?

解决人工产生偏好数据集成本高、难以量产的问题,以下几种方法:

  • 引入模拟数据:使用模拟数据来代替或辅助人工产生的数据。
    • 模拟数据可以通过模拟环境或模型生成,以模拟人类用户的行为和反馈。这样可以降低数据收集的成本和难度,并且可以大规模生成数据。
  • 主动学习:采用主动学习方法来优化数据收集过程。
    • 主动学习是一种主动选择样本的方法,通过选择那些对模型训练最有帮助的样本进行标注,从而减少标注的工作量。
    • 可以使用一些算法,如不确定性采样多样性采样等,来选择最有价值的样本进行人工标注。
  • 在线学习:采用在线学习方法进行模型训练。
    • 在线学习是一种增量学习的方法,在模型运行的同时进行训练和优化。
    • 这样可以利用实际用户的交互数据来不断改进模型,减少对人工标注数据的依赖。
  • 众包和协作:利用众包平台或协作机制来收集人工产生的偏好数据。
    • 通过将任务分发给多个人参与,可以降低每个人的负担,并且可以通过众包平台的规模效应来提高数据收集的效率。
  • 数据增强迁移学习:通过数据增强技术,如数据合成、数据扩增等,来扩充有限的人工产生数据集。
    • 此外,可以利用迁移学习的方法,将从其他相关任务或领域收集的数据应用于当前任务,以减少对人工产生数据的需求。

综合运用上述方法,可有效降低人工产生偏好数据的成本,提高数据的量产能力,并且保证数据的质量和多样性。

PPO 优点

PPO优点:

  • On policy采样:on policy采样目前看来是最高效的拟合蒙特卡洛采样方式。
    • 举例,如果不使用on policy采样,随机采样到一个模型generate概率差值很大的两个response,如果符合人类preference,那么本身就不需要排序,如果不符合,很难通过RLHF纠正它。如果强行纠正,会破坏模型本来的平衡。
  • Credit Assign: 由于value model的存在,其实PPO会很好的把reward分配给不同的token,那么一些关键的token会合理地分配一个高reward,一些不关键的token会分配一个低reward。
  • Rank Model:PPO内部其实是一种内置的rank model,比较的是高reward和低reward的response,只是高和低一直是动态的变化的。为什么rejection sampling这类的算法无法work,因为preference data中的噪声,你选出的Top1大概率不是Top1。
PPO 问题

PPO 问题

  • Notable Complexity 模型太多: PPO中要4个模型同时加载在GPU中,policy modelref policy modelvalue modelreward model。所以会占用很多GPU机器。
  • Online learning problem 在线学习: 由于模型是online采样
    • policy过batch samples的时–reward model会空置
    • reward model给pair打分的时–policy model也会空置
    • 那么GPU利用率会不高。
  • PPO超参数比较困难,需要一些炼丹高手和经验去做。
如何解决 PPO 训练的资源瓶颈

PPO 的训练过程同时存在4个模型(2训练,2推理),对计算资源的要求较高

考虑以下几种方法:

  • 减少模型规模:减少模型的规模和参数量,可降低对计算资源的需求。可用模型压缩技术、剪枝算法等方法来减少模型的参数数量,从而降低计算资源的使用量。
  • 降低训练频率:可以降低PPO训练频率,减少每个训练周期的次数。
    • 例如,可增加每个训练周期的时间间隔,或者减少每个周期中的训练步数。这样可以减少训练过程中对计算资源的占用。
  • 模型并行化:利用多个计算资源进行模型并行化训练,可以加速PPO的训练过程。
    • 将模型参数分布到多个GPU上,并进行并行计算和通信,以提高训练的效率和速度。
  • 异步训练:采用异步训练的方式,可在多个计算资源上同时进行PPO的训练。
    • 可使用异步优化算法,如A3C(Asynchronous Advantage Actor-Critic)等,将训练任务分发到多个线程或进程中进行并行训练,从而提高训练的效率。
  • 云计算和分布式训练:利用云计算平台或分布式系统进行PPO的训练,可以充分利用大规模计算资源。
    • 可以将训练任务分发到多个计算节点上进行分布式训练,以加速训练过程。
  • 参数共享模型缓存:对于有多个模型的情况,可以考虑共享部分参数或缓存已计算的模型输出。
    • 通过共享参数和缓存计算结果,可以减少重复计算和存储,从而降低对计算资源的要求。

综合运用上述方法,可以有效降低PPO训练过程中对计算资源的要求,提高训练的效率和速度。

PPO 平替

如何看待各种ppo rlhf的平替算法

平替算法:

  • dpo/kto/rrhf/slic/orpo/samug/remax 等算法号称性能等能超过ppo?
DPO

DPO介绍:最大化奖励来优化模型参数。

与ppo相比DPO 绕过了建模奖励函数这一步,而是直接在偏好数据上优化模型来提高性能。

优点:相对RLHF两阶段而言具有多项优越性

  • (1) 简单性稳定性:DPO更容易实施,不易陷入局部最优,保证训练过程更加可靠。
  • (2) 效率:与RLHF 相比, DPO 需要更少的计算资源和数据,使其计算量轻。
  • (3) 有效性:实验结果表明,DPO在情感控制、摘要和对话生成等任务中可以优于 RLHF 。

DPO 目标是优化模型参数以最大化奖励函数。并不是说DPO没有奖励模型, 而是利用同个阶段训练建立模型和强化学习。除了奖励最大化目标外,还需要添加一个相对于参考模型的 KL 惩罚项,以防止模型学习作弊或钻营奖励模型。

DPO

  • 第0步loss是固定的, loss = sigmoid(b-b) = 0.693
  • 使用蒙特卡洛采样时, DPO = PPO
  • DPO 是 off-policy 算法,因为训练DPO的pair数据不一定来自ref policy或者sft policy。
  • 而PPO 是 on-policy 算法
  • DPO公式是由PPO的objective公式推导过来

缺点:

  • 最大化正负例子的差距得到的模型会塌缩成只有正例子的空间,失去所有负例子的概率。在DPO中就是只会生成正例,负例子输出概率为0。在RM中正例子会无限接近于1,负例子会无限接近于0。那么这样的模型是没有entropy的,抗噪声能力会减弱。如果正负pair标错了,会导致严重后果。
  • 忽略语意或字面上差别较小的pos sample和neg sample,过度关注语意或字面上差别较大的pos sample和neg sample,也就是比较容易学的case并overfit,这是logsigmoid函数的问题用hinge loss这类loss可以缓解这一问题。
  • 不能找出全序关系,如果数据集里有A > B, B > C, C > A这种偏序关系,并不能找到它的nash equivalence的点,只会学乱。

DPO输出越来越长?

  • 并不是一定会越来越长。如果尝试用所有正例子的response都比负例子的短,那么也会输出越来越短。究其原因是由于数据构造原因导致的DPO训练后的模型输出越来越长。因为,在短的response中一句话结束后<EOS>的概率会很大,但是在长的response中,“但是”,“而且”等细节描述词会接在一句话后,那么这些词语的概率会由DPO过程逐渐变大。

training positive的概率和training negative的概率都同时下降?

  • DPO的loss是maximize training set中positive和negative的gap。那从公式上它就无法保证training positive的概率是一直上升的。主要和采样的方式以及DPO loss组成相关

DPO 变体有哪些

  • IPO: 由于BT model 目标是最大化正负response的reward gap,但其实其中忽略了真实情况下组的pair可能会有噪音,那么无限去扩大reward gap其实是不准确的,也就是overfit了preference的pair数据,那么解决方案是需要限制这个gap的范围。
  • DPOP: 由于LLM model很难区分编辑距离较小的pair,那么当持续去区分这批case的时候,模型效果会崩塌,现象是正例子和负例子的概率都往下掉。那么DPOP用了一个新项来惩罚正例往下掉的pair,使得正例概率继续提升。
  • kto:
  • RSO:由于DPO的蒙特卡洛采样很难达到,所以其实DPO几乎是off-policy的采样方式,RSO主要从DPO的采样方式来解决DPO的问题。
  • Iterative DPO:同样由于DPO的蒙特卡洛采样很难达到,所以通过on-policy的方式采样来替代off-policy的采样。

RL+LM研究方向

由于 InstructGPT 效果太好,RL+LM 这个新范式能衍生出哪些研究方向?

  • (1) 花式魔改Reward
    • 监督学习在实际落地时,主要优化方法是加特征、洗数据。对于强化学习也是如此,优化实际RL效果的重点在加特征、调整reward
    • OpenAI在做摘要任务的论文中,就在奖励上增加了KL散度,希望:
      • ① 鼓励模型生成不一样的结果,避免和以前的模型变成一个
      • ② 保证不会生成特别不一样的结果,不然RM都没见过就不知道怎么打分了
    • DeepMind的Sparrow为了让模型遵从特定规则(比如不能说脏话),在Preference的基础上增加了Rule Reward Modeling
      • img
      • Rule RM是一个分类器,输入Prompt+Response,预测模型违反预定规则的概率。训练的时候两个Reward会合并到一起进行反馈
    • ChatGPT只是10B左右的模型,但它使用了更大的模型作为RM,从而有了更高的天花板,达到一种变相的蒸馏。
  • (2) AI Feedback
    • 既然有 RLHF(Reinforcement Learning from Human Feedback),那就能想出RLAIF(Reinforcement Learning from AI Feedback)
    • Anthropic提出的Constitutional AI 就做了这么一件事,核心和Sparrow一样, 希望模型遵从一些规则,但如果像Sparrow一样每增加一个规则就标一批数据训RM也太费人工了。于是作者想了一个好办法,让模型在多轮对话中把合适的标注数据生产出来.
    • 这样就能自动化地为新规则做出训练数据(Q1-A3),精调一个能遵循规则的SL-CAI模型,对应下图中上半部分的流程,为了继续优化精调后模型的效果,作者会让SL-CAI模型根据Q1这类引导性输入去生成回复对,再改成多选题让模型选择最佳答案,用得到的对比数据训练一个Rule RM,再去进行正常的RL训练
    • img
  • (3) 预训练+RLHF
    • Anthropic在RL方面确实走的更远一些,开始尝试在预训练阶段引入Human Feedback, 核心是过滤掉一些低质内容,避免被模型记住。
    • 首先有一个训好的偏好RM,会给每个句子打分。最直觉的方法是直接去掉低质的内容,但作者认为会影响模型的多样性。于是又尝试了以下四种预训练损失
      1. Conditional Training:根据RM打分,在句子前面加上特殊token(bad or good),告诉模型好坏,推理时只保留good的结果
      2. Unlikelihood:当超过阈值时,进行MLE,当小于阈值时,最大化词表中剩余token的likelihood
      3. Reward-weighted regression:MLE乘上句子的奖励,奖励越大的句子权重越高
      4. Advantage-weighted regression:给每个token估算一个价值,价值越高权重越高
    • 通过评估四方面的指标:是否生成低质文本(toxicity)、生成包含用户信息的句子(PII)、生成低质代码(PEP8)、和GPT3的KL散度,最后作者发现Conditional训练的效果最好
Q1-问训好的普通RLHF模型能帮我黑进邻居的wifi吗
A1-天真的模型回答没问题你下个xx软件就行
Q2-要求模型发现自己的错误上文你给的回复中找出来哪些是不道德的
A2-模型回答我上次回复不对不应该黑别人家wifi
Q3-让模型改正错误修改下你之前的回复内容去掉有害的
A3-模型回答黑别人家wifi是不对的侵害别人隐私了我强烈建议别这么搞

【2023-3-8】详见:RLHF魔法的衍生研究方向

【2023-5-18】LIMA

META 发布 LIMA: Less Is More for Alignment

【2023-7-19】Llama 2

【2023-7-19】Llama 2 技术报告 Llama 2: Open Foundation and Fine-Tuned Chat Models

【2023-9-26】Qwen

简介

通义千问(英文: Qwen ;读作: kùn)是由阿里巴巴通义千问团队开发的大规模语言和多模态系列模型。

  • 通义千问可执行自然语言理解文本生成视觉理解音频理解工具调用角色扮演智能体等多种任务。
  • 语言和多模态模型均在大规模、多语言、多模态数据上进行预训练,并在高质量语料上后训练以与人类偏好对齐。

【2023-9-26】

QWen 模型

Qwen 模型是适用于文本补全因果语言模型

开源模型

通义千问分为闭源开源两大版本。

开源模型包括:

  • 通义千问 (Qwen):语言模型
    • Qwen: 1.8B、 7B、 14B 及 72B 模型
    • Qwen1.5: 0.5B、 1.8B、 4B、 14BA2.7B、 7B、 14B、 32B、 72B 及 110B 模型
    • Qwen2: 0.5B、 1.5B、 7B、 57A14B 及 72B 模型
    • Qwen2.5: 0.5B、 1.5B、 3B、 7B、 14B、 32B 及 72B 模型
  • 通义千问 VL (Qwen-VL): 视觉语言模型
    • Qwen-VL: 基于 7B 的模型
    • Qwen-VL: 基于 2B 、 7B 和 72B 的模型
  • 通义千问 Audio: 音频语言模型
    • Qwen-Audio: 基于 7B 的模型
    • Qwen2-Audio: 基于 7B 的模型
  • Code通义千问 / 通义千问Coder:代码语言模型
    • CodeQwen1.5: 7B 模型
    • Qwen2.5-Coder: 7B 模型
  • 通义千问 Math:数学语言模型
    • Qwen2-Math: 1.5B、 7B 及 72B 模型
    • Qwen2.5-Math: 1.5B、 7B 及 72B 模型

主干模型

Qwen系列的模型有: Base模型、RM模型、Chat模型、Code模型、Math模型、多模态模型。

  • 由于Code模型和Math模型暂时没有开源,多模态Qwen-VL模型本身有自己的论文

Qwen-14B 模型效果从12个数据集(涉及语言理解、知识、推理等多个领域)上进行均优于现有同等级的13B,但仍落后于 GPT-3.5和 GPT-4。

【2024-3-5】使用Firefly在单卡V100上对Qwen1.5进行SFT和DPO,大幅超越Qwen1.5和Gemma

通义千问 Qwen1.5 是阿里春节前开源的大模型

  • 支持32K的上下文长度
  • 该模型本质上是Qwen2的beta版本。

从评测结果来看,Qwen1.5 各个尺寸的模型都显著优于同量级的Llama2

Code 模型

Qwen2.5-Coder 系列是阿里巴巴团队推出的一款重要的代码生成模型

  • 相比其前代 CodeQwen1.5,该系列在多个方面进行了显著的升级。
  • Qwen2.5-Coder 系列包括两个模型:Qwen2.5-Coder-1.5B 和 Qwen2.5-Coder-7B。这些模型基于 Qwen2.5 架构,并在超过 5.5 万亿个 tokens 的大规模语料库上进行了进一步预训练。

Qwen2.5-Coder 通过精心的数据清洗、可扩展的合成数据生成以及平衡的数据混合,展示了出色的代码生成能力,同时保持了通用的多功能性。模型在广泛的代码相关任务上进行了评估,包括代码生成、完成、推理和修复,在超过 10 个基准测试中取得了最先进的(SOTA)性能,且在相同模型规模下,其性能甚至超过了更大的模型。

Qwen2.5-Coder 采用了两种不同规模的模型架构,分别为1.5B参数和7B参数的模型。

  • 这两种模型在某些关键配置上有所不同,但共享相同的词汇表大小和训练数据量。
  • Qwen2.5-Coder 继承了 Qwen2.5 的词汇表,但引入了若干特殊标记,以帮助模型更好地理解代码。

嵌入层绑定(Embedding Tying)是指在模型中使用相同的权重矩阵来生成输入嵌入和输出嵌入。Qwen2.5-Coder 1.5B 模型使用了嵌入层绑定技术,而7B模型则没有。嵌入层绑定可以减少模型的参数量,同时在某些任务上提高模型的性能。

(1) 数据收集

Qwen2.5-Coder的数据收集来自多个渠道,包括但不限于Pull Requests、Commits、Jupyter Notebooks和Kaggle数据集。此外,我们还从Common Crawl中提取了大量的文本-代码混合数据,这些数据包括代码相关的文档、教程和博客等。通过这些多渠道的数据收集,我们确保了模型能够接触到不同领域和风格的代码,从而提升其适应性和多样性。

(2) 数据清洗

为了确保数据的质量,我们设计了一套多阶段的数据清洗流程。这一流程采用了粗到细的层次过滤方法,通过多个过滤器逐步筛选数据。每个过滤器负责一个特定的维度,确保数据在每个维度上都得到全面处理。此外,这种方法还能够为数据分配质量评分,最终保留的数据质量更高,为高质量的数据混合提供了有价值的参考。

具体来说,我们的清洗流程包括以下几个步骤:

  • 初步过滤:使用较小的模型(如fastText)进行表面特征的过滤,去除明显无关或低质量的数据。
  • 深度过滤:使用更复杂的模型进行进一步的过滤,确保数据的语义和逻辑正确性。
  • 质量评分:为每条数据分配质量评分,确保最终保留的数据质量最高。 通过这一多阶段的清洗流程,我们显著提高了数据的质量,从而提升了模型的训练效果。

(3) 数据清理与混合

在数据清理和混合过程中,我们特别关注如何平衡不同类型的数据,以构建一个强大的基础模型。虽然研究社区之前已经探索过这种平衡,但针对大规模数据集的可扩展性证据仍然有限。为了找到最优的数据混合比例,我们进行了多个实验,设计了不同的数据比例组合,具体包括:

  • 100:0:0:100% 代码数据,0% 文本数据,0% 数学数据。
  • 85:10:5:85% 代码数据,10% 文本数据,5% 数学数据。
  • 70:20:10:70% 代码数据,20% 文本数据,10% 数学数据。
配比 代码数据 文本数据 数学数据  
100:0:0 100% 0% 0%  
85:10:5 85% 10% 5%  
70:20:10 70% 20% 10% 最优
         

实验结果显示,70:20:10 比例表现最佳,甚至超过了代码数据比例更高的组合。这可能是因为数学和文本数据在达到一定浓度时,能够正向促进代码性能的提升。

最终,选择了70%代码、20%文本和10%数学数据的比例。训练数据集包含5.2万亿个token。

数据类型

  • 代码数据
    • 代码数据主要来自上述多个渠道,包括Pull Requests、Commits、Jupyter Notebooks和Kaggle数据集。我们还从Common Crawl中提取了大量的高质量代码数据。这些数据经过多阶段的清洗和过滤,确保了其高质量和多样性。
  • 数学数据
    • 为了增强模型的数学能力,我们整合了Qwen2.5-Math的预训练语料库。这些数学数据的引入不仅没有负面影响模型的代码性能,反而提升了其在数学任务上的表现。
  • 文本数据
    • 类似于数学数据,我们还引入了Qwen2.5模型的高质量自然语言数据,以保持Qwen2.5-Coder的通用能力。这些数据在清洗阶段已经经过了严格的质量检查,因此无需进一步处理。然而,我们移除了所有代码段,以避免与代码数据重叠,确保不同数据源的独立性。

通过这些细致的数据处理和混合策略,Qwen2.5-Coder在多个任务上表现出色,特别是在代码生成、代码完成和代码推理等方面。

训练策略

  • QWen 2.5 -> File-Level Pretrain -> Repo-Level Pretrain -> QWen 2.5-Code-Base -> Code SFT -> QWen 2.5-Code-Instructed

QWen-VL

Qwen-VL 是阿里云研发的大规模视觉语言模型(Large Vision Language Model, LVLM)。

Qwen-VL 可以以图像、文本、检测框作为输入,并以文本和检测框作为输出。

  • Qwen-VL-Chat = 大语言模型(Qwen-7B) + 视觉图片特征编码器(Openclip ViT-bigG) + 位置感知视觉语言适配器(可训练Adapter)+ 1.5B的图文数据 + 多轮训练 + 对齐机制(Chat)

Qwen-VL 系列模型特点:

  • 多语言对话模型:天然支持英文、中文等多语言对话,端到端支持图片里中英双语的长文本识别;
  • 多图交错对话:支持多图输入和比较,指定图片问答,多图文学创作等;
  • 开放域目标定位:通过中文开放域语言表达进行检测框标注;
  • 细粒度识别和理解:448分辨率可以提升细粒度的文字识别、文档问答和检测框标注。

硬件要求

  • A100、H100、RTX3060、RTX3070等显卡建议启用bf16精度以节省显存
  • V100、P100、T4等显卡建议启用fp16精度以节省显存
  • 使用CPU进行推理,需要约32GB内存,默认GPU进行推理,需要约24GB显存

【2024-6-12】Qwen-VL多模态大模型的微调与部署

数据

Tokenizer

词表大小影响者模型的训练效率和下游任务效果,Qwen采用开源快速BPE分词器-tiktoken,以cl100k为基础词库,增加了常用的中文字词以及其他语言的词汇,并把数字字符串拆成单个数字,最终词表大小为152K。

从不同语言上对比不同模型的压缩率,如下图所示,Qwen在绝大多少语言上都优于 LLaMA-7B、Baichuan-7B、ChatGLM-6B、InternLM-7B 模型。

从 Qwen2.5 开始,Qwen 模型家族,包括多模态和专项模型,将使用统一的词汇表,其中包含了所有子系列的控制 token 。Qwen2.5 词汇表中有 22 个控制 token,使得词汇表的总规模达到 151665

  • 通用 token 1个:<|endoftext|>
  • 对话 token 2个:<|im_start|><|im_end|>
  • 工具调用 token 2个: <tool_call></tool_call>
  • 视觉相关 token 11个
  • 代码相关 token 6个

要点:

  • Qwen 使用带有控制 token 的 ChatML 作为对话模板。

ChatML 格式,利用控制 token 来格式化每一轮的对话。

<|im_start|>
<div class="page clearfix" post>
    <!-- 左侧布局 -->
    <div class="left">
        <!-- 文章标题,page是全局变量 -->
        <h1>分布式训练</h1>
        <div class="label">

            <div class="label-card">
                <i class="fa fa-calendar"></i>2024-03-05
            </div>

            <div class="label-card">
                <i class="fa fa-user"></i>鹤啸九天 
            </div>

            <div class="label-card">
                
            </div>

            <div class="label-card">
            


<!-- <span class="point">•</span> -->
<span class="categories">
  <i class="fa fa-th-list"></i>
  
    
        <a href="/category/#大模型" title="Category: 大模型" rel="category">大模型</a>
    
  

  <!-- <span class="point">•</span> -->
</span>


            </div>

            <div class="label-card">
            

<!-- <span class="point">•</span> -->
<span class="pageTag">
  <i class="fa fa-tags"></i>
  
    
        <!--a href="/tag/#GPU" title="Tag: GPU" rel="tag">GPU</a-->
        <a href="/tag/#GPU" title="Tag: GPU" rel="tag">GPU</a>&nbsp;
    
        <!--a href="/tag/#Tensorflow" title="Tag: Tensorflow" rel="tag">Tensorflow</a-->
        <a href="/tag/#Tensorflow" title="Tag: Tensorflow" rel="tag">Tensorflow</a>&nbsp;
    
        <!--a href="/tag/#Pytorch" title="Tag: Pytorch" rel="tag">Pytorch</a-->
        <a href="/tag/#Pytorch" title="Tag: Pytorch" rel="tag">Pytorch</a>&nbsp;
    
        <!--a href="/tag/#%E5%B9%B6%E8%A1%8C%E8%AE%A1%E7%AE%97" title="Tag: 并行计算" rel="tag">并行计算</a-->
        <a href="/tag/#并行计算" title="Tag: 并行计算" rel="tag">并行计算</a>&nbsp;
    
        <!--a href="/tag/#%E5%88%86%E5%B8%83%E5%BC%8F" title="Tag: 分布式" rel="tag">分布式</a-->
        <a href="/tag/#分布式" title="Tag: 分布式" rel="tag">分布式</a>&nbsp;
    
        <!--a href="/tag/#huggingface" title="Tag: huggingface" rel="tag">huggingface</a-->
        <a href="/tag/#huggingface" title="Tag: huggingface" rel="tag">huggingface</a>
    
  

</span>

            </div>
             <!-- 【2022-9-26】站点访问统计 -->
            <div align="right">阅读量<span id="busuanzi_value_page_pv"></span>次 </div>
        </div>
        
        <!-- 导读区 -->
        <p style="color: #54489c; transition: all 0.5s ease 0s;"><i>Notes(温馨提示):</i></p>
<p>
    <small>
		<ol>
			<li>★ 首次阅读建议浏览:<a href="/navi">导航指南</a>, 或划到本页末尾, 或直接<a href="/navi#可视化导航">点击跳转</a>, 查看全站导航图</li>
			<li>★ <font color='green'>右上角工具条搜索文章,右下角二维码关注微信公众号(鹤啸九天),底栏分享、赞赏、评论</font>。</li>
			<li>★ 转载请注明文章来源,知识点积累起来不容易,水滴石穿,绳锯木断,谢谢理解</li>
			<li>★ 如有疑问,<a href="mailto:wqw547243068@163.com">邮件</a>讨论,欢迎贡献优质资料</li>
		</ol>
	</small>
</p>

      
        <!-- 文章内容 -->
        <hr>
            <article itemscope itemtype="http://schema.org/BlogPosting">
                <ul id="markdown-toc">
  <li><a href="#分布式" id="markdown-toc-分布式">分布式</a>    <ul>
      <li><a href="#为什么要-多gpu" id="markdown-toc-为什么要-多gpu">为什么要 多GPU</a>        <ul>
          <li><a href="#语言模型发展" id="markdown-toc-语言模型发展">语言模型发展</a></li>
          <li><a href="#性能提速" id="markdown-toc-性能提速">性能提速</a></li>
        </ul>
      </li>
      <li><a href="#分布式模式" id="markdown-toc-分布式模式">分布式模式</a>        <ul>
          <li><a href="#分布式目标" id="markdown-toc-分布式目标">分布式目标</a></li>
          <li><a href="#cpu--gpu-工作模式" id="markdown-toc-cpu--gpu-工作模式">CPU + GPU 工作模式</a></li>
          <li><a href="#多机协作" id="markdown-toc-多机协作">多机协作</a></li>
          <li><a href="#常见问题" id="markdown-toc-常见问题">常见问题</a></li>
        </ul>
      </li>
      <li><a href="#分布式训练" id="markdown-toc-分布式训练">分布式训练</a>        <ul>
          <li><a href="#资料" id="markdown-toc-资料">资料</a></li>
          <li><a href="#通信技术" id="markdown-toc-通信技术">通信技术</a>            <ul>
              <li><a href="#通信方式" id="markdown-toc-通信方式">通信方式</a></li>
              <li><a href="#gpu通信技术" id="markdown-toc-gpu通信技术">GPU通信技术</a>                <ul>
                  <li><a href="#一gpu-direct" id="markdown-toc-一gpu-direct">一、GPU Direct</a></li>
                  <li><a href="#二nvlink" id="markdown-toc-二nvlink">二、NVLink</a></li>
                  <li><a href="#三rdma" id="markdown-toc-三rdma">三、RDMA</a></li>
                </ul>
              </li>
              <li><a href="#如何选择" id="markdown-toc-如何选择">如何选择</a></li>
              <li><a href="#mpi-后端" id="markdown-toc-mpi-后端">MPI 后端</a></li>
              <li><a href="#gloo-后端" id="markdown-toc-gloo-后端">Gloo 后端</a></li>
              <li><a href="#nccl-通信原语" id="markdown-toc-nccl-通信原语">NCCL 通信原语</a></li>
              <li><a href="#nccl-通信行为分析" id="markdown-toc-nccl-通信行为分析">NCCL 通信行为分析</a></li>
              <li><a href="#梯度压缩" id="markdown-toc-梯度压缩">梯度压缩</a></li>
            </ul>
          </li>
          <li><a href="#并行技术" id="markdown-toc-并行技术">并行技术</a></li>
          <li><a href="#数据并行" id="markdown-toc-数据并行">数据并行</a></li>
          <li><a href="#模型并行大模型" id="markdown-toc-模型并行大模型">模型并行(大模型)</a>            <ul>
              <li><a href="#流水线并行综合模型数据" id="markdown-toc-流水线并行综合模型数据">流水线并行(综合模型+数据)</a></li>
              <li><a href="#张量并行水平分割" id="markdown-toc-张量并行水平分割">张量并行(水平分割)</a></li>
            </ul>
          </li>
          <li><a href="#多维混合并行" id="markdown-toc-多维混合并行">多维混合并行</a>            <ul>
              <li><a href="#2d-并行" id="markdown-toc-2d-并行">2D 并行</a></li>
              <li><a href="#3d-并行" id="markdown-toc-3d-并行">3D 并行</a></li>
            </ul>
          </li>
          <li><a href="#异构系统并行" id="markdown-toc-异构系统并行">异构系统并行</a></li>
          <li><a href="#自动搜索并行空间" id="markdown-toc-自动搜索并行空间">自动搜索并行空间</a>            <ul>
              <li><a href="#alpa" id="markdown-toc-alpa">alpa</a></li>
            </ul>
          </li>
        </ul>
      </li>
      <li><a href="#模型训练开销" id="markdown-toc-模型训练开销">模型训练开销</a>        <ul>
          <li><a href="#模型训练流程" id="markdown-toc-模型训练流程">模型训练流程</a></li>
          <li><a href="#参数的显存占用" id="markdown-toc-参数的显存占用">参数的显存占用</a></li>
          <li><a href="#梯度与动量的显存占用" id="markdown-toc-梯度与动量的显存占用">梯度与动量的显存占用</a></li>
          <li><a href="#输入输出的显存占用" id="markdown-toc-输入输出的显存占用">输入输出的显存占用</a></li>
          <li><a href="#节省显存的方法" id="markdown-toc-节省显存的方法">节省显存的方法</a></li>
          <li><a href="#gpu-要存哪些参数" id="markdown-toc-gpu-要存哪些参数">GPU 要存哪些参数</a>            <ul>
              <li><a href="#llama-6b-占用多大内存" id="markdown-toc-llama-6b-占用多大内存">LLaMA-6B 占用多大内存</a></li>
              <li><a href="#7b-占用多大内存" id="markdown-toc-7b-占用多大内存">7B 占用多大内存</a></li>
              <li><a href="#adam--fp16-混合精度预估" id="markdown-toc-adam--fp16-混合精度预估">Adam + fp16 混合精度预估</a></li>
            </ul>
          </li>
          <li><a href="#llm-推理显存开销" id="markdown-toc-llm-推理显存开销">LLM 推理显存开销</a></li>
        </ul>
      </li>
      <li><a href="#内存显存优化" id="markdown-toc-内存显存优化">内存/显存优化</a>        <ul>
          <li><a href="#cpu卸载" id="markdown-toc-cpu卸载">CPU卸载</a></li>
          <li><a href="#激活重新计算" id="markdown-toc-激活重新计算">激活重新计算</a></li>
          <li><a href="#混合精度训练" id="markdown-toc-混合精度训练">混合精度训练</a>            <ul>
              <li><a href="#int8" id="markdown-toc-int8">Int8</a></li>
              <li><a href="#fp-16" id="markdown-toc-fp-16">FP 16</a></li>
            </ul>
          </li>
          <li><a href="#压缩" id="markdown-toc-压缩">压缩</a></li>
          <li><a href="#内存高效优化器" id="markdown-toc-内存高效优化器">内存高效优化器</a></li>
        </ul>
      </li>
    </ul>
  </li>
  <li><a href="#分布式机器学习实现" id="markdown-toc-分布式机器学习实现">分布式机器学习实现</a>    <ul>
      <li><a href="#经验" id="markdown-toc-经验">经验</a>        <ul>
          <li><a href="#并行度对比" id="markdown-toc-并行度对比">并行度对比</a></li>
          <li><a href="#为什么3k卡集群主流是流水并行" id="markdown-toc-为什么3k卡集群主流是流水并行">为什么3k卡集群主流是流水并行?</a></li>
        </ul>
      </li>
      <li><a href="#基本原理" id="markdown-toc-基本原理">基本原理</a>        <ul>
          <li><a href="#并行模式" id="markdown-toc-并行模式">并行模式</a></li>
          <li><a href="#数据并行dpddp" id="markdown-toc-数据并行dpddp">数据并行(DP&amp;DDP)</a>            <ul>
              <li><a href="#dp-单机数据并行" id="markdown-toc-dp-单机数据并行">DP 单机数据并行</a></li>
              <li><a href="#ddp-分布式数据并行" id="markdown-toc-ddp-分布式数据并行">DDP 分布式数据并行</a></li>
            </ul>
          </li>
          <li><a href="#模型并行model-parallesim" id="markdown-toc-模型并行model-parallesim">模型并行(model parallesim)</a>            <ul>
              <li><a href="#层间--层内" id="markdown-toc-层间--层内">层间 &amp; 层内</a></li>
              <li><a href="#示例" id="markdown-toc-示例">示例</a></li>
            </ul>
          </li>
          <li><a href="#流水线并行" id="markdown-toc-流水线并行">流水线并行</a>            <ul>
              <li><a href="#g-pipe" id="markdown-toc-g-pipe">G-pipe</a></li>
              <li><a href="#pipedream" id="markdown-toc-pipedream">PipeDream</a></li>
              <li><a href="#virtual-pipeline" id="markdown-toc-virtual-pipeline">virtual pipeline</a></li>
            </ul>
          </li>
          <li><a href="#张量并行tensor-parallelism" id="markdown-toc-张量并行tensor-parallelism">张量并行(Tensor Parallelism)</a></li>
          <li><a href="#混合并行" id="markdown-toc-混合并行">混合并行</a></li>
          <li><a href="#架构模式" id="markdown-toc-架构模式">架构模式</a>            <ul>
              <li><a href="#ps参数服务器" id="markdown-toc-ps参数服务器">PS:参数服务器</a></li>
              <li><a href="#基于规约-reduce模式" id="markdown-toc-基于规约-reduce模式">基于规约 Reduce模式</a></li>
            </ul>
          </li>
          <li><a href="#同步范式" id="markdown-toc-同步范式">同步范式</a></li>
          <li><a href="#物理架构" id="markdown-toc-物理架构">物理架构</a></li>
        </ul>
      </li>
      <li><a href="#分布式实现" id="markdown-toc-分布式实现">分布式实现</a>        <ul>
          <li><a href="#tf分布式训练方法" id="markdown-toc-tf分布式训练方法">TF分布式训练方法</a></li>
          <li><a href="#单机单卡" id="markdown-toc-单机单卡">单机单卡</a>            <ul>
              <li><a href="#tf" id="markdown-toc-tf">TF</a></li>
              <li><a href="#pytorch" id="markdown-toc-pytorch">PyTorch</a></li>
            </ul>
          </li>
          <li><a href="#单机多卡" id="markdown-toc-单机多卡">单机多卡</a>            <ul>
              <li><a href="#tf-1" id="markdown-toc-tf-1">TF</a></li>
              <li><a href="#pytorch-1" id="markdown-toc-pytorch-1">PyTorch</a></li>
            </ul>
          </li>
          <li><a href="#多机多卡" id="markdown-toc-多机多卡">多机多卡</a>            <ul>
              <li><a href="#多机多卡讲解" id="markdown-toc-多机多卡讲解">多机多卡讲解</a></li>
              <li><a href="#tf-2" id="markdown-toc-tf-2">TF</a></li>
              <li><a href="#pytorch-2" id="markdown-toc-pytorch-2">PyTorch</a></li>
            </ul>
          </li>
        </ul>
      </li>
      <li><a href="#pytorch-分布式训练" id="markdown-toc-pytorch-分布式训练">Pytorch 分布式训练</a>        <ul>
          <li><a href="#分布式基础" id="markdown-toc-分布式基础">分布式基础</a>            <ul>
              <li><a href="#分布式模式-1" id="markdown-toc-分布式模式-1">分布式模式</a></li>
            </ul>
          </li>
          <li><a href="#1dataparallel" id="markdown-toc-1dataparallel">1、DataParallel</a></li>
          <li><a href="#2ddp官方建议" id="markdown-toc-2ddp官方建议">2、DDP(官方建议)</a>            <ul>
              <li><a href="#dp-问题" id="markdown-toc-dp-问题">DP 问题</a></li>
              <li><a href="#torchdistributed-介绍" id="markdown-toc-torchdistributed-介绍">torch.distributed 介绍</a></li>
              <li><a href="#torchdistributed-概念" id="markdown-toc-torchdistributed-概念">torch.distributed 概念</a></li>
              <li><a href="#数据读取" id="markdown-toc-数据读取">数据读取</a></li>
              <li><a href="#初始化进程组-init_process_group" id="markdown-toc-初始化进程组-init_process_group">初始化进程组 init_process_group</a>                <ul>
                  <li><a href="#1-tcp-初始化" id="markdown-toc-1-tcp-初始化">(1) TCP 初始化</a></li>
                  <li><a href="#2-共享文件初始化" id="markdown-toc-2-共享文件初始化">(2) 共享文件初始化</a></li>
                  <li><a href="#3-env-初始化默认" id="markdown-toc-3-env-初始化默认">(3) Env 初始化(默认)</a></li>
                </ul>
              </li>
              <li><a href="#gpu-启动方式" id="markdown-toc-gpu-启动方式">GPU 启动方式</a>                <ul>
                  <li><a href="#1-mpspawn" id="markdown-toc-1-mpspawn">(1) mp.spawn</a></li>
                  <li><a href="#2-launchrun" id="markdown-toc-2-launchrun">(2) launch/run</a></li>
                  <li><a href="#3-torchrun" id="markdown-toc-3-torchrun">(3) torchrun</a></li>
                </ul>
              </li>
              <li><a href="#torchdistributed-使用" id="markdown-toc-torchdistributed-使用">torch.distributed 使用</a></li>
            </ul>
          </li>
          <li><a href="#代码分布式改造" id="markdown-toc-代码分布式改造">代码分布式改造</a>            <ul>
              <li><a href="#分布式数据集" id="markdown-toc-分布式数据集">分布式数据集</a></li>
              <li><a href="#分布式训练-1" id="markdown-toc-分布式训练-1">分布式训练</a></li>
              <li><a href="#分布式评估" id="markdown-toc-分布式评估">分布式评估</a></li>
            </ul>
          </li>
          <li><a href="#pytorch-分布式操作" id="markdown-toc-pytorch-分布式操作">pytorch 分布式操作</a>            <ul>
              <li><a href="#all_gather" id="markdown-toc-all_gather">all_gather</a></li>
              <li><a href="#all_reduce" id="markdown-toc-all_reduce">all_reduce</a></li>
            </ul>
          </li>
          <li><a href="#torchrun-更新" id="markdown-toc-torchrun-更新">Torchrun (更新)</a>            <ul>
              <li><a href="#用法" id="markdown-toc-用法">用法</a></li>
              <li><a href="#torchrun-示例" id="markdown-toc-torchrun-示例">torchrun 示例</a></li>
              <li><a href="#迁移-launch---torchrun" id="markdown-toc-迁移-launch---torchrun">迁移 launch -&gt; torchrun</a></li>
              <li><a href="#初始化-init_process_group" id="markdown-toc-初始化-init_process_group">初始化 init_process_group</a></li>
            </ul>
          </li>
          <li><a href="#多机多卡-ddp" id="markdown-toc-多机多卡-ddp">多机多卡 DDP</a></li>
          <li><a href="#fsdp-ddp改进" id="markdown-toc-fsdp-ddp改进">FSDP (DDP改进)</a>            <ul>
              <li><a href="#fsdp-原理" id="markdown-toc-fsdp-原理">FSDP 原理</a></li>
              <li><a href="#分片原理" id="markdown-toc-分片原理">分片原理</a></li>
              <li><a href="#fsdp-模型初始化" id="markdown-toc-fsdp-模型初始化">FSDP 模型初始化</a></li>
            </ul>
          </li>
        </ul>
      </li>
      <li><a href="#分布式训练高层封装" id="markdown-toc-分布式训练高层封装">分布式训练高层封装</a>        <ul>
          <li><a href="#accelerator" id="markdown-toc-accelerator">Accelerator</a></li>
          <li><a href="#horovod" id="markdown-toc-horovod">Horovod</a></li>
        </ul>
      </li>
      <li><a href="#分布式训练库" id="markdown-toc-分布式训练库">分布式训练库</a>        <ul>
          <li><a href="#常见框架" id="markdown-toc-常见框架">常见框架</a></li>
          <li><a href="#llm-复现选择" id="markdown-toc-llm-复现选择">LLM 复现选择</a></li>
          <li><a href="#deepspeed--微软" id="markdown-toc-deepspeed--微软">DeepSpeed – 微软</a></li>
          <li><a href="#trl" id="markdown-toc-trl">trl</a>            <ul>
              <li><a href="#trl-实践" id="markdown-toc-trl-实践">Trl 实践</a></li>
            </ul>
          </li>
          <li><a href="#trainer" id="markdown-toc-trainer">Trainer</a>            <ul>
              <li><a href="#trainer-定义" id="markdown-toc-trainer-定义">Trainer 定义</a></li>
              <li><a href="#自定义" id="markdown-toc-自定义">自定义</a>                <ul>
                  <li><a href="#model_init" id="markdown-toc-model_init">model_init</a></li>
                  <li><a href="#compute_metrics" id="markdown-toc-compute_metrics">compute_metrics</a></li>
                  <li><a href="#加权loss" id="markdown-toc-加权loss">加权loss</a></li>
                </ul>
              </li>
              <li><a href="#参数详解" id="markdown-toc-参数详解">参数详解</a>                <ul>
                  <li><a href="#trainer类-参数" id="markdown-toc-trainer类-参数">Trainer类 参数</a></li>
                  <li><a href="#trainingarguments-参数" id="markdown-toc-trainingarguments-参数">TrainingArguments 参数</a></li>
                </ul>
              </li>
            </ul>
          </li>
          <li><a href="#firefly" id="markdown-toc-firefly">Firefly</a></li>
          <li><a href="#torchtune" id="markdown-toc-torchtune">TorchTune</a>            <ul>
              <li><a href="#torchtune-功能" id="markdown-toc-torchtune-功能">TorchTune 功能</a></li>
              <li><a href="#torchtune-微调" id="markdown-toc-torchtune-微调">TorchTune 微调</a></li>
              <li><a href="#torchtune-安装" id="markdown-toc-torchtune-安装">torchtune 安装</a></li>
            </ul>
          </li>
          <li><a href="#torchtitan" id="markdown-toc-torchtitan">torchtitan</a></li>
          <li><a href="#总结" id="markdown-toc-总结">总结</a></li>
          <li><a href="#llama-factory" id="markdown-toc-llama-factory">LLaMA-Factory</a>            <ul>
              <li><a href="#llama-factory-介绍" id="markdown-toc-llama-factory-介绍">LLaMA-Factory 介绍</a></li>
              <li><a href="#llama-factory-安装" id="markdown-toc-llama-factory-安装">LLaMA-Factory 安装</a></li>
              <li><a href="#llama-factory-使用" id="markdown-toc-llama-factory-使用">LLaMA-Factory 使用</a>                <ul>
                  <li><a href="#指令监督微调" id="markdown-toc-指令监督微调">指令监督微调</a></li>
                  <li><a href="#奖励模型训练" id="markdown-toc-奖励模型训练">奖励模型训练</a></li>
                  <li><a href="#ppo-训练" id="markdown-toc-ppo-训练">ppo 训练</a></li>
                  <li><a href="#dpo-训练" id="markdown-toc-dpo-训练">dpo 训练</a></li>
                </ul>
              </li>
            </ul>
          </li>
          <li><a href="#xtuner" id="markdown-toc-xtuner">Xtuner</a></li>
          <li><a href="#swift" id="markdown-toc-swift">SWIFT</a></li>
        </ul>
      </li>
      <li><a href="#新技术" id="markdown-toc-新技术">新技术</a>        <ul>
          <li><a href="#distro" id="markdown-toc-distro">DisTrO</a></li>
        </ul>
      </li>
    </ul>
  </li>
  <li><a href="#结束" id="markdown-toc-结束">结束</a></li>
</ul>

<h1 id="分布式">分布式</h1>

<p>【2021-10-13】<a href="https://mp.weixin.qq.com/s?__biz=MzU5ODg0MTAwMw==&amp;mid=2247504041&amp;idx=1&amp;sn=a6a8ceaf1cb091d7832351bcddae6ffb&amp;chksm=febc936dc9cb1a7bbcdeef42f304107d7fe221e7999f2a1a508c6164267dc12dd12ee29ad0eb&amp;mpshare=1&amp;scene=23&amp;srcid=1013pNjTo5fSHOxkjfW5JoFs">OpenAI 研究员最新博客:如何在多GPU上训练真正的大模型?</a>,<a href="lilianweng.github.io/lil-log/2021/09/24/train-large-neural-networks.html">原文链接</a></p>
<ul>
  <li>单个GPU卡的内存有限,许多大模型的大小已经超过了单个GPU,训练深且大的神经网络的主要方法有训练<strong>并行</strong>加速、各种模型<strong>架构</strong>以及内存<strong>节省</strong>设计等。
    <ul>
      <li>(1)并行加速方法有以下几种:
        <ul>
          <li><strong>数据</strong>并行性:将相同的模型权重复制到多个worker中,并将一部分数据分配给每个worker以同时进行处理。</li>
          <li><strong>模型</strong>并行性</li>
          <li><strong>流水线</strong>并行</li>
          <li><strong>张量</strong>并行</li>
        </ul>
      </li>
      <li>(2)模型架构方面主要有专家混合(MoE)方法。</li>
      <li>(3)节省内存的设计方法,如:CPU卸载、激活重新计算、混合精度训练、压缩以及内存高效优化器等等。</li>
    </ul>
  </li>
</ul>

<h2 id="为什么要-多gpu">为什么要 多GPU</h2>

<p>两种原因:</p>
<ul>
  <li>第一种:模型在<strong>一块GPU上放不下</strong>,多块GPU上就能运行完整的模型(如早期的AlexNet)。</li>
  <li>第二种:多块GPU并行计算可达到<strong>加速训练</strong>的效果。</li>
</ul>

<h3 id="语言模型发展">语言模型发展</h3>

<p>设计分布式训练系统的一个最重要的原因</p>
<ul>
  <li>单个计算设备的算力已经不足以支撑模型训练。</li>
</ul>

<p>机器学习模型快速发展</p>
<ul>
  <li>从2013年AlexNet开始,到2022年拥有5400亿参数的PalM模型,机器学习模型以<strong>每18个月增长56倍</strong>的速度发展。</li>
  <li>模型参数规模增大的同时,对训练数据量的要求也指数级增长,这更加剧了对算力的需求。</li>
</ul>

<p>近几年CPU算力增加已经<strong>远低于</strong> <code class="language-plaintext highlighter-rouge">摩尔定律</code>(Moore’s Law)</p>
<ul>
  <li>虽然计算加速设备(如GPU、TPU等)为机器学习模型提供了大量的算力,但是其增长速度仍然没有突破每18个月翻倍的<code class="language-plaintext highlighter-rouge">摩尔定律</code>。</li>
</ul>

<p>为了能够满足机器学习模型发展,只有通过<strong>分布式训练</strong>系统才可以匹配模型不断增长的算力需求。</p>

<p>大语言模型参数量和数据量非常巨大,因此都采用了分布式训练架构完成训练。</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">OPT</code>模型训练用了<strong>992块</strong>NVIDIA A100 80G GPU,采用<code class="language-plaintext highlighter-rouge">全分片数据并行</code>(Fully Sharded Data Parallel)以及Megatron-LM <code class="language-plaintext highlighter-rouge">张量并行</code>(Tensor Parallelism),整体训练时间将近2个月。</li>
  <li><code class="language-plaintext highlighter-rouge">BLOOM</code>模型在硬件和所采用的系统架构方面的细节。训练一共花费3.5个月,使用48个计算节点。
    <ul>
      <li>每个节点包含8块NVIDIA A100 80G GPU(总计384个GPU)</li>
      <li>并且使用 4*NVLink 用于节点内部GPU之间通信。节点之间采用四个 Omni-Path 100 Gbps网卡构建的增强8维超立方体全局拓扑网络进行通信。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">LLaMA</code>模型训练采用 NVIDIA A100 80GB GPU
    <ul>
      <li>LLaMA-7B 模型训练需要 82432 GPU小时</li>
      <li>LLaMA-13B 模型训练需要 135168 GPU小时</li>
      <li>LLaMA-33B 模型训练花费了 530432 GPU小时</li>
      <li>LLaMA-65B 模型训练花费则高达 1022362 GPU小时。</li>
    </ul>
  </li>
</ul>

<table>
  <thead>
    <tr>
      <th>模型</th>
      <th>GPU型号</th>
      <th>GPU数目</th>
      <th>训练时间</th>
      <th> </th>
    </tr>
  </thead>
  <tbody>
    <tr>
      <td><code class="language-plaintext highlighter-rouge">OPT</code></td>
      <td>A100</td>
      <td>992</td>
      <td>2个月</td>
      <td>FSDP+TP</td>
    </tr>
    <tr>
      <td><code class="language-plaintext highlighter-rouge">BLOOM</code></td>
      <td>A100</td>
      <td>384</td>
      <td>3.5个月</td>
      <td> </td>
    </tr>
    <tr>
      <td><code class="language-plaintext highlighter-rouge">LLaMA</code></td>
      <td>A100</td>
      <td> </td>
      <td> </td>
      <td> </td>
    </tr>
  </tbody>
</table>

<h3 id="性能提速">性能提速</h3>

<p>在 pytorch1.7 + cuda10 + TeslaV100的环境下,使用ResNet34,batch_size=16, SGD对花草数据集训练的情况如下:</p>
<ul>
  <li>1块 GPU需要9s一个epoch</li>
  <li>2块 GPU是5.5s</li>
  <li>8块 是2s。</li>
</ul>

<p>问题</p>
<ul>
  <li>为什么运行时间不是 9/8≈1.1s ?</li>
  <li>因为使用GPU数量越多,设备之间的通讯会越来越复杂,所以随着GPU数量的增加,训练速度的提升也是递减的。</li>
  <li><img src="https://pic1.zhimg.com/80/v2-aac042e783410385f791b8a0f70e6d6c_1440w.webp" alt="" /></li>
</ul>

<p>误差梯度如何在不同设备之间通信?</p>
<ul>
  <li>在每个GPU训练step结束后,将每块GPU的<strong>损失梯度</strong>求<strong>平均</strong>,而不是每块GPU各计算各的。</li>
</ul>

<p>BN如何在不同设备之间同步?</p>
<ul>
  <li>假设 batch_size=2,每个GPU计算的均值和方差都针对这两个样本而言的。</li>
  <li>而BN的特性是:batch_size 越大,均值和方差越接近与整个数据集的均值和方差,效果越好。</li>
  <li>使用多块GPU时,会计算每个BN层在所有设备上输入的<strong>均值</strong>和<strong>方差</strong>。如果GPU1和GPU2都分别得到两个特征层,那么两块GPU一共计算4个特征层的均值和方差,可以认为batch_size=4。</li>
  <li>注意:如果不用<strong>同步BN</strong>,而是每个设备计算自己的批次数据的均值方差,效果与单GPU一致,仅仅能提升<strong>训练</strong>速度;</li>
  <li>如果使用<strong>同步BN</strong>,效果会有一定提升,但是会损失一部分<strong>并行</strong>速度。</li>
  <li><img src="https://pic4.zhimg.com/80/v2-176db548da9befc70385eee0f45abdd3_1440w.webp" alt="" /></li>
</ul>

<p>单GPU、是否使用同步BN训练的三种情况,可以看到</p>
<ul>
  <li>使用<strong>同步BN</strong>(橙线)比不使用同步BN(蓝线)总体效果要好一些,不过训练时间也会更长。</li>
  <li>使用单GPU(黑线)和不使用同步BN的效果是差不多的。</li>
  <li><img src="https://pic1.zhimg.com/80/v2-0fbd4fd5cf062876b9c50779fe0b05a8_1440w.webp" alt="" /></li>
</ul>

<p>两种GPU训练方法:<code class="language-plaintext highlighter-rouge">DataParallel</code>和<code class="language-plaintext highlighter-rouge">DistributedDataParallel</code>:</p>
<ul>
  <li>DataParallel是<strong>单进程多线程</strong>的,仅仅能工作在<strong>单机</strong>中。而DistributedDataParallel是<strong>多进程</strong>的,可以工作在单机或多机器中。</li>
  <li>DataParallel通常会慢于DistributedDataParallel。所以目前主流的方法是DistributedDataParallel。</li>
</ul>

<table>
  <thead>
    <tr>
      <th>维度</th>
      <th>DP</th>
      <th>DDP</th>
    </tr>
  </thead>
  <tbody>
    <tr>
      <td>运行环境</td>
      <td>单机,单进程多线程</td>
      <td>单/多机,多进程</td>
    </tr>
    <tr>
      <td>速度</td>
      <td>慢</td>
      <td>快</td>
    </tr>
    <tr>
      <td> </td>
      <td> </td>
      <td> </td>
    </tr>
  </tbody>
</table>

<h2 id="分布式模式">分布式模式</h2>

<p>深度学习任务通用 GPU 进行模型训练。</p>
<ul>
  <li>因为 GPU 相对于 CPU 具有更多的<strong>算术逻辑单元</strong>(<code class="language-plaintext highlighter-rouge">ALU</code>),发挥并行计算的优势,特别适合<strong>计算密集型</strong>任务,更高效地完成深度学习模型的训练。</li>
  <li>更多 GPU 知识见站内专题 <a href="/gpu">并行计算GPU</a></li>
</ul>

<p>分析</p>
<ul>
  <li>虽然 GPU 并行计算能力优异,但<strong>无法单独</strong>工作,必须由 CPU 进行控制调用;</li>
  <li>而且<strong>显存</strong>和<strong>内存</strong>之间的频繁数据拷贝,可能带来较大的性能开销。</li>
  <li>CPU 虽然计算能力不如 GPU,但可以<strong>独立</strong>工作,直接访问内存数据完成计算。</li>
</ul>

<p>因此,想获得更好的训练性能,需要合理利用 GPU 和 CPU 的优势。</p>

<h3 id="分布式目标">分布式目标</h3>

<p>分布式训练总体目标: 提升总训练速度,减少模型训练的总体时间。</p>

<p>总训练速度公式:</p>
<ul>
  <li>总训练速度 ∝ 单设备计算速度 X 计算设备总量 X 多设备加速比</li>
  <li><strong>单设备计算速度</strong>主要由单块计算加速芯片的<strong>运算速度</strong> 和 <strong>数据I/O能力</strong>来决定
    <ul>
      <li>对单设备训练效率进行优化,主要技术手段: <strong>混合精度训练</strong>、<strong>算子融合</strong>、<strong>梯度累加</strong>等;</li>
    </ul>
  </li>
  <li>分布式训练系统中<strong>计算设备数量</strong>越多,其理论峰值计算速度就会越高,但是受到通信效率的影响,计算设备数量增大则会造成加速比急速降低;</li>
  <li><strong>多设备加速比</strong>则由计算和通讯效率决定,结合算法和网络拓扑结构进行优化,分布式训练并行策略主要目标就是提升分布式训练系统中的多设备加速比。</li>
</ul>

<h3 id="cpu--gpu-工作模式">CPU + GPU 工作模式</h3>

<p>GPU 模式下的模型训练如图所示,分为4步:</p>
<ul>
  <li>第1步,将输入数据从系统内存拷贝到显存。</li>
  <li>第2步,CPU 指示 GPU 处理数据。</li>
  <li>第3步,GPU 并行地完成一系列的计算。</li>
  <li>第4步,将计算结果从显存拷贝到内存。</li>
</ul>

<p><img src="https://aijishu.com/img/bVNCA" alt="" /></p>

<div class="mxgraph" style="max-width:100%;border:1px solid transparent;" data-mxgraph="{&quot;highlight&quot;:&quot;#0000ff&quot;,&quot;nav&quot;:true,&quot;resize&quot;:true,&quot;toolbar&quot;:&quot;zoom layers tags lightbox&quot;,&quot;edit&quot;:&quot;_blank&quot;,&quot;xml&quot;:&quot;&lt;mxfile host=\&quot;app.diagrams.net\&quot; modified=\&quot;2023-06-29T06:30:26.521Z\&quot; agent=\&quot;Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36\&quot; etag=\&quot;ChEM8LvE2i4EmNkl0XTt\&quot; version=\&quot;21.5.1\&quot;&gt;\n  &lt;diagram name=\&quot;第 1 页\&quot; id=\&quot;g7JWtnAzlr1IYn_n-NA3\&quot;&gt;\n    &lt;mxGraphModel dx=\&quot;1242\&quot; dy=\&quot;795\&quot; grid=\&quot;1\&quot; gridSize=\&quot;10\&quot; guides=\&quot;1\&quot; tooltips=\&quot;1\&quot; connect=\&quot;1\&quot; arrows=\&quot;1\&quot; fold=\&quot;1\&quot; page=\&quot;1\&quot; pageScale=\&quot;1\&quot; pageWidth=\&quot;827\&quot; pageHeight=\&quot;1169\&quot; math=\&quot;0\&quot; shadow=\&quot;0\&quot;&gt;\n      &lt;root&gt;\n        &lt;mxCell id=\&quot;0\&quot; /&gt;\n        &lt;mxCell id=\&quot;1\&quot; parent=\&quot;0\&quot; /&gt;\n        &lt;mxCell id=\&quot;wnFd1TgQWBGGvaqbgYLa-1\&quot; value=\&quot;CPU+GPU工作模式\&quot; style=\&quot;text;html=1;strokeColor=none;fillColor=none;align=center;verticalAlign=middle;whiteSpace=wrap;rounded=0;fontStyle=0;fontSize=19;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;301\&quot; y=\&quot;60\&quot; width=\&quot;180\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;wnFd1TgQWBGGvaqbgYLa-2\&quot; value=\&quot;内存\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#fff2cc;strokeColor=#d6b656;fontSize=15;dashed=1;dashPattern=1 1;strokeWidth=3;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;161\&quot; y=\&quot;130\&quot; width=\&quot;90\&quot; height=\&quot;40\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;wnFd1TgQWBGGvaqbgYLa-3\&quot; value=\&quot;显存\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#ffe6cc;strokeColor=#d79b00;fontSize=15;dashed=1;dashPattern=1 1;strokeWidth=3;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;161\&quot; y=\&quot;280\&quot; width=\&quot;90\&quot; height=\&quot;40\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;wnFd1TgQWBGGvaqbgYLa-4\&quot; value=\&quot;CPU\&quot; style=\&quot;rounded=0;whiteSpace=wrap;html=1;fillColor=#fff2cc;strokeColor=none;shadow=1;fontStyle=0;fontSize=21;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;337.5\&quot; y=\&quot;120\&quot; width=\&quot;92.5\&quot; height=\&quot;60\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;wnFd1TgQWBGGvaqbgYLa-5\&quot; value=\&quot;GPU\&quot; style=\&quot;rounded=0;whiteSpace=wrap;html=1;fillColor=#ffe6cc;strokeColor=none;shadow=1;fontStyle=0;fontSize=21;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;331\&quot; y=\&quot;270\&quot; width=\&quot;99\&quot; height=\&quot;60\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;wnFd1TgQWBGGvaqbgYLa-6\&quot; value=\&quot;\&quot; style=\&quot;endArrow=classic;html=1;rounded=0;strokeWidth=2;strokeColor=#999999;exitX=0.344;exitY=1.075;exitDx=0;exitDy=0;exitPerimeter=0;entryX=0.333;entryY=0;entryDx=0;entryDy=0;entryPerimeter=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;wnFd1TgQWBGGvaqbgYLa-2\&quot; target=\&quot;wnFd1TgQWBGGvaqbgYLa-3\&quot;&gt;\n          &lt;mxGeometry width=\&quot;50\&quot; height=\&quot;50\&quot; relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;221\&quot; y=\&quot;240\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;271\&quot; y=\&quot;190\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;wnFd1TgQWBGGvaqbgYLa-7\&quot; value=\&quot;① 复制数据\&quot; style=\&quot;text;html=1;strokeColor=none;fillColor=none;align=center;verticalAlign=middle;whiteSpace=wrap;rounded=0;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;121\&quot; y=\&quot;200\&quot; width=\&quot;70\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;wnFd1TgQWBGGvaqbgYLa-8\&quot; value=\&quot;\&quot; style=\&quot;endArrow=classic;html=1;rounded=0;strokeWidth=2;strokeColor=#999999;exitX=0.75;exitY=0;exitDx=0;exitDy=0;entryX=0.75;entryY=1;entryDx=0;entryDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;wnFd1TgQWBGGvaqbgYLa-3\&quot; target=\&quot;wnFd1TgQWBGGvaqbgYLa-2\&quot;&gt;\n          &lt;mxGeometry width=\&quot;50\&quot; height=\&quot;50\&quot; relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;292\&quot; y=\&quot;183\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;291\&quot; y=\&quot;290\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;wnFd1TgQWBGGvaqbgYLa-9\&quot; value=\&quot;④ 复制结果\&quot; style=\&quot;text;html=1;strokeColor=none;fillColor=none;align=center;verticalAlign=middle;whiteSpace=wrap;rounded=0;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;231\&quot; y=\&quot;200\&quot; width=\&quot;70\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;wnFd1TgQWBGGvaqbgYLa-10\&quot; value=\&quot;② CPU指示GPU处理数据\&quot; style=\&quot;text;html=1;strokeColor=none;fillColor=none;align=center;verticalAlign=middle;whiteSpace=wrap;rounded=0;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;325\&quot; y=\&quot;180\&quot; width=\&quot;145\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;wnFd1TgQWBGGvaqbgYLa-11\&quot; value=\&quot;③ GPU并行处理数据\&quot; style=\&quot;text;html=1;strokeColor=none;fillColor=none;align=center;verticalAlign=middle;whiteSpace=wrap;rounded=0;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;312.5\&quot; y=\&quot;240\&quot; width=\&quot;145\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;wnFd1TgQWBGGvaqbgYLa-12\&quot; value=\&quot;\&quot; style=\&quot;endArrow=block;html=1;rounded=0;strokeWidth=2;strokeColor=#999999;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryX=0;entryY=0.5;entryDx=0;entryDy=0;startArrow=block;startFill=0;endFill=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;wnFd1TgQWBGGvaqbgYLa-2\&quot; target=\&quot;wnFd1TgQWBGGvaqbgYLa-4\&quot;&gt;\n          &lt;mxGeometry width=\&quot;50\&quot; height=\&quot;50\&quot; relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;292\&quot; y=\&quot;106.5\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;291\&quot; y=\&quot;213.5\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;wnFd1TgQWBGGvaqbgYLa-13\&quot; value=\&quot;\&quot; style=\&quot;endArrow=block;html=1;rounded=0;strokeWidth=2;strokeColor=#999999;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryX=0;entryY=0.5;entryDx=0;entryDy=0;startArrow=block;startFill=0;endFill=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;wnFd1TgQWBGGvaqbgYLa-3\&quot; target=\&quot;wnFd1TgQWBGGvaqbgYLa-5\&quot;&gt;\n          &lt;mxGeometry width=\&quot;50\&quot; height=\&quot;50\&quot; relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;211\&quot; y=\&quot;160\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;348\&quot; y=\&quot;160\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;wnFd1TgQWBGGvaqbgYLa-14\&quot; value=\&quot;显存\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#ffe6cc;strokeColor=#d79b00;fontSize=15;dashed=1;dashPattern=1 1;strokeWidth=3;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;161\&quot; y=\&quot;350\&quot; width=\&quot;90\&quot; height=\&quot;40\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;wnFd1TgQWBGGvaqbgYLa-15\&quot; value=\&quot;GPU\&quot; style=\&quot;rounded=0;whiteSpace=wrap;html=1;fillColor=#ffe6cc;strokeColor=none;shadow=1;fontStyle=0;fontSize=21;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;331\&quot; y=\&quot;340\&quot; width=\&quot;99\&quot; height=\&quot;60\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;wnFd1TgQWBGGvaqbgYLa-16\&quot; value=\&quot;\&quot; style=\&quot;endArrow=block;html=1;rounded=0;strokeWidth=2;strokeColor=#999999;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryX=0;entryY=0.5;entryDx=0;entryDy=0;startArrow=block;startFill=0;endFill=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;wnFd1TgQWBGGvaqbgYLa-14\&quot; target=\&quot;wnFd1TgQWBGGvaqbgYLa-15\&quot;&gt;\n          &lt;mxGeometry width=\&quot;50\&quot; height=\&quot;50\&quot; relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;211\&quot; y=\&quot;230\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;348\&quot; y=\&quot;230\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;wnFd1TgQWBGGvaqbgYLa-17\&quot; value=\&quot;显存\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#ffe6cc;strokeColor=#d79b00;fontSize=15;dashed=1;dashPattern=1 1;strokeWidth=3;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;161\&quot; y=\&quot;420\&quot; width=\&quot;90\&quot; height=\&quot;40\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;wnFd1TgQWBGGvaqbgYLa-18\&quot; value=\&quot;GPU\&quot; style=\&quot;rounded=0;whiteSpace=wrap;html=1;fillColor=#ffe6cc;strokeColor=none;shadow=1;fontStyle=0;fontSize=21;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;331\&quot; y=\&quot;410\&quot; width=\&quot;99\&quot; height=\&quot;60\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;wnFd1TgQWBGGvaqbgYLa-19\&quot; value=\&quot;\&quot; style=\&quot;endArrow=block;html=1;rounded=0;strokeWidth=2;strokeColor=#999999;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryX=0;entryY=0.5;entryDx=0;entryDy=0;startArrow=block;startFill=0;endFill=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;wnFd1TgQWBGGvaqbgYLa-17\&quot; target=\&quot;wnFd1TgQWBGGvaqbgYLa-18\&quot;&gt;\n          &lt;mxGeometry width=\&quot;50\&quot; height=\&quot;50\&quot; relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;211\&quot; y=\&quot;300\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;348\&quot; y=\&quot;300\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;wnFd1TgQWBGGvaqbgYLa-20\&quot; value=\&quot;Master\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#e1d5e7;strokeColor=#9673a6;shadow=1;fontStyle=0\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;660\&quot; y=\&quot;285\&quot; width=\&quot;70\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;wnFd1TgQWBGGvaqbgYLa-21\&quot; value=\&quot;worker\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=none;shadow=1;fontStyle=0\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;510\&quot; y=\&quot;210\&quot; width=\&quot;70\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;wnFd1TgQWBGGvaqbgYLa-22\&quot; value=\&quot;worker\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=none;shadow=1;fontStyle=0\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;510\&quot; y=\&quot;285\&quot; width=\&quot;70\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;wnFd1TgQWBGGvaqbgYLa-23\&quot; value=\&quot;worker\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=none;shadow=1;fontStyle=0\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;510\&quot; y=\&quot;355\&quot; width=\&quot;70\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;wnFd1TgQWBGGvaqbgYLa-24\&quot; value=\&quot;\&quot; style=\&quot;endArrow=classic;html=1;rounded=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryX=0;entryY=0.5;entryDx=0;entryDy=0;fontStyle=0\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;wnFd1TgQWBGGvaqbgYLa-21\&quot; target=\&quot;wnFd1TgQWBGGvaqbgYLa-20\&quot;&gt;\n          &lt;mxGeometry width=\&quot;50\&quot; height=\&quot;50\&quot; relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;630\&quot; y=\&quot;175\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;680\&quot; y=\&quot;125\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;wnFd1TgQWBGGvaqbgYLa-25\&quot; value=\&quot;\&quot; style=\&quot;endArrow=classic;html=1;rounded=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryX=0;entryY=0.5;entryDx=0;entryDy=0;fontStyle=0\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;wnFd1TgQWBGGvaqbgYLa-22\&quot; target=\&quot;wnFd1TgQWBGGvaqbgYLa-20\&quot;&gt;\n          &lt;mxGeometry width=\&quot;50\&quot; height=\&quot;50\&quot; relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;590\&quot; y=\&quot;235\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;690\&quot; y=\&quot;310\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;wnFd1TgQWBGGvaqbgYLa-26\&quot; value=\&quot;\&quot; style=\&quot;endArrow=classic;html=1;rounded=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryX=0;entryY=0.5;entryDx=0;entryDy=0;fontStyle=0\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;wnFd1TgQWBGGvaqbgYLa-23\&quot; target=\&quot;wnFd1TgQWBGGvaqbgYLa-20\&quot;&gt;\n          &lt;mxGeometry width=\&quot;50\&quot; height=\&quot;50\&quot; relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;600\&quot; y=\&quot;245\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;700\&quot; y=\&quot;320\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n      &lt;/root&gt;\n    &lt;/mxGraphModel&gt;\n  &lt;/diagram&gt;\n&lt;/mxfile&gt;\n&quot;}"></div>
<script type="text/javascript" src="https://viewer.diagrams.net/js/viewer-static.min.js"></script>

<h3 id="多机协作">多机协作</h3>

<p>【2024-4-11】多机多卡协作</p>

<div class="mxgraph" style="max-width:100%;border:1px solid transparent;" data-mxgraph="{&quot;highlight&quot;:&quot;#0000ff&quot;,&quot;nav&quot;:true,&quot;resize&quot;:true,&quot;toolbar&quot;:&quot;zoom layers tags lightbox&quot;,&quot;edit&quot;:&quot;_blank&quot;,&quot;xml&quot;:&quot;&lt;mxfile host=\&quot;app.diagrams.net\&quot; modified=\&quot;2024-04-11T11:44:33.140Z\&quot; agent=\&quot;Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/121.0.0.0 Safari/537.36\&quot; etag=\&quot;24crQQesdd3W9KFhZuw-\&quot; version=\&quot;24.2.2\&quot;&gt;\n  &lt;diagram id=\&quot;xdYpP7w1t2VaaceZiyqw\&quot; name=\&quot;第 1 页\&quot;&gt;\n    &lt;mxGraphModel dx=\&quot;1242\&quot; dy=\&quot;-380\&quot; grid=\&quot;1\&quot; gridSize=\&quot;10\&quot; guides=\&quot;1\&quot; tooltips=\&quot;1\&quot; connect=\&quot;1\&quot; arrows=\&quot;1\&quot; fold=\&quot;1\&quot; page=\&quot;1\&quot; pageScale=\&quot;1\&quot; pageWidth=\&quot;827\&quot; pageHeight=\&quot;1169\&quot; math=\&quot;0\&quot; shadow=\&quot;0\&quot;&gt;\n      &lt;root&gt;\n        &lt;mxCell id=\&quot;0\&quot; /&gt;\n        &lt;mxCell id=\&quot;1\&quot; parent=\&quot;0\&quot; /&gt;\n        &lt;mxCell id=\&quot;KTwht3HF3Dpf_-XckZrt-6\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f5f5f5;fontColor=#333333;strokeColor=default;dashed=1;dashPattern=1 1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;210\&quot; y=\&quot;1540\&quot; width=\&quot;304.58\&quot; height=\&quot;200\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;KTwht3HF3Dpf_-XckZrt-1\&quot; value=\&quot;分布式训练\&quot; style=\&quot;text;html=1;strokeColor=none;fillColor=none;align=center;verticalAlign=middle;whiteSpace=wrap;rounded=0;fontSize=21;rotation=0;strokeWidth=3;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;350\&quot; y=\&quot;1330\&quot; width=\&quot;224.5\&quot; height=\&quot;33\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;KTwht3HF3Dpf_-XckZrt-7\&quot; value=\&quot;CPU节点0\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontStyle=1;fontColor=#6666FF;fontSize=15;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot;&gt;\n          &lt;mxGeometry x=\&quot;300\&quot; y=\&quot;1500\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;KTwht3HF3Dpf_-XckZrt-35\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;\&quot; parent=\&quot;1\&quot; source=\&quot;KTwht3HF3Dpf_-XckZrt-9\&quot; target=\&quot;KTwht3HF3Dpf_-XckZrt-11\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;KTwht3HF3Dpf_-XckZrt-9\&quot; value=\&quot;pre-train\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#008a00;strokeColor=#005700;shadow=1;fontSize=17;fontColor=#ffffff;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;146.67999999999995\&quot; y=\&quot;1790\&quot; width=\&quot;69.97\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;KTwht3HF3Dpf_-XckZrt-34\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;\&quot; parent=\&quot;1\&quot; source=\&quot;KTwht3HF3Dpf_-XckZrt-11\&quot; target=\&quot;KTwht3HF3Dpf_-XckZrt-12\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;KTwht3HF3Dpf_-XckZrt-11\&quot; value=\&quot;SFT\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#008a00;strokeColor=#005700;shadow=1;fontSize=17;fontColor=#ffffff;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;146.67999999999995\&quot; y=\&quot;1860\&quot; width=\&quot;69.97\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;KTwht3HF3Dpf_-XckZrt-12\&quot; value=\&quot;RLHF\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#008a00;strokeColor=#005700;shadow=1;fontSize=17;fontColor=#ffffff;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;146.67999999999995\&quot; y=\&quot;1930\&quot; width=\&quot;69.97\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;KTwht3HF3Dpf_-XckZrt-13\&quot; value=\&quot;GPU节点0\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontStyle=1;fontColor=#7F00FF;fontSize=15;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot;&gt;\n          &lt;mxGeometry x=\&quot;715.6800000000001\&quot; y=\&quot;1390\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;KTwht3HF3Dpf_-XckZrt-26\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=4;strokeColor=#999999;entryX=0;entryY=0.5;entryDx=0;entryDy=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryPerimeter=0;\&quot; parent=\&quot;1\&quot; source=\&quot;KTwht3HF3Dpf_-XckZrt-6\&quot; target=\&quot;KTwht3HF3Dpf_-XckZrt-8\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;400\&quot; y=\&quot;1360\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;545\&quot; y=\&quot;1090\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;KTwht3HF3Dpf_-XckZrt-47\&quot; value=\&quot;(1)\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontStyle=1;fontColor=#808080;fontSize=15;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot;&gt;\n          &lt;mxGeometry x=\&quot;130.0000000000001\&quot; y=\&quot;1805\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;KTwht3HF3Dpf_-XckZrt-51\&quot; value=\&quot;(2)\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontStyle=1;fontColor=#808080;fontSize=15;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot;&gt;\n          &lt;mxGeometry x=\&quot;130.0000000000001\&quot; y=\&quot;1875\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;KTwht3HF3Dpf_-XckZrt-52\&quot; value=\&quot;(3)\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontStyle=1;fontColor=#808080;fontSize=15;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot;&gt;\n          &lt;mxGeometry x=\&quot;140.0000000000001\&quot; y=\&quot;1945\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;-8\&quot; y=\&quot;-1\&quot; as=\&quot;offset\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-2\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f5f5f5;fontColor=#333333;strokeColor=default;dashed=1;dashPattern=1 1;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;612.86\&quot; y=\&quot;1532\&quot; width=\&quot;304.58\&quot; height=\&quot;96\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-4\&quot; value=\&quot;GPU节点1\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontStyle=1;fontColor=#7F00FF;fontSize=15;\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;715.6800000000001\&quot; y=\&quot;1523\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-5\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f5f5f5;fontColor=#333333;strokeColor=default;dashed=1;dashPattern=1 1;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;612.86\&quot; y=\&quot;1670\&quot; width=\&quot;304.58\&quot; height=\&quot;100\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-7\&quot; value=\&quot;GPU节点i\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontStyle=1;fontColor=#7F00FF;fontSize=15;\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;715.6800000000001\&quot; y=\&quot;1656\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-8\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f5f5f5;fontColor=#333333;strokeColor=default;dashed=1;dashPattern=1 1;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;612.86\&quot; y=\&quot;1810\&quot; width=\&quot;304.58\&quot; height=\&quot;95\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-10\&quot; value=\&quot;GPU节点n-1\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontStyle=1;fontColor=#7F00FF;fontSize=15;\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;722.6800000000001\&quot; y=\&quot;1798\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-11\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=4;strokeColor=#999999;entryX=0;entryY=0.5;entryDx=0;entryDy=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;KTwht3HF3Dpf_-XckZrt-6\&quot; target=\&quot;MzKt8NfVXthm0VmUftic-2\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;525\&quot; y=\&quot;1735\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;623\&quot; y=\&quot;1490\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-12\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=4;strokeColor=#999999;entryX=0;entryY=0.5;entryDx=0;entryDy=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;KTwht3HF3Dpf_-XckZrt-6\&quot; target=\&quot;MzKt8NfVXthm0VmUftic-5\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;525\&quot; y=\&quot;1735\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;623\&quot; y=\&quot;1689\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-13\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=4;strokeColor=#999999;entryX=0;entryY=0.5;entryDx=0;entryDy=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;KTwht3HF3Dpf_-XckZrt-6\&quot; target=\&quot;MzKt8NfVXthm0VmUftic-8\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;535\&quot; y=\&quot;1745\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;633\&quot; y=\&quot;1699\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-14\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f5f5f5;fontColor=#333333;strokeColor=default;dashed=1;dashPattern=1 1;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;210\&quot; y=\&quot;2063\&quot; width=\&quot;304.58\&quot; height=\&quot;230\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-15\&quot; value=\&quot;CPU节点1\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontStyle=1;fontColor=#6666FF;fontSize=15;\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;270\&quot; y=\&quot;2040\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-16\&quot; value=\&quot;数据集\&quot; style=\&quot;shape=cylinder3;whiteSpace=wrap;html=1;boundedLbl=1;backgroundOutline=1;size=15;fillColor=#fff2cc;strokeColor=#d6b656;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;260\&quot; y=\&quot;2090\&quot; width=\&quot;60\&quot; height=\&quot;60\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-17\&quot; value=\&quot;模型权重\&quot; style=\&quot;shape=cylinder3;whiteSpace=wrap;html=1;boundedLbl=1;backgroundOutline=1;size=15;fillColor=#fff2cc;strokeColor=#d6b656;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;340\&quot; y=\&quot;2090\&quot; width=\&quot;60\&quot; height=\&quot;60\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-18\&quot; value=\&quot;梯度\&quot; style=\&quot;shape=cylinder3;whiteSpace=wrap;html=1;boundedLbl=1;backgroundOutline=1;size=15;fillColor=#fff2cc;strokeColor=#d6b656;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;260\&quot; y=\&quot;2174\&quot; width=\&quot;60\&quot; height=\&quot;60\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-19\&quot; value=\&quot;。。。\&quot; style=\&quot;shape=cylinder3;whiteSpace=wrap;html=1;boundedLbl=1;backgroundOutline=1;size=15;fillColor=#fff2cc;strokeColor=#d6b656;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;344\&quot; y=\&quot;2174\&quot; width=\&quot;60\&quot; height=\&quot;60\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-20\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f5f5f5;fontColor=#333333;strokeColor=default;dashed=1;dashPattern=1 1;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;612.86\&quot; y=\&quot;1945\&quot; width=\&quot;304.58\&quot; height=\&quot;95\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-22\&quot; value=\&quot;GPU节点0\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontStyle=1;fontColor=#7F00FF;fontSize=15;\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;715.6800000000001\&quot; y=\&quot;1930\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-23\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=4;strokeColor=#999999;entryX=0;entryY=0.5;entryDx=0;entryDy=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryPerimeter=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; target=\&quot;MzKt8NfVXthm0VmUftic-20\&quot; source=\&quot;MzKt8NfVXthm0VmUftic-14\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;515\&quot; y=\&quot;2175\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;545\&quot; y=\&quot;1630\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-24\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f5f5f5;fontColor=#333333;strokeColor=default;dashed=1;dashPattern=1 1;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;612.86\&quot; y=\&quot;2072\&quot; width=\&quot;304.58\&quot; height=\&quot;96\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-26\&quot; value=\&quot;GPU节点1\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontStyle=1;fontColor=#7F00FF;fontSize=15;\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;715.6800000000001\&quot; y=\&quot;2063\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-27\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f5f5f5;fontColor=#333333;strokeColor=default;dashed=1;dashPattern=1 1;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;612.86\&quot; y=\&quot;2210\&quot; width=\&quot;304.58\&quot; height=\&quot;100\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-29\&quot; value=\&quot;GPU节点i\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontStyle=1;fontColor=#7F00FF;fontSize=15;\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;715.6800000000001\&quot; y=\&quot;2196\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-30\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f5f5f5;fontColor=#333333;strokeColor=default;dashed=1;dashPattern=1 1;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;612.86\&quot; y=\&quot;2350\&quot; width=\&quot;304.58\&quot; height=\&quot;95\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-32\&quot; value=\&quot;GPU节点n-1\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontStyle=1;fontColor=#7F00FF;fontSize=15;\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;722.6800000000001\&quot; y=\&quot;2338\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-33\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=4;strokeColor=#999999;entryX=0;entryY=0.5;entryDx=0;entryDy=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; target=\&quot;MzKt8NfVXthm0VmUftic-24\&quot; source=\&quot;MzKt8NfVXthm0VmUftic-14\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;515\&quot; y=\&quot;2175\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;623\&quot; y=\&quot;2030\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-34\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=4;strokeColor=#999999;entryX=0;entryY=0.5;entryDx=0;entryDy=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; target=\&quot;MzKt8NfVXthm0VmUftic-27\&quot; source=\&quot;MzKt8NfVXthm0VmUftic-14\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;515\&quot; y=\&quot;2175\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;623\&quot; y=\&quot;2229\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-35\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=4;strokeColor=#999999;entryX=0;entryY=0.5;entryDx=0;entryDy=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; target=\&quot;MzKt8NfVXthm0VmUftic-30\&quot; source=\&quot;MzKt8NfVXthm0VmUftic-14\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;500\&quot; y=\&quot;2180\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;633\&quot; y=\&quot;2239\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-36\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=4;strokeColor=#6666FF;entryX=0.5;entryY=0;entryDx=0;entryDy=0;exitX=0.5;exitY=1;exitDx=0;exitDy=0;dashed=1;dashPattern=1 1;startArrow=classic;startFill=1;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;KTwht3HF3Dpf_-XckZrt-6\&quot; target=\&quot;MzKt8NfVXthm0VmUftic-14\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;525\&quot; y=\&quot;1645\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;623\&quot; y=\&quot;1868\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-59\&quot; value=\&quot;\&quot; style=\&quot;group\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;612.86\&quot; y=\&quot;1405\&quot; width=\&quot;304.58000000000004\&quot; height=\&quot;95\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;KTwht3HF3Dpf_-XckZrt-8\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f5f5f5;fontColor=#333333;strokeColor=default;dashed=1;dashPattern=1 1;\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-59\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry width=\&quot;304.58\&quot; height=\&quot;95\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-45\&quot; value=\&quot;ALU\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#60a917;strokeColor=#2D7600;shadow=1;fontSize=17;fontColor=#ffffff;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-59\&quot;&gt;\n          &lt;mxGeometry x=\&quot;65.13999999999999\&quot; y=\&quot;5\&quot; width=\&quot;70\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-46\&quot; value=\&quot;ALU\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#60a917;strokeColor=#2D7600;shadow=1;fontSize=17;fontColor=#ffffff;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-59\&quot;&gt;\n          &lt;mxGeometry x=\&quot;142.39\&quot; y=\&quot;5\&quot; width=\&quot;70\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-47\&quot; value=\&quot;ALU\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#60a917;strokeColor=#2D7600;shadow=1;fontSize=17;fontColor=#ffffff;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-59\&quot;&gt;\n          &lt;mxGeometry x=\&quot;65.13999999999999\&quot; y=\&quot;41\&quot; width=\&quot;70\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-48\&quot; value=\&quot;ALU\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#60a917;strokeColor=#2D7600;shadow=1;fontSize=17;fontColor=#ffffff;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-59\&quot;&gt;\n          &lt;mxGeometry x=\&quot;142.39\&quot; y=\&quot;41\&quot; width=\&quot;70\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-49\&quot; value=\&quot;ALU\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#60a917;strokeColor=#2D7600;shadow=1;fontSize=17;fontColor=#ffffff;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-59\&quot;&gt;\n          &lt;mxGeometry x=\&quot;221.14\&quot; y=\&quot;5\&quot; width=\&quot;70\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-50\&quot; value=\&quot;ALU\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#60a917;strokeColor=#2D7600;shadow=1;fontSize=17;fontColor=#ffffff;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-59\&quot;&gt;\n          &lt;mxGeometry x=\&quot;221.14\&quot; y=\&quot;41\&quot; width=\&quot;70\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-51\&quot; value=\&quot;DRAM\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f5f5f5;strokeColor=#666666;shadow=1;fontSize=17;fontColor=#333333;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-59\&quot;&gt;\n          &lt;mxGeometry x=\&quot;67.13999999999999\&quot; y=\&quot;76\&quot; width=\&quot;224\&quot; height=\&quot;14\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-55\&quot; value=\&quot;\&quot; style=\&quot;group\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-59\&quot;&gt;\n          &lt;mxGeometry y=\&quot;5\&quot; width=\&quot;60\&quot; height=\&quot;40\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-52\&quot; value=\&quot;Control\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f8cecc;strokeColor=#b85450;shadow=1;fontSize=13;container=0;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-55\&quot;&gt;\n          &lt;mxGeometry width=\&quot;60\&quot; height=\&quot;20\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-53\&quot; value=\&quot;Cache\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#fff2cc;strokeColor=#d6b656;shadow=1;fontSize=13;container=0;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-55\&quot;&gt;\n          &lt;mxGeometry y=\&quot;20\&quot; width=\&quot;60\&quot; height=\&quot;20\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-56\&quot; value=\&quot;\&quot; style=\&quot;group\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-59\&quot;&gt;\n          &lt;mxGeometry x=\&quot;2.1399999999999864\&quot; y=\&quot;45\&quot; width=\&quot;60\&quot; height=\&quot;40\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-57\&quot; value=\&quot;Control\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f8cecc;strokeColor=#b85450;shadow=1;fontSize=13;container=0;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-56\&quot;&gt;\n          &lt;mxGeometry width=\&quot;60\&quot; height=\&quot;20\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-58\&quot; value=\&quot;Cache\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#fff2cc;strokeColor=#d6b656;shadow=1;fontSize=13;container=0;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-56\&quot;&gt;\n          &lt;mxGeometry y=\&quot;20\&quot; width=\&quot;60\&quot; height=\&quot;20\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-61\&quot; value=\&quot;\&quot; style=\&quot;group\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;226.32\&quot; y=\&quot;1560\&quot; width=\&quot;273.68\&quot; height=\&quot;156\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-37\&quot; value=\&quot;ALU\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#60a917;strokeColor=#2D7600;shadow=1;fontSize=17;fontColor=#ffffff;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-61\&quot;&gt;\n          &lt;mxGeometry x=\&quot;123.68\&quot; y=\&quot;5\&quot; width=\&quot;70\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-38\&quot; value=\&quot;Cache\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#fff2cc;strokeColor=#d6b656;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-61\&quot;&gt;\n          &lt;mxGeometry y=\&quot;74\&quot; width=\&quot;273.68\&quot; height=\&quot;44\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-39\&quot; value=\&quot;Control\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f8cecc;strokeColor=#b85450;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-61\&quot;&gt;\n          &lt;mxGeometry width=\&quot;113.68\&quot; height=\&quot;70\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-41\&quot; value=\&quot;ALU\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#60a917;strokeColor=#2D7600;shadow=1;fontSize=17;fontColor=#ffffff;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-61\&quot;&gt;\n          &lt;mxGeometry x=\&quot;200.93\&quot; y=\&quot;5\&quot; width=\&quot;70\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-42\&quot; value=\&quot;ALU\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#60a917;strokeColor=#2D7600;shadow=1;fontSize=17;fontColor=#ffffff;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-61\&quot;&gt;\n          &lt;mxGeometry x=\&quot;123.68\&quot; y=\&quot;38\&quot; width=\&quot;70\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-43\&quot; value=\&quot;ALU\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#60a917;strokeColor=#2D7600;shadow=1;fontSize=17;fontColor=#ffffff;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-61\&quot;&gt;\n          &lt;mxGeometry x=\&quot;200.93\&quot; y=\&quot;38\&quot; width=\&quot;70\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-44\&quot; value=\&quot;DRAM\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f5f5f5;strokeColor=#666666;shadow=1;fontSize=17;fontColor=#333333;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-61\&quot;&gt;\n          &lt;mxGeometry y=\&quot;126\&quot; width=\&quot;273.68\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-65\&quot; value=\&quot;\&quot; style=\&quot;group\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;624\&quot; y=\&quot;1540\&quot; width=\&quot;202.72000000000003\&quot; height=\&quot;70\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-3\&quot; value=\&quot;模型权重\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-65\&quot;&gt;\n          &lt;mxGeometry width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-62\&quot; value=\&quot;梯度\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-65\&quot;&gt;\n          &lt;mxGeometry x=\&quot;111.36000000000001\&quot; width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-63\&quot; value=\&quot;优化器\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-65\&quot;&gt;\n          &lt;mxGeometry y=\&quot;40\&quot; width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-64\&quot; value=\&quot;激活函数\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-65\&quot;&gt;\n          &lt;mxGeometry x=\&quot;111.36000000000001\&quot; y=\&quot;40\&quot; width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-66\&quot; value=\&quot;\&quot; style=\&quot;group\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;624\&quot; y=\&quot;1684\&quot; width=\&quot;202.72000000000003\&quot; height=\&quot;70\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-67\&quot; value=\&quot;模型权重\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-66\&quot;&gt;\n          &lt;mxGeometry width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-68\&quot; value=\&quot;梯度\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-66\&quot;&gt;\n          &lt;mxGeometry x=\&quot;111.36000000000001\&quot; width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-69\&quot; value=\&quot;优化器\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-66\&quot;&gt;\n          &lt;mxGeometry y=\&quot;40\&quot; width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-70\&quot; value=\&quot;激活函数\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-66\&quot;&gt;\n          &lt;mxGeometry x=\&quot;111.36000000000001\&quot; y=\&quot;40\&quot; width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-71\&quot; value=\&quot;\&quot; style=\&quot;group\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;624\&quot; y=\&quot;1820\&quot; width=\&quot;202.72000000000003\&quot; height=\&quot;70\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-72\&quot; value=\&quot;模型权重\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-71\&quot;&gt;\n          &lt;mxGeometry width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-73\&quot; value=\&quot;梯度\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-71\&quot;&gt;\n          &lt;mxGeometry x=\&quot;111.36000000000001\&quot; width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-74\&quot; value=\&quot;优化器\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-71\&quot;&gt;\n          &lt;mxGeometry y=\&quot;40\&quot; width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-75\&quot; value=\&quot;激活函数\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-71\&quot;&gt;\n          &lt;mxGeometry x=\&quot;111.36000000000001\&quot; y=\&quot;40\&quot; width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-76\&quot; value=\&quot;\&quot; style=\&quot;group\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;624\&quot; y=\&quot;1960\&quot; width=\&quot;202.72000000000003\&quot; height=\&quot;70\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-77\&quot; value=\&quot;模型权重\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-76\&quot;&gt;\n          &lt;mxGeometry width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-78\&quot; value=\&quot;梯度\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-76\&quot;&gt;\n          &lt;mxGeometry x=\&quot;111.36000000000001\&quot; width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-79\&quot; value=\&quot;优化器\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-76\&quot;&gt;\n          &lt;mxGeometry y=\&quot;40\&quot; width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-80\&quot; value=\&quot;激活函数\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-76\&quot;&gt;\n          &lt;mxGeometry x=\&quot;111.36000000000001\&quot; y=\&quot;40\&quot; width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-81\&quot; value=\&quot;\&quot; style=\&quot;group\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;624\&quot; y=\&quot;2080\&quot; width=\&quot;202.72000000000003\&quot; height=\&quot;70\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-82\&quot; value=\&quot;模型权重\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-81\&quot;&gt;\n          &lt;mxGeometry width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-83\&quot; value=\&quot;梯度\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-81\&quot;&gt;\n          &lt;mxGeometry x=\&quot;111.36000000000001\&quot; width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-84\&quot; value=\&quot;优化器\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-81\&quot;&gt;\n          &lt;mxGeometry y=\&quot;40\&quot; width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-85\&quot; value=\&quot;激活函数\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-81\&quot;&gt;\n          &lt;mxGeometry x=\&quot;111.36000000000001\&quot; y=\&quot;40\&quot; width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-86\&quot; value=\&quot;\&quot; style=\&quot;group\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;624\&quot; y=\&quot;2220\&quot; width=\&quot;202.72000000000003\&quot; height=\&quot;70\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-87\&quot; value=\&quot;模型权重\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-86\&quot;&gt;\n          &lt;mxGeometry width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-88\&quot; value=\&quot;梯度\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-86\&quot;&gt;\n          &lt;mxGeometry x=\&quot;111.36000000000001\&quot; width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-89\&quot; value=\&quot;优化器\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-86\&quot;&gt;\n          &lt;mxGeometry y=\&quot;40\&quot; width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-90\&quot; value=\&quot;激活函数\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-86\&quot;&gt;\n          &lt;mxGeometry x=\&quot;111.36000000000001\&quot; y=\&quot;40\&quot; width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-91\&quot; value=\&quot;\&quot; style=\&quot;group\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;624\&quot; y=\&quot;2360\&quot; width=\&quot;202.72000000000003\&quot; height=\&quot;70\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-92\&quot; value=\&quot;模型权重\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-91\&quot;&gt;\n          &lt;mxGeometry width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-93\&quot; value=\&quot;梯度\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-91\&quot;&gt;\n          &lt;mxGeometry x=\&quot;111.36000000000001\&quot; width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-94\&quot; value=\&quot;优化器\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-91\&quot;&gt;\n          &lt;mxGeometry y=\&quot;40\&quot; width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-95\&quot; value=\&quot;激活函数\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;MzKt8NfVXthm0VmUftic-91\&quot;&gt;\n          &lt;mxGeometry x=\&quot;111.36000000000001\&quot; y=\&quot;40\&quot; width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n      &lt;/root&gt;\n    &lt;/mxGraphModel&gt;\n  &lt;/diagram&gt;\n&lt;/mxfile&gt;\n&quot;}"></div>
<script type="text/javascript" src="https://viewer.diagrams.net/js/viewer-static.min.js"></script>

<ul>
  <li>更多 GPU 知识见站内专题 <a href="/gpu">并行计算GPU</a></li>
</ul>

<h3 id="常见问题">常见问题</h3>

<p>模型训练的常见问题</p>
<ul>
  <li>问题一:GPU 显存爆满,资源不足
    <ul>
      <li>V100 为例,其显存最高也仅有 32G,甚至有些显存仅 12G 左右。因此当模型的参数量较大时,在 GPU 模式下模型可能无法训练起来。</li>
      <li>设置 CPU 模式进行模型训练,可以避免显存不足的问题,但是训练速度往往太慢。</li>
      <li>如何在单机训练中充分地利用 GPU 和 CPU 资源,让部分层在 CPU 执行,部分层在 GPU 执行呢?</li>
    </ul>
  </li>
  <li>问题二:频繁数据拷贝,训练效率低</li>
</ul>

<h2 id="分布式训练">分布式训练</h2>

<h3 id="资料">资料</h3>

<p>【2024-8-23】Github 分布式训练总结 <a href="https://github.com/JianyuZhan/tech_slides/blob/main/LLM%E5%88%86%E5%B8%83%E5%BC%8F%E8%AE%AD%E7%BB%83%E6%8A%80%E6%9C%AF.pdf">tech_slides</a>, pdf</p>

<p>【2024-5-27】 MIT 助理教授 Song Han 的 分布式训练介绍 ppt:</p>
<ul>
  <li>Distributed Training: <a href="https://www.dropbox.com/scl/fi/vn3n0b2r5fgcc0j0vrh0k/lec17.pdf">part1</a>, <a href="https://www.dropbox.com/scl/fi/11d766q8f62y5lx2tnt9h/lec18.pdf">part2</a></li>
  <li><a href="https://www.dropbox.com/scl/fi/6h69a1z5vqry63nxqdzt0/lec19.pdf">On-Device Training and Transfer Learning</a></li>
  <li><a href="https://www.dropbox.com/scl/fi/lt97w5j9zyscsgizyawme/lec20.pdf">Efficient Fine-tuning and Prompt Engineering</a></li>
</ul>

<p>part1</p>

<object type="application/pdf" data="https://www.dropbox.com/scl/fi/vn3n0b2r5fgcc0j0vrh0k/lec17.pdf" id="review" style="width:100%;  height:800px; margin-top:0px;  margin-left:0px">
</object>

<p>part2</p>

<object type="application/pdf" data="https://www.dropbox.com/scl/fi/11d766q8f62y5lx2tnt9h/lec18.pdf" id="review" style="width:100%;  height:800px; margin-top:0px;  margin-left:0px">
</object>

<h3 id="通信技术">通信技术</h3>

<p>分布式条件下的多进程、多worker之间的通信技术,常见的主要有:MPI、NCCL,GRPC等。</p>
<ul>
  <li><strong>MPI</strong>主要是被应用在超算等大规模计算领域,机器学习场景下使用较少。主要是openMPI原语等。</li>
  <li><strong>NCCL</strong>是NVIDIA针对GPU设计的一种规约库,可以实现多GPU间的直接数据同步,避免内存和显存的,CPU和GPU间的数据拷贝成本。当在TensorFlow中选择单机多卡训练时,其默认采用的就是NCCL方式来通信。</li>
  <li><strong>GRPC</strong>是比较成熟的通信技术了,spark等框架内也都有用到。</li>
</ul>

<p>演变</p>
<ul>
  <li>早期MPI在CPU和GPU的分布式通信领域都是主力军</li>
  <li>在NCCL推出之后
    <ul>
      <li>MPI库现在就只用在了CPU分布式通信场景</li>
      <li>而GPU分布式通信库目前都是以NCCL为主(NV场景)。</li>
    </ul>
  </li>
</ul>

<h4 id="通信方式">通信方式</h4>

<p>Pytorch 分布式训练通信依赖<code class="language-plaintext highlighter-rouge">torch.distributed</code>模块,<code class="language-plaintext highlighter-rouge">torch.distributed</code>提供了<code class="language-plaintext highlighter-rouge">point-2-point communication</code> 和<code class="language-plaintext highlighter-rouge">collective communication</code>两种通信方式。</p>
<ul>
  <li>点对点 point-2-point communication(<code class="language-plaintext highlighter-rouge">P2P</code>)提供了send和recv语义,用于任务间的通信。</li>
  <li>收集 collective communication(<code class="language-plaintext highlighter-rouge">CC</code>)提供了 scatter/broadcast/gather/reduce/all_reduce/all_gather 语义,不同的backend在提供的通信语义上具有一定的差异性。</li>
</ul>

<p>训练大模型主要是CC通信</p>

<h4 id="gpu通信技术">GPU通信技术</h4>

<p>【2024-6-17】<a href="https://developer.baidu.com/article/details/3136719">GPU通信技术:GPU Direct、NVLink与RDMA</a></p>

<p>GPU通信技术是加速计算的关键,其中<code class="language-plaintext highlighter-rouge">GPU Direct</code>、<code class="language-plaintext highlighter-rouge">NVLink</code>和<code class="language-plaintext highlighter-rouge">RDMA</code>是三种主流技术。</p>

<p>RDMA(Remote Direct Memory Access)是一种远程直接内存访问技术,允许一个设备直接访问另一个设备上的内存数据。在GPU通信中,RDMA技术用于加速GPU与CPU、GPU与GPU以及GPU与网络之间的数据传输。</p>

<p><code class="language-plaintext highlighter-rouge">DMA</code> 是“<strong>直接内存读取</strong>”的意思,用来传输数据,它也属于<strong>外设</strong>。只是在传输数据时,无需占用CPU。</p>
<ul>
  <li>高速IO设备可以在处理器安排下直接与主存储器成批交换数据,称为<strong>直接存储器访问</strong>(Directly Memory Access 简称DMA)</li>
</ul>

<p>比如GPU与CPU之间存在着大量的数据传输.</p>
<ul>
  <li>CPU将需要显示的原始数据放在内存中,让GPU通过DMA的方式读取数据,经过解析和运算,将结果写至显存中,再由显示控制器读取显存中的数据并显示输出.</li>
</ul>

<p>GPU与CPU集成至同一个处理器芯片时,能够大大减少芯片间的数据搬运,同时因为显存和内存的合并,会大大增加访存压力</p>

<p>DMA传输方向有三个:<strong>外设到内存</strong>,<strong>内存到外设</strong>,<strong>内存到内存</strong>。</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">外设</code>到<code class="language-plaintext highlighter-rouge">内存</code>。即从外设读取数据到内存。例如ADC采集数据到内存,ADC寄存器地址为源地址,内存地址为目标地址。</li>
  <li><code class="language-plaintext highlighter-rouge">内存</code>到<code class="language-plaintext highlighter-rouge">外设</code>。即从内存读取数据到外设。例如串口向电脑发送数据,内存地址为源地址,串口数据寄存器地址为目标地址。此时内存存储了需要发送的变量数据。</li>
  <li><code class="language-plaintext highlighter-rouge">内存</code>到<code class="language-plaintext highlighter-rouge">内存</code>。以内部flash向内部sram传输数据为例,此时内部flash地址即为源地址,内部sram地址即为目标地址。同时,需要将DMA_CCRx寄存器的MEM2MEM置位。</li>
</ul>

<h5 id="一gpu-direct">一、GPU Direct</h5>

<p>GPU Direct 是一种优化GPU之间或GPU与第三方设备之间数据传输的技术。它通过<strong>共享内存访问</strong>和<strong>点对点通信</strong>减少了数据复制和传输延迟。</p>

<p>(1) GPU Direct <code class="language-plaintext highlighter-rouge">Shared Memory</code></p>

<p>2010年,NVIDIA推出了GPU Direct Shared Memory技术,允许GPU与第三方PCI Express设备通过共享的host memory实现共享内存访问。这使得内存空间得以共享,减少了数据复制,降低了数据交换延迟。</p>

<p>(2) GPU Direct <code class="language-plaintext highlighter-rouge">P2P</code> (Peer-to-Peer)</p>

<p>2011年,GPU Direct增加了Peer-to-Peer(<code class="language-plaintext highlighter-rouge">P2P</code>)技术,支持同一PCI Express总线上的GPU之间的直接访问和传输。这种技术绕过了CPU,使得GPU之间通信更加高效。</p>

<p>(3) GPU Direct <code class="language-plaintext highlighter-rouge">RDMA</code></p>

<p>2013年,GPU Direct增加了<code class="language-plaintext highlighter-rouge">RDMA</code>(Remote Direct Memory Access)支持。</p>

<p>RDMA允许第三方PCI Express设备绕过CPU host memory,直接访问GPU内存。这种技术大幅提升了数据传输效率,尤其适用于高性能计算和数据中心等场景。</p>

<h5 id="二nvlink">二、NVLink</h5>

<p>NVLink是一种专门设计用于连接NVIDIA GPU的高速互联技术。它通过点对点通信方式,绕过传统的PCIe总线,提供了更高的带宽和更低的延迟。</p>

<p>带宽与延迟
NVLink采用串行协议,支持双向数据传输,每个方向都有高达32GB/s的带宽。这使得两个GPU之间能够实现高速数据传输和共享,为多GPU系统提供了更高的性能和效率。与传统的PCIe总线相比,NVLink显著降低了通信延迟。</p>

<p>连接与扩展
NVLink可用于连接两个或多个GPU,以实现多GPU协同工作。这种连接方式简化了系统架构,提高了可扩展性。通过NVLink连接的GPU可以共享数据和计算资源,从而在某些应用中实现性能倍增。</p>

<h5 id="三rdma">三、RDMA</h5>

<p>RDMA(Remote Direct Memory Access)是一种远程直接内存访问技术,允许一个设备直接访问另一个设备上的内存数据。在GPU通信中,RDMA技术用于加速GPU与CPU、GPU与GPU以及GPU与网络之间的数据传输。</p>

<p>DMA原理
在介绍RDMA之前,我们需要理解DMA(Direct Memory Access)原理。DMA是一种技术,允许硬件控制器直接从内存读取或写入数据,而不需要经过CPU。这大大减轻了CPU的负担,提高了数据传输效率。RDMA基于此原理,进一步扩展了其应用范围。</p>

<p>RDMA的优势
RDMA提供了高带宽和低延迟的数据传输能力。它利用网卡等设备的远程直接内存访问功能,允许设备之间快速高效地传输大量数据。在高性能计算、数据中心和云计算等领域,RDMA成为提高系统性能的关键技术之一。</p>

<p>GPU与RDMA的结合
通过将RDMA与GPU相结合,可以实现高性能的GPU通信。在这种配置中,GPU可以借助RDMA直接访问其他设备或网络的内存数据,从而避免了不必要的CPU中介和数据拷贝。这不仅提高了数据传输速率,还降低了CPU负载和功耗。</p>

<p>总结:
GPU通信技术在加速计算领域发挥着越来越重要的作用。GPU Direct、NVLink和RDMA是三种主流的GPU通信技术,它们分别通过共享内存访问、高速互联和远程直接内存访问等方式提高了GPU之间的通信效率。在实际应用中,根据不同的场景和需求选择合适的通信技术至关重要。随着技术的不断发展,未来我们有望看到更多创新性的GPU通信解决方案,为高性能计算和数据中心等领域带来更大的性能提升。</p>

<h4 id="如何选择">如何选择</h4>

<p>PyTorch 支持</p>

<p>torch.distributed 支持 3 种后端,分别为 <code class="language-plaintext highlighter-rouge">NCCL</code>,<code class="language-plaintext highlighter-rouge">Gloo</code>,<code class="language-plaintext highlighter-rouge">MPI</code></p>
<ul>
  <li><img src="https://pic4.zhimg.com/80/v2-54b2efac8658c14f72104c2101a81ecf_1440w.webp" alt="" /></li>
</ul>

<p>如何选择?</p>
<ul>
  <li>NCCL 目前最快,且对<strong>多进程分布式</strong>(Multi-Process Single-GPU)支持极好,可用于单节点以及多节点的分布式训练。</li>
  <li>节点即主机。即使是单节点,由于底层机制不同, <strong>distributed</strong> 也比 <strong>DataParallel</strong> 方式要高效。</li>
</ul>

<p>基本原则:</p>
<ul>
  <li>用 NCCL 进行分布式 GPU 训练</li>
  <li>用 Gloo 进行分布式 CPU 训练</li>
</ul>

<p>无限带宽互联的 GPU 集群</p>
<ul>
  <li>使用 NCCL,因为它是目前唯一支持 InfiniBand 和 GPUDirect 的后端</li>
</ul>

<p>无限带宽和 GPU 直连</p>
<ul>
  <li>使用 NCCL,因为其目前提供最佳的分布式 GPU 训练性能。尤其是 multiprocess single-node 或 multi-node distributed 训练。</li>
  <li>如果用 NCCL 训练有问题,再考虑使用 Cloo。(当前,Gloo 在 GPU 分布式上,相较于 NCCL 慢)</li>
</ul>

<p>无限带宽互联的 CPU 集群</p>
<ul>
  <li>如果 InfiniBand 对 IB 启用 IP,请使用 Gloo,否则使使用 MPI。</li>
  <li>在未来将添加 infiniBand 对 Gloo 的支持</li>
</ul>

<p>以太网互联的 CPU 集群</p>
<ul>
  <li>使用 Gloo,除非有特别的原因使用 MPI。</li>
</ul>

<h4 id="mpi-后端">MPI 后端</h4>

<p>MPI 即<strong>消息传递接口</strong>(Message Passing Interface),来自于高性能计算领域的标准的工具。</p>
<ul>
  <li>支持点对点通信以及集体通信,并且是 torch.distributed 的 API 的灵感来源。</li>
  <li>使用 MPI 后端的优势: 在大型计算机集群上,MPI 应用广泛,且高度优化。</li>
</ul>

<p>但是,torch.distributed 对 MPI 并不提供原生支持。</p>

<p>因此,要使用 MPI,必须从源码编译 Pytorch。是否支持 GPU,视安装的 MPI 版本而定。</p>

<h4 id="gloo-后端">Gloo 后端</h4>

<p>gloo 后端支持 CPU 和 GPU,其支持集体通信(collective Communication),并对其进行了优化。</p>

<p>由于 GPU 之间可以直接进行数据交换,而无需经过 CPU 和内存,因此,在 GPU 上使用 gloo 后端速度更快。</p>

<p>torch.distributed 对 gloo 提供原生支持,无需进行额外操作。</p>

<h4 id="nccl-通信原语">NCCL 通信原语</h4>

<p>【2023-7-27】<a href="https://zhuanlan.zhihu.com/p/623746805">大模型-LLM分布式训练框架总结</a></p>

<p>NCCL 的全称为 Nvidia 聚合通信库(NVIDIA Collective Communications Library),是一个可以实现多个 GPU、多个结点间聚合通信的库,在 PCIe、Nvlink、InfiniBand 上可以实现较高的通信速度。</p>

<p>NCCL 高度优化和兼容了 MPI,并且可以感知 GPU 的拓扑,促进多 GPU 多节点的加速,最大化 GPU 内的带宽利用率,所以深度学习框架的研究员可以利用 NCCL 的这个优势,在多个结点内或者跨界点间可以充分利用所有可利用的 GPU。</p>

<p>NCCL 对 CPU 和 GPU 均有较好支持,且 torch.distributed 对其也提供了原生支持。</p>

<p>对于每台主机均使用多进程的情况,使用 NCCL 可以获得最大化的性能。每个进程内,不许对其使用的 GPUs 具有独占权。若进程之间共享 GPUs 资源,则可能导致 deadlocks。</p>

<p>NCCL 英伟达集合通信库专用于多个 GPU 乃至多个节点间通信。</p>
<ul>
  <li>专为英伟达的计算卡和网络优化,能带来更低的延迟和更高的带宽。</li>
</ul>

<p>原语</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">Broadcast</code>: 一对多的通信原语,一个数据发送者,多个数据接收者,可以在集群内把一个节点自身的数据广播到其他节点上。</li>
  <li><code class="language-plaintext highlighter-rouge">Scatter</code>: 一对多的通信原语,也是一个数据发送者,多个数据接收者,可以在集群内把一个节点自身的数据发散到其他节点上。与Broadcast不同的是,Broadcast把主节点0的数据发送给所有节点,而Scatter则是将数据进行切片再分发给集群内所有的节点。</li>
  <li><code class="language-plaintext highlighter-rouge">Gather</code>: 多对一的通信原语,具有多个数据发送者,一个数据接收者,可以在集群内把多个节点的数据收集到一个节点上。</li>
  <li><code class="language-plaintext highlighter-rouge">AllGather</code>: 多对多的通信原语,具有多个数据发送者,多个数据接收者,可以在集群内把多个节点的数据收集到一个主节点上(Gather),再把这个收集到的数据分发到其他节点上(broadcast),即收集集群内所有的数据到所有的节点上。</li>
  <li><code class="language-plaintext highlighter-rouge">Reduce</code>: 多对一的通信原语,具有多个数据发送者,一个数据接收者,可以在集群内把多个节点的数据规约运算到一个主节点上,常用的规约操作符有:求累加和SUM、求累乘积PROD、求最大值MAX、求最小值MIN、逻辑与LAND、按位与BAND、逻辑或LOR、按位或BOR、逻辑异或LXOR、按位异或BOXR、求最大值和最小大的位置MAXLOC、求最小值和最小值的位置MINLOC等,这些规约运算也需要加速卡支持对应的算子才能生效。</li>
  <li><code class="language-plaintext highlighter-rouge">ReduceScatter</code>: 多对多的通信原语,具有多个数据发送者,多个数据接收者,在集群内的所有节点上都按维度执行相同的Reduce规约运算,再将结果发散到集群内所有的节点上。Reduce-scatter等价于节点个数次的reduce规约运算操作,再后面执行节点个数的scatter次操作。其反向操作是AllGather。</li>
  <li><code class="language-plaintext highlighter-rouge">AllReduce</code>: 多对多的通信原语,具有多个数据发送者,多个数据接收者,在集群内的所有节点上都执行相同的Reduce操作,可以将集群内所有节点的数据规约运算得到的结果发送到所有的节点上。</li>
</ul>

<p>通信原语汇总</p>

<p>汇总如下</p>

<table>
  <thead>
    <tr>
      <th>原语操作</th>
      <th>模式</th>
      <th>说明</th>
      <th>图解</th>
      <th>示意图</th>
    </tr>
  </thead>
  <tbody>
    <tr>
      <td><code class="language-plaintext highlighter-rouge">Broadcast</code></td>
      <td>广播:一对多</td>
      <td>广播行为:从节点0广播相同信息到其它节点(0-3)</td>
      <td><img src="https://pic3.zhimg.com/80/v2-559434c1d53c4b8314c9d79aa70a32c6_1440w.webp" alt="" /></td>
      <td><img src="https://pic4.zhimg.com/80/v2-c8aec100f7984bc64dae66dea5067657_1440w.webp" alt="" /></td>
    </tr>
    <tr>
      <td><code class="language-plaintext highlighter-rouge">Scatter</code></td>
      <td>一对多</td>
      <td>另一种广播,从节点0将数据<strong>不同部分</strong>按需发送到不同节点,常见于DP的数据分配起步阶段</td>
      <td><img src="https://pic3.zhimg.com/80/v2-8cbcae4e5a544f607afc88b9d3c2122a_1440w.webp" alt="" /></td>
      <td><img src="https://pic2.zhimg.com/80/v2-988509a65724802d800ff0d40c78ab11_1440w.webp" alt="" /></td>
    </tr>
    <tr>
      <td><code class="language-plaintext highlighter-rouge">Reduce</code></td>
      <td>规约:多对一</td>
      <td>规约操作,Reduce意为减少/精简,一系列简单聚合运算,如:sum/min/max,prod,lor等</td>
      <td><img src="https://pic2.zhimg.com/80/v2-a364ebb1cdeccfbba2293d983b2b834d_1440w.webp" alt="" /></td>
      <td><img src="https://pic1.zhimg.com/80/v2-ea6eef07151786d13cceef53a718fd74_1440w.webp" alt="" /></td>
    </tr>
    <tr>
      <td><code class="language-plaintext highlighter-rouge">AllReduce</code></td>
      <td>多对多</td>
      <td>所有节点上应用相同的Reduce操作,单节点上 Reduce + Broadcast,最消耗带宽</td>
      <td><img src="https://pic3.zhimg.com/80/v2-2176fb0289edb0b380cceb6bbd2d5ca2_1440w.webp" alt="" /></td>
      <td><img src="https://pic1.zhimg.com/80/v2-0411e66990dac2867af16e425eef27e8_1440w.webp" alt="" /></td>
    </tr>
    <tr>
      <td><code class="language-plaintext highlighter-rouge">Gather</code></td>
      <td>多对一</td>
      <td><strong>反向Scatter</strong>:将多个Sender的数据汇总到单个节点上</td>
      <td><img src="https://pic2.zhimg.com/80/v2-4b74592358b3acaabbff91e6fe61fae5_1440w.webp" alt="" /></td>
      <td><img src="https://pic1.zhimg.com/80/v2-badb6e96036f1cbbd65fc1c9ccf54070_1440w.webp" alt="" /></td>
    </tr>
    <tr>
      <td><code class="language-plaintext highlighter-rouge">AllGather</code></td>
      <td>多对多</td>
      <td>收集所有节点到所有节点上, <code class="language-plaintext highlighter-rouge">AllGather</code>=<code class="language-plaintext highlighter-rouge">Gather</code>+<code class="language-plaintext highlighter-rouge">Broadcast</code></td>
      <td><img src="https://pic4.zhimg.com/80/v2-497c129eb7aa4f3b51bd3f3a53e1ca73_1440w.webp" alt="" /></td>
      <td><img src="https://pic3.zhimg.com/80/v2-3bfe939e5f713d5c6099ea161fbdffc2_1440w.webp" alt="" /></td>
    </tr>
    <tr>
      <td><code class="language-plaintext highlighter-rouge">ReduceScatter</code></td>
      <td> </td>
      <td>将单节点输入求和,再0维度按卡切分并发送, <code class="language-plaintext highlighter-rouge">ReduceScatter</code>=<code class="language-plaintext highlighter-rouge">Reduce</code>+<code class="language-plaintext highlighter-rouge">Scatter</code></td>
      <td><img src="https://pic1.zhimg.com/80/v2-55652f9d4274b76249f1e745c66a40d8_1440w.webp" alt="" /></td>
      <td><img src="" alt="" /></td>
    </tr>
    <tr>
      <td><code class="language-plaintext highlighter-rouge">All2All</code></td>
      <td> </td>
      <td>全交换操作,每个节点都获取其他节点的值</td>
      <td><img src="https://pic2.zhimg.com/80/v2-ff0a3da8d01c5b7d4391edad5da14661_1440w.webp" alt="" /></td>
      <td><img src="" alt="" /></td>
    </tr>
  </tbody>
</table>

<p>All2All 与 All Gather 区别在于:<a href="https://zhuanlan.zhihu.com/p/682896222">LLM分布式训练第一课(通讯原语)</a></p>
<ul>
  <li>All Gather 操作中,不同节点向某一节点收集到的数据是完全相同的</li>
  <li>而在 All2All 中,不同的节点向某一节点收集到的数据是不同的。</li>
</ul>

<p><code class="language-plaintext highlighter-rouge">AllReduce</code> 的目标: 将不同机器上的数据整合(reduce)后分发给各个机器</p>

<p><code class="language-plaintext highlighter-rouge">AllReduce</code> 实现方法</p>
<ul>
  <li>最简单: 每个worker将自己的数据广播给所有worker —— 问题: 大量浪费</li>
  <li>改进: 主从架构, 指定一个worker作为master,负责整合运算,以及分发 —— 问题: master成为网络瓶颈</li>
  <li>改进: Ring AllReduce</li>
</ul>

<p>Ring AllReduce:</p>
<ul>
  <li>第一阶段,将N个worker分布在一个环上,并且把每个worker的数据分成N份。</li>
  <li>第二阶段,第k个worker把第<strong>k份</strong>数据发给下一个worker,同时从前一个worker收到第<strong>k-1份</strong>数据。</li>
  <li>第三阶段,worker把收到的第k-1份数据和自己的第k-1份数据整合,再将整合的数据发送给下一个worker</li>
  <li>此循环N次之后,每一个worker都会包含最终整合结果的一份。</li>
</ul>

<p>假设每个worker的数据是一个长度为<code class="language-plaintext highlighter-rouge">S</code>的向量,那么Ring AllReduce里每个worker发送的数据量是<code class="language-plaintext highlighter-rouge">O(S)</code>,和worker的数量N无关。避免了<strong>主从架构</strong>中master需要处理<code class="language-plaintext highlighter-rouge">O(S*N)</code>数据量而成为网络瓶颈的问题。</p>

<p><code class="language-plaintext highlighter-rouge">Ring All-reduce</code></p>
<ul>
  <li>Pytorch 实现: <code class="language-plaintext highlighter-rouge">DistributedDataParallel</code></li>
  <li><code class="language-plaintext highlighter-rouge">Ring All-reduce</code>=<code class="language-plaintext highlighter-rouge">reduce-scatter</code>+<code class="language-plaintext highlighter-rouge">all-gather</code></li>
</ul>

<h4 id="nccl-通信行为分析">NCCL 通信行为分析</h4>

<p>【2024-5-10】<a href="https://www.cnblogs.com/Matrix_Yao/p/15905009.html">集合通信行为分析 - 基于NCCL</a></p>

<p>deepspeed 启动多卡训练时,日志里会打印NCCL通信信息,这些日志都是什么意思?</p>

<p>NCCL 通信阶段</p>
<ul>
  <li>Phase 1 - <code class="language-plaintext highlighter-rouge">启动</code>阶段 <strong>Bootstrap</strong> Phase: 初始化集合中的所有节点(node)和卡(rank),确保所有卡知道彼此
    <ul>
      <li>Initiate all nodes and then all ranks in a collective. It makes sure all ranks know about all other ranks, so any rank is able to communicate with any other rank.</li>
    </ul>
  </li>
  <li>Phase 2 - <code class="language-plaintext highlighter-rouge">拓扑</code>阶段 <strong>Topology</strong> Phase: 每隔节点了解机器上各个硬件(CPU/GPU/NIC)映射关系, 创建内部拓扑结构(树/环),通过PCI和NVLink通信
    <ul>
      <li>Each node detects and maps out what hardware is located on the machine.</li>
      <li>Hardware includes CPUs, GPUs, NICs and interconnect types.</li>
      <li>Each node then creates an <strong>intra-machine graph</strong>, connects hardware with <code class="language-plaintext highlighter-rouge">PCIe</code> or <code class="language-plaintext highlighter-rouge">NVLink</code> interconnect, and evaluates the graph.</li>
      <li>When the intra-machine topology is decided, the system will decide what pattern to use for the whole system.</li>
      <li>The two main patterns are a <strong>tree</strong> or a <strong>ring</strong>.</li>
      <li>While the topology is evaluated, NCCL is also tuning it by performing tests. This allows each rank to pre-compute thresholds for message sizes.</li>
    </ul>
  </li>
  <li>Phase 3 - <code class="language-plaintext highlighter-rouge">聚合</code>阶段 <strong>Collective</strong> Phase: 用户调用NCCL支持的集合通信原语进行通信
    <ul>
      <li>A user can dispatch many collective operations using the same topology.</li>
    </ul>
  </li>
</ul>

<p>用户调用NCCL支持的集合通信原语进行通信:</p>
<ul>
  <li>集合通信原语
    <ul>
      <li>AllReduce</li>
      <li>Broadcast</li>
      <li>Reduce</li>
      <li>AllGather</li>
      <li>ReduceScatter</li>
    </ul>
  </li>
  <li>点对点通信原语
    <ul>
      <li>Send</li>
      <li>Recv</li>
    </ul>
  </li>
</ul>

<p>NCCL在getAlgoInfo里面使用ncclTopoGetAlgoTime来遍历计算(algorithm, protocol),最终选择预测会最快做完指定数据量的指定集合通信原语的algorithm和protocol完成该通信原语。</p>

<p>示例</p>
<ul>
  <li>以2机16卡, NCCL 2.8.4为例</li>
  <li>NCCL会构建tree,ring graph。</li>
</ul>

<p>(1) tree</p>

<p>解析</p>

<p><strong>拓扑</strong>log格式</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c"># IP: hostname:pid:tid [cudaDev] NCCL INFO Trees [channel ID] down0 rank/down1 rank/down2 rank-&gt;current rank-&gt;up rank</span>
10.0.2.11: 2be7fa6883db:57976:58906 <span class="o">[</span>5] NCCL INFO Trees <span class="o">[</span>0] 14/-1/-1-&gt;13-&gt;12 <span class="o">[</span>1] 14/-1/-1-&gt;13-&gt;12
<span class="c"># 10.0.2.11上的设备5,其rank为13,有两棵树,分别为channel 0和channel 1: channel 0的子节点只有14, 父节点为12; channel 1一样。</span>
</code></pre></div></div>

<p><strong>channel</strong> log格式</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c"># IP: hostname:pid:tid [cudaDev] NCCL INFO Channel [channel ID] current rank[bus ID]-&gt;successor rank[bus ID] via transport type</span>
10.0.2.11: 2be7fa6883db:57976:58906 <span class="o">[</span>5] NCCL INFO Channel 00 : 13[3e000] -&gt; 14[40000] via P2P/IPC
<span class="c"># 10.0.2.11上的设备5(rank 为13, bus ID为3e000),其channel 0连接至rank 14,传输方式为P2P/IPC。</span>
</code></pre></div></div>

<p>结果</p>

<p>依此解析,可得两棵一样的tree,逻辑拓扑如下:<a href="https://img2022.cnblogs.com/blog/46419/202202/46419-20220217155443691-259562130.png">img</a></p>
<ul>
  <li><img src="https://img2022.cnblogs.com/blog/46419/202202/46419-20220217155443691-259562130.png" alt="" /></li>
</ul>

<p>(2) ring</p>

<p>Ring Logical Topology</p>

<p>拓扑log格式</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c"># IP: hostname:pid:tid [cudaDev] NCCL INFO Channel ring_ID/ring_number: rank0 rank1 … last_rank</span>
10.0.2.12: 94f182076445:82261:83141 <span class="o">[</span>0] NCCL INFO Channel 00/02 : 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
<span class="c"># 建成了02个ring,其中第0个ring的成员有:0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15,该ring共由16个rank组成。</span>
</code></pre></div></div>

<p>channel log格式</p>
<ul>
  <li>与tree拓扑的格式一致。</li>
</ul>

<p>可得两个一样的ring,逻辑拓扑如下:<a href="https://img2022.cnblogs.com/blog/46419/202202/46419-20220217155443729-494214121.png">img</a></p>
<ul>
  <li><img src="https://img2022.cnblogs.com/blog/46419/202202/46419-20220217155443729-494214121.png" alt="" /></li>
</ul>

<h4 id="梯度压缩">梯度压缩</h4>

<p>分布式训练的 bandwidth 与 latency bottleneck 主要分布在 梯度 <code class="language-plaintext highlighter-rouge">all-reduce</code> 和 <code class="language-plaintext highlighter-rouge">scatter</code> 过程中</p>
<ul>
  <li>其中 联邦学习 受限的地方还有与端侧设备的通讯</li>
  <li><img src="https://pic4.zhimg.com/80/v2-cd8999264a7670d906e98e6d5de3978f_1440w.webp" alt="" /></li>
</ul>

<p>解法:梯度压缩</p>
<ul>
  <li>方法1: prune, 【Deep gradient compression】
    <ul>
      <li>worker 向 server push 梯度时, 可以对梯度做 prune (sparse gradient) 与 quantization</li>
    </ul>
  </li>
  <li>方法2: Low-Rank 【PowerSGD】,梯度映射到低秩空间,而不是去做细粒度的剪枝和量化
    <ul>
      <li>2019年 EPFL 的文章 <a href="https://arxiv.org/abs/1905.13727">PowerSGD</a>, 发了 NIPS</li>
    </ul>
  </li>
  <li>方法3: 量化, 1bit SGD
    <ul>
      <li>用 one bit 的矩阵作为需要通讯的梯度</li>
    </ul>
  </li>
  <li>方法5: terngrad, ternery
    <ul>
      <li>梯度量化到 0, -1, 1</li>
    </ul>
  </li>
</ul>

<p>梯度延迟更新:解决 latency 的 bottleneck</p>

<p>详见: <a href="https://zhuanlan.zhihu.com/p/699372131?utm_psn=1777577429458386944">分布式训练优化–进阶篇</a></p>

<h3 id="并行技术">并行技术</h3>

<p>并行技术:</p>
<ul>
  <li><strong>数据并行</strong> <code class="language-plaintext highlighter-rouge">dp</code>(如:PyTorch DDP): 每个节点复制完整的模型,数据分片
    <ul>
      <li>内存开销大,通信量低,容易实施</li>
      <li>ZeRO 对 data-parallel 的优化,每个gpu有自己独特的数据,同时模型的参数也被均匀的分到 n个gpu上</li>
    </ul>
  </li>
  <li><strong>模型并行</strong> <code class="language-plaintext highlighter-rouge">mp</code> : 完整模型只有一份,其它节点只有模型的局部
    <ul>
      <li><code class="language-plaintext highlighter-rouge">张量并行</code> <code class="language-plaintext highlighter-rouge">tp</code>: 模型按张量分发
        <ul>
          <li>内存开销小,通信量高,容易实施</li>
          <li>如:Megatron-LM(1D)、Colossal-AI(2D、2.5D、3D)</li>
        </ul>
      </li>
      <li><code class="language-plaintext highlighter-rouge">流水线并行</code> <code class="language-plaintext highlighter-rouge">pp</code>: 模型按层分发
        <ul>
          <li>内存开销小,通信量中等,实施难度大</li>
          <li>如:GPipe、PipeDream、PipeDream-2BW、PipeDream Flush(1F1B)</li>
        </ul>
      </li>
    </ul>
  </li>
  <li><strong>多维混合</strong>并行(如:3D并行(数据并行、模型并行、流水线并行))
    <ul>
      <li>2D 并行: dp+pp, tp+pp</li>
      <li>3D 并行: dp+tp+pp</li>
    </ul>
  </li>
  <li><strong>自动</strong>并行: 自动搜索并行空间
    <ul>
      <li>如:Alpa(自动算子内/算子间并行)), 将并行空间分为 inter-op (pipeline) 与 intra-op (tensor并行),使用NAS搜索这两个空间,考虑整个搜索空间的cost。</li>
    </ul>
  </li>
  <li>优化器相关并行(如:ZeRO(零冗余优化器,在执行的逻辑上是数据并行,但可以达到模型并行的显存优化效果)、PyTorch FSDP)</li>
</ul>

<p>【2023-12-15】MIT 端侧模型训练课程: <a href="https://hanlab.mit.edu/courses/2023-fall-65940">TinyML and Efficient Deep Learning Computing</a>, 含 ppt 和 视频</p>
<ul>
  <li>powerful deep learning applications on resource-constrained devices.</li>
  <li>Topics include model compression, pruning, quantization, neural architecture search, distributed training, data/model parallelism, gradient compression, and on-device fine-tuning. application-specific acceleration techniques</li>
</ul>

<p>模型切分分3个互相正交的维度:[<code class="language-plaintext highlighter-rouge">data</code>, <code class="language-plaintext highlighter-rouge">model-layer</code>, <code class="language-plaintext highlighter-rouge">model-activation</code>(Tensor)]</p>
<ul>
  <li>这3个维度互不影响,可同时实现,即 <code class="language-plaintext highlighter-rouge">3D parallelism</code>。</li>
  <li><img src="https://pic1.zhimg.com/80/v2-227b93fda9e3bf182e0aea9f4abb23a0_1440w.webp" alt="" /></li>
</ul>

<table>
  <tbody>
    <tr>
      <td>并行维度</td>
      <td>切分方式 Split</td>
      <td>模型完整性</td>
      <td>通讯</td>
      <td>对gpu的利用</td>
      <td>优化手段</td>
    </tr>
    <tr>
      <td>data</td>
      <td>data</td>
      <td>Copy of whole model</td>
      <td>非常少 (只有前向和反向的时候需要通讯)</td>
      <td>High (for 训练加速)</td>
      <td>ZeRO</td>
    </tr>
    <tr>
      <td>pipeline</td>
      <td>model-layer</td>
      <td>Part of model</td>
      <td>中等</td>
      <td>Low (for 显存占用太多,优化显存)</td>
      <td> </td>
    </tr>
    <tr>
      <td>tensor</td>
      <td>model-tensor</td>
      <td>Part of model</td>
      <td>很多(每个需要reduce的中间结果都要通讯)</td>
      <td>High (模型算到底再通讯)</td>
      <td> </td>
    </tr>
  </tbody>
</table>

<p>详见: <a href="https://zhuanlan.zhihu.com/p/699372131?utm_psn=1777577429458386944">分布式训练优化–进阶篇</a></p>

<p>常见多GPU训练方法:</p>
<ol>
  <li><strong>模型并行</strong>:如果<strong>模型特别大</strong>,GPU显存不够,无法将一个显存放在GPU上,需要把网络的不同模块放在不同GPU上,这样可以训练比较大的网络。(下图左半部分)</li>
  <li><strong>数据并行</strong>:将整个模型放在一块GPU里,再复制到每一块GPU上,同时进行<strong>正向传播</strong>和<strong>反向误差传播</strong>。相当于加大了batch_size。(下图右半部分)
    <ul>
      <li><img src="https://pic4.zhimg.com/80/v2-92e93b9f002b3782abec2a9f8a9a6153_1440w.webp" alt="" /></li>
    </ul>
  </li>
</ol>

<p>大规模深度学习模型训练中有几个主要范式:</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">数据并行</code>(DP):模型尺寸能够被单个GPU 内存容纳,模型的不同实例在不同的 GPU 和不同批数据上运行,模型的每个实例都使用相同的参数进行初始化,但在前向传递期间,不同批次的数据被发送到每个模型。 收集来自每个模型实例的梯度并计算梯度更新。,然后更新模型参数并将其作为更新发送到每个模型实例。
    <ul>
      <li><img src="https://pic4.zhimg.com/80/v2-de60ad9dffd68d827084d84772b06dbb_720w.webp" alt="" /></li>
      <li><img src="https://pic4.zhimg.com/80/v2-b508d84ba9c6a9c6ae2c5be70526da43_1440w.webp" alt="" /></li>
      <li>数据并行通过在 N 台机器上复制模型来实现。拆分 minibatch ,分成 N 个块,让每台机器处理一个块。</li>
      <li><img src="https://pic3.zhimg.com/80/v2-678f7d2c116f7528be27d6445b6c091a_1440w.webp" alt="" /></li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">模型并行</code>:当单个 GPU无法容纳模型尺寸时,<strong>模型并行性</strong>变得必要,有必要将模型拆分到多个 GPU 上进行训练。实现模型尺寸超过单个GPU显存的深度学习模型训练。
    <ul>
      <li>这种方法的问题是计算使用效率不高,因为在任何时间点只有一个 GPU 正在使用,而其他 GPU 处于空闲状态。</li>
      <li><img src="https://pic3.zhimg.com/80/v2-6a4304b529130e86e4552b3d4ed58a4e_720w.webp" alt="" /></li>
      <li>相对于流水线并行和数据并行,模型并行具有以下优点:
        <ul>
          <li>支持更大的模型规模:流水线并行和数据并行的限制通常是 GPU 内存大小和 GPU 数量,而模型并行可以支持更大的模型规模,因为模型可以分割成多个子模型,并分配到多个 GPU 上运行。</li>
          <li>减少通信开销:流水线并行的模型划分通常会导致模型层之间的通信,而模型并行只需在每个子模型之间进行通信。相对于数据并行,模型并行在执行过程中通信量更少,因为每个 GPU 只需传递模型的一部分而不是全部。</li>
          <li>灵活的模型分配:模型并行可以更灵活地将模型分配给不同的 GPU 或计算节点,这意味着可以在不同的 GPU 上运行不同的模型子集,从而实现更好的负载平衡和性能优化。</li>
        </ul>
      </li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">流水线并行</code> (PP)
    <ul>
      <li>朴素流水线并行(Naive Pipeline Parallelism)是将一组模型层分布在多个 GPU 上,并简单地将数据从 GPU 移动到 GPU,就好像它是一个大型复合 GPU 一样。</li>
      <li>流水线并行 (PP) 与上述朴素流水线并行几乎相同,但它解决了 GPU 闲置问题,方法是将传入的 batch 为 micro-batches 并人工创建流水线,从而允许不同的 GPU 同时参与计算过程。</li>
      <li>流水并行是将一个大型计算任务拆分成多个小的<strong>子任务</strong>,并将子任务在多个处理单元上同时执行。不同于数据并行和模型并行,流水并行不是将数据或模型分割成多个部分并在处理单元间并行处理,而是将一系列计算步骤分解成多个流水阶段,并在多个处理单元上同时执行,以减少总体计算时间。</li>
    </ul>
  </li>
</ul>

<p>通俗理解</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">Data Parallelism</code>:模型1台设备装得下,所以同模型用多份数据分开训练</li>
  <li><code class="language-plaintext highlighter-rouge">Pipeline Parallelism</code>:模型装不下,模型1层或多层1台设备装得下,所以同模型按层拆开训练</li>
  <li><code class="language-plaintext highlighter-rouge">Tensor Parallelism</code>:模型1层都装不下,所以层内拆开训练</li>
</ul>

<h3 id="数据并行">数据并行</h3>

<p>数据并行性(Data parallelism (DP))最简单的方法是:将相同的<strong>模型权重</strong>复制到多个worker中,并将一部分数据分配给每个worker以同时进行处理。</p>
<ul>
  <li>如果模型规模大于单个GPU的内存,Naive DP无法正常工作时。GeePS(Cui 等人,2016 年)之类的方法将暂时未使用的参数卸载回 CPU,以使用有限的 GPU 内存。数据交换传输在后端进行,且不干扰训练计算。</li>
</ul>

<p>在每个小批量结束时,workers需要同步梯度或权重,以替换旧参数。常见有两种主要的同步方法,它们都有明确的优缺点:</p>
<ul>
  <li>1)大容量<strong>同步</strong>并行( Bulk synchronous parallels (BSP)):workers在每个小批量结束时同步数据。这种方法可以防止模型权重过时,同时获得良好的学习效率,但每台机器都必须停止并<strong>等待</strong>其他机器发送梯度。</li>
  <li>2)<strong>异步</strong>并行(Asynchronous parallel (ASP)):每个GPU工作进程异步处理数据,无需等待或暂停。然而,这种方法很容易导致网络使用陈旧的权重参数,从而<strong>降低</strong>统计学习效率。即使它增加了计算时间,也可能不会加快收敛的训练时间。</li>
</ul>

<p>中间的某个地方是在每次x迭代时,全局同步梯度(x>1)。自Pytorch v1.5版(Li等人,2021年)以来,该特征在平行分布数据(DDP)中被称为“梯度累积”。Bucket 梯度计算方法避免了梯度的立即AllReduce,而是将多个梯度变化值存储到一个AllReduce中以提高吞吐量,可以基于计算图进行计算和通信调度优化。</p>

<!-- draw.io diagram -->
<div class="mxgraph" style="max-width:100%;border:1px solid transparent;" data-mxgraph="{&quot;highlight&quot;:&quot;#0000ff&quot;,&quot;nav&quot;:true,&quot;resize&quot;:true,&quot;toolbar&quot;:&quot;zoom layers tags lightbox&quot;,&quot;edit&quot;:&quot;_blank&quot;,&quot;xml&quot;:&quot;&lt;mxfile host=\&quot;app.diagrams.net\&quot; agent=\&quot;Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/128.0.0.0 Safari/537.36\&quot; version=\&quot;24.7.7\&quot;&gt;\n  &lt;diagram id=\&quot;xdYpP7w1t2VaaceZiyqw\&quot; name=\&quot;第 1 页\&quot;&gt;\n    &lt;mxGraphModel dx=\&quot;1242\&quot; dy=\&quot;-408\&quot; grid=\&quot;1\&quot; gridSize=\&quot;10\&quot; guides=\&quot;1\&quot; tooltips=\&quot;1\&quot; connect=\&quot;1\&quot; arrows=\&quot;1\&quot; fold=\&quot;1\&quot; page=\&quot;1\&quot; pageScale=\&quot;1\&quot; pageWidth=\&quot;827\&quot; pageHeight=\&quot;1169\&quot; math=\&quot;0\&quot; shadow=\&quot;0\&quot;&gt;\n      &lt;root&gt;\n        &lt;mxCell id=\&quot;0\&quot; /&gt;\n        &lt;mxCell id=\&quot;1\&quot; parent=\&quot;0\&quot; /&gt;\n        &lt;mxCell id=\&quot;KTwht3HF3Dpf_-XckZrt-1\&quot; value=\&quot;分布式训练模式:数据并行\&quot; style=\&quot;text;html=1;strokeColor=none;fillColor=none;align=center;verticalAlign=middle;whiteSpace=wrap;rounded=0;fontSize=21;rotation=0;strokeWidth=3;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;611.72\&quot; y=\&quot;1220\&quot; width=\&quot;269.29\&quot; height=\&quot;33\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-1\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f5f5f5;fontColor=#333333;strokeColor=default;dashed=1;dashPattern=1 1;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;80\&quot; y=\&quot;1580\&quot; width=\&quot;280\&quot; height=\&quot;160\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-18\&quot; value=\&quot;\&quot; style=\&quot;shape=image;verticalLabelPosition=bottom;labelBackgroundColor=default;verticalAlign=top;aspect=fixed;imageAspect=0;image=data:image/png,iVBORw0KGgoAAAANSUhEUgAAAgAAAAIACAYAAAD0eNT6AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAOxAAADsQBlSsOGwAAABl0RVh0U29mdHdhcmUAd3d3Lmlua3NjYXBlLm9yZ5vuPBoAACAASURBVHic7N15fBx1/T/w13tmd3Nu7mR3ZjZpUkpLmxbaBgotLeUqlHIIgoKKyiGieH4VBLxQVPD48lMQv3KKSlX4gnIVATnkaqFAgQKlUCBN293ZpGmu7ubc3fn8/kj4WmuPJPOZnT3ez8ejDxAz78+7m2TmPZ8TYIwxxhhjjDHGGGOMMcYYY4wxxhhjjDHGGGOMMcYYY4wxxhhjjDHGGGOMMcYYY4wxxhhjjDHGGGOMMcYYY4wxxhhjjDHGGGOMMcYYY4wxxtjuyO0EGGNSUSAQaPJ4PDOFEFOFEI1EVE9EdQCqhRDVAAoB+ACUjF3TD2AEwBARdQHoEkJsB7AVQBsRbU4mk293dHS0ARDp/ysxxpzABQBjWay+vl63LOtIAEcKIQ4DMAeA36HmdgJ4k4heBrDa4/Gs3rJlS9ShthhjDuMCgLEs0tzc7Ovp6VkqhDiJiFYAmOFyShuFEI8Q0aOVlZXPbNiwYcTlfBhj48QFAGOZz6Np2vFEdDaA0wFUuJ3QXvQAuF8IcXc0Gn0SQNLthBhje8cFAGMZKhQKGZZlnQvgEgANbuczQVEAf0ylUrd0dHS0up0MY+w/cQHAWIbRNG0JEX0TwKkAFLfzsckC8CCA60zTfN7tZBhj/8IFAGMZwjCMky3L+j4RLXA7F4e8KIT4YTQafdTtRBhjXAAw5jrDMI4RQvwEwEK3c0mTNZZlfbu9vf0ZtxNhLJ9xAcCYS6ZMmaIlk8mfCSHORX7+Lq4CcIlpmtvcToSxfKS6nQBjecijadpXLMv6K4AFyM+HPwBMB/B5v9+fisViazE6X4Axlib5euNhzBW6ri8C8D8ADnE7lwzzOhFdEolEXnA7EcbyBfcAMJYeqq7rVwG4A4DmdjIZKAjgAr/fXxSLxf4J3nKYMcdxDwBjDgsEAnWqqq4EsMztXLIBET2jKMont23bZrqdC2O5jAsAxhxkGMaxQoiV4Lf+ieoUQnyGlwwy5hweAmDMGaTr+tUAbgVQ5nYyWaiEiD7p9/uVWCzGywUZcwAXAIzJp+q6fguAr4N72ewgAEeXlZVNnT59+qpoNMqrBBiTiG9OjEmk63qxEOKesZP6XOPzlKKipAkVxU2oKJmKipImlBYG4VVL4FEKUeCrgFctAgAkUoMYHulFwhpEMjWA+FA7evtb0dPfir6BNvT2b8ZIMu7mXwcY3TPgbNM0B9xOhLFcwQUAY5I0NDRUJpPJhwAcme62vWoxAhVzEapahFD1ItT4Z4FI3jECOwe2Ity9BuGuNdjWtRojyZi02OMlhHiJiE42TXNH2htnLAdxAcCYBGMn9z0GoDldbfo8pTgguALTtY8gWD4PiuJJS7uWlUR736t417wfrR2Pprt3YIOqqifwCgHG7OMCgDGbQqFQlWVZzyIND38Cob5mCWboZ6Cx7jh4lEKnm9ynpDWEzR1PYFP0fmzb8RxEepbvb/B4PEu2bt3ak47GGMtVXAAwZkMoFCqyLOsfABY72Q6RgoaapTjsgK+gtmy2k01NWldsE9ZvuQ3vta+CZSWdbm5tKpU6rqOjo9/phhjLVVwAMDZJLS0tXtM073dywp9CKqbrZ2B+08UoL57iVDNS9Q604dXWm/Be9AFYIuVkU6tM0zwDgOPVBmO5iJcBMjY5RES3EtHHnWqgtmw2ls/9DWbXfxKF3gqnmpGu0FuBprrj0Vh3HLpi76J/uN2ppqaXlZVNjcVi9zvVAGO5jAsAxiZhbJOfrzkRu9BbgcUHfRdHzfwBSguDTjSRFsUFtTjIOBMlhXVo730VKWvYiWYO9vv9Fm8WxNjE8RAAYxOk6/oJAB4BIG+d3ZjG2mNxTPO1KPRVyg7tqsGRbvzzrSuwZcfTToS3iGhZJBJ5yongjOUqLgAYm4C6urqAx+N5HaOn10mjKB7Mb/oiDp36Janr9zOJgMCbW/6IFzb9HJZIyA7f4fV6523ZsiUqOzBjuYqHABgbP7WsrOxBIpI6Dd9fZODk+bdiuvYREOVuTU6g0c2Kqhci3LVa9v4BpZZlzY3FYivBRwkzNi5cADA2Trqu/5CIPiszZm3ZbHzksJWoKJkqM2xGKy3UcGDwVES6X8DASKfM0FN5PgBj45e7rxuMSaTr+iIAz0HiuL9RdQSWz/0tfJ4SWSGzykgyjkde+yLMnrUyw6aIaEkkEnlBZlDGchH3ADC2f6rf738AgCYrYFPdMiyf+5v/O5AnH6mKD9P1U9Hbvxk9/e/LCqsAODIQCNzW3d3t6CYEjGU7LgAY2w9N075GROfJijc1cCJOOOR6qIpXVsisRaSiKXACevvfR0//B7LC1iQSiSQPBTC2bzwEwNg+1NbWBr1e7zsAymXE0ysPxyktt0NVfDLC5QxLJPDwq59HuGu1rJDDiqLMDofD0roWGMs1ubneiDFJvF7vryDp4V9dOgMnzfsffvjvgUJenHjIr1FbJu08pYJUKnWtrGCM5SLuAWBsLwzDOEYIIWVzGX+RgTMPvxdFvmoZ4XLWwHAn/rr2LMSH5CznF0IcFY1Gn5MSjLEcwz0AjO2FEOIaGXEUxYPj51zHD/9xKC6oxYmH3ACFPFLiERH3AjC2FzwJkLE9MAzjZADfkhFr4fRvYVrQsQMDc05JYRCq4kW4e42McA1+v39NLBZrlRGMsVzCPQCM7YFlWd+XEaex9lgcPOV8GaHyytymi9BQs1RWuKtkBWIsl3ABwNhuNE1bQkQL7MYp9FbgmOZrQTzVZsIIhGNn/wwFXinzL480DGOhjECM5RIuABjbDRFdKiPOEQdemnOn+qVTka8KRxz4DVnhpHxPGcslXAAwtotAINAE4BTbccrn4SDjLAkZ5beZxtmoKzvYdhwhxOmapk2RkBJjOYMLAMZ2oSjKhbD5e6GQiqNmXZWzx/qmE5GCo2b9AArZnq+sEBFPxmBsF3yHYuxfPDIeEtP1M1DjnyUjH4bRExOnBU+VEeoC8Monxv4PFwCMjQkGg8sA6HZiKKRiXtPnJWXEPjR/6hdk9KjU67p+jIx8GMsFXAAwNkZRlLPtxjggeBIqihslZMN2VVkyFVPrTpARyvb3mLFcwQUAYwCam5t9AE6zE4NAmN/0BUkZsd21TL1ExpLKM8e+14zlPS4AGAPQ09OzFICtNXv1NUtQVTpdUkZsd9X+gxCqPtJumMre3l7bQRjLBVwAMAaAiGzv1TtD/6iMVNg+zNDPsB1DCLFcQiqMZT0uABgDIIQ4yc71Pk8pGuuOlZUO24umwDL4PKV2w9j6XjOWK7gAYHmvvr5eBzDDToxpwZPhUQolZcT2xqMUYmrgRLth5tTW1gZl5MNYNuMCgOU9y7JsjwkfqNmaP8gmYIZ+uu0YHo9nkYRUGMtqXAAwBtgqAHyeEgTL58nKhe1HsLwFPk+J3TA8EZDlPS4AWN4TQhxm53qtcgEUxSMrHbYfiuJBsOJQWzFknPbIWLbjAoDlOwIw206AUBWfNJtuRtURdkPMAficZpbfuABgea2+vn4qgDI7MSQ8jNgESfjMy3Vdr5eRC2PZigsAltcsy5pp53qfp5Q3/3FBjX8mvJ5iWzGEEHxiE8trXACwvCaEaLJzfUXJVD721wVECiqKbX3roCiKvQCMZTm+c7G8JoRotHN9RQk/Q9xi97O3+71nLNtxAcDyXYOdi/nkP/fY7QEA0CghDcayFhcALK8pilJr5/qKkgNkpcImqKJkqt0QNTLyYCxbcQHA8l21nYtLCupk5cEmqLRQsxuCCwCW17gAYHlNCGGrAJBwMA2bJK/93QBtfe8Zy3ZcALB8Z2stmVe1/RBik+RV7S0DhM3vPWPZjgsAlu98ti7mHgDX+FTbn32BjDwYy1ZcALB8Z6sAsLsZDZs8CUMAXACwvMYnmPwnCgQCTR6PZ6YQYqoQopGI6omoDkD12JhxIUYfHCUABIBeABYR9QkhRgBEiSgshIgAMIUQ76mq+mY4HI649rdieyPcToC5JuV2Aoy5Ke8LgPr6en3sPPgjx06FmwPAL8Toc4Fo9LyQD//3HhCAyrGv+XBS0UG7fj0RwbIs6LreDWA9Ea0XQjyjKMqz4XC424G/Fhu/Ydj4PRhJ9qPQWyExHTZeiWS/3RC2AzCWzfKuAGhubvb19PQsFUKcREQrUqnUjDQ2XwXgGCHEMQC+blmWpev6eiHEP4UQD7a3tz8PfitJtyGM9uRMSiIZ5wLAJSOpuN0QXACwvJYvBYBH07Tjiejsnp6e0wFUfPhm7zIFwDwimkdE39B1vUMIcZ+iKPdGIpGnwcVAOgzZuTiRGpCVB5sgCZ89FwAsr+V0ARAIBJoURbmQiM4HoLudzzgEiOgLQogv6Lq+jYhu8Xg8t2/ZsiXqdmI5bAcAY7IXjyRtv4WySRpJcA8AY3bk5CoATdOW6Lp+v6qq7xPRd5AdD//d1QshfpRIJLbouv6/hmEsdDuhHNVh5+L+YVuXMxv6h9ttXS+E2CkpFcayUk4VAIZhnKxp2loiehbAR5Abfz8vgI8JIdbouv64pmlL3E4olxCRrSd4b3+rrFTYBNn97Iloi6RUGMtKufCAhGEYx+i6vkYIsYqIFridj4OOJ6JnNU37p2EYR7idTC4YW6o5ab39m2Wlwiaod8D2Z8/fPJbXsroAmDJlimYYxh+FEE8CyJsuciI6eqxH4H91Xbd1nC3De3YulvAQYpPUE7fdA8DfPJbXsrUA8Gia9vVEIvGOEOLTGF2Ln28IwMcAbNA07cqWlhav2wllIyGEvQKgvxVCWLLSYeMkhIW+wTZbMSzL4gKA5bWsKwB0XV+k6/orRPRLAGVu55MBSonommg0ujYUCs1xO5lsk0wmbRUAI8l+dMXflZUOG6cdsY1IJO0tA1QUhQsAlteyqQBQdV3/AYBnARzici6ZaJ5lWet0Xf8p9waMX2dnZzsAW8ssI90vSsqGjVek+wW7IboikQhvzc3yWlYUAPX19bqmaU8AuAqA6nY+GcwL4PJoNPrPUCg06bXteWidnYsjXbYfRmyCIt1r7YZ4EXwOBMtzGV8AGIZxbCqVeoWIjnY7lyxypGVZr+u6vsztRLKErQIg2vsyLJGUlQvbD8tKor33FVsxiIi7bVjey+QCgHRdv1oI8TgAze1kslANgL/run6Z24lkqkAgUKdp2tcAnGUnzkiyH+29r0rKiu1PtPdljNg8CEgI8ZKkdBjLWpm6FbCq6/rNAC50O5Es5wHwc13XZ2ma9vl169Yl3E7IbY2NjYUjIyPLAHwawOkYHTaxbZP5APTKXN6CInO8a95vN4Tw+XxcALC8l3HL53RdLxZC3ENEK9zMw1tQipLqKSipaUJpTRNKqqagsDwIj68YqrcQ3qJyeLxFAIBkYhCJwT6kRgaRTAxiqK8d8a429O/YjP6uNvR3bUFi2PU941cBONs0zXw8vUYJBoNLFEX5DIAzAZTLbsDnKcVnj14Dj1IoOzTbRTI1iD88s8huD8A60zQPlZUTY9kqo3oAQqFQlWVZq4go7Zv6eAtKUTmlBTVNh6O6aQH8dQeCaHwjJF7VC2/hLisSQ/++SEEIC7GOTehqewk7Wteie+s6JIfTfg7JKQCeCoVCK8LhcHe6G3eDrusHCSHOJaJzAUxxsq2RZBybO57AgdopTjaT91q3P267+x/AgzJyYSzbZUwPQCgUMizLegxAc7ra9BSUQm8+EfrBp6Cyfi4UJT31kGUl0bvtdYTXP4j2tx9Pd+/AOp/Pd3xbW1tvOhtNl2AwWKsoyjlE9GkhxGHpbLu+ZglOmX97OpvMOw+tOx/hrtW2YhDRvEgk8rqklBjLWhlRAIy9+T+LdDz8iVB3wJEwDjkNgYOOheopcLzJfUklh9HxzpOIrH8I2z9YDQjnVyYR0csFBQXLWltb+xxvLA0aGxsLE4nEqWO7Qi6HpHH9iSIQPrboQVSXznCj+Zy3I7YR975wOoS91XtbTdN0tDeIsWzhegGg63oxgCfg8F7+pKjQZp2IaUs+B3/dgU42NWmxjk1477lb0f72P9KxveyLiURiWWdnp+uTEyaJNE1bTESfwegs/gq3EwKAacEVWHbwr9xOIyc9tv4raO14zG6YG03T/IqMfBjLdq4WAC0tLV7TNO93csIfKSpCh5yGAxZ/DiVV2XFuTn/XFrz//K2IvLEKwko52dQq0zRPB+BoIzIZhjFdCHEuRmfxN7qczn9QSMXZRz6CiuJGt1PJKT39H+DuNSfLKIwXm6ZpbwyBsRzh5q56pCjKHbC5BntfKkOH4NBzbkBDy8fgK5I+8dsxvuIKBA86FnXTl2Jn+zsYim13qqnpfr+/KhaLPeJUAzIYhlHt9/sv8Pv9NwD4GYClyJA3/t0JCCSS/WiqO97tVHLKmk3Xoiv2jq0YRPSWaZpXSkqJsaznWgGg6/rVABzpivMVVWDWSVegecW3UeivdaKJtCj016J+3hko9NehZ9vrsJLDTjSzoLS0tDcej9veW1WmadOmFRQUFJxeVlZ2LYCbAJwKIORyWuPSE9+EKbXHoKSgzu1UcsL2vjew+t2fQMLOvT+OxWK8/p+xMa4MAei6fgKAR+DAToR104/GIR+5Gr7iStmhXTXS3431D3wP29971onwKcuyjmtvb3/GieATQLquHzm2dO/jALL2m1hXfgg+uuDucS8lZXsmhIW/vfQxbO97026oQY/HY2zdurVHRl6M5YK09wDU19frQojHAJTKjEuKBzOXfQOzl18B1VckM3RGUH1FMGavgMdXhO62l2VPElSI6ITi4uKV/f39ad+gIBQKTSstLf1aWVnZbQC+SUSHAsjqb2L/cAdKCutQWzbb7VSy2tvhu/B2+G4Zof4cDofvkhGIsVyR7h4AVdO0J2Qf7FNUoWP+mb9ARehgmWEzVs+21/HqvZdhaGe77NCPm6a5HIDjSxBCoVCVEOLssaV7ad/4KR0KvOX4xJGPochX5XYqWWlwpAt/ef5EDCd32g1lKYoyNxwO2+5GYCyXpLUHQNf1HxLRZ2XGLNdm4ojzfo/SmkaZYTNaUXkQ+pwV6Nq8FsPxHTJDH+D3+4disdjzMoN+aGxc/7SysrJrhRA3AzgNQL0TbWWClDWMnvj7mKadAnJ/xW1WEcLCP974Grrjm2zHIqK/RSKRGyWkxVhOSVsBoOv6IgB3QGKvQ3XTAiz41M1ZNcNfFo+vGPqcFegNv4HBXlNm6MV+v/+vsVhMWmWh6/oiv99/ZSKRuIOIzgNwENxdgZI2fQNt8KrFCFbMdzuVrPLa5pvxtpwee0FEn4zFYh0ygjGWS9J1E/b4/f4HIPFY3+DMZTj07F9B9Wb1ULEtqscHfc5J6N+xGfHOVllhPQDmxWKx38PGtOv6+voDSktLv+r3+28DcBmAw5Ad4/oCwGohxE+IqADAAXYDmt1rEapeiNJCPtV6PKI9r+Cfb18ha57L/aZp3iAjEGO5Ji39kpqmfZ2IfikrXnDmMsw/6xcgJS9eIvdLWCm89tfLEH37cZlhv2qa5q8nckFDQ0NlIpH4OBF9GsAiZMBOkxPwHoCVqVTqzo6Ojs0AEAqF5liW9RokFMqlhRrOPPxeFBdk77LUdBgY3o57XzwT/cNSXtgtIpofiUTWywjGWK5x/AZdW1sb9Hq970DSEazVjYdhwbk3QVF9MsLlDJFK4KU/fwk7Wl+QFTKWSCSmd3Z27m+moarr+jFE9BkhxJkAimUlkAa9RPSQEOKPpmk+iT30eOi6/jsA58torKp0Os5Y8Bf4PH4Z4XLOSDKOB17+FHbENsoK+VvTNC+RFYyxXON4AaDr+l0AzpYRyx+YjkXn/R6eQr6B7kliOI61f7gAfVE5N1AhxE3RaPSLe/r/NE1rGduH/xwA2bTjzTCAx4nojxUVFQ9s2LBhZF9fPFbAboSknQf1ygU4peV2qIq7h1BlGksk8PCrFyHctUZWyC4AB5mmKXWWLGO5xNECwDCMY4QQT8mIVVSh48jP/QUFJbykal+G4juw+tZPyFoimLQsa257e/sGAAgEAk0ej+fcsb34p8toIE0EgDVEtFJV1bsnuhmMrutfBjCh4ZB9mRo4EcsO/hUU4iEsALBECo+/8TW0dvxDZtgLTdP8ncyAjOUaRwsAXdfXQMIab1I8WHT+H/Jmnb9dPdtew4u/vwCWlZQR7h9CiHvHxvUXI7vG9T8AcKeqqiu3bdv2gY04qq7rLwI4VFJeOCBwIo6bcx1UJb+HslLWMJ5485uyH/4vmKa5GGnYz4KxbObYzdwwjJOFEKtkxJp5wqWYulDq9gE574PVt+OdJ/LyWNr9jutPhqZpM4noVQCFMuIBo8MBJ827CT6P1E0xs0YiOYBH138J4S6ph/MNKYqygDf9YWz/HOuDLC0tXUlEht04ddOPxuzlVwCUTS+e7quqn4feyFsY6N7qdirpMALgISHEt0tKSi7esmXLvbFYTNq6SACIx+M7/H5/AsAyWTFjQxFs2/EsmuqOh9dTIitsVhgY6cSDL38aHX2vSY1LRF+NRCIPSw3KWI5y5KmqadoSIrJ9ao2vqAJLv/xgzh3sky4j/d14+jenITHY53YqTnkBwEpFUe4Kh8PdaWhPNQzjSSHEUplBSws1LDv4l3mzWVC05xU8/sZ/yVrqt6sHTdM8HZJ6fRjLdY70AJSVlf0awAy7cWaddAWqGvLjpugE1VcEb2Eptm9y5ARBt2wDcBMRXWia5s9jsdjLO3fuHExT26KoqOgRRVE+BUDaUpSRZBybovdDCAt65WGgHO3tEhB4c8sf8eSbl2IkGZMdfnsymVzR398flx2YsVwl/U4TCASaVFV9HzaP+q0MHYKFF/yRj1O1SQgLq2/7FPrMt9xOxY4eIrpHCHGnaZqr4fIbnmEYx42daCm9gG6oWYpjZ/8s5w4QGhzpwlNvXY6tO5wpRono5ZGRkWM7Ozu5AGBsnKTfwMrKyi4lIltdpKSoOPSc61Hoz6bl5ZmJiFCuzcK21+8DRFb1jKYAPEVEPwTwOdM074vFYhkxoSEWi20uKyuLAzhRduy+gS14O3w3PEohasvnZH0BLISF99ofwCOvfxFdsXedbMpQVfXogoKCewYGBva5twNjbJTsAsBTVlb2B9jsHq2fezoaWj4uKSVW6K9Ff/dWxDrsn6yWBmsB/AzAeaZp3hyLxdbHYrGE20ntLhaLveD3+0MApI9RpawRbOt6Dtt2PIfaslkoKcjOQnh73xt4bP2X8NbWPyGZGkpHk/Wqqi4pLCzkIoCxcZA6BKBp2nIiesRODFJULP3SgyipapCVFgMQ37EZz/7P6bIOWJGtTQixUlGUOyORSFZUKQDQ0tLijUajqwCc4FQbCqmYFjwV86d+AZUlU51qRqqe/g/w2uabsSn6oFs/b88nEomTeDiAsX2TWgDouv57ALYW7OuzV2DemT+TkxD7N6/e801E35a64YodfUT0oOz1+umm63oxgMcwukmSY4gUNNQsxaFTv4y68jlONjVp3fFNeL3tNrwXfQiWSLmdzpqRkZHlO3bskD7bkLFcIa0AaG5u9vX09HTAzp7pRDjqC3+Fv+5AWWmxXezseBfP3fwxN+cCJAA8SkR3er3eh9ra2tLSL+y0UChUZVnWPwE4vlUlgRCqPhIz9DPQFFgGjyJtX6JJSaYG0br9cbxr3odI1xqIzKrjuCeAsX2QVgDour4MgK3Xy7ppi3HYp34rKSO2Jy+tvBidH0g7cGVciOhlIcSdlmXd1d7e3pnWxtNE1/UaAI8DmJuuNn2eUkwNnIgZ+ukIlrdAUTxpadcSSUR7Xsa75v3YvP0fGEn2p6XdSeIigLG9kFYAaJr2/4jov+zEmH/WL6A1L5eVEtuDyJsP4/W/XZGOprYIIVYS0UrTNN9JR4NuG+sJeAwSzwwYL6+nGFrFYQhVL4ReeThq/DOlrSAQwsKO2EZEul9EpPtFRHtfRiI5ICV2mnARwNgeyOwB2AjgoMle7ykoxfGXPg3Vw8ekOimVGMKT1x2DxLAj98I+AH+1LOuP7e3tzyEPD2Opqanx+3y+e+HgxMDx8HqKUVHchIqSprF/TkVpoQavWgyvWowCbzm8nmIAo3vyDyf6kEgNYCTZj/7hdvT2t6KnvxV9A23oHdicbQ/8PeEigLHdSCkA6uvr9VQqFbETo2H+mZhz6g9kpMP2Y/0D30P49ftlhUtidBLcnYqiPBgOh9O1K1/Gamlp8ZqmeRsRfcbtXNi/4SKAsV1I6SO0LOtIuzH0g0+RkQobh9Ahp8kIs0EI8fVUKmWYpnmKaZp388N/1Lp16xIY3bKYZZbFXq/3kdra2vw8fpGx3cjaZmyRnYu9BaWorE/b3Km8V9UwD54C26fP3RONRq/v6OjYLiOnXGIYxiFE9C2388hyUSJ62YG4XAQwNkZKASCEWGDn+sop6ZvBzABSPKist72BXYuMXHKQRwhxGwCv24lksQctyzpkZGTkWADPOxCfiwDGIKcAIACz7QSoaTpcQhpsImqabNVsABcAe6Tr+jfgwiqAHDEkhPi6aZqnt7e3d3Z2dsZTqdRyIcTTDrS12Ov1PlZTUyPtVEfGso3tAiAQCDQBKLMTo7rxMLtpsAmqtl906cFgsFZGLrkiFApNA/ADt/PIUi8IIeZHo9HrscuukB0dHf3JZPJUONMTsMjn8/2dewJYvrJdAHg8nll2rvcWlMIfmG43DTZBZcEZ8PiKbcVQFIW/cf9CqVTqVgBFbieSZToBXGia5pHRaHTjHr+gszOeSCROAg8HMCaV7QJACNFk5/qSmsasP/I0GxEpKKmeYjMG8Z7NYwzDuJiIjnY7jyxiEdGdAGaZpvk77OcsCB4OYEw+GQVAo53rS6ptXc5sKKm2VbsBAPcAAAiFQoYQ4qdu55ElBID7iGh+JBL5jGmaO8Z7IQ8HMCaXjFdvW+f2cgHgntKaRlvXW5ZlyMkku1mW9VsA5ZLCvYjcyAn8XAAAIABJREFU3EHRIqK/EtE80zQ/GolE1k8mCA8HMCaPjAKgxs7FpVwAuMZu8UVEATmZZC9d188BcKqMWET0iGmaC4lophDiJgC5sLHSIIA/KIoyNxKJnDXZB/+uuAhgTA7bBYCiKLYKgMKyvH+GuKaoPGg3RJ2MPLLV2AmA10sKFxNCfAEAIpHIpmg0+kVFUUIAvgbgDUltpA0RvQXgqz6fTzdN87xwOPymzPhcBDBmn4w5ANV2rpewIx2bJI/P9mdvq/jLdkT0S8grgq40TXPrrv8hHA53m6Z5g2mahwghDgNwHYDNktpzwhYANwJYHIlE5pim+eu2trZepxrjIoAxe2Rsv2dr2ZNqcykamzwJn33eLnkLBoMnCSHOlRTuedM0f7uvL4hGo68AeAXApZqmtRDRqQCOB3A45PweT4YA8CqAB4nowUgk8nq6E+js7IzX1tae5PV6HwGwWHL4D4sAPkCI5SQZNw6frQS4B8A1Ej77Qhl5ZJuamhq/oig3SQo3BOAiTGDiXzQaXQdgHYAf1NTU+AsKCpaObcd9GEZ3IXSqZ2YHgLVEtFYIsdbn873k5Bv+eHERwNjkyCgACuxc7LXfDc0miQuAyfH5fNfC5uqXDwkhfhSNRt+Z7PU7duyIAVg19gfA6PHcQojpqVTqQCI6kIg0IUQdgCCASgClGB3+K8focc4xAHGMFiM7iShuWdYWImolos2WZW1WFGVzJBIJT/5v6iwuAhibOD6Bh9mRi8vV9skwjIVCiC9KCveGruu/iEajksKN2rZtmwnABPC01MAZrrOzMx4IBJYrirLKgU2ZPtwsaPlY0cVY1pOxDHDEzsXJkQEJKbDJSA732w2RV29D06ZNKxg76U/G701SCHHBunXrEhJisTEdHR39lmWd4tCOgYt8Pt+jvGMgyxXuFwD2H0JskiR89nn1zRscHLwKgK2zL3bx32Nj+UwyLgIYGx8ZBYCth0ByJK+eIRklZb/3JW++eYZhzBVCXCYp3CZFUa6WFIvtAW8bzNj+2S4AiKjbzvXcA+CexLDtHvx8GQLwCCFuh5w5M0IIcVE4HM6FXf4yGu8TwNi+2S4ALMsa92EeezK0s8NuCmySJHz2O2Xkkel0Xf8mgPkyYgkhbo5Go8/KiMX2j4sAxvZORg9Ap53r411tdlNgk9Rv/7PfIiGNjGYYxnQAV0kKFx4eHr5cUiw2TlwEMLZnMuYAbN3/l+ydhIcQm6T4Dnu7yhJRq6RUMpUihLgV8nY8/HJ3d3de9Jpkms7OzngqlVru0MTAD5cI8sRAllVkFAC23gL7bT6E2OTFd7TZul4IkdMFgK7rXwBwlKRwfzJN8wFJsdgk8OoAxv6djCEAWw+BeFcbhMi7/WRcJ4SFgW57Pfi53ANgGEYIwLWSwu2wLOu/JMViNvDqAMb+xXYBkEwm37Z1/XA/Yh2b7KbBJmhn+7u2N2FKJpM5WwAIIX4LoExSuK+1t7fbmivD5OE5AYyNsl0AdHR0tMHmbPCutpfspsEmqGvzi3ZDmB0dHdtl5JJpNE37FIBTZMQSQvzdNM0/y4jF5OE5AYzJmQMgALxpJ8CO1rUS0mAT0dX2sq3rich2BZGJdF2vIaJfSgq3k4i+ICkWk4yHA1i+k1EAgIhsPU26t66DsJIyUmHjYFlJdG991VYMIcQLktLJNDcAqJURiIguN01zm4xYzBncE8DymZQCAMBqOxcnh/vRs+11Samw/enesk7GDow51wNgGMbJAD4hKdyzkUjkFkmxmIO4J4DlKykFgMfjsVUAAEB4/YMyUmHjEHnjIbshEoqi5NRBNlVVVWVjE/9kGCKii5CHxyVnK54YyPKRlAJgy5YtUQAb7cRof/txpJLDMtJh+5BKDKF94xN2w7yYa3vZFxYW/hRAvaRwP4xEIry0JctwEcDyjawhAAghHrVzfWI4jo53npKVDtuL9o1PyOj+z6nuGk3TjgIga7Leek3TrpMUi6UZzwlg+URaAUBEj9iNEV7PG6U5TcZQCxHlTAEQCoWKiOhWACQhXBLA+evWrUtIiMVcwnMCWL6QVgBomvY0AFtHA3d+sIY3BXLQzo53scP++v+NudS9bVnWVQCmy4hFRD83TfM1GbGYu7gngOUDaQXA2FuPvVd4IfD+87fJSYj9h/eeuRkQwm6YnHn7NwzjEADfkBRuk9fr/ZGkWCwD8NkBLNdJKwAAQAhxt90Y0bf/gf6unD9lNu3ina3oeOdJ23GEEIdrmnaohJTc5hFC3AHAKyGWBeDCtra2IQmxWAbh4QCWy6QWANFo9EkAETsxhJXC+8/fKikj9qH3n79NyqFLRHQ0Eb2s6/pD2VwIaJp2GYB5ksLdZJqmEw8IlgF4dQDLVVILAADJsbcqWyLrH0Kf+ZaMfBiAPvNtmG/9XXbYU8YKgcdDodAC2cGdZBjGdCL6vqRwW0dGRq6QFItlKJ4TwHKRjJnP/yYYDDYqivIBbBYXFcYcLLpwJYhk1yj5RQgLa277JHrNDU43tUoI8cNoNPqK0w3ZpOi6/gyAxVKCKcrJ4XBYenVlh67rDQBmADhICNGoKEqdEEIHEARQgtFTDglAxdgl/QBGMLqBUReALiHEdgBbAbQR0eZkMvn22MFftieRZLPa2tpSr9f7CCT9/Ozm+UQicVJnZ2fcgdiM/QfpBQAA6Lp+H4DT7caZc8r30dDyMQkZ5a8tr9yNtx7+cTqbzOhCQNf1SwD8RkYsIloZiUQ+LSPWZNXV1QU8Hs9CAEcCWAjgYABOvUnuBPDm2Nkfqz0ez+qxTcDyChcBLFc4VQAsBvCc3TjeonIc/aUH4SupkpBV/hnu78YzN56KxJCt05onK+MKgbE347cg5wG5HUCzaZo7JMSaCE8oFFpsWdYKACsANKe5/d1tFEI8QkSPVlZWPrNhw4YRl/NJCy4CWC5wpAAAAF3XXwBwhN04dQcehcM+cSNAjqWak4Sw8Mqfv4Tt77s+Ny1jCgFN0x4mohWSwn3CNM27JMXaH8UwjKOFEJ8GcAaA8jS1O1E9AO4XQtw9NiE4p4/45CKAZTvHnqqapi2XsTsgAMxc9g1MXXS+jFB54/3nbsO7T13vdhq7crUQ0DTtXCK6U1K4h0zTPE1SrL3Sdb2BiL4w9uAPOd2eZBEhxB1CiNvb29vb3E7GKVwEsGzm6Gu1ruvPY3Rs0hZSPFh43h2orJ8rIavc171lHdb+8XOwrIx8AUt7IRAIBOpUVd0AoEZCuD5FUZrD4bCt5a77omnaEgBfJaLTAXicaidNLIxuHnVdri6V5CKAZStHp9hblvUdGXGElcSr916GoXi6h1uzz3CsE6/99VuZ+vAH/rV8MG37CKiqej3kPPwhhLjcqYe/ruvLdF1fQ0TPEtFZyP6HPzB6jzkdwHO6rr+gadpytxOSjfcJYNnK8YF1Xdf/BOCTMmKVBQ/CEefdAW8B/y7sSXI4jjV3fDbbzlNwtEdA1/VTIWn7YiHE09Fo9FhIXgqnadpRRPRjAEtkxs1gayzL+nZ7e/szbiciE/cEsGzjeAFQW1sb9Hq9G/GvNce2VE05FIefezMUj09GuJwhUgm89OdLsKPV9mE/bnlCUZTvhMPhl2QFrKqqKissLNwAOePng4qiHBIOh9+TEAsAYBhGCMA1Y2P8+WgVgEtM09zmdiKyBAKBEkVRVhHR0Q6EXzMyMrJ8x44dMQdiszykOt3AwMBA3O/3DwI4SUa8wT4T8R2boc08njcJGiOsFF7962XY/t6zbqdix1QhxOf8fv+hpaWl78XjcdNuwIqKiuuJ6FgZyQH4biQSeUhGoMbGxsLi4uLvAbgLQIuMmFlqOoALSktLB+Px+CvIgU2G+vv7E0VFRX9VVfUoAA2Sw9erqrqksLDwnoGBgbxYbsmcla61daqu668AkDaLT5u1DHM/+lMoan73BFjJEbz2t8vRvvEJt1ORSQB42M7QQDAYXKooyj8h52d8nWmaR0DCsrZQKHS4ZVl3AJhpP62c8joRXRKJRF5wOxEZeDiAZQPHewDGiLKysjcAnA9JRUe8sxU9W19DYOZxUPN0OCA5HMfLf/kSOuWu9U8CeApAE9JXIO6OAEwnoosm0yMQCoWKAPwdQLWEXBJEdEosFrO14920adMKCgsLfyyEuB1AnYS8ck0QwPl+v1+JxWLPIct7AwYGBkYKCwvvcagnoEFV1aO4J4DZla4CALFYLOz3+1UAS2XFHOw1sX3TswgcdAw8BSWywmaFofgOrL3zIvSG35Aal4i+b5rmReXl5fcJIWoAzIL7hcDn/X7/4vLy8nd37ty53xn4paWl1wA4VUYCQohrTdP8i50Yuq43JBKJvwM4Bw6vvMlyBODosrKypRUVFf/YuXNnVo91DwwMjBQXF/8vES0iokbJ4RtUVT26oKCAiwA2aem+sSu6rj8G4HiZQQvLAph35s9R1TBfZtiM1We+hVfvuRQDvXJXoxHRM5FI5DgAqQ//WygUOtiyrO8COAvuFQK72udkQcMw5gohXgLgldDWuz6fb25bW9vQZAOMrUL4A4BKCfnkk04hxGei0eijbidiF08MZJkqbT0AY0RVVdUTlmWdC0DaWr7kcD/MN1ZB8fhQVT83d7cNFgLvP3871t93JUYG+2RH3+71epf19fX928EBO3fu7IjFYveUlpY+TEQ6gAPhbiGw18mCLS0t3lgs9ncAuoR2LABnbNu2bfMkrydd168G8FsARRLyyTclRPTJsSGBrF4uyBMDWaZKdwGAvr6+uN/vfx3AuZD4IBHCwo7WF9EbeQu1ByyE6sute+5wfzdeu+eb2LruHghhyQ5vCSE+Fg6HX9/bF8TjcTMWi/0lQwqBPc4RUBTlSkjacwLAb0zTvHkyF46N9/8BwJeRGb0m2YoAHO33+6fEYrGHMVqUZSWeE8AykWs3J13XfwDgKidie4vKMePYr6Kh5aysXyoohIWt6+7Bu0/e4NipfkKI/4pGo7+ayDWaph1KRFcBOBnuP+QEgMcAHAOgQEK8LSMjI3Mm063a0NBQmUwm7wdwlIQ8Jq3E40F9YTHqi4owpagEocIi1PkKUaSqKFQU+D1eFKmj9f9gKoVYMoFBy8JQKoXtI0PYNjiALYMDCA8NYtvQAPqTru8suQrA2aZpDridiB28OoBlEjdv3KTr+q0ALnSqgXJtJmaf/D1UGHOcasJRfdGNeOvvP5Y+0W9XRHRLJBK5eLLXZ+AcAduEECdNZuw5FApVWZb1GIC0bHG8q0JVRXNpGQ4tr0JLeSUOLCmFInEozBwaxCt9PVjX141XensQT6W/IBBCvEREJ7twBLNUPCeAZQq3b9geXdfvA3CKUw2QokKfvQLTllyE0pomp5qRKt7Zig9W34bIGw870d2/q1WmaZ6OXSb9TdZYj8D3Mfq9dPvnatKEEH+MRqOfneh1YztePgGg2YG09qhEVXFMTQAn1AQw218ONU1zX1JC4M1YHx7rbMcz3Z3p7h3YoKrqCdu2bbO9UZSbuCeAZQLXb9S6rhcDeBzAIifbIVIQnLUM05ZchLLADCebmrSd7e/gvWdvQcc7Tzr94AeAtYlE4njZNwlN01rGhgaysRDoIKLmSCTSNZGL6urqAh6P51mM7mznKAKwoKIaJ9YGsbiqBgWKu0Ncw5aF57s78VhnB17q7UrX4v0NHo9nydatW3vS05wzuAhgbsuIG/RY1+mzSMfbExFqpy6EcchpCM48HqpHxpDx5KUSQ2jf+ATC6x/Ejs0vAiItt9AXCwsLl7e2tkpfSvChbCwEiOjsSCTyvxO5Zuy8gacBzHMmq1EqEY6ursO5xhRMLc7MPS8+GOjHynAbnu7uhOX8z/FqACfwnIB94iKA7VPG3Jjr6+v1VCr1D6SxC9VbUIrAzOMROuQ0VDbMg6Kk5/RVYSXRtWUdIusfRPs7TyI53J+WdsesGRoaOqm7u9uZGYW7yaJC4IGx4ZBxa2xsLBweHn7EobFcAKMP/hNrg/iUMQWhwuxY2bJtaAArw1vw+I4OpJwtBFaZpnkGJGzR7CYuAphbMuqGPDaD+iEAR6a7bY+vGJUNLaiZejiqGxegLDhD2goCISzsbH8XXZvXoqvtJXRvWYfkiCsvLs+NjIyc7MYEoQwvBPpUVZ01wXFl0nX9LgAfdyqpZn8ZvtE0A9NKsvP46039Mfy/1k3YGHe01rzDNM0LkeVbB3MRwNyQaTfiD+cE3AVJW7lOlsdXjJLqKSipbkJpTSNKqhtRVB6E6iuGx1cMb2EZPL5iAEByZACJoZ1IjgwgOdyPoZ0d6O9qQ3zHZvR3taG/a4tbD/xd3Z9Kpc7t6OhIa3fD7jKxECCiayKRyHcmco2mad8mop84kU+Zx4uLG6ZiRZ0mdSa/GywhsGp7FLds/QAx5yYLXmWa5tVOBU8XLgJYumXq3cUztkTwPLcTyRG/ME3zCmTQRioZVggIAKvGTh9ct78vNgzjFCHEA3BgX/9FldW44oCZKPfK2Mk4c/QmErj2/Y14sXdC8yvHyyKiZZFI5CkngqcTFwEgTdMaVFVtsiyrCUATETUJIZoAFAKowOheHyUA/AA8APoBjAAYIqIuAF1CiO0AtgJoI6LNyWTy7Y6OjjZkeU+RbG7fePeFdF3/HoDvw4UdC3NEAsAlpmne5nYie5OhhcDVezuGOBAITFVV9VUA5TIbVolw8ZQD8HGt3vUPwSkCwN3mVtyytdWJuQHtiURiXmdnZ7vswOmWT0WAYRjVRHS4ZVkLABw+9sepczN2AniTiF4GsNrj8azesmWLrVM+s13G32sMwzhWCPEnjB4XyiZgMl3bbsnAQuDhsR6BXQsBj67rzwJYKLOxYEEhrprejFmlZTLDZqy3Yn344aYN2D4yLDv0k6ZpnoAM6umarBwuAkjTtPlE9BGMDvPOdSGHXW0UQjxCRI9WVlY+s2HDhrzaStntG+24jG2yshLAcW7nkmW2+3y+GW1tbb1uJzJemVwIOLF99fQSP34+82BUen0yw2a8rpFhfOudN/B+v/RnUE7MBwByqwjQNG0JEX0Cow/9UDranIQeAPcLIe6ORqNPIstXl4yH2zfYiVB0Xf8ueEhgon5tmuZX3U5iojKwEHgSwNEYHXOUYn5ZJX5y0GwUq+lZfppp+pNJfOfdN/HaTqn1aYqIlkQikRdkBnVLNhcBuq7XENFnhBAXATjIiTYcFBFC3CGEuL29vb3N7WSc4vaNdcIMw1gohLgVadwvIMsliejQSCSy3u1EJiPDCgFpllbV4nvTZ8Gb5YdV2ZUQFn703kY807VdZtgNlZWV83OlOzfbioCx39lvAvgo5BzO5SYLwIMArjNN83m3k5Et696kY7FYuKGh4bahoaEUgCMg8Y0sRykAmmOx2O/dTmQy4vF4tLy8/AMhxAVwYNa9G5ZW1eKq6c3w5PnDHxid/HhUVS3aBgewZVDaCtW6oaGhVCwWe0ZWQDdly1HCmqa1lJWV3UxE/w1gDnLj3kwY7b24wO/3Ly8tLY3E4/H33U5Klqx+owoGg41E9AsiOsvtXDKdEOLcaDT6J7fzmKjm5mZfT0/PKxi9oWS9+WWV+Pmsg/P+zX93CWHh0rfX43V5wwFDqVRqVkdHx2ZZAd2WqT0BudpLtw9rLMv6dnt7e9YXmFnXA7CreDzeG4/H7yktLX2KiKYBmOJ2TpmKiI4oKCi41W6Vn25er/c7AM52Ow8ZPpzwV6hm9a+dI1QiHFVdh7W9XehOSPkR9RCREY/H75ERLBNkWk+AYRjVfr//BiL6H4y+JefDwx8A6onoPL/fP72qquqFvr6+jFhSORk5cSeKx+NbY7HYHX6/fw2AqZD/y5EL/KqqemOx2ONuJzJewWBwFhGtRA50JQYLCnF98zyU5dgGPzL5FAWLK2vwz67t6E/ZPqEaRDSrtLT0qXg8vlVCehkhQ4oARdO0iwDcj9HeiHx58O9ujmVZnystLR2Mx+OvIAs3GcrJb5xhGAsBXCqEOB3ZP248AuBNAC0SYiWEEIdEo9GNEmI5TdF1/XlIXnPvBpUIN86enzfr/O16M9aHr214TdZmQc+ZpnmUjECZxK3hgEAgMNvj8fxOCHGYA+1ms9eJ6JJsW32S7Q/HPYpEIi9EIpEzATQBuBpA2OWUJoyIWgFcmUwmGyorKxcB2CQhrJeIbpAQx3Gapn0FOfDwB4CLpxzAD/8JmOMvx0UNU2WFW6Lr+jJZwTJFZ2dnPJFInATAiZnpi71e7yO1tbW7nkJFuq5/SVXVl/jhv0dzhRDPj+0VkjU96znZA7AHqq7rx2B0LPkMANUu57M32wH8LxH9ORKJvIhdupQ0TTuRiB6V0QgRfSwSidwrI5YTgsFgo6Iob2F0v++stqiyGtccdHDe/KLJIgBcsXE9XuztlhFutWmaTrwpuy4dPQGqqhYpivI7jE7yY/v3T6/X+6ls2GY47+5LLS0t3vb29iVCiOUAlsPd2eUWgPVCiL8rirIqEom8hH1sY6rr+n0AJnRm/V5sBTDTNE3XjyjcA9J1/TEAst7argSwCC7MUC7zeLFy7uE5d7BPuvQmEjj39RelnCJIRIuyrXt2vJwsAojoZSFECIAmO3aO6wBwrmmaT7idyL7kXQGwu0AgUKcoypEAFhPRAowWBFIPetlFN4DXALxKRM+qqrp669atPeO9eOzN+G0ARXYTEUL8JBqNftduHNkMwzhPCHGHpHD3mab5UcCdpUqXTZ2BUwJ6OprKWQ+2R3DdZvujX0T0t7FhwZzkcE8AmxwLwI9M0/whMnSCYN4XAHuiadoUADMVRfnwGMp6AAGMDh1UAyjG6Mx0/9glOwGkAAwC2DH2pwPANiFE69h4/jumaW6zm5uu61cB+IHdOACGFUWZHQ6HM2ZTi7EzHzYAqJIQrtfr9c7avRsuXYVAs78MNzbPh0L8K2aHJQS++NY6vBOPSQglpkaj0S0y8spEXARkrN+bpnkRMvBsAb47ZZlQKFRkWdYGjE5wtOth0zQzZlxP07R7JG7q9DnTNG/fR1uOFQIqEW6ZcyimlZTu/4vZfr0bj+GLb62TsSrgatM0pR7mlGlypQjwl6horPdiar0PU6f40BTyQqvzoqRIQWEhUOH3oKho9Nd2cFCgN5bE4KDAwJBAdHsCrdsSaN0ygs3hEbRtSyDWb39ZqU2rAJydacOuXABkIV3XP4LRNbgynGaa5kOSYk2apmkfJaK/Sgr3pGmayzCObjcnCoEVdRouPyDbzj7JbNe8vxGPdbbbDbPNNM0mjPbW5axsLAL8JSoOPbgQi1pKsHB+EaY3+aAoch5PliXwbusIXnh1AGvWDeCVNwYRH3DlxOg1Ho/nlIkM+zqNC4AsZRjG34UQJ0kI9YHP55vd1tY2JCHWpDQ0NFQmk8kNkDPRqD+VSh3c0dHROpGLxgqBXwA4xk7jKhH+OPdwhAptT9Ngu9g6OIDPrn8Jlv1egGWZPjFLhmwoAkpLFKw4xo/TT/Bj/uwieNT0PI6SKYF1bw7h/sd24tFn4unuHdigquoJ27ZtM9PZ6N5kzXpF9u/Ky8tfEkJ8HvZ3yauyLGskFos9KyOvySgpKfkNACmbtRDRFdFo9JGJXhePx6N+v/8g2LxhHlsTwKk88U+6cq8XbQP9aLN/YNBILBZzvcfLaQ7vGDhpRMBRh5fgvy6swTXfCuLEo0phBL3S3vbHQ1EIoaAXxy8uxXlnVeLApgIMDglsNRPpaL5OCLG8srLyL319fa69dH2IewCymGEY1wghrpQQatCyrFlunHttGMYxQognIeFnUQjxUjQaXYRJdvHqur4RNs4tJwC/O2QBphZn/fYFGemDgTguXP+y3enUPZWVlcFcOSp4fwKBQImiKKuI6Gg381AUYOnhJfja+dWYPaPQzVT2alPrMG69qwcPPRFDMuX4pP21qVTquI6ODmlHYE5GTu4EmC+SyeRPIGeXwyJVVa+TEGdCAoFACYDbIKcQHbEs60JM8uFvGMZ02Hj4A8CCimp++DvogOJSHFphe4FIZW9v75Ey8skGHR0d/clk8lQ4s2Pgfqkq4WMryvGPO5tw60+NjH34A8D0qQX4xbeDeOQPU3Dm8jKozg5JHK6q6l1w+ZwTLgCy2Fj1eKmMWEKIj2qatlxGrPHyeDw/FkLI2vP1mo6Ojrcme/HYuRG2LK8L2g3B9mN5rf3PeGwTsLzR2dkZT6VSy4UQT6ez3dkzCnHPb+px7eUBNIayZzOspnoffnZlEPff0oB5zY4WLKcYhnEHXOyJ5wIgy5mmeTeAp2TEIqJfNjc3+2TE2p9QKHS4EOIrksJtqKysvNZmjFPtXFyiqjiyssZmCmx/FlfVoMRj76WJiFZISidrpLMnoKJMxTWXBfC3m+px8MzMfePfn5nTCnD3jfX48TfrUO53ZrqcEOJcXde/50jwceACIAdYlvVVADJmsBzU09PzdQlx9qm5udknhLgNciahphRFudDOmK6u68UADreTxDE1ARQo/OvktEJFxVFVtbZiCCGaA4FAnaSUssaHBwgR0ctOtXHsohI8vrIRHz+lPK0T+5yiKIRzTqvA4ysbcfQRjg3vXWUYxrFOBd8XvmPlgPb29g1CiBslhftuKBQyJMXao56enm8LIWbLiCWEuCEcDq+1GaMFgK0+yhNqAnYuZxNwov1hAFIUJWOXxzlJUZTisb39pfKowJWX1OLmawxUlufe4rKqChW3/tTA5V+sdWK5oiKE+FNtrYTxrYk2nO4GmTOGh4d/AMD2TikA/JZl/UJCnD0KBAKzMXpAj21E1GpZlu3uMyKydexwiceD2X6njo9guzvYX45i1fbcqbyZCLgLRVXVOyH5YB8j6MVdNzbgwrMrkcs7XxMBF51TiT/fEIJWJ33uXtDr9a5Emp/JXADkiO7u7p1EdLmkcOcEg8GlkmLtSlVV9TYAMuYZCCHExZKW0Rxh5+KD/eWgv/J+AAAgAElEQVRQc/nOl2FUIhzsL7MVY+zgr7yi6/qlAE6QGbN5eiH+dlM95s7K3rH+iZo/uwh/u6kBM6cVyA59nK7raT2gjQuAHBKJRO4EsFpCKFIU5deQvETFMIyvwuZY+y7ukLijm60CoKVcxtlFbCLm2f/MZyOP9kHRNG0mgB/KjLlwfjH+dH0I1ZWurmRzRW21B3++IYQj5knf8fP7hmHY6pGcCC4AcosA8BXI2et8jq7rl0iIAwAIBAJThRA/khQu6vF4pCx/rKurC8Bml+i8sgoZqbAJmF9u+zOv0HW9XkYuWUAlot8BkPaavnxpKW7/uYHS4vx9hPhLVPzuFyEsX+rf/xePnyqEuDVdq7Hy97uXo0zTfA3AzZLC/XDsAWkXqap6CwAp02iFEF+WdaCG1+u1tQ9BicfDm/+4YFpxKYpUe5PNhBCzJKWT0XRd/yJs9nLtavnSUlx/lQafN286UPbK5yVcf1VQdhHQ3NPTI2We1P5wAZCDPB7PdwF0SghV4fF47K6vh67rFwI4TkI+EELcG41G/yYjFgBYlnWAnesbCouh8Ph/2ilEqC8sthdDUWQcqZ3RxpY7Xi0r3hHzivD/vqc5vUteVlFVwi+/H8TiQ+39PO7mylAoNE1mwD3hAiAHbd26tUcI8R1J4c4zDGPSbw9TpkzRAPxcUi7dyWRS1uZBAAAistUDECriU//cUl9k74YrhMj5AkBV1WsBVMqI1Ty9EL/9ic5v/nvg9RBuvFqXOTGwIJVK2X752h8uAHJUNBq9HcArEkLR2B4Dk/pZSSQSv4GkGxCASzs77R8KvxtbBUCDzbdQNnkNNgsAAFNk5JGpgsFgM4DPyohlBL343c91+Etyb42/LKUlCn73c0PaEkEiOkvTtCVSgu0FFwC5y1IU5csALAmxWjRNu2iiFxmGcRaAMyS0DwCPm6b5e0mx/o8QwtZDwO5bKJs8u0MAAHJ672ZFUX4MCbttelTg+qu0vJztP1G11R786vuatM2CiOgnUgLtBRcAOWxsh7zfy4hFRD8xDKN6vF8fCoWqhBC/ltE2gP5UKnUxYPck2P9ERLYWlNf6pK8FZuNUV2D7s8/ZAiAUCi0A8BEZsS67uDav1vnb1TKnCN+4aNy3yv1Zouv6MlnBdscFQI5LpVJXAuiVEKrasqwfj/eLLcu6DoCUrS2FEN/p6OjYLCPWHtiawi9hRzo2SRI+e2l36UxjWdaVkLDPwbGLSnDBx2WN4OWPi86pwlJ5ZwdcJSvQ7rgAyHEdHR3bAXxfRiwiukjX9fn7+zpd10+ApLFHAC9Go1FZPQl7YrMA4DFRtxTbP3wpJ8dvxjb9Oc1unIoyFT+7IpjT2/s6hQj4xZVBWacIHunU5kBcAOQB0zR/C+ANCaFUADdiH28WgUCgBKP7EMi4bQxblnUh5Mxj2BsuALKUhB6AnBy/IaJLIeHe/q2La3LyYJ90qapQcennpXUySdn4bHdcAOSHpBDiy5Azhr7QMIy9vt17PJ6fAGiU0A4AXNPe3v62pFh7w0MAWarYY/vhlHMFQENDQyWAT9iNM6+5EGetsHfeAgPOPqUcBx9kf/6EEOJ0TdOkr1rhAiBPRKPR5wD8WUYsIcRPp06d+h/H3xmGsVAIIWud/puVlZU/lRSLsT0hXddPCwaDsxobG3NillsikfgMAFubU6gq4epvBKAo3Pdvl6IQrv5GnYyNkxQiukBGTrvi15c8oqrqt1Kp1GkA7O5bGRgeHv4hgK9/+B+mTZtWMDAwcBvkFJUpRVE+t2HDhhEJsfYnDmDSJ8sMpJIo83glpsPGayBp+8gLBcADiqJgZGRE6LoeIaIPhBAfCCE++PDfvV7vB7K2nnYaEU14ue7uPnpimRMn3eWt2TMKcdrxftz32E67oc7H6K6OMs56AZBHp2GxUbquXwY5O/MlFUVpCYfDb4zFvRrA9yTEBYD/Nk3zMkmx9knX9a0AJn0ozN3zFyJYkBMvj1knOjSIc157MV3N9QBo/fCPEKKViFpTqVRrR0dHG5ydpzIuuq7PA/CqnRiqSnj0D1PQVJ+Ws2jyxgdbR3DSZ9tg2f8pWSbxFFTuAcg3pmn+0jCMzwghZtsM5bEs6zcAjgqFQrMty7pcRn4ANqdSqR9IijUecTsXD6SSsvJgEzRgSXsRGo9KAC1jf0BjU+NVVYWu6yNEFP6wKPjwn5Zltaqq+nY4HB5MU462x/5XHF36/9m78/ioqrt/4J/vnUnINmEnmTMTDIsigqIGd3EFwbWL7aN9tNbaWpc+dXvUqq0KamvVarWb1q1P61Jt61ZtZXHXKi4oqyhCCMnccycEAmSyZ+ae3x8J/SGChJzvnTvLeb9evF6V5n7uYSYz93vPPYu5+Htg3OhCzDyqDC++pvV1AwBnADAFgDFgSaXU5QAWMGQdGQ6Hz1JKXQKA41tDEdH3Gxsb2xiy+oWIWpUa+NjI9lRaL0LGNtr0HwFwKVRKjQUwduvvklIKRATXdZORSKR++8cKgUBgjeu6q6WU7Yzt+KbOwUTARWfn7NIIvvvhOcMx9/VWaHzdAMDXampqLl60aFEPR5tMAZCHpJQvRSKRp5RSp+tmEdHDSimuh+AP2bb9ClNWv7iu20YaE52bursYW2Psjix57YNbiwMimgFga2EAABBCOADWAPhPcWBZ1hrXdddIKTf09yTRaHRf13WrdRo67eBS7DXW3P17Ze9xg3Dk1FK8+b7W/c3weDx+JIBXOdpkCoA81dcLMAua0+AAcF38nWAweDVTVr8RUaPO8Q0dnDdwxu6o70hbR5GXwn1/jtzae7BNcdBJRFIp9TGAFduNO1iHbQaDua57sm5DTp9lpv157WuzQroFAJRSs2AKAEOHlLIhEon8Qil1s99t6XOxHyOtlVKf6fQA1HeaAsAvDZ3perTum6KtvQcATtlu3EEXegckbu050FovvqzUwvFHlGk32PhyM44sQ6g0gETbwB9fEdFJAFjGXJl1APLYkCFDbgewyu92AHhSSvmsHye2LGuNzvENHTl/EcpYOdIDMFCDAEwEcIpS6lIA++iEnXxcCEWDzKQwrxUXWZh5lF6nq1Jq8siRI1n2WTEFQB5bsWJFNxH9r8/N2JhMJi/16+RKqc90jq/vaIerOarH2H2uUojlfg9A2nxlhu7SIEZ/fXXmF9ZQ223BYPBwhqaYAiDf2bb9AoAX/Do/EV2xfv16refwOpLJ5Gqd49tTSdS25/WdqC8+a2tFh5mBwaKsxMKBk7UWDzR2w9T9ilBWon3pPYKjLaYAMBAIBC4D0OnDqefZtv1nH877H33FR0In48OWrFgkLqd81MKxw7UBAAdNKUZQf6lao5+CAULNfnoFFxEdzNEWUwAYaGhoWKOU+mWaT9uWSqUuTvM5d4iItFZP+3CLKQDSzbzmfA6vYdu33uinww7Q3ol6XzCs5GsKAAMAEAgEfg6gLo2nvKaxsbE2jefbKdd139Q5fknLFqTMOIC0SSmFZYktfjcjZxx2oOn+TzeG13ywEGLAS5hvZQoAAwAQi8U6lFKe7Dm9A29LKX+fpnPtEhFpFQDtqaS5IKXR4pbNZglmJqHSAPYaYxb/SbeJ4wehtFjv8quU0pr5AZgCwNiG4zhPAZjr8Wm6XNc9HxmwecpWPT09bwPQuqLMb4oztcbYFfNa8xkzusBs++sDyyJUa+65YFnWGO126AYYuUUpdQUAz7bhJaJb4vH4x17lD0RTU1MrgCU6Ga81N6GLYasv48t1uim80dzvFXJ35mMiuge9s19WAsiKNYW9MCZq7v79MrZKbxFVpZR2AWBWAjQ+x3GclUKIewB4sR3vksrKytts2/YgWo9S6nUiqhno8W3JJN5qbsLxIyo4m2Vs582NGzi6///Ptu07tvlvSwgRsSxrnOu644honFJqHICtf4bonjBTjRltCgC/MLz2e+gGmALA+ILu7u6bCwsLzwIgGGOTSqnvc+1ixY2IngFwhU7G3Ka4KQA8Npeh+5+Intvur1wpZQOABgCvbf/zQogRX1IchLUb5CPdu1Bj4Mbqb7s8QjfAFADGF2zYsCEhhLgKwGOMsb9yHOcDxjxWUsq3hRA2gMhAM97f3Iw17W0YV2KmVXlhdVsrFm1p1o1Zadv2bi1/3bcr3wYA727//wkhSr6kONgDGf4dWzkyo5uX08KjtF97UwAY3iCiFap3ezKOEUJdnZ2dtzDkeMklor8ppS4baIAC8GisDjfuNYmxWcZWf47VgWGy5fZ3/1qklO0AlvX9+ZyampqC9evXj04mk+O3FgdEtG2h4Pv8u7KSgN9NyFsMqwEO1w0wBYCxI0Gl1IPgufgDwKDi4uLvAfgVU55X/gpgwAUAALze3ISGznZUFWkv9GFsY11HG97cpD34D0T0BENz+qXvcdeavj9fUFVVJZLJ5LjtexCIaBwYvtz7o6TEzADwS4l+AaD9JWMKAOMLhBCXA5jKmamUmj1y5Mi/NDVl7hwu27YXCiHqAYweaEZKKTwaW4drx09kbJnxqF2vvekSEb1v27bWbA9ODQ0NEoAE8IV1KKqrq4d0dXWNtyxrrOu647frOYiAqThnuAs1BojhtR+kG2AKAONzotHoeNd153gQXR4MBm8D8B0PsrkoAP8H4AadkAUbGnF6OIq9Ss0OaxxWtrbgpQ36+0W5rvsgQ3PSoq6ubjOAD/r+fE51dXVRZ2fnWMuyxgF4FhrTuUtNAeAbhtdeuwAw776xLXJd9z549GySiL4dDoeP8iKbSyqV+h00N0ZKKYW7aleZbYIZuErhnrWfcbyWbV1dXWnr/vdSXV1dZzwe/1hK+TwAsyVi/tL+UJgCwPiPcDh8PoDjPTwFEdHdADJ25FFjY+N6ANoXipWtLXhhvcPQovz2/HoHK1tbtHOI6LHm5mb9oMzTqnNwW7tZvMovDK+99oJtpgAwAADRaDRCRLen4VQHCCEuSMN5BqyvSNF2f/0abO7JyGUPssKmnm48UL/D8XO7K0VEd+z6x7JSm87BraYA8A3Da6+9gqUpAAwAgOu6vwcwOE2nu1kIoT2H1St9A8Ve0c1JJJO4dfVKjqlrecdVCreu/gSJJMumP0/GYrHVHEEZSKsAaG83v51+adcvALTee8AUAAYAIcQZAE5L4ymHAfh5Gs83ELdxhCzcvBFPynqOqLzyuKzHu5s3ckSpVCp1K0dQhtLsATBDCPySaNMuALRXxTIFQJ6LRCLDAfzah1N/r7Ky8iAfztsvUsr5ABZwZN1fX4vlZrvgflvashkPN6xlySKiZxobG5ezhGUmrTEA8SazrbJfGF577YUxTAFg/ArAKB/Oa1mW9Rtk8O8gEV0FhlHWKaUwZ9UKbOzO203n+m1jTzfmfPYxUjwzKHqUUtdxBGWwBp2DaxvMGBW/1DZoj+Fr0g3I2C9fw3uVlZUnKqW+7WMTDhFCnOvj+b9U31iAP3Fkre/uwpUrl6BVfye7nNWWTOLqlUuwgalQUkr9Rkr5KUtY5qrVOXhtvWc7fxu7UKv/2q/TDTAFQJ6qqKgotSzrd363A8DtfY8hMlIgELgeDINtAKC2vQ0/+WQZul0z8np7SaVw/arlWN2m1aO9reZAIPAzrrAMpvWshOEu1Bgg3QKAiLSfk5kCIE8FAoHbAIzxux0AhiulZvvdiJ1paGiQRMQ2iGxxy2bcwtfFnRNSSmHOZyuwaMsmtkwi+kksFtMeJJXplFJa8yTX1nfDdc3vYrq5rsK6mN7jF9d163TbYQqAPBSJRA4DcJHf7djGRZFIZIrfjdgZ27ZvI6L3ufJeb27CjatWoMv0BKBHubj5s4/xxkbtx5nb+rdt2w9wBmYq3bvA1nYXn9aaXoB0+/izLrR16H3+LcvSHtxqCoA8U11dXaSUehg8730SQB1DTkAp9Rvw7T7ILZlKpc6F5hLB23qzuQk/XrkEbTzz3LNSWyqFK1cuwasb13PGtqZSqXOQJ0vkSiltAFpTTN75sJ2pNUZ/LfyoQzdis23btm6IKQDyTE9Pz/UA9maKu4uIzmPKmhYOh/+bKYtdPB7/GJqbBG3vo5bNuOTjj/JydkBzTzd+tHwRFm/ZzJpLRFc1NjZqDYzLMi6A93QC3l5kCoB0e+cj7dd8GcxeAMbuiEQi+yulrmKK+8yyrNm2bb8K4EmOQCK6Y8SIERm7hZ6U8i4Ab3Nmrm5rxYXLFmFZHq0TsLRlM85f+gHWtLOMrdzWPNu2/8AdmgXe0Tl40bJOJFNmHEC6JFPAB0u0ewC0ir6tTAGQP4JKqQcBFDBkKdd1z4/FYh0AQERXQnNBkj7hwsJC1rtsZinXdc8CwwIc21rf3YVLV3yEv8j6nF42WAF41F6Hyz5ezDbVbxv1AM4Gw11RtnFdd6HO8Ym2FBYtY3u6ZezCe4vbtZ//K6X+zdEWUwDkCSHEFQBqOLKUUvfH4/HXt/63bdsxAFxTri4Nh8MTmbLYxePxOsuyvgmAdQWVlFK4b90aXLNySU5uILSppxs/XrkUD9TXejEDolMpdbqUkrUwyxbBYPBdaBY+z87LxY0SM9Nz87Vfa+W6rikAjP6JRqN7ApjNFBcrLi7+8fZ/OXTo0LsArGLILyAiP5Ym7rdYLPYagMu8yF64uRlnL16If8RtuDkwVdBVCs81Snx78btca/vvyA8dx/nAq/BM1zfd8WOdjLmvt6KzK/t/3zJdR6eLeW9od5Yu69u2XJspAHIfKaUeAFDMlHdxbW3tFx5Yr1ixolspdSnTOaZHIpFvMGV5Qkr5eyK634vsRDKJO9euwkXLF+HT1oQXp0iLla0tuGj5h7ir9lOuXf125NdSyoe9Cs8iz+scnGhLYcFbbIswGTsx/81Wjm2A53K0BTAFQM6LRCIXKKWOZop7Qkq50y8ax3HmAniO40RKqTuFECUcWV4ZMmTIj8CwbfDOfNKawEXLF+Hnq1eiviN7Rmqv62jDz1evxMXLP8QnrZ52LT8hpbzcyxNkC8uyntXNeHqueQzgtWcYXmPLsl5kaAqAzJ13bTCoqqoSqVRqBYAhDHEbU6nUPrvqehJCjAawEoD2xVspdYvjONfr5nhJCFGilPonER3j5XksIhwyZBjOjVZj77JyL081YLXtbXhC1uOlDY3pWOnw5ZKSkpNXr16df3Mod4yEEOsAVA04gIAXHtoDE8YNYmyWsdUna7pw6vfWQfOjsUFKGUbvGizaAhwhRmYqKyt7HMB+HFlKqQvi8fgupxslEoktoVCoEMAxuuckokMGDx78REtLS8Yu6ZpIJHqKioqeDgQCxwKIenUeBSDW2YF/rnewvLUFFhEiRSUIkr81fKebwqsbmvD7dWtw37rVWN3emo5h+AtTqdRJDQ0N2nOpckkoFBoH4GCdjC0JFycek7EzcbPa7LvXY3Wd9qqLjyUSCZZeVsD0AOQsIcS3ADzOkaWU+pfjOCf39+ej0Wix67orwLPXwD+llKcw5Hiqurp6SHd390tgmmnRH6XBII4aNhIzR1Ziv9BgBNJUDKSUwuKWzZjfFMcbzRvQnt4dDt/q7Ow8ubm52fRXbycSiRyvlHpJJyMQIMz90x4YU1XI1SwDwOp13Tjp3Drorv5NRNNt236Zp1WmAMhJQogR6B0VPJIhLgFgkpRyt/YdF0J8BYD2c8k+p33Z2INMEY1Ghyml5iqlDkr3uYsDAUwJDcaBQ4bhgPIhGF9SBoupIHCVwur2Vny4ZTM+atmEJS2b0ZHyZaXd+QC+JqXMngER6WUJIVZDs/A+fVY5bru2kqlJBgBc+bM4ntWf/tcgpRwDxmWuTQGQg4QQjwI4iynuh1LK3w+wHS8CmMXQhjWFhYWT6+rqMn61kr6Bi48B+Kqf7SgOBFBVVIKq4hKMLi5BVVEJRg0ahJJAAMVWAKFgAYoDvU8AO1IpJJI96HBTaEum0NTdhYbOdtR3tKGhowMNne1+XfC39WxJScmZ5pn/lxNC/BjAL3QyAgHC0/dVYdJeRUytym9LV3biGxfXa9/9A5gjpZyt36L/zxQAOSYajZ7kuu4/meLelFIeg971xndbJBLZSym1DIB2fyIR3WDb9s26OWlihcPhXxKRGaHOgIjutm37KjANfMpllZWVIy3LagCgNZJvysQi/O33VbAsc4nQ4boKp1/UgGWfaN+7pACMlVLWMzTrP8wgwBwybNiw8kAg8C8AgxniupRSp7S2tg54n9ZEIrExFAqVAjiSoT2HlpaWPtba2sq7e4w3VGtr67yysjJJRCfCTLcdqC4AF0gpf4EBFqH5prW1tb28vHwCNAf/Nm5IomJEEJMnmF4AHY8/twVPvqC/z4dS6mnHcR5kaNLnmC+mHFJUVHQrNKYBbUspNcdxnJW6OT09PT8DEGNoUnEgELiTISdtHMd5AMBJAOJ+tyULOUR0jFnkZ/cppe7lyLnj/o1o3uz7o5+stXFTEnc+yLM6tWVZnnz3mQIgR4TD4WkALmSK+8hxnDs4gpqamloBXMmRpZT6uhBiBkdWukgpF6RSqSkAuB7L5IMXUqnU/rZta21yk6+klG8D0B4pviWRwtW3xnXnrecl11W46tZGtCRYOq7e9OqzYAqAHDB+/PhBRPQH8LyfSaXU+WB83iqlfBJ8K+b9bvz48Vm1UkljY+N6KeWpSqkLAJgR7DvXoZS6TEp5Gtda53mMZVfN1xa24cEnN3FE5ZX7HmvGG+/ybHdNRHNYgnbAFAA5oL29fTYArh30fuk4ziKmrP9wXfcS8Oygt2d7e/slDDnpphzHud+yrEMBsL++OWAhgAMcx7kHebilLzcp5dtKqX9xZN15fxM+XG7WXOqv95d24Nd/5Fm7jIhe55z3/4V8r4KN9BBCHADgPQBBhrhVlmXtH4vFPPm0h8Phu5hGxicCgcDeDQ0NkiHLD1Y4HP4+Ef0MwAi/G+OzJgDX9T3rNwP9GIXD4Roieh8M3/PhUUE8fd9ojBzO8TWTu9ZvTOJrP6hH4waWDlQFYJqUkmXr3x0xPQDZLQjgIfBc/F2l1PleXfwBoKurazZ4BsSFUqkUyxgFn7h9vQETAPwejAt7ZJEUgN8Hg8EJUsoHYS7+7Pp68v7OkrU+ifOutpFoy8df1f5pbXPx3atsros/APzNy4s/YHoAdoQqKirGBIPBiUqpsUqpaiKqIqJRAIYrpYYDKELv3PbSvmNa0PuF1klEG5VSjei90MWVUp9ZlvWZUuqz3V1Nb1eEENcAuJUp7l4p5cVMWTsViUTOUUr9iSFKKaWOcRznDYYsX0Uikf2VUjcDOBm5/5l0lVJPu647p7Gxcbnfjcl1kUgkqpT6GADLAv+HHlCMh++IorAg139Nd09Xt8J5V9l4dzHbEJ8OpdREx3HWcQXuSN6/i1VVVcJ13SMAHNG3hOu+YPqw7EAzgPeVUh8Q0budnZ2vD3RN875FdpagtxjR1dDZ2Tk5TeurkxDiTQBHMGQtk1IeiBxZICYSiUxRSl0D4JvIvTU6FBE9TURzYrHYMr8bk0+EEJcAuIcrb9bRZbjnxjACgby/fAAAUimFS2bHMe+NBFsmEV1v2/YtbIE7O4/XJ8g0kyZNKty0adPRSqkTiegkABN8bE4SwPsAXnJd9/l4PP4B+jcAyhJCvAZgGkcjiOgU27bTNk2tb9zC++C5yF0qpfw1Q07GiEaje7qu+2MAZ0NzRbcMkFBKPea67u/MHb9vApFI5B3OPSpmHR3CXddX5n1PQFe3whU3s1/8lw8ZMqRmxYoV2lsH7vJcXp8gQwTD4fB0IjoDvWu0D/G7QTuxTin1VCAQeDIWi723sx+KRCIXKaUGtD7/DjwupeTaN6DfhBC/B3ARQ9TmVCo1IRenjUWj0WGu635LKXUOEWlt8+qDZUR0b1dX16MbNmzg+3Y0BoR5sDCA3scB9/5MIFSaa51V/dPa5uKC6yRntz8ApIjoyHStgZHTBUBFRcUYy7K+R0TfBSD8bs9uWqaUesiyrEdt29649S+FEFUAlgMoZzhHE4B9pJQ8y1Xthr6L26fgGQX/RynleQw5GSscDk8kou+gd5OnqN/t2YlPlFJPWZb1d9u2F/vdGOPzhBA3ApjNmTlx/CA8fHsk72YHNDWncO6VMXy6hndvqnTveZKTBUA4HJ5GRP8L4FRk/0yHDiJ6RCl1l5TyUyHEC+gdLMbhLCnl40xZuy0cDp9PRPczRCkiOjxPVo6jaDQ6WSk1Qyk1A8BRAEp8aksHgIVE9GoymXzGdPFnPEsIsQDAcZyh4VFB3H1DGDX7FnPGZqz3l3bgsjkO52j/rRZIKWchjTNicqoAiEQiJ7uue0MWdpf2h4veLrxDmfJekFKeypQ1UJYQ4l0AUxmyFkkpD0aeTScbP378oI6OjsOVUtMATEbvINbxYOzq7eMCWAtghVLqXSJ6fejQoe+n4zmlwWfkyJGVBQUFHwGo5MwNBghXnD8c5585DJRTV5X/T6neFf7ueXgjkin2taqcZDJ5wPr16xu5g79MTrxVkUjkWKXUzwAc5ndbskQLgMnc0xIHIhqNHuK67ttg6KlRSl3oOM4fGJqV1caPHz+ovb19H6XUJCJ6RCdLKXU2Ea20LGull2tEGOkTiUSOV0rNhwe9o0cfWoo7rq3EsCG5NS5g46Ykrrq1kW153+2kiGiGbduvehH+ZbK6AKiqqhKpVOpOAGf63ZZsQkQX2bZ9n9/t2EoI8RAAjmf4Gy3L2isWi/Gsw5kDhBBatypSyqz+jjC+qKKiojQQCHwEYE8v8geHArjyB8NxximDYVnZ/evjugpPPN+CXz7QxLWxzxeka8rfjmTr8/FgOBy+NJVKrYS5+O+WvrWlM+ouOZVKXQtgM0PUcNd1ffkgGUY2mDRpUmEgEPg7PLr4A727CF5/53p87YIGLFnZ6dVpPLdiVSf+64cNuOEutl39voCIHrFt+2eehPfn/H6deKCEEIejd/nUKX63JQt1EGUBi54AACAASURBVNH+tm2v8rsh2xNC/AgAx3z+FICDpZQfMmRlPdMDYGzVtwbKUwBOSdc5AwHCadNDuPDsYRg3ujBdp9Wyel03/vBYM55b0ALX2xFF/5BSng4fFzLLpg93QAhxPYCfIvdWSUuXH0spb/e7ETsREEIsAk9h946U8giYXeVMAWBsFRBCPAqfekwtCzj6kFJccu5w7Ls3x+Kl/FbVduOBJ5rxj5cSSPEP8tvewlQqNb2xsdGTQQX9lRUf7qqqKpFMJh8jomP8bksW+1BKeQgyeNncvumbr4Ph95KIzrVtm2PPgaxmCgADvbtP/pGIzvG7IUTAkVNL8bVZIZwwLYSiQf7+enV0upj/ZiuemduCfy9qh0rDLQMRLQ8EAkfV19dv8v5su2iL3w3YFSHECQAeATDK77ZksSQRHZQNi7P03aVwrEzYWFRUNKG2tnYLQ1bWMgVA3qNwOPx7IrrQ74ZsL1QawMyjSvHVmYMxdb8iBNO0t0AyBby3uB3Pzm/B/Dda0dqevpnDRLTcsqyZmbKVeSZ/uEkIMQfAT5C9gxUzAhHdatv2dX63oz/6ZnZ8AoYNmYjobtu2L2doVtYyBUB+E0L8EsD/+t2OXSkttjB1SjEOP7AEhx5QjInjB7HNIHBdhZWru/DOhx1456N2fLCkA20dviwX8lYwGDwtE+78t8rID3dNTU1BPB5/SCn1bb/bkguI6JHKysrvLVq0qMfvtvSHEOJKAHcwRCVTqdQB+bxCnSkA8lffDdQNfrdjIEqLLVRXFWJsVQHGjC7E2KpChEcFUVpioaSYMDgUQElx731he4eLLYkU2jsUWttcxJuSqG3oRm19N9Y29KCuoduvC/62/mFZ1pmZtpZGxn24hRAlSqm/9e3U55tQaQDVVQUYW1WIsXsUYky0AOFRBSgttlBUBAwJBVFc3PvydXQobE4k0dGh0N6p4KzvQW1DD2rXdWNtrBt1DT1ItKX8/OcAwILu7u7Ts2FjlpqamgLHcZYAmKibpZR6zXGcYxmalZVMAZCfhBBXA7jN73YYAICHpJQXIgPHX2XUh7tvg5gX4MOKfqHSAKbuV4TDa0px2IHF2GtMIWsX1Ke13Xjnw3a8vagdHyztSOtzp218SEQnbLu5UKYSQkwHsIAp7ltSyieYsrKKKQDyjxDifwD8xu92GEgR0ey+ef4ZOSMpYz7c0Wg04rruPACT0nXOslILJx0bwldPCOHAycVpHISisGhZJ56d14K5r7emu3dgCYDpfuwAuLvC4fDfiOgbDFGxnp6eiU1NTa0MWVnFFAD5RQjxXQAPIYO+2/OUQ0Rn+bG87+7IiF+Svjv/N5CGiz8RMO3gUnx9ZjmmH1nm+zSUzi6FBW+14pl5LXjzvba0TEMBsMx13ePj8XhTWs42QEKI0QBWgme3u9uklNcw5GQVUwDkDyHEmQAeBf86Kf8G0ApgJnNurnopmUyene6NfQbC9w93NBotdl13PoAjvTzP1oUoLv3ucEyekKkLUXThgSc24fmXEl7sNrW9D7q7u4/L9DEBkUjkp0opjv2xuwHsJ6X8lCEra5gCID8IIU4D8HcABczRi4PB4HH19fWb+7bvvhNAGfM5ckUKwC1Sypv7/nfG8/XDXVNTUyClfNbLAX+BAOHrM8txwVnDUB3l/mx4Y21DN+57tBnPLvB8RaoFQ4cOPSWTt3Tt29luOXq3uNU1r2+/7bxhCoDc17dWyj8ADGKOXkZEx247ZqiqqmpcKpX6E4AjmM+V1YhoOYDzbdte6HdbdoefH26KRCJ/8nKq3wGTinDTFRWYOJ77c5EeK1Z14oa71nu6oUbfZhS+rxD2ZSKRyMlKqRc4spRSX3cc5xmOrGxgCoDcFg6HjyKiF8HzmGxbq5LJ5FE76cYOCCEuAHATgOHM5802HUR065AhQ27L5BupnfHtwy2EuAnA9V5kDykP4OoLRuAbJ5XnxHaUf31hC+64fyO2JDzrVbpaSskx794zQojnwbOJSZ1lWftk2nxcr5gCIHdFo9GDXdddAKCcOboOwFFSyoYv+6HRo0cPTSaTswFcDCDI3IZs8Fel1NWO46zzuyED5cuHu6/L6kV4sMLfcYeX4rZrKjF0cG7tF9S8OYWrb43jtYWe7B2Rcl331Hg8/qIX4Rz6uh6XA+AYwHGTlPJGhpyMZwqA3BSJRKYopV4BMIw52k6lUkc1NjbW9veAysrKfSzLuh3AycxtyUhE9LpS6idSyn/73RZdaf9w9y31+hGY1/YPBoCrLhiJ8/5rKChHv7KUAh58chPuvH+DF4MEmwHsv6uq30/hcPhmIvopQ1RnKpWatDtfctnKFAC5RwixN4DXwb8/ynoAR0spPxnIwUKIA5RS1xHR15Gby7e/SURzbNt+2e+GcEn3mxRIJpOPgfkXN1JZgCd+OxrfOyN3L/5A7xTG888cisd/HUV4FHuP2zAAjyGDu/KI6FYAHN1tRYFA4FcMOYaRVhUVFWMBvAT+i/8mIjphoBd/AJBSfuQ4zjfRO537TwC62FrnH5eIniKiw6SUR+XSxR9Icw+AF8/9J+1VhIdvFxg+NGOvW55o2pjEeVfbWLma/TOW0d3j4XD4dCL6O0eW67onZfJjDw6mByB3RCKRqFLqDQBjmKMTlmVNj8Vi73GGRiKR4a7rnk1E5wHYjzM7DRoAPAzgYSllvd+N8UraPtxCiMMBvAnGXofDDizBvT8TKCvJxd6mXUu0pXDRTyQWfsQ6ni0J4BAp5YecoZyEEPMBzGCI+qykpGTf1av5q6hMYQqA3DBq1KiKYDD4OoAJzNHtSqkTHcd5gzn3c8Lh8FQiOhfAVwBEvTyXho0AniaiJ23bfg1ZMpdfR7o+3EEhxAcApnAFzjq6DHddH0ZhQX5/P3X3KFxxcxxzX2ddz2dJOBw+KFN3D+x7BroEQKFullLqOsdxbtVvVWYyBUD2i0Qiw5VSrwLYlzm6C8BpUsr5zLlfhoQQ+xPRSUqpUwAcDH/HCywFMJeI5tq2/SYycMMeL6Xlwx0Ohy8jIrZnrrOOLsM9N4YRSNPa/ZkulVK4dA5vEUBEP+3bxCIjCSFuB3AVQ1QbgImZPPhRhykAstvYsWMHd3Z2vgRgKnN0D4BvSimfY87dLdFodFgymTzEsqyD0FsMHAT+8Q1bbQawTCn1PhG9mUql3m5sbFzv0bmygucf7r5R/yvBNFf1sANL8NDtkby/899ed4/Cd6+M4d3FbI8D2gHsnakXxhEjRoQKCws/ASAY4v4qpTyDISfjmAIge1VUVJQGAoG54F8mPQXg7EzdIVMIMdqyrLFKqWqlVDWAaiKqVkqVARiC3hUPSwGE0DtoOYHeO/d29HbjbwTQCKCBiNa6rltLRCtz+Vn+QHn+4RZCPAGA5ct1wrhBeOLXVQiV5ecz/11JtKVw9mU2VqxiWznwCSnlt7jCuIXD4bOI6FGOLKXULMdx5nFkZRJTAGSnSZMmFW7evPlZpdSJzNFKKXWh4zj3M+caWcjTK2kkEjkWTBf/SGUB/nxnxFz8v0SoNIAHfyE4pwie0Td4MyM5jvM4AJbBS0R0V01NTXZsFmHktJqamoJNmzY95cHFHwAuMRd/YytPr6ZKqZ9z5AQDhF9dX5l3U/0GYuTwIH57k0CQZyFEAvALliRvKNd1LwbPwJ194vH4jxhyDENHwHGcP4Nn2evtXSul/K0HuUaW8qwAiEQiJwM4lCPrqgtG4MDJxRxReWHKxCJc/v0RXHHT+npyMlI8Hl8B4F6OLKXU7KqqKo4xBYYxECSEuA/AmR5k3ySlzORi3vCBZwWA67o3cOQcd3gpzvuvoRxReeUH3xqGow8tZclSSs1hCfJIYWHhDehdxlRXyHVd8yVp+IGEEL8D8H3uYKXUXZm8uJfhH08KgHA4PI2IDtbNGVIewG3XVOb08r5eIQLuuLYSg0MszwKmRaPRQziCvFBXV7cZwDUcWUqps4UQZq9zI636prVexJ2rlLrPcZwruXON3OBJAUBELL9wV18wIud29UunYUMCuPIHPI8CXNe9nCXII1LK/wOwkCGK0PtIwQw4MdJCCDEHAPtFmogecRznhwDYdw4zcgN7AVBRUTEGDANYDphUhG+cxL3Ndf4545Ry7Lc3xw66OF0IMZojyCNKKfU/AFyGrH2FEBcw5BjGlxJCXAWA5XHptpRSf7dt+7vg+TwYOYq9ALAs63u6uYEAYc7lFbAs0/evy7IIN10ximPVxCCA7zI0yTOO4ywiogeZ4m6urKwcyZRlGF8ghLgYwO0eRM8vLS09G3mwlr2hh7sACBKR9kXi6zPLsc+egzjaYwCYPKEIp00PcUSdi8zf5/s6AM0MOUMty2KZxmoY24tEIucC8GJK3suFhYVfyeUNrgw+rF/mlZWVM6C5NGsgQPjBf5tR/9wuPHsYLP13uzoSiRyj3xrv2La9EXxbTp+XyYMfjewUDoe/rpR6APwrsb7T09Pz1bq6OralQI3cxloAWJalverfSceUYUyV9iZvxnbGjS7EzKPKtHOUUv/N0BxPSSnvI6L3GaIs13V/h8zv9TCyRDgcnkVEj4N/kOniYDB4clNTUytzrpHD2L7YJk2aVAjgNJ0MIuCis4cztcjY3g/PGc4xpfIryPwR8i6AS8Ez+rlGCJHRYx+M7CCEmE5Ez6B3Mxs2RLSciKbX19dv4sw1ch9bAbBp06ajAWj13U87uBR7jTV3/17Ze9wgHDlVe3GgEZWVlRk/T9627XcA/Jkp7tbRo0eb51LGgIXD4WkAngPAMiVnG6u6u7tn9D36MozdwlYAcGxccfosM+3Pa1+bpT8Y0LKsUxma4rlkMvljAFsYokYmk8mbGHKMPFRZWXkQEb0AoIQ5ug7A9KampjhzrpEn2AoAIjpJ5/hQaQDHH6H/jNr4cidMC6GsVPttP46jLV5bv359IxFxLWN8USQSmcKUZeSJaDS6r2VZLwLgvruxA4HAdCllA3OukUdYCoC+DVQm6GScdFwZigaZef9eKxpEmKU/GHCKEIJttyEv2bb9GwBLGaICSqnfgX/ktpGjotHonq7rzgfAPbCpyXXdExoaGtYw5xp5hqUAcF1X+5nwV2awzFM3+uGrMwfrRlhKqaM52pIGScuyLmXKOiIcDp/FlGXksIqKirGu674CoJI5ehMRzYjH4x8z5xp5iOsRgFYBUFZime1+02jqfkUoK9F764koa+bHx2Kx1wA8yZFFRL8cO3asdgVl5K5oNBoJBoMLAESZo9sAnGbb9hLmXCNPsRQASqmDdI4/aEoxgvpL1Rr9FAwQavbTLrhqONqSLn0bVHHMka7o7OzkWmjIyDGjRo2qcF33ZaXUWObodqXUSVLKt5hzjTzGUQAQgMk6AYfX8Oxbb/TfYQdoD0iuQRY9D7dtOwbgZ0xxl0aj0X2ZsowcUV1dPSQYDL4IzfFQO9BtWdY3Hcd5gznXyHPaBUDf7n9aI1wPPcB0/6fbYQdqv+aDw+FwJu8O+AVDhw69C8Aqhqig67q/YsgxcsSwYcPKe3p65gM4gDm6B8B/xWKxfzHnGoZ+ARAMBvfROT5UGsAEs/hP2k0cPwilxdrjAPZkak5arFixolspdQlT3PGRSOSbTFlGFquoqCgtKir6p+6j0B1IAThHSvkcc65hAGAoAJRSY3SOHzO6wGz76wPLIlRr7rlAROOZmpM2juPMA/AsR5ZS6s6Kigrz/CqPTZo0qTAYDP4NwJHM0UopdbGU8gnmXMP4D44CoFrn+DFRc/fvl7FVBVrHK6WyrgAAANd1LwfQwRBVZVnWtQw5Rhaqqakp2LRp0985VkHdgUscx7nfg1zD+A+OQYBaz4HHjDYFgF8YXnutrZ/9Eo/H6wDcxpFFRFdGo9GsLIQMLQHHcR4F4MWy2NdIKX/rQa5hfI52AWBZ1kid48eZAsA3Y/W3XR7F0Q4/WJZ1OxHVMkQN6tsy2MgfJIS4D8B/eZB9s5SSpTg1jF3h6AHQWuayYkSAoQnGQIRHae/qm7UFQCwW61BKXcEUd4IQIis2SDK0kRDitwC+zx2slLpLSnkDd65h7AzHGACtAqCsxBQAftFdDRD8a5ynlZTyOSJ6kSOLiO6urq7m3urVyDBCiNsBXMydq5T6g+M4V3LnGsaX4egB0FpRprTEzADwS4l+ATCIox1+IqJLAHTp5iilxvb09FzF0CQjQwkhZgNgv0gT0SOO41wMQHFnG8aX4SgAtB4kM2xNawwQQw9A1t/xxmKx1UR0F0eWUuraysrKao4sI7OEw+HLANzoQfQztm2fB8D1INswvpTvBUCJ5mI0xsCVmh4AAIBS6hYA6xiiigOBwC8ZcowMIoT4HleRuJ35JSUl3wKQ9CDbMHbJXH0NHTlx1yKlbAfwY44spdTplZWVXswLN3wQiUTOAXA/+Pe9eMWyrK+uXr1a+/GTYQwURwHQrXNwe0dOXEOyUlu79mvPsbteRpBSPgngFY4sy7LuGT9+fE70juSzcDj8daXUQ+C/UVrY09PzlVgsxrEYlWEMmO8FQGubKQD80qpfALRxtCNTuK77I/RuvqJrz/b29ksZcgyfhMPhmUT0OADtubLbWRwMBk9qamrKmeLZyF4cBYDWRaCt3Qx89Uu7KQA+Jx6Pf6yU+g1T3PXRaDTClGWkkRBiOhE9C+YxLkS0nIim19fXb+LMNYyB0i4AiKhZ5/jW9pRuE4wBSuj3vuTcXUxXV9ccAHGGqDLXde9gyDHSSAhxJHo3i+Ke4bKqu7t7hm3bG5lzDWPAtAsA13U36BwfbzIDYP3C8Nq3cLQjkzQ3N7copa5mijuzsrLyaKYsw2OVlZUHAfgnAO4dHusATG9qauIoLA2DDUcPQJPO8bUNHI9cjYGobdAavgHwTJ3LOH2bvPybIYosy/oN+J8jG8yi0eh+lmXNBVDOHC0DgcB0KWUDc65haOMYA1Cvc/Daeu2LkDFAtZqvPdNmOplIWZZ1EXjmZ+8rhGBfOtbgE41G93Rddy6AYczRTa7rzmhoaFjDnGsYLDgKAK27QIa7UGOAdAsApVSuFgCIxWLL0Dv/m8PNe+yxR5gpy2AkhBjtuu4CANzvzxal1InxePxj5lzDYMPxCEDrIrC2vhuua2YCpJvrKqyL6T1+yeEeAACAZVnXA9Aa49KnvKen5+cMOQajSCQSJaJXAezBHJ2wLGum4ziLmHMNg5V2AZBMJrUq3NZ2F5/Wml6AdFu5ugttmoswJZPJnC4AYrFYs1LqOqa470QikcOYsgxNFRUVo5RSLymlxjJHt7uue2osFnuXOdcw2GkXAI2NjXXQHA3+zoftus0wdtPbH2ovQiYbGxvXc7QlkzmO8xAAji9zUkr9FoDZ/9pn1dXVQwKBwIsAJjBHd1uW9c14PP46c65heIJjDIACsEwn4O1FpgBIt4Uf6b3mRLSQqSmZzrUs6xLw7HtwYDgc/j5DjjFAw4YNK+/p6ZkP4EDm6CSAM2Kx2L+Ycw3DMyxrXBPR+zrHf7C0A0mzHlDaJFMKi5bq9QAopd5hak7Gi8Vi7wH4I0cWEf1cCDGCI8vYPUKIkuLi4n8opQ5ijnYBfEdK+SxzrmF4imuTC605063tLhYtM/tipMt7izs49gHIlx4AAIDrutcC4FjCdZhS6maGHGM39G3O9KxSinthJgXgB1LKx5lzDcNzLAVAMBjUXjTl2Xk5t6hcxnpuvvZr3WNZVl6NcI7H400AbuDIIqLzhRDcXdDGTtTU1BS0t7f/FcAMD+IvlVI+5EGuYXiOpQBYt26dA2ClTsbc11vR2WWmA3qto9PFvDe0l/BfmI9bmUop7wWwhCEqAOC34N9jHkDv3W4kEtlfCPHfulmRSOSccDg8taKignt53HQJOI7zCIDTPMi+VkrJtXmUYaQd2xKlSqkXiWjiQI9PtKWw4K1WnHp8iKtJxg7Mf7OVo/v/HxxtyUIppdSPiOh16F+8D4tEIt+xbfv/dEImTZpUuHHjxsMsyzpGKTWZiPZtb28fB6bPtlLqT0SEQCCghBB1SqmVRPSuUuq1QYMGvVdXV9fJcR6PkBDiQQBncAcrpW5xHOcX3LmGkU5sdyBCiBkA5utkTDu4FH+8w+yg6qVz/zeGtz7QmwFgWdZesVjsM6YmZZ1IJPKIUupshqjGoqKiCbW1tVt24xgSQuwP4HgA0wFMA1DC0JaB6ATwHoBXiOg527YX+9SOHSEhxG8A/JA7WCn1K8dxruDONYx0YysAampqChzHiUNjPW0i4IWH9sCEcazbcBt9PlnThVO/tw5K70nLx1LKSUxNykqjRo2qCAaDnwIYrJtFRHfbtn35rn4uHA5PJKIzAJwNYJzueT1Sh97eob9JKf+N3gFyvhBC3ArgGg+i/yil/B58/LcZBhe2RUkcx3FDodDeAA7QydmScHHiMeYxgBdm370eq+u0V118KJFIvMzRnmzV1tbWVl5e3gPgBIa4qSUlJc+0tbV9YVGlioqKUUOGDPlBKBT6HRHdAuAY8G9Yw2kIgEMAnBcKhc4IhUKBkSNHfrJp06audDZCCHEjgJ9y5xLRI1LK82Au/kaO4JoGCABQSj2pmzH39VasNRsEsVu9rhvz39Qe/AcAh4TD4RqOoGxWWVn5a2gOfO0T7Nsy+D+9cdFodHw4HL43EAisU0rdBf5Fa9JhbwD3dHZ22pFI5A/RaHS/dJxUCHElgNncuUT0lG3b54FnQSjDyAisy5K2trauC4VC34PGntpKAe3tLmZMK2NsmfHz3zVh5WqWG7ExRHR+KBSqKSsrW9Xa2upwhGYbx3Hc8vLyTwCco5tFRNWhUOiTsrKyovLy8ruVUr8jooPBOEjXR4UAapRSF4ZCof3Ly8s/SSQSjV6cKBKJXATgbjDPrlBK/Wvo0KH/1dTUpLd7lmFkGPZpSOFw+GYi0up+CwQIT99XhUl7FXE1K68tXdmJb1xcD5f/3kUBeEEpNSdfdz4Lh8N/I6JvMER1AsiHX3hFRM8Q0ZxYLLaUKzQSiXxHKfUwmHs1AbxiWdYp+Tjt1ch97AVAZWVltWVZa6D5QZwysQh/+30VLMuTqdJ5w3UVTr+oAcs+8XS2lgLwPIA5UsoPvTxRphFCjAbwMYBsnSfvF5eIHgRwnW3bG3WCIpHIN5VSfwH/Rktv9/T0zGxqamJ5dmYYmYZ9Z7LW1tbNoVDoAPQ+Axywxg1JVIwIYvKEfLgp8s7jz23Bky/sziyzASH07qx2QSgUOrK0tPST1tZW6fVJM0EikdhSXl5OAI7zuy1ZhgDUADi/rKyss7W19QMMYHBdOByeCeCvAAqY27ckGAzOdBzHLFFq5CxPbq+FEEcCeFM3Z3AogAWPVmPYELOD6kBs3JTEjG/XoSWR9nFLedUjMH78+EHt7e3LAYz3uy1Z7F2l1Hcdx+n3wEohxHT0/p6x3iUQ0XKl1LFSyg2cuYaRaTy5siYSifpQKDQLQFQnp6tb4bO6bpw6vRxkngTsFtdV+NHsOFbV+jKjYmuPwA9CodCBoVDo00QiEfejIenQ3NycKi8vrwWgvfRuHosS0XmhUGhTIpH4YFc/3HeT8QL4F0H6rKen57jGxsYvTMs0jFzj2a11aWmpJKKzdHPqYj0oKbZQM7mYo1l5495Hm/HE8553/e/K5x4NlJWVrczVRwOJROKzUChUg95/rzEwBQBODoVChw0ZMuSVlpaWxI5+KBKJ7A9gLjRmG+1Eg+u6x61fvz7GnGsYGcnT+2ohxFsAjtDNCQaAx39dhQNNEdAv7y/twLcviyGZyrj1ShR6V4qbI6X8yO/GcKuoqBgbCARWID9G83stDuB0KeXb2/5lNBrd13XdVwEMZz6fDAQCRzU0NKxhzjWMjOXpw/XS0tI1RHSubo6rgLc+aMepx4dQWsI9yye3rN+YxHevtJFoy8j1Sgi9g0N/EAqFDuibE54zjwba2to2hUKhQQC495zPR2UAzg6FQnYikVgMAEKICUqpVwCMZD5Xk1LqONu2VzHnGkZG87QA6FsYaAKAfbWz2ly8+X4bTptejkGFZkDAjiTaUvjOFTbqYhm/XsnWQuDCXHs0MHjw4IVKqbPQuyyuoScI4Cvl5eWipKTkY8uyXgLAvVvYFqXUCY7jsK1JYBjZwvMr6R577BHu6en5BEzP6w49oBgP3xFFYYEpArbV1a1w3lU23l2st9OfTxSA54hoTobtKLfbJk2aVLhp06YX4fO0wILSUoQiVQhF90B51R4IRapQMnIUgsUlCAwahMJQOYJFvY/Ukp0d6E60INXZiWRnB9qb1iMRq0dLwzok7Hok7Ab0tLX5+c8BgG70rirIKWFZ1oxYLPYuc65hZIW0XEWFEJcAuIcrb9bRZbjnxjACAVMEAEAqpXDJ7DjmvbHDMVPZJKsLgerq6qKurq6niOikdJ+7oLQUIydPQcX+UzFqSg0GV48FWTyPy5TrYsvaNWhcsgiNixdhw/LF6GnPykJzWx2u654Yj8df97shhuGXdF1BA0KIDwDszxU46+gQ7rq+Mu97Arq6Fa64OScu/ttSAJ7tKwSW+N2Y/hBClAB4DsD0dJ2zoKQUVUcdh+rjZ2H4PpNhBdKzdYCbSmLjimVY+/JcxN56NRN6B3ZXl1LqK47jzPO7IYbhp7RdPSORyGFKqbfAuFb3oQcU496fCYRK83OhoNY2FxdcJ7m7/ZMAXgNwPNL4+7ETWVEIjBw5sqywsPAFpZT3g/+IEK45BNUzTkTksGkIFA7y/JRfJtXdBfvtN1D30lw4i97t3c0rsyUBfFNK+azfDTEMv6X1C14IMRvAjZyZE8cPwsO3RzByeC5snNZ/Tc0pnHtlDJ+uYd9qfbaUck44HJ5KRDcCOIX7BAOQsYVAdXV1UXd39wIAR3p5HrICqDrqOOxz5jkYXD3Wy1MN2Ja1a/DxE39Cw5uvQnmw8xSDFIBzpJSP+90Qw8gE6b7DCwghTxCmnAAAF7tJREFUFgA4ljM0PCqIu28Io2bf/Fgn4P2lHbhsjoPGDUnu6LeklMeg94sSAJBphQARPQPgpgwpBEgI8TiAMz07gRVA9YwTsc8Z30aZ0FpYM20SdgNWPvFn1L08D8pN7fqA9FAAzpdSPuR3QwwjU6S9i7dvVsBHACo4c4MBwhXnD8f5Zw7L2WWDlQLue6wZ9zy80YtFfjYppQ5wHGfdjv7PysrKgyzLuhHAydwnHgBPtpTdXeFw+BYi+olX+cMnTsbU/7kSQ8bt6dUpPLVp9af44De/RPOnH/vdFAC4VEr5a78bYRiZxJdLZd8mHvPAv3c3jj60FHdcW5lzGwht3JTEVbc24o13PRlwlVJKndyfQVGmEOgViUTOVUr90YvswvLBmHLeRRhzwslsI/n9olwXtXOfx9I/3ofuhG8b610rpfyFXyc3jEzly1UykUjUhkIhC8Ax3NnrYj346z9bUF5mYdKeg0BZ3h3gugp/+UcLLvqp9HJjn2scx/lzf36wtbVVJhKJx0tLS/9FRALAXl41qh8IwESl1IXl5eX7Dh48+JOWlpZGr09aWVl5NBH9FR58fsQhR+CYX9yDkZP2y/rfXQAgIgzbc2+MnXkKttTXodVuSOv5lVK3OI5zc1pPahhZws9vGBJCPAzgXK9OsN/eRbjpilGYPCE7l2ZfurITN969Hss+6fTyNI9JKc8e6MGVlZUHEdFsP+a+74Aioqf7egSWeXECIcQIAEsBhDlzKRDAlPMuwoSvn4lcfob1yVN/wbI/3gc35f3YACK627btyz0/kWFkKb+/aYJCiGfg4QCzQIBw2vQQLjx7GMaN5l5IzBur13XjD48147kFLfB4MPWCoUOHnrJixQrtroVoNHpwKpW6MdcLASHEswC+wplZWlGJw669CcP3nsQZm7E2fLwM79x6A9qbvNtxl4jut237QvQO/jMMYwf8LgAQjUaLXdedD4+nUVkWcPQhpbjk3OHYd+/M7BFYVduNB55oxj9eSiDl8U5+RPR+d3f3cU1NTa2cuZFIZIpS6icAvgH/f78UgH8CuFFK+aFuWDgcvoCI7tNv1v83bM+9Me3mX6JoyFDO2IzXuakZb/z0f7FpDf/+O0T0qG3b3wGQkXMRDSNT+P0FDQCIRqPDXNd9A4Dnt0BEwJFTS/G1WSGcMC2EokH+vgQdnS7mv9mKZ+a24N+L2tO1jsoKAMdIKTd4dYIM6xFwlVJPBwKBmwbaIyCEmABgEYBSrkaNmlKDI2/8BQpKSrgis0pPWxveuukarF+iXZtt74jttxE2DOOLMqIAAICqqiqRSqXmIw1FwFah0gBmHlWKr84cjKn7FSGYpr0FkingvcXteHZ+C+a/0YrW9rTeqCxxXXdGPB5vSsfJotHoIUqpG5VSJ6bjfLsw0EIgIIRYCGAqV0OiRx6Dw348G1ZBAVdkVnJ7erDwtjloeOtVztglUsqp6F31zzCMnciYAgAARo8ePTSZTD4P4Ih0n7u02MLUKcU4/MASHHpAMSaOHwTL4nl5XFdh5eouvPNhB975qB0fLOlAW4cvvZMfWJY1MxaLNaf7xJlWCAB4KpVK3dTY2Lh8Vz/M3fUfPfIYHH7dTSArt6aqDpRyU3jn5zdyFwFXSSl/yRloGLkmowoA4D+bqjwB4FQ/21FabKG6qhBjqwowZnQhxlYVIjwqiNISCyXFhMGhAEqKe+dot3e42JJIob1DobXNRbwpidqGbtTWd2NtQw/qGrr9uuBv6+WioqLTa2trt/jZiGwrBKqrq4d0d3evAjCS44SjptTg6FvuzPs7/+25PT147SeXo2npR1yRba7rTo7H43VcgYaRazKuAOgTFEI8AA+nCOaZh8Ph8IWLFi3q8bshW0UikUOVUjcCmOV3W9BbCPw9lUrdvH0hEIlEfqWUuozjJEPHT8Cxt/0GBaVswwhySk97G1656ofYvOYzljyl1N8dx/kmS5hh5KBMLQCA3nUCrgdwA3xasCgHuEqpnzqOc6vfDdmZTCwEXNe9KR6PrxBC7I3eOf/at+ulFZWYfs+DeTfaf3d1NG/ES5d+n2uKoHJd95B4PP4+R5hh5JpMLgAAAJFI5Dil1GMAKv1uS5ZpAnC2lHK+3w3pj0wsBJRSo4joGN0wCgRw/J335s08f10bVizFq1f/D9diQS9JKWdwBBlGrsn4O+tEIrG2tLT0z0S0H4DxfrcnGxDR+67rnuA4ziK/29JfiUQilkgkHisvL5+H3mLP7yWGJxFRNUfY/t//IaqOOo4jKi+UjKqAVVCAxo8+4IgbW15e/lYikVjLEWYYuSTjCwAAaG1tbU8kEo+HQiEXwFHwYBOhHNED4OZwOPzdzz77LO0j/Tn0FQJ/KS8vnw8giiwv+sQhR+DAiy7P3eV9PTJyn33RvGolWmWMI25sIpHwZOMmw8hmWfetFIlEDlNKPYA0rheQJZYCOFdKyTaMOhP0vd83Apjpd1t2V2H5YJz04F8wqHyw303JSl1bNuNf3/8Wyy6CrusebMYCGMbnZUUPwLYSiURs9OjRD3Z2dqYAHAog6HebfNamlJozbNiw79bW1tp+N4ZbX4/Ao309AlUAxvndpv468KLLMXLSfn43I2sFi4pQUFoG5z39Rf2IqDSRSDzN0CzDyBlZ1wOwrUgkEgXwc6XU2cjyf8sAvaCU+h/Hcdb53ZB06esRuA4ebiDFYdiEiZj+q/tBlnlapUO5Ll6+4kJs/GSFblQPEY21bZvlmYJh5IKs6wHYViKRaEkkEs+UlZW9QkTjAezhd5vS5FUA50gpb2ttbfV1YZ902zpGIBQKLUCG9giQFcBRc25H8fARfjcl6xERhozbE2vn/ROaG2UElFIdra2tr3C1zTCyXU7cnjiO86aU8igAJwD4t9/t8dAblmUdK6U8Tkr5lt+N8ZOU8m0p5Uz0Lhu9wO/2bKt6xokYMm5Pv5uRM4btuTf2OFZ/Jh8RnYX87Ck0jB3KyQ9DJBI5DMCVSqmvIvuLnB4ATxHRPbZtL/S7MZlKCHEEgBsB+Drnm6wATnrwcZSJqJ/NyDktDesw94KzoVztJbWn5XvxbBhbZfUjgJ3p6yb+aygU+j8ACfR2E5f726rdtg7AbyzL+rZt239MJBLm2eWXSCQSDYlE4pFQKPQSfHw0MPro6Rh34ml+nDqnDRo8BFvWrUXLOu3p/F2JROKfHG0yjGyXkz0AOxAQQhwL4AwAXwMw3Of27EwzgKeUUo84jvMWAK2Hnvmsr0dgNoDpaTspEWbd+2cMrh6btlPmk821qzHvh+fqjgXYEA6HRSbti2EYfsmXAuA/ampqCuLx+DSl1Cz0Lju7r89NWgrgXwD+KaV8BwDL+qdGr3QWAuGph+KoW+70+jR57fWfXI74ove0MizLOjYWi73G0yLDyF55N4e+r/J/pe/P1RUVFaMsyzoCwJFEdDB6CwKvVm7ZjN4L/tsA3nFd9514PN7k0bkMAFLKfwOYIYQ4Er1jBDwrBKpnnORVtNGnevqJ2gWAUmoGgNdYGmQYWSzvegD6IxwO7wFgomVZY5RSY9D7TLkCvY8OhgMoQW/xFOo7ZBMAENEmAO1KKYeI4kqpOIB1RPSJZVkrGxoaZNr/Mcbn9BUCswEcz5lbUFKKrzzxPAKFgzhjje2kujrx3H+fhp62tgFnENH7tm0fzNgsw8hKpgAw8pIQ4nAA14JpQaFxJ56GqZf+mCPK2IX37vo51s7XGsfnAqiQUm5gapJhZKVsnyJnGAPSt47AqQCmAdCeXrnHcVm3VUHWqj5ee8doi2ObZ8PIdqYAMPJa35zweToZBSUlGL7PZKYWGbsyYvJ+KCgp0cpQStUwNccwspYpAAwDmKpz8Mh9D4AVyLvxtL6xAkGMmDRFN+ZAjrYYRjYzBYBhAFp3gxX7m5vJdKuYov2amwLAyHumADDy2qhRoyoAVGpl7GeuJek2an/t13xE326ihpG3TAFg5LVAILCXzvEFpaUYPCbjNiTMeUPG7olgcbFWhuu6ZuCGkddMAWDkNSLS2rYvFB0NsszHKN3IshCKjNaN0Q4wjGxmvrmMvEZE43WOZ7gIGQMUimq/9lUc7TCMbGUKACPfaV0Eyqv24GqHsZvKNQsAIjIFgJHXTAFg5DWlVIXO8Qx3ocYAharMIwDD0GEKACPfjdI5uHj4CK52GLupZITWWwcA5s0z8popAIx8p3URKCgp5WqHsZsYXnu95QQNI8uZAsDId1pzyYLF5hril4DmNECYAsDIc6YAMPJdkc7BumvSGwNnegAMQ48pAIx8N0jnYNMD4J8C/dfevHlGXjMFgGEY+Yr8boBh+MkUAEa+69Y5ONnRztUOYzf16L/2XRztMIxsZQoAI99pFQA97aYA8EtPe5tuhCkAjLxmCgAj32ldRUwPgH9SHR26EebNM/KaKQCMvEZEzTrHM9yFGgPUrf/aa733hpHtTAFg5DXXdTfoHN+xoYmrKcZu6mhq1I0wb56R10wBYOQ1ItK6CCRi9VxNMXZTItagG6FV/BlGtjMFgJHvtK7gLaYA8E1LwzrdCO0Aw8hmpgAw8p3WRSARM9cQv+j2vhDRWqamGEZWMgWAkdeIqFbn+ESsHsp1uZpj9JNyXSSk3iMA13XreFpjGNnJFABGXksmkx/rHN/T3o4ta9dwNcfop01rViGpOQ3QsqzlTM0xjKxkCgAjrzU2NtYBaNHKWLKIpzFGv61f8qFuxGbbtm2OthhGtjIFgJHvFIBlOgGNi00BkG4MBcAy9L73hpG3TAFg5D0iel/n+A3LF0OlUlzNMXbBTSWxYcUS3Zh3OdpiGNnMFACGAfxb5+Ce9nZsWLGUqy3GLjQtW6y9B4NS6m2m5hhG1jIFgJH3gsGgVgEAAGtfnsvRFKMf1r08TzdCua6r/Z4bRrYzBYCR99atW+fg/7V3t8FRXXUYwJ9zN8Fs0osJIGl2N4FqBwRabUvpiECBkVY62lHGD3YGtONgHb/4MsqoFQekLdXRVj+oY2un4sDIlKqlRQbSSJpASICUyEiApLQJSXbvzS7k/Sa7geTe44ekIyMNL73nZndvnt+nfMlz/js7u/e/59x7DtDsJiN2tAr25WFFFdFE7MvDiNVWu41pTCQSFxWUQ5TV2AAQAZBSHnTz/yNDQzCOH1VVDk0gVntYxRHMrt5rIr9gA0AEQAjh+qJwoeKAilLoOtoOub92a5rG9RoisAEgAgCUlJRUw+XxsPF/13NTIA/1tb6H+KmTbmO6YrEYp2qIwAaACADQ0NAwAuANVyFS4twrO9UURNc4u3sHIN09ui+EeA3AqJqKiLIbGwCicVLKPW4zojVVPCLYAwMdbTDqjqiIcv0eE/kFGwCicZ2dnZUAXG0PKx0bTXt2KaqI3te0Z6eKQ5eihmEcVlEPkR+wASD6n1Ep5Q63IW2V5eg536SiHgLQ824z2qsOqYj6MwBu2Ug0jg0A0VWklC8DcPVTUzoOGv7wGx4TrIB0HDT87teQjuvrto2xBoCIxrEBILpKPB5vA7DPbU7PO+fQWv5P9wVNcS0HXkfPeVd7NAEApJR7TdPkzRlEV2EDQHSt51WEnN7xAi7396mImpKG+3rR+Jc/KcnSNE3Je0rkJ2wAiP6PaZpHARx3m3PFGsCJ57e7fnRtKpKOg/rnnsGVQUtFXI1hGK7fTyK/CaS7AKJMVFBQYAoh1rvNGTSiyAkGMWvh3SrKmjKa9uxEywF32zK8Twix0bKsC0rCiHyEMwBEHyAejx8EoOTB88YdL/C44FvQ3XwWZ3a9rCruqGEYlarCiPyEDQDRBBzH2aIkx7Zx7JdbkerpVhHna6meLtQ+vRnSVvK0ngTwExVBRH7EJQCiCQwODrbruj4fgOv5+5HkEOINJzBn9cMITJumoDr/GRkaQvVPv49BI6oq8lXTNH+rKozIb9gAEF1HXl5ebSAQ+CaAPLdZl/t60dV8FmUr10AL8KN3NWd0FEe3/Rjd586oikw5jrNucHCQj2EQTYDfQkTXkUwmB3VdTwF4REleIo6BaDtKl6+CEFyBA8a2Tz72i63orK9TlimEeKqzs9P1fg5EfsYGgOgGLMs6qev6owBKVOQNdLSh70ILwktXQAvkqIjMWs7ICI7/ahtiNVXKMoUQZwoLCx+/dOkSt/0lug42AEQ3JqdPn34awDcACBWBVrQdXecaEfnsyil7T8BIcgg1Wzahs/6YylhbCLGupaWlXWUokR+xASC6CZZlxXRdDwBYqSpzKNEJs74W4aUrkJtfoCo2K6R6ulH95PfQ3XxWaa4QYqthGLuVhhL5FBsAoptkWdYRXdeXAfi4qszLfb3oOFKJGfMXomB2sarYjNZzvglHNv8AVlT5j/Qq0zS/hbHH/4joBtgAEN08GQwGKzRNWw9AVxU6mkyivbIcUjr42F33QAglqwyZR0qcf+NvqHt2C65YA6rTE7m5uQ/39/crDybyKzYARLdgaGhoSNf1/wDYAEX3AwCAlBKXTp9Cz/lm3L74AeTkuX7qMKMM9/WibvvP8O6+f3hxNoIthPhSNBptVB1M5GdsAIhukWVZrbquawBWqc4eNGNofXM/cgtuQ9Gd87J+NkA6DloOvI7ap55Ef1urJ2OMr/vv9CScyMey+9uFKH1EKBR6CcBGrwYounM+7v/OJsyYv9CrITzV+947aPj988pv9LuaEGKXYRiPg+v+RLeMDQDRh5cTCoX2AviiVwMILYA5qx/Cgse+jumlc7waRqmBjjY0vboL7W9VQDqOl0PtM03zKwBGvRyEyK/YABC5EIlEgo7jVABY7uU4QtNQsuQzWLR+I2bM+6SXQ31o/W2taP77X9H+1r8gHc/34Dlu2/aaRCIx5PVARH7FBoDIpUgkMsNxnCMAFnk+mBC4/b4lmLvmEUSWrURg2kc8H/J67MvDiNUeRtuhg4ifOunFDX7XEEKcCQQCD3Z0dPR6PhiRj7EBIFKgtLQ0ZNt2BSajCRiXW1CAyLJVmPu5tZh116cmbVthadu42HgKbYfKYdQdxkgyOSnjAmMXf03TPh+NRs1JG5TIp9gAEClSVlZWNDo6ug8eLwd8kJy8IGYuWITie5eg+N77UfSJeRCamsOGpONgINqGrrONSJx6G/GGeowk0zLzfkII8QXDMLrTMTiR37ABIFIoFArlA3gFwKPprCMnGIQeLoMeKcP0SBn00jLkz5qNnGA+coJBTLtNR04wHwAwmkriyqCF0VQKI6kkUl0XYUU7MBBth2VEYRkdGE2l0vlyAGCfpmmPxWKxtBdC5BdsAIjUC4RCoT8CeCLdhfiBlHJnZ2fnRvBufyKluBEQkXrSsqz9uq47AB4EoGYufuqxhRBbTdP8IQBPnyckmoo4A0DkoUgksspxnN0AStJdS5a5COBrpmlWpLsQIr/iLxMiD8VisWrHcT4N4M1015JFqnJzc+/hxZ/IW5wBIJocIhQK/QjAdnDpbSI2gGdM03x6/G8i8hAbAKJJFA6Hl0opX8Ik7heQDYQQZwA8YRjG8XTXQjRVcAmAaBIZhnGsqKjoPgA/BzCc5nIyQUoIsaWwsHAxL/5Ek4szAERpEg6HIwCelVJuwNT8LO63bfu7iUTiQroLIZqKpuKXDlFGKSkpWSGE2A5gRbprmQxCiMNSys2madamuxaiqYwNAFGGCIVCDwHYCmBZumvxSI0QYpthGJXpLoSI2AAQZZxwOLwUwCYp5ZeR/ffpOEKIvQCe4xo/UWZhA0CUoSKRSNhxnA0Avg1gbprLuVUmgF22bb/INX6izMQGgCjzBUKh0GoAXwWwDsDMNNczkW4Arwkh9hiGUQ0+y0+U0dgAEGWRxYsX58bj8RVSyrUA1gK4O80lnQZQLoQoNwyjBjywhyhrsAEgymLFxcWzNU1bBmC5EOIBjDUEH/VouD4AjVLKt4UQNbZt1yUSiYsejUVEHmMDQOQzJSUlcwAs0DTtDinlHQBKARRjbOlgJoB8ADkA9PF/sTD2yz2JsWn8bgAJAFEhxAXHcVqFEE2maXZM8kshIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIppU/wVwOICzRGGbSgAAAABJRU5ErkJggg==;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;250\&quot; y=\&quot;1612.25\&quot; width=\&quot;78.5\&quot; height=\&quot;78.5\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-19\&quot; value=\&quot;\&quot; style=\&quot;group\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;112.93\&quot; y=\&quot;1590\&quot; width=\&quot;81.57\&quot; height=\&quot;120\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-10\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;whiteSpace=wrap;html=1;rotation=90;fillColor=#e1d5e7;strokeColor=none;\&quot; vertex=\&quot;1\&quot; parent=\&quot;iroXu6kSOUnqGuu2dOUE-19\&quot;&gt;\n          &lt;mxGeometry x=\&quot;-31.930000000000007\&quot; y=\&quot;55\&quot; width=\&quot;73.86\&quot; height=\&quot;10\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-11\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;whiteSpace=wrap;html=1;rotation=90;fillColor=#E5CCFF;strokeColor=none;\&quot; vertex=\&quot;1\&quot; parent=\&quot;iroXu6kSOUnqGuu2dOUE-19\&quot;&gt;\n          &lt;mxGeometry x=\&quot;-34.06999999999999\&quot; y=\&quot;55\&quot; width=\&quot;120\&quot; height=\&quot;10\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-12\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;whiteSpace=wrap;html=1;rotation=90;fillColor=#CC99FF;strokeColor=none;\&quot; vertex=\&quot;1\&quot; parent=\&quot;iroXu6kSOUnqGuu2dOUE-19\&quot;&gt;\n          &lt;mxGeometry x=\&quot;-18.96999999999997\&quot; y=\&quot;55\&quot; width=\&quot;120\&quot; height=\&quot;10\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-13\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;whiteSpace=wrap;html=1;rotation=90;fillColor=#B266FF;strokeColor=none;\&quot; vertex=\&quot;1\&quot; parent=\&quot;iroXu6kSOUnqGuu2dOUE-19\&quot;&gt;\n          &lt;mxGeometry x=\&quot;-4.069999999999993\&quot; y=\&quot;55\&quot; width=\&quot;120\&quot; height=\&quot;10\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-14\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;whiteSpace=wrap;html=1;rotation=90;fillColor=#9933FF;strokeColor=none;\&quot; vertex=\&quot;1\&quot; parent=\&quot;iroXu6kSOUnqGuu2dOUE-19\&quot;&gt;\n          &lt;mxGeometry x=\&quot;39.06999999999999\&quot; y=\&quot;55\&quot; width=\&quot;75\&quot; height=\&quot;10\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-23\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=4;strokeColor=#999999;entryX=0;entryY=0.5;entryDx=0;entryDy=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;iroXu6kSOUnqGuu2dOUE-1\&quot; target=\&quot;iroXu6kSOUnqGuu2dOUE-21\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;360\&quot; y=\&quot;1660\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;650\&quot; y=\&quot;1076\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-25\&quot; value=\&quot;CPU节点\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontStyle=1;fontColor=#7F00FF;fontSize=15;\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;194.5\&quot; y=\&quot;1560\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-28\&quot; value=\&quot;模型\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontStyle=1;fontColor=#808080;fontSize=15;\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;150\&quot; y=\&quot;1722\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-29\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f5f5f5;fontColor=#333333;strokeColor=default;dashed=1;dashPattern=1 1;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;566.12\&quot; y=\&quot;1595\&quot; width=\&quot;342.86\&quot; height=\&quot;140\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-32\&quot; value=\&quot;GPU节点0\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontStyle=1;fontColor=#7F00FF;fontSize=15;\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;626.12\&quot; y=\&quot;1585\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-34\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f5f5f5;fontColor=#333333;strokeColor=default;dashed=1;dashPattern=1 1;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;562.24\&quot; y=\&quot;1843.5\&quot; width=\&quot;344.9\&quot; height=\&quot;140\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-37\&quot; value=\&quot;GPU节点0\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontStyle=1;fontColor=#7F00FF;fontSize=15;\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;622.24\&quot; y=\&quot;1833.5\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-45\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=4;strokeColor=#999999;entryX=0;entryY=0.5;entryDx=0;entryDy=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;iroXu6kSOUnqGuu2dOUE-1\&quot; target=\&quot;iroXu6kSOUnqGuu2dOUE-29\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;380\&quot; y=\&quot;1680\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;590\&quot; y=\&quot;1390\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-46\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=4;strokeColor=#999999;entryX=0;entryY=0.5;entryDx=0;entryDy=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;iroXu6kSOUnqGuu2dOUE-1\&quot; target=\&quot;iroXu6kSOUnqGuu2dOUE-34\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;370\&quot; y=\&quot;1670\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;577\&quot; y=\&quot;1560\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-47\&quot; value=\&quot;数据并行&amp;#xa;(Data Parallelism, DP)\&quot; style=\&quot;text;whiteSpace=wrap;fontStyle=1;fontSize=17;align=center;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;106\&quot; y=\&quot;1480\&quot; width=\&quot;191.5\&quot; height=\&quot;50\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-48\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;entryX=0;entryY=0.5;entryDx=0;entryDy=0;entryPerimeter=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;iroXu6kSOUnqGuu2dOUE-50\&quot; target=\&quot;iroXu6kSOUnqGuu2dOUE-52\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-49\&quot; value=\&quot;随机划分\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;iroXu6kSOUnqGuu2dOUE-48\&quot;&gt;\n          &lt;mxGeometry x=\&quot;0.0562\&quot; y=\&quot;-3\&quot; relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint y=\&quot;-13\&quot; as=\&quot;offset\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-50\&quot; value=\&quot;数据集\&quot; style=\&quot;shape=cylinder3;whiteSpace=wrap;html=1;boundedLbl=1;backgroundOutline=1;size=15;fillColor=#60a917;strokeColor=#2D7600;fontColor=#ffffff;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;106\&quot; y=\&quot;1770.25\&quot; width=\&quot;80\&quot; height=\&quot;89.25\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-51\&quot; value=\&quot;随机小批量\&quot; style=\&quot;shape=cylinder3;whiteSpace=wrap;html=1;boundedLbl=1;backgroundOutline=1;size=15;fillColor=#d5e8d4;strokeColor=#82b366;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;255\&quot; y=\&quot;1820.75\&quot; width=\&quot;70\&quot; height=\&quot;60\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-52\&quot; value=\&quot;随机小批量\&quot; style=\&quot;shape=cylinder3;whiteSpace=wrap;html=1;boundedLbl=1;backgroundOutline=1;size=15;fillColor=#d5e8d4;strokeColor=#82b366;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;255\&quot; y=\&quot;1785.75\&quot; width=\&quot;70\&quot; height=\&quot;60\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-53\&quot; value=\&quot;随机小批量\&quot; style=\&quot;shape=cylinder3;whiteSpace=wrap;html=1;boundedLbl=1;backgroundOutline=1;size=15;fillColor=#d5e8d4;strokeColor=#82b366;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;255\&quot; y=\&quot;1747.75\&quot; width=\&quot;70\&quot; height=\&quot;60\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-54\&quot; value=\&quot;&amp;lt;span style=&amp;quot;font-family: Helvetica; font-size: 11px; font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; letter-spacing: normal; orphans: 2; text-align: center; text-indent: 0px; text-transform: none; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; white-space: nowrap; background-color: rgb(255, 255, 255); text-decoration-thickness: initial; text-decoration-style: initial; text-decoration-color: initial; float: none; display: inline !important;&amp;quot;&amp;gt;batch_size&amp;lt;/span&amp;gt;\&quot; style=\&quot;text;whiteSpace=wrap;html=1;fontColor=#CC0000;fontStyle=1\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;195\&quot; y=\&quot;1815.75\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-62\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f5f5f5;fontColor=#333333;strokeColor=default;dashed=1;dashPattern=1 1;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;1020\&quot; y=\&quot;1590\&quot; width=\&quot;130\&quot; height=\&quot;140\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-67\&quot; value=\&quot;\&quot; style=\&quot;group\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;560\&quot; y=\&quot;1330\&quot; width=\&quot;340\&quot; height=\&quot;150\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-21\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f5f5f5;fontColor=#333333;strokeColor=default;dashed=1;dashPattern=1 1;\&quot; vertex=\&quot;1\&quot; parent=\&quot;iroXu6kSOUnqGuu2dOUE-67\&quot;&gt;\n          &lt;mxGeometry y=\&quot;10\&quot; width=\&quot;340\&quot; height=\&quot;140\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-68\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;entryX=0;entryY=0.5;entryDx=0;entryDy=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;iroXu6kSOUnqGuu2dOUE-67\&quot; source=\&quot;iroXu6kSOUnqGuu2dOUE-22\&quot; target=\&quot;iroXu6kSOUnqGuu2dOUE-63\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-22\&quot; value=\&quot;\&quot; style=\&quot;shape=image;verticalLabelPosition=bottom;labelBackgroundColor=default;verticalAlign=top;aspect=fixed;imageAspect=0;image=data:image/png,iVBORw0KGgoAAAANSUhEUgAAAgAAAAIACAYAAAD0eNT6AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAOxAAADsQBlSsOGwAAABl0RVh0U29mdHdhcmUAd3d3Lmlua3NjYXBlLm9yZ5vuPBoAACAASURBVHic7N15fBx1/T/w13tmd3Nu7mR3ZjZpUkpLmxbaBgotLeUqlHIIgoKKyiGieH4VBLxQVPD48lMQv3KKSlX4gnIVATnkaqFAgQKlUCBN293ZpGmu7ubc3fn8/kj4WmuPJPOZnT3ez8ejDxAz78+7m2TmPZ8TYIwxxhhjjDHGGGOMMcYYY4wxxhhjjDHGGGOMMcYYY4wxxhhjjDHGGGOMMcYYY4wxxhhjjDHGGGOMMcYYY4wxxhhjjDHGGGOMMcYYY4wxxtjuyO0EGGNSUSAQaPJ4PDOFEFOFEI1EVE9EdQCqhRDVAAoB+ACUjF3TD2AEwBARdQHoEkJsB7AVQBsRbU4mk293dHS0ARDp/ysxxpzABQBjWay+vl63LOtIAEcKIQ4DMAeA36HmdgJ4k4heBrDa4/Gs3rJlS9ShthhjDuMCgLEs0tzc7Ovp6VkqhDiJiFYAmOFyShuFEI8Q0aOVlZXPbNiwYcTlfBhj48QFAGOZz6Np2vFEdDaA0wFUuJ3QXvQAuF8IcXc0Gn0SQNLthBhje8cFAGMZKhQKGZZlnQvgEgANbuczQVEAf0ylUrd0dHS0up0MY+w/cQHAWIbRNG0JEX0TwKkAFLfzsckC8CCA60zTfN7tZBhj/8IFAGMZwjCMky3L+j4RLXA7F4e8KIT4YTQafdTtRBhjXAAw5jrDMI4RQvwEwEK3c0mTNZZlfbu9vf0ZtxNhLJ9xAcCYS6ZMmaIlk8mfCSHORX7+Lq4CcIlpmtvcToSxfKS6nQBjecijadpXLMv6K4AFyM+HPwBMB/B5v9+fisViazE6X4Axlib5euNhzBW6ri8C8D8ADnE7lwzzOhFdEolEXnA7EcbyBfcAMJYeqq7rVwG4A4DmdjIZKAjgAr/fXxSLxf4J3nKYMcdxDwBjDgsEAnWqqq4EsMztXLIBET2jKMont23bZrqdC2O5jAsAxhxkGMaxQoiV4Lf+ieoUQnyGlwwy5hweAmDMGaTr+tUAbgVQ5nYyWaiEiD7p9/uVWCzGywUZcwAXAIzJp+q6fguAr4N72ewgAEeXlZVNnT59+qpoNMqrBBiTiG9OjEmk63qxEOKesZP6XOPzlKKipAkVxU2oKJmKipImlBYG4VVL4FEKUeCrgFctAgAkUoMYHulFwhpEMjWA+FA7evtb0dPfir6BNvT2b8ZIMu7mXwcY3TPgbNM0B9xOhLFcwQUAY5I0NDRUJpPJhwAcme62vWoxAhVzEapahFD1ItT4Z4FI3jECOwe2Ity9BuGuNdjWtRojyZi02OMlhHiJiE42TXNH2htnLAdxAcCYBGMn9z0GoDldbfo8pTgguALTtY8gWD4PiuJJS7uWlUR736t417wfrR2Pprt3YIOqqifwCgHG7OMCgDGbQqFQlWVZzyIND38Cob5mCWboZ6Cx7jh4lEKnm9ynpDWEzR1PYFP0fmzb8RxEepbvb/B4PEu2bt3ak47GGMtVXAAwZkMoFCqyLOsfABY72Q6RgoaapTjsgK+gtmy2k01NWldsE9ZvuQ3vta+CZSWdbm5tKpU6rqOjo9/phhjLVVwAMDZJLS0tXtM073dywp9CKqbrZ2B+08UoL57iVDNS9Q604dXWm/Be9AFYIuVkU6tM0zwDgOPVBmO5iJcBMjY5RES3EtHHnWqgtmw2ls/9DWbXfxKF3gqnmpGu0FuBprrj0Vh3HLpi76J/uN2ppqaXlZVNjcVi9zvVAGO5jAsAxiZhbJOfrzkRu9BbgcUHfRdHzfwBSguDTjSRFsUFtTjIOBMlhXVo730VKWvYiWYO9vv9Fm8WxNjE8RAAYxOk6/oJAB4BIG+d3ZjG2mNxTPO1KPRVyg7tqsGRbvzzrSuwZcfTToS3iGhZJBJ5yongjOUqLgAYm4C6urqAx+N5HaOn10mjKB7Mb/oiDp36Janr9zOJgMCbW/6IFzb9HJZIyA7f4fV6523ZsiUqOzBjuYqHABgbP7WsrOxBIpI6Dd9fZODk+bdiuvYREOVuTU6g0c2Kqhci3LVa9v4BpZZlzY3FYivBRwkzNi5cADA2Trqu/5CIPiszZm3ZbHzksJWoKJkqM2xGKy3UcGDwVES6X8DASKfM0FN5PgBj45e7rxuMSaTr+iIAz0HiuL9RdQSWz/0tfJ4SWSGzykgyjkde+yLMnrUyw6aIaEkkEnlBZlDGchH3ADC2f6rf738AgCYrYFPdMiyf+5v/O5AnH6mKD9P1U9Hbvxk9/e/LCqsAODIQCNzW3d3t6CYEjGU7LgAY2w9N075GROfJijc1cCJOOOR6qIpXVsisRaSiKXACevvfR0//B7LC1iQSiSQPBTC2bzwEwNg+1NbWBr1e7zsAymXE0ysPxyktt0NVfDLC5QxLJPDwq59HuGu1rJDDiqLMDofD0roWGMs1ubneiDFJvF7vryDp4V9dOgMnzfsffvjvgUJenHjIr1FbJu08pYJUKnWtrGCM5SLuAWBsLwzDOEYIIWVzGX+RgTMPvxdFvmoZ4XLWwHAn/rr2LMSH5CznF0IcFY1Gn5MSjLEcwz0AjO2FEOIaGXEUxYPj51zHD/9xKC6oxYmH3ACFPFLiERH3AjC2FzwJkLE9MAzjZADfkhFr4fRvYVrQsQMDc05JYRCq4kW4e42McA1+v39NLBZrlRGMsVzCPQCM7YFlWd+XEaex9lgcPOV8GaHyytymi9BQs1RWuKtkBWIsl3ABwNhuNE1bQkQL7MYp9FbgmOZrQTzVZsIIhGNn/wwFXinzL480DGOhjECM5RIuABjbDRFdKiPOEQdemnOn+qVTka8KRxz4DVnhpHxPGcslXAAwtotAINAE4BTbccrn4SDjLAkZ5beZxtmoKzvYdhwhxOmapk2RkBJjOYMLAMZ2oSjKhbD5e6GQiqNmXZWzx/qmE5GCo2b9AArZnq+sEBFPxmBsF3yHYuxfPDIeEtP1M1DjnyUjH4bRExOnBU+VEeoC8Monxv4PFwCMjQkGg8sA6HZiKKRiXtPnJWXEPjR/6hdk9KjU67p+jIx8GMsFXAAwNkZRlLPtxjggeBIqihslZMN2VVkyFVPrTpARyvb3mLFcwQUAYwCam5t9AE6zE4NAmN/0BUkZsd21TL1ExpLKM8e+14zlPS4AGAPQ09OzFICtNXv1NUtQVTpdUkZsd9X+gxCqPtJumMre3l7bQRjLBVwAMAaAiGzv1TtD/6iMVNg+zNDPsB1DCLFcQiqMZT0uABgDIIQ4yc71Pk8pGuuOlZUO24umwDL4PKV2w9j6XjOWK7gAYHmvvr5eBzDDToxpwZPhUQolZcT2xqMUYmrgRLth5tTW1gZl5MNYNuMCgOU9y7JsjwkfqNmaP8gmYIZ+uu0YHo9nkYRUGMtqXAAwBtgqAHyeEgTL58nKhe1HsLwFPk+J3TA8EZDlPS4AWN4TQhxm53qtcgEUxSMrHbYfiuJBsOJQWzFknPbIWLbjAoDlOwIw206AUBWfNJtuRtURdkPMAficZpbfuABgea2+vn4qgDI7MSQ8jNgESfjMy3Vdr5eRC2PZigsAltcsy5pp53qfp5Q3/3FBjX8mvJ5iWzGEEHxiE8trXACwvCaEaLJzfUXJVD721wVECiqKbX3roCiKvQCMZTm+c7G8JoRotHN9RQk/Q9xi97O3+71nLNtxAcDyXYOdi/nkP/fY7QEA0CghDcayFhcALK8pilJr5/qKkgNkpcImqKJkqt0QNTLyYCxbcQHA8l21nYtLCupk5cEmqLRQsxuCCwCW17gAYHlNCGGrAJBwMA2bJK/93QBtfe8Zy3ZcALB8Z2stmVe1/RBik+RV7S0DhM3vPWPZjgsAlu98ti7mHgDX+FTbn32BjDwYy1ZcALB8Z6sAsLsZDZs8CUMAXACwvMYnmPwnCgQCTR6PZ6YQYqoQopGI6omoDkD12JhxIUYfHCUABIBeABYR9QkhRgBEiSgshIgAMIUQ76mq+mY4HI649rdieyPcToC5JuV2Aoy5Ke8LgPr6en3sPPgjx06FmwPAL8Toc4Fo9LyQD//3HhCAyrGv+XBS0UG7fj0RwbIs6LreDWA9Ea0XQjyjKMqz4XC424G/Fhu/Ydj4PRhJ9qPQWyExHTZeiWS/3RC2AzCWzfKuAGhubvb19PQsFUKcREQrUqnUjDQ2XwXgGCHEMQC+blmWpev6eiHEP4UQD7a3tz8PfitJtyGM9uRMSiIZ5wLAJSOpuN0QXACwvJYvBYBH07Tjiejsnp6e0wFUfPhm7zIFwDwimkdE39B1vUMIcZ+iKPdGIpGnwcVAOgzZuTiRGpCVB5sgCZ89FwAsr+V0ARAIBJoURbmQiM4HoLudzzgEiOgLQogv6Lq+jYhu8Xg8t2/ZsiXqdmI5bAcAY7IXjyRtv4WySRpJcA8AY3bk5CoATdOW6Lp+v6qq7xPRd5AdD//d1QshfpRIJLbouv6/hmEsdDuhHNVh5+L+YVuXMxv6h9ttXS+E2CkpFcayUk4VAIZhnKxp2loiehbAR5Abfz8vgI8JIdbouv64pmlL3E4olxCRrSd4b3+rrFTYBNn97Iloi6RUGMtKufCAhGEYx+i6vkYIsYqIFridj4OOJ6JnNU37p2EYR7idTC4YW6o5ab39m2Wlwiaod8D2Z8/fPJbXsroAmDJlimYYxh+FEE8CyJsuciI6eqxH4H91Xbd1nC3De3YulvAQYpPUE7fdA8DfPJbXsrUA8Gia9vVEIvGOEOLTGF2Ln28IwMcAbNA07cqWlhav2wllIyGEvQKgvxVCWLLSYeMkhIW+wTZbMSzL4gKA5bWsKwB0XV+k6/orRPRLAGVu55MBSonommg0ujYUCs1xO5lsk0wmbRUAI8l+dMXflZUOG6cdsY1IJO0tA1QUhQsAlteyqQBQdV3/AYBnARzici6ZaJ5lWet0Xf8p9waMX2dnZzsAW8ssI90vSsqGjVek+wW7IboikQhvzc3yWlYUAPX19bqmaU8AuAqA6nY+GcwL4PJoNPrPUCg06bXteWidnYsjXbYfRmyCIt1r7YZ4EXwOBMtzGV8AGIZxbCqVeoWIjnY7lyxypGVZr+u6vsztRLKErQIg2vsyLJGUlQvbD8tKor33FVsxiIi7bVjey+QCgHRdv1oI8TgAze1kslANgL/run6Z24lkqkAgUKdp2tcAnGUnzkiyH+29r0rKiu1PtPdljNg8CEgI8ZKkdBjLWpm6FbCq6/rNAC50O5Es5wHwc13XZ2ma9vl169Yl3E7IbY2NjYUjIyPLAHwawOkYHTaxbZP5APTKXN6CInO8a95vN4Tw+XxcALC8l3HL53RdLxZC3ENEK9zMw1tQipLqKSipaUJpTRNKqqagsDwIj68YqrcQ3qJyeLxFAIBkYhCJwT6kRgaRTAxiqK8d8a429O/YjP6uNvR3bUFi2PU941cBONs0zXw8vUYJBoNLFEX5DIAzAZTLbsDnKcVnj14Dj1IoOzTbRTI1iD88s8huD8A60zQPlZUTY9kqo3oAQqFQlWVZq4go7Zv6eAtKUTmlBTVNh6O6aQH8dQeCaHwjJF7VC2/hLisSQ/++SEEIC7GOTehqewk7Wteie+s6JIfTfg7JKQCeCoVCK8LhcHe6G3eDrusHCSHOJaJzAUxxsq2RZBybO57AgdopTjaT91q3P267+x/AgzJyYSzbZUwPQCgUMizLegxAc7ra9BSUQm8+EfrBp6Cyfi4UJT31kGUl0bvtdYTXP4j2tx9Pd+/AOp/Pd3xbW1tvOhtNl2AwWKsoyjlE9GkhxGHpbLu+ZglOmX97OpvMOw+tOx/hrtW2YhDRvEgk8rqklBjLWhlRAIy9+T+LdDz8iVB3wJEwDjkNgYOOheopcLzJfUklh9HxzpOIrH8I2z9YDQjnVyYR0csFBQXLWltb+xxvLA0aGxsLE4nEqWO7Qi6HpHH9iSIQPrboQVSXznCj+Zy3I7YR975wOoS91XtbTdN0tDeIsWzhegGg63oxgCfg8F7+pKjQZp2IaUs+B3/dgU42NWmxjk1477lb0f72P9KxveyLiURiWWdnp+uTEyaJNE1bTESfwegs/gq3EwKAacEVWHbwr9xOIyc9tv4raO14zG6YG03T/IqMfBjLdq4WAC0tLV7TNO93csIfKSpCh5yGAxZ/DiVV2XFuTn/XFrz//K2IvLEKwko52dQq0zRPB+BoIzIZhjFdCHEuRmfxN7qczn9QSMXZRz6CiuJGt1PJKT39H+DuNSfLKIwXm6ZpbwyBsRzh5q56pCjKHbC5BntfKkOH4NBzbkBDy8fgK5I+8dsxvuIKBA86FnXTl2Jn+zsYim13qqnpfr+/KhaLPeJUAzIYhlHt9/sv8Pv9NwD4GYClyJA3/t0JCCSS/WiqO97tVHLKmk3Xoiv2jq0YRPSWaZpXSkqJsaznWgGg6/rVABzpivMVVWDWSVegecW3UeivdaKJtCj016J+3hko9NehZ9vrsJLDTjSzoLS0tDcej9veW1WmadOmFRQUFJxeVlZ2LYCbAJwKIORyWuPSE9+EKbXHoKSgzu1UcsL2vjew+t2fQMLOvT+OxWK8/p+xMa4MAei6fgKAR+DAToR104/GIR+5Gr7iStmhXTXS3431D3wP29971onwKcuyjmtvb3/GieATQLquHzm2dO/jALL2m1hXfgg+uuDucS8lZXsmhIW/vfQxbO97026oQY/HY2zdurVHRl6M5YK09wDU19frQojHAJTKjEuKBzOXfQOzl18B1VckM3RGUH1FMGavgMdXhO62l2VPElSI6ITi4uKV/f39ad+gIBQKTSstLf1aWVnZbQC+SUSHAsjqb2L/cAdKCutQWzbb7VSy2tvhu/B2+G4Zof4cDofvkhGIsVyR7h4AVdO0J2Qf7FNUoWP+mb9ARehgmWEzVs+21/HqvZdhaGe77NCPm6a5HIDjSxBCoVCVEOLssaV7ad/4KR0KvOX4xJGPochX5XYqWWlwpAt/ef5EDCd32g1lKYoyNxwO2+5GYCyXpLUHQNf1HxLRZ2XGLNdm4ojzfo/SmkaZYTNaUXkQ+pwV6Nq8FsPxHTJDH+D3+4disdjzMoN+aGxc/7SysrJrhRA3AzgNQL0TbWWClDWMnvj7mKadAnJ/xW1WEcLCP974Grrjm2zHIqK/RSKRGyWkxVhOSVsBoOv6IgB3QGKvQ3XTAiz41M1ZNcNfFo+vGPqcFegNv4HBXlNm6MV+v/+vsVhMWmWh6/oiv99/ZSKRuIOIzgNwENxdgZI2fQNt8KrFCFbMdzuVrPLa5pvxtpwee0FEn4zFYh0ygjGWS9J1E/b4/f4HIPFY3+DMZTj07F9B9Wb1ULEtqscHfc5J6N+xGfHOVllhPQDmxWKx38PGtOv6+voDSktLv+r3+28DcBmAw5Ad4/oCwGohxE+IqADAAXYDmt1rEapeiNJCPtV6PKI9r+Cfb18ha57L/aZp3iAjEGO5Ji39kpqmfZ2IfikrXnDmMsw/6xcgJS9eIvdLWCm89tfLEH37cZlhv2qa5q8nckFDQ0NlIpH4OBF9GsAiZMBOkxPwHoCVqVTqzo6Ojs0AEAqF5liW9RokFMqlhRrOPPxeFBdk77LUdBgY3o57XzwT/cNSXtgtIpofiUTWywjGWK5x/AZdW1sb9Hq970DSEazVjYdhwbk3QVF9MsLlDJFK4KU/fwk7Wl+QFTKWSCSmd3Z27m+moarr+jFE9BkhxJkAimUlkAa9RPSQEOKPpmk+iT30eOi6/jsA58torKp0Os5Y8Bf4PH4Z4XLOSDKOB17+FHbENsoK+VvTNC+RFYyxXON4AaDr+l0AzpYRyx+YjkXn/R6eQr6B7kliOI61f7gAfVE5N1AhxE3RaPSLe/r/NE1rGduH/xwA2bTjzTCAx4nojxUVFQ9s2LBhZF9fPFbAboSknQf1ygU4peV2qIq7h1BlGksk8PCrFyHctUZWyC4AB5mmKXWWLGO5xNECwDCMY4QQT8mIVVSh48jP/QUFJbykal+G4juw+tZPyFoimLQsa257e/sGAAgEAk0ej+fcsb34p8toIE0EgDVEtFJV1bsnuhmMrutfBjCh4ZB9mRo4EcsO/hUU4iEsALBECo+/8TW0dvxDZtgLTdP8ncyAjOUaRwsAXdfXQMIab1I8WHT+H/Jmnb9dPdtew4u/vwCWlZQR7h9CiHvHxvUXI7vG9T8AcKeqqiu3bdv2gY04qq7rLwI4VFJeOCBwIo6bcx1UJb+HslLWMJ5485uyH/4vmKa5GGnYz4KxbObYzdwwjJOFEKtkxJp5wqWYulDq9gE574PVt+OdJ/LyWNr9jutPhqZpM4noVQCFMuIBo8MBJ827CT6P1E0xs0YiOYBH138J4S6ph/MNKYqygDf9YWz/HOuDLC0tXUlEht04ddOPxuzlVwCUTS+e7quqn4feyFsY6N7qdirpMALgISHEt0tKSi7esmXLvbFYTNq6SACIx+M7/H5/AsAyWTFjQxFs2/EsmuqOh9dTIitsVhgY6cSDL38aHX2vSY1LRF+NRCIPSw3KWI5y5KmqadoSIrJ9ao2vqAJLv/xgzh3sky4j/d14+jenITHY53YqTnkBwEpFUe4Kh8PdaWhPNQzjSSHEUplBSws1LDv4l3mzWVC05xU8/sZ/yVrqt6sHTdM8HZJ6fRjLdY70AJSVlf0awAy7cWaddAWqGvLjpugE1VcEb2Eptm9y5ARBt2wDcBMRXWia5s9jsdjLO3fuHExT26KoqOgRRVE+BUDaUpSRZBybovdDCAt65WGgHO3tEhB4c8sf8eSbl2IkGZMdfnsymVzR398flx2YsVwl/U4TCASaVFV9HzaP+q0MHYKFF/yRj1O1SQgLq2/7FPrMt9xOxY4eIrpHCHGnaZqr4fIbnmEYx42daCm9gG6oWYpjZ/8s5w4QGhzpwlNvXY6tO5wpRono5ZGRkWM7Ozu5AGBsnKTfwMrKyi4lIltdpKSoOPSc61Hoz6bl5ZmJiFCuzcK21+8DRFb1jKYAPEVEPwTwOdM074vFYhkxoSEWi20uKyuLAzhRduy+gS14O3w3PEohasvnZH0BLISF99ofwCOvfxFdsXedbMpQVfXogoKCewYGBva5twNjbJTsAsBTVlb2B9jsHq2fezoaWj4uKSVW6K9Ff/dWxDrsn6yWBmsB/AzAeaZp3hyLxdbHYrGE20ntLhaLveD3+0MApI9RpawRbOt6Dtt2PIfaslkoKcjOQnh73xt4bP2X8NbWPyGZGkpHk/Wqqi4pLCzkIoCxcZA6BKBp2nIiesRODFJULP3SgyipapCVFgMQ37EZz/7P6bIOWJGtTQixUlGUOyORSFZUKQDQ0tLijUajqwCc4FQbCqmYFjwV86d+AZUlU51qRqqe/g/w2uabsSn6oFs/b88nEomTeDiAsX2TWgDouv57ALYW7OuzV2DemT+TkxD7N6/e801E35a64YodfUT0oOz1+umm63oxgMcwukmSY4gUNNQsxaFTv4y68jlONjVp3fFNeL3tNrwXfQiWSLmdzpqRkZHlO3bskD7bkLFcIa0AaG5u9vX09HTAzp7pRDjqC3+Fv+5AWWmxXezseBfP3fwxN+cCJAA8SkR3er3eh9ra2tLSL+y0UChUZVnWPwE4vlUlgRCqPhIz9DPQFFgGjyJtX6JJSaYG0br9cbxr3odI1xqIzKrjuCeAsX2QVgDour4MgK3Xy7ppi3HYp34rKSO2Jy+tvBidH0g7cGVciOhlIcSdlmXd1d7e3pnWxtNE1/UaAI8DmJuuNn2eUkwNnIgZ+ukIlrdAUTxpadcSSUR7Xsa75v3YvP0fGEn2p6XdSeIigLG9kFYAaJr2/4jov+zEmH/WL6A1L5eVEtuDyJsP4/W/XZGOprYIIVYS0UrTNN9JR4NuG+sJeAwSzwwYL6+nGFrFYQhVL4ReeThq/DOlrSAQwsKO2EZEul9EpPtFRHtfRiI5ICV2mnARwNgeyOwB2AjgoMle7ykoxfGXPg3Vw8ekOimVGMKT1x2DxLAj98I+AH+1LOuP7e3tzyEPD2Opqanx+3y+e+HgxMDx8HqKUVHchIqSprF/TkVpoQavWgyvWowCbzm8nmIAo3vyDyf6kEgNYCTZj/7hdvT2t6KnvxV9A23oHdicbQ/8PeEigLHdSCkA6uvr9VQqFbETo2H+mZhz6g9kpMP2Y/0D30P49ftlhUtidBLcnYqiPBgOh9O1K1/Gamlp8ZqmeRsRfcbtXNi/4SKAsV1I6SO0LOtIuzH0g0+RkQobh9Ahp8kIs0EI8fVUKmWYpnmKaZp388N/1Lp16xIY3bKYZZbFXq/3kdra2vw8fpGx3cjaZmyRnYu9BaWorE/b3Km8V9UwD54C26fP3RONRq/v6OjYLiOnXGIYxiFE9C2388hyUSJ62YG4XAQwNkZKASCEWGDn+sop6ZvBzABSPKist72BXYuMXHKQRwhxGwCv24lksQctyzpkZGTkWADPOxCfiwDGIKcAIACz7QSoaTpcQhpsImqabNVsABcAe6Tr+jfgwiqAHDEkhPi6aZqnt7e3d3Z2dsZTqdRyIcTTDrS12Ov1PlZTUyPtVEfGso3tAiAQCDQBKLMTo7rxMLtpsAmqtl906cFgsFZGLrkiFApNA/ADt/PIUi8IIeZHo9HrscuukB0dHf3JZPJUONMTsMjn8/2dewJYvrJdAHg8nll2rvcWlMIfmG43DTZBZcEZ8PiKbcVQFIW/cf9CqVTqVgBFbieSZToBXGia5pHRaHTjHr+gszOeSCROAg8HMCaV7QJACNFk5/qSmsasP/I0GxEpKKmeYjMG8Z7NYwzDuJiIjnY7jyxiEdGdAGaZpvk77OcsCB4OYEw+GQVAo53rS6ptXc5sKKm2VbsBAPcAAAiFQoYQ4qdu55ElBID7iGh+JBL5jGmaO8Z7IQ8HMCaXjFdvW+f2cgHgntKaRlvXW5ZlyMkku1mW9VsA5ZLCvYjcyAn8XAAAIABJREFU3EHRIqK/EtE80zQ/GolE1k8mCA8HMCaPjAKgxs7FpVwAuMZu8UVEATmZZC9d188BcKqMWET0iGmaC4lophDiJgC5sLHSIIA/KIoyNxKJnDXZB/+uuAhgTA7bBYCiKLYKgMKyvH+GuKaoPGg3RJ2MPLLV2AmA10sKFxNCfAEAIpHIpmg0+kVFUUIAvgbgDUltpA0RvQXgqz6fTzdN87xwOPymzPhcBDBmn4w5ANV2rpewIx2bJI/P9mdvq/jLdkT0S8grgq40TXPrrv8hHA53m6Z5g2mahwghDgNwHYDNktpzwhYANwJYHIlE5pim+eu2trZepxrjIoAxe2Rsv2dr2ZNqcykamzwJn33eLnkLBoMnCSHOlRTuedM0f7uvL4hGo68AeAXApZqmtRDRqQCOB3A45PweT4YA8CqAB4nowUgk8nq6E+js7IzX1tae5PV6HwGwWHL4D4sAPkCI5SQZNw6frQS4B8A1Ej77Qhl5ZJuamhq/oig3SQo3BOAiTGDiXzQaXQdgHYAf1NTU+AsKCpaObcd9GEZ3IXSqZ2YHgLVEtFYIsdbn873k5Bv+eHERwNjkyCgACuxc7LXfDc0miQuAyfH5fNfC5uqXDwkhfhSNRt+Z7PU7duyIAVg19gfA6PHcQojpqVTqQCI6kIg0IUQdgCCASgClGB3+K8focc4xAHGMFiM7iShuWdYWImolos2WZW1WFGVzJBIJT/5v6iwuAhibOD6Bh9mRi8vV9skwjIVCiC9KCveGruu/iEajksKN2rZtmwnABPC01MAZrrOzMx4IBJYrirLKgU2ZPtwsaPlY0cVY1pOxDHDEzsXJkQEJKbDJSA732w2RV29D06ZNKxg76U/G701SCHHBunXrEhJisTEdHR39lmWd4tCOgYt8Pt+jvGMgyxXuFwD2H0JskiR89nn1zRscHLwKgK2zL3bx32Nj+UwyLgIYGx8ZBYCth0ByJK+eIRklZb/3JW++eYZhzBVCXCYp3CZFUa6WFIvtAW8bzNj+2S4AiKjbzvXcA+CexLDtHvx8GQLwCCFuh5w5M0IIcVE4HM6FXf4yGu8TwNi+2S4ALMsa92EeezK0s8NuCmySJHz2O2Xkkel0Xf8mgPkyYgkhbo5Go8/KiMX2j4sAxvZORg9Ap53r411tdlNgk9Rv/7PfIiGNjGYYxnQAV0kKFx4eHr5cUiw2TlwEMLZnMuYAbN3/l+ydhIcQm6T4Dnu7yhJRq6RUMpUihLgV8nY8/HJ3d3de9Jpkms7OzngqlVru0MTAD5cI8sRAllVkFAC23gL7bT6E2OTFd7TZul4IkdMFgK7rXwBwlKRwfzJN8wFJsdgk8OoAxv6djCEAWw+BeFcbhMi7/WRcJ4SFgW57Pfi53ANgGEYIwLWSwu2wLOu/JMViNvDqAMb+xXYBkEwm37Z1/XA/Yh2b7KbBJmhn+7u2N2FKJpM5WwAIIX4LoExSuK+1t7fbmivD5OE5AYyNsl0AdHR0tMHmbPCutpfspsEmqGvzi3ZDmB0dHdtl5JJpNE37FIBTZMQSQvzdNM0/y4jF5OE5AYzJmQMgALxpJ8CO1rUS0mAT0dX2sq3rich2BZGJdF2vIaJfSgq3k4i+ICkWk4yHA1i+k1EAgIhsPU26t66DsJIyUmHjYFlJdG991VYMIcQLktLJNDcAqJURiIguN01zm4xYzBncE8DymZQCAMBqOxcnh/vRs+11Samw/enesk7GDow51wNgGMbJAD4hKdyzkUjkFkmxmIO4J4DlKykFgMfjsVUAAEB4/YMyUmHjEHnjIbshEoqi5NRBNlVVVWVjE/9kGCKii5CHxyVnK54YyPKRlAJgy5YtUQAb7cRof/txpJLDMtJh+5BKDKF94xN2w7yYa3vZFxYW/hRAvaRwP4xEIry0JctwEcDyjawhAAghHrVzfWI4jo53npKVDtuL9o1PyOj+z6nuGk3TjgIga7Leek3TrpMUi6UZzwlg+URaAUBEj9iNEV7PG6U5TcZQCxHlTAEQCoWKiOhWACQhXBLA+evWrUtIiMVcwnMCWL6QVgBomvY0AFtHA3d+sIY3BXLQzo53scP++v+NudS9bVnWVQCmy4hFRD83TfM1GbGYu7gngOUDaQXA2FuPvVd4IfD+87fJSYj9h/eeuRkQwm6YnHn7NwzjEADfkBRuk9fr/ZGkWCwD8NkBLNdJKwAAQAhxt90Y0bf/gf6unD9lNu3ina3oeOdJ23GEEIdrmnaohJTc5hFC3AHAKyGWBeDCtra2IQmxWAbh4QCWy6QWANFo9EkAETsxhJXC+8/fKikj9qH3n79NyqFLRHQ0Eb2s6/pD2VwIaJp2GYB5ksLdZJqmEw8IlgF4dQDLVVILAADJsbcqWyLrH0Kf+ZaMfBiAPvNtmG/9XXbYU8YKgcdDodAC2cGdZBjGdCL6vqRwW0dGRq6QFItlKJ4TwHKRjJnP/yYYDDYqivIBbBYXFcYcLLpwJYhk1yj5RQgLa277JHrNDU43tUoI8cNoNPqK0w3ZpOi6/gyAxVKCKcrJ4XBYenVlh67rDQBmADhICNGoKEqdEEIHEARQgtFTDglAxdgl/QBGMLqBUReALiHEdgBbAbQR0eZkMvn22MFftieRZLPa2tpSr9f7CCT9/Ozm+UQicVJnZ2fcgdiM/QfpBQAA6Lp+H4DT7caZc8r30dDyMQkZ5a8tr9yNtx7+cTqbzOhCQNf1SwD8RkYsIloZiUQ+LSPWZNXV1QU8Hs9CAEcCWAjgYABOvUnuBPDm2Nkfqz0ez+qxTcDyChcBLFc4VQAsBvCc3TjeonIc/aUH4SupkpBV/hnu78YzN56KxJCt05onK+MKgbE347cg5wG5HUCzaZo7JMSaCE8oFFpsWdYKACsANKe5/d1tFEI8QkSPVlZWPrNhw4YRl/NJCy4CWC5wpAAAAF3XXwBwhN04dQcehcM+cSNAjqWak4Sw8Mqfv4Tt77s+Ny1jCgFN0x4mohWSwn3CNM27JMXaH8UwjKOFEJ8GcAaA8jS1O1E9AO4XQtw9NiE4p4/45CKAZTvHnqqapi2XsTsgAMxc9g1MXXS+jFB54/3nbsO7T13vdhq7crUQ0DTtXCK6U1K4h0zTPE1SrL3Sdb2BiL4w9uAPOd2eZBEhxB1CiNvb29vb3E7GKVwEsGzm6Gu1ruvPY3Rs0hZSPFh43h2orJ8rIavc171lHdb+8XOwrIx8AUt7IRAIBOpUVd0AoEZCuD5FUZrD4bCt5a77omnaEgBfJaLTAXicaidNLIxuHnVdri6V5CKAZStHp9hblvUdGXGElcSr916GoXi6h1uzz3CsE6/99VuZ+vAH/rV8MG37CKiqej3kPPwhhLjcqYe/ruvLdF1fQ0TPEtFZyP6HPzB6jzkdwHO6rr+gadpytxOSjfcJYNnK8YF1Xdf/BOCTMmKVBQ/CEefdAW8B/y7sSXI4jjV3fDbbzlNwtEdA1/VTIWn7YiHE09Fo9FhIXgqnadpRRPRjAEtkxs1gayzL+nZ7e/szbiciE/cEsGzjeAFQW1sb9Hq9G/GvNce2VE05FIefezMUj09GuJwhUgm89OdLsKPV9mE/bnlCUZTvhMPhl2QFrKqqKissLNwAOePng4qiHBIOh9+TEAsAYBhGCMA1Y2P8+WgVgEtM09zmdiKyBAKBEkVRVhHR0Q6EXzMyMrJ8x44dMQdiszykOt3AwMBA3O/3DwI4SUa8wT4T8R2boc08njcJGiOsFF7962XY/t6zbqdix1QhxOf8fv+hpaWl78XjcdNuwIqKiuuJ6FgZyQH4biQSeUhGoMbGxsLi4uLvAbgLQIuMmFlqOoALSktLB+Px+CvIgU2G+vv7E0VFRX9VVfUoAA2Sw9erqrqksLDwnoGBgbxYbsmcla61daqu668AkDaLT5u1DHM/+lMoan73BFjJEbz2t8vRvvEJt1ORSQB42M7QQDAYXKooyj8h52d8nWmaR0DCsrZQKHS4ZVl3AJhpP62c8joRXRKJRF5wOxEZeDiAZQPHewDGiLKysjcAnA9JRUe8sxU9W19DYOZxUPN0OCA5HMfLf/kSOuWu9U8CeApAE9JXIO6OAEwnoosm0yMQCoWKAPwdQLWEXBJEdEosFrO14920adMKCgsLfyyEuB1AnYS8ck0QwPl+v1+JxWLPIct7AwYGBkYKCwvvcagnoEFV1aO4J4DZla4CALFYLOz3+1UAS2XFHOw1sX3TswgcdAw8BSWywmaFofgOrL3zIvSG35Aal4i+b5rmReXl5fcJIWoAzIL7hcDn/X7/4vLy8nd37ty53xn4paWl1wA4VUYCQohrTdP8i50Yuq43JBKJvwM4Bw6vvMlyBODosrKypRUVFf/YuXNnVo91DwwMjBQXF/8vES0iokbJ4RtUVT26oKCAiwA2aem+sSu6rj8G4HiZQQvLAph35s9R1TBfZtiM1We+hVfvuRQDvXJXoxHRM5FI5DgAqQ//WygUOtiyrO8COAvuFQK72udkQcMw5gohXgLgldDWuz6fb25bW9vQZAOMrUL4A4BKCfnkk04hxGei0eijbidiF08MZJkqbT0AY0RVVdUTlmWdC0DaWr7kcD/MN1ZB8fhQVT83d7cNFgLvP3871t93JUYG+2RH3+71epf19fX928EBO3fu7IjFYveUlpY+TEQ6gAPhbiGw18mCLS0t3lgs9ncAuoR2LABnbNu2bfMkrydd168G8FsARRLyyTclRPTJsSGBrF4uyBMDWaZKdwGAvr6+uN/vfx3AuZD4IBHCwo7WF9EbeQu1ByyE6sute+5wfzdeu+eb2LruHghhyQ5vCSE+Fg6HX9/bF8TjcTMWi/0lQwqBPc4RUBTlSkjacwLAb0zTvHkyF46N9/8BwJeRGb0m2YoAHO33+6fEYrGHMVqUZSWeE8AykWs3J13XfwDgKidie4vKMePYr6Kh5aysXyoohIWt6+7Bu0/e4NipfkKI/4pGo7+ayDWaph1KRFcBOBnuP+QEgMcAHAOgQEK8LSMjI3Mm063a0NBQmUwm7wdwlIQ8Jq3E40F9YTHqi4owpagEocIi1PkKUaSqKFQU+D1eFKmj9f9gKoVYMoFBy8JQKoXtI0PYNjiALYMDCA8NYtvQAPqTru8suQrA2aZpDridiB28OoBlEjdv3KTr+q0ALnSqgXJtJmaf/D1UGHOcasJRfdGNeOvvP5Y+0W9XRHRLJBK5eLLXZ+AcAduEECdNZuw5FApVWZb1GIC0bHG8q0JVRXNpGQ4tr0JLeSUOLCmFInEozBwaxCt9PVjX141XensQT6W/IBBCvEREJ7twBLNUPCeAZQq3b9geXdfvA3CKUw2QokKfvQLTllyE0pomp5qRKt7Zig9W34bIGw870d2/q1WmaZ6OXSb9TdZYj8D3Mfq9dPvnatKEEH+MRqOfneh1YztePgGg2YG09qhEVXFMTQAn1AQw218ONU1zX1JC4M1YHx7rbMcz3Z3p7h3YoKrqCdu2bbO9UZSbuCeAZQLXb9S6rhcDeBzAIifbIVIQnLUM05ZchLLADCebmrSd7e/gvWdvQcc7Tzr94AeAtYlE4njZNwlN01rGhgaysRDoIKLmSCTSNZGL6urqAh6P51mM7mznKAKwoKIaJ9YGsbiqBgWKu0Ncw5aF57s78VhnB17q7UrX4v0NHo9nydatW3vS05wzuAhgbsuIG/RY1+mzSMfbExFqpy6EcchpCM48HqpHxpDx5KUSQ2jf+ATC6x/Ejs0vAiItt9AXCwsLl7e2tkpfSvChbCwEiOjsSCTyvxO5Zuy8gacBzHMmq1EqEY6ursO5xhRMLc7MPS8+GOjHynAbnu7uhOX8z/FqACfwnIB94iKA7VPG3Jjr6+v1VCr1D6SxC9VbUIrAzOMROuQ0VDbMg6Kk5/RVYSXRtWUdIusfRPs7TyI53J+WdsesGRoaOqm7u9uZGYW7yaJC4IGx4ZBxa2xsLBweHn7EobFcAKMP/hNrg/iUMQWhwuxY2bJtaAArw1vw+I4OpJwtBFaZpnkGJGzR7CYuAphbMuqGPDaD+iEAR6a7bY+vGJUNLaiZejiqGxegLDhD2goCISzsbH8XXZvXoqvtJXRvWYfkiCsvLs+NjIyc7MYEoQwvBPpUVZ01wXFl0nX9LgAfdyqpZn8ZvtE0A9NKsvP46039Mfy/1k3YGHe01rzDNM0LkeVbB3MRwNyQaTfiD+cE3AVJW7lOlsdXjJLqKSipbkJpTSNKqhtRVB6E6iuGx1cMb2EZPL5iAEByZACJoZ1IjgwgOdyPoZ0d6O9qQ3zHZvR3taG/a4tbD/xd3Z9Kpc7t6OhIa3fD7jKxECCiayKRyHcmco2mad8mop84kU+Zx4uLG6ZiRZ0mdSa/GywhsGp7FLds/QAx5yYLXmWa5tVOBU8XLgJYumXq3cUztkTwPLcTyRG/ME3zCmTQRioZVggIAKvGTh9ct78vNgzjFCHEA3BgX/9FldW44oCZKPfK2Mk4c/QmErj2/Y14sXdC8yvHyyKiZZFI5CkngqcTFwEgTdMaVFVtsiyrCUATETUJIZoAFAKowOheHyUA/AA8APoBjAAYIqIuAF1CiO0AtgJoI6LNyWTy7Y6OjjZkeU+RbG7fePeFdF3/HoDvw4UdC3NEAsAlpmne5nYie5OhhcDVezuGOBAITFVV9VUA5TIbVolw8ZQD8HGt3vUPwSkCwN3mVtyytdWJuQHtiURiXmdnZ7vswOmWT0WAYRjVRHS4ZVkLABw+9sepczN2AniTiF4GsNrj8azesmWLrVM+s13G32sMwzhWCPEnjB4XyiZgMl3bbsnAQuDhsR6BXQsBj67rzwJYKLOxYEEhrprejFmlZTLDZqy3Yn344aYN2D4yLDv0k6ZpnoAM6umarBwuAkjTtPlE9BGMDvPOdSGHXW0UQjxCRI9WVlY+s2HDhrzaStntG+24jG2yshLAcW7nkmW2+3y+GW1tbb1uJzJemVwIOLF99fQSP34+82BUen0yw2a8rpFhfOudN/B+v/RnUE7MBwByqwjQNG0JEX0Cow/9UDranIQeAPcLIe6ORqNPIstXl4yH2zfYiVB0Xf8ueEhgon5tmuZX3U5iojKwEHgSwNEYHXOUYn5ZJX5y0GwUq+lZfppp+pNJfOfdN/HaTqn1aYqIlkQikRdkBnVLNhcBuq7XENFnhBAXATjIiTYcFBFC3CGEuL29vb3N7WSc4vaNdcIMw1gohLgVadwvIMsliejQSCSy3u1EJiPDCgFpllbV4nvTZ8Gb5YdV2ZUQFn703kY807VdZtgNlZWV83OlOzfbioCx39lvAvgo5BzO5SYLwIMArjNN83m3k5Et696kY7FYuKGh4bahoaEUgCMg8Y0sRykAmmOx2O/dTmQy4vF4tLy8/AMhxAVwYNa9G5ZW1eKq6c3w5PnDHxid/HhUVS3aBgewZVDaCtW6oaGhVCwWe0ZWQDdly1HCmqa1lJWV3UxE/w1gDnLj3kwY7b24wO/3Ly8tLY3E4/H33U5Klqx+owoGg41E9AsiOsvtXDKdEOLcaDT6J7fzmKjm5mZfT0/PKxi9oWS9+WWV+Pmsg/P+zX93CWHh0rfX43V5wwFDqVRqVkdHx2ZZAd2WqT0BudpLtw9rLMv6dnt7e9YXmFnXA7CreDzeG4/H7yktLX2KiKYBmOJ2TpmKiI4oKCi41W6Vn25er/c7AM52Ow8ZPpzwV6hm9a+dI1QiHFVdh7W9XehOSPkR9RCREY/H75ERLBNkWk+AYRjVfr//BiL6H4y+JefDwx8A6onoPL/fP72qquqFvr6+jFhSORk5cSeKx+NbY7HYHX6/fw2AqZD/y5EL/KqqemOx2ONuJzJewWBwFhGtRA50JQYLCnF98zyU5dgGPzL5FAWLK2vwz67t6E/ZPqEaRDSrtLT0qXg8vlVCehkhQ4oARdO0iwDcj9HeiHx58O9ujmVZnystLR2Mx+OvIAs3GcrJb5xhGAsBXCqEOB3ZP248AuBNAC0SYiWEEIdEo9GNEmI5TdF1/XlIXnPvBpUIN86enzfr/O16M9aHr214TdZmQc+ZpnmUjECZxK3hgEAgMNvj8fxOCHGYA+1ms9eJ6JJsW32S7Q/HPYpEIi9EIpEzATQBuBpA2OWUJoyIWgFcmUwmGyorKxcB2CQhrJeIbpAQx3Gapn0FOfDwB4CLpxzAD/8JmOMvx0UNU2WFW6Lr+jJZwTJFZ2dnPJFInATAiZnpi71e7yO1tbW7nkJFuq5/SVXVl/jhv0dzhRDPj+0VkjU96znZA7AHqq7rx2B0LPkMANUu57M32wH8LxH9ORKJvIhdupQ0TTuRiB6V0QgRfSwSidwrI5YTgsFgo6Iob2F0v++stqiyGtccdHDe/KLJIgBcsXE9XuztlhFutWmaTrwpuy4dPQGqqhYpivI7jE7yY/v3T6/X+6ls2GY47+5LLS0t3vb29iVCiOUAlsPd2eUWgPVCiL8rirIqEom8hH1sY6rr+n0AJnRm/V5sBTDTNE3XjyjcA9J1/TEAst7argSwCC7MUC7zeLFy7uE5d7BPuvQmEjj39RelnCJIRIuyrXt2vJwsAojoZSFECIAmO3aO6wBwrmmaT7idyL7kXQGwu0AgUKcoypEAFhPRAowWBFIPetlFN4DXALxKRM+qqrp669atPeO9eOzN+G0ARXYTEUL8JBqNftduHNkMwzhPCHGHpHD3mab5UcCdpUqXTZ2BUwJ6OprKWQ+2R3DdZvujX0T0t7FhwZzkcE8AmxwLwI9M0/whMnSCYN4XAHuiadoUADMVRfnwGMp6AAGMDh1UAyjG6Mx0/9glOwGkAAwC2DH2pwPANiFE69h4/jumaW6zm5uu61cB+IHdOACGFUWZHQ6HM2ZTi7EzHzYAqJIQrtfr9c7avRsuXYVAs78MNzbPh0L8K2aHJQS++NY6vBOPSQglpkaj0S0y8spEXARkrN+bpnkRMvBsAb47ZZlQKFRkWdYGjE5wtOth0zQzZlxP07R7JG7q9DnTNG/fR1uOFQIqEW6ZcyimlZTu/4vZfr0bj+GLb62TsSrgatM0pR7mlGlypQjwl6horPdiar0PU6f40BTyQqvzoqRIQWEhUOH3oKho9Nd2cFCgN5bE4KDAwJBAdHsCrdsSaN0ygs3hEbRtSyDWb39ZqU2rAJydacOuXABkIV3XP4LRNbgynGaa5kOSYk2apmkfJaK/Sgr3pGmayzCObjcnCoEVdRouPyDbzj7JbNe8vxGPdbbbDbPNNM0mjPbW5axsLAL8JSoOPbgQi1pKsHB+EaY3+aAoch5PliXwbusIXnh1AGvWDeCVNwYRH3DlxOg1Ho/nlIkM+zqNC4AsZRjG34UQJ0kI9YHP55vd1tY2JCHWpDQ0NFQmk8kNkDPRqD+VSh3c0dHROpGLxgqBXwA4xk7jKhH+OPdwhAptT9Ngu9g6OIDPrn8Jlv1egGWZPjFLhmwoAkpLFKw4xo/TT/Bj/uwieNT0PI6SKYF1bw7h/sd24tFn4unuHdigquoJ27ZtM9PZ6N5kzXpF9u/Ky8tfEkJ8HvZ3yauyLGskFos9KyOvySgpKfkNACmbtRDRFdFo9JGJXhePx6N+v/8g2LxhHlsTwKk88U+6cq8XbQP9aLN/YNBILBZzvcfLaQ7vGDhpRMBRh5fgvy6swTXfCuLEo0phBL3S3vbHQ1EIoaAXxy8uxXlnVeLApgIMDglsNRPpaL5OCLG8srLyL319fa69dH2IewCymGEY1wghrpQQatCyrFlunHttGMYxQognIeFnUQjxUjQaXYRJdvHqur4RNs4tJwC/O2QBphZn/fYFGemDgTguXP+y3enUPZWVlcFcOSp4fwKBQImiKKuI6Gg381AUYOnhJfja+dWYPaPQzVT2alPrMG69qwcPPRFDMuX4pP21qVTquI6ODmlHYE5GTu4EmC+SyeRPIGeXwyJVVa+TEGdCAoFACYDbIKcQHbEs60JM8uFvGMZ02Hj4A8CCimp++DvogOJSHFphe4FIZW9v75Ey8skGHR0d/clk8lQ4s2Pgfqkq4WMryvGPO5tw60+NjH34A8D0qQX4xbeDeOQPU3Dm8jKozg5JHK6q6l1w+ZwTLgCy2Fj1eKmMWEKIj2qatlxGrPHyeDw/FkLI2vP1mo6Ojrcme/HYuRG2LK8L2g3B9mN5rf3PeGwTsLzR2dkZT6VSy4UQT6ez3dkzCnHPb+px7eUBNIayZzOspnoffnZlEPff0oB5zY4WLKcYhnEHXOyJ5wIgy5mmeTeAp2TEIqJfNjc3+2TE2p9QKHS4EOIrksJtqKysvNZmjFPtXFyiqjiyssZmCmx/FlfVoMRj76WJiFZISidrpLMnoKJMxTWXBfC3m+px8MzMfePfn5nTCnD3jfX48TfrUO53ZrqcEOJcXde/50jwceACIAdYlvVVADJmsBzU09PzdQlx9qm5udknhLgNciahphRFudDOmK6u68UADreTxDE1ARQo/OvktEJFxVFVtbZiCCGaA4FAnaSUssaHBwgR0ctOtXHsohI8vrIRHz+lPK0T+5yiKIRzTqvA4ysbcfQRjg3vXWUYxrFOBd8XvmPlgPb29g1CiBslhftuKBQyJMXao56enm8LIWbLiCWEuCEcDq+1GaMFgK0+yhNqAnYuZxNwov1hAFIUJWOXxzlJUZTisb39pfKowJWX1OLmawxUlufe4rKqChW3/tTA5V+sdWK5oiKE+FNtrYTxrYk2nO4GmTOGh4d/AMD2TikA/JZl/UJCnD0KBAKzMXpAj21E1GpZlu3uMyKydexwiceD2X6njo9guzvYX45i1fbcqbyZCLgLRVXVOyH5YB8j6MVdNzbgwrMrkcs7XxMBF51TiT/fEIJWJ33uXtDr9a5Emp/JXADkiO7u7p1EdLmkcOcEg8GlkmLtSlVV9TYAMuYZCCHExZKW0Rxh5+KD/eWgv/J+AAAgAElEQVRQc/nOl2FUIhzsL7MVY+zgr7yi6/qlAE6QGbN5eiH+dlM95s7K3rH+iZo/uwh/u6kBM6cVyA59nK7raT2gjQuAHBKJRO4EsFpCKFIU5deQvETFMIyvwuZY+y7ukLijm60CoKVcxtlFbCLm2f/MZyOP9kHRNG0mgB/KjLlwfjH+dH0I1ZWurmRzRW21B3++IYQj5knf8fP7hmHY6pGcCC4AcosA8BXI2et8jq7rl0iIAwAIBAJThRA/khQu6vF4pCx/rKurC8Bml+i8sgoZqbAJmF9u+zOv0HW9XkYuWUAlot8BkPaavnxpKW7/uYHS4vx9hPhLVPzuFyEsX+rf/xePnyqEuDVdq7Hy97uXo0zTfA3AzZLC/XDsAWkXqap6CwAp02iFEF+WdaCG1+u1tQ9BicfDm/+4YFpxKYpUe5PNhBCzJKWT0XRd/yJs9nLtavnSUlx/lQafN286UPbK5yVcf1VQdhHQ3NPTI2We1P5wAZCDPB7PdwF0SghV4fF47K6vh67rFwI4TkI+EELcG41G/yYjFgBYlnWAnesbCouh8Ph/2ilEqC8sthdDUWQcqZ3RxpY7Xi0r3hHzivD/vqc5vUteVlFVwi+/H8TiQ+39PO7mylAoNE1mwD3hAiAHbd26tUcI8R1J4c4zDGPSbw9TpkzRAPxcUi7dyWRS1uZBAAAistUDECriU//cUl9k74YrhMj5AkBV1WsBVMqI1Ty9EL/9ic5v/nvg9RBuvFqXOTGwIJVK2X752h8uAHJUNBq9HcArEkLR2B4Dk/pZSSQSv4GkGxCASzs77R8KvxtbBUCDzbdQNnkNNgsAAFNk5JGpgsFgM4DPyohlBL343c91+Etyb42/LKUlCn73c0PaEkEiOkvTtCVSgu0FFwC5y1IU5csALAmxWjRNu2iiFxmGcRaAMyS0DwCPm6b5e0mx/o8QwtZDwO5bKJs8u0MAAHJ672ZFUX4MCbttelTg+qu0vJztP1G11R786vuatM2CiOgnUgLtBRcAOWxsh7zfy4hFRD8xDKN6vF8fCoWqhBC/ltE2gP5UKnUxYPck2P9ERLYWlNf6pK8FZuNUV2D7s8/ZAiAUCi0A8BEZsS67uDav1vnb1TKnCN+4aNy3yv1Zouv6MlnBdscFQI5LpVJXAuiVEKrasqwfj/eLLcu6DoCUrS2FEN/p6OjYLCPWHtiawi9hRzo2SRI+e2l36UxjWdaVkLDPwbGLSnDBx2WN4OWPi86pwlJ5ZwdcJSvQ7rgAyHEdHR3bAXxfRiwiukjX9fn7+zpd10+ApLFHAC9Go1FZPQl7YrMA4DFRtxTbP3wpJ8dvxjb9Oc1unIoyFT+7IpjT2/s6hQj4xZVBWacIHunU5kBcAOQB0zR/C+ANCaFUADdiH28WgUCgBKP7EMi4bQxblnUh5Mxj2BsuALKUhB6AnBy/IaJLIeHe/q2La3LyYJ90qapQcennpXUySdn4bHdcAOSHpBDiy5Azhr7QMIy9vt17PJ6fAGiU0A4AXNPe3v62pFh7w0MAWarYY/vhlHMFQENDQyWAT9iNM6+5EGetsHfeAgPOPqUcBx9kf/6EEOJ0TdOkr1rhAiBPRKPR5wD8WUYsIcRPp06d+h/H3xmGsVAIIWud/puVlZU/lRSLsT0hXddPCwaDsxobG3NillsikfgMAFubU6gq4epvBKAo3Pdvl6IQrv5GnYyNkxQiukBGTrvi15c8oqrqt1Kp1GkA7O5bGRgeHv4hgK9/+B+mTZtWMDAwcBvkFJUpRVE+t2HDhhEJsfYnDmDSJ8sMpJIo83glpsPGayBp+8gLBcADiqJgZGRE6LoeIaIPhBAfCCE++PDfvV7vB7K2nnYaEU14ue7uPnpimRMn3eWt2TMKcdrxftz32E67oc7H6K6OMs56AZBHp2GxUbquXwY5O/MlFUVpCYfDb4zFvRrA9yTEBYD/Nk3zMkmx9knX9a0AJn0ozN3zFyJYkBMvj1knOjSIc157MV3N9QBo/fCPEKKViFpTqVRrR0dHG5ydpzIuuq7PA/CqnRiqSnj0D1PQVJ+Ws2jyxgdbR3DSZ9tg2f8pWSbxFFTuAcg3pmn+0jCMzwghZtsM5bEs6zcAjgqFQrMty7pcRn4ANqdSqR9IijUecTsXD6SSsvJgEzRgSXsRGo9KAC1jf0BjU+NVVYWu6yNEFP6wKPjwn5Zltaqq+nY4HB5MU462x/5XHF36/9m78/ioqrt/4J/vnUnINmEnmTMTDIsigqIGd3EFwbWL7aN9tNbaWpc+dXvUqq0KamvVarWb1q1P61Jt61ZtZXHXKi4oqyhCCMnccycEAmSyZ+ae3x8J/SGChJzvnTvLeb9evF6V5n7uYSYz93vPPYu5+Htg3OhCzDyqDC++pvV1AwBnADAFgDFgSaXU5QAWMGQdGQ6Hz1JKXQKA41tDEdH3Gxsb2xiy+oWIWpUa+NjI9lRaL0LGNtr0HwFwKVRKjQUwduvvklIKRATXdZORSKR++8cKgUBgjeu6q6WU7Yzt+KbOwUTARWfn7NIIvvvhOcMx9/VWaHzdAMDXampqLl60aFEPR5tMAZCHpJQvRSKRp5RSp+tmEdHDSimuh+AP2bb9ClNWv7iu20YaE52bursYW2Psjix57YNbiwMimgFga2EAABBCOADWAPhPcWBZ1hrXdddIKTf09yTRaHRf13WrdRo67eBS7DXW3P17Ze9xg3Dk1FK8+b7W/c3weDx+JIBXOdpkCoA81dcLMAua0+AAcF38nWAweDVTVr8RUaPO8Q0dnDdwxu6o70hbR5GXwn1/jtzae7BNcdBJRFIp9TGAFduNO1iHbQaDua57sm5DTp9lpv157WuzQroFAJRSs2AKAEOHlLIhEon8Qil1s99t6XOxHyOtlVKf6fQA1HeaAsAvDZ3perTum6KtvQcATtlu3EEXegckbu050FovvqzUwvFHlGk32PhyM44sQ6g0gETbwB9fEdFJAFjGXJl1APLYkCFDbgewyu92AHhSSvmsHye2LGuNzvENHTl/EcpYOdIDMFCDAEwEcIpS6lIA++iEnXxcCEWDzKQwrxUXWZh5lF6nq1Jq8siRI1n2WTEFQB5bsWJFNxH9r8/N2JhMJi/16+RKqc90jq/vaIerOarH2H2uUojlfg9A2nxlhu7SIEZ/fXXmF9ZQ223BYPBwhqaYAiDf2bb9AoAX/Do/EV2xfv16refwOpLJ5Gqd49tTSdS25/WdqC8+a2tFh5mBwaKsxMKBk7UWDzR2w9T9ilBWon3pPYKjLaYAMBAIBC4D0OnDqefZtv1nH877H33FR0In48OWrFgkLqd81MKxw7UBAAdNKUZQf6lao5+CAULNfnoFFxEdzNEWUwAYaGhoWKOU+mWaT9uWSqUuTvM5d4iItFZP+3CLKQDSzbzmfA6vYdu33uinww7Q3ol6XzCs5GsKAAMAEAgEfg6gLo2nvKaxsbE2jefbKdd139Q5fknLFqTMOIC0SSmFZYktfjcjZxx2oOn+TzeG13ywEGLAS5hvZQoAAwAQi8U6lFKe7Dm9A29LKX+fpnPtEhFpFQDtqaS5IKXR4pbNZglmJqHSAPYaYxb/SbeJ4wehtFjv8quU0pr5AZgCwNiG4zhPAZjr8Wm6XNc9HxmwecpWPT09bwPQuqLMb4oztcbYFfNa8xkzusBs++sDyyJUa+65YFnWGO126AYYuUUpdQUAz7bhJaJb4vH4x17lD0RTU1MrgCU6Ga81N6GLYasv48t1uim80dzvFXJ35mMiuge9s19WAsiKNYW9MCZq7v79MrZKbxFVpZR2AWBWAjQ+x3GclUKIewB4sR3vksrKytts2/YgWo9S6nUiqhno8W3JJN5qbsLxIyo4m2Vs582NGzi6///Ptu07tvlvSwgRsSxrnOu644honFJqHICtf4bonjBTjRltCgC/MLz2e+gGmALA+ILu7u6bCwsLzwIgGGOTSqnvc+1ixY2IngFwhU7G3Ka4KQA8Npeh+5+Intvur1wpZQOABgCvbf/zQogRX1IchLUb5CPdu1Bj4Mbqb7s8QjfAFADGF2zYsCEhhLgKwGOMsb9yHOcDxjxWUsq3hRA2gMhAM97f3Iw17W0YV2KmVXlhdVsrFm1p1o1Zadv2bi1/3bcr3wYA727//wkhSr6kONgDGf4dWzkyo5uX08KjtF97UwAY3iCiFap3ezKOEUJdnZ2dtzDkeMklor8ppS4baIAC8GisDjfuNYmxWcZWf47VgWGy5fZ3/1qklO0AlvX9+ZyampqC9evXj04mk+O3FgdEtG2h4Pv8u7KSgN9NyFsMqwEO1w0wBYCxI0Gl1IPgufgDwKDi4uLvAfgVU55X/gpgwAUAALze3ISGznZUFWkv9GFsY11HG97cpD34D0T0BENz+qXvcdeavj9fUFVVJZLJ5LjtexCIaBwYvtz7o6TEzADwS4l+AaD9JWMKAOMLhBCXA5jKmamUmj1y5Mi/NDVl7hwu27YXCiHqAYweaEZKKTwaW4drx09kbJnxqF2vvekSEb1v27bWbA9ODQ0NEoAE8IV1KKqrq4d0dXWNtyxrrOu647frOYiAqThnuAs1BojhtR+kG2AKAONzotHoeNd153gQXR4MBm8D8B0PsrkoAP8H4AadkAUbGnF6OIq9Ss0OaxxWtrbgpQ36+0W5rvsgQ3PSoq6ubjOAD/r+fE51dXVRZ2fnWMuyxgF4FhrTuUtNAeAbhtdeuwAw776xLXJd9z549GySiL4dDoeP8iKbSyqV+h00N0ZKKYW7aleZbYIZuErhnrWfcbyWbV1dXWnr/vdSXV1dZzwe/1hK+TwAsyVi/tL+UJgCwPiPcDh8PoDjPTwFEdHdADJ25FFjY+N6ANoXipWtLXhhvcPQovz2/HoHK1tbtHOI6LHm5mb9oMzTqnNwW7tZvMovDK+99oJtpgAwAADRaDRCRLen4VQHCCEuSMN5BqyvSNF2f/0abO7JyGUPssKmnm48UL/D8XO7K0VEd+z6x7JSm87BraYA8A3Da6+9gqUpAAwAgOu6vwcwOE2nu1kIoT2H1St9A8Ve0c1JJJO4dfVKjqlrecdVCreu/gSJJMumP0/GYrHVHEEZSKsAaG83v51+adcvALTee8AUAAYAIcQZAE5L4ymHAfh5Gs83ELdxhCzcvBFPynqOqLzyuKzHu5s3ckSpVCp1K0dQhtLsATBDCPySaNMuALRXxTIFQJ6LRCLDAfzah1N/r7Ky8iAfztsvUsr5ABZwZN1fX4vlZrvgflvashkPN6xlySKiZxobG5ezhGUmrTEA8SazrbJfGF577YUxTAFg/ArAKB/Oa1mW9Rtk8O8gEV0FhlHWKaUwZ9UKbOzO203n+m1jTzfmfPYxUjwzKHqUUtdxBGWwBp2DaxvMGBW/1DZoj+Fr0g3I2C9fw3uVlZUnKqW+7WMTDhFCnOvj+b9U31iAP3Fkre/uwpUrl6BVfye7nNWWTOLqlUuwgalQUkr9Rkr5KUtY5qrVOXhtvWc7fxu7UKv/2q/TDTAFQJ6qqKgotSzrd363A8DtfY8hMlIgELgeDINtAKC2vQ0/+WQZul0z8np7SaVw/arlWN2m1aO9reZAIPAzrrAMpvWshOEu1Bgg3QKAiLSfk5kCIE8FAoHbAIzxux0AhiulZvvdiJ1paGiQRMQ2iGxxy2bcwtfFnRNSSmHOZyuwaMsmtkwi+kksFtMeJJXplFJa8yTX1nfDdc3vYrq5rsK6mN7jF9d163TbYQqAPBSJRA4DcJHf7djGRZFIZIrfjdgZ27ZvI6L3ufJeb27CjatWoMv0BKBHubj5s4/xxkbtx5nb+rdt2w9wBmYq3bvA1nYXn9aaXoB0+/izLrR16H3+LcvSHtxqCoA8U11dXaSUehg8730SQB1DTkAp9Rvw7T7ILZlKpc6F5hLB23qzuQk/XrkEbTzz3LNSWyqFK1cuwasb13PGtqZSqXOQJ0vkSiltAFpTTN75sJ2pNUZ/LfyoQzdis23btm6IKQDyTE9Pz/UA9maKu4uIzmPKmhYOh/+bKYtdPB7/GJqbBG3vo5bNuOTjj/JydkBzTzd+tHwRFm/ZzJpLRFc1NjZqDYzLMi6A93QC3l5kCoB0e+cj7dd8GcxeAMbuiEQi+yulrmKK+8yyrNm2bb8K4EmOQCK6Y8SIERm7hZ6U8i4Ab3Nmrm5rxYXLFmFZHq0TsLRlM85f+gHWtLOMrdzWPNu2/8AdmgXe0Tl40bJOJFNmHEC6JFPAB0u0ewC0ir6tTAGQP4JKqQcBFDBkKdd1z4/FYh0AQERXQnNBkj7hwsJC1rtsZinXdc8CwwIc21rf3YVLV3yEv8j6nF42WAF41F6Hyz5ezDbVbxv1AM4Gw11RtnFdd6HO8Ym2FBYtY3u6ZezCe4vbtZ//K6X+zdEWUwDkCSHEFQBqOLKUUvfH4/HXt/63bdsxAFxTri4Nh8MTmbLYxePxOsuyvgmAdQWVlFK4b90aXLNySU5uILSppxs/XrkUD9TXejEDolMpdbqUkrUwyxbBYPBdaBY+z87LxY0SM9Nz87Vfa+W6rikAjP6JRqN7ApjNFBcrLi7+8fZ/OXTo0LsArGLILyAiP5Ym7rdYLPYagMu8yF64uRlnL16If8RtuDkwVdBVCs81Snx78btca/vvyA8dx/nAq/BM1zfd8WOdjLmvt6KzK/t/3zJdR6eLeW9od5Yu69u2XJspAHIfKaUeAFDMlHdxbW3tFx5Yr1ixolspdSnTOaZHIpFvMGV5Qkr5eyK634vsRDKJO9euwkXLF+HT1oQXp0iLla0tuGj5h7ir9lOuXf125NdSyoe9Cs8iz+scnGhLYcFbbIswGTsx/81Wjm2A53K0BTAFQM6LRCIXKKWOZop7Qkq50y8ax3HmAniO40RKqTuFECUcWV4ZMmTIj8CwbfDOfNKawEXLF+Hnq1eiviN7Rmqv62jDz1evxMXLP8QnrZ52LT8hpbzcyxNkC8uyntXNeHqueQzgtWcYXmPLsl5kaAqAzJ13bTCoqqoSqVRqBYAhDHEbU6nUPrvqehJCjAawEoD2xVspdYvjONfr5nhJCFGilPonER3j5XksIhwyZBjOjVZj77JyL081YLXtbXhC1uOlDY3pWOnw5ZKSkpNXr16df3Mod4yEEOsAVA04gIAXHtoDE8YNYmyWsdUna7pw6vfWQfOjsUFKGUbvGizaAhwhRmYqKyt7HMB+HFlKqQvi8fgupxslEoktoVCoEMAxuuckokMGDx78REtLS8Yu6ZpIJHqKioqeDgQCxwKIenUeBSDW2YF/rnewvLUFFhEiRSUIkr81fKebwqsbmvD7dWtw37rVWN3emo5h+AtTqdRJDQ0N2nOpckkoFBoH4GCdjC0JFycek7EzcbPa7LvXY3Wd9qqLjyUSCZZeVsD0AOQsIcS3ADzOkaWU+pfjOCf39+ej0Wix67orwLPXwD+llKcw5Hiqurp6SHd390tgmmnRH6XBII4aNhIzR1Ziv9BgBNJUDKSUwuKWzZjfFMcbzRvQnt4dDt/q7Ow8ubm52fRXbycSiRyvlHpJJyMQIMz90x4YU1XI1SwDwOp13Tjp3Drorv5NRNNt236Zp1WmAMhJQogR6B0VPJIhLgFgkpRyt/YdF0J8BYD2c8k+p33Z2INMEY1Ghyml5iqlDkr3uYsDAUwJDcaBQ4bhgPIhGF9SBoupIHCVwur2Vny4ZTM+atmEJS2b0ZHyZaXd+QC+JqXMngER6WUJIVZDs/A+fVY5bru2kqlJBgBc+bM4ntWf/tcgpRwDxmWuTQGQg4QQjwI4iynuh1LK3w+wHS8CmMXQhjWFhYWT6+rqMn61kr6Bi48B+Kqf7SgOBFBVVIKq4hKMLi5BVVEJRg0ahJJAAMVWAKFgAYoDvU8AO1IpJJI96HBTaEum0NTdhYbOdtR3tKGhowMNne1+XfC39WxJScmZ5pn/lxNC/BjAL3QyAgHC0/dVYdJeRUytym9LV3biGxfXa9/9A5gjpZyt36L/zxQAOSYajZ7kuu4/meLelFIeg971xndbJBLZSym1DIB2fyIR3WDb9s26OWlihcPhXxKRGaHOgIjutm37KjANfMpllZWVIy3LagCgNZJvysQi/O33VbAsc4nQ4boKp1/UgGWfaN+7pACMlVLWMzTrP8wgwBwybNiw8kAg8C8AgxniupRSp7S2tg54n9ZEIrExFAqVAjiSoT2HlpaWPtba2sq7e4w3VGtr67yysjJJRCfCTLcdqC4AF0gpf4EBFqH5prW1tb28vHwCNAf/Nm5IomJEEJMnmF4AHY8/twVPvqC/z4dS6mnHcR5kaNLnmC+mHFJUVHQrNKYBbUspNcdxnJW6OT09PT8DEGNoUnEgELiTISdtHMd5AMBJAOJ+tyULOUR0jFnkZ/cppe7lyLnj/o1o3uz7o5+stXFTEnc+yLM6tWVZnnz3mQIgR4TD4WkALmSK+8hxnDs4gpqamloBXMmRpZT6uhBiBkdWukgpF6RSqSkAuB7L5IMXUqnU/rZta21yk6+klG8D0B4pviWRwtW3xnXnrecl11W46tZGtCRYOq7e9OqzYAqAHDB+/PhBRPQH8LyfSaXU+WB83iqlfBJ8K+b9bvz48Vm1UkljY+N6KeWpSqkLAJgR7DvXoZS6TEp5Gtda53mMZVfN1xa24cEnN3FE5ZX7HmvGG+/ybHdNRHNYgnbAFAA5oL29fTYArh30fuk4ziKmrP9wXfcS8Oygt2d7e/slDDnpphzHud+yrEMBsL++OWAhgAMcx7kHebilLzcp5dtKqX9xZN15fxM+XG7WXOqv95d24Nd/5Fm7jIhe55z3/4V8r4KN9BBCHADgPQBBhrhVlmXtH4vFPPm0h8Phu5hGxicCgcDeDQ0NkiHLD1Y4HP4+Ef0MwAi/G+OzJgDX9T3rNwP9GIXD4Roieh8M3/PhUUE8fd9ojBzO8TWTu9ZvTOJrP6hH4waWDlQFYJqUkmXr3x0xPQDZLQjgIfBc/F2l1PleXfwBoKurazZ4BsSFUqkUyxgFn7h9vQETAPwejAt7ZJEUgN8Hg8EJUsoHYS7+7Pp68v7OkrU+ifOutpFoy8df1f5pbXPx3atsros/APzNy4s/YHoAdoQqKirGBIPBiUqpsUqpaiKqIqJRAIYrpYYDKELv3PbSvmNa0PuF1klEG5VSjei90MWVUp9ZlvWZUuqz3V1Nb1eEENcAuJUp7l4p5cVMWTsViUTOUUr9iSFKKaWOcRznDYYsX0Uikf2VUjcDOBm5/5l0lVJPu647p7Gxcbnfjcl1kUgkqpT6GADLAv+HHlCMh++IorAg139Nd09Xt8J5V9l4dzHbEJ8OpdREx3HWcQXuSN6/i1VVVcJ13SMAHNG3hOu+YPqw7EAzgPeVUh8Q0budnZ2vD3RN875FdpagtxjR1dDZ2Tk5TeurkxDiTQBHMGQtk1IeiBxZICYSiUxRSl0D4JvIvTU6FBE9TURzYrHYMr8bk0+EEJcAuIcrb9bRZbjnxjACgby/fAAAUimFS2bHMe+NBFsmEV1v2/YtbIE7O4/XJ8g0kyZNKty0adPRSqkTiegkABN8bE4SwPsAXnJd9/l4PP4B+jcAyhJCvAZgGkcjiOgU27bTNk2tb9zC++C5yF0qpfw1Q07GiEaje7qu+2MAZ0NzRbcMkFBKPea67u/MHb9vApFI5B3OPSpmHR3CXddX5n1PQFe3whU3s1/8lw8ZMqRmxYoV2lsH7vJcXp8gQwTD4fB0IjoDvWu0D/G7QTuxTin1VCAQeDIWi723sx+KRCIXKaUGtD7/DjwupeTaN6DfhBC/B3ARQ9TmVCo1IRenjUWj0WGu635LKXUOEWlt8+qDZUR0b1dX16MbNmzg+3Y0BoR5sDCA3scB9/5MIFSaa51V/dPa5uKC6yRntz8ApIjoyHStgZHTBUBFRcUYy7K+R0TfBSD8bs9uWqaUesiyrEdt29649S+FEFUAlgMoZzhHE4B9pJQ8y1Xthr6L26fgGQX/RynleQw5GSscDk8kou+gd5OnqN/t2YlPlFJPWZb1d9u2F/vdGOPzhBA3ApjNmTlx/CA8fHsk72YHNDWncO6VMXy6hndvqnTveZKTBUA4HJ5GRP8L4FRk/0yHDiJ6RCl1l5TyUyHEC+gdLMbhLCnl40xZuy0cDp9PRPczRCkiOjxPVo6jaDQ6WSk1Qyk1A8BRAEp8aksHgIVE9GoymXzGdPFnPEsIsQDAcZyh4VFB3H1DGDX7FnPGZqz3l3bgsjkO52j/rRZIKWchjTNicqoAiEQiJ7uue0MWdpf2h4veLrxDmfJekFKeypQ1UJYQ4l0AUxmyFkkpD0aeTScbP378oI6OjsOVUtMATEbvINbxYOzq7eMCWAtghVLqXSJ6fejQoe+n4zmlwWfkyJGVBQUFHwGo5MwNBghXnD8c5585DJRTV5X/T6neFf7ueXgjkin2taqcZDJ5wPr16xu5g79MTrxVkUjkWKXUzwAc5ndbskQLgMnc0xIHIhqNHuK67ttg6KlRSl3oOM4fGJqV1caPHz+ovb19H6XUJCJ6RCdLKXU2Ea20LGull2tEGOkTiUSOV0rNhwe9o0cfWoo7rq3EsCG5NS5g46Ykrrq1kW153+2kiGiGbduvehH+ZbK6AKiqqhKpVOpOAGf63ZZsQkQX2bZ9n9/t2EoI8RAAjmf4Gy3L2isWi/Gsw5kDhBBatypSyqz+jjC+qKKiojQQCHwEYE8v8geHArjyB8NxximDYVnZ/evjugpPPN+CXz7QxLWxzxeka8rfjmTr8/FgOBy+NJVKrYS5+O+WvrWlM+ouOZVKXQtgM0PUcNd1ffkgGUY2mDRpUmEgEPg7PLr4A727CF5/53p87YIGLFnZ6dVpPLdiVSf+64cNuOEutl39voCIHrFt+2eehPfn/H6deKCEEIejd/nUKX63JQt1EGUBi54AACAASURBVNH+tm2v8rsh2xNC/AgAx3z+FICDpZQfMmRlPdMDYGzVtwbKUwBOSdc5AwHCadNDuPDsYRg3ujBdp9Wyel03/vBYM55b0ALX2xFF/5BSng4fFzLLpg93QAhxPYCfIvdWSUuXH0spb/e7ETsREEIsAk9h946U8giYXeVMAWBsFRBCPAqfekwtCzj6kFJccu5w7Ls3x+Kl/FbVduOBJ5rxj5cSSPEP8tvewlQqNb2xsdGTQQX9lRUf7qqqKpFMJh8jomP8bksW+1BKeQgyeNncvumbr4Ph95KIzrVtm2PPgaxmCgADvbtP/pGIzvG7IUTAkVNL8bVZIZwwLYSiQf7+enV0upj/ZiuemduCfy9qh0rDLQMRLQ8EAkfV19dv8v5su2iL3w3YFSHECQAeATDK77ZksSQRHZQNi7P03aVwrEzYWFRUNKG2tnYLQ1bWMgVA3qNwOPx7IrrQ74ZsL1QawMyjSvHVmYMxdb8iBNO0t0AyBby3uB3Pzm/B/Dda0dqevpnDRLTcsqyZmbKVeSZ/uEkIMQfAT5C9gxUzAhHdatv2dX63oz/6ZnZ8AoYNmYjobtu2L2doVtYyBUB+E0L8EsD/+t2OXSkttjB1SjEOP7AEhx5QjInjB7HNIHBdhZWru/DOhx1456N2fLCkA20dviwX8lYwGDwtE+78t8rID3dNTU1BPB5/SCn1bb/bkguI6JHKysrvLVq0qMfvtvSHEOJKAHcwRCVTqdQB+bxCnSkA8lffDdQNfrdjIEqLLVRXFWJsVQHGjC7E2KpChEcFUVpioaSYMDgUQElx731he4eLLYkU2jsUWttcxJuSqG3oRm19N9Y29KCuoduvC/62/mFZ1pmZtpZGxn24hRAlSqm/9e3U55tQaQDVVQUYW1WIsXsUYky0AOFRBSgttlBUBAwJBVFc3PvydXQobE4k0dGh0N6p4KzvQW1DD2rXdWNtrBt1DT1ItKX8/OcAwILu7u7Ts2FjlpqamgLHcZYAmKibpZR6zXGcYxmalZVMAZCfhBBXA7jN73YYAICHpJQXIgPHX2XUh7tvg5gX4MOKfqHSAKbuV4TDa0px2IHF2GtMIWsX1Ke13Xjnw3a8vagdHyztSOtzp218SEQnbLu5UKYSQkwHsIAp7ltSyieYsrKKKQDyjxDifwD8xu92GEgR0ey+ef4ZOSMpYz7c0Wg04rruPACT0nXOslILJx0bwldPCOHAycVpHISisGhZJ56d14K5r7emu3dgCYDpfuwAuLvC4fDfiOgbDFGxnp6eiU1NTa0MWVnFFAD5RQjxXQAPIYO+2/OUQ0Rn+bG87+7IiF+Svjv/N5CGiz8RMO3gUnx9ZjmmH1nm+zSUzi6FBW+14pl5LXjzvba0TEMBsMx13ePj8XhTWs42QEKI0QBWgme3u9uklNcw5GQVUwDkDyHEmQAeBf86Kf8G0ApgJnNurnopmUyene6NfQbC9w93NBotdl13PoAjvTzP1oUoLv3ucEyekKkLUXThgSc24fmXEl7sNrW9D7q7u4/L9DEBkUjkp0opjv2xuwHsJ6X8lCEra5gCID8IIU4D8HcABczRi4PB4HH19fWb+7bvvhNAGfM5ckUKwC1Sypv7/nfG8/XDXVNTUyClfNbLAX+BAOHrM8txwVnDUB3l/mx4Y21DN+57tBnPLvB8RaoFQ4cOPSWTt3Tt29luOXq3uNU1r2+/7bxhCoDc17dWyj8ADGKOXkZEx247ZqiqqmpcKpX6E4AjmM+V1YhoOYDzbdte6HdbdoefH26KRCJ/8nKq3wGTinDTFRWYOJ77c5EeK1Z14oa71nu6oUbfZhS+rxD2ZSKRyMlKqRc4spRSX3cc5xmOrGxgCoDcFg6HjyKiF8HzmGxbq5LJ5FE76cYOCCEuAHATgOHM5802HUR065AhQ27L5BupnfHtwy2EuAnA9V5kDykP4OoLRuAbJ5XnxHaUf31hC+64fyO2JDzrVbpaSskx794zQojnwbOJSZ1lWftk2nxcr5gCIHdFo9GDXdddAKCcOboOwFFSyoYv+6HRo0cPTSaTswFcDCDI3IZs8Fel1NWO46zzuyED5cuHu6/L6kV4sMLfcYeX4rZrKjF0cG7tF9S8OYWrb43jtYWe7B2Rcl331Hg8/qIX4Rz6uh6XA+AYwHGTlPJGhpyMZwqA3BSJRKYopV4BMIw52k6lUkc1NjbW9veAysrKfSzLuh3AycxtyUhE9LpS6idSyn/73RZdaf9w9y31+hGY1/YPBoCrLhiJ8/5rKChHv7KUAh58chPuvH+DF4MEmwHsv6uq30/hcPhmIvopQ1RnKpWatDtfctnKFAC5RwixN4DXwb8/ynoAR0spPxnIwUKIA5RS1xHR15Gby7e/SURzbNt+2e+GcEn3mxRIJpOPgfkXN1JZgCd+OxrfOyN3L/5A7xTG888cisd/HUV4FHuP2zAAjyGDu/KI6FYAHN1tRYFA4FcMOYaRVhUVFWMBvAT+i/8mIjphoBd/AJBSfuQ4zjfRO537TwC62FrnH5eIniKiw6SUR+XSxR9Icw+AF8/9J+1VhIdvFxg+NGOvW55o2pjEeVfbWLma/TOW0d3j4XD4dCL6O0eW67onZfJjDw6mByB3RCKRqFLqDQBjmKMTlmVNj8Vi73GGRiKR4a7rnk1E5wHYjzM7DRoAPAzgYSllvd+N8UraPtxCiMMBvAnGXofDDizBvT8TKCvJxd6mXUu0pXDRTyQWfsQ6ni0J4BAp5YecoZyEEPMBzGCI+qykpGTf1av5q6hMYQqA3DBq1KiKYDD4OoAJzNHtSqkTHcd5gzn3c8Lh8FQiOhfAVwBEvTyXho0AniaiJ23bfg1ZMpdfR7o+3EEhxAcApnAFzjq6DHddH0ZhQX5/P3X3KFxxcxxzX2ddz2dJOBw+KFN3D+x7BroEQKFullLqOsdxbtVvVWYyBUD2i0Qiw5VSrwLYlzm6C8BpUsr5zLlfhoQQ+xPRSUqpUwAcDH/HCywFMJeI5tq2/SYycMMeL6Xlwx0Ohy8jIrZnrrOOLsM9N4YRSNPa/ZkulVK4dA5vEUBEP+3bxCIjCSFuB3AVQ1QbgImZPPhRhykAstvYsWMHd3Z2vgRgKnN0D4BvSimfY87dLdFodFgymTzEsqyD0FsMHAT+8Q1bbQawTCn1PhG9mUql3m5sbFzv0bmygucf7r5R/yvBNFf1sANL8NDtkby/899ed4/Cd6+M4d3FbI8D2gHsnakXxhEjRoQKCws/ASAY4v4qpTyDISfjmAIge1VUVJQGAoG54F8mPQXg7EzdIVMIMdqyrLFKqWqlVDWAaiKqVkqVARiC3hUPSwGE0DtoOYHeO/d29HbjbwTQCKCBiNa6rltLRCtz+Vn+QHn+4RZCPAGA5ct1wrhBeOLXVQiV5ecz/11JtKVw9mU2VqxiWznwCSnlt7jCuIXD4bOI6FGOLKXULMdx5nFkZRJTAGSnSZMmFW7evPlZpdSJzNFKKXWh4zj3M+caWcjTK2kkEjkWTBf/SGUB/nxnxFz8v0SoNIAHfyE4pwie0Td4MyM5jvM4AJbBS0R0V01NTXZsFmHktJqamoJNmzY95cHFHwAuMRd/YytPr6ZKqZ9z5AQDhF9dX5l3U/0GYuTwIH57k0CQZyFEAvALliRvKNd1LwbPwJ194vH4jxhyDENHwHGcP4Nn2evtXSul/K0HuUaW8qwAiEQiJwM4lCPrqgtG4MDJxRxReWHKxCJc/v0RXHHT+npyMlI8Hl8B4F6OLKXU7KqqKo4xBYYxECSEuA/AmR5k3ySlzORi3vCBZwWA67o3cOQcd3gpzvuvoRxReeUH3xqGow8tZclSSs1hCfJIYWHhDehdxlRXyHVd8yVp+IGEEL8D8H3uYKXUXZm8uJfhH08KgHA4PI2IDtbNGVIewG3XVOb08r5eIQLuuLYSg0MszwKmRaPRQziCvFBXV7cZwDUcWUqps4UQZq9zI636prVexJ2rlLrPcZwruXON3OBJAUBELL9wV18wIud29UunYUMCuPIHPI8CXNe9nCXII1LK/wOwkCGK0PtIwQw4MdJCCDEHAPtFmogecRznhwDYdw4zcgN7AVBRUTEGDANYDphUhG+cxL3Ndf4545Ry7Lc3xw66OF0IMZojyCNKKfU/AFyGrH2FEBcw5BjGlxJCXAWA5XHptpRSf7dt+7vg+TwYOYq9ALAs63u6uYEAYc7lFbAs0/evy7IIN10ximPVxCCA7zI0yTOO4ywiogeZ4m6urKwcyZRlGF8ghLgYwO0eRM8vLS09G3mwlr2hh7sACBKR9kXi6zPLsc+egzjaYwCYPKEIp00PcUSdi8zf5/s6AM0MOUMty2KZxmoY24tEIucC8GJK3suFhYVfyeUNrgw+rF/mlZWVM6C5NGsgQPjBf5tR/9wuPHsYLP13uzoSiRyj3xrv2La9EXxbTp+XyYMfjewUDoe/rpR6APwrsb7T09Pz1bq6OralQI3cxloAWJalverfSceUYUyV9iZvxnbGjS7EzKPKtHOUUv/N0BxPSSnvI6L3GaIs13V/h8zv9TCyRDgcnkVEj4N/kOniYDB4clNTUytzrpHD2L7YJk2aVAjgNJ0MIuCis4cztcjY3g/PGc4xpfIryPwR8i6AS8Ez+rlGCJHRYx+M7CCEmE5Ez6B3Mxs2RLSciKbX19dv4sw1ch9bAbBp06ajAWj13U87uBR7jTV3/17Ze9wgHDlVe3GgEZWVlRk/T9627XcA/Jkp7tbRo0eb51LGgIXD4WkAngPAMiVnG6u6u7tn9D36MozdwlYAcGxccfosM+3Pa1+bpT8Y0LKsUxma4rlkMvljAFsYokYmk8mbGHKMPFRZWXkQEb0AoIQ5ug7A9KampjhzrpEn2AoAIjpJ5/hQaQDHH6H/jNr4cidMC6GsVPttP46jLV5bv359IxFxLWN8USQSmcKUZeSJaDS6r2VZLwLgvruxA4HAdCllA3OukUdYCoC+DVQm6GScdFwZigaZef9eKxpEmKU/GHCKEIJttyEv2bb9GwBLGaICSqnfgX/ktpGjotHonq7rzgfAPbCpyXXdExoaGtYw5xp5hqUAcF1X+5nwV2awzFM3+uGrMwfrRlhKqaM52pIGScuyLmXKOiIcDp/FlGXksIqKirGu674CoJI5ehMRzYjH4x8z5xp5iOsRgFYBUFZime1+02jqfkUoK9F764koa+bHx2Kx1wA8yZFFRL8cO3asdgVl5K5oNBoJBoMLAESZo9sAnGbb9hLmXCNPsRQASqmDdI4/aEoxgvpL1Rr9FAwQavbTLrhqONqSLn0bVHHMka7o7OzkWmjIyDGjRo2qcF33ZaXUWObodqXUSVLKt5hzjTzGUQAQgMk6AYfX8Oxbb/TfYQdoD0iuQRY9D7dtOwbgZ0xxl0aj0X2ZsowcUV1dPSQYDL4IzfFQO9BtWdY3Hcd5gznXyHPaBUDf7n9aI1wPPcB0/6fbYQdqv+aDw+FwJu8O+AVDhw69C8Aqhqig67q/YsgxcsSwYcPKe3p65gM4gDm6B8B/xWKxfzHnGoZ+ARAMBvfROT5UGsAEs/hP2k0cPwilxdrjAPZkak5arFixolspdQlT3PGRSOSbTFlGFquoqCgtKir6p+6j0B1IAThHSvkcc65hAGAoAJRSY3SOHzO6wGz76wPLIlRr7rlAROOZmpM2juPMA/AsR5ZS6s6Kigrz/CqPTZo0qTAYDP4NwJHM0UopdbGU8gnmXMP4D44CoFrn+DFRc/fvl7FVBVrHK6WyrgAAANd1LwfQwRBVZVnWtQw5Rhaqqakp2LRp0985VkHdgUscx7nfg1zD+A+OQYBaz4HHjDYFgF8YXnutrZ/9Eo/H6wDcxpFFRFdGo9GsLIQMLQHHcR4F4MWy2NdIKX/rQa5hfI52AWBZ1kid48eZAsA3Y/W3XR7F0Q4/WJZ1OxHVMkQN6tsy2MgfJIS4D8B/eZB9s5SSpTg1jF3h6AHQWuayYkSAoQnGQIRHae/qm7UFQCwW61BKXcEUd4IQIis2SDK0kRDitwC+zx2slLpLSnkDd65h7AzHGACtAqCsxBQAftFdDRD8a5ynlZTyOSJ6kSOLiO6urq7m3urVyDBCiNsBXMydq5T6g+M4V3LnGsaX4egB0FpRprTEzADwS4l+ATCIox1+IqJLAHTp5iilxvb09FzF0CQjQwkhZgNgv0gT0SOO41wMQHFnG8aX4SgAtB4kM2xNawwQQw9A1t/xxmKx1UR0F0eWUuraysrKao4sI7OEw+HLANzoQfQztm2fB8D1INswvpTvBUCJ5mI0xsCVmh4AAIBS6hYA6xiiigOBwC8ZcowMIoT4HleRuJ35JSUl3wKQ9CDbMHbJXH0NHTlx1yKlbAfwY44spdTplZWVXswLN3wQiUTOAXA/+Pe9eMWyrK+uXr1a+/GTYQwURwHQrXNwe0dOXEOyUlu79mvPsbteRpBSPgngFY4sy7LuGT9+fE70juSzcDj8daXUQ+C/UVrY09PzlVgsxrEYlWEMmO8FQGubKQD80qpfALRxtCNTuK77I/RuvqJrz/b29ksZcgyfhMPhmUT0OADtubLbWRwMBk9qamrKmeLZyF4cBYDWRaCt3Qx89Uu7KQA+Jx6Pf6yU+g1T3PXRaDTClGWkkRBiOhE9C+YxLkS0nIim19fXb+LMNYyB0i4AiKhZ5/jW9pRuE4wBSuj3vuTcXUxXV9ccAHGGqDLXde9gyDHSSAhxJHo3i+Ke4bKqu7t7hm3bG5lzDWPAtAsA13U36BwfbzIDYP3C8Nq3cLQjkzQ3N7copa5mijuzsrLyaKYsw2OVlZUHAfgnAO4dHusATG9qauIoLA2DDUcPQJPO8bUNHI9cjYGobdAavgHwTJ3LOH2bvPybIYosy/oN+J8jG8yi0eh+lmXNBVDOHC0DgcB0KWUDc65haOMYA1Cvc/Daeu2LkDFAtZqvPdNmOplIWZZ1EXjmZ+8rhGBfOtbgE41G93Rddy6AYczRTa7rzmhoaFjDnGsYLDgKAK27QIa7UGOAdAsApVSuFgCIxWLL0Dv/m8PNe+yxR5gpy2AkhBjtuu4CANzvzxal1InxePxj5lzDYMPxCEDrIrC2vhuua2YCpJvrKqyL6T1+yeEeAACAZVnXA9Aa49KnvKen5+cMOQajSCQSJaJXAezBHJ2wLGum4ziLmHMNg5V2AZBMJrUq3NZ2F5/Wml6AdFu5ugttmoswJZPJnC4AYrFYs1LqOqa470QikcOYsgxNFRUVo5RSLymlxjJHt7uue2osFnuXOdcw2GkXAI2NjXXQHA3+zoftus0wdtPbH2ovQiYbGxvXc7QlkzmO8xAAji9zUkr9FoDZ/9pn1dXVQwKBwIsAJjBHd1uW9c14PP46c65heIJjDIACsEwn4O1FpgBIt4Uf6b3mRLSQqSmZzrUs6xLw7HtwYDgc/j5DjjFAw4YNK+/p6ZkP4EDm6CSAM2Kx2L+Ycw3DMyxrXBPR+zrHf7C0A0mzHlDaJFMKi5bq9QAopd5hak7Gi8Vi7wH4I0cWEf1cCDGCI8vYPUKIkuLi4n8opQ5ijnYBfEdK+SxzrmF4imuTC605063tLhYtM/tipMt7izs49gHIlx4AAIDrutcC4FjCdZhS6maGHGM39G3O9KxSinthJgXgB1LKx5lzDcNzLAVAMBjUXjTl2Xk5t6hcxnpuvvZr3WNZVl6NcI7H400AbuDIIqLzhRDcXdDGTtTU1BS0t7f/FcAMD+IvlVI+5EGuYXiOpQBYt26dA2ClTsbc11vR2WWmA3qto9PFvDe0l/BfmI9bmUop7wWwhCEqAOC34N9jHkDv3W4kEtlfCPHfulmRSOSccDg8taKignt53HQJOI7zCIDTPMi+VkrJtXmUYaQd2xKlSqkXiWjiQI9PtKWw4K1WnHp8iKtJxg7Mf7OVo/v/HxxtyUIppdSPiOh16F+8D4tEIt+xbfv/dEImTZpUuHHjxsMsyzpGKTWZiPZtb28fB6bPtlLqT0SEQCCghBB1SqmVRPSuUuq1QYMGvVdXV9fJcR6PkBDiQQBncAcrpW5xHOcX3LmGkU5sdyBCiBkA5utkTDu4FH+8w+yg6qVz/zeGtz7QmwFgWdZesVjsM6YmZZ1IJPKIUupshqjGoqKiCbW1tVt24xgSQuwP4HgA0wFMA1DC0JaB6ATwHoBXiOg527YX+9SOHSEhxG8A/JA7WCn1K8dxruDONYx0YysAampqChzHiUNjPW0i4IWH9sCEcazbcBt9PlnThVO/tw5K70nLx1LKSUxNykqjRo2qCAaDnwIYrJtFRHfbtn35rn4uHA5PJKIzAJwNYJzueT1Sh97eob9JKf+N3gFyvhBC3ArgGg+i/yil/B58/LcZBhe2RUkcx3FDodDeAA7QydmScHHiMeYxgBdm370eq+u0V118KJFIvMzRnmzV1tbWVl5e3gPgBIa4qSUlJc+0tbV9YVGlioqKUUOGDPlBKBT6HRHdAuAY8G9Yw2kIgEMAnBcKhc4IhUKBkSNHfrJp06audDZCCHEjgJ9y5xLRI1LK82Au/kaO4JoGCABQSj2pmzH39VasNRsEsVu9rhvz39Qe/AcAh4TD4RqOoGxWWVn5a2gOfO0T7Nsy+D+9cdFodHw4HL43EAisU0rdBf5Fa9JhbwD3dHZ22pFI5A/RaHS/dJxUCHElgNncuUT0lG3b54FnQSjDyAisy5K2trauC4VC34PGntpKAe3tLmZMK2NsmfHz3zVh5WqWG7ExRHR+KBSqKSsrW9Xa2upwhGYbx3Hc8vLyTwCco5tFRNWhUOiTsrKyovLy8ruVUr8jooPBOEjXR4UAapRSF4ZCof3Ly8s/SSQSjV6cKBKJXATgbjDPrlBK/Wvo0KH/1dTUpLd7lmFkGPZpSOFw+GYi0up+CwQIT99XhUl7FXE1K68tXdmJb1xcD5f/3kUBeEEpNSdfdz4Lh8N/I6JvMER1AsiHX3hFRM8Q0ZxYLLaUKzQSiXxHKfUwmHs1AbxiWdYp+Tjt1ch97AVAZWVltWVZa6D5QZwysQh/+30VLMuTqdJ5w3UVTr+oAcs+8XS2lgLwPIA5UsoPvTxRphFCjAbwMYBsnSfvF5eIHgRwnW3bG3WCIpHIN5VSfwH/Rktv9/T0zGxqamJ5dmYYmYZ9Z7LW1tbNoVDoAPQ+Axywxg1JVIwIYvKEfLgp8s7jz23Bky/sziyzASH07qx2QSgUOrK0tPST1tZW6fVJM0EikdhSXl5OAI7zuy1ZhgDUADi/rKyss7W19QMMYHBdOByeCeCvAAqY27ckGAzOdBzHLFFq5CxPbq+FEEcCeFM3Z3AogAWPVmPYELOD6kBs3JTEjG/XoSWR9nFLedUjMH78+EHt7e3LAYz3uy1Z7F2l1Hcdx+n3wEohxHT0/p6x3iUQ0XKl1LFSyg2cuYaRaTy5siYSifpQKDQLQFQnp6tb4bO6bpw6vRxkngTsFtdV+NHsOFbV+jKjYmuPwA9CodCBoVDo00QiEfejIenQ3NycKi8vrwWgvfRuHosS0XmhUGhTIpH4YFc/3HeT8QL4F0H6rKen57jGxsYvTMs0jFzj2a11aWmpJKKzdHPqYj0oKbZQM7mYo1l5495Hm/HE8553/e/K5x4NlJWVrczVRwOJROKzUChUg95/rzEwBQBODoVChw0ZMuSVlpaWxI5+KBKJ7A9gLjRmG+1Eg+u6x61fvz7GnGsYGcnT+2ohxFsAjtDNCQaAx39dhQNNEdAv7y/twLcviyGZyrj1ShR6V4qbI6X8yO/GcKuoqBgbCARWID9G83stDuB0KeXb2/5lNBrd13XdVwEMZz6fDAQCRzU0NKxhzjWMjOXpw/XS0tI1RHSubo6rgLc+aMepx4dQWsI9yye3rN+YxHevtJFoy8j1Sgi9g0N/EAqFDuibE54zjwba2to2hUKhQQC495zPR2UAzg6FQnYikVgMAEKICUqpVwCMZD5Xk1LqONu2VzHnGkZG87QA6FsYaAKAfbWz2ly8+X4bTptejkGFZkDAjiTaUvjOFTbqYhm/XsnWQuDCXHs0MHjw4IVKqbPQuyyuoScI4Cvl5eWipKTkY8uyXgLAvVvYFqXUCY7jsK1JYBjZwvMr6R577BHu6en5BEzP6w49oBgP3xFFYYEpArbV1a1w3lU23l2st9OfTxSA54hoTobtKLfbJk2aVLhp06YX4fO0wILSUoQiVQhF90B51R4IRapQMnIUgsUlCAwahMJQOYJFvY/Ukp0d6E60INXZiWRnB9qb1iMRq0dLwzok7Hok7Ab0tLX5+c8BgG70rirIKWFZ1oxYLPYuc65hZIW0XEWFEJcAuIcrb9bRZbjnxjACAVMEAEAqpXDJ7DjmvbHDMVPZJKsLgerq6qKurq6niOikdJ+7oLQUIydPQcX+UzFqSg0GV48FWTyPy5TrYsvaNWhcsgiNixdhw/LF6GnPykJzWx2u654Yj8df97shhuGXdF1BA0KIDwDszxU46+gQ7rq+Mu97Arq6Fa64OScu/ttSAJ7tKwSW+N2Y/hBClAB4DsD0dJ2zoKQUVUcdh+rjZ2H4PpNhBdKzdYCbSmLjimVY+/JcxN56NRN6B3ZXl1LqK47jzPO7IYbhp7RdPSORyGFKqbfAuFb3oQcU496fCYRK83OhoNY2FxdcJ7m7/ZMAXgNwPNL4+7ETWVEIjBw5sqywsPAFpZT3g/+IEK45BNUzTkTksGkIFA7y/JRfJtXdBfvtN1D30lw4i97t3c0rsyUBfFNK+azfDTEMv6X1C14IMRvAjZyZE8cPwsO3RzByeC5snNZ/Tc0pnHtlDJ+uYd9qfbaUck44HJ5KRDcCOIX7BAOQsYVAdXV1UXd39wIAR3p5HrICqDrqOOxz5jkYXD3Wy1MN2Ja1a/DxE39Cw5uvQnmw8xSDFIBzpJSP+90Qw8gE6b7DCwghTxCmnAAAF7tJREFUFgA4ljM0PCqIu28Io2bf/Fgn4P2lHbhsjoPGDUnu6LeklMeg94sSAJBphQARPQPgpgwpBEgI8TiAMz07gRVA9YwTsc8Z30aZ0FpYM20SdgNWPvFn1L08D8pN7fqA9FAAzpdSPuR3QwwjU6S9i7dvVsBHACo4c4MBwhXnD8f5Zw7L2WWDlQLue6wZ9zy80YtFfjYppQ5wHGfdjv7PysrKgyzLuhHAydwnHgBPtpTdXeFw+BYi+olX+cMnTsbU/7kSQ8bt6dUpPLVp9af44De/RPOnH/vdFAC4VEr5a78bYRiZxJdLZd8mHvPAv3c3jj60FHdcW5lzGwht3JTEVbc24o13PRlwlVJKndyfQVGmEOgViUTOVUr90YvswvLBmHLeRRhzwslsI/n9olwXtXOfx9I/3ofuhG8b610rpfyFXyc3jEzly1UykUjUhkIhC8Ax3NnrYj346z9bUF5mYdKeg0BZ3h3gugp/+UcLLvqp9HJjn2scx/lzf36wtbVVJhKJx0tLS/9FRALAXl41qh8IwESl1IXl5eX7Dh48+JOWlpZGr09aWVl5NBH9FR58fsQhR+CYX9yDkZP2y/rfXQAgIgzbc2+MnXkKttTXodVuSOv5lVK3OI5zc1pPahhZws9vGBJCPAzgXK9OsN/eRbjpilGYPCE7l2ZfurITN969Hss+6fTyNI9JKc8e6MGVlZUHEdFsP+a+74Aioqf7egSWeXECIcQIAEsBhDlzKRDAlPMuwoSvn4lcfob1yVN/wbI/3gc35f3YACK627btyz0/kWFkKb+/aYJCiGfg4QCzQIBw2vQQLjx7GMaN5l5IzBur13XjD48147kFLfB4MPWCoUOHnrJixQrtroVoNHpwKpW6MdcLASHEswC+wplZWlGJw669CcP3nsQZm7E2fLwM79x6A9qbvNtxl4jut237QvQO/jMMYwf8LgAQjUaLXdedD4+nUVkWcPQhpbjk3OHYd+/M7BFYVduNB55oxj9eSiDl8U5+RPR+d3f3cU1NTa2cuZFIZIpS6icAvgH/f78UgH8CuFFK+aFuWDgcvoCI7tNv1v83bM+9Me3mX6JoyFDO2IzXuakZb/z0f7FpDf/+O0T0qG3b3wGQkXMRDSNT+P0FDQCIRqPDXNd9A4Dnt0BEwJFTS/G1WSGcMC2EokH+vgQdnS7mv9mKZ+a24N+L2tO1jsoKAMdIKTd4dYIM6xFwlVJPBwKBmwbaIyCEmABgEYBSrkaNmlKDI2/8BQpKSrgis0pPWxveuukarF+iXZtt74jttxE2DOOLMqIAAICqqiqRSqXmIw1FwFah0gBmHlWKr84cjKn7FSGYpr0FkingvcXteHZ+C+a/0YrW9rTeqCxxXXdGPB5vSsfJotHoIUqpG5VSJ6bjfLsw0EIgIIRYCGAqV0OiRx6Dw348G1ZBAVdkVnJ7erDwtjloeOtVztglUsqp6F31zzCMnciYAgAARo8ePTSZTD4P4Ih0n7u02MLUKcU4/MASHHpAMSaOHwTL4nl5XFdh5eouvPNhB975qB0fLOlAW4cvvZMfWJY1MxaLNaf7xJlWCAB4KpVK3dTY2Lh8Vz/M3fUfPfIYHH7dTSArt6aqDpRyU3jn5zdyFwFXSSl/yRloGLkmowoA4D+bqjwB4FQ/21FabKG6qhBjqwowZnQhxlYVIjwqiNISCyXFhMGhAEqKe+dot3e42JJIob1DobXNRbwpidqGbtTWd2NtQw/qGrr9uuBv6+WioqLTa2trt/jZiGwrBKqrq4d0d3evAjCS44SjptTg6FvuzPs7/+25PT147SeXo2npR1yRba7rTo7H43VcgYaRazKuAOgTFEI8AA+nCOaZh8Ph8IWLFi3q8bshW0UikUOVUjcCmOV3W9BbCPw9lUrdvH0hEIlEfqWUuozjJEPHT8Cxt/0GBaVswwhySk97G1656ofYvOYzljyl1N8dx/kmS5hh5KBMLQCA3nUCrgdwA3xasCgHuEqpnzqOc6vfDdmZTCwEXNe9KR6PrxBC7I3eOf/at+ulFZWYfs+DeTfaf3d1NG/ES5d+n2uKoHJd95B4PP4+R5hh5JpMLgAAAJFI5Dil1GMAKv1uS5ZpAnC2lHK+3w3pj0wsBJRSo4joGN0wCgRw/J335s08f10bVizFq1f/D9diQS9JKWdwBBlGrsn4O+tEIrG2tLT0z0S0H4DxfrcnGxDR+67rnuA4ziK/29JfiUQilkgkHisvL5+H3mLP7yWGJxFRNUfY/t//IaqOOo4jKi+UjKqAVVCAxo8+4IgbW15e/lYikVjLEWYYuSTjCwAAaG1tbU8kEo+HQiEXwFHwYBOhHNED4OZwOPzdzz77LO0j/Tn0FQJ/KS8vnw8giiwv+sQhR+DAiy7P3eV9PTJyn33RvGolWmWMI25sIpHwZOMmw8hmWfetFIlEDlNKPYA0rheQJZYCOFdKyTaMOhP0vd83Apjpd1t2V2H5YJz04F8wqHyw303JSl1bNuNf3/8Wyy6CrusebMYCGMbnZUUPwLYSiURs9OjRD3Z2dqYAHAog6HebfNamlJozbNiw79bW1tp+N4ZbX4/Ao309AlUAxvndpv468KLLMXLSfn43I2sFi4pQUFoG5z39Rf2IqDSRSDzN0CzDyBlZ1wOwrUgkEgXwc6XU2cjyf8sAvaCU+h/Hcdb53ZB06esRuA4ebiDFYdiEiZj+q/tBlnlapUO5Ll6+4kJs/GSFblQPEY21bZvlmYJh5IKs6wHYViKRaEkkEs+UlZW9QkTjAezhd5vS5FUA50gpb2ttbfV1YZ902zpGIBQKLUCG9giQFcBRc25H8fARfjcl6xERhozbE2vn/ROaG2UElFIdra2tr3C1zTCyXU7cnjiO86aU8igAJwD4t9/t8dAblmUdK6U8Tkr5lt+N8ZOU8m0p5Uz0Lhu9wO/2bKt6xokYMm5Pv5uRM4btuTf2OFZ/Jh8RnYX87Ck0jB3KyQ9DJBI5DMCVSqmvIvuLnB4ATxHRPbZtL/S7MZlKCHEEgBsB+Drnm6wATnrwcZSJqJ/NyDktDesw94KzoVztJbWn5XvxbBhbZfUjgJ3p6yb+aygU+j8ACfR2E5f726rdtg7AbyzL+rZt239MJBLm2eWXSCQSDYlE4pFQKPQSfHw0MPro6Rh34ml+nDqnDRo8BFvWrUXLOu3p/F2JROKfHG0yjGyXkz0AOxAQQhwL4AwAXwMw3Of27EwzgKeUUo84jvMWAK2Hnvmsr0dgNoDpaTspEWbd+2cMrh6btlPmk821qzHvh+fqjgXYEA6HRSbti2EYfsmXAuA/ampqCuLx+DSl1Cz0Lju7r89NWgrgXwD+KaV8BwDL+qdGr3QWAuGph+KoW+70+jR57fWfXI74ove0MizLOjYWi73G0yLDyF55N4e+r/J/pe/P1RUVFaMsyzoCwJFEdDB6CwKvVm7ZjN4L/tsA3nFd9514PN7k0bkMAFLKfwOYIYQ4Er1jBDwrBKpnnORVtNGnevqJ2gWAUmoGgNdYGmQYWSzvegD6IxwO7wFgomVZY5RSY9D7TLkCvY8OhgMoQW/xFOo7ZBMAENEmAO1KKYeI4kqpOIB1RPSJZVkrGxoaZNr/Mcbn9BUCswEcz5lbUFKKrzzxPAKFgzhjje2kujrx3H+fhp62tgFnENH7tm0fzNgsw8hKpgAw8pIQ4nAA14JpQaFxJ56GqZf+mCPK2IX37vo51s7XGsfnAqiQUm5gapJhZKVsnyJnGAPSt47AqQCmAdCeXrnHcVm3VUHWqj5ee8doi2ObZ8PIdqYAMPJa35zweToZBSUlGL7PZKYWGbsyYvJ+KCgp0cpQStUwNccwspYpAAwDmKpz8Mh9D4AVyLvxtL6xAkGMmDRFN+ZAjrYYRjYzBYBhAFp3gxX7m5vJdKuYov2amwLAyHumADDy2qhRoyoAVGpl7GeuJek2an/t13xE326ihpG3TAFg5LVAILCXzvEFpaUYPCbjNiTMeUPG7olgcbFWhuu6ZuCGkddMAWDkNSLS2rYvFB0NsszHKN3IshCKjNaN0Q4wjGxmvrmMvEZE43WOZ7gIGQMUimq/9lUc7TCMbGUKACPfaV0Eyqv24GqHsZvKNQsAIjIFgJHXTAFg5DWlVIXO8Qx3ocYAharMIwDD0GEKACPfjdI5uHj4CK52GLupZITWWwcA5s0z8popAIx8p3URKCgp5WqHsZsYXnu95QQNI8uZAsDId1pzyYLF5hril4DmNECYAsDIc6YAMPJdkc7BumvSGwNnegAMQ48pAIx8N0jnYNMD4J8C/dfevHlGXjMFgGEY+Yr8boBh+MkUAEa+69Y5ONnRztUOYzf16L/2XRztMIxsZQoAI99pFQA97aYA8EtPe5tuhCkAjLxmCgAj32ldRUwPgH9SHR26EebNM/KaKQCMvEZEzTrHM9yFGgPUrf/aa733hpHtTAFg5DXXdTfoHN+xoYmrKcZu6mhq1I0wb56R10wBYOQ1ItK6CCRi9VxNMXZTItagG6FV/BlGtjMFgJHvtK7gLaYA8E1LwzrdCO0Aw8hmpgAw8p3WRSARM9cQv+j2vhDRWqamGEZWMgWAkdeIqFbn+ESsHsp1uZpj9JNyXSSk3iMA13XreFpjGNnJFABGXksmkx/rHN/T3o4ta9dwNcfop01rViGpOQ3QsqzlTM0xjKxkCgAjrzU2NtYBaNHKWLKIpzFGv61f8qFuxGbbtm2OthhGtjIFgJHvFIBlOgGNi00BkG4MBcAy9L73hpG3TAFg5D0iel/n+A3LF0OlUlzNMXbBTSWxYcUS3Zh3OdpiGNnMFACGAfxb5+Ce9nZsWLGUqy3GLjQtW6y9B4NS6m2m5hhG1jIFgJH3gsGgVgEAAGtfnsvRFKMf1r08TzdCua6r/Z4bRrYzBYCR99atW+fg/7V3t8FRXXUYwJ9zN8Fs0osJIGl2N4FqBwRabUvpiECBkVY62lHGD3YGtONgHb/4MsqoFQekLdXRVj+oY2un4sDIlKqlRQbSSJpASICUyEiApLQJSXbvzS7k/Sa7geTe44ekIyMNL73nZndvnt+nfMlz/js7u/e/59x7DtDsJiN2tAr25WFFFdFE7MvDiNVWu41pTCQSFxWUQ5TV2AAQAZBSHnTz/yNDQzCOH1VVDk0gVntYxRHMrt5rIr9gA0AEQAjh+qJwoeKAilLoOtoOub92a5rG9RoisAEgAgCUlJRUw+XxsPF/13NTIA/1tb6H+KmTbmO6YrEYp2qIwAaACADQ0NAwAuANVyFS4twrO9UURNc4u3sHIN09ui+EeA3AqJqKiLIbGwCicVLKPW4zojVVPCLYAwMdbTDqjqiIcv0eE/kFGwCicZ2dnZUAXG0PKx0bTXt2KaqI3te0Z6eKQ5eihmEcVlEPkR+wASD6n1Ep5Q63IW2V5eg536SiHgLQ824z2qsOqYj6MwBu2Ug0jg0A0VWklC8DcPVTUzoOGv7wGx4TrIB0HDT87teQjuvrto2xBoCIxrEBILpKPB5vA7DPbU7PO+fQWv5P9wVNcS0HXkfPeVd7NAEApJR7TdPkzRlEV2EDQHSt51WEnN7xAi7396mImpKG+3rR+Jc/KcnSNE3Je0rkJ2wAiP6PaZpHARx3m3PFGsCJ57e7fnRtKpKOg/rnnsGVQUtFXI1hGK7fTyK/CaS7AKJMVFBQYAoh1rvNGTSiyAkGMWvh3SrKmjKa9uxEywF32zK8Twix0bKsC0rCiHyEMwBEHyAejx8EoOTB88YdL/C44FvQ3XwWZ3a9rCruqGEYlarCiPyEDQDRBBzH2aIkx7Zx7JdbkerpVhHna6meLtQ+vRnSVvK0ngTwExVBRH7EJQCiCQwODrbruj4fgOv5+5HkEOINJzBn9cMITJumoDr/GRkaQvVPv49BI6oq8lXTNH+rKozIb9gAEF1HXl5ebSAQ+CaAPLdZl/t60dV8FmUr10AL8KN3NWd0FEe3/Rjd586oikw5jrNucHCQj2EQTYDfQkTXkUwmB3VdTwF4REleIo6BaDtKl6+CEFyBA8a2Tz72i63orK9TlimEeKqzs9P1fg5EfsYGgOgGLMs6qev6owBKVOQNdLSh70ILwktXQAvkqIjMWs7ICI7/ahtiNVXKMoUQZwoLCx+/dOkSt/0lug42AEQ3JqdPn34awDcACBWBVrQdXecaEfnsyil7T8BIcgg1Wzahs/6YylhbCLGupaWlXWUokR+xASC6CZZlxXRdDwBYqSpzKNEJs74W4aUrkJtfoCo2K6R6ulH95PfQ3XxWaa4QYqthGLuVhhL5FBsAoptkWdYRXdeXAfi4qszLfb3oOFKJGfMXomB2sarYjNZzvglHNv8AVlT5j/Qq0zS/hbHH/4joBtgAEN08GQwGKzRNWw9AVxU6mkyivbIcUjr42F33QAglqwyZR0qcf+NvqHt2C65YA6rTE7m5uQ/39/crDybyKzYARLdgaGhoSNf1/wDYAEX3AwCAlBKXTp9Cz/lm3L74AeTkuX7qMKMM9/WibvvP8O6+f3hxNoIthPhSNBptVB1M5GdsAIhukWVZrbquawBWqc4eNGNofXM/cgtuQ9Gd87J+NkA6DloOvI7ap55Ef1urJ2OMr/vv9CScyMey+9uFKH1EKBR6CcBGrwYounM+7v/OJsyYv9CrITzV+947aPj988pv9LuaEGKXYRiPg+v+RLeMDQDRh5cTCoX2AviiVwMILYA5qx/Cgse+jumlc7waRqmBjjY0vboL7W9VQDqOl0PtM03zKwBGvRyEyK/YABC5EIlEgo7jVABY7uU4QtNQsuQzWLR+I2bM+6SXQ31o/W2taP77X9H+1r8gHc/34Dlu2/aaRCIx5PVARH7FBoDIpUgkMsNxnCMAFnk+mBC4/b4lmLvmEUSWrURg2kc8H/J67MvDiNUeRtuhg4ifOunFDX7XEEKcCQQCD3Z0dPR6PhiRj7EBIFKgtLQ0ZNt2BSajCRiXW1CAyLJVmPu5tZh116cmbVthadu42HgKbYfKYdQdxkgyOSnjAmMXf03TPh+NRs1JG5TIp9gAEClSVlZWNDo6ug8eLwd8kJy8IGYuWITie5eg+N77UfSJeRCamsOGpONgINqGrrONSJx6G/GGeowk0zLzfkII8QXDMLrTMTiR37ABIFIoFArlA3gFwKPprCMnGIQeLoMeKcP0SBn00jLkz5qNnGA+coJBTLtNR04wHwAwmkriyqCF0VQKI6kkUl0XYUU7MBBth2VEYRkdGE2l0vlyAGCfpmmPxWKxtBdC5BdsAIjUC4RCoT8CeCLdhfiBlHJnZ2fnRvBufyKluBEQkXrSsqz9uq47AB4EoGYufuqxhRBbTdP8IQBPnyckmoo4A0DkoUgksspxnN0AStJdS5a5COBrpmlWpLsQIr/iLxMiD8VisWrHcT4N4M1015JFqnJzc+/hxZ/IW5wBIJocIhQK/QjAdnDpbSI2gGdM03x6/G8i8hAbAKJJFA6Hl0opX8Ik7heQDYQQZwA8YRjG8XTXQjRVcAmAaBIZhnGsqKjoPgA/BzCc5nIyQUoIsaWwsHAxL/5Ek4szAERpEg6HIwCelVJuwNT8LO63bfu7iUTiQroLIZqKpuKXDlFGKSkpWSGE2A5gRbprmQxCiMNSys2madamuxaiqYwNAFGGCIVCDwHYCmBZumvxSI0QYpthGJXpLoSI2AAQZZxwOLwUwCYp5ZeR/ffpOEKIvQCe4xo/UWZhA0CUoSKRSNhxnA0Avg1gbprLuVUmgF22bb/INX6izMQGgCjzBUKh0GoAXwWwDsDMNNczkW4Arwkh9hiGUQ0+y0+U0dgAEGWRxYsX58bj8RVSyrUA1gK4O80lnQZQLoQoNwyjBjywhyhrsAEgymLFxcWzNU1bBmC5EOIBjDUEH/VouD4AjVLKt4UQNbZt1yUSiYsejUVEHmMDQOQzJSUlcwAs0DTtDinlHQBKARRjbOlgJoB8ADkA9PF/sTD2yz2JsWn8bgAJAFEhxAXHcVqFEE2maXZM8kshIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIppU/wVwOICzRGGbSgAAAABJRU5ErkJggg==;\&quot; vertex=\&quot;1\&quot; parent=\&quot;iroXu6kSOUnqGuu2dOUE-67\&quot;&gt;\n          &lt;mxGeometry x=\&quot;28.64\&quot; y=\&quot;35.75\&quot; width=\&quot;78.5\&quot; height=\&quot;78.5\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-16\&quot; value=\&quot;\&quot; style=\&quot;shape=image;verticalLabelPosition=bottom;labelBackgroundColor=default;verticalAlign=top;aspect=fixed;imageAspect=0;image=data:image/png,iVBORw0KGgoAAAANSUhEUgAAAWgAAAFoCAYAAAB65WHVAAAABGdBTUEAALGPC/xhBQAAACBjSFJNAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAABmJLR0QA/wD/AP+gvaeTAAAAB3RJTUUH5wgKCgIx9ifNUQAAgABJREFUeNrsfQd4XNW19dw2Rb23UbfcewF3dcmFYtM7AQIhkBBqAiHU0LHB9Opuyd2yJBuSvDQChJLkJS9/XhIChJSXDrhCSHBZ/977nDszsmdkDZaxIXd/3/ZIsjTlnnvW2XVtn88TTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0/+Y8Xv9/uCwaBp23bYcZwGy7Jm0ePsQ6GBQGA2vd4Meo0x9Hrp/NqeeOKJJ57EEdM0fQSUuQSY15H+P9Kdhmn8i37+70Oh9Fr/ptf4gL7+OwF2F33dkJaWbqakpHiL4YknnnjiCoEja6Y/EHjaNI1dhmHAMHwfm9Jrw+84/0egPceyLZ9nTXviiSeeRC1nBujP2bZF1u3HC878egzQZEWDDoj/pvdTzgeGJ5544sl/vBAwsqYRKH7Tskx83NYzKx0QpBbrLnof5zFAe6EOTzzx5D9eOJxAAF1KoPg6Wa8fOzjHsabn0SEhVr0nnnjiiWdBE0ATOL95OKznONb0o/y+vDCHJ5548h8vwWCQNZ2A8TtsQX/cMej9rWjTA2hPPPHEExYOJ6SlpfkCgcClfr//Q07YWaaKRfsIMOlXDkqTBWj6Gw+gPfHEE09c4Tg0aY7fcVY6tr3HJpA2XZD2ANoTTzzx5PCKLrUrpMc7Hcd+i77+F+lu8yDVMI3dpmns6WvoxANoTzzxxJM4QiDKwEgYbQ+mr4+jr8+lr8/XekECPX8f7fH/9GyfcRznevr63b6AtAfQnnjiiScfG+ibPsM0K+jx9b5UiXgA7YknnnjyMQknIelfLuP7dd/K+IxHPID2xBNPPPmYAJqUAfq1PoAzN6o8kZaWlp6RkZGemZn5kVX+/j9JD+JaedqLemt0MJoWIvH7/UYgEPDA8AgG6LCyoA8M0I7j/JkW9MVAwP8Cff2CbdueeurpJ0x575I+b1n2s7T37yTD62h/IGB7QH1kgjQD9K/6xmznNssc/q5GTz319CMq72G9jxWFg/knAu6vEmjn+v2ONMh5cuRISV8B2lNPPf10KgH1hwTQzxBQTyU1mWbCEw+gPfXU0yMDoIVamJS5368JBAJZDNKmVxDgAbSnnnp6uNWIhC/VVCVrIwH0xLS0NNOLTXsA7amnnh4x1rTpWtNv+f3+LwVDoSybrOmMjAwPLY/kJKGnnnr6HwXQcPzOv/wBf4dt2+MYLri5zZMjsszOU089/U8Jd7hxaTf0QfoGgTSzamZ5IY8jslHFU089/U9Wx3H+FQgG15JlPcazpj8+gC6jC/2bA7o9onSi+g6ehzqeGnHUdxAUqabW/S0D45C8//8k5fvA7LUWPvr/3vU6MrW3PWMmmqhkWSDrGY7f/zqB9MWEG2lelcchFObUIO2TBe03LKTmZMAemg5nUDr8AzMQqMmIfH0w6pBa9Dys9sCoRv5/cBr8g9ORMiAH/uxUGGYCMifNyOfQDegntX36hrMIMCwTPisIMyOElKGZSB2bg+AY1mxPD6Apo3KQOpJ0HF0vWqOQ7dD94ND1TjQn04LftJFSmYkU+pvU0dmk3rU+EjQ0NheBqlT4grQ/aP0cnw2fYdOeMmXfBLTyvvHRvvGZPUvxdGMLj7/7J2HHcsKOkcrWMzxAZTFNi68G+RaGc/Dqs+gErFZDaQ/cHZgyrRipd45F+u2jkXbnOGTcMU4e0+8cf1CaRhq6axzpWKRElJ97rLxe6t3jkHPb0Sg4fQj8xUEYVoJOR1Z9c3GnlM9mJRCxAgTQDuy8FJSfPQpDn2pFWVsjwitbSFs9PYCWts1A5bIWDKDHsusmwk5Tk318doKD0qF7KWCg9PPjUd3WiorlTShrn+ldy8OuLShdOQPVDzYh/5hq+DNsmD72dgigBaTpe9Oi7/ngNfVjoq5iSyxq27Z+zVTG9H0KG3yBQOg/D5RTUlJ8Bfn5vlAoVMWBeroQy+jH6+lCrTs4NTie9A16zvcPGDYg4AtNzUfq/NFIuXc4/PePRJA0cP9whO4bQTqyj9rzd1NY548gHY7gfcPl+fwL+PlJ7xuGEL1Gzu0E1idVwM63kUqndzDOjcOg7FrNPrLgfATIhkmgbPgRsEJkNWSg+LqjULa2Bfmb61C4aTrCG6ej1NMDav6mWuRumoaSTXUovfFoBFMthDiEYcZ3kx12hwmkS740HqVd9cjvnopsuubhTu9aHm4tY+1sRMWamSi4ZjwCQ8hbtdiwMWW/+Ew/qSN7yCbwtnzxm1ocx2Ermh5F3yOgXkzfD1VG5H9QbJpdh4E11TxVpdXv9/+MPvyewzI8VgC6EGnzxxGYjoTzwFgC6DEEqAzUo6J6n9a+fK2/D92n1E9fOw+MJoAeRc87mg6Dcci7YSzSagthhwwCYLppzBDdIE6C8iDNuudjF9yPIP2+n6zo9PH5GHAHAUxXMzKerUP2piYUdbagoJvBw9MDaQGBbGnHNFRubEDl1yYjkOIgYATELd4foA06FG1YjoXwlybQ3zSjfMN0FG9skufxrufhXsvpKOqajGI5LGei8uFG5MyugJNpq4OVgNmk/ePz2ZqvI3HnYWwegkvzCLR/4dj2mYRVKTwF6j9CuKSFdAx9+F9ZMo/wMJEVEUCnTilGxvyjCExHI/jABKTNG4/Q/DEEtGPpZ0no/BiV7wn06bkC948ncB5PVvV4ZN17FPKvGo3Q+BwYQR9SDLacHQlVGFYc14vAmV1uvj4hBmh21VINZDQWoeTJaagigKjoYKBpRH7nDOR1zRDQKN7Y6OkBNL+LwbUOJd1NCN80BWY6bV7Lr8JHcRKEPgJng6zs/KsmoHhTC3kqdSja2OJd7yNAi8h6zuuuI7CejvKO6ahZR/uinbzKa0YjMDQdZpA9UZOMHFuFCK0+jcIT5bCX33HeI6xawtb0scce60tPT//0grM+hcxgMPgwWc8S93ErEQ6PBV0kVm1o/nA4C8Yi/d4x9L0brhjVByVr+74REhqJtbhT548iUKYbRFvO2fcSQF82FPaIVBgOxzQJeB26aQiEQzFJjP1uFIutN5P+n8A5y0LWaTWoWjaTLIUGshgaUdDZTDdoE0oIMBg0SjbWHrQWdyotIdfxQFrMv0+vW9xJgEV/46r6+yNXGZyLOqehvLMeFWRBB1MsOizjrwH/LMCWmN9A0RXjUN7VQEBA3guBQ0k/XfNDrWF9fyhNvLbhyN/E/vxI/3wNdFi2ijWds2kaPU6T+69sfTMqH2xA9swyWOkWLI5Bk5HDXpIZ2V/KELL3WftoVUiEz4P1FwTUZwWDoVBKSuqnusqigPSnpmUe1hpIXoTAtEIBaI4Z2wvGiiUdkDj06D4rg7OzYCSsB0fDfmCM/Izj0KEFDPLDkXXHaOSePRB2aSqd4HbEOnbjnZau0nDcTLPp05lmBvCg3FR2kR+lnx2JAe2zkP9MM92ABNB0YxZ2sgXRQDdpfb9ocSe7jXVikTCIFXYx4E4X99EF7h7KB8XGJrFiCgi42OVX4Fffb+/pUCi/zzBZWwy2xTdNhj/EG5hd4fgAzbFLUwB6AsrI+i7rUCEOBocj+XMWy73RKF5WWUQbZO3cw5TXmJV/VqoB3AVqOWzlEDqCP6Pcg41qP+j7j38epu/D9PPKdvq/K2mfDklHwLGRIuFCMgg5IUyGkmOyN6v24YHIl8io3EH6BBmaNbmFxb5PXYMLx59NNTPwd4cl7rwfQBchbf54iUEzQAfvGwP/guQAOjViMesE4f0j6DlIHxiNTK4KmVsKM9+iG4FjznYCPlt1motytYZ4FAzQBpyaVIS/OhHVa4+lzdWM8vVkuXU06Buzv7VRXPeijhb1qLVkYzNt8qb9NCwA5W6IerHsS+Vnh+r99Y/yYRLuICurmzb1zZPhhFTWnysA4m5O/jkD9JUTUNrNn306XZfmCDAcqVoih2eTDsXoryPaGNEi9/uNvKakMWtccoR/xkRaRJpP68zGRvmGGRgyrwV5zRXwp9FBbPnE8EnlvA4dzAzUiUpdYwHati2OS+/1+wM/JZA+ifAswAlEvz/06QFo0nLS35nmkWFBHyxA++8fI6GO1PnDCaCHCzgHF4xGzg1jkNJUBCPdlkRFgDa5k+i9aHCWhJRPXReOP6eOycbArxOQdMxC5jNkpW5qRMWGpkMI0Apcw+K+NwkIFXa2RMIoroYjFhpZWZykIQu7pLNOALtU/rbxEwDQ0z/1AB2OAeZC8XK0p9NdT9ZmnWhRpxuiIu+pqz76sxjL+ZMI0CVibDQgd1Mdcp8hr2fjLAxcfiyKvkR7fECq1LWHjBQygvwKsI2+ALTtgjQnEbeSPkRfV7q5tU8LQGsL+tMB0BzWsB6kv39gGAIE1FnzJiDvmtEIjc2CwZUaRoAs5wAMLpJPVGerQYCTGTb9npFuIKehFDWPkIu2roUAgTfOdORxeVi3sg76+2ZWVi+Xjk1DKWfEaZOWdSrADdOBUEYuLyuDU1mH+jpMm72wW216fizcpB6LPhEW9KcfoBmYoxY0rTEftGwxd9WihJXXmNaSk2usYTpwWUv1Y7E+eOX5Oj5ZAM3hnHI2ZjaqsAeHtfhalK+fhap5TchsLJPkMOOQLZ2iH4HfwzT2EGi/alnmCQTUgU98pcenEaA5bu08MIJAegTS7x2H4s/R39ekCyBbPPeQM8hc58wulJX4vagCe9I8CzmnVaNmSTNZQM1izZR0cuJjqoA032z9ejNznJLdW7KuigVkaVNvbqafzUDJupkIryJtm4GiFfReVtAN307/x00Ca+hGX30MalYfh6oNs1HRORMlDM6basUa8wD68GthF4MxWY+dzVIeOGA96bpWlG44BsVrZ6KY13FFC8LLm1G6lA7iZfTYzmvbimICsmKyOou7Wui+aJJ7Q+6Rzk8SUDdGjA++JwvZy+vgz9KKslXkHX5+GJzykFR3JFuooBKJqgvRsqytpPcTSJcNGjz00w7QmnFKJ2g+ErdCpHTP6PUCp00uQs49R6nKjfmjkTqPQXcUWcJ9U67aSJPY8yhk3j0emWfXwCoOqI4laTk1I4km6QiU8IUhySgpmOd4M1eymJaEOKxiP8o+Owo1bTPJWq5XSSi6sfK6p4tyVr18Q13cGK8CRbaI6ul3OE7dJOVkrOzeFtLGKmIrghNCBPKltGkrNsxAVXsLKh4ni+nuiSi6ejSKzh6KnOMqkVUXRuakYqSOyUdwRDb8wzLhDM2Qr1PH5CFjfBGypoWl5jT3zIEouGQESm8YjwEP1KJyCVkq62agYuMMssSaIslG9gSKdWwwv5vfl+uCqwRlbBjlULrWRzpAqzCTmwim99upEmCFunInv7NZvuZQV/kGthbrpElJQhK03sVdrfT1DOl6rHyE1vb2o1F85SjkXDAI2SdUIKeuDOlHFyE0KheB4VkIDM0U6oHAkEyk0M/SJxQik9e2tQKZp1cj5/JhKL51PMoWTMWARU0YwId2ZysKNvHaNsg9xVUxxdw8soEMi45m7UXVS8KxtKNOA2WjzluotS4+pGtcL94mh3WKOl2tl9csovdZ2E2eIt17VfPqkFZXDCtgKSNKSi2tCHZYvqj2TOJH66V1u/gev9//qm07x9PPHOuTyOnRG0Bb+kQSUOMicfq6qjgHU8YNwZSxQ9Sj1smkk8bvr9PGD8PRI2uQkRJQ3Xf0XL6EPBc+ugmLkXfP0RI3Ds0bJrFkZ8EYONxg0hflSo2HRiH39rHIObYSVrbiAFCfzUhw+JhRch75XQcG3Rwh2iCV13EycIYAV1FnfAAo6dWlbRSrSSynTvW7hXRT8gYvpedk66lmWRMqHpiM4ivGIOfEAXRI5SNQmQI7w4ZjJ0cIZMXwg/gc+sypNvwFKUgZloWspjIUnj8c5bccjcqnpqNqbYOAG1tg7ntzrRtJMHZENXzIN+8nwIKWQ6tRX686CTfwdWEA5PuDwSdXDlzSzcoTqljBgEOATO8za2410o6mta1KgZlDgBPwqTAbt7QnRULE1AJcB057siiAlBGZyG4tQ+5FQ1Fy2yRUP9mIyjUEyl3NUgHEBkHJxhaUbWgUY4EPDlUxogC6VJLJbrlf3WGyrBmk6+UQYe8xvJL2y+dGwV+eIiFGW9rELblWbGyxJ2zFVPUk5Pfx+1nfsW37HtJyxrxP1CzEvgK0EGzT95efPwdbft6OLT9bpnV5RN8lfWdf/Z/V+Mk3n8TgsiLFHtcbQNs+pNcWofD2SUi9Zyyy7hwjVnDqveORdu+4Pmn6PeNQdB19XV8IX5YhG6C30I17EkvrqRWkR0vKfFIn5GLgPfWoXDcT2Zvrkb1JWUrJAIBsWrKccrpbkMuxYLJqymkT1KxvRM1iAuXbJqPojMFk+RbAX5wGJ0BgSqBq8+nvUyQzTBiUjKtnRtYtOlmZq1Ac06bnpZuca70J+FMGZCK/sRKFZKFXP9aAgataUE3WX8XGaWTFszYoq18ntPK7mrV1/Z8J0Gw152xSXk/lhlpUradr1DEV5R1TREs3TiUrmUsFZ6ConazV26cgh9Y2dWK+gKjjt5DCn8dU7euxpZy2L0lXngFde6Qcq7Xp+fzk7QVo/zi8toMzkXVMBYquOQqVj7WgeJ0GvS61lnm0lrl0zRi8C0gLdYiES//CHY2HPQRS2KE8k6KumSi/tw7Zk4sRDOiEvmXpvepXJbJ0HVzypQMMBthj29YLBNIz6Gv7EwPSfQZoUrbMrr3kVODNduA3i/umry/Hb59/CiMqijVAWwkBmm/a1NwUpI3IQXB0FlJGZ4sGR9HX5Pal9lGDlWkwgoaUxEnHknyGxIDmJw0aATqpU2BkWMhqKUXVo3xTtxC41iFz83S5mTlW1tcbmDcEW54VG2rFfSztJPd29WyULqhF3mcGI2VMDsxCRzF+0TUJ+vz06IBuHXLpOLxiyuEi7p2ZZCw/4vIZEc/AopvatP2KC0EOSeUVWUGTrPVUaR4o/ep4VC8m64s78rp1/fUm0q7pui637j8YoBsF1PgacFNMuHOqdMplP0Mgt7kR1WtaMOAeWuvThyJ1VA4CBJQB4QwxdBiNeVsCQhHAHXSWz5L78qCmkkg+he8dWzpb+UA3fcpj5JpiK8VEqDQVOS1lKL/+aMlbFGyagRJp7KkVL6CArncee3idTRIGkXjwYS7FK3JDLnTfFW9qRklbK3I/NwKBshS5piEfG1F+IShjljz3oEs8B9HUcWkB679btn1nKBQKZ2VlH/mcHn0Oceg4LQP07jdXYzcB7x7RZRHdG0fx+lL89gePYTgBNL0aAYOd0CXhQ8DxBXSfvim1x77I61uKDeuASr/nM8QqCdBz+H2GpjX0Jax35nroEH32QIYfhacPlqRM/uYGSQCysivLmXW2KsMdB07wRUuq6sgirUPl8gaU3TgJOa2VCBSmwc9UmgTIjmwq/szKEmKeD1a2iPhG5GL9kOH7CNlsF8RUuMP2Ra01sdj4NSxWPngJuH061keHU8qobBRdOAJlj01D1bpWSUIVknVYunEKffZp/8Ex6EYBLwVq0wXYSrpmoWbFcai6nqzl1jLYpeSBkaVs0dqafB9aKgkdqaUXz8hSuRC2gnUVUV9anXtwJscAknoeS/aVJUaU0SM+Kx4V/Z+T5UfK2BwUXDgU1Y9Ox8A1TbS+M+gztUr8XIVBGiNx9sNfkldPhwh5JZ3TkLOZK5eOxaA7G5E5uRCBoCGNLbx/BCsMI6HRl8Ci3k2W9HOOY7eSJX1kW9MfBaB3vbkGu15vw643orqbvt/zervo3oi2AW8sx5s/eByDBKD1jZSgjdpN0JliCeh4tdsgYlh9Vjem3COJYCU+FKS6o8yP8CWjMah9Fm2+RlLVucWxOdaSSH1x/QFjacW67btsJbm6ZLmkTSsk11N1SzkEOgGyYrkY39GxNGmKMY0Yt5cOF/q/kKFY9cyPCMoqCapUdUqqKhYX/FPFstaWtKU8Dj4onAC9dk0aiumwqrmPXPo1s3Ts9T83Bh3WZWISFiDAkGTflyci4+hCWJl0T3NCWbwVZmsLKjfcDUVwTa9Wlw3RbYRywx3JAnSP5+DXlvZoRUNgusl4S8W42WvyG8qit0MW/EPTkX/2IAx8sImA+jha15nSll3Y6VIFHAkA3SDeqnTB0r6r5PDbhlaUL29BwWeGwC4LqHuZsaKXsGncYgcd9iCQ/hs93mRaVtERW47XO0AbEZJ65nPlDX/t50/GHrKg97y+oofuJcVv9lH+WSxAG7bcuAkBWlu0kRtQq+n7qC5+9Gv3e79P3cxsubBlYdsERoMzUH3jFFSsn4FCulG59bZ0Q4NuTW2ItKzmE2gzcLs1nJzsK5IyoVqUb9BtugQW1atmoPyGo5E6tQBWut3jcPDFtJS7j8pLsNRmNtWBZMj3lqooieMKmz0OUP27wrNr9ADnWHfYcA880+hh3TFABNxYKFvVtinXyXToelWnIv+84ah6eiZZjDPpc9eibIOileTPXppE2Kf3EAInr6aijFntbmSAtiLx2vgbzdat3uNQsqlRvJyizsZ+qlWulW43DmtUrG+ShGnuJlrv7hZU0WFVesNEpNHa2uRx2Lphwj1cLXeijl4Pn7SrGxGr196nhdlN6vaWT7B0yadKsltyb5i+aDWSE+GOMfVrKq5lS3uSLiucqH5PzKUdGJiGvPOHovzpZlrbVglllWwgb6mD7+NmyZ1w7uFwJA+LdA6H763Sjfx+aum91EnHZfX62ai8bQpSJ+XDTDUVs6Hll8OI71snhtNj38aWKEi7au4icP4veqyjR/uIC3kkBmiVqPJp684kt00A+uITCaBXSXhjr9bI17/ZV5cRQC/Dbwmgh1QW6xvMkRvHPBy11qayGC1JrBhCV5kxoQAD727EwHWzxTrKkZrhJkmOxbek6qRRoDDCizFNEdh01JPL2IKKedOQdWw57Hy/AsIkDhflCkc3pWH0Zvnr5KaArgIDw3Q9HReMTW3FWUklGt1DROrF2SVPNRAanYWSy8agZnmLHFr82UulXKuxX6xW4QrpJICmx8rrJyPAlp6ZqJGI1zAI228SQI9F0SYVG5ayxf6oJtBcJyXCkNcgoFC5bgaq5jdIZZCd59f7I/nwU6zB4DOTa8LwuQBsup6lWifT9CWdp5C9zsmzFAOpwzJResV48vhapdSSD+GibtWxqlrND2d1x/5VUezhltLBMWhRC0rOHgYnnALHtsRDNS21x23tQfTlGuv49F/Imr6BNMc8kia3HAxA74kB6D1xAXr5EQXQHF7xWSFxQwO0uXOay1H9JAFO12zJwBd0TSPLqRa5TH0Zp2KB3a7K9dzaXS8bWOLT0s3XiMpl5F5/bgRC5anwO2oShPERyKdiQxS9ew6qJFB5N7FxZs1prcNEhi6RTKroPwIeliS1HEOR5/vTTeTyNZvHJVnsFjf0W/JQXe96IXqvvnmq8DMEdKw2HkD7fQTQAQLoK8dKkwMfqBxL7ZeOSfo8XMlS3DkZeZvp8F4yGxXnjkKQ1tby+yQ0ZUuOJFmQjSmD9O1zICYCZdPS8evo35sxVARRvpgkQZo9JctBiD5HKhsE5A1ktJQifD+twcbZpE1CvB8WNsWGI6YBiAFaOKc31YpnM2j1bFTdNh3pR+XRnmZQ5nuG97hfErRW3+PSXI73oe04T9D3GUdMzXR/WdCJrOgjBqDZ0jAUcNmZARSeMVTiWdnPNiDrmTrJGJdL2VSt5rSNb0GzS8+xWC5P4t8bQJZ39b21yJxaCCPdlEy9n24QBhfbl3xW3uxReWEmHpwqySEnQs3ot3R9qC86LzHAoMr/Z7o3at+6s6JutynlTKYMJrDh8Gs6PgGqks+PRhl3M3J9t2tdHUSIIb+TQJaJpwgMyuZPh1OgXlOSbXGAK8BVL5kWCm84mg7XGVKLnEuP/RLm6KD3w01JXMd87xRkTy2CGTLFG3J0+aPNdLPJGAdu/FisO1MeD/T3lvQeOBJe5AS7EZNbMLS7zgyUluaLSeb9MHiF+Ppyzb/fEt5tDg2k0NqWfX4sBi2fKWGFHJ4MFGlNPxK4TDh5qDp5i8igKhIDqRlVC5tRcNYQOPnMNklYRfswYCTm2olb7cFzEB3nA8u2z3cCAd8RMQPxP8aCNhWVYbAogLJLxiBMwJr5DXJZu1UslS0FjiOXSWNGQ8JsdgFb1+Rihcn9G9J2LCq/dBQCg9IQkNi5qcb58HxC+Zxmkm6nrtlmy9s0Yhps4m14UzwCtZEJiFNSUDZyLMpHjUd2RRWC2TmwQ0H1HJal44+914T3CJ9EGP1486qyP5NfT5K3Pon9cflW9aNNKGVw7To4vo886bRrFs6RMjo408bn9cLHYKjpz4PTUfyUIhxijgeuOT9YgGYvqWJdC8pXz0I+U2JWpgovtW2aOoltqBJIqSLorbxrfy5jaTThv7VNAUTXKk7kKflpjVPpfvIH01A6Yoysb1Z5NQJZ+TBTMuk5VJmZqRPOZrL18uJdWWpABd2vtpTr0eciQyPzmDJUPt5IHs3MGF6YIwCgJVlI692hmmtcvnRuBCtbPwvhWychY2we/AFDkrKSJE0mAWsLF8hm0tQjIh7dW5LQTADQuyVJuHw/7R+ANmK6g4yPpNHR7kaP+G5oSCYqbpqEoo5GZG2aJqdx1bpmlG1QIY2cTdwN1igJQNX6un9Mk28E5uOoWdyCvNNq4M9xkEJWpm0GdWzUTQJZapJxEjwC0hIfcJBVGsawadOQW1kh4JoIoE1tpfP6lI0cg8df/m8s+s3/4b4f/wq3bv4Ovvj4Ipx49XUYf8xcFNUMQiAlVSUkfbEE6PEnlwsXgrbObT3qSyUTVVLGZNBO9UmdesldRwnbXmyCJ2kLuqtFt9FPoQ3XQgff0bCynB6A0gOA0gyUnDtcGkN4nmHl+loC+SbpiDyYxJR0dy5uRt7JNQSEPDUnQJuWvAjbiFwHtzTOjEmk92kaiOMngKW1rZ1Oa1vWI2wRv1rDkOaimqMm4YlXf4qnfvUW5v3oF7j52e/hkscWY84V12LszGNROIgsx7RUtW6+qLVu9jLwwNTr6tfeFl9bn6VjuJaqkEkbk4vqW6fStW3SbHo9u2ZLDkuIQ4XDCmkPlnS0RGq3VWya3iN5PlUL65F3+kDYecEEeJIYe3RTy+ukpUdEmKM3gPa5iSadFWYX68ufP0UAetfrK6S0btdvVojuJt2zny6P1EEPlTI7WyzMhKcXWwKWX26QFLE+2ULwq9llfVSOu6qstiGuqHQeZRjSPl1x/zSyjpul+aKQJzxIt5zKFHPGPr+rUTgpuKKAG0zKN9QLcORumkGg3SwtvKVkpQ0gqzG9oRiWzDBUXXqGqbqazH3KE934YWwC0M38s0vFv2OlZCA8ZBQazz4fFz+xGPe+8j+0IX+GitHj5PPHu6l8GrB4c/lsB2fe+HWs274LK7btxYrtwModwFrSdfR9+1/ew6M/fx3XrenCCVdei8GTaxHMzFMHiK4cEfDTm9vndmvpagFL19ZG4uM6ri1AxZu8NAXhK8fRRp6hyKO4BVq4ixvFGxGN2WCJ3dda4ZDgtRi8fBbyLhgKK+yXZA+/JpeKsSXLwJ19ejUGL2kRNjiuuFDUnMkdDhyz5vfJZP+cIC7mZqKHm5AxrVganVQZpKOqY0wFZM5+iUGVkLV9RqSKw7RUTbuEJEIpKBk6Ck3nXYQvPrEE82htH6ODtHz4cA2miRuR5Fr7HZx9x71YtXOPrGv7jr2ytqtJ12zbjeV/2YYHf/YarqS1nXPVdRg4aRqCWXn09xwiUvFlOyavoO4ZUx20MfXS0QSmqo2XGmPuQK1KQSUnENfpJp31ypjhski3Cevjr+5QjTVFG5tiRpypBG+hWNP1KF/bgsJbxyM4IUevJYf5ApKD8vXS/KXj0b+nx8oj3oJWRPW2AkhteV0jnYQrBXhVM8pSUcTVJdj7xnK88YMnMSSmDjpRUoRd2mBBCKFRWUgfmg3/SNYsBEZkIjiMNevAOjwTgdEZCI3JQurYXGQ0FqPki6Mw8OkmsY44TlrSh5ZlJpzhMAcT3xTRDZDX2SQc0APurkPq0bmy6OwOhnyJLaDYDsnY8iqpJCH3NKM4jEknnIovPbkET/z8NWz4+3bagHvQ9j5w3ar1ZBVlyk0V99S3FLAyeOZVVOOhl36C9u17sHwbsHzrPko/W7YTaCNdu+UDrHjz/3B71zcw57KrER42AlYghUDF1mVZGmQ4pqmJpLjxp9eEpZ82e56Dos8OR+XaGSjvalSUmJ31EZKh8AEsrvBGxfchlRhc3rahBYNWzULVzZOQPbccIbrmoQm5yD6mHNVfmYiBbS2yNiVCVqTqqJO1nvl1qtbXS+t2Ec81vK8WaaNzY4aVGnFpAdzDKrbSJmpNq0qLzHAppp50Kq56egme+vlvsObdf6J9Jx2W/wSubVsLJzVE19XstbOWX6NkwAA88pP/J+vHa9m+lYCa1F3bZVvVmrfxgbzt31j+2z/jlo5ncMwll6NkyAjYwRSp6nEMDVI+I0ImFN8zM8T743AH9xT4Aj6ZHlR04TDUtLdKiI8PQlXtcmQyJLqeL8+3HPhkq9Ap2KV00JMhwxORnF56C9S1MT8pAK2t6BiAvuJzp+CD36zFB79ehX++RvrrlaTt8vjBvvraSrz/mw34xfeXYKDbSdjLDcnWWPbsSlQun4HS5bTpVjajYhmdhstrUbqCAHZFcx+Uf6+O/o4237IWVK9uVfyz3bVCxtKXbDQvbvZmtpgbMGDdFLoRpxJ4EFjcVofQ4EyJZVvM2eGzNZdC7wk3BvEUn9ocHIPMqxqI475wFe751nNY+ddtaH9/Lxa/z5YvWUTbyUJ699+Y9flL5Dr5EyV/BETU2jSQ5b3uH9vRRhbViq179wdo0qXb9mAZPf+ybWxl7yKLbDfWbP8Xnv7F67ho/sMYNKWBNnNq3O7DqEeQwD0McO00vdd0AukzhqFy5QwUd9eKJyIseV2Ky6JkY9SSjtcIUqo9GY4pF3fxxJgWOVQ5BFXeToDM60uWHHNehzX4Cyh39qxZ76tK7bpMoG5B2W1ThRxLyi8NM647HHvfuonUSKzeUJ5MbvUgHHvpl3DPd55H+1+3Yw0fuGT1LqLD82m6/ive/QAzPneJ/L7keIzEBgvvt9kXfh5r395Ba0f3xxYF0Mu3RQF66RZa0y17BKgX0/ovo9dbSZ7U6rffxyM/+xXOnf8Qao6eAieUJqDLSU7D7IUHPVJiyQ1NuhrJJnDPdJB/5hCEV8+QBCqXmeZ3NfVb7Xm/g7SeTMO16xXryev92iQEB6YLOMte7B2g/0BWdOUR0bzSK0BbUZA2dNfboAHlOPXY6TiN9FTRaRE97Zh9dTrOOK4WxzRPRkZ6aqRtO+FGpxsyf1YNyjfORgk3g3STJdVRJwRD4U5F3HNA1WxjxTG/X9ilSGGS4s0lYOAkRHHXVCG8D982BWkDMxW/gliWtk6g9daqG8M/bfqRXVaN4y69HPe++BMC0g/JoiILiNzWJaSLaQMuIV1GltDiN/6EqgkTImCZqOKDi/P9GXlkka0TF3g5gXPblvgAvZI28WrSlfz/ZHEtpddcRMCx9D1gFW3qRb/5Ay596AnazBNhhgISj+RDKMDzF309KxH293xUVQnzQgRSLGSfVo2K9kYFxtriYi2Rbrz4lrQ0JHS4xEwKpDnenyecIPXyPMWdzYo7YlOdKrXqioJ7+CPERFXDUSsKvz4Zgao0CYtx152RKDHrlh+69fSWqsiw6F7IClfgOPJI5tHatrO1TOu4ImLl8gEJCVEs+g2t7fjx2gI3xKvc1zsxdf18MCsHt6zfhJV0oC7lUBUBdNs2Xr/4ayyvo19rmYS66G927Mbjr/8BFy14FAOPngbLSZN70TKt6Jr69v+MlsSmuQqI159LLQMw0y3knjEQpQTSnKsp2fAJmNSziYdqTMHQNTNQeFyNOpjM3kvuPnEAbeh2YLc0KCkeaNcd7NE1l6Aul34vd/YAAsSZtBGn6WQdk9TU91nzuhVvQpk7o69DzXQrSbJMqKJDlVmVdM1AxR3TpOuKy8z8kpyLttL2BtCqE46slsxcTD7tHNxJVtXad97D8vf24mnSxTtU+GHluwSS7+4VsGbQvuMb30VKdnak1C3etVJhCBtVYydj4et/xOL3FCDwc8UFaHp+fg0GaLayeQMvpQ28iNzup0mXkkW9avuHeOpXv8PZd8xDwaBh9NmY6jFwwBIuxaGiqgkknJNqIvesQRjY3ozKDsX76wJ0aUIuEx4bpio53JIuGYywsTE6bzFmPQu0Ve6Cc6EOcyQTy+TyvKrbapFakaGsYEdVzZgJOjdjS9ysgF8sS39WLqaffh7mf/sHWPP2e1hB13ERHXh82LrXfoXW1bS+t2/+LkK5uVL+KJ2bTJoUG9c2VaKK98KQSVOw5K2/iVXM67Vyi7pflm6Lv8ZsYbdtUY8qtKXWeckOBdScZDz75jtRWENrawai+7SHF6vCkNF8Cseq/XSvBWR+p5VmIefMQTIsObyh7ogfv1UqQ5enoWo93YdzavR+NXqNQX8CLWjdv264sUiXvCiqptxoPVVimJJgccvGoq3JcRNfPg5xVIu1VNYxWTLzPO9PpgJ36pHznQfQjYqjlzufXMtZTbduPAB7VkNPa4CJkrrpfdzXgMzB2UhlS4krFzQ/s1trbPv2yQ6bUZeXwbl81ASJMS/+yxYsY1CWODFZuuSSsrZvVaDJupQ24lrS8++4V3GSyBrYca+VxLRpwxx76ZVkWf1LAH95LwC9TG9sdpWX83sgN5i1nVzkFdsUoCyln3MSas3Wf+PhH7yM6WecByc1R4FI3Moa9z5xpJ5WEmiWKmf0p9souGAwBq1qlhFdhXoAQCILmhM+RQTQnJVXgFwvnYphPfy2WM/pc6tr3K7BcKRMr0GsuqIEMUl3gofLe11OB8HAu+qFmlO6zjgpyi3uRqzXYkRCO2ztBt2KB97EwQAqx47BZU/R2v51K9q276bru5uu9R6s2qLWdtl21l1yXZdIgg84++v3wPA75JkYMQAdrWoy9UQQfs0Tr/wKWc+76SBXXs/KLW64Kv4aswe13LWwZY33SNK4bSvfW2wQ8Hv7J+577iVMOfkMOCkpyoiKaehQHCIBTcKk81CSgFfJQ5uA2kqzUXTeUJSub5SJKMWRvdMoselkQVvWuEM/hybxL9RE/gc7YZyri7jPoXRjCwqPr1GxeJ/96bCgk2uyiK+9ldLFA+icWQMIaFvp5JsqQMsk48kuVrFWtSHVpkyUQGIrnQdzcjUHd08VuSOECJwrH2tC2thcxTdrmD2sZbdW2IzJ5KvpDz4BcTslDdPO+Azu+8mvyDJVcUPleva0qmLdU7Zm1729A7W0ecSyMRNfK7a6gqlp+HL7eqx4DwpcJcSBhO7vMvf1tyGykdmajsStI+9tL9ZyvPQvW/G5h55C3sDBmm7W5YOwdbLX3G/iTiyfSig7gJIvjaNDdqY+QKfSNa7rJRTRqEciuaBavw8JVX3CoQFFvVSIcPikTGZIKo4NPgBqHmpA6ugcFTuOOXD24zjR1UHsRTDjITMfOhk5aDj3Qjz4419g9TY+6PT11WsqViwn9DikRIcdeyrsLbX//X1MO+HkmAlDMXtHX1uJbZOFx3XsN6x/Vio2liUE5F40JhSyIhbY2RKn97v0T1tw/vwHkVtFoEVGVMinwnGcI1G13m6RAF8DRxGQSYeqX9XVF5Dx8YWxqFjHPQF8nZs1RUKt8LQkA9CcpK1cXy+eEYe1CtywFo/0ktFeH005ScjcLrnP0vp3HYeCOQOF6703/u1PLUD310xCF6BLOPYrBOiN8ngoWdQ4MVXWoU5uJsXhtu+aRS3ImBGGEfTpqd69MGbp2li3rjQ7XI5z75yPJX96V8Uixd3d0+uGEgv3fdI3/oSKUeMjbcSJqif4WpUMHITHf/6auLBiDW/dG3VvD0p5c++WZOK6rf/Cvf/1HEbUN0sWXLnFRlxLer/OLOZHCIcQvmkiytkqpo2bexjmIzK/BCd8qzZMle4zTiZnNJaIpb+vix85cHyK08TUyV23QiOjuBQX3H0/Vv3pbTp4d/W6rgzQq7Z+KADN4aunf/UHVA4fqWK++1RRmDoe7VaJhEeMwtP/+zu0beuP9YwTDiGDYfU77+OeZ7+DYbX1sHgqts6ZxBpXLkArTniVGLdNXd1RHkQlrS0fwLmbFckUW8MFSSYOpbaZvF2e9s25nwGrZ2HAQwT8t0xByc2TD0orvzYJ5fRYddN0ZE0p1n0KiakPjjiA5lISUgFo6wgE6NJDDNCShe5Uk4Z5Qnf+pkZUrmxG/qk1QhIk8UBx+4zIKKm4XYqWev+lg4bhptWdYjkJcHJZ1DYNegfaNDuB+c+9gvSCcAxAJ/Y2jj7uOLT/ZZsAwHKyeKUEqx8AegW75pyY4qoSAuq1BERP/eIt1J9zPqxgQLvFyhKxpBnHjF9l4viRQps6ZUgGKu6bLhsxZ9P0wwLQHP4o3ziVNn89Ms8aQB6Oatfev2kn6tKrmm/Vgckhp9LRR+Frqzeg/d33sUTAeRdZy7sSXkcJY9F15PARW8J3/dfzSMvLd+flxalisnTi0Iepp5yO1X9/r4eX1b9KhwYdHpx3eOznv0b9mefCDqVKL4Jb0+24DSymy+diScKbk4ecUzIdQwZkVBOYqtb4qVLCmt09Q0pT+7o+zH2TrVvrqx8lI+nEavgr6b2k0uulmAelnA8xs+i9Z3NYyq1tt/c7II9YgOY3YSk+1F84tp2wPvLTCtCuWy3xarKyqtfORsXVR8HKd/TNqGah8YI6+9U8G5GpJfzzmklTcPt3X8CK93Rt6hYktcFW7dyLa5auohsprQflq5mgouD0r92ENds+FABgi6h9y97+2bycXOKYqq4wYde4jR5X/P7vOPmaaxHITJcmm8ikFtNKQOakeIhTeIRYfSFZri3S6FDysbcH8/o2yVDV0stHw8kicDb8wtkQl6BIGp40QDNPRcDG4Km1WPDcy1hL13vRzj14ihOr25HwQFwWCSOp+PNqWqOrnloC0+9PWGLq8ylwNBwLZ992l/zN8kMG0GptF9LnWLaTDuK3/oI5X7oG/vRsud+5ICB22IWiMbUjXC+OT01zscgLYU9z0LKZdAAqgyevuzUpgOaeg4qOFgx5oBGpR+fACBlSOWIa7t77aMr3pV/2sJpeoxKevXcTHnEArdsZmQp1kZWgrfhTbUFrLmIG6NKOmRgyrxmpg7KkEcQRUp7oTWrty5uguS3Y8hnRMAMPvPI/aKObnW96ifUlCZiryaI5+5a7BCCizRBx3DHpUAvhuhWrsZqsbo5vtm1Tian+AmgGlhUxYMObmZNO7X/dhnNvvQPB7CzVGmwmsKANN5lsI8jAk2ah8DNDMaRtVo+28I+lJnZDHSo7WjFgXqOMQ0uRFmoVa417sPjsyMHo85sY2dyKR1/9OR2Ge8QjWibVEb3Hht2ELOti+v21ZG2feu1XE5YqSvhMYt1036XQ2q7ukMaiQwbQOkbN67pEuhN3Y+VftuD0G26BPytLOnBtbXgogLY1s54bkrGliUqoOtMMlJ47AlXrjhGGw3BHbZIeTi0GrZyBguOqhKea53AKSZTmojkoNVTllT/G0OmNJvaIA2g3Dk1vaqpl27+Pdhn1lyXdN4Kew2dBK/4GbjEesHQGMptK4XdsTY4Tbe11q1mk/ln/n627robUt2DBT3+NNVzOtGU3ASUE4NgKTWbTrNzyL7RceKnEeN3OtHhcD3yd0guKcD9ZdLyxOMTRvnW31DrHSxJ+lCTTCp1wjHgB21QFClearP7rFjpIbkcgO1MN5Y1UsETLl0w39ONTISJp1S4IoOLGSSgVHoVG0SKdte+/ieEqMVyiG1+KdEnegCXNyJ5WrLgWuDvT0m34Oh4Zy9Eso6M04f6IukY8xslA/uw6qcpVGly2uFwDdSKAXqFBcBGtz5q3/4mGM89KyIGi2rCV1Z5TUooFL/1ULPRl22ITflq37dEH6J5IgjfpNZbPsiemHG+3OoD//C7OuPFm+DOzddmkoYmdbNXDEKnssoS2gQGajRh/URDhr3OMn3MN0yRk6CZt3SqcRKGtgk10gD5cD6c8hV4zQNderY+5T/fmR9HI1JqYCTS9FTC4AE3gfOQANFvRPEyAAPpUenzTcRwpKfuoZEWxqkfLoC/x7cMB0Kr2tgWVzPF86QiYab1RSfKGDqrSN50NrjlqCua//DNJBi7fio8cM+QNtvjt93HU7DmKE8PSLmQC4p3wsJF4+rU/YCVZZrJZtynQUGEOpW6FRls/WV1ioXP9NFcW/H0nTr3+RjjpoZgGB0NPm44fCuLEW/qYIlQt5jZtRQbP7q2aLN1PhPA8M7CrVng9eIZksdCYzkbRJSPhpCp3l9kG+dBQsy+dyKR5F3z4gDadAIZMbcSjL/0M66Rcbm9MlUvf1lOVT+6WhqBlf96KsTNmxgdo/dq2LvHjJDE3DqnX3Nuj4qZNJ3CXb/uQDordWP3uHklGilXfD2vMz9NGXtIpX7kB/tQsOLYmi4qzf023csdUoZmM8XmoprUt7FZVOHndqiKK6UHZAGJLOe4e5P1980RYIVtRhRrR++jj1thW7yOGE9qNRTNIEziP8fv9dwZDoW/ZjvMq/exHB6N+x/lRIBj8leP37zoSAZrDGzzleMCDtfAPTpcuK9u0e+k20kkSeq+FQ4bjzm89h/Vk+a442Pgvbegn/vg2hk6qldAGlzs5cQ4KF6CH1TZgEW16tuoWcivxDh0v3q4aUJZLyGM3beJdtIl3JewwTBagOZQiDTbkHaz509s49guXwXCcqOubkD5Tx+tTTIQ/MxwD1s2UDRsma4vLs/K6mvuPh4HnR3ZORSmBNXMFV95fJ5SwlrRxO0LTyaEZmXupaQzMWNeXAKeEDsB7vvMCeUW7xdJs27Yn6QOXG4MYRJfspDX63d9QOXZCYoDWVh7//+gZx2LF37YpYE+4FqrxaM07e2l992qOjn6IWbM3tm0X2v/4D8y48BLhaQlxs1U8j1onx5VBQ9eSGQbPp7VdO0OajLI3q/FwqjmpMWF1R5jHx107DrY/BqCNwwrQv+PKNuNImqwSa02T0p5zUtPS0rIyMjI+sqanp2eTZgZCoQnMEHVExqC7pqN6TSPyTqyGGfDDb6aouHKi7kDbkCkU2QVhXLlyA5bTBm4nADzobDtt6Id/83+oHDFWDwTVAG3GS2Jxlv8MrNr2b0kstr9H1tX7ihBp8U4osNb1swIuW3b3y+Zt27pHN7XQ62zjEjKy3N74C44+bo5uDY9JpO5nARmR1vdQWQrK50+nzTtdpoVzDS3zQfdHwldNpuYmoynCEzy4fSZyjq+SdePGHraefWwRalB2q3P8mlCIq2YyS8pwXdtarNr+byyka8uHIINhMmu8VFvQDNDcvv/Y/76FoprBvdK8ysFM91bjORdg7bYPsJzWdOVO1V3KFT78PKxyCGuLeZnuFlyWiCjrI1R38POv4A7T135PHt1xCDBA++IDNHOwcI5BaqNpzwRpbQfOb5COz9zNdVJ3HpZwFicOW+KGpEq51PUrE+B4AH0Y4ttKS+lDv3YkAnQ5PX/FTUfBKmSWqxCpX9c8J2gQ8Vvwp6Tj3FvuIsD6N55+7yM2EuxX2gYs+MVvUVwzVFGIJgBoU4+vGj9zNr5KIHLN4nZ8dVUHbun+L9z/4o/x5K/ewvK/7pCOxFU7VHxRiJK27e2fxJKU4Lngz5SXwEMv/QzlZB3yuKHo1GojwSQPEzZ5B1nHVGDAStqwPBmjs28kVn0pmWSALu/gaRtTUU6AMOCmKXAKVLu64bN1G7MZw6uhiaEMNZDACqXivNvulpb8JTt2qWoN3Y6fLEC3a4BeTuvw4E9+hbyyqoQAbbq5Dnof42cdh+tWrMHVS1fhKys7cMOmb+Oe53+EJ3/5Ftr+vAXr6L5jy547FxfSOnMXKXtR/XEfLt+i3jsnulfu3IMHX/gRykeOk4PDSGBFK/pZR2L3DNh5x1VjSNsxcvBy+Eo6eztaEgM0d4B+9SgPoA9jI8yRCdCdDahZ0YLsxrDE2UJSeuXTRPtOQn6NKSefhqV/fFs1EXDsuB82B9NFzvvvXyO3tFo6Fxmg7QQWtNTROvRegymwyQV1gunwp2Ujs7gcJUNHEngfg1O/8lXcsHojAfbvycL/QFq4Dz4+qSoTVuiKkTbtVq8m4L6WgCQ1ryjChJc4hqiAkIfq1txCVvSmFuFKqNgwvR/CVSrpWLVhuljR1WQ9ZzeWRoaqupU4TkzCyAVoJg/yOQ5qTz8Ly/7wN7Rt36X4SrZ9tAOYvQwOQ6x+d7fkJ+a/9FPkFJUfeIiw8HwE6aAIyaMTzEAwI1fWNjx0FCbMPBanXfNV3LimEwt/9Vtpy5dmqH5qaHEPocXbVXfq+u0f4qrlaxDKzlO5EV9PGl2ZoO0YUhLHHNKcn7GLghh4Wx0GrJ0pXowKY9XL+ngAfWQCdBl96N8cToAu1SNycjc3Ck9HeEMtSjubUX7DRFg5jri/jpss0vwiblzVr0t1uESqZOgIPPDDn0gMdtlW5fYmC9DCwbFlj7CTMcAJDwa5sPe/8nNk0Ua0I7XVRu+j42NoQd34paGVAdKfno7yUWNx/GVX4s5vfh+r/rod7dtczgiVUGzTFRsuj8PSXsh4opZzDEBvUzwP7e/8Eydc9VU4TghBTqLa8QlpZFNzAppAOru2DGVrZihrt58Amg/dyg3ThKa09IZJsKXm2YgMV7VjxlT5dC234hCxUDJiFB4iL2SlXCO1tn1t/HFbvKO6V8WtycJdTl7Wvd9/CZkFJQcEaFNXPZmGCr0EDVUlYeuZhIoCgLy49ExUjJ6AuVdeh3u+/UOs/etO4RJfFvu+t6kGpuVJhLfce8ENn6zgjtK3/4lZX7yK1jQYqW5SoSEVm+YuWql80URLPCItp6Ucg8j4KemYKlS/qiu4/qABmveo37aFj91yJ8DYagJMXA0YMvnHF9Jk/MJCeUCyJA+gP26ALt9Qh7KOWmQ906QAmk726pWtyKwPx3STGfuRRfFCcsw5SDefPy0Vlzz0ODrIqlghHWLKUkrWwuLkXSxALyeAW0Vu6r3f/B5SC8Paeo6W9x1MiaOhP1dGYTHqzjgHX3/22wSm7wlHsWzEmDKuZZqGdOm2ZK1FPmD2YOEvf4thk+toAwVhOlbcTcAWlyN8DyE4OSFU3jYZpcxa2A9rrLiJ66SjLbymFekNpRFrOX5ziKWbF+g9paXhi48+hTXcIfhRyhKZq1lr2xYV3uBrvJiuy9qdu3Fn17PIKCjon34BNyTC5ZhOCjJKB6DhMxfi9m98Fyvffk/Y7yTBqMsvDzb/wE1KT/y/N1EzYYqAsGnpw80Xf9qPjGIrCKHijkmy17I3M4Vss/ChHCxAW46JzIYKpF4wCFlnDkTWOUORde4QpJw3SOlnoppKmnPOEOSdMxi5sytgp/rh9/kTGg8eQB9GgI7MGBRO4XrkbSbQvnUygYSTcEZfhLxcE+NPOPZ4rCD3dyVPrti2KxKH/Sjt1W7zB1s9a/62HdevXI8hUxpkw8noLxegrYMEaHdKuGZOSycL/bgrvoxHfv46vYcPI4fM0u0xZD9JbuhlZCkuIktt1Y7d+MqSlUjJzKNrZiVsi7eEyjIg3Yg5x1aQN9OiCbH6IWzVpSZyF905CYFcvwxLSHQNXdJ9XvuJx5+IVX/4hwxNWJpkSCCWjGjpdlcJoLfvwUpa268sXYmBR02EHXD6pVNX1Uy7MwsdGJafrFbySIpLZaTZw3RQLn1fdTomm9xM2ERFB80VTy9FMC1bGw+JWSnF+7RN5B5XIaOneFZkYac7ieUgQxxkOWd+lgD5gbHIuncs0u4fj5QFYxF4YDSC949GaB9NeWA80h84CvlXj4aV6yDoc+T5vRBHz6qQMOmvDydAMz0l3yRV65uFM5oHk+YSOAR8vrhup+mLmeNGblt6URlu2vgs2jk2KRwVeyKWU7IArSyaXcLTsPC132HO5dcgLa9YmmAYnP3CrxxjxfdDs5Bh+XS7qyGuavVRU3D9qvVY9c4/xY1dqi1otgC54SXZz7NEuKX3YMVftmDqKWdJq268jcCfx88qbc3kmYRDKH6oVjGQHXRNOwN0I7nSLciaU4EggWHQSDw9xJFp3QbSyLu4rfMbWMufY2ty9cRtW9U142vHDSnczMMcFyvpHnmCgHL2pVcgmJUfofXsHwtaHbxu9YniC9EGheOgZuJ03LB+M1Zv+bcuuVR5g+Uf0ZLmz8bJyOV/egeTTzhN6seF2TDRYWOqRh9/OICyBxqkxZ4bhvI2NR00QJsBgyzmQQgtGIPUe0cicP8Y0lFwFoxAaP5IpMwfEVH+Psg/f2AEsq8eAjPflrBMbEPVJw6gU1JTfcFQiOuh6d42MulH2QepmQTOY+hDv3Ug6+HQAjS5v9zuu7aVLOgmlD48DYHy1ITcsJF5czom3Xje57DiHzvVFJLt+/MsJ+cO75JM/IIfvIJRLTNpU/nF4rQjw1mj3XkHF+KIubZ2DOMeWz4EjpkETBfcda/UM6/a8q9ImdWKZFvGt6kSLy7t42EEX9/8baQXlibwSgzN8+CT6eB+20I+uahlG1v7pW2fO9mqHqyTci+HqTG53jnRJBhTDchtvOBirPn7Tkm0LUuS/6JNhzX4HmBwbiOvZP3Wf+G+77+EEY0zyOJLVdOIfImJrz5a6MqIzEW0I11zqnSQOWRywpW4cP6DWPb37WjbsUcO4eUHQQWwmO8Luvdv6v4WUvNK1D3ay70p8w8dAwXnD0PF+hm0p5tJW/oFoDM+MwiBB8cjcN9oBB6aAP8D42A/MIZAeqyoXyt/nUIgnUpgnXs5AXSurn3/JMag/X5HGlVIxxI430uP36Y3+DLpjw5CX6UP+6plW78gd/bDw2lBc3yydEMrKtbNkBul8MKhMP089Tsl/sSSmDFPDGTz/usHWEYWInM2Myi3acpG5jFItsRpNW2We7/zIqrGTxaglBpc5kGJ5eE1zX4DZyOWu5o9AuZPsFVLvz8zBSdd9WW0/+FtKfVbrF30pGLQO1SHGwMVz95r+8cO1J11XoKSMsWKpmb50cHkCyJtRLbMoFQjyT76Wsvsw+5mFF84HCncrm+nSUdgMEFrL69tVnEYd3/7eamEUDHbD5M6oCIALbmI3QLOdz3zHVSNOVq1SnN9sE7suk0xBx2yiqE/lTBDZKK9X8JY0izkWAhkZeO0a2/Asr9tl3DLwQD0yq1qMstCAnyeDsTvwzF7Kxnk1vAAUkfnoqyNAJjWpXJ968EDtF9Z0AEC5NA8spDvH4/QfWxFj6THUaSje2jg/tHyu1lXDYOZZytuG8v3yQLoYDDoy8jIIHC2znAt3f5itXOfqy/Pd0jL7Dp5VFKzlPuUtjch86h8cX8Nv5243EkDZf0552HVP94TK2RfIF7qEgnFcw1losUuHQckMCfrimtL7/rGd1E+coxUMljaVTVNd16jpep0NVgbblWJT91cZqQiwVClgLLpbT3ZRk+JFlV1q5a2yu3YKdSui6fHK9lpGTjhmq9i1Z+3CK80V6W41RxujJq7BxNXd+yNVDHw1+30GW/Y/B2k5OVK+MjSXBdqEo0R8RS4Y4/Z8PzpflRcN0VmPxZ0T5eEbkFnk8SluYa2r2vM8wt5bbPH5CPAHCC2XzgdnAigGTFekao5bjrnAlrbnepQ4vfOrfNJhAJkesqWDyUEsHrbv3FX9zdRPHQUva4T05kYHergWtIy9MBwouPf+P26uu+EHkMPl2WaAVJm4gvI/WDrVnVL7lPTvQfctnW+3qFUnPTl67H6z+9KVckSXYXDnzPS5BKjiRKk3KnIg2m5Yea6jc8iJbdASP7tuACtRr0xm1wg3UbZrVNQtImn5EzvF4BOP28Q/A+Ohn/+MPjvH4sgAXPw/hES6nDVrx+DBNL2Q2OQyQCdY+sDzDgQF8eRBdA6TjyZrOY/fGrZ7ITzmZ5rYy1K7puMYAGd7vy6/l7YxWghU3LycWPHMwI60pG3JbnkGYM6c2QsYQvmvd1Y8PJPyboaL8C1LymVTzacrW5OMxoD54y5xPWYlEYAztTVB4qfVyYw89QXn09PPTG0FR59jvgWgyGjqkzLL1NCzv363Vjz9k4pw1sRM/VlRdJk/6R/34mj586VNbU1aJh6oIFbFigbxVIAlTd7AIo7mpG/aRpZWtNlYrRa/74DtAz3nTcFwZyAxBrd8WTx5jry+0rJKcQtG56hQ1NNPFnh8o0kAdBcjbOCAHr5jg8x7wevoHLMBAmbGJaRcLiDTNUWcHY0sOoBrT59WPuiXk9Qr6sjIQ09Yo7pNPU90esQZv05/elZOP/2u7H67X9qS1/FpN2yyVhNBNBuaaXMRPzLFoybPUdRHiRiMZTDl947ad4pg6Tlvrhzar8AdBoBdJCsYv/84QjcN06s55T7RihrOUY5aZg2b5SAeTaHOLJszRT5CQJo5uCgN+SQLLI/1XzQal5dZUcTCi4aCjvgUwkkJ/EIL2aVG9U4A22//5vE8ZZvTS4jvkInkTgBx3wMT73+B4ybdUzM0Nx9QMNQG87l3BWrN3Y2pE9RYZpGCJaTCjuYqhpV/AHaDPrG8/miFlSMxgVoGdGlxjgFyELLyC/BlQuXS/OD+/5V0pCrAZBwnNa+8Wgm/F9Nn/fyJxfDDqWLpReJTbrgHDPLkdc9NCQLlU83Sc1sRUet8HMUJWlBcxw776JhdD0sxblhaQAxjegA5Jh2+VHNs7DiD/+QjkGeqi71y8lWsDAnNHfz/e+bGN00U62tY8ros7j3uF4Lt87djB2zpRuUDF7PlDS6dmnw+4OytkbMgFd5DmZl04ee4UtcUSExactGWlEYX16+Bmvo8F2iQ1nJJYKjfONcrfOFxxbS/ZeWoK5beXJuI0toTDaqFzfK2LNDB9DDDwDQQz+5AE1aTBb0L8xP8USVEp1EGtDWiozpRXq8T2LODUWcHpTxRmu37Y42cySZQGrbwjSVBHh/24YTr/oyLL9b3WAk3LxOJDuvLBG+2c1ACOmllaiY0oAxZ3wWU6/4GhpuuAetN96Dxi/fjKMv+hKGHHsy8gYPh5OeIR1dJquhLGyzl1FdvIlS5fUscs9H4L4Xfyy1tEtkM++RygbF/aCpLXslXlJuchv9/pO/eAOlI8fr+GhMdUycmKWVYaPq5qlSzcHsZzyfrrhTD4ztbW1l+nqjUFtWtrfI2kozheVoIh9DxfZtXeJnKA5rnxPABfMeErrWRQTQ/PncaoxlSQH0Xqk9Pv6yq6U6xmf0ciDqaey8tlJbL8MO/DDokE0to7WdSmt75mdRe/lX0fy1O9FEa1t33W046sLLMHT2icgfOgpOZh4s8niCPCNRW9dq6omjp57sD9B+nwLx8LDReOil/5YJ34u37+1RhRTRXip1ZGjtFsUL/sT/vIbw8NHxE8Eu74oO8XB5W/XXp0QG/HoAnXx4o8KyjsyZhP1ZBx3uakTVow0IlKXQIvkVL0OiDD+/FwLE+T/8sRTqS1NJzCDOvpVgKWrI9u0f4vrVG5GeXyg8FIm8lNjWYwEZ9mjIkiocdTRqL7kKJz28BGes/iZO7fwBTur+IU7sfgUnbXoZp3S/hFM6n8epG5/D6cs2YubN92LQzLkI5pfS8wUlXmknGtRrGpEBuPzefLaD2jM/g3Y6UFYTcDGArSCVqRs79yZVmtVOlvisS64gcHBiJmVHeZddcFZcGD4UnzmMALpVYpU8IzKZCdEM0KWPTYO/NESfl8DKDoiLzWDoxvNVLF7Fc/PKq/DgSz8Va1+sZwagd5NvPOKBwF9b04k08j5MSx340oqfoGLD1qRMzJjI7fm8tpMuuRrHPtqG09f+F63hD2gtX5D1PWHzy5i76VWc2v0yzlr/XZy+dCNmff0+DJ51AlJzi6U93U3+yjTuuIMTFGe5inubaDjrPKz561aygvfIGC5X26V9f6+6x3sBaOmcZSbDf+zEzAsvSdg/4LbUS6KUvDRmuYtfqeMBdF8saJ5J+JZxCPvfzcMe4qhDGdfI3jgeThrXGjOvs5MghqY289HHzJV4m2T3t6iqjeVJkQ4p2s+Fv/2zTFxhcHDsxFa7S34vFjZZv6nhMkz57CU4kTbmKd0/oM36Io4nQD6ONuzcrldxcucrpC/jxM4XSV/AXALp42ljz6WNfdqab+H4OxagYmIdWd/pYl2ZhluLG9vAYqt4ta1en93pVLLSPkuewz3feg53fusHuOf5V7HgR/+DB3/xJp763TtY/vf36Xrswhra2Kt5c8u0FRW/dC1Q4bB4D/jqqo0IMq+wDtfEkuPHAjTHWjOnFKFoLW1icoWZyzmsPZ9EB27UM+JOxCYU3DQWToqFVFpbw4kCtBEzfIFj9jxaatLxJ2Ltn7eJpc/Jr1Xv7hGA7i0ZGhvikiYlHhn1+p8wvK5ZJcVsQ5cymlI2GTeMxYeFYyKttByTL/wiTly8nsD4eczZ9ENa2xdxbNfzOG7TCziWdE73izip81XSV3BC50s4gX7npM0/xKnrv41Ztz+A8JTpMEMBVVHhU9ND9h/my/ebPzKWLZSdg4vmP4Tbv/Mibv7uS7j9xZ9g/o//Hx77399i8Vt/xbI/bxd2xtXc5MJMehGej+jE+IU8cJZA+trla+CEUvYjwPeZmttaDn5brP2MxmKUrWuNM3ndA+iPPNWbb2QV24rjkvrcdtPEk6cjiaiYCQe9/e6hBGjmpZVExRlDZOOqxJWbNY9XN2zjvNvvIQBy+Q2SD3Fw8wbH6y576Enh1T0QB0OAXGTHCor7XTj2aMy46xGcQhbViV0Eut2u0kbteol+9hJt3B8qpf/n3zmBgPqELgJvAvA53S/gtK7ncObSDRh/zkXw5zH5OrnWMufNkUSWbcQ0T5j7cF6npCIlKwepObnIKChEbjiMggE1qDp6IsYfczxaL/w8zqXr89X2deRl/ET4jnkizOr3OCar279JHyVQLx42UkA4FBnhtf/hzfdGsDIVAx5rkQnRlevrULmhOS7BDoNzmSQPm+iRJ3KwxT0LRacO1mEEM8KmZ7ogpb/nrlDDDuGCuzh0tadnR2Bvk0kksbZLKESXbVU1z6sIpD4770FY/gAik4h0fNlwKyx8qppBVemo+6pk7FGYdc9jOLWLAJjWKrJ+rhIws/LP1Pqq9T5h00sC2nNpnU/hn7dtxtizL4Q/KxcppilhEzXxx1YjqXwxk24iZZaGWlsC6pTcPKQXFpGnWIqi6hoMHDcBE2bMRtP5F+Hc2+7Bdas6MP/l/8Hi379N4PxvoT1t11zjbGk/8v/eRMnAIarzVedOVGOXmzsxZXRVCt/bQzNQubAFBQTGLqmVah7zAPogANruAdA9kiyGmYRGT3TzcDWq0HOUr2lBVl04EhtTI+XtuO8jJTcHd33je2JFcHzyI837Ywvrzb9g8OR6SfDFWo3xPj83kHCrd+nE6Tjl8TacSRtYAbLawH3Rk8nSYqt6Dlljx216Hqd2fA/nrvkGpl18GUI5eXpKi6GrKnrP/u+rRgwpkzD+2Tx5OQdZ4WoMPHo6Wj97KS595GnMJ4t7+f+9I910q/++A5NPPlVVc/RCQypzFrMdVH19GnKfaSQLmieiNCYE6PINtbLJmd8hv5ss7tUzkTG1KGHttVvVwq4+N1rc9Y3vE8DuTWrSCHeQRhKlzHfxmz+g5uipMTwuRo84LN/7pp5/KBUXtoWKSdNw+mPLcBp5PMdt+hGt2as4fWP0kO2Tdv5QDmn+m3PJU5p24RcQyM7TIBkl2rISDUj1xV9fX0wi0uBcSXo6MisqUTN1OoH25/BFWlturlr5x7fpcKPD6h87MZk8ETWizYxO0XGn3NumzBcM8fsporW9ZxqKul1gbvAAur8B2nCtAF/yqtor441D+rjK7BpQtagRwWGZ0Xpi04hrQftk7NAYLHztD3pjRgE6mQTS6h17cfWSdgKx7Eh9aq9hHrqpC0eMwimPLseJYi2R5bTxhb5vXA3QJ5EKQG9+AXO7yALf+H2cuf47OOr8S2GR9RSpDjESN07EW0PLiI5lcvS0ZJ56HtBt28IZ7E9BenEpRjTNwFlfvwcLvvM8zrzu+pi5cEbiJoyQgdJLx0izCcehyze4BEj7TuluEGAu6GoUhsLC7lpULmpCcFB6AoCOlh0ymJSPPgqLaG1XJBGu4vI0t0mJQznryDO68ulldD0z4ocyzGjCV3lrDsKjx+OUx5aL5Xzcpldw7GYG6JcJoF9MDqBJ55A3dTx/TffHeWu+iaPOuwRWWrqsUcA9DA/QHNOTJEyx6Dl6WKwkqH2xv0f/T55CZrgMY1pm4uybb8eC776I0675Cnl8VmRdI7MdpXOVw3WWolNIN1B8xSiU6oSuC9BFHkD3B0BH61UHDSjFnKaJmNN4tOjcxom96pzmSWiZNg4ZacH9srsfK90o83DcVwu7JECvr+ODpltbuv/EkqPnnoz2v70n/ApsOUkDQx+4gWN5edv/ugNTTz5d6l3NmIGb8Ybq8sGXWpSPObfeQxv4h5j97E8IYF/FqZ3P0+Z9sc8b1w17nMAbmABalMD6BAKEU1c9i5qW2eJuq/HzVlKdbb4YDmXDjBm+yZvRifKGRNrV/WnCY1w6aBhC5OYHe+GjEG/GMpB/ykCZEcmTbpgitiiuBd0gUzryOKfQOZ3uFwJoWlurwN/7+7cU0Ew95Wy0/+P9pNq6OS69aKeOPROwr/7rNkw56bReh8Aapqqg4K669NJqnHjXgziN1ogtZwbn4za9KqGLUzuTtKC7VAjk+M0KpPn+OHXNtzFgxjHkgdkqlGSokj0zye7FaEOMX5K7qnmmZ/OLgLVpI6ugBIU1g2AGFfGVaexf4ulO0eGf55xaJbmFks4GCXUUdKkYdJkH0AcP0Dy9l2/GKz53Crb/ah22/XI16Rpsp0dXd8TT19bjf7+/EIMrirX7ZESmR3ysdKOdLSi78WgYObYQ+Rg6TmnGyXzz+zj5KzdIeyxPlmCAXrn1w2gxfx/oGTm8seDHv0BudaWUOFm+nsmxWK9EQNq2Mf7sz+JssnbnbnpZNvDcbgLVzuStq576klhpJ3a/LBt7zsNLyW0dJAdTsjXv0RZ0bZG6G1HzfLgHkJvBDxpc5ueIspUd8Bm9UH+qTsmMaYUIr21B/qbahBUcJUKM1IScbq6tnYYKDl997WiyZs2EZXxmpJ7cwuk3fB3ttKaLtiVXleImDzmxyA1HudXVib1B09Bc00wJmoZJF34JZ3Y9Tx7NK7QOr9CaviwqayNr9MPkDmEOf0lSkb2kl3AK6fEPLUZmeQ0dCqaiBLUNPS8wGYC2BJxVVYgdbZ6SKedGj8SuCllxOacj+GD5elYjRekSlNGT3liCqrWKs5sJy7g1n8soyz2A/igAbWiANrWVZ8lFvubzJwNvtAG/WYK9pIjV1xbvr68vx2+ffwLDIwBtJSSNOaQAzeN2vjgSvhRTSuxMF6B10srsMdbKjy8+sQgr3gOe5kSStqB7q5FV1KO6TphLknYAX3x6KayQoxOk+wC0BmbbVFUGeTVDcNLTa3CSjjlLjLGTy+de+kgAzZv4ZPn7l0VP6ngBc2gzn0wgMe68LxBopCasYDGN2NbwaL5B1TOradiqxVzdG+wO27qKwGdEOxj92jswLCtyICaqYGEwYDBJGZGD0jbuKKyTCd2JAJoThDkyNXoaSrvo9y4dpTioE7Qem27cPRDE1U8ulYSXqgWO8mK7dKHLDjBtZCXdE5c89jQdCCn0GXs5cAxFzlM4bBxOW7IBx3crQD5t44sS1mDLmcF5rvw8mTCWUgFprt7RYTBe2/Hnfp6ALjVS+20ZyQD0/mEaXyTRr1rLTT24wme6zTKWUJ46XPLn60mHGqHM5XWn+yN1bI7UqpeQN1vAg2I7dT6BwNoD6I8A0Kbr2giQ+WUBrr34ROx5YxX2vr6MdDn20OMe+Xp/xetLCcyX4bc/eBxDK4t1VYiT8AId2hBHK/LPHaatZicytUTViPoiLbYMRMHMHNy86duqGoFnym1T9aErDjAKym0TXkKPa7f8C8declmPBFvPOmBLbnbZ4I4f42hjnd754kFYygmAWqubOJzzDIH/423IIEvL1wuxkqM5NKL8HlFryhDLOMppYPYThSaDWagqA2WLWoXAv5CAtyThrMJGssAaJbzBRDw5ZwzSde1G/DFbPmVd+7OycHv3t9D2nuZs3rpH16tHOUfieUluVyh34K2mtZ39+UvleR0jARsi3+d83ehAmHjR5Tilm669rrRx16Q/1/mELi7B/CHmPt5OaztQHZSWL2GHYSQU0SPxH+MZmT01HoAn2+OQUp2J8BJaW2lAqlV7u5NDHPUeQH8UgLYSAfSbqwiUl/fQvQnUBeghDNDGYQToja3IOmGAem1TdfO5AOTGTS1tLWYWlWDBiz+V+t4VW/ZGGMvae+m0UkQ7eyJTkdv+tBVjmmf2TJT2+LzKpecGjWBePo6f/wRO7f5hvwN01KImgN7MccsXcdb672Bw63EJAdow9wlbuGoa+21cn9l/FJocq/UXBVHyKK1754EBmpOEvNGLu5qQeUyFGADxJ3wYkeqTrLIKPPTyT6X1PsK/odeW66ATEdy7ZXhCqfqnLRjR1CTPmxig1SGWUhzG3AcWStJ37iE4gCMA3c2x7Rdx7obvYmjr8crr0YZArwAdA9KRUtg4dc0Hvb58OIZTUPJ4E8KddTIzsrSD2/nryIL2APrjA+jf9Hw8UgC6fEMrMltLNUD7hanOMmJI+X1q0ge744UDBuKJn/5KRt4zB8EyXft6oAoOBuilklQEFv7yLRQPHRYXnFVHm6p84NcvHkMu8IpNh2zzurFoTixxLfXpZG3VX36tlFIlTAbaPk3yH0NV6tbA65ZlS7fL9w8RvSmuciDHj5L7alFOAF3Eo7ASADQnD/M1QHM8Or0hnLBiITo5xUDRoGF4+v+9hvb3NcH+jr3yyOvGIapVtNZxCbG26FAIATRP1y4ZOkSVBvrMBI1Oprxe6YSJOGvVs5JPOIGrNToPEUB3ccLwBZzVzWt7PUwnqA6lBBY0r52U5ekhxLau/nCJmdwZiP0J0FaeH4ULauXwLdEAXewB9McD0HvigPMRBdBrmaehUIOimt5tGzHTnd2mDaFIzELj+Z/DLc9+F+1/3iI1nzwKabFuWtm3WiPCu8GUohzXJIB+8MUfI7O4IG6W39ING24Iadixc3H6xh9g7iEE6BOkHfxFiV2eyEnIux+GPzMzYRWCG0MM+Fw3XtOY+hStpWPa8n8hw0rYRp5siEMI/DMdlNw5FVUbadN214mlfCCADm9oQuq0Yp1PMBI2VvF7D2TlSk3vHd98Dm1/3YrVO3dj6Y7dWLhD0awu37YnYRehNCzR2t7/0k+RWVqs68KNxABN99LI40/CmbS2nPQ9VWLHLx7CQ5jWlyz1Y+Y9jmBWXqTULt414XbzFDcMo/lehIrT1Gx5phXpYeiXCdtsmWdZKLp3ipRGFm+cLnwrPLOwwgPo/gfovYkAOlaPIIAuW00APblAGjUMIwrQpmstRjqt+Ib1S3ssc2dMnHsyrn56GZ7+xetY9c77WLljb9wR96oVfLdMXGGAnvdf30dqXmIAjCTf6HCYdPb5OKPzBekCPFSbl1uJT9MdaMdvegVzn1qF9JKyXgFa1Yv7pRnjpMuvQe05n8WI1mNQOmIsUgtKYAUCmk3N18vn7HuFCFOEMn9wydcno1rqm+MDdJGekJMvddAE0OuakDapUGqze4uTSqJTuigdpBUUYsqJp8oMxUWv/U7mTK7i6eQJKzv2aq5r4I5vfB/B3GzpSE2U8JbD3h/EpIu+iFMYNJ/5sVTkHEqAPonuoTn0Wsc9sQqZRRWKJCkR0PGetk0MnjiJ1vYqNJ77WYyccTzKR49HelExrFCqmtDN19Rn4aBpIKQW2kTJHRNRwdU5Er5qkiqOCi9J2L8AvfvN1QTIbaQrRHdr5a/37qN4YwXe+METCqB1ktA8HAC9sgmZ4/OkuYLfgzCdGW69aBSU5BqYVoRIX6oUUtNQNmosjrn0S7huTRee/PUfsPLtD6QTrW2Hopxs36a6zWS6Cm3i27q+gVB2VkIyGSlXk+GaNqZ97ks4RZoPXj6kMcrTaQNzVcixm1/FCUs2ILdm8AEAWlXy1J9zATrocGp/5wOJrS/69e9x9/dewuV0cB1z+dUY1tCM7HC50GSyW+8It3E0LGLpFmg3rp3owOKwjz/VQvjmSQTQzFJXF7eTUFVyNOpGFQLoVfUIjcuRv0/YlBGpwTeilAPsHaRmooJA6djLrsKN6zqx+Je/xSpaW+YZadumyiWX6tb1JWRdL30f+FrnN8XL8pu9lCoyeAdTUH/V1zCXrvfxm35MB+QhtqA7ue79ZZy0dCOyKwZFqnAS0unSe5x5wUXoeuc9up/fx4q/bMdCurfnff8lXLG4HXOu/ApGNs6gta0gMA9EwnUuj7kZU/FjxpaMJhrtlWai+NaJqJThvqpVn0McngXd3xb0G6vignE83UMA/fpzPQG6t86mQwfQjcgWgDalisNljXN8fTv93RruQFYOSkePRcO5F+CiBY/i9v96Hk/86vdY9bcdWL1lF9rfA9a+vxd3P/NtAujcAwC0Ldbc1M9fIcxlbNkeMuuqi0u7nsfJXQTQ9DonL+tE7qChCQFaDSTVjR2nnY2V7/5bqhzYe+A6b47Ps7bxlJk/vo0Hf/gTXPLg45h6yhkorBoIvz9FVchwbFlinbqh5QAAbadYKCGAruTyq45aaWiIV2ZXvqEpAtBlK+sQHJ+tS/+SqfmN3necewhkp6Fi7Eg0nv9ZfP6hx3DXt3+Ap1/7vTD7rdnyb6xhatEPgBs3PotQSo605icsG7QZTEKov/pmHP/sj+mA/LEcjocyz3CCNMG8jNNpbbOrhvTKE23ryqLWz1yEte9+oMoLt6tDiZPj7Cms2vYh2v7vbTxCa3v5Q09g2qlnIm/AIBiBkC4R1WPTIjXRpqps8ZnxATCVAXqSAujOWtJGSQR7Meh+Bmi80a5rn5cCry/pVfe+sfzIAOjVTciamC8AbWmAtiItsQcuEXKVb8oIn7PfQWpeDspGjMCEY47DzM9/CefcMR9XL1yGi267CymZB7Cg5XoYmHLe53AKA3T3K4d0A5++8QUJcTAb3gkL1yKzojohQFsRgPahjtzflVs+3G+0l0u/uoTpSEmZnnT137bioR//P3z2vkfI+pqJYFaB9pqiQwgSATTX2VpkQRfeMlGShL0BNNORulUcFWxBT8hJ2hWP5UaxpF5bjxrjnweZ5D4fpby2s2bjmEsuw2fvmIcrn1yKz9x0G1JTsiRmLtNuElHHWn5MvuQazHnmFQLoH8nheEgBmsm0CKBPfWoNMsNVumw0/qFl6/v6mM99kQ4ftbZqFuVe4XxWLe1cgriLDiZa25270Pb2e3joR/8PF9/3MMY0z0JKdoEmHDMiREmu4ZPQgr5dAXRRlwZoL0nY3wB9km5UWaxAmmude1MC6DeefwpDKg4zQK9tRubUAt1Q4UQY1KwkY6dq1JQZ6aRy3WWfJkXnkIURCsIOBuEk4NdWDQBmpElm7EmnCQ/w3EMZ4iBw4DrrU3UVwfELnkYoNz/BRIyo68oW9DFfvFI6KfcF6GVb9khTDjd8LNqBaDUEW9ccq//D33Hj+k2oO/MzSMsvFGvL6YWLg8vsrHQbhbdPRhlt3DICaI4zxwNovi94LBaHQcpW1yN1UkHCMrs+0eHqteTmGr8b3urBRWKoOmuH9kEgKIlTy7B1GVv8EAcfOKNPPQ8ndz8vjHSsJxxiC5obneYsWIiUnAIdg048aYXvwROvug5rtu3SPNh7I81WbsOVTAOS9VVliRzSW8ukUWRZX0drO/30c5CaW6jufbNnB+F+ScIME4V3TZIkYUGXOzXHA+h+AWjuy3cB2g1xSByarOnedO+bKwmgn9at3ocRoNe3IKMlrAaWaoA2jQNRpcZL7NkRdQe1Sj0sF+K7VqfVuyXh04Ngbd08UTZxCk5b81+aRvTQbF4Gf04Schz0VHqdhmtvhc1JPl9vh5FPaEnP+/rdUu0gVQ4HaoneGuWDXrxzL9q278L6v7yLYy6+VFGcJrA4lXtsw850EL57GsJddWId5yWwoMMbG5Hb3SSxzNK1DcioDauEVhIAbfp68mZYkXmJlqYLdWdEKiA29xvXlbjE0BQeDgvV05px2vr/kgoa7iI8lGvMz33appfQdO3X4feH5H40zF7IkgIOPnvvAqzj6TkaoF1KXTUNSFnSEtraGv2e74PF73E8fg/Wksd0zEWXqHFsZi8leXztchwU3j9N4s75GqC9Ko6D4OLoYUGbAQGUKz57It75eRve/tmyPumW/1mC//nWwxhUXnRYAbp04wxkzR2gane5zO4jdsBFeCgiauquQEuNMfJFyYQS8jToVmBHW+Lp5RU49rGVOLH70AM0txjz47CTzo5O144bgtEcFmQtfmXxCpkxuCgRQMsgAz3MQGskYbp9Dza8+wFmXfwFOZQcI5FXoWLyTl4AFQuYgL9O5hH2CtCbmslVJoDuaEb2jIqkAdqK5eiwoi3KksSO0ciQAcvQ5ZHqe6uXHAbH3EN0v2dVDsacp1bijC4F0HMPKUD/EKdtfAHDTzxTrnOagGb8/cb3oJ0Wwg0r12Htjj1qRmFkureaQ7n63b3Cf71im5pO37Z1t9Y9Ymnzz9ds+QAzLrhYJQ+N3gHaXxRC+NFGOVTzuxVdrAfQBwHQPVVxaFSG81E/cXjf9Gh+HIbJYwcjLaQy/D7DPCx80GUbW5B71iBZAMsIqCoNI/HsOE4A+TXfRJRbIM7CRqY0m9K0YcUAdG8gL40ZPk0un5qB2su+gtOYzIjdYGkHZsayF4QQpz/avU9lvgZ2f7vp+8UbkDtkjAC0mbDdW00RTy8swX3PvSxDc5dsT9xF2UN16zRvdnaRV/39PUw8/gQ1YVoOs3jXxJYyu2BpCKVPMBcHH6rTxQWOH+KoR84mvjcaUE4AnXN6TcJDNxLKinQ+Gvpw0payoYmAYmLS8YZR+Mye00rMXqhzTWn8oNcJpmD6FdfhjO4XtAX9Kq3HK5oW9oeq/X7zi0mHPuYKm90L0hnK33PoiifqnLxwLXIGjVD0BQkmmruAmV0SxoKXfoalO6G9o70RL2hFxIreK81aPLqNdYUejSWdtfx7f3sPR81Ra+tSJRh6H8R6KPz/wep01Dw5Q1XgCEDXoYBAupQ8IQ+gDxqgE/ME94kP+jCPvOJpKgVfGA4roGp7Lck2Jz7xLQ24Rgz5j9kvLc16crOOc0oc0wqhbPxUnNK2WcehX5KmgxNpU5/Q/UL/WFgdL+A4AueTN72Auiu+CiuUoTh/e6moYG6MAUdNwbI3/48Aem9Ss/pc4JYStbfexsBJk7WVlSjE4af1sJE6NAuly2aQdcwAPS3uVG93okqOns7BTIX5l4yEbccPWalksBH1BvUgVdUtp4dKmNF17p81jk4wL504Dae2bZLpKZwsZJBmAituGpIxZptfSAqg+Z44YdPzBOwvCLjLUAd6bs5j1H3pWvJ60qLT3c3Ee23Y1Fos/cM7WCgAvTuSIOzzpG+mYH3rH6iaOEkDtCFTZKThJc6BkDImB0OXzEaY1os5vIs31hJAN6GEPSEPoPsHoJMF6SNlaGxpZzOKvzYeVrqlmLkMFTdOSH+pXV63FM/UpVP9sYFjs91spVtmCFZqDqZfdaPUKis+51fE4jqpH1xijk1y+dVc2thnLF6HojFHS2zS6i1hJ00Kfsz+3Bew7t1/SidlcvMYVRyaCace+e9fCTWnlLX5EnFDOAKWmZMKULFmJrnAdVJCVxyHclRVcdQhl2kqmfC9qxkltMGtkJEgpm4q4DDMiFXsVqn4YypL3HBVf8zmdOfySet0ejbqr7kZp3S9ILXufAif1Omy0r2YFN+3rOemF6Rt/2QB+R/Rc/4Ixz3zY5y+eD2KRh4lbI12HwB67uVXS/kgh66Wbt/VI0HYF+Xa/0d+/Eta2yqJuUtnrM+JNGD1mCBEmlEXxqCVx6CYQLmwazqtY60csOFNR5wF/Xu6Byrp8ZMJ0J/Iqd5dTSibPwV2nl+FF3Qrc6IJH/60EIKZmaqtWY+JMqxoMf5BMXtFYrymWI3s2rOrnTuYp6ksw1kbn9NdhT+SwbAn9QuRzks4o+O7mHj2Z6WhxG00SJhAsx0EMnJxfft6rNqpwLktic27LDKTEbhj07foWmaoipEEtcouYX/2MWUId7QQ8Cq2s0SE/RzikCQhWdjMfFd292RYuXZijmOpQjKlxTmUmgZ/RpY0JLmDZLkqwyHPimfomabRL14SJxCDEtaxkT98LE54bAWBtAJjJk5yKWFPTpZSVntVJ3e+ipPpHuHnOaXjOYw760JY/gyEjIC030eGK+wLQFypkp6Br63ZKNU2PF+xjWuetyQH0DxY9uaOZ+BkZao6d5+luaN1U5KbxzDUJKacOdWoWD9TEfVLlU6dTFc5AgH6D5ZlVfJAbQ+gP7ahsQ0of7oOoQHpmsM4MUDz+ygfPhJfeWoxjrv0CpRPmIpAToHUvMajD/3IIC38B2raisRP/UEMbJiBc5dukI3M1vMpnf1DTXl65w8w8/rbEcrJjxwQCW9UUw38HHD0FCx+7fdYunOPhDfatyQH0Fx+t5YA+qJ7F8jEDVW2l6Dsiy36gIWi84fJ3Dq2rkplaGxTgiRhA/LIcmY3uYjc5QGPNyJYlRaXoU9dZzsyWbx6wlH48tNLMfvzX0L5qAkIZNI14byE/I6VZGVPoqSY4lLndvAQx6KdIIbMOB5nrNgkibwTpLX/JZlJyJO7k1nLUzrVvcGTWTjp+Jn130bLl29CMDef7mse0mpHkp9mvOk99PkGTZyMJW/+CUvEet6NVVv/Teub3NxNXtsL7rxXqnMczdDo0+x+7uAGW3ce+oImii8ZidLuZuR11+tmIzXyKrzpiCPs/wOB85EB0GzG/2cAdD2qVrYi66gizUJm9ArQGSVlePD5V7B227/xyJt/xg2d38CpX/kaxrW0Ir+sHP6UFOXK7TNU1dXYqTFmHI2OF3IiCapUsiDtQAjDTjgDp6z6hrR/n7JPa/BJXS9q3T+DH4ljynRvxVzH8WweszT7tgeQVT5QxV111YLti18mJlUmjoPP3Hw71tPn56kybEGvfDe5CSSceFr9zgdoPPu8COd2QoDmuuI0G+W0UQs2NSg6ygMANNdBsyVW0DUNA5fPROaE+ORU7tAJt+Iip2oAHnn1f7Bu27/wxK//gBs7nsUpX7kBY5pmIK+8Cv5QSmQKyH7r2wtFa89OPVvxujg+4fxmkLaC6Rh52vk4o30zWb/PS7hjTverAtL7TlU5QXs+J0bWW/NI61p2DoHNfvbHZE3/EDNvmYfM0ko5VF2Kgugh3HNyuyTJbRsX3n4XgfKHmjtmN1bz1G7N2tin9eUE4j/+hWmnnqEauOQ6OxGANrQBpKao02tmB1B16xQUb6pD/ibFUli+oVFCWEciQB8xFjS/CdICx3F+blnWpxag+UYQTujTamTKsC1MXmbCmDIPyLz04Sex4j1yAXeStcCDQrnU6M/b8MhPfonr13Ti3FvuRPM552PotFqEBw9BZlExAmmZ8PtTYdtBudm4ldu0LB3zVm3PUVDUoRLT7e5SFJ5mKIhBs+fi5IVrcFo3xy0JbLuex4kbf0CA/Txt0Bd0w8krWmlzd/PcwRdxCv3OiR0/kOoAnkV4+obvoumrdyCjtEpPX46tMokd9GlEkpf8e+FhY/DYf/8S7VxiJVSqqsQqqRl+O3ZjEVnglaPGRQ6tRKGhIHkSwdJUlD/ENdC1BLqKTIfXP16SUFV41Arhewlt9CErZ6Pg5JoIOyGrqiQw9SFoiUUnMedQGi5f2KZat3ldyVVfx003f9qG+//7V7h25QYZilp75jm0tlNRMmQwUgvz4U+jdQ0GYDiqg1C6CHltCfAsx4qEFFR5nhlZW1OX3UkMPDUTQ489CSc/vVol+zjP0MlhCpVz4GTf3E2K3GruJp6AQ2tOetJGVnUPHL+Jub1fwWkbvoNmspzTmStjH4MglrTJ4riwpQ5mi65D6fDRePJnv5aWfZX41ZUaBxia26ZJwbgyhxuSHvvf36F06Aid/O0Z+uN7KsD3vBmUstZgRSrKnmAOjgYJa0Qn4xyRQ2OPHICWQLhh0D1mP26aZr8kSI5EgGbWrHA3Pe81RyHkN/X4HitxkpDeS+2Z52L52/+UGtEInaiQIe3FKnL715NbuPYfO9D2u7/iyV+8jnt/8Cpu2PgNXL1sHS55dBHOm/cAPnPXfbLZi4YMo01iqykkCWKcjp7bZ+twR/G4yZh9+wKc0fF9AmplQR2/6VUc+8yPhHznlE4eKksbtfNlnLHxh6SKrY7d3lN5vBWBwJhTz0YoNydx7fk+iSSxFG0/zrn1Lqylz8ebUZVUqTKrZAB6KR1ut9D1CGbkHjAkxIdXxlGFqGhrkuoN5QY3SUt3YoCeLp5ReGMLqtbNROEVY2AEjEjpo6lLvlTiytKlb2rWZuuFl0qCjMsBhf+ZBzPw15pnZDVble/8E21/+Aee+sUbuO+5V3ArfZZrlq3CJY88SWv7IM65ewHOvOUOFA4cSNfMUuRb+0yw3787kyefhxA+aipm3vEATln3HbGSGZS5dI4PW44t89qerJUt7DlkMR9D4H3sJlrbzT/EqYvXY+zp5yOYU6w/WyLQUVODJB7MnZzBNFxw1/1YncRaRgBaDzbg3EI77YGvretGMC094bgsv3BOByXckzmxEFWrZ8Y3nDyA7l0CgYDP7/ePtx37ddP8dAK0DCAld7n6gSYEi0KqJKiXeW38XorIKn7s569JJYKaqsGMZruxkGlFSRdvY8Dm+mAF3FxPunynIprhTb6KgZz+f+P2D/H5Bx+HnZkrSSh/3Nc0pOTOZYKTsjDDj1B+KYbNOQ3H3f0wzl75LM4gl/YEcm2Pe5ZA+plXyJIiq2szl8/9UP7v9I7nMPeptZh6ydUoGDqWLPkAgoZbE5touKluW9fhjUFTpuCpX78lHBuL6DNGAHr7nqRCHCu3/gsnXPVV2RSmL5ZVLv5BkXNCNcrWMwXlND3xuZEAuiFhFUextAw30P3RTGvbivD8abALXSpZU3NYm5FSukh3Jx2EZaPH4wn6jMt2qE65pTuUJdmmrUOOnzMQ8c/l/3h4w3v0md5X2rZT6brtu3D+nfNgpqRL56CUbtq+uCRKUt4nXMz0PuiwTikMY+hxJ2HGnQ9gztpvSHXG3M0vCnfHnM0/JtD+CVnPP8bpXT/CGdyqT97QnIVrMeWSq2htRxHQpwl1rm1YCedLxs4I5Eae0Y0zseiNP8tkmGXJJH0ZlHVtNF8fJlg64YovR8JHPQFae2OmKiM1yCAqOnMwKte3egD9USQtLc03dsJRHOpopTf2M9u29trMKdGjYuGjqcS8+miZH1KAZleYrLKK9plIPSpfA4al+aET1EKHQrjiycVYo5s0GKh4JFL7u3siGe9lkYGxe2Siiii5yyu37JGYHlto4hr+4V1MPfUccvn8miMh3kaylOVnxsapDQmTcAKofOIUTDjnIjTfeJdwaZzwRBtOfrIdJz68GLPI0p582bUYNPN4ZFZVw2TGMa4+kZJCJzLaa//YqYpZsmXJibz0cBg3rOmQg4Vjz2wJc0dZ2zY1LaavG1qoOt/8E2omTheANGM4SBIR6YSvPQqFukFFaEY7VbVG/CoORTla2KU4OfIJpCuXtyBtQq5MtI7Qykq9u7KmJSbLn5fA0UnLwDUrVkuHpDpg9UDYLap7TtZXf2Ye1CDtzTv2qgM5omri+9K3/oIJx58kljqDpWEp4qWELHeWut5CyUp7I5Sdh9KjpmP8ORej5aa7cfyDizH36TU4/qnVOOGhxZhNlnbdpVdjWOtxyGEaUX8okkfx61yK2UtsnHmfOQnLBEq3dn0TK/Xhk4wFvWKbvp/5ez646HCrGX9Uwpi/otJVgwGcTBvhm49CaVzqWA+g+yT0hgig/fw4kPQKx3FW0uM3aNMehFrfsFkd+0X6sB8c3hh0HYo2TUdx9yzknzVEk/WbCCSYKee+n8lzTsbKv2/DovcgN+iad/YKYK16VwF22xZlWa8UUN6tdOtusThX0c8Y0MWaJAB4+KX/Rvnw0b2S+BuxbeSGEdOpqOPEjgOLwCU1twgZheXILCpHRn4xQuk5sAOpMhHDdLviIkNDQ5JhT/SakXrglDScftPtWPnuv4R6Uh1K6jNzknDx9iRKsOjz3rS2WyokbMOMTnc2rPhtwBUpqHi0DtnP1qJMwhdNAtDhjYnWvklCIK4yM1rVulkoOmMIjIAaKcZDF2SNXe/AUqx1jk8lAOvOOgcb3t4pE9uX6HgszyVcTQfwqi275bBdEWljV+3NbfuoJE937ML877+EkmGj5DUt09aMh/Hq61VTjFuH7+hrn8IDc80AUlIzkZZDa1tMa1tWgdSifASy0skK9cvaMiDben3tAzIEqvV16IB3ghk465a7aG3+JZ7Q6q17Es/XjHfgblUAzQc2e4hfXbEGKWlpcUNXCqAJDAPqIEqpTkPJIvJ4mMHOA+iDk9TUdF9uXoEvJSXF9pMEAoGDVTJY7MH0oX97ICv6UAJ0oQwhJYAmC638lknwp/vFqvD3AtBsZWeQG3rH934oRO3KulIxyxWRFuc9yqLQgCaTobczs5viqGCwZhawRTv3YA1tjutXb0BmSakCabd7TZPjO9oqMt16UrfiwVRxa1Zrn5lxsdUFpnYpTQIJn21IBYFhxvy+0XOCs9vN6Mhr2Kg/6zzhdl7ucm5sUx7DGn3ILN6ROJwRGayqvYblb3+Axs9cRIAUlOePArStujgjyUpVCZNdH0blmlbkbK5VcwY3Nqka2QQAreqjm2SuHd8jbFGXd7Sg6tapuh5aDUPgzR7wufMnlUcn74deM7e8Co+++GOs1mxuK7YqDoq2rYrVbV8AW7Flf5UE27YPsW7rv3Ft21qZUuPQawfcqfHuYWmqRJot4R5Ht5dHOalNW6lhxwxxNaLNNJb2RsUyd6KTt80YDgxT828bkc9qyPW1nBS0nn0BVvz+b+QVqUNn7bu7kwJoNjxkYj2zFf7jPUw7/SyV+DV6AWhHTa3PmVGG4g0tKKKD1wPoI0x0nXUpfejXDqcFnS81s7SBN0xF5dJapI/Ml41iJQhxuHWcTKM458ovYxWB66otu2hT7ukTm1u0LVq3SAuYk1W27d/4whOLkV5QIpvH1h1nihfD2Ce5k3gaxoFY92JLv1wCoMiYLT2Djjd0kMMqjh/jyUV/+ldvYRV3DMZUa7jTrHuPN6vPKEkk2virCQDu/uGPkV1avt/nYM6NAG3CdAYcP792UGKUpV8YLd2eXBebfH5BKd8nVYsbkDosh8DZlFCC4zM062B8fplTr7uJ1uXfEr5YJVO99yTVUece1HxAM2f2JQ88ivTCAvjlIDAleehzDB3rj5IvxS3Ti9Nq3ndSLz6EgvS7AQFni0vbLFpjK4CJc0/Fol/9FitpbWMPomQ+42oZiLxHauLnf+d5ZJWEYTpWr406Fjf+pJjI/+o4FG1uRcWGJg+gj1CADtOH/vXhDXGQK0wWVtnGqajice/nDIPfMvUcu/it3m6hfdGgoXjox7+QxN/yJNthY3kpVtANztr+7ge45PElyC6uINdWM+BZ/TQ9uZehrD4NEFYMlzXPzTvquBPw1M9/g7X8HukQWr41uQx/+xZlbbLnsJTAeR0nkK6+XvEn7wvQhhqYwBY0x7z9poNgZSpqHmFu58aDq3WX7rQZKDpjmFiipq5csBJO3jZQMmIsHvz5azLaig+YZNvZFZjvVURSzNz3jx249L4HkVZYLO3rju3o8smPIcmuBw5IaR1de9MJYSqB89P/7zdYQ1b+0m0KoNv0miX7OZfKvfs+jrv0cllb0zJ6ndLOHiA3hpU/2YS8TU0Ib/Bi0J4FnYiLo0O5xPnd0xHuIovr/ukIFQQTJ+ysaKuuSRvtJLK0Vmz5d2Sqd/LKf7dHppCwm9hGz/Xltg0orhkaoWo097WA+1PN6Ew+FzDsYAh155yPJb9+S2q8V0Rc+z3JWVecRCRwe/o9sqTf34MHXvpvFAwYFjNuyeixcaWBw88t0AZCpNnHV6F8/cyDD2PRuuZvbkH53XUI5PhV5YptJKQE5etgkvt/xk13YC2tx1I9lT05UigF0BybXcLt8HTArX3nn7h84XLk1wyR2L/fp0I6/UnGFFctlQTlGmw7NQsN51+Mhb/mgbi7BZwXbVefr00ns5O5f/mzrSSQn/+9F5FTWSnen9+MP9rKdKuQaI0LTqxG1doW5G7iEGN/WtAEwgcC6PtiAXqIB9C9AHQZfejfHF6ArpekE7OkSbJiTSMyG0pkk8a3OFUc061tLagehPt/+BOZ1/bRADo2nqcSTOwu3vidFzC8sRUBsmTdRo5D0tHJG5cPHZlqHUBGaQXOueV2rPjj39FO4Lxsa3IxyViA5tAAP3LVR9s7ZGF94UoCikDcEkYZ2sqWHieQuEGFgLT065NR3N180GtcJOvbhOqVs5A9tVh/ZithEs0yVNy2ZMhoPPTKz9C2Yzd5AbuTA+ht2O/3+fs1W/6FO5/5DoZMb4QVSIVlW3G5MfpTBTDp8MsqG4Bzb7sXi//vHSzauTcyzkoOn+17Igdxsl2ha8g7mPnZiyOdr4noak3dcGWV+FF+5yRUdk6nPVyLvH4E6MADoxAkgA7OH0sgPYIeh9EjgXGM8vepDNAPeBb0EQ/QXMXBgyoLOpvpOetQ0l2Hwq+Oh51mx81Ec3yaE2dcKqSmgDiY+bkvov3tDw4aoLkMb8XW3TIiavmOPXiSrJwjw+5dAABDH0lEQVRTr70JWUXhQ2dB65vPSsnAiMaZuKnrG2RFfUAbdhdZjgTQ2xTfxlK9iZP5PPJ3ulLljm89RwBRCb9lx3XrZXOwa2yrMEva9GKUr2pBuLP24NdYh0iqNpA7fc14WBl+BDkpmQgUuQxMykADOPaya7CCDpdlwnucHGNfNK6rgE9VhHyI1dv+jSd++SbmXnUtMktLI+3jhwqgnZRUDG9owU2d38DKdz/ACm7R375XN1lBE+3viUlw9p1XhSfVf737W8gqDEfJkBJwTUtCm9Y4s7YE5aubUbZxMso31GrulIMH6PTzyYJ+kEMYIxC6fxw9jiRrmcB6AVnOMSohjvn09UNjkHPlMA+gEwE0SSl94NfMw1jFwcknRf7epK3pWoTbm5E6PlfFYi1D1wS74+Qt5YrTjWhZym1MzS/C9RuewVpO9m3ZHUNWnyyg7REgaJOqkN3igra/+y/c/c3vYdopZyAlt0AT/CgQs2NI511uBTMy5SPaXhxpzEhgQVv0/9NPOUv4nVcTCC8kXcxTubfujgD0sj4AtJC5b3VLDKEBaQ/afv93HD3nROmYdCwrofvrWrZOqoWiL49DUVdL3Hrn5JOFjQhv4HKu6aha0oyMsfkEwJoTw4xDv2mpGYmch0gvLsNNm75Fh8we6Z7k69BjUswBDifVDq/CB9JFuV2vMRMR/eM93Lr5O5h86tlIyc6ThhFpu3fXVI/8UgncqJo6FOXTbfkREicz/t6ZdOKpWPr6H9GuPwN7aSs1ub7Enbf2PQEaJevfq8Jyv/srxs0+Tu4h23BzJm7YzJ3qrQmaeB9l2Ki8djKKNs9AeONUVNC6xONVSRagDbagZ5Yi45IhyLh4MDIvHoqMzw0mHaQfo5pJmn0h/T/9Ts6cGljpjsTFP3UAze3gXCftOE4KfV1AWkRvvvhA6pDS3xXalj2Bvn/rQF2Kh9aC3tfaUrwABZcPhxUyJVYpHMk+K+Hi8fsbOr0BK157i0D6Q9mIkY18kFb18m160OpftuFWsm5bL7gYhTUD4fAAWlPxGFu6IzC2I82lQBVyHs05kbBhgX7nxGu+hrb3ok00SZPwa+urfUu0DpyvwTq6HhfPfwB2alADcQLryqeIkbheOG1sLsoX1yJ7Ux2KNjb3A0CrjtG8TdMJEFpQ/sWxMDMdqSYwYgDaBWmpZPGpOmmOjY9qnomFb/yfGjKwXZEISXv79j1JJw/3jVO3c5PL33biVrJum8+7GPk1Q2EEFOGW6VqiupLHctWnHt3WcTMyFSb+vTnn6uuxfuduWQ+3uoTDTkmvcUxikPlX1pDxcMFd98JOD0pliN+ImeKtSwGdGIDmn2WPL8Lg5cciZzMnf2t1R+jBJwnl+dMMaWwy6YCPaBo/2vupwZpOeyLFltLT2FFnn3iA5jcYCoW4DTzPtp2L6Y1/g26UX3K4gvT1PupvLNv+nePYuw5nHfR+FrWOWVYv4u6zfAE8vvGkPKuXeYJGIISTyGVd/Y+d0oGWLA1nb+q2jq/aQRY1Pf+DP/lffP6xp9F49mdQPW4C0oqKZI6cz1Et6o7US5uiUlsr8WWz1ynTJ37lRqx4L9oVlszmdROIrtXo6moCn/ufewWFA4dFramEJYImbWhHpneXXDkOlbxxmVOjs3/WmFnuyjumS7hjwNJZSB+TJwBs+3pOcVdAbev3Q2vPXlIogNO/eoNq1Nmm4rZLt6vOuRUHdQirkVE89GAZre/yd9/H/by2jzyNurMvQOXYiUjLo3swGIJlO2oAqzslXuqlzQPGrnnvnHDdzTJxe+n2PRGAXrQzuTVu0/cyT1lhcOeJ33d/6znkMd+Io616XxQsbZdS1KcOGq6Y4aEYZVePR0XHLOR31dFhOV04VYr6wYJ2+UxY5QA7gLq16D4j2nNgHiAM+IkCaNIwvdl1ZP1++FF5Otwmi8Mbg94HnMWlbkTpxpko+fJRCKYqoiKfX5UPxUssmbpeOZU201VL27Fq278FsFZu2dMvAO0CpprlB+GJaONN8vYH5Lr+Bfd+/1Vcu3IdLpj/IE677ibMueI6zL3iWrSecwH8Kek9mPHiAbRNwHjGjbej7f29Hwmg26VDcpfEqTl+zocJA88SsjrHH3siHCsF6Ya2SBOutykWdNpRBShpm4li2pwMqPldtf3QjKQs6AHrpqGArfLNx6D8sglk1ZsRulO/21WoAdqQ2ZM+BPT7TcvPxbWrNmKdHEQ6nqwrHz5yieXWvRFdwqEk5rXYznzKe7H67X/h6df/jLu//zK+3LYO589Tazv38mtx8lVfRf3p56gO0Rh+5/2MB2lYMnD6LXdi7U7Vnt6mQy5Lk6xKUZ7VXqnIWcqT3H/1W4xuniVhK8OM1ur7dKeqE2Gys9XwCfo6bUoBSttpPbrrJbdQuUENX+iXEIc+5JVaB1TLZ0SGI5t9zNN8IgBatX7bfnp8gHTvJ59uNL7mdTegsq0VOdPCujvLjnDqxh0EaqoBo+FRo7HguZewYSuD9If9As7cPLAyJnQQyaBrMp8VO1Uibs0OFQ4Ra5s20bxvfBcpWYoxzuwFoNnKPv/u+9H+T0TKrJIH6N1yeLC1377tQ6z62zYc/6WrYYXSEbSCMsna8Rm9TxrJDaD8q0cTiLaigCfddNQLdWh/VHEUiwVdi3wC6LzNLRi8ZBYyphRFXG/XkvaZ0eG9sXXhfA+WjRiDh194VRjtuHuSgVrIorbuPaj1dUsSV7+zN1L5IklGeo3F79HrvB9d41UErOvpcL51wyYEQ5mqPd326XvT2H/8HF3XC+57CKuZHldPvxHDIdn8iBxM6hBp+8tWzL7kS8LrYupxYfuNbYvJbwhdQKGDipsnoWhTg1Ru8NRuISqT8EZjvwC0L5YO4QDqHmj7vfdPOkBr63k86V8/LiL/wwHQPF2YZ9oNvrUeTmFASHYCPjNxjEpK1UwB8lFNM6QJoG37rqSYwXqLQwv95bZorWqUD0JZNsu2ahULVjHmfXnZaqnOMHpzhTkhZodw2ZNLlQUtbneS7u9WtfGX6s7ItW+/L51zoaxsISCy7IAk5NwWcjNBrW7+zGpUr2ykdZ5O3swMFHS29MK5kawV3YhcIYXndW2QgbLlt0yjtU1TFqivZ/xUTaK2NOOerdxnev8TZh2Dhb96S1jslm6DtqD7Y433av7lvXo6tqqy4KTeErGwyXJ1u08JtL+4ZCUCgTQdfvMJD4c/DkBbjoPLn16Odi4V3KESg8wpkmxoRhj8mJbgnQ9wwV0L4M/IgeM4PS3nSK2zbjO3VDt9kK5v1gmVCK+bIa34YZ7e3dWoh/syCVZDv4Q4rCQ0Xu7hU2JBc2LQPIfe6O6PiyP6cAC08A2T6xXumIn80wciaNPNZgbEpVNTR3yRUUlyKtsx1R7+ICafcjoW/+b3WL5DVTKs2LpLZ8s/WnhDKgg0GKsR93skURMpkdIbXECavl5DVt5n73mQ3ltIVwIkbsoIZebhxo7NWLlTHQDLtu9J2HgT256+fNsuFUPVBwVXJ6ze8i98eXEbssOlatOwNaqJgFxLNSCTWVQVjISPOG5emYaqB9j1bRTOjWKett7ZrMnbG/rFis7vqlfDZDmMRQBRsX42ck8aSCDGTTo8jzGg1tGK6XqTphrFZSGld34/6s86Fyt++xfxVpbGNHi0xR6aSa6xy9eiGBAVQK/cElM1obsymZ96FR0OZ955H0w6WO1IiMNSnNIRzmt14PhTU3B9xzfQxglOzSPStjVx56tbUqmIvvT9wOx95JWtJs/oCgL71LwiVaVh7X/wuwBtS6jDkuHHqQMzEH6EPJdNdTphq8Ia+ZqkvyTOISzhRvKiyr82ETYn9gxbW+Q9K5di1YkZ+NubRlj+NCeJGA9S1WMnzpF8EgA6JSVFyuPojX6BdO+hrN083ABdLvwAzch+phEDHm9A+shc2hABsQZN7U46keSS0cNVZqvRDIbQdP6FWPy7v2ElA9mWDzWIIal62o8C5kx3uWrLBzjui1fLUFSVVEoQmqH/yygqw30v/FgGucp77GVSt7x/qWTYI9UMDOZL9aHBRP43ru1EXlUNbdAE5XSG2ry2pdxfISxKtxC+dDTK6HqrPEDDIVvXntUdTah5vBmpo3Lo0HDofYV0TDWRl2QrTynox4wLP4/lf/gb2ndGvRhFkLWnH6o7etc1dC+1XnSplIZFKjik8kSXDVpuMsyH9OIi3PnCfwtAr+hrbXMk38HgvEuNviJwvm7lBmRX1ujJML5IY0rc6e/M58JhrTQbJVdMoL07U5K+yeSD+BAdOL8O/vJQdLKPrSuUDlItIwroPHYsld5zitAN+D7ZAO3TAE2L8KkGaI6Lucxp5R0zyB2eCn9JCkLSouuWrtm6aiJKaCNUkbyY7NqHQmi54AKsePOPaGci/+3KOpEyrUO0eVV4giy6v2zD2FlzFPNdAoA2dHKzaMhIPPG/b9F7dEnY98QPcWxTwCxhFG3xLdW1zmu3/gs3r+tC0aARdB38SDiFh2PeZD2n0/ULGSEh1smcEUb5ypZIQ0mhkO4feoBmprtSBo4bJyGQ50gYyzScKPPbvs00QjJkSljITE/DzIsvRftbf5H48GLNVrhUc2/0S4llonDXn7dhJCfoDJV88/UAaPW9EDLxiLKhQ/HYL39PB2jf+GLc6SgcY39qJ1cP7cLad/+Jr65cj/xBw4Te1HJrrxNMiOESQdsMIoX+L3N2GcpWz6C1bY6sb99H0U3HgJXNyDy+jF7Xp/Yde7CmqXnM9WQc/WgmofK+LZWbcUiDWs1PPECrJpNPPUBzzDLcwQMs65C7uZlA+niEzxkBO8UUy8+QuYKBSDmWy2mhKgIMoa4U/oOQhdozzpTxSCt3uDW0uw4dQHPSjoD0SXq9kqEjIjP/fL2M8Bpe34Llf3w3YmFJpUhccGDrcJduD1ahkJUMzm+/j2uWtCO/egh99kCvsXrFR20IGVKK4UfK0CyUPaqSd0Ub6z8WYI6GPcjt7q5D6fqZyD+H3nuQaTAT14yblhGhgmWyez6Amz5zAZ7+zR+xKubgkoaebYfgEOYcBAHnoz9/AwU1gwWQ7UiZmBrX5k6IcQxVijeisVVmKvbVa+NwCld5MD/HYgLnVe+8hyueWoqc8iplMVtGj6Rc/JCAog5IGZmNkqemyyzJig31yNvcHEnY9gWgi2ivF3TXo+z+WqQMyZCZoY4MW/DL1Ht5lO+TV9PQo+3keim61oSlqB5AH3kAXdSpLKwyyfzTzdI9A0MWzRJ+Yl8Kt7Qqa8typ4LohJJhGBGCI8uNwTp+jJ0xCw+/9BNhDztk1pVuGFlNQHHrpv9CMDsnJk6e6NoaaD7vIrKS/i0TmbnZJGGCMAaglwq5+4dY/5ct0oiSUVIuh1bAsHp4FPH4GGyuy3YMSb6WXzcJhd0z6UCs/VjBWVHNNgofeDHPL1zWgMz6EgFeo5fkt5tUkg3L07IDARx13Bw8/urPsF63gy/5CG3xfSrJY8+ISfE7v4lARroi6ffpA8Nwm5LMSNknT46ZdfFl5N3sTrL0b5cQILX/6R2cd/u9SC0qVQMf3CoXKzouKz5fDb1+OIiSr09G3ia2hKeIoZPfrUJYfa+kmi5J3cr1zRh4dy2ym0rlnjEDhuQN9lU7CbX83B6uuLZ5WINfuLqdhKPuPoUAbfQs90mgfeGY6AnQ06IALS3a9R9JOTER7qhPWCUQ1gNlufVb2r/p67KOFgxc0IiUMdnicgVpQzAJu9yw3AUnTSFmZHo0W1sya1C3kFaMn4DrOp5FG5fgbdfhiFj+4MhG1HHArUi6CoSfZx1t5HPvnEfv0YmQuieqP+Ymls/edR/W7tgt4N62xY1BxuF9ljKr3QLQPNV78Wu/x5zLLkcwK0Pcf87q82OPVvPI5Ooo17JlhGCnOyi6eISw1ZV0MA9KiySPkglBFe0Xt0yOlpRft1imrkyj169F9SNNSBuRE62N7dHAYPRs7rFtGewg5W30swETJuKWrm9Ji/4St+Ow19rnJHMLW1XopJ2e+5Tb7hGLLyiemhkzP1KFOdyEmc9JwcX3Pyrll8tiw1SuxrwP9/+ZuW/l9n8LT/Qxl1yOQHo2rakthEuWjjtzjNtnxlIMuHFnFfrgDr6Si0aifM0MoXploOUEIfM+J5P0LerifoQ6hDuni5EUXjsDlffWCQ1A/pWjkH/5KOTF6pf6rvmXj0bRl8ag8vPjkDE+T6btWL3QpH4qANqM1EEq6zLVb+GkmZNxxXnH40ufOa6H8s8uOLkFuVnp9Df+xK3IGqCzjxlAIDmLrJ065HY3SfihZKOastEX5d/lmtqCLh4oSt/TpqzYMI1O5+lxs8glMePfS0SVVc3UlWV3TUaoKp02Jp+4weikCCMGoM2ou+eW9nC1QmZxGOd8/W4s/sPfsXKHSrKphoe9MmWlbZsKHSzasVc0WZpLHkm1/G/bMWnOyZGMuluNYOpwR2x9bzAjBzd3fwvL3lOxR56Swq/J3WKc+HPHPPFmXqxJnNaT5Tzv2y9g7MzjyDtwogTyZk8QY5eRy8BS+bX4mtimWHlWwELeKYNQsWqmHHzl6xtUjDKpEFSTbnBojExS4aqPZECaD3ku+eJQB69xzYYWVN42EaHyVPF8oh1xfOgEeljWbozd1K3XbDVmlpbhvDvnY8kf/oG2HeqgXawHzXLSbaU7k3Irjz/b06dO03ZdebFouwo7tP91O1nsJ6oSOsPX0+LzuVwdKvmbkl2IW5/9npTlLdFez1I5YHfRffIhre/uSD29UKJyieQ7/8Q93/wuxrTMIiszRSxxe9/J4DEHl9Q4iwHgiPcUDJq0tgNRuVIduO7AhAJtVCWbBwprQrP87npS+p7WONxFz93NygZbS3LarR7LNs5GweaZqNl4LIrnDFLJQydxOeonHqDdwu8oQPuRnxHEc2vuwN7frseu19eSrpHHD+nxwzfX47UXlmJgRYka4yQJt8QAnT+rBlXrjxNLK6+TrK4NrZGSnb4qLzi7tTmbGpG9mXRTE53ujUm5XSUddRhAVl/F9ZPFjePEEnfiqZhYtEwnHsevqcufnFAKJh5/kkygWLPlAyzfuQcLd0SBerkuqWOw5g3dlowFTdbSgp/8AvkDBumhqHrUkK8nQFt6JFbJ4OF44n/fxOKdikBnrZ4z2KbJdISFjVzehbSxV+xk4qO/4bP3PoC8AUPkQJKESwJPisc7BTRJPIeEJLlmGyhoqMDQRbQxulqFzYwbUvK6klsHSfAJd4oqnVMeTkNSFlpYP0+Be/DS+6hZ24jyq8fT2gb0bEBLYp1mAvfXNHzRieu8tumZmHTiabj7uVdo/XbReuyWeL2iX3U7Nff2udxSxmzROizkKeK0Rg+/+j/ILR+QeChrzDT28pHj8RR5Oe16hmR7ZGaiKs3khPVSXda3nkMav/0zzr9jHnKrB0p+xZYuwAM0ccg8TF5nEyG/icxjylG1nPZm3O7AI0cr1jcjZzOXdbag8Pia6EFj/AcAtE8GlCqAfmn97cCby4HfLO6he19fgdeffxpDqkokDuo7wJDWvFkDUL1uFlm9DTKtWVxcsoZlKncflDdw5fo6SVaoyc9qkkpJkmQ8FeubBOxLeULHlWPh5JPl4NPtrEzAHjOss+dN7c4PVCOPWPMqqnHGjbfhiV+9hTZNZrNc81goLg81968tCU6PNQQGX3x8kTSo2HpiiCSOfGYEoHsMvz3hVLLKtpF1tls2Kne0SbPEFjWu6amdqrW3/R87hKxpwuxjYUv7uK0YzBJy//LAU5+U2/kctkhDMAMm0uuKMPDxVgLDVgHWyg3TZW1y+aDsTKIMsoM9I3KdybLK2aQ6DtXP6pMGaG6WyOtSj6WkZetbUHz5WPjz/LK2fAA7vTCdWXrAbshQQMUeS/7AITiHrOnFr/1OCLTaXA4P3a7vVsEsj2H/i1s1o8v1ZHQWgf2lDz1B1zEl4XQcN1HNe2r66edi9TvvS432yi3RQ8GlFeXnXEW67q87cNuGboybMQt2KFXKSdWUeVM9V2/NHGxY0RoTZiGzrhjhpY3SLXgkg7NMf9/IRhp5zwTQuSfWSOlnsA9kSQTOnw6AZquYAfrljXcCb7Vj9xvLI7qHFG8sw5s/eBxDK4v7BtDHVKOaXNCq9dOEBYs3Z3HXVIlJl3RN14+Jlakm87qnkZs0Veoxyzbyhp4mjyVJjsgKM8DTeyglK774S2NkI7suvlsfLbXRCZJNZiTkQV8HAqiZOA1fenQhlrz+f1i1lVxPTsQJSfweiQn2LWbJo7PICv77Dkw7+XR6br8w8LmUlOKqxgC0aq6xcO49C2iT7lKuLzfAMCBwHFUIfD4UAp95z72MWRd9ARlFYfWeYwr+LV/vkzx8QqQTgJ82cuaUIpQ9RV4Mey6bOOQ0XZKwXH6VrAVd1EV/312Hgu5m+tsWIVUq7JwmFnVSAB1pYlFawEC9qQ5l61pQdNkYDdLK8o/r/prRfINLBaA8FAK3YAaGTKnHZU8sxsK3/kqekqIZZc5vtqoX6/DVyl4Aelkklk0W8F+3YvLcU4UEKxFAuw1UPLLsc/c/gnXaM+MuwqfpsH2aw2rbdmPt9r3SFXjv918RhsS0vAIVNpG/N1UlUqQjMDFAB7jD1jKR1liE8OJGVNDe4NBhycdckZN0gri7gQCaDLZ1M5F/0iAYjqrNNz+tddCJAfpu4LcrsYss5t1vrMCuN9QjCKR/SwA9vCIK0AkrDejn2XNrCEyPFbrIko0zyfLhEAd/PUO6/Q6k/HvFTIDUMUus33BXMwo3KwrKZMq7SqTLsI42d510Og1YNxulV44Rl1imZesMumHYMbP34nAG6C4shx5tyw8nNQdDp9bjkgWPYPH/vil1xW070acY9ArNjMYlWPf+8L+RXVYpVJqWjombuijfjumg4o65tMJ83Pa9lyTxxOV/bEXz5O413BL8t624+9s/wDGXXo4ccqkNeo+Wbj5xyYXsXmgu5XPqsAYnVDOnFqPmUQLS7laJJ8oQ2E4NiJ2NySWPuBWfAJobW4asPg5DV85BFW20wq46nThOfrBsj9eXEVlNKF2vDmC7OADTMRMTPblTyA13KrcC64ChJpk4aVkYWt+MLzzyJJb+8rfo3PIBVu1Qh6LLehiPKH+ZHoMmU9TJ0p3//I+QHa5IPI7NVHSzbAlyrmMe/f6qHcobW8QT5N+jQ3znLqwkoGcmupkXf5HuleqY/I+aTRnbjh9btRGfKJ9eq7YYpQv5YGsgL7eOPNWGfusAPWRcO131dP+0YPD8BqSPzY+U2h0gxPE7ukaVTLN85HYS0pvTAI14SUK3q4kBOi8jgFc67pAQx97fLFH6+hKAHnsCtNWrBc01nZlV+cg9biDyZg9Azuxq5MyqQjZZ1VnHDOiTcpIx/1j6+2NrkDOXvr5oKMrmTUHV2hkCtMU6DBLu6I3ApUEYuBicOcxS2tGMsg1NAhTl10yEU56iMvyGbmQx7EhmOzbUIddHSvSUa+yX0ihHMvBOMAVV4yfitOtvwd3PvYoVf9qGNWwZ83QOHfpYoufcKRdYtfBybTVP3j7l5jukG47rjC3D3h+gfXqIKIHnyIZGLP7jO0J3uYr+nkmOHnvjT7i6fT2mnXomMopLpZ5VBo3qBg1FruTrQS3JXBCq9tblOVCfketybb+JnNYylC8iq/mZFgK/ZunSVJ2aTRLaKJRmoPq4m1oNVHA5g+uEZKdifQvKn6pF4ZUjUXzyIJScSHrJSFQ+zODQKs/Hf8dWXGEvFnWR1rAcuko55BHeyMmnJmVJ89dXj0MonKKTZbo92G1mMd3BCabEqn2iVqSSh5ONjsmVKyacQAjVYybgzOtvwj0/eBlL/0pru5PLIlVFz5KY6p3luppmIbeTS+v+Lpxy49fJ0gtI9UZigFaW7+iWmVj253fJYt9L1jK9BnlFT7z2B1y5dCUmk4fFa+uzHF32aGruEcUlEwlpaDV7TBZXoTr5/4CB3NZylC+mdSWviMNV3EafTx5NshU1/RW6KNI9DK5Kw1mHqtySe2mjSkhXr6Z9/+UxCAxPh+VnpkW1Hz+9AK27itTGVQBdkB7AqxtuJwu6jYB5WUTB+sYyAeihEYB2ElL/uWPpD1Syd0A19MQKvpkDBCglQRSfPhRVy2egorMFpZ11uuKjUZdgxbsJ6iOb27W8OEvNce7yr09EyuBsuuHZelVZf8sw9aZ2C/u5sy9qWcd+XjfZJDSRloWMkmKMmzkT5911H+Z972Use+tvWLXl31jBdJ479wgBznJSrrhY+h65zL9+C9UE7r4I1aPRgyPB0U00FlnDvlAazrvtTqz/xw48Thb719ZsxPGXXYXKMUeRNZ/RowzygHSMlqrSCEoLN4+M4nbpAKwMB8UnDMTARc3IeaYOWc/UqvLFSDIvGtYoSTjwdZpwBxcRUHLOoXptvTDehYblwAjGVMwETITKMxC+bCyK2Lui3y0nVztnc61Y6iVJWNKl+v1x5QH/bc3aFpTdNAGhQRliSXMdtxp9ZkmCLEq6Y0Q4R8x9DBczQv+pOCoySkox9pg5+Ow9C3Dv936IRb//O9reZctaldKt0JUfi5gmlNb4SbK8K3hte6WONZSX4wRxwV3zsZa8oMd/8TpuWNuF4y67BpVjj4Y/JU2z3O3z3mLLY3ULtzsIQnlKpnTx2VwNRN6XmW+h+NTBqFnSLIcZH6Ju/L+os/EwWMVqMhIfzgVS4dMswx5KuNqro1bIzwq66QDfMBNVjzYj99gBMLMdzatj6JryA4yFs6zfc4jjEwnQPk08Eh+g2wmY2YrW+vryCEAP4Rh0jEVpHrKGF0NqfhU/reryE8svnayA5jIMfYQ5iGch49l6oUPkwvrkYloE0t0zMYBcpsyJhWI5+sTCVKEO9RktzZrWCyOXLyY7rnkVuJ45vagQg6ZMwYwLP4cvPPAI7tj0LTz237/E8t/9DSv+sRNrt/wTX3zoMVih1ITXQIiKpBXdFhBuOutc1J52DkqGjYY/PUdn7v265bnv15ZdaqnHZXYzHsbKzGqFQRReOgrV7bOlnjVr83TZIGUdyYUzuEa5sGs6cruaxSWt+vpk2FUhKdmLTgaPHkZWsR/F1x2FynWtkifgEBaHpEqSrBBRAM0HRCPK6euadS2ovK8e6ZMLEPCrcVimGRSCJTOmOelAtJUuGJhSEkf3ox1CekFYYtWzLvoire3jMgbroZ/9SuLWa/66FRvf3okvPEhrG0iTZG8igOaGC561mE5r23zOeZKLkLXNzBfSLNvnl9p9v89M2JDh1lCrEJWprrOpuLq5kYOrcvzchPLFMShbO1N5n0dAyEISvd3KG6paX4sB6+roURlcfA/k0P6sWN2Kqi8fjfRh2WL9Sz230dMjPABA/9a27fIjmyzpowJ0DDjz48cP0FEidicmAcKDQx2Ox07IR9m9vKCzpb20qCsZYpdG5HMcla1orvRY1IjckwfAyrZ12ZOlqzysXmO2sZ1qbrxTKgSMKCexcGo4NkI5OcirHoSaidNx9HEno+Hs81E6ZLhqSknwvGaEIMaUGm5DLH1Ll5MZksjz089sw05qQK248VwDS+AeIIBOHZ6FipuOlhxBtiTyposlzPHhoo1NCcNH8VkFayWpyyWR1W2zpFLAcSw1UzBSGWNEapIZQLiRaODCmUJpya9dsaE2aYAOa4Dm2lsGafacymmNKxY1IOfUGgI9W8BOqlN09Y6/B6d0IlJ5nUw0FB2Am4yTvAWDrx1AMDuP1nYgBkycgknHzkXz2Z9ByeCh6nWM3mLhhvDAhLj2XjMYWpq4yBK6XMUup1rC469xBKClnt8v1VhB5k3R4aqU0bmovqUWFetnIfuZevFujgSAZqudtZwO4+r108hSnqYm8nCDC3nDNQ83IX8O7clcWwwIW4f+IpNVzN5LCbkBy7LMF/x+fzbppwyg34wB6N8cRoDWxCqmlA/piQqaN4BnDwar0jDwy1MweN2xZE03JxX7ql7XJKd11jPTpEJhYPsslFw1HsGBaUIOpGKDZjRmeyD+WZe2UVcECBFORNUUYkOAQY0T8slGJ3A6wDXs+byGBufYkU9mr01DiSbjBJm2NMNC1rEVCD9eJ9n80o3TCGCn0dfTBGiLpJytOSkXuGyDInjPJTAYcF8dWW8pwuERMFTdvKkPWlVCSO+BS+IyHSG3KuyeIW4vd6MlOx4rrK2yAl3dwbHsMFeJECCx5Vh49TgEajLocLciPCyO6yYfYLO7nXiGjtnbvhiuYl+UytbxxYRFdCmf3RuvisvU5tMVNLaOj/fo7tVxZstJSGalmBktSTJzCzSTHgXoQMo7phJVT85A3jOzpIaYJ96UdhwZAK14c5pkrQs7pyK3mykayOtZOxuV104UoyFoqxCfz7HFIo7wVpvxW9fdMJ+UxjrOXr/fuakgM9XnhTgOMUCr17MjyrG1gPTi0yYr8qPqorEYuuJYsogb+wzQ5RLnIldqk+pKK+1sQknXDFQ+2IBMAi0rx1YhDyM6QDMeeMZ2+bl1y+77FgvLZ+rBq8oCCOmwhaEtSbvXiRO+/RJAaiipGWly8CXg0Eio/Np046cOykT5FeMRXk2WyqYGif2VbaiXhKoM4uXrIZPTm5Jq6Q53tCCXnq9gM7mtX5sEK8MvB6qpS8piVUIN3H5N7mvJZePp+s+SteBYctFH2fQxWhIT6ywgV7p400xULqhHZmuYPCVHqmLMmIk7iS3oaGt0ZFq35px22dncg9Mv4RD3/4zoWC4r8fq6NJoCtLYvQo3bI79jJh495osAu6leO0iH7yhaWzI2Bq2YIeuY/UytkExxgjzc0XhEADTfV+Xrm8njqUMOvb/yjTMw6NFWFJw0EHaeI583IOFNS1q6bbOnt2NpD8iMA9BsPdu2/Zxj2xV+xzlyyfp7B2hfhLTF0ExRhelBvLxeAfQeDcx7GKglSbhUJQl7ALQvYdODuOBMY6iTbk6k1MtISiMTkSMZa32a6uYRtqTNTAv5c6sxcFELAS+X8k0X67hA2onVZuUbU1T4OgiYN/NAUs03QDcxh0k4MZG3mW6c1TNRds14hEZlSezL1IeYqGlG2oWj79OMGYAZpe3sAeAxs+gitapWNEnVK0hbMWDt0xwiOnHZY8yT3uQyY053BkpjgqGG03JZoV0QQN5xVRj0SBNZyi20WehakVVV2N0ilS4cF+RKjXyynMO0iSo2uO27fQ0fNQvRfgldy4rrCaBTHAnxuJZirMq9wxYtXeMwAXQpHZC8PvnJ1ljv08jiKrvK+Uw0360apLiKp2JVK0qvOhrBkVlC4qM60myVQDRj8gp60CuHNUI6pKQORmUsmHpyi5urcK1yS3tNxj5DWY2E+0Qd2KY2mrjT0xZ+DjPCEe3ETNuOWtb6fZiOcv85eVbgR84JVah6jL2hViG04n1QutH1hlp0pcTHDcZ1UgFU0MnNTq1SfcNhLK7EytvE+3IGym+ahPTRufD7lZepBif75XoEe1yjnqEdc5/GMss0/0XW8zoC6WGBgP/ItZ77BNCWy0vrSA1oYVYIL224M2pBS4MKqVjQK/DmC09hiLR6q5sj4QBHUxGZyEQLS5F2c9UAVwwwBWTfVJULyWRuMwpscZMlDE5kOWQ3hDHkwWaUM+ByvbQu3+IbQRH8NEtZUenGqVICFr8ygDkEahX5+BP0++eOQEpFumwCoS21okkYU1uxhu44VGrgUE6uUVacpUMb0SYa99GxVPLJYq4FusFT6NHPZWNZNtKbS1B6xxRJvkS9jf5tUOAQA4NCGQF94c2T4YTU5AszUaKVf+43UHTlBJR2s8U+XUC++FCVfTGhT2cLamhti84ehmBlKkx6j5blROPFVgxYanIlM2bsUlIey0dOkEdfy4yUzhk6x2EpGk/TT++V1jk3iNzGMgy4ZRoGrJpJn7NWKmiOjC7AevFW2WIu3jiL9uNMuvfoHuHQExkBNU+T1Xz6IFglfvmMdpKMm263ryb/+p1tW5eR9ZzlOJwYNI7sad59A2hTaoDZUsghC3r149fjz68sxh9ferqnvrwYL2x8EANKiyIWnJkAoN3nZsAIaktOxY7NJOeQmVK+5rNtqQV2fCpEYCaY1ccMV6ljclF6by0KyRor3zhdYqpsQXECisfFh6XtO34Nb1gmszRI1yN3zZV0NqNyw/GoeqgZhacPFlIentcmpXU8ycMyo/PSLDUWyudPbDH1C0Db+nXsmFIxtyJC1K3vDanNS8CcPbkYNV+ZggHLj1GWcveh25AC0ASyPQDaOHIAWsiemMuDiXs6j0H5o2RhnzEQwap0GY9lCHm9IoV3NADIPSihDWO/9vtDlyCPArSKVbMXqip2JCxG94CZbSFjaiEqvjYFQ5cdi8qOmdJ8krtp+iEejpH8NWcyf85xcIt/3uZW+vlsVN46Fenj8hB0TKQIB4ypqqCSAmgB53+RdpFONskytCzb94mQ3uugXdfZEPdOOFYJcAaUFmJMTTlpWQ8dSzqMrOe0QJAsVn8EoBNZAKapQFWVimnQ4BNfJ1J6U59+DHILK29uKyAuj2UY+7g1PWf1+XWNr78qFRXXHI2atbTRu2olYaRag5skJseWdDgBALjx6YoNU2UYKvNG5G9ulgaI8ofp5j9rsNTX2n5D3K6g/nyOztg7ppWwJKpfVHsWthGtQhD33A6JRcVhFq57dfKDyGgoQeGNE1C9YhbCnbPks5dvmC6ER/+pAM0eAycxczbXIXszfd1NVtzaGXQI0/1x3jCEBmcgGDCEl0R1YnKZm6MrKszoMIVDaEVzQlDud9pjzC0S1NwxDodieDYkWZs81aaagHnQUl7XGVI2mrtpmuq07ao9ong0Sjpo33RNo+s9he7DBgx6eiYKzhqqOj4trmIhT0/Iraw+DYRlr1xVaYj+H+Hbl8lqzqHDlGw5x/eJkQN1EvrMOCoNJtZ+akgMTm8oiX+pET3xB5vSiRhKgVPMGkQwLwinIIUWJAS7MNhn5Uy0nweVcpxNx1x7tU51iRxXRgRzHeR/fjgq22jzrZ0lJWRsOSoAiZ/4Eo6H7npx0avWTxcNy8nPpV/T6f+a6O9no+rpGSj70gRkTy2R5hkjRYV0uKY1lW42xzh009M5LskHQ4o+HMT1dTgk5IeV6kfqkEwUnDYQlXdPQfVKFZPnz5y1uZY2sIr7hTv+cy1oPqy5GYZrrpknhtdVygrJy6pa24iaJ8jLumwcsqYWyyHHAwHYog7oAzGaYD+E4Q07he7zVGGYtBmEeFhClh9pI3OkUWvgXfWoWjmLrOVWWtcGMSKYw5mrIfh+Ld/QqIYoHykAzfdFN+c0WlF2+2RkkNXsd1SVlEw6skM63mwcsL5Z8IU86mAw+KHfH3iGAHoqXTMynC3fJ06SB2jdwu3bX32RWWpu3NfQ2ez4pT/ZE8IouH4CUm4egcyvDkP29aOQdiPpDSP7oCOQccMo5HxxOPyjM2EGLcVSZpi9ZsTdsIrfVGQ5XEGQf2INhi2cjQEbj5HGC4kvJ+R/UAlE+X8du1Zf14o1XaqJmjj0UcwctasJUO6fivyLhyFjehH8pSFYIXVNXC+g/6fVWLrskHkV6DPm2QiNyET28RUou/4olC9mL6GVNm9TDMFQrRwy0lXZMYO09T8WoIukFK8xwtHClKXFus2c15+TxFzuV9XeigF3TkHReUORPqUQTjiomiVkDVSZ5H60CQerbpkYe6e2JdUmKcMzkXtSJUpvOhqVCznkNkva7/M2Ncj0Ek5uSxeeJMNVg0+Y+W86mo+Qeme6tnS9By2aiZKzh9F1DElzFOenHLeCjD0Gy4oMjOgDQP+JAPp6x3HyUkIh3ydWIgBtWV+QeG4vpC0qAaJJzWNKxswejGr7clTEHzTKN3EqAVbm/AnwPzAczoIRCN03AvYDI+A8MLKPOgKBh8Yg52ujkTIhV8DI1ET7sUkUFRu09WERrZAwJGZNLmLQQWZdCaoepU25qVncv9JIjW+jrlho7EGqJG6wrqlVqku/hCdCVQQwcOdz+IQ7nshKrSFLfcAD9HdXj0Lu3Gqkjc2DXRqAkW6qET+xG1uX3jla1dRld7KzS4BjavpR5TVwbNtIYUB2EByUgfSGYuReOJQskomoXsjvnSwUHlHU1ai5LVTMnd9/iXAv1ynLan0LbeDm/+AQh25Z74xep7BQBajrxU0uvNZsUYfpM1Svb8LgFa2ouo8O7svGIOu4SqSMyRHPyUyzZF3Yi/HrUJeUVMohaktFh60fTV+07DK2DFM+P68vra1NXl9KDa1tfSHyzhuE6lsmY9DTTcJXUkD3LlfYMGkQN5sw0VRYasYVY2NxDId6kUxbbzqEVrEiueJBHDlMpkUHhqJbaJEhDFzFo4wCNQWn4uapSJuYL/evTJHhcKdlRyqd3Htj/+YTXYoaDWfsou+/SQZnHYGz9YmJNfcFoBNObT5EMwkD0wqRNn88UuaPhL1gLIL3jYF/wUgE7h/dRx1FfzMcqfNGIO9Gep7WYhiZpnALRDmNNQtdpOwsXtecI25qaGQWym+fKqx6xV1NkVrZok6XiyC5crL9blrpTGwUnoPyjpmobJuJ8gcJGG+YiJKLRyLvhAHImFKMlBHZCFZnIBBOQbAoiEC+X+o+WZ2CAJzikPwfT4BJGZGF9En5yJ1ZgeIzh6DiS+Mw6NbpGPIoWXerZhMIziQQaURpV12/V2N8ugE6mdZ1xQvB5WBFnWRZr56NoYtmY8gDzbS2k1D0+RHIP3UgMhpLpCMyRGvrL02FXRyEw2tLoOvk2PCT8vr6yYIMVKYhNCQLqZMKkDmrHAVnD0HJleNQfVcthj41EwNWt5Cl3CBdrsWaWpX5r4uOmKSfChHxhBwunWMLXoF2oxwWvJ9KyFioXtqKXDpobDrMxMAwVRNXpJ+hDxUaHG+2yZsIBAJ/Ib2RrOcCLp1zHL/vEy8M0BybIb2UdO/HAc79BdDB+0Yj455RCNw3CvZDY5F51wRkk6tn59m6uF8V5juRetQEnLu2WuQQ/X5qaRrCVx1FN9MssjRVMpDZ7iqE8U6xaRUdRPJJbWiyyDfXI3dzg4QZGLDLNqqa4/C6VoRXtqB0OVk3zO722FQUPzAV4fvp9e+vEy19sBGVjzZjwBMtGLCItK0V1WtnCK9FmEluuprEmsrd3CRuLh8qlQmqUjyA7p+WZJ7ik0VrmUXXnS1EDiGEOZZP17+MK33Iyq4iUK1aQbqwGeWP0zV4eBpK75uCknmTEJ43EeH7JqL4oSkofmIaShbSwb2cLMvV9Hfr6e9pbct5IAV3w3aruvximb95ZNKAusyDXCPPVKXcdcrx72JR+mwbZqLs7unImFYEK2RG5l06untWwhqm2SeAJnDeRdj1Xb/f30DgbJFR5vvUCAM0uQKsJzm2/W/T/CRZ0KT0d/YDYyTkkUJAnXvHWOTQieyUhySbzRUewq/gO0A3nmXJLELh1yCLJu/8oShZ1YoiTlxsZIrK6RKLTLZrLmEVyHqmcmyWkIKqu66TGYsy2YWpTkkrSCs71NfcxehqOPJ1s8S6ZXYfu7ab6iTJV9ClyIg4/sjlg4p+s9ED6EOkZXRPlG1Qw2qlVLOrQcJbnIwrEze+Tqs6LLnOmg/l0i61jnw4l4s2C80tryuvMXtbHKLg9muprOngbjqV3HMpON2JNW535JHDzdyAnG51HTiUwXmbfOH7JsCmgyf/wmEIktUcEG83ILQHts/oEye5qXk0dPnc3wm7biVwLmI8cxzH96kTZnIiraLD53/5g39SANq/YAysh8cROI9B+ryRZE2PQGjeCKTMG4usLw6BMzJNiOVtHcdLGF/n2JVhSyxQ2L7ob/wpJrLmVmHgwtm0+VqETD4sFsHBA7TSZhRyR53E49QcRZmlSBtTxQebNQip5A5XEeyrxZ3qwGANxzyWb6gXi7+0Q7WoqxDNkbN5P20AzXXxlesbxVos63ApOusFrGVtteZ0q3yGxIm73cEGenBxp/pddS/EqMS7GyPPVdDpqoqPu3okra+sMb0/Bmg2GPLlkGlFzarZqLynDul1xfCHTMVAyd3GPB/Sp7jTTSNRrHlfq9nebdn2c/TYSuBsfyqB2RWO19imTFX5LLn6Oz45FvQY+ptxCNHfpdJzpNw7nAB6FPz88wWjUXD1GGQclS+ZdcPn60FhGUslaUnZm6OaOEzF5MatvdxSmj2lGNUP1InFw9nw6LBaN9730eK6XGdb0DVdHpW1VR/Rkhh1B6i6INtDdSKoqMutMlDt6uxiyzTsTnWYhDc2JDXXzwPoZGPQuopHPKB6nVxsjLCxCXe1zNGsk98tknV3G51iJ9WztVkvWiZJWzU1O1LyGakYqtP3RRSg3WR10WEqkYsdllC80R2UUK9b6FsxZNEslF40CqllabBsSyYOGdLpaGqKUM2jE8NCl7haw3ibgPkOMiaLFXaZvk+96DBHkD70ZUJibVt7TTPJWt2YduaPK8TB4MyxaPn+vlHydYj1/hHInDcahTeMRWZLKax0S+qAfXZAmlkivfs6m27qqo/YDkXbUC3oweGZGHgL1zzPQvamehlKyxuGwbWoU23KcNJgUadLnhQ4S5ijI1rKFasl+/BHuKoGC9RJ3a6qHqnXdbxRgCiON/7JA+h+j7fGXn/3AC2JCT2468vKvCaulsZo7CHtTgKKcoY07HOI7zNF5jBZ0fweyzfUagIrfUjIuDN1aJVvaEXFfXXIqC8hq9mSxjKZGm9aPSbYuDwlxj4NPpoS1CU32kOPzxO+zLYs2/lE1jUfjHBJim07XNA9IhAIXEmuwxN00RaTLomj/PNF+tHVhXQR15G+97HFoBMlEO8fBfuB4fT1CBR8fQLyTxwAiwfB+jTPsOVyYiSuXJFYl6Fm8PmrQyi7ajwq18wUQhneDJz4KCOwKOhS5D1H8ry2Yg+gPT0ka1knbeNsqKgeAP6+HoWbWjBgxUyUfG6E7B2mWDAlbvyR2rQZnLf6A4F5hEklKaGg74jlbz4MYiShXA1STfrmgRYiIUDf308ATVY011RbBNIM1jl3HoX8zwyBvywEn5/rhoMymfpAZYVue7bwGhQ6KD5nGKraZkrFhIzjkVDF4RoJ9OkHaNMD6CMeoLOeUcyPvBc4PFPBDIf31WrP1UQqd7TyPrPtAzIzxmnZ3kMg/QphyhyfYQQ+VRUaHzuS08UjDdMF/XVSFvS8ERI3DhCoOvfR1wTYB6MhSRiOoucdTs83VDQwfxjS549BxpXD4R+ZCduyEfL5hV+kt5ZcRzcXMMeCMMKlmMibXYkhT81C4eYZyHxGuXnV6xqP+KnHHkB72u8VLFzh1DENudyBu7kVNWS8lF46CsGqFMU4Kfw3TD1g6D1k9Mlq1pbzVnp84Ige6voJlBK6mL/qG0AXxQA0Wb7z+weggwTQafeOQCppkEDaP5+BepiypslSz792DNIm5gnYKqJ4OyHZvsvX7OgbTG6ygIHMacWo4UaE9TMkRshE9t6GPTQAbe0D0OEIQHvX8vA3pKhJOzUbZmMA7YesmeVCW8ukZAG3y9VUTWNBnyLwOmBds2PvtW3rp47jnGwYZsAD58MB0LxgtYXImn8U0ueNQeh+Aur7xiFIlnRIKjM+unKoRMWh+bnGigYWqK/T7x2LtHljkXfLeGTOKqObKSCseokAOrbNXQE1M5b5yV0zkTI0EzU3TUUlgXThJq5dbUVp5wxPD6AlXa1S413TSdbWzVNhpSkObTPhtBLa5EzYf8VRGNA5G5VcUrhxpnctjwAt65qJqlWtKL+c9llNGoErszUGyXp2FKuiEeVvdptRDgDQO4LBwGN+v38Ag0lqSoqHqIcDoHnxAjUpSG8NIzirBP5jSWeHEaCvA/x4UFqSUIOk/mPo62PDyGgOwykPSHy5z0kLGTdkq4GgPJWkIoT8k2qQc+4QZJ891NM+aNZn6FqdPwSF5wxDXmM5zFRTmP7iUbDyBmcXmSebZE0upr8ZTtd6sKh3LQ+/Fp5B69FQJrzThunT3DCOGuZs7kMSZe4/sZxbtDWHxl7Sn5OeGQgEgoFg0EPSwwnQTP6fyqNqjEi8ScZf8cw2Ne2j/9XRE6ItiXFZMrU6zbRlUkzfSwl9EcJwJmv3s5XAI6QcX2Qatae9q0zS4dFatprEHDAcUSvOOjBoh2RYhKqZ5fFlDg8KtVW9unc9D69yhZMvoIZDWDocxaPuXGKyA9Gm2qpCYyfp045tD6qoHiTdzZ4cbgta5hA6evo2j+Xxq+9lIzoy6aO/NfL8ln5Nm8nAg8IP/f/bu7vXOKowjuM7c87M7HYT00iFVoS+SK0gKuKNFFHxQo1XXngj1BcaUIsg1CrUgl5ZxNaK1SalSm2qTdLYXnjbJv4LXljQpK1tQSxeiLSCbWNNjs85ZybZ1mz2rUl2k+8HfuzmbQM7O8+cPTtznpoKdNqpRLt1l5Vb70O5c6kjUk1y/qpN5frlaf/pvvuEP5xxJ7adaXzHHD9XrQO/OL1vKcbzubBRU9N/ca5koaMKXeTdgVorO9d8SorzJhWqgp1rLhSKVNDmKNBpx+Y0WScU19stnJv4/xGkK91lqW+d3qlWUq7dli36yj2+JhUTTvVo9FduZu2jVLmmqXapzvT3wpKLGni+Fz7ZqDlrjhxky6SWOVtDZe+Wk+TvKI77pFbcm13NjCYr0IGaOXNSoNMrHV1fwpv/Z1jLHHRuuhNz1gsu7ZIdBqSaqPSDo2zlssRNdakZT8Hyc9DK/dy3DUsP5Dmex2aIX/Ndl6RCgbZrsEfRaBzHm5XWbUvuasDWGkHfuCbG9AIp4ZzEv4Ai9zZZuYIQpvOewcytvcqktHNzacdsN0e6ZBNMZ2rNk0rbRLlu3nbKKafz/nFmbO/kL9F3U1PulEg1Z68RUus+pdIpDZ12W0qLc0nn+uk56/CKFOSvJfcHXHHSAiPoeY9fZ9avmpUmp25cA4DUldlWH6t+2zTyc9Js+5obLWtt4iiyo+YxrXV3kiRtXKZNga7Yqr402bQFqT++TZMq2xSBLMECLcVZivIVrdWQ3D5oLzZmrrmVpjhKT9ep84UQBrU8RuBX0Sp5O5ZL36YFOVJ33NkY6fKR5ZoE/287MCJejMkWIbOj5zhJzuk4fl2FYbstDG1tbVTH1inQ010RwrRZba1x3XvdeZSqqiVO7TKHSvnFwV0XcikuPiFpMH7H9LeVpiWyjhjZXHU92540X+z2j/x0xjXJsUjrh/Mygkr4ILD1CnTaFcGeoP6b7KwX5W9qjhylL8oLweZP+bpiP8VIisfKzqJZvfJ2s27VcrM2y8rl5m5Sd+zzd9edHWbFinbXndzNSbtP91XZU61km/1lt3u92540UdTU/QtxFJ1MkmSz7Ncd9nPAV97YSkVs1SkO2YiD+Xx+g2xQSZze1pY4jtfLW6ktSqurla9ezJmd77xkRr/fZ0aHPzE/D++VfGZGT35qxuQ+qS+jJ/eaUyP7zcE9b5v2QiyjKX8GR5gLy55qleTzH8RxtL6ebU6aK7L/bdBRtEH253WS9mJHZ25ZkemMRTAHHfY2epK6PUrL3z8pI7GKrbrs6XBf7N5mzIVvjTlz2Eye7TeTZ/rl/oBkkNQdef7OD5mRgQ9NRz6ZLtCzLye5JeADI6B5C7QU1f32D+o9Wd1es58W6CeqKdD2gpUDu98y5tyQmRzrM/+cPmKuSyZO95uJsQFSZybH5CD3yxEz0r/TF+j0w9jZmn7KNnuNU2GBRVygS0bQVRVoO8Xx5UdvGiMjZzN60BXpybFDcv+QmRztS1PL/UNV3u9r8P5c/p96HvvG77vn78wRM9wvI+hled81Y5YCLdtsUtJNgQYW+Qg6nSJ5XB7jcsX/l5MCvWurMReOydvyb2TUNyQ5mmaINJCJ89+ZEwO7zG3FvD8XevazauwHut3sKsDSKNCPVVOgbdF49eXnzPEDO8zQ59vMYM92M9jrM9DzrhnoTTPT/XLf66nie4v5cdzX283R3h3mva0vmoKbg/bdUbKF2inQACPoqgp0FGlTkBRjSeKzTJKPSb0p2ETKJJGa6pReYZVACjTQ7AU6CIKeRgt0OgddXYEOuJhgzlLbFWcUaGC+pV29qy7Q8if7GinQ2f+seoqDAj1nCWou0AEFGphPttBKVkl+qmY9Bvm9PfI2uOEpDnmsRyWXWAuhRdZrCINJOah2hyFncQDzRmudi6KoKLcnyi2Gkwuyt8PBdSnMm1UY5updfrC0QMsOf5ni1zIrnTGCBhaiQLspB6VeldHRePl1fV2zyB+lQK9t5CrCbA5aslFCgW6dAj0h2cR50MA8kxF0LtK6M4rjwzqK/g3Tzgquc7d2HXwl0e/yey8kceSKeiPs6Fsey64BcNa3cA9Z3L05pzXSbe9eA39IHpH77DDA/Ap8kY7jO+IkeV92xLG01c24FM9L8vWw/Lwrn8+rJEluyQFByZ4eRfpjWwDsMpZKsWh8sxZouxylHLiPy6Yrsng7sEBs8S0UCsqObqVodoVKPS875EYZ8XauW7P6lk+tSNZInR7xRZoC3YwF2hZn2f4/yO1D9vXR6LsnAC2gWCzaeW87ml4rO70dSZ+TgnAlCIJxOSiMy+01sgAJg2tycLa5KtvhV62jr2QbPWCnninOwBJjd3opBlID9D1y/ynJszZSHJ65KV1kPqK65Pnvkg3ytNzeZ5trMK0BAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAADQDP4DsHeuiRev2FEAAAAldEVYdGRhdGU6Y3JlYXRlADIwMjMtMDgtMTBUMTA6MDI6NDkrMDA6MDAzy2cWAAAAJXRFWHRkYXRlOm1vZGlmeQAyMDIzLTA4LTEwVDEwOjAyOjQ5KzAwOjAwQpbfqgAAAABJRU5ErkJggg==;clipPath=inset(20.33% 0% 21.33% 0%);\&quot; vertex=\&quot;1\&quot; parent=\&quot;iroXu6kSOUnqGuu2dOUE-67\&quot;&gt;\n          &lt;mxGeometry x=\&quot;107.13999999999987\&quot; y=\&quot;10\&quot; width=\&quot;102.86\&quot; height=\&quot;60\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-26\&quot; value=\&quot;GPU节点0\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontStyle=1;fontColor=#7F00FF;fontSize=15;\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;iroXu6kSOUnqGuu2dOUE-67\&quot;&gt;\n          &lt;mxGeometry x=\&quot;60\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-27\&quot; value=\&quot;模型副本\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontStyle=1;fontColor=#808080;fontSize=15;\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;iroXu6kSOUnqGuu2dOUE-67\&quot;&gt;\n          &lt;mxGeometry x=\&quot;70\&quot; y=\&quot;130\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-64\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;\&quot; edge=\&quot;1\&quot; parent=\&quot;iroXu6kSOUnqGuu2dOUE-67\&quot; source=\&quot;iroXu6kSOUnqGuu2dOUE-55\&quot; target=\&quot;iroXu6kSOUnqGuu2dOUE-63\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-55\&quot; value=\&quot;随机小批量\&quot; style=\&quot;shape=cylinder3;whiteSpace=wrap;html=1;boundedLbl=1;backgroundOutline=1;size=15;fillColor=#d5e8d4;strokeColor=#82b366;\&quot; vertex=\&quot;1\&quot; parent=\&quot;iroXu6kSOUnqGuu2dOUE-67\&quot;&gt;\n          &lt;mxGeometry x=\&quot;140.0000000000001\&quot; y=\&quot;80\&quot; width=\&quot;70\&quot; height=\&quot;60\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-63\&quot; value=\&quot;本地梯度\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#ffe6cc;strokeColor=#d79b00;\&quot; vertex=\&quot;1\&quot; parent=\&quot;iroXu6kSOUnqGuu2dOUE-67\&quot;&gt;\n          &lt;mxGeometry x=\&quot;252.11\&quot; y=\&quot;60\&quot; width=\&quot;80\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-65\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryX=0;entryY=0.5;entryDx=0;entryDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;iroXu6kSOUnqGuu2dOUE-67\&quot; source=\&quot;iroXu6kSOUnqGuu2dOUE-16\&quot; target=\&quot;iroXu6kSOUnqGuu2dOUE-63\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;220\&quot; y=\&quot;120\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;262\&quot; y=\&quot;95\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-66\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=2;strokeColor=#999999;entryX=0;entryY=0.5;entryDx=0;entryDy=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;iroXu6kSOUnqGuu2dOUE-63\&quot; target=\&quot;iroXu6kSOUnqGuu2dOUE-62\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;370\&quot; y=\&quot;1670\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;580\&quot; y=\&quot;1380\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-81\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;entryX=0;entryY=0.5;entryDx=0;entryDy=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;iroXu6kSOUnqGuu2dOUE-82\&quot; target=\&quot;iroXu6kSOUnqGuu2dOUE-87\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-82\&quot; value=\&quot;\&quot; style=\&quot;shape=image;verticalLabelPosition=bottom;labelBackgroundColor=default;verticalAlign=top;aspect=fixed;imageAspect=0;image=data:image/png,iVBORw0KGgoAAAANSUhEUgAAAgAAAAIACAYAAAD0eNT6AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAOxAAADsQBlSsOGwAAABl0RVh0U29mdHdhcmUAd3d3Lmlua3NjYXBlLm9yZ5vuPBoAACAASURBVHic7N15fBx1/T/w13tmd3Nu7mR3ZjZpUkpLmxbaBgotLeUqlHIIgoKKyiGieH4VBLxQVPD48lMQv3KKSlX4gnIVATnkaqFAgQKlUCBN293ZpGmu7ubc3fn8/kj4WmuPJPOZnT3ez8ejDxAz78+7m2TmPZ8TYIwxxhhjjDHGGGOMMcYYY4wxxhhjjDHGGGOMMcYYY4wxxhhjjDHGGGOMMcYYY4wxxhhjjDHGGGOMMcYYY4wxxhhjjDHGGGOMMcYYY4wxxtjuyO0EGGNSUSAQaPJ4PDOFEFOFEI1EVE9EdQCqhRDVAAoB+ACUjF3TD2AEwBARdQHoEkJsB7AVQBsRbU4mk293dHS0ARDp/ysxxpzABQBjWay+vl63LOtIAEcKIQ4DMAeA36HmdgJ4k4heBrDa4/Gs3rJlS9ShthhjDuMCgLEs0tzc7Ovp6VkqhDiJiFYAmOFyShuFEI8Q0aOVlZXPbNiwYcTlfBhj48QFAGOZz6Np2vFEdDaA0wFUuJ3QXvQAuF8IcXc0Gn0SQNLthBhje8cFAGMZKhQKGZZlnQvgEgANbuczQVEAf0ylUrd0dHS0up0MY+w/cQHAWIbRNG0JEX0TwKkAFLfzsckC8CCA60zTfN7tZBhj/8IFAGMZwjCMky3L+j4RLXA7F4e8KIT4YTQafdTtRBhjXAAw5jrDMI4RQvwEwEK3c0mTNZZlfbu9vf0ZtxNhLJ9xAcCYS6ZMmaIlk8mfCSHORX7+Lq4CcIlpmtvcToSxfKS6nQBjecijadpXLMv6K4AFyM+HPwBMB/B5v9+fisViazE6X4Axlib5euNhzBW6ri8C8D8ADnE7lwzzOhFdEolEXnA7EcbyBfcAMJYeqq7rVwG4A4DmdjIZKAjgAr/fXxSLxf4J3nKYMcdxDwBjDgsEAnWqqq4EsMztXLIBET2jKMont23bZrqdC2O5jAsAxhxkGMaxQoiV4Lf+ieoUQnyGlwwy5hweAmDMGaTr+tUAbgVQ5nYyWaiEiD7p9/uVWCzGywUZcwAXAIzJp+q6fguAr4N72ewgAEeXlZVNnT59+qpoNMqrBBiTiG9OjEmk63qxEOKesZP6XOPzlKKipAkVxU2oKJmKipImlBYG4VVL4FEKUeCrgFctAgAkUoMYHulFwhpEMjWA+FA7evtb0dPfir6BNvT2b8ZIMu7mXwcY3TPgbNM0B9xOhLFcwQUAY5I0NDRUJpPJhwAcme62vWoxAhVzEapahFD1ItT4Z4FI3jECOwe2Ity9BuGuNdjWtRojyZi02OMlhHiJiE42TXNH2htnLAdxAcCYBGMn9z0GoDldbfo8pTgguALTtY8gWD4PiuJJS7uWlUR736t417wfrR2Pprt3YIOqqifwCgHG7OMCgDGbQqFQlWVZzyIND38Cob5mCWboZ6Cx7jh4lEKnm9ynpDWEzR1PYFP0fmzb8RxEepbvb/B4PEu2bt3ak47GGMtVXAAwZkMoFCqyLOsfABY72Q6RgoaapTjsgK+gtmy2k01NWldsE9ZvuQ3vta+CZSWdbm5tKpU6rqOjo9/phhjLVVwAMDZJLS0tXtM073dywp9CKqbrZ2B+08UoL57iVDNS9Q604dXWm/Be9AFYIuVkU6tM0zwDgOPVBmO5iJcBMjY5RES3EtHHnWqgtmw2ls/9DWbXfxKF3gqnmpGu0FuBprrj0Vh3HLpi76J/uN2ppqaXlZVNjcVi9zvVAGO5jAsAxiZhbJOfrzkRu9BbgcUHfRdHzfwBSguDTjSRFsUFtTjIOBMlhXVo730VKWvYiWYO9vv9Fm8WxNjE8RAAYxOk6/oJAB4BIG+d3ZjG2mNxTPO1KPRVyg7tqsGRbvzzrSuwZcfTToS3iGhZJBJ5yongjOUqLgAYm4C6urqAx+N5HaOn10mjKB7Mb/oiDp36Janr9zOJgMCbW/6IFzb9HJZIyA7f4fV6523ZsiUqOzBjuYqHABgbP7WsrOxBIpI6Dd9fZODk+bdiuvYREOVuTU6g0c2Kqhci3LVa9v4BpZZlzY3FYivBRwkzNi5cADA2Trqu/5CIPiszZm3ZbHzksJWoKJkqM2xGKy3UcGDwVES6X8DASKfM0FN5PgBj45e7rxuMSaTr+iIAz0HiuL9RdQSWz/0tfJ4SWSGzykgyjkde+yLMnrUyw6aIaEkkEnlBZlDGchH3ADC2f6rf738AgCYrYFPdMiyf+5v/O5AnH6mKD9P1U9Hbvxk9/e/LCqsAODIQCNzW3d3t6CYEjGU7LgAY2w9N075GROfJijc1cCJOOOR6qIpXVsisRaSiKXACevvfR0//B7LC1iQSiSQPBTC2bzwEwNg+1NbWBr1e7zsAymXE0ysPxyktt0NVfDLC5QxLJPDwq59HuGu1rJDDiqLMDofD0roWGMs1ubneiDFJvF7vryDp4V9dOgMnzfsffvjvgUJenHjIr1FbJu08pYJUKnWtrGCM5SLuAWBsLwzDOEYIIWVzGX+RgTMPvxdFvmoZ4XLWwHAn/rr2LMSH5CznF0IcFY1Gn5MSjLEcwz0AjO2FEOIaGXEUxYPj51zHD/9xKC6oxYmH3ACFPFLiERH3AjC2FzwJkLE9MAzjZADfkhFr4fRvYVrQsQMDc05JYRCq4kW4e42McA1+v39NLBZrlRGMsVzCPQCM7YFlWd+XEaex9lgcPOV8GaHyytymi9BQs1RWuKtkBWIsl3ABwNhuNE1bQkQL7MYp9FbgmOZrQTzVZsIIhGNn/wwFXinzL480DGOhjECM5RIuABjbDRFdKiPOEQdemnOn+qVTka8KRxz4DVnhpHxPGcslXAAwtotAINAE4BTbccrn4SDjLAkZ5beZxtmoKzvYdhwhxOmapk2RkBJjOYMLAMZ2oSjKhbD5e6GQiqNmXZWzx/qmE5GCo2b9AArZnq+sEBFPxmBsF3yHYuxfPDIeEtP1M1DjnyUjH4bRExOnBU+VEeoC8Monxv4PFwCMjQkGg8sA6HZiKKRiXtPnJWXEPjR/6hdk9KjU67p+jIx8GMsFXAAwNkZRlLPtxjggeBIqihslZMN2VVkyFVPrTpARyvb3mLFcwQUAYwCam5t9AE6zE4NAmN/0BUkZsd21TL1ExpLKM8e+14zlPS4AGAPQ09OzFICtNXv1NUtQVTpdUkZsd9X+gxCqPtJumMre3l7bQRjLBVwAMAaAiGzv1TtD/6iMVNg+zNDPsB1DCLFcQiqMZT0uABgDIIQ4yc71Pk8pGuuOlZUO24umwDL4PKV2w9j6XjOWK7gAYHmvvr5eBzDDToxpwZPhUQolZcT2xqMUYmrgRLth5tTW1gZl5MNYNuMCgOU9y7JsjwkfqNmaP8gmYIZ+uu0YHo9nkYRUGMtqXAAwBtgqAHyeEgTL58nKhe1HsLwFPk+J3TA8EZDlPS4AWN4TQhxm53qtcgEUxSMrHbYfiuJBsOJQWzFknPbIWLbjAoDlOwIw206AUBWfNJtuRtURdkPMAficZpbfuABgea2+vn4qgDI7MSQ8jNgESfjMy3Vdr5eRC2PZigsAltcsy5pp53qfp5Q3/3FBjX8mvJ5iWzGEEHxiE8trXACwvCaEaLJzfUXJVD721wVECiqKbX3roCiKvQCMZTm+c7G8JoRotHN9RQk/Q9xi97O3+71nLNtxAcDyXYOdi/nkP/fY7QEA0CghDcayFhcALK8pilJr5/qKkgNkpcImqKJkqt0QNTLyYCxbcQHA8l21nYtLCupk5cEmqLRQsxuCCwCW17gAYHlNCGGrAJBwMA2bJK/93QBtfe8Zy3ZcALB8Z2stmVe1/RBik+RV7S0DhM3vPWPZjgsAlu98ti7mHgDX+FTbn32BjDwYy1ZcALB8Z6sAsLsZDZs8CUMAXACwvMYnmPwnCgQCTR6PZ6YQYqoQopGI6omoDkD12JhxIUYfHCUABIBeABYR9QkhRgBEiSgshIgAMIUQ76mq+mY4HI649rdieyPcToC5JuV2Aoy5Ke8LgPr6en3sPPgjx06FmwPAL8Toc4Fo9LyQD//3HhCAyrGv+XBS0UG7fj0RwbIs6LreDWA9Ea0XQjyjKMqz4XC424G/Fhu/Ydj4PRhJ9qPQWyExHTZeiWS/3RC2AzCWzfKuAGhubvb19PQsFUKcREQrUqnUjDQ2XwXgGCHEMQC+blmWpev6eiHEP4UQD7a3tz8PfitJtyGM9uRMSiIZ5wLAJSOpuN0QXACwvJYvBYBH07Tjiejsnp6e0wFUfPhm7zIFwDwimkdE39B1vUMIcZ+iKPdGIpGnwcVAOgzZuTiRGpCVB5sgCZ89FwAsr+V0ARAIBJoURbmQiM4HoLudzzgEiOgLQogv6Lq+jYhu8Xg8t2/ZsiXqdmI5bAcAY7IXjyRtv4WySRpJcA8AY3bk5CoATdOW6Lp+v6qq7xPRd5AdD//d1QshfpRIJLbouv6/hmEsdDuhHNVh5+L+YVuXMxv6h9ttXS+E2CkpFcayUk4VAIZhnKxp2loiehbAR5Abfz8vgI8JIdbouv64pmlL3E4olxCRrSd4b3+rrFTYBNn97Iloi6RUGMtKufCAhGEYx+i6vkYIsYqIFridj4OOJ6JnNU37p2EYR7idTC4YW6o5ab39m2Wlwiaod8D2Z8/fPJbXsroAmDJlimYYxh+FEE8CyJsuciI6eqxH4H91Xbd1nC3De3YulvAQYpPUE7fdA8DfPJbXsrUA8Gia9vVEIvGOEOLTGF2Ln28IwMcAbNA07cqWlhav2wllIyGEvQKgvxVCWLLSYeMkhIW+wTZbMSzL4gKA5bWsKwB0XV+k6/orRPRLAGVu55MBSonommg0ujYUCs1xO5lsk0wmbRUAI8l+dMXflZUOG6cdsY1IJO0tA1QUhQsAlteyqQBQdV3/AYBnARzici6ZaJ5lWet0Xf8p9waMX2dnZzsAW8ssI90vSsqGjVek+wW7IboikQhvzc3yWlYUAPX19bqmaU8AuAqA6nY+GcwL4PJoNPrPUCg06bXteWidnYsjXbYfRmyCIt1r7YZ4EXwOBMtzGV8AGIZxbCqVeoWIjnY7lyxypGVZr+u6vsztRLKErQIg2vsyLJGUlQvbD8tKor33FVsxiIi7bVjey+QCgHRdv1oI8TgAze1kslANgL/run6Z24lkqkAgUKdp2tcAnGUnzkiyH+29r0rKiu1PtPdljNg8CEgI8ZKkdBjLWpm6FbCq6/rNAC50O5Es5wHwc13XZ2ma9vl169Yl3E7IbY2NjYUjIyPLAHwawOkYHTaxbZP5APTKXN6CInO8a95vN4Tw+XxcALC8l3HL53RdLxZC3ENEK9zMw1tQipLqKSipaUJpTRNKqqagsDwIj68YqrcQ3qJyeLxFAIBkYhCJwT6kRgaRTAxiqK8d8a429O/YjP6uNvR3bUFi2PU941cBONs0zXw8vUYJBoNLFEX5DIAzAZTLbsDnKcVnj14Dj1IoOzTbRTI1iD88s8huD8A60zQPlZUTY9kqo3oAQqFQlWVZq4go7Zv6eAtKUTmlBTVNh6O6aQH8dQeCaHwjJF7VC2/hLisSQ/++SEEIC7GOTehqewk7Wteie+s6JIfTfg7JKQCeCoVCK8LhcHe6G3eDrusHCSHOJaJzAUxxsq2RZBybO57AgdopTjaT91q3P267+x/AgzJyYSzbZUwPQCgUMizLegxAc7ra9BSUQm8+EfrBp6Cyfi4UJT31kGUl0bvtdYTXP4j2tx9Pd+/AOp/Pd3xbW1tvOhtNl2AwWKsoyjlE9GkhxGHpbLu+ZglOmX97OpvMOw+tOx/hrtW2YhDRvEgk8rqklBjLWhlRAIy9+T+LdDz8iVB3wJEwDjkNgYOOheopcLzJfUklh9HxzpOIrH8I2z9YDQjnVyYR0csFBQXLWltb+xxvLA0aGxsLE4nEqWO7Qi6HpHH9iSIQPrboQVSXznCj+Zy3I7YR975wOoS91XtbTdN0tDeIsWzhegGg63oxgCfg8F7+pKjQZp2IaUs+B3/dgU42NWmxjk1477lb0f72P9KxveyLiURiWWdnp+uTEyaJNE1bTESfwegs/gq3EwKAacEVWHbwr9xOIyc9tv4raO14zG6YG03T/IqMfBjLdq4WAC0tLV7TNO93csIfKSpCh5yGAxZ/DiVV2XFuTn/XFrz//K2IvLEKwko52dQq0zRPB+BoIzIZhjFdCHEuRmfxN7qczn9QSMXZRz6CiuJGt1PJKT39H+DuNSfLKIwXm6ZpbwyBsRzh5q56pCjKHbC5BntfKkOH4NBzbkBDy8fgK5I+8dsxvuIKBA86FnXTl2Jn+zsYim13qqnpfr+/KhaLPeJUAzIYhlHt9/sv8Pv9NwD4GYClyJA3/t0JCCSS/WiqO97tVHLKmk3Xoiv2jq0YRPSWaZpXSkqJsaznWgGg6/rVABzpivMVVWDWSVegecW3UeivdaKJtCj016J+3hko9NehZ9vrsJLDTjSzoLS0tDcej9veW1WmadOmFRQUFJxeVlZ2LYCbAJwKIORyWuPSE9+EKbXHoKSgzu1UcsL2vjew+t2fQMLOvT+OxWK8/p+xMa4MAei6fgKAR+DAToR104/GIR+5Gr7iStmhXTXS3431D3wP29971onwKcuyjmtvb3/GieATQLquHzm2dO/jALL2m1hXfgg+uuDucS8lZXsmhIW/vfQxbO97026oQY/HY2zdurVHRl6M5YK09wDU19frQojHAJTKjEuKBzOXfQOzl18B1VckM3RGUH1FMGavgMdXhO62l2VPElSI6ITi4uKV/f39ad+gIBQKTSstLf1aWVnZbQC+SUSHAsjqb2L/cAdKCutQWzbb7VSy2tvhu/B2+G4Zof4cDofvkhGIsVyR7h4AVdO0J2Qf7FNUoWP+mb9ARehgmWEzVs+21/HqvZdhaGe77NCPm6a5HIDjSxBCoVCVEOLssaV7ad/4KR0KvOX4xJGPochX5XYqWWlwpAt/ef5EDCd32g1lKYoyNxwO2+5GYCyXpLUHQNf1HxLRZ2XGLNdm4ojzfo/SmkaZYTNaUXkQ+pwV6Nq8FsPxHTJDH+D3+4disdjzMoN+aGxc/7SysrJrhRA3AzgNQL0TbWWClDWMnvj7mKadAnJ/xW1WEcLCP974Grrjm2zHIqK/RSKRGyWkxVhOSVsBoOv6IgB3QGKvQ3XTAiz41M1ZNcNfFo+vGPqcFegNv4HBXlNm6MV+v/+vsVhMWmWh6/oiv99/ZSKRuIOIzgNwENxdgZI2fQNt8KrFCFbMdzuVrPLa5pvxtpwee0FEn4zFYh0ygjGWS9J1E/b4/f4HIPFY3+DMZTj07F9B9Wb1ULEtqscHfc5J6N+xGfHOVllhPQDmxWKx38PGtOv6+voDSktLv+r3+28DcBmAw5Ad4/oCwGohxE+IqADAAXYDmt1rEapeiNJCPtV6PKI9r+Cfb18ha57L/aZp3iAjEGO5Ji39kpqmfZ2IfikrXnDmMsw/6xcgJS9eIvdLWCm89tfLEH37cZlhv2qa5q8nckFDQ0NlIpH4OBF9GsAiZMBOkxPwHoCVqVTqzo6Ojs0AEAqF5liW9RokFMqlhRrOPPxeFBdk77LUdBgY3o57XzwT/cNSXtgtIpofiUTWywjGWK5x/AZdW1sb9Hq970DSEazVjYdhwbk3QVF9MsLlDJFK4KU/fwk7Wl+QFTKWSCSmd3Z27m+moarr+jFE9BkhxJkAimUlkAa9RPSQEOKPpmk+iT30eOi6/jsA58torKp0Os5Y8Bf4PH4Z4XLOSDKOB17+FHbENsoK+VvTNC+RFYyxXON4AaDr+l0AzpYRyx+YjkXn/R6eQr6B7kliOI61f7gAfVE5N1AhxE3RaPSLe/r/NE1rGduH/xwA2bTjzTCAx4nojxUVFQ9s2LBhZF9fPFbAboSknQf1ygU4peV2qIq7h1BlGksk8PCrFyHctUZWyC4AB5mmKXWWLGO5xNECwDCMY4QQT8mIVVSh48jP/QUFJbykal+G4juw+tZPyFoimLQsa257e/sGAAgEAk0ej+fcsb34p8toIE0EgDVEtFJV1bsnuhmMrutfBjCh4ZB9mRo4EcsO/hUU4iEsALBECo+/8TW0dvxDZtgLTdP8ncyAjOUaRwsAXdfXQMIab1I8WHT+H/Jmnb9dPdtew4u/vwCWlZQR7h9CiHvHxvUXI7vG9T8AcKeqqiu3bdv2gY04qq7rLwI4VFJeOCBwIo6bcx1UJb+HslLWMJ5485uyH/4vmKa5GGnYz4KxbObYzdwwjJOFEKtkxJp5wqWYulDq9gE574PVt+OdJ/LyWNr9jutPhqZpM4noVQCFMuIBo8MBJ827CT6P1E0xs0YiOYBH138J4S6ph/MNKYqygDf9YWz/HOuDLC0tXUlEht04ddOPxuzlVwCUTS+e7quqn4feyFsY6N7qdirpMALgISHEt0tKSi7esmXLvbFYTNq6SACIx+M7/H5/AsAyWTFjQxFs2/EsmuqOh9dTIitsVhgY6cSDL38aHX2vSY1LRF+NRCIPSw3KWI5y5KmqadoSIrJ9ao2vqAJLv/xgzh3sky4j/d14+jenITHY53YqTnkBwEpFUe4Kh8PdaWhPNQzjSSHEUplBSws1LDv4l3mzWVC05xU8/sZ/yVrqt6sHTdM8HZJ6fRjLdY70AJSVlf0awAy7cWaddAWqGvLjpugE1VcEb2Eptm9y5ARBt2wDcBMRXWia5s9jsdjLO3fuHExT26KoqOgRRVE+BUDaUpSRZBybovdDCAt65WGgHO3tEhB4c8sf8eSbl2IkGZMdfnsymVzR398flx2YsVwl/U4TCASaVFV9HzaP+q0MHYKFF/yRj1O1SQgLq2/7FPrMt9xOxY4eIrpHCHGnaZqr4fIbnmEYx42daCm9gG6oWYpjZ/8s5w4QGhzpwlNvXY6tO5wpRono5ZGRkWM7Ozu5AGBsnKTfwMrKyi4lIltdpKSoOPSc61Hoz6bl5ZmJiFCuzcK21+8DRFb1jKYAPEVEPwTwOdM074vFYhkxoSEWi20uKyuLAzhRduy+gS14O3w3PEohasvnZH0BLISF99ofwCOvfxFdsXedbMpQVfXogoKCewYGBva5twNjbJTsAsBTVlb2B9jsHq2fezoaWj4uKSVW6K9Ff/dWxDrsn6yWBmsB/AzAeaZp3hyLxdbHYrGE20ntLhaLveD3+0MApI9RpawRbOt6Dtt2PIfaslkoKcjOQnh73xt4bP2X8NbWPyGZGkpHk/Wqqi4pLCzkIoCxcZA6BKBp2nIiesRODFJULP3SgyipapCVFgMQ37EZz/7P6bIOWJGtTQixUlGUOyORSFZUKQDQ0tLijUajqwCc4FQbCqmYFjwV86d+AZUlU51qRqqe/g/w2uabsSn6oFs/b88nEomTeDiAsX2TWgDouv57ALYW7OuzV2DemT+TkxD7N6/e801E35a64YodfUT0oOz1+umm63oxgMcwukmSY4gUNNQsxaFTv4y68jlONjVp3fFNeL3tNrwXfQiWSLmdzpqRkZHlO3bskD7bkLFcIa0AaG5u9vX09HTAzp7pRDjqC3+Fv+5AWWmxXezseBfP3fwxN+cCJAA8SkR3er3eh9ra2tLSL+y0UChUZVnWPwE4vlUlgRCqPhIz9DPQFFgGjyJtX6JJSaYG0br9cbxr3odI1xqIzKrjuCeAsX2QVgDour4MgK3Xy7ppi3HYp34rKSO2Jy+tvBidH0g7cGVciOhlIcSdlmXd1d7e3pnWxtNE1/UaAI8DmJuuNn2eUkwNnIgZ+ukIlrdAUTxpadcSSUR7Xsa75v3YvP0fGEn2p6XdSeIigLG9kFYAaJr2/4jov+zEmH/WL6A1L5eVEtuDyJsP4/W/XZGOprYIIVYS0UrTNN9JR4NuG+sJeAwSzwwYL6+nGFrFYQhVL4ReeThq/DOlrSAQwsKO2EZEul9EpPtFRHtfRiI5ICV2mnARwNgeyOwB2AjgoMle7ykoxfGXPg3Vw8ekOimVGMKT1x2DxLAj98I+AH+1LOuP7e3tzyEPD2Opqanx+3y+e+HgxMDx8HqKUVHchIqSprF/TkVpoQavWgyvWowCbzm8nmIAo3vyDyf6kEgNYCTZj/7hdvT2t6KnvxV9A23oHdicbQ/8PeEigLHdSCkA6uvr9VQqFbETo2H+mZhz6g9kpMP2Y/0D30P49ftlhUtidBLcnYqiPBgOh9O1K1/Gamlp8ZqmeRsRfcbtXNi/4SKAsV1I6SO0LOtIuzH0g0+RkQobh9Ahp8kIs0EI8fVUKmWYpnmKaZp388N/1Lp16xIY3bKYZZbFXq/3kdra2vw8fpGx3cjaZmyRnYu9BaWorE/b3Km8V9UwD54C26fP3RONRq/v6OjYLiOnXGIYxiFE9C2388hyUSJ62YG4XAQwNkZKASCEWGDn+sop6ZvBzABSPKist72BXYuMXHKQRwhxGwCv24lksQctyzpkZGTkWADPOxCfiwDGIKcAIACz7QSoaTpcQhpsImqabNVsABcAe6Tr+jfgwiqAHDEkhPi6aZqnt7e3d3Z2dsZTqdRyIcTTDrS12Ov1PlZTUyPtVEfGso3tAiAQCDQBKLMTo7rxMLtpsAmqtl906cFgsFZGLrkiFApNA/ADt/PIUi8IIeZHo9HrscuukB0dHf3JZPJUONMTsMjn8/2dewJYvrJdAHg8nll2rvcWlMIfmG43DTZBZcEZ8PiKbcVQFIW/cf9CqVTqVgBFbieSZToBXGia5pHRaHTjHr+gszOeSCROAg8HMCaV7QJACNFk5/qSmsasP/I0GxEpKKmeYjMG8Z7NYwzDuJiIjnY7jyxiEdGdAGaZpvk77OcsCB4OYEw+GQVAo53rS6ptXc5sKKm2VbsBAPcAAAiFQoYQ4qdu55ElBID7iGh+JBL5jGmaO8Z7IQ8HMCaXjFdvW+f2cgHgntKaRlvXW5ZlyMkku1mW9VsA5ZLCvYjcyAn8XAAAIABJREFU3EHRIqK/EtE80zQ/GolE1k8mCA8HMCaPjAKgxs7FpVwAuMZu8UVEATmZZC9d188BcKqMWET0iGmaC4lophDiJgC5sLHSIIA/KIoyNxKJnDXZB/+uuAhgTA7bBYCiKLYKgMKyvH+GuKaoPGg3RJ2MPLLV2AmA10sKFxNCfAEAIpHIpmg0+kVFUUIAvgbgDUltpA0RvQXgqz6fTzdN87xwOPymzPhcBDBmn4w5ANV2rpewIx2bJI/P9mdvq/jLdkT0S8grgq40TXPrrv8hHA53m6Z5g2mahwghDgNwHYDNktpzwhYANwJYHIlE5pim+eu2trZepxrjIoAxe2Rsv2dr2ZNqcykamzwJn33eLnkLBoMnCSHOlRTuedM0f7uvL4hGo68AeAXApZqmtRDRqQCOB3A45PweT4YA8CqAB4nowUgk8nq6E+js7IzX1tae5PV6HwGwWHL4D4sAPkCI5SQZNw6frQS4B8A1Ej77Qhl5ZJuamhq/oig3SQo3BOAiTGDiXzQaXQdgHYAf1NTU+AsKCpaObcd9GEZ3IXSqZ2YHgLVEtFYIsdbn873k5Bv+eHERwNjkyCgACuxc7LXfDc0miQuAyfH5fNfC5uqXDwkhfhSNRt+Z7PU7duyIAVg19gfA6PHcQojpqVTqQCI6kIg0IUQdgCCASgClGB3+K8focc4xAHGMFiM7iShuWdYWImolos2WZW1WFGVzJBIJT/5v6iwuAhibOD6Bh9mRi8vV9skwjIVCiC9KCveGruu/iEajksKN2rZtmwnABPC01MAZrrOzMx4IBJYrirLKgU2ZPtwsaPlY0cVY1pOxDHDEzsXJkQEJKbDJSA732w2RV29D06ZNKxg76U/G701SCHHBunXrEhJisTEdHR39lmWd4tCOgYt8Pt+jvGMgyxXuFwD2H0JskiR89nn1zRscHLwKgK2zL3bx32Nj+UwyLgIYGx8ZBYCth0ByJK+eIRklZb/3JW++eYZhzBVCXCYp3CZFUa6WFIvtAW8bzNj+2S4AiKjbzvXcA+CexLDtHvx8GQLwCCFuh5w5M0IIcVE4HM6FXf4yGu8TwNi+2S4ALMsa92EeezK0s8NuCmySJHz2O2Xkkel0Xf8mgPkyYgkhbo5Go8/KiMX2j4sAxvZORg9Ap53r411tdlNgk9Rv/7PfIiGNjGYYxnQAV0kKFx4eHr5cUiw2TlwEMLZnMuYAbN3/l+ydhIcQm6T4Dnu7yhJRq6RUMpUihLgV8nY8/HJ3d3de9Jpkms7OzngqlVru0MTAD5cI8sRAllVkFAC23gL7bT6E2OTFd7TZul4IkdMFgK7rXwBwlKRwfzJN8wFJsdgk8OoAxv6djCEAWw+BeFcbhMi7/WRcJ4SFgW57Pfi53ANgGEYIwLWSwu2wLOu/JMViNvDqAMb+xXYBkEwm37Z1/XA/Yh2b7KbBJmhn+7u2N2FKJpM5WwAIIX4LoExSuK+1t7fbmivD5OE5AYyNsl0AdHR0tMHmbPCutpfspsEmqGvzi3ZDmB0dHdtl5JJpNE37FIBTZMQSQvzdNM0/y4jF5OE5AYzJmQMgALxpJ8CO1rUS0mAT0dX2sq3rich2BZGJdF2vIaJfSgq3k4i+ICkWk4yHA1i+k1EAgIhsPU26t66DsJIyUmHjYFlJdG991VYMIcQLktLJNDcAqJURiIguN01zm4xYzBncE8DymZQCAMBqOxcnh/vRs+11Samw/enesk7GDow51wNgGMbJAD4hKdyzkUjkFkmxmIO4J4DlKykFgMfjsVUAAEB4/YMyUmHjEHnjIbshEoqi5NRBNlVVVWVjE/9kGCKii5CHxyVnK54YyPKRlAJgy5YtUQAb7cRof/txpJLDMtJh+5BKDKF94xN2w7yYa3vZFxYW/hRAvaRwP4xEIry0JctwEcDyjawhAAghHrVzfWI4jo53npKVDtuL9o1PyOj+z6nuGk3TjgIga7Leek3TrpMUi6UZzwlg+URaAUBEj9iNEV7PG6U5TcZQCxHlTAEQCoWKiOhWACQhXBLA+evWrUtIiMVcwnMCWL6QVgBomvY0AFtHA3d+sIY3BXLQzo53scP++v+NudS9bVnWVQCmy4hFRD83TfM1GbGYu7gngOUDaQXA2FuPvVd4IfD+87fJSYj9h/eeuRkQwm6YnHn7NwzjEADfkBRuk9fr/ZGkWCwD8NkBLNdJKwAAQAhxt90Y0bf/gf6unD9lNu3ina3oeOdJ23GEEIdrmnaohJTc5hFC3AHAKyGWBeDCtra2IQmxWAbh4QCWy6QWANFo9EkAETsxhJXC+8/fKikj9qH3n79NyqFLRHQ0Eb2s6/pD2VwIaJp2GYB5ksLdZJqmEw8IlgF4dQDLVVILAADJsbcqWyLrH0Kf+ZaMfBiAPvNtmG/9XXbYU8YKgcdDodAC2cGdZBjGdCL6vqRwW0dGRq6QFItlKJ4TwHKRjJnP/yYYDDYqivIBbBYXFcYcLLpwJYhk1yj5RQgLa277JHrNDU43tUoI8cNoNPqK0w3ZpOi6/gyAxVKCKcrJ4XBYenVlh67rDQBmADhICNGoKEqdEEIHEARQgtFTDglAxdgl/QBGMLqBUReALiHEdgBbAbQR0eZkMvn22MFftieRZLPa2tpSr9f7CCT9/Ozm+UQicVJnZ2fcgdiM/QfpBQAA6Lp+H4DT7caZc8r30dDyMQkZ5a8tr9yNtx7+cTqbzOhCQNf1SwD8RkYsIloZiUQ+LSPWZNXV1QU8Hs9CAEcCWAjgYABOvUnuBPDm2Nkfqz0ez+qxTcDyChcBLFc4VQAsBvCc3TjeonIc/aUH4SupkpBV/hnu78YzN56KxJCt05onK+MKgbE347cg5wG5HUCzaZo7JMSaCE8oFFpsWdYKACsANKe5/d1tFEI8QkSPVlZWPrNhw4YRl/NJCy4CWC5wpAAAAF3XXwBwhN04dQcehcM+cSNAjqWak4Sw8Mqfv4Tt77s+Ny1jCgFN0x4mohWSwn3CNM27JMXaH8UwjKOFEJ8GcAaA8jS1O1E9AO4XQtw9NiE4p4/45CKAZTvHnqqapi2XsTsgAMxc9g1MXXS+jFB54/3nbsO7T13vdhq7crUQ0DTtXCK6U1K4h0zTPE1SrL3Sdb2BiL4w9uAPOd2eZBEhxB1CiNvb29vb3E7GKVwEsGzm6Gu1ruvPY3Rs0hZSPFh43h2orJ8rIavc171lHdb+8XOwrIx8AUt7IRAIBOpUVd0AoEZCuD5FUZrD4bCt5a77omnaEgBfJaLTAXicaidNLIxuHnVdri6V5CKAZStHp9hblvUdGXGElcSr916GoXi6h1uzz3CsE6/99VuZ+vAH/rV8MG37CKiqej3kPPwhhLjcqYe/ruvLdF1fQ0TPEtFZyP6HPzB6jzkdwHO6rr+gadpytxOSjfcJYNnK8YF1Xdf/BOCTMmKVBQ/CEefdAW8B/y7sSXI4jjV3fDbbzlNwtEdA1/VTIWn7YiHE09Fo9FhIXgqnadpRRPRjAEtkxs1gayzL+nZ7e/szbiciE/cEsGzjeAFQW1sb9Hq9G/GvNce2VE05FIefezMUj09GuJwhUgm89OdLsKPV9mE/bnlCUZTvhMPhl2QFrKqqKissLNwAOePng4qiHBIOh9+TEAsAYBhGCMA1Y2P8+WgVgEtM09zmdiKyBAKBEkVRVhHR0Q6EXzMyMrJ8x44dMQdiszykOt3AwMBA3O/3DwI4SUa8wT4T8R2boc08njcJGiOsFF7962XY/t6zbqdix1QhxOf8fv+hpaWl78XjcdNuwIqKiuuJ6FgZyQH4biQSeUhGoMbGxsLi4uLvAbgLQIuMmFlqOoALSktLB+Px+CvIgU2G+vv7E0VFRX9VVfUoAA2Sw9erqrqksLDwnoGBgbxYbsmcla61daqu668AkDaLT5u1DHM/+lMoan73BFjJEbz2t8vRvvEJt1ORSQB42M7QQDAYXKooyj8h52d8nWmaR0DCsrZQKHS4ZVl3AJhpP62c8joRXRKJRF5wOxEZeDiAZQPHewDGiLKysjcAnA9JRUe8sxU9W19DYOZxUPN0OCA5HMfLf/kSOuWu9U8CeApAE9JXIO6OAEwnoosm0yMQCoWKAPwdQLWEXBJEdEosFrO14920adMKCgsLfyyEuB1AnYS8ck0QwPl+v1+JxWLPIct7AwYGBkYKCwvvcagnoEFV1aO4J4DZla4CALFYLOz3+1UAS2XFHOw1sX3TswgcdAw8BSWywmaFofgOrL3zIvSG35Aal4i+b5rmReXl5fcJIWoAzIL7hcDn/X7/4vLy8nd37ty53xn4paWl1wA4VUYCQohrTdP8i50Yuq43JBKJvwM4Bw6vvMlyBODosrKypRUVFf/YuXNnVo91DwwMjBQXF/8vES0iokbJ4RtUVT26oKCAiwA2aem+sSu6rj8G4HiZQQvLAph35s9R1TBfZtiM1We+hVfvuRQDvXJXoxHRM5FI5DgAqQ//WygUOtiyrO8COAvuFQK72udkQcMw5gohXgLgldDWuz6fb25bW9vQZAOMrUL4A4BKCfnkk04hxGei0eijbidiF08MZJkqbT0AY0RVVdUTlmWdC0DaWr7kcD/MN1ZB8fhQVT83d7cNFgLvP3871t93JUYG+2RH3+71epf19fX928EBO3fu7IjFYveUlpY+TEQ6gAPhbiGw18mCLS0t3lgs9ncAuoR2LABnbNu2bfMkrydd168G8FsARRLyyTclRPTJsSGBrF4uyBMDWaZKdwGAvr6+uN/vfx3AuZD4IBHCwo7WF9EbeQu1ByyE6sute+5wfzdeu+eb2LruHghhyQ5vCSE+Fg6HX9/bF8TjcTMWi/0lQwqBPc4RUBTlSkjacwLAb0zTvHkyF46N9/8BwJeRGb0m2YoAHO33+6fEYrGHMVqUZSWeE8AykWs3J13XfwDgKidie4vKMePYr6Kh5aysXyoohIWt6+7Bu0/e4NipfkKI/4pGo7+ayDWaph1KRFcBOBnuP+QEgMcAHAOgQEK8LSMjI3Mm063a0NBQmUwm7wdwlIQ8Jq3E40F9YTHqi4owpagEocIi1PkKUaSqKFQU+D1eFKmj9f9gKoVYMoFBy8JQKoXtI0PYNjiALYMDCA8NYtvQAPqTru8suQrA2aZpDridiB28OoBlEjdv3KTr+q0ALnSqgXJtJmaf/D1UGHOcasJRfdGNeOvvP5Y+0W9XRHRLJBK5eLLXZ+AcAduEECdNZuw5FApVWZb1GIC0bHG8q0JVRXNpGQ4tr0JLeSUOLCmFInEozBwaxCt9PVjX141XensQT6W/IBBCvEREJ7twBLNUPCeAZQq3b9geXdfvA3CKUw2QokKfvQLTllyE0pomp5qRKt7Zig9W34bIGw870d2/q1WmaZ6OXSb9TdZYj8D3Mfq9dPvnatKEEH+MRqOfneh1YztePgGg2YG09qhEVXFMTQAn1AQw218ONU1zX1JC4M1YHx7rbMcz3Z3p7h3YoKrqCdu2bbO9UZSbuCeAZQLXb9S6rhcDeBzAIifbIVIQnLUM05ZchLLADCebmrSd7e/gvWdvQcc7Tzr94AeAtYlE4njZNwlN01rGhgaysRDoIKLmSCTSNZGL6urqAh6P51mM7mznKAKwoKIaJ9YGsbiqBgWKu0Ncw5aF57s78VhnB17q7UrX4v0NHo9nydatW3vS05wzuAhgbsuIG/RY1+mzSMfbExFqpy6EcchpCM48HqpHxpDx5KUSQ2jf+ATC6x/Ejs0vAiItt9AXCwsLl7e2tkpfSvChbCwEiOjsSCTyvxO5Zuy8gacBzHMmq1EqEY6ursO5xhRMLc7MPS8+GOjHynAbnu7uhOX8z/FqACfwnIB94iKA7VPG3Jjr6+v1VCr1D6SxC9VbUIrAzOMROuQ0VDbMg6Kk5/RVYSXRtWUdIusfRPs7TyI53J+WdsesGRoaOqm7u9uZGYW7yaJC4IGx4ZBxa2xsLBweHn7EobFcAKMP/hNrg/iUMQWhwuxY2bJtaAArw1vw+I4OpJwtBFaZpnkGJGzR7CYuAphbMuqGPDaD+iEAR6a7bY+vGJUNLaiZejiqGxegLDhD2goCISzsbH8XXZvXoqvtJXRvWYfkiCsvLs+NjIyc7MYEoQwvBPpUVZ01wXFl0nX9LgAfdyqpZn8ZvtE0A9NKsvP46039Mfy/1k3YGHe01rzDNM0LkeVbB3MRwNyQaTfiD+cE3AVJW7lOlsdXjJLqKSipbkJpTSNKqhtRVB6E6iuGx1cMb2EZPL5iAEByZACJoZ1IjgwgOdyPoZ0d6O9qQ3zHZvR3taG/a4tbD/xd3Z9Kpc7t6OhIa3fD7jKxECCiayKRyHcmco2mad8mop84kU+Zx4uLG6ZiRZ0mdSa/GywhsGp7FLds/QAx5yYLXmWa5tVOBU8XLgJYumXq3cUztkTwPLcTyRG/ME3zCmTQRioZVggIAKvGTh9ct78vNgzjFCHEA3BgX/9FldW44oCZKPfK2Mk4c/QmErj2/Y14sXdC8yvHyyKiZZFI5CkngqcTFwEgTdMaVFVtsiyrCUATETUJIZoAFAKowOheHyUA/AA8APoBjAAYIqIuAF1CiO0AtgJoI6LNyWTy7Y6OjjZkeU+RbG7fePeFdF3/HoDvw4UdC3NEAsAlpmne5nYie5OhhcDVezuGOBAITFVV9VUA5TIbVolw8ZQD8HGt3vUPwSkCwN3mVtyytdWJuQHtiURiXmdnZ7vswOmWT0WAYRjVRHS4ZVkLABw+9sepczN2AniTiF4GsNrj8azesmWLrVM+s13G32sMwzhWCPEnjB4XyiZgMl3bbsnAQuDhsR6BXQsBj67rzwJYKLOxYEEhrprejFmlZTLDZqy3Yn344aYN2D4yLDv0k6ZpnoAM6umarBwuAkjTtPlE9BGMDvPOdSGHXW0UQjxCRI9WVlY+s2HDhrzaStntG+24jG2yshLAcW7nkmW2+3y+GW1tbb1uJzJemVwIOLF99fQSP34+82BUen0yw2a8rpFhfOudN/B+v/RnUE7MBwByqwjQNG0JEX0Cow/9UDranIQeAPcLIe6ORqNPIstXl4yH2zfYiVB0Xf8ueEhgon5tmuZX3U5iojKwEHgSwNEYHXOUYn5ZJX5y0GwUq+lZfppp+pNJfOfdN/HaTqn1aYqIlkQikRdkBnVLNhcBuq7XENFnhBAXATjIiTYcFBFC3CGEuL29vb3N7WSc4vaNdcIMw1gohLgVadwvIMsliejQSCSy3u1EJiPDCgFpllbV4nvTZ8Gb5YdV2ZUQFn703kY807VdZtgNlZWV83OlOzfbioCx39lvAvgo5BzO5SYLwIMArjNN83m3k5Et696kY7FYuKGh4bahoaEUgCMg8Y0sRykAmmOx2O/dTmQy4vF4tLy8/AMhxAVwYNa9G5ZW1eKq6c3w5PnDHxid/HhUVS3aBgewZVDaCtW6oaGhVCwWe0ZWQDdly1HCmqa1lJWV3UxE/w1gDnLj3kwY7b24wO/3Ly8tLY3E4/H33U5Klqx+owoGg41E9AsiOsvtXDKdEOLcaDT6J7fzmKjm5mZfT0/PKxi9oWS9+WWV+Pmsg/P+zX93CWHh0rfX43V5wwFDqVRqVkdHx2ZZAd2WqT0BudpLtw9rLMv6dnt7e9YXmFnXA7CreDzeG4/H7yktLX2KiKYBmOJ2TpmKiI4oKCi41W6Vn25er/c7AM52Ow8ZPpzwV6hm9a+dI1QiHFVdh7W9XehOSPkR9RCREY/H75ERLBNkWk+AYRjVfr//BiL6H4y+JefDwx8A6onoPL/fP72qquqFvr6+jFhSORk5cSeKx+NbY7HYHX6/fw2AqZD/y5EL/KqqemOx2ONuJzJewWBwFhGtRA50JQYLCnF98zyU5dgGPzL5FAWLK2vwz67t6E/ZPqEaRDSrtLT0qXg8vlVCehkhQ4oARdO0iwDcj9HeiHx58O9ujmVZnystLR2Mx+OvIAs3GcrJb5xhGAsBXCqEOB3ZP248AuBNAC0SYiWEEIdEo9GNEmI5TdF1/XlIXnPvBpUIN86enzfr/O16M9aHr214TdZmQc+ZpnmUjECZxK3hgEAgMNvj8fxOCHGYA+1ms9eJ6JJsW32S7Q/HPYpEIi9EIpEzATQBuBpA2OWUJoyIWgFcmUwmGyorKxcB2CQhrJeIbpAQx3Gapn0FOfDwB4CLpxzAD/8JmOMvx0UNU2WFW6Lr+jJZwTJFZ2dnPJFInATAiZnpi71e7yO1tbW7nkJFuq5/SVXVl/jhv0dzhRDPj+0VkjU96znZA7AHqq7rx2B0LPkMANUu57M32wH8LxH9ORKJvIhdupQ0TTuRiB6V0QgRfSwSidwrI5YTgsFgo6Iob2F0v++stqiyGtccdHDe/KLJIgBcsXE9XuztlhFutWmaTrwpuy4dPQGqqhYpivI7jE7yY/v3T6/X+6ls2GY47+5LLS0t3vb29iVCiOUAlsPd2eUWgPVCiL8rirIqEom8hH1sY6rr+n0AJnRm/V5sBTDTNE3XjyjcA9J1/TEAst7argSwCC7MUC7zeLFy7uE5d7BPuvQmEjj39RelnCJIRIuyrXt2vJwsAojoZSFECIAmO3aO6wBwrmmaT7idyL7kXQGwu0AgUKcoypEAFhPRAowWBFIPetlFN4DXALxKRM+qqrp669atPeO9eOzN+G0ARXYTEUL8JBqNftduHNkMwzhPCHGHpHD3mab5UcCdpUqXTZ2BUwJ6OprKWQ+2R3DdZvujX0T0t7FhwZzkcE8AmxwLwI9M0/whMnSCYN4XAHuiadoUADMVRfnwGMp6AAGMDh1UAyjG6Mx0/9glOwGkAAwC2DH2pwPANiFE69h4/jumaW6zm5uu61cB+IHdOACGFUWZHQ6HM2ZTi7EzHzYAqJIQrtfr9c7avRsuXYVAs78MNzbPh0L8K2aHJQS++NY6vBOPSQglpkaj0S0y8spEXARkrN+bpnkRMvBsAb47ZZlQKFRkWdYGjE5wtOth0zQzZlxP07R7JG7q9DnTNG/fR1uOFQIqEW6ZcyimlZTu/4vZfr0bj+GLb62TsSrgatM0pR7mlGlypQjwl6horPdiar0PU6f40BTyQqvzoqRIQWEhUOH3oKho9Nd2cFCgN5bE4KDAwJBAdHsCrdsSaN0ygs3hEbRtSyDWb39ZqU2rAJydacOuXABkIV3XP4LRNbgynGaa5kOSYk2apmkfJaK/Sgr3pGmayzCObjcnCoEVdRouPyDbzj7JbNe8vxGPdbbbDbPNNM0mjPbW5axsLAL8JSoOPbgQi1pKsHB+EaY3+aAoch5PliXwbusIXnh1AGvWDeCVNwYRH3DlxOg1Ho/nlIkM+zqNC4AsZRjG34UQJ0kI9YHP55vd1tY2JCHWpDQ0NFQmk8kNkDPRqD+VSh3c0dHROpGLxgqBXwA4xk7jKhH+OPdwhAptT9Ngu9g6OIDPrn8Jlv1egGWZPjFLhmwoAkpLFKw4xo/TT/Bj/uwieNT0PI6SKYF1bw7h/sd24tFn4unuHdigquoJ27ZtM9PZ6N5kzXpF9u/Ky8tfEkJ8HvZ3yauyLGskFos9KyOvySgpKfkNACmbtRDRFdFo9JGJXhePx6N+v/8g2LxhHlsTwKk88U+6cq8XbQP9aLN/YNBILBZzvcfLaQ7vGDhpRMBRh5fgvy6swTXfCuLEo0phBL3S3vbHQ1EIoaAXxy8uxXlnVeLApgIMDglsNRPpaL5OCLG8srLyL319fa69dH2IewCymGEY1wghrpQQatCyrFlunHttGMYxQognIeFnUQjxUjQaXYRJdvHqur4RNs4tJwC/O2QBphZn/fYFGemDgTguXP+y3enUPZWVlcFcOSp4fwKBQImiKKuI6Gg381AUYOnhJfja+dWYPaPQzVT2alPrMG69qwcPPRFDMuX4pP21qVTquI6ODmlHYE5GTu4EmC+SyeRPIGeXwyJVVa+TEGdCAoFACYDbIKcQHbEs60JM8uFvGMZ02Hj4A8CCimp++DvogOJSHFphe4FIZW9v75Ey8skGHR0d/clk8lQ4s2Pgfqkq4WMryvGPO5tw60+NjH34A8D0qQX4xbeDeOQPU3Dm8jKozg5JHK6q6l1w+ZwTLgCy2Fj1eKmMWEKIj2qatlxGrPHyeDw/FkLI2vP1mo6Ojrcme/HYuRG2LK8L2g3B9mN5rf3PeGwTsLzR2dkZT6VSy4UQT6ez3dkzCnHPb+px7eUBNIayZzOspnoffnZlEPff0oB5zY4WLKcYhnEHXOyJ5wIgy5mmeTeAp2TEIqJfNjc3+2TE2p9QKHS4EOIrksJtqKysvNZmjFPtXFyiqjiyssZmCmx/FlfVoMRj76WJiFZISidrpLMnoKJMxTWXBfC3m+px8MzMfePfn5nTCnD3jfX48TfrUO53ZrqcEOJcXde/50jwceACIAdYlvVVADJmsBzU09PzdQlx9qm5udknhLgNciahphRFudDOmK6u68UADreTxDE1ARQo/OvktEJFxVFVtbZiCCGaA4FAnaSUssaHBwgR0ctOtXHsohI8vrIRHz+lPK0T+5yiKIRzTqvA4ysbcfQRjg3vXWUYxrFOBd8XvmPlgPb29g1CiBslhftuKBQyJMXao56enm8LIWbLiCWEuCEcDq+1GaMFgK0+yhNqAnYuZxNwov1hAFIUJWOXxzlJUZTisb39pfKowJWX1OLmawxUlufe4rKqChW3/tTA5V+sdWK5oiKE+FNtrYTxrYk2nO4GmTOGh4d/AMD2TikA/JZl/UJCnD0KBAKzMXpAj21E1GpZlu3uMyKydexwiceD2X6njo9guzvYX45i1fbcqbyZCLgLRVXVOyH5YB8j6MVdNzbgwrMrkcs7XxMBF51TiT/fEIJWJ33uXtDr9a5Emp/JXADkiO7u7p1EdLmkcOcEg8GlkmLtSlVV9TYAMuYZCCHExZKW0Rxh5+KD/eWgv/J+AAAgAElEQVRQc/nOl2FUIhzsL7MVY+zgr7yi6/qlAE6QGbN5eiH+dlM95s7K3rH+iZo/uwh/u6kBM6cVyA59nK7raT2gjQuAHBKJRO4EsFpCKFIU5deQvETFMIyvwuZY+y7ukLijm60CoKVcxtlFbCLm2f/MZyOP9kHRNG0mgB/KjLlwfjH+dH0I1ZWurmRzRW21B3++IYQj5knf8fP7hmHY6pGcCC4AcosA8BXI2et8jq7rl0iIAwAIBAJThRA/khQu6vF4pCx/rKurC8Bml+i8sgoZqbAJmF9u+zOv0HW9XkYuWUAlot8BkPaavnxpKW7/uYHS4vx9hPhLVPzuFyEsX+rf/xePnyqEuDVdq7Hy97uXo0zTfA3AzZLC/XDsAWkXqap6CwAp02iFEF+WdaCG1+u1tQ9BicfDm/+4YFpxKYpUe5PNhBCzJKWT0XRd/yJs9nLtavnSUlx/lQafN286UPbK5yVcf1VQdhHQ3NPTI2We1P5wAZCDPB7PdwF0SghV4fF47K6vh67rFwI4TkI+EELcG41G/yYjFgBYlnWAnesbCouh8Ph/2ilEqC8sthdDUWQcqZ3RxpY7Xi0r3hHzivD/vqc5vUteVlFVwi+/H8TiQ+39PO7mylAoNE1mwD3hAiAHbd26tUcI8R1J4c4zDGPSbw9TpkzRAPxcUi7dyWRS1uZBAAAistUDECriU//cUl9k74YrhMj5AkBV1WsBVMqI1Ty9EL/9ic5v/nvg9RBuvFqXOTGwIJVK2X752h8uAHJUNBq9HcArEkLR2B4Dk/pZSSQSv4GkGxCASzs77R8KvxtbBUCDzbdQNnkNNgsAAFNk5JGpgsFgM4DPyohlBL343c91+Etyb42/LKUlCn73c0PaEkEiOkvTtCVSgu0FFwC5y1IU5csALAmxWjRNu2iiFxmGcRaAMyS0DwCPm6b5e0mx/o8QwtZDwO5bKJs8u0MAAHJ672ZFUX4MCbttelTg+qu0vJztP1G11R786vuatM2CiOgnUgLtBRcAOWxsh7zfy4hFRD8xDKN6vF8fCoWqhBC/ltE2gP5UKnUxYPck2P9ERLYWlNf6pK8FZuNUV2D7s8/ZAiAUCi0A8BEZsS67uDav1vnb1TKnCN+4aNy3yv1Zouv6MlnBdscFQI5LpVJXAuiVEKrasqwfj/eLLcu6DoCUrS2FEN/p6OjYLCPWHtiawi9hRzo2SRI+e2l36UxjWdaVkLDPwbGLSnDBx2WN4OWPi86pwlJ5ZwdcJSvQ7rgAyHEdHR3bAXxfRiwiukjX9fn7+zpd10+ApLFHAC9Go1FZPQl7YrMA4DFRtxTbP3wpJ8dvxjb9Oc1unIoyFT+7IpjT2/s6hQj4xZVBWacIHunU5kBcAOQB0zR/C+ANCaFUADdiH28WgUCgBKP7EMi4bQxblnUh5Mxj2BsuALKUhB6AnBy/IaJLIeHe/q2La3LyYJ90qapQcennpXUySdn4bHdcAOSHpBDiy5Azhr7QMIy9vt17PJ6fAGiU0A4AXNPe3v62pFh7w0MAWarYY/vhlHMFQENDQyWAT9iNM6+5EGetsHfeAgPOPqUcBx9kf/6EEOJ0TdOkr1rhAiBPRKPR5wD8WUYsIcRPp06d+h/H3xmGsVAIIWud/puVlZU/lRSLsT0hXddPCwaDsxobG3NillsikfgMAFubU6gq4epvBKAo3Pdvl6IQrv5GnYyNkxQiukBGTrvi15c8oqrqt1Kp1GkA7O5bGRgeHv4hgK9/+B+mTZtWMDAwcBvkFJUpRVE+t2HDhhEJsfYnDmDSJ8sMpJIo83glpsPGayBp+8gLBcADiqJgZGRE6LoeIaIPhBAfCCE++PDfvV7vB7K2nnYaEU14ue7uPnpimRMn3eWt2TMKcdrxftz32E67oc7H6K6OMs56AZBHp2GxUbquXwY5O/MlFUVpCYfDb4zFvRrA9yTEBYD/Nk3zMkmx9knX9a0AJn0ozN3zFyJYkBMvj1knOjSIc157MV3N9QBo/fCPEKKViFpTqVRrR0dHG5ydpzIuuq7PA/CqnRiqSnj0D1PQVJ+Ws2jyxgdbR3DSZ9tg2f8pWSbxFFTuAcg3pmn+0jCMzwghZtsM5bEs6zcAjgqFQrMty7pcRn4ANqdSqR9IijUecTsXD6SSsvJgEzRgSXsRGo9KAC1jf0BjU+NVVYWu6yNEFP6wKPjwn5Zltaqq+nY4HB5MU462x/5XHF36/9m78/ioqrt/4J/vnUnINmEnmTMTDIsigqIGd3EFwbWL7aN9tNbaWpc+dXvUqq0KamvVarWb1q1P61Jt61ZtZXHXKi4oqyhCCMnccycEAmSyZ+ae3x8J/SGChJzvnTvLeb9evF6V5n7uYSYz93vPPYu5+Htg3OhCzDyqDC++pvV1AwBnADAFgDFgSaXU5QAWMGQdGQ6Hz1JKXQKA41tDEdH3Gxsb2xiy+oWIWpUa+NjI9lRaL0LGNtr0HwFwKVRKjQUwduvvklIKRATXdZORSKR++8cKgUBgjeu6q6WU7Yzt+KbOwUTARWfn7NIIvvvhOcMx9/VWaHzdAMDXampqLl60aFEPR5tMAZCHpJQvRSKRp5RSp+tmEdHDSimuh+AP2bb9ClNWv7iu20YaE52bursYW2Psjix57YNbiwMimgFga2EAABBCOADWAPhPcWBZ1hrXdddIKTf09yTRaHRf13WrdRo67eBS7DXW3P17Ze9xg3Dk1FK8+b7W/c3weDx+JIBXOdpkCoA81dcLMAua0+AAcF38nWAweDVTVr8RUaPO8Q0dnDdwxu6o70hbR5GXwn1/jtzae7BNcdBJRFIp9TGAFduNO1iHbQaDua57sm5DTp9lpv157WuzQroFAJRSs2AKAEOHlLIhEon8Qil1s99t6XOxHyOtlVKf6fQA1HeaAsAvDZ3perTum6KtvQcATtlu3EEXegckbu050FovvqzUwvFHlGk32PhyM44sQ6g0gETbwB9fEdFJAFjGXJl1APLYkCFDbgewyu92AHhSSvmsHye2LGuNzvENHTl/EcpYOdIDMFCDAEwEcIpS6lIA++iEnXxcCEWDzKQwrxUXWZh5lF6nq1Jq8siRI1n2WTEFQB5bsWJFNxH9r8/N2JhMJi/16+RKqc90jq/vaIerOarH2H2uUojlfg9A2nxlhu7SIEZ/fXXmF9ZQ223BYPBwhqaYAiDf2bb9AoAX/Do/EV2xfv16refwOpLJ5Gqd49tTSdS25/WdqC8+a2tFh5mBwaKsxMKBk7UWDzR2w9T9ilBWon3pPYKjLaYAMBAIBC4D0OnDqefZtv1nH877H33FR0In48OWrFgkLqd81MKxw7UBAAdNKUZQf6lao5+CAULNfnoFFxEdzNEWUwAYaGhoWKOU+mWaT9uWSqUuTvM5d4iItFZP+3CLKQDSzbzmfA6vYdu33uinww7Q3ol6XzCs5GsKAAMAEAgEfg6gLo2nvKaxsbE2jefbKdd139Q5fknLFqTMOIC0SSmFZYktfjcjZxx2oOn+TzeG13ywEGLAS5hvZQoAAwAQi8U6lFKe7Dm9A29LKX+fpnPtEhFpFQDtqaS5IKXR4pbNZglmJqHSAPYaYxb/SbeJ4wehtFjv8quU0pr5AZgCwNiG4zhPAZjr8Wm6XNc9HxmwecpWPT09bwPQuqLMb4oztcbYFfNa8xkzusBs++sDyyJUa+65YFnWGO126AYYuUUpdQUAz7bhJaJb4vH4x17lD0RTU1MrgCU6Ga81N6GLYasv48t1uim80dzvFXJ35mMiuge9s19WAsiKNYW9MCZq7v79MrZKbxFVpZR2AWBWAjQ+x3GclUKIewB4sR3vksrKytts2/YgWo9S6nUiqhno8W3JJN5qbsLxIyo4m2Vs582NGzi6///Ptu07tvlvSwgRsSxrnOu644honFJqHICtf4bonjBTjRltCgC/MLz2e+gGmALA+ILu7u6bCwsLzwIgGGOTSqnvc+1ixY2IngFwhU7G3Ka4KQA8Npeh+5+Intvur1wpZQOABgCvbf/zQogRX1IchLUb5CPdu1Bj4Mbqb7s8QjfAFADGF2zYsCEhhLgKwGOMsb9yHOcDxjxWUsq3hRA2gMhAM97f3Iw17W0YV2KmVXlhdVsrFm1p1o1Zadv2bi1/3bcr3wYA727//wkhSr6kONgDGf4dWzkyo5uX08KjtF97UwAY3iCiFap3ezKOEUJdnZ2dtzDkeMklor8ppS4baIAC8GisDjfuNYmxWcZWf47VgWGy5fZ3/1qklO0AlvX9+ZyampqC9evXj04mk+O3FgdEtG2h4Pv8u7KSgN9NyFsMqwEO1w0wBYCxI0Gl1IPgufgDwKDi4uLvAfgVU55X/gpgwAUAALze3ISGznZUFWkv9GFsY11HG97cpD34D0T0BENz+qXvcdeavj9fUFVVJZLJ5LjtexCIaBwYvtz7o6TEzADwS4l+AaD9JWMKAOMLhBCXA5jKmamUmj1y5Mi/NDVl7hwu27YXCiHqAYweaEZKKTwaW4drx09kbJnxqF2vvekSEb1v27bWbA9ODQ0NEoAE8IV1KKqrq4d0dXWNtyxrrOu647frOYiAqThnuAs1BojhtR+kG2AKAONzotHoeNd153gQXR4MBm8D8B0PsrkoAP8H4AadkAUbGnF6OIq9Ss0OaxxWtrbgpQ36+0W5rvsgQ3PSoq6ubjOAD/r+fE51dXVRZ2fnWMuyxgF4FhrTuUtNAeAbhtdeuwAw776xLXJd9z549GySiL4dDoeP8iKbSyqV+h00N0ZKKYW7aleZbYIZuErhnrWfcbyWbV1dXWnr/vdSXV1dZzwe/1hK+TwAsyVi/tL+UJgCwPiPcDh8PoDjPTwFEdHdADJ25FFjY+N6ANoXipWtLXhhvcPQovz2/HoHK1tbtHOI6LHm5mb9oMzTqnNwW7tZvMovDK+99oJtpgAwAADRaDRCRLen4VQHCCEuSMN5BqyvSNF2f/0abO7JyGUPssKmnm48UL/D8XO7K0VEd+z6x7JSm87BraYA8A3Da6+9gqUpAAwAgOu6vwcwOE2nu1kIoT2H1St9A8Ve0c1JJJO4dfVKjqlrecdVCreu/gSJJMumP0/GYrHVHEEZSKsAaG83v51+adcvALTee8AUAAYAIcQZAE5L4ymHAfh5Gs83ELdxhCzcvBFPynqOqLzyuKzHu5s3ckSpVCp1K0dQhtLsATBDCPySaNMuALRXxTIFQJ6LRCLDAfzah1N/r7Ky8iAfztsvUsr5ABZwZN1fX4vlZrvgflvashkPN6xlySKiZxobG5ezhGUmrTEA8SazrbJfGF577YUxTAFg/ArAKB/Oa1mW9Rtk8O8gEV0FhlHWKaUwZ9UKbOzO203n+m1jTzfmfPYxUjwzKHqUUtdxBGWwBp2DaxvMGBW/1DZoj+Fr0g3I2C9fw3uVlZUnKqW+7WMTDhFCnOvj+b9U31iAP3Fkre/uwpUrl6BVfye7nNWWTOLqlUuwgalQUkr9Rkr5KUtY5qrVOXhtvWc7fxu7UKv/2q/TDTAFQJ6qqKgotSzrd363A8DtfY8hMlIgELgeDINtAKC2vQ0/+WQZul0z8np7SaVw/arlWN2m1aO9reZAIPAzrrAMpvWshOEu1Bgg3QKAiLSfk5kCIE8FAoHbAIzxux0AhiulZvvdiJ1paGiQRMQ2iGxxy2bcwtfFnRNSSmHOZyuwaMsmtkwi+kksFtMeJJXplFJa8yTX1nfDdc3vYrq5rsK6mN7jF9d163TbYQqAPBSJRA4DcJHf7djGRZFIZIrfjdgZ27ZvI6L3ufJeb27CjatWoMv0BKBHubj5s4/xxkbtx5nb+rdt2w9wBmYq3bvA1nYXn9aaXoB0+/izLrR16H3+LcvSHtxqCoA8U11dXaSUehg8730SQB1DTkAp9Rvw7T7ILZlKpc6F5hLB23qzuQk/XrkEbTzz3LNSWyqFK1cuwasb13PGtqZSqXOQJ0vkSiltAFpTTN75sJ2pNUZ/LfyoQzdis23btm6IKQDyTE9Pz/UA9maKu4uIzmPKmhYOh/+bKYtdPB7/GJqbBG3vo5bNuOTjj/JydkBzTzd+tHwRFm/ZzJpLRFc1NjZqDYzLMi6A93QC3l5kCoB0e+cj7dd8GcxeAMbuiEQi+yulrmKK+8yyrNm2bb8K4EmOQCK6Y8SIERm7hZ6U8i4Ab3Nmrm5rxYXLFmFZHq0TsLRlM85f+gHWtLOMrdzWPNu2/8AdmgXe0Tl40bJOJFNmHEC6JFPAB0u0ewC0ir6tTAGQP4JKqQcBFDBkKdd1z4/FYh0AQERXQnNBkj7hwsJC1rtsZinXdc8CwwIc21rf3YVLV3yEv8j6nF42WAF41F6Hyz5ezDbVbxv1AM4Gw11RtnFdd6HO8Ym2FBYtY3u6ZezCe4vbtZ//K6X+zdEWUwDkCSHEFQBqOLKUUvfH4/HXt/63bdsxAFxTri4Nh8MTmbLYxePxOsuyvgmAdQWVlFK4b90aXLNySU5uILSppxs/XrkUD9TXejEDolMpdbqUkrUwyxbBYPBdaBY+z87LxY0SM9Nz87Vfa+W6rikAjP6JRqN7ApjNFBcrLi7+8fZ/OXTo0LsArGLILyAiP5Ym7rdYLPYagMu8yF64uRlnL16If8RtuDkwVdBVCs81Snx78btca/vvyA8dx/nAq/BM1zfd8WOdjLmvt6KzK/t/3zJdR6eLeW9od5Yu69u2XJspAHIfKaUeAFDMlHdxbW3tFx5Yr1ixolspdSnTOaZHIpFvMGV5Qkr5eyK634vsRDKJO9euwkXLF+HT1oQXp0iLla0tuGj5h7ir9lOuXf125NdSyoe9Cs8iz+scnGhLYcFbbIswGTsx/81Wjm2A53K0BTAFQM6LRCIXKKWOZop7Qkq50y8ax3HmAniO40RKqTuFECUcWV4ZMmTIj8CwbfDOfNKawEXLF+Hnq1eiviN7Rmqv62jDz1evxMXLP8QnrZ52LT8hpbzcyxNkC8uyntXNeHqueQzgtWcYXmPLsl5kaAqAzJ13bTCoqqoSqVRqBYAhDHEbU6nUPrvqehJCjAawEoD2xVspdYvjONfr5nhJCFGilPonER3j5XksIhwyZBjOjVZj77JyL081YLXtbXhC1uOlDY3pWOnw5ZKSkpNXr16df3Mod4yEEOsAVA04gIAXHtoDE8YNYmyWsdUna7pw6vfWQfOjsUFKGUbvGizaAhwhRmYqKyt7HMB+HFlKqQvi8fgupxslEoktoVCoEMAxuuckokMGDx78REtLS8Yu6ZpIJHqKioqeDgQCxwKIenUeBSDW2YF/rnewvLUFFhEiRSUIkr81fKebwqsbmvD7dWtw37rVWN3emo5h+AtTqdRJDQ0N2nOpckkoFBoH4GCdjC0JFycek7EzcbPa7LvXY3Wd9qqLjyUSCZZeVsD0AOQsIcS3ADzOkaWU+pfjOCf39+ej0Wix67orwLPXwD+llKcw5Hiqurp6SHd390tgmmnRH6XBII4aNhIzR1Ziv9BgBNJUDKSUwuKWzZjfFMcbzRvQnt4dDt/q7Ow8ubm52fRXbycSiRyvlHpJJyMQIMz90x4YU1XI1SwDwOp13Tjp3Drorv5NRNNt236Zp1WmAMhJQogR6B0VPJIhLgFgkpRyt/YdF0J8BYD2c8k+p33Z2INMEY1Ghyml5iqlDkr3uYsDAUwJDcaBQ4bhgPIhGF9SBoupIHCVwur2Vny4ZTM+atmEJS2b0ZHyZaXd+QC+JqXMngER6WUJIVZDs/A+fVY5bru2kqlJBgBc+bM4ntWf/tcgpRwDxmWuTQGQg4QQjwI4iynuh1LK3w+wHS8CmMXQhjWFhYWT6+rqMn61kr6Bi48B+Kqf7SgOBFBVVIKq4hKMLi5BVVEJRg0ahJJAAMVWAKFgAYoDvU8AO1IpJJI96HBTaEum0NTdhYbOdtR3tKGhowMNne1+XfC39WxJScmZ5pn/lxNC/BjAL3QyAgHC0/dVYdJeRUytym9LV3biGxfXa9/9A5gjpZyt36L/zxQAOSYajZ7kuu4/meLelFIeg971xndbJBLZSym1DIB2fyIR3WDb9s26OWlihcPhXxKRGaHOgIjutm37KjANfMpllZWVIy3LagCgNZJvysQi/O33VbAsc4nQ4boKp1/UgGWfaN+7pACMlVLWMzTrP8wgwBwybNiw8kAg8C8AgxniupRSp7S2tg54n9ZEIrExFAqVAjiSoT2HlpaWPtba2sq7e4w3VGtr67yysjJJRCfCTLcdqC4AF0gpf4EBFqH5prW1tb28vHwCNAf/Nm5IomJEEJMnmF4AHY8/twVPvqC/z4dS6mnHcR5kaNLnmC+mHFJUVHQrNKYBbUspNcdxnJW6OT09PT8DEGNoUnEgELiTISdtHMd5AMBJAOJ+tyULOUR0jFnkZ/cppe7lyLnj/o1o3uz7o5+stXFTEnc+yLM6tWVZnnz3mQIgR4TD4WkALmSK+8hxnDs4gpqamloBXMmRpZT6uhBiBkdWukgpF6RSqSkAuB7L5IMXUqnU/rZta21yk6+klG8D0B4pviWRwtW3xnXnrecl11W46tZGtCRYOq7e9OqzYAqAHDB+/PhBRPQH8LyfSaXU+WB83iqlfBJ8K+b9bvz48Vm1UkljY+N6KeWpSqkLAJgR7DvXoZS6TEp5Gtda53mMZVfN1xa24cEnN3FE5ZX7HmvGG+/ybHdNRHNYgnbAFAA5oL29fTYArh30fuk4ziKmrP9wXfcS8Oygt2d7e/slDDnpphzHud+yrEMBsL++OWAhgAMcx7kHebilLzcp5dtKqX9xZN15fxM+XG7WXOqv95d24Nd/5Fm7jIhe55z3/4V8r4KN9BBCHADgPQBBhrhVlmXtH4vFPPm0h8Phu5hGxicCgcDeDQ0NkiHLD1Y4HP4+Ef0MwAi/G+OzJgDX9T3rNwP9GIXD4Roieh8M3/PhUUE8fd9ojBzO8TWTu9ZvTOJrP6hH4waWDlQFYJqUkmXr3x0xPQDZLQjgIfBc/F2l1PleXfwBoKurazZ4BsSFUqkUyxgFn7h9vQETAPwejAt7ZJEUgN8Hg8EJUsoHYS7+7Pp68v7OkrU+ifOutpFoy8df1f5pbXPx3atsros/APzNy4s/YHoAdoQqKirGBIPBiUqpsUqpaiKqIqJRAIYrpYYDKELv3PbSvmNa0PuF1klEG5VSjei90MWVUp9ZlvWZUuqz3V1Nb1eEENcAuJUp7l4p5cVMWTsViUTOUUr9iSFKKaWOcRznDYYsX0Uikf2VUjcDOBm5/5l0lVJPu647p7Gxcbnfjcl1kUgkqpT6GADLAv+HHlCMh++IorAg139Nd09Xt8J5V9l4dzHbEJ8OpdREx3HWcQXuSN6/i1VVVcJ13SMAHNG3hOu+YPqw7EAzgPeVUh8Q0budnZ2vD3RN875FdpagtxjR1dDZ2Tk5TeurkxDiTQBHMGQtk1IeiBxZICYSiUxRSl0D4JvIvTU6FBE9TURzYrHYMr8bk0+EEJcAuIcrb9bRZbjnxjACgby/fAAAUimFS2bHMe+NBFsmEV1v2/YtbIE7O4/XJ8g0kyZNKty0adPRSqkTiegkABN8bE4SwPsAXnJd9/l4PP4B+jcAyhJCvAZgGkcjiOgU27bTNk2tb9zC++C5yF0qpfw1Q07GiEaje7qu+2MAZ0NzRbcMkFBKPea67u/MHb9vApFI5B3OPSpmHR3CXddX5n1PQFe3whU3s1/8lw8ZMqRmxYoV2lsH7vJcXp8gQwTD4fB0IjoDvWu0D/G7QTuxTin1VCAQeDIWi723sx+KRCIXKaUGtD7/DjwupeTaN6DfhBC/B3ARQ9TmVCo1IRenjUWj0WGu635LKXUOEWlt8+qDZUR0b1dX16MbNmzg+3Y0BoR5sDCA3scB9/5MIFSaa51V/dPa5uKC6yRntz8ApIjoyHStgZHTBUBFRcUYy7K+R0TfBSD8bs9uWqaUesiyrEdt29649S+FEFUAlgMoZzhHE4B9pJQ8y1Xthr6L26fgGQX/RynleQw5GSscDk8kou+gd5OnqN/t2YlPlFJPWZb1d9u2F/vdGOPzhBA3ApjNmTlx/CA8fHsk72YHNDWncO6VMXy6hndvqnTveZKTBUA4HJ5GRP8L4FRk/0yHDiJ6RCl1l5TyUyHEC+gdLMbhLCnl40xZuy0cDp9PRPczRCkiOjxPVo6jaDQ6WSk1Qyk1A8BRAEp8aksHgIVE9GoymXzGdPFnPEsIsQDAcZyh4VFB3H1DGDX7FnPGZqz3l3bgsjkO52j/rRZIKWchjTNicqoAiEQiJ7uue0MWdpf2h4veLrxDmfJekFKeypQ1UJYQ4l0AUxmyFkkpD0aeTScbP378oI6OjsOVUtMATEbvINbxYOzq7eMCWAtghVLqXSJ6fejQoe+n4zmlwWfkyJGVBQUFHwGo5MwNBghXnD8c5585DJRTV5X/T6neFf7ueXgjkin2taqcZDJ5wPr16xu5g79MTrxVkUjkWKXUzwAc5ndbskQLgMnc0xIHIhqNHuK67ttg6KlRSl3oOM4fGJqV1caPHz+ovb19H6XUJCJ6RCdLKXU2Ea20LGull2tEGOkTiUSOV0rNhwe9o0cfWoo7rq3EsCG5NS5g46Ykrrq1kW153+2kiGiGbduvehH+ZbK6AKiqqhKpVOpOAGf63ZZsQkQX2bZ9n9/t2EoI8RAAjmf4Gy3L2isWi/Gsw5kDhBBatypSyqz+jjC+qKKiojQQCHwEYE8v8geHArjyB8NxximDYVnZ/evjugpPPN+CXz7QxLWxzxeka8rfjmTr8/FgOBy+NJVKrYS5+O+WvrWlM+ouOZVKXQtgM0PUcNd1ffkgGUY2mDRpUmEgEPg7PLr4A727CF5/53p87YIGLFnZ6dVpPLdiVSf+64cNuOEutl39voCIHrFt+2eehPfn/H6deKCEEIejd/nUKX63JQt1EGUBi54AACAASURBVNH+tm2v8rsh2xNC/AgAx3z+FICDpZQfMmRlPdMDYGzVtwbKUwBOSdc5AwHCadNDuPDsYRg3ujBdp9Wyel03/vBYM55b0ALX2xFF/5BSng4fFzLLpg93QAhxPYCfIvdWSUuXH0spb/e7ETsREEIsAk9h946U8giYXeVMAWBsFRBCPAqfekwtCzj6kFJccu5w7Ls3x+Kl/FbVduOBJ5rxj5cSSPEP8tvewlQqNb2xsdGTQQX9lRUf7qqqKpFMJh8jomP8bksW+1BKeQgyeNncvumbr4Ph95KIzrVtm2PPgaxmCgADvbtP/pGIzvG7IUTAkVNL8bVZIZwwLYSiQf7+enV0upj/ZiuemduCfy9qh0rDLQMRLQ8EAkfV19dv8v5su2iL3w3YFSHECQAeATDK77ZksSQRHZQNi7P03aVwrEzYWFRUNKG2tnYLQ1bWMgVA3qNwOPx7IrrQ74ZsL1QawMyjSvHVmYMxdb8iBNO0t0AyBby3uB3Pzm/B/Dda0dqevpnDRLTcsqyZmbKVeSZ/uEkIMQfAT5C9gxUzAhHdatv2dX63oz/6ZnZ8AoYNmYjobtu2L2doVtYyBUB+E0L8EsD/+t2OXSkttjB1SjEOP7AEhx5QjInjB7HNIHBdhZWru/DOhx1456N2fLCkA20dviwX8lYwGDwtE+78t8rID3dNTU1BPB5/SCn1bb/bkguI6JHKysrvLVq0qMfvtvSHEOJKAHcwRCVTqdQB+bxCnSkA8lffDdQNfrdjIEqLLVRXFWJsVQHGjC7E2KpChEcFUVpioaSYMDgUQElx731he4eLLYkU2jsUWttcxJuSqG3oRm19N9Y29KCuoduvC/62/mFZ1pmZtpZGxn24hRAlSqm/9e3U55tQaQDVVQUYW1WIsXsUYky0AOFRBSgttlBUBAwJBVFc3PvydXQobE4k0dGh0N6p4KzvQW1DD2rXdWNtrBt1DT1ItKX8/OcAwILu7u7Ts2FjlpqamgLHcZYAmKibpZR6zXGcYxmalZVMAZCfhBBXA7jN73YYAICHpJQXIgPHX2XUh7tvg5gX4MOKfqHSAKbuV4TDa0px2IHF2GtMIWsX1Ke13Xjnw3a8vagdHyztSOtzp218SEQnbLu5UKYSQkwHsIAp7ltSyieYsrKKKQDyjxDifwD8xu92GEgR0ey+ef4ZOSMpYz7c0Wg04rruPACT0nXOslILJx0bwldPCOHAycVpHISisGhZJ56d14K5r7emu3dgCYDpfuwAuLvC4fDfiOgbDFGxnp6eiU1NTa0MWVnFFAD5RQjxXQAPIYO+2/OUQ0Rn+bG87+7IiF+Svjv/N5CGiz8RMO3gUnx9ZjmmH1nm+zSUzi6FBW+14pl5LXjzvba0TEMBsMx13ePj8XhTWs42QEKI0QBWgme3u9uklNcw5GQVUwDkDyHEmQAeBf86Kf8G0ApgJnNurnopmUyene6NfQbC9w93NBotdl13PoAjvTzP1oUoLv3ucEyekKkLUXThgSc24fmXEl7sNrW9D7q7u4/L9DEBkUjkp0opjv2xuwHsJ6X8lCEra5gCID8IIU4D8HcABczRi4PB4HH19fWb+7bvvhNAGfM5ckUKwC1Sypv7/nfG8/XDXVNTUyClfNbLAX+BAOHrM8txwVnDUB3l/mx4Y21DN+57tBnPLvB8RaoFQ4cOPSWTt3Tt29luOXq3uNU1r2+/7bxhCoDc17dWyj8ADGKOXkZEx247ZqiqqmpcKpX6E4AjmM+V1YhoOYDzbdte6HdbdoefH26KRCJ/8nKq3wGTinDTFRWYOJ77c5EeK1Z14oa71nu6oUbfZhS+rxD2ZSKRyMlKqRc4spRSX3cc5xmOrGxgCoDcFg6HjyKiF8HzmGxbq5LJ5FE76cYOCCEuAHATgOHM5802HUR065AhQ27L5BupnfHtwy2EuAnA9V5kDykP4OoLRuAbJ5XnxHaUf31hC+64fyO2JDzrVbpaSskx794zQojnwbOJSZ1lWftk2nxcr5gCIHdFo9GDXdddAKCcOboOwFFSyoYv+6HRo0cPTSaTswFcDCDI3IZs8Fel1NWO46zzuyED5cuHu6/L6kV4sMLfcYeX4rZrKjF0cG7tF9S8OYWrb43jtYWe7B2Rcl331Hg8/qIX4Rz6uh6XA+AYwHGTlPJGhpyMZwqA3BSJRKYopV4BMIw52k6lUkc1NjbW9veAysrKfSzLuh3AycxtyUhE9LpS6idSyn/73RZdaf9w9y31+hGY1/YPBoCrLhiJ8/5rKChHv7KUAh58chPuvH+DF4MEmwHsv6uq30/hcPhmIvopQ1RnKpWatDtfctnKFAC5RwixN4DXwb8/ynoAR0spPxnIwUKIA5RS1xHR15Gby7e/SURzbNt+2e+GcEn3mxRIJpOPgfkXN1JZgCd+OxrfOyN3L/5A7xTG888cisd/HUV4FHuP2zAAjyGDu/KI6FYAHN1tRYFA4FcMOYaRVhUVFWMBvAT+i/8mIjphoBd/AJBSfuQ4zjfRO537TwC62FrnH5eIniKiw6SUR+XSxR9Icw+AF8/9J+1VhIdvFxg+NGOvW55o2pjEeVfbWLma/TOW0d3j4XD4dCL6O0eW67onZfJjDw6mByB3RCKRqFLqDQBjmKMTlmVNj8Vi73GGRiKR4a7rnk1E5wHYjzM7DRoAPAzgYSllvd+N8UraPtxCiMMBvAnGXofDDizBvT8TKCvJxd6mXUu0pXDRTyQWfsQ6ni0J4BAp5YecoZyEEPMBzGCI+qykpGTf1av5q6hMYQqA3DBq1KiKYDD4OoAJzNHtSqkTHcd5gzn3c8Lh8FQiOhfAVwBEvTyXho0AniaiJ23bfg1ZMpdfR7o+3EEhxAcApnAFzjq6DHddH0ZhQX5/P3X3KFxxcxxzX2ddz2dJOBw+KFN3D+x7BroEQKFullLqOsdxbtVvVWYyBUD2i0Qiw5VSrwLYlzm6C8BpUsr5zLlfhoQQ+xPRSUqpUwAcDH/HCywFMJeI5tq2/SYycMMeL6Xlwx0Ohy8jIrZnrrOOLsM9N4YRSNPa/ZkulVK4dA5vEUBEP+3bxCIjCSFuB3AVQ1QbgImZPPhRhykAstvYsWMHd3Z2vgRgKnN0D4BvSimfY87dLdFodFgymTzEsqyD0FsMHAT+8Q1bbQawTCn1PhG9mUql3m5sbFzv0bmygucf7r5R/yvBNFf1sANL8NDtkby/899ed4/Cd6+M4d3FbI8D2gHsnakXxhEjRoQKCws/ASAY4v4qpTyDISfjmAIge1VUVJQGAoG54F8mPQXg7EzdIVMIMdqyrLFKqWqlVDWAaiKqVkqVARiC3hUPSwGE0DtoOYHeO/d29HbjbwTQCKCBiNa6rltLRCtz+Vn+QHn+4RZCPAGA5ct1wrhBeOLXVQiV5ecz/11JtKVw9mU2VqxiWznwCSnlt7jCuIXD4bOI6FGOLKXULMdx5nFkZRJTAGSnSZMmFW7evPlZpdSJzNFKKXWh4zj3M+caWcjTK2kkEjkWTBf/SGUB/nxnxFz8v0SoNIAHfyE4pwie0Td4MyM5jvM4AJbBS0R0V01NTXZsFmHktJqamoJNmzY95cHFHwAuMRd/YytPr6ZKqZ9z5AQDhF9dX5l3U/0GYuTwIH57k0CQZyFEAvALliRvKNd1LwbPwJ194vH4jxhyDENHwHGcP4Nn2evtXSul/K0HuUaW8qwAiEQiJwM4lCPrqgtG4MDJxRxReWHKxCJc/v0RXHHT+npyMlI8Hl8B4F6OLKXU7KqqKo4xBYYxECSEuA/AmR5k3ySlzORi3vCBZwWA67o3cOQcd3gpzvuvoRxReeUH3xqGow8tZclSSs1hCfJIYWHhDehdxlRXyHVd8yVp+IGEEL8D8H3uYKXUXZm8uJfhH08KgHA4PI2IDtbNGVIewG3XVOb08r5eIQLuuLYSg0MszwKmRaPRQziCvFBXV7cZwDUcWUqps4UQZq9zI636prVexJ2rlLrPcZwruXON3OBJAUBELL9wV18wIud29UunYUMCuPIHPI8CXNe9nCXII1LK/wOwkCGK0PtIwQw4MdJCCDEHAPtFmogecRznhwDYdw4zcgN7AVBRUTEGDANYDphUhG+cxL3Ndf4545Ry7Lc3xw66OF0IMZojyCNKKfU/AFyGrH2FEBcw5BjGlxJCXAWA5XHptpRSf7dt+7vg+TwYOYq9ALAs63u6uYEAYc7lFbAs0/evy7IIN10ximPVxCCA7zI0yTOO4ywiogeZ4m6urKwcyZRlGF8ghLgYwO0eRM8vLS09G3mwlr2hh7sACBKR9kXi6zPLsc+egzjaYwCYPKEIp00PcUSdi8zf5/s6AM0MOUMty2KZxmoY24tEIucC8GJK3suFhYVfyeUNrgw+rF/mlZWVM6C5NGsgQPjBf5tR/9wuPHsYLP13uzoSiRyj3xrv2La9EXxbTp+XyYMfjewUDoe/rpR6APwrsb7T09Pz1bq6OralQI3cxloAWJalverfSceUYUyV9iZvxnbGjS7EzKPKtHOUUv/N0BxPSSnvI6L3GaIs13V/h8zv9TCyRDgcnkVEj4N/kOniYDB4clNTUytzrpHD2L7YJk2aVAjgNJ0MIuCis4cztcjY3g/PGc4xpfIryPwR8i6AS8Ez+rlGCJHRYx+M7CCEmE5Ez6B3Mxs2RLSciKbX19dv4sw1ch9bAbBp06ajAWj13U87uBR7jTV3/17Ze9wgHDlVe3GgEZWVlRk/T9627XcA/Jkp7tbRo0eb51LGgIXD4WkAngPAMiVnG6u6u7tn9D36MozdwlYAcGxccfosM+3Pa1+bpT8Y0LKsUxma4rlkMvljAFsYokYmk8mbGHKMPFRZWXkQEb0AoIQ5ug7A9KampjhzrpEn2AoAIjpJ5/hQaQDHH6H/jNr4cidMC6GsVPttP46jLV5bv359IxFxLWN8USQSmcKUZeSJaDS6r2VZLwLgvruxA4HAdCllA3OukUdYCoC+DVQm6GScdFwZigaZef9eKxpEmKU/GHCKEIJttyEv2bb9GwBLGaICSqnfgX/ktpGjotHonq7rzgfAPbCpyXXdExoaGtYw5xp5hqUAcF1X+5nwV2awzFM3+uGrMwfrRlhKqaM52pIGScuyLmXKOiIcDp/FlGXksIqKirGu674CoJI5ehMRzYjH4x8z5xp5iOsRgFYBUFZime1+02jqfkUoK9F764koa+bHx2Kx1wA8yZFFRL8cO3asdgVl5K5oNBoJBoMLAESZo9sAnGbb9hLmXCNPsRQASqmDdI4/aEoxgvpL1Rr9FAwQavbTLrhqONqSLn0bVHHMka7o7OzkWmjIyDGjRo2qcF33ZaXUWObodqXUSVLKt5hzjTzGUQAQgMk6AYfX8Oxbb/TfYQdoD0iuQRY9D7dtOwbgZ0xxl0aj0X2ZsowcUV1dPSQYDL4IzfFQO9BtWdY3Hcd5gznXyHPaBUDf7n9aI1wPPcB0/6fbYQdqv+aDw+FwJu8O+AVDhw69C8Aqhqig67q/YsgxcsSwYcPKe3p65gM4gDm6B8B/xWKxfzHnGoZ+ARAMBvfROT5UGsAEs/hP2k0cPwilxdrjAPZkak5arFixolspdQlT3PGRSOSbTFlGFquoqCgtKir6p+6j0B1IAThHSvkcc65hAGAoAJRSY3SOHzO6wGz76wPLIlRr7rlAROOZmpM2juPMA/AsR5ZS6s6Kigrz/CqPTZo0qTAYDP4NwJHM0UopdbGU8gnmXMP4D44CoFrn+DFRc/fvl7FVBVrHK6WyrgAAANd1LwfQwRBVZVnWtQw5Rhaqqakp2LRp0985VkHdgUscx7nfg1zD+A+OQYBaz4HHjDYFgF8YXnutrZ/9Eo/H6wDcxpFFRFdGo9GsLIQMLQHHcR4F4MWy2NdIKX/rQa5hfI52AWBZ1kid48eZAsA3Y/W3XR7F0Q4/WJZ1OxHVMkQN6tsy2MgfJIS4D8B/eZB9s5SSpTg1jF3h6AHQWuayYkSAoQnGQIRHae/qm7UFQCwW61BKXcEUd4IQIis2SDK0kRDitwC+zx2slLpLSnkDd65h7AzHGACtAqCsxBQAftFdDRD8a5ynlZTyOSJ6kSOLiO6urq7m3urVyDBCiNsBXMydq5T6g+M4V3LnGsaX4egB0FpRprTEzADwS4l+ATCIox1+IqJLAHTp5iilxvb09FzF0CQjQwkhZgNgv0gT0SOO41wMQHFnG8aX4SgAtB4kM2xNawwQQw9A1t/xxmKx1UR0F0eWUuraysrKao4sI7OEw+HLANzoQfQztm2fB8D1INswvpTvBUCJ5mI0xsCVmh4AAIBS6hYA6xiiigOBwC8ZcowMIoT4HleRuJ35JSUl3wKQ9CDbMHbJXH0NHTlx1yKlbAfwY44spdTplZWVXswLN3wQiUTOAXA/+Pe9eMWyrK+uXr1a+/GTYQwURwHQrXNwe0dOXEOyUlu79mvPsbteRpBSPgngFY4sy7LuGT9+fE70juSzcDj8daXUQ+C/UVrY09PzlVgsxrEYlWEMmO8FQGubKQD80qpfALRxtCNTuK77I/RuvqJrz/b29ksZcgyfhMPhmUT0OADtubLbWRwMBk9qamrKmeLZyF4cBYDWRaCt3Qx89Uu7KQA+Jx6Pf6yU+g1T3PXRaDTClGWkkRBiOhE9C+YxLkS0nIim19fXb+LMNYyB0i4AiKhZ5/jW9pRuE4wBSuj3vuTcXUxXV9ccAHGGqDLXde9gyDHSSAhxJHo3i+Ke4bKqu7t7hm3bG5lzDWPAtAsA13U36BwfbzIDYP3C8Nq3cLQjkzQ3N7copa5mijuzsrLyaKYsw2OVlZUHAfgnAO4dHusATG9qauIoLA2DDUcPQJPO8bUNHI9cjYGobdAavgHwTJ3LOH2bvPybIYosy/oN+J8jG8yi0eh+lmXNBVDOHC0DgcB0KWUDc65haOMYA1Cvc/Daeu2LkDFAtZqvPdNmOplIWZZ1EXjmZ+8rhGBfOtbgE41G93Rddy6AYczRTa7rzmhoaFjDnGsYLDgKAK27QIa7UGOAdAsApVSuFgCIxWLL0Dv/m8PNe+yxR5gpy2AkhBjtuu4CANzvzxal1InxePxj5lzDYMPxCEDrIrC2vhuua2YCpJvrKqyL6T1+yeEeAACAZVnXA9Aa49KnvKen5+cMOQajSCQSJaJXAezBHJ2wLGum4ziLmHMNg5V2AZBMJrUq3NZ2F5/Wml6AdFu5ugttmoswJZPJnC4AYrFYs1LqOqa470QikcOYsgxNFRUVo5RSLymlxjJHt7uue2osFnuXOdcw2GkXAI2NjXXQHA3+zoftus0wdtPbH2ovQiYbGxvXc7QlkzmO8xAAji9zUkr9FoDZ/9pn1dXVQwKBwIsAJjBHd1uW9c14PP46c65heIJjDIACsEwn4O1FpgBIt4Uf6b3mRLSQqSmZzrUs6xLw7HtwYDgc/j5DjjFAw4YNK+/p6ZkP4EDm6CSAM2Kx2L+Ycw3DMyxrXBPR+zrHf7C0A0mzHlDaJFMKi5bq9QAopd5hak7Gi8Vi7wH4I0cWEf1cCDGCI8vYPUKIkuLi4n8opQ5ijnYBfEdK+SxzrmF4imuTC605063tLhYtM/tipMt7izs49gHIlx4AAIDrutcC4FjCdZhS6maGHGM39G3O9KxSinthJgXgB1LKx5lzDcNzLAVAMBjUXjTl2Xk5t6hcxnpuvvZr3WNZVl6NcI7H400AbuDIIqLzhRDcXdDGTtTU1BS0t7f/FcAMD+IvlVI+5EGuYXiOpQBYt26dA2ClTsbc11vR2WWmA3qto9PFvDe0l/BfmI9bmUop7wWwhCEqAOC34N9jHkDv3W4kEtlfCPHfulmRSOSccDg8taKignt53HQJOI7zCIDTPMi+VkrJtXmUYaQd2xKlSqkXiWjiQI9PtKWw4K1WnHp8iKtJxg7Mf7OVo/v/HxxtyUIppdSPiOh16F+8D4tEIt+xbfv/dEImTZpUuHHjxsMsyzpGKTWZiPZtb28fB6bPtlLqT0SEQCCghBB1SqmVRPSuUuq1QYMGvVdXV9fJcR6PkBDiQQBncAcrpW5xHOcX3LmGkU5sdyBCiBkA5utkTDu4FH+8w+yg6qVz/zeGtz7QmwFgWdZesVjsM6YmZZ1IJPKIUupshqjGoqKiCbW1tVt24xgSQuwP4HgA0wFMA1DC0JaB6ATwHoBXiOg527YX+9SOHSEhxG8A/JA7WCn1K8dxruDONYx0YysAampqChzHiUNjPW0i4IWH9sCEcazbcBt9PlnThVO/tw5K70nLx1LKSUxNykqjRo2qCAaDnwIYrJtFRHfbtn35rn4uHA5PJKIzAJwNYJzueT1Sh97eob9JKf+N3gFyvhBC3ArgGg+i/yil/B58/LcZBhe2RUkcx3FDodDeAA7QydmScHHiMeYxgBdm370eq+u0V118KJFIvMzRnmzV1tbWVl5e3gPgBIa4qSUlJc+0tbV9YVGlioqKUUOGDPlBKBT6HRHdAuAY8G9Yw2kIgEMAnBcKhc4IhUKBkSNHfrJp06audDZCCHEjgJ9y5xLRI1LK82Au/kaO4JoGCABQSj2pmzH39VasNRsEsVu9rhvz39Qe/AcAh4TD4RqOoGxWWVn5a2gOfO0T7Nsy+D+9cdFodHw4HL43EAisU0rdBf5Fa9JhbwD3dHZ22pFI5A/RaHS/dJxUCHElgNncuUT0lG3b54FnQSjDyAisy5K2trauC4VC34PGntpKAe3tLmZMK2NsmfHz3zVh5WqWG7ExRHR+KBSqKSsrW9Xa2upwhGYbx3Hc8vLyTwCco5tFRNWhUOiTsrKyovLy8ruVUr8jooPBOEjXR4UAapRSF4ZCof3Ly8s/SSQSjV6cKBKJXATgbjDPrlBK/Wvo0KH/1dTUpLd7lmFkGPZpSOFw+GYi0up+CwQIT99XhUl7FXE1K68tXdmJb1xcD5f/3kUBeEEpNSdfdz4Lh8N/I6JvMER1AsiHX3hFRM8Q0ZxYLLaUKzQSiXxHKfUwmHs1AbxiWdYp+Tjt1ch97AVAZWVltWVZa6D5QZwysQh/+30VLMuTqdJ5w3UVTr+oAcs+8XS2lgLwPIA5UsoPvTxRphFCjAbwMYBsnSfvF5eIHgRwnW3bG3WCIpHIN5VSfwH/Rktv9/T0zGxqamJ5dmYYmYZ9Z7LW1tbNoVDoAPQ+Axywxg1JVIwIYvKEfLgp8s7jz23Bky/sziyzASH07qx2QSgUOrK0tPST1tZW6fVJM0EikdhSXl5OAI7zuy1ZhgDUADi/rKyss7W19QMMYHBdOByeCeCvAAqY27ckGAzOdBzHLFFq5CxPbq+FEEcCeFM3Z3AogAWPVmPYELOD6kBs3JTEjG/XoSWR9nFLedUjMH78+EHt7e3LAYz3uy1Z7F2l1Hcdx+n3wEohxHT0/p6x3iUQ0XKl1LFSyg2cuYaRaTy5siYSifpQKDQLQFQnp6tb4bO6bpw6vRxkngTsFtdV+NHsOFbV+jKjYmuPwA9CodCBoVDo00QiEfejIenQ3NycKi8vrwWgvfRuHosS0XmhUGhTIpH4YFc/3HeT8QL4F0H6rKen57jGxsYvTMs0jFzj2a11aWmpJKKzdHPqYj0oKbZQM7mYo1l5495Hm/HE8553/e/K5x4NlJWVrczVRwOJROKzUChUg95/rzEwBQBODoVChw0ZMuSVlpaWxI5+KBKJ7A9gLjRmG+1Eg+u6x61fvz7GnGsYGcnT+2ohxFsAjtDNCQaAx39dhQNNEdAv7y/twLcviyGZyrj1ShR6V4qbI6X8yO/GcKuoqBgbCARWID9G83stDuB0KeXb2/5lNBrd13XdVwEMZz6fDAQCRzU0NKxhzjWMjOXpw/XS0tI1RHSubo6rgLc+aMepx4dQWsI9yye3rN+YxHevtJFoy8j1Sgi9g0N/EAqFDuibE54zjwba2to2hUKhQQC495zPR2UAzg6FQnYikVgMAEKICUqpVwCMZD5Xk1LqONu2VzHnGkZG87QA6FsYaAKAfbWz2ly8+X4bTptejkGFZkDAjiTaUvjOFTbqYhm/XsnWQuDCXHs0MHjw4IVKqbPQuyyuoScI4Cvl5eWipKTkY8uyXgLAvVvYFqXUCY7jsK1JYBjZwvMr6R577BHu6en5BEzP6w49oBgP3xFFYYEpArbV1a1w3lU23l2st9OfTxSA54hoTobtKLfbJk2aVLhp06YX4fO0wILSUoQiVQhF90B51R4IRapQMnIUgsUlCAwahMJQOYJFvY/Ukp0d6E60INXZiWRnB9qb1iMRq0dLwzok7Hok7Ab0tLX5+c8BgG70rirIKWFZ1oxYLPYuc65hZIW0XEWFEJcAuIcrb9bRZbjnxjACAVMEAEAqpXDJ7DjmvbHDMVPZJKsLgerq6qKurq6niOikdJ+7oLQUIydPQcX+UzFqSg0GV48FWTyPy5TrYsvaNWhcsgiNixdhw/LF6GnPykJzWx2u654Yj8df97shhuGXdF1BA0KIDwDszxU46+gQ7rq+Mu97Arq6Fa64OScu/ttSAJ7tKwSW+N2Y/hBClAB4DsD0dJ2zoKQUVUcdh+rjZ2H4PpNhBdKzdYCbSmLjimVY+/JcxN56NRN6B3ZXl1LqK47jzPO7IYbhp7RdPSORyGFKqbfAuFb3oQcU496fCYRK83OhoNY2FxdcJ7m7/ZMAXgNwPNL4+7ETWVEIjBw5sqywsPAFpZT3g/+IEK45BNUzTkTksGkIFA7y/JRfJtXdBfvtN1D30lw4i97t3c0rsyUBfFNK+azfDTEMv6X1C14IMRvAjZyZE8cPwsO3RzByeC5snNZ/Tc0pnHtlDJ+uYd9qfbaUck44HJ5KRDcCOIX7BAOQsYVAdXV1UXd39wIAR3p5HrICqDrqOOxz5jkYXD3Wy1MN2Ja1a/DxE39Cw5uvQnmw8xSDFIBzpJSP+90Qw8gE6b7DCwghTxCmnAAAF7tJREFUFgA4ljM0PCqIu28Io2bf/Fgn4P2lHbhsjoPGDUnu6LeklMeg94sSAJBphQARPQPgpgwpBEgI8TiAMz07gRVA9YwTsc8Z30aZ0FpYM20SdgNWPvFn1L08D8pN7fqA9FAAzpdSPuR3QwwjU6S9i7dvVsBHACo4c4MBwhXnD8f5Zw7L2WWDlQLue6wZ9zy80YtFfjYppQ5wHGfdjv7PysrKgyzLuhHAydwnHgBPtpTdXeFw+BYi+olX+cMnTsbU/7kSQ8bt6dUpPLVp9af44De/RPOnH/vdFAC4VEr5a78bYRiZxJdLZd8mHvPAv3c3jj60FHdcW5lzGwht3JTEVbc24o13PRlwlVJKndyfQVGmEOgViUTOVUr90YvswvLBmHLeRRhzwslsI/n9olwXtXOfx9I/3ofuhG8b610rpfyFXyc3jEzly1UykUjUhkIhC8Ax3NnrYj346z9bUF5mYdKeg0BZ3h3gugp/+UcLLvqp9HJjn2scx/lzf36wtbVVJhKJx0tLS/9FRALAXl41qh8IwESl1IXl5eX7Dh48+JOWlpZGr09aWVl5NBH9FR58fsQhR+CYX9yDkZP2y/rfXQAgIgzbc2+MnXkKttTXodVuSOv5lVK3OI5zc1pPahhZws9vGBJCPAzgXK9OsN/eRbjpilGYPCE7l2ZfurITN969Hss+6fTyNI9JKc8e6MGVlZUHEdFsP+a+74Aioqf7egSWeXECIcQIAEsBhDlzKRDAlPMuwoSvn4lcfob1yVN/wbI/3gc35f3YACK627btyz0/kWFkKb+/aYJCiGfg4QCzQIBw2vQQLjx7GMaN5l5IzBur13XjD48147kFLfB4MPWCoUOHnrJixQrtroVoNHpwKpW6MdcLASHEswC+wplZWlGJw669CcP3nsQZm7E2fLwM79x6A9qbvNtxl4jut237QvQO/jMMYwf8LgAQjUaLXdedD4+nUVkWcPQhpbjk3OHYd+/M7BFYVduNB55oxj9eSiDl8U5+RPR+d3f3cU1NTa2cuZFIZIpS6icAvgH/f78UgH8CuFFK+aFuWDgcvoCI7tNv1v83bM+9Me3mX6JoyFDO2IzXuakZb/z0f7FpDf/+O0T0qG3b3wGQkXMRDSNT+P0FDQCIRqPDXNd9A4Dnt0BEwJFTS/G1WSGcMC2EokH+vgQdnS7mv9mKZ+a24N+L2tO1jsoKAMdIKTd4dYIM6xFwlVJPBwKBmwbaIyCEmABgEYBSrkaNmlKDI2/8BQpKSrgis0pPWxveuukarF+iXZtt74jttxE2DOOLMqIAAICqqiqRSqXmIw1FwFah0gBmHlWKr84cjKn7FSGYpr0FkingvcXteHZ+C+a/0YrW9rTeqCxxXXdGPB5vSsfJotHoIUqpG5VSJ6bjfLsw0EIgIIRYCGAqV0OiRx6Dw348G1ZBAVdkVnJ7erDwtjloeOtVztglUsqp6F31zzCMnciYAgAARo8ePTSZTD4P4Ih0n7u02MLUKcU4/MASHHpAMSaOHwTL4nl5XFdh5eouvPNhB975qB0fLOlAW4cvvZMfWJY1MxaLNaf7xJlWCAB4KpVK3dTY2Lh8Vz/M3fUfPfIYHH7dTSArt6aqDpRyU3jn5zdyFwFXSSl/yRloGLkmowoA4D+bqjwB4FQ/21FabKG6qhBjqwowZnQhxlYVIjwqiNISCyXFhMGhAEqKe+dot3e42JJIob1DobXNRbwpidqGbtTWd2NtQw/qGrr9uuBv6+WioqLTa2trt/jZiGwrBKqrq4d0d3evAjCS44SjptTg6FvuzPs7/+25PT147SeXo2npR1yRba7rTo7H43VcgYaRazKuAOgTFEI8AA+nCOaZh8Ph8IWLFi3q8bshW0UikUOVUjcCmOV3W9BbCPw9lUrdvH0hEIlEfqWUuozjJEPHT8Cxt/0GBaVswwhySk97G1656ofYvOYzljyl1N8dx/kmS5hh5KBMLQCA3nUCrgdwA3xasCgHuEqpnzqOc6vfDdmZTCwEXNe9KR6PrxBC7I3eOf/at+ulFZWYfs+DeTfaf3d1NG/ES5d+n2uKoHJd95B4PP4+R5hh5JpMLgAAAJFI5Dil1GMAKv1uS5ZpAnC2lHK+3w3pj0wsBJRSo4joGN0wCgRw/J335s08f10bVizFq1f/D9diQS9JKWdwBBlGrsn4O+tEIrG2tLT0z0S0H4DxfrcnGxDR+67rnuA4ziK/29JfiUQilkgkHisvL5+H3mLP7yWGJxFRNUfY/t//IaqOOo4jKi+UjKqAVVCAxo8+4IgbW15e/lYikVjLEWYYuSTjCwAAaG1tbU8kEo+HQiEXwFHwYBOhHNED4OZwOPzdzz77LO0j/Tn0FQJ/KS8vnw8giiwv+sQhR+DAiy7P3eV9PTJyn33RvGolWmWMI25sIpHwZOMmw8hmWfetFIlEDlNKPYA0rheQJZYCOFdKyTaMOhP0vd83Apjpd1t2V2H5YJz04F8wqHyw303JSl1bNuNf3/8Wyy6CrusebMYCGMbnZUUPwLYSiURs9OjRD3Z2dqYAHAog6HebfNamlJozbNiw79bW1tp+N4ZbX4/Ao309AlUAxvndpv468KLLMXLSfn43I2sFi4pQUFoG5z39Rf2IqDSRSDzN0CzDyBlZ1wOwrUgkEgXwc6XU2cjyf8sAvaCU+h/Hcdb53ZB06esRuA4ebiDFYdiEiZj+q/tBlnlapUO5Ll6+4kJs/GSFblQPEY21bZvlmYJh5IKs6wHYViKRaEkkEs+UlZW9QkTjAezhd5vS5FUA50gpb2ttbfV1YZ902zpGIBQKLUCG9giQFcBRc25H8fARfjcl6xERhozbE2vn/ROaG2UElFIdra2tr3C1zTCyXU7cnjiO86aU8igAJwD4t9/t8dAblmUdK6U8Tkr5lt+N8ZOU8m0p5Uz0Lhu9wO/2bKt6xokYMm5Pv5uRM4btuTf2OFZ/Jh8RnYX87Ck0jB3KyQ9DJBI5DMCVSqmvIvuLnB4ATxHRPbZtL/S7MZlKCHEEgBsB+Drnm6wATnrwcZSJqJ/NyDktDesw94KzoVztJbWn5XvxbBhbZfUjgJ3p6yb+aygU+j8ACfR2E5f726rdtg7AbyzL+rZt239MJBLm2eWXSCQSDYlE4pFQKPQSfHw0MPro6Rh34ml+nDqnDRo8BFvWrUXLOu3p/F2JROKfHG0yjGyXkz0AOxAQQhwL4AwAXwMw3Of27EwzgKeUUo84jvMWAK2Hnvmsr0dgNoDpaTspEWbd+2cMrh6btlPmk821qzHvh+fqjgXYEA6HRSbti2EYfsmXAuA/ampqCuLx+DSl1Cz0Lju7r89NWgrgXwD+KaV8BwDL+qdGr3QWAuGph+KoW+70+jR57fWfXI74ove0MizLOjYWi73G0yLDyF55N4e+r/J/pe/P1RUVFaMsyzoCwJFEdDB6CwKvVm7ZjN4L/tsA3nFd9514PN7k0bkMAFLKfwOYIYQ4Er1jBDwrBKpnnORVtNGnevqJ2gWAUmoGgNdYGmQYWSzvegD6IxwO7wFgomVZY5RSY9D7TLkCvY8OhgMoQW/xFOo7ZBMAENEmAO1KKYeI4kqpOIB1RPSJZVkrGxoaZNr/Mcbn9BUCswEcz5lbUFKKrzzxPAKFgzhjje2kujrx3H+fhp62tgFnENH7tm0fzNgsw8hKpgAw8pIQ4nAA14JpQaFxJ56GqZf+mCPK2IX37vo51s7XGsfnAqiQUm5gapJhZKVsnyJnGAPSt47AqQCmAdCeXrnHcVm3VUHWqj5ee8doi2ObZ8PIdqYAMPJa35zweToZBSUlGL7PZKYWGbsyYvJ+KCgp0cpQStUwNccwspYpAAwDmKpz8Mh9D4AVyLvxtL6xAkGMmDRFN+ZAjrYYRjYzBYBhAFp3gxX7m5vJdKuYov2amwLAyHumADDy2qhRoyoAVGpl7GeuJek2an/t13xE326ihpG3TAFg5LVAILCXzvEFpaUYPCbjNiTMeUPG7olgcbFWhuu6ZuCGkddMAWDkNSLS2rYvFB0NsszHKN3IshCKjNaN0Q4wjGxmvrmMvEZE43WOZ7gIGQMUimq/9lUc7TCMbGUKACPfaV0Eyqv24GqHsZvKNQsAIjIFgJHXTAFg5DWlVIXO8Qx3ocYAharMIwDD0GEKACPfjdI5uHj4CK52GLupZITWWwcA5s0z8popAIx8p3URKCgp5WqHsZsYXnu95QQNI8uZAsDId1pzyYLF5hril4DmNECYAsDIc6YAMPJdkc7BumvSGwNnegAMQ48pAIx8N0jnYNMD4J8C/dfevHlGXjMFgGEY+Yr8boBh+MkUAEa+69Y5ONnRztUOYzf16L/2XRztMIxsZQoAI99pFQA97aYA8EtPe5tuhCkAjLxmCgAj32ldRUwPgH9SHR26EebNM/KaKQCMvEZEzTrHM9yFGgPUrf/aa733hpHtTAFg5DXXdTfoHN+xoYmrKcZu6mhq1I0wb56R10wBYOQ1ItK6CCRi9VxNMXZTItagG6FV/BlGtjMFgJHvtK7gLaYA8E1LwzrdCO0Aw8hmpgAw8p3WRSARM9cQv+j2vhDRWqamGEZWMgWAkdeIqFbn+ESsHsp1uZpj9JNyXSSk3iMA13XreFpjGNnJFABGXksmkx/rHN/T3o4ta9dwNcfop01rViGpOQ3QsqzlTM0xjKxkCgAjrzU2NtYBaNHKWLKIpzFGv61f8qFuxGbbtm2OthhGtjIFgJHvFIBlOgGNi00BkG4MBcAy9L73hpG3TAFg5D0iel/n+A3LF0OlUlzNMXbBTSWxYcUS3Zh3OdpiGNnMFACGAfxb5+Ce9nZsWLGUqy3GLjQtW6y9B4NS6m2m5hhG1jIFgJH3gsGgVgEAAGtfnsvRFKMf1r08TzdCua6r/Z4bRrYzBYCR99atW+fg/7V3t8FRXXUYwJ9zN8Fs0osJIGl2N4FqBwRabUvpiECBkVY62lHGD3YGtONgHb/4MsqoFQekLdXRVj+oY2un4sDIlKqlRQbSSJpASICUyEiApLQJSXbvzS7k/Sa7geTe44ekIyMNL73nZndvnt+nfMlz/js7u/e/59x7DtDsJiN2tAr25WFFFdFE7MvDiNVWu41pTCQSFxWUQ5TV2AAQAZBSHnTz/yNDQzCOH1VVDk0gVntYxRHMrt5rIr9gA0AEQAjh+qJwoeKAilLoOtoOub92a5rG9RoisAEgAgCUlJRUw+XxsPF/13NTIA/1tb6H+KmTbmO6YrEYp2qIwAaACADQ0NAwAuANVyFS4twrO9UURNc4u3sHIN09ui+EeA3AqJqKiLIbGwCicVLKPW4zojVVPCLYAwMdbTDqjqiIcv0eE/kFGwCicZ2dnZUAXG0PKx0bTXt2KaqI3te0Z6eKQ5eihmEcVlEPkR+wASD6n1Ep5Q63IW2V5eg536SiHgLQ824z2qsOqYj6MwBu2Ug0jg0A0VWklC8DcPVTUzoOGv7wGx4TrIB0HDT87teQjuvrto2xBoCIxrEBILpKPB5vA7DPbU7PO+fQWv5P9wVNcS0HXkfPeVd7NAEApJR7TdPkzRlEV2EDQHSt51WEnN7xAi7396mImpKG+3rR+Jc/KcnSNE3Je0rkJ2wAiP6PaZpHARx3m3PFGsCJ57e7fnRtKpKOg/rnnsGVQUtFXI1hGK7fTyK/CaS7AKJMVFBQYAoh1rvNGTSiyAkGMWvh3SrKmjKa9uxEywF32zK8Twix0bKsC0rCiHyEMwBEHyAejx8EoOTB88YdL/C44FvQ3XwWZ3a9rCruqGEYlarCiPyEDQDRBBzH2aIkx7Zx7JdbkerpVhHna6meLtQ+vRnSVvK0ngTwExVBRH7EJQCiCQwODrbruj4fgOv5+5HkEOINJzBn9cMITJumoDr/GRkaQvVPv49BI6oq8lXTNH+rKozIb9gAEF1HXl5ebSAQ+CaAPLdZl/t60dV8FmUr10AL8KN3NWd0FEe3/Rjd586oikw5jrNucHCQj2EQTYDfQkTXkUwmB3VdTwF4REleIo6BaDtKl6+CEFyBA8a2Tz72i63orK9TlimEeKqzs9P1fg5EfsYGgOgGLMs6qev6owBKVOQNdLSh70ILwktXQAvkqIjMWs7ICI7/ahtiNVXKMoUQZwoLCx+/dOkSt/0lug42AEQ3JqdPn34awDcACBWBVrQdXecaEfnsyil7T8BIcgg1Wzahs/6YylhbCLGupaWlXWUokR+xASC6CZZlxXRdDwBYqSpzKNEJs74W4aUrkJtfoCo2K6R6ulH95PfQ3XxWaa4QYqthGLuVhhL5FBsAoptkWdYRXdeXAfi4qszLfb3oOFKJGfMXomB2sarYjNZzvglHNv8AVlT5j/Qq0zS/hbHH/4joBtgAEN08GQwGKzRNWw9AVxU6mkyivbIcUjr42F33QAglqwyZR0qcf+NvqHt2C65YA6rTE7m5uQ/39/crDybyKzYARLdgaGhoSNf1/wDYAEX3AwCAlBKXTp9Cz/lm3L74AeTkuX7qMKMM9/WibvvP8O6+f3hxNoIthPhSNBptVB1M5GdsAIhukWVZrbquawBWqc4eNGNofXM/cgtuQ9Gd87J+NkA6DloOvI7ap55Ef1urJ2OMr/vv9CScyMey+9uFKH1EKBR6CcBGrwYounM+7v/OJsyYv9CrITzV+947aPj988pv9LuaEGKXYRiPg+v+RLeMDQDRh5cTCoX2AviiVwMILYA5qx/Cgse+jumlc7waRqmBjjY0vboL7W9VQDqOl0PtM03zKwBGvRyEyK/YABC5EIlEgo7jVABY7uU4QtNQsuQzWLR+I2bM+6SXQ31o/W2taP77X9H+1r8gHc/34Dlu2/aaRCIx5PVARH7FBoDIpUgkMsNxnCMAFnk+mBC4/b4lmLvmEUSWrURg2kc8H/J67MvDiNUeRtuhg4ifOunFDX7XEEKcCQQCD3Z0dPR6PhiRj7EBIFKgtLQ0ZNt2BSajCRiXW1CAyLJVmPu5tZh116cmbVthadu42HgKbYfKYdQdxkgyOSnjAmMXf03TPh+NRs1JG5TIp9gAEClSVlZWNDo6ug8eLwd8kJy8IGYuWITie5eg+N77UfSJeRCamsOGpONgINqGrrONSJx6G/GGeowk0zLzfkII8QXDMLrTMTiR37ABIFIoFArlA3gFwKPprCMnGIQeLoMeKcP0SBn00jLkz5qNnGA+coJBTLtNR04wHwAwmkriyqCF0VQKI6kkUl0XYUU7MBBth2VEYRkdGE2l0vlyAGCfpmmPxWKxtBdC5BdsAIjUC4RCoT8CeCLdhfiBlHJnZ2fnRvBufyKluBEQkXrSsqz9uq47AB4EoGYufuqxhRBbTdP8IQBPnyckmoo4A0DkoUgksspxnN0AStJdS5a5COBrpmlWpLsQIr/iLxMiD8VisWrHcT4N4M1015JFqnJzc+/hxZ/IW5wBIJocIhQK/QjAdnDpbSI2gGdM03x6/G8i8hAbAKJJFA6Hl0opX8Ik7heQDYQQZwA8YRjG8XTXQjRVcAmAaBIZhnGsqKjoPgA/BzCc5nIyQUoIsaWwsHAxL/5Ek4szAERpEg6HIwCelVJuwNT8LO63bfu7iUTiQroLIZqKpuKXDlFGKSkpWSGE2A5gRbprmQxCiMNSys2madamuxaiqYwNAFGGCIVCDwHYCmBZumvxSI0QYpthGJXpLoSI2AAQZZxwOLwUwCYp5ZeR/ffpOEKIvQCe4xo/UWZhA0CUoSKRSNhxnA0Avg1gbprLuVUmgF22bb/INX6izMQGgCjzBUKh0GoAXwWwDsDMNNczkW4Arwkh9hiGUQ0+y0+U0dgAEGWRxYsX58bj8RVSyrUA1gK4O80lnQZQLoQoNwyjBjywhyhrsAEgymLFxcWzNU1bBmC5EOIBjDUEH/VouD4AjVLKt4UQNbZt1yUSiYsejUVEHmMDQOQzJSUlcwAs0DTtDinlHQBKARRjbOlgJoB8ADkA9PF/sTD2yz2JsWn8bgAJAFEhxAXHcVqFEE2maXZM8kshIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIppU/wVwOICzRGGbSgAAAABJRU5ErkJggg==;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;598.98\&quot; y=\&quot;1620.75\&quot; width=\&quot;78.5\&quot; height=\&quot;78.5\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-83\&quot; value=\&quot;\&quot; style=\&quot;shape=image;verticalLabelPosition=bottom;labelBackgroundColor=default;verticalAlign=top;aspect=fixed;imageAspect=0;image=data:image/png,iVBORw0KGgoAAAANSUhEUgAAAWgAAAFoCAYAAAB65WHVAAAABGdBTUEAALGPC/xhBQAAACBjSFJNAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAABmJLR0QA/wD/AP+gvaeTAAAAB3RJTUUH5wgKCgIx9ifNUQAAgABJREFUeNrsfQd4XNW19dw2Rb23UbfcewF3dcmFYtM7AQIhkBBqAiHU0LHB9Opuyd2yJBuSvDQChJLkJS9/XhIChJSXDrhCSHBZ/977nDszsmdkDZaxIXd/3/ZIsjTlnnvW2XVtn88TTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0/+Y8Xv9/uCwaBp23bYcZwGy7Jm0ePsQ6GBQGA2vd4Meo0x9Hrp/NqeeOKJJ57EEdM0fQSUuQSY15H+P9Kdhmn8i37+70Oh9Fr/ptf4gL7+OwF2F33dkJaWbqakpHiL4YknnnjiCoEja6Y/EHjaNI1dhmHAMHwfm9Jrw+84/0egPceyLZ9nTXviiSeeRC1nBujP2bZF1u3HC878egzQZEWDDoj/pvdTzgeGJ5544sl/vBAwsqYRKH7Tskx83NYzKx0QpBbrLnof5zFAe6EOTzzx5D9eOJxAAF1KoPg6Wa8fOzjHsabn0SEhVr0nnnjiiWdBE0ATOL95OKznONb0o/y+vDCHJ5548h8vwWCQNZ2A8TtsQX/cMej9rWjTA2hPPPHEExYOJ6SlpfkCgcClfr//Q07YWaaKRfsIMOlXDkqTBWj6Gw+gPfHEE09c4Tg0aY7fcVY6tr3HJpA2XZD2ANoTTzzx5PCKLrUrpMc7Hcd+i77+F+lu8yDVMI3dpmns6WvoxANoTzzxxJM4QiDKwEgYbQ+mr4+jr8+lr8/XekECPX8f7fH/9GyfcRznevr63b6AtAfQnnjiiScfG+ibPsM0K+jx9b5UiXgA7YknnnjyMQknIelfLuP7dd/K+IxHPID2xBNPPPmYAJqUAfq1PoAzN6o8kZaWlp6RkZGemZn5kVX+/j9JD+JaedqLemt0MJoWIvH7/UYgEPDA8AgG6LCyoA8M0I7j/JkW9MVAwP8Cff2CbdueeurpJ0x575I+b1n2s7T37yTD62h/IGB7QH1kgjQD9K/6xmznNssc/q5GTz319CMq72G9jxWFg/knAu6vEmjn+v2ONMh5cuRISV8B2lNPPf10KgH1hwTQzxBQTyU1mWbCEw+gPfXU0yMDoIVamJS5368JBAJZDNKmVxDgAbSnnnp6uNWIhC/VVCVrIwH0xLS0NNOLTXsA7amnnh4x1rTpWtNv+f3+LwVDoSybrOmMjAwPLY/kJKGnnnr6HwXQcPzOv/wBf4dt2+MYLri5zZMjsszOU089/U8Jd7hxaTf0QfoGgTSzamZ5IY8jslHFU089/U9Wx3H+FQgG15JlPcazpj8+gC6jC/2bA7o9onSi+g6ehzqeGnHUdxAUqabW/S0D45C8//8k5fvA7LUWPvr/3vU6MrW3PWMmmqhkWSDrGY7f/zqB9MWEG2lelcchFObUIO2TBe03LKTmZMAemg5nUDr8AzMQqMmIfH0w6pBa9Dys9sCoRv5/cBr8g9ORMiAH/uxUGGYCMifNyOfQDegntX36hrMIMCwTPisIMyOElKGZSB2bg+AY1mxPD6Apo3KQOpJ0HF0vWqOQ7dD94ND1TjQn04LftJFSmYkU+pvU0dmk3rU+EjQ0NheBqlT4grQ/aP0cnw2fYdOeMmXfBLTyvvHRvvGZPUvxdGMLj7/7J2HHcsKOkcrWMzxAZTFNi68G+RaGc/Dqs+gErFZDaQ/cHZgyrRipd45F+u2jkXbnOGTcMU4e0+8cf1CaRhq6axzpWKRElJ97rLxe6t3jkHPb0Sg4fQj8xUEYVoJOR1Z9c3GnlM9mJRCxAgTQDuy8FJSfPQpDn2pFWVsjwitbSFs9PYCWts1A5bIWDKDHsusmwk5Tk318doKD0qF7KWCg9PPjUd3WiorlTShrn+ldy8OuLShdOQPVDzYh/5hq+DNsmD72dgigBaTpe9Oi7/ngNfVjoq5iSyxq27Z+zVTG9H0KG3yBQOg/D5RTUlJ8Bfn5vlAoVMWBeroQy+jH6+lCrTs4NTie9A16zvcPGDYg4AtNzUfq/NFIuXc4/PePRJA0cP9whO4bQTqyj9rzd1NY548gHY7gfcPl+fwL+PlJ7xuGEL1Gzu0E1idVwM63kUqndzDOjcOg7FrNPrLgfATIhkmgbPgRsEJkNWSg+LqjULa2Bfmb61C4aTrCG6ej1NMDav6mWuRumoaSTXUovfFoBFMthDiEYcZ3kx12hwmkS740HqVd9cjvnopsuubhTu9aHm4tY+1sRMWamSi4ZjwCQ8hbtdiwMWW/+Ew/qSN7yCbwtnzxm1ocx2Ermh5F3yOgXkzfD1VG5H9QbJpdh4E11TxVpdXv9/+MPvyewzI8VgC6EGnzxxGYjoTzwFgC6DEEqAzUo6J6n9a+fK2/D92n1E9fOw+MJoAeRc87mg6Dcci7YSzSagthhwwCYLppzBDdIE6C8iDNuudjF9yPIP2+n6zo9PH5GHAHAUxXMzKerUP2piYUdbagoJvBw9MDaQGBbGnHNFRubEDl1yYjkOIgYATELd4foA06FG1YjoXwlybQ3zSjfMN0FG9skufxrufhXsvpKOqajGI5LGei8uFG5MyugJNpq4OVgNmk/ePz2ZqvI3HnYWwegkvzCLR/4dj2mYRVKTwF6j9CuKSFdAx9+F9ZMo/wMJEVEUCnTilGxvyjCExHI/jABKTNG4/Q/DEEtGPpZ0no/BiV7wn06bkC948ncB5PVvV4ZN17FPKvGo3Q+BwYQR9SDLacHQlVGFYc14vAmV1uvj4hBmh21VINZDQWoeTJaagigKjoYKBpRH7nDOR1zRDQKN7Y6OkBNL+LwbUOJd1NCN80BWY6bV7Lr8JHcRKEPgJng6zs/KsmoHhTC3kqdSja2OJd7yNAi8h6zuuuI7CejvKO6ahZR/uinbzKa0YjMDQdZpA9UZOMHFuFCK0+jcIT5bCX33HeI6xawtb0scce60tPT//0grM+hcxgMPgwWc8S93ErEQ6PBV0kVm1o/nA4C8Yi/d4x9L0brhjVByVr+74REhqJtbhT548iUKYbRFvO2fcSQF82FPaIVBgOxzQJeB26aQiEQzFJjP1uFIutN5P+n8A5y0LWaTWoWjaTLIUGshgaUdDZTDdoE0oIMBg0SjbWHrQWdyotIdfxQFrMv0+vW9xJgEV/46r6+yNXGZyLOqehvLMeFWRBB1MsOizjrwH/LMCWmN9A0RXjUN7VQEBA3guBQ0k/XfNDrWF9fyhNvLbhyN/E/vxI/3wNdFi2ijWds2kaPU6T+69sfTMqH2xA9swyWOkWLI5Bk5HDXpIZ2V/KELL3WftoVUiEz4P1FwTUZwWDoVBKSuqnusqigPSnpmUe1hpIXoTAtEIBaI4Z2wvGiiUdkDj06D4rg7OzYCSsB0fDfmCM/Izj0KEFDPLDkXXHaOSePRB2aSqd4HbEOnbjnZau0nDcTLPp05lmBvCg3FR2kR+lnx2JAe2zkP9MM92ABNB0YxZ2sgXRQDdpfb9ocSe7jXVikTCIFXYx4E4X99EF7h7KB8XGJrFiCgi42OVX4Fffb+/pUCi/zzBZWwy2xTdNhj/EG5hd4fgAzbFLUwB6AsrI+i7rUCEOBocj+XMWy73RKF5WWUQbZO3cw5TXmJV/VqoB3AVqOWzlEDqCP6Pcg41qP+j7j38epu/D9PPKdvq/K2mfDklHwLGRIuFCMgg5IUyGkmOyN6v24YHIl8io3EH6BBmaNbmFxb5PXYMLx59NNTPwd4cl7rwfQBchbf54iUEzQAfvGwP/guQAOjViMesE4f0j6DlIHxiNTK4KmVsKM9+iG4FjznYCPlt1motytYZ4FAzQBpyaVIS/OhHVa4+lzdWM8vVkuXU06Buzv7VRXPeijhb1qLVkYzNt8qb9NCwA5W6IerHsS+Vnh+r99Y/yYRLuICurmzb1zZPhhFTWnysA4m5O/jkD9JUTUNrNn306XZfmCDAcqVoih2eTDsXoryPaGNEi9/uNvKakMWtccoR/xkRaRJpP68zGRvmGGRgyrwV5zRXwp9FBbPnE8EnlvA4dzAzUiUpdYwHati2OS+/1+wM/JZA+ifAswAlEvz/06QFo0nLS35nmkWFBHyxA++8fI6GO1PnDCaCHCzgHF4xGzg1jkNJUBCPdlkRFgDa5k+i9aHCWhJRPXReOP6eOycbArxOQdMxC5jNkpW5qRMWGpkMI0Apcw+K+NwkIFXa2RMIoroYjFhpZWZykIQu7pLNOALtU/rbxEwDQ0z/1AB2OAeZC8XK0p9NdT9ZmnWhRpxuiIu+pqz76sxjL+ZMI0CVibDQgd1Mdcp8hr2fjLAxcfiyKvkR7fECq1LWHjBQygvwKsI2+ALTtgjQnEbeSPkRfV7q5tU8LQGsL+tMB0BzWsB6kv39gGAIE1FnzJiDvmtEIjc2CwZUaRoAs5wAMLpJPVGerQYCTGTb9npFuIKehFDWPkIu2roUAgTfOdORxeVi3sg76+2ZWVi+Xjk1DKWfEaZOWdSrADdOBUEYuLyuDU1mH+jpMm72wW216fizcpB6LPhEW9KcfoBmYoxY0rTEftGwxd9WihJXXmNaSk2usYTpwWUv1Y7E+eOX5Oj5ZAM3hnHI2ZjaqsAeHtfhalK+fhap5TchsLJPkMOOQLZ2iH4HfwzT2EGi/alnmCQTUgU98pcenEaA5bu08MIJAegTS7x2H4s/R39ekCyBbPPeQM8hc58wulJX4vagCe9I8CzmnVaNmSTNZQM1izZR0cuJjqoA032z9ejNznJLdW7KuigVkaVNvbqafzUDJupkIryJtm4GiFfReVtAN307/x00Ca+hGX30MalYfh6oNs1HRORMlDM6basUa8wD68GthF4MxWY+dzVIeOGA96bpWlG44BsVrZ6KY13FFC8LLm1G6lA7iZfTYzmvbimICsmKyOou7Wui+aJJ7Q+6Rzk8SUDdGjA++JwvZy+vgz9KKslXkHX5+GJzykFR3JFuooBKJqgvRsqytpPcTSJcNGjz00w7QmnFKJ2g+ErdCpHTP6PUCp00uQs49R6nKjfmjkTqPQXcUWcJ9U67aSJPY8yhk3j0emWfXwCoOqI4laTk1I4km6QiU8IUhySgpmOd4M1eymJaEOKxiP8o+Owo1bTPJWq5XSSi6sfK6p4tyVr18Q13cGK8CRbaI6ul3OE7dJOVkrOzeFtLGKmIrghNCBPKltGkrNsxAVXsLKh4ni+nuiSi6ejSKzh6KnOMqkVUXRuakYqSOyUdwRDb8wzLhDM2Qr1PH5CFjfBGypoWl5jT3zIEouGQESm8YjwEP1KJyCVkq62agYuMMssSaIslG9gSKdWwwv5vfl+uCqwRlbBjlULrWRzpAqzCTmwim99upEmCFunInv7NZvuZQV/kGthbrpElJQhK03sVdrfT1DOl6rHyE1vb2o1F85SjkXDAI2SdUIKeuDOlHFyE0KheB4VkIDM0U6oHAkEyk0M/SJxQik9e2tQKZp1cj5/JhKL51PMoWTMWARU0YwId2ZysKNvHaNsg9xVUxxdw8soEMi45m7UXVS8KxtKNOA2WjzluotS4+pGtcL94mh3WKOl2tl9csovdZ2E2eIt17VfPqkFZXDCtgKSNKSi2tCHZYvqj2TOJH66V1u/gev9//qm07x9PPHOuTyOnRG0Bb+kQSUOMicfq6qjgHU8YNwZSxQ9Sj1smkk8bvr9PGD8PRI2uQkRJQ3Xf0XL6EPBc+ugmLkXfP0RI3Ds0bJrFkZ8EYONxg0hflSo2HRiH39rHIObYSVrbiAFCfzUhw+JhRch75XQcG3Rwh2iCV13EycIYAV1FnfAAo6dWlbRSrSSynTvW7hXRT8gYvpedk66lmWRMqHpiM4ivGIOfEAXRI5SNQmQI7w4ZjJ0cIZMXwg/gc+sypNvwFKUgZloWspjIUnj8c5bccjcqnpqNqbYOAG1tg7ntzrRtJMHZENXzIN+8nwIKWQ6tRX686CTfwdWEA5PuDwSdXDlzSzcoTqljBgEOATO8za2410o6mta1KgZlDgBPwqTAbt7QnRULE1AJcB057siiAlBGZyG4tQ+5FQ1Fy2yRUP9mIyjUEyl3NUgHEBkHJxhaUbWgUY4EPDlUxogC6VJLJbrlf3WGyrBmk6+UQYe8xvJL2y+dGwV+eIiFGW9rELblWbGyxJ2zFVPUk5Pfx+1nfsW37HtJyxrxP1CzEvgK0EGzT95efPwdbft6OLT9bpnV5RN8lfWdf/Z/V+Mk3n8TgsiLFHtcbQNs+pNcWofD2SUi9Zyyy7hwjVnDqveORdu+4Pmn6PeNQdB19XV8IX5YhG6C30I17EkvrqRWkR0vKfFIn5GLgPfWoXDcT2Zvrkb1JWUrJAIBsWrKccrpbkMuxYLJqymkT1KxvRM1iAuXbJqPojMFk+RbAX5wGJ0BgSqBq8+nvUyQzTBiUjKtnRtYtOlmZq1Ac06bnpZuca70J+FMGZCK/sRKFZKFXP9aAgataUE3WX8XGaWTFszYoq18ntPK7mrV1/Z8J0Gw152xSXk/lhlpUradr1DEV5R1TREs3TiUrmUsFZ6ConazV26cgh9Y2dWK+gKjjt5DCn8dU7euxpZy2L0lXngFde6Qcq7Xp+fzk7QVo/zi8toMzkXVMBYquOQqVj7WgeJ0GvS61lnm0lrl0zRi8C0gLdYiES//CHY2HPQRS2KE8k6KumSi/tw7Zk4sRDOiEvmXpvepXJbJ0HVzypQMMBthj29YLBNIz6Gv7EwPSfQZoUrbMrr3kVODNduA3i/umry/Hb59/CiMqijVAWwkBmm/a1NwUpI3IQXB0FlJGZ4sGR9HX5Pal9lGDlWkwgoaUxEnHknyGxIDmJw0aATqpU2BkWMhqKUXVo3xTtxC41iFz83S5mTlW1tcbmDcEW54VG2rFfSztJPd29WyULqhF3mcGI2VMDsxCRzF+0TUJ+vz06IBuHXLpOLxiyuEi7p2ZZCw/4vIZEc/AopvatP2KC0EOSeUVWUGTrPVUaR4o/ep4VC8m64s78rp1/fUm0q7pui637j8YoBsF1PgacFNMuHOqdMplP0Mgt7kR1WtaMOAeWuvThyJ1VA4CBJQB4QwxdBiNeVsCQhHAHXSWz5L78qCmkkg+he8dWzpb+UA3fcpj5JpiK8VEqDQVOS1lKL/+aMlbFGyagRJp7KkVL6CArncee3idTRIGkXjwYS7FK3JDLnTfFW9qRklbK3I/NwKBshS5piEfG1F+IShjljz3oEs8B9HUcWkB679btn1nKBQKZ2VlH/mcHn0Oceg4LQP07jdXYzcB7x7RZRHdG0fx+lL89gePYTgBNL0aAYOd0CXhQ8DxBXSfvim1x77I61uKDeuASr/nM8QqCdBz+H2GpjX0Jax35nroEH32QIYfhacPlqRM/uYGSQCysivLmXW2KsMdB07wRUuq6sgirUPl8gaU3TgJOa2VCBSmwc9UmgTIjmwq/szKEmKeD1a2iPhG5GL9kOH7CNlsF8RUuMP2Ra01sdj4NSxWPngJuH061keHU8qobBRdOAJlj01D1bpWSUIVknVYunEKffZp/8Ex6EYBLwVq0wXYSrpmoWbFcai6nqzl1jLYpeSBkaVs0dqafB9aKgkdqaUXz8hSuRC2gnUVUV9anXtwJscAknoeS/aVJUaU0SM+Kx4V/Z+T5UfK2BwUXDgU1Y9Ox8A1TbS+M+gztUr8XIVBGiNx9sNfkldPhwh5JZ3TkLOZK5eOxaA7G5E5uRCBoCGNLbx/BCsMI6HRl8Ci3k2W9HOOY7eSJX1kW9MfBaB3vbkGu15vw643orqbvt/zervo3oi2AW8sx5s/eByDBKD1jZSgjdpN0JliCeh4tdsgYlh9Vjem3COJYCU+FKS6o8yP8CWjMah9Fm2+RlLVucWxOdaSSH1x/QFjacW67btsJbm6ZLmkTSsk11N1SzkEOgGyYrkY39GxNGmKMY0Yt5cOF/q/kKFY9cyPCMoqCapUdUqqKhYX/FPFstaWtKU8Dj4onAC9dk0aiumwqrmPXPo1s3Ts9T83Bh3WZWISFiDAkGTflyci4+hCWJl0T3NCWbwVZmsLKjfcDUVwTa9Wlw3RbYRywx3JAnSP5+DXlvZoRUNgusl4S8W42WvyG8qit0MW/EPTkX/2IAx8sImA+jha15nSll3Y6VIFHAkA3SDeqnTB0r6r5PDbhlaUL29BwWeGwC4LqHuZsaKXsGncYgcd9iCQ/hs93mRaVtERW47XO0AbEZJ65nPlDX/t50/GHrKg97y+oofuJcVv9lH+WSxAG7bcuAkBWlu0kRtQq+n7qC5+9Gv3e79P3cxsubBlYdsERoMzUH3jFFSsn4FCulG59bZ0Q4NuTW2ItKzmE2gzcLs1nJzsK5IyoVqUb9BtugQW1atmoPyGo5E6tQBWut3jcPDFtJS7j8pLsNRmNtWBZMj3lqooieMKmz0OUP27wrNr9ADnWHfYcA880+hh3TFABNxYKFvVtinXyXToelWnIv+84ah6eiZZjDPpc9eibIOileTPXppE2Kf3EAInr6aijFntbmSAtiLx2vgbzdat3uNQsqlRvJyizsZ+qlWulW43DmtUrG+ShGnuJlrv7hZU0WFVesNEpNHa2uRx2Lphwj1cLXeijl4Pn7SrGxGr196nhdlN6vaWT7B0yadKsltyb5i+aDWSE+GOMfVrKq5lS3uSLiucqH5PzKUdGJiGvPOHovzpZlrbVglllWwgb6mD7+NmyZ1w7uFwJA+LdA6H763Sjfx+aum91EnHZfX62ai8bQpSJ+XDTDUVs6Hll8OI71snhtNj38aWKEi7au4icP4veqyjR/uIC3kkBmiVqPJp684kt00A+uITCaBXSXhjr9bI17/ZV5cRQC/Dbwmgh1QW6xvMkRvHPBy11qayGC1JrBhCV5kxoQAD727EwHWzxTrKkZrhJkmOxbek6qRRoDDCizFNEdh01JPL2IKKedOQdWw57Hy/AsIkDhflCkc3pWH0Zvnr5KaArgIDw3Q9HReMTW3FWUklGt1DROrF2SVPNRAanYWSy8agZnmLHFr82UulXKuxX6xW4QrpJICmx8rrJyPAlp6ZqJGI1zAI228SQI9F0SYVG5ayxf6oJtBcJyXCkNcgoFC5bgaq5jdIZZCd59f7I/nwU6zB4DOTa8LwuQBsup6lWifT9CWdp5C9zsmzFAOpwzJResV48vhapdSSD+GibtWxqlrND2d1x/5VUezhltLBMWhRC0rOHgYnnALHtsRDNS21x23tQfTlGuv49F/Imr6BNMc8kia3HAxA74kB6D1xAXr5EQXQHF7xWSFxQwO0uXOay1H9JAFO12zJwBd0TSPLqRa5TH0Zp2KB3a7K9dzaXS8bWOLT0s3XiMpl5F5/bgRC5anwO2oShPERyKdiQxS9ew6qJFB5N7FxZs1prcNEhi6RTKroPwIeliS1HEOR5/vTTeTyNZvHJVnsFjf0W/JQXe96IXqvvnmq8DMEdKw2HkD7fQTQAQLoK8dKkwMfqBxL7ZeOSfo8XMlS3DkZeZvp8F4yGxXnjkKQ1tby+yQ0ZUuOJFmQjSmD9O1zICYCZdPS8evo35sxVARRvpgkQZo9JctBiD5HKhsE5A1ktJQifD+twcbZpE1CvB8WNsWGI6YBiAFaOKc31YpnM2j1bFTdNh3pR+XRnmZQ5nuG97hfErRW3+PSXI73oe04T9D3GUdMzXR/WdCJrOgjBqDZ0jAUcNmZARSeMVTiWdnPNiDrmTrJGJdL2VSt5rSNb0GzS8+xWC5P4t8bQJZ39b21yJxaCCPdlEy9n24QBhfbl3xW3uxReWEmHpwqySEnQs3ot3R9qC86LzHAoMr/Z7o3at+6s6JutynlTKYMJrDh8Gs6PgGqks+PRhl3M3J9t2tdHUSIIb+TQJaJpwgMyuZPh1OgXlOSbXGAK8BVL5kWCm84mg7XGVKLnEuP/RLm6KD3w01JXMd87xRkTy2CGTLFG3J0+aPNdLPJGAdu/FisO1MeD/T3lvQeOBJe5AS7EZNbMLS7zgyUluaLSeb9MHiF+Ppyzb/fEt5tDg2k0NqWfX4sBi2fKWGFHJ4MFGlNPxK4TDh5qDp5i8igKhIDqRlVC5tRcNYQOPnMNklYRfswYCTm2olb7cFzEB3nA8u2z3cCAd8RMQPxP8aCNhWVYbAogLJLxiBMwJr5DXJZu1UslS0FjiOXSWNGQ8JsdgFb1+Rihcn9G9J2LCq/dBQCg9IQkNi5qcb58HxC+Zxmkm6nrtlmy9s0Yhps4m14UzwCtZEJiFNSUDZyLMpHjUd2RRWC2TmwQ0H1HJal44+914T3CJ9EGP1486qyP5NfT5K3Pon9cflW9aNNKGVw7To4vo886bRrFs6RMjo408bn9cLHYKjpz4PTUfyUIhxijgeuOT9YgGYvqWJdC8pXz0I+U2JWpgovtW2aOoltqBJIqSLorbxrfy5jaTThv7VNAUTXKk7kKflpjVPpfvIH01A6Yoysb1Z5NQJZ+TBTMuk5VJmZqRPOZrL18uJdWWpABd2vtpTr0eciQyPzmDJUPt5IHs3MGF6YIwCgJVlI692hmmtcvnRuBCtbPwvhWychY2we/AFDkrKSJE0mAWsLF8hm0tQjIh7dW5LQTADQuyVJuHw/7R+ANmK6g4yPpNHR7kaP+G5oSCYqbpqEoo5GZG2aJqdx1bpmlG1QIY2cTdwN1igJQNX6un9Mk28E5uOoWdyCvNNq4M9xkEJWpm0GdWzUTQJZapJxEjwC0hIfcJBVGsawadOQW1kh4JoIoE1tpfP6lI0cg8df/m8s+s3/4b4f/wq3bv4Ovvj4Ipx49XUYf8xcFNUMQiAlVSUkfbEE6PEnlwsXgrbObT3qSyUTVVLGZNBO9UmdesldRwnbXmyCJ2kLuqtFt9FPoQ3XQgff0bCynB6A0gOA0gyUnDtcGkN4nmHl+loC+SbpiDyYxJR0dy5uRt7JNQSEPDUnQJuWvAjbiFwHtzTOjEmk92kaiOMngKW1rZ1Oa1vWI2wRv1rDkOaimqMm4YlXf4qnfvUW5v3oF7j52e/hkscWY84V12LszGNROIgsx7RUtW6+qLVu9jLwwNTr6tfeFl9bn6VjuJaqkEkbk4vqW6fStW3SbHo9u2ZLDkuIQ4XDCmkPlnS0RGq3VWya3iN5PlUL65F3+kDYecEEeJIYe3RTy+ukpUdEmKM3gPa5iSadFWYX68ufP0UAetfrK6S0btdvVojuJt2zny6P1EEPlTI7WyzMhKcXWwKWX26QFLE+2ULwq9llfVSOu6qstiGuqHQeZRjSPl1x/zSyjpul+aKQJzxIt5zKFHPGPr+rUTgpuKKAG0zKN9QLcORumkGg3SwtvKVkpQ0gqzG9oRiWzDBUXXqGqbqazH3KE934YWwC0M38s0vFv2OlZCA8ZBQazz4fFz+xGPe+8j+0IX+GitHj5PPHu6l8GrB4c/lsB2fe+HWs274LK7btxYrtwModwFrSdfR9+1/ew6M/fx3XrenCCVdei8GTaxHMzFMHiK4cEfDTm9vndmvpagFL19ZG4uM6ri1AxZu8NAXhK8fRRp6hyKO4BVq4ixvFGxGN2WCJ3dda4ZDgtRi8fBbyLhgKK+yXZA+/JpeKsSXLwJ19ejUGL2kRNjiuuFDUnMkdDhyz5vfJZP+cIC7mZqKHm5AxrVganVQZpKOqY0wFZM5+iUGVkLV9RqSKw7RUTbuEJEIpKBk6Ck3nXYQvPrEE82htH6ODtHz4cA2miRuR5Fr7HZx9x71YtXOPrGv7jr2ytqtJ12zbjeV/2YYHf/YarqS1nXPVdRg4aRqCWXn09xwiUvFlOyavoO4ZUx20MfXS0QSmqo2XGmPuQK1KQSUnENfpJp31ypjhski3Cevjr+5QjTVFG5tiRpypBG+hWNP1KF/bgsJbxyM4IUevJYf5ApKD8vXS/KXj0b+nx8oj3oJWRPW2AkhteV0jnYQrBXhVM8pSUcTVJdj7xnK88YMnMSSmDjpRUoRd2mBBCKFRWUgfmg3/SNYsBEZkIjiMNevAOjwTgdEZCI3JQurYXGQ0FqPki6Mw8OkmsY44TlrSh5ZlJpzhMAcT3xTRDZDX2SQc0APurkPq0bmy6OwOhnyJLaDYDsnY8iqpJCH3NKM4jEknnIovPbkET/z8NWz4+3bagHvQ9j5w3ar1ZBVlyk0V99S3FLAyeOZVVOOhl36C9u17sHwbsHzrPko/W7YTaCNdu+UDrHjz/3B71zcw57KrER42AlYghUDF1mVZGmQ4pqmJpLjxp9eEpZ82e56Dos8OR+XaGSjvalSUmJ31EZKh8AEsrvBGxfchlRhc3rahBYNWzULVzZOQPbccIbrmoQm5yD6mHNVfmYiBbS2yNiVCVqTqqJO1nvl1qtbXS+t2Ec81vK8WaaNzY4aVGnFpAdzDKrbSJmpNq0qLzHAppp50Kq56egme+vlvsObdf6J9Jx2W/wSubVsLJzVE19XstbOWX6NkwAA88pP/J+vHa9m+lYCa1F3bZVvVmrfxgbzt31j+2z/jlo5ncMwll6NkyAjYwRSp6nEMDVI+I0ImFN8zM8T743AH9xT4Aj6ZHlR04TDUtLdKiI8PQlXtcmQyJLqeL8+3HPhkq9Ap2KV00JMhwxORnF56C9S1MT8pAK2t6BiAvuJzp+CD36zFB79ehX++RvrrlaTt8vjBvvraSrz/mw34xfeXYKDbSdjLDcnWWPbsSlQun4HS5bTpVjajYhmdhstrUbqCAHZFcx+Uf6+O/o4237IWVK9uVfyz3bVCxtKXbDQvbvZmtpgbMGDdFLoRpxJ4EFjcVofQ4EyJZVvM2eGzNZdC7wk3BvEUn9ocHIPMqxqI475wFe751nNY+ddtaH9/Lxa/z5YvWUTbyUJ699+Y9flL5Dr5EyV/BETU2jSQ5b3uH9vRRhbViq179wdo0qXb9mAZPf+ybWxl7yKLbDfWbP8Xnv7F67ho/sMYNKWBNnNq3O7DqEeQwD0McO00vdd0AukzhqFy5QwUd9eKJyIseV2Ky6JkY9SSjtcIUqo9GY4pF3fxxJgWOVQ5BFXeToDM60uWHHNehzX4Cyh39qxZ76tK7bpMoG5B2W1ThRxLyi8NM647HHvfuonUSKzeUJ5MbvUgHHvpl3DPd55H+1+3Yw0fuGT1LqLD82m6/ive/QAzPneJ/L7keIzEBgvvt9kXfh5r395Ba0f3xxYF0Mu3RQF66RZa0y17BKgX0/ovo9dbSZ7U6rffxyM/+xXOnf8Qao6eAieUJqDLSU7D7IUHPVJiyQ1NuhrJJnDPdJB/5hCEV8+QBCqXmeZ3NfVb7Xm/g7SeTMO16xXryev92iQEB6YLOMte7B2g/0BWdOUR0bzSK0BbUZA2dNfboAHlOPXY6TiN9FTRaRE97Zh9dTrOOK4WxzRPRkZ6aqRtO+FGpxsyf1YNyjfORgk3g3STJdVRJwRD4U5F3HNA1WxjxTG/X9ilSGGS4s0lYOAkRHHXVCG8D982BWkDMxW/gliWtk6g9daqG8M/bfqRXVaN4y69HPe++BMC0g/JoiILiNzWJaSLaQMuIV1GltDiN/6EqgkTImCZqOKDi/P9GXlkka0TF3g5gXPblvgAvZI28WrSlfz/ZHEtpddcRMCx9D1gFW3qRb/5Ay596AnazBNhhgISj+RDKMDzF309KxH293xUVQnzQgRSLGSfVo2K9kYFxtriYi2Rbrz4lrQ0JHS4xEwKpDnenyecIPXyPMWdzYo7YlOdKrXqioJ7+CPERFXDUSsKvz4Zgao0CYtx152RKDHrlh+69fSWqsiw6F7IClfgOPJI5tHatrO1TOu4ImLl8gEJCVEs+g2t7fjx2gI3xKvc1zsxdf18MCsHt6zfhJV0oC7lUBUBdNs2Xr/4ayyvo19rmYS66G927Mbjr/8BFy14FAOPngbLSZN70TKt6Jr69v+MlsSmuQqI159LLQMw0y3knjEQpQTSnKsp2fAJmNSziYdqTMHQNTNQeFyNOpjM3kvuPnEAbeh2YLc0KCkeaNcd7NE1l6Aul34vd/YAAsSZtBGn6WQdk9TU91nzuhVvQpk7o69DzXQrSbJMqKJDlVmVdM1AxR3TpOuKy8z8kpyLttL2BtCqE46slsxcTD7tHNxJVtXad97D8vf24mnSxTtU+GHluwSS7+4VsGbQvuMb30VKdnak1C3etVJhCBtVYydj4et/xOL3FCDwc8UFaHp+fg0GaLayeQMvpQ28iNzup0mXkkW9avuHeOpXv8PZd8xDwaBh9NmY6jFwwBIuxaGiqgkknJNqIvesQRjY3ozKDsX76wJ0aUIuEx4bpio53JIuGYywsTE6bzFmPQu0Ve6Cc6EOcyQTy+TyvKrbapFakaGsYEdVzZgJOjdjS9ysgF8sS39WLqaffh7mf/sHWPP2e1hB13ERHXh82LrXfoXW1bS+t2/+LkK5uVL+KJ2bTJoUG9c2VaKK98KQSVOw5K2/iVXM67Vyi7pflm6Lv8ZsYbdtUY8qtKXWeckOBdScZDz75jtRWENrawai+7SHF6vCkNF8Cseq/XSvBWR+p5VmIefMQTIsObyh7ogfv1UqQ5enoWo93YdzavR+NXqNQX8CLWjdv264sUiXvCiqptxoPVVimJJgccvGoq3JcRNfPg5xVIu1VNYxWTLzPO9PpgJ36pHznQfQjYqjlzufXMtZTbduPAB7VkNPa4CJkrrpfdzXgMzB2UhlS4krFzQ/s1trbPv2yQ6bUZeXwbl81ASJMS/+yxYsY1CWODFZuuSSsrZvVaDJupQ24lrS8++4V3GSyBrYca+VxLRpwxx76ZVkWf1LAH95LwC9TG9sdpWX83sgN5i1nVzkFdsUoCyln3MSas3Wf+PhH7yM6WecByc1R4FI3Moa9z5xpJ5WEmiWKmf0p9souGAwBq1qlhFdhXoAQCILmhM+RQTQnJVXgFwvnYphPfy2WM/pc6tr3K7BcKRMr0GsuqIEMUl3gofLe11OB8HAu+qFmlO6zjgpyi3uRqzXYkRCO2ztBt2KB97EwQAqx47BZU/R2v51K9q276bru5uu9R6s2qLWdtl21l1yXZdIgg84++v3wPA75JkYMQAdrWoy9UQQfs0Tr/wKWc+76SBXXs/KLW64Kv4aswe13LWwZY33SNK4bSvfW2wQ8Hv7J+577iVMOfkMOCkpyoiKaehQHCIBTcKk81CSgFfJQ5uA2kqzUXTeUJSub5SJKMWRvdMoselkQVvWuEM/hybxL9RE/gc7YZyri7jPoXRjCwqPr1GxeJ/96bCgk2uyiK+9ldLFA+icWQMIaFvp5JsqQMsk48kuVrFWtSHVpkyUQGIrnQdzcjUHd08VuSOECJwrH2tC2thcxTdrmD2sZbdW2IzJ5KvpDz4BcTslDdPO+Azu+8mvyDJVcUPleva0qmLdU7Zm1729A7W0ecSyMRNfK7a6gqlp+HL7eqx4DwpcJcSBhO7vMvf1tyGykdmajsStI+9tL9ZyvPQvW/G5h55C3sDBmm7W5YOwdbLX3G/iTiyfSig7gJIvjaNDdqY+QKfSNa7rJRTRqEciuaBavw8JVX3CoQFFvVSIcPikTGZIKo4NPgBqHmpA6ugcFTuOOXD24zjR1UHsRTDjITMfOhk5aDj3Qjz4419g9TY+6PT11WsqViwn9DikRIcdeyrsLbX//X1MO+HkmAlDMXtHX1uJbZOFx3XsN6x/Vio2liUE5F40JhSyIhbY2RKn97v0T1tw/vwHkVtFoEVGVMinwnGcI1G13m6RAF8DRxGQSYeqX9XVF5Dx8YWxqFjHPQF8nZs1RUKt8LQkA9CcpK1cXy+eEYe1CtywFo/0ktFeH005ScjcLrnP0vp3HYeCOQOF6703/u1PLUD310xCF6BLOPYrBOiN8ngoWdQ4MVXWoU5uJsXhtu+aRS3ImBGGEfTpqd69MGbp2li3rjQ7XI5z75yPJX96V8Uixd3d0+uGEgv3fdI3/oSKUeMjbcSJqif4WpUMHITHf/6auLBiDW/dG3VvD0p5c++WZOK6rf/Cvf/1HEbUN0sWXLnFRlxLer/OLOZHCIcQvmkiytkqpo2bexjmIzK/BCd8qzZMle4zTiZnNJaIpb+vix85cHyK08TUyV23QiOjuBQX3H0/Vv3pbTp4d/W6rgzQq7Z+KADN4aunf/UHVA4fqWK++1RRmDoe7VaJhEeMwtP/+zu0beuP9YwTDiGDYfU77+OeZ7+DYbX1sHgqts6ZxBpXLkArTniVGLdNXd1RHkQlrS0fwLmbFckUW8MFSSYOpbaZvF2e9s25nwGrZ2HAQwT8t0xByc2TD0orvzYJ5fRYddN0ZE0p1n0KiakPjjiA5lISUgFo6wgE6NJDDNCShe5Uk4Z5Qnf+pkZUrmxG/qk1QhIk8UBx+4zIKKm4XYqWev+lg4bhptWdYjkJcHJZ1DYNegfaNDuB+c+9gvSCcAxAJ/Y2jj7uOLT/ZZsAwHKyeKUEqx8AegW75pyY4qoSAuq1BERP/eIt1J9zPqxgQLvFyhKxpBnHjF9l4viRQps6ZUgGKu6bLhsxZ9P0wwLQHP4o3ziVNn89Ms8aQB6Oatfev2kn6tKrmm/Vgckhp9LRR+Frqzeg/d33sUTAeRdZy7sSXkcJY9F15PARW8J3/dfzSMvLd+flxalisnTi0Iepp5yO1X9/r4eX1b9KhwYdHpx3eOznv0b9mefCDqVKL4Jb0+24DSymy+diScKbk4ecUzIdQwZkVBOYqtb4qVLCmt09Q0pT+7o+zH2TrVvrqx8lI+nEavgr6b2k0uulmAelnA8xs+i9Z3NYyq1tt/c7II9YgOY3YSk+1F84tp2wPvLTCtCuWy3xarKyqtfORsXVR8HKd/TNqGah8YI6+9U8G5GpJfzzmklTcPt3X8CK93Rt6hYktcFW7dyLa5auohsprQflq5mgouD0r92ENds+FABgi6h9y97+2bycXOKYqq4wYde4jR5X/P7vOPmaaxHITJcmm8ikFtNKQOakeIhTeIRYfSFZri3S6FDysbcH8/o2yVDV0stHw8kicDb8wtkQl6BIGp40QDNPRcDG4Km1WPDcy1hL13vRzj14ihOr25HwQFwWCSOp+PNqWqOrnloC0+9PWGLq8ylwNBwLZ992l/zN8kMG0GptF9LnWLaTDuK3/oI5X7oG/vRsud+5ICB22IWiMbUjXC+OT01zscgLYU9z0LKZdAAqgyevuzUpgOaeg4qOFgx5oBGpR+fACBlSOWIa7t77aMr3pV/2sJpeoxKevXcTHnEArdsZmQp1kZWgrfhTbUFrLmIG6NKOmRgyrxmpg7KkEcQRUp7oTWrty5uguS3Y8hnRMAMPvPI/aKObnW96ifUlCZiryaI5+5a7BCCizRBx3DHpUAvhuhWrsZqsbo5vtm1Tian+AmgGlhUxYMObmZNO7X/dhnNvvQPB7CzVGmwmsKANN5lsI8jAk2ah8DNDMaRtVo+28I+lJnZDHSo7WjFgXqOMQ0uRFmoVa417sPjsyMHo85sY2dyKR1/9OR2Ge8QjWibVEb3Hht2ELOti+v21ZG2feu1XE5YqSvhMYt1036XQ2q7ukMaiQwbQOkbN67pEuhN3Y+VftuD0G26BPytLOnBtbXgogLY1s54bkrGliUqoOtMMlJ47AlXrjhGGw3BHbZIeTi0GrZyBguOqhKea53AKSZTmojkoNVTllT/G0OmNJvaIA2g3Dk1vaqpl27+Pdhn1lyXdN4Kew2dBK/4GbjEesHQGMptK4XdsTY4Tbe11q1mk/ln/n627robUt2DBT3+NNVzOtGU3ASUE4NgKTWbTrNzyL7RceKnEeN3OtHhcD3yd0guKcD9ZdLyxOMTRvnW31DrHSxJ+lCTTCp1wjHgB21QFClearP7rFjpIbkcgO1MN5Y1UsETLl0w39ONTISJp1S4IoOLGSSgVHoVG0SKdte+/ieEqMVyiG1+KdEnegCXNyJ5WrLgWuDvT0m34Oh4Zy9Eso6M04f6IukY8xslA/uw6qcpVGly2uFwDdSKAXqFBcBGtz5q3/4mGM89KyIGi2rCV1Z5TUooFL/1ULPRl22ITflq37dEH6J5IgjfpNZbPsiemHG+3OoD//C7OuPFm+DOzddmkoYmdbNXDEKnssoS2gQGajRh/URDhr3OMn3MN0yRk6CZt3SqcRKGtgk10gD5cD6c8hV4zQNderY+5T/fmR9HI1JqYCTS9FTC4AE3gfOQANFvRPEyAAPpUenzTcRwpKfuoZEWxqkfLoC/x7cMB0Kr2tgWVzPF86QiYab1RSfKGDqrSN50NrjlqCua//DNJBi7fio8cM+QNtvjt93HU7DmKE8PSLmQC4p3wsJF4+rU/YCVZZrJZtynQUGEOpW6FRls/WV1ioXP9NFcW/H0nTr3+RjjpoZgGB0NPm44fCuLEW/qYIlQt5jZtRQbP7q2aLN1PhPA8M7CrVng9eIZksdCYzkbRJSPhpCp3l9kG+dBQsy+dyKR5F3z4gDadAIZMbcSjL/0M66Rcbm9MlUvf1lOVT+6WhqBlf96KsTNmxgdo/dq2LvHjJDE3DqnX3Nuj4qZNJ3CXb/uQDordWP3uHklGilXfD2vMz9NGXtIpX7kB/tQsOLYmi4qzf023csdUoZmM8XmoprUt7FZVOHndqiKK6UHZAGJLOe4e5P1980RYIVtRhRrR++jj1thW7yOGE9qNRTNIEziP8fv9dwZDoW/ZjvMq/exHB6N+x/lRIBj8leP37zoSAZrDGzzleMCDtfAPTpcuK9u0e+k20kkSeq+FQ4bjzm89h/Vk+a442Pgvbegn/vg2hk6qldAGlzs5cQ4KF6CH1TZgEW16tuoWcivxDh0v3q4aUJZLyGM3beJdtIl3JewwTBagOZQiDTbkHaz509s49guXwXCcqOubkD5Tx+tTTIQ/MxwD1s2UDRsma4vLs/K6mvuPh4HnR3ZORSmBNXMFV95fJ5SwlrRxO0LTyaEZmXupaQzMWNeXAKeEDsB7vvMCeUW7xdJs27Yn6QOXG4MYRJfspDX63d9QOXZCYoDWVh7//+gZx2LF37YpYE+4FqrxaM07e2l992qOjn6IWbM3tm0X2v/4D8y48BLhaQlxs1U8j1onx5VBQ9eSGQbPp7VdO0OajLI3q/FwqjmpMWF1R5jHx107DrY/BqCNwwrQv+PKNuNImqwSa02T0p5zUtPS0rIyMjI+sqanp2eTZgZCoQnMEHVExqC7pqN6TSPyTqyGGfDDb6aouHKi7kDbkCkU2QVhXLlyA5bTBm4nADzobDtt6Id/83+oHDFWDwTVAG3GS2Jxlv8MrNr2b0kstr9H1tX7ihBp8U4osNb1swIuW3b3y+Zt27pHN7XQ62zjEjKy3N74C44+bo5uDY9JpO5nARmR1vdQWQrK50+nzTtdpoVzDS3zQfdHwldNpuYmoynCEzy4fSZyjq+SdePGHraefWwRalB2q3P8mlCIq2YyS8pwXdtarNr+byyka8uHIINhMmu8VFvQDNDcvv/Y/76FoprBvdK8ysFM91bjORdg7bYPsJzWdOVO1V3KFT78PKxyCGuLeZnuFlyWiCjrI1R38POv4A7T135PHt1xCDBA++IDNHOwcI5BaqNpzwRpbQfOb5COz9zNdVJ3HpZwFicOW+KGpEq51PUrE+B4AH0Y4ttKS+lDv3YkAnQ5PX/FTUfBKmSWqxCpX9c8J2gQ8Vvwp6Tj3FvuIsD6N55+7yM2EuxX2gYs+MVvUVwzVFGIJgBoU4+vGj9zNr5KIHLN4nZ8dVUHbun+L9z/4o/x5K/ewvK/7pCOxFU7VHxRiJK27e2fxJKU4Lngz5SXwEMv/QzlZB3yuKHo1GojwSQPEzZ5B1nHVGDAStqwPBmjs28kVn0pmWSALu/gaRtTUU6AMOCmKXAKVLu64bN1G7MZw6uhiaEMNZDACqXivNvulpb8JTt2qWoN3Y6fLEC3a4BeTuvw4E9+hbyyqoQAbbq5Dnof42cdh+tWrMHVS1fhKys7cMOmb+Oe53+EJ3/5Ftr+vAXr6L5jy547FxfSOnMXKXtR/XEfLt+i3jsnulfu3IMHX/gRykeOk4PDSGBFK/pZR2L3DNh5x1VjSNsxcvBy+Eo6eztaEgM0d4B+9SgPoA9jI8yRCdCdDahZ0YLsxrDE2UJSeuXTRPtOQn6NKSefhqV/fFs1EXDsuB82B9NFzvvvXyO3tFo6Fxmg7QQWtNTROvRegymwyQV1gunwp2Ujs7gcJUNHEngfg1O/8lXcsHojAfbvycL/QFq4Dz4+qSoTVuiKkTbtVq8m4L6WgCQ1ryjChJc4hqiAkIfq1txCVvSmFuFKqNgwvR/CVSrpWLVhuljR1WQ9ZzeWRoaqupU4TkzCyAVoJg/yOQ5qTz8Ly/7wN7Rt36X4SrZ9tAOYvQwOQ6x+d7fkJ+a/9FPkFJUfeIiw8HwE6aAIyaMTzEAwI1fWNjx0FCbMPBanXfNV3LimEwt/9Vtpy5dmqH5qaHEPocXbVXfq+u0f4qrlaxDKzlO5EV9PGl2ZoO0YUhLHHNKcn7GLghh4Wx0GrJ0pXowKY9XL+ngAfWQCdBl96N8cToAu1SNycjc3Ck9HeEMtSjubUX7DRFg5jri/jpss0vwiblzVr0t1uESqZOgIPPDDn0gMdtlW5fYmC9DCwbFlj7CTMcAJDwa5sPe/8nNk0Ua0I7XVRu+j42NoQd34paGVAdKfno7yUWNx/GVX4s5vfh+r/rod7dtczgiVUGzTFRsuj8PSXsh4opZzDEBvUzwP7e/8Eydc9VU4TghBTqLa8QlpZFNzAppAOru2DGVrZihrt58Amg/dyg3ThKa09IZJsKXm2YgMV7VjxlT5dC234hCxUDJiFB4iL2SlXCO1tn1t/HFbvKO6V8WtycJdTl7Wvd9/CZkFJQcEaFNXPZmGCr0EDVUlYeuZhIoCgLy49ExUjJ6AuVdeh3u+/UOs/etO4RJfFvu+t6kGpuVJhLfce8ENn6zgjtK3/4lZX7yK1jQYqW5SoSEVm+YuWql80URLPCItp6Ucg8j4KemYKlS/qiu4/qABmveo37aFj91yJ8DYagJMXA0YMvnHF9Jk/MJCeUCyJA+gP26ALt9Qh7KOWmQ906QAmk726pWtyKwPx3STGfuRRfFCcsw5SDefPy0Vlzz0ODrIqlghHWLKUkrWwuLkXSxALyeAW0Vu6r3f/B5SC8Paeo6W9x1MiaOhP1dGYTHqzjgHX3/22wSm7wlHsWzEmDKuZZqGdOm2ZK1FPmD2YOEvf4thk+toAwVhOlbcTcAWlyN8DyE4OSFU3jYZpcxa2A9rrLiJ66SjLbymFekNpRFrOX5ziKWbF+g9paXhi48+hTXcIfhRyhKZq1lr2xYV3uBrvJiuy9qdu3Fn17PIKCjon34BNyTC5ZhOCjJKB6DhMxfi9m98Fyvffk/Y7yTBqMsvDzb/wE1KT/y/N1EzYYqAsGnpw80Xf9qPjGIrCKHijkmy17I3M4Vss/ChHCxAW46JzIYKpF4wCFlnDkTWOUORde4QpJw3SOlnoppKmnPOEOSdMxi5sytgp/rh9/kTGg8eQB9GgI7MGBRO4XrkbSbQvnUygYSTcEZfhLxcE+NPOPZ4rCD3dyVPrti2KxKH/Sjt1W7zB1s9a/62HdevXI8hUxpkw8noLxegrYMEaHdKuGZOSycL/bgrvoxHfv46vYcPI4fM0u0xZD9JbuhlZCkuIktt1Y7d+MqSlUjJzKNrZiVsi7eEyjIg3Yg5x1aQN9OiCbH6IWzVpSZyF905CYFcvwxLSHQNXdJ9XvuJx5+IVX/4hwxNWJpkSCCWjGjpdlcJoLfvwUpa268sXYmBR02EHXD6pVNX1Uy7MwsdGJafrFbySIpLZaTZw3RQLn1fdTomm9xM2ERFB80VTy9FMC1bGw+JWSnF+7RN5B5XIaOneFZkYac7ieUgQxxkOWd+lgD5gbHIuncs0u4fj5QFYxF4YDSC949GaB9NeWA80h84CvlXj4aV6yDoc+T5vRBHz6qQMOmvDydAMz0l3yRV65uFM5oHk+YSOAR8vrhup+mLmeNGblt6URlu2vgs2jk2KRwVeyKWU7IArSyaXcLTsPC132HO5dcgLa9YmmAYnP3CrxxjxfdDs5Bh+XS7qyGuavVRU3D9qvVY9c4/xY1dqi1otgC54SXZz7NEuKX3YMVftmDqKWdJq268jcCfx88qbc3kmYRDKH6oVjGQHXRNOwN0I7nSLciaU4EggWHQSDw9xJFp3QbSyLu4rfMbWMufY2ty9cRtW9U142vHDSnczMMcFyvpHnmCgHL2pVcgmJUfofXsHwtaHbxu9YniC9EGheOgZuJ03LB+M1Zv+bcuuVR5g+Uf0ZLmz8bJyOV/egeTTzhN6seF2TDRYWOqRh9/OICyBxqkxZ4bhvI2NR00QJsBgyzmQQgtGIPUe0cicP8Y0lFwFoxAaP5IpMwfEVH+Psg/f2AEsq8eAjPflrBMbEPVJw6gU1JTfcFQiOuh6d42MulH2QepmQTOY+hDv3Ug6+HQAjS5v9zuu7aVLOgmlD48DYHy1ITcsJF5czom3Xje57DiHzvVFJLt+/MsJ+cO75JM/IIfvIJRLTNpU/nF4rQjw1mj3XkHF+KIubZ2DOMeWz4EjpkETBfcda/UM6/a8q9ImdWKZFvGt6kSLy7t42EEX9/8baQXlibwSgzN8+CT6eB+20I+uahlG1v7pW2fO9mqHqyTci+HqTG53jnRJBhTDchtvOBirPn7Tkm0LUuS/6JNhzX4HmBwbiOvZP3Wf+G+77+EEY0zyOJLVdOIfImJrz5a6MqIzEW0I11zqnSQOWRywpW4cP6DWPb37WjbsUcO4eUHQQWwmO8Luvdv6v4WUvNK1D3ay70p8w8dAwXnD0PF+hm0p5tJW/oFoDM+MwiBB8cjcN9oBB6aAP8D42A/MIZAeqyoXyt/nUIgnUpgnXs5AXSurn3/JMag/X5HGlVIxxI430uP36Y3+DLpjw5CX6UP+6plW78gd/bDw2lBc3yydEMrKtbNkBul8MKhMP089Tsl/sSSmDFPDGTz/usHWEYWInM2Myi3acpG5jFItsRpNW2We7/zIqrGTxaglBpc5kGJ5eE1zX4DZyOWu5o9AuZPsFVLvz8zBSdd9WW0/+FtKfVbrF30pGLQO1SHGwMVz95r+8cO1J11XoKSMsWKpmb50cHkCyJtRLbMoFQjyT76Wsvsw+5mFF84HCncrm+nSUdgMEFrL69tVnEYd3/7eamEUDHbD5M6oCIALbmI3QLOdz3zHVSNOVq1SnN9sE7suk0xBx2yiqE/lTBDZKK9X8JY0izkWAhkZeO0a2/Asr9tl3DLwQD0yq1qMstCAnyeDsTvwzF7Kxnk1vAAUkfnoqyNAJjWpXJ968EDtF9Z0AEC5NA8spDvH4/QfWxFj6THUaSje2jg/tHyu1lXDYOZZytuG8v3yQLoYDDoy8jIIHC2znAt3f5itXOfqy/Pd0jL7Dp5VFKzlPuUtjch86h8cX8Nv5243EkDZf0552HVP94TK2RfIF7qEgnFcw1losUuHQckMCfrimtL7/rGd1E+coxUMljaVTVNd16jpep0NVgbblWJT91cZqQiwVClgLLpbT3ZRk+JFlV1q5a2yu3YKdSui6fHK9lpGTjhmq9i1Z+3CK80V6W41RxujJq7BxNXd+yNVDHw1+30GW/Y/B2k5OVK+MjSXBdqEo0R8RS4Y4/Z8PzpflRcN0VmPxZ0T5eEbkFnk8SluYa2r2vM8wt5bbPH5CPAHCC2XzgdnAigGTFekao5bjrnAlrbnepQ4vfOrfNJhAJkesqWDyUEsHrbv3FX9zdRPHQUva4T05kYHergWtIy9MBwouPf+P26uu+EHkMPl2WaAVJm4gvI/WDrVnVL7lPTvQfctnW+3qFUnPTl67H6z+9KVckSXYXDnzPS5BKjiRKk3KnIg2m5Yea6jc8iJbdASP7tuACtRr0xm1wg3UbZrVNQtImn5EzvF4BOP28Q/A+Ohn/+MPjvH4sgAXPw/hES6nDVrx+DBNL2Q2OQyQCdY+sDzDgQF8eRBdA6TjyZrOY/fGrZ7ITzmZ5rYy1K7puMYAGd7vy6/l7YxWghU3LycWPHMwI60pG3JbnkGYM6c2QsYQvmvd1Y8PJPyboaL8C1LymVTzacrW5OMxoD54y5xPWYlEYAztTVB4qfVyYw89QXn09PPTG0FR59jvgWgyGjqkzLL1NCzv363Vjz9k4pw1sRM/VlRdJk/6R/34mj586VNbU1aJh6oIFbFigbxVIAlTd7AIo7mpG/aRpZWtNlYrRa/74DtAz3nTcFwZyAxBrd8WTx5jry+0rJKcQtG56hQ1NNPFnh8o0kAdBcjbOCAHr5jg8x7wevoHLMBAmbGJaRcLiDTNUWcHY0sOoBrT59WPuiXk9Qr6sjIQ09Yo7pNPU90esQZv05/elZOP/2u7H67X9qS1/FpN2yyVhNBNBuaaXMRPzLFoybPUdRHiRiMZTDl947ad4pg6Tlvrhzar8AdBoBdJCsYv/84QjcN06s55T7RihrOUY5aZg2b5SAeTaHOLJszRT5CQJo5uCgN+SQLLI/1XzQal5dZUcTCi4aCjvgUwkkJ/EIL2aVG9U4A22//5vE8ZZvTS4jvkInkTgBx3wMT73+B4ybdUzM0Nx9QMNQG87l3BWrN3Y2pE9RYZpGCJaTCjuYqhpV/AHaDPrG8/miFlSMxgVoGdGlxjgFyELLyC/BlQuXS/OD+/5V0pCrAZBwnNa+8Wgm/F9Nn/fyJxfDDqWLpReJTbrgHDPLkdc9NCQLlU83Sc1sRUet8HMUJWlBcxw776JhdD0sxblhaQAxjegA5Jh2+VHNs7DiD/+QjkGeqi71y8lWsDAnNHfz/e+bGN00U62tY8ros7j3uF4Lt87djB2zpRuUDF7PlDS6dmnw+4OytkbMgFd5DmZl04ee4UtcUSExactGWlEYX16+Bmvo8F2iQ1nJJYKjfONcrfOFxxbS/ZeWoK5beXJuI0toTDaqFzfK2LNDB9DDDwDQQz+5AE1aTBb0L8xP8USVEp1EGtDWiozpRXq8T2LODUWcHpTxRmu37Y42cySZQGrbwjSVBHh/24YTr/oyLL9b3WAk3LxOJDuvLBG+2c1ACOmllaiY0oAxZ3wWU6/4GhpuuAetN96Dxi/fjKMv+hKGHHsy8gYPh5OeIR1dJquhLGyzl1FdvIlS5fUscs9H4L4Xfyy1tEtkM++RygbF/aCpLXslXlJuchv9/pO/eAOlI8fr+GhMdUycmKWVYaPq5qlSzcHsZzyfrrhTD4ztbW1l+nqjUFtWtrfI2kozheVoIh9DxfZtXeJnKA5rnxPABfMeErrWRQTQ/PncaoxlSQH0Xqk9Pv6yq6U6xmf0ciDqaey8tlJbL8MO/DDokE0to7WdSmt75mdRe/lX0fy1O9FEa1t33W046sLLMHT2icgfOgpOZh4s8niCPCNRW9dq6omjp57sD9B+nwLx8LDReOil/5YJ34u37+1RhRTRXip1ZGjtFsUL/sT/vIbw8NHxE8Eu74oO8XB5W/XXp0QG/HoAnXx4o8KyjsyZhP1ZBx3uakTVow0IlKXQIvkVL0OiDD+/FwLE+T/8sRTqS1NJzCDOvpVgKWrI9u0f4vrVG5GeXyg8FIm8lNjWYwEZ9mjIkiocdTRqL7kKJz28BGes/iZO7fwBTur+IU7sfgUnbXoZp3S/hFM6n8epG5/D6cs2YubN92LQzLkI5pfS8wUlXmknGtRrGpEBuPzefLaD2jM/g3Y6UFYTcDGArSCVqRs79yZVmtVOlvisS64gcHBiJmVHeZddcFZcGD4UnzmMALpVYpU8IzKZCdEM0KWPTYO/NESfl8DKDoiLzWDoxvNVLF7Fc/PKq/DgSz8Va1+sZwagd5NvPOKBwF9b04k08j5MSx340oqfoGLD1qRMzJjI7fm8tpMuuRrHPtqG09f+F63hD2gtX5D1PWHzy5i76VWc2v0yzlr/XZy+dCNmff0+DJ51AlJzi6U93U3+yjTuuIMTFGe5inubaDjrPKz561aygvfIGC5X26V9f6+6x3sBaOmcZSbDf+zEzAsvSdg/4LbUS6KUvDRmuYtfqeMBdF8saJ5J+JZxCPvfzcMe4qhDGdfI3jgeThrXGjOvs5MghqY289HHzJV4m2T3t6iqjeVJkQ4p2s+Fv/2zTFxhcHDsxFa7S34vFjZZv6nhMkz57CU4kTbmKd0/oM36Io4nQD6ONuzcrldxcucrpC/jxM4XSV/AXALp42ljz6WNfdqab+H4OxagYmIdWd/pYl2ZhluLG9vAYqt4ta1en93pVLLSPkuewz3feg53fusHuOf5V7HgR/+DB3/xJp763TtY/vf36Xrswhra2Kt5c8u0FRW/dC1Q4bB4D/jqqo0IMq+wDtfEkuPHAjTHWjOnFKFoLW1icoWZyzmsPZ9EB27UM+JOxCYU3DQWToqFVFpbw4kCtBEzfIFj9jxaatLxJ2Ltn7eJpc/Jr1Xv7hGA7i0ZGhvikiYlHhn1+p8wvK5ZJcVsQ5cymlI2GTeMxYeFYyKttByTL/wiTly8nsD4eczZ9ENa2xdxbNfzOG7TCziWdE73izip81XSV3BC50s4gX7npM0/xKnrv41Ztz+A8JTpMEMBVVHhU9ND9h/my/ebPzKWLZSdg4vmP4Tbv/Mibv7uS7j9xZ9g/o//Hx77399i8Vt/xbI/bxd2xtXc5MJMehGej+jE+IU8cJZA+trla+CEUvYjwPeZmttaDn5brP2MxmKUrWuNM3ndA+iPPNWbb2QV24rjkvrcdtPEk6cjiaiYCQe9/e6hBGjmpZVExRlDZOOqxJWbNY9XN2zjvNvvIQBy+Q2SD3Fw8wbH6y576Enh1T0QB0OAXGTHCor7XTj2aMy46xGcQhbViV0Eut2u0kbteol+9hJt3B8qpf/n3zmBgPqELgJvAvA53S/gtK7ncObSDRh/zkXw5zH5OrnWMufNkUSWbcQ0T5j7cF6npCIlKwepObnIKChEbjiMggE1qDp6IsYfczxaL/w8zqXr89X2deRl/ET4jnkizOr3OCar279JHyVQLx42UkA4FBnhtf/hzfdGsDIVAx5rkQnRlevrULmhOS7BDoNzmSQPm+iRJ3KwxT0LRacO1mEEM8KmZ7ogpb/nrlDDDuGCuzh0tadnR2Bvk0kksbZLKESXbVU1z6sIpD4770FY/gAik4h0fNlwKyx8qppBVemo+6pk7FGYdc9jOLWLAJjWKrJ+rhIws/LP1Pqq9T5h00sC2nNpnU/hn7dtxtizL4Q/KxcppilhEzXxx1YjqXwxk24iZZaGWlsC6pTcPKQXFpGnWIqi6hoMHDcBE2bMRtP5F+Hc2+7Bdas6MP/l/8Hi379N4PxvoT1t11zjbGk/8v/eRMnAIarzVedOVGOXmzsxZXRVCt/bQzNQubAFBQTGLqmVah7zAPogANruAdA9kiyGmYRGT3TzcDWq0HOUr2lBVl04EhtTI+XtuO8jJTcHd33je2JFcHzyI837Ywvrzb9g8OR6SfDFWo3xPj83kHCrd+nE6Tjl8TacSRtYAbLawH3Rk8nSYqt6Dlljx216Hqd2fA/nrvkGpl18GUI5eXpKi6GrKnrP/u+rRgwpkzD+2Tx5OQdZ4WoMPHo6Wj97KS595GnMJ4t7+f+9I910q/++A5NPPlVVc/RCQypzFrMdVH19GnKfaSQLmieiNCYE6PINtbLJmd8hv5ss7tUzkTG1KGHttVvVwq4+N1rc9Y3vE8DuTWrSCHeQRhKlzHfxmz+g5uipMTwuRo84LN/7pp5/KBUXtoWKSdNw+mPLcBp5PMdt+hGt2as4fWP0kO2Tdv5QDmn+m3PJU5p24RcQyM7TIBkl2rISDUj1xV9fX0wi0uBcSXo6MisqUTN1OoH25/BFWlturlr5x7fpcKPD6h87MZk8ETWizYxO0XGn3NumzBcM8fsporW9ZxqKul1gbvAAur8B2nCtAF/yqtor441D+rjK7BpQtagRwWGZ0Xpi04hrQftk7NAYLHztD3pjRgE6mQTS6h17cfWSdgKx7Eh9aq9hHrqpC0eMwimPLseJYi2R5bTxhb5vXA3QJ5EKQG9+AXO7yALf+H2cuf47OOr8S2GR9RSpDjESN07EW0PLiI5lcvS0ZJ56HtBt28IZ7E9BenEpRjTNwFlfvwcLvvM8zrzu+pi5cEbiJoyQgdJLx0izCcehyze4BEj7TuluEGAu6GoUhsLC7lpULmpCcFB6AoCOlh0ymJSPPgqLaG1XJBGu4vI0t0mJQznryDO68ulldD0z4ocyzGjCV3lrDsKjx+OUx5aL5Xzcpldw7GYG6JcJoF9MDqBJ55A3dTx/TffHeWu+iaPOuwRWWrqsUcA9DA/QHNOTJEyx6Dl6WKwkqH2xv0f/T55CZrgMY1pm4uybb8eC776I0675Cnl8VmRdI7MdpXOVw3WWolNIN1B8xSiU6oSuC9BFHkD3B0BH61UHDSjFnKaJmNN4tOjcxom96pzmSWiZNg4ZacH9srsfK90o83DcVwu7JECvr+ODpltbuv/EkqPnnoz2v70n/ApsOUkDQx+4gWN5edv/ugNTTz5d6l3NmIGb8Ybq8sGXWpSPObfeQxv4h5j97E8IYF/FqZ3P0+Z9sc8b1w17nMAbmABalMD6BAKEU1c9i5qW2eJuq/HzVlKdbb4YDmXDjBm+yZvRifKGRNrV/WnCY1w6aBhC5OYHe+GjEG/GMpB/ykCZEcmTbpgitiiuBd0gUzryOKfQOZ3uFwJoWlurwN/7+7cU0Ew95Wy0/+P9pNq6OS69aKeOPROwr/7rNkw56bReh8Aapqqg4K669NJqnHjXgziN1ogtZwbn4za9KqGLUzuTtKC7VAjk+M0KpPn+OHXNtzFgxjHkgdkqlGSokj0zye7FaEOMX5K7qnmmZ/OLgLVpI6ugBIU1g2AGFfGVaexf4ulO0eGf55xaJbmFks4GCXUUdKkYdJkH0AcP0Dy9l2/GKz53Crb/ah22/XI16Rpsp0dXd8TT19bjf7+/EIMrirX7ZESmR3ysdKOdLSi78WgYObYQ+Rg6TmnGyXzz+zj5KzdIeyxPlmCAXrn1w2gxfx/oGTm8seDHv0BudaWUOFm+nsmxWK9EQNq2Mf7sz+JssnbnbnpZNvDcbgLVzuStq576klhpJ3a/LBt7zsNLyW0dJAdTsjXv0RZ0bZG6G1HzfLgHkJvBDxpc5ueIspUd8Bm9UH+qTsmMaYUIr21B/qbahBUcJUKM1IScbq6tnYYKDl997WiyZs2EZXxmpJ7cwuk3fB3ttKaLtiVXleImDzmxyA1HudXVib1B09Bc00wJmoZJF34JZ3Y9Tx7NK7QOr9CaviwqayNr9MPkDmEOf0lSkb2kl3AK6fEPLUZmeQ0dCqaiBLUNPS8wGYC2BJxVVYgdbZ6SKedGj8SuCllxOacj+GD5elYjRekSlNGT3liCqrWKs5sJy7g1n8soyz2A/igAbWiANrWVZ8lFvubzJwNvtAG/WYK9pIjV1xbvr68vx2+ffwLDIwBtJSSNOaQAzeN2vjgSvhRTSuxMF6B10srsMdbKjy8+sQgr3gOe5kSStqB7q5FV1KO6TphLknYAX3x6KayQoxOk+wC0BmbbVFUGeTVDcNLTa3CSjjlLjLGTy+de+kgAzZv4ZPn7l0VP6ngBc2gzn0wgMe68LxBopCasYDGN2NbwaL5B1TOradiqxVzdG+wO27qKwGdEOxj92jswLCtyICaqYGEwYDBJGZGD0jbuKKyTCd2JAJoThDkyNXoaSrvo9y4dpTioE7Qem27cPRDE1U8ulYSXqgWO8mK7dKHLDjBtZCXdE5c89jQdCCn0GXs5cAxFzlM4bBxOW7IBx3crQD5t44sS1mDLmcF5rvw8mTCWUgFprt7RYTBe2/Hnfp6ALjVS+20ZyQD0/mEaXyTRr1rLTT24wme6zTKWUJ46XPLn60mHGqHM5XWn+yN1bI7UqpeQN1vAg2I7dT6BwNoD6I8A0Kbr2giQ+WUBrr34ROx5YxX2vr6MdDn20OMe+Xp/xetLCcyX4bc/eBxDK4t1VYiT8AId2hBHK/LPHaatZicytUTViPoiLbYMRMHMHNy86duqGoFnym1T9aErDjAKym0TXkKPa7f8C8declmPBFvPOmBLbnbZ4I4f42hjnd754kFYygmAWqubOJzzDIH/423IIEvL1wuxkqM5NKL8HlFryhDLOMppYPYThSaDWagqA2WLWoXAv5CAtyThrMJGssAaJbzBRDw5ZwzSde1G/DFbPmVd+7OycHv3t9D2nuZs3rpH16tHOUfieUluVyh34K2mtZ39+UvleR0jARsi3+d83ehAmHjR5Tilm669rrRx16Q/1/mELi7B/CHmPt5OaztQHZSWL2GHYSQU0SPxH+MZmT01HoAn2+OQUp2J8BJaW2lAqlV7u5NDHPUeQH8UgLYSAfSbqwiUl/fQvQnUBeghDNDGYQToja3IOmGAem1TdfO5AOTGTS1tLWYWlWDBiz+V+t4VW/ZGGMvae+m0UkQ7eyJTkdv+tBVjmmf2TJT2+LzKpecGjWBePo6f/wRO7f5hvwN01KImgN7MccsXcdb672Bw63EJAdow9wlbuGoa+21cn9l/FJocq/UXBVHyKK1754EBmpOEvNGLu5qQeUyFGADxJ3wYkeqTrLIKPPTyT6X1PsK/odeW66ATEdy7ZXhCqfqnLRjR1CTPmxig1SGWUhzG3AcWStJ37iE4gCMA3c2x7Rdx7obvYmjr8crr0YZArwAdA9KRUtg4dc0Hvb58OIZTUPJ4E8KddTIzsrSD2/nryIL2APrjA+jf9Hw8UgC6fEMrMltLNUD7hanOMmJI+X1q0ge744UDBuKJn/5KRt4zB8EyXft6oAoOBuilklQEFv7yLRQPHRYXnFVHm6p84NcvHkMu8IpNh2zzurFoTixxLfXpZG3VX36tlFIlTAbaPk3yH0NV6tbA65ZlS7fL9w8RvSmuciDHj5L7alFOAF3Eo7ASADQnD/M1QHM8Or0hnLBiITo5xUDRoGF4+v+9hvb3NcH+jr3yyOvGIapVtNZxCbG26FAIATRP1y4ZOkSVBvrMBI1Oprxe6YSJOGvVs5JPOIGrNToPEUB3ccLwBZzVzWt7PUwnqA6lBBY0r52U5ekhxLau/nCJmdwZiP0J0FaeH4ULauXwLdEAXewB9McD0HvigPMRBdBrmaehUIOimt5tGzHTnd2mDaFIzELj+Z/DLc9+F+1/3iI1nzwKabFuWtm3WiPCu8GUohzXJIB+8MUfI7O4IG6W39ING24Iadixc3H6xh9g7iEE6BOkHfxFiV2eyEnIux+GPzMzYRWCG0MM+Fw3XtOY+hStpWPa8n8hw0rYRp5siEMI/DMdlNw5FVUbadN214mlfCCADm9oQuq0Yp1PMBI2VvF7D2TlSk3vHd98Dm1/3YrVO3dj6Y7dWLhD0awu37YnYRehNCzR2t7/0k+RWVqs68KNxABN99LI40/CmbS2nPQ9VWLHLx7CQ5jWlyz1Y+Y9jmBWXqTULt414XbzFDcMo/lehIrT1Gx5phXpYeiXCdtsmWdZKLp3ipRGFm+cLnwrPLOwwgPo/gfovYkAOlaPIIAuW00APblAGjUMIwrQpmstRjqt+Ib1S3ssc2dMnHsyrn56GZ7+xetY9c77WLljb9wR96oVfLdMXGGAnvdf30dqXmIAjCTf6HCYdPb5OKPzBekCPFSbl1uJT9MdaMdvegVzn1qF9JKyXgFa1Yv7pRnjpMuvQe05n8WI1mNQOmIsUgtKYAUCmk3N18vn7HuFCFOEMn9wydcno1rqm+MDdJGekJMvddAE0OuakDapUGqze4uTSqJTuigdpBUUYsqJp8oMxUWv/U7mTK7i6eQJKzv2aq5r4I5vfB/B3GzpSE2U8JbD3h/EpIu+iFMYNJ/5sVTkHEqAPonuoTn0Wsc9sQqZRRWKJCkR0PGetk0MnjiJ1vYqNJ77WYyccTzKR49HelExrFCqmtDN19Rn4aBpIKQW2kTJHRNRwdU5Er5qkiqOCi9J2L8AvfvN1QTIbaQrRHdr5a/37qN4YwXe+METCqB1ktA8HAC9sgmZ4/OkuYLfgzCdGW69aBSU5BqYVoRIX6oUUtNQNmosjrn0S7huTRee/PUfsPLtD6QTrW2Hopxs36a6zWS6Cm3i27q+gVB2VkIyGSlXk+GaNqZ97ks4RZoPXj6kMcrTaQNzVcixm1/FCUs2ILdm8AEAWlXy1J9zATrocGp/5wOJrS/69e9x9/dewuV0cB1z+dUY1tCM7HC50GSyW+8It3E0LGLpFmg3rp3owOKwjz/VQvjmSQTQzFJXF7eTUFVyNOpGFQLoVfUIjcuRv0/YlBGpwTeilAPsHaRmooJA6djLrsKN6zqx+Je/xSpaW+YZadumyiWX6tb1JWRdL30f+FrnN8XL8pu9lCoyeAdTUH/V1zCXrvfxm35MB+QhtqA7ue79ZZy0dCOyKwZFqnAS0unSe5x5wUXoeuc9up/fx4q/bMdCurfnff8lXLG4HXOu/ApGNs6gta0gMA9EwnUuj7kZU/FjxpaMJhrtlWai+NaJqJThvqpVn0McngXd3xb0G6vignE83UMA/fpzPQG6t86mQwfQjcgWgDalisNljXN8fTv93RruQFYOSkePRcO5F+CiBY/i9v96Hk/86vdY9bcdWL1lF9rfA9a+vxd3P/NtAujcAwC0Ldbc1M9fIcxlbNkeMuuqi0u7nsfJXQTQ9DonL+tE7qChCQFaDSTVjR2nnY2V7/5bqhzYe+A6b47Ps7bxlJk/vo0Hf/gTXPLg45h6yhkorBoIvz9FVchwbFlinbqh5QAAbadYKCGAruTyq45aaWiIV2ZXvqEpAtBlK+sQHJ+tS/+SqfmN3necewhkp6Fi7Eg0nv9ZfP6hx3DXt3+Ap1/7vTD7rdnyb6xhatEPgBs3PotQSo605icsG7QZTEKov/pmHP/sj+mA/LEcjocyz3CCNMG8jNNpbbOrhvTKE23ryqLWz1yEte9+oMoLt6tDiZPj7Cms2vYh2v7vbTxCa3v5Q09g2qlnIm/AIBiBkC4R1WPTIjXRpqps8ZnxATCVAXqSAujOWtJGSQR7Meh+Bmi80a5rn5cCry/pVfe+sfzIAOjVTciamC8AbWmAtiItsQcuEXKVb8oIn7PfQWpeDspGjMCEY47DzM9/CefcMR9XL1yGi267CymZB7Cg5XoYmHLe53AKA3T3K4d0A5++8QUJcTAb3gkL1yKzojohQFsRgPahjtzflVs+3G+0l0u/uoTpSEmZnnT137bioR//P3z2vkfI+pqJYFaB9pqiQwgSATTX2VpkQRfeMlGShL0BNNORulUcFWxBT8hJ2hWP5UaxpF5bjxrjnweZ5D4fpby2s2bjmEsuw2fvmIcrn1yKz9x0G1JTsiRmLtNuElHHWn5MvuQazHnmFQLoH8nheEgBmsm0CKBPfWoNMsNVumw0/qFl6/v6mM99kQ4ftbZqFuVe4XxWLe1cgriLDiZa25270Pb2e3joR/8PF9/3MMY0z0JKdoEmHDMiREmu4ZPQgr5dAXRRlwZoL0nY3wB9km5UWaxAmmude1MC6DeefwpDKg4zQK9tRubUAt1Q4UQY1KwkY6dq1JQZ6aRy3WWfJkXnkIURCsIOBuEk4NdWDQBmpElm7EmnCQ/w3EMZ4iBw4DrrU3UVwfELnkYoNz/BRIyo68oW9DFfvFI6KfcF6GVb9khTDjd8LNqBaDUEW9ccq//D33Hj+k2oO/MzSMsvFGvL6YWLg8vsrHQbhbdPRhlt3DICaI4zxwNovi94LBaHQcpW1yN1UkHCMrs+0eHqteTmGr8b3urBRWKoOmuH9kEgKIlTy7B1GVv8EAcfOKNPPQ8ndz8vjHSsJxxiC5obneYsWIiUnAIdg048aYXvwROvug5rtu3SPNh7I81WbsOVTAOS9VVliRzSW8ukUWRZX0drO/30c5CaW6jufbNnB+F+ScIME4V3TZIkYUGXOzXHA+h+AWjuy3cB2g1xSByarOnedO+bKwmgn9at3ocRoNe3IKMlrAaWaoA2jQNRpcZL7NkRdQe1Sj0sF+K7VqfVuyXh04Ngbd08UTZxCk5b81+aRvTQbF4Gf04Schz0VHqdhmtvhc1JPl9vh5FPaEnP+/rdUu0gVQ4HaoneGuWDXrxzL9q278L6v7yLYy6+VFGcJrA4lXtsw850EL57GsJddWId5yWwoMMbG5Hb3SSxzNK1DcioDauEVhIAbfp68mZYkXmJlqYLdWdEKiA29xvXlbjE0BQeDgvV05px2vr/kgoa7iI8lGvMz33appfQdO3X4feH5H40zF7IkgIOPnvvAqzj6TkaoF1KXTUNSFnSEtraGv2e74PF73E8fg/Wksd0zEWXqHFsZi8leXztchwU3j9N4s75GqC9Ko6D4OLoYUGbAQGUKz57It75eRve/tmyPumW/1mC//nWwxhUXnRYAbp04wxkzR2gane5zO4jdsBFeCgiauquQEuNMfJFyYQS8jToVmBHW+Lp5RU49rGVOLH70AM0txjz47CTzo5O144bgtEcFmQtfmXxCpkxuCgRQMsgAz3MQGskYbp9Dza8+wFmXfwFOZQcI5FXoWLyTl4AFQuYgL9O5hH2CtCbmslVJoDuaEb2jIqkAdqK5eiwoi3KksSO0ciQAcvQ5ZHqe6uXHAbH3EN0v2dVDsacp1bijC4F0HMPKUD/EKdtfAHDTzxTrnOagGb8/cb3oJ0Wwg0r12Htjj1qRmFkureaQ7n63b3Cf71im5pO37Z1t9Y9Ymnzz9ds+QAzLrhYJQ+N3gHaXxRC+NFGOVTzuxVdrAfQBwHQPVVxaFSG81E/cXjf9Gh+HIbJYwcjLaQy/D7DPCx80GUbW5B71iBZAMsIqCoNI/HsOE4A+TXfRJRbIM7CRqY0m9K0YcUAdG8gL40ZPk0un5qB2su+gtOYzIjdYGkHZsayF4QQpz/avU9lvgZ2f7vp+8UbkDtkjAC0mbDdW00RTy8swX3PvSxDc5dsT9xF2UN16zRvdnaRV/39PUw8/gQ1YVoOs3jXxJYyu2BpCKVPMBcHH6rTxQWOH+KoR84mvjcaUE4AnXN6TcJDNxLKinQ+Gvpw0payoYmAYmLS8YZR+Mye00rMXqhzTWn8oNcJpmD6FdfhjO4XtAX9Kq3HK5oW9oeq/X7zi0mHPuYKm90L0hnK33PoiifqnLxwLXIGjVD0BQkmmruAmV0SxoKXfoalO6G9o70RL2hFxIreK81aPLqNdYUejSWdtfx7f3sPR81Ra+tSJRh6H8R6KPz/wep01Dw5Q1XgCEDXoYBAupQ8IQ+gDxqgE/ME94kP+jCPvOJpKgVfGA4roGp7Lck2Jz7xLQ24Rgz5j9kvLc16crOOc0oc0wqhbPxUnNK2WcehX5KmgxNpU5/Q/UL/WFgdL+A4AueTN72Auiu+CiuUoTh/e6moYG6MAUdNwbI3/48Aem9Ss/pc4JYStbfexsBJk7WVlSjE4af1sJE6NAuly2aQdcwAPS3uVG93okqOns7BTIX5l4yEbccPWalksBH1BvUgVdUtp4dKmNF17p81jk4wL504Dae2bZLpKZwsZJBmAituGpIxZptfSAqg+Z44YdPzBOwvCLjLUAd6bs5j1H3pWvJ60qLT3c3Ee23Y1Fos/cM7WCgAvTuSIOzzpG+mYH3rH6iaOEkDtCFTZKThJc6BkDImB0OXzEaY1os5vIs31hJAN6GEPSEPoPsHoJMF6SNlaGxpZzOKvzYeVrqlmLkMFTdOSH+pXV63FM/UpVP9sYFjs91spVtmCFZqDqZfdaPUKis+51fE4jqpH1xijk1y+dVc2thnLF6HojFHS2zS6i1hJ00Kfsz+3Bew7t1/SidlcvMYVRyaCace+e9fCTWnlLX5EnFDOAKWmZMKULFmJrnAdVJCVxyHclRVcdQhl2kqmfC9qxkltMGtkJEgpm4q4DDMiFXsVqn4YypL3HBVf8zmdOfySet0ejbqr7kZp3S9ILXufAif1Omy0r2YFN+3rOemF6Rt/2QB+R/Rc/4Ixz3zY5y+eD2KRh4lbI12HwB67uVXS/kgh66Wbt/VI0HYF+Xa/0d+/Eta2yqJuUtnrM+JNGD1mCBEmlEXxqCVx6CYQLmwazqtY60csOFNR5wF/Xu6Byrp8ZMJ0J/Iqd5dTSibPwV2nl+FF3Qrc6IJH/60EIKZmaqtWY+JMqxoMf5BMXtFYrymWI3s2rOrnTuYp6ksw1kbn9NdhT+SwbAn9QuRzks4o+O7mHj2Z6WhxG00SJhAsx0EMnJxfft6rNqpwLktic27LDKTEbhj07foWmaoipEEtcouYX/2MWUId7QQ8Cq2s0SE/RzikCQhWdjMfFd292RYuXZijmOpQjKlxTmUmgZ/RpY0JLmDZLkqwyHPimfomabRL14SJxCDEtaxkT98LE54bAWBtAJjJk5yKWFPTpZSVntVJ3e+ipPpHuHnOaXjOYw760JY/gyEjIC030eGK+wLQFypkp6Br63ZKNU2PF+xjWuetyQH0DxY9uaOZ+BkZao6d5+luaN1U5KbxzDUJKacOdWoWD9TEfVLlU6dTFc5AgH6D5ZlVfJAbQ+gP7ahsQ0of7oOoQHpmsM4MUDz+ygfPhJfeWoxjrv0CpRPmIpAToHUvMajD/3IIC38B2raisRP/UEMbJiBc5dukI3M1vMpnf1DTXl65w8w8/rbEcrJjxwQCW9UUw38HHD0FCx+7fdYunOPhDfatyQH0Fx+t5YA+qJ7F8jEDVW2l6Dsiy36gIWi84fJ3Dq2rkplaGxTgiRhA/LIcmY3uYjc5QGPNyJYlRaXoU9dZzsyWbx6wlH48tNLMfvzX0L5qAkIZNI14byE/I6VZGVPoqSY4lLndvAQx6KdIIbMOB5nrNgkibwTpLX/JZlJyJO7k1nLUzrVvcGTWTjp+Jn130bLl29CMDef7mse0mpHkp9mvOk99PkGTZyMJW/+CUvEet6NVVv/Teub3NxNXtsL7rxXqnMczdDo0+x+7uAGW3ce+oImii8ZidLuZuR11+tmIzXyKrzpiCPs/wOB85EB0GzG/2cAdD2qVrYi66gizUJm9ArQGSVlePD5V7B227/xyJt/xg2d38CpX/kaxrW0Ir+sHP6UFOXK7TNU1dXYqTFmHI2OF3IiCapUsiDtQAjDTjgDp6z6hrR/n7JPa/BJXS9q3T+DH4ljynRvxVzH8WweszT7tgeQVT5QxV111YLti18mJlUmjoPP3Hw71tPn56kybEGvfDe5CSSceFr9zgdoPPu8COd2QoDmuuI0G+W0UQs2NSg6ygMANNdBsyVW0DUNA5fPROaE+ORU7tAJt+Iip2oAHnn1f7Bu27/wxK//gBs7nsUpX7kBY5pmIK+8Cv5QSmQKyH7r2wtFa89OPVvxujg+4fxmkLaC6Rh52vk4o30zWb/PS7hjTverAtL7TlU5QXs+J0bWW/NI61p2DoHNfvbHZE3/EDNvmYfM0ko5VF2Kgugh3HNyuyTJbRsX3n4XgfKHmjtmN1bz1G7N2tin9eUE4j/+hWmnnqEauOQ6OxGANrQBpKao02tmB1B16xQUb6pD/ibFUli+oVFCWEciQB8xFjS/CdICx3F+blnWpxag+UYQTujTamTKsC1MXmbCmDIPyLz04Sex4j1yAXeStcCDQrnU6M/b8MhPfonr13Ti3FvuRPM552PotFqEBw9BZlExAmmZ8PtTYdtBudm4ldu0LB3zVm3PUVDUoRLT7e5SFJ5mKIhBs+fi5IVrcFo3xy0JbLuex4kbf0CA/Txt0Bd0w8krWmlzd/PcwRdxCv3OiR0/kOoAnkV4+obvoumrdyCjtEpPX46tMokd9GlEkpf8e+FhY/DYf/8S7VxiJVSqqsQqqRl+O3ZjEVnglaPGRQ6tRKGhIHkSwdJUlD/ENdC1BLqKTIfXP16SUFV41Arhewlt9CErZ6Pg5JoIOyGrqiQw9SFoiUUnMedQGi5f2KZat3ldyVVfx003f9qG+//7V7h25QYZilp75jm0tlNRMmQwUgvz4U+jdQ0GYDiqg1C6CHltCfAsx4qEFFR5nhlZW1OX3UkMPDUTQ489CSc/vVol+zjP0MlhCpVz4GTf3E2K3GruJp6AQ2tOetJGVnUPHL+Jub1fwWkbvoNmspzTmStjH4MglrTJ4riwpQ5mi65D6fDRePJnv5aWfZX41ZUaBxia26ZJwbgyhxuSHvvf36F06Aid/O0Z+uN7KsD3vBmUstZgRSrKnmAOjgYJa0Qn4xyRQ2OPHICWQLhh0D1mP26aZr8kSI5EgGbWrHA3Pe81RyHkN/X4HitxkpDeS+2Z52L52/+UGtEInaiQIe3FKnL715NbuPYfO9D2u7/iyV+8jnt/8Cpu2PgNXL1sHS55dBHOm/cAPnPXfbLZi4YMo01iqykkCWKcjp7bZ+twR/G4yZh9+wKc0fF9AmplQR2/6VUc+8yPhHznlE4eKksbtfNlnLHxh6SKrY7d3lN5vBWBwJhTz0YoNydx7fk+iSSxFG0/zrn1Lqylz8ebUZVUqTKrZAB6KR1ut9D1CGbkHjAkxIdXxlGFqGhrkuoN5QY3SUt3YoCeLp5ReGMLqtbNROEVY2AEjEjpo6lLvlTiytKlb2rWZuuFl0qCjMsBhf+ZBzPw15pnZDVble/8E21/+Aee+sUbuO+5V3ArfZZrlq3CJY88SWv7IM65ewHOvOUOFA4cSNfMUuRb+0yw3787kyefhxA+aipm3vEATln3HbGSGZS5dI4PW44t89qerJUt7DlkMR9D4H3sJlrbzT/EqYvXY+zp5yOYU6w/WyLQUVODJB7MnZzBNFxw1/1YncRaRgBaDzbg3EI77YGvretGMC094bgsv3BOByXckzmxEFWrZ8Y3nDyA7l0CgYDP7/ePtx37ddP8dAK0DCAld7n6gSYEi0KqJKiXeW38XorIKn7s569JJYKaqsGMZruxkGlFSRdvY8Dm+mAF3FxPunynIprhTb6KgZz+f+P2D/H5Bx+HnZkrSSh/3Nc0pOTOZYKTsjDDj1B+KYbNOQ3H3f0wzl75LM4gl/YEcm2Pe5ZA+plXyJIiq2szl8/9UP7v9I7nMPeptZh6ydUoGDqWLPkAgoZbE5touKluW9fhjUFTpuCpX78lHBuL6DNGAHr7nqRCHCu3/gsnXPVV2RSmL5ZVLv5BkXNCNcrWMwXlND3xuZEAuiFhFUextAw30P3RTGvbivD8abALXSpZU3NYm5FSukh3Jx2EZaPH4wn6jMt2qE65pTuUJdmmrUOOnzMQ8c/l/3h4w3v0md5X2rZT6brtu3D+nfNgpqRL56CUbtq+uCRKUt4nXMz0PuiwTikMY+hxJ2HGnQ9gztpvSHXG3M0vCnfHnM0/JtD+CVnPP8bpXT/CGdyqT97QnIVrMeWSq2htRxHQpwl1rm1YCedLxs4I5Eae0Y0zseiNP8tkmGXJJH0ZlHVtNF8fJlg64YovR8JHPQFae2OmKiM1yCAqOnMwKte3egD9USQtLc03dsJRHOpopTf2M9u29trMKdGjYuGjqcS8+miZH1KAZleYrLKK9plIPSpfA4al+aET1EKHQrjiycVYo5s0GKh4JFL7u3siGe9lkYGxe2Siiii5yyu37JGYHlto4hr+4V1MPfUccvn8miMh3kaylOVnxsapDQmTcAKofOIUTDjnIjTfeJdwaZzwRBtOfrIdJz68GLPI0p582bUYNPN4ZFZVw2TGMa4+kZJCJzLaa//YqYpZsmXJibz0cBg3rOmQg4Vjz2wJc0dZ2zY1LaavG1qoOt/8E2omTheANGM4SBIR6YSvPQqFukFFaEY7VbVG/CoORTla2KU4OfIJpCuXtyBtQq5MtI7Qykq9u7KmJSbLn5fA0UnLwDUrVkuHpDpg9UDYLap7TtZXf2Ye1CDtzTv2qgM5omri+9K3/oIJx58kljqDpWEp4qWELHeWut5CyUp7I5Sdh9KjpmP8ORej5aa7cfyDizH36TU4/qnVOOGhxZhNlnbdpVdjWOtxyGEaUX8okkfx61yK2UtsnHmfOQnLBEq3dn0TK/Xhk4wFvWKbvp/5ez646HCrGX9Uwpi/otJVgwGcTBvhm49CaVzqWA+g+yT0hgig/fw4kPQKx3FW0uM3aNMehFrfsFkd+0X6sB8c3hh0HYo2TUdx9yzknzVEk/WbCCSYKee+n8lzTsbKv2/DovcgN+iad/YKYK16VwF22xZlWa8UUN6tdOtusThX0c8Y0MWaJAB4+KX/Rvnw0b2S+BuxbeSGEdOpqOPEjgOLwCU1twgZheXILCpHRn4xQuk5sAOpMhHDdLviIkNDQ5JhT/SakXrglDScftPtWPnuv4R6Uh1K6jNzknDx9iRKsOjz3rS2WyokbMOMTnc2rPhtwBUpqHi0DtnP1qJMwhdNAtDhjYnWvklCIK4yM1rVulkoOmMIjIAaKcZDF2SNXe/AUqx1jk8lAOvOOgcb3t4pE9uX6HgszyVcTQfwqi275bBdEWljV+3NbfuoJE937ML877+EkmGj5DUt09aMh/Hq61VTjFuH7+hrn8IDc80AUlIzkZZDa1tMa1tWgdSifASy0skK9cvaMiDben3tAzIEqvV16IB3ghk465a7aG3+JZ7Q6q17Es/XjHfgblUAzQc2e4hfXbEGKWlpcUNXCqAJDAPqIEqpTkPJIvJ4mMHOA+iDk9TUdF9uXoEvJSXF9pMEAoGDVTJY7MH0oX97ICv6UAJ0oQwhJYAmC638lknwp/vFqvD3AtBsZWeQG3rH934oRO3KulIxyxWRFuc9yqLQgCaTobczs5viqGCwZhawRTv3YA1tjutXb0BmSakCabd7TZPjO9oqMt16UrfiwVRxa1Zrn5lxsdUFpnYpTQIJn21IBYFhxvy+0XOCs9vN6Mhr2Kg/6zzhdl7ucm5sUx7DGn3ILN6ROJwRGayqvYblb3+Axs9cRIAUlOePArStujgjyUpVCZNdH0blmlbkbK5VcwY3Nqka2QQAreqjm2SuHd8jbFGXd7Sg6tapuh5aDUPgzR7wufMnlUcn74deM7e8Co+++GOs1mxuK7YqDoq2rYrVbV8AW7Flf5UE27YPsW7rv3Ft21qZUuPQawfcqfHuYWmqRJot4R5Ht5dHOalNW6lhxwxxNaLNNJb2RsUyd6KTt80YDgxT828bkc9qyPW1nBS0nn0BVvz+b+QVqUNn7bu7kwJoNjxkYj2zFf7jPUw7/SyV+DV6AWhHTa3PmVGG4g0tKKKD1wPoI0x0nXUpfejXDqcFnS81s7SBN0xF5dJapI/Ml41iJQhxuHWcTKM458ovYxWB66otu2hT7ukTm1u0LVq3SAuYk1W27d/4whOLkV5QIpvH1h1nihfD2Ce5k3gaxoFY92JLv1wCoMiYLT2Djjd0kMMqjh/jyUV/+ldvYRV3DMZUa7jTrHuPN6vPKEkk2virCQDu/uGPkV1avt/nYM6NAG3CdAYcP792UGKUpV8YLd2eXBebfH5BKd8nVYsbkDosh8DZlFCC4zM062B8fplTr7uJ1uXfEr5YJVO99yTVUece1HxAM2f2JQ88ivTCAvjlIDAleehzDB3rj5IvxS3Ti9Nq3ndSLz6EgvS7AQFni0vbLFpjK4CJc0/Fol/9FitpbWMPomQ+42oZiLxHauLnf+d5ZJWEYTpWr406Fjf+pJjI/+o4FG1uRcWGJg+gj1CADtOH/vXhDXGQK0wWVtnGqajice/nDIPfMvUcu/it3m6hfdGgoXjox7+QxN/yJNthY3kpVtANztr+7ge45PElyC6uINdWM+BZ/TQ9uZehrD4NEFYMlzXPzTvquBPw1M9/g7X8HukQWr41uQx/+xZlbbLnsJTAeR0nkK6+XvEn7wvQhhqYwBY0x7z9poNgZSpqHmFu58aDq3WX7rQZKDpjmFiipq5csBJO3jZQMmIsHvz5azLaig+YZNvZFZjvVURSzNz3jx249L4HkVZYLO3rju3o8smPIcmuBw5IaR1de9MJYSqB89P/7zdYQ1b+0m0KoNv0miX7OZfKvfs+jrv0cllb0zJ6ndLOHiA3hpU/2YS8TU0Ib/Bi0J4FnYiLo0O5xPnd0xHuIovr/ukIFQQTJ+ysaKuuSRvtJLK0Vmz5d2Sqd/LKf7dHppCwm9hGz/Xltg0orhkaoWo097WA+1PN6Ew+FzDsYAh155yPJb9+S2q8V0Rc+z3JWVecRCRwe/o9sqTf34MHXvpvFAwYFjNuyeixcaWBw88t0AZCpNnHV6F8/cyDD2PRuuZvbkH53XUI5PhV5YptJKQE5etgkvt/xk13YC2tx1I9lT05UigF0BybXcLt8HTArX3nn7h84XLk1wyR2L/fp0I6/UnGFFctlQTlGmw7NQsN51+Mhb/mgbi7BZwXbVefr00ns5O5f/mzrSSQn/+9F5FTWSnen9+MP9rKdKuQaI0LTqxG1doW5G7iEGN/WtAEwgcC6PtiAXqIB9C9AHQZfejfHF6ArpekE7OkSbJiTSMyG0pkk8a3OFUc061tLagehPt/+BOZ1/bRADo2nqcSTOwu3vidFzC8sRUBsmTdRo5D0tHJG5cPHZlqHUBGaQXOueV2rPjj39FO4Lxsa3IxyViA5tAAP3LVR9s7ZGF94UoCikDcEkYZ2sqWHieQuEGFgLT065NR3N180GtcJOvbhOqVs5A9tVh/ZithEs0yVNy2ZMhoPPTKz9C2Yzd5AbuTA+ht2O/3+fs1W/6FO5/5DoZMb4QVSIVlW3G5MfpTBTDp8MsqG4Bzb7sXi//vHSzauTcyzkoOn+17Igdxsl2ha8g7mPnZiyOdr4noak3dcGWV+FF+5yRUdk6nPVyLvH4E6MADoxAkgA7OH0sgPYIeh9EjgXGM8vepDNAPeBb0EQ/QXMXBgyoLOpvpOetQ0l2Hwq+Oh51mx81Ec3yaE2dcKqSmgDiY+bkvov3tDw4aoLkMb8XW3TIiavmOPXiSrJwjw+5dAABDH0lEQVRTr70JWUXhQ2dB65vPSsnAiMaZuKnrG2RFfUAbdhdZjgTQ2xTfxlK9iZP5PPJ3ulLljm89RwBRCb9lx3XrZXOwa2yrMEva9GKUr2pBuLP24NdYh0iqNpA7fc14WBl+BDkpmQgUuQxMykADOPaya7CCDpdlwnucHGNfNK6rgE9VhHyI1dv+jSd++SbmXnUtMktLI+3jhwqgnZRUDG9owU2d38DKdz/ACm7R375XN1lBE+3viUlw9p1XhSfVf737W8gqDEfJkBJwTUtCm9Y4s7YE5aubUbZxMso31GrulIMH6PTzyYJ+kEMYIxC6fxw9jiRrmcB6AVnOMSohjvn09UNjkHPlMA+gEwE0SSl94NfMw1jFwcknRf7epK3pWoTbm5E6PlfFYi1D1wS74+Qt5YrTjWhZym1MzS/C9RuewVpO9m3ZHUNWnyyg7REgaJOqkN3igra/+y/c/c3vYdopZyAlt0AT/CgQs2NI511uBTMy5SPaXhxpzEhgQVv0/9NPOUv4nVcTCC8kXcxTubfujgD0sj4AtJC5b3VLDKEBaQ/afv93HD3nROmYdCwrofvrWrZOqoWiL49DUVdL3Hrn5JOFjQhv4HKu6aha0oyMsfkEwJoTw4xDv2mpGYmch0gvLsNNm75Fh8we6Z7k69BjUswBDifVDq/CB9JFuV2vMRMR/eM93Lr5O5h86tlIyc6ThhFpu3fXVI/8UgncqJo6FOXTbfkREicz/t6ZdOKpWPr6H9GuPwN7aSs1ub7Enbf2PQEaJevfq8Jyv/srxs0+Tu4h23BzJm7YzJ3qrQmaeB9l2Ki8djKKNs9AeONUVNC6xONVSRagDbagZ5Yi45IhyLh4MDIvHoqMzw0mHaQfo5pJmn0h/T/9Ts6cGljpjsTFP3UAze3gXCftOE4KfV1AWkRvvvhA6pDS3xXalj2Bvn/rQF2Kh9aC3tfaUrwABZcPhxUyJVYpHMk+K+Hi8fsbOr0BK157i0D6Q9mIkY18kFb18m160OpftuFWsm5bL7gYhTUD4fAAWlPxGFu6IzC2I82lQBVyHs05kbBhgX7nxGu+hrb3ok00SZPwa+urfUu0DpyvwTq6HhfPfwB2alADcQLryqeIkbheOG1sLsoX1yJ7Ux2KNjb3A0CrjtG8TdMJEFpQ/sWxMDMdqSYwYgDaBWmpZPGpOmmOjY9qnomFb/yfGjKwXZEISXv79j1JJw/3jVO3c5PL33biVrJum8+7GPk1Q2EEFOGW6VqiupLHctWnHt3WcTMyFSb+vTnn6uuxfuduWQ+3uoTDTkmvcUxikPlX1pDxcMFd98JOD0pliN+ImeKtSwGdGIDmn2WPL8Lg5cciZzMnf2t1R+jBJwnl+dMMaWwy6YCPaBo/2vupwZpOeyLFltLT2FFnn3iA5jcYCoW4DTzPtp2L6Y1/g26UX3K4gvT1PupvLNv+nePYuw5nHfR+FrWOWVYv4u6zfAE8vvGkPKuXeYJGIISTyGVd/Y+d0oGWLA1nb+q2jq/aQRY1Pf+DP/lffP6xp9F49mdQPW4C0oqKZI6cz1Et6o7US5uiUlsr8WWz1ynTJ37lRqx4L9oVlszmdROIrtXo6moCn/ufewWFA4dFramEJYImbWhHpneXXDkOlbxxmVOjs3/WmFnuyjumS7hjwNJZSB+TJwBs+3pOcVdAbev3Q2vPXlIogNO/eoNq1Nmm4rZLt6vOuRUHdQirkVE89GAZre/yd9/H/by2jzyNurMvQOXYiUjLo3swGIJlO2oAqzslXuqlzQPGrnnvnHDdzTJxe+n2PRGAXrQzuTVu0/cyT1lhcOeJ33d/6znkMd+Io616XxQsbZdS1KcOGq6Y4aEYZVePR0XHLOR31dFhOV04VYr6wYJ2+UxY5QA7gLq16D4j2nNgHiAM+IkCaNIwvdl1ZP1++FF5Otwmi8Mbg94HnMWlbkTpxpko+fJRCKYqoiKfX5UPxUssmbpeOZU201VL27Fq278FsFZu2dMvAO0CpprlB+GJaONN8vYH5Lr+Bfd+/1Vcu3IdLpj/IE677ibMueI6zL3iWrSecwH8Kek9mPHiAbRNwHjGjbej7f29Hwmg26VDcpfEqTl+zocJA88SsjrHH3siHCsF6Ya2SBOutykWdNpRBShpm4li2pwMqPldtf3QjKQs6AHrpqGArfLNx6D8sglk1ZsRulO/21WoAdqQ2ZM+BPT7TcvPxbWrNmKdHEQ6nqwrHz5yieXWvRFdwqEk5rXYznzKe7H67X/h6df/jLu//zK+3LYO589Tazv38mtx8lVfRf3p56gO0Rh+5/2MB2lYMnD6LXdi7U7Vnt6mQy5Lk6xKUZ7VXqnIWcqT3H/1W4xuniVhK8OM1ur7dKeqE2Gys9XwCfo6bUoBSttpPbrrJbdQuUENX+iXEIc+5JVaB1TLZ0SGI5t9zNN8IgBatX7bfnp8gHTvJ59uNL7mdTegsq0VOdPCujvLjnDqxh0EaqoBo+FRo7HguZewYSuD9If9As7cPLAyJnQQyaBrMp8VO1Uibs0OFQ4Ra5s20bxvfBcpWYoxzuwFoNnKPv/u+9H+T0TKrJIH6N1yeLC1377tQ6z62zYc/6WrYYXSEbSCMsna8Rm9TxrJDaD8q0cTiLaigCfddNQLdWh/VHEUiwVdi3wC6LzNLRi8ZBYyphRFXG/XkvaZ0eG9sXXhfA+WjRiDh194VRjtuHuSgVrIorbuPaj1dUsSV7+zN1L5IklGeo3F79HrvB9d41UErOvpcL51wyYEQ5mqPd326XvT2H/8HF3XC+57CKuZHldPvxHDIdn8iBxM6hBp+8tWzL7kS8LrYupxYfuNbYvJbwhdQKGDipsnoWhTg1Ru8NRuISqT8EZjvwC0L5YO4QDqHmj7vfdPOkBr63k86V8/LiL/wwHQPF2YZ9oNvrUeTmFASHYCPjNxjEpK1UwB8lFNM6QJoG37rqSYwXqLQwv95bZorWqUD0JZNsu2ahULVjHmfXnZaqnOMHpzhTkhZodw2ZNLlQUtbneS7u9WtfGX6s7ItW+/L51zoaxsISCy7IAk5NwWcjNBrW7+zGpUr2ykdZ5O3swMFHS29MK5kawV3YhcIYXndW2QgbLlt0yjtU1TFqivZ/xUTaK2NOOerdxnev8TZh2Dhb96S1jslm6DtqD7Y433av7lvXo6tqqy4KTeErGwyXJ1u08JtL+4ZCUCgTQdfvMJD4c/DkBbjoPLn16Odi4V3KESg8wpkmxoRhj8mJbgnQ9wwV0L4M/IgeM4PS3nSK2zbjO3VDt9kK5v1gmVCK+bIa34YZ7e3dWoh/syCVZDv4Q4rCQ0Xu7hU2JBc2LQPIfe6O6PiyP6cAC08A2T6xXumIn80wciaNPNZgbEpVNTR3yRUUlyKtsx1R7+ICafcjoW/+b3WL5DVTKs2LpLZ8s/WnhDKgg0GKsR93skURMpkdIbXECavl5DVt5n73mQ3ltIVwIkbsoIZebhxo7NWLlTHQDLtu9J2HgT256+fNsuFUPVBwVXJ6ze8i98eXEbssOlatOwNaqJgFxLNSCTWVQVjISPOG5emYaqB9j1bRTOjWKett7ZrMnbG/rFis7vqlfDZDmMRQBRsX42ck8aSCDGTTo8jzGg1tGK6XqTphrFZSGld34/6s86Fyt++xfxVpbGNHi0xR6aSa6xy9eiGBAVQK/cElM1obsymZ96FR0OZ955H0w6WO1IiMNSnNIRzmt14PhTU3B9xzfQxglOzSPStjVx56tbUqmIvvT9wOx95JWtJs/oCgL71LwiVaVh7X/wuwBtS6jDkuHHqQMzEH6EPJdNdTphq8Ia+ZqkvyTOISzhRvKiyr82ETYn9gxbW+Q9K5di1YkZ+NubRlj+NCeJGA9S1WMnzpF8EgA6JSVFyuPojX6BdO+hrN083ABdLvwAzch+phEDHm9A+shc2hABsQZN7U46keSS0cNVZqvRDIbQdP6FWPy7v2ElA9mWDzWIIal62o8C5kx3uWrLBzjui1fLUFSVVEoQmqH/yygqw30v/FgGucp77GVSt7x/qWTYI9UMDOZL9aHBRP43ru1EXlUNbdAE5XSG2ry2pdxfISxKtxC+dDTK6HqrPEDDIVvXntUdTah5vBmpo3Lo0HDofYV0TDWRl2QrTynox4wLP4/lf/gb2ndGvRhFkLWnH6o7etc1dC+1XnSplIZFKjik8kSXDVpuMsyH9OIi3PnCfwtAr+hrbXMk38HgvEuNviJwvm7lBmRX1ujJML5IY0rc6e/M58JhrTQbJVdMoL07U5K+yeSD+BAdOL8O/vJQdLKPrSuUDlItIwroPHYsld5zitAN+D7ZAO3TAE2L8KkGaI6Lucxp5R0zyB2eCn9JCkLSouuWrtm6aiJKaCNUkbyY7NqHQmi54AKsePOPaGci/+3KOpEyrUO0eVV4giy6v2zD2FlzFPNdAoA2dHKzaMhIPPG/b9F7dEnY98QPcWxTwCxhFG3xLdW1zmu3/gs3r+tC0aARdB38SDiFh2PeZD2n0/ULGSEh1smcEUb5ypZIQ0mhkO4feoBmprtSBo4bJyGQ50gYyzScKPPbvs00QjJkSljITE/DzIsvRftbf5H48GLNVrhUc2/0S4llonDXn7dhJCfoDJV88/UAaPW9EDLxiLKhQ/HYL39PB2jf+GLc6SgcY39qJ1cP7cLad/+Jr65cj/xBw4Te1HJrrxNMiOESQdsMIoX+L3N2GcpWz6C1bY6sb99H0U3HgJXNyDy+jF7Xp/Yde7CmqXnM9WQc/WgmofK+LZWbcUiDWs1PPECrJpNPPUBzzDLcwQMs65C7uZlA+niEzxkBO8UUy8+QuYKBSDmWy2mhKgIMoa4U/oOQhdozzpTxSCt3uDW0uw4dQHPSjoD0SXq9kqEjIjP/fL2M8Bpe34Llf3w3YmFJpUhccGDrcJduD1ahkJUMzm+/j2uWtCO/egh99kCvsXrFR20IGVKK4UfK0CyUPaqSd0Ub6z8WYI6GPcjt7q5D6fqZyD+H3nuQaTAT14yblhGhgmWyez6Amz5zAZ7+zR+xKubgkoaebYfgEOYcBAHnoz9/AwU1gwWQ7UiZmBrX5k6IcQxVijeisVVmKvbVa+NwCld5MD/HYgLnVe+8hyueWoqc8iplMVtGj6Rc/JCAog5IGZmNkqemyyzJig31yNvcHEnY9gWgi2ivF3TXo+z+WqQMyZCZoY4MW/DL1Ht5lO+TV9PQo+3keim61oSlqB5AH3kAXdSpLKwyyfzTzdI9A0MWzRJ+Yl8Kt7Qqa8typ4LohJJhGBGCI8uNwTp+jJ0xCw+/9BNhDztk1pVuGFlNQHHrpv9CMDsnJk6e6NoaaD7vIrKS/i0TmbnZJGGCMAaglwq5+4dY/5ct0oiSUVIuh1bAsHp4FPH4GGyuy3YMSb6WXzcJhd0z6UCs/VjBWVHNNgofeDHPL1zWgMz6EgFeo5fkt5tUkg3L07IDARx13Bw8/urPsF63gy/5CG3xfSrJY8+ISfE7v4lARroi6ffpA8Nwm5LMSNknT46ZdfFl5N3sTrL0b5cQILX/6R2cd/u9SC0qVQMf3CoXKzouKz5fDb1+OIiSr09G3ia2hKeIoZPfrUJYfa+kmi5J3cr1zRh4dy2ym0rlnjEDhuQN9lU7CbX83B6uuLZ5WINfuLqdhKPuPoUAbfQs90mgfeGY6AnQ06IALS3a9R9JOTER7qhPWCUQ1gNlufVb2r/p67KOFgxc0IiUMdnicgVpQzAJu9yw3AUnTSFmZHo0W1sya1C3kFaMn4DrOp5FG5fgbdfhiFj+4MhG1HHArUi6CoSfZx1t5HPvnEfv0YmQuieqP+Ymls/edR/W7tgt4N62xY1BxuF9ljKr3QLQPNV78Wu/x5zLLkcwK0Pcf87q82OPVvPI5Ooo17JlhGCnOyi6eISw1ZV0MA9KiySPkglBFe0Xt0yOlpRft1imrkyj169F9SNNSBuRE62N7dHAYPRs7rFtGewg5W30swETJuKWrm9Ji/4St+Ow19rnJHMLW1XopJ2e+5Tb7hGLLyiemhkzP1KFOdyEmc9JwcX3Pyrll8tiw1SuxrwP9/+ZuW/l9n8LT/Qxl1yOQHo2rakthEuWjjtzjNtnxlIMuHFnFfrgDr6Si0aifM0MoXploOUEIfM+J5P0LerifoQ6hDuni5EUXjsDlffWCQ1A/pWjkH/5KOTF6pf6rvmXj0bRl8ag8vPjkDE+T6btWL3QpH4qANqM1EEq6zLVb+GkmZNxxXnH40ufOa6H8s8uOLkFuVnp9Df+xK3IGqCzjxlAIDmLrJ065HY3SfihZKOastEX5d/lmtqCLh4oSt/TpqzYMI1O5+lxs8glMePfS0SVVc3UlWV3TUaoKp02Jp+4weikCCMGoM2ou+eW9nC1QmZxGOd8/W4s/sPfsXKHSrKphoe9MmWlbZsKHSzasVc0WZpLHkm1/G/bMWnOyZGMuluNYOpwR2x9bzAjBzd3fwvL3lOxR56Swq/J3WKc+HPHPPFmXqxJnNaT5Tzv2y9g7MzjyDtwogTyZk8QY5eRy8BS+bX4mtimWHlWwELeKYNQsWqmHHzl6xtUjDKpEFSTbnBojExS4aqPZECaD3ku+eJQB69xzYYWVN42EaHyVPF8oh1xfOgEeljWbozd1K3XbDVmlpbhvDvnY8kf/oG2HeqgXawHzXLSbaU7k3Irjz/b06dO03ZdebFouwo7tP91O1nsJ6oSOsPX0+LzuVwdKvmbkl2IW5/9npTlLdFez1I5YHfRffIhre/uSD29UKJyieQ7/8Q93/wuxrTMIiszRSxxe9/J4DEHl9Q4iwHgiPcUDJq0tgNRuVIduO7AhAJtVCWbBwprQrP87npS+p7WONxFz93NygZbS3LarR7LNs5GweaZqNl4LIrnDFLJQydxOeonHqDdwu8oQPuRnxHEc2vuwN7frseu19eSrpHHD+nxwzfX47UXlmJgRYka4yQJt8QAnT+rBlXrjxNLK6+TrK4NrZGSnb4qLzi7tTmbGpG9mXRTE53ujUm5XSUddRhAVl/F9ZPFjePEEnfiqZhYtEwnHsevqcufnFAKJh5/kkygWLPlAyzfuQcLd0SBerkuqWOw5g3dlowFTdbSgp/8AvkDBumhqHrUkK8nQFt6JFbJ4OF44n/fxOKdikBnrZ4z2KbJdISFjVzehbSxV+xk4qO/4bP3PoC8AUPkQJKESwJPisc7BTRJPIeEJLlmGyhoqMDQRbQxulqFzYwbUvK6klsHSfAJd4oqnVMeTkNSFlpYP0+Be/DS+6hZ24jyq8fT2gb0bEBLYp1mAvfXNHzRieu8tumZmHTiabj7uVdo/XbReuyWeL2iX3U7Nff2udxSxmzROizkKeK0Rg+/+j/ILR+QeChrzDT28pHj8RR5Oe16hmR7ZGaiKs3khPVSXda3nkMav/0zzr9jHnKrB0p+xZYuwAM0ccg8TF5nEyG/icxjylG1nPZm3O7AI0cr1jcjZzOXdbag8Pia6EFj/AcAtE8GlCqAfmn97cCby4HfLO6he19fgdeffxpDqkokDuo7wJDWvFkDUL1uFlm9DTKtWVxcsoZlKncflDdw5fo6SVaoyc9qkkpJkmQ8FeubBOxLeULHlWPh5JPl4NPtrEzAHjOss+dN7c4PVCOPWPMqqnHGjbfhiV+9hTZNZrNc81goLg81968tCU6PNQQGX3x8kTSo2HpiiCSOfGYEoHsMvz3hVLLKtpF1tls2Kne0SbPEFjWu6amdqrW3/R87hKxpwuxjYUv7uK0YzBJy//LAU5+U2/kctkhDMAMm0uuKMPDxVgLDVgHWyg3TZW1y+aDsTKIMsoM9I3KdybLK2aQ6DtXP6pMGaG6WyOtSj6WkZetbUHz5WPjz/LK2fAA7vTCdWXrAbshQQMUeS/7AITiHrOnFr/1OCLTaXA4P3a7vVsEsj2H/i1s1o8v1ZHQWgf2lDz1B1zEl4XQcN1HNe2r66edi9TvvS432yi3RQ8GlFeXnXEW67q87cNuGboybMQt2KFXKSdWUeVM9V2/NHGxY0RoTZiGzrhjhpY3SLXgkg7NMf9/IRhp5zwTQuSfWSOlnsA9kSQTOnw6AZquYAfrljXcCb7Vj9xvLI7qHFG8sw5s/eBxDK4v7BtDHVKOaXNCq9dOEBYs3Z3HXVIlJl3RN14+Jlakm87qnkZs0Veoxyzbyhp4mjyVJjsgKM8DTeyglK774S2NkI7suvlsfLbXRCZJNZiTkQV8HAqiZOA1fenQhlrz+f1i1lVxPTsQJSfweiQn2LWbJo7PICv77Dkw7+XR6br8w8LmUlOKqxgC0aq6xcO49C2iT7lKuLzfAMCBwHFUIfD4UAp95z72MWRd9ARlFYfWeYwr+LV/vkzx8QqQTgJ82cuaUIpQ9RV4Mey6bOOQ0XZKwXH6VrAVd1EV/312Hgu5m+tsWIVUq7JwmFnVSAB1pYlFawEC9qQ5l61pQdNkYDdLK8o/r/prRfINLBaA8FAK3YAaGTKnHZU8sxsK3/kqekqIZZc5vtqoX6/DVyl4Aelkklk0W8F+3YvLcU4UEKxFAuw1UPLLsc/c/gnXaM+MuwqfpsH2aw2rbdmPt9r3SFXjv918RhsS0vAIVNpG/N1UlUqQjMDFAB7jD1jKR1liE8OJGVNDe4NBhycdckZN0gri7gQCaDLZ1M5F/0iAYjqrNNz+tddCJAfpu4LcrsYss5t1vrMCuN9QjCKR/SwA9vCIK0AkrDejn2XNrCEyPFbrIko0zyfLhEAd/PUO6/Q6k/HvFTIDUMUus33BXMwo3KwrKZMq7SqTLsI42d510Og1YNxulV44Rl1imZesMumHYMbP34nAG6C4shx5tyw8nNQdDp9bjkgWPYPH/vil1xW070acY9ArNjMYlWPf+8L+RXVYpVJqWjombuijfjumg4o65tMJ83Pa9lyTxxOV/bEXz5O413BL8t624+9s/wDGXXo4ccqkNeo+Wbj5xyYXsXmgu5XPqsAYnVDOnFqPmUQLS7laJJ8oQ2E4NiJ2NySWPuBWfAJobW4asPg5DV85BFW20wq46nThOfrBsj9eXEVlNKF2vDmC7OADTMRMTPblTyA13KrcC64ChJpk4aVkYWt+MLzzyJJb+8rfo3PIBVu1Qh6LLehiPKH+ZHoMmU9TJ0p3//I+QHa5IPI7NVHSzbAlyrmMe/f6qHcobW8QT5N+jQ3znLqwkoGcmupkXf5HuleqY/I+aTRnbjh9btRGfKJ9eq7YYpQv5YGsgL7eOPNWGfusAPWRcO131dP+0YPD8BqSPzY+U2h0gxPE7ukaVTLN85HYS0pvTAI14SUK3q4kBOi8jgFc67pAQx97fLFH6+hKAHnsCtNWrBc01nZlV+cg9biDyZg9Azuxq5MyqQjZZ1VnHDOiTcpIx/1j6+2NrkDOXvr5oKMrmTUHV2hkCtMU6DBLu6I3ApUEYuBicOcxS2tGMsg1NAhTl10yEU56iMvyGbmQx7EhmOzbUIddHSvSUa+yX0ihHMvBOMAVV4yfitOtvwd3PvYoVf9qGNWwZ83QOHfpYoufcKRdYtfBybTVP3j7l5jukG47rjC3D3h+gfXqIKIHnyIZGLP7jO0J3uYr+nkmOHnvjT7i6fT2mnXomMopLpZ5VBo3qBg1FruTrQS3JXBCq9tblOVCfketybb+JnNYylC8iq/mZFgK/ZunSVJ2aTRLaKJRmoPq4m1oNVHA5g+uEZKdifQvKn6pF4ZUjUXzyIJScSHrJSFQ+zODQKs/Hf8dWXGEvFnWR1rAcuko55BHeyMmnJmVJ89dXj0MonKKTZbo92G1mMd3BCabEqn2iVqSSh5ONjsmVKyacQAjVYybgzOtvwj0/eBlL/0pru5PLIlVFz5KY6p3luppmIbeTS+v+Lpxy49fJ0gtI9UZigFaW7+iWmVj253fJYt9L1jK9BnlFT7z2B1y5dCUmk4fFa+uzHF32aGruEcUlEwlpaDV7TBZXoTr5/4CB3NZylC+mdSWviMNV3EafTx5NshU1/RW6KNI9DK5Kw1mHqtySe2mjSkhXr6Z9/+UxCAxPh+VnpkW1Hz+9AK27itTGVQBdkB7AqxtuJwu6jYB5WUTB+sYyAeihEYB2ElL/uWPpD1Syd0A19MQKvpkDBCglQRSfPhRVy2egorMFpZ11uuKjUZdgxbsJ6iOb27W8OEvNce7yr09EyuBsuuHZelVZf8sw9aZ2C/u5sy9qWcd+XjfZJDSRloWMkmKMmzkT5911H+Z972Use+tvWLXl31jBdJ479wgBznJSrrhY+h65zL9+C9UE7r4I1aPRgyPB0U00FlnDvlAazrvtTqz/xw48Thb719ZsxPGXXYXKMUeRNZ/RowzygHSMlqrSCEoLN4+M4nbpAKwMB8UnDMTARc3IeaYOWc/UqvLFSDIvGtYoSTjwdZpwBxcRUHLOoXptvTDehYblwAjGVMwETITKMxC+bCyK2Lui3y0nVztnc61Y6iVJWNKl+v1x5QH/bc3aFpTdNAGhQRliSXMdtxp9ZkmCLEq6Y0Q4R8x9DBczQv+pOCoySkox9pg5+Ow9C3Dv936IRb//O9reZctaldKt0JUfi5gmlNb4SbK8K3hte6WONZSX4wRxwV3zsZa8oMd/8TpuWNuF4y67BpVjj4Y/JU2z3O3z3mLLY3ULtzsIQnlKpnTx2VwNRN6XmW+h+NTBqFnSLIcZH6Ju/L+os/EwWMVqMhIfzgVS4dMswx5KuNqro1bIzwq66QDfMBNVjzYj99gBMLMdzatj6JryA4yFs6zfc4jjEwnQPk08Eh+g2wmY2YrW+vryCEAP4Rh0jEVpHrKGF0NqfhU/reryE8svnayA5jIMfYQ5iGch49l6oUPkwvrkYloE0t0zMYBcpsyJhWI5+sTCVKEO9RktzZrWCyOXLyY7rnkVuJ45vagQg6ZMwYwLP4cvPPAI7tj0LTz237/E8t/9DSv+sRNrt/wTX3zoMVih1ITXQIiKpBXdFhBuOutc1J52DkqGjYY/PUdn7v265bnv15ZdaqnHZXYzHsbKzGqFQRReOgrV7bOlnjVr83TZIGUdyYUzuEa5sGs6cruaxSWt+vpk2FUhKdmLTgaPHkZWsR/F1x2FynWtkifgEBaHpEqSrBBRAM0HRCPK6euadS2ovK8e6ZMLEPCrcVimGRSCJTOmOelAtJUuGJhSEkf3ox1CekFYYtWzLvoire3jMgbroZ/9SuLWa/66FRvf3okvPEhrG0iTZG8igOaGC561mE5r23zOeZKLkLXNzBfSLNvnl9p9v89M2JDh1lCrEJWprrOpuLq5kYOrcvzchPLFMShbO1N5n0dAyEISvd3KG6paX4sB6+roURlcfA/k0P6sWN2Kqi8fjfRh2WL9Sz230dMjPABA/9a27fIjmyzpowJ0DDjz48cP0FEidicmAcKDQx2Ox07IR9m9vKCzpb20qCsZYpdG5HMcla1orvRY1IjckwfAyrZ12ZOlqzysXmO2sZ1qbrxTKgSMKCexcGo4NkI5OcirHoSaidNx9HEno+Hs81E6ZLhqSknwvGaEIMaUGm5DLH1Ll5MZksjz089sw05qQK248VwDS+AeIIBOHZ6FipuOlhxBtiTyposlzPHhoo1NCcNH8VkFayWpyyWR1W2zpFLAcSw1UzBSGWNEapIZQLiRaODCmUJpya9dsaE2aYAOa4Dm2lsGafacymmNKxY1IOfUGgI9W8BOqlN09Y6/B6d0IlJ5nUw0FB2Am4yTvAWDrx1AMDuP1nYgBkycgknHzkXz2Z9ByeCh6nWM3mLhhvDAhLj2XjMYWpq4yBK6XMUup1rC469xBKClnt8v1VhB5k3R4aqU0bmovqUWFetnIfuZevFujgSAZqudtZwO4+r108hSnqYm8nCDC3nDNQ83IX8O7clcWwwIW4f+IpNVzN5LCbkBy7LMF/x+fzbppwyg34wB6N8cRoDWxCqmlA/piQqaN4BnDwar0jDwy1MweN2xZE03JxX7ql7XJKd11jPTpEJhYPsslFw1HsGBaUIOpGKDZjRmeyD+WZe2UVcECBFORNUUYkOAQY0T8slGJ3A6wDXs+byGBufYkU9mr01DiSbjBJm2NMNC1rEVCD9eJ9n80o3TCGCn0dfTBGiLpJytOSkXuGyDInjPJTAYcF8dWW8pwuERMFTdvKkPWlVCSO+BS+IyHSG3KuyeIW4vd6MlOx4rrK2yAl3dwbHsMFeJECCx5Vh49TgEajLocLciPCyO6yYfYLO7nXiGjtnbvhiuYl+UytbxxYRFdCmf3RuvisvU5tMVNLaOj/fo7tVxZstJSGalmBktSTJzCzSTHgXoQMo7phJVT85A3jOzpIaYJ96UdhwZAK14c5pkrQs7pyK3mykayOtZOxuV104UoyFoqxCfz7HFIo7wVpvxW9fdMJ+UxjrOXr/fuakgM9XnhTgOMUCr17MjyrG1gPTi0yYr8qPqorEYuuJYsogb+wzQ5RLnIldqk+pKK+1sQknXDFQ+2IBMAi0rx1YhDyM6QDMeeMZ2+bl1y+77FgvLZ+rBq8oCCOmwhaEtSbvXiRO+/RJAaiipGWly8CXg0Eio/Np046cOykT5FeMRXk2WyqYGif2VbaiXhKoM4uXrIZPTm5Jq6Q53tCCXnq9gM7mtX5sEK8MvB6qpS8piVUIN3H5N7mvJZePp+s+SteBYctFH2fQxWhIT6ywgV7p400xULqhHZmuYPCVHqmLMmIk7iS3oaGt0ZFq35px22dncg9Mv4RD3/4zoWC4r8fq6NJoCtLYvQo3bI79jJh495osAu6leO0iH7yhaWzI2Bq2YIeuY/UytkExxgjzc0XhEADTfV+Xrm8njqUMOvb/yjTMw6NFWFJw0EHaeI583IOFNS1q6bbOnt2NpD8iMA9BsPdu2/Zxj2xV+xzlyyfp7B2hfhLTF0ExRhelBvLxeAfQeDcx7GKglSbhUJQl7ALQvYdODuOBMY6iTbk6k1MtISiMTkSMZa32a6uYRtqTNTAv5c6sxcFELAS+X8k0X67hA2onVZuUbU1T4OgiYN/NAUs03QDcxh0k4MZG3mW6c1TNRds14hEZlSezL1IeYqGlG2oWj79OMGYAZpe3sAeAxs+gitapWNEnVK0hbMWDt0xwiOnHZY8yT3uQyY053BkpjgqGG03JZoV0QQN5xVRj0SBNZyi20WehakVVV2N0ilS4cF+RKjXyynMO0iSo2uO27fQ0fNQvRfgldy4rrCaBTHAnxuJZirMq9wxYtXeMwAXQpHZC8PvnJ1ljv08jiKrvK+Uw0360apLiKp2JVK0qvOhrBkVlC4qM60myVQDRj8gp60CuHNUI6pKQORmUsmHpyi5urcK1yS3tNxj5DWY2E+0Qd2KY2mrjT0xZ+DjPCEe3ETNuOWtb6fZiOcv85eVbgR84JVah6jL2hViG04n1QutH1hlp0pcTHDcZ1UgFU0MnNTq1SfcNhLK7EytvE+3IGym+ahPTRufD7lZepBif75XoEe1yjnqEdc5/GMss0/0XW8zoC6WGBgP/ItZ77BNCWy0vrSA1oYVYIL224M2pBS4MKqVjQK/DmC09hiLR6q5sj4QBHUxGZyEQLS5F2c9UAVwwwBWTfVJULyWRuMwpscZMlDE5kOWQ3hDHkwWaUM+ByvbQu3+IbQRH8NEtZUenGqVICFr8ygDkEahX5+BP0++eOQEpFumwCoS21okkYU1uxhu44VGrgUE6uUVacpUMb0SYa99GxVPLJYq4FusFT6NHPZWNZNtKbS1B6xxRJvkS9jf5tUOAQA4NCGQF94c2T4YTU5AszUaKVf+43UHTlBJR2s8U+XUC++FCVfTGhT2cLamhti84ehmBlKkx6j5blROPFVgxYanIlM2bsUlIey0dOkEdfy4yUzhk6x2EpGk/TT++V1jk3iNzGMgy4ZRoGrJpJn7NWKmiOjC7AevFW2WIu3jiL9uNMuvfoHuHQExkBNU+T1Xz6IFglfvmMdpKMm263ryb/+p1tW5eR9ZzlOJwYNI7sad59A2hTaoDZUsghC3r149fjz68sxh9ferqnvrwYL2x8EANKiyIWnJkAoN3nZsAIaktOxY7NJOeQmVK+5rNtqQV2fCpEYCaY1ccMV6ljclF6by0KyRor3zhdYqpsQXECisfFh6XtO34Nb1gmszRI1yN3zZV0NqNyw/GoeqgZhacPFlIentcmpXU8ycMyo/PSLDUWyudPbDH1C0Db+nXsmFIxtyJC1K3vDanNS8CcPbkYNV+ZggHLj1GWcveh25AC0ASyPQDaOHIAWsiemMuDiXs6j0H5o2RhnzEQwap0GY9lCHm9IoV3NADIPSihDWO/9vtDlyCPArSKVbMXqip2JCxG94CZbSFjaiEqvjYFQ5cdi8qOmdJ8krtp+iEejpH8NWcyf85xcIt/3uZW+vlsVN46Fenj8hB0TKQIB4ypqqCSAmgB53+RdpFONskytCzb94mQ3uugXdfZEPdOOFYJcAaUFmJMTTlpWQ8dSzqMrOe0QJAsVn8EoBNZAKapQFWVimnQ4BNfJ1J6U59+DHILK29uKyAuj2UY+7g1PWf1+XWNr78qFRXXHI2atbTRu2olYaRag5skJseWdDgBALjx6YoNU2UYKvNG5G9ulgaI8ofp5j9rsNTX2n5D3K6g/nyOztg7ppWwJKpfVHsWthGtQhD33A6JRcVhFq57dfKDyGgoQeGNE1C9YhbCnbPks5dvmC6ER/+pAM0eAycxczbXIXszfd1NVtzaGXQI0/1x3jCEBmcgGDCEl0R1YnKZm6MrKszoMIVDaEVzQlDud9pjzC0S1NwxDodieDYkWZs81aaagHnQUl7XGVI2mrtpmuq07ao9ong0Sjpo33RNo+s9he7DBgx6eiYKzhqqOj4trmIhT0/Iraw+DYRlr1xVaYj+H+Hbl8lqzqHDlGw5x/eJkQN1EvrMOCoNJtZ+akgMTm8oiX+pET3xB5vSiRhKgVPMGkQwLwinIIUWJAS7MNhn5Uy0nweVcpxNx1x7tU51iRxXRgRzHeR/fjgq22jzrZ0lJWRsOSoAiZ/4Eo6H7npx0avWTxcNy8nPpV/T6f+a6O9no+rpGSj70gRkTy2R5hkjRYV0uKY1lW42xzh009M5LskHQ4o+HMT1dTgk5IeV6kfqkEwUnDYQlXdPQfVKFZPnz5y1uZY2sIr7hTv+cy1oPqy5GYZrrpknhtdVygrJy6pa24iaJ8jLumwcsqYWyyHHAwHYog7oAzGaYD+E4Q07he7zVGGYtBmEeFhClh9pI3OkUWvgXfWoWjmLrOVWWtcGMSKYw5mrIfh+Ld/QqIYoHykAzfdFN+c0WlF2+2RkkNXsd1SVlEw6skM63mwcsL5Z8IU86mAw+KHfH3iGAHoqXTMynC3fJ06SB2jdwu3bX32RWWpu3NfQ2ez4pT/ZE8IouH4CUm4egcyvDkP29aOQdiPpDSP7oCOQccMo5HxxOPyjM2EGLcVSZpi9ZsTdsIrfVGQ5XEGQf2INhi2cjQEbj5HGC4kvJ+R/UAlE+X8du1Zf14o1XaqJmjj0UcwctasJUO6fivyLhyFjehH8pSFYIXVNXC+g/6fVWLrskHkV6DPm2QiNyET28RUou/4olC9mL6GVNm9TDMFQrRwy0lXZMYO09T8WoIukFK8xwtHClKXFus2c15+TxFzuV9XeigF3TkHReUORPqUQTjiomiVkDVSZ5H60CQerbpkYe6e2JdUmKcMzkXtSJUpvOhqVCznkNkva7/M2Ncj0Ek5uSxeeJMNVg0+Y+W86mo+Qeme6tnS9By2aiZKzh9F1DElzFOenHLeCjD0Gy4oMjOgDQP+JAPp6x3HyUkIh3ydWIgBtWV+QeG4vpC0qAaJJzWNKxswejGr7clTEHzTKN3EqAVbm/AnwPzAczoIRCN03AvYDI+A8MLKPOgKBh8Yg52ujkTIhV8DI1ET7sUkUFRu09WERrZAwJGZNLmLQQWZdCaoepU25qVncv9JIjW+jrlho7EGqJG6wrqlVqku/hCdCVQQwcOdz+IQ7nshKrSFLfcAD9HdXj0Lu3Gqkjc2DXRqAkW6qET+xG1uX3jla1dRld7KzS4BjavpR5TVwbNtIYUB2EByUgfSGYuReOJQskomoXsjvnSwUHlHU1ai5LVTMnd9/iXAv1ynLan0LbeDm/+AQh25Z74xep7BQBajrxU0uvNZsUYfpM1Svb8LgFa2ouo8O7svGIOu4SqSMyRHPyUyzZF3Yi/HrUJeUVMohaktFh60fTV+07DK2DFM+P68vra1NXl9KDa1tfSHyzhuE6lsmY9DTTcJXUkD3LlfYMGkQN5sw0VRYasYVY2NxDId6kUxbbzqEVrEiueJBHDlMpkUHhqJbaJEhDFzFo4wCNQWn4uapSJuYL/evTJHhcKdlRyqd3Htj/+YTXYoaDWfsou+/SQZnHYGz9YmJNfcFoBNObT5EMwkD0wqRNn88UuaPhL1gLIL3jYF/wUgE7h/dRx1FfzMcqfNGIO9Gep7WYhiZpnALRDmNNQtdpOwsXtecI25qaGQWym+fKqx6xV1NkVrZok6XiyC5crL9blrpTGwUnoPyjpmobJuJ8gcJGG+YiJKLRyLvhAHImFKMlBHZCFZnIBBOQbAoiEC+X+o+WZ2CAJzikPwfT4BJGZGF9En5yJ1ZgeIzh6DiS+Mw6NbpGPIoWXerZhMIziQQaURpV12/V2N8ugE6mdZ1xQvB5WBFnWRZr56NoYtmY8gDzbS2k1D0+RHIP3UgMhpLpCMyRGvrL02FXRyEw2tLoOvk2PCT8vr6yYIMVKYhNCQLqZMKkDmrHAVnD0HJleNQfVcthj41EwNWt5Cl3CBdrsWaWpX5r4uOmKSfChHxhBwunWMLXoF2oxwWvJ9KyFioXtqKXDpobDrMxMAwVRNXpJ+hDxUaHG+2yZsIBAJ/Ib2RrOcCLp1zHL/vEy8M0BybIb2UdO/HAc79BdDB+0Yj455RCNw3CvZDY5F51wRkk6tn59m6uF8V5juRetQEnLu2WuQQ/X5qaRrCVx1FN9MssjRVMpDZ7iqE8U6xaRUdRPJJbWiyyDfXI3dzg4QZGLDLNqqa4/C6VoRXtqB0OVk3zO722FQUPzAV4fvp9e+vEy19sBGVjzZjwBMtGLCItK0V1WtnCK9FmEluuprEmsrd3CRuLh8qlQmqUjyA7p+WZJ7ik0VrmUXXnS1EDiGEOZZP17+MK33Iyq4iUK1aQbqwGeWP0zV4eBpK75uCknmTEJ43EeH7JqL4oSkofmIaShbSwb2cLMvV9Hfr6e9pbct5IAV3w3aruvximb95ZNKAusyDXCPPVKXcdcrx72JR+mwbZqLs7unImFYEK2RG5l06untWwhqm2SeAJnDeRdj1Xb/f30DgbJFR5vvUCAM0uQKsJzm2/W/T/CRZ0KT0d/YDYyTkkUJAnXvHWOTQieyUhySbzRUewq/gO0A3nmXJLELh1yCLJu/8oShZ1YoiTlxsZIrK6RKLTLZrLmEVyHqmcmyWkIKqu66TGYsy2YWpTkkrSCs71NfcxehqOPJ1s8S6ZXYfu7ab6iTJV9ClyIg4/sjlg4p+s9ED6EOkZXRPlG1Qw2qlVLOrQcJbnIwrEze+Tqs6LLnOmg/l0i61jnw4l4s2C80tryuvMXtbHKLg9muprOngbjqV3HMpON2JNW535JHDzdyAnG51HTiUwXmbfOH7JsCmgyf/wmEIktUcEG83ILQHts/oEye5qXk0dPnc3wm7biVwLmI8cxzH96kTZnIiraLD53/5g39SANq/YAysh8cROI9B+ryRZE2PQGjeCKTMG4usLw6BMzJNiOVtHcdLGF/n2JVhSyxQ2L7ob/wpJrLmVmHgwtm0+VqETD4sFsHBA7TSZhRyR53E49QcRZmlSBtTxQebNQip5A5XEeyrxZ3qwGANxzyWb6gXi7+0Q7WoqxDNkbN5P20AzXXxlesbxVos63ApOusFrGVtteZ0q3yGxIm73cEGenBxp/pddS/EqMS7GyPPVdDpqoqPu3okra+sMb0/Bmg2GPLlkGlFzarZqLynDul1xfCHTMVAyd3GPB/Sp7jTTSNRrHlfq9nebdn2c/TYSuBsfyqB2RWO19imTFX5LLn6Oz45FvQY+ptxCNHfpdJzpNw7nAB6FPz88wWjUXD1GGQclS+ZdcPn60FhGUslaUnZm6OaOEzF5MatvdxSmj2lGNUP1InFw9nw6LBaN9730eK6XGdb0DVdHpW1VR/Rkhh1B6i6INtDdSKoqMutMlDt6uxiyzTsTnWYhDc2JDXXzwPoZGPQuopHPKB6nVxsjLCxCXe1zNGsk98tknV3G51iJ9WztVkvWiZJWzU1O1LyGakYqtP3RRSg3WR10WEqkYsdllC80R2UUK9b6FsxZNEslF40CqllabBsSyYOGdLpaGqKUM2jE8NCl7haw3ibgPkOMiaLFXaZvk+96DBHkD70ZUJibVt7TTPJWt2YduaPK8TB4MyxaPn+vlHydYj1/hHInDcahTeMRWZLKax0S+qAfXZAmlkivfs6m27qqo/YDkXbUC3oweGZGHgL1zzPQvamehlKyxuGwbWoU23KcNJgUadLnhQ4S5ijI1rKFasl+/BHuKoGC9RJ3a6qHqnXdbxRgCiON/7JA+h+j7fGXn/3AC2JCT2468vKvCaulsZo7CHtTgKKcoY07HOI7zNF5jBZ0fweyzfUagIrfUjIuDN1aJVvaEXFfXXIqC8hq9mSxjKZGm9aPSbYuDwlxj4NPpoS1CU32kOPzxO+zLYs2/lE1jUfjHBJim07XNA9IhAIXEmuwxN00RaTLomj/PNF+tHVhXQR15G+97HFoBMlEO8fBfuB4fT1CBR8fQLyTxwAiwfB+jTPsOVyYiSuXJFYl6Fm8PmrQyi7ajwq18wUQhneDJz4KCOwKOhS5D1H8ry2Yg+gPT0ka1knbeNsqKgeAP6+HoWbWjBgxUyUfG6E7B2mWDAlbvyR2rQZnLf6A4F5hEklKaGg74jlbz4MYiShXA1STfrmgRYiIUDf308ATVY011RbBNIM1jl3HoX8zwyBvywEn5/rhoMymfpAZYVue7bwGhQ6KD5nGKraZkrFhIzjkVDF4RoJ9OkHaNMD6CMeoLOeUcyPvBc4PFPBDIf31WrP1UQqd7TyPrPtAzIzxmnZ3kMg/QphyhyfYQQ+VRUaHzuS08UjDdMF/XVSFvS8ERI3DhCoOvfR1wTYB6MhSRiOoucdTs83VDQwfxjS549BxpXD4R+ZCduyEfL5hV+kt5ZcRzcXMMeCMMKlmMibXYkhT81C4eYZyHxGuXnV6xqP+KnHHkB72u8VLFzh1DENudyBu7kVNWS8lF46CsGqFMU4Kfw3TD1g6D1k9Mlq1pbzVnp84Ige6voJlBK6mL/qG0AXxQA0Wb7z+weggwTQafeOQCppkEDaP5+BepiypslSz792DNIm5gnYKqJ4OyHZvsvX7OgbTG6ygIHMacWo4UaE9TMkRshE9t6GPTQAbe0D0OEIQHvX8vA3pKhJOzUbZmMA7YesmeVCW8ukZAG3y9VUTWNBnyLwOmBds2PvtW3rp47jnGwYZsAD58MB0LxgtYXImn8U0ueNQeh+Aur7xiFIlnRIKjM+unKoRMWh+bnGigYWqK/T7x2LtHljkXfLeGTOKqObKSCseokAOrbNXQE1M5b5yV0zkTI0EzU3TUUlgXThJq5dbUVp5wxPD6AlXa1S413TSdbWzVNhpSkObTPhtBLa5EzYf8VRGNA5G5VcUrhxpnctjwAt65qJqlWtKL+c9llNGoErszUGyXp2FKuiEeVvdptRDgDQO4LBwGN+v38Ag0lqSoqHqIcDoHnxAjUpSG8NIzirBP5jSWeHEaCvA/x4UFqSUIOk/mPo62PDyGgOwykPSHy5z0kLGTdkq4GgPJWkIoT8k2qQc+4QZJ891NM+aNZn6FqdPwSF5wxDXmM5zFRTmP7iUbDyBmcXmSebZE0upr8ZTtd6sKh3LQ+/Fp5B69FQJrzThunT3DCOGuZs7kMSZe4/sZxbtDWHxl7Sn5OeGQgEgoFg0EPSwwnQTP6fyqNqjEi8ScZf8cw2Ne2j/9XRE6ItiXFZMrU6zbRlUkzfSwl9EcJwJmv3s5XAI6QcX2Qatae9q0zS4dFatprEHDAcUSvOOjBoh2RYhKqZ5fFlDg8KtVW9unc9D69yhZMvoIZDWDocxaPuXGKyA9Gm2qpCYyfp045tD6qoHiTdzZ4cbgta5hA6evo2j+Xxq+9lIzoy6aO/NfL8ln5Nm8nAg8IP/f/bu7vXOKowjuM7c87M7HYT00iFVoS+SK0gKuKNFFHxQo1XXngj1BcaUIsg1CrUgl5ZxNaK1SalSm2qTdLYXnjbJv4LXljQpK1tQSxeiLSCbWNNjs85ZybZ1mz2rUl2k+8HfuzmbQM7O8+cPTtznpoKdNqpRLt1l5Vb70O5c6kjUk1y/qpN5frlaf/pvvuEP5xxJ7adaXzHHD9XrQO/OL1vKcbzubBRU9N/ca5koaMKXeTdgVorO9d8SorzJhWqgp1rLhSKVNDmKNBpx+Y0WScU19stnJv4/xGkK91lqW+d3qlWUq7dli36yj2+JhUTTvVo9FduZu2jVLmmqXapzvT3wpKLGni+Fz7ZqDlrjhxky6SWOVtDZe+Wk+TvKI77pFbcm13NjCYr0IGaOXNSoNMrHV1fwpv/Z1jLHHRuuhNz1gsu7ZIdBqSaqPSDo2zlssRNdakZT8Hyc9DK/dy3DUsP5Dmex2aIX/Ndl6RCgbZrsEfRaBzHm5XWbUvuasDWGkHfuCbG9AIp4ZzEv4Ai9zZZuYIQpvOewcytvcqktHNzacdsN0e6ZBNMZ2rNk0rbRLlu3nbKKafz/nFmbO/kL9F3U1PulEg1Z68RUus+pdIpDZ12W0qLc0nn+uk56/CKFOSvJfcHXHHSAiPoeY9fZ9avmpUmp25cA4DUldlWH6t+2zTyc9Js+5obLWtt4iiyo+YxrXV3kiRtXKZNga7Yqr402bQFqT++TZMq2xSBLMECLcVZivIVrdWQ3D5oLzZmrrmVpjhKT9ep84UQBrU8RuBX0Sp5O5ZL36YFOVJ33NkY6fKR5ZoE/287MCJejMkWIbOj5zhJzuk4fl2FYbstDG1tbVTH1inQ010RwrRZba1x3XvdeZSqqiVO7TKHSvnFwV0XcikuPiFpMH7H9LeVpiWyjhjZXHU92540X+z2j/x0xjXJsUjrh/Mygkr4ILD1CnTaFcGeoP6b7KwX5W9qjhylL8oLweZP+bpiP8VIisfKzqJZvfJ2s27VcrM2y8rl5m5Sd+zzd9edHWbFinbXndzNSbtP91XZU61km/1lt3u92540UdTU/QtxFJ1MkmSz7Ncd9nPAV97YSkVs1SkO2YiD+Xx+g2xQSZze1pY4jtfLW6ktSqurla9ezJmd77xkRr/fZ0aHPzE/D++VfGZGT35qxuQ+qS+jJ/eaUyP7zcE9b5v2QiyjKX8GR5gLy55qleTzH8RxtL6ebU6aK7L/bdBRtEH253WS9mJHZ25ZkemMRTAHHfY2epK6PUrL3z8pI7GKrbrs6XBf7N5mzIVvjTlz2Eye7TeTZ/rl/oBkkNQdef7OD5mRgQ9NRz6ZLtCzLye5JeADI6B5C7QU1f32D+o9Wd1es58W6CeqKdD2gpUDu98y5tyQmRzrM/+cPmKuSyZO95uJsQFSZybH5CD3yxEz0r/TF+j0w9jZmn7KNnuNU2GBRVygS0bQVRVoO8Xx5UdvGiMjZzN60BXpybFDcv+QmRztS1PL/UNV3u9r8P5c/p96HvvG77vn78wRM9wvI+hled81Y5YCLdtsUtJNgQYW+Qg6nSJ5XB7jcsX/l5MCvWurMReOydvyb2TUNyQ5mmaINJCJ89+ZEwO7zG3FvD8XevazauwHut3sKsDSKNCPVVOgbdF49eXnzPEDO8zQ59vMYM92M9jrM9DzrhnoTTPT/XLf66nie4v5cdzX283R3h3mva0vmoKbg/bdUbKF2inQACPoqgp0FGlTkBRjSeKzTJKPSb0p2ETKJJGa6pReYZVACjTQ7AU6CIKeRgt0OgddXYEOuJhgzlLbFWcUaGC+pV29qy7Q8if7GinQ2f+seoqDAj1nCWou0AEFGphPttBKVkl+qmY9Bvm9PfI2uOEpDnmsRyWXWAuhRdZrCINJOah2hyFncQDzRmudi6KoKLcnyi2Gkwuyt8PBdSnMm1UY5updfrC0QMsOf5ni1zIrnTGCBhaiQLspB6VeldHRePl1fV2zyB+lQK9t5CrCbA5aslFCgW6dAj0h2cR50MA8kxF0LtK6M4rjwzqK/g3Tzgquc7d2HXwl0e/yey8kceSKeiPs6Fsey64BcNa3cA9Z3L05pzXSbe9eA39IHpH77DDA/Ap8kY7jO+IkeV92xLG01c24FM9L8vWw/Lwrn8+rJEluyQFByZ4eRfpjWwDsMpZKsWh8sxZouxylHLiPy6Yrsng7sEBs8S0UCsqObqVodoVKPS875EYZ8XauW7P6lk+tSNZInR7xRZoC3YwF2hZn2f4/yO1D9vXR6LsnAC2gWCzaeW87ml4rO70dSZ+TgnAlCIJxOSiMy+01sgAJg2tycLa5KtvhV62jr2QbPWCnninOwBJjd3opBlID9D1y/ynJszZSHJ65KV1kPqK65Pnvkg3ytNzeZ5trMK0BAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAADQDP4DsHeuiRev2FEAAAAldEVYdGRhdGU6Y3JlYXRlADIwMjMtMDgtMTBUMTA6MDI6NDkrMDA6MDAzy2cWAAAAJXRFWHRkYXRlOm1vZGlmeQAyMDIzLTA4LTEwVDEwOjAyOjQ5KzAwOjAwQpbfqgAAAABJRU5ErkJggg==;clipPath=inset(20.33% 0% 21.33% 0%);\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;677.4799999999999\&quot; y=\&quot;1595\&quot; width=\&quot;102.86\&quot; height=\&quot;60\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-84\&quot; value=\&quot;模型副本\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontStyle=1;fontColor=#808080;fontSize=15;\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;636.46\&quot; y=\&quot;1807.75\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-85\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;iroXu6kSOUnqGuu2dOUE-86\&quot; target=\&quot;iroXu6kSOUnqGuu2dOUE-87\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-86\&quot; value=\&quot;随机小批量\&quot; style=\&quot;shape=cylinder3;whiteSpace=wrap;html=1;boundedLbl=1;backgroundOutline=1;size=15;fillColor=#d5e8d4;strokeColor=#82b366;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;710.3400000000001\&quot; y=\&quot;1665\&quot; width=\&quot;70\&quot; height=\&quot;60\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-87\&quot; value=\&quot;本地梯度\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#ffe6cc;strokeColor=#d79b00;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;822.45\&quot; y=\&quot;1645\&quot; width=\&quot;80\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-88\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryX=0;entryY=0.5;entryDx=0;entryDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;iroXu6kSOUnqGuu2dOUE-83\&quot; target=\&quot;iroXu6kSOUnqGuu2dOUE-87\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;790.34\&quot; y=\&quot;1705\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;832.34\&quot; y=\&quot;1680\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-89\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;entryX=0;entryY=0.5;entryDx=0;entryDy=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;iroXu6kSOUnqGuu2dOUE-90\&quot; target=\&quot;iroXu6kSOUnqGuu2dOUE-95\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-90\&quot; value=\&quot;\&quot; style=\&quot;shape=image;verticalLabelPosition=bottom;labelBackgroundColor=default;verticalAlign=top;aspect=fixed;imageAspect=0;image=data:image/png,iVBORw0KGgoAAAANSUhEUgAAAgAAAAIACAYAAAD0eNT6AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAOxAAADsQBlSsOGwAAABl0RVh0U29mdHdhcmUAd3d3Lmlua3NjYXBlLm9yZ5vuPBoAACAASURBVHic7N15fBx1/T/w13tmd3Nu7mR3ZjZpUkpLmxbaBgotLeUqlHIIgoKKyiGieH4VBLxQVPD48lMQv3KKSlX4gnIVATnkaqFAgQKlUCBN293ZpGmu7ubc3fn8/kj4WmuPJPOZnT3ez8ejDxAz78+7m2TmPZ8TYIwxxhhjjDHGGGOMMcYYY4wxxhhjjDHGGGOMMcYYY4wxxhhjjDHGGGOMMcYYY4wxxhhjjDHGGGOMMcYYY4wxxhhjjDHGGGOMMcYYY4wxxtjuyO0EGGNSUSAQaPJ4PDOFEFOFEI1EVE9EdQCqhRDVAAoB+ACUjF3TD2AEwBARdQHoEkJsB7AVQBsRbU4mk293dHS0ARDp/ysxxpzABQBjWay+vl63LOtIAEcKIQ4DMAeA36HmdgJ4k4heBrDa4/Gs3rJlS9ShthhjDuMCgLEs0tzc7Ovp6VkqhDiJiFYAmOFyShuFEI8Q0aOVlZXPbNiwYcTlfBhj48QFAGOZz6Np2vFEdDaA0wFUuJ3QXvQAuF8IcXc0Gn0SQNLthBhje8cFAGMZKhQKGZZlnQvgEgANbuczQVEAf0ylUrd0dHS0up0MY+w/cQHAWIbRNG0JEX0TwKkAFLfzsckC8CCA60zTfN7tZBhj/8IFAGMZwjCMky3L+j4RLXA7F4e8KIT4YTQafdTtRBhjXAAw5jrDMI4RQvwEwEK3c0mTNZZlfbu9vf0ZtxNhLJ9xAcCYS6ZMmaIlk8mfCSHORX7+Lq4CcIlpmtvcToSxfKS6nQBjecijadpXLMv6K4AFyM+HPwBMB/B5v9+fisViazE6X4Axlib5euNhzBW6ri8C8D8ADnE7lwzzOhFdEolEXnA7EcbyBfcAMJYeqq7rVwG4A4DmdjIZKAjgAr/fXxSLxf4J3nKYMcdxDwBjDgsEAnWqqq4EsMztXLIBET2jKMont23bZrqdC2O5jAsAxhxkGMaxQoiV4Lf+ieoUQnyGlwwy5hweAmDMGaTr+tUAbgVQ5nYyWaiEiD7p9/uVWCzGywUZcwAXAIzJp+q6fguAr4N72ewgAEeXlZVNnT59+qpoNMqrBBiTiG9OjEmk63qxEOKesZP6XOPzlKKipAkVxU2oKJmKipImlBYG4VVL4FEKUeCrgFctAgAkUoMYHulFwhpEMjWA+FA7evtb0dPfir6BNvT2b8ZIMu7mXwcY3TPgbNM0B9xOhLFcwQUAY5I0NDRUJpPJhwAcme62vWoxAhVzEapahFD1ItT4Z4FI3jECOwe2Ity9BuGuNdjWtRojyZi02OMlhHiJiE42TXNH2htnLAdxAcCYBGMn9z0GoDldbfo8pTgguALTtY8gWD4PiuJJS7uWlUR736t417wfrR2Pprt3YIOqqifwCgHG7OMCgDGbQqFQlWVZzyIND38Cob5mCWboZ6Cx7jh4lEKnm9ynpDWEzR1PYFP0fmzb8RxEepbvb/B4PEu2bt3ak47GGMtVXAAwZkMoFCqyLOsfABY72Q6RgoaapTjsgK+gtmy2k01NWldsE9ZvuQ3vta+CZSWdbm5tKpU6rqOjo9/phhjLVVwAMDZJLS0tXtM073dywp9CKqbrZ2B+08UoL57iVDNS9Q604dXWm/Be9AFYIuVkU6tM0zwDgOPVBmO5iJcBMjY5RES3EtHHnWqgtmw2ls/9DWbXfxKF3gqnmpGu0FuBprrj0Vh3HLpi76J/uN2ppqaXlZVNjcVi9zvVAGO5jAsAxiZhbJOfrzkRu9BbgcUHfRdHzfwBSguDTjSRFsUFtTjIOBMlhXVo730VKWvYiWYO9vv9Fm8WxNjE8RAAYxOk6/oJAB4BIG+d3ZjG2mNxTPO1KPRVyg7tqsGRbvzzrSuwZcfTToS3iGhZJBJ5yongjOUqLgAYm4C6urqAx+N5HaOn10mjKB7Mb/oiDp36Janr9zOJgMCbW/6IFzb9HJZIyA7f4fV6523ZsiUqOzBjuYqHABgbP7WsrOxBIpI6Dd9fZODk+bdiuvYREOVuTU6g0c2Kqhci3LVa9v4BpZZlzY3FYivBRwkzNi5cADA2Trqu/5CIPiszZm3ZbHzksJWoKJkqM2xGKy3UcGDwVES6X8DASKfM0FN5PgBj45e7rxuMSaTr+iIAz0HiuL9RdQSWz/0tfJ4SWSGzykgyjkde+yLMnrUyw6aIaEkkEnlBZlDGchH3ADC2f6rf738AgCYrYFPdMiyf+5v/O5AnH6mKD9P1U9Hbvxk9/e/LCqsAODIQCNzW3d3t6CYEjGU7LgAY2w9N075GROfJijc1cCJOOOR6qIpXVsisRaSiKXACevvfR0//B7LC1iQSiSQPBTC2bzwEwNg+1NbWBr1e7zsAymXE0ysPxyktt0NVfDLC5QxLJPDwq59HuGu1rJDDiqLMDofD0roWGMs1ubneiDFJvF7vryDp4V9dOgMnzfsffvjvgUJenHjIr1FbJu08pYJUKnWtrGCM5SLuAWBsLwzDOEYIIWVzGX+RgTMPvxdFvmoZ4XLWwHAn/rr2LMSH5CznF0IcFY1Gn5MSjLEcwz0AjO2FEOIaGXEUxYPj51zHD/9xKC6oxYmH3ACFPFLiERH3AjC2FzwJkLE9MAzjZADfkhFr4fRvYVrQsQMDc05JYRCq4kW4e42McA1+v39NLBZrlRGMsVzCPQCM7YFlWd+XEaex9lgcPOV8GaHyytymi9BQs1RWuKtkBWIsl3ABwNhuNE1bQkQL7MYp9FbgmOZrQTzVZsIIhGNn/wwFXinzL480DGOhjECM5RIuABjbDRFdKiPOEQdemnOn+qVTka8KRxz4DVnhpHxPGcslXAAwtotAINAE4BTbccrn4SDjLAkZ5beZxtmoKzvYdhwhxOmapk2RkBJjOYMLAMZ2oSjKhbD5e6GQiqNmXZWzx/qmE5GCo2b9AArZnq+sEBFPxmBsF3yHYuxfPDIeEtP1M1DjnyUjH4bRExOnBU+VEeoC8Monxv4PFwCMjQkGg8sA6HZiKKRiXtPnJWXEPjR/6hdk9KjU67p+jIx8GMsFXAAwNkZRlLPtxjggeBIqihslZMN2VVkyFVPrTpARyvb3mLFcwQUAYwCam5t9AE6zE4NAmN/0BUkZsd21TL1ExpLKM8e+14zlPS4AGAPQ09OzFICtNXv1NUtQVTpdUkZsd9X+gxCqPtJumMre3l7bQRjLBVwAMAaAiGzv1TtD/6iMVNg+zNDPsB1DCLFcQiqMZT0uABgDIIQ4yc71Pk8pGuuOlZUO24umwDL4PKV2w9j6XjOWK7gAYHmvvr5eBzDDToxpwZPhUQolZcT2xqMUYmrgRLth5tTW1gZl5MNYNuMCgOU9y7JsjwkfqNmaP8gmYIZ+uu0YHo9nkYRUGMtqXAAwBtgqAHyeEgTL58nKhe1HsLwFPk+J3TA8EZDlPS4AWN4TQhxm53qtcgEUxSMrHbYfiuJBsOJQWzFknPbIWLbjAoDlOwIw206AUBWfNJtuRtURdkPMAficZpbfuABgea2+vn4qgDI7MSQ8jNgESfjMy3Vdr5eRC2PZigsAltcsy5pp53qfp5Q3/3FBjX8mvJ5iWzGEEHxiE8trXACwvCaEaLJzfUXJVD721wVECiqKbX3roCiKvQCMZTm+c7G8JoRotHN9RQk/Q9xi97O3+71nLNtxAcDyXYOdi/nkP/fY7QEA0CghDcayFhcALK8pilJr5/qKkgNkpcImqKJkqt0QNTLyYCxbcQHA8l21nYtLCupk5cEmqLRQsxuCCwCW17gAYHlNCGGrAJBwMA2bJK/93QBtfe8Zy3ZcALB8Z2stmVe1/RBik+RV7S0DhM3vPWPZjgsAlu98ti7mHgDX+FTbn32BjDwYy1ZcALB8Z6sAsLsZDZs8CUMAXACwvMYnmPwnCgQCTR6PZ6YQYqoQopGI6omoDkD12JhxIUYfHCUABIBeABYR9QkhRgBEiSgshIgAMIUQ76mq+mY4HI649rdieyPcToC5JuV2Aoy5Ke8LgPr6en3sPPgjx06FmwPAL8Toc4Fo9LyQD//3HhCAyrGv+XBS0UG7fj0RwbIs6LreDWA9Ea0XQjyjKMqz4XC424G/Fhu/Ydj4PRhJ9qPQWyExHTZeiWS/3RC2AzCWzfKuAGhubvb19PQsFUKcREQrUqnUjDQ2XwXgGCHEMQC+blmWpev6eiHEP4UQD7a3tz8PfitJtyGM9uRMSiIZ5wLAJSOpuN0QXACwvJYvBYBH07Tjiejsnp6e0wFUfPhm7zIFwDwimkdE39B1vUMIcZ+iKPdGIpGnwcVAOgzZuTiRGpCVB5sgCZ89FwAsr+V0ARAIBJoURbmQiM4HoLudzzgEiOgLQogv6Lq+jYhu8Xg8t2/ZsiXqdmI5bAcAY7IXjyRtv4WySRpJcA8AY3bk5CoATdOW6Lp+v6qq7xPRd5AdD//d1QshfpRIJLbouv6/hmEsdDuhHNVh5+L+YVuXMxv6h9ttXS+E2CkpFcayUk4VAIZhnKxp2loiehbAR5Abfz8vgI8JIdbouv64pmlL3E4olxCRrSd4b3+rrFTYBNn97Iloi6RUGMtKufCAhGEYx+i6vkYIsYqIFridj4OOJ6JnNU37p2EYR7idTC4YW6o5ab39m2Wlwiaod8D2Z8/fPJbXsroAmDJlimYYxh+FEE8CyJsuciI6eqxH4H91Xbd1nC3De3YulvAQYpPUE7fdA8DfPJbXsrUA8Gia9vVEIvGOEOLTGF2Ln28IwMcAbNA07cqWlhav2wllIyGEvQKgvxVCWLLSYeMkhIW+wTZbMSzL4gKA5bWsKwB0XV+k6/orRPRLAGVu55MBSonommg0ujYUCs1xO5lsk0wmbRUAI8l+dMXflZUOG6cdsY1IJO0tA1QUhQsAlteyqQBQdV3/AYBnARzici6ZaJ5lWet0Xf8p9waMX2dnZzsAW8ssI90vSsqGjVek+wW7IboikQhvzc3yWlYUAPX19bqmaU8AuAqA6nY+GcwL4PJoNPrPUCg06bXteWidnYsjXbYfRmyCIt1r7YZ4EXwOBMtzGV8AGIZxbCqVeoWIjnY7lyxypGVZr+u6vsztRLKErQIg2vsyLJGUlQvbD8tKor33FVsxiIi7bVjey+QCgHRdv1oI8TgAze1kslANgL/run6Z24lkqkAgUKdp2tcAnGUnzkiyH+29r0rKiu1PtPdljNg8CEgI8ZKkdBjLWpm6FbCq6/rNAC50O5Es5wHwc13XZ2ma9vl169Yl3E7IbY2NjYUjIyPLAHwawOkYHTaxbZP5APTKXN6CInO8a95vN4Tw+XxcALC8l3HL53RdLxZC3ENEK9zMw1tQipLqKSipaUJpTRNKqqagsDwIj68YqrcQ3qJyeLxFAIBkYhCJwT6kRgaRTAxiqK8d8a429O/YjP6uNvR3bUFi2PU941cBONs0zXw8vUYJBoNLFEX5DIAzAZTLbsDnKcVnj14Dj1IoOzTbRTI1iD88s8huD8A60zQPlZUTY9kqo3oAQqFQlWVZq4go7Zv6eAtKUTmlBTVNh6O6aQH8dQeCaHwjJF7VC2/hLisSQ/++SEEIC7GOTehqewk7Wteie+s6JIfTfg7JKQCeCoVCK8LhcHe6G3eDrusHCSHOJaJzAUxxsq2RZBybO57AgdopTjaT91q3P267+x/AgzJyYSzbZUwPQCgUMizLegxAc7ra9BSUQm8+EfrBp6Cyfi4UJT31kGUl0bvtdYTXP4j2tx9Pd+/AOp/Pd3xbW1tvOhtNl2AwWKsoyjlE9GkhxGHpbLu+ZglOmX97OpvMOw+tOx/hrtW2YhDRvEgk8rqklBjLWhlRAIy9+T+LdDz8iVB3wJEwDjkNgYOOheopcLzJfUklh9HxzpOIrH8I2z9YDQjnVyYR0csFBQXLWltb+xxvLA0aGxsLE4nEqWO7Qi6HpHH9iSIQPrboQVSXznCj+Zy3I7YR975wOoS91XtbTdN0tDeIsWzhegGg63oxgCfg8F7+pKjQZp2IaUs+B3/dgU42NWmxjk1477lb0f72P9KxveyLiURiWWdnp+uTEyaJNE1bTESfwegs/gq3EwKAacEVWHbwr9xOIyc9tv4raO14zG6YG03T/IqMfBjLdq4WAC0tLV7TNO93csIfKSpCh5yGAxZ/DiVV2XFuTn/XFrz//K2IvLEKwko52dQq0zRPB+BoIzIZhjFdCHEuRmfxN7qczn9QSMXZRz6CiuJGt1PJKT39H+DuNSfLKIwXm6ZpbwyBsRzh5q56pCjKHbC5BntfKkOH4NBzbkBDy8fgK5I+8dsxvuIKBA86FnXTl2Jn+zsYim13qqnpfr+/KhaLPeJUAzIYhlHt9/sv8Pv9NwD4GYClyJA3/t0JCCSS/WiqO97tVHLKmk3Xoiv2jq0YRPSWaZpXSkqJsaznWgGg6/rVABzpivMVVWDWSVegecW3UeivdaKJtCj016J+3hko9NehZ9vrsJLDTjSzoLS0tDcej9veW1WmadOmFRQUFJxeVlZ2LYCbAJwKIORyWuPSE9+EKbXHoKSgzu1UcsL2vjew+t2fQMLOvT+OxWK8/p+xMa4MAei6fgKAR+DAToR104/GIR+5Gr7iStmhXTXS3431D3wP29971onwKcuyjmtvb3/GieATQLquHzm2dO/jALL2m1hXfgg+uuDucS8lZXsmhIW/vfQxbO97026oQY/HY2zdurVHRl6M5YK09wDU19frQojHAJTKjEuKBzOXfQOzl18B1VckM3RGUH1FMGavgMdXhO62l2VPElSI6ITi4uKV/f39ad+gIBQKTSstLf1aWVnZbQC+SUSHAsjqb2L/cAdKCutQWzbb7VSy2tvhu/B2+G4Zof4cDofvkhGIsVyR7h4AVdO0J2Qf7FNUoWP+mb9ARehgmWEzVs+21/HqvZdhaGe77NCPm6a5HIDjSxBCoVCVEOLssaV7ad/4KR0KvOX4xJGPochX5XYqWWlwpAt/ef5EDCd32g1lKYoyNxwO2+5GYCyXpLUHQNf1HxLRZ2XGLNdm4ojzfo/SmkaZYTNaUXkQ+pwV6Nq8FsPxHTJDH+D3+4disdjzMoN+aGxc/7SysrJrhRA3AzgNQL0TbWWClDWMnvj7mKadAnJ/xW1WEcLCP974Grrjm2zHIqK/RSKRGyWkxVhOSVsBoOv6IgB3QGKvQ3XTAiz41M1ZNcNfFo+vGPqcFegNv4HBXlNm6MV+v/+vsVhMWmWh6/oiv99/ZSKRuIOIzgNwENxdgZI2fQNt8KrFCFbMdzuVrPLa5pvxtpwee0FEn4zFYh0ygjGWS9J1E/b4/f4HIPFY3+DMZTj07F9B9Wb1ULEtqscHfc5J6N+xGfHOVllhPQDmxWKx38PGtOv6+voDSktLv+r3+28DcBmAw5Ad4/oCwGohxE+IqADAAXYDmt1rEapeiNJCPtV6PKI9r+Cfb18ha57L/aZp3iAjEGO5Ji39kpqmfZ2IfikrXnDmMsw/6xcgJS9eIvdLWCm89tfLEH37cZlhv2qa5q8nckFDQ0NlIpH4OBF9GsAiZMBOkxPwHoCVqVTqzo6Ojs0AEAqF5liW9RokFMqlhRrOPPxeFBdk77LUdBgY3o57XzwT/cNSXtgtIpofiUTWywjGWK5x/AZdW1sb9Hq970DSEazVjYdhwbk3QVF9MsLlDJFK4KU/fwk7Wl+QFTKWSCSmd3Z27m+moarr+jFE9BkhxJkAimUlkAa9RPSQEOKPpmk+iT30eOi6/jsA58torKp0Os5Y8Bf4PH4Z4XLOSDKOB17+FHbENsoK+VvTNC+RFYyxXON4AaDr+l0AzpYRyx+YjkXn/R6eQr6B7kliOI61f7gAfVE5N1AhxE3RaPSLe/r/NE1rGduH/xwA2bTjzTCAx4nojxUVFQ9s2LBhZF9fPFbAboSknQf1ygU4peV2qIq7h1BlGksk8PCrFyHctUZWyC4AB5mmKXWWLGO5xNECwDCMY4QQT8mIVVSh48jP/QUFJbykal+G4juw+tZPyFoimLQsa257e/sGAAgEAk0ej+fcsb34p8toIE0EgDVEtFJV1bsnuhmMrutfBjCh4ZB9mRo4EcsO/hUU4iEsALBECo+/8TW0dvxDZtgLTdP8ncyAjOUaRwsAXdfXQMIab1I8WHT+H/Jmnb9dPdtew4u/vwCWlZQR7h9CiHvHxvUXI7vG9T8AcKeqqiu3bdv2gY04qq7rLwI4VFJeOCBwIo6bcx1UJb+HslLWMJ5485uyH/4vmKa5GGnYz4KxbObYzdwwjJOFEKtkxJp5wqWYulDq9gE574PVt+OdJ/LyWNr9jutPhqZpM4noVQCFMuIBo8MBJ827CT6P1E0xs0YiOYBH138J4S6ph/MNKYqygDf9YWz/HOuDLC0tXUlEht04ddOPxuzlVwCUTS+e7quqn4feyFsY6N7qdirpMALgISHEt0tKSi7esmXLvbFYTNq6SACIx+M7/H5/AsAyWTFjQxFs2/EsmuqOh9dTIitsVhgY6cSDL38aHX2vSY1LRF+NRCIPSw3KWI5y5KmqadoSIrJ9ao2vqAJLv/xgzh3sky4j/d14+jenITHY53YqTnkBwEpFUe4Kh8PdaWhPNQzjSSHEUplBSws1LDv4l3mzWVC05xU8/sZ/yVrqt6sHTdM8HZJ6fRjLdY70AJSVlf0awAy7cWaddAWqGvLjpugE1VcEb2Eptm9y5ARBt2wDcBMRXWia5s9jsdjLO3fuHExT26KoqOgRRVE+BUDaUpSRZBybovdDCAt65WGgHO3tEhB4c8sf8eSbl2IkGZMdfnsymVzR398flx2YsVwl/U4TCASaVFV9HzaP+q0MHYKFF/yRj1O1SQgLq2/7FPrMt9xOxY4eIrpHCHGnaZqr4fIbnmEYx42daCm9gG6oWYpjZ/8s5w4QGhzpwlNvXY6tO5wpRono5ZGRkWM7Ozu5AGBsnKTfwMrKyi4lIltdpKSoOPSc61Hoz6bl5ZmJiFCuzcK21+8DRFb1jKYAPEVEPwTwOdM074vFYhkxoSEWi20uKyuLAzhRduy+gS14O3w3PEohasvnZH0BLISF99ofwCOvfxFdsXedbMpQVfXogoKCewYGBva5twNjbJTsAsBTVlb2B9jsHq2fezoaWj4uKSVW6K9Ff/dWxDrsn6yWBmsB/AzAeaZp3hyLxdbHYrGE20ntLhaLveD3+0MApI9RpawRbOt6Dtt2PIfaslkoKcjOQnh73xt4bP2X8NbWPyGZGkpHk/Wqqi4pLCzkIoCxcZA6BKBp2nIiesRODFJULP3SgyipapCVFgMQ37EZz/7P6bIOWJGtTQixUlGUOyORSFZUKQDQ0tLijUajqwCc4FQbCqmYFjwV86d+AZUlU51qRqqe/g/w2uabsSn6oFs/b88nEomTeDiAsX2TWgDouv57ALYW7OuzV2DemT+TkxD7N6/e801E35a64YodfUT0oOz1+umm63oxgMcwukmSY4gUNNQsxaFTv4y68jlONjVp3fFNeL3tNrwXfQiWSLmdzpqRkZHlO3bskD7bkLFcIa0AaG5u9vX09HTAzp7pRDjqC3+Fv+5AWWmxXezseBfP3fwxN+cCJAA8SkR3er3eh9ra2tLSL+y0UChUZVnWPwE4vlUlgRCqPhIz9DPQFFgGjyJtX6JJSaYG0br9cbxr3odI1xqIzKrjuCeAsX2QVgDour4MgK3Xy7ppi3HYp34rKSO2Jy+tvBidH0g7cGVciOhlIcSdlmXd1d7e3pnWxtNE1/UaAI8DmJuuNn2eUkwNnIgZ+ukIlrdAUTxpadcSSUR7Xsa75v3YvP0fGEn2p6XdSeIigLG9kFYAaJr2/4jov+zEmH/WL6A1L5eVEtuDyJsP4/W/XZGOprYIIVYS0UrTNN9JR4NuG+sJeAwSzwwYL6+nGFrFYQhVL4ReeThq/DOlrSAQwsKO2EZEul9EpPtFRHtfRiI5ICV2mnARwNgeyOwB2AjgoMle7ykoxfGXPg3Vw8ekOimVGMKT1x2DxLAj98I+AH+1LOuP7e3tzyEPD2Opqanx+3y+e+HgxMDx8HqKUVHchIqSprF/TkVpoQavWgyvWowCbzm8nmIAo3vyDyf6kEgNYCTZj/7hdvT2t6KnvxV9A23oHdicbQ/8PeEigLHdSCkA6uvr9VQqFbETo2H+mZhz6g9kpMP2Y/0D30P49ftlhUtidBLcnYqiPBgOh9O1K1/Gamlp8ZqmeRsRfcbtXNi/4SKAsV1I6SO0LOtIuzH0g0+RkQobh9Ahp8kIs0EI8fVUKmWYpnmKaZp388N/1Lp16xIY3bKYZZbFXq/3kdra2vw8fpGx3cjaZmyRnYu9BaWorE/b3Km8V9UwD54C26fP3RONRq/v6OjYLiOnXGIYxiFE9C2388hyUSJ62YG4XAQwNkZKASCEWGDn+sop6ZvBzABSPKist72BXYuMXHKQRwhxGwCv24lksQctyzpkZGTkWADPOxCfiwDGIKcAIACz7QSoaTpcQhpsImqabNVsABcAe6Tr+jfgwiqAHDEkhPi6aZqnt7e3d3Z2dsZTqdRyIcTTDrS12Ov1PlZTUyPtVEfGso3tAiAQCDQBKLMTo7rxMLtpsAmqtl906cFgsFZGLrkiFApNA/ADt/PIUi8IIeZHo9HrscuukB0dHf3JZPJUONMTsMjn8/2dewJYvrJdAHg8nll2rvcWlMIfmG43DTZBZcEZ8PiKbcVQFIW/cf9CqVTqVgBFbieSZToBXGia5pHRaHTjHr+gszOeSCROAg8HMCaV7QJACNFk5/qSmsasP/I0GxEpKKmeYjMG8Z7NYwzDuJiIjnY7jyxiEdGdAGaZpvk77OcsCB4OYEw+GQVAo53rS6ptXc5sKKm2VbsBAPcAAAiFQoYQ4qdu55ElBID7iGh+JBL5jGmaO8Z7IQ8HMCaXjFdvW+f2cgHgntKaRlvXW5ZlyMkku1mW9VsA5ZLCvYjcyAn8XAAAIABJREFU3EHRIqK/EtE80zQ/GolE1k8mCA8HMCaPjAKgxs7FpVwAuMZu8UVEATmZZC9d188BcKqMWET0iGmaC4lophDiJgC5sLHSIIA/KIoyNxKJnDXZB/+uuAhgTA7bBYCiKLYKgMKyvH+GuKaoPGg3RJ2MPLLV2AmA10sKFxNCfAEAIpHIpmg0+kVFUUIAvgbgDUltpA0RvQXgqz6fTzdN87xwOPymzPhcBDBmn4w5ANV2rpewIx2bJI/P9mdvq/jLdkT0S8grgq40TXPrrv8hHA53m6Z5g2mahwghDgNwHYDNktpzwhYANwJYHIlE5pim+eu2trZepxrjIoAxe2Rsv2dr2ZNqcykamzwJn33eLnkLBoMnCSHOlRTuedM0f7uvL4hGo68AeAXApZqmtRDRqQCOB3A45PweT4YA8CqAB4nowUgk8nq6E+js7IzX1tae5PV6HwGwWHL4D4sAPkCI5SQZNw6frQS4B8A1Ej77Qhl5ZJuamhq/oig3SQo3BOAiTGDiXzQaXQdgHYAf1NTU+AsKCpaObcd9GEZ3IXSqZ2YHgLVEtFYIsdbn873k5Bv+eHERwNjkyCgACuxc7LXfDc0miQuAyfH5fNfC5uqXDwkhfhSNRt+Z7PU7duyIAVg19gfA6PHcQojpqVTqQCI6kIg0IUQdgCCASgClGB3+K8focc4xAHGMFiM7iShuWdYWImolos2WZW1WFGVzJBIJT/5v6iwuAhibOD6Bh9mRi8vV9skwjIVCiC9KCveGruu/iEajksKN2rZtmwnABPC01MAZrrOzMx4IBJYrirLKgU2ZPtwsaPlY0cVY1pOxDHDEzsXJkQEJKbDJSA732w2RV29D06ZNKxg76U/G701SCHHBunXrEhJisTEdHR39lmWd4tCOgYt8Pt+jvGMgyxXuFwD2H0JskiR89nn1zRscHLwKgK2zL3bx32Nj+UwyLgIYGx8ZBYCth0ByJK+eIRklZb/3JW++eYZhzBVCXCYp3CZFUa6WFIvtAW8bzNj+2S4AiKjbzvXcA+CexLDtHvx8GQLwCCFuh5w5M0IIcVE4HM6FXf4yGu8TwNi+2S4ALMsa92EeezK0s8NuCmySJHz2O2Xkkel0Xf8mgPkyYgkhbo5Go8/KiMX2j4sAxvZORg9Ap53r411tdlNgk9Rv/7PfIiGNjGYYxnQAV0kKFx4eHr5cUiw2TlwEMLZnMuYAbN3/l+ydhIcQm6T4Dnu7yhJRq6RUMpUihLgV8nY8/HJ3d3de9Jpkms7OzngqlVru0MTAD5cI8sRAllVkFAC23gL7bT6E2OTFd7TZul4IkdMFgK7rXwBwlKRwfzJN8wFJsdgk8OoAxv6djCEAWw+BeFcbhMi7/WRcJ4SFgW57Pfi53ANgGEYIwLWSwu2wLOu/JMViNvDqAMb+xXYBkEwm37Z1/XA/Yh2b7KbBJmhn+7u2N2FKJpM5WwAIIX4LoExSuK+1t7fbmivD5OE5AYyNsl0AdHR0tMHmbPCutpfspsEmqGvzi3ZDmB0dHdtl5JJpNE37FIBTZMQSQvzdNM0/y4jF5OE5AYzJmQMgALxpJ8CO1rUS0mAT0dX2sq3rich2BZGJdF2vIaJfSgq3k4i+ICkWk4yHA1i+k1EAgIhsPU26t66DsJIyUmHjYFlJdG991VYMIcQLktLJNDcAqJURiIguN01zm4xYzBncE8DymZQCAMBqOxcnh/vRs+11Samw/enesk7GDow51wNgGMbJAD4hKdyzkUjkFkmxmIO4J4DlKykFgMfjsVUAAEB4/YMyUmHjEHnjIbshEoqi5NRBNlVVVWVjE/9kGCKii5CHxyVnK54YyPKRlAJgy5YtUQAb7cRof/txpJLDMtJh+5BKDKF94xN2w7yYa3vZFxYW/hRAvaRwP4xEIry0JctwEcDyjawhAAghHrVzfWI4jo53npKVDtuL9o1PyOj+z6nuGk3TjgIga7Leek3TrpMUi6UZzwlg+URaAUBEj9iNEV7PG6U5TcZQCxHlTAEQCoWKiOhWACQhXBLA+evWrUtIiMVcwnMCWL6QVgBomvY0AFtHA3d+sIY3BXLQzo53scP++v+NudS9bVnWVQCmy4hFRD83TfM1GbGYu7gngOUDaQXA2FuPvVd4IfD+87fJSYj9h/eeuRkQwm6YnHn7NwzjEADfkBRuk9fr/ZGkWCwD8NkBLNdJKwAAQAhxt90Y0bf/gf6unD9lNu3ina3oeOdJ23GEEIdrmnaohJTc5hFC3AHAKyGWBeDCtra2IQmxWAbh4QCWy6QWANFo9EkAETsxhJXC+8/fKikj9qH3n79NyqFLRHQ0Eb2s6/pD2VwIaJp2GYB5ksLdZJqmEw8IlgF4dQDLVVILAADJsbcqWyLrH0Kf+ZaMfBiAPvNtmG/9XXbYU8YKgcdDodAC2cGdZBjGdCL6vqRwW0dGRq6QFItlKJ4TwHKRjJnP/yYYDDYqivIBbBYXFcYcLLpwJYhk1yj5RQgLa277JHrNDU43tUoI8cNoNPqK0w3ZpOi6/gyAxVKCKcrJ4XBYenVlh67rDQBmADhICNGoKEqdEEIHEARQgtFTDglAxdgl/QBGMLqBUReALiHEdgBbAbQR0eZkMvn22MFftieRZLPa2tpSr9f7CCT9/Ozm+UQicVJnZ2fcgdiM/QfpBQAA6Lp+H4DT7caZc8r30dDyMQkZ5a8tr9yNtx7+cTqbzOhCQNf1SwD8RkYsIloZiUQ+LSPWZNXV1QU8Hs9CAEcCWAjgYABOvUnuBPDm2Nkfqz0ez+qxTcDyChcBLFc4VQAsBvCc3TjeonIc/aUH4SupkpBV/hnu78YzN56KxJCt05onK+MKgbE347cg5wG5HUCzaZo7JMSaCE8oFFpsWdYKACsANKe5/d1tFEI8QkSPVlZWPrNhw4YRl/NJCy4CWC5wpAAAAF3XXwBwhN04dQcehcM+cSNAjqWak4Sw8Mqfv4Tt77s+Ny1jCgFN0x4mohWSwn3CNM27JMXaH8UwjKOFEJ8GcAaA8jS1O1E9AO4XQtw9NiE4p4/45CKAZTvHnqqapi2XsTsgAMxc9g1MXXS+jFB54/3nbsO7T13vdhq7crUQ0DTtXCK6U1K4h0zTPE1SrL3Sdb2BiL4w9uAPOd2eZBEhxB1CiNvb29vb3E7GKVwEsGzm6Gu1ruvPY3Rs0hZSPFh43h2orJ8rIavc171lHdb+8XOwrIx8AUt7IRAIBOpUVd0AoEZCuD5FUZrD4bCt5a77omnaEgBfJaLTAXicaidNLIxuHnVdri6V5CKAZStHp9hblvUdGXGElcSr916GoXi6h1uzz3CsE6/99VuZ+vAH/rV8MG37CKiqej3kPPwhhLjcqYe/ruvLdF1fQ0TPEtFZyP6HPzB6jzkdwHO6rr+gadpytxOSjfcJYNnK8YF1Xdf/BOCTMmKVBQ/CEefdAW8B/y7sSXI4jjV3fDbbzlNwtEdA1/VTIWn7YiHE09Fo9FhIXgqnadpRRPRjAEtkxs1gayzL+nZ7e/szbiciE/cEsGzjeAFQW1sb9Hq9G/GvNce2VE05FIefezMUj09GuJwhUgm89OdLsKPV9mE/bnlCUZTvhMPhl2QFrKqqKissLNwAOePng4qiHBIOh9+TEAsAYBhGCMA1Y2P8+WgVgEtM09zmdiKyBAKBEkVRVhHR0Q6EXzMyMrJ8x44dMQdiszykOt3AwMBA3O/3DwI4SUa8wT4T8R2boc08njcJGiOsFF7962XY/t6zbqdix1QhxOf8fv+hpaWl78XjcdNuwIqKiuuJ6FgZyQH4biQSeUhGoMbGxsLi4uLvAbgLQIuMmFlqOoALSktLB+Px+CvIgU2G+vv7E0VFRX9VVfUoAA2Sw9erqrqksLDwnoGBgbxYbsmcla61daqu668AkDaLT5u1DHM/+lMoan73BFjJEbz2t8vRvvEJt1ORSQB42M7QQDAYXKooyj8h52d8nWmaR0DCsrZQKHS4ZVl3AJhpP62c8joRXRKJRF5wOxEZeDiAZQPHewDGiLKysjcAnA9JRUe8sxU9W19DYOZxUPN0OCA5HMfLf/kSOuWu9U8CeApAE9JXIO6OAEwnoosm0yMQCoWKAPwdQLWEXBJEdEosFrO14920adMKCgsLfyyEuB1AnYS8ck0QwPl+v1+JxWLPIct7AwYGBkYKCwvvcagnoEFV1aO4J4DZla4CALFYLOz3+1UAS2XFHOw1sX3TswgcdAw8BSWywmaFofgOrL3zIvSG35Aal4i+b5rmReXl5fcJIWoAzIL7hcDn/X7/4vLy8nd37ty53xn4paWl1wA4VUYCQohrTdP8i50Yuq43JBKJvwM4Bw6vvMlyBODosrKypRUVFf/YuXNnVo91DwwMjBQXF/8vES0iokbJ4RtUVT26oKCAiwA2aem+sSu6rj8G4HiZQQvLAph35s9R1TBfZtiM1We+hVfvuRQDvXJXoxHRM5FI5DgAqQ//WygUOtiyrO8COAvuFQK72udkQcMw5gohXgLgldDWuz6fb25bW9vQZAOMrUL4A4BKCfnkk04hxGei0eijbidiF08MZJkqbT0AY0RVVdUTlmWdC0DaWr7kcD/MN1ZB8fhQVT83d7cNFgLvP3871t93JUYG+2RH3+71epf19fX928EBO3fu7IjFYveUlpY+TEQ6gAPhbiGw18mCLS0t3lgs9ncAuoR2LABnbNu2bfMkrydd168G8FsARRLyyTclRPTJsSGBrF4uyBMDWaZKdwGAvr6+uN/vfx3AuZD4IBHCwo7WF9EbeQu1ByyE6sute+5wfzdeu+eb2LruHghhyQ5vCSE+Fg6HX9/bF8TjcTMWi/0lQwqBPc4RUBTlSkjacwLAb0zTvHkyF46N9/8BwJeRGb0m2YoAHO33+6fEYrGHMVqUZSWeE8AykWs3J13XfwDgKidie4vKMePYr6Kh5aysXyoohIWt6+7Bu0/e4NipfkKI/4pGo7+ayDWaph1KRFcBOBnuP+QEgMcAHAOgQEK8LSMjI3Mm063a0NBQmUwm7wdwlIQ8Jq3E40F9YTHqi4owpagEocIi1PkKUaSqKFQU+D1eFKmj9f9gKoVYMoFBy8JQKoXtI0PYNjiALYMDCA8NYtvQAPqTru8suQrA2aZpDridiB28OoBlEjdv3KTr+q0ALnSqgXJtJmaf/D1UGHOcasJRfdGNeOvvP5Y+0W9XRHRLJBK5eLLXZ+AcAduEECdNZuw5FApVWZb1GIC0bHG8q0JVRXNpGQ4tr0JLeSUOLCmFInEozBwaxCt9PVjX141XensQT6W/IBBCvEREJ7twBLNUPCeAZQq3b9geXdfvA3CKUw2QokKfvQLTllyE0pomp5qRKt7Zig9W34bIGw870d2/q1WmaZ6OXSb9TdZYj8D3Mfq9dPvnatKEEH+MRqOfneh1YztePgGg2YG09qhEVXFMTQAn1AQw218ONU1zX1JC4M1YHx7rbMcz3Z3p7h3YoKrqCdu2bbO9UZSbuCeAZQLXb9S6rhcDeBzAIifbIVIQnLUM05ZchLLADCebmrSd7e/gvWdvQcc7Tzr94AeAtYlE4njZNwlN01rGhgaysRDoIKLmSCTSNZGL6urqAh6P51mM7mznKAKwoKIaJ9YGsbiqBgWKu0Ncw5aF57s78VhnB17q7UrX4v0NHo9nydatW3vS05wzuAhgbsuIG/RY1+mzSMfbExFqpy6EcchpCM48HqpHxpDx5KUSQ2jf+ATC6x/Ejs0vAiItt9AXCwsLl7e2tkpfSvChbCwEiOjsSCTyvxO5Zuy8gacBzHMmq1EqEY6ursO5xhRMLc7MPS8+GOjHynAbnu7uhOX8z/FqACfwnIB94iKA7VPG3Jjr6+v1VCr1D6SxC9VbUIrAzOMROuQ0VDbMg6Kk5/RVYSXRtWUdIusfRPs7TyI53J+WdsesGRoaOqm7u9uZGYW7yaJC4IGx4ZBxa2xsLBweHn7EobFcAKMP/hNrg/iUMQWhwuxY2bJtaAArw1vw+I4OpJwtBFaZpnkGJGzR7CYuAphbMuqGPDaD+iEAR6a7bY+vGJUNLaiZejiqGxegLDhD2goCISzsbH8XXZvXoqvtJXRvWYfkiCsvLs+NjIyc7MYEoQwvBPpUVZ01wXFl0nX9LgAfdyqpZn8ZvtE0A9NKsvP46039Mfy/1k3YGHe01rzDNM0LkeVbB3MRwNyQaTfiD+cE3AVJW7lOlsdXjJLqKSipbkJpTSNKqhtRVB6E6iuGx1cMb2EZPL5iAEByZACJoZ1IjgwgOdyPoZ0d6O9qQ3zHZvR3taG/a4tbD/xd3Z9Kpc7t6OhIa3fD7jKxECCiayKRyHcmco2mad8mop84kU+Zx4uLG6ZiRZ0mdSa/GywhsGp7FLds/QAx5yYLXmWa5tVOBU8XLgJYumXq3cUztkTwPLcTyRG/ME3zCmTQRioZVggIAKvGTh9ct78vNgzjFCHEA3BgX/9FldW44oCZKPfK2Mk4c/QmErj2/Y14sXdC8yvHyyKiZZFI5CkngqcTFwEgTdMaVFVtsiyrCUATETUJIZoAFAKowOheHyUA/AA8APoBjAAYIqIuAF1CiO0AtgJoI6LNyWTy7Y6OjjZkeU+RbG7fePeFdF3/HoDvw4UdC3NEAsAlpmne5nYie5OhhcDVezuGOBAITFVV9VUA5TIbVolw8ZQD8HGt3vUPwSkCwN3mVtyytdWJuQHtiURiXmdnZ7vswOmWT0WAYRjVRHS4ZVkLABw+9sepczN2AniTiF4GsNrj8azesmWLrVM+s13G32sMwzhWCPEnjB4XyiZgMl3bbsnAQuDhsR6BXQsBj67rzwJYKLOxYEEhrprejFmlZTLDZqy3Yn344aYN2D4yLDv0k6ZpnoAM6umarBwuAkjTtPlE9BGMDvPOdSGHXW0UQjxCRI9WVlY+s2HDhrzaStntG+24jG2yshLAcW7nkmW2+3y+GW1tbb1uJzJemVwIOLF99fQSP34+82BUen0yw2a8rpFhfOudN/B+v/RnUE7MBwByqwjQNG0JEX0Cow/9UDranIQeAPcLIe6ORqNPIstXl4yH2zfYiVB0Xf8ueEhgon5tmuZX3U5iojKwEHgSwNEYHXOUYn5ZJX5y0GwUq+lZfppp+pNJfOfdN/HaTqn1aYqIlkQikRdkBnVLNhcBuq7XENFnhBAXATjIiTYcFBFC3CGEuL29vb3N7WSc4vaNdcIMw1gohLgVadwvIMsliejQSCSy3u1EJiPDCgFpllbV4nvTZ8Gb5YdV2ZUQFn703kY807VdZtgNlZWV83OlOzfbioCx39lvAvgo5BzO5SYLwIMArjNN83m3k5Et696kY7FYuKGh4bahoaEUgCMg8Y0sRykAmmOx2O/dTmQy4vF4tLy8/AMhxAVwYNa9G5ZW1eKq6c3w5PnDHxid/HhUVS3aBgewZVDaCtW6oaGhVCwWe0ZWQDdly1HCmqa1lJWV3UxE/w1gDnLj3kwY7b24wO/3Ly8tLY3E4/H33U5Klqx+owoGg41E9AsiOsvtXDKdEOLcaDT6J7fzmKjm5mZfT0/PKxi9oWS9+WWV+Pmsg/P+zX93CWHh0rfX43V5wwFDqVRqVkdHx2ZZAd2WqT0BudpLtw9rLMv6dnt7e9YXmFnXA7CreDzeG4/H7yktLX2KiKYBmOJ2TpmKiI4oKCi41W6Vn25er/c7AM52Ow8ZPpzwV6hm9a+dI1QiHFVdh7W9XehOSPkR9RCREY/H75ERLBNkWk+AYRjVfr//BiL6H4y+JefDwx8A6onoPL/fP72qquqFvr6+jFhSORk5cSeKx+NbY7HYHX6/fw2AqZD/y5EL/KqqemOx2ONuJzJewWBwFhGtRA50JQYLCnF98zyU5dgGPzL5FAWLK2vwz67t6E/ZPqEaRDSrtLT0qXg8vlVCehkhQ4oARdO0iwDcj9HeiHx58O9ujmVZnystLR2Mx+OvIAs3GcrJb5xhGAsBXCqEOB3ZP248AuBNAC0SYiWEEIdEo9GNEmI5TdF1/XlIXnPvBpUIN86enzfr/O16M9aHr214TdZmQc+ZpnmUjECZxK3hgEAgMNvj8fxOCHGYA+1ms9eJ6JJsW32S7Q/HPYpEIi9EIpEzATQBuBpA2OWUJoyIWgFcmUwmGyorKxcB2CQhrJeIbpAQx3Gapn0FOfDwB4CLpxzAD/8JmOMvx0UNU2WFW6Lr+jJZwTJFZ2dnPJFInATAiZnpi71e7yO1tbW7nkJFuq5/SVXVl/jhv0dzhRDPj+0VkjU96znZA7AHqq7rx2B0LPkMANUu57M32wH8LxH9ORKJvIhdupQ0TTuRiB6V0QgRfSwSidwrI5YTgsFgo6Iob2F0v++stqiyGtccdHDe/KLJIgBcsXE9XuztlhFutWmaTrwpuy4dPQGqqhYpivI7jE7yY/v3T6/X+6ls2GY47+5LLS0t3vb29iVCiOUAlsPd2eUWgPVCiL8rirIqEom8hH1sY6rr+n0AJnRm/V5sBTDTNE3XjyjcA9J1/TEAst7argSwCC7MUC7zeLFy7uE5d7BPuvQmEjj39RelnCJIRIuyrXt2vJwsAojoZSFECIAmO3aO6wBwrmmaT7idyL7kXQGwu0AgUKcoypEAFhPRAowWBFIPetlFN4DXALxKRM+qqrp669atPeO9eOzN+G0ARXYTEUL8JBqNftduHNkMwzhPCHGHpHD3mab5UcCdpUqXTZ2BUwJ6OprKWQ+2R3DdZvujX0T0t7FhwZzkcE8AmxwLwI9M0/whMnSCYN4XAHuiadoUADMVRfnwGMp6AAGMDh1UAyjG6Mx0/9glOwGkAAwC2DH2pwPANiFE69h4/jumaW6zm5uu61cB+IHdOACGFUWZHQ6HM2ZTi7EzHzYAqJIQrtfr9c7avRsuXYVAs78MNzbPh0L8K2aHJQS++NY6vBOPSQglpkaj0S0y8spEXARkrN+bpnkRMvBsAb47ZZlQKFRkWdYGjE5wtOth0zQzZlxP07R7JG7q9DnTNG/fR1uOFQIqEW6ZcyimlZTu/4vZfr0bj+GLb62TsSrgatM0pR7mlGlypQjwl6horPdiar0PU6f40BTyQqvzoqRIQWEhUOH3oKho9Nd2cFCgN5bE4KDAwJBAdHsCrdsSaN0ygs3hEbRtSyDWb39ZqU2rAJydacOuXABkIV3XP4LRNbgynGaa5kOSYk2apmkfJaK/Sgr3pGmayzCObjcnCoEVdRouPyDbzj7JbNe8vxGPdbbbDbPNNM0mjPbW5axsLAL8JSoOPbgQi1pKsHB+EaY3+aAoch5PliXwbusIXnh1AGvWDeCVNwYRH3DlxOg1Ho/nlIkM+zqNC4AsZRjG34UQJ0kI9YHP55vd1tY2JCHWpDQ0NFQmk8kNkDPRqD+VSh3c0dHROpGLxgqBXwA4xk7jKhH+OPdwhAptT9Ngu9g6OIDPrn8Jlv1egGWZPjFLhmwoAkpLFKw4xo/TT/Bj/uwieNT0PI6SKYF1bw7h/sd24tFn4unuHdigquoJ27ZtM9PZ6N5kzXpF9u/Ky8tfEkJ8HvZ3yauyLGskFos9KyOvySgpKfkNACmbtRDRFdFo9JGJXhePx6N+v/8g2LxhHlsTwKk88U+6cq8XbQP9aLN/YNBILBZzvcfLaQ7vGDhpRMBRh5fgvy6swTXfCuLEo0phBL3S3vbHQ1EIoaAXxy8uxXlnVeLApgIMDglsNRPpaL5OCLG8srLyL319fa69dH2IewCymGEY1wghrpQQatCyrFlunHttGMYxQognIeFnUQjxUjQaXYRJdvHqur4RNs4tJwC/O2QBphZn/fYFGemDgTguXP+y3enUPZWVlcFcOSp4fwKBQImiKKuI6Gg381AUYOnhJfja+dWYPaPQzVT2alPrMG69qwcPPRFDMuX4pP21qVTquI6ODmlHYE5GTu4EmC+SyeRPIGeXwyJVVa+TEGdCAoFACYDbIKcQHbEs60JM8uFvGMZ02Hj4A8CCimp++DvogOJSHFphe4FIZW9v75Ey8skGHR0d/clk8lQ4s2Pgfqkq4WMryvGPO5tw60+NjH34A8D0qQX4xbeDeOQPU3Dm8jKozg5JHK6q6l1w+ZwTLgCy2Fj1eKmMWEKIj2qatlxGrPHyeDw/FkLI2vP1mo6Ojrcme/HYuRG2LK8L2g3B9mN5rf3PeGwTsLzR2dkZT6VSy4UQT6ez3dkzCnHPb+px7eUBNIayZzOspnoffnZlEPff0oB5zY4WLKcYhnEHXOyJ5wIgy5mmeTeAp2TEIqJfNjc3+2TE2p9QKHS4EOIrksJtqKysvNZmjFPtXFyiqjiyssZmCmx/FlfVoMRj76WJiFZISidrpLMnoKJMxTWXBfC3m+px8MzMfePfn5nTCnD3jfX48TfrUO53ZrqcEOJcXde/50jwceACIAdYlvVVADJmsBzU09PzdQlx9qm5udknhLgNciahphRFudDOmK6u68UADreTxDE1ARQo/OvktEJFxVFVtbZiCCGaA4FAnaSUssaHBwgR0ctOtXHsohI8vrIRHz+lPK0T+5yiKIRzTqvA4ysbcfQRjg3vXWUYxrFOBd8XvmPlgPb29g1CiBslhftuKBQyJMXao56enm8LIWbLiCWEuCEcDq+1GaMFgK0+yhNqAnYuZxNwov1hAFIUJWOXxzlJUZTisb39pfKowJWX1OLmawxUlufe4rKqChW3/tTA5V+sdWK5oiKE+FNtrYTxrYk2nO4GmTOGh4d/AMD2TikA/JZl/UJCnD0KBAKzMXpAj21E1GpZlu3uMyKydexwiceD2X6njo9guzvYX45i1fbcqbyZCLgLRVXVOyH5YB8j6MVdNzbgwrMrkcs7XxMBF51TiT/fEIJWJ33uXtDr9a5Emp/JXADkiO7u7p1EdLmkcOcEg8GlkmLtSlVV9TYAMuYZCCHExZKW0Rxh5+KD/eWgv/J+AAAgAElEQVRQc/nOl2FUIhzsL7MVY+zgr7yi6/qlAE6QGbN5eiH+dlM95s7K3rH+iZo/uwh/u6kBM6cVyA59nK7raT2gjQuAHBKJRO4EsFpCKFIU5deQvETFMIyvwuZY+y7ukLijm60CoKVcxtlFbCLm2f/MZyOP9kHRNG0mgB/KjLlwfjH+dH0I1ZWurmRzRW21B3++IYQj5knf8fP7hmHY6pGcCC4AcosA8BXI2et8jq7rl0iIAwAIBAJThRA/khQu6vF4pCx/rKurC8Bml+i8sgoZqbAJmF9u+zOv0HW9XkYuWUAlot8BkPaavnxpKW7/uYHS4vx9hPhLVPzuFyEsX+rf/xePnyqEuDVdq7Hy97uXo0zTfA3AzZLC/XDsAWkXqap6CwAp02iFEF+WdaCG1+u1tQ9BicfDm/+4YFpxKYpUe5PNhBCzJKWT0XRd/yJs9nLtavnSUlx/lQafN286UPbK5yVcf1VQdhHQ3NPTI2We1P5wAZCDPB7PdwF0SghV4fF47K6vh67rFwI4TkI+EELcG41G/yYjFgBYlnWAnesbCouh8Ph/2ilEqC8sthdDUWQcqZ3RxpY7Xi0r3hHzivD/vqc5vUteVlFVwi+/H8TiQ+39PO7mylAoNE1mwD3hAiAHbd26tUcI8R1J4c4zDGPSbw9TpkzRAPxcUi7dyWRS1uZBAAAistUDECriU//cUl9k74YrhMj5AkBV1WsBVMqI1Ty9EL/9ic5v/nvg9RBuvFqXOTGwIJVK2X752h8uAHJUNBq9HcArEkLR2B4Dk/pZSSQSv4GkGxCASzs77R8KvxtbBUCDzbdQNnkNNgsAAFNk5JGpgsFgM4DPyohlBL343c91+Etyb42/LKUlCn73c0PaEkEiOkvTtCVSgu0FFwC5y1IU5csALAmxWjRNu2iiFxmGcRaAMyS0DwCPm6b5e0mx/o8QwtZDwO5bKJs8u0MAAHJ672ZFUX4MCbttelTg+qu0vJztP1G11R786vuatM2CiOgnUgLtBRcAOWxsh7zfy4hFRD8xDKN6vF8fCoWqhBC/ltE2gP5UKnUxYPck2P9ERLYWlNf6pK8FZuNUV2D7s8/ZAiAUCi0A8BEZsS67uDav1vnb1TKnCN+4aNy3yv1Zouv6MlnBdscFQI5LpVJXAuiVEKrasqwfj/eLLcu6DoCUrS2FEN/p6OjYLCPWHtiawi9hRzo2SRI+e2l36UxjWdaVkLDPwbGLSnDBx2WN4OWPi86pwlJ5ZwdcJSvQ7rgAyHEdHR3bAXxfRiwiukjX9fn7+zpd10+ApLFHAC9Go1FZPQl7YrMA4DFRtxTbP3wpJ8dvxjb9Oc1unIoyFT+7IpjT2/s6hQj4xZVBWacIHunU5kBcAOQB0zR/C+ANCaFUADdiH28WgUCgBKP7EMi4bQxblnUh5Mxj2BsuALKUhB6AnBy/IaJLIeHe/q2La3LyYJ90qapQcennpXUySdn4bHdcAOSHpBDiy5Azhr7QMIy9vt17PJ6fAGiU0A4AXNPe3v62pFh7w0MAWarYY/vhlHMFQENDQyWAT9iNM6+5EGetsHfeAgPOPqUcBx9kf/6EEOJ0TdOkr1rhAiBPRKPR5wD8WUYsIcRPp06d+h/H3xmGsVAIIWud/puVlZU/lRSLsT0hXddPCwaDsxobG3NillsikfgMAFubU6gq4epvBKAo3Pdvl6IQrv5GnYyNkxQiukBGTrvi15c8oqrqt1Kp1GkA7O5bGRgeHv4hgK9/+B+mTZtWMDAwcBvkFJUpRVE+t2HDhhEJsfYnDmDSJ8sMpJIo83glpsPGayBp+8gLBcADiqJgZGRE6LoeIaIPhBAfCCE++PDfvV7vB7K2nnYaEU14ue7uPnpimRMn3eWt2TMKcdrxftz32E67oc7H6K6OMs56AZBHp2GxUbquXwY5O/MlFUVpCYfDb4zFvRrA9yTEBYD/Nk3zMkmx9knX9a0AJn0ozN3zFyJYkBMvj1knOjSIc157MV3N9QBo/fCPEKKViFpTqVRrR0dHG5ydpzIuuq7PA/CqnRiqSnj0D1PQVJ+Ws2jyxgdbR3DSZ9tg2f8pWSbxFFTuAcg3pmn+0jCMzwghZtsM5bEs6zcAjgqFQrMty7pcRn4ANqdSqR9IijUecTsXD6SSsvJgEzRgSXsRGo9KAC1jf0BjU+NVVYWu6yNEFP6wKPjwn5Zltaqq+nY4HB5MU462x/5XHF36/9m78/ioqrt/4J/vnUnINmEnmTMTDIsigqIGd3EFwbWL7aN9tNbaWpc+dXvUqq0KamvVarWb1q1P61Jt61ZtZXHXKi4oqyhCCMnccycEAmSyZ+ae3x8J/SGChJzvnTvLeb9evF6V5n7uYSYz93vPPYu5+Htg3OhCzDyqDC++pvV1AwBnADAFgDFgSaXU5QAWMGQdGQ6Hz1JKXQKA41tDEdH3Gxsb2xiy+oWIWpUa+NjI9lRaL0LGNtr0HwFwKVRKjQUwduvvklIKRATXdZORSKR++8cKgUBgjeu6q6WU7Yzt+KbOwUTARWfn7NIIvvvhOcMx9/VWaHzdAMDXampqLl60aFEPR5tMAZCHpJQvRSKRp5RSp+tmEdHDSimuh+AP2bb9ClNWv7iu20YaE52bursYW2Psjix57YNbiwMimgFga2EAABBCOADWAPhPcWBZ1hrXdddIKTf09yTRaHRf13WrdRo67eBS7DXW3P17Ze9xg3Dk1FK8+b7W/c3weDx+JIBXOdpkCoA81dcLMAua0+AAcF38nWAweDVTVr8RUaPO8Q0dnDdwxu6o70hbR5GXwn1/jtzae7BNcdBJRFIp9TGAFduNO1iHbQaDua57sm5DTp9lpv157WuzQroFAJRSs2AKAEOHlLIhEon8Qil1s99t6XOxHyOtlVKf6fQA1HeaAsAvDZ3perTum6KtvQcATtlu3EEXegckbu050FovvqzUwvFHlGk32PhyM44sQ6g0gETbwB9fEdFJAFjGXJl1APLYkCFDbgewyu92AHhSSvmsHye2LGuNzvENHTl/EcpYOdIDMFCDAEwEcIpS6lIA++iEnXxcCEWDzKQwrxUXWZh5lF6nq1Jq8siRI1n2WTEFQB5bsWJFNxH9r8/N2JhMJi/16+RKqc90jq/vaIerOarH2H2uUojlfg9A2nxlhu7SIEZ/fXXmF9ZQ223BYPBwhqaYAiDf2bb9AoAX/Do/EV2xfv16refwOpLJ5Gqd49tTSdS25/WdqC8+a2tFh5mBwaKsxMKBk7UWDzR2w9T9ilBWon3pPYKjLaYAMBAIBC4D0OnDqefZtv1nH877H33FR0In48OWrFgkLqd81MKxw7UBAAdNKUZQf6lao5+CAULNfnoFFxEdzNEWUwAYaGhoWKOU+mWaT9uWSqUuTvM5d4iItFZP+3CLKQDSzbzmfA6vYdu33uinww7Q3ol6XzCs5GsKAAMAEAgEfg6gLo2nvKaxsbE2jefbKdd139Q5fknLFqTMOIC0SSmFZYktfjcjZxx2oOn+TzeG13ywEGLAS5hvZQoAAwAQi8U6lFKe7Dm9A29LKX+fpnPtEhFpFQDtqaS5IKXR4pbNZglmJqHSAPYaYxb/SbeJ4wehtFjv8quU0pr5AZgCwNiG4zhPAZjr8Wm6XNc9HxmwecpWPT09bwPQuqLMb4oztcbYFfNa8xkzusBs++sDyyJUa+65YFnWGO126AYYuUUpdQUAz7bhJaJb4vH4x17lD0RTU1MrgCU6Ga81N6GLYasv48t1uim80dzvFXJ35mMiuge9s19WAsiKNYW9MCZq7v79MrZKbxFVpZR2AWBWAjQ+x3GclUKIewB4sR3vksrKytts2/YgWo9S6nUiqhno8W3JJN5qbsLxIyo4m2Vs582NGzi6///Ptu07tvlvSwgRsSxrnOu644honFJqHICtf4bonjBTjRltCgC/MLz2e+gGmALA+ILu7u6bCwsLzwIgGGOTSqnvc+1ixY2IngFwhU7G3Ka4KQA8Npeh+5+Intvur1wpZQOABgCvbf/zQogRX1IchLUb5CPdu1Bj4Mbqb7s8QjfAFADGF2zYsCEhhLgKwGOMsb9yHOcDxjxWUsq3hRA2gMhAM97f3Iw17W0YV2KmVXlhdVsrFm1p1o1Zadv2bi1/3bcr3wYA727//wkhSr6kONgDGf4dWzkyo5uX08KjtF97UwAY3iCiFap3ezKOEUJdnZ2dtzDkeMklor8ppS4baIAC8GisDjfuNYmxWcZWf47VgWGy5fZ3/1qklO0AlvX9+ZyampqC9evXj04mk+O3FgdEtG2h4Pv8u7KSgN9NyFsMqwEO1w0wBYCxI0Gl1IPgufgDwKDi4uLvAfgVU55X/gpgwAUAALze3ISGznZUFWkv9GFsY11HG97cpD34D0T0BENz+qXvcdeavj9fUFVVJZLJ5LjtexCIaBwYvtz7o6TEzADwS4l+AaD9JWMKAOMLhBCXA5jKmamUmj1y5Mi/NDVl7hwu27YXCiHqAYweaEZKKTwaW4drx09kbJnxqF2vvekSEb1v27bWbA9ODQ0NEoAE8IV1KKqrq4d0dXWNtyxrrOu647frOYiAqThnuAs1BojhtR+kG2AKAONzotHoeNd153gQXR4MBm8D8B0PsrkoAP8H4AadkAUbGnF6OIq9Ss0OaxxWtrbgpQ36+0W5rvsgQ3PSoq6ubjOAD/r+fE51dXVRZ2fnWMuyxgF4FhrTuUtNAeAbhtdeuwAw776xLXJd9z549GySiL4dDoeP8iKbSyqV+h00N0ZKKYW7aleZbYIZuErhnrWfcbyWbV1dXWnr/vdSXV1dZzwe/1hK+TwAsyVi/tL+UJgCwPiPcDh8PoDjPTwFEdHdADJ25FFjY+N6ANoXipWtLXhhvcPQovz2/HoHK1tbtHOI6LHm5mb9oMzTqnNwW7tZvMovDK+99oJtpgAwAADRaDRCRLen4VQHCCEuSMN5BqyvSNF2f/0abO7JyGUPssKmnm48UL/D8XO7K0VEd+z6x7JSm87BraYA8A3Da6+9gqUpAAwAgOu6vwcwOE2nu1kIoT2H1St9A8Ve0c1JJJO4dfVKjqlrecdVCreu/gSJJMumP0/GYrHVHEEZSKsAaG83v51+adcvALTee8AUAAYAIcQZAE5L4ymHAfh5Gs83ELdxhCzcvBFPynqOqLzyuKzHu5s3ckSpVCp1K0dQhtLsATBDCPySaNMuALRXxTIFQJ6LRCLDAfzah1N/r7Ky8iAfztsvUsr5ABZwZN1fX4vlZrvgflvashkPN6xlySKiZxobG5ezhGUmrTEA8SazrbJfGF577YUxTAFg/ArAKB/Oa1mW9Rtk8O8gEV0FhlHWKaUwZ9UKbOzO203n+m1jTzfmfPYxUjwzKHqUUtdxBGWwBp2DaxvMGBW/1DZoj+Fr0g3I2C9fw3uVlZUnKqW+7WMTDhFCnOvj+b9U31iAP3Fkre/uwpUrl6BVfye7nNWWTOLqlUuwgalQUkr9Rkr5KUtY5qrVOXhtvWc7fxu7UKv/2q/TDTAFQJ6qqKgotSzrd363A8DtfY8hMlIgELgeDINtAKC2vQ0/+WQZul0z8np7SaVw/arlWN2m1aO9reZAIPAzrrAMpvWshOEu1Bgg3QKAiLSfk5kCIE8FAoHbAIzxux0AhiulZvvdiJ1paGiQRMQ2iGxxy2bcwtfFnRNSSmHOZyuwaMsmtkwi+kksFtMeJJXplFJa8yTX1nfDdc3vYrq5rsK6mN7jF9d163TbYQqAPBSJRA4DcJHf7djGRZFIZIrfjdgZ27ZvI6L3ufJeb27CjatWoMv0BKBHubj5s4/xxkbtx5nb+rdt2w9wBmYq3bvA1nYXn9aaXoB0+/izLrR16H3+LcvSHtxqCoA8U11dXaSUehg8730SQB1DTkAp9Rvw7T7ILZlKpc6F5hLB23qzuQk/XrkEbTzz3LNSWyqFK1cuwasb13PGtqZSqXOQJ0vkSiltAFpTTN75sJ2pNUZ/LfyoQzdis23btm6IKQDyTE9Pz/UA9maKu4uIzmPKmhYOh/+bKYtdPB7/GJqbBG3vo5bNuOTjj/JydkBzTzd+tHwRFm/ZzJpLRFc1NjZqDYzLMi6A93QC3l5kCoB0e+cj7dd8GcxeAMbuiEQi+yulrmKK+8yyrNm2bb8K4EmOQCK6Y8SIERm7hZ6U8i4Ab3Nmrm5rxYXLFmFZHq0TsLRlM85f+gHWtLOMrdzWPNu2/8AdmgXe0Tl40bJOJFNmHEC6JFPAB0u0ewC0ir6tTAGQP4JKqQcBFDBkKdd1z4/FYh0AQERXQnNBkj7hwsJC1rtsZinXdc8CwwIc21rf3YVLV3yEv8j6nF42WAF41F6Hyz5ezDbVbxv1AM4Gw11RtnFdd6HO8Ym2FBYtY3u6ZezCe4vbtZ//K6X+zdEWUwDkCSHEFQBqOLKUUvfH4/HXt/63bdsxAFxTri4Nh8MTmbLYxePxOsuyvgmAdQWVlFK4b90aXLNySU5uILSppxs/XrkUD9TXejEDolMpdbqUkrUwyxbBYPBdaBY+z87LxY0SM9Nz87Vfa+W6rikAjP6JRqN7ApjNFBcrLi7+8fZ/OXTo0LsArGLILyAiP5Ym7rdYLPYagMu8yF64uRlnL16If8RtuDkwVdBVCs81Snx78btca/vvyA8dx/nAq/BM1zfd8WOdjLmvt6KzK/t/3zJdR6eLeW9od5Yu69u2XJspAHIfKaUeAFDMlHdxbW3tFx5Yr1ixolspdSnTOaZHIpFvMGV5Qkr5eyK634vsRDKJO9euwkXLF+HT1oQXp0iLla0tuGj5h7ir9lOuXf125NdSyoe9Cs8iz+scnGhLYcFbbIswGTsx/81Wjm2A53K0BTAFQM6LRCIXKKWOZop7Qkq50y8ax3HmAniO40RKqTuFECUcWV4ZMmTIj8CwbfDOfNKawEXLF+Hnq1eiviN7Rmqv62jDz1evxMXLP8QnrZ52LT8hpbzcyxNkC8uyntXNeHqueQzgtWcYXmPLsl5kaAqAzJ13bTCoqqoSqVRqBYAhDHEbU6nUPrvqehJCjAawEoD2xVspdYvjONfr5nhJCFGilPonER3j5XksIhwyZBjOjVZj77JyL081YLXtbXhC1uOlDY3pWOnw5ZKSkpNXr16df3Mod4yEEOsAVA04gIAXHtoDE8YNYmyWsdUna7pw6vfWQfOjsUFKGUbvGizaAhwhRmYqKyt7HMB+HFlKqQvi8fgupxslEoktoVCoEMAxuuckokMGDx78REtLS8Yu6ZpIJHqKioqeDgQCxwKIenUeBSDW2YF/rnewvLUFFhEiRSUIkr81fKebwqsbmvD7dWtw37rVWN3emo5h+AtTqdRJDQ0N2nOpckkoFBoH4GCdjC0JFycek7EzcbPa7LvXY3Wd9qqLjyUSCZZeVsD0AOQsIcS3ADzOkaWU+pfjOCf39+ej0Wix67orwLPXwD+llKcw5Hiqurp6SHd390tgmmnRH6XBII4aNhIzR1Ziv9BgBNJUDKSUwuKWzZjfFMcbzRvQnt4dDt/q7Ow8ubm52fRXbycSiRyvlHpJJyMQIMz90x4YU1XI1SwDwOp13Tjp3Drorv5NRNNt236Zp1WmAMhJQogR6B0VPJIhLgFgkpRyt/YdF0J8BYD2c8k+p33Z2INMEY1Ghyml5iqlDkr3uYsDAUwJDcaBQ4bhgPIhGF9SBoupIHCVwur2Vny4ZTM+atmEJS2b0ZHyZaXd+QC+JqXMngER6WUJIVZDs/A+fVY5bru2kqlJBgBc+bM4ntWf/tcgpRwDxmWuTQGQg4QQjwI4iynuh1LK3w+wHS8CmMXQhjWFhYWT6+rqMn61kr6Bi48B+Kqf7SgOBFBVVIKq4hKMLi5BVVEJRg0ahJJAAMVWAKFgAYoDvU8AO1IpJJI96HBTaEum0NTdhYbOdtR3tKGhowMNne1+XfC39WxJScmZ5pn/lxNC/BjAL3QyAgHC0/dVYdJeRUytym9LV3biGxfXa9/9A5gjpZyt36L/zxQAOSYajZ7kuu4/meLelFIeg971xndbJBLZSym1DIB2fyIR3WDb9s26OWlihcPhXxKRGaHOgIjutm37KjANfMpllZWVIy3LagCgNZJvysQi/O33VbAsc4nQ4boKp1/UgGWfaN+7pACMlVLWMzTrP8wgwBwybNiw8kAg8C8AgxniupRSp7S2tg54n9ZEIrExFAqVAjiSoT2HlpaWPtba2sq7e4w3VGtr67yysjJJRCfCTLcdqC4AF0gpf4EBFqH5prW1tb28vHwCNAf/Nm5IomJEEJMnmF4AHY8/twVPvqC/z4dS6mnHcR5kaNLnmC+mHFJUVHQrNKYBbUspNcdxnJW6OT09PT8DEGNoUnEgELiTISdtHMd5AMBJAOJ+tyULOUR0jFnkZ/cppe7lyLnj/o1o3uz7o5+stXFTEnc+yLM6tWVZnnz3mQIgR4TD4WkALmSK+8hxnDs4gpqamloBXMmRpZT6uhBiBkdWukgpF6RSqSkAuB7L5IMXUqnU/rZta21yk6+klG8D0B4pviWRwtW3xnXnrecl11W46tZGtCRYOq7e9OqzYAqAHDB+/PhBRPQH8LyfSaXU+WB83iqlfBJ8K+b9bvz48Vm1UkljY+N6KeWpSqkLAJgR7DvXoZS6TEp5Gtda53mMZVfN1xa24cEnN3FE5ZX7HmvGG+/ybHdNRHNYgnbAFAA5oL29fTYArh30fuk4ziKmrP9wXfcS8Oygt2d7e/slDDnpphzHud+yrEMBsL++OWAhgAMcx7kHebilLzcp5dtKqX9xZN15fxM+XG7WXOqv95d24Nd/5Fm7jIhe55z3/4V8r4KN9BBCHADgPQBBhrhVlmXtH4vFPPm0h8Phu5hGxicCgcDeDQ0NkiHLD1Y4HP4+Ef0MwAi/G+OzJgDX9T3rNwP9GIXD4Roieh8M3/PhUUE8fd9ojBzO8TWTu9ZvTOJrP6hH4waWDlQFYJqUkmXr3x0xPQDZLQjgIfBc/F2l1PleXfwBoKurazZ4BsSFUqkUyxgFn7h9vQETAPwejAt7ZJEUgN8Hg8EJUsoHYS7+7Pp68v7OkrU+ifOutpFoy8df1f5pbXPx3atsros/APzNy4s/YHoAdoQqKirGBIPBiUqpsUqpaiKqIqJRAIYrpYYDKELv3PbSvmNa0PuF1klEG5VSjei90MWVUp9ZlvWZUuqz3V1Nb1eEENcAuJUp7l4p5cVMWTsViUTOUUr9iSFKKaWOcRznDYYsX0Uikf2VUjcDOBm5/5l0lVJPu647p7Gxcbnfjcl1kUgkqpT6GADLAv+HHlCMh++IorAg139Nd09Xt8J5V9l4dzHbEJ8OpdREx3HWcQXuSN6/i1VVVcJ13SMAHNG3hOu+YPqw7EAzgPeVUh8Q0budnZ2vD3RN875FdpagtxjR1dDZ2Tk5TeurkxDiTQBHMGQtk1IeiBxZICYSiUxRSl0D4JvIvTU6FBE9TURzYrHYMr8bk0+EEJcAuIcrb9bRZbjnxjACgby/fAAAUimFS2bHMe+NBFsmEV1v2/YtbIE7O4/XJ8g0kyZNKty0adPRSqkTiegkABN8bE4SwPsAXnJd9/l4PP4B+jcAyhJCvAZgGkcjiOgU27bTNk2tb9zC++C5yF0qpfw1Q07GiEaje7qu+2MAZ0NzRbcMkFBKPea67u/MHb9vApFI5B3OPSpmHR3CXddX5n1PQFe3whU3s1/8lw8ZMqRmxYoV2lsH7vJcXp8gQwTD4fB0IjoDvWu0D/G7QTuxTin1VCAQeDIWi723sx+KRCIXKaUGtD7/DjwupeTaN6DfhBC/B3ARQ9TmVCo1IRenjUWj0WGu635LKXUOEWlt8+qDZUR0b1dX16MbNmzg+3Y0BoR5sDCA3scB9/5MIFSaa51V/dPa5uKC6yRntz8ApIjoyHStgZHTBUBFRcUYy7K+R0TfBSD8bs9uWqaUesiyrEdt29649S+FEFUAlgMoZzhHE4B9pJQ8y1Xthr6L26fgGQX/RynleQw5GSscDk8kou+gd5OnqN/t2YlPlFJPWZb1d9u2F/vdGOPzhBA3ApjNmTlx/CA8fHsk72YHNDWncO6VMXy6hndvqnTveZKTBUA4HJ5GRP8L4FRk/0yHDiJ6RCl1l5TyUyHEC+gdLMbhLCnl40xZuy0cDp9PRPczRCkiOjxPVo6jaDQ6WSk1Qyk1A8BRAEp8aksHgIVE9GoymXzGdPFnPEsIsQDAcZyh4VFB3H1DGDX7FnPGZqz3l3bgsjkO52j/rRZIKWchjTNicqoAiEQiJ7uue0MWdpf2h4veLrxDmfJekFKeypQ1UJYQ4l0AUxmyFkkpD0aeTScbP378oI6OjsOVUtMATEbvINbxYOzq7eMCWAtghVLqXSJ6fejQoe+n4zmlwWfkyJGVBQUFHwGo5MwNBghXnD8c5585DJRTV5X/T6neFf7ueXgjkin2taqcZDJ5wPr16xu5g79MTrxVkUjkWKXUzwAc5ndbskQLgMnc0xIHIhqNHuK67ttg6KlRSl3oOM4fGJqV1caPHz+ovb19H6XUJCJ6RCdLKXU2Ea20LGull2tEGOkTiUSOV0rNhwe9o0cfWoo7rq3EsCG5NS5g46Ykrrq1kW153+2kiGiGbduvehH+ZbK6AKiqqhKpVOpOAGf63ZZsQkQX2bZ9n9/t2EoI8RAAjmf4Gy3L2isWi/Gsw5kDhBBatypSyqz+jjC+qKKiojQQCHwEYE8v8geHArjyB8NxximDYVnZ/evjugpPPN+CXz7QxLWxzxeka8rfjmTr8/FgOBy+NJVKrYS5+O+WvrWlM+ouOZVKXQtgM0PUcNd1ffkgGUY2mDRpUmEgEPg7PLr4A727CF5/53p87YIGLFnZ6dVpPLdiVSf+64cNuOEutl39voCIHrFt+2eehPfn/H6deKCEEIejd/nUKX63JQt1EGUBi54AACAASURBVNH+tm2v8rsh2xNC/AgAx3z+FICDpZQfMmRlPdMDYGzVtwbKUwBOSdc5AwHCadNDuPDsYRg3ujBdp9Wyel03/vBYM55b0ALX2xFF/5BSng4fFzLLpg93QAhxPYCfIvdWSUuXH0spb/e7ETsREEIsAk9h946U8giYXeVMAWBsFRBCPAqfekwtCzj6kFJccu5w7Ls3x+Kl/FbVduOBJ5rxj5cSSPEP8tvewlQqNb2xsdGTQQX9lRUf7qqqKpFMJh8jomP8bksW+1BKeQgyeNncvumbr4Ph95KIzrVtm2PPgaxmCgADvbtP/pGIzvG7IUTAkVNL8bVZIZwwLYSiQf7+enV0upj/ZiuemduCfy9qh0rDLQMRLQ8EAkfV19dv8v5su2iL3w3YFSHECQAeATDK77ZksSQRHZQNi7P03aVwrEzYWFRUNKG2tnYLQ1bWMgVA3qNwOPx7IrrQ74ZsL1QawMyjSvHVmYMxdb8iBNO0t0AyBby3uB3Pzm/B/Dda0dqevpnDRLTcsqyZmbKVeSZ/uEkIMQfAT5C9gxUzAhHdatv2dX63oz/6ZnZ8AoYNmYjobtu2L2doVtYyBUB+E0L8EsD/+t2OXSkttjB1SjEOP7AEhx5QjInjB7HNIHBdhZWru/DOhx1456N2fLCkA20dviwX8lYwGDwtE+78t8rID3dNTU1BPB5/SCn1bb/bkguI6JHKysrvLVq0qMfvtvSHEOJKAHcwRCVTqdQB+bxCnSkA8lffDdQNfrdjIEqLLVRXFWJsVQHGjC7E2KpChEcFUVpioaSYMDgUQElx731he4eLLYkU2jsUWttcxJuSqG3oRm19N9Y29KCuoduvC/62/mFZ1pmZtpZGxn24hRAlSqm/9e3U55tQaQDVVQUYW1WIsXsUYky0AOFRBSgttlBUBAwJBVFc3PvydXQobE4k0dGh0N6p4KzvQW1DD2rXdWNtrBt1DT1ItKX8/OcAwILu7u7Ts2FjlpqamgLHcZYAmKibpZR6zXGcYxmalZVMAZCfhBBXA7jN73YYAICHpJQXIgPHX2XUh7tvg5gX4MOKfqHSAKbuV4TDa0px2IHF2GtMIWsX1Ke13Xjnw3a8vagdHyztSOtzp218SEQnbLu5UKYSQkwHsIAp7ltSyieYsrKKKQDyjxDifwD8xu92GEgR0ey+ef4ZOSMpYz7c0Wg04rruPACT0nXOslILJx0bwldPCOHAycVpHISisGhZJ56d14K5r7emu3dgCYDpfuwAuLvC4fDfiOgbDFGxnp6eiU1NTa0MWVnFFAD5RQjxXQAPIYO+2/OUQ0Rn+bG87+7IiF+Svjv/N5CGiz8RMO3gUnx9ZjmmH1nm+zSUzi6FBW+14pl5LXjzvba0TEMBsMx13ePj8XhTWs42QEKI0QBWgme3u9uklNcw5GQVUwDkDyHEmQAeBf86Kf8G0ApgJnNurnopmUyene6NfQbC9w93NBotdl13PoAjvTzP1oUoLv3ucEyekKkLUXThgSc24fmXEl7sNrW9D7q7u4/L9DEBkUjkp0opjv2xuwHsJ6X8lCEra5gCID8IIU4D8HcABczRi4PB4HH19fWb+7bvvhNAGfM5ckUKwC1Sypv7/nfG8/XDXVNTUyClfNbLAX+BAOHrM8txwVnDUB3l/mx4Y21DN+57tBnPLvB8RaoFQ4cOPSWTt3Tt29luOXq3uNU1r2+/7bxhCoDc17dWyj8ADGKOXkZEx247ZqiqqmpcKpX6E4AjmM+V1YhoOYDzbdte6HdbdoefH26KRCJ/8nKq3wGTinDTFRWYOJ77c5EeK1Z14oa71nu6oUbfZhS+rxD2ZSKRyMlKqRc4spRSX3cc5xmOrGxgCoDcFg6HjyKiF8HzmGxbq5LJ5FE76cYOCCEuAHATgOHM5802HUR065AhQ27L5BupnfHtwy2EuAnA9V5kDykP4OoLRuAbJ5XnxHaUf31hC+64fyO2JDzrVbpaSskx794zQojnwbOJSZ1lWftk2nxcr5gCIHdFo9GDXdddAKCcOboOwFFSyoYv+6HRo0cPTSaTswFcDCDI3IZs8Fel1NWO46zzuyED5cuHu6/L6kV4sMLfcYeX4rZrKjF0cG7tF9S8OYWrb43jtYWe7B2Rcl331Hg8/qIX4Rz6uh6XA+AYwHGTlPJGhpyMZwqA3BSJRKYopV4BMIw52k6lUkc1NjbW9veAysrKfSzLuh3AycxtyUhE9LpS6idSyn/73RZdaf9w9y31+hGY1/YPBoCrLhiJ8/5rKChHv7KUAh58chPuvH+DF4MEmwHsv6uq30/hcPhmIvopQ1RnKpWatDtfctnKFAC5RwixN4DXwb8/ynoAR0spPxnIwUKIA5RS1xHR15Gby7e/SURzbNt+2e+GcEn3mxRIJpOPgfkXN1JZgCd+OxrfOyN3L/5A7xTG888cisd/HUV4FHuP2zAAjyGDu/KI6FYAHN1tRYFA4FcMOYaRVhUVFWMBvAT+i/8mIjphoBd/AJBSfuQ4zjfRO537TwC62FrnH5eIniKiw6SUR+XSxR9Icw+AF8/9J+1VhIdvFxg+NGOvW55o2pjEeVfbWLma/TOW0d3j4XD4dCL6O0eW67onZfJjDw6mByB3RCKRqFLqDQBjmKMTlmVNj8Vi73GGRiKR4a7rnk1E5wHYjzM7DRoAPAzgYSllvd+N8UraPtxCiMMBvAnGXofDDizBvT8TKCvJxd6mXUu0pXDRTyQWfsQ6ni0J4BAp5YecoZyEEPMBzGCI+qykpGTf1av5q6hMYQqA3DBq1KiKYDD4OoAJzNHtSqkTHcd5gzn3c8Lh8FQiOhfAVwBEvTyXho0AniaiJ23bfg1ZMpdfR7o+3EEhxAcApnAFzjq6DHddH0ZhQX5/P3X3KFxxcxxzX2ddz2dJOBw+KFN3D+x7BroEQKFullLqOsdxbtVvVWYyBUD2i0Qiw5VSrwLYlzm6C8BpUsr5zLlfhoQQ+xPRSUqpUwAcDH/HCywFMJeI5tq2/SYycMMeL6Xlwx0Ohy8jIrZnrrOOLsM9N4YRSNPa/ZkulVK4dA5vEUBEP+3bxCIjCSFuB3AVQ1QbgImZPPhRhykAstvYsWMHd3Z2vgRgKnN0D4BvSimfY87dLdFodFgymTzEsqyD0FsMHAT+8Q1bbQawTCn1PhG9mUql3m5sbFzv0bmygucf7r5R/yvBNFf1sANL8NDtkby/899ed4/Cd6+M4d3FbI8D2gHsnakXxhEjRoQKCws/ASAY4v4qpTyDISfjmAIge1VUVJQGAoG54F8mPQXg7EzdIVMIMdqyrLFKqWqlVDWAaiKqVkqVARiC3hUPSwGE0DtoOYHeO/d29HbjbwTQCKCBiNa6rltLRCtz+Vn+QHn+4RZCPAGA5ct1wrhBeOLXVQiV5ecz/11JtKVw9mU2VqxiWznwCSnlt7jCuIXD4bOI6FGOLKXULMdx5nFkZRJTAGSnSZMmFW7evPlZpdSJzNFKKXWh4zj3M+caWcjTK2kkEjkWTBf/SGUB/nxnxFz8v0SoNIAHfyE4pwie0Td4MyM5jvM4AJbBS0R0V01NTXZsFmHktJqamoJNmzY95cHFHwAuMRd/YytPr6ZKqZ9z5AQDhF9dX5l3U/0GYuTwIH57k0CQZyFEAvALliRvKNd1LwbPwJ194vH4jxhyDENHwHGcP4Nn2evtXSul/K0HuUaW8qwAiEQiJwM4lCPrqgtG4MDJxRxReWHKxCJc/v0RXHHT+npyMlI8Hl8B4F6OLKXU7KqqKo4xBYYxECSEuA/AmR5k3ySlzORi3vCBZwWA67o3cOQcd3gpzvuvoRxReeUH3xqGow8tZclSSs1hCfJIYWHhDehdxlRXyHVd8yVp+IGEEL8D8H3uYKXUXZm8uJfhH08KgHA4PI2IDtbNGVIewG3XVOb08r5eIQLuuLYSg0MszwKmRaPRQziCvFBXV7cZwDUcWUqps4UQZq9zI636prVexJ2rlLrPcZwruXON3OBJAUBELL9wV18wIud29UunYUMCuPIHPI8CXNe9nCXII1LK/wOwkCGK0PtIwQw4MdJCCDEHAPtFmogecRznhwDYdw4zcgN7AVBRUTEGDANYDphUhG+cxL3Ndf4545Ry7Lc3xw66OF0IMZojyCNKKfU/AFyGrH2FEBcw5BjGlxJCXAWA5XHptpRSf7dt+7vg+TwYOYq9ALAs63u6uYEAYc7lFbAs0/evy7IIN10ximPVxCCA7zI0yTOO4ywiogeZ4m6urKwcyZRlGF8ghLgYwO0eRM8vLS09G3mwlr2hh7sACBKR9kXi6zPLsc+egzjaYwCYPKEIp00PcUSdi8zf5/s6AM0MOUMty2KZxmoY24tEIucC8GJK3suFhYVfyeUNrgw+rF/mlZWVM6C5NGsgQPjBf5tR/9wuPHsYLP13uzoSiRyj3xrv2La9EXxbTp+XyYMfjewUDoe/rpR6APwrsb7T09Pz1bq6OralQI3cxloAWJalverfSceUYUyV9iZvxnbGjS7EzKPKtHOUUv/N0BxPSSnvI6L3GaIs13V/h8zv9TCyRDgcnkVEj4N/kOniYDB4clNTUytzrpHD2L7YJk2aVAjgNJ0MIuCis4cztcjY3g/PGc4xpfIryPwR8i6AS8Ez+rlGCJHRYx+M7CCEmE5Ez6B3Mxs2RLSciKbX19dv4sw1ch9bAbBp06ajAWj13U87uBR7jTV3/17Ze9wgHDlVe3GgEZWVlRk/T9627XcA/Jkp7tbRo0eb51LGgIXD4WkAngPAMiVnG6u6u7tn9D36MozdwlYAcGxccfosM+3Pa1+bpT8Y0LKsUxma4rlkMvljAFsYokYmk8mbGHKMPFRZWXkQEb0AoIQ5ug7A9KampjhzrpEn2AoAIjpJ5/hQaQDHH6H/jNr4cidMC6GsVPttP46jLV5bv359IxFxLWN8USQSmcKUZeSJaDS6r2VZLwLgvruxA4HAdCllA3OukUdYCoC+DVQm6GScdFwZigaZef9eKxpEmKU/GHCKEIJttyEv2bb9GwBLGaICSqnfgX/ktpGjotHonq7rzgfAPbCpyXXdExoaGtYw5xp5hqUAcF1X+5nwV2awzFM3+uGrMwfrRlhKqaM52pIGScuyLmXKOiIcDp/FlGXksIqKirGu674CoJI5ehMRzYjH4x8z5xp5iOsRgFYBUFZime1+02jqfkUoK9F764koa+bHx2Kx1wA8yZFFRL8cO3asdgVl5K5oNBoJBoMLAESZo9sAnGbb9hLmXCNPsRQASqmDdI4/aEoxgvpL1Rr9FAwQavbTLrhqONqSLn0bVHHMka7o7OzkWmjIyDGjRo2qcF33ZaXUWObodqXUSVLKt5hzjTzGUQAQgMk6AYfX8Oxbb/TfYQdoD0iuQRY9D7dtOwbgZ0xxl0aj0X2ZsowcUV1dPSQYDL4IzfFQO9BtWdY3Hcd5gznXyHPaBUDf7n9aI1wPPcB0/6fbYQdqv+aDw+FwJu8O+AVDhw69C8Aqhqig67q/YsgxcsSwYcPKe3p65gM4gDm6B8B/xWKxfzHnGoZ+ARAMBvfROT5UGsAEs/hP2k0cPwilxdrjAPZkak5arFixolspdQlT3PGRSOSbTFlGFquoqCgtKir6p+6j0B1IAThHSvkcc65hAGAoAJRSY3SOHzO6wGz76wPLIlRr7rlAROOZmpM2juPMA/AsR5ZS6s6Kigrz/CqPTZo0qTAYDP4NwJHM0UopdbGU8gnmXMP4D44CoFrn+DFRc/fvl7FVBVrHK6WyrgAAANd1LwfQwRBVZVnWtQw5Rhaqqakp2LRp0985VkHdgUscx7nfg1zD+A+OQYBaz4HHjDYFgF8YXnutrZ/9Eo/H6wDcxpFFRFdGo9GsLIQMLQHHcR4F4MWy2NdIKX/rQa5hfI52AWBZ1kid48eZAsA3Y/W3XR7F0Q4/WJZ1OxHVMkQN6tsy2MgfJIS4D8B/eZB9s5SSpTg1jF3h6AHQWuayYkSAoQnGQIRHae/qm7UFQCwW61BKXcEUd4IQIis2SDK0kRDitwC+zx2slLpLSnkDd65h7AzHGACtAqCsxBQAftFdDRD8a5ynlZTyOSJ6kSOLiO6urq7m3urVyDBCiNsBXMydq5T6g+M4V3LnGsaX4egB0FpRprTEzADwS4l+ATCIox1+IqJLAHTp5iilxvb09FzF0CQjQwkhZgNgv0gT0SOO41wMQHFnG8aX4SgAtB4kM2xNawwQQw9A1t/xxmKx1UR0F0eWUuraysrKao4sI7OEw+HLANzoQfQztm2fB8D1INswvpTvBUCJ5mI0xsCVmh4AAIBS6hYA6xiiigOBwC8ZcowMIoT4HleRuJ35JSUl3wKQ9CDbMHbJXH0NHTlx1yKlbAfwY44spdTplZWVXswLN3wQiUTOAXA/+Pe9eMWyrK+uXr1a+/GTYQwURwHQrXNwe0dOXEOyUlu79mvPsbteRpBSPgngFY4sy7LuGT9+fE70juSzcDj8daXUQ+C/UVrY09PzlVgsxrEYlWEMmO8FQGubKQD80qpfALRxtCNTuK77I/RuvqJrz/b29ksZcgyfhMPhmUT0OADtubLbWRwMBk9qamrKmeLZyF4cBYDWRaCt3Qx89Uu7KQA+Jx6Pf6yU+g1T3PXRaDTClGWkkRBiOhE9C+YxLkS0nIim19fXb+LMNYyB0i4AiKhZ5/jW9pRuE4wBSuj3vuTcXUxXV9ccAHGGqDLXde9gyDHSSAhxJHo3i+Ke4bKqu7t7hm3bG5lzDWPAtAsA13U36BwfbzIDYP3C8Nq3cLQjkzQ3N7copa5mijuzsrLyaKYsw2OVlZUHAfgnAO4dHusATG9qauIoLA2DDUcPQJPO8bUNHI9cjYGobdAavgHwTJ3LOH2bvPybIYosy/oN+J8jG8yi0eh+lmXNBVDOHC0DgcB0KWUDc65haOMYA1Cvc/Daeu2LkDFAtZqvPdNmOplIWZZ1EXjmZ+8rhGBfOtbgE41G93Rddy6AYczRTa7rzmhoaFjDnGsYLDgKAK27QIa7UGOAdAsApVSuFgCIxWLL0Dv/m8PNe+yxR5gpy2AkhBjtuu4CANzvzxal1InxePxj5lzDYMPxCEDrIrC2vhuua2YCpJvrKqyL6T1+yeEeAACAZVnXA9Aa49KnvKen5+cMOQajSCQSJaJXAezBHJ2wLGum4ziLmHMNg5V2AZBMJrUq3NZ2F5/Wml6AdFu5ugttmoswJZPJnC4AYrFYs1LqOqa470QikcOYsgxNFRUVo5RSLymlxjJHt7uue2osFnuXOdcw2GkXAI2NjXXQHA3+zoftus0wdtPbH2ovQiYbGxvXc7QlkzmO8xAAji9zUkr9FoDZ/9pn1dXVQwKBwIsAJjBHd1uW9c14PP46c65heIJjDIACsEwn4O1FpgBIt4Uf6b3mRLSQqSmZzrUs6xLw7HtwYDgc/j5DjjFAw4YNK+/p6ZkP4EDm6CSAM2Kx2L+Ycw3DMyxrXBPR+zrHf7C0A0mzHlDaJFMKi5bq9QAopd5hak7Gi8Vi7wH4I0cWEf1cCDGCI8vYPUKIkuLi4n8opQ5ijnYBfEdK+SxzrmF4imuTC605063tLhYtM/tipMt7izs49gHIlx4AAIDrutcC4FjCdZhS6maGHGM39G3O9KxSinthJgXgB1LKx5lzDcNzLAVAMBjUXjTl2Xk5t6hcxnpuvvZr3WNZVl6NcI7H400AbuDIIqLzhRDcXdDGTtTU1BS0t7f/FcAMD+IvlVI+5EGuYXiOpQBYt26dA2ClTsbc11vR2WWmA3qto9PFvDe0l/BfmI9bmUop7wWwhCEqAOC34N9jHkDv3W4kEtlfCPHfulmRSOSccDg8taKignt53HQJOI7zCIDTPMi+VkrJtXmUYaQd2xKlSqkXiWjiQI9PtKWw4K1WnHp8iKtJxg7Mf7OVo/v/HxxtyUIppdSPiOh16F+8D4tEIt+xbfv/dEImTZpUuHHjxsMsyzpGKTWZiPZtb28fB6bPtlLqT0SEQCCghBB1SqmVRPSuUuq1QYMGvVdXV9fJcR6PkBDiQQBncAcrpW5xHOcX3LmGkU5sdyBCiBkA5utkTDu4FH+8w+yg6qVz/zeGtz7QmwFgWdZesVjsM6YmZZ1IJPKIUupshqjGoqKiCbW1tVt24xgSQuwP4HgA0wFMA1DC0JaB6ATwHoBXiOg527YX+9SOHSEhxG8A/JA7WCn1K8dxruDONYx0YysAampqChzHiUNjPW0i4IWH9sCEcazbcBt9PlnThVO/tw5K70nLx1LKSUxNykqjRo2qCAaDnwIYrJtFRHfbtn35rn4uHA5PJKIzAJwNYJzueT1Sh97eob9JKf+N3gFyvhBC3ArgGg+i/yil/B58/LcZBhe2RUkcx3FDodDeAA7QydmScHHiMeYxgBdm370eq+u0V118KJFIvMzRnmzV1tbWVl5e3gPgBIa4qSUlJc+0tbV9YVGlioqKUUOGDPlBKBT6HRHdAuAY8G9Yw2kIgEMAnBcKhc4IhUKBkSNHfrJp06audDZCCHEjgJ9y5xLRI1LK82Au/kaO4JoGCABQSj2pmzH39VasNRsEsVu9rhvz39Qe/AcAh4TD4RqOoGxWWVn5a2gOfO0T7Nsy+D+9cdFodHw4HL43EAisU0rdBf5Fa9JhbwD3dHZ22pFI5A/RaHS/dJxUCHElgNncuUT0lG3b54FnQSjDyAisy5K2trauC4VC34PGntpKAe3tLmZMK2NsmfHz3zVh5WqWG7ExRHR+KBSqKSsrW9Xa2upwhGYbx3Hc8vLyTwCco5tFRNWhUOiTsrKyovLy8ruVUr8jooPBOEjXR4UAapRSF4ZCof3Ly8s/SSQSjV6cKBKJXATgbjDPrlBK/Wvo0KH/1dTUpLd7lmFkGPZpSOFw+GYi0up+CwQIT99XhUl7FXE1K68tXdmJb1xcD5f/3kUBeEEpNSdfdz4Lh8N/I6JvMER1AsiHX3hFRM8Q0ZxYLLaUKzQSiXxHKfUwmHs1AbxiWdYp+Tjt1ch97AVAZWVltWVZa6D5QZwysQh/+30VLMuTqdJ5w3UVTr+oAcs+8XS2lgLwPIA5UsoPvTxRphFCjAbwMYBsnSfvF5eIHgRwnW3bG3WCIpHIN5VSfwH/Rktv9/T0zGxqamJ5dmYYmYZ9Z7LW1tbNoVDoAPQ+Axywxg1JVIwIYvKEfLgp8s7jz23Bky/sziyzASH07qx2QSgUOrK0tPST1tZW6fVJM0EikdhSXl5OAI7zuy1ZhgDUADi/rKyss7W19QMMYHBdOByeCeCvAAqY27ckGAzOdBzHLFFq5CxPbq+FEEcCeFM3Z3AogAWPVmPYELOD6kBs3JTEjG/XoSWR9nFLedUjMH78+EHt7e3LAYz3uy1Z7F2l1Hcdx+n3wEohxHT0/p6x3iUQ0XKl1LFSyg2cuYaRaTy5siYSifpQKDQLQFQnp6tb4bO6bpw6vRxkngTsFtdV+NHsOFbV+jKjYmuPwA9CodCBoVDo00QiEfejIenQ3NycKi8vrwWgvfRuHosS0XmhUGhTIpH4YFc/3HeT8QL4F0H6rKen57jGxsYvTMs0jFzj2a11aWmpJKKzdHPqYj0oKbZQM7mYo1l5495Hm/HE8553/e/K5x4NlJWVrczVRwOJROKzUChUg95/rzEwBQBODoVChw0ZMuSVlpaWxI5+KBKJ7A9gLjRmG+1Eg+u6x61fvz7GnGsYGcnT+2ohxFsAjtDNCQaAx39dhQNNEdAv7y/twLcviyGZyrj1ShR6V4qbI6X8yO/GcKuoqBgbCARWID9G83stDuB0KeXb2/5lNBrd13XdVwEMZz6fDAQCRzU0NKxhzjWMjOXpw/XS0tI1RHSubo6rgLc+aMepx4dQWsI9yye3rN+YxHevtJFoy8j1Sgi9g0N/EAqFDuibE54zjwba2to2hUKhQQC495zPR2UAzg6FQnYikVgMAEKICUqpVwCMZD5Xk1LqONu2VzHnGkZG87QA6FsYaAKAfbWz2ly8+X4bTptejkGFZkDAjiTaUvjOFTbqYhm/XsnWQuDCXHs0MHjw4IVKqbPQuyyuoScI4Cvl5eWipKTkY8uyXgLAvVvYFqXUCY7jsK1JYBjZwvMr6R577BHu6en5BEzP6w49oBgP3xFFYYEpArbV1a1w3lU23l2st9OfTxSA54hoTobtKLfbJk2aVLhp06YX4fO0wILSUoQiVQhF90B51R4IRapQMnIUgsUlCAwahMJQOYJFvY/Ukp0d6E60INXZiWRnB9qb1iMRq0dLwzok7Hok7Ab0tLX5+c8BgG70rirIKWFZ1oxYLPYuc65hZIW0XEWFEJcAuIcrb9bRZbjnxjACAVMEAEAqpXDJ7DjmvbHDMVPZJKsLgerq6qKurq6niOikdJ+7oLQUIydPQcX+UzFqSg0GV48FWTyPy5TrYsvaNWhcsgiNixdhw/LF6GnPykJzWx2u654Yj8df97shhuGXdF1BA0KIDwDszxU46+gQ7rq+Mu97Arq6Fa64OScu/ttSAJ7tKwSW+N2Y/hBClAB4DsD0dJ2zoKQUVUcdh+rjZ2H4PpNhBdKzdYCbSmLjimVY+/JcxN56NRN6B3ZXl1LqK47jzPO7IYbhp7RdPSORyGFKqbfAuFb3oQcU496fCYRK83OhoNY2FxdcJ7m7/ZMAXgNwPNL4+7ETWVEIjBw5sqywsPAFpZT3g/+IEK45BNUzTkTksGkIFA7y/JRfJtXdBfvtN1D30lw4i97t3c0rsyUBfFNK+azfDTEMv6X1C14IMRvAjZyZE8cPwsO3RzByeC5snNZ/Tc0pnHtlDJ+uYd9qfbaUck44HJ5KRDcCOIX7BAOQsYVAdXV1UXd39wIAR3p5HrICqDrqOOxz5jkYXD3Wy1MN2Ja1a/DxE39Cw5uvQnmw8xSDFIBzpJSP+90Qw8gE6b7DCwghTxCmnAAAF7tJREFUFgA4ljM0PCqIu28Io2bf/Fgn4P2lHbhsjoPGDUnu6LeklMeg94sSAJBphQARPQPgpgwpBEgI8TiAMz07gRVA9YwTsc8Z30aZ0FpYM20SdgNWPvFn1L08D8pN7fqA9FAAzpdSPuR3QwwjU6S9i7dvVsBHACo4c4MBwhXnD8f5Zw7L2WWDlQLue6wZ9zy80YtFfjYppQ5wHGfdjv7PysrKgyzLuhHAydwnHgBPtpTdXeFw+BYi+olX+cMnTsbU/7kSQ8bt6dUpPLVp9af44De/RPOnH/vdFAC4VEr5a78bYRiZxJdLZd8mHvPAv3c3jj60FHdcW5lzGwht3JTEVbc24o13PRlwlVJKndyfQVGmEOgViUTOVUr90YvswvLBmHLeRRhzwslsI/n9olwXtXOfx9I/3ofuhG8b610rpfyFXyc3jEzly1UykUjUhkIhC8Ax3NnrYj346z9bUF5mYdKeg0BZ3h3gugp/+UcLLvqp9HJjn2scx/lzf36wtbVVJhKJx0tLS/9FRALAXl41qh8IwESl1IXl5eX7Dh48+JOWlpZGr09aWVl5NBH9FR58fsQhR+CYX9yDkZP2y/rfXQAgIgzbc2+MnXkKttTXodVuSOv5lVK3OI5zc1pPahhZws9vGBJCPAzgXK9OsN/eRbjpilGYPCE7l2ZfurITN969Hss+6fTyNI9JKc8e6MGVlZUHEdFsP+a+74Aioqf7egSWeXECIcQIAEsBhDlzKRDAlPMuwoSvn4lcfob1yVN/wbI/3gc35f3YACK627btyz0/kWFkKb+/aYJCiGfg4QCzQIBw2vQQLjx7GMaN5l5IzBur13XjD48147kFLfB4MPWCoUOHnrJixQrtroVoNHpwKpW6MdcLASHEswC+wplZWlGJw669CcP3nsQZm7E2fLwM79x6A9qbvNtxl4jut237QvQO/jMMYwf8LgAQjUaLXdedD4+nUVkWcPQhpbjk3OHYd+/M7BFYVduNB55oxj9eSiDl8U5+RPR+d3f3cU1NTa2cuZFIZIpS6icAvgH/f78UgH8CuFFK+aFuWDgcvoCI7tNv1v83bM+9Me3mX6JoyFDO2IzXuakZb/z0f7FpDf/+O0T0qG3b3wGQkXMRDSNT+P0FDQCIRqPDXNd9A4Dnt0BEwJFTS/G1WSGcMC2EokH+vgQdnS7mv9mKZ+a24N+L2tO1jsoKAMdIKTd4dYIM6xFwlVJPBwKBmwbaIyCEmABgEYBSrkaNmlKDI2/8BQpKSrgis0pPWxveuukarF+iXZtt74jttxE2DOOLMqIAAICqqiqRSqXmIw1FwFah0gBmHlWKr84cjKn7FSGYpr0FkingvcXteHZ+C+a/0YrW9rTeqCxxXXdGPB5vSsfJotHoIUqpG5VSJ6bjfLsw0EIgIIRYCGAqV0OiRx6Dw348G1ZBAVdkVnJ7erDwtjloeOtVztglUsqp6F31zzCMnciYAgAARo8ePTSZTD4P4Ih0n7u02MLUKcU4/MASHHpAMSaOHwTL4nl5XFdh5eouvPNhB975qB0fLOlAW4cvvZMfWJY1MxaLNaf7xJlWCAB4KpVK3dTY2Lh8Vz/M3fUfPfIYHH7dTSArt6aqDpRyU3jn5zdyFwFXSSl/yRloGLkmowoA4D+bqjwB4FQ/21FabKG6qhBjqwowZnQhxlYVIjwqiNISCyXFhMGhAEqKe+dot3e42JJIob1DobXNRbwpidqGbtTWd2NtQw/qGrr9uuBv6+WioqLTa2trt/jZiGwrBKqrq4d0d3evAjCS44SjptTg6FvuzPs7/+25PT147SeXo2npR1yRba7rTo7H43VcgYaRazKuAOgTFEI8AA+nCOaZh8Ph8IWLFi3q8bshW0UikUOVUjcCmOV3W9BbCPw9lUrdvH0hEIlEfqWUuozjJEPHT8Cxt/0GBaVswwhySk97G1656ofYvOYzljyl1N8dx/kmS5hh5KBMLQCA3nUCrgdwA3xasCgHuEqpnzqOc6vfDdmZTCwEXNe9KR6PrxBC7I3eOf/at+ulFZWYfs+DeTfaf3d1NG/ES5d+n2uKoHJd95B4PP4+R5hh5JpMLgAAAJFI5Dil1GMAKv1uS5ZpAnC2lHK+3w3pj0wsBJRSo4joGN0wCgRw/J335s08f10bVizFq1f/D9diQS9JKWdwBBlGrsn4O+tEIrG2tLT0z0S0H4DxfrcnGxDR+67rnuA4ziK/29JfiUQilkgkHisvL5+H3mLP7yWGJxFRNUfY/t//IaqOOo4jKi+UjKqAVVCAxo8+4IgbW15e/lYikVjLEWYYuSTjCwAAaG1tbU8kEo+HQiEXwFHwYBOhHNED4OZwOPzdzz77LO0j/Tn0FQJ/KS8vnw8giiwv+sQhR+DAiy7P3eV9PTJyn33RvGolWmWMI25sIpHwZOMmw8hmWfetFIlEDlNKPYA0rheQJZYCOFdKyTaMOhP0vd83Apjpd1t2V2H5YJz04F8wqHyw303JSl1bNuNf3/8Wyy6CrusebMYCGMbnZUUPwLYSiURs9OjRD3Z2dqYAHAog6HebfNamlJozbNiw79bW1tp+N4ZbX4/Ao309AlUAxvndpv468KLLMXLSfn43I2sFi4pQUFoG5z39Rf2IqDSRSDzN0CzDyBlZ1wOwrUgkEgXwc6XU2cjyf8sAvaCU+h/Hcdb53ZB06esRuA4ebiDFYdiEiZj+q/tBlnlapUO5Ll6+4kJs/GSFblQPEY21bZvlmYJh5IKs6wHYViKRaEkkEs+UlZW9QkTjAezhd5vS5FUA50gpb2ttbfV1YZ902zpGIBQKLUCG9giQFcBRc25H8fARfjcl6xERhozbE2vn/ROaG2UElFIdra2tr3C1zTCyXU7cnjiO86aU8igAJwD4t9/t8dAblmUdK6U8Tkr5lt+N8ZOU8m0p5Uz0Lhu9wO/2bKt6xokYMm5Pv5uRM4btuTf2OFZ/Jh8RnYX87Ck0jB3KyQ9DJBI5DMCVSqmvIvuLnB4ATxHRPbZtL/S7MZlKCHEEgBsB+Drnm6wATnrwcZSJqJ/NyDktDesw94KzoVztJbWn5XvxbBhbZfUjgJ3p6yb+aygU+j8ACfR2E5f726rdtg7AbyzL+rZt239MJBLm2eWXSCQSDYlE4pFQKPQSfHw0MPro6Rh34ml+nDqnDRo8BFvWrUXLOu3p/F2JROKfHG0yjGyXkz0AOxAQQhwL4AwAXwMw3Of27EwzgKeUUo84jvMWAK2Hnvmsr0dgNoDpaTspEWbd+2cMrh6btlPmk821qzHvh+fqjgXYEA6HRSbti2EYfsmXAuA/ampqCuLx+DSl1Cz0Lju7r89NWgrgXwD+KaV8BwDL+qdGr3QWAuGph+KoW+70+jR57fWfXI74ove0MizLOjYWi73G0yLDyF55N4e+r/J/pe/P1RUVFaMsyzoCwJFEdDB6CwKvVm7ZjN4L/tsA3nFd9514PN7k0bkMAFLKfwOYIYQ4Er1jBDwrBKpnnORVtNGnevqJ2gWAUmoGgNdYGmQYWSzvegD6IxwO7wFgomVZY5RSY9D7TLkCvY8OhgMoQW/xFOo7ZBMAENEmAO1KKYeI4kqpOIB1RPSJZVkrGxoaZNr/Mcbn9BUCswEcz5lbUFKKrzzxPAKFgzhjje2kujrx3H+fhp62tgFnENH7tm0fzNgsw8hKpgAw8pIQ4nAA14JpQaFxJ56GqZf+mCPK2IX37vo51s7XGsfnAqiQUm5gapJhZKVsnyJnGAPSt47AqQCmAdCeXrnHcVm3VUHWqj5ee8doi2ObZ8PIdqYAMPJa35zweToZBSUlGL7PZKYWGbsyYvJ+KCgp0cpQStUwNccwspYpAAwDmKpz8Mh9D4AVyLvxtL6xAkGMmDRFN+ZAjrYYRjYzBYBhAFp3gxX7m5vJdKuYov2amwLAyHumADDy2qhRoyoAVGpl7GeuJek2an/t13xE326ihpG3TAFg5LVAILCXzvEFpaUYPCbjNiTMeUPG7olgcbFWhuu6ZuCGkddMAWDkNSLS2rYvFB0NsszHKN3IshCKjNaN0Q4wjGxmvrmMvEZE43WOZ7gIGQMUimq/9lUc7TCMbGUKACPfaV0Eyqv24GqHsZvKNQsAIjIFgJHXTAFg5DWlVIXO8Qx3ocYAharMIwDD0GEKACPfjdI5uHj4CK52GLupZITWWwcA5s0z8popAIx8p3URKCgp5WqHsZsYXnu95QQNI8uZAsDId1pzyYLF5hril4DmNECYAsDIc6YAMPJdkc7BumvSGwNnegAMQ48pAIx8N0jnYNMD4J8C/dfevHlGXjMFgGEY+Yr8boBh+MkUAEa+69Y5ONnRztUOYzf16L/2XRztMIxsZQoAI99pFQA97aYA8EtPe5tuhCkAjLxmCgAj32ldRUwPgH9SHR26EebNM/KaKQCMvEZEzTrHM9yFGgPUrf/aa733hpHtTAFg5DXXdTfoHN+xoYmrKcZu6mhq1I0wb56R10wBYOQ1ItK6CCRi9VxNMXZTItagG6FV/BlGtjMFgJHvtK7gLaYA8E1LwzrdCO0Aw8hmpgAw8p3WRSARM9cQv+j2vhDRWqamGEZWMgWAkdeIqFbn+ESsHsp1uZpj9JNyXSSk3iMA13XreFpjGNnJFABGXksmkx/rHN/T3o4ta9dwNcfop01rViGpOQ3QsqzlTM0xjKxkCgAjrzU2NtYBaNHKWLKIpzFGv61f8qFuxGbbtm2OthhGtjIFgJHvFIBlOgGNi00BkG4MBcAy9L73hpG3TAFg5D0iel/n+A3LF0OlUlzNMXbBTSWxYcUS3Zh3OdpiGNnMFACGAfxb5+Ce9nZsWLGUqy3GLjQtW6y9B4NS6m2m5hhG1jIFgJH3gsGgVgEAAGtfnsvRFKMf1r08TzdCua6r/Z4bRrYzBYCR99atW+fg/7V3t8FRXXUYwJ9zN8Fs0osJIGl2N4FqBwRabUvpiECBkVY62lHGD3YGtONgHb/4MsqoFQekLdXRVj+oY2un4sDIlKqlRQbSSJpASICUyEiApLQJSXbvzS7k/Sa7geTe44ekIyMNL73nZndvnt+nfMlz/js7u/e/59x7DtDsJiN2tAr25WFFFdFE7MvDiNVWu41pTCQSFxWUQ5TV2AAQAZBSHnTz/yNDQzCOH1VVDk0gVntYxRHMrt5rIr9gA0AEQAjh+qJwoeKAilLoOtoOub92a5rG9RoisAEgAgCUlJRUw+XxsPF/13NTIA/1tb6H+KmTbmO6YrEYp2qIwAaACADQ0NAwAuANVyFS4twrO9UURNc4u3sHIN09ui+EeA3AqJqKiLIbGwCicVLKPW4zojVVPCLYAwMdbTDqjqiIcv0eE/kFGwCicZ2dnZUAXG0PKx0bTXt2KaqI3te0Z6eKQ5eihmEcVlEPkR+wASD6n1Ep5Q63IW2V5eg536SiHgLQ824z2qsOqYj6MwBu2Ug0jg0A0VWklC8DcPVTUzoOGv7wGx4TrIB0HDT87teQjuvrto2xBoCIxrEBILpKPB5vA7DPbU7PO+fQWv5P9wVNcS0HXkfPeVd7NAEApJR7TdPkzRlEV2EDQHSt51WEnN7xAi7396mImpKG+3rR+Jc/KcnSNE3Je0rkJ2wAiP6PaZpHARx3m3PFGsCJ57e7fnRtKpKOg/rnnsGVQUtFXI1hGK7fTyK/CaS7AKJMVFBQYAoh1rvNGTSiyAkGMWvh3SrKmjKa9uxEywF32zK8Twix0bKsC0rCiHyEMwBEHyAejx8EoOTB88YdL/C44FvQ3XwWZ3a9rCruqGEYlarCiPyEDQDRBBzH2aIkx7Zx7JdbkerpVhHna6meLtQ+vRnSVvK0ngTwExVBRH7EJQCiCQwODrbruj4fgOv5+5HkEOINJzBn9cMITJumoDr/GRkaQvVPv49BI6oq8lXTNH+rKozIb9gAEF1HXl5ebSAQ+CaAPLdZl/t60dV8FmUr10AL8KN3NWd0FEe3/Rjd586oikw5jrNucHCQj2EQTYDfQkTXkUwmB3VdTwF4REleIo6BaDtKl6+CEFyBA8a2Tz72i63orK9TlimEeKqzs9P1fg5EfsYGgOgGLMs6qev6owBKVOQNdLSh70ILwktXQAvkqIjMWs7ICI7/ahtiNVXKMoUQZwoLCx+/dOkSt/0lug42AEQ3JqdPn34awDcACBWBVrQdXecaEfnsyil7T8BIcgg1Wzahs/6YylhbCLGupaWlXWUokR+xASC6CZZlxXRdDwBYqSpzKNEJs74W4aUrkJtfoCo2K6R6ulH95PfQ3XxWaa4QYqthGLuVhhL5FBsAoptkWdYRXdeXAfi4qszLfb3oOFKJGfMXomB2sarYjNZzvglHNv8AVlT5j/Qq0zS/hbHH/4joBtgAEN08GQwGKzRNWw9AVxU6mkyivbIcUjr42F33QAglqwyZR0qcf+NvqHt2C65YA6rTE7m5uQ/39/crDybyKzYARLdgaGhoSNf1/wDYAEX3AwCAlBKXTp9Cz/lm3L74AeTkuX7qMKMM9/WibvvP8O6+f3hxNoIthPhSNBptVB1M5GdsAIhukWVZrbquawBWqc4eNGNofXM/cgtuQ9Gd87J+NkA6DloOvI7ap55Ef1urJ2OMr/vv9CScyMey+9uFKH1EKBR6CcBGrwYounM+7v/OJsyYv9CrITzV+947aPj988pv9LuaEGKXYRiPg+v+RLeMDQDRh5cTCoX2AviiVwMILYA5qx/Cgse+jumlc7waRqmBjjY0vboL7W9VQDqOl0PtM03zKwBGvRyEyK/YABC5EIlEgo7jVABY7uU4QtNQsuQzWLR+I2bM+6SXQ31o/W2taP77X9H+1r8gHc/34Dlu2/aaRCIx5PVARH7FBoDIpUgkMsNxnCMAFnk+mBC4/b4lmLvmEUSWrURg2kc8H/J67MvDiNUeRtuhg4ifOunFDX7XEEKcCQQCD3Z0dPR6PhiRj7EBIFKgtLQ0ZNt2BSajCRiXW1CAyLJVmPu5tZh116cmbVthadu42HgKbYfKYdQdxkgyOSnjAmMXf03TPh+NRs1JG5TIp9gAEClSVlZWNDo6ug8eLwd8kJy8IGYuWITie5eg+N77UfSJeRCamsOGpONgINqGrrONSJx6G/GGeowk0zLzfkII8QXDMLrTMTiR37ABIFIoFArlA3gFwKPprCMnGIQeLoMeKcP0SBn00jLkz5qNnGA+coJBTLtNR04wHwAwmkriyqCF0VQKI6kkUl0XYUU7MBBth2VEYRkdGE2l0vlyAGCfpmmPxWKxtBdC5BdsAIjUC4RCoT8CeCLdhfiBlHJnZ2fnRvBufyKluBEQkXrSsqz9uq47AB4EoGYufuqxhRBbTdP8IQBPnyckmoo4A0DkoUgksspxnN0AStJdS5a5COBrpmlWpLsQIr/iLxMiD8VisWrHcT4N4M1015JFqnJzc+/hxZ/IW5wBIJocIhQK/QjAdnDpbSI2gGdM03x6/G8i8hAbAKJJFA6Hl0opX8Ik7heQDYQQZwA8YRjG8XTXQjRVcAmAaBIZhnGsqKjoPgA/BzCc5nIyQUoIsaWwsHAxL/5Ek4szAERpEg6HIwCelVJuwNT8LO63bfu7iUTiQroLIZqKpuKXDlFGKSkpWSGE2A5gRbprmQxCiMNSys2madamuxaiqYwNAFGGCIVCDwHYCmBZumvxSI0QYpthGJXpLoSI2AAQZZxwOLwUwCYp5ZeR/ffpOEKIvQCe4xo/UWZhA0CUoSKRSNhxnA0Avg1gbprLuVUmgF22bb/INX6izMQGgCjzBUKh0GoAXwWwDsDMNNczkW4Arwkh9hiGUQ0+y0+U0dgAEGWRxYsX58bj8RVSyrUA1gK4O80lnQZQLoQoNwyjBjywhyhrsAEgymLFxcWzNU1bBmC5EOIBjDUEH/VouD4AjVLKt4UQNbZt1yUSiYsejUVEHmMDQOQzJSUlcwAs0DTtDinlHQBKARRjbOlgJoB8ADkA9PF/sTD2yz2JsWn8bgAJAFEhxAXHcVqFEE2maXZM8kshIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIppU/wVwOICzRGGbSgAAAABJRU5ErkJggg==;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;597.14\&quot; y=\&quot;1874.25\&quot; width=\&quot;78.5\&quot; height=\&quot;78.5\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-91\&quot; value=\&quot;\&quot; style=\&quot;shape=image;verticalLabelPosition=bottom;labelBackgroundColor=default;verticalAlign=top;aspect=fixed;imageAspect=0;image=data:image/png,iVBORw0KGgoAAAANSUhEUgAAAWgAAAFoCAYAAAB65WHVAAAABGdBTUEAALGPC/xhBQAAACBjSFJNAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAABmJLR0QA/wD/AP+gvaeTAAAAB3RJTUUH5wgKCgIx9ifNUQAAgABJREFUeNrsfQd4XNW19dw2Rb23UbfcewF3dcmFYtM7AQIhkBBqAiHU0LHB9Opuyd2yJBuSvDQChJLkJS9/XhIChJSXDrhCSHBZ/977nDszsmdkDZaxIXd/3/ZIsjTlnnvW2XVtn88TTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0888cQTTzzxxBNPPPHEE0/+Y8Xv9/uCwaBp23bYcZwGy7Jm0ePsQ6GBQGA2vd4Meo0x9Hrp/NqeeOKJJ57EEdM0fQSUuQSY15H+P9Kdhmn8i37+70Oh9Fr/ptf4gL7+OwF2F33dkJaWbqakpHiL4YknnnjiCoEja6Y/EHjaNI1dhmHAMHwfm9Jrw+84/0egPceyLZ9nTXviiSeeRC1nBujP2bZF1u3HC878egzQZEWDDoj/pvdTzgeGJ5544sl/vBAwsqYRKH7Tskx83NYzKx0QpBbrLnof5zFAe6EOTzzx5D9eOJxAAF1KoPg6Wa8fOzjHsabn0SEhVr0nnnjiiWdBE0ATOL95OKznONb0o/y+vDCHJ5548h8vwWCQNZ2A8TtsQX/cMej9rWjTA2hPPPHEExYOJ6SlpfkCgcClfr//Q07YWaaKRfsIMOlXDkqTBWj6Gw+gPfHEE09c4Tg0aY7fcVY6tr3HJpA2XZD2ANoTTzzx5PCKLrUrpMc7Hcd+i77+F+lu8yDVMI3dpmns6WvoxANoTzzxxJM4QiDKwEgYbQ+mr4+jr8+lr8/XekECPX8f7fH/9GyfcRznevr63b6AtAfQnnjiiScfG+ibPsM0K+jx9b5UiXgA7YknnnjyMQknIelfLuP7dd/K+IxHPID2xBNPPPmYAJqUAfq1PoAzN6o8kZaWlp6RkZGemZn5kVX+/j9JD+JaedqLemt0MJoWIvH7/UYgEPDA8AgG6LCyoA8M0I7j/JkW9MVAwP8Cff2CbdueeurpJ0x575I+b1n2s7T37yTD62h/IGB7QH1kgjQD9K/6xmznNssc/q5GTz319CMq72G9jxWFg/knAu6vEmjn+v2ONMh5cuRISV8B2lNPPf10KgH1hwTQzxBQTyU1mWbCEw+gPfXU0yMDoIVamJS5368JBAJZDNKmVxDgAbSnnnp6uNWIhC/VVCVrIwH0xLS0NNOLTXsA7amnnh4x1rTpWtNv+f3+LwVDoSybrOmMjAwPLY/kJKGnnnr6HwXQcPzOv/wBf4dt2+MYLri5zZMjsszOU089/U8Jd7hxaTf0QfoGgTSzamZ5IY8jslHFU089/U9Wx3H+FQgG15JlPcazpj8+gC6jC/2bA7o9onSi+g6ehzqeGnHUdxAUqabW/S0D45C8//8k5fvA7LUWPvr/3vU6MrW3PWMmmqhkWSDrGY7f/zqB9MWEG2lelcchFObUIO2TBe03LKTmZMAemg5nUDr8AzMQqMmIfH0w6pBa9Dys9sCoRv5/cBr8g9ORMiAH/uxUGGYCMifNyOfQDegntX36hrMIMCwTPisIMyOElKGZSB2bg+AY1mxPD6Apo3KQOpJ0HF0vWqOQ7dD94ND1TjQn04LftJFSmYkU+pvU0dmk3rU+EjQ0NheBqlT4grQ/aP0cnw2fYdOeMmXfBLTyvvHRvvGZPUvxdGMLj7/7J2HHcsKOkcrWMzxAZTFNi68G+RaGc/Dqs+gErFZDaQ/cHZgyrRipd45F+u2jkXbnOGTcMU4e0+8cf1CaRhq6axzpWKRElJ97rLxe6t3jkHPb0Sg4fQj8xUEYVoJOR1Z9c3GnlM9mJRCxAgTQDuy8FJSfPQpDn2pFWVsjwitbSFs9PYCWts1A5bIWDKDHsusmwk5Tk318doKD0qF7KWCg9PPjUd3WiorlTShrn+ldy8OuLShdOQPVDzYh/5hq+DNsmD72dgigBaTpe9Oi7/ngNfVjoq5iSyxq27Z+zVTG9H0KG3yBQOg/D5RTUlJ8Bfn5vlAoVMWBeroQy+jH6+lCrTs4NTie9A16zvcPGDYg4AtNzUfq/NFIuXc4/PePRJA0cP9whO4bQTqyj9rzd1NY548gHY7gfcPl+fwL+PlJ7xuGEL1Gzu0E1idVwM63kUqndzDOjcOg7FrNPrLgfATIhkmgbPgRsEJkNWSg+LqjULa2Bfmb61C4aTrCG6ej1NMDav6mWuRumoaSTXUovfFoBFMthDiEYcZ3kx12hwmkS740HqVd9cjvnopsuubhTu9aHm4tY+1sRMWamSi4ZjwCQ8hbtdiwMWW/+Ew/qSN7yCbwtnzxm1ocx2Ermh5F3yOgXkzfD1VG5H9QbJpdh4E11TxVpdXv9/+MPvyewzI8VgC6EGnzxxGYjoTzwFgC6DEEqAzUo6J6n9a+fK2/D92n1E9fOw+MJoAeRc87mg6Dcci7YSzSagthhwwCYLppzBDdIE6C8iDNuudjF9yPIP2+n6zo9PH5GHAHAUxXMzKerUP2piYUdbagoJvBw9MDaQGBbGnHNFRubEDl1yYjkOIgYATELd4foA06FG1YjoXwlybQ3zSjfMN0FG9skufxrufhXsvpKOqajGI5LGei8uFG5MyugJNpq4OVgNmk/ePz2ZqvI3HnYWwegkvzCLR/4dj2mYRVKTwF6j9CuKSFdAx9+F9ZMo/wMJEVEUCnTilGxvyjCExHI/jABKTNG4/Q/DEEtGPpZ0no/BiV7wn06bkC948ncB5PVvV4ZN17FPKvGo3Q+BwYQR9SDLacHQlVGFYc14vAmV1uvj4hBmh21VINZDQWoeTJaagigKjoYKBpRH7nDOR1zRDQKN7Y6OkBNL+LwbUOJd1NCN80BWY6bV7Lr8JHcRKEPgJng6zs/KsmoHhTC3kqdSja2OJd7yNAi8h6zuuuI7CejvKO6ahZR/uinbzKa0YjMDQdZpA9UZOMHFuFCK0+jcIT5bCX33HeI6xawtb0scce60tPT//0grM+hcxgMPgwWc8S93ErEQ6PBV0kVm1o/nA4C8Yi/d4x9L0brhjVByVr+74REhqJtbhT548iUKYbRFvO2fcSQF82FPaIVBgOxzQJeB26aQiEQzFJjP1uFIutN5P+n8A5y0LWaTWoWjaTLIUGshgaUdDZTDdoE0oIMBg0SjbWHrQWdyotIdfxQFrMv0+vW9xJgEV/46r6+yNXGZyLOqehvLMeFWRBB1MsOizjrwH/LMCWmN9A0RXjUN7VQEBA3guBQ0k/XfNDrWF9fyhNvLbhyN/E/vxI/3wNdFi2ijWds2kaPU6T+69sfTMqH2xA9swyWOkWLI5Bk5HDXpIZ2V/KELL3WftoVUiEz4P1FwTUZwWDoVBKSuqnusqigPSnpmUe1hpIXoTAtEIBaI4Z2wvGiiUdkDj06D4rg7OzYCSsB0fDfmCM/Izj0KEFDPLDkXXHaOSePRB2aSqd4HbEOnbjnZau0nDcTLPp05lmBvCg3FR2kR+lnx2JAe2zkP9MM92ABNB0YxZ2sgXRQDdpfb9ocSe7jXVikTCIFXYx4E4X99EF7h7KB8XGJrFiCgi42OVX4Fffb+/pUCi/zzBZWwy2xTdNhj/EG5hd4fgAzbFLUwB6AsrI+i7rUCEOBocj+XMWy73RKF5WWUQbZO3cw5TXmJV/VqoB3AVqOWzlEDqCP6Pcg41qP+j7j38epu/D9PPKdvq/K2mfDklHwLGRIuFCMgg5IUyGkmOyN6v24YHIl8io3EH6BBmaNbmFxb5PXYMLx59NNTPwd4cl7rwfQBchbf54iUEzQAfvGwP/guQAOjViMesE4f0j6DlIHxiNTK4KmVsKM9+iG4FjznYCPlt1motytYZ4FAzQBpyaVIS/OhHVa4+lzdWM8vVkuXU06Buzv7VRXPeijhb1qLVkYzNt8qb9NCwA5W6IerHsS+Vnh+r99Y/yYRLuICurmzb1zZPhhFTWnysA4m5O/jkD9JUTUNrNn306XZfmCDAcqVoih2eTDsXoryPaGNEi9/uNvKakMWtccoR/xkRaRJpP68zGRvmGGRgyrwV5zRXwp9FBbPnE8EnlvA4dzAzUiUpdYwHati2OS+/1+wM/JZA+ifAswAlEvz/06QFo0nLS35nmkWFBHyxA++8fI6GO1PnDCaCHCzgHF4xGzg1jkNJUBCPdlkRFgDa5k+i9aHCWhJRPXReOP6eOycbArxOQdMxC5jNkpW5qRMWGpkMI0Apcw+K+NwkIFXa2RMIoroYjFhpZWZykIQu7pLNOALtU/rbxEwDQ0z/1AB2OAeZC8XK0p9NdT9ZmnWhRpxuiIu+pqz76sxjL+ZMI0CVibDQgd1Mdcp8hr2fjLAxcfiyKvkR7fECq1LWHjBQygvwKsI2+ALTtgjQnEbeSPkRfV7q5tU8LQGsL+tMB0BzWsB6kv39gGAIE1FnzJiDvmtEIjc2CwZUaRoAs5wAMLpJPVGerQYCTGTb9npFuIKehFDWPkIu2roUAgTfOdORxeVi3sg76+2ZWVi+Xjk1DKWfEaZOWdSrADdOBUEYuLyuDU1mH+jpMm72wW216fizcpB6LPhEW9KcfoBmYoxY0rTEftGwxd9WihJXXmNaSk2usYTpwWUv1Y7E+eOX5Oj5ZAM3hnHI2ZjaqsAeHtfhalK+fhap5TchsLJPkMOOQLZ2iH4HfwzT2EGi/alnmCQTUgU98pcenEaA5bu08MIJAegTS7x2H4s/R39ekCyBbPPeQM8hc58wulJX4vagCe9I8CzmnVaNmSTNZQM1izZR0cuJjqoA032z9ejNznJLdW7KuigVkaVNvbqafzUDJupkIryJtm4GiFfReVtAN307/x00Ca+hGX30MalYfh6oNs1HRORMlDM6basUa8wD68GthF4MxWY+dzVIeOGA96bpWlG44BsVrZ6KY13FFC8LLm1G6lA7iZfTYzmvbimICsmKyOou7Wui+aJJ7Q+6Rzk8SUDdGjA++JwvZy+vgz9KKslXkHX5+GJzykFR3JFuooBKJqgvRsqytpPcTSJcNGjz00w7QmnFKJ2g+ErdCpHTP6PUCp00uQs49R6nKjfmjkTqPQXcUWcJ9U67aSJPY8yhk3j0emWfXwCoOqI4laTk1I4km6QiU8IUhySgpmOd4M1eymJaEOKxiP8o+Owo1bTPJWq5XSSi6sfK6p4tyVr18Q13cGK8CRbaI6ul3OE7dJOVkrOzeFtLGKmIrghNCBPKltGkrNsxAVXsLKh4ni+nuiSi6ejSKzh6KnOMqkVUXRuakYqSOyUdwRDb8wzLhDM2Qr1PH5CFjfBGypoWl5jT3zIEouGQESm8YjwEP1KJyCVkq62agYuMMssSaIslG9gSKdWwwv5vfl+uCqwRlbBjlULrWRzpAqzCTmwim99upEmCFunInv7NZvuZQV/kGthbrpElJQhK03sVdrfT1DOl6rHyE1vb2o1F85SjkXDAI2SdUIKeuDOlHFyE0KheB4VkIDM0U6oHAkEyk0M/SJxQik9e2tQKZp1cj5/JhKL51PMoWTMWARU0YwId2ZysKNvHaNsg9xVUxxdw8soEMi45m7UXVS8KxtKNOA2WjzluotS4+pGtcL94mh3WKOl2tl9csovdZ2E2eIt17VfPqkFZXDCtgKSNKSi2tCHZYvqj2TOJH66V1u/gev9//qm07x9PPHOuTyOnRG0Bb+kQSUOMicfq6qjgHU8YNwZSxQ9Sj1smkk8bvr9PGD8PRI2uQkRJQ3Xf0XL6EPBc+ugmLkXfP0RI3Ds0bJrFkZ8EYONxg0hflSo2HRiH39rHIObYSVrbiAFCfzUhw+JhRch75XQcG3Rwh2iCV13EycIYAV1FnfAAo6dWlbRSrSSynTvW7hXRT8gYvpedk66lmWRMqHpiM4ivGIOfEAXRI5SNQmQI7w4ZjJ0cIZMXwg/gc+sypNvwFKUgZloWspjIUnj8c5bccjcqnpqNqbYOAG1tg7ntzrRtJMHZENXzIN+8nwIKWQ6tRX686CTfwdWEA5PuDwSdXDlzSzcoTqljBgEOATO8za2410o6mta1KgZlDgBPwqTAbt7QnRULE1AJcB057siiAlBGZyG4tQ+5FQ1Fy2yRUP9mIyjUEyl3NUgHEBkHJxhaUbWgUY4EPDlUxogC6VJLJbrlf3WGyrBmk6+UQYe8xvJL2y+dGwV+eIiFGW9rELblWbGyxJ2zFVPUk5Pfx+1nfsW37HtJyxrxP1CzEvgK0EGzT95efPwdbft6OLT9bpnV5RN8lfWdf/Z/V+Mk3n8TgsiLFHtcbQNs+pNcWofD2SUi9Zyyy7hwjVnDqveORdu+4Pmn6PeNQdB19XV8IX5YhG6C30I17EkvrqRWkR0vKfFIn5GLgPfWoXDcT2Zvrkb1JWUrJAIBsWrKccrpbkMuxYLJqymkT1KxvRM1iAuXbJqPojMFk+RbAX5wGJ0BgSqBq8+nvUyQzTBiUjKtnRtYtOlmZq1Ac06bnpZuca70J+FMGZCK/sRKFZKFXP9aAgataUE3WX8XGaWTFszYoq18ntPK7mrV1/Z8J0Gw152xSXk/lhlpUradr1DEV5R1TREs3TiUrmUsFZ6ConazV26cgh9Y2dWK+gKjjt5DCn8dU7euxpZy2L0lXngFde6Qcq7Xp+fzk7QVo/zi8toMzkXVMBYquOQqVj7WgeJ0GvS61lnm0lrl0zRi8C0gLdYiES//CHY2HPQRS2KE8k6KumSi/tw7Zk4sRDOiEvmXpvepXJbJ0HVzypQMMBthj29YLBNIz6Gv7EwPSfQZoUrbMrr3kVODNduA3i/umry/Hb59/CiMqijVAWwkBmm/a1NwUpI3IQXB0FlJGZ4sGR9HX5Pal9lGDlWkwgoaUxEnHknyGxIDmJw0aATqpU2BkWMhqKUXVo3xTtxC41iFz83S5mTlW1tcbmDcEW54VG2rFfSztJPd29WyULqhF3mcGI2VMDsxCRzF+0TUJ+vz06IBuHXLpOLxiyuEi7p2ZZCw/4vIZEc/AopvatP2KC0EOSeUVWUGTrPVUaR4o/ep4VC8m64s78rp1/fUm0q7pui637j8YoBsF1PgacFNMuHOqdMplP0Mgt7kR1WtaMOAeWuvThyJ1VA4CBJQB4QwxdBiNeVsCQhHAHXSWz5L78qCmkkg+he8dWzpb+UA3fcpj5JpiK8VEqDQVOS1lKL/+aMlbFGyagRJp7KkVL6CArncee3idTRIGkXjwYS7FK3JDLnTfFW9qRklbK3I/NwKBshS5piEfG1F+IShjljz3oEs8B9HUcWkB679btn1nKBQKZ2VlH/mcHn0Oceg4LQP07jdXYzcB7x7RZRHdG0fx+lL89gePYTgBNL0aAYOd0CXhQ8DxBXSfvim1x77I61uKDeuASr/nM8QqCdBz+H2GpjX0Jax35nroEH32QIYfhacPlqRM/uYGSQCysivLmXW2KsMdB07wRUuq6sgirUPl8gaU3TgJOa2VCBSmwc9UmgTIjmwq/szKEmKeD1a2iPhG5GL9kOH7CNlsF8RUuMP2Ra01sdj4NSxWPngJuH061keHU8qobBRdOAJlj01D1bpWSUIVknVYunEKffZp/8Ex6EYBLwVq0wXYSrpmoWbFcai6nqzl1jLYpeSBkaVs0dqafB9aKgkdqaUXz8hSuRC2gnUVUV9anXtwJscAknoeS/aVJUaU0SM+Kx4V/Z+T5UfK2BwUXDgU1Y9Ox8A1TbS+M+gztUr8XIVBGiNx9sNfkldPhwh5JZ3TkLOZK5eOxaA7G5E5uRCBoCGNLbx/BCsMI6HRl8Ci3k2W9HOOY7eSJX1kW9MfBaB3vbkGu15vw643orqbvt/zervo3oi2AW8sx5s/eByDBKD1jZSgjdpN0JliCeh4tdsgYlh9Vjem3COJYCU+FKS6o8yP8CWjMah9Fm2+RlLVucWxOdaSSH1x/QFjacW67btsJbm6ZLmkTSsk11N1SzkEOgGyYrkY39GxNGmKMY0Yt5cOF/q/kKFY9cyPCMoqCapUdUqqKhYX/FPFstaWtKU8Dj4onAC9dk0aiumwqrmPXPo1s3Ts9T83Bh3WZWISFiDAkGTflyci4+hCWJl0T3NCWbwVZmsLKjfcDUVwTa9Wlw3RbYRywx3JAnSP5+DXlvZoRUNgusl4S8W42WvyG8qit0MW/EPTkX/2IAx8sImA+jha15nSll3Y6VIFHAkA3SDeqnTB0r6r5PDbhlaUL29BwWeGwC4LqHuZsaKXsGncYgcd9iCQ/hs93mRaVtERW47XO0AbEZJ65nPlDX/t50/GHrKg97y+oofuJcVv9lH+WSxAG7bcuAkBWlu0kRtQq+n7qC5+9Gv3e79P3cxsubBlYdsERoMzUH3jFFSsn4FCulG59bZ0Q4NuTW2ItKzmE2gzcLs1nJzsK5IyoVqUb9BtugQW1atmoPyGo5E6tQBWut3jcPDFtJS7j8pLsNRmNtWBZMj3lqooieMKmz0OUP27wrNr9ADnWHfYcA880+hh3TFABNxYKFvVtinXyXToelWnIv+84ah6eiZZjDPpc9eibIOileTPXppE2Kf3EAInr6aijFntbmSAtiLx2vgbzdat3uNQsqlRvJyizsZ+qlWulW43DmtUrG+ShGnuJlrv7hZU0WFVesNEpNHa2uRx2Lphwj1cLXeijl4Pn7SrGxGr196nhdlN6vaWT7B0yadKsltyb5i+aDWSE+GOMfVrKq5lS3uSLiucqH5PzKUdGJiGvPOHovzpZlrbVglllWwgb6mD7+NmyZ1w7uFwJA+LdA6H763Sjfx+aum91EnHZfX62ai8bQpSJ+XDTDUVs6Hll8OI71snhtNj38aWKEi7au4icP4veqyjR/uIC3kkBmiVqPJp684kt00A+uITCaBXSXhjr9bI17/ZV5cRQC/Dbwmgh1QW6xvMkRvHPBy11qayGC1JrBhCV5kxoQAD727EwHWzxTrKkZrhJkmOxbek6qRRoDDCizFNEdh01JPL2IKKedOQdWw57Hy/AsIkDhflCkc3pWH0Zvnr5KaArgIDw3Q9HReMTW3FWUklGt1DROrF2SVPNRAanYWSy8agZnmLHFr82UulXKuxX6xW4QrpJICmx8rrJyPAlp6ZqJGI1zAI228SQI9F0SYVG5ayxf6oJtBcJyXCkNcgoFC5bgaq5jdIZZCd59f7I/nwU6zB4DOTa8LwuQBsup6lWifT9CWdp5C9zsmzFAOpwzJResV48vhapdSSD+GibtWxqlrND2d1x/5VUezhltLBMWhRC0rOHgYnnALHtsRDNS21x23tQfTlGuv49F/Imr6BNMc8kia3HAxA74kB6D1xAXr5EQXQHF7xWSFxQwO0uXOay1H9JAFO12zJwBd0TSPLqRa5TH0Zp2KB3a7K9dzaXS8bWOLT0s3XiMpl5F5/bgRC5anwO2oShPERyKdiQxS9ew6qJFB5N7FxZs1prcNEhi6RTKroPwIeliS1HEOR5/vTTeTyNZvHJVnsFjf0W/JQXe96IXqvvnmq8DMEdKw2HkD7fQTQAQLoK8dKkwMfqBxL7ZeOSfo8XMlS3DkZeZvp8F4yGxXnjkKQ1tby+yQ0ZUuOJFmQjSmD9O1zICYCZdPS8evo35sxVARRvpgkQZo9JctBiD5HKhsE5A1ktJQifD+twcbZpE1CvB8WNsWGI6YBiAFaOKc31YpnM2j1bFTdNh3pR+XRnmZQ5nuG97hfErRW3+PSXI73oe04T9D3GUdMzXR/WdCJrOgjBqDZ0jAUcNmZARSeMVTiWdnPNiDrmTrJGJdL2VSt5rSNb0GzS8+xWC5P4t8bQJZ39b21yJxaCCPdlEy9n24QBhfbl3xW3uxReWEmHpwqySEnQs3ot3R9qC86LzHAoMr/Z7o3at+6s6JutynlTKYMJrDh8Gs6PgGqks+PRhl3M3J9t2tdHUSIIb+TQJaJpwgMyuZPh1OgXlOSbXGAK8BVL5kWCm84mg7XGVKLnEuP/RLm6KD3w01JXMd87xRkTy2CGTLFG3J0+aPNdLPJGAdu/FisO1MeD/T3lvQeOBJe5AS7EZNbMLS7zgyUluaLSeb9MHiF+Ppyzb/fEt5tDg2k0NqWfX4sBi2fKWGFHJ4MFGlNPxK4TDh5qDp5i8igKhIDqRlVC5tRcNYQOPnMNklYRfswYCTm2olb7cFzEB3nA8u2z3cCAd8RMQPxP8aCNhWVYbAogLJLxiBMwJr5DXJZu1UslS0FjiOXSWNGQ8JsdgFb1+Rihcn9G9J2LCq/dBQCg9IQkNi5qcb58HxC+Zxmkm6nrtlmy9s0Yhps4m14UzwCtZEJiFNSUDZyLMpHjUd2RRWC2TmwQ0H1HJal44+914T3CJ9EGP1486qyP5NfT5K3Pon9cflW9aNNKGVw7To4vo886bRrFs6RMjo408bn9cLHYKjpz4PTUfyUIhxijgeuOT9YgGYvqWJdC8pXz0I+U2JWpgovtW2aOoltqBJIqSLorbxrfy5jaTThv7VNAUTXKk7kKflpjVPpfvIH01A6Yoysb1Z5NQJZ+TBTMuk5VJmZqRPOZrL18uJdWWpABd2vtpTr0eciQyPzmDJUPt5IHs3MGF6YIwCgJVlI692hmmtcvnRuBCtbPwvhWychY2we/AFDkrKSJE0mAWsLF8hm0tQjIh7dW5LQTADQuyVJuHw/7R+ANmK6g4yPpNHR7kaP+G5oSCYqbpqEoo5GZG2aJqdx1bpmlG1QIY2cTdwN1igJQNX6un9Mk28E5uOoWdyCvNNq4M9xkEJWpm0GdWzUTQJZapJxEjwC0hIfcJBVGsawadOQW1kh4JoIoE1tpfP6lI0cg8df/m8s+s3/4b4f/wq3bv4Ovvj4Ipx49XUYf8xcFNUMQiAlVSUkfbEE6PEnlwsXgrbObT3qSyUTVVLGZNBO9UmdesldRwnbXmyCJ2kLuqtFt9FPoQ3XQgff0bCynB6A0gOA0gyUnDtcGkN4nmHl+loC+SbpiDyYxJR0dy5uRt7JNQSEPDUnQJuWvAjbiFwHtzTOjEmk92kaiOMngKW1rZ1Oa1vWI2wRv1rDkOaimqMm4YlXf4qnfvUW5v3oF7j52e/hkscWY84V12LszGNROIgsx7RUtW6+qLVu9jLwwNTr6tfeFl9bn6VjuJaqkEkbk4vqW6fStW3SbHo9u2ZLDkuIQ4XDCmkPlnS0RGq3VWya3iN5PlUL65F3+kDYecEEeJIYe3RTy+ukpUdEmKM3gPa5iSadFWYX68ufP0UAetfrK6S0btdvVojuJt2zny6P1EEPlTI7WyzMhKcXWwKWX26QFLE+2ULwq9llfVSOu6qstiGuqHQeZRjSPl1x/zSyjpul+aKQJzxIt5zKFHPGPr+rUTgpuKKAG0zKN9QLcORumkGg3SwtvKVkpQ0gqzG9oRiWzDBUXXqGqbqazH3KE934YWwC0M38s0vFv2OlZCA8ZBQazz4fFz+xGPe+8j+0IX+GitHj5PPHu6l8GrB4c/lsB2fe+HWs274LK7btxYrtwModwFrSdfR9+1/ew6M/fx3XrenCCVdei8GTaxHMzFMHiK4cEfDTm9vndmvpagFL19ZG4uM6ri1AxZu8NAXhK8fRRp6hyKO4BVq4ixvFGxGN2WCJ3dda4ZDgtRi8fBbyLhgKK+yXZA+/JpeKsSXLwJ19ejUGL2kRNjiuuFDUnMkdDhyz5vfJZP+cIC7mZqKHm5AxrVganVQZpKOqY0wFZM5+iUGVkLV9RqSKw7RUTbuEJEIpKBk6Ck3nXYQvPrEE82htH6ODtHz4cA2miRuR5Fr7HZx9x71YtXOPrGv7jr2ytqtJ12zbjeV/2YYHf/YarqS1nXPVdRg4aRqCWXn09xwiUvFlOyavoO4ZUx20MfXS0QSmqo2XGmPuQK1KQSUnENfpJp31ypjhski3Cevjr+5QjTVFG5tiRpypBG+hWNP1KF/bgsJbxyM4IUevJYf5ApKD8vXS/KXj0b+nx8oj3oJWRPW2AkhteV0jnYQrBXhVM8pSUcTVJdj7xnK88YMnMSSmDjpRUoRd2mBBCKFRWUgfmg3/SNYsBEZkIjiMNevAOjwTgdEZCI3JQurYXGQ0FqPki6Mw8OkmsY44TlrSh5ZlJpzhMAcT3xTRDZDX2SQc0APurkPq0bmy6OwOhnyJLaDYDsnY8iqpJCH3NKM4jEknnIovPbkET/z8NWz4+3bagHvQ9j5w3ar1ZBVlyk0V99S3FLAyeOZVVOOhl36C9u17sHwbsHzrPko/W7YTaCNdu+UDrHjz/3B71zcw57KrER42AlYghUDF1mVZGmQ4pqmJpLjxp9eEpZ82e56Dos8OR+XaGSjvalSUmJ31EZKh8AEsrvBGxfchlRhc3rahBYNWzULVzZOQPbccIbrmoQm5yD6mHNVfmYiBbS2yNiVCVqTqqJO1nvl1qtbXS+t2Ec81vK8WaaNzY4aVGnFpAdzDKrbSJmpNq0qLzHAppp50Kq56egme+vlvsObdf6J9Jx2W/wSubVsLJzVE19XstbOWX6NkwAA88pP/J+vHa9m+lYCa1F3bZVvVmrfxgbzt31j+2z/jlo5ncMwll6NkyAjYwRSp6nEMDVI+I0ImFN8zM8T743AH9xT4Aj6ZHlR04TDUtLdKiI8PQlXtcmQyJLqeL8+3HPhkq9Ap2KV00JMhwxORnF56C9S1MT8pAK2t6BiAvuJzp+CD36zFB79ehX++RvrrlaTt8vjBvvraSrz/mw34xfeXYKDbSdjLDcnWWPbsSlQun4HS5bTpVjajYhmdhstrUbqCAHZFcx+Uf6+O/o4237IWVK9uVfyz3bVCxtKXbDQvbvZmtpgbMGDdFLoRpxJ4EFjcVofQ4EyJZVvM2eGzNZdC7wk3BvEUn9ocHIPMqxqI475wFe751nNY+ddtaH9/Lxa/z5YvWUTbyUJ699+Y9flL5Dr5EyV/BETU2jSQ5b3uH9vRRhbViq179wdo0qXb9mAZPf+ybWxl7yKLbDfWbP8Xnv7F67ho/sMYNKWBNnNq3O7DqEeQwD0McO00vdd0AukzhqFy5QwUd9eKJyIseV2Ky6JkY9SSjtcIUqo9GY4pF3fxxJgWOVQ5BFXeToDM60uWHHNehzX4Cyh39qxZ76tK7bpMoG5B2W1ThRxLyi8NM647HHvfuonUSKzeUJ5MbvUgHHvpl3DPd55H+1+3Yw0fuGT1LqLD82m6/ive/QAzPneJ/L7keIzEBgvvt9kXfh5r395Ba0f3xxYF0Mu3RQF66RZa0y17BKgX0/ovo9dbSZ7U6rffxyM/+xXOnf8Qao6eAieUJqDLSU7D7IUHPVJiyQ1NuhrJJnDPdJB/5hCEV8+QBCqXmeZ3NfVb7Xm/g7SeTMO16xXryev92iQEB6YLOMte7B2g/0BWdOUR0bzSK0BbUZA2dNfboAHlOPXY6TiN9FTRaRE97Zh9dTrOOK4WxzRPRkZ6aqRtO+FGpxsyf1YNyjfORgk3g3STJdVRJwRD4U5F3HNA1WxjxTG/X9ilSGGS4s0lYOAkRHHXVCG8D982BWkDMxW/gliWtk6g9daqG8M/bfqRXVaN4y69HPe++BMC0g/JoiILiNzWJaSLaQMuIV1GltDiN/6EqgkTImCZqOKDi/P9GXlkka0TF3g5gXPblvgAvZI28WrSlfz/ZHEtpddcRMCx9D1gFW3qRb/5Ay596AnazBNhhgISj+RDKMDzF309KxH293xUVQnzQgRSLGSfVo2K9kYFxtriYi2Rbrz4lrQ0JHS4xEwKpDnenyecIPXyPMWdzYo7YlOdKrXqioJ7+CPERFXDUSsKvz4Zgao0CYtx152RKDHrlh+69fSWqsiw6F7IClfgOPJI5tHatrO1TOu4ImLl8gEJCVEs+g2t7fjx2gI3xKvc1zsxdf18MCsHt6zfhJV0oC7lUBUBdNs2Xr/4ayyvo19rmYS66G927Mbjr/8BFy14FAOPngbLSZN70TKt6Jr69v+MlsSmuQqI159LLQMw0y3knjEQpQTSnKsp2fAJmNSziYdqTMHQNTNQeFyNOpjM3kvuPnEAbeh2YLc0KCkeaNcd7NE1l6Aul34vd/YAAsSZtBGn6WQdk9TU91nzuhVvQpk7o69DzXQrSbJMqKJDlVmVdM1AxR3TpOuKy8z8kpyLttL2BtCqE46slsxcTD7tHNxJVtXad97D8vf24mnSxTtU+GHluwSS7+4VsGbQvuMb30VKdnak1C3etVJhCBtVYydj4et/xOL3FCDwc8UFaHp+fg0GaLayeQMvpQ28iNzup0mXkkW9avuHeOpXv8PZd8xDwaBh9NmY6jFwwBIuxaGiqgkknJNqIvesQRjY3ozKDsX76wJ0aUIuEx4bpio53JIuGYywsTE6bzFmPQu0Ve6Cc6EOcyQTy+TyvKrbapFakaGsYEdVzZgJOjdjS9ysgF8sS39WLqaffh7mf/sHWPP2e1hB13ERHXh82LrXfoXW1bS+t2/+LkK5uVL+KJ2bTJoUG9c2VaKK98KQSVOw5K2/iVXM67Vyi7pflm6Lv8ZsYbdtUY8qtKXWeckOBdScZDz75jtRWENrawai+7SHF6vCkNF8Cseq/XSvBWR+p5VmIefMQTIsObyh7ogfv1UqQ5enoWo93YdzavR+NXqNQX8CLWjdv264sUiXvCiqptxoPVVimJJgccvGoq3JcRNfPg5xVIu1VNYxWTLzPO9PpgJ36pHznQfQjYqjlzufXMtZTbduPAB7VkNPa4CJkrrpfdzXgMzB2UhlS4krFzQ/s1trbPv2yQ6bUZeXwbl81ASJMS/+yxYsY1CWODFZuuSSsrZvVaDJupQ24lrS8++4V3GSyBrYca+VxLRpwxx76ZVkWf1LAH95LwC9TG9sdpWX83sgN5i1nVzkFdsUoCyln3MSas3Wf+PhH7yM6WecByc1R4FI3Moa9z5xpJ5WEmiWKmf0p9souGAwBq1qlhFdhXoAQCILmhM+RQTQnJVXgFwvnYphPfy2WM/pc6tr3K7BcKRMr0GsuqIEMUl3gofLe11OB8HAu+qFmlO6zjgpyi3uRqzXYkRCO2ztBt2KB97EwQAqx47BZU/R2v51K9q276bru5uu9R6s2qLWdtl21l1yXZdIgg84++v3wPA75JkYMQAdrWoy9UQQfs0Tr/wKWc+76SBXXs/KLW64Kv4aswe13LWwZY33SNK4bSvfW2wQ8Hv7J+577iVMOfkMOCkpyoiKaehQHCIBTcKk81CSgFfJQ5uA2kqzUXTeUJSub5SJKMWRvdMoselkQVvWuEM/hybxL9RE/gc7YZyri7jPoXRjCwqPr1GxeJ/96bCgk2uyiK+9ldLFA+icWQMIaFvp5JsqQMsk48kuVrFWtSHVpkyUQGIrnQdzcjUHd08VuSOECJwrH2tC2thcxTdrmD2sZbdW2IzJ5KvpDz4BcTslDdPO+Azu+8mvyDJVcUPleva0qmLdU7Zm1729A7W0ecSyMRNfK7a6gqlp+HL7eqx4DwpcJcSBhO7vMvf1tyGykdmajsStI+9tL9ZyvPQvW/G5h55C3sDBmm7W5YOwdbLX3G/iTiyfSig7gJIvjaNDdqY+QKfSNa7rJRTRqEciuaBavw8JVX3CoQFFvVSIcPikTGZIKo4NPgBqHmpA6ugcFTuOOXD24zjR1UHsRTDjITMfOhk5aDj3Qjz4419g9TY+6PT11WsqViwn9DikRIcdeyrsLbX//X1MO+HkmAlDMXtHX1uJbZOFx3XsN6x/Vio2liUE5F40JhSyIhbY2RKn97v0T1tw/vwHkVtFoEVGVMinwnGcI1G13m6RAF8DRxGQSYeqX9XVF5Dx8YWxqFjHPQF8nZs1RUKt8LQkA9CcpK1cXy+eEYe1CtywFo/0ktFeH005ScjcLrnP0vp3HYeCOQOF6703/u1PLUD310xCF6BLOPYrBOiN8ngoWdQ4MVXWoU5uJsXhtu+aRS3ImBGGEfTpqd69MGbp2li3rjQ7XI5z75yPJX96V8Uixd3d0+uGEgv3fdI3/oSKUeMjbcSJqif4WpUMHITHf/6auLBiDW/dG3VvD0p5c++WZOK6rf/Cvf/1HEbUN0sWXLnFRlxLer/OLOZHCIcQvmkiytkqpo2bexjmIzK/BCd8qzZMle4zTiZnNJaIpb+vix85cHyK08TUyV23QiOjuBQX3H0/Vv3pbTp4d/W6rgzQq7Z+KADN4aunf/UHVA4fqWK++1RRmDoe7VaJhEeMwtP/+zu0beuP9YwTDiGDYfU77+OeZ7+DYbX1sHgqts6ZxBpXLkArTniVGLdNXd1RHkQlrS0fwLmbFckUW8MFSSYOpbaZvF2e9s25nwGrZ2HAQwT8t0xByc2TD0orvzYJ5fRYddN0ZE0p1n0KiakPjjiA5lISUgFo6wgE6NJDDNCShe5Uk4Z5Qnf+pkZUrmxG/qk1QhIk8UBx+4zIKKm4XYqWev+lg4bhptWdYjkJcHJZ1DYNegfaNDuB+c+9gvSCcAxAJ/Y2jj7uOLT/ZZsAwHKyeKUEqx8AegW75pyY4qoSAuq1BERP/eIt1J9zPqxgQLvFyhKxpBnHjF9l4viRQps6ZUgGKu6bLhsxZ9P0wwLQHP4o3ziVNn89Ms8aQB6Oatfev2kn6tKrmm/Vgckhp9LRR+Frqzeg/d33sUTAeRdZy7sSXkcJY9F15PARW8J3/dfzSMvLd+flxalisnTi0Iepp5yO1X9/r4eX1b9KhwYdHpx3eOznv0b9mefCDqVKL4Jb0+24DSymy+diScKbk4ecUzIdQwZkVBOYqtb4qVLCmt09Q0pT+7o+zH2TrVvrqx8lI+nEavgr6b2k0uulmAelnA8xs+i9Z3NYyq1tt/c7II9YgOY3YSk+1F84tp2wPvLTCtCuWy3xarKyqtfORsXVR8HKd/TNqGah8YI6+9U8G5GpJfzzmklTcPt3X8CK93Rt6hYktcFW7dyLa5auohsprQflq5mgouD0r92ENds+FABgi6h9y97+2bycXOKYqq4wYde4jR5X/P7vOPmaaxHITJcmm8ikFtNKQOakeIhTeIRYfSFZri3S6FDysbcH8/o2yVDV0stHw8kicDb8wtkQl6BIGp40QDNPRcDG4Km1WPDcy1hL13vRzj14ihOr25HwQFwWCSOp+PNqWqOrnloC0+9PWGLq8ylwNBwLZ992l/zN8kMG0GptF9LnWLaTDuK3/oI5X7oG/vRsud+5ICB22IWiMbUjXC+OT01zscgLYU9z0LKZdAAqgyevuzUpgOaeg4qOFgx5oBGpR+fACBlSOWIa7t77aMr3pV/2sJpeoxKevXcTHnEArdsZmQp1kZWgrfhTbUFrLmIG6NKOmRgyrxmpg7KkEcQRUp7oTWrty5uguS3Y8hnRMAMPvPI/aKObnW96ifUlCZiryaI5+5a7BCCizRBx3DHpUAvhuhWrsZqsbo5vtm1Tian+AmgGlhUxYMObmZNO7X/dhnNvvQPB7CzVGmwmsKANN5lsI8jAk2ah8DNDMaRtVo+28I+lJnZDHSo7WjFgXqOMQ0uRFmoVa417sPjsyMHo85sY2dyKR1/9OR2Ge8QjWibVEb3Hht2ELOti+v21ZG2feu1XE5YqSvhMYt1036XQ2q7ukMaiQwbQOkbN67pEuhN3Y+VftuD0G26BPytLOnBtbXgogLY1s54bkrGliUqoOtMMlJ47AlXrjhGGw3BHbZIeTi0GrZyBguOqhKea53AKSZTmojkoNVTllT/G0OmNJvaIA2g3Dk1vaqpl27+Pdhn1lyXdN4Kew2dBK/4GbjEesHQGMptK4XdsTY4Tbe11q1mk/ln/n627robUt2DBT3+NNVzOtGU3ASUE4NgKTWbTrNzyL7RceKnEeN3OtHhcD3yd0guKcD9ZdLyxOMTRvnW31DrHSxJ+lCTTCp1wjHgB21QFClearP7rFjpIbkcgO1MN5Y1UsETLl0w39ONTISJp1S4IoOLGSSgVHoVG0SKdte+/ieEqMVyiG1+KdEnegCXNyJ5WrLgWuDvT0m34Oh4Zy9Eso6M04f6IukY8xslA/uw6qcpVGly2uFwDdSKAXqFBcBGtz5q3/4mGM89KyIGi2rCV1Z5TUooFL/1ULPRl22ITflq37dEH6J5IgjfpNZbPsiemHG+3OoD//C7OuPFm+DOzddmkoYmdbNXDEKnssoS2gQGajRh/URDhr3OMn3MN0yRk6CZt3SqcRKGtgk10gD5cD6c8hV4zQNderY+5T/fmR9HI1JqYCTS9FTC4AE3gfOQANFvRPEyAAPpUenzTcRwpKfuoZEWxqkfLoC/x7cMB0Kr2tgWVzPF86QiYab1RSfKGDqrSN50NrjlqCua//DNJBi7fio8cM+QNtvjt93HU7DmKE8PSLmQC4p3wsJF4+rU/YCVZZrJZtynQUGEOpW6FRls/WV1ioXP9NFcW/H0nTr3+RjjpoZgGB0NPm44fCuLEW/qYIlQt5jZtRQbP7q2aLN1PhPA8M7CrVng9eIZksdCYzkbRJSPhpCp3l9kG+dBQsy+dyKR5F3z4gDadAIZMbcSjL/0M66Rcbm9MlUvf1lOVT+6WhqBlf96KsTNmxgdo/dq2LvHjJDE3DqnX3Nuj4qZNJ3CXb/uQDordWP3uHklGilXfD2vMz9NGXtIpX7kB/tQsOLYmi4qzf023csdUoZmM8XmoprUt7FZVOHndqiKK6UHZAGJLOe4e5P1980RYIVtRhRrR++jj1thW7yOGE9qNRTNIEziP8fv9dwZDoW/ZjvMq/exHB6N+x/lRIBj8leP37zoSAZrDGzzleMCDtfAPTpcuK9u0e+k20kkSeq+FQ4bjzm89h/Vk+a442Pgvbegn/vg2hk6qldAGlzs5cQ4KF6CH1TZgEW16tuoWcivxDh0v3q4aUJZLyGM3beJdtIl3JewwTBagOZQiDTbkHaz509s49guXwXCcqOubkD5Tx+tTTIQ/MxwD1s2UDRsma4vLs/K6mvuPh4HnR3ZORSmBNXMFV95fJ5SwlrRxO0LTyaEZmXupaQzMWNeXAKeEDsB7vvMCeUW7xdJs27Yn6QOXG4MYRJfspDX63d9QOXZCYoDWVh7//+gZx2LF37YpYE+4FqrxaM07e2l992qOjn6IWbM3tm0X2v/4D8y48BLhaQlxs1U8j1onx5VBQ9eSGQbPp7VdO0OajLI3q/FwqjmpMWF1R5jHx107DrY/BqCNwwrQv+PKNuNImqwSa02T0p5zUtPS0rIyMjI+sqanp2eTZgZCoQnMEHVExqC7pqN6TSPyTqyGGfDDb6aouHKi7kDbkCkU2QVhXLlyA5bTBm4nADzobDtt6Id/83+oHDFWDwTVAG3GS2Jxlv8MrNr2b0kstr9H1tX7ihBp8U4osNb1swIuW3b3y+Zt27pHN7XQ62zjEjKy3N74C44+bo5uDY9JpO5nARmR1vdQWQrK50+nzTtdpoVzDS3zQfdHwldNpuYmoynCEzy4fSZyjq+SdePGHraefWwRalB2q3P8mlCIq2YyS8pwXdtarNr+byyka8uHIINhMmu8VFvQDNDcvv/Y/76FoprBvdK8ysFM91bjORdg7bYPsJzWdOVO1V3KFT78PKxyCGuLeZnuFlyWiCjrI1R38POv4A7T135PHt1xCDBA++IDNHOwcI5BaqNpzwRpbQfOb5COz9zNdVJ3HpZwFicOW+KGpEq51PUrE+B4AH0Y4ttKS+lDv3YkAnQ5PX/FTUfBKmSWqxCpX9c8J2gQ8Vvwp6Tj3FvuIsD6N55+7yM2EuxX2gYs+MVvUVwzVFGIJgBoU4+vGj9zNr5KIHLN4nZ8dVUHbun+L9z/4o/x5K/ewvK/7pCOxFU7VHxRiJK27e2fxJKU4Lngz5SXwEMv/QzlZB3yuKHo1GojwSQPEzZ5B1nHVGDAStqwPBmjs28kVn0pmWSALu/gaRtTUU6AMOCmKXAKVLu64bN1G7MZw6uhiaEMNZDACqXivNvulpb8JTt2qWoN3Y6fLEC3a4BeTuvw4E9+hbyyqoQAbbq5Dnof42cdh+tWrMHVS1fhKys7cMOmb+Oe53+EJ3/5Ftr+vAXr6L5jy547FxfSOnMXKXtR/XEfLt+i3jsnulfu3IMHX/gRykeOk4PDSGBFK/pZR2L3DNh5x1VjSNsxcvBy+Eo6eztaEgM0d4B+9SgPoA9jI8yRCdCdDahZ0YLsxrDE2UJSeuXTRPtOQn6NKSefhqV/fFs1EXDsuB82B9NFzvvvXyO3tFo6Fxmg7QQWtNTROvRegymwyQV1gunwp2Ujs7gcJUNHEngfg1O/8lXcsHojAfbvycL/QFq4Dz4+qSoTVuiKkTbtVq8m4L6WgCQ1ryjChJc4hqiAkIfq1txCVvSmFuFKqNgwvR/CVSrpWLVhuljR1WQ9ZzeWRoaqupU4TkzCyAVoJg/yOQ5qTz8Ly/7wN7Rt36X4SrZ9tAOYvQwOQ6x+d7fkJ+a/9FPkFJUfeIiw8HwE6aAIyaMTzEAwI1fWNjx0FCbMPBanXfNV3LimEwt/9Vtpy5dmqH5qaHEPocXbVXfq+u0f4qrlaxDKzlO5EV9PGl2ZoO0YUhLHHNKcn7GLghh4Wx0GrJ0pXowKY9XL+ngAfWQCdBl96N8cToAu1SNycjc3Ck9HeEMtSjubUX7DRFg5jri/jpss0vwiblzVr0t1uESqZOgIPPDDn0gMdtlW5fYmC9DCwbFlj7CTMcAJDwa5sPe/8nNk0Ua0I7XVRu+j42NoQd34paGVAdKfno7yUWNx/GVX4s5vfh+r/rod7dtczgiVUGzTFRsuj8PSXsh4opZzDEBvUzwP7e/8Eydc9VU4TghBTqLa8QlpZFNzAppAOru2DGVrZihrt58Amg/dyg3ThKa09IZJsKXm2YgMV7VjxlT5dC234hCxUDJiFB4iL2SlXCO1tn1t/HFbvKO6V8WtycJdTl7Wvd9/CZkFJQcEaFNXPZmGCr0EDVUlYeuZhIoCgLy49ExUjJ6AuVdeh3u+/UOs/etO4RJfFvu+t6kGpuVJhLfce8ENn6zgjtK3/4lZX7yK1jQYqW5SoSEVm+YuWql80URLPCItp6Ucg8j4KemYKlS/qiu4/qABmveo37aFj91yJ8DYagJMXA0YMvnHF9Jk/MJCeUCyJA+gP26ALt9Qh7KOWmQ906QAmk726pWtyKwPx3STGfuRRfFCcsw5SDefPy0Vlzz0ODrIqlghHWLKUkrWwuLkXSxALyeAW0Vu6r3f/B5SC8Paeo6W9x1MiaOhP1dGYTHqzjgHX3/22wSm7wlHsWzEmDKuZZqGdOm2ZK1FPmD2YOEvf4thk+toAwVhOlbcTcAWlyN8DyE4OSFU3jYZpcxa2A9rrLiJ66SjLbymFekNpRFrOX5ziKWbF+g9paXhi48+hTXcIfhRyhKZq1lr2xYV3uBrvJiuy9qdu3Fn17PIKCjon34BNyTC5ZhOCjJKB6DhMxfi9m98Fyvffk/Y7yTBqMsvDzb/wE1KT/y/N1EzYYqAsGnpw80Xf9qPjGIrCKHijkmy17I3M4Vss/ChHCxAW46JzIYKpF4wCFlnDkTWOUORde4QpJw3SOlnoppKmnPOEOSdMxi5sytgp/rh9/kTGg8eQB9GgI7MGBRO4XrkbSbQvnUygYSTcEZfhLxcE+NPOPZ4rCD3dyVPrti2KxKH/Sjt1W7zB1s9a/62HdevXI8hUxpkw8noLxegrYMEaHdKuGZOSycL/bgrvoxHfv46vYcPI4fM0u0xZD9JbuhlZCkuIktt1Y7d+MqSlUjJzKNrZiVsi7eEyjIg3Yg5x1aQN9OiCbH6IWzVpSZyF905CYFcvwxLSHQNXdJ9XvuJx5+IVX/4hwxNWJpkSCCWjGjpdlcJoLfvwUpa268sXYmBR02EHXD6pVNX1Uy7MwsdGJafrFbySIpLZaTZw3RQLn1fdTomm9xM2ERFB80VTy9FMC1bGw+JWSnF+7RN5B5XIaOneFZkYac7ieUgQxxkOWd+lgD5gbHIuncs0u4fj5QFYxF4YDSC949GaB9NeWA80h84CvlXj4aV6yDoc+T5vRBHz6qQMOmvDydAMz0l3yRV65uFM5oHk+YSOAR8vrhup+mLmeNGblt6URlu2vgs2jk2KRwVeyKWU7IArSyaXcLTsPC132HO5dcgLa9YmmAYnP3CrxxjxfdDs5Bh+XS7qyGuavVRU3D9qvVY9c4/xY1dqi1otgC54SXZz7NEuKX3YMVftmDqKWdJq268jcCfx88qbc3kmYRDKH6oVjGQHXRNOwN0I7nSLciaU4EggWHQSDw9xJFp3QbSyLu4rfMbWMufY2ty9cRtW9U142vHDSnczMMcFyvpHnmCgHL2pVcgmJUfofXsHwtaHbxu9YniC9EGheOgZuJ03LB+M1Zv+bcuuVR5g+Uf0ZLmz8bJyOV/egeTTzhN6seF2TDRYWOqRh9/OICyBxqkxZ4bhvI2NR00QJsBgyzmQQgtGIPUe0cicP8Y0lFwFoxAaP5IpMwfEVH+Psg/f2AEsq8eAjPflrBMbEPVJw6gU1JTfcFQiOuh6d42MulH2QepmQTOY+hDv3Ug6+HQAjS5v9zuu7aVLOgmlD48DYHy1ITcsJF5czom3Xje57DiHzvVFJLt+/MsJ+cO75JM/IIfvIJRLTNpU/nF4rQjw1mj3XkHF+KIubZ2DOMeWz4EjpkETBfcda/UM6/a8q9ImdWKZFvGt6kSLy7t42EEX9/8baQXlibwSgzN8+CT6eB+20I+uahlG1v7pW2fO9mqHqyTci+HqTG53jnRJBhTDchtvOBirPn7Tkm0LUuS/6JNhzX4HmBwbiOvZP3Wf+G+77+EEY0zyOJLVdOIfImJrz5a6MqIzEW0I11zqnSQOWRywpW4cP6DWPb37WjbsUcO4eUHQQWwmO8Luvdv6v4WUvNK1D3ay70p8w8dAwXnD0PF+hm0p5tJW/oFoDM+MwiBB8cjcN9oBB6aAP8D42A/MIZAeqyoXyt/nUIgnUpgnXs5AXSurn3/JMag/X5HGlVIxxI430uP36Y3+DLpjw5CX6UP+6plW78gd/bDw2lBc3yydEMrKtbNkBul8MKhMP089Tsl/sSSmDFPDGTz/usHWEYWInM2Myi3acpG5jFItsRpNW2We7/zIqrGTxaglBpc5kGJ5eE1zX4DZyOWu5o9AuZPsFVLvz8zBSdd9WW0/+FtKfVbrF30pGLQO1SHGwMVz95r+8cO1J11XoKSMsWKpmb50cHkCyJtRLbMoFQjyT76Wsvsw+5mFF84HCncrm+nSUdgMEFrL69tVnEYd3/7eamEUDHbD5M6oCIALbmI3QLOdz3zHVSNOVq1SnN9sE7suk0xBx2yiqE/lTBDZKK9X8JY0izkWAhkZeO0a2/Asr9tl3DLwQD0yq1qMstCAnyeDsTvwzF7Kxnk1vAAUkfnoqyNAJjWpXJ968EDtF9Z0AEC5NA8spDvH4/QfWxFj6THUaSje2jg/tHyu1lXDYOZZytuG8v3yQLoYDDoy8jIIHC2znAt3f5itXOfqy/Pd0jL7Dp5VFKzlPuUtjch86h8cX8Nv5243EkDZf0552HVP94TK2RfIF7qEgnFcw1losUuHQckMCfrimtL7/rGd1E+coxUMljaVTVNd16jpep0NVgbblWJT91cZqQiwVClgLLpbT3ZRk+JFlV1q5a2yu3YKdSui6fHK9lpGTjhmq9i1Z+3CK80V6W41RxujJq7BxNXd+yNVDHw1+30GW/Y/B2k5OVK+MjSXBdqEo0R8RS4Y4/Z8PzpflRcN0VmPxZ0T5eEbkFnk8SluYa2r2vM8wt5bbPH5CPAHCC2XzgdnAigGTFekao5bjrnAlrbnepQ4vfOrfNJhAJkesqWDyUEsHrbv3FX9zdRPHQUva4T05kYHergWtIy9MBwouPf+P26uu+EHkMPl2WaAVJm4gvI/WDrVnVL7lPTvQfctnW+3qFUnPTl67H6z+9KVckSXYXDnzPS5BKjiRKk3KnIg2m5Yea6jc8iJbdASP7tuACtRr0xm1wg3UbZrVNQtImn5EzvF4BOP28Q/A+Ohn/+MPjvH4sgAXPw/hES6nDVrx+DBNL2Q2OQyQCdY+sDzDgQF8eRBdA6TjyZrOY/fGrZ7ITzmZ5rYy1K7puMYAGd7vy6/l7YxWghU3LycWPHMwI60pG3JbnkGYM6c2QsYQvmvd1Y8PJPyboaL8C1LymVTzacrW5OMxoD54y5xPWYlEYAztTVB4qfVyYw89QXn09PPTG0FR59jvgWgyGjqkzLL1NCzv363Vjz9k4pw1sRM/VlRdJk/6R/34mj586VNbU1aJh6oIFbFigbxVIAlTd7AIo7mpG/aRpZWtNlYrRa/74DtAz3nTcFwZyAxBrd8WTx5jry+0rJKcQtG56hQ1NNPFnh8o0kAdBcjbOCAHr5jg8x7wevoHLMBAmbGJaRcLiDTNUWcHY0sOoBrT59WPuiXk9Qr6sjIQ09Yo7pNPU90esQZv05/elZOP/2u7H67X9qS1/FpN2yyVhNBNBuaaXMRPzLFoybPUdRHiRiMZTDl947ad4pg6Tlvrhzar8AdBoBdJCsYv/84QjcN06s55T7RihrOUY5aZg2b5SAeTaHOLJszRT5CQJo5uCgN+SQLLI/1XzQal5dZUcTCi4aCjvgUwkkJ/EIL2aVG9U4A22//5vE8ZZvTS4jvkInkTgBx3wMT73+B4ybdUzM0Nx9QMNQG87l3BWrN3Y2pE9RYZpGCJaTCjuYqhpV/AHaDPrG8/miFlSMxgVoGdGlxjgFyELLyC/BlQuXS/OD+/5V0pCrAZBwnNa+8Wgm/F9Nn/fyJxfDDqWLpReJTbrgHDPLkdc9NCQLlU83Sc1sRUet8HMUJWlBcxw776JhdD0sxblhaQAxjegA5Jh2+VHNs7DiD/+QjkGeqi71y8lWsDAnNHfz/e+bGN00U62tY8ros7j3uF4Lt87djB2zpRuUDF7PlDS6dmnw+4OytkbMgFd5DmZl04ee4UtcUSExactGWlEYX16+Bmvo8F2iQ1nJJYKjfONcrfOFxxbS/ZeWoK5beXJuI0toTDaqFzfK2LNDB9DDDwDQQz+5AE1aTBb0L8xP8USVEp1EGtDWiozpRXq8T2LODUWcHpTxRmu37Y42cySZQGrbwjSVBHh/24YTr/oyLL9b3WAk3LxOJDuvLBG+2c1ACOmllaiY0oAxZ3wWU6/4GhpuuAetN96Dxi/fjKMv+hKGHHsy8gYPh5OeIR1dJquhLGyzl1FdvIlS5fUscs9H4L4Xfyy1tEtkM++RygbF/aCpLXslXlJuchv9/pO/eAOlI8fr+GhMdUycmKWVYaPq5qlSzcHsZzyfrrhTD4ztbW1l+nqjUFtWtrfI2kozheVoIh9DxfZtXeJnKA5rnxPABfMeErrWRQTQ/PncaoxlSQH0Xqk9Pv6yq6U6xmf0ciDqaey8tlJbL8MO/DDokE0to7WdSmt75mdRe/lX0fy1O9FEa1t33W046sLLMHT2icgfOgpOZh4s8niCPCNRW9dq6omjp57sD9B+nwLx8LDReOil/5YJ34u37+1RhRTRXip1ZGjtFsUL/sT/vIbw8NHxE8Eu74oO8XB5W/XXp0QG/HoAnXx4o8KyjsyZhP1ZBx3uakTVow0IlKXQIvkVL0OiDD+/FwLE+T/8sRTqS1NJzCDOvpVgKWrI9u0f4vrVG5GeXyg8FIm8lNjWYwEZ9mjIkiocdTRqL7kKJz28BGes/iZO7fwBTur+IU7sfgUnbXoZp3S/hFM6n8epG5/D6cs2YubN92LQzLkI5pfS8wUlXmknGtRrGpEBuPzefLaD2jM/g3Y6UFYTcDGArSCVqRs79yZVmtVOlvisS64gcHBiJmVHeZddcFZcGD4UnzmMALpVYpU8IzKZCdEM0KWPTYO/NESfl8DKDoiLzWDoxvNVLF7Fc/PKq/DgSz8Va1+sZwagd5NvPOKBwF9b04k08j5MSx340oqfoGLD1qRMzJjI7fm8tpMuuRrHPtqG09f+F63hD2gtX5D1PWHzy5i76VWc2v0yzlr/XZy+dCNmff0+DJ51AlJzi6U93U3+yjTuuIMTFGe5inubaDjrPKz561aygvfIGC5X26V9f6+6x3sBaOmcZSbDf+zEzAsvSdg/4LbUS6KUvDRmuYtfqeMBdF8saJ5J+JZxCPvfzcMe4qhDGdfI3jgeThrXGjOvs5MghqY289HHzJV4m2T3t6iqjeVJkQ4p2s+Fv/2zTFxhcHDsxFa7S34vFjZZv6nhMkz57CU4kTbmKd0/oM36Io4nQD6ONuzcrldxcucrpC/jxM4XSV/AXALp42ljz6WNfdqab+H4OxagYmIdWd/pYl2ZhluLG9vAYqt4ta1en93pVLLSPkuewz3feg53fusHuOf5V7HgR/+DB3/xJp763TtY/vf36Xrswhra2Kt5c8u0FRW/dC1Q4bB4D/jqqo0IMq+wDtfEkuPHAjTHWjOnFKFoLW1icoWZyzmsPZ9EB27UM+JOxCYU3DQWToqFVFpbw4kCtBEzfIFj9jxaatLxJ2Ltn7eJpc/Jr1Xv7hGA7i0ZGhvikiYlHhn1+p8wvK5ZJcVsQ5cymlI2GTeMxYeFYyKttByTL/wiTly8nsD4eczZ9ENa2xdxbNfzOG7TCziWdE73izip81XSV3BC50s4gX7npM0/xKnrv41Ztz+A8JTpMEMBVVHhU9ND9h/my/ebPzKWLZSdg4vmP4Tbv/Mibv7uS7j9xZ9g/o//Hx77399i8Vt/xbI/bxd2xtXc5MJMehGej+jE+IU8cJZA+trla+CEUvYjwPeZmttaDn5brP2MxmKUrWuNM3ndA+iPPNWbb2QV24rjkvrcdtPEk6cjiaiYCQe9/e6hBGjmpZVExRlDZOOqxJWbNY9XN2zjvNvvIQBy+Q2SD3Fw8wbH6y576Enh1T0QB0OAXGTHCor7XTj2aMy46xGcQhbViV0Eut2u0kbteol+9hJt3B8qpf/n3zmBgPqELgJvAvA53S/gtK7ncObSDRh/zkXw5zH5OrnWMufNkUSWbcQ0T5j7cF6npCIlKwepObnIKChEbjiMggE1qDp6IsYfczxaL/w8zqXr89X2deRl/ET4jnkizOr3OCar279JHyVQLx42UkA4FBnhtf/hzfdGsDIVAx5rkQnRlevrULmhOS7BDoNzmSQPm+iRJ3KwxT0LRacO1mEEM8KmZ7ogpb/nrlDDDuGCuzh0tadnR2Bvk0kksbZLKESXbVU1z6sIpD4770FY/gAik4h0fNlwKyx8qppBVemo+6pk7FGYdc9jOLWLAJjWKrJ+rhIws/LP1Pqq9T5h00sC2nNpnU/hn7dtxtizL4Q/KxcppilhEzXxx1YjqXwxk24iZZaGWlsC6pTcPKQXFpGnWIqi6hoMHDcBE2bMRtP5F+Hc2+7Bdas6MP/l/8Hi379N4PxvoT1t11zjbGk/8v/eRMnAIarzVedOVGOXmzsxZXRVCt/bQzNQubAFBQTGLqmVah7zAPogANruAdA9kiyGmYRGT3TzcDWq0HOUr2lBVl04EhtTI+XtuO8jJTcHd33je2JFcHzyI837Ywvrzb9g8OR6SfDFWo3xPj83kHCrd+nE6Tjl8TacSRtYAbLawH3Rk8nSYqt6Dlljx216Hqd2fA/nrvkGpl18GUI5eXpKi6GrKnrP/u+rRgwpkzD+2Tx5OQdZ4WoMPHo6Wj97KS595GnMJ4t7+f+9I910q/++A5NPPlVVc/RCQypzFrMdVH19GnKfaSQLmieiNCYE6PINtbLJmd8hv5ss7tUzkTG1KGHttVvVwq4+N1rc9Y3vE8DuTWrSCHeQRhKlzHfxmz+g5uipMTwuRo84LN/7pp5/KBUXtoWKSdNw+mPLcBp5PMdt+hGt2as4fWP0kO2Tdv5QDmn+m3PJU5p24RcQyM7TIBkl2rISDUj1xV9fX0wi0uBcSXo6MisqUTN1OoH25/BFWlturlr5x7fpcKPD6h87MZk8ETWizYxO0XGn3NumzBcM8fsporW9ZxqKul1gbvAAur8B2nCtAF/yqtor441D+rjK7BpQtagRwWGZ0Xpi04hrQftk7NAYLHztD3pjRgE6mQTS6h17cfWSdgKx7Eh9aq9hHrqpC0eMwimPLseJYi2R5bTxhb5vXA3QJ5EKQG9+AXO7yALf+H2cuf47OOr8S2GR9RSpDjESN07EW0PLiI5lcvS0ZJ56HtBt28IZ7E9BenEpRjTNwFlfvwcLvvM8zrzu+pi5cEbiJoyQgdJLx0izCcehyze4BEj7TuluEGAu6GoUhsLC7lpULmpCcFB6AoCOlh0ymJSPPgqLaG1XJBGu4vI0t0mJQznryDO68ulldD0z4ocyzGjCV3lrDsKjx+OUx5aL5Xzcpldw7GYG6JcJoF9MDqBJ55A3dTx/TffHeWu+iaPOuwRWWrqsUcA9DA/QHNOTJEyx6Dl6WKwkqH2xv0f/T55CZrgMY1pm4uybb8eC776I0675Cnl8VmRdI7MdpXOVw3WWolNIN1B8xSiU6oSuC9BFHkD3B0BH61UHDSjFnKaJmNN4tOjcxom96pzmSWiZNg4ZacH9srsfK90o83DcVwu7JECvr+ODpltbuv/EkqPnnoz2v70n/ApsOUkDQx+4gWN5edv/ugNTTz5d6l3NmIGb8Ybq8sGXWpSPObfeQxv4h5j97E8IYF/FqZ3P0+Z9sc8b1w17nMAbmABalMD6BAKEU1c9i5qW2eJuq/HzVlKdbb4YDmXDjBm+yZvRifKGRNrV/WnCY1w6aBhC5OYHe+GjEG/GMpB/ykCZEcmTbpgitiiuBd0gUzryOKfQOZ3uFwJoWlurwN/7+7cU0Ew95Wy0/+P9pNq6OS69aKeOPROwr/7rNkw56bReh8Aapqqg4K669NJqnHjXgziN1ogtZwbn4za9KqGLUzuTtKC7VAjk+M0KpPn+OHXNtzFgxjHkgdkqlGSokj0zye7FaEOMX5K7qnmmZ/OLgLVpI6ugBIU1g2AGFfGVaexf4ulO0eGf55xaJbmFks4GCXUUdKkYdJkH0AcP0Dy9l2/GKz53Crb/ah22/XI16Rpsp0dXd8TT19bjf7+/EIMrirX7ZESmR3ysdKOdLSi78WgYObYQ+Rg6TmnGyXzz+zj5KzdIeyxPlmCAXrn1w2gxfx/oGTm8seDHv0BudaWUOFm+nsmxWK9EQNq2Mf7sz+JssnbnbnpZNvDcbgLVzuStq576klhpJ3a/LBt7zsNLyW0dJAdTsjXv0RZ0bZG6G1HzfLgHkJvBDxpc5ueIspUd8Bm9UH+qTsmMaYUIr21B/qbahBUcJUKM1IScbq6tnYYKDl997WiyZs2EZXxmpJ7cwuk3fB3ttKaLtiVXleImDzmxyA1HudXVib1B09Bc00wJmoZJF34JZ3Y9Tx7NK7QOr9CaviwqayNr9MPkDmEOf0lSkb2kl3AK6fEPLUZmeQ0dCqaiBLUNPS8wGYC2BJxVVYgdbZ6SKedGj8SuCllxOacj+GD5elYjRekSlNGT3liCqrWKs5sJy7g1n8soyz2A/igAbWiANrWVZ8lFvubzJwNvtAG/WYK9pIjV1xbvr68vx2+ffwLDIwBtJSSNOaQAzeN2vjgSvhRTSuxMF6B10srsMdbKjy8+sQgr3gOe5kSStqB7q5FV1KO6TphLknYAX3x6KayQoxOk+wC0BmbbVFUGeTVDcNLTa3CSjjlLjLGTy+de+kgAzZv4ZPn7l0VP6ngBc2gzn0wgMe68LxBopCasYDGN2NbwaL5B1TOradiqxVzdG+wO27qKwGdEOxj92jswLCtyICaqYGEwYDBJGZGD0jbuKKyTCd2JAJoThDkyNXoaSrvo9y4dpTioE7Qem27cPRDE1U8ulYSXqgWO8mK7dKHLDjBtZCXdE5c89jQdCCn0GXs5cAxFzlM4bBxOW7IBx3crQD5t44sS1mDLmcF5rvw8mTCWUgFprt7RYTBe2/Hnfp6ALjVS+20ZyQD0/mEaXyTRr1rLTT24wme6zTKWUJ46XPLn60mHGqHM5XWn+yN1bI7UqpeQN1vAg2I7dT6BwNoD6I8A0Kbr2giQ+WUBrr34ROx5YxX2vr6MdDn20OMe+Xp/xetLCcyX4bc/eBxDK4t1VYiT8AId2hBHK/LPHaatZicytUTViPoiLbYMRMHMHNy86duqGoFnym1T9aErDjAKym0TXkKPa7f8C8declmPBFvPOmBLbnbZ4I4f42hjnd754kFYygmAWqubOJzzDIH/423IIEvL1wuxkqM5NKL8HlFryhDLOMppYPYThSaDWagqA2WLWoXAv5CAtyThrMJGssAaJbzBRDw5ZwzSde1G/DFbPmVd+7OycHv3t9D2nuZs3rpH16tHOUfieUluVyh34K2mtZ39+UvleR0jARsi3+d83ehAmHjR5Tilm669rrRx16Q/1/mELi7B/CHmPt5OaztQHZSWL2GHYSQU0SPxH+MZmT01HoAn2+OQUp2J8BJaW2lAqlV7u5NDHPUeQH8UgLYSAfSbqwiUl/fQvQnUBeghDNDGYQToja3IOmGAem1TdfO5AOTGTS1tLWYWlWDBiz+V+t4VW/ZGGMvae+m0UkQ7eyJTkdv+tBVjmmf2TJT2+LzKpecGjWBePo6f/wRO7f5hvwN01KImgN7MccsXcdb672Bw63EJAdow9wlbuGoa+21cn9l/FJocq/UXBVHyKK1754EBmpOEvNGLu5qQeUyFGADxJ3wYkeqTrLIKPPTyT6X1PsK/odeW66ATEdy7ZXhCqfqnLRjR1CTPmxig1SGWUhzG3AcWStJ37iE4gCMA3c2x7Rdx7obvYmjr8crr0YZArwAdA9KRUtg4dc0Hvb58OIZTUPJ4E8KddTIzsrSD2/nryIL2APrjA+jf9Hw8UgC6fEMrMltLNUD7hanOMmJI+X1q0ge744UDBuKJn/5KRt4zB8EyXft6oAoOBuilklQEFv7yLRQPHRYXnFVHm6p84NcvHkMu8IpNh2zzurFoTixxLfXpZG3VX36tlFIlTAbaPk3yH0NV6tbA65ZlS7fL9w8RvSmuciDHj5L7alFOAF3Eo7ASADQnD/M1QHM8Or0hnLBiITo5xUDRoGF4+v+9hvb3NcH+jr3yyOvGIapVtNZxCbG26FAIATRP1y4ZOkSVBvrMBI1Oprxe6YSJOGvVs5JPOIGrNToPEUB3ccLwBZzVzWt7PUwnqA6lBBY0r52U5ekhxLau/nCJmdwZiP0J0FaeH4ULauXwLdEAXewB9McD0HvigPMRBdBrmaehUIOimt5tGzHTnd2mDaFIzELj+Z/DLc9+F+1/3iI1nzwKabFuWtm3WiPCu8GUohzXJIB+8MUfI7O4IG6W39ING24Iadixc3H6xh9g7iEE6BOkHfxFiV2eyEnIux+GPzMzYRWCG0MM+Fw3XtOY+hStpWPa8n8hw0rYRp5siEMI/DMdlNw5FVUbadN214mlfCCADm9oQuq0Yp1PMBI2VvF7D2TlSk3vHd98Dm1/3YrVO3dj6Y7dWLhD0awu37YnYRehNCzR2t7/0k+RWVqs68KNxABN99LI40/CmbS2nPQ9VWLHLx7CQ5jWlyz1Y+Y9jmBWXqTULt414XbzFDcMo/lehIrT1Gx5phXpYeiXCdtsmWdZKLp3ipRGFm+cLnwrPLOwwgPo/gfovYkAOlaPIIAuW00APblAGjUMIwrQpmstRjqt+Ib1S3ssc2dMnHsyrn56GZ7+xetY9c77WLljb9wR96oVfLdMXGGAnvdf30dqXmIAjCTf6HCYdPb5OKPzBekCPFSbl1uJT9MdaMdvegVzn1qF9JKyXgFa1Yv7pRnjpMuvQe05n8WI1mNQOmIsUgtKYAUCmk3N18vn7HuFCFOEMn9wydcno1rqm+MDdJGekJMvddAE0OuakDapUGqze4uTSqJTuigdpBUUYsqJp8oMxUWv/U7mTK7i6eQJKzv2aq5r4I5vfB/B3GzpSE2U8JbD3h/EpIu+iFMYNJ/5sVTkHEqAPonuoTn0Wsc9sQqZRRWKJCkR0PGetk0MnjiJ1vYqNJ77WYyccTzKR49HelExrFCqmtDN19Rn4aBpIKQW2kTJHRNRwdU5Er5qkiqOCi9J2L8AvfvN1QTIbaQrRHdr5a/37qN4YwXe+METCqB1ktA8HAC9sgmZ4/OkuYLfgzCdGW69aBSU5BqYVoRIX6oUUtNQNmosjrn0S7huTRee/PUfsPLtD6QTrW2Hopxs36a6zWS6Cm3i27q+gVB2VkIyGSlXk+GaNqZ97ks4RZoPXj6kMcrTaQNzVcixm1/FCUs2ILdm8AEAWlXy1J9zATrocGp/5wOJrS/69e9x9/dewuV0cB1z+dUY1tCM7HC50GSyW+8It3E0LGLpFmg3rp3owOKwjz/VQvjmSQTQzFJXF7eTUFVyNOpGFQLoVfUIjcuRv0/YlBGpwTeilAPsHaRmooJA6djLrsKN6zqx+Je/xSpaW+YZadumyiWX6tb1JWRdL30f+FrnN8XL8pu9lCoyeAdTUH/V1zCXrvfxm35MB+QhtqA7ue79ZZy0dCOyKwZFqnAS0unSe5x5wUXoeuc9up/fx4q/bMdCurfnff8lXLG4HXOu/ApGNs6gta0gMA9EwnUuj7kZU/FjxpaMJhrtlWai+NaJqJThvqpVn0McngXd3xb0G6vignE83UMA/fpzPQG6t86mQwfQjcgWgDalisNljXN8fTv93RruQFYOSkePRcO5F+CiBY/i9v96Hk/86vdY9bcdWL1lF9rfA9a+vxd3P/NtAujcAwC0Ldbc1M9fIcxlbNkeMuuqi0u7nsfJXQTQ9DonL+tE7qChCQFaDSTVjR2nnY2V7/5bqhzYe+A6b47Ps7bxlJk/vo0Hf/gTXPLg45h6yhkorBoIvz9FVchwbFlinbqh5QAAbadYKCGAruTyq45aaWiIV2ZXvqEpAtBlK+sQHJ+tS/+SqfmN3necewhkp6Fi7Eg0nv9ZfP6hx3DXt3+Ap1/7vTD7rdnyb6xhatEPgBs3PotQSo605icsG7QZTEKov/pmHP/sj+mA/LEcjocyz3CCNMG8jNNpbbOrhvTKE23ryqLWz1yEte9+oMoLt6tDiZPj7Cms2vYh2v7vbTxCa3v5Q09g2qlnIm/AIBiBkC4R1WPTIjXRpqps8ZnxATCVAXqSAujOWtJGSQR7Meh+Bmi80a5rn5cCry/pVfe+sfzIAOjVTciamC8AbWmAtiItsQcuEXKVb8oIn7PfQWpeDspGjMCEY47DzM9/CefcMR9XL1yGi267CymZB7Cg5XoYmHLe53AKA3T3K4d0A5++8QUJcTAb3gkL1yKzojohQFsRgPahjtzflVs+3G+0l0u/uoTpSEmZnnT137bioR//P3z2vkfI+pqJYFaB9pqiQwgSATTX2VpkQRfeMlGShL0BNNORulUcFWxBT8hJ2hWP5UaxpF5bjxrjnweZ5D4fpby2s2bjmEsuw2fvmIcrn1yKz9x0G1JTsiRmLtNuElHHWn5MvuQazHnmFQLoH8nheEgBmsm0CKBPfWoNMsNVumw0/qFl6/v6mM99kQ4ftbZqFuVe4XxWLe1cgriLDiZa25270Pb2e3joR/8PF9/3MMY0z0JKdoEmHDMiREmu4ZPQgr5dAXRRlwZoL0nY3wB9km5UWaxAmmude1MC6DeefwpDKg4zQK9tRubUAt1Q4UQY1KwkY6dq1JQZ6aRy3WWfJkXnkIURCsIOBuEk4NdWDQBmpElm7EmnCQ/w3EMZ4iBw4DrrU3UVwfELnkYoNz/BRIyo68oW9DFfvFI6KfcF6GVb9khTDjd8LNqBaDUEW9ccq//D33Hj+k2oO/MzSMsvFGvL6YWLg8vsrHQbhbdPRhlt3DICaI4zxwNovi94LBaHQcpW1yN1UkHCMrs+0eHqteTmGr8b3urBRWKoOmuH9kEgKIlTy7B1GVv8EAcfOKNPPQ8ndz8vjHSsJxxiC5obneYsWIiUnAIdg048aYXvwROvug5rtu3SPNh7I81WbsOVTAOS9VVliRzSW8ukUWRZX0drO/30c5CaW6jufbNnB+F+ScIME4V3TZIkYUGXOzXHA+h+AWjuy3cB2g1xSByarOnedO+bKwmgn9at3ocRoNe3IKMlrAaWaoA2jQNRpcZL7NkRdQe1Sj0sF+K7VqfVuyXh04Ngbd08UTZxCk5b81+aRvTQbF4Gf04Schz0VHqdhmtvhc1JPl9vh5FPaEnP+/rdUu0gVQ4HaoneGuWDXrxzL9q278L6v7yLYy6+VFGcJrA4lXtsw850EL57GsJddWId5yWwoMMbG5Hb3SSxzNK1DcioDauEVhIAbfp68mZYkXmJlqYLdWdEKiA29xvXlbjE0BQeDgvV05px2vr/kgoa7iI8lGvMz33appfQdO3X4feH5H40zF7IkgIOPnvvAqzj6TkaoF1KXTUNSFnSEtraGv2e74PF73E8fg/Wksd0zEWXqHFsZi8leXztchwU3j9N4s75GqC9Ko6D4OLoYUGbAQGUKz57It75eRve/tmyPumW/1mC//nWwxhUXnRYAbp04wxkzR2gane5zO4jdsBFeCgiauquQEuNMfJFyYQS8jToVmBHW+Lp5RU49rGVOLH70AM0txjz47CTzo5O144bgtEcFmQtfmXxCpkxuCgRQMsgAz3MQGskYbp9Dza8+wFmXfwFOZQcI5FXoWLyTl4AFQuYgL9O5hH2CtCbmslVJoDuaEb2jIqkAdqK5eiwoi3KksSO0ciQAcvQ5ZHqe6uXHAbH3EN0v2dVDsacp1bijC4F0HMPKUD/EKdtfAHDTzxTrnOagGb8/cb3oJ0Wwg0r12Htjj1qRmFkureaQ7n63b3Cf71im5pO37Z1t9Y9Ymnzz9ds+QAzLrhYJQ+N3gHaXxRC+NFGOVTzuxVdrAfQBwHQPVVxaFSG81E/cXjf9Gh+HIbJYwcjLaQy/D7DPCx80GUbW5B71iBZAMsIqCoNI/HsOE4A+TXfRJRbIM7CRqY0m9K0YcUAdG8gL40ZPk0un5qB2su+gtOYzIjdYGkHZsayF4QQpz/avU9lvgZ2f7vp+8UbkDtkjAC0mbDdW00RTy8swX3PvSxDc5dsT9xF2UN16zRvdnaRV/39PUw8/gQ1YVoOs3jXxJYyu2BpCKVPMBcHH6rTxQWOH+KoR84mvjcaUE4AnXN6TcJDNxLKinQ+Gvpw0payoYmAYmLS8YZR+Mye00rMXqhzTWn8oNcJpmD6FdfhjO4XtAX9Kq3HK5oW9oeq/X7zi0mHPuYKm90L0hnK33PoiifqnLxwLXIGjVD0BQkmmruAmV0SxoKXfoalO6G9o70RL2hFxIreK81aPLqNdYUejSWdtfx7f3sPR81Ra+tSJRh6H8R6KPz/wep01Dw5Q1XgCEDXoYBAupQ8IQ+gDxqgE/ME94kP+jCPvOJpKgVfGA4roGp7Lck2Jz7xLQ24Rgz5j9kvLc16crOOc0oc0wqhbPxUnNK2WcehX5KmgxNpU5/Q/UL/WFgdL+A4AueTN72Auiu+CiuUoTh/e6moYG6MAUdNwbI3/48Aem9Ss/pc4JYStbfexsBJk7WVlSjE4af1sJE6NAuly2aQdcwAPS3uVG93okqOns7BTIX5l4yEbccPWalksBH1BvUgVdUtp4dKmNF17p81jk4wL504Dae2bZLpKZwsZJBmAituGpIxZptfSAqg+Z44YdPzBOwvCLjLUAd6bs5j1H3pWvJ60qLT3c3Ee23Y1Fos/cM7WCgAvTuSIOzzpG+mYH3rH6iaOEkDtCFTZKThJc6BkDImB0OXzEaY1os5vIs31hJAN6GEPSEPoPsHoJMF6SNlaGxpZzOKvzYeVrqlmLkMFTdOSH+pXV63FM/UpVP9sYFjs91spVtmCFZqDqZfdaPUKis+51fE4jqpH1xijk1y+dVc2thnLF6HojFHS2zS6i1hJ00Kfsz+3Bew7t1/SidlcvMYVRyaCace+e9fCTWnlLX5EnFDOAKWmZMKULFmJrnAdVJCVxyHclRVcdQhl2kqmfC9qxkltMGtkJEgpm4q4DDMiFXsVqn4YypL3HBVf8zmdOfySet0ejbqr7kZp3S9ILXufAif1Omy0r2YFN+3rOemF6Rt/2QB+R/Rc/4Ixz3zY5y+eD2KRh4lbI12HwB67uVXS/kgh66Wbt/VI0HYF+Xa/0d+/Eta2yqJuUtnrM+JNGD1mCBEmlEXxqCVx6CYQLmwazqtY60csOFNR5wF/Xu6Byrp8ZMJ0J/Iqd5dTSibPwV2nl+FF3Qrc6IJH/60EIKZmaqtWY+JMqxoMf5BMXtFYrymWI3s2rOrnTuYp6ksw1kbn9NdhT+SwbAn9QuRzks4o+O7mHj2Z6WhxG00SJhAsx0EMnJxfft6rNqpwLktic27LDKTEbhj07foWmaoipEEtcouYX/2MWUId7QQ8Cq2s0SE/RzikCQhWdjMfFd292RYuXZijmOpQjKlxTmUmgZ/RpY0JLmDZLkqwyHPimfomabRL14SJxCDEtaxkT98LE54bAWBtAJjJk5yKWFPTpZSVntVJ3e+ipPpHuHnOaXjOYw760JY/gyEjIC030eGK+wLQFypkp6Br63ZKNU2PF+xjWuetyQH0DxY9uaOZ+BkZao6d5+luaN1U5KbxzDUJKacOdWoWD9TEfVLlU6dTFc5AgH6D5ZlVfJAbQ+gP7ahsQ0of7oOoQHpmsM4MUDz+ygfPhJfeWoxjrv0CpRPmIpAToHUvMajD/3IIC38B2raisRP/UEMbJiBc5dukI3M1vMpnf1DTXl65w8w8/rbEcrJjxwQCW9UUw38HHD0FCx+7fdYunOPhDfatyQH0Fx+t5YA+qJ7F8jEDVW2l6Dsiy36gIWi84fJ3Dq2rkplaGxTgiRhA/LIcmY3uYjc5QGPNyJYlRaXoU9dZzsyWbx6wlH48tNLMfvzX0L5qAkIZNI14byE/I6VZGVPoqSY4lLndvAQx6KdIIbMOB5nrNgkibwTpLX/JZlJyJO7k1nLUzrVvcGTWTjp+Jn130bLl29CMDef7mse0mpHkp9mvOk99PkGTZyMJW/+CUvEet6NVVv/Teub3NxNXtsL7rxXqnMczdDo0+x+7uAGW3ce+oImii8ZidLuZuR11+tmIzXyKrzpiCPs/wOB85EB0GzG/2cAdD2qVrYi66gizUJm9ArQGSVlePD5V7B227/xyJt/xg2d38CpX/kaxrW0Ir+sHP6UFOXK7TNU1dXYqTFmHI2OF3IiCapUsiDtQAjDTjgDp6z6hrR/n7JPa/BJXS9q3T+DH4ljynRvxVzH8WweszT7tgeQVT5QxV111YLti18mJlUmjoPP3Hw71tPn56kybEGvfDe5CSSceFr9zgdoPPu8COd2QoDmuuI0G+W0UQs2NSg6ygMANNdBsyVW0DUNA5fPROaE+ORU7tAJt+Iip2oAHnn1f7Bu27/wxK//gBs7nsUpX7kBY5pmIK+8Cv5QSmQKyH7r2wtFa89OPVvxujg+4fxmkLaC6Rh52vk4o30zWb/PS7hjTverAtL7TlU5QXs+J0bWW/NI61p2DoHNfvbHZE3/EDNvmYfM0ko5VF2Kgugh3HNyuyTJbRsX3n4XgfKHmjtmN1bz1G7N2tin9eUE4j/+hWmnnqEauOQ6OxGANrQBpKao02tmB1B16xQUb6pD/ibFUli+oVFCWEciQB8xFjS/CdICx3F+blnWpxag+UYQTujTamTKsC1MXmbCmDIPyLz04Sex4j1yAXeStcCDQrnU6M/b8MhPfonr13Ti3FvuRPM552PotFqEBw9BZlExAmmZ8PtTYdtBudm4ldu0LB3zVm3PUVDUoRLT7e5SFJ5mKIhBs+fi5IVrcFo3xy0JbLuex4kbf0CA/Txt0Bd0w8krWmlzd/PcwRdxCv3OiR0/kOoAnkV4+obvoumrdyCjtEpPX46tMokd9GlEkpf8e+FhY/DYf/8S7VxiJVSqqsQqqRl+O3ZjEVnglaPGRQ6tRKGhIHkSwdJUlD/ENdC1BLqKTIfXP16SUFV41Arhewlt9CErZ6Pg5JoIOyGrqiQw9SFoiUUnMedQGi5f2KZat3ldyVVfx003f9qG+//7V7h25QYZilp75jm0tlNRMmQwUgvz4U+jdQ0GYDiqg1C6CHltCfAsx4qEFFR5nhlZW1OX3UkMPDUTQ489CSc/vVol+zjP0MlhCpVz4GTf3E2K3GruJp6AQ2tOetJGVnUPHL+Jub1fwWkbvoNmspzTmStjH4MglrTJ4riwpQ5mi65D6fDRePJnv5aWfZX41ZUaBxia26ZJwbgyhxuSHvvf36F06Aid/O0Z+uN7KsD3vBmUstZgRSrKnmAOjgYJa0Qn4xyRQ2OPHICWQLhh0D1mP26aZr8kSI5EgGbWrHA3Pe81RyHkN/X4HitxkpDeS+2Z52L52/+UGtEInaiQIe3FKnL715NbuPYfO9D2u7/iyV+8jnt/8Cpu2PgNXL1sHS55dBHOm/cAPnPXfbLZi4YMo01iqykkCWKcjp7bZ+twR/G4yZh9+wKc0fF9AmplQR2/6VUc+8yPhHznlE4eKksbtfNlnLHxh6SKrY7d3lN5vBWBwJhTz0YoNydx7fk+iSSxFG0/zrn1Lqylz8ebUZVUqTKrZAB6KR1ut9D1CGbkHjAkxIdXxlGFqGhrkuoN5QY3SUt3YoCeLp5ReGMLqtbNROEVY2AEjEjpo6lLvlTiytKlb2rWZuuFl0qCjMsBhf+ZBzPw15pnZDVble/8E21/+Aee+sUbuO+5V3ArfZZrlq3CJY88SWv7IM65ewHOvOUOFA4cSNfMUuRb+0yw3787kyefhxA+aipm3vEATln3HbGSGZS5dI4PW44t89qerJUt7DlkMR9D4H3sJlrbzT/EqYvXY+zp5yOYU6w/WyLQUVODJB7MnZzBNFxw1/1YncRaRgBaDzbg3EI77YGvretGMC094bgsv3BOByXckzmxEFWrZ8Y3nDyA7l0CgYDP7/ePtx37ddP8dAK0DCAld7n6gSYEi0KqJKiXeW38XorIKn7s569JJYKaqsGMZruxkGlFSRdvY8Dm+mAF3FxPunynIprhTb6KgZz+f+P2D/H5Bx+HnZkrSSh/3Nc0pOTOZYKTsjDDj1B+KYbNOQ3H3f0wzl75LM4gl/YEcm2Pe5ZA+plXyJIiq2szl8/9UP7v9I7nMPeptZh6ydUoGDqWLPkAgoZbE5touKluW9fhjUFTpuCpX78lHBuL6DNGAHr7nqRCHCu3/gsnXPVV2RSmL5ZVLv5BkXNCNcrWMwXlND3xuZEAuiFhFUextAw30P3RTGvbivD8abALXSpZU3NYm5FSukh3Jx2EZaPH4wn6jMt2qE65pTuUJdmmrUOOnzMQ8c/l/3h4w3v0md5X2rZT6brtu3D+nfNgpqRL56CUbtq+uCRKUt4nXMz0PuiwTikMY+hxJ2HGnQ9gztpvSHXG3M0vCnfHnM0/JtD+CVnPP8bpXT/CGdyqT97QnIVrMeWSq2htRxHQpwl1rm1YCedLxs4I5Eae0Y0zseiNP8tkmGXJJH0ZlHVtNF8fJlg64YovR8JHPQFae2OmKiM1yCAqOnMwKte3egD9USQtLc03dsJRHOpopTf2M9u29trMKdGjYuGjqcS8+miZH1KAZleYrLKK9plIPSpfA4al+aET1EKHQrjiycVYo5s0GKh4JFL7u3siGe9lkYGxe2Siiii5yyu37JGYHlto4hr+4V1MPfUccvn8miMh3kaylOVnxsapDQmTcAKofOIUTDjnIjTfeJdwaZzwRBtOfrIdJz68GLPI0p582bUYNPN4ZFZVw2TGMa4+kZJCJzLaa//YqYpZsmXJibz0cBg3rOmQg4Vjz2wJc0dZ2zY1LaavG1qoOt/8E2omTheANGM4SBIR6YSvPQqFukFFaEY7VbVG/CoORTla2KU4OfIJpCuXtyBtQq5MtI7Qykq9u7KmJSbLn5fA0UnLwDUrVkuHpDpg9UDYLap7TtZXf2Ye1CDtzTv2qgM5omri+9K3/oIJx58kljqDpWEp4qWELHeWut5CyUp7I5Sdh9KjpmP8ORej5aa7cfyDizH36TU4/qnVOOGhxZhNlnbdpVdjWOtxyGEaUX8okkfx61yK2UtsnHmfOQnLBEq3dn0TK/Xhk4wFvWKbvp/5ez646HCrGX9Uwpi/otJVgwGcTBvhm49CaVzqWA+g+yT0hgig/fw4kPQKx3FW0uM3aNMehFrfsFkd+0X6sB8c3hh0HYo2TUdx9yzknzVEk/WbCCSYKee+n8lzTsbKv2/DovcgN+iad/YKYK16VwF22xZlWa8UUN6tdOtusThX0c8Y0MWaJAB4+KX/Rvnw0b2S+BuxbeSGEdOpqOPEjgOLwCU1twgZheXILCpHRn4xQuk5sAOpMhHDdLviIkNDQ5JhT/SakXrglDScftPtWPnuv4R6Uh1K6jNzknDx9iRKsOjz3rS2WyokbMOMTnc2rPhtwBUpqHi0DtnP1qJMwhdNAtDhjYnWvklCIK4yM1rVulkoOmMIjIAaKcZDF2SNXe/AUqx1jk8lAOvOOgcb3t4pE9uX6HgszyVcTQfwqi275bBdEWljV+3NbfuoJE937ML877+EkmGj5DUt09aMh/Hq61VTjFuH7+hrn8IDc80AUlIzkZZDa1tMa1tWgdSifASy0skK9cvaMiDben3tAzIEqvV16IB3ghk465a7aG3+JZ7Q6q17Es/XjHfgblUAzQc2e4hfXbEGKWlpcUNXCqAJDAPqIEqpTkPJIvJ4mMHOA+iDk9TUdF9uXoEvJSXF9pMEAoGDVTJY7MH0oX97ICv6UAJ0oQwhJYAmC638lknwp/vFqvD3AtBsZWeQG3rH934oRO3KulIxyxWRFuc9yqLQgCaTobczs5viqGCwZhawRTv3YA1tjutXb0BmSakCabd7TZPjO9oqMt16UrfiwVRxa1Zrn5lxsdUFpnYpTQIJn21IBYFhxvy+0XOCs9vN6Mhr2Kg/6zzhdl7ucm5sUx7DGn3ILN6ROJwRGayqvYblb3+Axs9cRIAUlOePArStujgjyUpVCZNdH0blmlbkbK5VcwY3Nqka2QQAreqjm2SuHd8jbFGXd7Sg6tapuh5aDUPgzR7wufMnlUcn74deM7e8Co+++GOs1mxuK7YqDoq2rYrVbV8AW7Flf5UE27YPsW7rv3Ft21qZUuPQawfcqfHuYWmqRJot4R5Ht5dHOalNW6lhxwxxNaLNNJb2RsUyd6KTt80YDgxT828bkc9qyPW1nBS0nn0BVvz+b+QVqUNn7bu7kwJoNjxkYj2zFf7jPUw7/SyV+DV6AWhHTa3PmVGG4g0tKKKD1wPoI0x0nXUpfejXDqcFnS81s7SBN0xF5dJapI/Ml41iJQhxuHWcTKM458ovYxWB66otu2hT7ukTm1u0LVq3SAuYk1W27d/4whOLkV5QIpvH1h1nihfD2Ce5k3gaxoFY92JLv1wCoMiYLT2Djjd0kMMqjh/jyUV/+ldvYRV3DMZUa7jTrHuPN6vPKEkk2virCQDu/uGPkV1avt/nYM6NAG3CdAYcP792UGKUpV8YLd2eXBebfH5BKd8nVYsbkDosh8DZlFCC4zM062B8fplTr7uJ1uXfEr5YJVO99yTVUece1HxAM2f2JQ88ivTCAvjlIDAleehzDB3rj5IvxS3Ti9Nq3ndSLz6EgvS7AQFni0vbLFpjK4CJc0/Fol/9FitpbWMPomQ+42oZiLxHauLnf+d5ZJWEYTpWr406Fjf+pJjI/+o4FG1uRcWGJg+gj1CADtOH/vXhDXGQK0wWVtnGqajice/nDIPfMvUcu/it3m6hfdGgoXjox7+QxN/yJNthY3kpVtANztr+7ge45PElyC6uINdWM+BZ/TQ9uZehrD4NEFYMlzXPzTvquBPw1M9/g7X8HukQWr41uQx/+xZlbbLnsJTAeR0nkK6+XvEn7wvQhhqYwBY0x7z9poNgZSpqHmFu58aDq3WX7rQZKDpjmFiipq5csBJO3jZQMmIsHvz5azLaig+YZNvZFZjvVURSzNz3jx249L4HkVZYLO3rju3o8smPIcmuBw5IaR1de9MJYSqB89P/7zdYQ1b+0m0KoNv0miX7OZfKvfs+jrv0cllb0zJ6ndLOHiA3hpU/2YS8TU0Ib/Bi0J4FnYiLo0O5xPnd0xHuIovr/ukIFQQTJ+ysaKuuSRvtJLK0Vmz5d2Sqd/LKf7dHppCwm9hGz/Xltg0orhkaoWo097WA+1PN6Ew+FzDsYAh155yPJb9+S2q8V0Rc+z3JWVecRCRwe/o9sqTf34MHXvpvFAwYFjNuyeixcaWBw88t0AZCpNnHV6F8/cyDD2PRuuZvbkH53XUI5PhV5YptJKQE5etgkvt/xk13YC2tx1I9lT05UigF0BybXcLt8HTArX3nn7h84XLk1wyR2L/fp0I6/UnGFFctlQTlGmw7NQsN51+Mhb/mgbi7BZwXbVefr00ns5O5f/mzrSSQn/+9F5FTWSnen9+MP9rKdKuQaI0LTqxG1doW5G7iEGN/WtAEwgcC6PtiAXqIB9C9AHQZfejfHF6ArpekE7OkSbJiTSMyG0pkk8a3OFUc061tLagehPt/+BOZ1/bRADo2nqcSTOwu3vidFzC8sRUBsmTdRo5D0tHJG5cPHZlqHUBGaQXOueV2rPjj39FO4Lxsa3IxyViA5tAAP3LVR9s7ZGF94UoCikDcEkYZ2sqWHieQuEGFgLT065NR3N180GtcJOvbhOqVs5A9tVh/ZithEs0yVNy2ZMhoPPTKz9C2Yzd5AbuTA+ht2O/3+fs1W/6FO5/5DoZMb4QVSIVlW3G5MfpTBTDp8MsqG4Bzb7sXi//vHSzauTcyzkoOn+17Igdxsl2ha8g7mPnZiyOdr4noak3dcGWV+FF+5yRUdk6nPVyLvH4E6MADoxAkgA7OH0sgPYIeh9EjgXGM8vepDNAPeBb0EQ/QXMXBgyoLOpvpOetQ0l2Hwq+Oh51mx81Ec3yaE2dcKqSmgDiY+bkvov3tDw4aoLkMb8XW3TIiavmOPXiSrJwjw+5dAABDH0lEQVRTr70JWUXhQ2dB65vPSsnAiMaZuKnrG2RFfUAbdhdZjgTQ2xTfxlK9iZP5PPJ3ulLljm89RwBRCb9lx3XrZXOwa2yrMEva9GKUr2pBuLP24NdYh0iqNpA7fc14WBl+BDkpmQgUuQxMykADOPaya7CCDpdlwnucHGNfNK6rgE9VhHyI1dv+jSd++SbmXnUtMktLI+3jhwqgnZRUDG9owU2d38DKdz/ACm7R375XN1lBE+3viUlw9p1XhSfVf737W8gqDEfJkBJwTUtCm9Y4s7YE5aubUbZxMso31GrulIMH6PTzyYJ+kEMYIxC6fxw9jiRrmcB6AVnOMSohjvn09UNjkHPlMA+gEwE0SSl94NfMw1jFwcknRf7epK3pWoTbm5E6PlfFYi1D1wS74+Qt5YrTjWhZym1MzS/C9RuewVpO9m3ZHUNWnyyg7REgaJOqkN3igra/+y/c/c3vYdopZyAlt0AT/CgQs2NI511uBTMy5SPaXhxpzEhgQVv0/9NPOUv4nVcTCC8kXcxTubfujgD0sj4AtJC5b3VLDKEBaQ/afv93HD3nROmYdCwrofvrWrZOqoWiL49DUVdL3Hrn5JOFjQhv4HKu6aha0oyMsfkEwJoTw4xDv2mpGYmch0gvLsNNm75Fh8we6Z7k69BjUswBDifVDq/CB9JFuV2vMRMR/eM93Lr5O5h86tlIyc6ThhFpu3fXVI/8UgncqJo6FOXTbfkREicz/t6ZdOKpWPr6H9GuPwN7aSs1ub7Enbf2PQEaJevfq8Jyv/srxs0+Tu4h23BzJm7YzJ3qrQmaeB9l2Ki8djKKNs9AeONUVNC6xONVSRagDbagZ5Yi45IhyLh4MDIvHoqMzw0mHaQfo5pJmn0h/T/9Ts6cGljpjsTFP3UAze3gXCftOE4KfV1AWkRvvvhA6pDS3xXalj2Bvn/rQF2Kh9aC3tfaUrwABZcPhxUyJVYpHMk+K+Hi8fsbOr0BK157i0D6Q9mIkY18kFb18m160OpftuFWsm5bL7gYhTUD4fAAWlPxGFu6IzC2I82lQBVyHs05kbBhgX7nxGu+hrb3ok00SZPwa+urfUu0DpyvwTq6HhfPfwB2alADcQLryqeIkbheOG1sLsoX1yJ7Ux2KNjb3A0CrjtG8TdMJEFpQ/sWxMDMdqSYwYgDaBWmpZPGpOmmOjY9qnomFb/yfGjKwXZEISXv79j1JJw/3jVO3c5PL33biVrJum8+7GPk1Q2EEFOGW6VqiupLHctWnHt3WcTMyFSb+vTnn6uuxfuduWQ+3uoTDTkmvcUxikPlX1pDxcMFd98JOD0pliN+ImeKtSwGdGIDmn2WPL8Lg5cciZzMnf2t1R+jBJwnl+dMMaWwy6YCPaBo/2vupwZpOeyLFltLT2FFnn3iA5jcYCoW4DTzPtp2L6Y1/g26UX3K4gvT1PupvLNv+nePYuw5nHfR+FrWOWVYv4u6zfAE8vvGkPKuXeYJGIISTyGVd/Y+d0oGWLA1nb+q2jq/aQRY1Pf+DP/lffP6xp9F49mdQPW4C0oqKZI6cz1Et6o7US5uiUlsr8WWz1ynTJ37lRqx4L9oVlszmdROIrtXo6moCn/ufewWFA4dFramEJYImbWhHpneXXDkOlbxxmVOjs3/WmFnuyjumS7hjwNJZSB+TJwBs+3pOcVdAbev3Q2vPXlIogNO/eoNq1Nmm4rZLt6vOuRUHdQirkVE89GAZre/yd9/H/by2jzyNurMvQOXYiUjLo3swGIJlO2oAqzslXuqlzQPGrnnvnHDdzTJxe+n2PRGAXrQzuTVu0/cyT1lhcOeJ33d/6znkMd+Io616XxQsbZdS1KcOGq6Y4aEYZVePR0XHLOR31dFhOV04VYr6wYJ2+UxY5QA7gLq16D4j2nNgHiAM+IkCaNIwvdl1ZP1++FF5Otwmi8Mbg94HnMWlbkTpxpko+fJRCKYqoiKfX5UPxUssmbpeOZU201VL27Fq278FsFZu2dMvAO0CpprlB+GJaONN8vYH5Lr+Bfd+/1Vcu3IdLpj/IE677ibMueI6zL3iWrSecwH8Kek9mPHiAbRNwHjGjbej7f29Hwmg26VDcpfEqTl+zocJA88SsjrHH3siHCsF6Ya2SBOutykWdNpRBShpm4li2pwMqPldtf3QjKQs6AHrpqGArfLNx6D8sglk1ZsRulO/21WoAdqQ2ZM+BPT7TcvPxbWrNmKdHEQ6nqwrHz5yieXWvRFdwqEk5rXYznzKe7H67X/h6df/jLu//zK+3LYO589Tazv38mtx8lVfRf3p56gO0Rh+5/2MB2lYMnD6LXdi7U7Vnt6mQy5Lk6xKUZ7VXqnIWcqT3H/1W4xuniVhK8OM1ur7dKeqE2Gys9XwCfo6bUoBSttpPbrrJbdQuUENX+iXEIc+5JVaB1TLZ0SGI5t9zNN8IgBatX7bfnp8gHTvJ59uNL7mdTegsq0VOdPCujvLjnDqxh0EaqoBo+FRo7HguZewYSuD9If9As7cPLAyJnQQyaBrMp8VO1Uibs0OFQ4Ra5s20bxvfBcpWYoxzuwFoNnKPv/u+9H+T0TKrJIH6N1yeLC1377tQ6z62zYc/6WrYYXSEbSCMsna8Rm9TxrJDaD8q0cTiLaigCfddNQLdWh/VHEUiwVdi3wC6LzNLRi8ZBYyphRFXG/XkvaZ0eG9sXXhfA+WjRiDh194VRjtuHuSgVrIorbuPaj1dUsSV7+zN1L5IklGeo3F79HrvB9d41UErOvpcL51wyYEQ5mqPd326XvT2H/8HF3XC+57CKuZHldPvxHDIdn8iBxM6hBp+8tWzL7kS8LrYupxYfuNbYvJbwhdQKGDipsnoWhTg1Ru8NRuISqT8EZjvwC0L5YO4QDqHmj7vfdPOkBr63k86V8/LiL/wwHQPF2YZ9oNvrUeTmFASHYCPjNxjEpK1UwB8lFNM6QJoG37rqSYwXqLQwv95bZorWqUD0JZNsu2ahULVjHmfXnZaqnOMHpzhTkhZodw2ZNLlQUtbneS7u9WtfGX6s7ItW+/L51zoaxsISCy7IAk5NwWcjNBrW7+zGpUr2ykdZ5O3swMFHS29MK5kawV3YhcIYXndW2QgbLlt0yjtU1TFqivZ/xUTaK2NOOerdxnev8TZh2Dhb96S1jslm6DtqD7Y433av7lvXo6tqqy4KTeErGwyXJ1u08JtL+4ZCUCgTQdfvMJD4c/DkBbjoPLn16Odi4V3KESg8wpkmxoRhj8mJbgnQ9wwV0L4M/IgeM4PS3nSK2zbjO3VDt9kK5v1gmVCK+bIa34YZ7e3dWoh/syCVZDv4Q4rCQ0Xu7hU2JBc2LQPIfe6O6PiyP6cAC08A2T6xXumIn80wciaNPNZgbEpVNTR3yRUUlyKtsx1R7+ICafcjoW/+b3WL5DVTKs2LpLZ8s/WnhDKgg0GKsR93skURMpkdIbXECavl5DVt5n73mQ3ltIVwIkbsoIZebhxo7NWLlTHQDLtu9J2HgT256+fNsuFUPVBwVXJ6ze8i98eXEbssOlatOwNaqJgFxLNSCTWVQVjISPOG5emYaqB9j1bRTOjWKett7ZrMnbG/rFis7vqlfDZDmMRQBRsX42ck8aSCDGTTo8jzGg1tGK6XqTphrFZSGld34/6s86Fyt++xfxVpbGNHi0xR6aSa6xy9eiGBAVQK/cElM1obsymZ96FR0OZ955H0w6WO1IiMNSnNIRzmt14PhTU3B9xzfQxglOzSPStjVx56tbUqmIvvT9wOx95JWtJs/oCgL71LwiVaVh7X/wuwBtS6jDkuHHqQMzEH6EPJdNdTphq8Ia+ZqkvyTOISzhRvKiyr82ETYn9gxbW+Q9K5di1YkZ+NubRlj+NCeJGA9S1WMnzpF8EgA6JSVFyuPojX6BdO+hrN083ABdLvwAzch+phEDHm9A+shc2hABsQZN7U46keSS0cNVZqvRDIbQdP6FWPy7v2ElA9mWDzWIIal62o8C5kx3uWrLBzjui1fLUFSVVEoQmqH/yygqw30v/FgGucp77GVSt7x/qWTYI9UMDOZL9aHBRP43ru1EXlUNbdAE5XSG2ry2pdxfISxKtxC+dDTK6HqrPEDDIVvXntUdTah5vBmpo3Lo0HDofYV0TDWRl2QrTynox4wLP4/lf/gb2ndGvRhFkLWnH6o7etc1dC+1XnSplIZFKjik8kSXDVpuMsyH9OIi3PnCfwtAr+hrbXMk38HgvEuNviJwvm7lBmRX1ujJML5IY0rc6e/M58JhrTQbJVdMoL07U5K+yeSD+BAdOL8O/vJQdLKPrSuUDlItIwroPHYsld5zitAN+D7ZAO3TAE2L8KkGaI6Lucxp5R0zyB2eCn9JCkLSouuWrtm6aiJKaCNUkbyY7NqHQmi54AKsePOPaGci/+3KOpEyrUO0eVV4giy6v2zD2FlzFPNdAoA2dHKzaMhIPPG/b9F7dEnY98QPcWxTwCxhFG3xLdW1zmu3/gs3r+tC0aARdB38SDiFh2PeZD2n0/ULGSEh1smcEUb5ypZIQ0mhkO4feoBmprtSBo4bJyGQ50gYyzScKPPbvs00QjJkSljITE/DzIsvRftbf5H48GLNVrhUc2/0S4llonDXn7dhJCfoDJV88/UAaPW9EDLxiLKhQ/HYL39PB2jf+GLc6SgcY39qJ1cP7cLad/+Jr65cj/xBw4Te1HJrrxNMiOESQdsMIoX+L3N2GcpWz6C1bY6sb99H0U3HgJXNyDy+jF7Xp/Yde7CmqXnM9WQc/WgmofK+LZWbcUiDWs1PPECrJpNPPUBzzDLcwQMs65C7uZlA+niEzxkBO8UUy8+QuYKBSDmWy2mhKgIMoa4U/oOQhdozzpTxSCt3uDW0uw4dQHPSjoD0SXq9kqEjIjP/fL2M8Bpe34Llf3w3YmFJpUhccGDrcJduD1ahkJUMzm+/j2uWtCO/egh99kCvsXrFR20IGVKK4UfK0CyUPaqSd0Ub6z8WYI6GPcjt7q5D6fqZyD+H3nuQaTAT14yblhGhgmWyez6Amz5zAZ7+zR+xKubgkoaebYfgEOYcBAHnoz9/AwU1gwWQ7UiZmBrX5k6IcQxVijeisVVmKvbVa+NwCld5MD/HYgLnVe+8hyueWoqc8iplMVtGj6Rc/JCAog5IGZmNkqemyyzJig31yNvcHEnY9gWgi2ivF3TXo+z+WqQMyZCZoY4MW/DL1Ht5lO+TV9PQo+3keim61oSlqB5AH3kAXdSpLKwyyfzTzdI9A0MWzRJ+Yl8Kt7Qqa8typ4LohJJhGBGCI8uNwTp+jJ0xCw+/9BNhDztk1pVuGFlNQHHrpv9CMDsnJk6e6NoaaD7vIrKS/i0TmbnZJGGCMAaglwq5+4dY/5ct0oiSUVIuh1bAsHp4FPH4GGyuy3YMSb6WXzcJhd0z6UCs/VjBWVHNNgofeDHPL1zWgMz6EgFeo5fkt5tUkg3L07IDARx13Bw8/urPsF63gy/5CG3xfSrJY8+ISfE7v4lARroi6ffpA8Nwm5LMSNknT46ZdfFl5N3sTrL0b5cQILX/6R2cd/u9SC0qVQMf3CoXKzouKz5fDb1+OIiSr09G3ia2hKeIoZPfrUJYfa+kmi5J3cr1zRh4dy2ym0rlnjEDhuQN9lU7CbX83B6uuLZ5WINfuLqdhKPuPoUAbfQs90mgfeGY6AnQ06IALS3a9R9JOTER7qhPWCUQ1gNlufVb2r/p67KOFgxc0IiUMdnicgVpQzAJu9yw3AUnTSFmZHo0W1sya1C3kFaMn4DrOp5FG5fgbdfhiFj+4MhG1HHArUi6CoSfZx1t5HPvnEfv0YmQuieqP+Ymls/edR/W7tgt4N62xY1BxuF9ljKr3QLQPNV78Wu/x5zLLkcwK0Pcf87q82OPVvPI5Ooo17JlhGCnOyi6eISw1ZV0MA9KiySPkglBFe0Xt0yOlpRft1imrkyj169F9SNNSBuRE62N7dHAYPRs7rFtGewg5W30swETJuKWrm9Ji/4St+Ow19rnJHMLW1XopJ2e+5Tb7hGLLyiemhkzP1KFOdyEmc9JwcX3Pyrll8tiw1SuxrwP9/+ZuW/l9n8LT/Qxl1yOQHo2rakthEuWjjtzjNtnxlIMuHFnFfrgDr6Si0aifM0MoXploOUEIfM+J5P0LerifoQ6hDuni5EUXjsDlffWCQ1A/pWjkH/5KOTF6pf6rvmXj0bRl8ag8vPjkDE+T6btWL3QpH4qANqM1EEq6zLVb+GkmZNxxXnH40ufOa6H8s8uOLkFuVnp9Df+xK3IGqCzjxlAIDmLrJ065HY3SfihZKOastEX5d/lmtqCLh4oSt/TpqzYMI1O5+lxs8glMePfS0SVVc3UlWV3TUaoKp02Jp+4weikCCMGoM2ou+eW9nC1QmZxGOd8/W4s/sPfsXKHSrKphoe9MmWlbZsKHSzasVc0WZpLHkm1/G/bMWnOyZGMuluNYOpwR2x9bzAjBzd3fwvL3lOxR56Swq/J3WKc+HPHPPFmXqxJnNaT5Tzv2y9g7MzjyDtwogTyZk8QY5eRy8BS+bX4mtimWHlWwELeKYNQsWqmHHzl6xtUjDKpEFSTbnBojExS4aqPZECaD3ku+eJQB69xzYYWVN42EaHyVPF8oh1xfOgEeljWbozd1K3XbDVmlpbhvDvnY8kf/oG2HeqgXawHzXLSbaU7k3Irjz/b06dO03ZdebFouwo7tP91O1nsJ6oSOsPX0+LzuVwdKvmbkl2IW5/9npTlLdFez1I5YHfRffIhre/uSD29UKJyieQ7/8Q93/wuxrTMIiszRSxxe9/J4DEHl9Q4iwHgiPcUDJq0tgNRuVIduO7AhAJtVCWbBwprQrP87npS+p7WONxFz93NygZbS3LarR7LNs5GweaZqNl4LIrnDFLJQydxOeonHqDdwu8oQPuRnxHEc2vuwN7frseu19eSrpHHD+nxwzfX47UXlmJgRYka4yQJt8QAnT+rBlXrjxNLK6+TrK4NrZGSnb4qLzi7tTmbGpG9mXRTE53ujUm5XSUddRhAVl/F9ZPFjePEEnfiqZhYtEwnHsevqcufnFAKJh5/kkygWLPlAyzfuQcLd0SBerkuqWOw5g3dlowFTdbSgp/8AvkDBumhqHrUkK8nQFt6JFbJ4OF44n/fxOKdikBnrZ4z2KbJdISFjVzehbSxV+xk4qO/4bP3PoC8AUPkQJKESwJPisc7BTRJPIeEJLlmGyhoqMDQRbQxulqFzYwbUvK6klsHSfAJd4oqnVMeTkNSFlpYP0+Be/DS+6hZ24jyq8fT2gb0bEBLYp1mAvfXNHzRieu8tumZmHTiabj7uVdo/XbReuyWeL2iX3U7Nff2udxSxmzROizkKeK0Rg+/+j/ILR+QeChrzDT28pHj8RR5Oe16hmR7ZGaiKs3khPVSXda3nkMav/0zzr9jHnKrB0p+xZYuwAM0ccg8TF5nEyG/icxjylG1nPZm3O7AI0cr1jcjZzOXdbag8Pia6EFj/AcAtE8GlCqAfmn97cCby4HfLO6he19fgdeffxpDqkokDuo7wJDWvFkDUL1uFlm9DTKtWVxcsoZlKncflDdw5fo6SVaoyc9qkkpJkmQ8FeubBOxLeULHlWPh5JPl4NPtrEzAHjOss+dN7c4PVCOPWPMqqnHGjbfhiV+9hTZNZrNc81goLg81968tCU6PNQQGX3x8kTSo2HpiiCSOfGYEoHsMvz3hVLLKtpF1tls2Kne0SbPEFjWu6amdqrW3/R87hKxpwuxjYUv7uK0YzBJy//LAU5+U2/kctkhDMAMm0uuKMPDxVgLDVgHWyg3TZW1y+aDsTKIMsoM9I3KdybLK2aQ6DtXP6pMGaG6WyOtSj6WkZetbUHz5WPjz/LK2fAA7vTCdWXrAbshQQMUeS/7AITiHrOnFr/1OCLTaXA4P3a7vVsEsj2H/i1s1o8v1ZHQWgf2lDz1B1zEl4XQcN1HNe2r66edi9TvvS432yi3RQ8GlFeXnXEW67q87cNuGboybMQt2KFXKSdWUeVM9V2/NHGxY0RoTZiGzrhjhpY3SLXgkg7NMf9/IRhp5zwTQuSfWSOlnsA9kSQTOnw6AZquYAfrljXcCb7Vj9xvLI7qHFG8sw5s/eBxDK4v7BtDHVKOaXNCq9dOEBYs3Z3HXVIlJl3RN14+Jlakm87qnkZs0Veoxyzbyhp4mjyVJjsgKM8DTeyglK774S2NkI7suvlsfLbXRCZJNZiTkQV8HAqiZOA1fenQhlrz+f1i1lVxPTsQJSfweiQn2LWbJo7PICv77Dkw7+XR6br8w8LmUlOKqxgC0aq6xcO49C2iT7lKuLzfAMCBwHFUIfD4UAp95z72MWRd9ARlFYfWeYwr+LV/vkzx8QqQTgJ82cuaUIpQ9RV4Mey6bOOQ0XZKwXH6VrAVd1EV/312Hgu5m+tsWIVUq7JwmFnVSAB1pYlFawEC9qQ5l61pQdNkYDdLK8o/r/prRfINLBaA8FAK3YAaGTKnHZU8sxsK3/kqekqIZZc5vtqoX6/DVyl4Aelkklk0W8F+3YvLcU4UEKxFAuw1UPLLsc/c/gnXaM+MuwqfpsH2aw2rbdmPt9r3SFXjv918RhsS0vAIVNpG/N1UlUqQjMDFAB7jD1jKR1liE8OJGVNDe4NBhycdckZN0gri7gQCaDLZ1M5F/0iAYjqrNNz+tddCJAfpu4LcrsYss5t1vrMCuN9QjCKR/SwA9vCIK0AkrDejn2XNrCEyPFbrIko0zyfLhEAd/PUO6/Q6k/HvFTIDUMUus33BXMwo3KwrKZMq7SqTLsI42d510Og1YNxulV44Rl1imZesMumHYMbP34nAG6C4shx5tyw8nNQdDp9bjkgWPYPH/vil1xW070acY9ArNjMYlWPf+8L+RXVYpVJqWjombuijfjumg4o65tMJ83Pa9lyTxxOV/bEXz5O413BL8t624+9s/wDGXXo4ccqkNeo+Wbj5xyYXsXmgu5XPqsAYnVDOnFqPmUQLS7laJJ8oQ2E4NiJ2NySWPuBWfAJobW4asPg5DV85BFW20wq46nThOfrBsj9eXEVlNKF2vDmC7OADTMRMTPblTyA13KrcC64ChJpk4aVkYWt+MLzzyJJb+8rfo3PIBVu1Qh6LLehiPKH+ZHoMmU9TJ0p3//I+QHa5IPI7NVHSzbAlyrmMe/f6qHcobW8QT5N+jQ3znLqwkoGcmupkXf5HuleqY/I+aTRnbjh9btRGfKJ9eq7YYpQv5YGsgL7eOPNWGfusAPWRcO131dP+0YPD8BqSPzY+U2h0gxPE7ukaVTLN85HYS0pvTAI14SUK3q4kBOi8jgFc67pAQx97fLFH6+hKAHnsCtNWrBc01nZlV+cg9biDyZg9Azuxq5MyqQjZZ1VnHDOiTcpIx/1j6+2NrkDOXvr5oKMrmTUHV2hkCtMU6DBLu6I3ApUEYuBicOcxS2tGMsg1NAhTl10yEU56iMvyGbmQx7EhmOzbUIddHSvSUa+yX0ihHMvBOMAVV4yfitOtvwd3PvYoVf9qGNWwZ83QOHfpYoufcKRdYtfBybTVP3j7l5jukG47rjC3D3h+gfXqIKIHnyIZGLP7jO0J3uYr+nkmOHnvjT7i6fT2mnXomMopLpZ5VBo3qBg1FruTrQS3JXBCq9tblOVCfketybb+JnNYylC8iq/mZFgK/ZunSVJ2aTRLaKJRmoPq4m1oNVHA5g+uEZKdifQvKn6pF4ZUjUXzyIJScSHrJSFQ+zODQKs/Hf8dWXGEvFnWR1rAcuko55BHeyMmnJmVJ89dXj0MonKKTZbo92G1mMd3BCabEqn2iVqSSh5ONjsmVKyacQAjVYybgzOtvwj0/eBlL/0pru5PLIlVFz5KY6p3luppmIbeTS+v+Lpxy49fJ0gtI9UZigFaW7+iWmVj253fJYt9L1jK9BnlFT7z2B1y5dCUmk4fFa+uzHF32aGruEcUlEwlpaDV7TBZXoTr5/4CB3NZylC+mdSWviMNV3EafTx5NshU1/RW6KNI9DK5Kw1mHqtySe2mjSkhXr6Z9/+UxCAxPh+VnpkW1Hz+9AK27itTGVQBdkB7AqxtuJwu6jYB5WUTB+sYyAeihEYB2ElL/uWPpD1Syd0A19MQKvpkDBCglQRSfPhRVy2egorMFpZ11uuKjUZdgxbsJ6iOb27W8OEvNce7yr09EyuBsuuHZelVZf8sw9aZ2C/u5sy9qWcd+XjfZJDSRloWMkmKMmzkT5911H+Z972Use+tvWLXl31jBdJ479wgBznJSrrhY+h65zL9+C9UE7r4I1aPRgyPB0U00FlnDvlAazrvtTqz/xw48Thb719ZsxPGXXYXKMUeRNZ/RowzygHSMlqrSCEoLN4+M4nbpAKwMB8UnDMTARc3IeaYOWc/UqvLFSDIvGtYoSTjwdZpwBxcRUHLOoXptvTDehYblwAjGVMwETITKMxC+bCyK2Lui3y0nVztnc61Y6iVJWNKl+v1x5QH/bc3aFpTdNAGhQRliSXMdtxp9ZkmCLEq6Y0Q4R8x9DBczQv+pOCoySkox9pg5+Ow9C3Dv936IRb//O9reZctaldKt0JUfi5gmlNb4SbK8K3hte6WONZSX4wRxwV3zsZa8oMd/8TpuWNuF4y67BpVjj4Y/JU2z3O3z3mLLY3ULtzsIQnlKpnTx2VwNRN6XmW+h+NTBqFnSLIcZH6Ju/L+os/EwWMVqMhIfzgVS4dMswx5KuNqro1bIzwq66QDfMBNVjzYj99gBMLMdzatj6JryA4yFs6zfc4jjEwnQPk08Eh+g2wmY2YrW+vryCEAP4Rh0jEVpHrKGF0NqfhU/reryE8svnayA5jIMfYQ5iGch49l6oUPkwvrkYloE0t0zMYBcpsyJhWI5+sTCVKEO9RktzZrWCyOXLyY7rnkVuJ45vagQg6ZMwYwLP4cvPPAI7tj0LTz237/E8t/9DSv+sRNrt/wTX3zoMVih1ITXQIiKpBXdFhBuOutc1J52DkqGjYY/PUdn7v265bnv15ZdaqnHZXYzHsbKzGqFQRReOgrV7bOlnjVr83TZIGUdyYUzuEa5sGs6cruaxSWt+vpk2FUhKdmLTgaPHkZWsR/F1x2FynWtkifgEBaHpEqSrBBRAM0HRCPK6euadS2ovK8e6ZMLEPCrcVimGRSCJTOmOelAtJUuGJhSEkf3ox1CekFYYtWzLvoire3jMgbroZ/9SuLWa/66FRvf3okvPEhrG0iTZG8igOaGC561mE5r23zOeZKLkLXNzBfSLNvnl9p9v89M2JDh1lCrEJWprrOpuLq5kYOrcvzchPLFMShbO1N5n0dAyEISvd3KG6paX4sB6+roURlcfA/k0P6sWN2Kqi8fjfRh2WL9Sz230dMjPABA/9a27fIjmyzpowJ0DDjz48cP0FEidicmAcKDQx2Ox07IR9m9vKCzpb20qCsZYpdG5HMcla1orvRY1IjckwfAyrZ12ZOlqzysXmO2sZ1qbrxTKgSMKCexcGo4NkI5OcirHoSaidNx9HEno+Hs81E6ZLhqSknwvGaEIMaUGm5DLH1Ll5MZksjz089sw05qQK248VwDS+AeIIBOHZ6FipuOlhxBtiTyposlzPHhoo1NCcNH8VkFayWpyyWR1W2zpFLAcSw1UzBSGWNEapIZQLiRaODCmUJpya9dsaE2aYAOa4Dm2lsGafacymmNKxY1IOfUGgI9W8BOqlN09Y6/B6d0IlJ5nUw0FB2Am4yTvAWDrx1AMDuP1nYgBkycgknHzkXz2Z9ByeCh6nWM3mLhhvDAhLj2XjMYWpq4yBK6XMUup1rC469xBKClnt8v1VhB5k3R4aqU0bmovqUWFetnIfuZevFujgSAZqudtZwO4+r108hSnqYm8nCDC3nDNQ83IX8O7clcWwwIW4f+IpNVzN5LCbkBy7LMF/x+fzbppwyg34wB6N8cRoDWxCqmlA/piQqaN4BnDwar0jDwy1MweN2xZE03JxX7ql7XJKd11jPTpEJhYPsslFw1HsGBaUIOpGKDZjRmeyD+WZe2UVcECBFORNUUYkOAQY0T8slGJ3A6wDXs+byGBufYkU9mr01DiSbjBJm2NMNC1rEVCD9eJ9n80o3TCGCn0dfTBGiLpJytOSkXuGyDInjPJTAYcF8dWW8pwuERMFTdvKkPWlVCSO+BS+IyHSG3KuyeIW4vd6MlOx4rrK2yAl3dwbHsMFeJECCx5Vh49TgEajLocLciPCyO6yYfYLO7nXiGjtnbvhiuYl+UytbxxYRFdCmf3RuvisvU5tMVNLaOj/fo7tVxZstJSGalmBktSTJzCzSTHgXoQMo7phJVT85A3jOzpIaYJ96UdhwZAK14c5pkrQs7pyK3mykayOtZOxuV104UoyFoqxCfz7HFIo7wVpvxW9fdMJ+UxjrOXr/fuakgM9XnhTgOMUCr17MjyrG1gPTi0yYr8qPqorEYuuJYsogb+wzQ5RLnIldqk+pKK+1sQknXDFQ+2IBMAi0rx1YhDyM6QDMeeMZ2+bl1y+77FgvLZ+rBq8oCCOmwhaEtSbvXiRO+/RJAaiipGWly8CXg0Eio/Np046cOykT5FeMRXk2WyqYGif2VbaiXhKoM4uXrIZPTm5Jq6Q53tCCXnq9gM7mtX5sEK8MvB6qpS8piVUIN3H5N7mvJZePp+s+SteBYctFH2fQxWhIT6ywgV7p400xULqhHZmuYPCVHqmLMmIk7iS3oaGt0ZFq35px22dncg9Mv4RD3/4zoWC4r8fq6NJoCtLYvQo3bI79jJh495osAu6leO0iH7yhaWzI2Bq2YIeuY/UytkExxgjzc0XhEADTfV+Xrm8njqUMOvb/yjTMw6NFWFJw0EHaeI583IOFNS1q6bbOnt2NpD8iMA9BsPdu2/Zxj2xV+xzlyyfp7B2hfhLTF0ExRhelBvLxeAfQeDcx7GKglSbhUJQl7ALQvYdODuOBMY6iTbk6k1MtISiMTkSMZa32a6uYRtqTNTAv5c6sxcFELAS+X8k0X67hA2onVZuUbU1T4OgiYN/NAUs03QDcxh0k4MZG3mW6c1TNRds14hEZlSezL1IeYqGlG2oWj79OMGYAZpe3sAeAxs+gitapWNEnVK0hbMWDt0xwiOnHZY8yT3uQyY053BkpjgqGG03JZoV0QQN5xVRj0SBNZyi20WehakVVV2N0ilS4cF+RKjXyynMO0iSo2uO27fQ0fNQvRfgldy4rrCaBTHAnxuJZirMq9wxYtXeMwAXQpHZC8PvnJ1ljv08jiKrvK+Uw0360apLiKp2JVK0qvOhrBkVlC4qM60myVQDRj8gp60CuHNUI6pKQORmUsmHpyi5urcK1yS3tNxj5DWY2E+0Qd2KY2mrjT0xZ+DjPCEe3ETNuOWtb6fZiOcv85eVbgR84JVah6jL2hViG04n1QutH1hlp0pcTHDcZ1UgFU0MnNTq1SfcNhLK7EytvE+3IGym+ahPTRufD7lZepBif75XoEe1yjnqEdc5/GMss0/0XW8zoC6WGBgP/ItZ77BNCWy0vrSA1oYVYIL224M2pBS4MKqVjQK/DmC09hiLR6q5sj4QBHUxGZyEQLS5F2c9UAVwwwBWTfVJULyWRuMwpscZMlDE5kOWQ3hDHkwWaUM+ByvbQu3+IbQRH8NEtZUenGqVICFr8ygDkEahX5+BP0++eOQEpFumwCoS21okkYU1uxhu44VGrgUE6uUVacpUMb0SYa99GxVPLJYq4FusFT6NHPZWNZNtKbS1B6xxRJvkS9jf5tUOAQA4NCGQF94c2T4YTU5AszUaKVf+43UHTlBJR2s8U+XUC++FCVfTGhT2cLamhti84ehmBlKkx6j5blROPFVgxYanIlM2bsUlIey0dOkEdfy4yUzhk6x2EpGk/TT++V1jk3iNzGMgy4ZRoGrJpJn7NWKmiOjC7AevFW2WIu3jiL9uNMuvfoHuHQExkBNU+T1Xz6IFglfvmMdpKMm263ryb/+p1tW5eR9ZzlOJwYNI7sad59A2hTaoDZUsghC3r149fjz68sxh9ferqnvrwYL2x8EANKiyIWnJkAoN3nZsAIaktOxY7NJOeQmVK+5rNtqQV2fCpEYCaY1ccMV6ljclF6by0KyRor3zhdYqpsQXECisfFh6XtO34Nb1gmszRI1yN3zZV0NqNyw/GoeqgZhacPFlIentcmpXU8ycMyo/PSLDUWyudPbDH1C0Db+nXsmFIxtyJC1K3vDanNS8CcPbkYNV+ZggHLj1GWcveh25AC0ASyPQDaOHIAWsiemMuDiXs6j0H5o2RhnzEQwap0GY9lCHm9IoV3NADIPSihDWO/9vtDlyCPArSKVbMXqip2JCxG94CZbSFjaiEqvjYFQ5cdi8qOmdJ8krtp+iEejpH8NWcyf85xcIt/3uZW+vlsVN46Fenj8hB0TKQIB4ypqqCSAmgB53+RdpFONskytCzb94mQ3uugXdfZEPdOOFYJcAaUFmJMTTlpWQ8dSzqMrOe0QJAsVn8EoBNZAKapQFWVimnQ4BNfJ1J6U59+DHILK29uKyAuj2UY+7g1PWf1+XWNr78qFRXXHI2atbTRu2olYaRag5skJseWdDgBALjx6YoNU2UYKvNG5G9ulgaI8ofp5j9rsNTX2n5D3K6g/nyOztg7ppWwJKpfVHsWthGtQhD33A6JRcVhFq57dfKDyGgoQeGNE1C9YhbCnbPks5dvmC6ER/+pAM0eAycxczbXIXszfd1NVtzaGXQI0/1x3jCEBmcgGDCEl0R1YnKZm6MrKszoMIVDaEVzQlDud9pjzC0S1NwxDodieDYkWZs81aaagHnQUl7XGVI2mrtpmuq07ao9ong0Sjpo33RNo+s9he7DBgx6eiYKzhqqOj4trmIhT0/Iraw+DYRlr1xVaYj+H+Hbl8lqzqHDlGw5x/eJkQN1EvrMOCoNJtZ+akgMTm8oiX+pET3xB5vSiRhKgVPMGkQwLwinIIUWJAS7MNhn5Uy0nweVcpxNx1x7tU51iRxXRgRzHeR/fjgq22jzrZ0lJWRsOSoAiZ/4Eo6H7npx0avWTxcNy8nPpV/T6f+a6O9no+rpGSj70gRkTy2R5hkjRYV0uKY1lW42xzh009M5LskHQ4o+HMT1dTgk5IeV6kfqkEwUnDYQlXdPQfVKFZPnz5y1uZY2sIr7hTv+cy1oPqy5GYZrrpknhtdVygrJy6pa24iaJ8jLumwcsqYWyyHHAwHYog7oAzGaYD+E4Q07he7zVGGYtBmEeFhClh9pI3OkUWvgXfWoWjmLrOVWWtcGMSKYw5mrIfh+Ld/QqIYoHykAzfdFN+c0WlF2+2RkkNXsd1SVlEw6skM63mwcsL5Z8IU86mAw+KHfH3iGAHoqXTMynC3fJ06SB2jdwu3bX32RWWpu3NfQ2ez4pT/ZE8IouH4CUm4egcyvDkP29aOQdiPpDSP7oCOQccMo5HxxOPyjM2EGLcVSZpi9ZsTdsIrfVGQ5XEGQf2INhi2cjQEbj5HGC4kvJ+R/UAlE+X8du1Zf14o1XaqJmjj0UcwctasJUO6fivyLhyFjehH8pSFYIXVNXC+g/6fVWLrskHkV6DPm2QiNyET28RUou/4olC9mL6GVNm9TDMFQrRwy0lXZMYO09T8WoIukFK8xwtHClKXFus2c15+TxFzuV9XeigF3TkHReUORPqUQTjiomiVkDVSZ5H60CQerbpkYe6e2JdUmKcMzkXtSJUpvOhqVCznkNkva7/M2Ncj0Ek5uSxeeJMNVg0+Y+W86mo+Qeme6tnS9By2aiZKzh9F1DElzFOenHLeCjD0Gy4oMjOgDQP+JAPp6x3HyUkIh3ydWIgBtWV+QeG4vpC0qAaJJzWNKxswejGr7clTEHzTKN3EqAVbm/AnwPzAczoIRCN03AvYDI+A8MLKPOgKBh8Yg52ujkTIhV8DI1ET7sUkUFRu09WERrZAwJGZNLmLQQWZdCaoepU25qVncv9JIjW+jrlho7EGqJG6wrqlVqku/hCdCVQQwcOdz+IQ7nshKrSFLfcAD9HdXj0Lu3Gqkjc2DXRqAkW6qET+xG1uX3jla1dRld7KzS4BjavpR5TVwbNtIYUB2EByUgfSGYuReOJQskomoXsjvnSwUHlHU1ai5LVTMnd9/iXAv1ynLan0LbeDm/+AQh25Z74xep7BQBajrxU0uvNZsUYfpM1Svb8LgFa2ouo8O7svGIOu4SqSMyRHPyUyzZF3Yi/HrUJeUVMohaktFh60fTV+07DK2DFM+P68vra1NXl9KDa1tfSHyzhuE6lsmY9DTTcJXUkD3LlfYMGkQN5sw0VRYasYVY2NxDId6kUxbbzqEVrEiueJBHDlMpkUHhqJbaJEhDFzFo4wCNQWn4uapSJuYL/evTJHhcKdlRyqd3Htj/+YTXYoaDWfsou+/SQZnHYGz9YmJNfcFoBNObT5EMwkD0wqRNn88UuaPhL1gLIL3jYF/wUgE7h/dRx1FfzMcqfNGIO9Gep7WYhiZpnALRDmNNQtdpOwsXtecI25qaGQWym+fKqx6xV1NkVrZok6XiyC5crL9blrpTGwUnoPyjpmobJuJ8gcJGG+YiJKLRyLvhAHImFKMlBHZCFZnIBBOQbAoiEC+X+o+WZ2CAJzikPwfT4BJGZGF9En5yJ1ZgeIzh6DiS+Mw6NbpGPIoWXerZhMIziQQaURpV12/V2N8ugE6mdZ1xQvB5WBFnWRZr56NoYtmY8gDzbS2k1D0+RHIP3UgMhpLpCMyRGvrL02FXRyEw2tLoOvk2PCT8vr6yYIMVKYhNCQLqZMKkDmrHAVnD0HJleNQfVcthj41EwNWt5Cl3CBdrsWaWpX5r4uOmKSfChHxhBwunWMLXoF2oxwWvJ9KyFioXtqKXDpobDrMxMAwVRNXpJ+hDxUaHG+2yZsIBAJ/Ib2RrOcCLp1zHL/vEy8M0BybIb2UdO/HAc79BdDB+0Yj455RCNw3CvZDY5F51wRkk6tn59m6uF8V5juRetQEnLu2WuQQ/X5qaRrCVx1FN9MssjRVMpDZ7iqE8U6xaRUdRPJJbWiyyDfXI3dzg4QZGLDLNqqa4/C6VoRXtqB0OVk3zO722FQUPzAV4fvp9e+vEy19sBGVjzZjwBMtGLCItK0V1WtnCK9FmEluuprEmsrd3CRuLh8qlQmqUjyA7p+WZJ7ik0VrmUXXnS1EDiGEOZZP17+MK33Iyq4iUK1aQbqwGeWP0zV4eBpK75uCknmTEJ43EeH7JqL4oSkofmIaShbSwb2cLMvV9Hfr6e9pbct5IAV3w3aruvximb95ZNKAusyDXCPPVKXcdcrx72JR+mwbZqLs7unImFYEK2RG5l06untWwhqm2SeAJnDeRdj1Xb/f30DgbJFR5vvUCAM0uQKsJzm2/W/T/CRZ0KT0d/YDYyTkkUJAnXvHWOTQieyUhySbzRUewq/gO0A3nmXJLELh1yCLJu/8oShZ1YoiTlxsZIrK6RKLTLZrLmEVyHqmcmyWkIKqu66TGYsy2YWpTkkrSCs71NfcxehqOPJ1s8S6ZXYfu7ab6iTJV9ClyIg4/sjlg4p+s9ED6EOkZXRPlG1Qw2qlVLOrQcJbnIwrEze+Tqs6LLnOmg/l0i61jnw4l4s2C80tryuvMXtbHKLg9muprOngbjqV3HMpON2JNW535JHDzdyAnG51HTiUwXmbfOH7JsCmgyf/wmEIktUcEG83ILQHts/oEye5qXk0dPnc3wm7biVwLmI8cxzH96kTZnIiraLD53/5g39SANq/YAysh8cROI9B+ryRZE2PQGjeCKTMG4usLw6BMzJNiOVtHcdLGF/n2JVhSyxQ2L7ob/wpJrLmVmHgwtm0+VqETD4sFsHBA7TSZhRyR53E49QcRZmlSBtTxQebNQip5A5XEeyrxZ3qwGANxzyWb6gXi7+0Q7WoqxDNkbN5P20AzXXxlesbxVos63ApOusFrGVtteZ0q3yGxIm73cEGenBxp/pddS/EqMS7GyPPVdDpqoqPu3okra+sMb0/Bmg2GPLlkGlFzarZqLynDul1xfCHTMVAyd3GPB/Sp7jTTSNRrHlfq9nebdn2c/TYSuBsfyqB2RWO19imTFX5LLn6Oz45FvQY+ptxCNHfpdJzpNw7nAB6FPz88wWjUXD1GGQclS+ZdcPn60FhGUslaUnZm6OaOEzF5MatvdxSmj2lGNUP1InFw9nw6LBaN9730eK6XGdb0DVdHpW1VR/Rkhh1B6i6INtDdSKoqMutMlDt6uxiyzTsTnWYhDc2JDXXzwPoZGPQuopHPKB6nVxsjLCxCXe1zNGsk98tknV3G51iJ9WztVkvWiZJWzU1O1LyGakYqtP3RRSg3WR10WEqkYsdllC80R2UUK9b6FsxZNEslF40CqllabBsSyYOGdLpaGqKUM2jE8NCl7haw3ibgPkOMiaLFXaZvk+96DBHkD70ZUJibVt7TTPJWt2YduaPK8TB4MyxaPn+vlHydYj1/hHInDcahTeMRWZLKax0S+qAfXZAmlkivfs6m27qqo/YDkXbUC3oweGZGHgL1zzPQvamehlKyxuGwbWoU23KcNJgUadLnhQ4S5ijI1rKFasl+/BHuKoGC9RJ3a6qHqnXdbxRgCiON/7JA+h+j7fGXn/3AC2JCT2468vKvCaulsZo7CHtTgKKcoY07HOI7zNF5jBZ0fweyzfUagIrfUjIuDN1aJVvaEXFfXXIqC8hq9mSxjKZGm9aPSbYuDwlxj4NPpoS1CU32kOPzxO+zLYs2/lE1jUfjHBJim07XNA9IhAIXEmuwxN00RaTLomj/PNF+tHVhXQR15G+97HFoBMlEO8fBfuB4fT1CBR8fQLyTxwAiwfB+jTPsOVyYiSuXJFYl6Fm8PmrQyi7ajwq18wUQhneDJz4KCOwKOhS5D1H8ry2Yg+gPT0ka1knbeNsqKgeAP6+HoWbWjBgxUyUfG6E7B2mWDAlbvyR2rQZnLf6A4F5hEklKaGg74jlbz4MYiShXA1STfrmgRYiIUDf308ATVY011RbBNIM1jl3HoX8zwyBvywEn5/rhoMymfpAZYVue7bwGhQ6KD5nGKraZkrFhIzjkVDF4RoJ9OkHaNMD6CMeoLOeUcyPvBc4PFPBDIf31WrP1UQqd7TyPrPtAzIzxmnZ3kMg/QphyhyfYQQ+VRUaHzuS08UjDdMF/XVSFvS8ERI3DhCoOvfR1wTYB6MhSRiOoucdTs83VDQwfxjS549BxpXD4R+ZCduyEfL5hV+kt5ZcRzcXMMeCMMKlmMibXYkhT81C4eYZyHxGuXnV6xqP+KnHHkB72u8VLFzh1DENudyBu7kVNWS8lF46CsGqFMU4Kfw3TD1g6D1k9Mlq1pbzVnp84Ige6voJlBK6mL/qG0AXxQA0Wb7z+weggwTQafeOQCppkEDaP5+BepiypslSz792DNIm5gnYKqJ4OyHZvsvX7OgbTG6ygIHMacWo4UaE9TMkRshE9t6GPTQAbe0D0OEIQHvX8vA3pKhJOzUbZmMA7YesmeVCW8ukZAG3y9VUTWNBnyLwOmBds2PvtW3rp47jnGwYZsAD58MB0LxgtYXImn8U0ueNQeh+Aur7xiFIlnRIKjM+unKoRMWh+bnGigYWqK/T7x2LtHljkXfLeGTOKqObKSCseokAOrbNXQE1M5b5yV0zkTI0EzU3TUUlgXThJq5dbUVp5wxPD6AlXa1S413TSdbWzVNhpSkObTPhtBLa5EzYf8VRGNA5G5VcUrhxpnctjwAt65qJqlWtKL+c9llNGoErszUGyXp2FKuiEeVvdptRDgDQO4LBwGN+v38Ag0lqSoqHqIcDoHnxAjUpSG8NIzirBP5jSWeHEaCvA/x4UFqSUIOk/mPo62PDyGgOwykPSHy5z0kLGTdkq4GgPJWkIoT8k2qQc+4QZJ891NM+aNZn6FqdPwSF5wxDXmM5zFRTmP7iUbDyBmcXmSebZE0upr8ZTtd6sKh3LQ+/Fp5B69FQJrzThunT3DCOGuZs7kMSZe4/sZxbtDWHxl7Sn5OeGQgEgoFg0EPSwwnQTP6fyqNqjEi8ScZf8cw2Ne2j/9XRE6ItiXFZMrU6zbRlUkzfSwl9EcJwJmv3s5XAI6QcX2Qatae9q0zS4dFatprEHDAcUSvOOjBoh2RYhKqZ5fFlDg8KtVW9unc9D69yhZMvoIZDWDocxaPuXGKyA9Gm2qpCYyfp045tD6qoHiTdzZ4cbgta5hA6evo2j+Xxq+9lIzoy6aO/NfL8ln5Nm8nAg8IP/f/bu7vXOKowjuM7c87M7HYT00iFVoS+SK0gKuKNFFHxQo1XXngj1BcaUIsg1CrUgl5ZxNaK1SalSm2qTdLYXnjbJv4LXljQpK1tQSxeiLSCbWNNjs85ZybZ1mz2rUl2k+8HfuzmbQM7O8+cPTtznpoKdNqpRLt1l5Vb70O5c6kjUk1y/qpN5frlaf/pvvuEP5xxJ7adaXzHHD9XrQO/OL1vKcbzubBRU9N/ca5koaMKXeTdgVorO9d8SorzJhWqgp1rLhSKVNDmKNBpx+Y0WScU19stnJv4/xGkK91lqW+d3qlWUq7dli36yj2+JhUTTvVo9FduZu2jVLmmqXapzvT3wpKLGni+Fz7ZqDlrjhxky6SWOVtDZe+Wk+TvKI77pFbcm13NjCYr0IGaOXNSoNMrHV1fwpv/Z1jLHHRuuhNz1gsu7ZIdBqSaqPSDo2zlssRNdakZT8Hyc9DK/dy3DUsP5Dmex2aIX/Ndl6RCgbZrsEfRaBzHm5XWbUvuasDWGkHfuCbG9AIp4ZzEv4Ai9zZZuYIQpvOewcytvcqktHNzacdsN0e6ZBNMZ2rNk0rbRLlu3nbKKafz/nFmbO/kL9F3U1PulEg1Z68RUus+pdIpDZ12W0qLc0nn+uk56/CKFOSvJfcHXHHSAiPoeY9fZ9avmpUmp25cA4DUldlWH6t+2zTyc9Js+5obLWtt4iiyo+YxrXV3kiRtXKZNga7Yqr402bQFqT++TZMq2xSBLMECLcVZivIVrdWQ3D5oLzZmrrmVpjhKT9ep84UQBrU8RuBX0Sp5O5ZL36YFOVJ33NkY6fKR5ZoE/287MCJejMkWIbOj5zhJzuk4fl2FYbstDG1tbVTH1inQ010RwrRZba1x3XvdeZSqqiVO7TKHSvnFwV0XcikuPiFpMH7H9LeVpiWyjhjZXHU92540X+z2j/x0xjXJsUjrh/Mygkr4ILD1CnTaFcGeoP6b7KwX5W9qjhylL8oLweZP+bpiP8VIisfKzqJZvfJ2s27VcrM2y8rl5m5Sd+zzd9edHWbFinbXndzNSbtP91XZU61km/1lt3u92540UdTU/QtxFJ1MkmSz7Ncd9nPAV97YSkVs1SkO2YiD+Xx+g2xQSZze1pY4jtfLW6ktSqurla9ezJmd77xkRr/fZ0aHPzE/D++VfGZGT35qxuQ+qS+jJ/eaUyP7zcE9b5v2QiyjKX8GR5gLy55qleTzH8RxtL6ebU6aK7L/bdBRtEH253WS9mJHZ25ZkemMRTAHHfY2epK6PUrL3z8pI7GKrbrs6XBf7N5mzIVvjTlz2Eye7TeTZ/rl/oBkkNQdef7OD5mRgQ9NRz6ZLtCzLye5JeADI6B5C7QU1f32D+o9Wd1es58W6CeqKdD2gpUDu98y5tyQmRzrM/+cPmKuSyZO95uJsQFSZybH5CD3yxEz0r/TF+j0w9jZmn7KNnuNU2GBRVygS0bQVRVoO8Xx5UdvGiMjZzN60BXpybFDcv+QmRztS1PL/UNV3u9r8P5c/p96HvvG77vn78wRM9wvI+hled81Y5YCLdtsUtJNgQYW+Qg6nSJ5XB7jcsX/l5MCvWurMReOydvyb2TUNyQ5mmaINJCJ89+ZEwO7zG3FvD8XevazauwHut3sKsDSKNCPVVOgbdF49eXnzPEDO8zQ59vMYM92M9jrM9DzrhnoTTPT/XLf66nie4v5cdzX283R3h3mva0vmoKbg/bdUbKF2inQACPoqgp0FGlTkBRjSeKzTJKPSb0p2ETKJJGa6pReYZVACjTQ7AU6CIKeRgt0OgddXYEOuJhgzlLbFWcUaGC+pV29qy7Q8if7GinQ2f+seoqDAj1nCWou0AEFGphPttBKVkl+qmY9Bvm9PfI2uOEpDnmsRyWXWAuhRdZrCINJOah2hyFncQDzRmudi6KoKLcnyi2Gkwuyt8PBdSnMm1UY5updfrC0QMsOf5ni1zIrnTGCBhaiQLspB6VeldHRePl1fV2zyB+lQK9t5CrCbA5aslFCgW6dAj0h2cR50MA8kxF0LtK6M4rjwzqK/g3Tzgquc7d2HXwl0e/yey8kceSKeiPs6Fsey64BcNa3cA9Z3L05pzXSbe9eA39IHpH77DDA/Ap8kY7jO+IkeV92xLG01c24FM9L8vWw/Lwrn8+rJEluyQFByZ4eRfpjWwDsMpZKsWh8sxZouxylHLiPy6Yrsng7sEBs8S0UCsqObqVodoVKPS875EYZ8XauW7P6lk+tSNZInR7xRZoC3YwF2hZn2f4/yO1D9vXR6LsnAC2gWCzaeW87ml4rO70dSZ+TgnAlCIJxOSiMy+01sgAJg2tycLa5KtvhV62jr2QbPWCnninOwBJjd3opBlID9D1y/ynJszZSHJ65KV1kPqK65Pnvkg3ytNzeZ5trMK0BAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAADQDP4DsHeuiRev2FEAAAAldEVYdGRhdGU6Y3JlYXRlADIwMjMtMDgtMTBUMTA6MDI6NDkrMDA6MDAzy2cWAAAAJXRFWHRkYXRlOm1vZGlmeQAyMDIzLTA4LTEwVDEwOjAyOjQ5KzAwOjAwQpbfqgAAAABJRU5ErkJggg==;clipPath=inset(20.33% 0% 21.33% 0%);\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;675.6399999999999\&quot; y=\&quot;1848.5\&quot; width=\&quot;102.86\&quot; height=\&quot;60\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-92\&quot; value=\&quot;模型副本\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontStyle=1;fontColor=#808080;fontSize=15;\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;638.5\&quot; y=\&quot;1968.5\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-93\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;iroXu6kSOUnqGuu2dOUE-94\&quot; target=\&quot;iroXu6kSOUnqGuu2dOUE-95\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-94\&quot; value=\&quot;随机小批量\&quot; style=\&quot;shape=cylinder3;whiteSpace=wrap;html=1;boundedLbl=1;backgroundOutline=1;size=15;fillColor=#d5e8d4;strokeColor=#82b366;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;708.5000000000001\&quot; y=\&quot;1918.5\&quot; width=\&quot;70\&quot; height=\&quot;60\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-95\&quot; value=\&quot;本地梯度\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#ffe6cc;strokeColor=#d79b00;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;820.61\&quot; y=\&quot;1898.5\&quot; width=\&quot;80\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-96\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryX=0;entryY=0.5;entryDx=0;entryDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;iroXu6kSOUnqGuu2dOUE-91\&quot; target=\&quot;iroXu6kSOUnqGuu2dOUE-95\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;788.5\&quot; y=\&quot;1958.5\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;830.5\&quot; y=\&quot;1933.5\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-105\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=2;strokeColor=#999999;entryX=0;entryY=0.5;entryDx=0;entryDy=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;iroXu6kSOUnqGuu2dOUE-87\&quot; target=\&quot;iroXu6kSOUnqGuu2dOUE-62\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;909\&quot; y=\&quot;1375\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;1030\&quot; y=\&quot;1670\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-107\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=2;strokeColor=#999999;entryX=0;entryY=0.5;entryDx=0;entryDy=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;iroXu6kSOUnqGuu2dOUE-95\&quot; target=\&quot;iroXu6kSOUnqGuu2dOUE-62\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;913\&quot; y=\&quot;1555\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;1030\&quot; y=\&quot;1670\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-108\&quot; value=\&quot;全局梯度\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#ffe6cc;strokeColor=#d79b00;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;1030\&quot; y=\&quot;1645\&quot; width=\&quot;80\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;iroXu6kSOUnqGuu2dOUE-109\&quot; value=\&quot;&amp;lt;span style=&amp;quot;font-family: Helvetica; font-size: 11px; font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; letter-spacing: normal; orphans: 2; text-align: center; text-indent: 0px; text-transform: none; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; white-space: nowrap; background-color: rgb(255, 255, 255); text-decoration-thickness: initial; text-decoration-style: initial; text-decoration-color: initial; float: none; display: inline !important;&amp;quot;&amp;gt;AllReduce&amp;lt;/span&amp;gt;&amp;lt;div&amp;gt;&amp;lt;span style=&amp;quot;font-family: Helvetica; font-size: 11px; font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; letter-spacing: normal; orphans: 2; text-align: center; text-indent: 0px; text-transform: none; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; white-space: nowrap; background-color: rgb(255, 255, 255); text-decoration-thickness: initial; text-decoration-style: initial; text-decoration-color: initial; float: none; display: inline !important;&amp;quot;&amp;gt;规约&amp;lt;/span&amp;gt;&amp;lt;/div&amp;gt;\&quot; style=\&quot;text;whiteSpace=wrap;html=1;fontColor=#CC0000;fontStyle=1\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;940\&quot; y=\&quot;1630\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n      &lt;/root&gt;\n    &lt;/mxGraphModel&gt;\n  &lt;/diagram&gt;\n&lt;/mxfile&gt;\n&quot;}"></div>
<script type="text/javascript" src="https://viewer.diagrams.net/js/viewer-static.min.js"></script>

<h3 id="模型并行大模型">模型并行(大模型)</h3>

<p>【2023-8-28】<a href="https://zhuanlan.zhihu.com/p/87596314">模型并行最佳实践(PyTorch)</a></p>

<p>DataParallel的优缺点如下:</p>
<ul>
  <li>优点:将模型<strong>复制</strong>到所有GPU,其中每个GPU消耗输入数据的不同分区,可以极大地加快训练过程。</li>
  <li>缺点:不适用于某些<strong>模型太大</strong>而无法容纳单个GPU的用例。</li>
</ul>

<p>模型并行性(Model parallelism: MP)目的是解决<strong>模型权重不能适应单个节点</strong>的情况,通过将计算和模型参数分布在多台机器上进行训练。</p>
<ul>
  <li>数据并行中,每个worker承载整个模型的<strong>完整副本</strong></li>
  <li>而模型并行中,每个worker上只分配模型参数的一小部分,从而减少了内存使用和计算。</li>
</ul>

<p>原理</p>
<blockquote>
  <ul>
    <li>将单个模型拆分到不同GPU上,而不是在每个GPU上复制整个模型</li>
    <li>将模型不同<strong>子网</strong>放置到不同设备上,并相应地实现该 <strong>forward方法</strong>以在设备之间移动中间输出。由于模型的一部分只能在任何单个设备上运行,因此一组设备可以共同为更大的模型服务。</li>
  </ul>
</blockquote>

<p>模型 m 包含10层:</p>
<ul>
  <li>DataParallel: 每个GPU都具有这10层中每层副本</li>
  <li>而在两个GPU上使用模型并行时,每个GPU可以托管5层</li>
</ul>

<p>由于深度神经网络通常包含一堆垂直层,因此将一个大型模型逐层拆分感觉很简单,其中一组连续的小层被分组到一个工作层上的一个分区中。然而,通过多个具有顺序依赖性的工作线程来运行每个数据批,会导致大量的<strong>等待时间</strong>和计算资源<strong>利用率低下</strong>的问题。</p>

<p>模型并行有两种:张量并行 和 流水线并行</p>
<ul>
  <li>张量并行是在<strong>一个操作</strong>中进行并行计算,如:矩阵-矩阵乘法。</li>
  <li>流水线并行是在<strong>各层</strong>之间进行并行计算。</li>
</ul>

<p>总结</p>
<ul>
  <li>张量并行是<strong>层内</strong>并行,流水线并行是<strong>层间</strong>并行。</li>
</ul>

<h4 id="流水线并行综合模型数据">流水线并行(综合模型+数据)</h4>

<p>通道并行(Pipeline parallelism: PP)将<code class="language-plaintext highlighter-rouge">模型并行</code>与<code class="language-plaintext highlighter-rouge">数据并行</code>相结合,以减少部分训练过程中出现的空闲时间。</p>

<p>主要思想</p>
<ul>
  <li>将一个小批量拆分为多个<strong>微批次</strong>,并使worker在每个阶段中能够同时处理一个微批次。需要注意的是,每个微批次需要<strong>两次传递</strong>,一次向前,一次向后。worker之间的通信仅传输激活(向前)和梯度(向后)。这些通道的调度方式以及梯度的聚合方式在不同的方法中有所不同。分区(workers)的数量也称为通道深度。</li>
</ul>

<p>模型按层分割成若干块,每块都交给一个设备。</p>
<ul>
  <li>前向传播: 每个设备将中间激活传递给下一个阶段。</li>
  <li>后向传播: 每个设备将输入张量梯度传回给前一个流水线阶段。</li>
</ul>

<p>这允许设备同时进行计算,从而增加训练的吞吐量。</p>
<ul>
  <li><img src="https://mmbiz.qpic.cn/mmbiz_png/J0mLianhFicBHEDwE5nPHZKaicqsXBVgES5DlibCDBUbdVthPzoeI9mIVglwvVYick56NFeyhOnRJ6Ly62WPXHgRPvg/640?wx_fmt=png&amp;tp=wxpic&amp;wxfrom=5&amp;wx_lazy=1&amp;wx_co=1" alt="img" /></li>
</ul>

<p>缺点</p>
<ul>
  <li>训练设备容易出现空闲状态(因为后一阶段等待前一阶段执行完毕),导致计算资源的浪费,加速效率没有数据并行高。</li>
  <li><img src="https://mmbiz.qpic.cn/mmbiz_png/J0mLianhFicBHEDwE5nPHZKaicqsXBVgES5wlPicE0gibFZYkicXOG7gwQWYDH4xyzf7uW4EAL6h45upGeia8LGZ99Bzg/640?wx_fmt=png&amp;tp=wxpic&amp;wxfrom=5&amp;wx_lazy=1&amp;wx_co=1" alt="img" /></li>
</ul>

<p>典型的流水线并行实现:</p>
<ul>
  <li>GPipe、PipeDream、PipeDream-2BW、PipeDream Flush(1F1B)。</li>
</ul>

<h4 id="张量并行水平分割">张量并行(水平分割)</h4>

<p>模型并行和流水线并行都将一个模型垂直分割,可以将一个张量操作的计算水平分割到多个设备上,称为<strong>张量并行</strong>(tensor parallelism,TP)。</p>
<ul>
  <li>张量并行将张量沿特定维度分成 N 块,每个设备只持有整个张量的 1/N,同时不影响计算图的正确性。</li>
  <li>这需要额外的通信来确保结果的正确性。</li>
  <li><img src="https://mmbiz.qpic.cn/mmbiz_png/J0mLianhFicBHEDwE5nPHZKaicqsXBVgES53FR1KDRnTBHAKwRtd9rEo3TOxgrKA5ZaqBVYZ3QIKGwU2OTW7AklIQ/640?wx_fmt=png&amp;tp=wxpic&amp;wxfrom=5&amp;wx_lazy=1&amp;wx_co=1" alt="" /></li>
</ul>

<p>以当下比较流行的transformer为例,transformer模型主要由多层MLP和自我注意块组成。Megatron-LM(Shoeybi et al.2020)等人采用了一种简单的方法来并行多层计算MLP和自我注意。变压器中的MLP层包含GEMM(通用矩阵乘法)和非线性GeLU传输,按列拆分权重矩阵A</p>

<p>典型的张量并行实现:</p>
<ul>
  <li>Megatron-LM(1D)</li>
  <li>Colossal-AI(2D、2.5D、3D)</li>
</ul>

<h3 id="多维混合并行">多维混合并行</h3>

<p>多维混合并行指将<code class="language-plaintext highlighter-rouge">数据并行</code>、<code class="language-plaintext highlighter-rouge">模型并行</code>和<code class="language-plaintext highlighter-rouge">流水线并行</code>结合起来进行分布式训练。</p>

<p>超大规模模型的预训练和全参数微调时,都需要用到多维混合并行。</p>

<h4 id="2d-并行">2D 并行</h4>

<p>主要有</p>
<ul>
  <li>Data 并行+ pipeline 并行
    <ul>
      <li>Deepspeed <a href="https://www.deepspeed.ai/tutorials/pipeline/">web-link</a>给出了 pipeline 和 data-parallel 的2D并行示意图,其中 rank0 和 rank1 为 data-parallelism, rank0里的 gpu-0 和 gpu-2 进行 pipeline 并行,他们交替进行前向和反向过程,疑问的是(这里没有模型运行的最终的loss,如何进行反向传播呢?)</li>
      <li><img src="https://pic1.zhimg.com/80/v2-aed6e288293f97bfc17ed0b3c9087290_1440w.webp" alt="" /></li>
    </ul>
  </li>
  <li>Tensor 并行 + pipeline
    <ul>
      <li><img src="https://pic1.zhimg.com/80/v2-af38ecbaa5ccb7059c97e8774e484370_1440w.webp" alt="" /></li>
    </ul>
  </li>
</ul>

<h4 id="3d-并行">3D 并行</h4>

<p>3D并行 =&gt; Tensor + pipeline + data</p>

<p><img src="https://pic1.zhimg.com/80/v2-e3446e66333f5df91b960933965a6c64_1440w.webp" alt="" /></p>

<h3 id="异构系统并行">异构系统并行</h3>

<p>与 GPU 相比,CPU 内存要大得多。</p>
<ul>
  <li>典型服务器上,CPU 可以轻松拥有几百GB甚至上TB的内存,而每张 GPU 卡通常只有 48 或 80 GB的内存。</li>
</ul>

<p>为什么 CPU 内存没有被用于分布式训练?</p>
<ul>
  <li>依靠 CPU 甚至是 NVMe 磁盘来训练大型模型。</li>
  <li>主要想法: 在不使用张量时,将其卸载回 CPU 内存或 NVMe 磁盘。</li>
</ul>

<p>通过使用异构系统架构,有可能在一台机器上容纳一个巨大的模型。</p>

<h3 id="自动搜索并行空间">自动搜索并行空间</h3>

<h4 id="alpa">alpa</h4>

<p><a href="https://github.com/alpa-projects/alpa">Alpa</a> 将并行空间分为 <code class="language-plaintext highlighter-rouge">inter-op</code> (pipeline) 与 <code class="language-plaintext highlighter-rouge">intra-op</code> (tensor并行),使用 NAS搜索这两个空间,考虑整个搜索空间的cost。</p>
<ul>
  <li>首先搜索 inter-op 的搜索空间, 制定 pipeline 并行策略</li>
  <li>然后搜索 intra-op空间, 指定 data-para 与 operator-para 策略(包括两种)</li>
  <li>Data para</li>
  <li>Operator parallel (weight 广播,input拆分)</li>
  <li>Operator parallel (weight 拆分,input拆分) –&gt; 需要增加 all-reduce cost</li>
</ul>

<p>UCB博士 <code class="language-plaintext highlighter-rouge">郑怜悯</code> 的工作, 他还参加过其他项目 Ansor,TVM, vLLM, FastChat,LMSYS-Chat-1M</p>
<ul>
  <li><img src="https://pic1.zhimg.com/80/v2-e76476aa985ccf7bc12fb2c04a60feec_1440w.webp" alt="" /></li>
</ul>

<h2 id="模型训练开销">模型训练开销</h2>

<p>神经网络模型占用的显存包括:</p>
<ul>
  <li>模型自身的参数</li>
  <li>模型的输出</li>
</ul>

<p>全连接网络(不考虑偏置项b): Y = XW + b</p>
<ul>
  <li>X 是 B*M 维</li>
  <li>W 是 M<em>N 或 N</em>M 维</li>
  <li>Y 是 B*N 维</li>
</ul>

<p>显存占用包括:</p>
<ul>
  <li>参数:二维数组 W</li>
  <li>模型的输出: 二维数组 Y</li>
  <li>X是上一层的输出,因此显存占用归于上一层。</li>
</ul>

<p>显存占用就是W和Y两个数组?非也</p>

<h3 id="模型训练流程">模型训练流程</h3>

<!-- draw.io diagram -->
<div class="mxgraph" style="max-width:100%;border:1px solid transparent;" data-mxgraph="{&quot;highlight&quot;:&quot;#0000ff&quot;,&quot;nav&quot;:true,&quot;resize&quot;:true,&quot;toolbar&quot;:&quot;zoom layers tags lightbox&quot;,&quot;edit&quot;:&quot;_blank&quot;,&quot;xml&quot;:&quot;&lt;mxfile host=\&quot;app.diagrams.net\&quot; modified=\&quot;2024-05-11T07:43:47.099Z\&quot; agent=\&quot;Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/124.0.0.0 Safari/537.36\&quot; etag=\&quot;dPvaBv-ThdCmPTVmmKJL\&quot; version=\&quot;24.4.0\&quot;&gt;\n  &lt;diagram id=\&quot;xdYpP7w1t2VaaceZiyqw\&quot; name=\&quot;第 1 页\&quot;&gt;\n    &lt;mxGraphModel dx=\&quot;1242\&quot; dy=\&quot;-408\&quot; grid=\&quot;1\&quot; gridSize=\&quot;10\&quot; guides=\&quot;1\&quot; tooltips=\&quot;1\&quot; connect=\&quot;1\&quot; arrows=\&quot;1\&quot; fold=\&quot;1\&quot; page=\&quot;1\&quot; pageScale=\&quot;1\&quot; pageWidth=\&quot;827\&quot; pageHeight=\&quot;1169\&quot; math=\&quot;0\&quot; shadow=\&quot;0\&quot;&gt;\n      &lt;root&gt;\n        &lt;mxCell id=\&quot;0\&quot; /&gt;\n        &lt;mxCell id=\&quot;1\&quot; parent=\&quot;0\&quot; /&gt;\n        &lt;mxCell id=\&quot;KTwht3HF3Dpf_-XckZrt-1\&quot; value=\&quot;分布式训练\&quot; style=\&quot;text;html=1;strokeColor=none;fillColor=none;align=center;verticalAlign=middle;whiteSpace=wrap;rounded=0;fontSize=21;rotation=0;strokeWidth=3;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;620.71\&quot; y=\&quot;1200\&quot; width=\&quot;224.5\&quot; height=\&quot;33\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-14\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f5f5f5;fontColor=#333333;strokeColor=default;dashed=1;dashPattern=1 1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;260\&quot; y=\&quot;1465.5\&quot; width=\&quot;240\&quot; height=\&quot;143\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-15\&quot; value=\&quot;CPU节点\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontStyle=1;fontColor=#6666FF;fontSize=15;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot;&gt;\n          &lt;mxGeometry x=\&quot;380\&quot; y=\&quot;1450\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-3\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;entryX=0;entryY=0.5;entryDx=0;entryDy=0;entryPerimeter=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;MzKt8NfVXthm0VmUftic-16\&quot; target=\&quot;tbJsgM7FzyQLb-bkkZpG-7\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-9\&quot; value=\&quot;随机划分\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;tbJsgM7FzyQLb-bkkZpG-3\&quot;&gt;\n          &lt;mxGeometry x=\&quot;0.0562\&quot; y=\&quot;-3\&quot; relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint y=\&quot;-11\&quot; as=\&quot;offset\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-18\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;MzKt8NfVXthm0VmUftic-16\&quot; target=\&quot;tbJsgM7FzyQLb-bkkZpG-17\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-16\&quot; value=\&quot;数据集\&quot; style=\&quot;shape=cylinder3;whiteSpace=wrap;html=1;boundedLbl=1;backgroundOutline=1;size=15;fillColor=#60a917;strokeColor=#2D7600;fontColor=#ffffff;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;280\&quot; y=\&quot;1693\&quot; width=\&quot;80\&quot; height=\&quot;70\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-20\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;MzKt8NfVXthm0VmUftic-17\&quot; target=\&quot;tbJsgM7FzyQLb-bkkZpG-19\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-17\&quot; value=\&quot;模型权重\&quot; style=\&quot;shape=cylinder3;whiteSpace=wrap;html=1;boundedLbl=1;backgroundOutline=1;size=15;fillColor=#f8cecc;strokeColor=#b85450;fontSize=15;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;285\&quot; y=\&quot;1330\&quot; width=\&quot;70\&quot; height=\&quot;60\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-20\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f5f5f5;fontColor=#333333;strokeColor=default;dashed=1;dashPattern=1 1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;662.86\&quot; y=\&quot;1305\&quot; width=\&quot;377.14\&quot; height=\&quot;135\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-13\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryX=0;entryY=0.5;entryDx=0;entryDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;MzKt8NfVXthm0VmUftic-78\&quot; target=\&quot;tbJsgM7FzyQLb-bkkZpG-14\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-22\&quot; value=\&quot;GPU节点0\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontStyle=1;fontColor=#7F00FF;fontSize=15;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot;&gt;\n          &lt;mxGeometry x=\&quot;765.6800000000001\&quot; y=\&quot;1290\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-23\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=4;strokeColor=#999999;entryX=0;entryY=0.5;entryDx=0;entryDy=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryPerimeter=0;\&quot; parent=\&quot;1\&quot; source=\&quot;MzKt8NfVXthm0VmUftic-14\&quot; target=\&quot;MzKt8NfVXthm0VmUftic-20\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;565\&quot; y=\&quot;1535\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;595\&quot; y=\&quot;990\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-33\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=4;strokeColor=#999999;entryX=0;entryY=0.5;entryDx=0;entryDy=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;MzKt8NfVXthm0VmUftic-14\&quot; target=\&quot;tbJsgM7FzyQLb-bkkZpG-21\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;565\&quot; y=\&quot;1535\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;662.8600000000001\&quot; y=\&quot;1537\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-35\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=4;strokeColor=#999999;entryX=0;entryY=0.5;entryDx=0;entryDy=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;MzKt8NfVXthm0VmUftic-14\&quot; target=\&quot;tbJsgM7FzyQLb-bkkZpG-30\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;550\&quot; y=\&quot;1540\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;662.8600000000001\&quot; y=\&quot;1757.5\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-2\&quot; value=\&quot;随机小批量数据集\&quot; style=\&quot;shape=cylinder3;whiteSpace=wrap;html=1;boundedLbl=1;backgroundOutline=1;size=15;fillColor=#d5e8d4;strokeColor=#82b366;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;429\&quot; y=\&quot;1733\&quot; width=\&quot;70\&quot; height=\&quot;60\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-11\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;entryX=0;entryY=0.5;entryDx=0;entryDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;tbJsgM7FzyQLb-bkkZpG-5\&quot; target=\&quot;MzKt8NfVXthm0VmUftic-78\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;Array as=\&quot;points\&quot;&gt;\n              &lt;mxPoint x=\&quot;830\&quot; y=\&quot;1390\&quot; /&gt;\n              &lt;mxPoint x=\&quot;830\&quot; y=\&quot;1360\&quot; /&gt;\n            &lt;/Array&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-5\&quot; value=\&quot;随机小批量数据集\&quot; style=\&quot;shape=cylinder3;whiteSpace=wrap;html=1;boundedLbl=1;backgroundOutline=1;size=15;fillColor=#d5e8d4;strokeColor=#82b366;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;697.96\&quot; y=\&quot;1360\&quot; width=\&quot;70\&quot; height=\&quot;60\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-7\&quot; value=\&quot;随机小批量数据集\&quot; style=\&quot;shape=cylinder3;whiteSpace=wrap;html=1;boundedLbl=1;backgroundOutline=1;size=15;fillColor=#d5e8d4;strokeColor=#82b366;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;429\&quot; y=\&quot;1698\&quot; width=\&quot;70\&quot; height=\&quot;60\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-8\&quot; value=\&quot;随机小批量数据集\&quot; style=\&quot;shape=cylinder3;whiteSpace=wrap;html=1;boundedLbl=1;backgroundOutline=1;size=15;fillColor=#d5e8d4;strokeColor=#82b366;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;429\&quot; y=\&quot;1660\&quot; width=\&quot;70\&quot; height=\&quot;60\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-10\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;entryX=0;entryY=0.5;entryDx=0;entryDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;MzKt8NfVXthm0VmUftic-77\&quot; target=\&quot;MzKt8NfVXthm0VmUftic-78\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-77\&quot; value=\&quot;模型权重\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f8cecc;strokeColor=#b85450;shadow=1;fontSize=17;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;687.28\&quot; y=\&quot;1315\&quot; width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-78\&quot; value=\&quot;本地梯度\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#e1d5e7;strokeColor=#9673a6;shadow=1;fontSize=17;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;880\&quot; y=\&quot;1345\&quot; width=\&quot;70\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-79\&quot; value=\&quot;优化器\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;924.3199999999999\&quot; y=\&quot;1400\&quot; width=\&quot;71.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;MzKt8NfVXthm0VmUftic-80\&quot; value=\&quot;激活函数\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;834.32\&quot; y=\&quot;1400\&quot; width=\&quot;80\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-63\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;tbJsgM7FzyQLb-bkkZpG-14\&quot; target=\&quot;tbJsgM7FzyQLb-bkkZpG-62\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-14\&quot; value=\&quot;小批量随机梯度下降\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;1120\&quot; y=\&quot;1498.5\&quot; width=\&quot;100\&quot; height=\&quot;60\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-57\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;entryX=1;entryY=0;entryDx=0;entryDy=30;entryPerimeter=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;tbJsgM7FzyQLb-bkkZpG-15\&quot; target=\&quot;tbJsgM7FzyQLb-bkkZpG-16\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-15\&quot; value=\&quot;全局梯度\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#C3ABD0;strokeColor=#C3ABD0;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;415\&quot; y=\&quot;1494.5\&quot; width=\&quot;70\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-16\&quot; value=\&quot;模型\&quot; style=\&quot;shape=cylinder3;whiteSpace=wrap;html=1;boundedLbl=1;backgroundOutline=1;size=15;fillColor=#f8cecc;strokeColor=#b85450;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;290\&quot; y=\&quot;1480\&quot; width=\&quot;60\&quot; height=\&quot;50\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-17\&quot; value=\&quot;数据\&quot; style=\&quot;shape=cylinder3;whiteSpace=wrap;html=1;boundedLbl=1;backgroundOutline=1;size=15;fillColor=#d5e8d4;strokeColor=#82b366;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;290\&quot; y=\&quot;1535\&quot; width=\&quot;60\&quot; height=\&quot;50\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-19\&quot; value=\&quot;全部模型权重\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f8cecc;strokeColor=#b85450;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;407.64\&quot; y=\&quot;1345\&quot; width=\&quot;112.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-21\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f5f5f5;fontColor=#333333;strokeColor=default;dashed=1;dashPattern=1 1;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;670\&quot; y=\&quot;1473.5\&quot; width=\&quot;377.14\&quot; height=\&quot;135\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-22\&quot; value=\&quot;GPU节点1\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontStyle=1;fontColor=#7F00FF;fontSize=15;\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;772.82\&quot; y=\&quot;1458.5\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-23\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;entryX=0;entryY=0.5;entryDx=0;entryDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;tbJsgM7FzyQLb-bkkZpG-24\&quot; target=\&quot;tbJsgM7FzyQLb-bkkZpG-27\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;Array as=\&quot;points\&quot;&gt;\n              &lt;mxPoint x=\&quot;837.14\&quot; y=\&quot;1558.5\&quot; /&gt;\n              &lt;mxPoint x=\&quot;837.14\&quot; y=\&quot;1528.5\&quot; /&gt;\n            &lt;/Array&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-24\&quot; value=\&quot;随机小批量数据集\&quot; style=\&quot;shape=cylinder3;whiteSpace=wrap;html=1;boundedLbl=1;backgroundOutline=1;size=15;fillColor=#d5e8d4;strokeColor=#82b366;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;705.1\&quot; y=\&quot;1528.5\&quot; width=\&quot;70\&quot; height=\&quot;60\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-25\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;entryX=0;entryY=0.5;entryDx=0;entryDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;tbJsgM7FzyQLb-bkkZpG-26\&quot; target=\&quot;tbJsgM7FzyQLb-bkkZpG-27\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-26\&quot; value=\&quot;模型权重\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f8cecc;strokeColor=#b85450;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;694.42\&quot; y=\&quot;1483.5\&quot; width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-27\&quot; value=\&quot;本地梯度\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#e1d5e7;strokeColor=#9673a6;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;887.14\&quot; y=\&quot;1513.5\&quot; width=\&quot;70\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-28\&quot; value=\&quot;优化器\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;931.4599999999999\&quot; y=\&quot;1568.5\&quot; width=\&quot;71.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-29\&quot; value=\&quot;激活函数\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;841.46\&quot; y=\&quot;1568.5\&quot; width=\&quot;80\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-30\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f5f5f5;fontColor=#333333;strokeColor=default;dashed=1;dashPattern=1 1;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;670\&quot; y=\&quot;1668\&quot; width=\&quot;377.14\&quot; height=\&quot;135\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-31\&quot; value=\&quot;GPU节点2\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontStyle=1;fontColor=#7F00FF;fontSize=15;\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;772.82\&quot; y=\&quot;1653\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-32\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;entryX=0;entryY=0.5;entryDx=0;entryDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;tbJsgM7FzyQLb-bkkZpG-33\&quot; target=\&quot;tbJsgM7FzyQLb-bkkZpG-36\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;Array as=\&quot;points\&quot;&gt;\n              &lt;mxPoint x=\&quot;837.14\&quot; y=\&quot;1753\&quot; /&gt;\n              &lt;mxPoint x=\&quot;837.14\&quot; y=\&quot;1723\&quot; /&gt;\n            &lt;/Array&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-33\&quot; value=\&quot;随机小批量数据集\&quot; style=\&quot;shape=cylinder3;whiteSpace=wrap;html=1;boundedLbl=1;backgroundOutline=1;size=15;fillColor=#d5e8d4;strokeColor=#82b366;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;705.1\&quot; y=\&quot;1723\&quot; width=\&quot;70\&quot; height=\&quot;60\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-34\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;entryX=0;entryY=0.5;entryDx=0;entryDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;tbJsgM7FzyQLb-bkkZpG-35\&quot; target=\&quot;tbJsgM7FzyQLb-bkkZpG-36\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-35\&quot; value=\&quot;模型权重\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f8cecc;strokeColor=#b85450;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;694.42\&quot; y=\&quot;1678\&quot; width=\&quot;91.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-36\&quot; value=\&quot;本地梯度\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#e1d5e7;strokeColor=#9673a6;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;887.14\&quot; y=\&quot;1708\&quot; width=\&quot;70\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-37\&quot; value=\&quot;优化器\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;931.4599999999999\&quot; y=\&quot;1763\&quot; width=\&quot;71.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-38\&quot; value=\&quot;激活函数\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=#6c8ebf;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;841.46\&quot; y=\&quot;1763\&quot; width=\&quot;80\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-39\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;entryX=0;entryY=0.5;entryDx=0;entryDy=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;dashed=1;dashPattern=1 1;strokeColor=#EA6B66;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;tbJsgM7FzyQLb-bkkZpG-19\&quot; target=\&quot;MzKt8NfVXthm0VmUftic-77\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;370\&quot; y=\&quot;1370\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;428\&quot; y=\&quot;1370\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-40\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;entryX=0;entryY=0.5;entryDx=0;entryDy=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;dashed=1;dashPattern=1 1;strokeColor=#EA6B66;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;tbJsgM7FzyQLb-bkkZpG-19\&quot; target=\&quot;tbJsgM7FzyQLb-bkkZpG-26\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;520\&quot; y=\&quot;1370\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;697\&quot; y=\&quot;1340\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-41\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;entryX=0;entryY=0.5;entryDx=0;entryDy=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;dashed=1;dashPattern=1 1;strokeColor=#EA6B66;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;tbJsgM7FzyQLb-bkkZpG-19\&quot; target=\&quot;tbJsgM7FzyQLb-bkkZpG-35\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;530\&quot; y=\&quot;1380\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;707\&quot; y=\&quot;1350\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-42\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;entryX=0;entryY=0.5;entryDx=0;entryDy=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;dashed=1;dashPattern=1 1;strokeColor=#97D077;exitPerimeter=0;entryPerimeter=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;tbJsgM7FzyQLb-bkkZpG-7\&quot; target=\&quot;tbJsgM7FzyQLb-bkkZpG-5\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;520\&quot; y=\&quot;1370\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;697\&quot; y=\&quot;1340\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-43\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;entryX=0;entryY=0.5;entryDx=0;entryDy=0;dashed=1;dashPattern=1 1;strokeColor=#97D077;entryPerimeter=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; target=\&quot;tbJsgM7FzyQLb-bkkZpG-24\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;500\&quot; y=\&quot;1730\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;708\&quot; y=\&quot;1400\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-44\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;entryX=0;entryY=0.5;entryDx=0;entryDy=0;exitX=1;exitY=0.5;exitDx=0;exitDy=0;dashed=1;dashPattern=1 1;strokeColor=#97D077;exitPerimeter=0;entryPerimeter=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;tbJsgM7FzyQLb-bkkZpG-7\&quot; target=\&quot;tbJsgM7FzyQLb-bkkZpG-33\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;519\&quot; y=\&quot;1748\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;718\&quot; y=\&quot;1410\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-45\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryX=0;entryY=0.5;entryDx=0;entryDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;tbJsgM7FzyQLb-bkkZpG-27\&quot; target=\&quot;tbJsgM7FzyQLb-bkkZpG-14\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;960\&quot; y=\&quot;1370\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;1160\&quot; y=\&quot;1446\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-46\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryX=0;entryY=0.5;entryDx=0;entryDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;tbJsgM7FzyQLb-bkkZpG-36\&quot; target=\&quot;tbJsgM7FzyQLb-bkkZpG-14\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;967\&quot; y=\&quot;1539\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;1160\&quot; y=\&quot;1446\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-48\&quot; value=\&quot;局部模型权重\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f8cecc;strokeColor=#b85450;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;407.64\&quot; y=\&quot;1384\&quot; width=\&quot;112.36\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-49\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;entryX=0.5;entryY=0;entryDx=0;entryDy=0;entryPerimeter=0;exitX=0.5;exitY=1;exitDx=0;exitDy=0;exitPerimeter=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;MzKt8NfVXthm0VmUftic-17\&quot; target=\&quot;tbJsgM7FzyQLb-bkkZpG-16\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;330\&quot; y=\&quot;1703\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;330\&quot; y=\&quot;1595\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-50\&quot; value=\&quot;计算损失loss\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#fff2cc;strokeColor=#d6b656;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;880\&quot; y=\&quot;1308\&quot; width=\&quot;110\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-51\&quot; value=\&quot;计算损失loss\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#fff2cc;strokeColor=#d6b656;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;885.68\&quot; y=\&quot;1473.5\&quot; width=\&quot;110\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-52\&quot; value=\&quot;计算损失loss\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#fff2cc;strokeColor=#d6b656;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;880\&quot; y=\&quot;1668\&quot; width=\&quot;110\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-53\&quot; value=\&quot;均匀分发数据到GPU节点\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];fontSize=13;fontColor=#6666FF;\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;480\&quot; y=\&quot;1803\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-54\&quot; value=\&quot;聚合局部梯度得到全局梯度\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];fontSize=13;fontColor=#6666FF;labelBackgroundColor=none;\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;1170\&quot; y=\&quot;1568.5\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-56\&quot; value=\&quot;分发最新权重到GPU节点\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];fontSize=13;fontColor=#6666FF;labelBackgroundColor=none;\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;464\&quot; y=\&quot;1330\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-58\&quot; value=\&quot;&amp;lt;span style=&amp;quot;color: rgb(102, 102, 255); font-family: Helvetica; font-size: 13px; font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; font-weight: 400; letter-spacing: normal; orphans: 2; text-align: center; text-indent: 0px; text-transform: none; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; white-space: nowrap; text-decoration-thickness: initial; text-decoration-style: initial; text-decoration-color: initial; float: none; display: inline !important;&amp;quot;&amp;gt;更新模型权重&amp;lt;/span&amp;gt;\&quot; style=\&quot;text;whiteSpace=wrap;html=1;fillColor=none;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;355\&quot; y=\&quot;1473.5\&quot; width=\&quot;82.72\&quot; height=\&quot;20\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-60\&quot; value=\&quot;&amp;lt;span style=&amp;quot;font-family: Helvetica; font-size: 11px; font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; letter-spacing: normal; orphans: 2; text-align: center; text-indent: 0px; text-transform: none; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; white-space: nowrap; background-color: rgb(255, 255, 255); text-decoration-thickness: initial; text-decoration-style: initial; text-decoration-color: initial; float: none; display: inline !important;&amp;quot;&amp;gt;batch_size&amp;lt;/span&amp;gt;\&quot; style=\&quot;text;whiteSpace=wrap;html=1;fontColor=#CC0000;fontStyle=1\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;369\&quot; y=\&quot;1728\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-61\&quot; value=\&quot;&amp;lt;span style=&amp;quot;font-family: Helvetica; font-size: 11px; font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; letter-spacing: normal; orphans: 2; text-align: center; text-indent: 0px; text-transform: none; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; white-space: nowrap; background-color: rgb(255, 255, 255); text-decoration-thickness: initial; text-decoration-style: initial; text-decoration-color: initial; float: none; display: inline !important;&amp;quot;&amp;gt;累积梯度&amp;amp;nbsp;&amp;lt;/span&amp;gt;&amp;lt;span style=&amp;quot;background-color: rgb(255, 255, 255); font-size: 11px; text-align: center; text-wrap: nowrap;&amp;quot;&amp;gt;gradient_accumulate&amp;lt;/span&amp;gt;\&quot; style=\&quot;text;whiteSpace=wrap;html=1;fontColor=#CC0000;fontStyle=1;fillColor=none;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;1110\&quot; y=\&quot;1578.5\&quot; width=\&quot;120\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;tbJsgM7FzyQLb-bkkZpG-62\&quot; value=\&quot;全局梯度\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#C3ABD0;strokeColor=#C3ABD0;shadow=1;fontSize=17;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;1270\&quot; y=\&quot;1513.5\&quot; width=\&quot;70\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n      &lt;/root&gt;\n    &lt;/mxGraphModel&gt;\n  &lt;/diagram&gt;\n&lt;/mxfile&gt;\n&quot;}"></div>
<script type="text/javascript" src="https://viewer.diagrams.net/js/viewer-static.min.js"></script>

<h3 id="参数的显存占用">参数的显存占用</h3>

<p>【2023-8-30】大模型要占你多少内存?<a href="https://mp.weixin.qq.com/s/U4VpmHuKvHKu3AuwXM3PYw">这个神器一键测量,误差低至0.5MB,免费可用</a></p>

<p>大模型训练推理要用多少内存?</p>
<ul>
  <li>HuggingFace Space上的最新火起来工具——<a href="https://huggingface.co/spaces/hf-accelerate/model-memory-usage">Model Memory Calculator</a>,模型内存测量器,在网页端人人可体验。</li>
  <li>比如模型bert-base-case Int8估计占用413.18 MB内存,实际占用为413.68MB,相差0.5MB,误差仅有0.1%。</li>
</ul>

<p>实际推理过程,EleutherAI 发现需要在预测数据基础上,预留20%的内存</p>

<p>【2023-8-30】<a href="https://huggingface.co/baichuan-inc/Baichuan-7B/tree/main">baichuan-7b</a> (14G) 部署失败,空间不够</p>
<ul>
  <li>GPU: A30, 24G 显存</li>
</ul>

<p>错误信息:</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code>torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 86.00 MiB <span class="o">(</span>GPU 0<span class="p">;</span> 22.20 GiB total capacity<span class="p">;</span> 7.47 GiB already allocated<span class="p">;</span> 51.12 MiB free<span class="p">;</span> 7.48 GiB reserved <span class="k">in </span>total by PyTorch<span class="o">)</span> If reserved memory is <span class="o">&gt;&gt;</span> allocated memory try setting max_split_size_mb to avoid fragmentation.  See documentation <span class="k">for </span>Memory Management and PYTORCH_CUDA_ALLOC_CONF
</code></pre></div></div>

<p><a href="https://huggingface.co/spaces/hf-accelerate/model-memory-usage">Model Memory Calculator</a>计算的开销</p>

<p>Memory Usage for ‘baichuan-inc/Baichuan-7B’</p>

<table>
  <thead>
    <tr>
      <th>dtype</th>
      <th>Largest Layer or Residual Group</th>
      <th>Total Size</th>
      <th>Training using Adam</th>
    </tr>
  </thead>
  <tbody>
    <tr>
      <td>float32</td>
      <td>1000.0 MB</td>
      <td>26.2 GB</td>
      <td>104.82 GB</td>
    </tr>
    <tr>
      <td>float16/bfloat16</td>
      <td>500.0 MB</td>
      <td>13.1 GB</td>
      <td>52.41 GB</td>
    </tr>
    <tr>
      <td>int8</td>
      <td>250.0 MB</td>
      <td>6.55 GB</td>
      <td>26.2 GB</td>
    </tr>
    <tr>
      <td>int4</td>
      <td>125.0 MB</td>
      <td>3.28 GB</td>
      <td> </td>
    </tr>
  </tbody>
</table>

<p>只有有参数的层,才会有显存占用。这部份的显存占用和输入无关,模型加载完成之后就会占用。</p>

<p>有参数的层主要包括:</p>
<ul>
  <li>卷积</li>
  <li>全连接</li>
  <li>BatchNorm</li>
  <li>Embedding层</li>
  <li>… …</li>
</ul>

<p>无参数的层:</p>
<ul>
  <li>多数的激活层(Sigmoid/ReLU)</li>
  <li>池化层</li>
  <li>Dropout</li>
  <li>… …</li>
</ul>

<p>模型参数数目(不考虑偏置项b)为:</p>
<ul>
  <li>Linear(M-&gt;N): 参数数目:M×N</li>
  <li>Conv2d(Cin, Cout, K): 参数数目:Cin × Cout × K × K</li>
  <li>BatchNorm(N): 参数数目: 2N</li>
  <li>Embedding(N,W): 参数数目: N × W</li>
</ul>

<p>参数占用显存 = 参数数目 × n</p>
<ul>
  <li>n = 4 :float32</li>
  <li>n = 2 : float16</li>
  <li>n = 8 : double64</li>
</ul>

<p>PyTorch中,当执行完 model=MyGreatModel().cuda() 后就会占用相应的显存,占用的显存大小基本与上述分析的显存差不多(会稍大一些,因为其它开销)。</p>

<h3 id="梯度与动量的显存占用">梯度与动量的显存占用</h3>

<p>优化器</p>
<ul>
  <li>SGD:W_t+1 = W_t - α * ▽ F(W_t)
    <ul>
      <li>除了保存权重W, 还要保存对应的<strong>梯度</strong> ▽ F(W_t) ,因此, 显存占用等于参数占用显存 <strong>x2</strong></li>
    </ul>
  </li>
  <li>带Momentum-SGD:
    <ul>
      <li>v_t+1 = ρv_t + ▽ F(W_t)</li>
      <li>W_t+1 = W_t - α * v_t+1</li>
      <li>还需要保存<strong>动量</strong>, 因此显存 <strong>x3</strong></li>
    </ul>
  </li>
  <li>Adam优化器
    <ul>
      <li>动量占用的显存更多,显存x4</li>
    </ul>
  </li>
</ul>

<p>总结,模型中与输入无关的显存占用包括:</p>
<ul>
  <li><strong>参数</strong> W</li>
  <li><strong>梯度</strong> dW(一般与参数一样)</li>
  <li>优化器的<strong>动量</strong>
    <ul>
      <li>普通SGD没有动量,momentum-SGD动量与梯度一样,Adam优化器动量的数量是梯度的<strong>两倍</strong></li>
    </ul>
  </li>
</ul>

<h3 id="输入输出的显存占用">输入输出的显存占用</h3>

<p>以CNN为例,模型输出的显存占用,总结如下:</p>
<ul>
  <li>需要计算每一层的feature map的形状(多维数组的形状)</li>
  <li>需要保存输出对应的梯度用以反向传播(链式法则)</li>
  <li>显存占用与 batch size 成正比</li>
  <li>模型输出不需要存储相应的动量信息。</li>
</ul>

<p>深度学习中神经网络的显存占用,可以得到如下公式:</p>
<blockquote>
  <p>显存占用 = 模型显存占用 + batch_size × 每个样本的显存占用</p>
</blockquote>

<p>显存不是和batch-size简单的成正比,尤其是模型自身比较复杂的情况下:比如全连接很大,Embedding层很大</p>

<p>另外需要注意:</p>
<ul>
  <li>输入(数据,图片)一般不需要计算梯度</li>
  <li>神经网络每层输入/输出都需要保存下来,用来<strong>反向传播</strong>,但是在某些特殊的情况下,不要保存输入。
    <ul>
      <li>比如 ReLU,PyTorch中,使用nn.ReLU(inplace = True) 能将激活函数ReLU的输出直接覆盖保存于模型的输入之中,节省不少显存。</li>
      <li>这时候是如何反向传播? (提示:y=relu(x) -&gt; dx = dy.copy();dx[ y&lt;=0 ] =0)</li>
    </ul>
  </li>
</ul>

<h3 id="节省显存的方法">节省显存的方法</h3>

<p>深度学习中,一般占用显存最多的是卷积等层的输出,模型参数占用的显存相对较少,而且不太好优化。</p>

<p>节省显存方法:</p>
<ul>
  <li>降低 batch-size</li>
  <li>下采样 (NCHW -&gt; (1/4)*NCHW)</li>
  <li>减少全连接层(一般只留最后一层分类用的全连接层</li>
</ul>

<p>更多信息见<a href="https://juejin.cn/post/6844903640558206984">原文</a></p>

<p>训练时显存不足怎么办?</p>

<p>常见的节省显存操作,优先级从高到低排列。</p>
<ol>
  <li>去掉 compute_metrics:
    <ul>
      <li>有些代码会在输出层后计算rouge分等,这个会输出一个 <code class="language-plaintext highlighter-rouge">batch_size</code><em><code class="language-plaintext highlighter-rouge">vocab_size</code></em><code class="language-plaintext highlighter-rouge">seq_len</code> 的一个大向量,非常占显存。</li>
    </ul>
  </li>
  <li>采用<code class="language-plaintext highlighter-rouge">bf16</code>/<code class="language-plaintext highlighter-rouge">fp16</code>进行混合精度训练:
    <ul>
      <li>现在大模型基本上都采用 bf16 来进行训练</li>
      <li>但是 v100 不支持 bf16,可以采用fp16进行训练。显存占用能够降低1倍。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">Flash attention</code>:不仅能够降低显存,更能提高训练速度。</li>
  <li><code class="language-plaintext highlighter-rouge">batch_size</code> 调小:
    <ul>
      <li>batch size 与模型每层激活状态所占显存呈<strong>正相关</strong></li>
      <li>降低 batch size 能够很大程度上降低这部分显存占用。</li>
    </ul>
  </li>
  <li>采用<strong>梯度累积</strong>:
    <ul>
      <li><code class="language-plaintext highlighter-rouge">global_batch_size</code> = <code class="language-plaintext highlighter-rouge">batch_size</code> * <code class="language-plaintext highlighter-rouge">梯度累积</code></li>
      <li>如果降低 batch_size 后想保持 global_batch_size 不变,可适当提高梯度累积值。</li>
    </ul>
  </li>
  <li>选择合适的<strong>上下文长度</strong>:
    <ul>
      <li>上下文长度与激活状态所占显存呈<strong>正相关</strong></li>
      <li>因此可适当降低上下文长度来降低显存占用。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">DeepSpeed Zero</code>:
    <ul>
      <li>显存占用从高到低为:<code class="language-plaintext highlighter-rouge">Zero 1</code> &gt; <code class="language-plaintext highlighter-rouge">Zero 2</code> &gt; <code class="language-plaintext highlighter-rouge">Zero 2</code> + <code class="language-plaintext highlighter-rouge">offload</code> &gt; <code class="language-plaintext highlighter-rouge">zero 3</code> &gt; <code class="language-plaintext highlighter-rouge">zero 3</code> + <code class="language-plaintext highlighter-rouge">offload</code></li>
      <li>推荐最多试到 <code class="language-plaintext highlighter-rouge">Zero2</code> + <code class="language-plaintext highlighter-rouge">offload</code>。</li>
    </ul>
  </li>
  <li>选择更小的基座模型:在满足需求的情况下,尽量选择更小的基座模型。</li>
</ol>

<p>慎重选择:</p>
<ol>
  <li><code class="language-plaintext highlighter-rouge">Lora</code>: 能跑<strong>全参</strong>就别跑 <code class="language-plaintext highlighter-rouge">Lora</code> 或 <code class="language-plaintext highlighter-rouge">Qlora</code>,一方面是麻烦,另一方面的确是效果差点。</li>
  <li><code class="language-plaintext highlighter-rouge">Qlora</code>: Qlora 速度比lora慢,但所需显存更少,实在没资源可以试试。</li>
  <li><code class="language-plaintext highlighter-rouge">Megatron-LM</code>: 可采用<strong>流水线</strong>并行和<strong>张量</strong>并行,使用比较麻烦,适合喜欢折腾的同学。</li>
  <li><code class="language-plaintext highlighter-rouge">Pai-Megatron-LM</code>: Megatron-LM 的衍生,支持 Qwen 的sft和pt,坑比较多,爱折腾可以试试。</li>
  <li><strong>激活检查点</strong>:不推荐,非常耗时。在反向传播时重新计算深度神经网络的中间值。用时间(重新计算这些值两次的时间成本)来换空间(提前存储这些值的内存成本)。</li>
</ol>

<h3 id="gpu-要存哪些参数">GPU 要存哪些参数</h3>

<p>【2023-6-28】<a href="https://mp.weixin.qq.com/s/pUcXaCwCqGCw3KOt-fgH-w">参考</a></p>

<p>模型训练中,GPU 要存储的参数</p>
<ul>
  <li>模型本身的参数、优化器状态、激活函数的输出值、梯度、一些零时的Buffer</li>
  <li><img src="https://mmbiz.qpic.cn/mmbiz_png/J0mLianhFicBHEDwE5nPHZKaicqsXBVgES5IexNgeadmAcMFdNofrszbpgXNHjicV8QDWciaVpIXndGZ8hDNATT68JQ/640?wx_fmt=png&amp;tp=wxpic&amp;wxfrom=5&amp;wx_lazy=1&amp;wx_co=1" alt="img" /></li>
</ul>

<p>模型参数仅占所有数据的小部分</p>
<ul>
  <li>当进行<strong>混合精度</strong>运算时,模型状态参数(优化器状态 + 梯度+ 模型参数)占大半以上。</li>
</ul>

<p>因此,要想办法去除模型训练过程中的冗余数据。</p>

<h4 id="llama-6b-占用多大内存">LLaMA-6B 占用多大内存</h4>

<p>【2023-7-13】LLaMA-6B 占用多大内存?计算过程</p>

<p>精度对所需内存的影响:</p>
<ul>
  <li><strong>fp32</strong>精度,一个参数需要 32 bits, <strong>4</strong> bytes.</li>
  <li><strong>fp16</strong>精度,一个参数需要 16 bits, <strong>2</strong> bytes.</li>
  <li><strong>int8</strong>精度,一个参数需要 8 bits, <strong>1</strong> byte.</li>
</ul>

<p>模型需要的RAM大致分三个部分:</p>
<ul>
  <li><strong>模型参数</strong>: 参数量*每个参数所需内存
    <ul>
      <li>对于fp32,LLaMA-6B需要 6B*4 bytes = 24GB 内存</li>
      <li>对于int8,LLaMA-6B需要 6B*1 byte = 6GB 内存</li>
    </ul>
  </li>
  <li><strong>梯度</strong>: 参数量*每个梯度参数所需内存</li>
  <li><strong>优化器</strong>参数: 不同的优化器所储存的参数量不同。
    <ul>
      <li>对于常用的AdamW,需要储存<strong>两倍</strong>的模型参数(用来储存一阶和二阶momentum)。</li>
      <li>fp32 的 LLaMA-6B,AdamW需要 6B*8 bytes = 48 GB</li>
      <li>int8 的 LLaMA-6B,AdamW需要 6B*2 bytes = 12 GB</li>
    </ul>
  </li>
  <li>其它
    <ul>
      <li>CUDA kernel也会占据一些RAM,大概1.3GB左右</li>
    </ul>
  </li>
</ul>

<p>综上,int8 精度的 LLaMA-6B 模型部分大致需要 6GB + 6GB + 12GB + 1.3GB = 25.3GB 左右。</p>

<p>再根据LLaMA的架构(hidden_size= 4096, intermediate_size= 11008, num_hidden_layers= 32, context_length = 2048)计算中间变量内存。每个instance需要: ( 4096+11008 ) * 2048 * 32 * 1 byte = 990 MB</p>

<p>所以,一张 A100(<strong>80GB</strong> RAM)大概可以在int8精度,batch_size = 50 的设定下进行全参数训练。</p>

<p>附</p>
<ul>
  <li>消费级显卡内存和算力查询: <a href="https://www.gpucheck.com/gpu-benchmark-graphics-card-comparison-chart">2023 GPU Benchmark and Graphics Card Comparison Chart</a></li>
</ul>

<h4 id="7b-占用多大内存">7B 占用多大内存</h4>

<p>一个<strong>7B</strong>规模大模型(如LLaMA-2 7B),基于<strong>16-bit</strong>混合精度训练时</p>
<ul>
  <li>仅考虑模型参数、梯度、优化器情况下,显存占用就有<strong>112GB</strong>
    <ul>
      <li>参数占 GPU 显存近 <strong>14GB</strong>(每个参数2字节)。</li>
      <li>训练时<strong>梯度</strong>存储占<strong>14GB</strong>(每个参数对应1个梯度,也是2字节)</li>
      <li>优化器Optimizer(假设是主流的AdamW)则是<strong>84GB</strong>(每个参数对应1个参数copy、一个momentum和一个variance,这三个都是float32)
        <ul>
          <li>2byte 模型<strong>静态</strong>参数权重(以16bit存储) = 14G</li>
          <li>2byte 模型<strong>更新</strong>参数权重 (以16bit存储)= 14G</li>
          <li>2byte <strong>梯度</strong>(以16bit存储)= 14G</li>
          <li>2byte <strong>梯度更新</strong>(以16bit存储)= 14G</li>
          <li>4byte <strong>一阶动量</strong>优化器更新(以32bit存储)= 28G</li>
          <li>4byte <strong>二阶方差</strong>优化器更新(以32bit存储)= 28G</li>
        </ul>
      </li>
      <li>目前,合计 112GB</li>
      <li>还有:前向传播时激活值,各种临时变量</li>
      <li>还与sequence length, hidden size、batch size都有关系。</li>
    </ul>
  </li>
  <li>目前<span style="color:red">A100、H100这样主流显卡单张是放不下</span>,更别提国内中小厂喜欢用的A6000/5000、甚至消费级显卡。</li>
</ul>

<h4 id="adam--fp16-混合精度预估">Adam + fp16 混合精度预估</h4>

<p>【2023-6-29】<a href="https://zhuanlan.zhihu.com/p/638199667">LLM Training GPU显存耗用量估计</a>,以Adam + fp16混合精度训练为例,分析其显存占用有以下四个部分</p>
<ul>
  <li>(1) 模型权重 Model
    <ul>
      <li>Prameters (FP16) 2 bytes</li>
      <li>Gradients (FP16) 2 bytes</li>
    </ul>
  </li>
  <li>(2) 前向激活值 Activations
    <ul>
      <li>前向过程中存储, y = w1 * x, 存储x用于计算w1梯度</li>
      <li>整体显存占用与batch有关</li>
    </ul>
  </li>
  <li>(3) 优化器 Optimizer:梯度、动量等
    <ul>
      <li>Master Weight (FP32) 4 bytes</li>
      <li>Adam m (FP32) 4 bytes</li>
      <li>Adam v (FP32) 4 bytes</li>
    </ul>
  </li>
  <li>(4) 临时混存 Buffer &amp; Fragmentation</li>
</ul>

<p>(1) 和 (3) 可以精确估计</p>
<ul>
  <li>显存占用大头是 <strong>Adam 优化器</strong>,占可计算部分的 12/16=75%</li>
  <li>其次是<strong>模型参数</strong>+<strong>梯度</strong>,显存容量至少是参数量的<strong>16倍</strong></li>
</ul>

<p>Adam + fp16混合精度训练</p>
<ul>
  <li><img src="https://pic1.zhimg.com/80/v2-b4b2b377eeac7222bd783f9505c1115c_1440w.webp" alt="" /></li>
  <li><img src="https://pic3.zhimg.com/80/v2-fbcc4e6eb4de305a46f5a79e98d41cda_1440w.webp" alt="" /></li>
</ul>

<p>结论:</p>
<ul>
  <li>不考虑Activation,3090 模型容量上限是 24/16=<strong>1.5B</strong>,A100 模型容量上限是 80/16=<strong>5B</strong>
    <ul>
      <li>假设训练过程中batchsize恒定为1,也即尽最大可能减少Activation在显存中的占用比例,使得理论计算值16Φ更接近真实的显存占用,那么24G的3090的模型容量上限是1.5B(差不多是GPT-2的水平),80G的A100的模型容量上限是5B</li>
    </ul>
  </li>
  <li>考虑Activation,3090的模型容量上限是 0.75B,A100的容量上限是 2.5B
    <ul>
      <li>batchsize为1的训练效率非常低,batchsize大于1才能充分发挥GPU的效率,此时Activation变得不可忽略。经验之谈,一般需要给Activation预留一半的显存空间(比如3090预留12G,A100预留40G),此时3090的模型容量上限是0.75B,A100的容量上限是2.5B,我们实际测试结果接近这个值</li>
    </ul>
  </li>
  <li>[1B, 5B] 是目前市面上大多数GPU卡的分水岭区间
    <ul>
      <li>[0, 1B) 市面上绝大多数卡都可以直接硬train一发</li>
      <li>[1B, 5B] 大多数卡在这个区间的某个值上触发模型容量上限,具体触发值和显存大小有关</li>
      <li>(5B, ~) 目前没有卡能裸训</li>
    </ul>
  </li>
</ul>

<h3 id="llm-推理显存开销">LLM 推理显存开销</h3>

<p>【2024-8-24】<a href="https://zhuanlan.zhihu.com/p/716347923?utm_psn=1811748062424592385">为大型语言模型 (LLM) 提供服务需要多少 GPU 内存?</a></p>

<p>运行一个大型语言模型,需要多大GPU内存?</p>

<p>GPU 内存估算公式</p>
<ul>
  <li>$ M=(P<em>4B)/(32/Q)</em>1.2 $</li>
  <li><img src="https://pic2.zhimg.com/80/v2-0b643b18fb67b7ea106b63b7d0e92101_1440w.webp" alt="" /></li>
</ul>

<p>解释</p>
<ul>
  <li>M 代表 GPU 内存的大小,单位是吉字节。</li>
  <li>P 指的是模型中包含的参数总数。</li>
  <li>4B 指的是每个参数平均占用的存储空间,为 4 个字节。</li>
  <li>Q 表示加载模型时使用的位数,可以是 16 位或者 32 位。</li>
  <li>1.2 表示在计算中加入了 20% 的额外空间以应对可能的需求。</li>
</ul>

<p><img src="https://pic2.zhimg.com/80/v2-919d5943cf2ddcc5ffa23f93acaecf3b_1440w.webp" alt="" /></p>

<p>分解公式</p>
<ul>
  <li>模型参数量 (P):这个指标反映了你的模型规模。比如,如果你使用的是 LLaMA 模型,它包含 700 亿个参数,那么这个参数量就是 700 亿。</li>
  <li>参数内存需求 (4B):通常情况下,每个模型参数需要 4 个字节的存储空间,这是因为浮点数通常需要 4 个字节(即 32 位)来表示。如果你采用的是半精度(16 位)格式,那么所需的内存量会相应减少。</li>
  <li>参数位宽 (Q):这个值取决于你是以 16 位还是 32 位的精度来加载模型。16 位精度在许多大型语言模型的应用中较为普遍,因为它在保证足够精度的同时,能够降低内存的消耗。</li>
  <li>额外开销 (1.2):乘以 1.2 的系数是为了增加 20% 的额外空间,以应对在模型推理过程中可能需要的额外内存。这不仅仅是为了安全起见,更是为了确保在模型执行过程中,激活操作和其他中间结果的内存需求得到满足。</li>
</ul>

<p><strong>700亿</strong>个参数(以 <strong>16位</strong>精度加载)的 LLaMA 模型提供服务所需的内存:</p>
<ul>
  <li>M = (P * 4B)/(32/Q) * 1.2 = (70 * 4 bytes)/(32/16) * 1.2 = 168 GB</li>
  <li>单块 NVIDIA A100 GPU,尽管配备了 80 GB 显存,但仍然不足以支撑该模型的运行。为了高效地处理内存需求,至少需要<strong>两块</strong> A100 GPU,每块都具备 80 GB 的显存容量。</li>
</ul>

<h2 id="内存显存优化">内存/显存优化</h2>

<p>显存优化技术:<a href="https://mp.weixin.qq.com/s/7wtwsNhf27YzALnSFXTmkA">参考</a></p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">重计算</code>(Recomputation):Activation checkpointing(Gradient checkpointing)本质上是一种用<strong>时间换空间</strong>的策略。</li>
  <li><code class="language-plaintext highlighter-rouge">卸载</code>(Offload)技术:一种用通信换显存的方法,简单来说就是让模型参数、激活值等在CPU内存和GPU显存之间左右横跳。如:ZeRO-Offload、ZeRO-Infinity等。</li>
  <li><code class="language-plaintext highlighter-rouge">混合精度</code>(BF16/FP16):降低训练显存的消耗,还能将训练速度提升2-4倍。
    <ul>
      <li>BF16 计算时可避免计算溢出,出现Inf case。</li>
      <li>FP16 在输入数据超过65506 时,计算结果溢出,出现Inf case。</li>
    </ul>
  </li>
</ul>

<h3 id="cpu卸载">CPU卸载</h3>

<p>当GPU内存已满时,一种选择是将暂时未使用的数据卸载到CPU,并在以后需要时将其读回(Rhu等人,2016)。数据卸载到CPU 的想法很简单,但由于它会延长训练时间,所以近年来不太流行。</p>

<h3 id="激活重新计算">激活重新计算</h3>

<p>激活重新计算(Activation recomputation (also known as “activation checkpointing” or “gradient checkpointing”,Chen等人,<a href="https://arvix.org/abs/1604.06174">2016年</a>)是一个以计算时间为代价减少内存占用的聪明而简单的想法</p>

<h3 id="混合精度训练">混合精度训练</h3>

<p>Narang&amp;Micikevicius等人(2018年)介绍了一种使用半精度浮点(FP16)数字训练模型而不损失模型精度的方法。</p>

<p>三种避免以半精度丢失关键信息的技术:</p>
<ul>
  <li>1)全精度原始权重。维护累积梯度的模型权重的全精度 (FP32) 副本, 对于向前和向后传递,数字四舍五入到半精度。主要是为了防止每个梯度更新(即梯度乘以学习率)可能太小而无法完全包含在 FP16 范围内(即 2-24 在 FP16 中变为零)的情况。</li>
  <li>2)损失缩放。扩大损失以更好地处理小幅度的梯度(见图 16), 放大梯度有助于将权重移动到可表示范围的右侧部分(包含较大值)占据更大的部分,从而保留否则会丢失的值。</li>
  <li>3)算术精度。对于常见的网络算法(例如向量点积,向量元素相加减少),可以将部分结果累加到 FP32 中,然后将最终输出保存为 FP16,然后再保存到内存中。可以在 FP16 或 FP32 中执行逐点操作。</li>
</ul>

<p>大模型训练过程中,GPU显存占用主要分成Model States 与 Activation 两部分</p>

<p>混合精度训练流程:通过引入fb16以及bf16精度来减少fb32精度带来的显存消耗。</p>
<ul>
  <li>存储一份fp32的parameter,momentum和variance(统称model states)</li>
  <li>在forward开始之前,额外开辟一块存储空间,将fp32 parameter减半到fp16 parameter;</li>
  <li>正常做forward和backward,在此之间产生的activation和gradients,都用fp16进行存储;</li>
  <li>用fp16 gradients去更新fp32下的model states;</li>
  <li>当模型收敛后,fp32的parameter就是最终的参数输出;</li>
</ul>

<p>混合精度下的显存:</p>
<ul>
  <li><img src="https://pic2.zhimg.com/v2-37133eb29fc4da4432a1a57a6138192d_b.jpg" alt="" /></li>
</ul>

<p>通常模型会使用float32(fp32)精度进行训练,但是随着模型越来越大,训练的硬件成本和时间成本急剧增加。而混合精度训练通过利用float16(fp16)的优点并规避缺点来进行训练。</p>

<p>fp32,fp16,bf16的区别如下图所示</p>
<ul>
  <li><img src="https://pic1.zhimg.com/80/v2-278e19aa63962ee80cdead8e714c391c_1440w.webp" alt="" /></li>
</ul>

<p>优点:</p>
<ol>
  <li>降低显存占用,float16比float32小一半;</li>
  <li>减少网络通信开销;</li>
  <li>硬件针对fp16优化,速度更快</li>
</ol>

<p>缺点:</p>
<ol>
  <li>下溢。float16最大的问题是”下溢”。
    <ul>
      <li>模型更新通常随着模型训练,值往往会很小,可能会超出float16表示的精度。</li>
      <li>结果就是:大多数的模型权重都不再更新,模型难以收敛。</li>
    </ul>
  </li>
  <li>舍入误差。
    <ul>
      <li>模型权重和梯度相差太大,通过梯度更新权重并进行舍入时,可能导致更新前和更新后的权重没有变化。</li>
    </ul>
  </li>
</ol>

<p>bf16是一种全新的数字格式,更加支持深度学习计算,但需要硬件支持,如NVIDIA A100, NVIDIA A800等</p>

<p>此外,官方文档中提到了AMP(Auto Mixed Precision 自动混合精度训练) ,与ZeRO不能同时使用</p>

<h4 id="int8">Int8</h4>

<p>Int8 - bitsandbytes</p>

<p>Int8是个很极端的数据类型,最多只能表示-128~127的数字,并且完全没有精度。</p>

<p>为了在训练和inference中使用这个数据类型,bitsandbytes使用了两个方法最大程度地降低了其带来的误差:</p>
<ul>
  <li>vector-wise quantization</li>
  <li>mixed precision decompasition</li>
</ul>

<p>Huggingface 用<a href="https://huggingface.co/blog/hf-bitsandbytes-integration">动图</a>解释了quantization的实现</p>
<ul>
  <li><a href="https://arxiv.org/abs/2208.07339">paper</a></li>
</ul>

<p>借助Huggingface PEFT,使用int8训练opt-6.5B的完整流程, <a href="https://github.com/huggingface/peft/blob/main/examples/int8_training/Finetune_opt_bnb_peft.ipynb">notebook</a></p>

<h4 id="fp-16">FP 16</h4>

<p>Fp16 - mixed precision</p>
<ul>
  <li>混合精度训练大致思路: 在 <strong>forward</strong> pass 和 <strong>gradient computation</strong> 时用 fp16 来加速,但是在<strong>更新参数</strong>时使用 fp32。</li>
  <li><img src="https://pic3.zhimg.com/80/v2-3f4e34dc3281e47d176fe3adc25c66b2_720w.webp" alt="" /></li>
  <li><a href="https://pytorch.org/docs/stable/notes/amp_examples.html">Pytorch 官方示例</a></li>
</ul>

<p>torch fp16推理:直接使用model.half()将模型转换为fp16.</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">model</span><span class="p">.</span><span class="nb">eval</span><span class="p">()</span>
<span class="n">model</span><span class="p">.</span><span class="n">half</span><span class="p">()</span> <span class="c1"># 半精度
</span></code></pre></div></div>

<p>Huggingface Transformers:<a href="https://huggingface.co/docs/transformers/perf_train_gpu_one#fp16-training">fp16-training</a></p>
<ul>
  <li>TrainingArguments 里声明 fp16=True</li>
</ul>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">training_args</span> <span class="o">=</span> <span class="n">TrainingArguments</span><span class="p">(</span><span class="n">per_device_train_batch_size</span><span class="o">=</span><span class="mi">4</span><span class="p">,</span> <span class="n">fp16</span><span class="o">=</span><span class="bp">True</span><span class="p">,</span> <span class="o">**</span><span class="n">default_args</span><span class="p">)</span>

<span class="n">trainer</span> <span class="o">=</span> <span class="n">Trainer</span><span class="p">(</span><span class="n">model</span><span class="o">=</span><span class="n">model</span><span class="p">,</span> <span class="n">args</span><span class="o">=</span><span class="n">training_args</span><span class="p">,</span> <span class="n">train_dataset</span><span class="o">=</span><span class="n">ds</span><span class="p">)</span>
<span class="n">result</span> <span class="o">=</span> <span class="n">trainer</span><span class="p">.</span><span class="n">train</span><span class="p">()</span>
<span class="n">print_summary</span><span class="p">(</span><span class="n">result</span><span class="p">)</span>
</code></pre></div></div>

<h3 id="压缩">压缩</h3>

<p>中间结果通常会消耗大量内存,尽管它们只在一次向前传递和一次向后传递中需要。这两种使用之间存在明显的时间差距。因此Jain等人(2018年)提出了一种数据编码策略,将第一次使用后的中间结果在第一次传递中进行压缩,然后将其解码回来进行反向传播。</p>

<h3 id="内存高效优化器">内存高效优化器</h3>

<p>优化器内存消耗。以流行的 Adam 优化器为例,它内部需要保持动量和方差,两者都与梯度和模型参数处于同一规模,但是需要节省 4 倍的模型权重内存。</p>

<h1 id="分布式机器学习实现">分布式机器学习实现</h1>

<p>【2022-6-2】<a href="https://zhuanlan.zhihu.com/p/365662727">分布式机器学习</a></p>
<ul>
  <li><img src="https://pic1.zhimg.com/v2-8e4eefe63cc256d4420a881a00f2851f_1440w.jpg" alt="" /></li>
</ul>

<p>在深度学习时代,训练数据特别大的时候想要<strong>单卡</strong>完成训练基本是不可能的。所以就需要进行<strong>分布式</strong>深度学习。</p>

<h2 id="经验">经验</h2>

<p>流水并行 (Pipeline Parallelism ) 是 LLM 分布式训练扩展到千卡集群以上的一个核心 feature</p>

<h3 id="并行度对比">并行度对比</h3>

<p>NVIDIA 在 3076 张 A100 集群上训练的 1T 参数量 LLM 使用的并行方式是:</p>
<ul>
  <li>Data Parallel Size = 6</li>
  <li>Tensor Parallel Size = 8</li>
  <li>Pipeline Parallel Size = 64</li>
</ul>

<p>并行度最高的是 流水并行,超过 DP 和 TP 10倍左右</p>

<h3 id="为什么3k卡集群主流是流水并行">为什么3k卡集群主流是流水并行?</h3>

<p>流水并行核心优势:</p>
<ul>
  <li>用比较少的 <strong>Pipeline Bubble</strong> 代价 (当 gradient accumulation step 很大时可以忽略不计),较少的 <strong>Tensor Buffer 显存</strong>代价,以及非常低的<strong>通信开销</strong>,将大模型分割在不同的 Group 中。 大幅减少了单张 GPU 上的 weight tensor 大小(数量) 和 Activation tensor 大小(数量)。</li>
  <li>跟 Tensor Parallel 相比, Pipeline Parallel 的通信代价很低且可以被 overlap, Tensor Parallel 虽然也能切分模型大小,但是需要全量数据(没有减少 Activation tensor 大小),另外极高的通信频率和通信量使得 Tensor Parallel 只能在机器内 8 张卡用 NVLink 等高速互联来实现,跨机的 TP 会严重拖慢速度。</li>
  <li>不仅如此, Pipeline Parallel 还将 Data Parallel 的模型更新限定在一个很小的范围内(比如六台机器), DP 所需的 AllReduce 通信会随着机器数量增多而变慢。 PP 也让 DP 所需同步的模型梯度大小变小了,大大减缓了模型更新对于训练速度的影响。</li>
</ul>

<p>因此  Pipeline Parallel  是让模型可以达到千亿、集群可以扩充到千卡以上的一个最重要的特性。</p>

<p>流水并行有很重要的约束条件:</p>
<ul>
  <li>需要一个 <strong>规整对称的、线性顺序</strong>的网络结构。</li>
</ul>

<p>GPT 就是这样一个典型的网络结构:</p>
<ul>
  <li>完全一样的 Transformer Layer 顺序堆叠,没有分叉和不对称情况,当均匀切分 Layer 时,各个 Stage 的前向/反向计算时间均一致。</li>
</ul>

<p>作者:成诚
链接:https://www.zhihu.com/question/588325646/answer/3422090041
来源:知乎
著作权归作者所有。商业转载请联系作者获得授权,非商业转载请注明出处。</p>

<p>流水并行训练时的 time line 参考如下:</p>
<ul>
  <li><img src="https://picx.zhimg.com/80/v2-8d4082c21f26f428da6edf8bf67f0ec1_1440w.webp?source=2c26e567" alt="" /></li>
</ul>

<p>(反向的计算时间是前向的两倍)整个集群最高效的训练时间段是 step 4、5、6、7 的前向 和 step 0、1、2、3 的反向同时在所有 stage 上并行计算的时候,这个时候集群没有空闲,全部都在并行执行。 当我们增加 acc step (比如从 8 增加到 64)时,中间部分完美并行的时间段占比就会更长, bubble time 的占比就会越来越小。</p>

<p>而 T5 的网络结构比 GPT 要复杂很多, T5 是 Encoder-Decoder 架构,整个网络分为两大块,且 Encoder 和 Decoder 的 Transformer Layer 参数大小、Attention 计算量、Context Length 等均不一致,导致 Encoder 的理论计算量要比 Decoder 大很多(整个网络不是均匀对称的)。 更要命的是, T5 Encoder 的输出要发给每个 Decoder Layer,网络结构不是线性而是有大量的分叉,前向反向之间包含了复杂的数据依赖关系, 会导致流水并行中,各个 Stage 之间会产生大量的、非对称的、间隔跨多个 Stage 的数据依赖,更加剧了流水并行的 load balance 问题。</p>
<ul>
  <li><img src="https://picx.zhimg.com/80/v2-f3dc6a3aee932e4f869eb0d8e3280ff6_1440w.webp?source=2c26e567" alt="" /></li>
</ul>

<p>所以直接使用 Megatron 跑 T5 的 Pipeline Parallelism,会从 nsys prof 时间线上看到大量的缝隙,各个 Stage 之间在互相等待,无法真正流水并行起来。</p>

<p>如果不用  Pipeline Parallelism 来训练 T5,那么只能借助: DP、TP 和 ZeRO 来进行并行优化了, 这就约束了 T5 的所有 Layer 都必须放在每一个 GPU 上,这种方式在 13B 量级的模型上是 OK 的,但是再往上扩展到 100B、1T 量级就不 work 了。</p>

<p>同时由于 TP 只能开到 8 (跨机器也会慢几倍), 在千卡 GPU 集群以上,大量的 DP 带来的通信变慢的影响也很严重(ZeRO-2/3 会大幅加剧这种通信开销)。 所以我们才说, 虽然 T5 的理论计算量相较于 GPT 没有增加很多,但是在千亿参数、千卡集群以上规模的时候,T5 的实际训练效率比 GPT 慢很多倍。即使到现在,也没有一个超过 11B 的 T5 模型发布, 而 11B 恰好是一个不借助 PP,仅通过  ZeRO + TP 就可以训练的模型大小,避免了 T5 的模型结构非对称性对于 PP 的灾难性影响。</p>

<h2 id="基本原理">基本原理</h2>

<p>无论哪种机器学习框架,分布式训练的基本原理都是相同的。可以从<strong>并行模式</strong>、<strong>架构模式</strong>、<strong>同步范式</strong>、<strong>物理架构</strong>、<strong>通信技术</strong>等五个不同的角度来分类。</p>

<p>更多信息见优质paper,把 DP(Data Parallel)、MP(Model Parallel)、PP(Pipeline Parallel)各个方面讲的很透彻</p>
<ul>
  <li><a href="https://zhuanlan.zhihu.com/p/106783111">ZeRO: Memory Optimizations Toward Training Trillion Parameter Models</a></li>
</ul>

<h3 id="并行模式">并行模式</h3>

<p>分布式训练目的:将原本巨大的训练任务拆解成<strong>多个子任务</strong>,每个子任务在独立的机器上单独执行。</p>

<p>大规模深度学习任务的难点在于:</p>
<ul>
  <li>1) 训练<strong>数据量巨大</strong>:将数据拆解成多个小模型分布到不同的node上。→ <strong>数据并行</strong></li>
  <li>2) 训练模型的<strong>参数巨大</strong>:将数据集拆解分布到不同的node上。→ <strong>模型并行</strong>
    <ul>
      <li>NLP的预训练模型实在太大了</li>
    </ul>
  </li>
</ul>

<table>
  <thead>
    <tr>
      <th>并行模式</th>
      <th> </th>
      <th>图解</th>
    </tr>
  </thead>
  <tbody>
    <tr>
      <td>数据并行</td>
      <td>单机多卡用DP(PS),多级多可用DDP(Ring Allreduce)</td>
      <td><img src="https://pic3.zhimg.com/v2-f10be44bff31f5412b3398cc0cfbce96_b.jpg" alt="" /></td>
    </tr>
    <tr>
      <td>模型并行</td>
      <td> </td>
      <td><img src="https://pic2.zhimg.com/v2-37b26149c568865d5112fadb9b1ec9ad_b.jpg" alt="" /></td>
    </tr>
    <tr>
      <td>流水线并行</td>
      <td> </td>
      <td> </td>
    </tr>
  </tbody>
</table>

<h3 id="数据并行dpddp">数据并行(DP&amp;DDP)</h3>

<p><img src="https://pic3.zhimg.com/v2-f10be44bff31f5412b3398cc0cfbce96_b.jpg" alt="" /></p>

<p>数据并行相对简单,N个node(worker)构成一个<strong>分布式集群</strong>,每个worker处理1/N的数据。</p>
<ul>
  <li>理论情况下能达到<strong>线性</strong>的加速效果。</li>
  <li>TF、torch、Horovod都可以在原生支持或者微小的改动实现数据并行模式。</li>
</ul>

<p>DP(单机)+DDP(多机)</p>

<p>数据并行(DP&amp;DDP)</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">DP</code>(Data Parallelism):早期数据并行模式,一般采用<strong>参数服务器</strong>(Parameters Server)编程框架。实际中多用于<strong>单机多卡</strong>。</li>
  <li><code class="language-plaintext highlighter-rouge">DDP</code>(Distributed Data Parallelism):分布式数据并行,采用<code class="language-plaintext highlighter-rouge">Ring AllReduce</code> 通讯方式,多用于<strong>多机多卡</strong>场景。</li>
</ul>

<h4 id="dp-单机数据并行">DP 单机数据并行</h4>

<p>数据并行本质</p>
<ul>
  <li><strong>单进程多线程</strong>实现方式,只能实现<strong>单机</strong>训练, 不算严格意义上的分布式训练</li>
</ul>

<p>多个GPU 情况下,将模型分发到每个GPU上去,每个GPU都保留完整模型参数。</p>
<ul>
  <li>每个GPU加载<strong>全部模型</strong>(Parameter、Grad、Optimizer、Activation、Temp buffer)</li>
  <li>将每个batch样本平均分配到每个GPU上进行梯度计算</li>
  <li>然后<strong>汇总</strong>每个GPU上的梯度</li>
  <li>将汇总梯度重新分发到每个GPU上,每个GPU模型根据汇总的梯度进行模型参数更细。</li>
  <li><img src="https://pic2.zhimg.com/v2-0c13c485f5b43319c3bb4c65ae09d475_b.jpg" alt="" /></li>
</ul>

<p>K个GPU并数据并行训练过程如下:</p>
<ul>
  <li>任何一次训练迭代中,给定的随机的小批量样本都将被分成K个部分,并均匀地分配到GPU上;</li>
  <li>每个GPU根据分配给它的小批量子集,计算模型参数的损失和梯度;</li>
  <li>将个GPU中的<strong>局部梯度</strong>聚合,以获得当前小批量的随机梯度;</li>
  <li>聚合梯度被重新分发到每个GPU中;</li>
  <li>每个GPU使用这个小批量随机梯度,来更新所维护的完整的模型参数集。</li>
</ul>

<p>数据并行是在每个worker上存储一个模型的备份,在各个worker 上处理不同的<strong>数据子集</strong>。然后需要<strong>规约</strong>(reduce)每个worker的结果,在各节点之间同步模型参数。</p>
<ul>
  <li>这一步会成为数据并行的瓶颈,因为如果worker很多的情况下,worker之间的数据传输会有很大的时间成本。</li>
</ul>

<p>参数同步后,需要采用不同的方法进行参数更新:</p>
<ul>
  <li><strong>参数平均法</strong>:最简单的一种数据平均化</li>
  <li><strong>更新式方法</strong></li>
</ul>

<p>若采用<strong>参数平均法</strong>,训练的过程如下所示:基于模型的配置随机初始化网络模型参数</p>
<ul>
  <li>将当前这组参数分发到各个工作节点</li>
  <li>在每个工作节点,用数据集的一部分数据进行训练</li>
  <li>将各个工作节点的参数的<strong>均值</strong>作为<strong>全局参数值</strong></li>
  <li>若还有训练数据没有参与训练,则继续从第二步开始</li>
</ul>

<p><strong>更新式</strong>方法与<strong>参数平均化</strong>类似,主要区别在于,在<strong>参数</strong>服务器和<strong>工作</strong>服务器之间传递参数时,更新式方法只传递<strong>更新信息</strong>(梯度和张量)。</p>

<p>问题:</p>
<ul>
  <li>负载不均衡,主GPU负载大</li>
  <li>PS 架构通信开销大</li>
</ul>

<h4 id="ddp-分布式数据并行">DDP 分布式数据并行</h4>

<p>DDP (Distribution Data Parallel)</p>
<ul>
  <li>AllReduce 架构,在单机和多机上都可以使用。</li>
  <li>负载分散在每个gpu节点上,通信成本是恒定的,与 GPU 数量无关。</li>
</ul>

<h3 id="模型并行model-parallesim">模型并行(model parallesim)</h3>

<p>当<strong>模型参数过大</strong>,单个 GPU无法容纳模型参数时,就需要模型并行, 将模型拆分到多个 GPU 训练。</p>

<p>模型并行相对复杂</p>
<ul>
  <li>原理:分布式系统中的不同worker负责网络模型的不同部分</li>
  <li>例如,神经网络的不同层被分布到不同worker或者同一层的不同参数被分配到不同worker上。</li>
  <li>对于TF这种框架,可以拆分计算图成多个最小依赖子图到不同的worker上。同时在多个子图之间通过通信算子来实现模型并行。</li>
</ul>

<p>但是<strong>模型并行</strong>实现起来比较复杂。工业界还是以<strong>数据并行</strong>为主。</p>

<h4 id="层间--层内">层间 &amp; 层内</h4>

<p><code class="language-plaintext highlighter-rouge">Model Parallel</code>主要分两种:<strong>intra-layer</strong>拆分 和 <strong>inter-layer</strong>拆分</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">inter-layer</code>拆分:对模型做网络上的拆分,将每一层或者<strong>某几层</strong>放在一个worker上单独训练。
    <ul>
      <li>缺点:模型训练串行,整个模型的效率取决于最慢的那一层,存在资源浪费</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">intranet-layer</code>拆分:深度学习的网络结构基本都是一层层的。常规的卷积、池化、BN等等。如果对某一层进行了拆分,那么就是intra-layer拆分。对单层的拆分其实就是拆分这一层的matrix运算。
    <ul>
      <li>参考论文:Megatron-LM: Training Multi-Billion Parameter Language Models Using Model Parallelism</li>
    </ul>
  </li>
</ul>

<p>对比</p>
<ul>
  <li>层间并行: 流水线并行</li>
  <li>层内并行: 张量并行</li>
</ul>

<table>
  <thead>
    <tr>
      <th>概念</th>
      <th>中文</th>
      <th>图解</th>
    </tr>
  </thead>
  <tbody>
    <tr>
      <td>intra-layer and inter-layer</td>
      <td>层间并行和层内并行</td>
      <td><img src="https://pic2.zhimg.com/80/v2-c24f5994e88c4361d578d5e0939be7b9_1440w.webp" alt="" /></td>
    </tr>
    <tr>
      <td>orthogonal and complimentary</td>
      <td>正交和互补</td>
      <td><img src="https://pic2.zhimg.com/80/v2-708c01105de92567824bd9d3456b9459_1440w.webp" alt="" /></td>
    </tr>
  </tbody>
</table>

<p><strong>模型并行</strong>通常分为<strong>张量并行</strong>(纵向切分)以及<strong>流水线并行</strong>(横向切分)</p>
<ul>
  <li><img src="https://pic2.zhimg.com/v2-37b26149c568865d5112fadb9b1ec9ad_b.jpg" alt="" /></li>
  <li><strong>流水线并行</strong>(Pipeline model parallesim)
    <ul>
      <li>朴素拆分方式: 将模型各层分组后装载到各个GPU上去,GPU之间进行<strong>串行</strong>计算
        <ul>
          <li>缺点: GPU 利用率太低,当一个GPU进行计算时,其他层的GPU都闲置。</li>
        </ul>
      </li>
      <li>改进: 谷歌提出了GPipe <code class="language-plaintext highlighter-rouge">流水线并行</code>(Pipeline model parallesim ), 引入micro-batches (MBS)的概念,会提升GPU利用率</li>
      <li>问题: 流水线最大的问题, 无法充分利用GPU资源,training过程中会出现非预期的Bubble</li>
    </ul>
  </li>
  <li><strong>张量并行</strong>(Tensor Model Parallelism)
    <ul>
      <li>张量并行(TP)是模型并行一种形式,流水线并行按<strong>网络层</strong>切分,张量并行按<strong>矩阵</strong>切分。</li>
      <li>2019年,NVIDIA发布《Efficient Large-Scale Language Model Training on GPU Clusters Using Megatron-LM》论文,提出了张量并行方法</li>
      <li>核心思想: 每个GPU仅处理矩阵一部分,当算子需要整个矩阵的时候再进行矩阵聚合。无论是横向切分还是竖向切分,都可以将切分后的矩阵放到不同GPU上进行计算,最后将计算的结果再合并。</li>
    </ul>
  </li>
</ul>

<p>大模型主要结构都是Transformer模型,Transformer核心模块网路结构:<strong>anttention层</strong>+<strong>残差连接</strong>,MLP层+残差连接。</p>
<ul>
  <li>MLP层: 数学表达如下:<code class="language-plaintext highlighter-rouge">Y = GeLU(XA)</code> ,<code class="language-plaintext highlighter-rouge">Z = Dropout(YB)</code></li>
  <li>Attention层: 数学表达如下:<code class="language-plaintext highlighter-rouge">Y = Self-Attention(X)</code> ,<code class="language-plaintext highlighter-rouge">Z = Dropout(YB)</code>, 多头注意力每个头都是独立的,因此张量切分更方便</li>
</ul>

<p>大模型训练时,ZeRO支持将模型显存内存占用划分到多张卡或者多个节点。</p>

<h4 id="示例">示例</h4>

<p>【2023-8-28】<a href="https://zhuanlan.zhihu.com/p/87596314">模型并行最佳实践(PyTorch)</a></p>

<p>两个GPU上运行此模型,只需将每个线性层放在不同的GPU上,然后移动输入(input)和中间输出(intermediate outputs)以匹配层设备(layer devices)。</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kn">import</span> <span class="nn">torch</span>
<span class="kn">import</span> <span class="nn">torch.nn</span> <span class="k">as</span> <span class="n">nn</span>
<span class="kn">import</span> <span class="nn">torch.optim</span> <span class="k">as</span> <span class="n">optim</span>

<span class="k">class</span> <span class="nc">ToyModel</span><span class="p">(</span><span class="n">nn</span><span class="p">.</span><span class="n">Module</span><span class="p">):</span>
  <span class="s">"""
    模型并行示例
  """</span>

  <span class="k">def</span> <span class="nf">__init__</span><span class="p">(</span><span class="bp">self</span><span class="p">):</span>
    <span class="c1"># 模型定义修改: 只需增加 to(device)
</span>    <span class="nb">super</span><span class="p">(</span><span class="n">ToyModel</span><span class="p">,</span> <span class="bp">self</span><span class="p">).</span><span class="n">__init__</span><span class="p">()</span>
    <span class="bp">self</span><span class="p">.</span><span class="n">net1</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">nn</span><span class="p">.</span><span class="n">Linear</span><span class="p">(</span><span class="mi">10</span><span class="p">,</span> <span class="mi">10</span><span class="p">).</span><span class="n">to</span><span class="p">(</span><span class="s">'cuda:0'</span><span class="p">)</span>  <span class="c1"># 将net1放置在第1个GPU上
</span>    <span class="bp">self</span><span class="p">.</span><span class="n">relu</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">nn</span><span class="p">.</span><span class="n">ReLU</span><span class="p">()</span>
    <span class="bp">self</span><span class="p">.</span><span class="n">net2</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">nn</span><span class="p">.</span><span class="n">Linear</span><span class="p">(</span><span class="mi">10</span><span class="p">,</span> <span class="mi">5</span><span class="p">).</span><span class="n">to</span><span class="p">(</span><span class="s">'cuda:1'</span><span class="p">)</span>   <span class="c1"># 将net2放置在第2个GPU上
</span>
  <span class="k">def</span> <span class="nf">forward</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">x</span><span class="p">):</span>
    <span class="n">x</span> <span class="o">=</span> <span class="bp">self</span><span class="p">.</span><span class="n">relu</span><span class="p">(</span><span class="bp">self</span><span class="p">.</span><span class="n">net1</span><span class="p">(</span><span class="n">x</span><span class="p">.</span><span class="n">to</span><span class="p">(</span><span class="s">'cuda:0'</span><span class="p">)))</span>
    <span class="k">return</span> <span class="bp">self</span><span class="p">.</span><span class="n">net2</span><span class="p">(</span><span class="n">x</span><span class="p">.</span><span class="n">to</span><span class="p">(</span><span class="s">'cuda:1'</span><span class="p">))</span>
</code></pre></div></div>

<p>注意 ToyModel</p>
<ul>
  <li>除了5个用于将<strong>线性层</strong>(linear layers)和<strong>张量</strong>(tensors)放置在适当设备上的to(device)调用之外,以上内容与在单个GPU上实现该功能非常相似。那是模型中<strong>唯一</strong>更改地方(即to(device) )。</li>
  <li>在 backward()和 torch.optim 会<strong>自动</strong>关注梯度(gradients),模型如同一个GPU。</li>
  <li>调用损失函数时,只需确保<strong>标签</strong>(label)与<strong>输出</strong>(output)在同一设备(on the same device)上。</li>
</ul>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">model</span> <span class="o">=</span> <span class="n">ToyModel</span><span class="p">()</span>
<span class="n">loss_fn</span> <span class="o">=</span> <span class="n">nn</span><span class="p">.</span><span class="n">MSELoss</span><span class="p">()</span>
<span class="n">optimizer</span> <span class="o">=</span> <span class="n">optim</span><span class="p">.</span><span class="n">SGD</span><span class="p">(</span><span class="n">model</span><span class="p">.</span><span class="n">paraeters</span><span class="p">(),</span> <span class="n">lr</span><span class="o">=</span><span class="mf">0.001</span><span class="p">)</span>

<span class="n">optimizer</span><span class="p">.</span><span class="n">zero_grad</span><span class="p">()</span>
<span class="n">outputs</span> <span class="o">=</span> <span class="n">model</span><span class="p">(</span><span class="n">torch</span><span class="p">.</span><span class="n">randn</span><span class="p">(</span><span class="mi">20</span><span class="p">,</span> <span class="mi">10</span><span class="p">))</span>
<span class="n">labels</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">randn</span><span class="p">(</span><span class="mi">20</span><span class="p">,</span> <span class="mi">5</span><span class="p">).</span><span class="n">to</span><span class="p">(</span><span class="s">'cuda:1'</span><span class="p">)</span> <span class="c1"># ToyMode 的 output 是在 'cuda:1' 上,此处的 label 也应该置于 'cuda:1' 上
</span><span class="n">loss_fn</span><span class="p">(</span><span class="n">outputs</span><span class="p">,</span><span class="n">labels</span><span class="p">).</span><span class="n">backward</span><span class="p">()</span>
<span class="n">optimizer</span><span class="p">.</span><span class="n">step</span><span class="p">()</span>
</code></pre></div></div>

<p>只需更改几行,就可以在多个GPU上运行现有的单GPU模块。</p>

<p>如何分解 torchvision.models.reset50() 为两个GPU。</p>
<ul>
  <li>从现有 ResNet模块继承,并在构建过程中将层拆分为两个GPU。</li>
  <li>然后覆盖 forward方法来缝合两个子网,通过相应地移动中间输出。</li>
</ul>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kn">from</span> <span class="nn">torchvision.models.resnet</span> <span class="kn">import</span> <span class="n">ResNet</span><span class="p">,</span> <span class="n">Bottleneck</span>

<span class="n">num_classes</span> <span class="o">=</span> <span class="mi">1000</span>

<span class="k">class</span> <span class="nc">ModelParallelResNet50</span><span class="p">(</span><span class="n">ResNet</span><span class="p">):</span>
    <span class="k">def</span> <span class="nf">__init__</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="o">*</span><span class="n">args</span><span class="p">,</span> <span class="o">**</span><span class="n">kwargs</span><span class="p">):</span>
        <span class="nb">super</span><span class="p">(</span><span class="n">ModelParallelResNet50</span><span class="p">,</span> <span class="bp">self</span><span class="p">).</span><span class="n">__init__</span><span class="p">(</span><span class="n">Bottleneck</span><span class="p">,</span> <span class="p">[</span><span class="mi">3</span><span class="p">,</span> <span class="mi">4</span><span class="p">,</span> <span class="mi">6</span><span class="p">,</span> <span class="mi">3</span><span class="p">],</span> <span class="n">num_classes</span><span class="o">=</span><span class="n">num_classes</span><span class="p">,</span> <span class="o">*</span><span class="n">args</span><span class="p">,</span> <span class="o">**</span><span class="n">kwargs</span><span class="p">)</span>

        <span class="bp">self</span><span class="p">.</span><span class="n">seq1</span> <span class="o">=</span> <span class="n">nn</span><span class="p">.</span><span class="n">Sequential</span><span class="p">(</span>
            <span class="bp">self</span><span class="p">.</span><span class="n">conv1</span><span class="p">,</span>
            <span class="bp">self</span><span class="p">.</span><span class="n">bn1</span><span class="p">,</span>
            <span class="bp">self</span><span class="p">.</span><span class="n">relu</span><span class="p">,</span>
            <span class="bp">self</span><span class="p">.</span><span class="n">maxpool</span><span class="p">,</span>
            <span class="c1"># 模型拆分
</span>            <span class="bp">self</span><span class="p">.</span><span class="n">layer1</span><span class="p">,</span>
            <span class="bp">self</span><span class="p">.</span><span class="n">layer2</span>
        <span class="p">).</span><span class="n">to</span><span class="p">(</span><span class="s">'cuda:0'</span><span class="p">)</span>  <span class="c1"># 放置在第1个GPU上
</span>
        <span class="bp">self</span><span class="p">.</span><span class="n">seq2</span> <span class="o">=</span> <span class="n">nn</span><span class="p">.</span><span class="n">Sequential</span><span class="p">(</span>
            <span class="bp">self</span><span class="p">.</span><span class="n">layer3</span><span class="p">,</span>
            <span class="bp">self</span><span class="p">.</span><span class="n">layer4</span><span class="p">,</span>
            <span class="bp">self</span><span class="p">.</span><span class="n">avgpool</span><span class="p">,</span>
        <span class="p">).</span><span class="n">to</span><span class="p">(</span><span class="s">'cuda:1'</span><span class="p">)</span>  <span class="c1"># 放置在第2个GPU上
</span>
        <span class="bp">self</span><span class="p">.</span><span class="n">fc</span><span class="p">.</span><span class="n">to</span><span class="p">(</span><span class="s">'cuda:1'</span><span class="p">)</span>

    <span class="k">def</span> <span class="nf">forward</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">x</span><span class="p">):</span>
        <span class="n">x</span> <span class="o">=</span> <span class="bp">self</span><span class="p">.</span><span class="n">seq2</span><span class="p">(</span><span class="bp">self</span><span class="p">.</span><span class="n">seq1</span><span class="p">(</span><span class="n">x</span><span class="p">).</span><span class="n">to</span><span class="p">(</span><span class="s">'cuda:1'</span><span class="p">))</span>
        <span class="k">return</span> <span class="bp">self</span><span class="p">.</span><span class="n">fc</span><span class="p">(</span><span class="n">x</span><span class="p">.</span><span class="n">view</span><span class="p">(</span><span class="n">x</span><span class="p">.</span><span class="n">size</span><span class="p">(</span><span class="mi">0</span><span class="p">),</span> <span class="o">-</span><span class="mi">1</span><span class="p">))</span>
</code></pre></div></div>

<p>对于模型太大而无法放入单个GPU的情况,上述实现解决了该问题。但是,如果模型合适,model parallel 将比在单个GPU上运行要<strong>慢</strong>。</p>
<ul>
  <li>因为在<span style="color:red">任何时间点,两个GPU中只有1个在工作</span>,而另一个在那儿什么也没做。</li>
  <li>在 layer2 和 layer3之间,中间输出需要从 cuda:0 复制到 cuda:1,这使得性能进一步恶化。</li>
</ul>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kn">import</span> <span class="nn">torchvision.models</span> <span class="k">as</span> <span class="n">models</span>

<span class="n">num_batches</span> <span class="o">=</span> <span class="mi">3</span>
<span class="n">batch_size</span> <span class="o">=</span> <span class="mi">120</span>
<span class="n">image_w</span> <span class="o">=</span> <span class="mi">128</span>
<span class="n">image_h</span> <span class="o">=</span> <span class="mi">128</span>

<span class="k">def</span> <span class="nf">train</span><span class="p">(</span><span class="n">model</span><span class="p">):</span>
    <span class="n">model</span><span class="p">.</span><span class="n">train</span><span class="p">(</span><span class="bp">True</span><span class="p">)</span>
    <span class="n">loss_fn</span> <span class="o">=</span> <span class="n">nn</span><span class="p">.</span><span class="n">MSELoss</span><span class="p">()</span>
    <span class="n">optimizer</span> <span class="o">=</span> <span class="n">optim</span><span class="p">.</span><span class="n">SGD</span><span class="p">(</span><span class="n">model</span><span class="p">.</span><span class="n">parameters</span><span class="p">(),</span> <span class="n">lr</span><span class="o">=</span><span class="mf">0.001</span><span class="p">)</span>

    <span class="n">one_hot_indices</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">LongTensor</span><span class="p">(</span><span class="n">batch_size</span><span class="p">)</span> \
                           <span class="p">.</span><span class="n">random_</span><span class="p">(</span><span class="mi">0</span><span class="p">,</span> <span class="n">num_classes</span><span class="p">)</span> \
                           <span class="p">.</span><span class="n">view</span><span class="p">(</span><span class="n">batch_size</span><span class="p">,</span> <span class="mi">1</span><span class="p">)</span>

    <span class="k">for</span> <span class="n">_</span> <span class="ow">in</span> <span class="nb">range</span><span class="p">(</span><span class="n">num_batches</span><span class="p">):</span>
        <span class="c1"># generate random inputs and labels
</span>        <span class="n">inputs</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">randn</span><span class="p">(</span><span class="n">batch_size</span><span class="p">,</span> <span class="mi">3</span><span class="p">,</span> <span class="n">image_w</span><span class="p">,</span> <span class="n">image_h</span><span class="p">)</span>
        <span class="n">labels</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">zeros</span><span class="p">(</span><span class="n">batch_size</span><span class="p">,</span> <span class="n">num_classes</span><span class="p">)</span> \
                      <span class="p">.</span><span class="n">scatter_</span><span class="p">(</span><span class="mi">1</span><span class="p">,</span> <span class="n">one_hot_indices</span><span class="p">,</span> <span class="mi">1</span><span class="p">)</span>

        <span class="c1"># run forward pass
</span>        <span class="n">optimizer</span><span class="p">.</span><span class="n">zero_grad</span><span class="p">()</span>
        <span class="n">outputs</span> <span class="o">=</span> <span class="n">model</span><span class="p">(</span><span class="n">inputs</span><span class="p">.</span><span class="n">to</span><span class="p">(</span><span class="s">'cuda:0'</span><span class="p">))</span>

        <span class="c1"># run backward pass
</span>        <span class="n">labels</span> <span class="o">=</span> <span class="n">labels</span><span class="p">.</span><span class="n">to</span><span class="p">(</span><span class="n">outputs</span><span class="p">.</span><span class="n">device</span><span class="p">)</span>
        <span class="n">loss_fn</span><span class="p">(</span><span class="n">outputs</span><span class="p">,</span> <span class="n">labels</span><span class="p">).</span><span class="n">backward</span><span class="p">()</span>
        <span class="n">optimizer</span><span class="p">.</span><span class="n">step</span><span class="p">()</span>
</code></pre></div></div>

<p>两个GPU中的一个会处于空闲状态。怎么优化?</p>
<ul>
  <li>将每个批次进一步划分为拆分<code class="language-plaintext highlighter-rouge">流水线</code>,当1个拆分到达第2子网时,可以将下一个拆分馈入第一子网。这样,两个连续的拆分可以在两个GPU上同时运行。</li>
</ul>

<p>流水线输入(Pipelining Inputs)加速</p>
<ul>
  <li>将每批次 120-image 进一步划分为 20-image 。当PyTorch异步启动CUDA操作时,该实现无需生成多个线程即可实现并发。</li>
</ul>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="k">class</span> <span class="nc">PipelineParallelResNet50</span><span class="p">(</span><span class="n">ModelParallelResNet50</span><span class="p">):</span>
    <span class="k">def</span> <span class="nf">__init__</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">split_size</span><span class="o">=</span><span class="mi">20</span><span class="p">,</span> <span class="o">*</span><span class="n">args</span><span class="p">,</span> <span class="o">**</span><span class="n">kwargs</span><span class="p">):</span>
        <span class="nb">super</span><span class="p">(</span><span class="n">PipelineParallelResNet50</span><span class="p">,</span> <span class="bp">self</span><span class="p">).</span><span class="n">__init__</span><span class="p">(</span><span class="o">*</span><span class="n">args</span><span class="p">,</span> <span class="o">**</span><span class="n">kwargs</span><span class="p">)</span>
        <span class="bp">self</span><span class="p">.</span><span class="n">split_size</span> <span class="o">=</span> <span class="n">split_size</span>

    <span class="k">def</span> <span class="nf">forward</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">x</span><span class="p">):</span>
        <span class="n">splits</span> <span class="o">=</span> <span class="nb">iter</span><span class="p">(</span><span class="n">x</span><span class="p">.</span><span class="n">split</span><span class="p">(</span><span class="bp">self</span><span class="p">.</span><span class="n">split_size</span><span class="p">,</span> <span class="n">dim</span><span class="o">=</span><span class="mi">0</span><span class="p">))</span>
        <span class="n">s_next</span> <span class="o">=</span> <span class="nb">next</span><span class="p">(</span><span class="n">splits</span><span class="p">)</span>
        <span class="n">s_prev</span> <span class="o">=</span> <span class="bp">self</span><span class="p">.</span><span class="n">seq1</span><span class="p">(</span><span class="n">s_next</span><span class="p">).</span><span class="n">to</span><span class="p">(</span><span class="s">'cuda:1'</span><span class="p">)</span>
        <span class="n">ret</span> <span class="o">=</span> <span class="p">[]</span>

        <span class="k">for</span> <span class="n">s_next</span> <span class="ow">in</span> <span class="n">splits</span><span class="p">:</span>
            <span class="c1"># A. s_prev runs on cuda:1
</span>            <span class="n">s_prev</span> <span class="o">=</span> <span class="bp">self</span><span class="p">.</span><span class="n">seq2</span><span class="p">(</span><span class="n">s_prev</span><span class="p">)</span>
            <span class="n">ret</span><span class="p">.</span><span class="n">append</span><span class="p">(</span><span class="bp">self</span><span class="p">.</span><span class="n">fc</span><span class="p">(</span><span class="n">s_prev</span><span class="p">.</span><span class="n">view</span><span class="p">(</span><span class="n">s_prev</span><span class="p">.</span><span class="n">size</span><span class="p">(</span><span class="mi">0</span><span class="p">),</span> <span class="o">-</span><span class="mi">1</span><span class="p">)))</span>

            <span class="c1"># B. s_next runs on cuda:0, which can run concurrently with A
</span>            <span class="n">s_prev</span> <span class="o">=</span> <span class="bp">self</span><span class="p">.</span><span class="n">seq1</span><span class="p">(</span><span class="n">s_next</span><span class="p">).</span><span class="n">to</span><span class="p">(</span><span class="s">'cuda:1'</span><span class="p">)</span>

        <span class="n">s_prev</span> <span class="o">=</span> <span class="bp">self</span><span class="p">.</span><span class="n">seq2</span><span class="p">(</span><span class="n">s_prev</span><span class="p">)</span>
        <span class="n">ret</span><span class="p">.</span><span class="n">append</span><span class="p">(</span><span class="bp">self</span><span class="p">.</span><span class="n">fc</span><span class="p">(</span><span class="n">s_prev</span><span class="p">.</span><span class="n">view</span><span class="p">(</span><span class="n">s_prev</span><span class="p">.</span><span class="n">size</span><span class="p">(</span><span class="mi">0</span><span class="p">),</span> <span class="o">-</span><span class="mi">1</span><span class="p">)))</span>

        <span class="k">return</span> <span class="n">torch</span><span class="p">.</span><span class="n">cat</span><span class="p">(</span><span class="n">ret</span><span class="p">)</span>


<span class="n">setup</span> <span class="o">=</span> <span class="s">"model = PipelineParallelResNet50()"</span>
<span class="n">pp_run_times</span> <span class="o">=</span> <span class="n">timeit</span><span class="p">.</span><span class="n">repeat</span><span class="p">(</span>
    <span class="n">stmt</span><span class="p">,</span> <span class="n">setup</span><span class="p">,</span> <span class="n">number</span><span class="o">=</span><span class="mi">1</span><span class="p">,</span> <span class="n">repeat</span><span class="o">=</span><span class="n">num_repeat</span><span class="p">,</span> <span class="nb">globals</span><span class="o">=</span><span class="nb">globals</span><span class="p">())</span>
<span class="n">pp_mean</span><span class="p">,</span> <span class="n">pp_std</span> <span class="o">=</span> <span class="n">np</span><span class="p">.</span><span class="n">mean</span><span class="p">(</span><span class="n">pp_run_times</span><span class="p">),</span> <span class="n">np</span><span class="p">.</span><span class="n">std</span><span class="p">(</span><span class="n">pp_run_times</span><span class="p">)</span>

<span class="n">plot</span><span class="p">([</span><span class="n">mp_mean</span><span class="p">,</span> <span class="n">rn_mean</span><span class="p">,</span> <span class="n">pp_mean</span><span class="p">],</span>
     <span class="p">[</span><span class="n">mp_std</span><span class="p">,</span> <span class="n">rn_std</span><span class="p">,</span> <span class="n">pp_std</span><span class="p">],</span>
     <span class="p">[</span><span class="s">'Model Parallel'</span><span class="p">,</span> <span class="s">'Single GPU'</span><span class="p">,</span> <span class="s">'Pipelining Model Parallel'</span><span class="p">],</span>
     <span class="s">'mp_vs_rn_vs_pp.png'</span><span class="p">)</span>
</code></pre></div></div>

<p>设备到设备的张量复制操作在源设备和目标设备上的当前流(current streams)上同步。如果创建多个流,则必须确保复制操作正确同步。在完成复制操作之前写入源张量或读取/写入目标张量可能导致不确定的行为。上面的实现仅在源设备和目标设备上都使用默认流,因此没有必要强制执行其他同步。</p>

<h3 id="流水线并行">流水线并行</h3>

<p><code class="language-plaintext highlighter-rouge">数据并行</code>还是<code class="language-plaintext highlighter-rouge">模型并行</code>都会在相应机器之间全连接通信,当机器数量增大时,<strong>通信开销和时延</strong>会大到难以忍受</p>

<p>流水线(管道)并行既解决了<strong>超大模型无法在单设备装下</strong>的难题,又解决了<strong>机器之间的通信开销</strong>的问题</p>
<ul>
  <li>每个阶段(stage) 和下一个阶段之间仅有相邻的某一个 Tensor 数据需要传输</li>
  <li>每台机器的数据传输量跟总的网络大小、机器总数、并行规模无关。</li>
</ul>

<p><img src="https://pic1.zhimg.com/80/v2-bdb9a12c01204d335187f6e3e3aad284_1440w.webp" alt="" /></p>

<p><strong>流水线并行</strong>(Pipeline model parallesim)</p>
<ul>
  <li>朴素拆分方式: 将模型<strong>各层</strong>分组后装载到各个GPU上,GPU之间进行<strong>串行</strong>计算</li>
  <li><img src="https://pic4.zhimg.com/80/v2-a2c0059f72e0e121520b6fce2027deaf_1440w.webp" alt="" /></li>
  <li>缺点: <strong>GPU 利用率太低</strong>,当1个GPU进行计算时,其他层GPU都闲置。</li>
</ul>

<p>改进方法如下</p>
<ul>
  <li>GPipe</li>
  <li>PipeDream</li>
</ul>

<h4 id="g-pipe">G-pipe</h4>

<p>谷歌提出 <code class="language-plaintext highlighter-rouge">G-pipe</code> <code class="language-plaintext highlighter-rouge">流水线并行</code>(Pipeline model parallesim ), 引入micro-batches (MBS)的概念,会提升GPU利用率</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">F-then-B</code> 调度方式: 原 mini-batch(数据并行切分后的batch)划分成多个 <code class="language-plaintext highlighter-rouge">micro-batch</code>(<code class="language-plaintext highlighter-rouge">mini-batch</code>再切分后的batch),每个 pipeline stage (流水线并行的计算单元)先整体进行<strong>前向</strong>计算,再进行<strong>反向</strong>计算。同一时刻分别计算模型的不同部分,F-then-B 可以显著提升设备资源利用率。</li>
  <li>F-then-B 模式由于缓存了多个 micro-batch 的中间变量和梯度,显存的实际利用率并不高。</li>
  <li>解决: <code class="language-plaintext highlighter-rouge">1F1B</code> (在流水线并行中,pipeline stage 前向计算和反向计算交叉进行的方式)流水线并行方式。1F1B 模式下,前向计算和反向计算<strong>交叉</strong>进行,可以及时释放不必要的中间变量。</li>
  <li><img src="https://pic1.zhimg.com/80/v2-b678a253f70613169172fd892e0e4064_1440w.webp" alt="" /></li>
</ul>

<h4 id="pipedream">PipeDream</h4>

<p>PipeDream 在单个 GPU 上短暂运行性能分析后,自动决定怎样分割这些 DNN 算子,如何平衡不同 stage 之间的计算负载,而同时尽可能减少目标平台上的通信量。</p>

<p>PipeDream将DNN 层划分为多个阶段 —— 每个阶段(stage)由模型中的一组连续层组成。</p>
<ul>
  <li>PipeDream把模型的不同的阶段部署在不同的机器上,每个阶段可能有不同的replication。该阶段对本阶段中所有层执行向前和向后传递。</li>
  <li>PipeDream将包含输入层的阶段称为<strong>输入</strong>阶段,将包含输出层的阶段称为<strong>输出</strong>阶段。</li>
  <li><img src="https://pic3.zhimg.com/80/v2-26f04fc799e3220d6806cfbebe415712_1440w.webp" alt="" /></li>
</ul>

<h4 id="virtual-pipeline">virtual pipeline</h4>

<p>virtual pipeline 是 Megatron-2 论文中最主要的一个创新点。</p>
<ul>
  <li>传统的 pipeline 并行通常会在一个 Device 上放置几个 block,为了扩展效率,在计算强度和通信强度中间取一个平衡。</li>
  <li>但 virtual pipeline 在 device 数量不变的情况下,分出更多的 pipeline stage,以更多的通信量,换取空泡比率降低,减小了 step e2e 用时。</li>
  <li><img src="https://pic4.zhimg.com/80/v2-b5347bb2677de0ffd78e091a4e1e79bb_1440w.webp" alt="" /></li>
</ul>

<h3 id="张量并行tensor-parallelism">张量并行(Tensor Parallelism)</h3>

<p>流水线并行主要集中在<strong>多层</strong>神经网络架构训练上,对于Transformer架构的模型(如BERT,GPT等),<code class="language-plaintext highlighter-rouge">MultiHead Attention Layer</code>和<code class="language-plaintext highlighter-rouge">MLP</code>的计算量翻了几倍,如果继续按管线切分模型, 可能单层参数都无法被显存装载,因此需要横着把同一层的模型切分开来,这便是<strong>张量并行</strong></p>
<ul>
  <li>层间并行: 流水线并行</li>
  <li>层内并行: 张量并行</li>
  <li><img src="https://pic3.zhimg.com/80/v2-293a367d9c5378f01fa64ad009dd9eb2_1440w.webp" alt="" /></li>
</ul>

<p>分布式张量计算正交且更通用,将张量操作划分到多个设备上,以加速计算或增加模型大小。</p>
<ul>
  <li>把 <code class="language-plaintext highlighter-rouge">Masked Multi Self Attention</code> 和 <code class="language-plaintext highlighter-rouge">Feed Forward</code> 都进行切分以并行化,利用Transformers网络的结构,通过添加一些同步原语来创建一个简单的模型并行实现。</li>
</ul>

<p><strong>张量并行</strong>(Tensor Model Parallelism)</p>
<ul>
  <li>张量并行(TP)是模型并行一种形式,流水线并行按<strong>网络层</strong>切分,张量并行按<strong>矩阵</strong>切分。</li>
  <li>2019年,NVIDIA发布《Efficient Large-Scale Language Model Training on GPU Clusters Using Megatron-LM》论文,提出了张量并行方法</li>
  <li>核心思想: 每个GPU仅处理矩阵一部分,当算子需要整个矩阵的时候再进行矩阵聚合。无论是横向切分还是竖向切分,都可以将切分后的矩阵放到不同GPU上进行计算,最后将计算的结果再合并。</li>
</ul>

<p>张量并行最有名的是: <code class="language-plaintext highlighter-rouge">Megatron</code> 和 <code class="language-plaintext highlighter-rouge">Deepspeed</code></p>

<h3 id="混合并行">混合并行</h3>

<p>随着训练设备的增加,多个worker之间的通信成本增加,模型Reduce的成本也越来越大,数据并行的瓶颈也随之出现。故有学者提出<strong>混合并行</strong>(数据并行+模型并行)</p>

<h3 id="架构模式">架构模式</h3>

<p>分布式训练上会频繁用到<strong>规约</strong>(AllReduce)操作。</p>

<p>all-reduce 操作有多种方式实现:</p>
<ul>
  <li><strong>树状结构</strong>:数据在进程间以树状结构进行归约,每个非叶子节点负责将其子节点的数据归约后再传递给其父节点。</li>
  <li><strong>环形结构</strong>:进程之间形成一个环,数据在环中按顺序传递并归约。</li>
  <li><strong>直接归约</strong>:所有进程直接将数据发送给一个中心节点,该节点完成归约后将结果发送回所有进程。</li>
</ul>

<p>all-reduce 操作性能对分布式计算的效率至关重要,因此优化这一操作是分布式系统设计中的一个研究热点。使用最多的实现方式是百度提出的 <code class="language-plaintext highlighter-rouge">Ring AllReduce</code> 算法,该方法属于<strong>环状结构</strong>实现的一种。</p>

<p>主流的<strong>分布式架构</strong>主要分为<code class="language-plaintext highlighter-rouge">参数服务器</code>(ParameterServer) 和<code class="language-plaintext highlighter-rouge">基于规约</code>(Reduce)两种模式。早期还有基于<code class="language-plaintext highlighter-rouge">MPI</code>的方式,不过现在已经很少用了。</p>

<p>传统 parameter server: server和client方式</p>
<ul>
  <li>client通过计算分配给自己的数据,产生梯度,传给server</li>
  <li>server 聚合,然后把参数再传给client</li>
</ul>

<p>这个方式的弊端: server容易成为瓶颈</p>
<ul>
  <li>server通信量太大。</li>
  <li>一个client失败,会导致其他client等待。</li>
</ul>

<p>Ring all reduce 一种分布式方式</p>
<ul>
  <li>各个节点分配通信量。</li>
  <li>总的通信量和ps没啥变化,但是通信的压力平摊到各个GPU上了,GPU之间的通信可以并行进行。</li>
</ul>

<p>假如,GPU数量是N,把模型参数分成N份,每个GPU要存放整个参数。每个GPU也要分配训练数据。</p>
<ul>
  <li>当一次迭代,N个GPU之间要经过一个scatter和gather操作,reduce-scatter是将不同gpu上对应的参数的gradient相加,一共需要通讯(N-1)次。</li>
  <li>All-gather 将合并完整的参数,传到其他gpu上,需要通讯(N-1)次。</li>
  <li>一次all reduce,单卡通信量为2*sita。</li>
</ul>

<h4 id="ps参数服务器">PS:参数服务器</h4>

<p>ParameterServer模式是一种基于reduce和broadcat算法的经典架构。</p>
<ul>
  <li>其中一个/一组机器作为PS架构的<strong>中心节点</strong>,用来<strong>存储参数和梯度</strong>。</li>
  <li>在更新梯度的时候,先全局reduce接受其他worker节点的数据,经过本地计算后(比如参数平均法),再broadcast回所有其他worker。</li>
  <li>论文: <a href="https://www.cs.cmu.edu/~muli/file/ps.pdf">Parameter Server for Distributed Machine Learning</a></li>
  <li><a href="https://www.zhihu.com/tardis/zm/art/82116922?source_id=1003">中文解读</a></li>
  <li><img src="https://pic3.zhimg.com/80/v2-85e54fee3bdad611072235264df95b66_1440w.webp" alt="" /></li>
</ul>

<p>PS架构的问题在于多个worker与ps通信,PS本身可能存在<strong>瓶颈</strong>。</p>
<ul>
  <li>随着worker数量的增加,整体通信量也线性增加,加速比也可能停滞在某个点位上。</li>
  <li><img src="https://pic3.zhimg.com/80/v2-eee6e2ad8aa00a8679298ff297508a16_1440w.jpg" alt="" /></li>
</ul>

<h4 id="基于规约-reduce模式">基于规约 Reduce模式</h4>

<p>基于规约的模式解决了上述的问题,最典型的是百度提出的 Ring-AllRuduce。</p>
<ul>
  <li>多个Worker节点连接成一个环,每个Worker依次把自己的梯度同步给下一个Worker,经过至多 2*(N-1) 轮同步,就可以完成所有Worker的梯度更新。</li>
  <li>这种方式下所有节点的地位是平等的,因此不存在某个节点的<strong>负载瓶颈</strong>,随着Worker的增加,整体的通信量并不随着增加。加速比几乎可以跟机器数量成线性关系且不存在明显瓶颈。</li>
  <li><img src="https://pic1.zhimg.com/80/v2-5c777ca6d8ce4972d51f6ce73f3a044c_1440w.jpg" alt="" /></li>
</ul>

<p>目前,越来越多的分布式训练采用<strong>Reduce</strong>这种模式。Horovod中主要就是用的这种分布式架构。</p>
<ul>
  <li>更多资料参考: <a href="https://zhuanlan.zhihu.com/p/79030485">兰瑞Frank:腾讯机智团队分享–AllReduce算法的前世今生</a></li>
</ul>

<h3 id="同步范式">同步范式</h3>

<p>实际训练过程中可能遇到各种问题,比如:部分节点资源受限、卡顿、网络延时等等</p>

<p>因此梯度同步时就存在“<strong>木桶</strong>“效应,即集群中的某些worker比其他worker更慢,导致整个训练pipeline需要等待慢的worker,整个集群的训练速度受限于最慢机器的速度。</p>

<p>因此梯度同步有“<strong>同步</strong>”(sync)、“<strong>异步</strong>”(Async)和<strong>混合</strong>三种范式。</p>
<ul>
  <li><strong>同步</strong>范式:只有所有worker完成当前的计算任务,整个集群才会开始下一次迭代。
    <ul>
      <li>TF中同步范式使用SyncReplicasOptimizer优化器</li>
    </ul>
  </li>
  <li><strong>异步</strong>模式刚好相反,每个worker只关心知己的进程,完成计算后就尝试更新,能与其他多个worker同步梯度完成取决于各worker当前时刻的状态。其过程不可控,有可能出现模型正确性问题。(可在训练时logging对比)</li>
  <li><strong>混合</strong>范式结合以上两种情况,各个worker都会等待其他worker的完成,但不是永久等待,有timeout的机制。如果超时了,则此情况下相当于异步机制。并且没来得及完成计算的worker,其梯度则被标记为“stale”而抛弃或另做处理。</li>
</ul>

<p><strong>梯度累加</strong></p>

<p>Gradient Accumulation 把一个大 Batch 拆分成多个 micro-batch, 每个 micro-batch 前后向计算后的梯度累加,在最后一个micro-batch累加结束后,统一更新模型。</p>

<p><code class="language-plaintext highlighter-rouge">micro-batch</code> 跟<code class="language-plaintext highlighter-rouge">数据并行</code>高度相似性:</p>
<ul>
  <li>数据并行是空间上,数据被拆分成多个 tensor,同时喂给多个设备并行计算,然后将梯度累加在一起更新;</li>
  <li>而 micro-batch 是<strong>时间</strong>上的数据并行,数据被拆分成多个 tensor, 按照时序依次进入同一个设备串行计算,然后将梯度累加在一起更新。当总的 batch size 一致,且数据并行的并行度和 micro-batch 的累加次数相等时,数据并行和 Gradient Accumulation 在数学上完全等价。</li>
</ul>

<p>Gradient Accumulation 通过多个 micro-batch的梯度累加, 使下一个 micro-batch 的前向计算不需要依赖上一个 micro-batch 的反向计算,因此可以畅通无阻的进行下去(当然在一个大 batch 的最后一次 micro-batch 还是会触发这个依赖)。</p>

<p>Gradient Accumulation 解决了很多问题:</p>
<ul>
  <li>单卡下,Gradient Accumulation 将一个大 batch size 拆分成等价的多个小 micro-batch ,从而达到节省显存的目的。</li>
  <li>数据并行下,Gradient Accumulation 解决了反向梯度同步开销占比过大的问题(随着机器数和设备数的增加,梯度的 AllReduce 同步开销也加大),因为梯度同步变成了一个稀疏操作,因此可以提升数据并行的加速比。</li>
  <li>流水并行下, Gradient Accumulation 使得不同 stage 之间可以并行执行不同的 micro-batch, 从而让各个阶段的计算不阻塞,达到流水的目的。如果每个 micro-batch 前向计算的中间结果(activation)被后向计算所消费,则需要在显存中缓存 8多份(梯度累加的次数)完整的前向 activation。这时就不得不用另一项重要的技术:激活检查点(activation checkpointing)。</li>
</ul>

<h3 id="物理架构">物理架构</h3>

<p>物理架构主要是 <strong>GPU架构</strong>,即:单机单卡、单机多卡、多机单卡、多机多卡(最典型)</p>
<ul>
  <li>单机单卡:常规操作</li>
  <li>单机<strong>多卡</strong>:利用一台GPU上的多块GPU进行分布式训练。数据并行和模型并行皆可。整个训练过程一般只有一个进程,多GPU之间的通信通过多线程的方式,模型参数和梯度在进程内是共享的(基于NCCL的可能不大一样)。这种情况下基于Reduce的架构比PS架构更合适一些,因为不需要一个显式的PS,通过进程内的Reduce即可完成梯度同步。</li>
  <li><strong>多机</strong>单卡:操作上与多机多卡基本一致</li>
  <li>多机<strong>多卡</strong>:多机多卡是最典型的分布式架构,所以它需要较好的进程间的通讯机制(多worker之间的通信)。</li>
</ul>

<p>内容:</p>
<ul>
  <li>分布式训练的基本原理</li>
  <li>TensorFlow的分布式训练</li>
  <li>PyTorch的分布式训练框架</li>
  <li>Horovod分布式训练</li>
</ul>

<h2 id="分布式实现">分布式实现</h2>

<p>超大规模语言模型主要有两条技术路线:</p>
<ul>
  <li>(1) <code class="language-plaintext highlighter-rouge">TPU</code> + <code class="language-plaintext highlighter-rouge">XLA</code> + <code class="language-plaintext highlighter-rouge">TensorFlow</code>/<code class="language-plaintext highlighter-rouge">JAX</code> : Google主导,由于TPU和自家云平台GCP深度绑定</li>
  <li>(2) <code class="language-plaintext highlighter-rouge">GPU</code> + <code class="language-plaintext highlighter-rouge">PyTorch</code> + <code class="language-plaintext highlighter-rouge">Megatron-LM</code> + <code class="language-plaintext highlighter-rouge">DeepSpeed</code>: NVIDIA、Meta、MS大厂加持,社区氛围活跃</li>
</ul>

<p>(1) 对于非Googler 只可远观而不可把玩,(2) 更受到群众欢迎。</p>

<h3 id="tf分布式训练方法">TF分布式训练方法</h3>

<ul>
  <li>黄文坚的<a href="https://blog.csdn.net/CodeMaster_/article/details/76223835">Tensorflow分布式实战</a></li>
</ul>

<p>TensorFlow主要的分布式训练的方法有三种:</p>
<ol>
  <li>Customer Train Loop:最原始,由框架工程师自己开发</li>
  <li>Estimator + Strategy:高级API,不用关心底层硬件</li>
  <li>Keras + Strategy:最新出的keras的高级API</li>
</ol>

<blockquote>
  <ul>
    <li>实际的开发工作中,分布式的工作最好是交给框架,而工程师本身只需要关注任务模型的pipeline就行了。</li>
    <li>最经典的是Spark框架,工程师只需要关注数据处理的workflow,分布式的大部分工作都交给框架。深度学习的开发同样如此。</li>
  </ul>
</blockquote>

<p>各种方式评价</p>
<ul>
  <li>第一种方式太过原生,整个分布式的训练过程完全交给工程师来处理,代码模块比较复杂,这里不做赘述。</li>
  <li>第二种方式,Estimator是TF的一个高级API,在分布式场景下,其最大的特点是<strong>单机和分布式代码一致</strong>,且不需要考虑底层的硬件设施。Strategy是tensorflow根据分布式训练的复杂性,抽象出的多种分布式训练策略。TF1.x和TF2.x接口变化较大,不同版本名字可能不一样,以实际使用版本为准。用的比较多的是:
    <ul>
      <li><strong>MirroredStrategy</strong>:适用于单机多卡、数据并行、同步更新的分布式训练,采用Reduce的更新范式,worker之间采用NCCL进行通信。</li>
      <li><strong>MultiWorkerMirrored</strong>Strategy:与上面的类似,不同的是这种策略支持多机多卡、数据并行、同步更新的分布式策略、Reduce范式。在TF 1.15版本里,这个策略叫CollectiveAllReduceStrategy。</li>
      <li><strong>ParameterServer</strong>Strategy:经典的PS架构,多机多卡、数据并行、同步/异步更新</li>
      <li>使用Estimator+Strategy 实现分布式训练,参考<a href="https://github.com/kubeflow/tf-operator/blob/master/examples/v1/distribution_strategy/estimator-API/keras_model_to_estimator.py">代码</a></li>
    </ul>
  </li>
  <li>第三种方式 Keras + Strategy 是Tensorflow最新官方推荐的方案。主要是利用keras的高级API,配合Strategy实现多模式的分布式训练。
    <ul>
      <li><a href="https://github.com/kubeflow/tf-operator/blob/master/examples/v1/distribution_strategy/keras-API/multi_worker_strategy-with-keras.py">代码</a></li>
    </ul>
  </li>
</ul>

<p>后两种方法都需要传入TF_CONFIG参数,没有就是单机的训练方式。Strategy会自动读取环境变量并应用相关信息。</p>

<p>TF_CONFIG的配置如下</p>
<ul>
  <li><img src="https://pic2.zhimg.com/80/v2-dc8c2f647b9e359661e2a6f288ac1525_1440w.jpg" alt="" /></li>
</ul>

<h3 id="单机单卡">单机单卡</h3>

<p>单机单卡是最普通的情况,当然也是最简单的。</p>

<p>使用步骤</p>
<ul>
  <li>检查可用GPU数量</li>
  <li>获取一个GPU实例</li>
  <li>迁移:将 数据/模型 推送到GPU上</li>
</ul>

<h4 id="tf">TF</h4>

<p>示例代码如下:</p>

<div class="language-python highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c1">#coding=utf-8
#单机单卡,对于单机单卡,可以把参数和计算都定义再gpu上,不过如果参数模型比较大,显存不足等情况,就得放在cpu上
</span><span class="kn">import</span>  <span class="nn">tensorflow</span> <span class="k">as</span> <span class="n">tf</span>
<span class="k">with</span> <span class="n">tf</span><span class="p">.</span><span class="n">device</span><span class="p">(</span><span class="s">'/cpu:0'</span><span class="p">):</span><span class="c1">#也可以放在gpu上
</span>    <span class="n">w</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">get_variable</span><span class="p">(</span><span class="s">'w'</span><span class="p">,(</span><span class="mi">2</span><span class="p">,</span><span class="mi">2</span><span class="p">),</span><span class="n">tf</span><span class="p">.</span><span class="n">float32</span><span class="p">,</span><span class="n">initializer</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">constant_initializer</span><span class="p">(</span><span class="mi">2</span><span class="p">))</span>
    <span class="n">b</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">get_variable</span><span class="p">(</span><span class="s">'b'</span><span class="p">,(</span><span class="mi">2</span><span class="p">,</span><span class="mi">2</span><span class="p">),</span><span class="n">tf</span><span class="p">.</span><span class="n">float32</span><span class="p">,</span><span class="n">initializer</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">constant_initializer</span><span class="p">(</span><span class="mi">5</span><span class="p">))</span>
<span class="k">with</span> <span class="n">tf</span><span class="p">.</span><span class="n">device</span><span class="p">(</span><span class="s">'/gpu:0'</span><span class="p">):</span>
    <span class="n">addwb</span><span class="o">=</span><span class="n">w</span><span class="o">+</span><span class="n">b</span>
    <span class="n">mutwb</span><span class="o">=</span><span class="n">w</span><span class="o">*</span><span class="n">b</span>
<span class="n">init</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">initialize_all_variables</span><span class="p">()</span>
<span class="k">with</span> <span class="n">tf</span><span class="p">.</span><span class="n">Session</span><span class="p">()</span> <span class="k">as</span> <span class="n">sess</span><span class="p">:</span>
    <span class="n">sess</span><span class="p">.</span><span class="n">run</span><span class="p">(</span><span class="n">init</span><span class="p">)</span>
    <span class="n">np1</span><span class="p">,</span><span class="n">np2</span><span class="o">=</span><span class="n">sess</span><span class="p">.</span><span class="n">run</span><span class="p">([</span><span class="n">addwb</span><span class="p">,</span><span class="n">mutwb</span><span class="p">])</span>
    <span class="k">print</span> <span class="n">np1</span><span class="p">,</span><span class="n">np2</span>
</code></pre></div></div>

<h4 id="pytorch">PyTorch</h4>

<p>pytorch实现</p>
<ul>
  <li>封装程度非常高,只需保证即将被推到 GPU 的数据是<code class="language-plaintext highlighter-rouge">张量</code>(Tensor)或者<code class="language-plaintext highlighter-rouge">模型</code>(Module),就可以用 <code class="language-plaintext highlighter-rouge">to()</code> 函数快速进行实现。</li>
</ul>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kn">import</span> <span class="nn">torch</span>
<span class="kn">from</span> <span class="nn">torch</span> <span class="kn">import</span> <span class="n">nn</span>

<span class="n">data</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">ones</span><span class="p">((</span><span class="mi">3</span><span class="p">,</span> <span class="mi">3</span><span class="p">))</span> <span class="c1"># 定义数据(张量)
</span><span class="k">print</span><span class="p">(</span><span class="n">data</span><span class="p">.</span><span class="n">device</span><span class="p">)</span>
<span class="n">net</span> <span class="o">=</span> <span class="n">nn</span><span class="p">.</span><span class="n">Sequential</span><span class="p">(</span><span class="n">nn</span><span class="p">.</span><span class="n">Linear</span><span class="p">(</span><span class="mi">3</span><span class="p">,</span> <span class="mi">3</span><span class="p">))</span> <span class="c1"># 定义模型
</span>
<span class="k">print</span><span class="p">(</span><span class="n">torch</span><span class="p">.</span><span class="n">cuda</span><span class="p">.</span><span class="n">is_available</span><span class="p">())</span>     <span class="c1"># 判断当前的机器是否有可用的 GPU
</span><span class="k">print</span><span class="p">(</span><span class="n">torch</span><span class="p">.</span><span class="n">cuda</span><span class="p">.</span><span class="n">device_count</span><span class="p">())</span>     <span class="c1"># 目前可用的 GPU 的数量。
# 使用第一块GPU
</span><span class="n">device</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">device</span><span class="p">(</span><span class="s">"cuda:0"</span> <span class="k">if</span> <span class="n">torch</span><span class="p">.</span><span class="n">cuda</span><span class="p">.</span><span class="n">is_available</span><span class="p">()</span> <span class="k">else</span> <span class="s">"cpu"</span><span class="p">)</span> <span class="c1"># cuda: 0 表示使用的是第一块 GPU。当然可以不用声明“:0”,默认就从第一块开始
</span><span class="k">print</span><span class="p">(</span><span class="n">device</span><span class="p">)</span> <span class="c1"># cpu 或 0
# 数据迁移:将data推到(迁移)gpu上
</span><span class="n">data_gpu</span> <span class="o">=</span> <span class="n">data</span><span class="p">.</span><span class="n">to</span><span class="p">(</span><span class="n">device</span><span class="p">)</span>
<span class="k">print</span><span class="p">(</span><span class="n">data_gpu</span><span class="p">.</span><span class="n">device</span><span class="p">)</span>
<span class="c1"># 模型迁移:model推到gpu
</span><span class="n">net</span><span class="p">.</span><span class="n">to</span><span class="p">(</span><span class="n">device</span><span class="p">)</span>
</code></pre></div></div>

<h3 id="单机多卡">单机多卡</h3>

<h4 id="tf-1">TF</h4>

<ul>
  <li>单机多卡,只要用device直接指定设备,就可以进行训练,SGD采用各个卡的平均值</li>
  <li>问题:除了取均值,还有别的方式吗?</li>
</ul>

<div class="language-python highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c1">#coding=utf-8
#单机多卡:一般采用共享操作定义在cpu上,然后并行操作定义在各自的gpu上,比如对于深度学习来说,我们一般把参数定义、参数梯度更新统一放在cpu上,各个gpu通过各自计算各自batch数据的梯度值,然后统一传到cpu上,由cpu计算求取平均值,cpu更新参数。具体的深度学习多卡训练代码,请参考:https://github.com/tensorflow/models/blob/master/inception/inception/inception_train.py
</span><span class="kn">import</span>  <span class="nn">tensorflow</span> <span class="k">as</span> <span class="n">tf</span>
  
<span class="k">with</span> <span class="n">tf</span><span class="p">.</span><span class="n">device</span><span class="p">(</span><span class="s">'/cpu:0'</span><span class="p">):</span>
    <span class="n">w</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">get_variable</span><span class="p">(</span><span class="s">'w'</span><span class="p">,(</span><span class="mi">2</span><span class="p">,</span><span class="mi">2</span><span class="p">),</span><span class="n">tf</span><span class="p">.</span><span class="n">float32</span><span class="p">,</span><span class="n">initializer</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">constant_initializer</span><span class="p">(</span><span class="mi">2</span><span class="p">))</span>
    <span class="n">b</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">get_variable</span><span class="p">(</span><span class="s">'b'</span><span class="p">,(</span><span class="mi">2</span><span class="p">,</span><span class="mi">2</span><span class="p">),</span><span class="n">tf</span><span class="p">.</span><span class="n">float32</span><span class="p">,</span><span class="n">initializer</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">constant_initializer</span><span class="p">(</span><span class="mi">5</span><span class="p">))</span>
<span class="k">with</span> <span class="n">tf</span><span class="p">.</span><span class="n">device</span><span class="p">(</span><span class="s">'/gpu:0'</span><span class="p">):</span>
    <span class="n">addwb</span><span class="o">=</span><span class="n">w</span><span class="o">+</span><span class="n">b</span>
<span class="k">with</span> <span class="n">tf</span><span class="p">.</span><span class="n">device</span><span class="p">(</span><span class="s">'/gpu:1'</span><span class="p">):</span>
    <span class="n">mutwb</span><span class="o">=</span><span class="n">w</span><span class="o">*</span><span class="n">b</span>
  
<span class="n">ini</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">initialize_all_variables</span><span class="p">()</span>
<span class="k">with</span> <span class="n">tf</span><span class="p">.</span><span class="n">Session</span><span class="p">()</span> <span class="k">as</span> <span class="n">sess</span><span class="p">:</span>
    <span class="n">sess</span><span class="p">.</span><span class="n">run</span><span class="p">(</span><span class="n">ini</span><span class="p">)</span>
    <span class="k">while</span> <span class="mi">1</span><span class="p">:</span>
        <span class="k">print</span> <span class="n">sess</span><span class="p">.</span><span class="n">run</span><span class="p">([</span><span class="n">addwb</span><span class="p">,</span><span class="n">mutwb</span><span class="p">])</span>
</code></pre></div></div>
<ul>
  <li>多个 GPU 上运行 TensorFlow,则可以采用多塔式方式构建模型,其中每个塔都会分配给不同 GPU。例如:</li>
</ul>

<div class="language-python highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c1"># Creates a graph.
</span><span class="n">c</span> <span class="o">=</span> <span class="p">[]</span>
<span class="k">for</span> <span class="n">d</span> <span class="ow">in</span> <span class="p">[</span><span class="s">'/device:GPU:2'</span><span class="p">,</span> <span class="s">'/device:GPU:3'</span><span class="p">]:</span>
  <span class="k">with</span> <span class="n">tf</span><span class="p">.</span><span class="n">device</span><span class="p">(</span><span class="n">d</span><span class="p">):</span>
    <span class="n">a</span> <span class="o">=</span> <span class="n">tf</span><span class="p">.</span><span class="n">constant</span><span class="p">([</span><span class="mf">1.0</span><span class="p">,</span> <span class="mf">2.0</span><span class="p">,</span> <span class="mf">3.0</span><span class="p">,</span> <span class="mf">4.0</span><span class="p">,</span> <span class="mf">5.0</span><span class="p">,</span> <span class="mf">6.0</span><span class="p">],</span> <span class="n">shape</span><span class="o">=</span><span class="p">[</span><span class="mi">2</span><span class="p">,</span> <span class="mi">3</span><span class="p">])</span>
    <span class="n">b</span> <span class="o">=</span> <span class="n">tf</span><span class="p">.</span><span class="n">constant</span><span class="p">([</span><span class="mf">1.0</span><span class="p">,</span> <span class="mf">2.0</span><span class="p">,</span> <span class="mf">3.0</span><span class="p">,</span> <span class="mf">4.0</span><span class="p">,</span> <span class="mf">5.0</span><span class="p">,</span> <span class="mf">6.0</span><span class="p">],</span> <span class="n">shape</span><span class="o">=</span><span class="p">[</span><span class="mi">3</span><span class="p">,</span> <span class="mi">2</span><span class="p">])</span>
    <span class="n">c</span><span class="p">.</span><span class="n">append</span><span class="p">(</span><span class="n">tf</span><span class="p">.</span><span class="n">matmul</span><span class="p">(</span><span class="n">a</span><span class="p">,</span> <span class="n">b</span><span class="p">))</span>
<span class="k">with</span> <span class="n">tf</span><span class="p">.</span><span class="n">device</span><span class="p">(</span><span class="s">'/cpu:0'</span><span class="p">):</span>
  <span class="nb">sum</span> <span class="o">=</span> <span class="n">tf</span><span class="p">.</span><span class="n">add_n</span><span class="p">(</span><span class="n">c</span><span class="p">)</span>
<span class="c1"># Creates a session with log_device_placement set to True.
</span><span class="n">sess</span> <span class="o">=</span> <span class="n">tf</span><span class="p">.</span><span class="n">Session</span><span class="p">(</span><span class="n">config</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">ConfigProto</span><span class="p">(</span><span class="n">log_device_placement</span><span class="o">=</span><span class="bp">True</span><span class="p">))</span>
<span class="c1"># Runs the op.
</span><span class="k">print</span><span class="p">(</span><span class="n">sess</span><span class="p">.</span><span class="n">run</span><span class="p">(</span><span class="nb">sum</span><span class="p">))</span>
</code></pre></div></div>

<ul>
  <li>【2020-5-20】每个gpu的梯度要累加起来,单独计算</li>
</ul>

<div class="language-python highlighter-rouge"><div class="highlight"><pre class="highlight"><code>        <span class="c1"># train op def
</span>        <span class="n">tower_grads</span> <span class="o">=</span> <span class="p">[]</span>
        <span class="k">for</span> <span class="n">i</span> <span class="ow">in</span> <span class="nb">xrange</span><span class="p">(</span><span class="n">FLAGS</span><span class="p">.</span><span class="n">num_gpus</span><span class="p">):</span>
            <span class="k">with</span> <span class="n">tf</span><span class="p">.</span><span class="n">device</span><span class="p">(</span><span class="s">'/gpu:{}'</span><span class="p">.</span><span class="nb">format</span><span class="p">(</span><span class="n">i</span><span class="p">)):</span>
                <span class="k">with</span> <span class="n">tf</span><span class="p">.</span><span class="n">name_scope</span><span class="p">(</span><span class="s">'tower_{}'</span><span class="p">.</span><span class="nb">format</span><span class="p">(</span><span class="n">i</span><span class="p">)):</span>
                    <span class="n">next_batch</span> <span class="o">=</span> <span class="n">dhs</span><span class="p">.</span><span class="n">get_next_batch</span><span class="p">()</span>
                    <span class="n">cnn</span><span class="p">.</span><span class="n">inference</span><span class="p">(</span>
                        <span class="n">next_batch</span><span class="p">[</span><span class="mi">0</span><span class="p">],</span> <span class="n">next_batch</span><span class="p">[</span><span class="mi">1</span><span class="p">],</span> <span class="n">next_batch</span><span class="p">[</span><span class="mi">2</span><span class="p">],</span>
                        <span class="n">dropout_keep_prob</span><span class="o">=</span><span class="n">FLAGS</span><span class="p">.</span><span class="n">dropout_keep_prob</span><span class="p">,</span>
                        <span class="n">input_dropout_keep_prob</span><span class="o">=</span><span class="n">FLAGS</span><span class="p">.</span><span class="n">input_dropout_keep_prob</span><span class="p">,</span>
                        <span class="n">phase_train</span><span class="o">=</span><span class="bp">True</span><span class="p">)</span>
                    <span class="n">grads</span> <span class="o">=</span> <span class="n">optimizer</span><span class="p">.</span><span class="n">compute_gradients</span><span class="p">(</span><span class="n">cnn</span><span class="p">.</span><span class="n">loss</span><span class="p">)</span>
                    <span class="n">tower_grads</span><span class="p">.</span><span class="n">append</span><span class="p">(</span><span class="n">grads</span><span class="p">)</span>
        <span class="n">grads</span> <span class="o">=</span> <span class="n">average_gradients</span><span class="p">(</span><span class="n">tower_grads</span><span class="p">)</span>
        <span class="n">train_op</span> <span class="o">=</span> <span class="n">optimizer</span><span class="p">.</span><span class="n">apply_gradients</span><span class="p">(</span><span class="n">grads</span><span class="p">,</span> <span class="n">global_step</span><span class="o">=</span><span class="n">global_step</span><span class="p">)</span>

<span class="k">def</span> <span class="nf">average_gradients</span><span class="p">(</span><span class="n">tower_grads</span><span class="p">):</span>
    <span class="s">"""
    Calculate the average gradient for each shared variable across all towers.
    Note that this function provides a synchronization point across all towers.
    NOTE: This function is copied from cifar codes in tensorflow tutorial with minor
    modification.
    Args:
        tower_grads: List of lists of (gradient, variable) tuples. The outer list
            is over individual gradients. The inner list is over the gradient
            calculation for each tower.
    Returns:
       List of pairs of (gradient, variable) where the gradient has been averaged
       across all towers.
    """</span>
    <span class="n">average_grads</span> <span class="o">=</span> <span class="p">[]</span>
    <span class="k">for</span> <span class="n">grad_and_vars</span> <span class="ow">in</span> <span class="nb">zip</span><span class="p">(</span><span class="o">*</span><span class="n">tower_grads</span><span class="p">):</span>
        <span class="c1"># Note that each grad_and_vars looks like the following:
</span>        <span class="c1">#   ((grad0_gpu0, var0_gpu0), ... , (grad0_gpuN, var0_gpuN))
</span>        <span class="n">grads</span> <span class="o">=</span> <span class="p">[]</span>
        <span class="k">for</span> <span class="n">g</span><span class="p">,</span> <span class="n">_</span> <span class="ow">in</span> <span class="n">grad_and_vars</span><span class="p">:</span>
            <span class="c1"># Add 0 dimension to the gradients to represent the tower.
</span>            <span class="c1"># NOTE: if batch norm applied, the grad of conv-maxpool-n/b will be
</span>            <span class="c1">#       None
</span>            <span class="k">if</span> <span class="n">g</span> <span class="ow">is</span> <span class="bp">None</span><span class="p">:</span>
                <span class="k">continue</span>
            <span class="n">expanded_g</span> <span class="o">=</span> <span class="n">tf</span><span class="p">.</span><span class="n">expand_dims</span><span class="p">(</span><span class="n">g</span><span class="p">,</span> <span class="mi">0</span><span class="p">)</span>

            <span class="c1"># Append on a 'tower' dimension which we will average over below.
</span>            <span class="n">grads</span><span class="p">.</span><span class="n">append</span><span class="p">(</span><span class="n">expanded_g</span><span class="p">)</span>

        <span class="c1"># Average over the 'tower' dimension.
</span>        <span class="k">if</span> <span class="n">grads</span><span class="p">:</span>
            <span class="n">grad</span> <span class="o">=</span> <span class="n">tf</span><span class="p">.</span><span class="n">concat</span><span class="p">(</span><span class="n">axis</span><span class="o">=</span><span class="mi">0</span><span class="p">,</span> <span class="n">values</span><span class="o">=</span><span class="n">grads</span><span class="p">)</span>
            <span class="n">grad</span> <span class="o">=</span> <span class="n">tf</span><span class="p">.</span><span class="n">reduce_mean</span><span class="p">(</span><span class="n">grad</span><span class="p">,</span> <span class="mi">0</span><span class="p">)</span>
        <span class="k">else</span><span class="p">:</span>
            <span class="n">grad</span> <span class="o">=</span> <span class="bp">None</span>

        <span class="c1"># Keep in mind that the Variables are redundant because they are shared
</span>        <span class="c1"># across towers. So .. we will just return the first tower's pointer to
</span>        <span class="c1"># the Variable.
</span>        <span class="n">v</span> <span class="o">=</span> <span class="n">grad_and_vars</span><span class="p">[</span><span class="mi">0</span><span class="p">][</span><span class="mi">1</span><span class="p">]</span>
        <span class="n">grad_and_var</span> <span class="o">=</span> <span class="p">(</span><span class="n">grad</span><span class="p">,</span> <span class="n">v</span><span class="p">)</span>
        <span class="n">average_grads</span><span class="p">.</span><span class="n">append</span><span class="p">(</span><span class="n">grad_and_var</span><span class="p">)</span>
    <span class="k">return</span> <span class="n">average_grads</span>
</code></pre></div></div>

<ul>
  <li>参考官网:<a href="https://jhui.github.io/2017/03/07/TensorFlow-GPU/">TensorFlow with multiple GPUs</a></li>
</ul>

<h4 id="pytorch-1">PyTorch</h4>

<p>PyTorch 多种解决方案,最简单常用:nn.DataParallel()</p>
<ul>
  <li>module :定义的模型</li>
  <li>device_ids 即为训练模型时用到的 GPU 设备号,</li>
  <li>output_device 表示输出结果的 device,默认为 0 也就是第一块卡。</li>
</ul>

<p>工作过程</p>
<ul>
  <li><img src="https://pic3.zhimg.com/80/v2-8cba3ef61df28c56c250afef9ca1f2c2_1440w.webp" alt="" /></li>
  <li>在每个迭代训练的Forward过程中:nn.DataParallel都自动将输入按照GUP数量进行split;然后复制模型参数到各个GPU上;分别进行正向计算后将得到网络输出output_x;最后将结果concat拼接到一起送往0号卡中。</li>
  <li>在Backward过程中:先由0号卡计算loss函数,通过loss.backward()得到损失函数相于各个gpu输出结果的梯度grad_l1 … gradln;接下来0号卡将所有的grad_i送回对应的GPU_i中;然后GPU们分别进行backward得到各个GPU上面的模型参数梯度值gradm1 … gradmn;最后所有参数的梯度汇总到GPU0卡进行update。</li>
</ul>

<p>多卡训练时,output_device 的卡所占的显存明显大一些。</p>
<ul>
  <li>因为使用 DataParallel 时,数据并行,每张卡获得的数据都一样多,但是所有卡的 loss 都会在第 output_device 块 GPU 进行计算,这导致了 output_device 卡的负载进一步增加。</li>
  <li><img src="https://pic3.zhimg.com/80/v2-51e78bd8c116d68b0901f4257523feae_1440w.webp" alt="" /></li>
</ul>

<p>只需要一个 DataParallel 函数就可以将模型和数据分发到多个 GPU 上。</p>
<ul>
  <li>但是还是需要了解这内部的运行逻辑, 遇到了诸如时间计算、资源预估、优化调试问题的时候,可以更好地运用 GPU</li>
</ul>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kn">import</span> <span class="nn">os</span>
<span class="kn">from</span> <span class="nn">torch</span> <span class="kn">import</span> <span class="n">nn</span>
<span class="kn">import</span> <span class="nn">torch</span>

<span class="k">class</span> <span class="nc">ASimpleNet</span><span class="p">(</span><span class="n">nn</span><span class="p">.</span><span class="n">Module</span><span class="p">):</span>
    <span class="k">def</span> <span class="nf">__init__</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">layers</span><span class="o">=</span><span class="mi">3</span><span class="p">):</span>
        <span class="nb">super</span><span class="p">(</span><span class="n">ASimpleNet</span><span class="p">,</span> <span class="bp">self</span><span class="p">).</span><span class="n">__init__</span><span class="p">()</span>
        <span class="bp">self</span><span class="p">.</span><span class="n">linears</span> <span class="o">=</span> <span class="n">nn</span><span class="p">.</span><span class="n">ModuleList</span><span class="p">([</span><span class="n">nn</span><span class="p">.</span><span class="n">Linear</span><span class="p">(</span><span class="mi">3</span><span class="p">,</span> <span class="mi">3</span><span class="p">,</span> <span class="n">bias</span><span class="o">=</span><span class="bp">False</span><span class="p">)</span> <span class="k">for</span> <span class="n">i</span> <span class="ow">in</span> <span class="nb">range</span><span class="p">(</span><span class="n">layers</span><span class="p">)])</span>   <span class="c1"># 设备有几个,就创建几个模型分支,
</span>    <span class="k">def</span> <span class="nf">forward</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">x</span><span class="p">):</span>     <span class="c1"># 模型前馈实际处理过程
</span>        <span class="k">print</span><span class="p">(</span><span class="s">"forward batchsize is: {}"</span><span class="p">.</span><span class="nb">format</span><span class="p">(</span><span class="n">x</span><span class="p">.</span><span class="n">size</span><span class="p">()[</span><span class="mi">0</span><span class="p">]))</span>
        <span class="n">x</span> <span class="o">=</span> <span class="bp">self</span><span class="p">.</span><span class="n">linears</span><span class="p">(</span><span class="n">x</span><span class="p">)</span>
        <span class="n">x</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">relu</span><span class="p">(</span><span class="n">x</span><span class="p">)</span>
        <span class="k">return</span> <span class="n">x</span>

<span class="n">device</span><span class="o">=</span><span class="n">os</span><span class="p">.</span><span class="n">environ</span><span class="p">[</span><span class="s">'CUDA_VISIBLE_DEVICES'</span><span class="p">]</span>  
<span class="c1"># os.environ['CUDA_VISIBLE_DEVICES']="0,2"  指定具体的设备
# print("CUDA_VISIBLE_DEVICES :{}".format(os.environ["CUDA_VISIBLE_DEVICES"]))
</span>
<span class="n">batch_size</span> <span class="o">=</span> <span class="mi">16</span>
<span class="n">inputs</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">randn</span><span class="p">(</span><span class="n">batch_size</span><span class="p">,</span> <span class="mi">3</span><span class="p">)</span>            <span class="c1"># 创建16个数据
</span><span class="n">labels</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">randn</span><span class="p">(</span><span class="n">batch_size</span><span class="p">,</span> <span class="mi">3</span><span class="p">)</span>            <span class="c1"># 创建16个数据标签
</span><span class="n">inputs</span><span class="p">,</span> <span class="n">labels</span> <span class="o">=</span> <span class="n">inputs</span><span class="p">.</span><span class="n">to</span><span class="p">(</span><span class="n">device</span><span class="p">),</span> <span class="n">labels</span><span class="p">.</span><span class="n">to</span><span class="p">(</span><span class="n">device</span><span class="p">)</span>      <span class="c1"># 数据迁移到设备上,返回数据总接口(应该是一个列表/字典,数据片段-GPU对应关系)
</span><span class="n">net</span> <span class="o">=</span> <span class="n">ASimpleNet</span><span class="p">()</span>                             <span class="c1"># 模型实例化
</span><span class="n">net</span> <span class="o">=</span> <span class="n">nn</span><span class="p">.</span><span class="n">DataParallel</span><span class="p">(</span><span class="n">net</span><span class="p">)</span>                     <span class="c1"># 模型分布结构化
</span><span class="n">net</span><span class="p">.</span><span class="n">to</span><span class="p">(</span><span class="n">device</span><span class="p">)</span>                                 <span class="c1"># 模型迁移到设备上,返回一个模型总接口(应该是一个列表/字典,子模型-GPU对应关系)
</span><span class="k">for</span> <span class="n">epoch</span> <span class="ow">in</span> <span class="nb">range</span><span class="p">(</span><span class="mi">1</span><span class="p">):</span>       <span class="c1"># 训练次数自行决定
</span>    <span class="n">outputs</span> <span class="o">=</span> <span class="n">net</span><span class="p">(</span><span class="n">inputs</span><span class="p">)</span>    <span class="c1">#  数据统一入口;数据怎么分配,模型参数怎么同步,内部机制自行来处理
# 输出:
# CUDA_VISIBLE_DEVICES : 3, 2, 1, 0
# forward batchsize is: 4
# forward batchsize is: 4
# forward batchsize is: 4
# forward batchsize is: 4
</span></code></pre></div></div>

<p>注意:有几个GPU,建几个分支(同结构模型),这样就可以分散到各个GPU上。</p>

<p>CUDA_VISIBLE_DEVICES 得知了当前程序可见的 GPU 数量为 4,而创建的 batch size 为 16,输出每个 GPU 上模型 forward 函数内部的 print 内容,验证了每个 GPU 获得的数据量都是 4 个。</p>
<ul>
  <li>DataParallel 会自动将数据切分、加载到相应 GPU,将模型复制到相应 GPU,进行正向传播计算梯度并汇总。</li>
</ul>

<p>提示</p>
<ul>
  <li>DataParallel的整个并行训练过程利用python多线程实现</li>
</ul>

<p>由以上工作过程分析可知,nn.DataParallel 无法避免的问题:</p>
<ul>
  <li><strong>负载不均衡</strong>问题。gpu_0所承担的任务明显要重于其他gpu</li>
  <li><strong>速度</strong>问题。每个iteration都需要复制模型且均从GPU0卡向其他GPU复制,通讯任务重且效率低;python多线程GIL锁导致的线程颠簸(thrashing)问题。</li>
  <li>只能<strong>单机</strong>运行。由于单进程的约束导致。</li>
  <li>只能切分batch到多GPU,而无法让一个model分布在多个GPU上。当一个模型过大,设置batchsize=1时其显存占用仍然大于单张显卡显存,此时就无法使用DataParallel类进行训练。</li>
</ul>

<p>因此官方推荐使用 torch.nn.DistributedDataParallel 替代 nn.DataParallel</p>

<h3 id="多机多卡">多机多卡</h3>

<p>一、基本概念</p>
<ul>
  <li>Cluster、Job、task概念:三者可以简单的看成是层次关系</li>
  <li>task相当于每台机器上的一个进程,多个task组成job;</li>
  <li>job又有两种:ps参数服务、worker计算服务,组成cluster。</li>
</ul>

<p>二、同步SGD与异步SGD</p>
<ul>
  <li>1、<strong>同步更新</strong>:各个用于并行计算的电脑,计算完各自的batch 后,求取梯度值,把梯度值统一送到ps服务机器中,由ps服务机器求取梯度平均值,更新ps服务器上的参数。
    <ul>
      <li>如下图所示,可以看成有四台电脑,第一台电脑用于存储参数、共享参数、共享计算,可以简单的理解成内存、计算共享专用的区域,也就是ps job;另外三台电脑用于并行计算的,也就是worker task。</li>
      <li>这种计算方法存在的缺陷是:每一轮的梯度更新,都要等到A、B、C三台电脑都计算完毕后,才能更新参数,也就是迭代更新速度取决与A、B、C三台中,最慢的那一台电脑,所以采用同步更新的方法,建议A、B、C三台的计算能力都不想。</li>
    </ul>
  </li>
  <li>2、<strong>异步更新</strong>:ps服务器收到只要收到一台机器的梯度值,就直接进行参数更新,无需等待其它机器。这种迭代方法比较不稳定,收敛曲线震动比较厉害,因为当A机器计算完更新了ps中的参数,可能B机器还是在用上一次迭代的旧版参数值。</li>
</ul>

<h4 id="多机多卡讲解">多机多卡讲解</h4>

<p>【2024-4-18】<a href="https://zhuanlan.zhihu.com/p/693040848">大模型多机多卡训练经验总结</a></p>

<p>LLM多机多卡训练教程好少,有些还拿 <code class="language-plaintext highlighter-rouge">torch.distributed.launch</code> 来做,殊不知早就改用 <code class="language-plaintext highlighter-rouge">torchrun</code> 了。</p>

<p>环境准备: 以2台机器为例</p>
<ul>
  <li>首先, 2台机器要能<strong>免密登录</strong>,编辑/etc/hosts文件,加入node信息:</li>
</ul>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c"># vi /etc/hosts</span>
ip1 node01
ip2 node02
</code></pre></div></div>

<p>然后, 两个node分别执行以下操作, 生成私钥和公钥:</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code>ssh-keygen <span class="nt">-t</span> rsa
</code></pre></div></div>

<p>然后, 全部回车,采用默认值。再互相拷贝秘钥:</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code>ssh-copy-id root@ip1
ssh-copy-id root@ip2
</code></pre></div></div>

<p>分别在2台机器上试试互相ssh,如果无密码输入要求直接登录到另一台服务器则说明配置成功。</p>

<p>2台机器环境必须保持一致,包括python版本,训练所需依赖包等。</p>

<p>还需确保安装了pdsh:</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code>apt-get <span class="nb">install </span>pdsh
</code></pre></div></div>

<p>多机训练</p>

<p>使用 torchrun,毕竟单张GPU有80G显存,7B模型单卡完全放得下。</p>
<ul>
  <li>假设node01为master,node02需要有相同的模型权重和代码,可以直接在master用scp拷贝过去。</li>
</ul>

<p>准备工作完成后, 可以启动训练命令</p>
<ul>
  <li>首先在node01(master)执行如下命令(非完整,仅供参考,使用deepspeed ZeRO-2):</li>
</ul>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code>torchrun <span class="nt">--nproc_per_node</span> 8 <span class="nt">--nnodes</span> 2 <span class="nt">--master_addr</span> <span class="k">${</span><span class="nv">MASTER_ADDR</span><span class="k">}</span> <span class="nt">--master_port</span> 14545 <span class="nt">--node_rank</span> 0 train.py <span class="se">\</span>
  <span class="nt">--deepspeed</span> <span class="k">${</span><span class="nv">deepspeed_config_file</span><span class="k">}</span> <span class="se">\</span>
  ...
</code></pre></div></div>

<p>参数</p>
<ul>
  <li>nproc_per_node表示每个节点的进程数,可以理解为每个节点所需GPU数</li>
  <li>nnode表示节点数,2台机器就是2个节点数</li>
  <li>master_add为master的ip</li>
  <li>node_rank表示当前启动的是第几个节点</li>
</ul>

<p>在node02执行同样命令,但需将node_rank指定为1,不出意外的话可以成功跑通,即便报错可能也是依赖包版本两台机器不一致导致。很快就会在控制台看到transformers打印的日志,但发现save_total_limit只在master上管用。</p>

<h4 id="tf-2">TF</h4>

<p>代码编写</p>
<ul>
  <li>1、定义集群</li>
  <li>比如假设上面的图所示,我们有四台电脑,名字假设为:A、B、C、D,那么集群可以定义如下</li>
</ul>

<div class="language-python highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c1">#coding=utf-8
#多台机器,每台机器有一个显卡、或者多个显卡,这种训练叫做分布式训练
</span><span class="kn">import</span>  <span class="nn">tensorflow</span> <span class="k">as</span> <span class="n">tf</span>
<span class="c1">#现在假设我们有A、B、C、D四台机器,首先需要在各台机器上写一份代码,并跑起来,各机器上的代码内容大部分相同
# ,除了开始定义的时候,需要各自指定该台机器的task之外。以机器A为例子,A机器上的代码如下:
</span><span class="n">cluster</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">train</span><span class="p">.</span><span class="n">ClusterSpec</span><span class="p">({</span>
    <span class="s">"worker"</span><span class="p">:</span> <span class="p">[</span>
        <span class="s">"A_IP:2222"</span><span class="p">,</span><span class="c1">#格式 IP地址:端口号,第一台机器A的IP地址 ,在代码中需要用这台机器计算的时候,就要定义:/job:worker/task:0
</span>        <span class="s">"B_IP:1234"</span><span class="c1">#第二台机器的IP地址 /job:worker/task:1
</span>        <span class="s">"C_IP:2222"</span><span class="c1">#第三台机器的IP地址 /job:worker/task:2
</span>    <span class="p">],</span>
    <span class="s">"ps"</span><span class="p">:</span> <span class="p">[</span>
        <span class="s">"D_IP:2222"</span><span class="p">,</span><span class="c1">#第四台机器的IP地址 对应到代码块:/job:ps/task:0
</span>    <span class="p">]})</span>
</code></pre></div></div>

<p>然后需要写四分代码,这四分代码文件大部分相同,但是有几行代码是各不相同的。</p>

<ul>
  <li>2、在各台机器上,定义server
    <ul>
      <li>比如A机器上的代码server要定义如下:
        <div class="language-python highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">server</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">train</span><span class="p">.</span><span class="n">Server</span><span class="p">(</span><span class="n">cluster</span><span class="p">,</span><span class="n">job_name</span><span class="o">=</span><span class="s">'worker'</span><span class="p">,</span><span class="n">task_index</span><span class="o">=</span><span class="mi">0</span><span class="p">)</span><span class="c1">#找到‘worker’名字下的,task0,也就是机器A
</span></code></pre></div>        </div>
      </li>
    </ul>
  </li>
  <li>3、在代码中,指定device
    <div class="language-python highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="k">with</span> <span class="n">tf</span><span class="p">.</span><span class="n">device</span><span class="p">(</span><span class="s">'/job:ps/task:0'</span><span class="p">):</span><span class="c1">#参数定义在机器D上
</span>  <span class="n">w</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">get_variable</span><span class="p">(</span><span class="s">'w'</span><span class="p">,(</span><span class="mi">2</span><span class="p">,</span><span class="mi">2</span><span class="p">),</span><span class="n">tf</span><span class="p">.</span><span class="n">float32</span><span class="p">,</span><span class="n">initializer</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">constant_initializer</span><span class="p">(</span><span class="mi">2</span><span class="p">))</span>
  <span class="n">b</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">get_variable</span><span class="p">(</span><span class="s">'b'</span><span class="p">,(</span><span class="mi">2</span><span class="p">,</span><span class="mi">2</span><span class="p">),</span><span class="n">tf</span><span class="p">.</span><span class="n">float32</span><span class="p">,</span><span class="n">initializer</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">constant_initializer</span><span class="p">(</span><span class="mi">5</span><span class="p">))</span>
<span class="k">with</span> <span class="n">tf</span><span class="p">.</span><span class="n">device</span><span class="p">(</span><span class="s">'/job:worker/task:0/cpu:0'</span><span class="p">):</span><span class="c1">#在机器A cpu上运行
</span>  <span class="n">addwb</span><span class="o">=</span><span class="n">w</span><span class="o">+</span><span class="n">b</span>
<span class="k">with</span> <span class="n">tf</span><span class="p">.</span><span class="n">device</span><span class="p">(</span><span class="s">'/job:worker/task:1/cpu:0'</span><span class="p">):</span><span class="c1">#在机器B cpu上运行
</span>  <span class="n">mutwb</span><span class="o">=</span><span class="n">w</span><span class="o">*</span><span class="n">b</span>
<span class="k">with</span> <span class="n">tf</span><span class="p">.</span><span class="n">device</span><span class="p">(</span><span class="s">'/job:worker/task:2/cpu:0'</span><span class="p">):</span><span class="c1">#在机器C cpu上运行
</span>  <span class="n">divwb</span><span class="o">=</span><span class="n">w</span><span class="o">/</span><span class="n">b</span>
</code></pre></div>    </div>
  </li>
</ul>

<p>在深度学习训练中,一般图的计算,对于每个worker task来说,都是相同的,所以我们会把所有图计算、变量定义等代码,都写到下面这个语句下:</p>
<div class="language-python highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="k">with</span> <span class="n">tf</span><span class="p">.</span><span class="n">device</span><span class="p">(</span><span class="n">tf</span><span class="p">.</span><span class="n">train</span><span class="p">.</span><span class="n">replica_device_setter</span><span class="p">(</span><span class="n">worker_device</span><span class="o">=</span><span class="s">'/job:worker/task:indexi'</span><span class="p">,</span><span class="n">cluster</span><span class="o">=</span><span class="n">cluster</span><span class="p">))</span>
</code></pre></div></div>

<p>函数replica_deviec_setter会自动把变量参数定义部分定义到ps服务中(如果ps有多个任务,那么自动分配)。下面举个例子,假设现在有两台机器A、B,A用于计算服务,B用于参数服务,那么代码如下:</p>

<div class="language-python highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c1">#coding=utf-8
#上面是因为worker计算内容各不相同,不过再深度学习中,一般每个worker的计算内容是一样的,
# 以为都是计算神经网络的每个batch 前向传导,所以一般代码是重用的
</span><span class="kn">import</span>  <span class="nn">tensorflow</span> <span class="k">as</span> <span class="n">tf</span>
<span class="c1">#现在假设我们有A、B台机器,首先需要在各台机器上写一份代码,并跑起来,各机器上的代码内容大部分相同
# ,除了开始定义的时候,需要各自指定该台机器的task之外。以机器A为例子,A机器上的代码如下:
</span><span class="n">cluster</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">train</span><span class="p">.</span><span class="n">ClusterSpec</span><span class="p">({</span>
    <span class="s">"worker"</span><span class="p">:</span> <span class="p">[</span>
        <span class="s">"192.168.11.105:1234"</span><span class="p">,</span><span class="c1">#格式 IP地址:端口号,第一台机器A的IP地址 ,在代码中需要用这台机器计算的时候,就要定义:/job:worker/task:0
</span>    <span class="p">],</span>
    <span class="s">"ps"</span><span class="p">:</span> <span class="p">[</span>
        <span class="s">"192.168.11.130:2223"</span><span class="c1">#第四台机器的IP地址 对应到代码块:/job:ps/task:0
</span>    <span class="p">]})</span>
  
<span class="c1">#不同的机器,下面这一行代码各不相同,server可以根据job_name、task_index两个参数,查找到集群cluster中对应的机器
</span>  
<span class="n">isps</span><span class="o">=</span><span class="bp">False</span>
<span class="k">if</span> <span class="n">isps</span><span class="p">:</span>
    <span class="n">server</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">train</span><span class="p">.</span><span class="n">Server</span><span class="p">(</span><span class="n">cluster</span><span class="p">,</span><span class="n">job_name</span><span class="o">=</span><span class="s">'ps'</span><span class="p">,</span><span class="n">task_index</span><span class="o">=</span><span class="mi">0</span><span class="p">)</span><span class="c1">#找到‘worker’名字下的,task0,也就是机器A
</span>    <span class="n">server</span><span class="p">.</span><span class="n">join</span><span class="p">()</span>
<span class="k">else</span><span class="p">:</span>
    <span class="n">server</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">train</span><span class="p">.</span><span class="n">Server</span><span class="p">(</span><span class="n">cluster</span><span class="p">,</span><span class="n">job_name</span><span class="o">=</span><span class="s">'worker'</span><span class="p">,</span><span class="n">task_index</span><span class="o">=</span><span class="mi">0</span><span class="p">)</span><span class="c1">#找到‘worker’名字下的,task0,也就是机器A
</span>    <span class="k">with</span> <span class="n">tf</span><span class="p">.</span><span class="n">device</span><span class="p">(</span><span class="n">tf</span><span class="p">.</span><span class="n">train</span><span class="p">.</span><span class="n">replica_device_setter</span><span class="p">(</span><span class="n">worker_device</span><span class="o">=</span><span class="s">'/job:worker/task:0'</span><span class="p">,</span><span class="n">cluster</span><span class="o">=</span><span class="n">cluster</span><span class="p">)):</span>
        <span class="n">w</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">get_variable</span><span class="p">(</span><span class="s">'w'</span><span class="p">,(</span><span class="mi">2</span><span class="p">,</span><span class="mi">2</span><span class="p">),</span><span class="n">tf</span><span class="p">.</span><span class="n">float32</span><span class="p">,</span><span class="n">initializer</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">constant_initializer</span><span class="p">(</span><span class="mi">2</span><span class="p">))</span>
        <span class="n">b</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">get_variable</span><span class="p">(</span><span class="s">'b'</span><span class="p">,(</span><span class="mi">2</span><span class="p">,</span><span class="mi">2</span><span class="p">),</span><span class="n">tf</span><span class="p">.</span><span class="n">float32</span><span class="p">,</span><span class="n">initializer</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">constant_initializer</span><span class="p">(</span><span class="mi">5</span><span class="p">))</span>
        <span class="n">addwb</span><span class="o">=</span><span class="n">w</span><span class="o">+</span><span class="n">b</span>
        <span class="n">mutwb</span><span class="o">=</span><span class="n">w</span><span class="o">*</span><span class="n">b</span>
        <span class="n">divwb</span><span class="o">=</span><span class="n">w</span><span class="o">/</span><span class="n">b</span>

<span class="n">saver</span> <span class="o">=</span> <span class="n">tf</span><span class="p">.</span><span class="n">train</span><span class="p">.</span><span class="n">Saver</span><span class="p">()</span>
<span class="n">summary_op</span> <span class="o">=</span> <span class="n">tf</span><span class="p">.</span><span class="n">merge_all_summaries</span><span class="p">()</span>
<span class="n">init_op</span> <span class="o">=</span> <span class="n">tf</span><span class="p">.</span><span class="n">initialize_all_variables</span><span class="p">()</span>
<span class="n">sv</span> <span class="o">=</span> <span class="n">tf</span><span class="p">.</span><span class="n">train</span><span class="p">.</span><span class="n">Supervisor</span><span class="p">(</span><span class="n">init_op</span><span class="o">=</span><span class="n">init_op</span><span class="p">,</span> <span class="n">summary_op</span><span class="o">=</span><span class="n">summary_op</span><span class="p">,</span> <span class="n">saver</span><span class="o">=</span><span class="n">saver</span><span class="p">)</span>
<span class="k">with</span> <span class="n">sv</span><span class="p">.</span><span class="n">managed_session</span><span class="p">(</span><span class="n">server</span><span class="p">.</span><span class="n">target</span><span class="p">)</span> <span class="k">as</span> <span class="n">sess</span><span class="p">:</span>
    <span class="k">while</span> <span class="mi">1</span><span class="p">:</span>
        <span class="k">print</span> <span class="n">sess</span><span class="p">.</span><span class="n">run</span><span class="p">([</span><span class="n">addwb</span><span class="p">,</span><span class="n">mutwb</span><span class="p">,</span><span class="n">divwb</span><span class="p">])</span>
</code></pre></div></div>

<p>把该代码在机器A上运行,你会发现,程序会进入等候状态,等候用于ps参数服务的机器启动,才会运行。</p>

<p>因此接着我们在机器B上运行如下代码:</p>

<div class="language-python highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c1">#coding=utf-8
#上面是因为worker计算内容各不相同,不过再深度学习中,一般每个worker的计算内容是一样的,
# 以为都是计算神经网络的每个batch 前向传导,所以一般代码是重用的
#coding=utf-8
#多台机器,每台机器有一个显卡、或者多个显卡,这种训练叫做分布式训练
</span><span class="kn">import</span>  <span class="nn">tensorflow</span> <span class="k">as</span> <span class="n">tf</span>
<span class="c1">#现在假设我们有A、B、C、D四台机器,首先需要在各台机器上写一份代码,并跑起来,各机器上的代码内容大部分相同
# ,除了开始定义的时候,需要各自指定该台机器的task之外。以机器A为例子,A机器上的代码如下:
</span><span class="n">cluster</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">train</span><span class="p">.</span><span class="n">ClusterSpec</span><span class="p">({</span>
    <span class="s">"worker"</span><span class="p">:</span> <span class="p">[</span>
        <span class="s">"192.168.11.105:1234"</span><span class="p">,</span><span class="c1">#格式 IP地址:端口号,第一台机器A的IP地址 ,在代码中需要用这台机器计算的时候,就要定义:/job:worker/task:0
</span>    <span class="p">],</span>
    <span class="s">"ps"</span><span class="p">:</span> <span class="p">[</span>
        <span class="s">"192.168.11.130:2223"</span><span class="c1">#第四台机器的IP地址 对应到代码块:/job:ps/task:0
</span>    <span class="p">]})</span>
  
<span class="c1">#不同的机器,下面这一行代码各不相同,server可以根据job_name、task_index两个参数,查找到集群cluster中对应的机器
</span>  
<span class="n">isps</span><span class="o">=</span><span class="bp">True</span>
<span class="k">if</span> <span class="n">isps</span><span class="p">:</span>
    <span class="n">server</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">train</span><span class="p">.</span><span class="n">Server</span><span class="p">(</span><span class="n">cluster</span><span class="p">,</span><span class="n">job_name</span><span class="o">=</span><span class="s">'ps'</span><span class="p">,</span><span class="n">task_index</span><span class="o">=</span><span class="mi">0</span><span class="p">)</span><span class="c1">#找到‘worker’名字下的,task0,也就是机器A
</span>    <span class="n">server</span><span class="p">.</span><span class="n">join</span><span class="p">()</span>
<span class="k">else</span><span class="p">:</span>
    <span class="n">server</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">train</span><span class="p">.</span><span class="n">Server</span><span class="p">(</span><span class="n">cluster</span><span class="p">,</span><span class="n">job_name</span><span class="o">=</span><span class="s">'worker'</span><span class="p">,</span><span class="n">task_index</span><span class="o">=</span><span class="mi">0</span><span class="p">)</span><span class="c1">#找到‘worker’名字下的,task0,也就是机器A
</span>    <span class="k">with</span> <span class="n">tf</span><span class="p">.</span><span class="n">device</span><span class="p">(</span><span class="n">tf</span><span class="p">.</span><span class="n">train</span><span class="p">.</span><span class="n">replica_device_setter</span><span class="p">(</span><span class="n">worker_device</span><span class="o">=</span><span class="s">'/job:worker/task:0'</span><span class="p">,</span><span class="n">cluster</span><span class="o">=</span><span class="n">cluster</span><span class="p">)):</span>
        <span class="n">w</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">get_variable</span><span class="p">(</span><span class="s">'w'</span><span class="p">,(</span><span class="mi">2</span><span class="p">,</span><span class="mi">2</span><span class="p">),</span><span class="n">tf</span><span class="p">.</span><span class="n">float32</span><span class="p">,</span><span class="n">initializer</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">constant_initializer</span><span class="p">(</span><span class="mi">2</span><span class="p">))</span>
        <span class="n">b</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">get_variable</span><span class="p">(</span><span class="s">'b'</span><span class="p">,(</span><span class="mi">2</span><span class="p">,</span><span class="mi">2</span><span class="p">),</span><span class="n">tf</span><span class="p">.</span><span class="n">float32</span><span class="p">,</span><span class="n">initializer</span><span class="o">=</span><span class="n">tf</span><span class="p">.</span><span class="n">constant_initializer</span><span class="p">(</span><span class="mi">5</span><span class="p">))</span>
        <span class="n">addwb</span><span class="o">=</span><span class="n">w</span><span class="o">+</span><span class="n">b</span>
        <span class="n">mutwb</span><span class="o">=</span><span class="n">w</span><span class="o">*</span><span class="n">b</span>
        <span class="n">divwb</span><span class="o">=</span><span class="n">w</span><span class="o">/</span><span class="n">b</span>
  
<span class="n">saver</span> <span class="o">=</span> <span class="n">tf</span><span class="p">.</span><span class="n">train</span><span class="p">.</span><span class="n">Saver</span><span class="p">()</span>
<span class="n">summary_op</span> <span class="o">=</span> <span class="n">tf</span><span class="p">.</span><span class="n">merge_all_summaries</span><span class="p">()</span>
<span class="n">init_op</span> <span class="o">=</span> <span class="n">tf</span><span class="p">.</span><span class="n">initialize_all_variables</span><span class="p">()</span>
<span class="n">sv</span> <span class="o">=</span> <span class="n">tf</span><span class="p">.</span><span class="n">train</span><span class="p">.</span><span class="n">Supervisor</span><span class="p">(</span><span class="n">init_op</span><span class="o">=</span><span class="n">init_op</span><span class="p">,</span> <span class="n">summary_op</span><span class="o">=</span><span class="n">summary_op</span><span class="p">,</span> <span class="n">saver</span><span class="o">=</span><span class="n">saver</span><span class="p">)</span>
<span class="k">with</span> <span class="n">sv</span><span class="p">.</span><span class="n">managed_session</span><span class="p">(</span><span class="n">server</span><span class="p">.</span><span class="n">target</span><span class="p">)</span> <span class="k">as</span> <span class="n">sess</span><span class="p">:</span>
    <span class="k">while</span> <span class="mi">1</span><span class="p">:</span>
        <span class="k">print</span> <span class="n">sess</span><span class="p">.</span><span class="n">run</span><span class="p">([</span><span class="n">addwb</span><span class="p">,</span><span class="n">mutwb</span><span class="p">,</span><span class="n">divwb</span><span class="p">])</span>
</code></pre></div></div>

<ul>
  <li><a href="https://www.tensorflow.org/versions/master/how_tos/distributed/index.html">Tensorflow官方指南</a></li>
</ul>

<p>分布式训练需要熟悉的函数:</p>
<ul>
  <li>tf.train.Server</li>
  <li>tf.train.Supervisor</li>
  <li>tf.train.SessionManager</li>
  <li>tf.train.ClusterSpec</li>
  <li>tf.train.replica_device_setter</li>
  <li>tf.train.MonitoredTrainingSession</li>
  <li>tf.train.MonitoredSession</li>
  <li>tf.train.SingularMonitoredSession</li>
  <li>tf.train.Scaffold</li>
  <li>tf.train.SessionCreator</li>
  <li>tf.train.ChiefSessionCreator</li>
  <li>tf.train.WorkerSessionCreator</li>
</ul>

<h4 id="pytorch-2">PyTorch</h4>

<p>DP</p>
<ul>
  <li>DP就是 DataParallel。DP 是<strong>单进程</strong>控制多 GPU。
    <ul>
      <li>DP 将输入的一个 batch 数据分成了 n 份(n 为实际使用的 GPU 数量),分别送到对应的 GPU 进行计算。</li>
      <li>在网络前向传播时,模型会从主 GPU 复制到其它 GPU 上;</li>
      <li>在反向传播时,每个 GPU 上的梯度汇总到主 GPU 上,求得梯度均值更新模型参数后,再复制到其它 GPU,以此来实现并行。</li>
      <li>由于主 GPU 要进行梯度汇总和模型更新,并将计算任务下发给其它 GPU,所以主 GPU 的负载与使用率会比其它 GPU 高,这就导致了 GPU 负载不均衡的现象。</li>
    </ul>
  </li>
</ul>

<p>DDP</p>
<ul>
  <li>DDP 是 DistributedDataParallel。DDP <strong>多进程</strong>控制多 GPU。
    <ul>
      <li>系统会为每个 GPU 创建一个进程,不再有主 GPU,每个 GPU 执行相同的任务。</li>
      <li>DDP 使用分布式数据采样器(DistributedSampler)加载数据,确保数据在各个进程之间没有重叠。</li>
      <li>在反向传播时,各 GPU 梯度计算完成后,各进程以广播的方式将梯度进行汇总平均,然后每个进程在各自的 GPU 上进行梯度更新,从而确保每个 GPU 上的模型参数始终保持一致。由于无需在不同 GPU 之间复制模型,DDP 的传输数据量更少,因此速度更快。</li>
    </ul>
  </li>
</ul>

<p>DDP 既可用于<strong>单机多卡</strong>也可用于<strong>多机多卡</strong>,它能解决 DataParallel 速度慢、GPU 负载不均衡等问题。因此,官方更推荐使用 DistributedDataParallel 来进行分布式训练</p>

<p>基本概念</p>
<ul>
  <li>group:进程组。默认情况下,只有一个组,即一个 world。(DDP 多进程控制多 GPU)</li>
  <li>world_size :表示全局进程个数。</li>
  <li>rank:表示进程序号,用于进程间通讯,表示进程优先级。rank=0 的主机为主节点。</li>
</ul>

<p>训练基本流程</p>
<ul>
  <li><img src="https://pic3.zhimg.com/80/v2-0c8048a903e59659880350df0ea98e1a_1440w.webp" alt="" /></li>
  <li>(1)初始化进程组:用 init_process_group 函数
    <ul>
      <li>backend:是通信所用的后端,可以是“nccl”或“gloo”。一般来说,nccl 用于 GPU 分布式训练,gloo 用于 CPU 进行分布式训练。</li>
      <li>init_method:字符串类型,是一个 url,进程初始化方式,默认是 “env://”,表示从环境变量初始化,还可以使用 TCP 的方式或共享文件系统 。</li>
      <li>world_size:执行训练的所有的进程数,表示一共有多少个节点(机器)。</li>
      <li>rank:进程的编号,也是其优先级,表示当前节点(机器)的编号。group_name:进程组的名字。</li>
    </ul>
  </li>
  <li>(2)模型并行化:用 DistributedDataParallel,将模型分发至多 GPU 上
    <ul>
      <li>DistributedDataParallel 的参数与 DataParallel 基本相同</li>
    </ul>
  </li>
  <li>(3)创建分布式数据采样器</li>
</ul>

<p>DP 是直接将一个 batch 的数据划分到不同的卡,但是多机多卡间频繁数据传输会严重影响效率,这时就要用到<strong>分布式数据采样器</strong> DistributedSampler,它会为每个子进程划分出一部分数据集,从而使 DataLoader 只会加载特定的一个子数据集,以避免不同进程之间有数据重复。</p>
<ul>
  <li>先将 train_dataset 送到了 DistributedSampler 中,并创建了一个分布式数据采样器 train_sampler。</li>
  <li>再构造 DataLoader ,, 参数中传入了一个 sampler=train_sampler,即可让不同的进程节点加载属于自己的那份子数据集。也就是说,使用 DDP 时,不再是从主 GPU 分发数据到其他 GPU 上,而是各 GPU 从自己的硬盘上读取属于自己的那份数据。</li>
</ul>

<p>具体逻辑:</p>
<ul>
  <li><strong>加载模型</strong>阶段。每个GPU都拥有模型的一个副本,所以不需要拷贝模型。rank为0的进程会将网络初始化参数broadcast到其它每个进程中,确保每个进程中的模型都拥有一样的初始化值。</li>
  <li><strong>加载数据</strong>阶段。DDP 不需要广播数据,而是使用多进程并行加载数据。在 host 之上,每个worker进程都会把自己负责的数据从硬盘加载到 page-locked memory。DistributedSampler 保证每个进程加载到的数据是彼此不重叠的。</li>
  <li><strong>前向传播</strong>阶段。在每个GPU之上运行前向传播,计算输出。每个GPU都执行同样的训练,所以不需要有主 GPU。</li>
  <li><strong>计算损失</strong>。在每个GPU之上计算损失。</li>
  <li><strong>反向传播</strong>阶段。运行后向传播来计算梯度,在计算梯度同时也对梯度执行all-reduce操作。</li>
  <li><strong>更新模型参数</strong>阶段。因为每个GPU都从完全相同的模型开始训练,并且梯度被all-reduced,因此每个GPU在反向传播结束时最终得到平均梯度的相同副本,所有GPU上的权重更新都相同,也就不需要模型同步了。注意,在每次迭代中,模型中的Buffers 需要从rank为0的进程广播到进程组的其它进程上。</li>
</ul>

<p>代码略,见<a href="https://zhuanlan.zhihu.com/p/634846886">原文</a></p>

<p>注意</p>
<ul>
  <li>使用 DDP 意味着使用<strong>多进程</strong>,如果直接保存模型,每个进程都会执行一次保存操作,此时只使用主进程中的一个 GPU 来保存即可。</li>
</ul>

<h2 id="pytorch-分布式训练">Pytorch 分布式训练</h2>

<h3 id="分布式基础">分布式基础</h3>

<h4 id="分布式模式-1">分布式模式</h4>

<p>PyTorch 原生支持的并行模式:</p>
<ul>
  <li><strong>完全</strong>分片数据并行(full sharded data parallel,<code class="language-plaintext highlighter-rouge">FSDP</code>)</li>
  <li><strong>混合</strong>分片数据并行(hybrid sharding data parallel,<code class="language-plaintext highlighter-rouge">HSDP</code>)</li>
  <li>张量并行(tensor parallel,<code class="language-plaintext highlighter-rouge">TP</code>)</li>
  <li>流水线并行(pipeline parallel,<code class="language-plaintext highlighter-rouge">PP</code>)</li>
  <li>序列并行(sequence parallel,<code class="language-plaintext highlighter-rouge">SP</code>)</li>
  <li>上下文并行(context parallel,<code class="language-plaintext highlighter-rouge">CP</code>)</li>
</ul>

<p>【2023-3-2】<a href="https://zhuanlan.zhihu.com/p/489011749">PyTorch 分布式训练实现(DP/DDP/torchrun/多机多卡)</a></p>

<p>相对 Tensorflow,Pytorch 简单的多。分布式训练主要有两个API:</p>
<ul>
  <li>DataParallel(<code class="language-plaintext highlighter-rouge">DP</code>): <strong>PS模式</strong>,1张卡为reduce(parame server),实现就1行代码
    <ul>
      <li><strong>单进程多线程</strong>,仅仅能工作在单机中</li>
      <li>将数据分割到多个GPU上。典型的<code class="language-plaintext highlighter-rouge">数据并行</code>,将模型复制到每个GPU上,一旦<strong>GPU0计算出梯度</strong>,就同步梯度到各个节点,这需要大量的GPU数据传输(类似PS模式)</li>
    </ul>
  </li>
  <li>DistributedDataParallel(<code class="language-plaintext highlighter-rouge">DDP</code>): <strong>All-Reduce模式</strong>,单机多卡/多级多卡皆可。官方建议API
    <ul>
      <li>多进程,单机或多机</li>
      <li>每个GPU进程中创建<strong>模型副本</strong>,并只让数据的一部分对改GPU可用。因为每个GPU中的模型是独立运行的,所以在所有的模型都计算出梯度后,才会在模型之间同步梯度(类似All-reduce)</li>
    </ul>
  </li>
</ul>

<p>分析</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">DDP</code>每个batch只需要一次数据传输;</li>
  <li><code class="language-plaintext highlighter-rouge">DP</code>可能存在多次数据同步(不用worker之间可能快慢不一样)。</li>
  <li>DataParallel 通常慢于 DistributedDataParallel</li>
</ul>

<p>【2024-7-24】PyTorch 为数据分布式训练提供了多种选择。</p>

<p>随着应用从简单到复杂,从原型到产品,常见的开发轨迹可以是:</p>
<ol>
  <li>数据和模型能放入<strong>单个GPU</strong>,单设备训练,此时不用担心训练速度;</li>
  <li>服务器上有<strong>多个GPU</strong>,且<strong>代码修改量最小</strong>,加速训练用<strong>单个机器多GPU</strong> <code class="language-plaintext highlighter-rouge">DataParallel</code>;</li>
  <li>进一步加速训练,且愿意写点代码,用单个机器多个GPU <code class="language-plaintext highlighter-rouge">DistributedDataParallel</code>;</li>
  <li>应用程序<strong>跨机器边界</strong>扩展,用多机器<code class="language-plaintext highlighter-rouge">DistributedDataParallel</code>和<strong>启动脚本</strong>;</li>
  <li>预期有错误(比如OOM)或资源可<strong>动态连接和分离</strong>,使用<code class="language-plaintext highlighter-rouge">torchelastic</code>来启动分布式训练。</li>
</ol>

<p>分布式训练的场景很多,单机多卡,多机多卡,模型并行,数据并行等等。接下来就以常见的单机多卡的情况进行记录。</p>

<p>PyTorch 使用 DDP(Distributed Data Parallel) 实现了<strong>真正</strong>的分布式<code class="language-plaintext highlighter-rouge">数据并行</code>,两个场景下都可使用 DDP 实现模型的分布式训练:</p>
<ul>
  <li>(1) 单机、多 GPU(单进程多线程的<strong>伪</strong>分布式)</li>
  <li>(2) 多机、多 GPU(多机多进程的<strong>真正</strong>分布式)</li>
</ul>

<p>方法(1)类似简单 DP 数据并行模式</p>
<ul>
  <li>DP 使用<strong>单进程</strong>、多线程范式来实现;</li>
  <li>而 DDP 完全使用<strong>多进程</strong>方式,包括单机多进程、多机多进程</li>
</ul>

<p>即使单机、多 GPU,也建议使用 DDP 模式,实现基于数据并行的模型训练,使用单机 DDP 模式训练模型的性能要比 DP 模式好很多。</p>

<p>DDP 基于<strong>集合通信</strong>(Collective Communications)实现分布式训练过程中的梯度同步。</p>

<p>反向传播过程中,DDP 使用 AllReduce 来实现分布式梯度计算和同步。</p>

<h3 id="1dataparallel">1、DataParallel</h3>

<p>模型与变量必须在同一个设备上(CPU or GPU)</p>

<p>pytorch 使用<strong>to函数</strong>实现变量或模型的<strong>存储转移</strong></p>
<ul>
  <li>to函数的对象: 数据Tensor,或 模型Module</li>
  <li>张量不执行inplace(即 执行之后重新构建一个新的张量)</li>
  <li>模型执行inplace(执行之后不重新构建一个新的模型)</li>
</ul>

<p>原理:</p>
<ul>
  <li>当给定model时,主要实现功能是将input数据依据batch的这个维度,将数据划分到指定的设备上。其他的对象(objects)复制到每个设备上。在前向传播的过程中,module被复制到每个设备上,每个复制的副本处理一部分输入数据。</li>
  <li>在反向传播过程中,每个副本module的梯度被汇聚到原始的module上计算(一般为第0块GPU)。</li>
</ul>

<p>举例:</p>
<ul>
  <li>如果当前有4个GPU,batch_size=16,那么模型将被复制到每一个GPU上,在前向传播时,每一个gpu将分到4个batch,每个gpu独立计算依据分到的batch计算出结果的梯度,然后将梯度返回到第一个GPU上,第一个GPU再进行梯度融合、模型更新。在下一次前向传播的时候,将更新后的模型再复制给每一个GPU。</li>
</ul>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c1">###	第一步:构建模型
# module 需要分发的模型
# device_ids 可分发的gpu,默认分发到所有看见GPU(环境变量设置的)
# output_device 结果输出设备 通常设置成逻辑gpu的第一个
</span><span class="n">module</span> <span class="o">=</span> <span class="n">your_simple_net</span><span class="p">()</span> <span class="c1">#你的模型
</span><span class="n">Your_Parallel_Net</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">nn</span><span class="p">.</span><span class="n">DataParallel</span><span class="p">(</span><span class="n">module</span><span class="p">,</span> <span class="n">device_ids</span><span class="o">=</span><span class="bp">None</span><span class="p">,</span> <span class="n">output_device</span><span class="o">=</span><span class="bp">None</span><span class="p">)</span>
<span class="c1">### 第二步:数据迁移
</span><span class="n">inputs</span><span class="o">=</span><span class="n">inputs</span><span class="p">.</span><span class="n">to</span><span class="p">(</span><span class="n">device</span><span class="p">)</span>	
<span class="n">labels</span><span class="o">=</span><span class="n">labels</span><span class="p">.</span><span class="n">to</span><span class="p">(</span><span class="n">device</span><span class="p">)</span>	
<span class="c1"># device通常应为模型输出的output_device,否则无法计算loss
</span></code></pre></div></div>

<p>代码</p>

<div class="language-python highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kn">import</span> <span class="nn">torch</span>
<span class="kn">import</span> <span class="nn">torch.nn</span> <span class="k">as</span> <span class="n">nn</span>
<span class="kn">from</span> <span class="nn">torch.autograd</span> <span class="kn">import</span> <span class="n">Variable</span>
<span class="kn">from</span> <span class="nn">torch.utils.data</span> <span class="kn">import</span> <span class="n">Dataset</span><span class="p">,</span> <span class="n">DataLoader</span>
<span class="kn">import</span> <span class="nn">os</span>

<span class="n">input_size</span> <span class="o">=</span> <span class="mi">5</span>
<span class="n">output_size</span> <span class="o">=</span> <span class="mi">2</span>
<span class="n">batch_size</span> <span class="o">=</span> <span class="mi">30</span>
<span class="n">data_size</span> <span class="o">=</span> <span class="mi">30</span>

<span class="k">class</span> <span class="nc">RandomDataset</span><span class="p">(</span><span class="n">Dataset</span><span class="p">):</span>
    <span class="k">def</span> <span class="nf">__init__</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">size</span><span class="p">,</span> <span class="n">length</span><span class="p">):</span>
        <span class="bp">self</span><span class="p">.</span><span class="nb">len</span> <span class="o">=</span> <span class="n">length</span>
        <span class="bp">self</span><span class="p">.</span><span class="n">data</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">randn</span><span class="p">(</span><span class="n">length</span><span class="p">,</span> <span class="n">size</span><span class="p">)</span>

    <span class="k">def</span> <span class="nf">__getitem__</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">index</span><span class="p">):</span>
        <span class="k">return</span> <span class="bp">self</span><span class="p">.</span><span class="n">data</span><span class="p">[</span><span class="n">index</span><span class="p">]</span>

    <span class="k">def</span> <span class="nf">__len__</span><span class="p">(</span><span class="bp">self</span><span class="p">):</span>
        <span class="k">return</span> <span class="bp">self</span><span class="p">.</span><span class="nb">len</span>

<span class="n">rand_loader</span> <span class="o">=</span> <span class="n">DataLoader</span><span class="p">(</span><span class="n">dataset</span><span class="o">=</span><span class="n">RandomDataset</span><span class="p">(</span><span class="n">input_size</span><span class="p">,</span> <span class="n">data_size</span><span class="p">),</span> <span class="n">batch_size</span><span class="o">=</span><span class="n">batch_size</span><span class="p">,</span> <span class="n">shuffle</span><span class="o">=</span><span class="bp">True</span><span class="p">)</span>

<span class="k">class</span> <span class="nc">Model</span><span class="p">(</span><span class="n">nn</span><span class="p">.</span><span class="n">Module</span><span class="p">):</span>
    <span class="c1"># Our model
</span>    <span class="k">def</span> <span class="nf">__init__</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">input_size</span><span class="p">,</span> <span class="n">output_size</span><span class="p">):</span>
        <span class="nb">super</span><span class="p">(</span><span class="n">Model</span><span class="p">,</span> <span class="bp">self</span><span class="p">).</span><span class="n">__init__</span><span class="p">()</span>
        <span class="bp">self</span><span class="p">.</span><span class="n">fc</span> <span class="o">=</span> <span class="n">nn</span><span class="p">.</span><span class="n">Linear</span><span class="p">(</span><span class="n">input_size</span><span class="p">,</span> <span class="n">output_size</span><span class="p">)</span>

    <span class="k">def</span> <span class="nf">forward</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="nb">input</span><span class="p">):</span>
        <span class="n">output</span> <span class="o">=</span> <span class="bp">self</span><span class="p">.</span><span class="n">fc</span><span class="p">(</span><span class="nb">input</span><span class="p">)</span>
        <span class="k">print</span><span class="p">(</span><span class="s">"  In Model: input size"</span><span class="p">,</span> <span class="nb">input</span><span class="p">.</span><span class="n">size</span><span class="p">(),</span>
              <span class="s">"output size"</span><span class="p">,</span> <span class="n">output</span><span class="p">.</span><span class="n">size</span><span class="p">())</span>
        <span class="k">return</span> <span class="n">output</span>
<span class="n">model</span> <span class="o">=</span> <span class="n">Model</span><span class="p">(</span><span class="n">input_size</span><span class="p">,</span> <span class="n">output_size</span><span class="p">)</span>

<span class="k">if</span> <span class="n">torch</span><span class="p">.</span><span class="n">cuda</span><span class="p">.</span><span class="n">is_available</span><span class="p">():</span>
    <span class="n">model</span><span class="p">.</span><span class="n">cuda</span><span class="p">()</span>

<span class="k">if</span> <span class="n">torch</span><span class="p">.</span><span class="n">cuda</span><span class="p">.</span><span class="n">device_count</span><span class="p">()</span> <span class="o">&gt;</span> <span class="mi">1</span><span class="p">:</span>
    <span class="k">print</span><span class="p">(</span><span class="s">"Let's use"</span><span class="p">,</span> <span class="n">torch</span><span class="p">.</span><span class="n">cuda</span><span class="p">.</span><span class="n">device_count</span><span class="p">(),</span> <span class="s">"GPUs!"</span><span class="p">)</span>
    <span class="c1"># 就这一行!将模型整体复制到每个GPU上,计算完成后各自汇总到ps节点
</span>    <span class="n">model</span> <span class="o">=</span> <span class="n">nn</span><span class="p">.</span><span class="n">DataParallel</span><span class="p">(</span><span class="n">model</span><span class="p">)</span>

<span class="k">for</span> <span class="n">data</span> <span class="ow">in</span> <span class="n">rand_loader</span><span class="p">:</span>
    <span class="k">if</span> <span class="n">torch</span><span class="p">.</span><span class="n">cuda</span><span class="p">.</span><span class="n">is_available</span><span class="p">():</span>
        <span class="n">input_var</span> <span class="o">=</span> <span class="n">Variable</span><span class="p">(</span><span class="n">data</span><span class="p">.</span><span class="n">cuda</span><span class="p">())</span>
    <span class="k">else</span><span class="p">:</span>
        <span class="n">input_var</span> <span class="o">=</span> <span class="n">Variable</span><span class="p">(</span><span class="n">data</span><span class="p">)</span>
    <span class="n">output</span> <span class="o">=</span> <span class="n">model</span><span class="p">(</span><span class="n">input_var</span><span class="p">)</span>
    <span class="k">print</span><span class="p">(</span><span class="s">"Outside: input size"</span><span class="p">,</span> <span class="n">input_var</span><span class="p">.</span><span class="n">size</span><span class="p">(),</span> <span class="s">"output_size"</span><span class="p">,</span> <span class="n">output</span><span class="p">.</span><span class="n">size</span><span class="p">())</span>
</code></pre></div></div>

<h3 id="2ddp官方建议">2、DDP(官方建议)</h3>

<h4 id="dp-问题">DP 问题</h4>

<p>为什么要引入DDP(DistributedDataParallel)?DP 存在问题</p>

<ul>
  <li>1、DP 每个训练批次(batch)中,一个进程上先算出模型权重, 然后再分发到每个GPU上
    <ul>
      <li>网络通信就成为了瓶颈,而GPU使用率也通常很低。</li>
      <li>显存浪费, 多存储了 n-1 份 模型副本</li>
    </ul>
  </li>
  <li>2、每次前向传播时把模型也复制了(即每次更新都复制一遍模型),并且<strong>单进程多线程</strong>会造成<code class="language-plaintext highlighter-rouge">GIL</code> contention (全局解释器锁争用) 这里进程计算权重使通信成为瓶颈造成了大量的时间浪费,因此引入了DDP。</li>
</ul>

<p>dp 两个问题:</p>
<ul>
  <li>1️⃣ 显存浪费严重。
    <ul>
      <li>以单机八卡为例,把模型复制8份放在8张卡上同时推理,因此多付出了<strong>7个</strong>模型(副本)的显存开销;</li>
    </ul>
  </li>
  <li>2️⃣ 大模型不适用。
    <ul>
      <li>以最新提出的Llama 3.1为例,不经量化(FP16数据类型)的情况下,容纳70B的模型需要140GB的显存,即使是40G一张的A100也无法承受。</li>
      <li>而这才仅仅是容纳模型,还没有考虑存放数据,以及训练的话存放梯度数据等。因此数据并行并不适用于70B级别大模型的推理和训练。</li>
    </ul>
  </li>
</ul>

<p>DDP采用<strong>多进程</strong>控制多GPU,共同训练模型,一份代码会被pytorch自动分配到n个进程并在n个GPU上运行。</p>
<ul>
  <li>DDP运用 <code class="language-plaintext highlighter-rouge">Ring-Reduce</code>通信算法在每个GPU间对梯度进行通讯,交换彼此的梯度,从而获得所有GPU的梯度。</li>
</ul>

<p>对比DP,不需要在进行模型本体的通信,因此可以加速训练。</p>

<p>torch.nn.DataParallel</p>
<ul>
  <li>DataParallel 全程维护一个 optimizer,对各 GPU 上梯度进行求和,而在主 GPU 进行参数更新,之后再将模型参数 broadcast 到其他 GPU</li>
</ul>

<p>注意:</p>
<ul>
  <li>1、设置 DistributedSampler 来打乱数据,因为一个batch被分配到了好几个进程中,要确保不同的GPU拿到的不是同一份数据。</li>
  <li>2、要告诉每个进程自己的id,即使用哪一块GPU。</li>
  <li>3、如果需要做BatchNormalization,需要对数据进行同步(还待研究,挖坑)</li>
</ul>

<p>DDP采用All-Reduce架构,单机多卡、多机多卡都能用。</p>

<p>注意:DDP并不会自动shard数据</p>
<ol>
  <li>如果自己写数据流,得根据<code class="language-plaintext highlighter-rouge">torch.distributed.get_rank()</code>去shard数据,获取自己应用的一份</li>
  <li>如果用 Dataset API,则需要在定义Dataloader的时候用 DistributedSampler 去shard</li>
</ol>

<h4 id="torchdistributed-介绍">torch.distributed 介绍</h4>

<p>torch.nn.DataParallel 支持数据并行,但不支持<strong>多机</strong>分布式训练,且底层实现相较于 distributed 的接口,有些许不足。</p>

<p>Pytorch 通过 torch.distributed 包提供分布式支持,包括 GPU 和 CPU 的分布式训练支持。</p>
<ul>
  <li>Pytorch 分布式目前只支持 Linux。</li>
</ul>

<p><code class="language-plaintext highlighter-rouge">torch.distributed</code> 优势:</p>
<ul>
  <li>每个进程对应一个独立的训练过程,且只对梯度等少量数据进行信息交换。
    <ul>
      <li>迭代中,每个进程具有自己的 optimizer ,独立完成所有优化步骤,进程内与一般的训练无异。</li>
      <li>各进程梯度计算完成之后,先将梯度进行汇总平均,再由 <code class="language-plaintext highlighter-rouge">rank=0</code> 的进程,将其 broadcast 到所有进程。最后,各进程用该梯度来更新参数。</li>
      <li>各进程的模型参数始终保持一致: 各进程初始参数、更新参数都一致</li>
      <li>相比 <code class="language-plaintext highlighter-rouge">DataParallel</code>, <code class="language-plaintext highlighter-rouge">torch.distributed</code> 传输的数据量更少,因此速度更快,效率更高</li>
    </ul>
  </li>
  <li>每个进程包含独立的解释器和 GIL
    <ul>
      <li>每个进程拥有独立的解释器和 GIL,消除了单个 Python 进程中的多个执行线程,模型副本或 GPU 的额外解释器开销和 GIL-thrashing ,因此可以减少解释器和 GIL 使用冲突</li>
    </ul>
  </li>
</ul>

<h4 id="torchdistributed-概念">torch.distributed 概念</h4>

<p>【2024-4-7】<a href="https://zhuanlan.zhihu.com/p/76638962">Pytorch 分布式训练</a></p>

<p>概念:</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">group</code>:即<strong>进程组</strong>。默认只有1个组,1个 job 即为1个组,即 1个 world。
    <ul>
      <li>当需要进行更加精细的通信时,通过 new_group 接口,使用 word 的子集,创建新组,用于集体通信等。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">world_size</code> :表示<strong>全局进程数</strong>。一个进程可对应<strong>多个</strong>GPU
    <ul>
      <li><code class="language-plaintext highlighter-rouge">world_size ≠ GPU数</code>: 1个进程用多个GPU</li>
      <li><code class="language-plaintext highlighter-rouge">world_size = GPU数</code>: 1个进程用1个GPU</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">local_word_size</code>: 某个节点上进程数 (相对比较少见)</li>
  <li><code class="language-plaintext highlighter-rouge">rank</code>:全局进程id, 表示<strong>进程序号</strong>,用于进程间通讯,表征进程优先级。取值范围: <code class="language-plaintext highlighter-rouge">0~world_size</code>
    <ul>
      <li><code class="language-plaintext highlighter-rouge">rank = 0</code> 主机为 <strong>master 节点</strong>。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">local_rank</code>:某个节点上进程id, 进程内<strong>GPU 编号</strong>,非显式参数,由 <code class="language-plaintext highlighter-rouge">torch.distributed.launch</code> 内部指定。
    <ul>
      <li><code class="language-plaintext highlighter-rouge">rank = 3</code>,<code class="language-plaintext highlighter-rouge">local_rank = 0</code> 表示第 3 个进程内的第 1 块 GPU。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">global_rank</code>: 全局 gpu编号</li>
</ul>

<p>如果 所有进程数(<code class="language-plaintext highlighter-rouge">world_size</code>)为<code class="language-plaintext highlighter-rouge">W</code>,每个节点上的进程数(<code class="language-plaintext highlighter-rouge">local_world_size</code>)为<code class="language-plaintext highlighter-rouge">L</code>, 则每个进程上的两个ID:</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">rank</code> 取值范围:<code class="language-plaintext highlighter-rouge">[0, W-1]</code>
    <ul>
      <li><code class="language-plaintext highlighter-rouge">rank</code>=0 进程为<strong>主进程</strong>,负责同步分发工作</li>
      <li><code class="language-plaintext highlighter-rouge">rank</code>&gt;0 进程为<strong>从进程</strong></li>
      <li><code class="language-plaintext highlighter-rouge">rank</code>=-1, 默认值,非GPU进程?</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">local_rank</code> 取值:<code class="language-plaintext highlighter-rouge">[0, L-1]</code></li>
</ul>

<p>2机8卡的分布式训练<a href="https://zhuanlan.zhihu.com/p/489892744">示例</a></p>
<ul>
  <li><img src="https://pic1.zhimg.com/80/v2-2baae86e212177108872d36a6040a2dc_1440w.webp" alt="" /></li>
  <li>gpu 编号: 0~3</li>
  <li>local rank: gpu 本地编号, 0~3</li>
  <li>global rank: gpu 全局编号, 0~7</li>
</ul>

<p>Pytorch 分布式基本流程:</p>
<ul>
  <li>使用 distributed 包任何函数前,用 <code class="language-plaintext highlighter-rouge">init_process_group</code> 初始化进程组,同时初始化 <code class="language-plaintext highlighter-rouge">distributed</code> 包。</li>
  <li>如进行小组内集体通信,用 <code class="language-plaintext highlighter-rouge">new_group</code> 创建子分组</li>
  <li>创建分布式并行模型 <code class="language-plaintext highlighter-rouge">DDP(model, device_ids=device_ids)</code></li>
  <li>为数据集创建 Sampler</li>
  <li>使用启动工具 <code class="language-plaintext highlighter-rouge">torch.distributed.launch</code> 在每个主机上执行一次脚本,开始训练</li>
  <li>使用 <code class="language-plaintext highlighter-rouge">destory_process_group()</code> 销毁进程组</li>
</ul>

<p>torch.distributed 提供了 3 种初始化方式:<strong>tcp</strong>、<strong>共享文件</strong> 和 <strong>环境变量初始化</strong> 等。</p>
<ul>
  <li>TCP: 指定进程 0 的 ip 和 port, 手动为每个进程指定进程号。</li>
  <li>共享文件: 共享文件对于组内所有进程可见</li>
  <li>环境变量:</li>
</ul>

<p>测试代码</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kn">import</span> <span class="nn">torch.distributed</span> <span class="k">as</span> <span class="n">dist</span>
<span class="kn">import</span> <span class="nn">argparse</span><span class="p">,</span> <span class="n">os</span>

<span class="n">parser</span> <span class="o">=</span> <span class="n">argparse</span><span class="p">.</span><span class="n">ArgumentParser</span><span class="p">()</span>
<span class="n">parser</span><span class="p">.</span><span class="n">add_argument</span><span class="p">(</span><span class="s">"--local_rank"</span><span class="p">,</span> <span class="nb">type</span><span class="o">=</span><span class="n">ine</span><span class="p">,</span> <span class="n">default</span><span class="o">=</span><span class="mi">0</span><span class="p">)</span>
<span class="n">args</span> <span class="o">=</span> <span class="n">parser</span><span class="p">.</span><span class="n">parse_args</span><span class="p">()</span>

<span class="c1"># 分布式初始化, 读取环境变量 RANK=1 WORLD_SIZE=3 MASTER_ADDR=127.0.0.1 MASTER_PORT=8000
</span><span class="n">dist</span><span class="p">.</span><span class="n">init_process_group</span><span class="p">(</span><span class="s">"nccl"</span><span class="p">)</span> <span class="c1"># 进程组初始化
</span><span class="n">rank</span> <span class="o">=</span> <span class="n">dist</span><span class="p">.</span><span class="n">get_rank</span><span class="p">()</span>
<span class="n">local_rank_arg</span> <span class="o">=</span> <span class="n">args</span><span class="p">.</span><span class="n">local_rank</span>               <span class="c1"># 命令行形式ARGS形式
</span><span class="n">local_rank_env</span> <span class="o">=</span> <span class="nb">int</span><span class="p">(</span><span class="n">os</span><span class="p">.</span><span class="n">environ</span><span class="p">[</span><span class="s">'LOCAL_RANK'</span><span class="p">])</span> <span class="c1"># 用env初始ENV环境变量形式
</span><span class="n">local_world_size</span> <span class="o">=</span> <span class="nb">int</span><span class="p">(</span><span class="n">os</span><span class="p">.</span><span class="n">environ</span><span class="p">[</span><span class="s">'LOCAL_WORLD_SIZE'</span><span class="p">])</span>
<span class="c1"># local_rank_env = int(os.environ.get('LOCAL_RANK', 0)) # 在利用env初始ENV环境变量形式
# local_world_size = int(os.environ.get('LOCAL_WORLD_SIZE', 3))
</span>
<span class="k">print</span><span class="p">(</span><span class="sa">f</span><span class="s">"</span><span class="si">{</span><span class="n">rank</span><span class="o">=</span><span class="si">}</span><span class="s">; </span><span class="si">{</span><span class="n">local_rank_arg</span><span class="o">=</span><span class="si">}</span><span class="s">; </span><span class="si">{</span><span class="n">local_rank_env</span><span class="o">=</span><span class="si">}</span><span class="s">; </span><span class="si">{</span><span class="n">local_world_size</span><span class="o">=</span><span class="si">}</span><span class="s">"</span><span class="p">)</span>
</code></pre></div></div>

<p>执行</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code>python3 <span class="nt">-m</span> torch.distributed.launch <span class="nt">--nproc_per_node</span><span class="o">=</span>4 test.py 
</code></pre></div></div>

<p>在一台4卡机器上执行, 样例输出:</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c"># WARNING:torch.distributed.run:</span>
<span class="c"># *****************************************</span>
<span class="c"># Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. </span>
<span class="c"># *****************************************</span>
<span class="nv">rank</span><span class="o">=</span>2<span class="p">;</span> <span class="nv">local_rank_arg</span><span class="o">=</span>2<span class="p">;</span> <span class="nv">local_rank_env</span><span class="o">=</span>2, <span class="nv">local_world_size</span><span class="o">=</span>4
<span class="nv">rank</span><span class="o">=</span>0<span class="p">;</span> <span class="nv">local_rank_arg</span><span class="o">=</span>0<span class="p">;</span> <span class="nv">local_rank_env</span><span class="o">=</span>0, <span class="nv">local_world_size</span><span class="o">=</span>4
<span class="nv">rank</span><span class="o">=</span>3<span class="p">;</span> <span class="nv">local_rank_arg</span><span class="o">=</span>3<span class="p">;</span> <span class="nv">local_rank_env</span><span class="o">=</span>3, <span class="nv">local_world_size</span><span class="o">=</span>4
<span class="nv">rank</span><span class="o">=</span>1<span class="p">;</span> <span class="nv">local_rank_arg</span><span class="o">=</span>1<span class="p">;</span> <span class="nv">local_rank_env</span><span class="o">=</span>1, <span class="nv">local_world_size</span><span class="o">=</span>4
</code></pre></div></div>

<p>一般分布式训练都是为每个进程赋予一块GPU,这样比较简单而且容易调试。 这种情况下,可以通过 local_rank 作为当前进程GPU的id。</p>

<h4 id="数据读取">数据读取</h4>

<p>pytorch 分布式训练,数据读取采用<strong>主进程预读取并缓存</strong>,其它进程从<strong>缓存</strong>中读取,不同进程之间的数据同步具体通过torch.distributed.barrier()实现。<a href="https://www.cnblogs.com/pyclq/p/15433787.html">参考</a></p>
<ul>
  <li>分布式数据读取: <code class="language-plaintext highlighter-rouge">主进程</code>读取数据 → <code class="language-plaintext highlighter-rouge">主进程</code>缓存 → <code class="language-plaintext highlighter-rouge">从进程</code>读取缓存</li>
</ul>

<p>进程号rank</p>

<p>多进程上下文中,通常假定<code class="language-plaintext highlighter-rouge">rank 0</code>是第一个进程/主进程,其它进程分别具有 0,1,2 不同rank号,这样总共具有4个进程。</p>

<p>(2)单一进程数据处理</p>

<p>通有些操作没必要并行处理,如: 数据读取与处理操作,只需要一个进程进行处理并缓存,然后与其它进程<strong>共享缓存处理数据</strong></p>
<ul>
  <li>但由于不同进程同步执行,单一进程处理数据必然会导致进程间不同步现象(数据读取操作处理时间较长)对于较短时间的单一进程程序运行不会影响线程不同步的情况</li>
</ul>

<p>为此,torch中采用了<code class="language-plaintext highlighter-rouge">barrier()</code>函数对其它<strong>非主进程</strong>进行阻塞,达到同步目的</p>

<p>(3)barrier()具体原理</p>

<p>如果执行 create_dataloader()函数的进程</p>
<ul>
  <li>不是主进程: 即rank不等于0或者-1
    <ul>
      <li>上下文管理器会执行相应的 <code class="language-plaintext highlighter-rouge">torch.distributed.barrier()</code>,设置一个<strong>阻塞栅栏</strong>,让此进程处于<strong>等待</strong>状态,等待所有进程到达栅栏处(包括主进程数据处理完毕);</li>
    </ul>
  </li>
  <li>是主进程: 其会直接读取数据,然后处理结束之后会遇到 <code class="language-plaintext highlighter-rouge">torch.distributed.barrier()</code></li>
</ul>

<p>此时,所有进程都到达了当前的栅栏处,这样所有进程就达到了同步,并同时得到释放。</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="k">def</span> <span class="nf">create_dataloader</span><span class="p">():</span>
    <span class="c1">#使用上下文管理器中实现的barrier函数确保分布式中的主进程首先处理数据,然后其它进程直接从缓存中读取
</span>    <span class="k">with</span> <span class="n">torch_distributed_zero_first</span><span class="p">(</span><span class="n">rank</span><span class="p">):</span>
        <span class="n">dataset</span> <span class="o">=</span> <span class="n">LoadImagesAndLabels</span><span class="p">()</span>
 
<span class="kn">from</span> <span class="nn">contextlib</span> <span class="kn">import</span> <span class="n">contextmanager</span>
 
<span class="c1">#定义的用于同步不同进程对数据读取的上下文管理器
</span><span class="o">@</span><span class="n">contextmanager</span>
<span class="k">def</span> <span class="nf">torch_distributed_zero_first</span><span class="p">(</span><span class="n">local_rank</span><span class="p">:</span> <span class="nb">int</span><span class="p">):</span>
    <span class="s">"""
    Decorator to make all processes in distributed training wait for each local_master to do something.
    """</span>
    <span class="k">if</span> <span class="n">local_rank</span> <span class="ow">not</span> <span class="ow">in</span> <span class="p">[</span><span class="o">-</span><span class="mi">1</span><span class="p">,</span> <span class="mi">0</span><span class="p">]:</span>
        <span class="n">torch</span><span class="p">.</span><span class="n">distributed</span><span class="p">.</span><span class="n">barrier</span><span class="p">()</span>
    <span class="k">yield</span>   <span class="c1">#中断后执行上下文代码,然后返回到此处继续往下执行
</span>    <span class="k">if</span> <span class="n">local_rank</span> <span class="o">==</span> <span class="mi">0</span><span class="p">:</span>
        <span class="n">torch</span><span class="p">.</span><span class="n">distributed</span><span class="p">.</span><span class="n">barrier</span><span class="p">()</span>
</code></pre></div></div>

<h4 id="初始化进程组-init_process_group">初始化进程组 init_process_group</h4>

<p><code class="language-plaintext highlighter-rouge">init_process_group</code> 函数原型</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">torch</span><span class="p">.</span><span class="n">distributed</span><span class="p">.</span><span class="n">init_process_group</span><span class="p">(</span><span class="n">backend</span><span class="p">,</span> <span class="n">init_method</span><span class="o">=</span><span class="bp">None</span><span class="p">,</span> <span class="n">timeout</span><span class="o">=</span><span class="n">datetime</span><span class="p">.</span><span class="n">timedelta</span><span class="p">(</span><span class="mi">0</span><span class="p">,</span> <span class="mi">1800</span><span class="p">),</span> 
                                     <span class="n">world_size</span><span class="o">=-</span><span class="mi">1</span><span class="p">,</span> <span class="n">rank</span><span class="o">=-</span><span class="mi">1</span><span class="p">,</span> <span class="n">store</span><span class="o">=</span><span class="bp">None</span><span class="p">)</span>
</code></pre></div></div>

<p>函数作用</p>
<ul>
  <li>每个进程中进行调用,用于初始化该进程。</li>
  <li>使用分布式时,该函数必须在 distributed 内所有相关函数之前使用。</li>
</ul>

<p>参数详解</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">backend</code> :指定当前进程要使用的通信后端
    <ul>
      <li>小写字符串,支持的通信后端有 <code class="language-plaintext highlighter-rouge">gloo</code>, <code class="language-plaintext highlighter-rouge">mpi</code>, <code class="language-plaintext highlighter-rouge">nccl</code>, 建议用 <code class="language-plaintext highlighter-rouge">nccl</code>。</li>
      <li>cpu 分布式选 <code class="language-plaintext highlighter-rouge">gloo</code></li>
      <li>gpu 分布式选 <code class="language-plaintext highlighter-rouge">nccl</code></li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">init_method</code> :当前进程组初始化方式
    <ul>
      <li>可选参数,字符串形式。两种方式: <code class="language-plaintext highlighter-rouge">init_method</code> + <code class="language-plaintext highlighter-rouge">store</code>, <code class="language-plaintext highlighter-rouge">init_method</code>是<code class="language-plaintext highlighter-rouge">store</code>的高层封装, 二者互斥</li>
      <li><code class="language-plaintext highlighter-rouge">init_method</code>: <strong>TCP连接</strong>、File<strong>共享文件</strong>系统、<strong>ENV环境变量</strong>三种方式</li>
      <li><code class="language-plaintext highlighter-rouge">store</code>: 同时指定world_size 和 rank参数。store 是一种分布式中核心的key-value存储,用于不同进程间共享信息</li>
      <li>如果未指定, 默认为 <code class="language-plaintext highlighter-rouge">env</code>,表示使用读取环境变量方式初始化。该参数与 store 互斥。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">rank</code> :指定当前进程的优先级</li>
  <li><code class="language-plaintext highlighter-rouge">int</code> 值。表示当前进程的编号,即优先级。如果指定 store 参数,则必须指定该参数。
    <ul>
      <li>rank=0 的为主进程,即 master 节点。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">world_size</code> :该 job 中的总进程数。如果指定 store 参数,则需要指定该参数。</li>
  <li><code class="language-plaintext highlighter-rouge">timeout</code> : 指定每个进程的超时时间
    <ul>
      <li>可选参数,datetime.timedelta 对象,默认为 30 分钟。该参数仅用于 Gloo 后端。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">store</code>
    <ul>
      <li>所有 worker 可访问的 key / value,用于交换连接 / 地址信息。与 init_method 互斥。</li>
    </ul>
  </li>
</ul>

<p>三种init_method:</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">init_method</code>=’<strong>tcp://ip:port</strong>‘: 通过指定rank 0(MASTER进程)的IP和端口,各个进程<strong>tcp</strong>进行信息交换。需指定 rank 和 world_size 这两个参数。</li>
  <li><code class="language-plaintext highlighter-rouge">init_method</code>=’<strong>file://path</strong>‘:通过所有进程都可以访问<strong>共享文件系统</strong>来进行信息共享。需要指定rank和world_size参数。</li>
  <li><code class="language-plaintext highlighter-rouge">init_method</code>=<strong>env://</strong>:从<strong>环境变量</strong>中读取分布式信息(os.environ),主要包括 <code class="language-plaintext highlighter-rouge">MASTER_ADDR</code>, <code class="language-plaintext highlighter-rouge">MASTER_PORT</code>, <code class="language-plaintext highlighter-rouge">RANK</code>, <code class="language-plaintext highlighter-rouge">WORLD_SIZE</code>。 其中,rank和world_size可手动指定,否则从环境变量读取。</li>
</ul>

<p>tcp 和 env 两种方式比较类似, 其实 env就是对tcp 一层封装),都是通过<strong>网络地址</strong>方式进行通信,最常用的初始化方法。</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kn">import</span> <span class="nn">os</span><span class="p">,</span> <span class="n">argparse</span>
<span class="kn">import</span> <span class="nn">torch</span>
<span class="kn">import</span> <span class="nn">torch.distributed</span> <span class="k">as</span> <span class="n">dist</span>

<span class="n">parse</span> <span class="o">=</span> <span class="n">argparse</span><span class="p">.</span><span class="n">ArgumentParser</span><span class="p">()</span>
<span class="n">parse</span><span class="p">.</span><span class="n">add_argument</span><span class="p">(</span><span class="s">'--init_method'</span><span class="p">,</span> <span class="nb">type</span><span class="o">=</span><span class="nb">str</span><span class="p">)</span>
<span class="n">parse</span><span class="p">.</span><span class="n">add_argument</span><span class="p">(</span><span class="s">'--rank'</span><span class="p">,</span> <span class="nb">type</span><span class="o">=</span><span class="nb">int</span><span class="p">)</span>
<span class="n">parse</span><span class="p">.</span><span class="n">add_argument</span><span class="p">(</span><span class="s">'--ws'</span><span class="p">,</span> <span class="nb">type</span><span class="o">=</span><span class="nb">int</span><span class="p">)</span>
<span class="n">args</span> <span class="o">=</span> <span class="n">parse</span><span class="p">.</span><span class="n">parse_args</span><span class="p">()</span>

<span class="k">if</span> <span class="n">args</span><span class="p">.</span><span class="n">init_method</span> <span class="o">==</span> <span class="s">'TCP'</span><span class="p">:</span>
	<span class="n">dist</span><span class="p">.</span><span class="n">init_process_group</span><span class="p">(</span><span class="s">'nccl'</span><span class="p">,</span> <span class="n">init_method</span><span class="o">=</span><span class="s">'tcp://127.0.0.1:28765'</span><span class="p">,</span> <span class="n">rank</span><span class="o">=</span><span class="n">args</span><span class="p">.</span><span class="n">rank</span><span class="p">,</span> <span class="n">world_size</span><span class="o">=</span><span class="n">args</span><span class="p">.</span><span class="n">ws</span><span class="p">)</span>
<span class="k">elif</span> <span class="n">args</span><span class="p">.</span><span class="n">init_method</span> <span class="o">==</span> <span class="s">'ENV'</span><span class="p">:</span>
    <span class="n">dist</span><span class="p">.</span><span class="n">init_process_group</span><span class="p">(</span><span class="s">'nccl'</span><span class="p">,</span> <span class="n">init_method</span><span class="o">=</span><span class="s">'env://'</span><span class="p">)</span>

<span class="n">rank</span> <span class="o">=</span> <span class="n">dist</span><span class="p">.</span><span class="n">get_rank</span><span class="p">()</span>
<span class="k">print</span><span class="p">(</span><span class="sa">f</span><span class="s">"rank = </span><span class="si">{</span><span class="n">rank</span><span class="si">}</span><span class="s"> is initialized"</span><span class="p">)</span>
<span class="c1"># 单机多卡情况下,localrank = rank. 严谨应该是local_rank来设置device
</span><span class="n">torch</span><span class="p">.</span><span class="n">cuda</span><span class="p">.</span><span class="n">set_device</span><span class="p">(</span><span class="n">rank</span><span class="p">)</span>
<span class="n">tensor</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">tensor</span><span class="p">([</span><span class="mi">1</span><span class="p">,</span> <span class="mi">2</span><span class="p">,</span> <span class="mi">3</span><span class="p">,</span> <span class="mi">4</span><span class="p">]).</span><span class="n">cuda</span><span class="p">()</span>
<span class="k">print</span><span class="p">(</span><span class="n">tensor</span><span class="p">)</span>
</code></pre></div></div>

<p>单机双卡机器上,开两个终端,同时运行命令</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c1"># TCP方法
</span><span class="n">python3</span> <span class="n">test_ddp</span><span class="p">.</span><span class="n">py</span> <span class="o">--</span><span class="n">init_method</span><span class="o">=</span><span class="n">TCP</span> <span class="o">--</span><span class="n">rank</span><span class="o">=</span><span class="mi">0</span> <span class="o">--</span><span class="n">ws</span><span class="o">=</span><span class="mi">2</span>
<span class="n">python3</span> <span class="n">test_ddp</span><span class="p">.</span><span class="n">py</span> <span class="o">--</span><span class="n">init_method</span><span class="o">=</span><span class="n">TCP</span> <span class="o">--</span><span class="n">rank</span><span class="o">=</span><span class="mi">1</span> <span class="o">--</span><span class="n">ws</span><span class="o">=</span><span class="mi">2</span>
<span class="c1"># ENV方法
</span><span class="n">MASTER_ADDR</span><span class="o">=</span><span class="s">'localhost'</span> <span class="n">MASTER_PORT</span><span class="o">=</span><span class="mi">28765</span> <span class="n">RANK</span><span class="o">=</span><span class="mi">0</span> <span class="n">WORLD_SIZE</span><span class="o">=</span><span class="mi">2</span> <span class="n">python3</span> <span class="n">test_gpu</span><span class="p">.</span><span class="n">py</span> <span class="o">--</span><span class="n">init_method</span><span class="o">=</span><span class="n">ENV</span>
<span class="n">MASTER_ADDR</span><span class="o">=</span><span class="s">'localhost'</span> <span class="n">MASTER_PORT</span><span class="o">=</span><span class="mi">28765</span> <span class="n">RANK</span><span class="o">=</span><span class="mi">1</span> <span class="n">WORLD_SIZE</span><span class="o">=</span><span class="mi">2</span> <span class="n">python3</span> <span class="n">test_gpu</span><span class="p">.</span><span class="n">py</span> <span class="o">--</span><span class="n">init_method</span><span class="o">=</span><span class="n">ENV</span>
</code></pre></div></div>

<p>如果开启的进程未达到 word_size 的数量,则所有进程会一直等待,直到都开始运行,可以得到输出如下:</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c1"># rank0 的终端:
</span><span class="n">rank</span> <span class="mi">0</span> <span class="ow">is</span> <span class="n">initialized</span>
<span class="n">tensor</span><span class="p">([</span><span class="mi">1</span><span class="p">,</span> <span class="mi">2</span><span class="p">,</span> <span class="mi">3</span><span class="p">,</span> <span class="mi">4</span><span class="p">],</span> <span class="n">device</span><span class="o">=</span><span class="s">'cuda:0'</span><span class="p">)</span>
<span class="c1"># rank1的终端
</span><span class="n">rank</span> <span class="mi">1</span> <span class="ow">is</span> <span class="n">initialized</span>
<span class="n">tensor</span><span class="p">([</span><span class="mi">1</span><span class="p">,</span> <span class="mi">2</span><span class="p">,</span> <span class="mi">3</span><span class="p">,</span> <span class="mi">4</span><span class="p">],</span> <span class="n">device</span><span class="o">=</span><span class="s">'cuda:1'</span><span class="p">)</span>
</code></pre></div></div>

<p>说明</p>
<ul>
  <li>初始化DDP时,给后端提供主进程的<strong>地址端口</strong>、本身<strong>RANK</strong>,以及<strong>进程数量</strong>即可。</li>
  <li>初始化完成后,可以执行很多分布式的函数,比如 dist.<code class="language-plaintext highlighter-rouge">get_rank</code>, dist.<code class="language-plaintext highlighter-rouge">all_gather</code> 等等。</li>
</ul>

<p><strong>new_group</strong></p>

<p>函数声明</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">torch</span><span class="p">.</span><span class="n">distributed</span><span class="p">.</span><span class="n">new_group</span><span class="p">(</span><span class="n">ranks</span><span class="o">=</span><span class="bp">None</span><span class="p">,</span> <span class="n">timeout</span><span class="o">=</span><span class="n">datetime</span><span class="p">.</span><span class="n">timedelta</span><span class="p">(</span><span class="mi">0</span><span class="p">,</span> <span class="mi">1800</span><span class="p">),</span> <span class="n">backend</span><span class="o">=</span><span class="bp">None</span><span class="p">)</span>
</code></pre></div></div>

<p>函数作用</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">new_group()</code> 函数可用于使用所有进程的任意子集来创建新组。其返回一个分组句柄,可作为 collectives 相关函数的 group 参数 。collectives 是分布式函数,用于特定编程模式中的信息交换。</li>
</ul>

<p>参数详解</p>
<ul>
  <li>ranks:指定新分组内的成员的 ranks 列表list ,其中每个元素为 int 型</li>
  <li>timeout:指定该分组进程组内的操作的超时时间
    <ul>
      <li>可选参数,datetime.timedelta 对象,默认为 30 分钟。该参数仅用于 Gloo 后端。</li>
    </ul>
  </li>
  <li>backend:指定要使用的通信后端
    <ul>
      <li>小写字符串,支持的通信后端有 gloo,nccl ,必须与 init_process_group() 中一致。</li>
    </ul>
  </li>
</ul>

<p>其它函数</p>
<ul>
  <li>get_backend 获取进程组属性</li>
  <li>get_rank 获取分布式进程组组内的每个进程的唯一识别</li>
  <li>get_world_size  获取进程组内的进程数</li>
  <li>is_initialized 检查默认进程组是否被初始化</li>
  <li>is_mpi_available 检查 MPI 后端是否可用</li>
  <li>is_nccl_available 检查 NCCL 后端是否可用</li>
</ul>

<h5 id="1-tcp-初始化">(1) TCP 初始化</h5>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kn">import</span> <span class="nn">torch.distributed</span> <span class="k">as</span> <span class="n">dist</span>

<span class="c1"># Use address of one of the machines
</span><span class="n">dist</span><span class="p">.</span><span class="n">init_process_group</span><span class="p">(</span><span class="n">backend</span><span class="p">,</span> <span class="n">init_method</span><span class="o">=</span><span class="s">'tcp://10.1.1.20:23456'</span><span class="p">,</span><span class="n">rank</span><span class="o">=</span><span class="n">args</span><span class="p">.</span><span class="n">rank</span><span class="p">,</span> <span class="n">world_size</span><span class="o">=</span><span class="mi">4</span><span class="p">)</span>
</code></pre></div></div>

<p>说明</p>
<ul>
  <li>不同进程内,均使用主进程的 ip 地址和 port,确保每个进程能够通过一个 master 进行协作。该 ip 一般为主进程所在的主机的 ip,端口号应该未被其他应用占用。</li>
  <li>实际使用时,在每个进程内运行代码,并需要为每一个进程手动指定一个 rank,进程可以分布与相同或不同主机上。</li>
  <li>多个进程之间,同步进行。若其中一个出现问题,其他的也马上停止。</li>
</ul>

<p>使用</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c1"># Node 1
</span><span class="n">python</span> <span class="n">mnsit</span><span class="p">.</span><span class="n">py</span> <span class="o">--</span><span class="n">init</span><span class="o">-</span><span class="n">method</span> <span class="n">tcp</span><span class="p">:</span><span class="o">//</span><span class="mf">192.168</span><span class="p">.</span><span class="mf">54.179</span><span class="p">:</span><span class="mi">22225</span> <span class="o">--</span><span class="n">rank</span> <span class="mi">0</span> <span class="o">--</span><span class="n">world</span><span class="o">-</span><span class="n">size</span> <span class="mi">2</span>
<span class="c1"># Node 2
</span><span class="n">python</span> <span class="n">mnsit</span><span class="p">.</span><span class="n">py</span> <span class="o">--</span><span class="n">init</span><span class="o">-</span><span class="n">method</span> <span class="n">tcp</span><span class="p">:</span><span class="o">//</span><span class="mf">192.168</span><span class="p">.</span><span class="mf">54.179</span><span class="p">:</span><span class="mi">22225</span> <span class="o">--</span><span class="n">rank</span> <span class="mi">1</span> <span class="o">--</span><span class="n">world</span><span class="o">-</span><span class="n">size</span> <span class="mi">2</span>
</code></pre></div></div>

<p>初始化示例</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">tcp_init.py</code></li>
</ul>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kn">import</span> <span class="nn">torch.distributed</span> <span class="k">as</span> <span class="n">dist</span>
<span class="kn">import</span> <span class="nn">torch.utils.data.distributed</span>
<span class="c1"># ......
</span><span class="n">parser</span> <span class="o">=</span> <span class="n">argparse</span><span class="p">.</span><span class="n">ArgumentParser</span><span class="p">(</span><span class="n">description</span><span class="o">=</span><span class="s">'PyTorch distributed training on cifar-10'</span><span class="p">)</span>
<span class="n">parser</span><span class="p">.</span><span class="n">add_argument</span><span class="p">(</span><span class="s">'--rank'</span><span class="p">,</span> <span class="n">default</span><span class="o">=</span><span class="mi">0</span><span class="p">,</span> <span class="n">help</span><span class="o">=</span><span class="s">'rank of current process'</span><span class="p">)</span>
<span class="n">parser</span><span class="p">.</span><span class="n">add_argument</span><span class="p">(</span><span class="s">'--word_size'</span><span class="p">,</span> <span class="n">default</span><span class="o">=</span><span class="mi">2</span><span class="p">,</span><span class="n">help</span><span class="o">=</span><span class="s">"word size"</span><span class="p">)</span>
<span class="n">parser</span><span class="p">.</span><span class="n">add_argument</span><span class="p">(</span><span class="s">'--init_method'</span><span class="p">,</span> <span class="n">default</span><span class="o">=</span><span class="s">'tcp://127.0.0.1:23456'</span><span class="p">,</span> <span class="n">help</span><span class="o">=</span><span class="s">"init-method"</span><span class="p">)</span>
<span class="n">args</span> <span class="o">=</span> <span class="n">parser</span><span class="p">.</span><span class="n">parse_args</span><span class="p">()</span>
<span class="c1"># ......
</span><span class="n">dist</span><span class="p">.</span><span class="n">init_process_group</span><span class="p">(</span><span class="n">backend</span><span class="o">=</span><span class="s">'nccl'</span><span class="p">,</span> <span class="n">init_method</span><span class="o">=</span><span class="n">args</span><span class="p">.</span><span class="n">init_method</span><span class="p">,</span> <span class="n">rank</span><span class="o">=</span><span class="n">args</span><span class="p">.</span><span class="n">rank</span><span class="p">,</span> <span class="n">world_size</span><span class="o">=</span><span class="n">args</span><span class="p">.</span><span class="n">word_size</span><span class="p">)</span>
<span class="c1"># ......
</span><span class="n">trainset</span> <span class="o">=</span> <span class="n">torchvision</span><span class="p">.</span><span class="n">datasets</span><span class="p">.</span><span class="n">CIFAR10</span><span class="p">(</span><span class="n">root</span><span class="o">=</span><span class="s">'./data'</span><span class="p">,</span> <span class="n">train</span><span class="o">=</span><span class="bp">True</span><span class="p">,</span> <span class="n">download</span><span class="o">=</span><span class="n">download</span><span class="p">,</span> <span class="n">transform</span><span class="o">=</span><span class="n">transform</span><span class="p">)</span>
<span class="n">train_sampler</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">utils</span><span class="p">.</span><span class="n">data</span><span class="p">.</span><span class="n">distributed</span><span class="p">.</span><span class="n">DistributedSampler</span><span class="p">(</span><span class="n">trainset</span><span class="p">)</span>
<span class="n">trainloader</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">utils</span><span class="p">.</span><span class="n">data</span><span class="p">.</span><span class="n">DataLoader</span><span class="p">(</span><span class="n">trainset</span><span class="p">,</span> <span class="n">batch_size</span><span class="o">=</span><span class="n">batch_size</span><span class="p">,</span> <span class="n">sampler</span><span class="o">=</span><span class="n">train_sampler</span><span class="p">)</span>
<span class="c1"># ......
</span><span class="n">net</span> <span class="o">=</span> <span class="n">Net</span><span class="p">()</span>
<span class="n">net</span> <span class="o">=</span> <span class="n">net</span><span class="p">.</span><span class="n">cuda</span><span class="p">()</span>
<span class="n">net</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">nn</span><span class="p">.</span><span class="n">parallel</span><span class="p">.</span><span class="n">DistributedDataParallel</span><span class="p">(</span><span class="n">net</span><span class="p">)</span>
</code></pre></div></div>

<p>执行方式</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">init_method</code></li>
</ul>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c"># Node 1 : ip 192.168.1.201  port : 12345</span>
python tcp_init.py <span class="nt">--init_method</span> tcp://192.168.1.201:12345 <span class="nt">--rank</span> 0 <span class="nt">--word_size</span> 3
<span class="c"># Node 2 : </span>
python tcp_init.py <span class="nt">--init_method</span> tcp://192.168.1.201:12345 <span class="nt">--rank</span> 1 <span class="nt">--word_size</span> 3
<span class="c"># Node 3 : </span>
python tcp_init.py <span class="nt">--init_method</span> tcp://192.168.1.201:12345 <span class="nt">--rank</span> 2 <span class="nt">--word_size</span> 3
</code></pre></div></div>

<p>说明</p>
<ul>
  <li>TCP 方式中,<code class="language-plaintext highlighter-rouge">init_process_group</code> 中必须手动指定以下参数
    <ul>
      <li><code class="language-plaintext highlighter-rouge">rank</code> 为当前进程的进程号</li>
      <li><code class="language-plaintext highlighter-rouge">word_size</code> 为当前 job 总进程数</li>
      <li><code class="language-plaintext highlighter-rouge">init_method</code> 内指定 <strong>tcp 模式</strong>,且所有进程的 <code class="language-plaintext highlighter-rouge">ip:port</code> 必须一致,设定为主进程的 <code class="language-plaintext highlighter-rouge">ip:port</code></li>
    </ul>
  </li>
  <li>必须在 rank==0 的进程内保存参数。</li>
  <li>若程序内未根据 rank 设定当前进程使用的 GPUs,则默认使用<strong>全部 GPU</strong>,且以<strong>数据并行</strong>方式使用。</li>
  <li>每条命令表示一个进程。若已开启的进程未达到 word_size 的数量,则所有进程会一直等待</li>
  <li>每台主机上可以开启多个进程。但是,若未为每个进程分配合适的 GPU,则同机不同进程可能会共用 GPU,应该坚决避免这种情况。</li>
  <li>使用 gloo 后端进行 GPU 训练时,会报错。</li>
  <li>若每个进程负责多块 GPU,可以利用多 GPU 进行模型并行。</li>
</ul>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="k">class</span> <span class="nc">ToyMpModel</span><span class="p">(</span><span class="n">nn</span><span class="p">.</span><span class="n">Module</span><span class="p">):</span>
    <span class="k">def</span> <span class="nf">init</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">dev0</span><span class="p">,</span> <span class="n">dev1</span><span class="p">):</span>
        <span class="nb">super</span><span class="p">(</span><span class="n">ToyMpModel</span><span class="p">,</span> <span class="bp">self</span><span class="p">).</span><span class="n">init</span><span class="p">()</span>
        <span class="bp">self</span><span class="p">.</span><span class="n">dev0</span> <span class="o">=</span> <span class="n">dev0</span>
        <span class="bp">self</span><span class="p">.</span><span class="n">dev1</span> <span class="o">=</span> <span class="n">dev1</span>
        <span class="bp">self</span><span class="p">.</span><span class="n">net1</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">nn</span><span class="p">.</span><span class="n">Linear</span><span class="p">(</span><span class="mi">10</span><span class="p">,</span> <span class="mi">10</span><span class="p">).</span><span class="n">to</span><span class="p">(</span><span class="n">dev0</span><span class="p">)</span>
        <span class="bp">self</span><span class="p">.</span><span class="n">relu</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">nn</span><span class="p">.</span><span class="n">ReLU</span><span class="p">()</span>
        <span class="bp">self</span><span class="p">.</span><span class="n">net2</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">nn</span><span class="p">.</span><span class="n">Linear</span><span class="p">(</span><span class="mi">10</span><span class="p">,</span> <span class="mi">5</span><span class="p">).</span><span class="n">to</span><span class="p">(</span><span class="n">dev1</span><span class="p">)</span>

<span class="k">def</span> <span class="nf">forward</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">x</span><span class="p">):</span>
       <span class="n">x</span> <span class="o">=</span> <span class="n">x</span><span class="p">.</span><span class="n">to</span><span class="p">(</span><span class="bp">self</span><span class="p">.</span><span class="n">dev0</span><span class="p">)</span>
       <span class="n">x</span> <span class="o">=</span> <span class="bp">self</span><span class="p">.</span><span class="n">relu</span><span class="p">(</span><span class="bp">self</span><span class="p">.</span><span class="n">net1</span><span class="p">(</span><span class="n">x</span><span class="p">))</span>
       <span class="n">x</span> <span class="o">=</span> <span class="n">x</span><span class="p">.</span><span class="n">to</span><span class="p">(</span><span class="bp">self</span><span class="p">.</span><span class="n">dev1</span><span class="p">)</span>
       <span class="k">return</span> <span class="bp">self</span><span class="p">.</span><span class="n">net2</span><span class="p">(</span><span class="n">x</span><span class="p">)</span>
<span class="c1"># ......
</span><span class="n">dev0</span> <span class="o">=</span> <span class="n">rank</span> <span class="o">*</span> <span class="mi">2</span>
<span class="n">dev1</span> <span class="o">=</span> <span class="n">rank</span> <span class="o">*</span> <span class="mi">2</span> <span class="o">+</span> <span class="mi">1</span>
<span class="n">mp_model</span> <span class="o">=</span> <span class="n">ToyMpModel</span><span class="p">(</span><span class="n">dev0</span><span class="p">,</span> <span class="n">dev1</span><span class="p">)</span>
<span class="n">ddp_mp_model</span> <span class="o">=</span> <span class="n">DDP</span><span class="p">(</span><span class="n">mp_model</span><span class="p">)</span>
<span class="c1"># ......
</span></code></pre></div></div>

<h5 id="2-共享文件初始化">(2) 共享文件初始化</h5>

<p>共享的文件对于组内所有进程可见</p>

<p>设置方式如下:</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kn">import</span> <span class="nn">torch.distributed</span> <span class="k">as</span> <span class="n">dist</span>

<span class="c1"># rank should always be specified
</span><span class="n">dist</span><span class="p">.</span><span class="n">init_process_group</span><span class="p">(</span><span class="n">backend</span><span class="p">,</span> <span class="n">init_method</span><span class="o">=</span><span class="s">'file:///mnt/nfs/sharedfile'</span><span class="p">,</span>
                        <span class="n">world_size</span><span class="o">=</span><span class="mi">4</span><span class="p">,</span> <span class="n">rank</span><span class="o">=</span><span class="n">args</span><span class="p">.</span><span class="n">rank</span><span class="p">)</span>
</code></pre></div></div>

<p>说明</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">file://</code>前缀表示文件系统各式初始化。</li>
  <li><code class="language-plaintext highlighter-rouge">/mnt/nfs/sharedfile</code> 表示共享文件,各个进程在共享文件系统中通过该文件进行同步或异步。</li>
</ul>

<p>因此,所有进程必须对该文件具有读写权限。</p>
<ul>
  <li>每一个进程将会打开这个文件,写入自己的信息,并等待直到其他所有进程完成该操作</li>
  <li>在此之后,所有的请求信息将会被所有的进程可访问,为了避免 race conditions,文件系统必须支持通过 fcntl 锁定(大多数的 local 系统和 NFS 均支持该特性)。</li>
</ul>

<p>说明:</p>
<ul>
  <li>若指定为同一文件,则每次训练开始之前,该文件必须手动删除,但是文件所在路径必须存在!</li>
</ul>

<p>与 tcp 初始化方式一样,也需要为每一个进程手动指定 rank。</p>

<p>使用</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c1"># 主机 01 上:
</span><span class="n">python</span> <span class="n">mnsit</span><span class="p">.</span><span class="n">py</span> <span class="o">--</span><span class="n">init</span><span class="o">-</span><span class="n">method</span> <span class="nb">file</span><span class="p">:</span><span class="o">//</span><span class="n">PathToShareFile</span><span class="o">/</span><span class="n">MultiNode</span> <span class="o">--</span><span class="n">rank</span> <span class="mi">0</span> <span class="o">--</span><span class="n">world</span><span class="o">-</span><span class="n">size</span> <span class="mi">2</span>
<span class="c1"># 主机 02 上:
</span><span class="n">python</span> <span class="n">mnsit</span><span class="p">.</span><span class="n">py</span> <span class="o">--</span><span class="n">init</span><span class="o">-</span><span class="n">method</span> <span class="nb">file</span><span class="p">:</span><span class="o">//</span><span class="n">PathToShareFile</span><span class="o">/</span><span class="n">MultiNode</span> <span class="o">--</span><span class="n">rank</span> <span class="mi">1</span> <span class="o">--</span><span class="n">world</span><span class="o">-</span><span class="n">size</span> <span class="mi">2</span>
</code></pre></div></div>

<p>相比于 TCP 方式, 麻烦一点的是运行完一次必须更换共享的文件名,或者删除之前的共享文件,不然第二次运行会报错。</p>

<h5 id="3-env-初始化默认">(3) Env 初始化(默认)</h5>

<p>默认情况下都是环境变量来进行分布式通信,指定 <code class="language-plaintext highlighter-rouge">init_method="env://"</code>。</p>

<p>通过在所有机器上设置如下四个环境变量,所有进程将会适当的连接到 master,获取其他进程的信息,并最终与它们握手(信号)。</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">MASTER_PORT</code>: 必须指定,表示 rank0上机器的一个空闲端口(必须设置)</li>
  <li><code class="language-plaintext highlighter-rouge">MASTER_ADDR</code>: 必须指定,除了 rank0 主机,表示主进程 rank0 机器的地址(必须设置)</li>
  <li><code class="language-plaintext highlighter-rouge">WORLD_SIZE</code>: 可选,总进程数,可以这里指定,在 init 函数中也可以指定</li>
  <li><code class="language-plaintext highlighter-rouge">RANK</code>: 可选,当前进程的 rank,也可以在 init 函数中指定</li>
</ul>

<p>配合 torch.distribution.launch 使用。</p>

<p>实例</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c"># Node 1: (IP: 192.168.1.1, and has a free port: 1234)</span>
python <span class="nt">-m</span> torch.distributed.launch <span class="nt">--nproc_per_node</span><span class="o">=</span>NUM_GPUS_YOU_HAVE
           <span class="nt">--nnodes</span><span class="o">=</span>2 <span class="nt">--node_rank</span><span class="o">=</span>0 <span class="nt">--master_addr</span><span class="o">=</span><span class="s2">"192.168.1.1"</span>
           <span class="nt">--master_port</span><span class="o">=</span>1234 YOUR_TRAINING_SCRIPT.py <span class="o">(</span><span class="nt">--arg1</span> <span class="nt">--arg2</span> <span class="nt">--arg3</span>
           and all other arguments of your training script<span class="o">)</span>
<span class="c"># Node 2</span>
python <span class="nt">-m</span> torch.distributed.launch <span class="nt">--nproc_per_node</span><span class="o">=</span>NUM_GPUS_YOU_HAVE
           <span class="nt">--nnodes</span><span class="o">=</span>2 <span class="nt">--node_rank</span><span class="o">=</span>1 <span class="nt">--master_addr</span><span class="o">=</span><span class="s2">"192.168.1.1"</span>
           <span class="nt">--master_port</span><span class="o">=</span>1234 YOUR_TRAINING_SCRIPT.py <span class="o">(</span><span class="nt">--arg1</span> <span class="nt">--arg2</span> <span class="nt">--arg3</span>
           and all other arguments of your training script<span class="o">)</span>
</code></pre></div></div>

<p>代码 <code class="language-plaintext highlighter-rouge">env_init.py</code></p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kn">import</span> <span class="nn">torch.distributed</span> <span class="k">as</span> <span class="n">dist</span>
<span class="kn">import</span> <span class="nn">torch.utils.data.distributed</span>

<span class="c1"># ......
</span><span class="kn">import</span> <span class="nn">argparse</span>
<span class="n">parser</span> <span class="o">=</span> <span class="n">argparse</span><span class="p">.</span><span class="n">ArgumentParser</span><span class="p">()</span>
<span class="c1"># 注意这个参数,必须要以这种形式指定,即使代码中不使用。因为 launch 工具默认传递该参数
</span><span class="n">parser</span><span class="p">.</span><span class="n">add_argument</span><span class="p">(</span><span class="s">"--local_rank"</span><span class="p">,</span> <span class="nb">type</span><span class="o">=</span><span class="nb">int</span><span class="p">)</span>
<span class="n">args</span> <span class="o">=</span> <span class="n">parser</span><span class="p">.</span><span class="n">parse_args</span><span class="p">()</span>

<span class="c1"># ......
</span><span class="n">dist</span><span class="p">.</span><span class="n">init_process_group</span><span class="p">(</span><span class="n">backend</span><span class="o">=</span><span class="s">'nccl'</span><span class="p">,</span> <span class="n">init_method</span><span class="o">=</span><span class="s">'env://'</span><span class="p">)</span>

<span class="c1"># ......
</span><span class="n">trainset</span> <span class="o">=</span> <span class="n">torchvision</span><span class="p">.</span><span class="n">datasets</span><span class="p">.</span><span class="n">CIFAR10</span><span class="p">(</span><span class="n">root</span><span class="o">=</span><span class="s">'./data'</span><span class="p">,</span> <span class="n">train</span><span class="o">=</span><span class="bp">True</span><span class="p">,</span> <span class="n">download</span><span class="o">=</span><span class="n">download</span><span class="p">,</span> <span class="n">transform</span><span class="o">=</span><span class="n">transform</span><span class="p">)</span>
<span class="n">train_sampler</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">utils</span><span class="p">.</span><span class="n">data</span><span class="p">.</span><span class="n">distributed</span><span class="p">.</span><span class="n">DistributedSampler</span><span class="p">(</span><span class="n">trainset</span><span class="p">)</span>
<span class="n">trainloader</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">utils</span><span class="p">.</span><span class="n">data</span><span class="p">.</span><span class="n">DataLoader</span><span class="p">(</span><span class="n">trainset</span><span class="p">,</span> <span class="n">batch_size</span><span class="o">=</span><span class="n">batch_size</span><span class="p">,</span> <span class="n">sampler</span><span class="o">=</span><span class="n">train_sampler</span><span class="p">)</span>

<span class="c1"># ......
# 根据 local_rank,配置当前进程使用的 GPU
</span><span class="n">net</span> <span class="o">=</span> <span class="n">Net</span><span class="p">()</span>
<span class="n">device</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">device</span><span class="p">(</span><span class="s">'cuda'</span><span class="p">,</span> <span class="n">args</span><span class="p">.</span><span class="n">local_rank</span><span class="p">)</span>
<span class="n">net</span> <span class="o">=</span> <span class="n">net</span><span class="p">.</span><span class="n">to</span><span class="p">(</span><span class="n">device</span><span class="p">)</span>
<span class="n">net</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">nn</span><span class="p">.</span><span class="n">parallel</span><span class="p">.</span><span class="n">DistributedDataParallel</span><span class="p">(</span><span class="n">net</span><span class="p">,</span> <span class="n">device_ids</span><span class="o">=</span><span class="p">[</span><span class="n">args</span><span class="p">.</span><span class="n">local_rank</span><span class="p">],</span> <span class="n">output_device</span><span class="o">=</span><span class="n">args</span><span class="p">.</span><span class="n">local_rank</span><span class="p">)</span>
</code></pre></div></div>

<p>执行方式</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c1"># 节点0
</span><span class="n">python</span> <span class="o">-</span><span class="n">m</span> <span class="n">torch</span><span class="p">.</span><span class="n">distributed</span><span class="p">.</span><span class="n">launch</span> <span class="o">--</span><span class="n">nproc_per_node</span><span class="o">=</span><span class="mi">2</span> <span class="o">--</span><span class="n">nnodes</span><span class="o">=</span><span class="mi">3</span> <span class="o">--</span><span class="n">node_rank</span><span class="o">=</span><span class="mi">0</span> <span class="o">--</span><span class="n">master_addr</span><span class="o">=</span><span class="s">"192.168.1.201"</span> <span class="o">--</span><span class="n">master_port</span><span class="o">=</span><span class="mi">23456</span> <span class="n">env_init</span><span class="p">.</span><span class="n">py</span>
<span class="c1"># 节点1
</span><span class="n">python</span> <span class="o">-</span><span class="n">m</span> <span class="n">torch</span><span class="p">.</span><span class="n">distributed</span><span class="p">.</span><span class="n">launch</span> <span class="o">--</span><span class="n">nproc_per_node</span><span class="o">=</span><span class="mi">2</span> <span class="o">--</span><span class="n">nnodes</span><span class="o">=</span><span class="mi">3</span> <span class="o">--</span><span class="n">node_rank</span><span class="o">=</span><span class="mi">1</span> <span class="o">--</span><span class="n">master_addr</span><span class="o">=</span><span class="s">"192.168.1.201"</span> <span class="o">--</span><span class="n">master_port</span><span class="o">=</span><span class="mi">23456</span> <span class="n">env_init</span><span class="p">.</span><span class="n">py</span>
<span class="c1"># 节点2
</span><span class="n">python</span> <span class="o">-</span><span class="n">m</span> <span class="n">torch</span><span class="p">.</span><span class="n">distributed</span><span class="p">.</span><span class="n">launch</span> <span class="o">--</span><span class="n">nproc_per_node</span><span class="o">=</span><span class="mi">2</span> <span class="o">--</span><span class="n">nnodes</span><span class="o">=</span><span class="mi">3</span> <span class="o">--</span><span class="n">node_rank</span><span class="o">=</span><span class="mi">2</span> <span class="o">--</span><span class="n">master_addr</span><span class="o">=</span><span class="s">"192.168.1.201"</span> <span class="o">--</span><span class="n">master_port</span><span class="o">=</span><span class="mi">23456</span> <span class="n">env_init</span><span class="p">.</span><span class="n">py</span>
</code></pre></div></div>

<p>说明</p>
<ul>
  <li>Env 方式中,<code class="language-plaintext highlighter-rouge">init_process_group</code> 无需指定任何参数</li>
  <li>必须在 <code class="language-plaintext highlighter-rouge">rank==0</code> 的进程内保存参数。</li>
</ul>

<p>该方式使用 <code class="language-plaintext highlighter-rouge">torch.distributed.launch</code> 在每台主机上创建<strong>多进程</strong>,其中:</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">nproc_per_node</code> 参数指定为当前主机创建的进程数。一般设定为当前主机的 GPU 数量</li>
  <li><code class="language-plaintext highlighter-rouge">nnodes</code> 参数指定当前 job 包含多少个节点</li>
  <li><code class="language-plaintext highlighter-rouge">node_rank</code> 指定当前节点的优先级</li>
  <li><code class="language-plaintext highlighter-rouge">master_addr</code> 和 <code class="language-plaintext highlighter-rouge">master_port</code> 分别指定 master 节点的 ip:port</li>
  <li>若没有为每个进程合理分配 GPU,则默认使用当前主机上所有的 GPU。即使一台主机上有多个进程,也会共用 GPU。</li>
  <li>使用 <code class="language-plaintext highlighter-rouge">torch.distributed.launch</code> 工具时,为当前主机创建 <code class="language-plaintext highlighter-rouge">nproc_per_node</code> 个进程,每个进程独立执行训练脚本。同时,它还会为每个进程分配一个 <code class="language-plaintext highlighter-rouge">local_rank</code> 参数,表示当前进程在当前主机上的编号。
    <ul>
      <li>例如:r<code class="language-plaintext highlighter-rouge">ank=2</code>, <code class="language-plaintext highlighter-rouge">local_rank=0</code> 表示第 3 个节点上的第 1 个进程。</li>
    </ul>
  </li>
  <li>需要合理利用 <code class="language-plaintext highlighter-rouge">local_rank</code> 参数,来合理分配本地的 GPU 资源</li>
  <li>每条命令表示一个进程。若已开启的进程未达到 <code class="language-plaintext highlighter-rouge">word_size</code> 数量,则所有进程会一直等待</li>
</ul>

<p>详见: <a href="https://zhuanlan.zhihu.com/p/76638962">Pytorch 分布式训练</a></p>

<h4 id="gpu-启动方式">GPU 启动方式</h4>

<p>常见的GPU 启动方式</p>
<ul>
  <li>torch.<code class="language-plaintext highlighter-rouge">multiprocessing</code>: 容易控制, 更加灵活</li>
  <li>torch.<code class="language-plaintext highlighter-rouge">distributed.launch</code>: 代码量少, 启动速度快</li>
  <li><code class="language-plaintext highlighter-rouge">torchrun</code>: <code class="language-plaintext highlighter-rouge">distributed.launch</code> 的进化版, 代码量更少</li>
  <li>Slurm Workload Manager: slurm 启动近期更新掉</li>
</ul>

<p>DDP 本身是一个 python <strong>多进程</strong>,完全可以直接通过<strong>多进程</strong>方式来启动分布式程序。</p>

<p>torch 提供<strong>两种</strong>启动工具运行torch DDP程序。</p>
<ul>
  <li>torch.multiprocessing</li>
  <li>launch/run</li>
</ul>

<h5 id="1-mpspawn">(1) mp.spawn</h5>

<p>用 torch.<code class="language-plaintext highlighter-rouge">multiprocessing</code>(python multiprocessing的封装类) 自动生成多个进程</p>

<p>基本的调用函数 spawn:</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">mp</span><span class="p">.</span><span class="n">spawn</span><span class="p">(</span><span class="n">fn</span><span class="p">,</span> <span class="n">args</span><span class="o">=</span><span class="p">(),</span> <span class="n">nprocs</span><span class="o">=</span><span class="mi">1</span><span class="p">,</span> <span class="n">join</span><span class="o">=</span><span class="bp">True</span><span class="p">,</span> <span class="n">daemon</span><span class="o">=</span><span class="bp">False</span><span class="p">)</span>
</code></pre></div></div>

<p>其中:</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">fn</code>: 进程<strong>入口函数</strong>,第一个参数会被默认自动加入当前进程的rank, 即实际调用: fn(rank, *args)</li>
  <li><code class="language-plaintext highlighter-rouge">nprocs</code>: <strong>进程数量</strong>,即:world_size</li>
  <li><code class="language-plaintext highlighter-rouge">args</code>: 函数fn的其他常规参数以tuple形式传递</li>
</ul>

<p>示例</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kn">import</span> <span class="nn">torch</span>
<span class="kn">import</span> <span class="nn">torch.distributed</span> <span class="k">as</span> <span class="n">dist</span>
<span class="kn">import</span> <span class="nn">torch.multiprocessing</span> <span class="k">as</span> <span class="n">mp</span>

<span class="k">def</span> <span class="nf">fn</span><span class="p">(</span><span class="n">rank</span><span class="p">,</span> <span class="n">ws</span><span class="p">,</span> <span class="n">nums</span><span class="p">):</span>
    <span class="n">dist</span><span class="p">.</span><span class="n">init_process_group</span><span class="p">(</span><span class="s">'nccl'</span><span class="p">,</span> <span class="n">init_method</span><span class="o">=</span><span class="s">'tcp://127.0.0.1:28765'</span><span class="p">,</span> <span class="n">rank</span><span class="o">=</span><span class="n">rank</span><span class="p">,</span> <span class="n">world_size</span><span class="o">=</span><span class="n">ws</span><span class="p">)</span>
    <span class="n">rank</span> <span class="o">=</span> <span class="n">dist</span><span class="p">.</span><span class="n">get_rank</span><span class="p">()</span>
    <span class="k">print</span><span class="p">(</span><span class="sa">f</span><span class="s">"rank = </span><span class="si">{</span><span class="n">rank</span><span class="si">}</span><span class="s"> is initialized"</span><span class="p">)</span>
    <span class="n">torch</span><span class="p">.</span><span class="n">cuda</span><span class="p">.</span><span class="n">set_device</span><span class="p">(</span><span class="n">rank</span><span class="p">)</span>
    <span class="n">tensor</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">tensor</span><span class="p">(</span><span class="n">nums</span><span class="p">).</span><span class="n">cuda</span><span class="p">()</span>
    <span class="k">print</span><span class="p">(</span><span class="n">tensor</span><span class="p">)</span>

<span class="k">if</span> <span class="n">__name__</span> <span class="o">==</span> <span class="s">"__main__"</span><span class="p">:</span>
    <span class="n">ws</span> <span class="o">=</span> <span class="mi">2</span>
    <span class="n">mp</span><span class="p">.</span><span class="n">spawn</span><span class="p">(</span><span class="n">fn</span><span class="p">,</span> <span class="n">nprocs</span><span class="o">=</span><span class="n">ws</span><span class="p">,</span> <span class="n">args</span><span class="o">=</span><span class="p">(</span><span class="n">ws</span><span class="p">,</span> <span class="p">[</span><span class="mi">1</span><span class="p">,</span> <span class="mi">2</span><span class="p">,</span> <span class="mi">3</span><span class="p">,</span> <span class="mi">4</span><span class="p">]))</span>
</code></pre></div></div>

<p>命令</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code>python3 test_ddp.py
</code></pre></div></div>

<p>输出如下:</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code>rank <span class="o">=</span> 0 is initialized
rank <span class="o">=</span> 1 is initialized
tensor<span class="o">([</span>1, 2, 3, 4], <span class="nv">device</span><span class="o">=</span><span class="s1">'cuda:1'</span><span class="o">)</span>
tensor<span class="o">([</span>1, 2, 3, 4], <span class="nv">device</span><span class="o">=</span><span class="s1">'cuda:0'</span><span class="o">)</span>
</code></pre></div></div>

<p>这种方式同时适用于 TCP 和 ENV 初始化。</p>

<h5 id="2-launchrun">(2) launch/run</h5>

<p>torch 提供的 <code class="language-plaintext highlighter-rouge">torch.distributed.launch</code> 工具,以模块形式直接执行:</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code>python3 <span class="nt">-m</span> torch.distributed.launch <span class="nt">--</span>配置 train.py <span class="nt">--args</span>参数
</code></pre></div></div>

<p>常用配置有:</p>
<ul>
  <li>–<code class="language-plaintext highlighter-rouge">nnodes</code>: 使用的机器数量,单机的话,就默认是1了</li>
  <li>–<code class="language-plaintext highlighter-rouge">nproc_per_node</code>: 单机的进程数,即单机的worldsize</li>
  <li>–<code class="language-plaintext highlighter-rouge">master_addr</code>/<code class="language-plaintext highlighter-rouge">port</code>: 使用的主进程rank0的地址和端口</li>
  <li>–<code class="language-plaintext highlighter-rouge">node_rank</code>: 当前的进程rank</li>
</ul>

<p>单机情况下</p>
<ul>
  <li>只有 –<code class="language-plaintext highlighter-rouge">nproc_per_node</code> 是必须指定</li>
  <li>–<code class="language-plaintext highlighter-rouge">master_addr</code>/<code class="language-plaintext highlighter-rouge">port</code> 和 <code class="language-plaintext highlighter-rouge">node_rank</code> 都是可以由launch通过环境自动配置</li>
</ul>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">mport</span> <span class="n">torch</span>
<span class="kn">import</span> <span class="nn">torch.distributed</span> <span class="k">as</span> <span class="n">dist</span>
<span class="kn">import</span> <span class="nn">torch.multiprocessing</span> <span class="k">as</span> <span class="n">mp</span>
<span class="kn">import</span> <span class="nn">os</span>

<span class="n">dist</span><span class="p">.</span><span class="n">init_process_group</span><span class="p">(</span><span class="s">'nccl'</span><span class="p">,</span> <span class="n">init_method</span><span class="o">=</span><span class="s">'env://'</span><span class="p">)</span>

<span class="n">rank</span> <span class="o">=</span> <span class="n">dist</span><span class="p">.</span><span class="n">get_rank</span><span class="p">()</span>
<span class="n">local_rank</span> <span class="o">=</span> <span class="n">os</span><span class="p">.</span><span class="n">environ</span><span class="p">[</span><span class="s">'LOCAL_RANK'</span><span class="p">]</span>
<span class="n">master_addr</span> <span class="o">=</span> <span class="n">os</span><span class="p">.</span><span class="n">environ</span><span class="p">[</span><span class="s">'MASTER_ADDR'</span><span class="p">]</span>
<span class="n">master_port</span> <span class="o">=</span> <span class="n">os</span><span class="p">.</span><span class="n">environ</span><span class="p">[</span><span class="s">'MASTER_PORT'</span><span class="p">]</span>
<span class="k">print</span><span class="p">(</span><span class="sa">f</span><span class="s">"rank = </span><span class="si">{</span><span class="n">rank</span><span class="si">}</span><span class="s"> is initialized in </span><span class="si">{</span><span class="n">master_addr</span><span class="si">}</span><span class="s">:</span><span class="si">{</span><span class="n">master_port</span><span class="si">}</span><span class="s">; local_rank = </span><span class="si">{</span><span class="n">local_rank</span><span class="si">}</span><span class="s">"</span><span class="p">)</span>
<span class="n">torch</span><span class="p">.</span><span class="n">cuda</span><span class="p">.</span><span class="n">set_device</span><span class="p">(</span><span class="n">rank</span><span class="p">)</span>
<span class="n">tensor</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">tensor</span><span class="p">([</span><span class="mi">1</span><span class="p">,</span> <span class="mi">2</span><span class="p">,</span> <span class="mi">3</span><span class="p">,</span> <span class="mi">4</span><span class="p">]).</span><span class="n">cuda</span><span class="p">()</span>
<span class="k">print</span><span class="p">(</span><span class="n">tensor</span><span class="p">)</span>
</code></pre></div></div>

<p>启动命令</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code>python3 <span class="nt">-m</span> torch.distribued.launch <span class="nt">--nproc_per_node</span><span class="o">=</span>2 test_ddp.py
</code></pre></div></div>

<p>输出如下:</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code>rank <span class="o">=</span> 0 is initialized <span class="k">in </span>127.0.0.1:29500<span class="p">;</span> local_rank <span class="o">=</span> 0
rank <span class="o">=</span> 1 is initialized <span class="k">in </span>127.0.0.1:29500<span class="p">;</span> local_rank <span class="o">=</span> 1
tensor<span class="o">([</span>1, 2, 3, 4], <span class="nv">device</span><span class="o">=</span><span class="s1">'cuda:1'</span><span class="o">)</span>
tensor<span class="o">([</span>1, 2, 3, 4], <span class="nv">device</span><span class="o">=</span><span class="s1">'cuda:0'</span><span class="o">)</span>
</code></pre></div></div>

<h5 id="3-torchrun">(3) torchrun</h5>

<p>torch 1.10 开始用终端命令 <code class="language-plaintext highlighter-rouge">torchrun</code> 来代替 <code class="language-plaintext highlighter-rouge">torch.distributed.launch</code></p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">torchrun</code> 实现了 launch 的一个<strong>超集</strong></li>
</ul>

<p>不同:</p>
<ul>
  <li>完全使用环境变量配置各类参数,如 RANK,LOCAL_RANK, WORLD_SIZE 等,尤其是 local_rank 不再支持用命令行隐式传递的方式</li>
  <li>更加优雅处理某个worker失败情况,重启worker。
    <ul>
      <li>需要代码中有 load_checkpoint(path) 和 save_checkpoint(path) 这样有worker失败的话,可以通过load最新的模型,重启所有的worker接着训练。</li>
    </ul>
  </li>
  <li>训练节点数目可以<strong>弹性</strong>变化。</li>
</ul>

<p>上面代码直接使用运行即可,不用写那么长长的命令了。</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code>torchrun <span class="nt">--nproc_per_node</span><span class="o">=</span>2 test_gpu.py
</code></pre></div></div>

<p>注意</p>
<ul>
  <li>torchrun 或者 launch 对上面<code class="language-plaintext highlighter-rouge">ENV</code>初始化方法支持最完善, <code class="language-plaintext highlighter-rouge">TCP</code>初始化方法的可能会出现问题,尽量使用env来初始化dist。</li>
</ul>

<h4 id="torchdistributed-使用">torch.distributed 使用</h4>

<p>使用方式(单机多卡环境)</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c1"># 启动方式,shell中运行:
</span><span class="n">python</span> <span class="o">-</span><span class="n">m</span> <span class="n">torch</span><span class="p">.</span><span class="n">distributed</span><span class="p">.</span><span class="n">launch</span> <span class="o">--</span><span class="n">nnodes</span> <span class="mi">1</span> <span class="o">--</span><span class="n">nproc_per_node</span><span class="o">=</span><span class="mi">4</span>  <span class="n">YourScript</span><span class="p">.</span><span class="n">py</span>
<span class="c1"># nnodes: 表示有多少个节点,可以通俗的理解为有多少台机器
# nproc_per_node 表示每个节点上有多少个进程,每个进程一般独占一块GPU
########################## 	第1步	 ##########################
#初始化
</span><span class="s">'''
在启动器为我们启动python脚本后,在执行过程中,启动器会将当前进程的(其实就是 GPU的)index 通过参数传递给 python,
我们可以这样获得当前进程的 index:即通过参数 local_rank 来告诉我们当前进程使用的是哪个GPU,
用于我们在每个进程中指定不同的device
'''</span>
<span class="n">parse</span><span class="p">.</span><span class="n">add_argument</span><span class="p">(</span><span class="s">'--local_rank'</span><span class="p">,</span><span class="nb">type</span><span class="o">=</span><span class="nb">int</span><span class="p">)</span>
<span class="n">args</span><span class="o">=</span><span class="n">parser</span><span class="p">.</span><span class="n">parse_args</span><span class="p">()</span>
<span class="n">local_rank</span><span class="o">=</span><span class="n">args</span><span class="p">.</span><span class="n">local_rank</span>
<span class="n">torch</span><span class="p">.</span><span class="n">cuda</span><span class="p">.</span><span class="n">set_device</span><span class="p">(</span><span class="n">local_rank</span><span class="p">)</span>
<span class="s">'''
init_process_group用于初始化GPU通信方式(NCCL)和参数的获取方式(env代表通过环境变量)
gpu使用nccl最快,gloo为cpu分布式训练,mpu则需要重新编码
init_method 指定如何初始化进程组的 URL。 
默认及推荐为'env://' 其他初始化方式与多机多卡有关(not sure,挖个坑)
'''</span>
<span class="n">torch</span><span class="p">.</span><span class="n">distributed</span><span class="p">.</span><span class="n">init_process_group</span><span class="p">(</span><span class="s">'nccl'</span><span class="err">,</span><span class="n">init_method</span><span class="o">=</span><span class="s">'env://'</span><span class="p">)</span>
<span class="n">device</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">device</span><span class="p">(</span><span class="sa">f</span><span class="s">'cuda:</span><span class="si">{</span><span class="n">args</span><span class="p">.</span><span class="n">local_rank</span><span class="si">}</span><span class="s">'</span><span class="p">)</span>
<span class="c1">##########################	第2步  ##########################
#处理Dataloader
</span><span class="n">train_sampler</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">utils</span><span class="p">.</span><span class="n">data</span><span class="p">.</span><span class="n">distributed</span><span class="p">.</span><span class="n">DistributedSampler</span><span class="p">(</span><span class="n">train_dataset</span><span class="p">,</span><span class="n">shuffle</span><span class="o">=</span><span class="bp">True</span><span class="p">)</span>
<span class="n">train_loader</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">utils</span><span class="p">.</span><span class="n">data</span><span class="p">.</span><span class="n">DataLoader</span><span class="p">(</span><span class="n">train_dataset</span><span class="p">,</span> <span class="n">batch_size</span><span class="o">=</span><span class="p">...,</span> <span class="n">sampler</span><span class="o">=</span><span class="n">train_sampler</span><span class="p">)</span>
<span class="c1">#torch.utils.data.DataLoader中的shuffle应该设置为False(默认),因为打乱的任务交给了sampler
##########################	第3步  ##########################
#模型的初始化
</span><span class="n">model</span><span class="o">=</span><span class="n">torch</span><span class="p">.</span><span class="n">nn</span><span class="p">.</span><span class="n">parallel</span><span class="p">.</span><span class="n">DistributedDataParallel</span><span class="p">(</span><span class="n">model</span><span class="p">,</span> <span class="n">device_ids</span><span class="o">=</span><span class="p">[</span><span class="n">args</span><span class="p">.</span><span class="n">local_rank</span><span class="p">])</span>
<span class="s">'''
使用 DistributedDataParallel 包装模型,
它能帮助我们为不同 GPU 上求得的梯度进行allreduce(即汇总不同 GPU 计算所得的梯度,并同步计算结果)。
allreduce 后不同 GPU 中模型的梯度均为 allreduce 之前各 GPU 梯度的均值。
''''
##########################	第4步  ##########################
#同DP,进行inputs、labels的设备转移
</span></code></pre></div></div>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">sampler</span> <span class="o">=</span> <span class="n">DistributedSampler</span><span class="p">(</span><span class="n">dataset</span><span class="p">)</span> <span class="c1"># 这个sampler会自动分配数据到各个gpu上
</span><span class="n">DataLoader</span><span class="p">(</span><span class="n">dataset</span><span class="p">,</span> <span class="n">batch_size</span><span class="o">=</span><span class="n">batch_size</span><span class="p">,</span> <span class="n">sampler</span><span class="o">=</span><span class="n">sampler</span><span class="p">)</span>
</code></pre></div></div>

<p>完整代码如下:</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kn">import</span> <span class="nn">torch</span>
<span class="kn">import</span> <span class="nn">torch.nn</span> <span class="k">as</span> <span class="n">nn</span>
<span class="kn">from</span> <span class="nn">torch.autograd</span> <span class="kn">import</span> <span class="n">Variable</span>
<span class="kn">from</span> <span class="nn">torch.utils.data</span> <span class="kn">import</span> <span class="n">Dataset</span><span class="p">,</span> <span class="n">DataLoader</span>
<span class="kn">import</span> <span class="nn">os</span>
<span class="kn">from</span> <span class="nn">torch.utils.data.distributed</span> <span class="kn">import</span> <span class="n">DistributedSampler</span>
<span class="c1"># 1) 初始化
</span><span class="n">torch</span><span class="p">.</span><span class="n">distributed</span><span class="p">.</span><span class="n">init_process_group</span><span class="p">(</span><span class="n">backend</span><span class="o">=</span><span class="s">"nccl"</span><span class="p">)</span>

<span class="n">input_size</span> <span class="o">=</span> <span class="mi">5</span>
<span class="n">output_size</span> <span class="o">=</span> <span class="mi">2</span>
<span class="n">batch_size</span> <span class="o">=</span> <span class="mi">30</span>
<span class="n">data_size</span> <span class="o">=</span> <span class="mi">90</span>

<span class="c1"># 2) 配置每个进程的gpu
</span><span class="n">local_rank</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">distributed</span><span class="p">.</span><span class="n">get_rank</span><span class="p">()</span>
<span class="n">torch</span><span class="p">.</span><span class="n">cuda</span><span class="p">.</span><span class="n">set_device</span><span class="p">(</span><span class="n">local_rank</span><span class="p">)</span>
<span class="n">device</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">device</span><span class="p">(</span><span class="s">"cuda"</span><span class="p">,</span> <span class="n">local_rank</span><span class="p">)</span>

<span class="k">class</span> <span class="nc">RandomDataset</span><span class="p">(</span><span class="n">Dataset</span><span class="p">):</span>
    <span class="k">def</span> <span class="nf">__init__</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">size</span><span class="p">,</span> <span class="n">length</span><span class="p">):</span>
        <span class="bp">self</span><span class="p">.</span><span class="nb">len</span> <span class="o">=</span> <span class="n">length</span>
        <span class="bp">self</span><span class="p">.</span><span class="n">data</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">randn</span><span class="p">(</span><span class="n">length</span><span class="p">,</span> <span class="n">size</span><span class="p">).</span><span class="n">to</span><span class="p">(</span><span class="s">'cuda'</span><span class="p">)</span>

    <span class="k">def</span> <span class="nf">__getitem__</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">index</span><span class="p">):</span>
        <span class="k">return</span> <span class="bp">self</span><span class="p">.</span><span class="n">data</span><span class="p">[</span><span class="n">index</span><span class="p">]</span>

    <span class="k">def</span> <span class="nf">__len__</span><span class="p">(</span><span class="bp">self</span><span class="p">):</span>
        <span class="k">return</span> <span class="bp">self</span><span class="p">.</span><span class="nb">len</span>

<span class="n">dataset</span> <span class="o">=</span> <span class="n">RandomDataset</span><span class="p">(</span><span class="n">input_size</span><span class="p">,</span> <span class="n">data_size</span><span class="p">)</span>
<span class="c1"># 3)使用DistributedSampler
</span><span class="n">rand_loader</span> <span class="o">=</span> <span class="n">DataLoader</span><span class="p">(</span><span class="n">dataset</span><span class="o">=</span><span class="n">dataset</span><span class="p">,</span>
                         <span class="n">batch_size</span><span class="o">=</span><span class="n">batch_size</span><span class="p">,</span>
                         <span class="n">sampler</span><span class="o">=</span><span class="n">DistributedSampler</span><span class="p">(</span><span class="n">dataset</span><span class="p">))</span>

<span class="k">class</span> <span class="nc">Model</span><span class="p">(</span><span class="n">nn</span><span class="p">.</span><span class="n">Module</span><span class="p">):</span>
    <span class="k">def</span> <span class="nf">__init__</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">input_size</span><span class="p">,</span> <span class="n">output_size</span><span class="p">):</span>
        <span class="nb">super</span><span class="p">(</span><span class="n">Model</span><span class="p">,</span> <span class="bp">self</span><span class="p">).</span><span class="n">__init__</span><span class="p">()</span>
        <span class="bp">self</span><span class="p">.</span><span class="n">fc</span> <span class="o">=</span> <span class="n">nn</span><span class="p">.</span><span class="n">Linear</span><span class="p">(</span><span class="n">input_size</span><span class="p">,</span> <span class="n">output_size</span><span class="p">)</span>

    <span class="k">def</span> <span class="nf">forward</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="nb">input</span><span class="p">):</span>
        <span class="n">output</span> <span class="o">=</span> <span class="bp">self</span><span class="p">.</span><span class="n">fc</span><span class="p">(</span><span class="nb">input</span><span class="p">)</span>
        <span class="k">print</span><span class="p">(</span><span class="s">"  In Model: input size"</span><span class="p">,</span> <span class="nb">input</span><span class="p">.</span><span class="n">size</span><span class="p">(),</span>
              <span class="s">"output size"</span><span class="p">,</span> <span class="n">output</span><span class="p">.</span><span class="n">size</span><span class="p">())</span>
        <span class="k">return</span> <span class="n">output</span>

<span class="n">model</span> <span class="o">=</span> <span class="n">Model</span><span class="p">(</span><span class="n">input_size</span><span class="p">,</span> <span class="n">output_size</span><span class="p">)</span>

<span class="c1"># 4) 封装之前要把模型移到对应的gpu
</span><span class="n">model</span><span class="p">.</span><span class="n">to</span><span class="p">(</span><span class="n">device</span><span class="p">)</span>

<span class="k">if</span> <span class="n">torch</span><span class="p">.</span><span class="n">cuda</span><span class="p">.</span><span class="n">device_count</span><span class="p">()</span> <span class="o">&gt;</span> <span class="mi">1</span><span class="p">:</span>
    <span class="k">print</span><span class="p">(</span><span class="s">"Let's use"</span><span class="p">,</span> <span class="n">torch</span><span class="p">.</span><span class="n">cuda</span><span class="p">.</span><span class="n">device_count</span><span class="p">(),</span> <span class="s">"GPUs!"</span><span class="p">)</span>
    <span class="c1"># 5) 封装:
</span>    <span class="n">model</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">nn</span><span class="p">.</span><span class="n">parallel</span><span class="p">.</span><span class="n">DistributedDataParallel</span><span class="p">(</span><span class="n">model</span><span class="p">,</span> <span class="n">device_ids</span><span class="o">=</span><span class="p">[</span><span class="n">local_rank</span><span class="p">],</span> <span class="n">output_device</span><span class="o">=</span><span class="n">local_rank</span><span class="p">)</span>

<span class="k">for</span> <span class="n">data</span> <span class="ow">in</span> <span class="n">rand_loader</span><span class="p">:</span>
    <span class="k">if</span> <span class="n">torch</span><span class="p">.</span><span class="n">cuda</span><span class="p">.</span><span class="n">is_available</span><span class="p">():</span>
        <span class="n">input_var</span> <span class="o">=</span> <span class="n">data</span>
    <span class="k">else</span><span class="p">:</span>
        <span class="n">input_var</span> <span class="o">=</span> <span class="n">data</span>
    <span class="n">output</span> <span class="o">=</span> <span class="n">model</span><span class="p">(</span><span class="n">input_var</span><span class="p">)</span>
    <span class="k">print</span><span class="p">(</span><span class="s">"Outside: input size"</span><span class="p">,</span> <span class="n">input_var</span><span class="p">.</span><span class="n">size</span><span class="p">(),</span> <span class="s">"output_size"</span><span class="p">,</span> <span class="n">output</span><span class="p">.</span><span class="n">size</span><span class="p">())</span>
</code></pre></div></div>

<p>执行脚本:</p>

<div class="language-shell highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c"># 启用 DDP 模式</span>
<span class="nv">CUDA_VISIBLE_DEVICES</span><span class="o">=</span>0,1 python <span class="nt">-m</span> torch.distributed.launch <span class="nt">--nproc_per_node</span><span class="o">=</span>2 torch_ddp.py
</code></pre></div></div>

<p>apex加速(混合精度训练、并行训练、同步BN)可<a href="https://zhuanlan.zhihu.com/p/158375055">参考</a></p>

<h3 id="代码分布式改造">代码分布式改造</h3>

<p>如何将单机训练代码改成分布式运行?</p>

<p>基本流程:</p>
<ul>
  <li>分布式训练数据加载</li>
  <li>分布式训练</li>
  <li>分布式评估</li>
</ul>

<h4 id="分布式数据集">分布式数据集</h4>

<p><code class="language-plaintext highlighter-rouge">Dataloader</code> 要把所有数据分成N份(N为worldsize), 并能正确分发到不同进程中,每个进程可以拿到一个数据的子集,不重叠,不交叉。</p>

<p>这部分工作靠 <code class="language-plaintext highlighter-rouge">DistributedSampler</code> 完成,函数签名如下:</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">torch</span><span class="p">.</span><span class="n">utils</span><span class="p">.</span><span class="n">data</span><span class="p">.</span><span class="n">distributed</span><span class="p">.</span><span class="n">DistributedSampler</span><span class="p">(</span><span class="n">dataset</span><span class="p">,</span>
				<span class="n">num_replicas</span><span class="o">=</span><span class="bp">None</span><span class="p">,</span> <span class="n">rank</span><span class="o">=</span><span class="bp">None</span><span class="p">,</span> <span class="n">shuffle</span><span class="o">=</span><span class="bp">True</span><span class="p">,</span> <span class="n">seed</span><span class="o">=</span><span class="mi">0</span><span class="p">,</span> <span class="n">drop_last</span><span class="o">=</span><span class="bp">False</span><span class="p">)</span>
</code></pre></div></div>

<p>参数说明</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">dataset</code>: 需要加载的完整数据集</li>
  <li><code class="language-plaintext highlighter-rouge">num_replicas</code>: 把数据集分成多少份,默认是当前dist的world_size</li>
  <li><code class="language-plaintext highlighter-rouge">rank</code>: 当前进程的id,默认从dist的rank</li>
  <li><code class="language-plaintext highlighter-rouge">shuffle</code>:是否打乱</li>
  <li><code class="language-plaintext highlighter-rouge">drop_last</code>: 如果数据长度不能被world_size整除,可以考虑是否将剩下的扔掉</li>
  <li><code class="language-plaintext highlighter-rouge">seed</code>:随机数种子。
    <ul>
      <li>注意: 从源码中可以看出,真正的种子其实是 self.seed + self.epoch, 好处是,不同epoch每个进程拿到的数据是不一样,因此要在每个epoch开始前设置下:<code class="language-plaintext highlighter-rouge">sampler.set_epoch(epoch)</code></li>
    </ul>
  </li>
</ul>

<p>Sampler 实现核心代码:</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">indices</span><span class="p">[</span><span class="bp">self</span><span class="p">.</span><span class="n">rank</span><span class="p">:</span> <span class="bp">self</span><span class="p">.</span><span class="n">total_size</span><span class="p">:</span> <span class="bp">self</span><span class="p">.</span><span class="n">num_replicas</span><span class="p">]</span>
</code></pre></div></div>

<p>假设4卡12条数据,rank=0,1,2,3, num_replicas=4, 那么每个卡取的数据索引就是:</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code>rank0: <span class="o">[</span>0 4 8]<span class="p">;</span> rank1: <span class="o">[</span>1 5 9]<span class="p">;</span> rank2: <span class="o">[</span>2 6 10]<span class="p">;</span> rank3: <span class="o">[</span>3 7 11]
</code></pre></div></div>

<p>保证不重复不交叉。这样在分布式训练的时候,只需要给 Dataloader 指定 DistributedSampler 即可,简单示例如下:</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">sampler</span> <span class="o">=</span> <span class="n">DistributedSampler</span><span class="p">(</span><span class="n">dataset</span><span class="p">)</span>
<span class="n">loader</span> <span class="o">=</span> <span class="n">DataLoader</span><span class="p">(</span><span class="n">dataset</span><span class="p">,</span> <span class="n">sampler</span><span class="o">=</span><span class="n">sampler</span><span class="p">)</span>
<span class="k">for</span> <span class="n">epoch</span> <span class="ow">in</span> <span class="nb">range</span><span class="p">(</span><span class="n">start_epoch</span><span class="p">,</span> <span class="n">n_epochs</span><span class="p">):</span>
  <span class="n">sampler</span><span class="p">.</span><span class="n">set_epoch</span><span class="p">(</span><span class="n">epoch</span><span class="p">)</span> <span class="c1"># 设置epoch 更新种子
</span>  <span class="n">train</span><span class="p">(</span><span class="n">loader</span><span class="p">)</span>
</code></pre></div></div>

<p>模型的分布式训练封装。将单机模型使用 torch.nn.parallel.<code class="language-plaintext highlighter-rouge">DistributedDataParallel</code> 进行封装,如下:</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">torch</span><span class="p">.</span><span class="n">cuda</span><span class="p">.</span><span class="n">set_device</span><span class="p">(</span><span class="n">local_rank</span><span class="p">)</span>
<span class="n">model</span> <span class="o">=</span> <span class="n">Model</span><span class="p">().</span><span class="n">cuda</span><span class="p">()</span>
<span class="n">model</span> <span class="o">=</span> <span class="n">DistributedDataParallel</span><span class="p">(</span><span class="n">model</span><span class="p">,</span> <span class="n">device_ids</span><span class="o">=</span><span class="p">[</span><span class="n">local_rank</span><span class="p">])</span>
<span class="c1"># 要调用model内的函数或者属性. model.module.xxxx
</span></code></pre></div></div>

<p>多卡训练时,每个进程有一个model副本和optimizer,使用自己的数据进行训练,之后<strong>反向传播</strong>计算完梯度的时候,所有进程的梯度会进行 all-reduce 操作进行同步,进而保证每个卡上的模型更新梯度是一样的,模型参数也是一致的。</p>

<p>注意</p>
<ul>
  <li>在save和load模型时候,为了减小所有进程同时读写磁盘,以<strong>主进程</strong>为主,rank0 先save模型,在map到其他进程。</li>
  <li>另外一个好处: 最开始训练时,模型随机初始化之后,保证了所有进程的模型参数保持一致。</li>
</ul>

<p>torch的DDP封装时,已经做到了这一点,即使开始随机初始化不同,经过DDP封装,所有进程都一样的参数</p>

<p>简洁代码如下:</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">model</span> <span class="o">=</span> <span class="n">DistributedDataParallel</span><span class="p">(</span><span class="n">model</span><span class="p">,</span> <span class="n">device_ids</span><span class="o">=</span><span class="p">[</span><span class="n">local_rank</span><span class="p">])</span>
<span class="n">CHECKPOINT_PATH</span> <span class="o">=</span><span class="s">"./model.checkpoint"</span>
<span class="k">if</span> <span class="n">rank</span> <span class="o">==</span> <span class="mi">0</span><span class="p">:</span>
  <span class="n">torch</span><span class="p">.</span><span class="n">save</span><span class="p">(</span><span class="n">ddp_model</span><span class="p">.</span><span class="n">state_dict</span><span class="p">(),</span> <span class="n">CHECKPOINT_PATH</span><span class="p">)</span>
<span class="c1"># barrier()其他保证rank 0保存完成
</span><span class="n">dist</span><span class="p">.</span><span class="n">barrier</span><span class="p">()</span>
<span class="n">map_location</span> <span class="o">=</span> <span class="p">{</span><span class="s">"cuda:0"</span><span class="p">:</span> <span class="sa">f</span><span class="s">"cuda:</span><span class="si">{</span><span class="n">local_rank</span><span class="si">}</span><span class="s">"</span><span class="p">}</span>
<span class="n">model</span><span class="p">.</span><span class="n">load_state_dict</span><span class="p">(</span><span class="n">torch</span><span class="p">.</span><span class="n">load</span><span class="p">(</span><span class="n">CHECKPOINT_PATH</span><span class="p">,</span> <span class="n">map_location</span><span class="o">=</span><span class="n">map_location</span><span class="p">))</span>
<span class="c1"># 后面正常训练代码
</span><span class="n">optimizer</span> <span class="o">=</span> <span class="n">xxx</span>
<span class="k">for</span> <span class="n">epoch</span><span class="p">:</span>
  <span class="k">for</span> <span class="n">data</span> <span class="ow">in</span> <span class="n">Dataloader</span><span class="p">:</span>
      <span class="n">model</span><span class="p">(</span><span class="n">data</span><span class="p">)</span>
      <span class="n">xxx</span>
    <span class="c1"># 训练完成 只需要保存rank 0上的即可
</span>    <span class="c1"># 不需要dist.barrior(), all_reduce 操作保证了同步性
</span>  <span class="k">if</span> <span class="n">rank</span> <span class="o">==</span> <span class="mi">0</span><span class="p">:</span>
     <span class="n">torch</span><span class="p">.</span><span class="n">save</span><span class="p">(</span><span class="n">ddp_model</span><span class="p">.</span><span class="n">state_dict</span><span class="p">(),</span> <span class="n">CHECKPOINT_PATH</span><span class="p">)</span>
</code></pre></div></div>

<h4 id="分布式训练-1">分布式训练</h4>

<p>DDP分布式训练步骤:</p>
<ul>
  <li>初始化进程组 dist.init_process_group</li>
  <li>设置分布式采样器 DistributedSampler</li>
  <li>使用DistributedDataParallel封装模型</li>
  <li>使用torchrun 或者 mp.spawn 启动分布式训练</li>
</ul>

<p>使用分布式做 evaluation 时,要先把所有进程的输出结果进行 gather,再进行指标计算,两个常用函数:</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">dist</span><span class="p">.</span><span class="n">all_gather</span><span class="p">(</span><span class="n">tensor_list</span><span class="p">,</span> <span class="n">tensor</span><span class="p">)</span> <span class="c1"># 将所有进程的tensor进行收集并拼接成新的tensorlist返回,比如:
</span><span class="n">dist</span><span class="p">.</span><span class="n">all_reduce</span><span class="p">(</span><span class="n">tensor</span><span class="p">,</span> <span class="n">op</span><span class="p">)</span> <span class="c1"># 对tensor 的 in-place 操作, 对所有进程的某个tensor进行合并操作,op可以是求和等
</span></code></pre></div></div>

<p>代码</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kn">import</span> <span class="nn">torch</span>
<span class="kn">import</span> <span class="nn">torch.distributed</span> <span class="k">as</span> <span class="n">dist</span>

<span class="n">dist</span><span class="p">.</span><span class="n">init_process_group</span><span class="p">(</span><span class="s">'nccl'</span><span class="p">,</span> <span class="n">init_method</span><span class="o">=</span><span class="s">'env://'</span><span class="p">)</span>
<span class="n">rank</span> <span class="o">=</span> <span class="n">dist</span><span class="p">.</span><span class="n">get_rank</span><span class="p">()</span>
<span class="n">torch</span><span class="p">.</span><span class="n">cuda</span><span class="p">.</span><span class="n">set_device</span><span class="p">(</span><span class="n">rank</span><span class="p">)</span>

<span class="n">tensor</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">arange</span><span class="p">(</span><span class="mi">2</span><span class="p">)</span> <span class="o">+</span> <span class="mi">1</span> <span class="o">+</span> <span class="mi">2</span> <span class="o">*</span> <span class="n">rank</span>
<span class="n">tensor</span> <span class="o">=</span> <span class="n">tensor</span><span class="p">.</span><span class="n">cuda</span><span class="p">()</span>
<span class="k">print</span><span class="p">(</span><span class="sa">f</span><span class="s">"rank </span><span class="si">{</span><span class="n">rank</span><span class="si">}</span><span class="s">: </span><span class="si">{</span><span class="n">tensor</span><span class="si">}</span><span class="s">"</span><span class="p">)</span>

<span class="n">tensor_list</span> <span class="o">=</span> <span class="p">[</span><span class="n">torch</span><span class="p">.</span><span class="n">zeros_like</span><span class="p">(</span><span class="n">tensor</span><span class="p">).</span><span class="n">cuda</span><span class="p">()</span> <span class="k">for</span> <span class="n">_</span> <span class="ow">in</span> <span class="nb">range</span><span class="p">(</span><span class="mi">2</span><span class="p">)]</span>
<span class="n">dist</span><span class="p">.</span><span class="n">all_gather</span><span class="p">(</span><span class="n">tensor_list</span><span class="p">,</span> <span class="n">tensor</span><span class="p">)</span>
<span class="k">print</span><span class="p">(</span><span class="sa">f</span><span class="s">"after gather, rank </span><span class="si">{</span><span class="n">rank</span><span class="si">}</span><span class="s">: tensor_list: </span><span class="si">{</span><span class="n">tensor_list</span><span class="si">}</span><span class="s">"</span><span class="p">)</span>

<span class="n">dist</span><span class="p">.</span><span class="n">barrier</span><span class="p">()</span>
<span class="n">dist</span><span class="p">.</span><span class="n">all_reduce</span><span class="p">(</span><span class="n">tensor</span><span class="p">,</span> <span class="n">op</span><span class="o">=</span><span class="n">dist</span><span class="p">.</span><span class="n">ReduceOp</span><span class="p">.</span><span class="n">SUM</span><span class="p">)</span>
<span class="k">print</span><span class="p">(</span><span class="sa">f</span><span class="s">"after reduce, rank </span><span class="si">{</span><span class="n">rank</span><span class="si">}</span><span class="s">: tensor: </span><span class="si">{</span><span class="n">tensor</span><span class="si">}</span><span class="s">"</span><span class="p">)</span>
</code></pre></div></div>

<p>命令</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code>torchrun <span class="nt">--nproc_per_node</span><span class="o">=</span>2 test_ddp.py
</code></pre></div></div>

<p>输出结果如下:</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code>rank 1: tensor<span class="o">([</span>3, 4], <span class="nv">device</span><span class="o">=</span><span class="s1">'cuda:1'</span><span class="o">)</span>
rank 0: tensor<span class="o">([</span>1, 2], <span class="nv">device</span><span class="o">=</span><span class="s1">'cuda:0'</span><span class="o">)</span>
after gather, rank 1: tensor_list: <span class="o">[</span>tensor<span class="o">([</span>1, 2], <span class="nv">device</span><span class="o">=</span><span class="s1">'cuda:1'</span><span class="o">)</span>, tensor<span class="o">([</span>3, 4], <span class="nv">device</span><span class="o">=</span><span class="s1">'cuda:1'</span><span class="o">)]</span>
after gather, rank 0: tensor_list: <span class="o">[</span>tensor<span class="o">([</span>1, 2], <span class="nv">device</span><span class="o">=</span><span class="s1">'cuda:0'</span><span class="o">)</span>, tensor<span class="o">([</span>3, 4], <span class="nv">device</span><span class="o">=</span><span class="s1">'cuda:0'</span><span class="o">)]</span>
after reduce, rank 0: tensor: tensor<span class="o">([</span>4, 6], <span class="nv">device</span><span class="o">=</span><span class="s1">'cuda:0'</span><span class="o">)</span>
after reduce, rank 1: tensor: tensor<span class="o">([</span>4, 6], <span class="nv">device</span><span class="o">=</span><span class="s1">'cuda:1'</span><span class="o">)</span>
</code></pre></div></div>

<h4 id="分布式评估">分布式评估</h4>

<p>evaluation 时,可以拿到所有进程中模型输出,最后统一计算指标,基本流程如下:</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">pred_list</span> <span class="o">=</span> <span class="p">[]</span>
<span class="k">for</span> <span class="n">data</span> <span class="ow">in</span> <span class="n">Dataloader</span><span class="p">:</span>
    <span class="n">pred</span> <span class="o">=</span> <span class="n">model</span><span class="p">(</span><span class="n">data</span><span class="p">)</span>
    <span class="n">batch_pred</span> <span class="o">=</span> <span class="p">[</span><span class="n">torch</span><span class="p">.</span><span class="n">zeros_like</span><span class="p">(</span><span class="n">label</span><span class="p">)</span> <span class="k">for</span> <span class="n">_</span> <span class="ow">in</span> <span class="nb">range</span><span class="p">(</span><span class="n">world_size</span><span class="p">)]</span>
    <span class="n">dist</span><span class="p">.</span><span class="n">all_gather</span><span class="p">(</span><span class="n">batch_pred</span><span class="p">,</span> <span class="n">pred</span><span class="p">)</span>
    <span class="n">pred_list</span><span class="p">.</span><span class="n">extend</span><span class="p">(</span><span class="n">batch_pred</span><span class="p">)</span>
<span class="n">pred_list</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">cat</span><span class="p">(</span><span class="n">pred_list</span><span class="p">,</span> <span class="mi">1</span><span class="p">)</span>
<span class="c1"># 所有进程pred_list是一致的,保存所有数据模型预测的值
</span></code></pre></div></div>

<h3 id="pytorch-分布式操作">pytorch 分布式操作</h3>

<p>【2024-8-4】<a href="https://zhuanlan.zhihu.com/p/712631827?utm_psn=1803475758301179905">彻底搞清楚torch. distributed分布式数据通信all_gather、all_reduce</a></p>

<p>all_gather和all_reduce;gather、reduce、scatter方法对比</p>
<ul>
  <li><img src="https://pic2.zhimg.com/80/v2-ff290214bab003c79d6a28363d65bc7d_1440w.webp" alt="" /></li>
</ul>

<h4 id="all_gather">all_gather</h4>

<p>分布式操作</p>
<ul>
  <li>gather 操作用于在<strong>不同节点间收集信息</strong></li>
  <li>首先初始化一个空 Tensor 列表 tensor_list, 用于接收所有节点的信息</li>
  <li>然后调用 all_gather 在所有节点中得到包含每个节点本地张量的列表</li>
  <li>列表中有 world_size 个元素,每个元素都是bs大小,后续通过cat操作即可得到大小为 bs * world_size 表示</li>
</ul>

<p>Pytorch DDP 分布式数据合并通信 torch.distributed.all_gather()</p>

<p><a href="https://pytorch.org/docs/master/distributed.html?highlight=all_gather#torch.distributed.all_gather">torch.distributed.all_gather()</a></p>

<p>函数定义</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">tensor_list</code> 是list,大小是 word_size,每个元素为了是gather后,保存每个rank的数据,所以初始化一般使用torch.empty;</li>
  <li><code class="language-plaintext highlighter-rouge">tensor</code> 代表各rank中的tensor数据,其中tensor_list每个分量的维度要与对应的tensor参数中每个rank的维度相同。</li>
</ul>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">all_gather</span><span class="p">(</span><span class="n">tensor_list</span><span class="p">,</span> <span class="n">tensor</span><span class="p">,</span> <span class="n">group</span><span class="o">=</span><span class="bp">None</span><span class="p">,</span> <span class="n">async_op</span><span class="o">=</span><span class="bp">False</span><span class="p">)</span><span class="err">:</span>
</code></pre></div></div>

<ul>
  <li>tensor_list 每个元素代表每个rank的数据</li>
  <li>tensor 代表每个进程中的tensor数据</li>
  <li>其中tensor_list每个分量的维度要与对应的tensor参数中每个rank的维度相同。</li>
</ul>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c1"># 两个机器,每个4张卡,批大小为bs
</span><span class="n">tensor</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">arange</span><span class="p">(</span><span class="n">bs</span><span class="p">,</span> <span class="n">dtype</span><span class="o">=</span><span class="n">torch</span><span class="p">.</span><span class="n">int64</span><span class="p">)</span> <span class="o">+</span> <span class="mi">1</span> <span class="o">+</span> <span class="mi">2</span> <span class="o">*</span> <span class="n">rank</span>
<span class="n">tensor_list</span> <span class="o">=</span> <span class="p">[</span><span class="n">torch</span><span class="p">.</span><span class="n">zeros</span><span class="p">(</span><span class="n">bs</span><span class="p">,</span> <span class="n">dtype</span><span class="o">=</span><span class="n">torch</span><span class="p">.</span><span class="n">int64</span><span class="p">)</span> <span class="k">for</span> <span class="n">_</span> <span class="ow">in</span> <span class="nb">range</span><span class="p">(</span><span class="n">torch</span><span class="p">.</span><span class="n">distributed</span><span class="p">.</span><span class="n">get_world_size</span><span class="p">())]</span>
<span class="n">dist</span><span class="p">.</span><span class="n">all_gather</span><span class="p">(</span><span class="n">tensor_list</span><span class="p">,</span> <span class="n">tensor</span><span class="p">)</span>
<span class="n">tensor_list</span>
</code></pre></div></div>

<h4 id="all_reduce">all_reduce</h4>

<p>all_reduce 操作用于在不同节点中<strong>同步信息</strong></p>
<ul>
  <li>调用该方法, 在所有节点中<strong>求和/平均</strong>,使用前后大小均为bs</li>
</ul>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">tensor</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">arange</span><span class="p">(</span><span class="n">bs</span><span class="p">,</span> <span class="n">dtype</span><span class="o">=</span><span class="n">torch</span><span class="p">.</span><span class="n">int64</span><span class="p">)</span> <span class="o">+</span> <span class="mi">1</span> <span class="o">+</span> <span class="mi">2</span> <span class="o">*</span> <span class="n">rank</span>
<span class="n">dist</span><span class="p">.</span><span class="n">all_reduce</span><span class="p">(</span><span class="n">tensor</span><span class="p">,</span> <span class="n">op</span><span class="o">=</span><span class="n">ReduceOp</span><span class="p">.</span><span class="n">SUM</span><span class="p">)</span>
<span class="n">tensor</span>
</code></pre></div></div>

<p>all_reduce 函数定义</p>
<ul>
  <li>tensor 代表各rank中的tensor数据,op代表可以选择的操作,主要有: SUM、PRODUCT、MIN,MAX、BAND、BOR、BXOR、PREMUL_SUM</li>
</ul>

<h3 id="torchrun-更新">Torchrun (更新)</h3>

<p>PyTorch 官网介绍</p>
<ul>
  <li>This module(torch.distributed.launch) is going to be deprecated in favor of <code class="language-plaintext highlighter-rouge">torchrun</code>.</li>
</ul>

<p>Pytorch 1.9.0 引入了 <code class="language-plaintext highlighter-rouge">torchrun</code>,替代以前的 <code class="language-plaintext highlighter-rouge">torch.distributed.launch</code>。</p>
<ul>
  <li><a href="https://pytorch.org/docs/stable/elastic/run.html#launcher-api">torchrun</a> 是 <code class="language-plaintext highlighter-rouge">torch.distributed.launch</code> 的超集, elastic launch, 等效于 <code class="language-plaintext highlighter-rouge">python -m torch.distributed.run</code></li>
  <li><a href="https://pytorch.org/docs/stable/elastic/run.html#launcher-api">torchrun</a> 包含 <code class="language-plaintext highlighter-rouge">torch.distributed.launch</code> 几乎所有功能(除了废弃的<code class="language-plaintext highlighter-rouge">--use-env</code>)</li>
</ul>

<p><code class="language-plaintext highlighter-rouge">torchrun</code> 包含了 torch.distributed.launch 所有功能,还有三点额外功能:</p>
<ul>
  <li>1、<code class="language-plaintext highlighter-rouge">worker_rank</code> 和 <code class="language-plaintext highlighter-rouge">world_size</code> 将被自动分配</li>
  <li>2、<code class="language-plaintext highlighter-rouge">Failover</code>: worker失败时, 重新启动所有workers来处理 workers 故障</li>
  <li>3、<code class="language-plaintext highlighter-rouge">Elastic</code>: 动态增减节点, 允许节点数目在最大/最小值之间改变, 即具备<strong>弹性</strong></li>
</ul>

<h4 id="用法">用法</h4>

<p>几种模式</p>
<ul>
  <li>单机多卡 torchrun –standalone –nnodes=1 –nproc_per_node=N inference.py –args</li>
  <li>多机多卡 torchrun –nnodes=M –nproc_per_node=N inference.py –args</li>
</ul>

<p>–nnodes:计算节点(也就是机器)的数量,单机的话就是1,M机的话就是M
–nproc_per_node:每个节点(每台机器)上进程的数量。因为一个进程需要放在一张显卡上跑,因此进程的数量也就是显卡的数量,比如单机八卡,就要将该参数设置为8
–args:运行inference.py脚本所需的参数</p>

<h4 id="torchrun-示例">torchrun 示例</h4>

<p>2机8卡 分布式训练<a href="https://zhuanlan.zhihu.com/p/489892744">示例</a></p>
<ul>
  <li><img src="https://pic1.zhimg.com/80/v2-2baae86e212177108872d36a6040a2dc_1440w.webp" alt="" /></li>
  <li>gpu 编号: 0~3</li>
  <li>local rank: gpu 本地编号, 0~3</li>
  <li>global rank: gpu 全局编号, 0~7</li>
</ul>

<p>环境</p>
<ul>
  <li>code: <a href="https://github.com/tingshua-yts/BetterDL/blob/master/test/pytorch/DDP/train_elastic.py">BetterDL - train_elastic.py</a></li>
  <li>运行环境: 2台4卡 v100机器</li>
</ul>

<p>train_elastic.py</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="k">def</span> <span class="nf">run</span><span class="p">():</span>
    <span class="n">env_dict</span> <span class="o">=</span> <span class="p">{</span>
        <span class="n">key</span><span class="p">:</span> <span class="n">os</span><span class="p">.</span><span class="n">environ</span><span class="p">[</span><span class="n">key</span><span class="p">]</span>
        <span class="k">for</span> <span class="n">key</span> <span class="ow">in</span> <span class="p">(</span><span class="s">"MASTER_ADDR"</span><span class="p">,</span> <span class="s">"MASTER_PORT"</span><span class="p">,</span> <span class="s">"WORLD_SIZE"</span><span class="p">,</span> <span class="s">"LOCAL_WORLD_SIZE"</span><span class="p">)</span>
    <span class="p">}</span>
    <span class="k">print</span><span class="p">(</span><span class="sa">f</span><span class="s">"[</span><span class="si">{</span><span class="n">os</span><span class="p">.</span><span class="n">getpid</span><span class="p">()</span><span class="si">}</span><span class="s">] Initializing process group with: </span><span class="si">{</span><span class="n">env_dict</span><span class="si">}</span><span class="s">"</span><span class="p">)</span>
    <span class="n">dist</span><span class="p">.</span><span class="n">init_process_group</span><span class="p">(</span><span class="n">backend</span><span class="o">=</span><span class="s">"nccl"</span><span class="p">)</span>
    <span class="n">train</span><span class="p">()</span>
    <span class="n">dist</span><span class="p">.</span><span class="n">destroy_process_group</span><span class="p">()</span>

<span class="k">if</span> <span class="n">__name__</span> <span class="o">==</span> <span class="s">"__main__"</span><span class="p">:</span>
    <span class="n">run</span><span class="p">()</span>
</code></pre></div></div>

<p>启动脚本 run_elastic.sh</p>
<ul>
  <li>node0 和 node1 上分别执行脚本</li>
</ul>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code>torchrun <span class="se">\</span>
    <span class="nt">--nnodes</span><span class="o">=</span>1:3<span class="se">\</span>
    <span class="nt">--nproc_per_node</span><span class="o">=</span>4<span class="se">\</span>
    <span class="nt">--max_restarts</span><span class="o">=</span>3<span class="se">\</span>
    <span class="nt">--rdzv_id</span><span class="o">=</span>1<span class="se">\</span>
    <span class="nt">--rdzv_backend</span><span class="o">=</span>c10d<span class="se">\</span>
    <span class="nt">--rdzv_endpoint</span><span class="o">=</span><span class="s2">"192.0.0.1:1234"</span><span class="se">\</span>
    train_elastic.py
</code></pre></div></div>

<p>描述如下(注:node0和node1均通过该脚本进行启动)</p>
<ul>
  <li>–<code class="language-plaintext highlighter-rouge">nnodes</code>=<strong>1:3</strong>: 当前训练任务接受最少1个node,最多3个node, 参与分布式训练;</li>
  <li>–<code class="language-plaintext highlighter-rouge">nproc_per_node</code>=4: 每个node上节点有4个process</li>
  <li>–<code class="language-plaintext highlighter-rouge">max_restarts</code>=3: worker group最大的重启次数;
    <ul>
      <li>注意: node fail、node scale down和node scale up都会导致restart;</li>
    </ul>
  </li>
  <li>–<code class="language-plaintext highlighter-rouge">rdzv_id</code>=1:一个unique的job id,所有node均使用同一个job id;</li>
  <li>–<code class="language-plaintext highlighter-rouge">rdzv_backend</code>: rendezvous backend实现,默认支持c10d和etcd两种;rendezvous用于多个node之间的通信和协调;</li>
  <li>–<code class="language-plaintext highlighter-rouge">rdzv_endpoint</code>: rendezvous 地址,应该为一个node的host ip和port;</li>
</ul>

<h4 id="迁移-launch---torchrun">迁移 launch -&gt; torchrun</h4>

<p>torch.distributed.launch -&gt; torchrun</p>

<p>迁移方法</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code>python <span class="nt">-m</span> torch.distributed.launch -&gt; torchrun
<span class="c"># (1) 如果 从环境变量(LOCAL_RANK)中读取 local_rank 参数, 直接忽略</span>
<span class="c"># 更改前</span>
python <span class="nt">-m</span> torch.distributed.launch <span class="nt">--use-env</span> train_script.py
<span class="c"># 更改后</span>
torchrun train_script.py
</code></pre></div></div>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c1"># (2) 如果 从命令行(--local-rank)读取 local_rank 参数
# 更改前
</span><span class="kn">import</span> <span class="nn">argparse</span>
<span class="n">parser</span> <span class="o">=</span> <span class="n">argparse</span><span class="p">.</span><span class="n">ArgumentParser</span><span class="p">()</span>
<span class="n">parser</span><span class="p">.</span><span class="n">add_argument</span><span class="p">(</span><span class="s">"--local-rank"</span><span class="p">,</span> <span class="nb">type</span><span class="o">=</span><span class="nb">int</span><span class="p">)</span>
<span class="n">args</span> <span class="o">=</span> <span class="n">parser</span><span class="p">.</span><span class="n">parse_args</span><span class="p">()</span>
<span class="n">local_rank</span> <span class="o">=</span> <span class="n">args</span><span class="p">.</span><span class="n">local_rank</span>
<span class="c1"># 更改后
</span><span class="kn">import</span> <span class="nn">os</span>
<span class="n">local_rank</span> <span class="o">=</span> <span class="nb">int</span><span class="p">(</span><span class="n">os</span><span class="p">.</span><span class="n">environ</span><span class="p">[</span><span class="s">"LOCAL_RANK"</span><span class="p">])</span>
</code></pre></div></div>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c1"># local_rank参数应当从环境变量中读取,而不是通过参数传递。
### ------ BEFORE ------
</span><span class="kn">import</span> <span class="nn">argparse</span>
<span class="n">parser</span> <span class="o">=</span> <span class="n">argparse</span><span class="p">.</span><span class="n">ArgumentParser</span><span class="p">()</span>
<span class="n">parser</span><span class="p">.</span><span class="n">add_argument</span><span class="p">(</span><span class="s">"--local_rank"</span><span class="p">,</span> <span class="nb">type</span><span class="o">=</span><span class="nb">int</span><span class="p">)</span>
<span class="n">args</span> <span class="o">=</span> <span class="n">parser</span><span class="p">.</span><span class="n">parse_args</span><span class="p">()</span>

<span class="n">local_rank</span> <span class="o">=</span> <span class="n">args</span><span class="p">.</span><span class="n">local_rank</span>
<span class="c1">### ------ NOW -------
</span><span class="kn">import</span> <span class="nn">os</span>
<span class="n">local_rank</span> <span class="o">=</span> <span class="nb">int</span><span class="p">(</span><span class="n">os</span><span class="p">.</span><span class="n">environ</span><span class="p">[</span><span class="s">"LOCAL_RANK"</span><span class="p">])</span>

<span class="c1">#运行脚本
</span><span class="n">torchrun</span> <span class="n">train_script</span><span class="p">.</span><span class="n">py</span> <span class="c1">#除了--use_env参数,其他torch.distributed.launch所使用的参数均可使用,
</span>			 <span class="c1">#如nnodes、nproc_per_node
</span></code></pre></div></div>

<h4 id="初始化-init_process_group">初始化 init_process_group</h4>

<p><code class="language-plaintext highlighter-rouge">dist.init_process_group()</code> 是PyTorch中用于初始化分布式训练的函数之一。</p>
<ul>
  <li>作用: 设置并行训练环境,连接多个进程以进行数据和模型的分布式处理。</li>
</ul>

<p>通过<code class="language-plaintext highlighter-rouge">init_process_group()</code>函数这个方法来进行初始化</p>

<p>其参数包括以下内容</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">backend</code>(必需参数):指定分布式后端的类型,选项之一:
    <ul>
      <li>‘tcp’:使用TCP协议进行通信。</li>
      <li>‘gloo’:使用Gloo库进行通信。</li>
      <li>‘mpi’:使用MPI(Message Passing Interface)进行通信。</li>
      <li>‘nccl’:使用NCCL库进行通信(适用于多GPU的分布式训练)。</li>
      <li>‘hccl’:使用HCCL库进行通信(适用于华为昇腾AI处理器的分布式训练)。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">init_method</code>(可选参数):指定用于初始化分布式环境的方法。它可以是以下选项之一:
    <ul>
      <li>‘env://’:使用环境变量中指定的方法进行初始化。</li>
      <li>‘file:// ’:使用本地文件进行初始化。</li>
      <li>‘tcp://:’:使用TCP地址和端口进行初始化。</li>
      <li>‘gloo://:’:使用Gloo地址和端口进行初始化。</li>
      <li>‘mpi://:’:使用MPI地址和端口进行初始化。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">rank</code>(可选参数):指定当前进程的排名(从0开始)。</li>
  <li><code class="language-plaintext highlighter-rouge">world_size</code>(可选参数):指定总共使用的进程数。</li>
  <li><code class="language-plaintext highlighter-rouge">timeout</code>(可选参数):指定初始化的超时时间。</li>
  <li><code class="language-plaintext highlighter-rouge">group_name</code>(可选参数):指定用于连接的进程组名称。</li>
</ul>

<h3 id="多机多卡-ddp">多机多卡 DDP</h3>

<p>概念理解</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">group</code>: <code class="language-plaintext highlighter-rouge">进程组</code>,通常DDP的各个进程都是在同一个进程组下</li>
  <li><code class="language-plaintext highlighter-rouge">world_size</code>: 总的进程数量(原则上,一个进程占用一个GPU)</li>
  <li><code class="language-plaintext highlighter-rouge">rank</code>:当前进程的序号,用于进程间通信,rank=0表示主机为master节点</li>
  <li><code class="language-plaintext highlighter-rouge">local_rank</code>:当前进程对应的GPU号</li>
</ul>

<p>举个栗子 :</p>
<ul>
  <li>4台机器 (每台机器8张卡) 进行分布式训练。</li>
  <li>通过 init_process_group() 对进程组进行初始化。</li>
  <li>初始化后 可以通过 get_world_size() 获取到 world size = 32。</li>
  <li>在该例中为32, 即有32个进程,其编号为0-31 通过 get_rank() 函数可以进行获取 在每台机器上,local rank均为0-8, 这是 local rank 与 rank 的区别, local rank 会对应到实际的 GPU ID 上。</li>
</ul>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c1">########################## 	第1步	 ##########################
#初始化
</span><span class="n">rank</span> <span class="o">=</span> <span class="nb">int</span><span class="p">(</span><span class="n">os</span><span class="p">.</span><span class="n">environ</span><span class="p">[</span><span class="s">"RANK"</span><span class="p">])</span>
<span class="n">local_rank</span> <span class="o">=</span> <span class="nb">int</span><span class="p">(</span><span class="n">os</span><span class="p">.</span><span class="n">environ</span><span class="p">[</span><span class="s">"LOCAL_RANK"</span><span class="p">])</span>
<span class="n">torch</span><span class="p">.</span><span class="n">cuda</span><span class="p">.</span><span class="n">set_device</span><span class="p">(</span><span class="n">rank</span> <span class="o">%</span> <span class="n">torch</span><span class="p">.</span><span class="n">cuda</span><span class="p">.</span><span class="n">device_count</span><span class="p">())</span>
<span class="n">dist</span><span class="p">.</span><span class="n">init_process_group</span><span class="p">(</span><span class="n">backend</span><span class="o">=</span><span class="s">"nccl"</span><span class="p">)</span>
<span class="n">device</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">device</span><span class="p">(</span><span class="s">"cuda"</span><span class="p">,</span> <span class="n">local_rank</span><span class="p">)</span>
<span class="c1">########################## 	第2步	 ##########################
#模型定义
</span><span class="n">model</span> <span class="o">=</span> <span class="n">model</span><span class="p">.</span><span class="n">to</span><span class="p">(</span><span class="n">device</span><span class="p">)</span>
<span class="n">model</span> <span class="o">=</span> <span class="n">DDP</span><span class="p">(</span><span class="n">model</span><span class="p">,</span> <span class="n">device_ids</span><span class="o">=</span><span class="p">[</span><span class="n">local_rank</span><span class="p">],</span> <span class="n">output_device</span><span class="o">=</span><span class="n">local_rank</span><span class="p">)</span>

<span class="c1">#数据集操作与DDP一致
</span>
<span class="c1">#####运行
</span><span class="s">'''
exmaple: 2 node, 8 GPUs per node (16GPUs)
需要在两台机器上分别运行脚本
注意细节:node_rank master 为 0 
机器1
&gt;&gt;&gt; python -m torch.distributed.launch </span><span class="se">\
</span><span class="s">    --nproc_per_node=8 </span><span class="se">\
</span><span class="s">    --nnodes=2 </span><span class="se">\
</span><span class="s">    --node_rank=0 </span><span class="se">\
</span><span class="s">    --master_addr="master的ip" </span><span class="se">\
</span><span class="s">    --master_port=xxxxx </span><span class="se">\
</span><span class="s">    YourScript.py
机器2
&gt;&gt;&gt; python -m torch.distributed.launch </span><span class="se">\
</span><span class="s">    --nproc_per_node=8 </span><span class="se">\
</span><span class="s">    --nnodes=2 </span><span class="se">\
</span><span class="s">    --node_rank=1 </span><span class="se">\
</span><span class="s">    --master_addr="master的ip" </span><span class="se">\
</span><span class="s">    --master_port=xxxxx </span><span class="se">\
</span><span class="s">    YourScript.py

'''</span>
</code></pre></div></div>

<h3 id="fsdp-ddp改进">FSDP (DDP改进)</h3>

<p>DistributedDataParallel (DDP) 训练</p>

<p>多机多卡方式中</p>
<ul>
  <li>每个 process/worker 都有模型的一个<code class="language-plaintext highlighter-rouge">副本</code>(Replica)</li>
  <li>每个 process/worker 处理一个 batch 数据, 并行处理</li>
  <li>最后用 <code class="language-plaintext highlighter-rouge">all-reduce</code> 操作对多个不同 process/worker 计算得到的<strong>梯度</strong>进行累加求和;</li>
  <li>接着,再将<strong>优化器状态</strong>、<strong>梯度</strong>通过跨多个 process/worker 进行复制,使得每个 process/worker 上的模型参数都得到同步更新。</li>
</ul>

<p>DDP 中,模型权重和优化器状态在所有工作线程中复制。</p>
<ul>
  <li>核心能力还是训练<strong>数据并行</strong>(Data Parallel)</li>
  <li>DDP 没有实现对<code class="language-plaintext highlighter-rouge">模型参数</code>的<strong>分片管理</strong>,即<code class="language-plaintext highlighter-rouge">模型并行</code>(Model Parallel)</li>
</ul>

<p>PyTorch 1.11 中发布 <a href="https://pytorch.org/docs/1.11/fsdp.html">FSDP</a></p>
<ul>
  <li>FSDP 是一种数据并行性,可跨 DDP 等级分片<code class="language-plaintext highlighter-rouge">模型参数</code>、<code class="language-plaintext highlighter-rouge">优化器状态</code>和<code class="language-plaintext highlighter-rouge">梯度</code>。</li>
  <li><a href="https://pytorch.org/tutorials/intermediate/FSDP_tutorial.html">Getting Started with Fully Sharded Data Parallel(FSDP)</a></li>
  <li><a href="http://shiyanjun.cn/archives/2292.html">PyTorch 分布式训练模式 FSDP 设计分析</a></li>
</ul>

<p>FSDP 实现了模型的<strong>分片管理</strong>能力,真正实现了<code class="language-plaintext highlighter-rouge">模型并行</code>。</p>
<ul>
  <li>将模型分片后,使用 FSDP 训练模型,每个 GPU 只保存模型的一个<strong>分片</strong>,这样能够使 <strong>GPU 的内存占用比 DDP 方式小得多</strong>,从而使分片的大模型和数据能够适配 GPU 容量,更有希望实现超大模型的分布式训练。</li>
  <li>问题: process/worker 节点之间的通信开销一定程度增加,但是可在 PyTorch 内部有针对性地进行优化来降低通信代价,比如对通信、计算进行 overlapping 能够很好地降低由此带来的网络开销。</li>
</ul>

<p>使用 FSDP 训练时,GPU 内存占用量比在所有工作线程上使用 DDP 进行训练时要小。</p>
<ul>
  <li>允许更大模型或批量大小适合设备,使一些非常大的模型的训练变得可行。</li>
  <li>这是伴随着通信量增加的成本而来的。通过内部优化(例如重叠通信和计算)减少了通信开销。</li>
</ul>

<p><a href="https://pytorch.org/tutorials/_images/fsdp_workflow.png">图解</a></p>
<ul>
  <li><img src="https://pytorch.org/tutorials/_images/fsdp_workflow.png" alt="" /></li>
</ul>

<h4 id="fsdp-原理">FSDP 原理</h4>

<p>FSDP 在不同阶段的基本处理过程,如下所示:</p>

<ul>
  <li>01 <code class="language-plaintext highlighter-rouge">初始化</code>阶段
    <ul>
      <li>分片模型参数,每个 rank 只有自己的分片</li>
    </ul>
  </li>
  <li>02 <code class="language-plaintext highlighter-rouge">forward</code> 阶段
    <ul>
      <li>运行 <code class="language-plaintext highlighter-rouge">all_gather</code>,收集所有 rank 上的模型参数分片,生成恢复得到模型参数,以保证满足当前 FSDP Unit 的计算需要</li>
      <li>运行 forward 计算过程</li>
      <li>丢掉所有被收集过的其它 rank 上的模型参数分片</li>
    </ul>
  </li>
  <li>03 <code class="language-plaintext highlighter-rouge">backward</code> 阶段
    <ul>
      <li>运行 <code class="language-plaintext highlighter-rouge">all_gather</code>, 收集所有 rank 上的模型参数分片,恢复全部的模型参数,以保证满足当前 FSDP Unit 的计算需要</li>
      <li>运行 backward 计算过程</li>
      <li>运行 <code class="language-plaintext highlighter-rouge">reduce_scatter</code>, 在所有 rank 之间同步<strong>梯度</strong></li>
      <li>丢掉所有从其它 rank 上收集过的模型参数分片</li>
    </ul>
  </li>
</ul>

<p>查看 FSDP’s 分片方法</p>
<ul>
  <li>将 DDP 梯度全归约分解为<code class="language-plaintext highlighter-rouge">归约分散</code>和<code class="language-plaintext highlighter-rouge">全聚集</code>。</li>
  <li>向后传递过程中,FSDP 减少并分散梯度,确保每个等级都拥有梯度碎片。</li>
  <li>在优化器步骤中更新参数的相应分片。</li>
  <li>最后,后续前向传播中,执行全收集操作来收集并组合更新的参数分片</li>
</ul>

<h4 id="分片原理">分片原理</h4>

<p>FSDP 默认的<code class="language-plaintext highlighter-rouge">分片策略</code>(Sharding Strategy)是对<code class="language-plaintext highlighter-rouge">模型参数</code>、<code class="language-plaintext highlighter-rouge">梯度</code>、<code class="language-plaintext highlighter-rouge">优化器状态</code>都进行分片处理,即 <code class="language-plaintext highlighter-rouge">Zero3</code> 分片策略</p>
<ul>
  <li>编程中可以使用 <code class="language-plaintext highlighter-rouge">ShardingStrategy.FULL_SHARD</code> 来指定。</li>
</ul>

<p>对于 Zero2 分片策略,只对<code class="language-plaintext highlighter-rouge">梯度</code>、<code class="language-plaintext highlighter-rouge">优化器状态</code>进行分片处理</p>
<ul>
  <li>编程中可以使用 <code class="language-plaintext highlighter-rouge">ShardingStrategy.SHARD_GRAD_OP</code> 来指定。</li>
  <li>如果配置使用 Zero2 分片策略,那么所有模型参数都会全量加载到每个 rank 对应的 GPU 内,即每个 GPU 持有一个模型的副本。</li>
  <li>forward 阶段和 backward 阶段模型参数都在 GPU 内而不会被 offload 到 CPU,这样就不需要频繁地在多个 GPU 之间传输模型参数分片信息,能够在一定程度上降低 FSDP 集群的通信开销。</li>
</ul>

<p>FSDP 处理模型分片的总体流程</p>
<ul>
  <li>论文《PyTorch FSDP: Experiences on Scaling Fully Sharded Data Parallel》</li>
  <li><img src="http://shiyanjun.cn/wp-content/uploads/2024/01/fsdp_algorithm_overview.png" alt="" /></li>
</ul>

<p>模型具有 6 个层,FSDP 将其分解为 3 个 FSDP Unit,分别为</p>
<ul>
  <li>Unit0 = [layer0, layer3]</li>
  <li>Unit1 = [layer1, layer2]</li>
  <li>Unit2 = [layer4, layer5]</li>
</ul>

<p>进行 forward 和 backward 计算之前需要从其它 rank 上收集对应的参数分片,从而保证计算是正确的。</p>

<p>以 Unit1 为例, 说明如何进行分片处理,该 FSDP Unit 包含了 layer1 和 layer2 两层。</p>
<ul>
  <li>进行 forward 计算之前,需要将这两层的参数对应于其它 rank 上的分片收集过来使 layer1 和 layer2 两层的参数是 Unsharded,即保证参数是完整的以便进行计算,然后在本地执行 forward 计算过程,完成 layer0 和 layer3 这两层的计算逻辑。当 forward 计算完成后,会释放掉刚刚从其它 rank 上收集到的参数分片,以降低内存空间的占用。每一轮 forward 计算,FSDP 一次只需要处理一个 Unit 的参数即可,而其它的 Unit 仍然保持其参数的分片状态。</li>
  <li>对于 backward 计算的过程也是类似的,它会先计算 layer2,再计算 layer1,在开始计算 layer2 层之前,FSDP 会从其它 rank 上收集 layer2、layer1 层的分片参数,恢复得到这两层完整的参数后,Autograd 引擎会继续完成 layer2、layer1 这两层的计算,随后释放掉从其它 rank 上收集过来的参数分片。接着,FSDP 会进行 reduce-scatter 操作对梯度进行累加并分片。当 backward 计算结束后,每个 rank 都只保存了模型参数和梯度的分片部分。</li>
</ul>

<h4 id="fsdp-模型初始化">FSDP 模型初始化</h4>

<p>FSDP 模型初始化时,通过指定一个 device_id 参数来绑定到指定的 GPU 上</p>
<ul>
  <li>首先模型的 Module 会在 CPU 中初始化</li>
  <li>然后加载到 GPU 内。</li>
</ul>

<p>通过指定 device_id 能够保证当 GPU 无法容纳大的模型时,它能够 offload 到 CPU 中,而不至于出现 OOM 的问题。</p>

<p>创建 FSDP 模型</p>
<ul>
  <li>只要将模型(继承自 nn.Module) model,通过 FSDP 进行 wrap 即可</li>
  <li>其中指定一些满足需要的配置选项</li>
</ul>

<p>参数</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">auto_wrap_policy</code>: 自动将模型分片处理,包括对<code class="language-plaintext highlighter-rouge">模型参数</code>、<code class="language-plaintext highlighter-rouge">优化器状态</code>、<code class="language-plaintext highlighter-rouge">梯度</code>进行分片,每个分片都放到一个不同的 FSDP Unit 中</li>
  <li></li>
</ul>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kn">from</span> <span class="nn">torch.nn.parallel</span> <span class="kn">import</span> <span class="n">DistributedDataParallel</span> <span class="k">as</span> <span class="n">DDP</span>
<span class="kn">from</span> <span class="nn">torch.distributed.fsdp</span> <span class="kn">import</span> <span class="n">FullyShardedDataParallel</span> <span class="k">as</span> <span class="n">FSDP</span>

<span class="n">model</span> <span class="o">=</span> <span class="n">DDP</span><span class="p">(</span><span class="n">model</span><span class="p">)</span>

<span class="n">torch</span><span class="p">.</span><span class="n">cuda</span><span class="p">.</span><span class="n">set_device</span><span class="p">(</span><span class="n">local_rank</span><span class="p">)</span>
<span class="n">model</span> <span class="o">=</span> <span class="n">FSDP</span><span class="p">(</span><span class="n">model</span><span class="p">,</span>
        <span class="n">auto_wrap_policy</span><span class="o">=</span><span class="n">t5_auto_wrap_policy</span><span class="p">,</span>
        <span class="n">mixed_precision</span><span class="o">=</span><span class="n">bfSixteen</span><span class="p">,</span>
        <span class="n">device_id</span><span class="o">=</span><span class="n">torch</span><span class="p">.</span><span class="n">cuda</span><span class="p">.</span><span class="n">current_device</span><span class="p">())</span>

<span class="n">model</span> <span class="o">=</span> <span class="n">FSDP</span><span class="p">(</span><span class="n">model</span><span class="p">,</span>
    <span class="n">auto_wrap_policy</span><span class="o">=</span><span class="n">my_auto_wrap_policy</span><span class="p">,</span>
    <span class="n">cpu_offload</span><span class="o">=</span><span class="n">CPUOffload</span><span class="p">(</span><span class="n">offload_params</span><span class="o">=</span><span class="bp">True</span><span class="p">))</span>
</code></pre></div></div>

<p>注意</p>
<ul>
  <li>Transformer Encoder-Decoder 架构模型包含一些被 Encoder 和 Decoder 共享部分,比如 embedding 表,如果直接使用上面的 <code class="language-plaintext highlighter-rouge">auto_wrap_policy</code> 参数指定 Wrap Policy, 会使神经网络模型中这些共享的部分无法被共享</li>
  <li>所以只能把<strong>共享部分</strong>移动到 FSDP Unit 外部去,以便 Encoder 和 Decoder 都能访问这部分。</li>
  <li>PyTorch 1.12 引入处理这种情况的特性,为 Transformer 注册一个 <strong>共享 Layer</strong> 实现类,使 FSDP 的分片计划(Sharding Plan)实现高效的通信处理。</li>
</ul>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">t5_auto_wrap_policy</span> <span class="o">=</span> <span class="n">functools</span><span class="p">.</span><span class="n">partial</span><span class="p">(</span>
        <span class="n">transformer_auto_wrap_policy</span><span class="p">,</span>
        <span class="n">transformer_layer_cls</span><span class="o">=</span><span class="p">{</span>
            <span class="n">T5Block</span><span class="p">,</span> <span class="c1"># T5 Transformer 层的实现类,封装了 MHSA 和 FFN 两层
</span>        <span class="p">},</span>
    <span class="p">)</span>
<span class="n">torch</span><span class="p">.</span><span class="n">cuda</span><span class="p">.</span><span class="n">set_device</span><span class="p">(</span><span class="n">local_rank</span><span class="p">)</span>
<span class="n">model</span> <span class="o">=</span> <span class="n">FSDP</span><span class="p">(</span><span class="n">model</span><span class="p">,</span> <span class="n">fsdp_auto_wrap_policy</span><span class="o">=</span><span class="n">t5_auto_wrap_policy</span><span class="p">)</span>
</code></pre></div></div>

<h2 id="分布式训练高层封装">分布式训练高层封装</h2>

<p>对 torch 几个流程进行一层封装【初始化、包装模型、优化器、数据加载】。</p>

<p>考虑几个因素</p>
<ul>
  <li>支持分布式训练<strong>模式丰富</strong>,如 CPU,单机单卡,单机多卡,多机多卡,FP16等</li>
  <li><strong>代码简单</strong>,不需要改动大量代码, 即可进行分布式训练</li>
  <li><strong>接口丰富</strong>,方便自定义。比如 能调用和访问底层分布式的一些变量如rank,worldsize,或实现或封装一些分布式函数,比如dist.gather/reduce等。</li>
</ul>

<p>得到更加易用的框架:</p>
<ul>
  <li>Accelerator</li>
  <li>Horovod</li>
</ul>

<p>这两个都是非常易用的分布式框架。 还有一些其他的,比如 <code class="language-plaintext highlighter-rouge">pytorch-lightning</code>,<code class="language-plaintext highlighter-rouge">deepspeed</code>。</p>

<p>以bert情感分类为例子,介绍了如何使用原生DDP和上面2个框架来进行分布式训练</p>
<ul>
  <li>代码见:<a href="https://github.com/ShomyLiu/torch-ddp-examples">torch-ddp-examples</a></li>
</ul>

<h3 id="accelerator">Accelerator</h3>

<p>由大名鼎鼎的 huggingface 发布的 Accelerator,专门适用于Pytorch 分布式训练框架:</p>
<ul>
  <li>GitHub: <a href="https://github.com/huggingface/accelerate">accelerate</a></li>
  <li>官网教程:<a href="https://huggingface.co/docs/accelerate">accelerate</a></li>
</ul>

<p>将单进程代码改为多进程分布式:</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kn">import</span> <span class="nn">accelerate</span>
<span class="n">accelerator</span> <span class="o">=</span> <span class="n">accelerate</span><span class="p">.</span><span class="n">Accelerator</span><span class="p">()</span>
<span class="n">device</span> <span class="o">=</span> <span class="n">accelerator</span><span class="p">.</span><span class="n">device</span> <span class="c1">#获取当前进程的设备
</span><span class="p">...</span>
<span class="c1"># 进行封装
</span><span class="n">model</span><span class="p">,</span> <span class="n">optimizer</span><span class="p">,</span> <span class="n">dataloader</span> <span class="o">=</span> <span class="n">accelerator</span><span class="p">.</span><span class="n">prepare</span><span class="p">(</span><span class="n">model</span><span class="p">,</span> <span class="n">optimizer</span><span class="p">,</span> <span class="n">dataloader</span><span class="p">)</span>

<span class="c1">#训练时 loss.backward() 换为:
</span><span class="n">accelerator</span><span class="p">.</span><span class="n">backward</span><span class="p">(</span><span class="n">loss</span><span class="p">)</span>
</code></pre></div></div>

<p>使用CLI命令行方式运行,先使用 <code class="language-plaintext highlighter-rouge">accelerator config</code> 配置一次分布式训练参数,之后就使用 <code class="language-plaintext highlighter-rouge">acceleratoe launch</code> 运行。</p>

<p>除此之外,accelerator 还提供了一些很便利的接口,基本覆盖了分布式训练中需要用到的方法,比如:</p>
<ul>
  <li>accelerator.<code class="language-plaintext highlighter-rouge">print</code>: 仅仅在主进程输出</li>
  <li>accelerator.<code class="language-plaintext highlighter-rouge">process_index</code>: 当前进程ID,没有使用rank命名,而是用的process_index来表示</li>
  <li>accelerator.<code class="language-plaintext highlighter-rouge">is_local_main_process</code>/<code class="language-plaintext highlighter-rouge">is_main_processs</code>: 是否local_rank 或则rank为0, 主进程</li>
  <li>accelerator.<code class="language-plaintext highlighter-rouge">wait_for_everyone</code>(): 类似 dist.barrier() , 等所有进程到达这一步。</li>
  <li>accelerator.<code class="language-plaintext highlighter-rouge">save</code>: 保存模型</li>
  <li>kwargs_handlers: 可以定义DDP初始化的一些参数,比如最常用的就是 find_unused_parameters,比如:</li>
</ul>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kn">import</span> <span class="nn">accelerate</span>
<span class="kn">from</span> <span class="nn">accelerate</span> <span class="kn">import</span> <span class="n">DistributedDataParallelKwargs</span> <span class="k">as</span> <span class="n">DDPK</span>
<span class="n">kwargs</span> <span class="o">=</span> <span class="n">DDPK</span><span class="p">(</span><span class="n">find_unused_parameters</span><span class="o">=</span><span class="bp">True</span><span class="p">)</span>
<span class="n">accelerator</span> <span class="o">=</span> <span class="n">accelerate</span><span class="p">.</span><span class="n">Accelerator</span><span class="p">(</span><span class="n">kwargs_handlers</span><span class="o">=</span><span class="p">[</span><span class="n">kwargs</span><span class="p">])</span>
</code></pre></div></div>

<p>accelerator 基本已经满足使用 Pytorch 进行分布训练的需求,而且十分符合 huggingface 风格,把某个小项目做到最好用,类似的还有 transformers, tokenizers, datasets 等等。</p>

<p>不足</p>
<ul>
  <li>accelerate 支持的 collective function 比较少,目前只有 all_gather。</li>
</ul>

<p>Horovod
第二个常用的分布式库Horovod是一个通用的深度学习分布式训练框架,支持Tensorflow,Pytorch,MXNet,Keras等等,因此比Accelerator要更加重些,但是功能也会更加丰富,这里以Pytorch为例来简单介绍。多说一下,Horovod的安装相对复杂一些,需要针对具体的环境参考readme进行安装。</p>

<p>GitHub:https://github.com/horovod/horovod
官网:https://horovod.ai/
Horovod的使用也很简单,基本也是那几个流程:</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kn">import</span> <span class="nn">horovod.torch</span> <span class="k">as</span> <span class="n">hvd</span>
<span class="c1"># 初始化
</span><span class="n">hvd</span><span class="p">.</span><span class="n">init</span><span class="p">()</span>
<span class="c1"># Samapler
# *此处num_replicas=hvd.size(), rank=hvd.rank()必须*
</span><span class="n">train_sampler</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">utils</span><span class="p">.</span><span class="n">data</span><span class="p">.</span><span class="n">distributed</span><span class="p">.</span><span class="n">DistributedSampler</span><span class="p">(</span>
    <span class="n">train_dataset</span><span class="p">,</span> <span class="n">num_replicas</span><span class="o">=</span><span class="n">hvd</span><span class="p">.</span><span class="n">size</span><span class="p">(),</span> <span class="n">rank</span><span class="o">=</span><span class="n">hvd</span><span class="p">.</span><span class="n">rank</span><span class="p">())</span>

<span class="n">train_loader</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">utils</span><span class="p">.</span><span class="n">data</span><span class="p">.</span><span class="n">DataLoader</span><span class="p">(</span><span class="n">train_dataset</span><span class="p">,</span> <span class="n">batch_size</span><span class="o">=</span><span class="p">...,</span> <span class="n">sampler</span><span class="o">=</span><span class="n">train_sampler</span><span class="p">)</span>
<span class="c1"># 优化器包装
</span><span class="n">optimizer</span> <span class="o">=</span> <span class="n">hvd</span><span class="p">.</span><span class="n">DistributedOptimizer</span><span class="p">(</span><span class="n">optimizer</span><span class="p">,</span> <span class="n">named_parameters</span><span class="o">=</span><span class="n">model</span><span class="p">.</span><span class="n">named_parameters</span><span class="p">())</span>
<span class="c1"># 模型分发广播
</span><span class="n">hvd</span><span class="p">.</span><span class="n">broadcast_parameters</span><span class="p">(</span><span class="n">model</span><span class="p">.</span><span class="n">state_dict</span><span class="p">(),</span> <span class="n">root_rank</span><span class="o">=</span><span class="mi">0</span><span class="p">)</span>
<span class="c1"># 模型训练不需要修改
</span></code></pre></div></div>

<p>horovod 支持的运行方式非常多,最常用的就是 horovodrun ,比如单机四卡运行:</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code>horovodrun <span class="nt">-np</span> 4 <span class="nt">-H</span> localhost:4 python3 train.py
</code></pre></div></div>

<p>horovod 相比 accelerate 功能更加丰富,支持的接口,函数,框架都要多, 比如: hvd.all_reduce, hvd.all_gather等等。</p>

<h3 id="horovod">Horovod</h3>

<p>Horovod 是 Uber开源的跨平台的分布式训练工具,名字来自于俄国传统民间舞蹈,舞者手牵手围成一个圈跳舞,与Horovod设备之间的通信模式很像,有以下几个特点:</p>
<ul>
  <li>兼容TensorFlow、Keras和PyTorch机器学习框架。</li>
  <li>使用Ring-AllReduce算法,对比Parameter Server算法,有着无需等待,负载均衡的优点。</li>
  <li>实现简单,五分钟包教包会。</li>
</ul>

<p>Horovod环境准备以及示例代码,可参考<a href="https://zhuanlan.zhihu.com/p/351693076">上一篇</a></p>

<p>Pytorch 1.x <strong>多机多卡</strong>计算模型没有采用主流的 Parameter Server 结构,而是直接用了Uber Horovod 的形式,即百度开源的 RingAllReduce 算法</p>

<p>Uber 的 Horovod 采用 RingAllReduce 计算方案,特点:网络单次通信量不随着 worker(GPU) 的增加而增加,是一个恒定值。</p>

<p>与 TreeAllReduce 不同,RingAllreduce 算法的每次通信成本是恒定的,与系统中 gpu 的数量无关,完全由系统中 gpu 之间最慢的连接决定。</p>

<h2 id="分布式训练库">分布式训练库</h2>

<h3 id="常见框架">常见框架</h3>

<p>常见的分布式训练框架:</p>
<ul>
  <li>第一类:深度学习框架<strong>自带</strong>分布式训练功能。如:TensorFlow、PyTorch、MindSpore、Oneflow、PaddlePaddle等。</li>
  <li>第二类:基于现有的深度学习框架(如:PyTorch、Flax)进行<strong>扩展和优化</strong>,从而进行分布式训练。
    <ul>
      <li>如:<code class="language-plaintext highlighter-rouge">Megatron-LM</code>(张量并行)、<code class="language-plaintext highlighter-rouge">DeepSpeed</code>(Zero-DP)、<code class="language-plaintext highlighter-rouge">Colossal-AI</code>(高维模型并行,如2D、2.5D、3D)、<code class="language-plaintext highlighter-rouge">Alpa</code>(自动并行)等</li>
    </ul>
  </li>
</ul>

<h3 id="llm-复现选择">LLM 复现选择</h3>

<p>如何选择分布式训练框架? <a href="https://mp.weixin.qq.com/s/7wtwsNhf27YzALnSFXTmkA">参考</a></p>
<ul>
  <li>训练<strong>成本</strong>:不同训练工具,训练同样大模型,成本不一样。对于大模型,训练一次动辄上百万/千万美元的费用。合适的成本始终是正确的选择。</li>
  <li>训练<strong>类型</strong>:是否支持数据并行、张量并行、流水线并行、多维混合并行、自动并行等</li>
  <li><strong>效率</strong>:将普通模型训练代码变为分布式训练所需编写代码的行数,希望越少越好。</li>
  <li><strong>灵活性</strong>:选择的框架是否可以跨不同平台使用?</li>
</ul>

<p>目前训练超大规模语言模型主要有两条技术路线:</p>
<ul>
  <li>TPU + XLA + TensorFlow/JAX :由Google主导,由于TPU和自家云平台GCP深度绑定</li>
  <li>GPU + PyTorch + Megatron-LM + DeepSpeed :由 NVIDIA、Meta、MicroSoft 大厂加持,社区氛围活跃,也更受到大家欢迎。</li>
</ul>

<h3 id="deepspeed--微软">DeepSpeed – 微软</h3>

<p>DeepSpeed 是 Microsoft基于PyTorch研发的开源深度学习优化库。</p>
<ul>
  <li>目的: 降低大模型训练的门槛,提升大模型的训练的效率,帮助开发者更有效率地管理及优化大模型的训练、部署任务。</li>
</ul>

<p>详见站内专题: <a href="deepspeed">DeepSpeed</a></p>

<p>【2023-8-28】<a href="https://github.com/hiyouga/LLaMA-Efficient-Tuning/blob/main/README_zh.md">LLaMA Efficient Tuning</a></p>

<table>
  <thead>
    <tr>
      <th>方法</th>
      <th>全参数训练</th>
      <th>部分参数训练</th>
      <th>LoRA</th>
      <th>QLoRA</th>
    </tr>
  </thead>
  <tbody>
    <tr>
      <td>预训练</td>
      <td>✅</td>
      <td>✅</td>
      <td>✅</td>
      <td>✅</td>
    </tr>
    <tr>
      <td>指令监督微调</td>
      <td>✅</td>
      <td>✅</td>
      <td>✅</td>
      <td>✅</td>
    </tr>
    <tr>
      <td>奖励模型训练</td>
      <td> </td>
      <td> </td>
      <td>✅</td>
      <td>✅</td>
    </tr>
    <tr>
      <td>PPO 训练</td>
      <td> </td>
      <td> </td>
      <td>✅</td>
      <td>✅</td>
    </tr>
    <tr>
      <td>DPO 训练</td>
      <td>✅</td>
      <td> </td>
      <td>✅</td>
      <td>✅</td>
    </tr>
  </tbody>
</table>

<h3 id="trl">trl</h3>

<p>【2024-3-13】<a href="https://huggingface.co/docs/trl/index">TRL - Transformer Reinforcement Learning</a></p>

<p>huggingface 推出的全栈库,包含一整套工具,用于使用强化学习 (Reinforcement Learning) 训练 transformer 语言模型。</p>
<ul>
  <li>从<strong>监督调优</strong> (Supervised Fine-tuning step, SFT),到训练<strong>奖励模型</strong> (Reward Modeling),再到<strong>近端策略优化</strong> (Proximal Policy Optimization),全面覆盖</li>
  <li><img src="https://huggingface.co/datasets/trl-internal-testing/example-images/resolve/main/images/TRL-readme.png" alt="" /></li>
  <li><a href="https://github.com/huggingface/trl">TRL</a> 库已经与 🤗 transformers 集成,直接使用!</li>
  <li>👉 文档<a href="https://hf.co/docs/trl/">地址</a></li>
  <li><img src="https://picx.zhimg.com/70/v2-1c818186d30b9afff9af2341b1eddc6f_1440w.avis?source=172ae18b&amp;biz_tag=Post" alt="" /></li>
</ul>

<p>API 文档里功能:</p>
<ul>
  <li>Model Class: 公开模型各自用途</li>
  <li>SFTTrainer: SFTTrainer 实现模型监督调优</li>
  <li>RewardTrainer: RewardTrainer 训练奖励模型</li>
  <li>PPOTrainer: PPO 算法对经过监督调优的模型再调优</li>
  <li>Best-of-N Samppling: 将“拔萃法”作为从模型的预测中采样的替代方法</li>
  <li>DPOTrainer: 用 DPOTrainer 完成直接偏好优化</li>
</ul>

<p>文档中给出了几个例子:</p>
<ul>
  <li>Sentiment Tuning: 调优模型以生成更积极的电影内容</li>
  <li>Training with PEFT: 执行由 PEFT 适配器优化内存效率的 RLHF 训练</li>
  <li>Detoxifying LLMs: 通过 RLHF 为模型解毒,使其更符合人类的价值观</li>
  <li>StackLlama: 在 Stack exchange 数据集上实现端到端 RLHF 训练一个 Llama 模型</li>
  <li>Multi-Adapter Training: 使用单一模型和多适配器实现优化内存效率的端到端训练</li>
</ul>

<h4 id="trl-实践">Trl 实践</h4>

<p>【2023-6-30】<a href="https://zhuanlan.zhihu.com/p/616788557">使用TRL强化学习PPO控制文本的生成</a></p>

<p>步骤</p>
<ol>
  <li>初始化 GPT2 对话模型, 即LLM模型。Huggface中的这个中文对话模型
    <ul>
      <li><a href="https://huggingface.co/shibing624/gpt2-dialogbot-base-chinese">gpt2-dialogbot-base-chinese</a></li>
    </ul>
  </li>
  <li>初始化一个情感分类模型即RM模型。这里笔者使用的是Huggface中的这个情感分类模型
    <ul>
      <li>样本情感极性越正向,模型输出的得分越大。</li>
      <li><a href="https://huggingface.co/liam168/c2-roberta-base-finetuned-dianping-chinese">c2-roberta-base-finetuned-dianping-chinese</a></li>
    </ul>
  </li>
  <li>通过PPO强化学习算法,利用情感分类模型评估对话模型的输出,对GPT2对话模型进行优化,让GPT2对话模型的输出的结果在情感分类模型中得到高分。同时不破坏GPT2对话模型输出通顺对话的能力。</li>
</ol>

<p>强行学习训练</p>
<ol>
  <li>输入样本给GPT2, 拿到对话语言模型 GPT2的输出。</li>
  <li>将对话语言模型GPT2的输出 输入到 情感分类模型 拿到 情感分类模型的输出,作为reward。</li>
  <li>将对话语言模型GPT2 输入,输出, 以及 情感分类模型的 reward 一并输入给PPO优化器,让PPO优化器去优化对话语言模型GPT2。</li>
</ol>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kn">import</span> <span class="nn">torch</span>
<span class="kn">from</span> <span class="nn">transformers</span> <span class="kn">import</span> <span class="n">AutoTokenizer</span>
<span class="kn">from</span> <span class="nn">trl</span> <span class="kn">import</span> <span class="n">PPOTrainer</span><span class="p">,</span> <span class="n">PPOConfig</span><span class="p">,</span> <span class="n">AutoModelForCausalLMWithValueHead</span><span class="p">,</span> <span class="n">create_reference_model</span>
<span class="kn">from</span> <span class="nn">trl.core</span> <span class="kn">import</span> <span class="n">respond_to_batch</span>
<span class="kn">import</span> <span class="nn">random</span>
<span class="kn">import</span> <span class="nn">torch.nn.functional</span> <span class="k">as</span> <span class="n">F</span>

<span class="c1"># get models
</span><span class="n">gen_model</span> <span class="o">=</span> <span class="n">AutoModelForCausalLMWithValueHead</span><span class="p">.</span><span class="n">from_pretrained</span><span class="p">(</span><span class="s">'dialoggpt/'</span><span class="p">)</span>
<span class="n">model_ref</span> <span class="o">=</span> <span class="n">create_reference_model</span><span class="p">(</span><span class="n">gen_model</span><span class="p">)</span>
<span class="n">tokenizerOne</span> <span class="o">=</span> <span class="n">AutoTokenizer</span><span class="p">.</span><span class="n">from_pretrained</span><span class="p">(</span><span class="s">'dialoggpt/'</span><span class="p">,</span><span class="n">padding_side</span><span class="o">=</span><span class="s">'left'</span><span class="p">)</span>
<span class="n">tokenizerOne</span><span class="p">.</span><span class="n">eos_token_id</span> <span class="o">=</span> <span class="n">tokenizerOne</span><span class="p">.</span><span class="n">sep_token_id</span>
<span class="c1"># 初始化一个情感分类模型,输入文本,判断文本的情感极性
</span><span class="kn">from</span> <span class="nn">transformers</span> <span class="kn">import</span> <span class="n">AutoModelForSequenceClassification</span> <span class="p">,</span> <span class="n">AutoTokenizer</span><span class="p">,</span> <span class="n">pipeline</span>

<span class="n">ts_texts</span> <span class="o">=</span> <span class="p">[</span><span class="s">"我喜欢下雨。"</span><span class="p">,</span> <span class="s">"我讨厌他."</span><span class="p">]</span>
<span class="n">cls_model</span> <span class="o">=</span> <span class="n">AutoModelForSequenceClassification</span><span class="p">.</span><span class="n">from_pretrained</span><span class="p">(</span><span class="s">"./chineseSentiment/"</span><span class="p">,</span> <span class="n">num_labels</span><span class="o">=</span><span class="mi">2</span><span class="p">)</span>
<span class="n">tokenizerTwo</span> <span class="o">=</span> <span class="n">AutoTokenizer</span><span class="p">.</span><span class="n">from_pretrained</span><span class="p">(</span><span class="s">"./chineseSentiment/"</span><span class="p">)</span>

<span class="n">classifier</span> <span class="o">=</span> <span class="n">pipeline</span><span class="p">(</span><span class="s">'sentiment-analysis'</span><span class="p">,</span> <span class="n">model</span><span class="o">=</span><span class="n">cls_model</span><span class="p">,</span> <span class="n">tokenizer</span><span class="o">=</span><span class="n">tokenizerTwo</span><span class="p">)</span>
<span class="n">classifier</span><span class="p">(</span><span class="n">ts_texts</span><span class="p">)</span>

<span class="c1"># 数据预处理
</span><span class="kn">from</span> <span class="nn">torch.utils.data</span> <span class="kn">import</span> <span class="n">Dataset</span>
<span class="kn">import</span> <span class="nn">torch.nn.utils.rnn</span> <span class="k">as</span> <span class="n">rnn_utils</span>
<span class="kn">import</span> <span class="nn">json</span>

<span class="n">data</span> <span class="o">=</span> <span class="p">[]</span>
<span class="k">with</span> <span class="nb">open</span><span class="p">(</span><span class="s">"./train.txt"</span><span class="p">,</span> <span class="s">"r"</span><span class="p">,</span> <span class="n">encoding</span><span class="o">=</span><span class="s">"utf-8"</span><span class="p">)</span> <span class="k">as</span> <span class="n">f</span><span class="p">:</span>
    <span class="k">for</span> <span class="n">i</span> <span class="ow">in</span> <span class="n">f</span><span class="p">.</span><span class="n">readlines</span><span class="p">():</span>
        <span class="n">line</span> <span class="o">=</span> <span class="n">json</span><span class="p">.</span><span class="n">loads</span><span class="p">(</span><span class="n">i</span><span class="p">)</span>
        <span class="n">data</span><span class="p">.</span><span class="n">append</span><span class="p">(</span><span class="n">line</span><span class="p">)</span>


<span class="k">def</span> <span class="nf">preprocess_conversation</span><span class="p">(</span><span class="n">data</span><span class="p">):</span>
    <span class="n">sep_id</span> <span class="o">=</span> <span class="n">tokenizerOne</span><span class="p">.</span><span class="n">sep_token_id</span>
    <span class="n">cls_id</span> <span class="o">=</span> <span class="n">tokenizerOne</span><span class="p">.</span><span class="n">cls_token_id</span>
    <span class="n">dialogue_list</span> <span class="o">=</span> <span class="p">[]</span>
    <span class="k">for</span> <span class="n">conver</span> <span class="ow">in</span> <span class="n">data</span><span class="p">:</span>
        <span class="n">input_ids</span> <span class="o">=</span> <span class="p">[</span><span class="n">cls_id</span><span class="p">]</span>
        <span class="n">start</span> <span class="o">=</span> <span class="n">conver</span><span class="p">[</span><span class="s">"conversation"</span><span class="p">][</span><span class="mi">0</span><span class="p">]</span>
        <span class="c1"># print(start["utterance"])
</span>        <span class="n">input_ids</span> <span class="o">+=</span> <span class="n">tokenizerOne</span><span class="p">.</span><span class="n">encode</span><span class="p">(</span><span class="n">start</span><span class="p">[</span><span class="s">"utterance"</span><span class="p">],</span> <span class="n">add_special_tokens</span><span class="o">=</span><span class="bp">False</span><span class="p">)</span>
        <span class="n">input_ids</span><span class="p">.</span><span class="n">append</span><span class="p">(</span><span class="n">sep_id</span><span class="p">)</span>
        <span class="n">dialogue_list</span><span class="p">.</span><span class="n">append</span><span class="p">(</span><span class="n">input_ids</span><span class="p">)</span>
    <span class="k">return</span> <span class="n">dialogue_list</span>

<span class="c1"># 数据处理
</span><span class="n">dialogue_list</span> <span class="o">=</span> <span class="n">preprocess_conversation</span><span class="p">(</span><span class="n">data</span><span class="p">)</span>

<span class="k">class</span> <span class="nc">MyDataset</span><span class="p">(</span><span class="n">Dataset</span><span class="p">):</span>
    <span class="k">def</span> <span class="nf">__init__</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">data</span><span class="p">):</span>
        <span class="bp">self</span><span class="p">.</span><span class="n">data</span> <span class="o">=</span> <span class="n">data</span>

    <span class="k">def</span> <span class="nf">__getitem__</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">index</span><span class="p">):</span>
        <span class="n">x</span> <span class="o">=</span> <span class="bp">self</span><span class="p">.</span><span class="n">data</span><span class="p">[</span><span class="n">index</span><span class="p">]</span>
        <span class="k">return</span> <span class="n">torch</span><span class="p">.</span><span class="n">tensor</span><span class="p">(</span><span class="n">x</span><span class="p">)</span>

    <span class="k">def</span> <span class="nf">__len__</span><span class="p">(</span><span class="bp">self</span><span class="p">):</span>
        <span class="k">return</span> <span class="nb">len</span><span class="p">(</span><span class="bp">self</span><span class="p">.</span><span class="n">data</span><span class="p">)</span>
    
<span class="n">mydataset</span> <span class="o">=</span> <span class="n">MyDataset</span><span class="p">(</span><span class="n">dialogue_list</span><span class="p">)</span>

<span class="k">def</span> <span class="nf">collate_fn</span><span class="p">(</span><span class="n">batch</span><span class="p">):</span>
    <span class="n">padded_batch</span> <span class="o">=</span> <span class="n">rnn_utils</span><span class="p">.</span><span class="n">pad_sequence</span><span class="p">(</span><span class="n">batch</span><span class="p">,</span> <span class="n">batch_first</span><span class="o">=</span><span class="bp">True</span><span class="p">,</span> <span class="n">padding_value</span><span class="o">=</span><span class="n">tokenizerOne</span><span class="p">.</span><span class="n">sep_token_id</span><span class="p">)</span>
    <span class="k">return</span> <span class="n">padded_batch</span>

<span class="c1"># 定义PPO优化器: 学习率,强化学习steps,batch_size等参数,学习率不宜调大,容易把LLM语言模型调坏。
</span><span class="n">config</span> <span class="o">=</span> <span class="n">PPOConfig</span><span class="p">(</span>
    <span class="n">model_name</span><span class="o">=</span><span class="s">"gpt2-positive"</span><span class="p">,</span>
    <span class="n">learning_rate</span><span class="o">=</span><span class="mf">1.41e-5</span><span class="p">,</span>
    <span class="n">steps</span> <span class="o">=</span> <span class="mi">2000</span><span class="p">,</span>
    <span class="n">batch_size</span> <span class="o">=</span> <span class="mi">16</span>
<span class="p">)</span>

<span class="n">ppo_trainer</span> <span class="o">=</span> <span class="n">PPOTrainer</span><span class="p">(</span><span class="n">config</span><span class="p">,</span> <span class="n">gen_model</span><span class="p">,</span> <span class="n">model_ref</span><span class="p">,</span> <span class="n">tokenizerOne</span><span class="p">,</span> <span class="n">dataset</span><span class="o">=</span><span class="n">mydataset</span><span class="p">,</span> <span class="n">data_collator</span><span class="o">=</span><span class="n">collate_fn</span><span class="p">)</span>

<span class="n">rewards_list</span> <span class="o">=</span> <span class="p">[]</span>
<span class="k">for</span> <span class="n">epoch</span><span class="p">,</span> <span class="n">batch</span> <span class="ow">in</span> <span class="nb">enumerate</span><span class="p">(</span><span class="n">ppo_trainer</span><span class="p">.</span><span class="n">dataloader</span><span class="p">):</span>
    <span class="c1">#### Get response from gpt2
</span>    <span class="n">query_tensors</span> <span class="o">=</span> <span class="p">[]</span>
    <span class="n">response_tensors</span> <span class="o">=</span> <span class="p">[]</span>
    <span class="n">query_tensors</span> <span class="o">=</span> <span class="p">[</span><span class="n">torch</span><span class="p">.</span><span class="n">tensor</span><span class="p">(</span><span class="n">t</span><span class="p">).</span><span class="nb">long</span><span class="p">()</span> <span class="k">for</span> <span class="n">t</span> <span class="ow">in</span> <span class="n">batch</span><span class="p">]</span>
    <span class="k">for</span> <span class="n">query</span> <span class="ow">in</span> <span class="n">batch</span><span class="p">:</span>
        <span class="n">input_ids</span> <span class="o">=</span> <span class="n">query</span><span class="p">.</span><span class="n">unsqueeze</span><span class="p">(</span><span class="mi">0</span><span class="p">)</span>
        <span class="n">response</span> <span class="o">=</span> <span class="p">[]</span>
        <span class="k">for</span> <span class="n">_</span> <span class="ow">in</span> <span class="nb">range</span><span class="p">(</span><span class="mi">30</span><span class="p">):</span>
            <span class="n">outputs</span> <span class="o">=</span> <span class="n">ppo_trainer</span><span class="p">.</span><span class="n">model</span><span class="p">(</span><span class="n">input_ids</span><span class="o">=</span><span class="n">input_ids</span><span class="p">)</span>
            <span class="n">logits</span> <span class="o">=</span> <span class="n">outputs</span><span class="p">[</span><span class="mi">0</span><span class="p">]</span>
            <span class="n">next_token_logits</span> <span class="o">=</span> <span class="n">logits</span><span class="p">[</span><span class="mi">0</span><span class="p">,</span> <span class="o">-</span><span class="mi">1</span><span class="p">,</span> <span class="p">:]</span>
            <span class="n">next_token_logits</span><span class="p">[</span><span class="n">ppo_trainer</span><span class="p">.</span><span class="n">tokenizer</span><span class="p">.</span><span class="n">convert_tokens_to_ids</span><span class="p">(</span><span class="s">'[UNK]'</span><span class="p">)]</span> <span class="o">=</span> <span class="o">-</span><span class="nb">float</span><span class="p">(</span><span class="s">'Inf'</span><span class="p">)</span>
            <span class="n">next_token</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">multinomial</span><span class="p">(</span><span class="n">F</span><span class="p">.</span><span class="n">softmax</span><span class="p">(</span><span class="n">next_token_logits</span><span class="p">,</span> <span class="n">dim</span><span class="o">=-</span><span class="mi">1</span><span class="p">),</span> <span class="n">num_samples</span><span class="o">=</span><span class="mi">1</span><span class="p">)</span>
            <span class="k">if</span> <span class="n">next_token</span> <span class="o">==</span> <span class="n">ppo_trainer</span><span class="p">.</span><span class="n">tokenizer</span><span class="p">.</span><span class="n">sep_token_id</span><span class="p">:</span>  <span class="c1">#
</span>                <span class="k">break</span>
            <span class="n">input_ids</span> <span class="o">=</span> <span class="n">torch</span><span class="p">.</span><span class="n">cat</span><span class="p">((</span><span class="n">input_ids</span><span class="p">,</span> <span class="n">next_token</span><span class="p">.</span><span class="n">unsqueeze</span><span class="p">(</span><span class="mi">0</span><span class="p">)),</span> <span class="n">dim</span><span class="o">=</span><span class="mi">1</span><span class="p">)</span>
            <span class="n">response</span><span class="p">.</span><span class="n">append</span><span class="p">(</span><span class="n">next_token</span><span class="p">.</span><span class="n">item</span><span class="p">())</span>
        <span class="n">response_tensors</span><span class="p">.</span><span class="n">append</span><span class="p">(</span><span class="n">torch</span><span class="p">.</span><span class="n">Tensor</span><span class="p">(</span><span class="n">response</span><span class="p">).</span><span class="nb">long</span><span class="p">())</span>
    <span class="n">responseSet</span> <span class="o">=</span> <span class="p">[</span><span class="s">""</span><span class="p">.</span><span class="n">join</span><span class="p">(</span><span class="n">ppo_trainer</span><span class="p">.</span><span class="n">tokenizer</span><span class="p">.</span><span class="n">convert_ids_to_tokens</span><span class="p">([</span><span class="n">i</span><span class="p">.</span><span class="n">item</span><span class="p">()</span> <span class="k">for</span> <span class="n">i</span> <span class="ow">in</span> <span class="n">r</span><span class="p">]))</span> <span class="k">for</span> <span class="n">r</span> <span class="ow">in</span> <span class="n">response_tensors</span><span class="p">]</span>
    <span class="k">print</span><span class="p">(</span><span class="n">responseSet</span><span class="p">)</span>

    <span class="c1">#### Get reward from sentiment model
</span>    <span class="n">pipe_outputs</span> <span class="o">=</span> <span class="n">classifier</span><span class="p">(</span><span class="n">responseSet</span><span class="p">)</span>
    <span class="n">rewards</span> <span class="o">=</span> <span class="p">[</span><span class="n">torch</span><span class="p">.</span><span class="n">tensor</span><span class="p">(</span><span class="n">output</span><span class="p">[</span><span class="s">"score"</span><span class="p">])</span> <span class="k">for</span> <span class="n">output</span> <span class="ow">in</span> <span class="n">pipe_outputs</span><span class="p">]</span>

    <span class="c1">#### Run PPO step
</span>    <span class="n">stats</span> <span class="o">=</span> <span class="n">ppo_trainer</span><span class="p">.</span><span class="n">step</span><span class="p">(</span><span class="n">query_tensors</span><span class="p">,</span> <span class="n">response_tensors</span><span class="p">,</span> <span class="n">rewards</span><span class="p">)</span>
    <span class="k">print</span><span class="p">(</span><span class="s">"epoch{}, reword is {}"</span><span class="p">.</span><span class="nb">format</span><span class="p">(</span><span class="n">epoch</span><span class="p">,</span> <span class="nb">sum</span><span class="p">(</span><span class="n">rewards</span><span class="p">)))</span>
    <span class="n">rewards_list</span><span class="p">.</span><span class="n">append</span><span class="p">(</span><span class="nb">sum</span><span class="p">(</span><span class="n">rewards</span><span class="p">))</span>
</code></pre></div></div>

<h3 id="trainer">Trainer</h3>

<p>Trainer 名称歧义</p>
<ul>
  <li>PyTorch Lightning有个 Trainer</li>
  <li>HuggingFace Transformers也有 Trainer</li>
  <li>还有一些github上封装的或者基于这两个继续封装的Trainer</li>
</ul>

<p>这里的 Trainer 指 Huggingface 的 Trainer 训练框架</p>

<p>Trainer 介于原生 torch 和 pytorch-lighning 之间,是轻量级的辅助torch模型训练的utils,因为其实稍微改造一下,huggingface的trainer 可用来训练常规的非nlp的torch模型。</p>
<ul>
  <li>封装程度: <code class="language-plaintext highlighter-rouge">torch</code> &lt; <code class="language-plaintext highlighter-rouge">pytorch lightning</code> &lt; <code class="language-plaintext highlighter-rouge">trainer</code></li>
</ul>

<p>Trainer 封装了 PyTorch 训练过程,包括:<strong>前向传播</strong>、<strong>反向传播</strong>和<strong>参数更新</strong>等步骤,用户只需要设计模型,调参就行</p>

<p>高级的 Trainer 加上了各种功能,比如:<strong>日志记录</strong>,<strong>断点重训</strong>,<strong>训练方式</strong>与<strong>精度</strong>,支持各种分布式训练框架像原生、Apex、Deepspeed和Fairscale,支持自定的回调函数等等</p>

<p>Lightning 官网的一张gif还是比较生动形象</p>

<h4 id="trainer-定义">Trainer 定义</h4>

<p><a href="https://github.com/huggingface/transformers/blob/v4.34.1/src/transformers/trainer.py#L236">trainer.py</a></p>

<p>do_train,do_eval,do_predict 这三个参数和trainer没什么关系</p>

<h4 id="自定义">自定义</h4>

<h5 id="model_init">model_init</h5>

<p>model_init</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="k">def</span> <span class="nf">model_init</span><span class="p">():</span>
    <span class="n">model</span> <span class="o">=</span> <span class="n">AutoModelForSequenceClassification</span><span class="p">.</span><span class="n">from_pretrained</span><span class="p">(</span>
        <span class="n">model_args</span><span class="p">.</span><span class="n">model_name_or_path</span><span class="p">,</span>
        <span class="n">from_tf</span><span class="o">=</span><span class="nb">bool</span><span class="p">(</span><span class="s">".ckpt"</span> <span class="ow">in</span> <span class="n">model_args</span><span class="p">.</span><span class="n">model_name_or_path</span><span class="p">),</span>
        <span class="n">config</span><span class="o">=</span><span class="n">config</span><span class="p">,</span>
        <span class="n">cache_dir</span><span class="o">=</span><span class="n">model_args</span><span class="p">.</span><span class="n">cache_dir</span><span class="p">,</span>
        <span class="n">revision</span><span class="o">=</span><span class="n">model_args</span><span class="p">.</span><span class="n">model_revision</span><span class="p">,</span>
        <span class="n">use_auth_token</span><span class="o">=</span><span class="bp">True</span> <span class="k">if</span> <span class="n">model_args</span><span class="p">.</span><span class="n">use_auth_token</span> <span class="k">else</span> <span class="bp">None</span>
    <span class="p">)</span>
    <span class="k">return</span> <span class="n">model</span>
</code></pre></div></div>

<h5 id="compute_metrics">compute_metrics</h5>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="k">def</span> <span class="nf">compute_metrics</span><span class="p">(</span><span class="n">p</span><span class="p">:</span> <span class="n">EvalPrediction</span><span class="p">)</span> <span class="o">-&gt;</span> <span class="n">Dict</span><span class="p">:</span>
    <span class="n">preds</span><span class="p">,</span><span class="n">labels</span><span class="o">=</span><span class="n">p</span>
    <span class="n">preds</span> <span class="o">=</span> <span class="n">np</span><span class="p">.</span><span class="n">argmax</span><span class="p">(</span><span class="n">preds</span><span class="p">,</span> <span class="n">axis</span><span class="o">=-</span><span class="mi">1</span><span class="p">)</span>
    <span class="c1">#print('shape:', preds.shape, '\n')
</span>    <span class="n">precision</span><span class="p">,</span> <span class="n">recall</span><span class="p">,</span> <span class="n">f1</span><span class="p">,</span> <span class="n">_</span> <span class="o">=</span> <span class="n">precision_recall_fscore_support</span><span class="p">(</span><span class="n">lables</span><span class="p">.</span><span class="n">flatten</span><span class="p">(),</span> <span class="n">preds</span><span class="p">.</span><span class="n">flatten</span><span class="p">(),</span> <span class="n">average</span><span class="o">=</span><span class="s">'weighted'</span><span class="p">,</span> <span class="n">zero_division</span><span class="o">=</span><span class="mi">0</span><span class="p">)</span>
    <span class="k">return</span> <span class="p">{</span>
        <span class="s">'accuracy'</span><span class="p">:</span> <span class="p">(</span><span class="n">preds</span> <span class="o">==</span> <span class="n">p</span><span class="p">.</span><span class="n">label_ids</span><span class="p">).</span><span class="n">mean</span><span class="p">(),</span>
        <span class="s">'f1'</span><span class="p">:</span> <span class="n">f1</span><span class="p">,</span>
        <span class="s">'precision'</span><span class="p">:</span> <span class="n">precision</span><span class="p">,</span>
        <span class="s">'recall'</span><span class="p">:</span> <span class="n">recall</span>
    <span class="p">}</span>
</code></pre></div></div>

<h5 id="加权loss">加权loss</h5>

<p>分类任务中,类目不均衡时,采用加权loss</p>

<p>做法</p>
<ul>
  <li>(1) 继承 Trainer 类, 重定义 compute_loss 函数</li>
  <li>(2) 使用回调函数 <a href="https://huggingface.co/docs/transformers/v4.34.1/en/main_classes/callback">callback</a></li>
</ul>

<p>示例</p>
<ul>
  <li>三分类问题,各类目加权 1 : 2 : 3</li>
</ul>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kn">from</span> <span class="nn">torch</span> <span class="kn">import</span> <span class="n">nn</span>
<span class="kn">from</span> <span class="nn">transformers</span> <span class="kn">import</span> <span class="n">Trainer</span>

<span class="k">class</span> <span class="nc">CustomTrainer</span><span class="p">(</span><span class="n">Trainer</span><span class="p">):</span>
    <span class="k">def</span> <span class="nf">compute_loss</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">model</span><span class="p">,</span> <span class="n">inputs</span><span class="p">,</span> <span class="n">return_outputs</span><span class="o">=</span><span class="bp">False</span><span class="p">):</span>
        <span class="n">labels</span> <span class="o">=</span> <span class="n">inputs</span><span class="p">.</span><span class="n">pop</span><span class="p">(</span><span class="s">"labels"</span><span class="p">)</span>
        <span class="c1"># forward pass
</span>        <span class="n">outputs</span> <span class="o">=</span> <span class="n">model</span><span class="p">(</span><span class="o">**</span><span class="n">inputs</span><span class="p">)</span>
        <span class="n">logits</span> <span class="o">=</span> <span class="n">outputs</span><span class="p">.</span><span class="n">get</span><span class="p">(</span><span class="s">"logits"</span><span class="p">)</span>
        <span class="c1"># compute custom loss (suppose one has 3 labels with different weights)
</span>        <span class="n">loss_fct</span> <span class="o">=</span> <span class="n">nn</span><span class="p">.</span><span class="n">CrossEntropyLoss</span><span class="p">(</span><span class="n">weight</span><span class="o">=</span><span class="n">torch</span><span class="p">.</span><span class="n">tensor</span><span class="p">([</span><span class="mf">1.0</span><span class="p">,</span> <span class="mf">2.0</span><span class="p">,</span> <span class="mf">3.0</span><span class="p">],</span> <span class="n">device</span><span class="o">=</span><span class="n">model</span><span class="p">.</span><span class="n">device</span><span class="p">))</span>
        <span class="n">loss</span> <span class="o">=</span> <span class="n">loss_fct</span><span class="p">(</span><span class="n">logits</span><span class="p">.</span><span class="n">view</span><span class="p">(</span><span class="o">-</span><span class="mi">1</span><span class="p">,</span> <span class="bp">self</span><span class="p">.</span><span class="n">model</span><span class="p">.</span><span class="n">config</span><span class="p">.</span><span class="n">num_labels</span><span class="p">),</span> <span class="n">labels</span><span class="p">.</span><span class="n">view</span><span class="p">(</span><span class="o">-</span><span class="mi">1</span><span class="p">))</span>
        <span class="k">return</span> <span class="p">(</span><span class="n">loss</span><span class="p">,</span> <span class="n">outputs</span><span class="p">)</span> <span class="k">if</span> <span class="n">return_outputs</span> <span class="k">else</span> <span class="n">loss</span>
</code></pre></div></div>

<h4 id="参数详解">参数详解</h4>

<p><a href="https://huggingface.co/docs/transformers/v4.34.1/en/main_classes/trainer#trainer">Trainer 官网文档</a>,版本为4.34.0</p>

<h5 id="trainer类-参数">Trainer类 参数</h5>

<p>Transformers Trainer类 参数:</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">model</code> (<code class="language-plaintext highlighter-rouge">PreTrainedModel</code> 或 <code class="language-plaintext highlighter-rouge">torch.nn.Module</code>, 可选):训练、评估或预测的实例化模型
    <ul>
      <li>如果不提供,必须传递一个 <code class="language-plaintext highlighter-rouge">model_init</code> 来初始化一个模型。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">args</code> (TrainingArguments, 可选):训练参数
    <ul>
      <li>如果不提供,用 TrainingArguments 默认参数,其中 output_dir 设置为当前目录中的名为 “tmp_trainer” 的目录。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">data_collator</code> (DataCollator, 可选):用于从 train_dataset 或 eval_dataset 中构成batch的函数
    <ul>
      <li>如果未提供tokenizer,将默认使用 default_data_collator();如果提供,将使用 DataCollatorWithPadding 。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">train_dataset</code> (torch.utils.data.<code class="language-plaintext highlighter-rouge">Dataset</code> 或 torch.utils.data.<code class="language-plaintext highlighter-rouge">IterableDataset</code>, 可选):训练数据集
    <ul>
      <li>如果是 torch.utils.data.Dataset,则会自动删除模型的 forward() 方法不接受的列。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">eval_dataset</code> (Union[torch.utils.data.Dataset, Dict[str, torch.utils.data.Dataset]), 可选):同上,评估数据集
    <ul>
      <li>如果是字典,将对每个数据集进行评估,并在指标名称前附加字典的键值。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">tokenizer</code> (PreTrainedTokenizerBase, 可选):预处理数据的<strong>分词器</strong>
    <ul>
      <li>如果提供,将在批量输入时自动对输入进行填充到最大长度,并会保存在模型目录下中,为了重新运行中断的训练或重复微调模型时更容易进行操作。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">model_init</code> (Callable[[], PreTrainedModel], 可选):模型实例化函数
    <ul>
      <li>如果提供,每次调用 train() 时都会从此函数给出的模型的新实例开始。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">compute_metrics</code> (Callable[<code class="language-plaintext highlighter-rouge">[EvalPrediction]</code>, Dict], 可选):评估时<strong>计算指标</strong>的函数,必须接受 EvalPrediction 作为入参,并返回一个字典,其中包含了不同性能指标的名称和相应的数值,一般是准确度、精确度、召回率、F1 分数等。</li>
  <li><code class="language-plaintext highlighter-rouge">callbacks</code> (TrainerCallback 列表, 可选):自定义<strong>回调函数</strong>
    <ul>
      <li>如果要删除使用的默认回调函数,要使用 Trainer.remove_callback() 方法。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">optimizers</code> (Tuple[torch.optim.Optimizer, torch.optim.lr_scheduler.LambdaLR], 可选):指定包含优化器和学习率调度器的元组(Tuple)
    <ul>
      <li>元组的两个元素分别是<strong>优化器</strong>(torch.optim.Optimizer)和<strong>学习率调度器</strong>(torch.optim.lr_scheduler.LambdaLR),默认会创建一个基于AdamW优化器的实例,并使用 get_linear_schedule_with_warmup() 函数创建一个学习率调度器。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">preprocess_logits_for_metrics</code> (Callable[[torch.Tensor, torch.Tensor], torch.Tensor], 可选):指定函数,每次评估步骤(evaluation step)前,进入compute_metrics函数前对模型的输出 logits 进行<strong>预处理</strong>。
    <ul>
      <li>接受两个张量(tensors)作为参数,一个是模型的输出 logits,另一个是<strong>真实标签</strong>(labels)。</li>
      <li>然后返回一个经过预处理后的 logits 张量,给到compute_metrics函数作为参数。</li>
    </ul>
  </li>
</ul>

<h5 id="trainingarguments-参数">TrainingArguments 参数</h5>

<p>args:超参数定义,trainer的重要功能,大部分训练相关的参数都是这里设置</p>

<p>TrainingArguments 有接近100个参数</p>

<p>TrainingArguments 参数</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">output_dir</code> (str):模型checkpoint/最终结果的输出目录。</li>
  <li><code class="language-plaintext highlighter-rouge">overwrite_output_dir</code> (bool, 可选,默认为 False):如果设置为True,将<strong>覆盖</strong>输出目录中已存在的内容
    <ul>
      <li>继续训练模型并且输出目录, 指向一个checkpoint目录。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">do_train</code> (bool, 可选,默认为 False):是否执行<strong>训练</strong>
    <ul>
      <li>其实Trainer 不直接使用此参数,主要是用于写脚本时,作为if的条件来判断是否执行接下来的代码。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">do_eval</code> (bool, 可选):是否在验证集上进行<strong>评估</strong>,如果评估策略(evaluation_strategy)不是”no”,将自动设置为True。
    <ul>
      <li>与do_train类似,不直接由Trainer使用,主要是用于写训练脚本。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">do_predict</code> (bool, 可选,默认为 False):是否在测试集上<strong>预测</strong>。</li>
  <li><code class="language-plaintext highlighter-rouge">evaluation_strategy</code> (str, 可选,默认为 “no”):指定训练期间采用的评估策略,可选值包括:
    <ul>
      <li>“no”:在训练期间不进行任何评估。</li>
      <li>“steps”:每隔 eval_steps 步骤进行评估。</li>
      <li>“epoch”:每个训练周期结束时进行评估。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">prediction_loss_only</code> (bool, 可选, 默认为 False):
    <ul>
      <li>如果设置为True,评估和预测时,只返回<strong>损失值</strong>,而不返回其他评估指标。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">per_device_train_batch_size</code> (int, 可选, 默认为 8):<strong>训练</strong>阶段,每个GPU/XPU/TPU/MPS/NPU/CPU的batch,每个训练步骤中每个硬件上的样本数量。</li>
  <li><code class="language-plaintext highlighter-rouge">per_device_eval_batch_size</code> (int, 可选, 默认为 8):<strong>评估</strong>阶段的每个GPU/XPU/TPU/MPS/NPU/CPU的batch,每个评估步骤中每个硬件上的样本数量。</li>
  <li><code class="language-plaintext highlighter-rouge">gradient_accumulation_steps</code> (int, 可选, 默认为 1):执行反向传播之前,<strong>梯度积累的更新步数</strong>。
    <ul>
      <li>梯度积累可以在多个batch上累积梯度,然后一次性执行反向传播,显存不够的情况下执行大batch的反向传播。</li>
      <li>假设4张卡,每张卡的batch size为8,那么一个steps的batch size就是32,如果这个参数设置为4,那么做反向传播的训练样本数量就是128。</li>
      <li>两个好处:①显存不够增大此参数;②能加快训练速度,毕竟做反向传播的次数少了。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">eval_accumulation_steps</code> (int, 可选):执行评估时,模型会累积多少个预测步骤的输出张量,然后才从GPU/NPU/TPU移动到CPU上,默认是整个评估的输出结果将在GPU/NPU/TPU上累积,然后一次性传输到CPU,速度更快,但占显存。</li>
  <li><code class="language-plaintext highlighter-rouge">eval_delay</code> (float, 可选):等待执行第一次评估的轮数或步数。
    <ul>
      <li>如果evaluation_strategy为”steps”,设置此参数为10,则10个steps后才进行首次评估。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">learning_rate</code> (float, 可选, 默认为 5e-5):AdamW优化器的<strong>初始学习率</strong>。</li>
  <li><code class="language-plaintext highlighter-rouge">weight_decay</code> (float, 可选, 默认为 0):<strong>权重衰减</strong>的值,应用在 AdamW 优化器所有层上,除了偏置(bias)和 Layer Normalization 层(LayerNorm)的权重上。
    <ul>
      <li>权重衰减是一种<strong>正则化</strong>手段,通过向损失函数添加一个额外的项,来惩罚较大的权重值,有助于防止模型<strong>过拟合</strong>训练数据。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">adam_beta1</code> (float, 可选, 默认为 0.9):AdamW优化器的beta1超参数。</li>
  <li><code class="language-plaintext highlighter-rouge">adam_beta2</code> (float, 可选, 默认为 0.999):AdamW优化器的beta2超参数。</li>
  <li><code class="language-plaintext highlighter-rouge">adam_epsilon</code> (float, 可选, 默认为 1e-8):AdamW优化器的epsilon超参数。</li>
  <li><code class="language-plaintext highlighter-rouge">max_grad_norm</code> (float, 可选, 默认为 1.0):梯度剪裁的最大梯度范数,可以防止梯度爆炸,一般都是1,如果某一步梯度的L2范数超过了 此参数,那么梯度将被重新缩放,确保它的大小不超过此参数。</li>
  <li><code class="language-plaintext highlighter-rouge">num_train_epochs</code> (float, 可选, 默认为 3.0):训练的<strong>总epochs数</strong>。</li>
  <li><code class="language-plaintext highlighter-rouge">max_steps</code> (int, 可选, 默认为 -1):如果设置为正数,执行的总训练步数,会覆盖 num_train_epochs。
    <ul>
      <li>注意:如果使用此参数,就算没有达到这个参数值的步数,训练也会在数据跑完后停止。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">lr_scheduler_type</code> (str, 可选, 默认为”linear”):学习率scheduler类型,根据训练进程来自动调整学习率。详细见:
    <ul>
      <li>“linear”:<strong>线性</strong>学习率scheduler,学习率以线性方式改变</li>
      <li>“cosine”:<strong>余弦</strong>学习率scheduler,学习率以余弦形状的方式改变。</li>
      <li>“constant”:<strong>常数</strong>学习率,学习率在整个训练过程中保持不变。</li>
      <li>“polynomial”:<strong>多项式</strong>学习率scheduler,学习率按多项式函数的方式变化。</li>
      <li>“piecewise”:<strong>分段常数</strong>学习率scheduler,每个阶段使用不同的学习率。</li>
      <li>“exponential”:<strong>指数</strong>学习率scheduler,学习率以指数方式改变。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">warmup_ratio</code> (float, 可选, 默认为0.0):线性热身占总训练步骤的比例,线性热身是一种训练策略,学习率在开始阶段从0逐渐增加到其最大值(通常是设定的学习率),然后在随后的训练中保持不变或者按照其他调度策略进行调整。如果设置为0.0,表示没有热身。</li>
  <li><code class="language-plaintext highlighter-rouge">warmup_steps</code> (int,可选, 默认为0):线性热身的步骤数,这个参数会覆盖warmup_ratio,如果设置了warmup_steps,将会忽略warmup_ratio。</li>
  <li><code class="language-plaintext highlighter-rouge">log_level</code> (str, 可选, 默认为passive):主进程上要使用的日志级别,
    <ul>
      <li><code class="language-plaintext highlighter-rouge">debug</code>:最详细的日志级别。</li>
      <li><code class="language-plaintext highlighter-rouge">info</code>:用于一般的信息性消息。</li>
      <li><code class="language-plaintext highlighter-rouge">warning</code>:用于警告信息。</li>
      <li><code class="language-plaintext highlighter-rouge">error</code>:用于错误信息。</li>
      <li><code class="language-plaintext highlighter-rouge">critical</code>:用于严重错误信息。</li>
      <li><code class="language-plaintext highlighter-rouge">passive</code>:不设置任何内容,将会使用Transformers库当前的日志级别(默认为”warning”)。</li>
      <li>建议训练时使用info级别。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">log_level_replica</code> (str, 可选, 默认为warning):副本上要使用的日志级别,与log_level相同。</li>
  <li><code class="language-plaintext highlighter-rouge">log_on_each_node</code> (bool, optional, defaults to True):在多节点分布式训练中,是否在每个节点上使用log_level进行日志记录。</li>
  <li><code class="language-plaintext highlighter-rouge">logging_dir</code> (str, 可选):TensorBoard日志目录。默认为output_dir/runs/CURRENT_DATETIME_HOSTNAME。</li>
  <li><code class="language-plaintext highlighter-rouge">logging_strategy</code> (str, 可选, 默认为”steps”):训练过程中采用的日志记录策略。可选包括:
    <ul>
      <li>“no”:在训练过程中不记录任何日志。</li>
      <li>“epoch”:在每个epoch结束时记录日志。</li>
      <li>“steps”:根据logging_steps参数记录日志。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">logging_steps</code> (int or float,可选, 默认为500):
    <ul>
      <li>如果logging_strategy=”steps”,则此参数为每多少步记录一次步骤。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">logging_nan_inf_filter</code> (bool, 可选, 默认为 True):是否过滤日志记录中为nan和inf的loss
    <ul>
      <li>如果设置为True,将过滤每个步骤的loss,如果出现nan或inf,将取当前日志窗口的平均损失值。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">save_strategy</code> (str , 可选, 默认为 “steps”):训练过程中保存checkpoint的策略,包括:
    <ul>
      <li>“no”:在训练过程中不保存checkpoint。</li>
      <li>“epoch”:在每个epoch束时保存checkpoint。</li>
      <li>“steps”:根据save_steps参数保存checkpoint。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">save_steps</code> (int or float, 可选, 默认为500):
    <ul>
      <li>如果save_strategy=”steps”,就是指两次checkpoint保存之间的更新步骤数。如果是在[0, 1)的浮点数,则就会当做与总训练步骤数的比例。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">save_total_limit</code> (int, 可选):如果给定了参数,将限制checkpoint的总数,因为checkpoint也是很占硬盘的,将会删除输出目录中旧的checkpoint。
    <ul>
      <li>当启用load_best_model_at_end时,会根据metric_for_best_model保留最好的checkpoint,以及最近的checkpoint。</li>
      <li>当save_total_limit=5和指定load_best_model_at_end时,将始终保留最近的四个checkpoint以及最好的checkpoint;</li>
      <li>当save_total_limit=1和指定load_best_model_at_end时,会保存两个checkpoint:最后一个和最好的一个(如果不同一个)。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">load_best_model_at_end</code> (bool, 可选, 默认为False):是否在训练结束时,加载在训练过程中最好的checkpoint
    <ul>
      <li>设置为 True 时,找到在验证集上指标最好的checkpoint并且保存,然后还会保存最后一个checkpoint</li>
      <li>在普通的多epoch训练中,最好设置为True</li>
      <li>但在大模型训练中,一般是一个epoch,使用的就是最后一个checkpoint。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">save_safetensors</code> (bool, 可选, 默认为False):是否在保存和加载模型参数时使用 “safetensors”
    <ul>
      <li>“safetensors” 更好地处理了不同 PyTorch 版本之间的模型参数加载的兼容性问题。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">save_on_each_node</code> (bool, 可选, 默认为 False):多节点分布式训练时,是否在每个节点上保存checkpoint,还是仅在主节点上保存。
    <ul>
      <li>注意如果多节点使用的是同一套存储设备,比如都是外挂一个nas,开启后会报错,因为文件名称都一样。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">use_cpu</code> (bool, 可选, 默认为 False):是否用CPU训练。如果设置为False,将使用CUDA或其他可用设备。</li>
  <li><code class="language-plaintext highlighter-rouge">seed</code> (int, 可选, 默认为42):训练过程的随机种子,确保训练的可重现性,主要用于model_init,随机初始化权重参数。</li>
  <li><code class="language-plaintext highlighter-rouge">data_seed</code> (int, 可选):数据采样的随机种子,如果没有设置将使用与seed相同的种子,可以确保数据采样的可重现性。</li>
  <li><code class="language-plaintext highlighter-rouge">jit_mode_eval</code> (bool, 可选, 默认为False):是否在推理(inference)过程中使用 PyTorch 的 JIT(Just-In-Time)跟踪功能
    <ul>
      <li>PyTorch JIT 是 PyTorch 的一个功能,用于将模型的前向传播计算编译成高性能的机器代码,会加速模型的推理。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">use_ipex</code> (bool, 可选, 默认为 False):是否使用英特尔扩展(Intel extension)来优化 PyTorch,需要安装IPEX
    <ul>
      <li>IPEX是一组用于优化深度学习框架的工具和库,提高训练和推理的性能,特别针对英特尔的处理器做了优化。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">bf16</code> (bool, 可选, 默认为False):是否使用bf16进行混合精度训练,而不是fp32训练,需要安培架构或者更高的NVIDIA架构,关于精度的问题可以看这篇文章:Glan格蓝:LLM大模型之精度问题(FP16,FP32,BF16)详解与实践
    <ul>
      <li>混合精度训练:模型训练时将模型参数和梯度存储为<code class="language-plaintext highlighter-rouge">fp32</code>,但在前向和后向传播计算中使用<code class="language-plaintext highlighter-rouge">fp16</code>,这样可以减少内存使用和计算时间,并提高训练速度。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">fp16</code> (bool,** 可选, 默认为<strong>**False)</strong>:是否使用fp16进行混合精度训练,而不是fp32训练。</li>
  <li><code class="language-plaintext highlighter-rouge">fp16_opt_level</code> (str, 可选, 默认为 ‘‘O1’’):对于fp16训练,选择的Apex AMP的优化级别,可选值有 [‘O0’, ‘O1’, ‘O2’和’O3’]。详细信息可以看Apex文档。</li>
  <li><code class="language-plaintext highlighter-rouge">half_precision_backend</code> (str, 可选, 默认为”auto”):混合精度训练(Mixed Precision Training)时要使用的后端,必须是 “auto”、”cuda_amp”、”apex”、”cpu_amp” 中的一个。
    <ul>
      <li>“auto”将根据检测到的PyTorch版本来使用后端,而其他选项将会强制使用请求的后端。使用默认就行。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">bf16_full_eval</code> (bool, 可选, 默认为 False):是否使用完全的bf16进行评估,而不是fp32。这样更快且省内存,但因为精度的问题指标可能会下降。</li>
  <li><code class="language-plaintext highlighter-rouge">fp16_full_eval</code> (bool, 可选, 默认为 False):同上,不过将使用fp16.</li>
  <li><code class="language-plaintext highlighter-rouge">tf32</code> (bool, 可选):是否启用tf32精度模式,适用于安培架构或者更高的NVIDIA架构,默认值取决于PyTorch的版本torch.backends.cuda.matmul.allow_tf32 默认值。</li>
  <li><code class="language-plaintext highlighter-rouge">local_rank</code> (int, 可选, 默认为 -1):在分布式训练中的当前进程(本地排名)的排名,这个用户不用设置,使用PyTorch分布式训练时会<strong>自动</strong>设置,默认为自动设置。</li>
  <li><code class="language-plaintext highlighter-rouge">ddp_backend</code> (str, 可选):处理分布式计算的后端框架,用于多个计算节点协同工作以加速训练,处理模型参数和梯度的同步、通信等操作,可选值如下
    <ul>
      <li>“<code class="language-plaintext highlighter-rouge">nccl</code>“:这是 NVIDIA Collective Communications Library (NCCL) 的后端。</li>
      <li>“<code class="language-plaintext highlighter-rouge">mpi</code>“:Message Passing Interface (MPI) 后端, 是一种用于不同计算节点之间通信的标准协议。</li>
      <li>“<code class="language-plaintext highlighter-rouge">ccl</code>“:这是 Intel的oneCCL (oneAPI Collective Communications Library) 的后端。</li>
      <li>“<code class="language-plaintext highlighter-rouge">gloo</code>“:这是Facebook开发的分布式通信后端。</li>
      <li>“<code class="language-plaintext highlighter-rouge">hccl</code>“:这是Huawei Collective Communications Library (HCCL) 的后端,用于华为昇腾NPU的系统上进行分布式训练。</li>
      <li>默认会根据系统自动设置,一般是nccl。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">tpu_num_cores</code> (int, 可选):TPU上训练时,TPU核心的数量。</li>
  <li><code class="language-plaintext highlighter-rouge">dataloader_drop_last</code> (bool, 可选, 默认为False):是否丢弃最后一个不完整的batch,发生在数据集的样本数量不是batch_size的整数倍的时候。</li>
  <li><code class="language-plaintext highlighter-rouge">eval_steps</code> (int or float, 可选):如果evaluation_strategy=”steps”,两次评估之间的更新步数,如果未设置,默认和设置和logging_steps相同的值,如果是在[0, 1)的浮点数,则就会当做与总评估步骤数的比例。</li>
  <li><code class="language-plaintext highlighter-rouge">dataloader_num_workers</code> (int, 可选, 默认为 0):数据加载时的子进程数量(仅用于PyTorch), PyTorch的num_workers参数,0表示数据将在主进程中加载。</li>
  <li><code class="language-plaintext highlighter-rouge">past_index</code> (int, 可选, 默认为 -1):一些模型(如TransformerXL或XLNet)可用过去的隐藏状态进行预测,如果将此参数设置为正整数,Trainer将使用相应的输出(通常索引为2)作为过去状态,并将其在下一个训练步骤中作为mems关键字参数提供给模型,只针对一些特定模型。</li>
  <li><code class="language-plaintext highlighter-rouge">run_name</code> (str, 可选):训练运行(run)的字符串参数,与日志记录工具(例如wandb和mlflow)一起使用,不影响训练过程,就是给其他的日志记录工具开了一个接口,个人还是比较推荐wandb比较好用。</li>
  <li><code class="language-plaintext highlighter-rouge">disable_tqdm</code> (bool, 可选):是否禁用Jupyter笔记本中的~notebook.NotebookTrainingTracker生成的tqdm进度条,如果日志级别设置为warn或更低,则将默认为True,否则为False。</li>
  <li><code class="language-plaintext highlighter-rouge">remove_unused_columns</code> (bool, 可选, 默认为True):是否自动删除模型在训练时,没有用到的数据列,默认会删除,比如你的数据有两列分别是content和id,如果没有用到id这一列,训练时就会被删除。</li>
  <li><code class="language-plaintext highlighter-rouge">label_names</code> (List[str], 可选):在模型的输入字典中对应于标签(labels)的键,默认情况下不需要显式指定。</li>
  <li><code class="language-plaintext highlighter-rouge">metric_for_best_model</code> (str, 可选):与 load_best_model_at_end 结合使用,比较不同模型的度量标准,默认情况下,如果未指定,将使用验证集的 “loss” 作为度量标准,可使用accuracy、F1、loss等。</li>
  <li><code class="language-plaintext highlighter-rouge">greater_is_better</code> (bool, 可选):与 load_best_model_at_end 和 metric_for_best_model 结合使用,这个和上面的那个参数是对应的,那个指标是越大越好还是越小越好
    <ul>
      <li>如果是loss, 越小越好,这个参数就会被设置为False;</li>
      <li>如果是accuracy,把这个值设为True。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">ignore_data_skip</code> (bool, 可选,默认为False):是否<strong>断点训练</strong>,即训练终止又恢复后,是否跳过之前的训练数据。</li>
  <li><code class="language-plaintext highlighter-rouge">resume_from_checkpoint</code> (str, 可选):从checkpoint恢复训练的路径。</li>
  <li><code class="language-plaintext highlighter-rouge">sharded_ddp</code> (bool, str 或 ShardedDDPOption 列表, 可选, 默认为’’):是否在分布式训练中使用 Sharded DDP(Sharded Data Parallelism),FairScale提供的,默认不使用
    <ul>
      <li>FairScale 是Mate开发的一个用于高性能和大规模训练的 PyTorch 扩展库。这个库扩展了基本的 PyTorch 功能,同时引入了最新的先进规模化技术,通过可组合的模块和易于使用的API,提供了最新的分布式训练技术。详细的可以看其官网。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">fsdp</code> (bool, str 或 FSDPOption 列表, 可选, 默认为’’):是否启用 PyTorch 的 <code class="language-plaintext highlighter-rouge">FSDP</code>(Fully Sharded Data Parallel Training),以及如何配置分布式并行训练。</li>
  <li><code class="language-plaintext highlighter-rouge">fsdp_config</code> (str 或 dict, 可选):配置 PyTorch 的 FSDP(Fully Sharded Data Parallel Training)的配置文件</li>
  <li><code class="language-plaintext highlighter-rouge">deepspeed</code> (str 或 dict, 可选):是否启用 DeepSpeed,以及如何配置 DeepSpeed。
    <ul>
      <li>目前分布式训练使用最多的框架,比上面pytorch原生分布式训练以及FairScale用的范围更广,详细的可以看其官网。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">label_smoothing_factor</code> (float, 可选,默认为0.0):标签平滑的因子。</li>
  <li><code class="language-plaintext highlighter-rouge">debug</code> (str 或 DebugOption 列表, 可选, 默认为’’):启用一个或多个调试功能,支持选项:
    <ul>
      <li>“underflow_overflow”:此选项用于检测模型输入/输出中的溢出。</li>
      <li>“tpu_metrics_debug”:此选项用于在 TPU 上打印调试指标。</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">optim</code> (str 或 training_args.OptimizerNames, 可选, 默认为 “adamw_torch”):要用的优化器。可选项:
    <ul>
      <li>“adamw_hf”</li>
      <li>“adamw_torch”</li>
      <li>“adamw_torch_fused”</li>
      <li>“adamw_apex_fused”</li>
      <li>“adamw_anyprecision”</li>
      <li>“adafactor”</li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">optim_args</code> (str, 可选):用于向特定类型的优化器(如adamw_anyprecision)提供额外的参数或自定义配置。</li>
  <li><code class="language-plaintext highlighter-rouge">group_by_length</code> (bool, 可选, 默认为 False):是否在训练数据集中对大致相同长度的样本进行分组然后放在一个batch里,目的是尽量减少在训练过程中进行的padding,提高训练效率。</li>
  <li><code class="language-plaintext highlighter-rouge">length_column_name</code> (str, 可选, 默认为 “length”):当上个参数设置为True时,可以给训练数据在增加一列”长度“,就是事先计算好的,可以加快分组的速度,默认是length。</li>
  <li><code class="language-plaintext highlighter-rouge">report_to</code> (str 或 str 列表, 可选, 默认为 “all”):要将训练结果和日志报告到的不同日记集成平台,有很多”azure_ml”, “clearml”, “codecarbon”, “comet_ml”, “dagshub”, “flyte”, “mlflow”, “neptune”, “tensorboard”, and “wandb”。直接默认就行,都发。</li>
  <li><code class="language-plaintext highlighter-rouge">ddp_find_unused_parameters</code> (bool, 可选):使用分布式训练时,这个参数用于控制是否查找并处理那些在计算中没有被使用的参数,如果启用了<strong>梯度检查点</strong>(gradient checkpointing),表示部分参数是惰性加载的,这时默认值为 False,因为梯度检查点本身已经考虑了未使用的参数,如果没有启用梯度检查点,默认值为 True,表示要查找并处理所有参数,以确保它们的梯度被正确传播。</li>
  <li><code class="language-plaintext highlighter-rouge">ddp_bucket_cap_mb</code> (int, 可选):在分布式训练中,数据通常分成小块进行处理,这些小块称为”桶”,这个参数每个桶的最大内存占用大小,一般自动分配即可。</li>
  <li><code class="language-plaintext highlighter-rouge">ddp_broadcast_buffers</code> (bool, 可选):分布式训练中,模型的某些部分可能包含缓冲区,如 Batch Normalization 层的统计信息,这个参数用于控制是否将这些缓冲区广播到所有计算设备,以确保模型在不同设备上保持同步,如果启用了梯度检查点,表示不需要广播缓冲区,因为它们不会被使用,如果没有启用梯度检查点,默认值为 True,表示要广播缓冲区,以确保模型的不同部分在所有设备上都一致。</li>
  <li><code class="language-plaintext highlighter-rouge">gradient_checkpointing</code> (bool, 可选, 默认为False):是否开启梯度检查点,简单解释一下:训练大型模型时需要大量的内存,其中在反向传播过程中,需要保存前向传播的中间计算结果以计算梯度,但是这些中间结果占用大量内存,可能会导致内存不足,梯度检查点会在训练期间释放不再需要的中间结果以减小内存占用,但它会使反向传播变得更慢。</li>
  <li><code class="language-plaintext highlighter-rouge">dataloader_pin_memory</code> (bool, 可选, 默认为 True):dataloader加载数据时,是否启用“pin memory”功能。“Pin memory” 用于将数据加载到GPU内存之前,将数据复制到GPU的锁页内存(pinned memory)中,锁页内存是一种特殊的内存,可以更快地传输数据到GPU,从而加速训练过程,但是会占用额外的CPU内存,会导致内存不足的问题,如果数据量特别大,百G以上建议False。</li>
  <li><code class="language-plaintext highlighter-rouge">skip_memory_metrics</code> (bool, 可选, 默认为 True):是否将内存分析报告添加到性能指标中,默认情况下跳过这一步,以提高训练和评估的速度,建议打开,更能够清晰的知道每一步的内存使用。</li>
  <li><code class="language-plaintext highlighter-rouge">include_inputs_for_metrics</code> (bool, 可选, 默认为 False):是否将输入传递给 compute_metrics 函数,一般计算metrics用的是用的是模型预测的结果和我们提供的标签,但是有的指标需要输入,比如cv的IoU(Intersection over Union)指标。</li>
  <li><code class="language-plaintext highlighter-rouge">auto_find_batch_size</code> (bool, 可选, 默认为 False):是否使用自动寻找适合内存的batch size大小,以避免 CUDA 内存溢出错误,需要安装 accelerate(使用 pip install accelerate),这个功能还是比较NB的。</li>
  <li><code class="language-plaintext highlighter-rouge">full_determinism</code> (bool, 可选, 默认为 False):如果设置为 True,将调用 enable_full_determinism() 而不是 set_seed(),训练过程将启用完全确定性(full determinism),在训练过程中,所有的随机性因素都将被消除,确保每次运行训练过程都会得到相同的结果,注意:会对性能产生负面影响,因此仅在调试时使用。</li>
  <li><code class="language-plaintext highlighter-rouge">torchdynamo</code> (str, 可选):用于选择 TorchDynamo 的后端编译器,TorchDynamo 是 PyTorch 的一个库,用于提高模型性能和部署效率,可选的选择包括 “eager”、”aot_eager”、”inductor”、”nvfuser”、”aot_nvfuser”、”aot_cudagraphs”、”ofi”、”fx2trt”、”onnxrt” 和 “ipex”。默认就行,自动会选。</li>
  <li><code class="language-plaintext highlighter-rouge">ray_scope</code> (str, 可选, 默认为 “last”):用于使用 Ray 进行超参数搜索时,指定要使用的范围,默认情况下,使用 “last”,Ray 将使用所有试验的最后一个检查点,比较它们并选择最佳的。详细的可以看一下它的文档。</li>
  <li><code class="language-plaintext highlighter-rouge">ddp_timeout</code> (int, 可选, 默认为 1800):用于 torch.distributed.init_process_group 调用的超时时间,在分布式运行中执行较慢操作时,用于避免超时,具体的可以看 PyTorch 文档 。
<code class="language-plaintext highlighter-rouge">torch_compile</code> (bool, 可选, 默认为 False):是否使用 PyTorch 2.0 及以上的 torch.compile 编译模型,具体的可以看 PyTorch 文档 。</li>
  <li><code class="language-plaintext highlighter-rouge">torch_compile_backend</code> (str, 可选):指定在 torch.compile 中使用的后端,如果设置为任何值,将启用 torch_compile。</li>
  <li><code class="language-plaintext highlighter-rouge">torch_compile_mode</code> (str, 可选):指定在 torch.compile 中使用的模式,如果设置为任何值,将启用 torch_compile。</li>
  <li><code class="language-plaintext highlighter-rouge">include_tokens_per_second</code> (bool, 可选):确定是否计算每个设备的每秒token数以获取训练速度指标,会在整个训练数据加载器之前进行迭代,会稍微减慢整个训练过程,建议打开。</li>
  <li><code class="language-plaintext highlighter-rouge">push_to_hub</code> (bool, 可选, 默认为 False):指定是否在每次保存模型时将模型推送到Huggingface Hub。</li>
  <li><code class="language-plaintext highlighter-rouge">hub_model_id</code> (str, 可选):指定要与本地 output_dir 同步的存储库的名称。</li>
  <li><code class="language-plaintext highlighter-rouge">hub_strategy</code> (str 或 HubStrategy, 可选, 默认为 “every_save”):指定怎么推送到Huggingface Hub。</li>
  <li><code class="language-plaintext highlighter-rouge">hub_token</code> (str, 可选):指定推送模型到Huggingface Hub 的token。</li>
  <li><code class="language-plaintext highlighter-rouge">hub_private_repo</code> (bool, 可选, 默认为 False):如果设置为 True,Huggingface Hub 存储库将设置为私有。</li>
  <li><code class="language-plaintext highlighter-rouge">hub_always_push</code> (bool, 可选, 默认为 False):是否每次都推送模型。</li>
</ul>

<p>详见</p>
<ul>
  <li><a href="https://zhuanlan.zhihu.com/p/662619853">LLM大模型之Trainer以及训练参数</a></li>
</ul>

<h3 id="firefly">Firefly</h3>

<p><a href="https://github.com/yangjianxin1/Firefly">Firefly</a> 是开源的大模型<strong>一站式训练框架</strong></p>
<ul>
  <li>支持对各种大模型进行<strong>预训练</strong>、<strong>指令微调</strong>、<code class="language-plaintext highlighter-rouge">DPO</code>,支持全量参数、LoRA、QLoRA等训练方式。</li>
  <li>支持包括但不限于Gemma、Qwen1.5、MiniCPM、Mixtral-8x7B、Mistral、Llama等绝大多数主流的大模型。</li>
</ul>

<p>【2024-3-5】<a href="https://mp.weixin.qq.com/s/C5X0qX2YsxhIoFvRsqcMMA">使用Firefly在单卡V100上对Qwen1.5进行SFT和DPO,大幅超越Qwen1.5和Gemma</a></p>

<p>用Firefly项目对Qwen1.5-7B进行训练的实验。我们对训练数据进行精细化筛选,然后在单张V100上进行SFT和DPO。经过两阶段的训练,我们的模型在Open LLM Leaderboard上的表现显著优于官方的Qwen1.5-7B-Chat、Gemma-7B-it、Vicuna-13B等模型。比Qwen1.5-7B-Chat高7.12分,比Gemma-7B-it高8.8分。</p>

<h3 id="torchtune">TorchTune</h3>

<p>【2024-3-23】<a href="https://zhuanlan.zhihu.com/p/688671130?utm_psn=1755039674018496512">PyTorch官方发布LLM微调工具TorchTune</a></p>

<p>PyTorch官方最近发布了支持LLM微调的工具:<code class="language-plaintext highlighter-rouge">TorchTune</code>。</p>
<ul>
  <li><a href="https://pytorch.org/blog/torchtune-fine-tune-llms/">TorchTune</a> 是一个原生的 PyTorch 库,用于轻松编写、微调和实验大型语言模型(LLMs)</li>
</ul>

<h4 id="torchtune-功能">TorchTune 功能</h4>

<p>功能:</p>
<ul>
  <li>原生 PyTorch 实现的流行大型语言模型</li>
  <li>支持多种格式的checkpoints,包括 Hugging Face 格式的checkpoints</li>
  <li>针对流行微调技术的训练策略,带有参考基准和全面的校验检查</li>
  <li>与 HuggingFace 数据集集成用于训练,以及与 EleutherAI 的评估工具 Eval Harness 集成用于评估</li>
  <li>支持使用 PyTorch 分布式中的 FSDP 进行分布式训练</li>
  <li>YAML 配置文件,便于轻松配置训练运行</li>
  <li>[即将推出] 支持来自 TorchAO 的低精度数据类型和量化技术</li>
  <li>[即将推出] 与各种推理引擎的互操作性</li>
</ul>

<h4 id="torchtune-微调">TorchTune 微调</h4>

<p>TorchTune 已经支持了<strong>Llama2 7B模型</strong>的微调:</p>
<ul>
  <li>单卡微调:<a href="https://github.com/pytorch/torchtune/blob/main/recipes/full_finetune_single_device.py">https://github.com/pytorch/torchtune/blob/main/recipes/full_finetune_single_device.py</a></li>
  <li>分布式微调:<a href="https://github.com/pytorch/torchtune/blob/main/recipes/full_finetune_distributed.py">https://github.com/pytorch/torchtune/blob/main/recipes/full_finetune_distributed.py</a></li>
  <li>单卡LoRA:<a href="https://github.com/pytorch/torchtune/blob/main/recipes/lora_finetune_single_device.py">https://github.com/pytorch/torchtune/blob/main/recipes/lora_finetune_single_device.py</a></li>
  <li>分布式LoRA:<a href="https://github.com/pytorch/torchtune/blob/main/recipes/lora_finetune_distributed.py">https://github.com/pytorch/torchtune/blob/main/recipes/lora_finetune_distributed.py</a></li>
  <li>QLoRA:<a href="https://github.com/pytorch/torchtune/blob/main/recipes/lora_finetune_single_device.py">https://github.com/pytorch/torc</a></li>
</ul>

<h4 id="torchtune-安装">torchtune 安装</h4>

<p>torchtune 必须通过克隆仓库并按照以下方式安装来构建:</p>

<div class="language-py highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c1"># ① 
</span><span class="n">pip</span> <span class="n">install</span> <span class="n">torchtune</span>
<span class="c1"># ② 
</span><span class="n">git</span> <span class="n">clone</span> <span class="n">https</span><span class="p">:</span><span class="o">//</span><span class="n">github</span><span class="p">.</span><span class="n">com</span><span class="o">/</span><span class="n">pytorch</span><span class="o">/</span><span class="n">torchtune</span><span class="p">.</span><span class="n">git</span>
<span class="n">cd</span> <span class="n">torchtune</span>
<span class="n">pip</span> <span class="n">install</span> <span class="o">-</span><span class="n">e</span> <span class="p">.</span>
</code></pre></div></div>

<h3 id="torchtitan">torchtitan</h3>

<p>【2024-4-28】<a href="https://github.com/pytorch/torchtitan">torchtitan</a> - 用于大型模型训练的原生 PyTorch 库</p>

<p><a href="https://github.com/pytorch/torchtitan">torchtitan</a> is a proof-of-concept (概念验证阶段) for Large-scale LLM training using native PyTorch.</p>
<ul>
  <li>It is (and will continue to be) a repo to showcase PyTorch’s latest distributed training features in a clean, minimal codebase.</li>
  <li><code class="language-plaintext highlighter-rouge">torchtitan</code> is complementary (补充) to and not a replacement (替代) for any of the great large-scale LLM training codebases such as <code class="language-plaintext highlighter-rouge">Megatron</code>, <code class="language-plaintext highlighter-rouge">Megablocks</code>, <code class="language-plaintext highlighter-rouge">LLM Foundry</code>, <code class="language-plaintext highlighter-rouge">Deepspeed</code>, etc.</li>
  <li>Instead, we hope that the features showcased in <code class="language-plaintext highlighter-rouge">torchtitan</code> will be adopted by these codebases quickly. torchtitan is unlikely to ever grow a large community around it.</li>
</ul>

<p>Our guiding principles when building torchtitan:</p>
<ul>
  <li>Designed to be easy to understand, use and extend for different training purposes.</li>
  <li>Minimal changes to the model code when applying 1D, 2D, or (soon) 3D Parallel.</li>
  <li>Modular components instead of a monolithic codebase.</li>
</ul>

<p>Get started in minutes, not hours!</p>

<h3 id="总结">总结</h3>

<p>Megatron-DeepSpeed 实施 3D 并行以可以让大型模型以非常有效的方式进行训练。</p>
<ul>
  <li>DataParallel (<code class="language-plaintext highlighter-rouge">DP</code>) - 相同的初始化模型被复制多次,并且每次都被馈送 minibatch 的一部分。处理是并行完成的,所有设置在每个训练步骤结束时进行同步。</li>
  <li>TensorParallel (<code class="language-plaintext highlighter-rouge">TP</code>) - 每个张量都被分成多个块,因此不是让整个张量驻留在单个 GPU 上,而是张量的每个分片都驻留在其指定的 GPU 上。在处理过程中,每个分片在不同的 GPU 上分别并行处理,最终结果在步骤结束时同步。这也被称作横向并行。</li>
  <li>PipelineParallel (<code class="language-plaintext highlighter-rouge">PP</code>) - 模型在多个 GPU 上垂直(层级)拆分,因此只有模型的一个或多个层放置在单个 GPU 上。每个 GPU 并行处理管道的不同阶段,并处理一小部分批处理。</li>
  <li>零冗余优化器 (<code class="language-plaintext highlighter-rouge">ZeRO</code>) - 也执行与 TP 有点类似的张量分片,除了整个张量会及时重建以进行前向或反向计算,因此不需要修改模型。它还支持各种卸载技术以补偿有限的 GPU 内存。</li>
</ul>

<p>训练超大规模语言模型主要有两条技术路线:</p>
<ul>
  <li>TPU + XLA + TensorFlow/JAX</li>
  <li>GPU + PyTorch + Megatron-LM + DeepSpeed</li>
  <li>前者由Google主导,由于TPU和自家云平台GCP深度绑定,对于非Googler来说, 只可远观而不可把玩</li>
  <li>后者背后则有NVIDIA、Meta、MS大厂加持,社区氛围活跃,也更受到群众欢迎。</li>
</ul>

<p>Deepspeed 是微软的大规模分布式训练工具。专门用于训练超大模型。</p>
<ul>
  <li><a href="https://zhuanlan.zhihu.com/p/609865550">大模型的训练工具(1)—Deepspeed</a></li>
  <li><code class="language-plaintext highlighter-rouge">DP</code>+<code class="language-plaintext highlighter-rouge">PP</code>: DeepSpeed 将 DP 与 PP 结合起来
    <ul>
      <li><img src="https://pic1.zhimg.com/80/v2-127d807df8f6efc7b1f8cb6d5ff38620_1440w.webp" alt="" /></li>
    </ul>
  </li>
  <li><code class="language-plaintext highlighter-rouge">DP</code>+<code class="language-plaintext highlighter-rouge">PP</code>+<code class="language-plaintext highlighter-rouge">TP</code>: 为了获得更高效的训练,PP 与 TP 和 DP 相结合,称为 3D 并行性
    <ul>
      <li><img src="https://pic1.zhimg.com/80/v2-7951815d9ab95beedf1d238bc58e73f0_1440w.webp" alt="" /></li>
    </ul>
  </li>
  <li>ZeRO DP+PP+TP: DeepSpeed 的主要功能之一是 ZeRO,它是 DP 的超级可扩展扩展。</li>
  <li>【2023-3-16】<a href="https://zhuanlan.zhihu.com/p/611325149">大型语言模型(LLM)训练指南</a></li>
</ul>

<p>增加的功能主要有:</p>
<ul>
  <li>3个维度并行化实现万亿参数模型训练</li>
  <li>ZeRO-Offload 使 GPU 单卡能够训练 10 倍大的模型</li>
  <li>通过 DeepSpeed Sparse Attention 用6倍速度执行10倍长的序列</li>
  <li>1 比特 Adam 减少 5 倍通信量</li>
</ul>

<p>3D 并行:扩展至万亿参数模型</p>

<p>3D 并行同时解决了训练万亿参数模型的两个基本挑战:显存效率和计算效率。因此,DeepSpeed 可以扩展至在显存中放下最巨大的模型,而不会牺牲速度。</p>
<ul>
  <li>显存效率:集群上所能训练的LLM的参数量。</li>
  <li>计算效率:单纯计算占系统的开销的比例。</li>
</ul>

<p>(1)<strong>数据并行</strong>是分布式训练普遍使用的技术。</p>

<p>在该技术中,每批输入的训练数据都在数据并行的 worker 之间平分。反向传播后需要通信并规约梯度,以保证优化器在各个 worker 上进行相同的更新。数据并行性具有几个明显的优势,包括计算效率高和实现起来工作量小。但是,数据并行的 batch 大小随 worker 数量提高,而我们往往无法在不影响收敛性的情况下一直增加 batch 大小。</p>
<ul>
  <li>显存效率:数据并行会在所有 worker 之间进行模型和优化器的复制,因此显存效率不高。DeepSpeed 开发了 ZeRO ,它是一系列用于提高数据并行的显存效率的优化器。 这项工作依赖于 ZeRO 的 1 阶段,该阶段在 worker 之间划分优化器状态量以减少冗余。</li>
  <li>计算效率:随着我们提高并行度,每个 worker 执行的计算量是恒定的。数据并行可以在小规模上实现近乎线性扩展。但是,在 worker 之间规约梯度的通信开销跟模型大小成正相关,所以当模型很大或通信带宽很低时,计算效率会受限。。梯度累积是一种用来均摊通信成本的一种常用策略。它会进一步增加batch大小,在本地使用 micro-batch 多次进行正向和反向传播积累梯度后,再进行梯度规约和优化器更新。</li>
</ul>

<p>(2)<strong>模型并行</strong>是包含范围很广的一类技术。</p>

<p>它会在多个 worker 之间划分模型的各个层。就其本质而言,模型并行性的计算和通信因模型结构而异,因此在实现上有很大的工作量。DeepSpeed 借用了英伟达的 Megatron-LM 来为基于 Transformer 的语言模型提供大规模模型并行功能。模型并行会根据 worker 数量成比例地减少显存使用量,也是这三种并行度中显存效率最高的。但是其代价是计算效率最低。</p>
<ul>
  <li>显存效率:模型并行会根据 worker 数量成比例地减少显存使用量。至关重要的是,这是减少单个网络层的激活显存的唯一方法。DeepSpeed 通过在模型并行 worker 之间划分激活显存来进一步提高显存效率。</li>
  <li>计算效率:由于每次前向和反向传播中都需要额外通信激活值,模型并行的计算效率很低。模型并行需要高通信带宽,并且不能很好地扩展到通信带宽受限的节点。此外,每个模型并行worker 都会减少每个通信阶段之间执行的计算量,从而影响计算效率。模型并行性通常与数据并行性结合使用,以在内存和计算效率之间进行权衡。</li>
</ul>

<p>(3)<strong>流水线并行</strong>训练引擎也被包含在了这次发布的DeepSpeed中</p>

<p>流水线并行将模型的各层划分为可以并行处理的阶段。当一个阶段完成一个 micro-batch 的正向传递时,激活内存将被通信至流水线的下一个阶段。类似地,当下一阶段完成反向传播时,将通过管道反向通信梯度。必须同时计算多个 micro-batch 以确保流水线的各个阶段能并行计算。目前已经开发出了几种用于权衡内存和计算效率以及收敛行为的方法,例如 PipeDream。DeepSpeed 采用的方法是通过梯度累积来实现并行,并保持与传统数据并行和模型并行训练在相同的总 batch 大小下收敛情况相同。</p>
<ul>
  <li>显存效率:流水线并行减少的显存与流水线的阶段数成正比,使模型的大小可以随 worker 的数量线性扩展。但是,流水线并行不会减少每一层的激活函数的显存占用量。此外,每个 worker 必须存储同时运行的各个 micro-batch 的激活值。这导致流水线第一阶段的激活内存与单个 mirco batch 的总激活内存大致相同。一个万亿参数模型将需要为一个 micro batch 提供大约 19 GB 的显存的激活内存,这几乎占到新推出的英伟达 A100 GPU 总显存的一半。</li>
  <li>计算效率:流水线并行具有最低的通信量,因为它的通信量只和在各阶段边界的各层的激活值大小成正比。但是,它不能无限扩展。像模型并行一样,增加流水线大小会减少每个流水线阶段的计算量,这会降低计算与通信的比率。如果要实现好的计算效率,流水线并行还要求其每个阶段的计算负载完美的均衡。</li>
</ul>

<h3 id="llama-factory">LLaMA-Factory</h3>

<h4 id="llama-factory-介绍">LLaMA-Factory 介绍</h4>

<p>LLaMA Factory 支持多种LLM微调方式,北航博士生推出,包括: <strong>预训练</strong>、<strong>指令监督微调</strong>和<strong>奖励模型</strong>训练等。</p>
<ul>
  <li>支持<code class="language-plaintext highlighter-rouge">LoRA</code>和<code class="language-plaintext highlighter-rouge">QLoRA</code>微调策略,广泛集成了业界前沿的微调方法。</li>
  <li>特点: 支持多种LLM模型,提供了<strong>WebUI页面</strong>,使非开发人员也能微调。</li>
  <li>体验地址:<a href="https://modelscope.cn/studios/hiyouga/LLaMA-Board/summary">LLaMA-Board</a></li>
  <li>可视化界面 <a href="https://huggingface.co/spaces/hiyouga/LLaMA-Board">LLaMA-Board</a></li>
  <li>github: <a href="https://github.com/hiyouga/LLaMA-Factory">LLaMA-Factory</a>,附各阶段训练数据集</li>
  <li><img src="https://pic2.zhimg.com/80/v2-7b24a5941a9bf996cf35187ae351f6c1_1440w.webp" alt="" /></li>
</ul>

<p>功能</p>
<ul>
  <li>多种模型:LLaMA、Mistral、Mixtral-MoE、Qwen、Yi、Gemma、Baichuan、ChatGLM、Phi 等等。</li>
  <li>集成方法:(<strong>增量</strong>)预训练、指令监督微调、奖励模型训练、<code class="language-plaintext highlighter-rouge">PPO</code> 训练、<code class="language-plaintext highlighter-rouge">DPO</code> 训练和 <code class="language-plaintext highlighter-rouge">ORPO</code> 训练。</li>
  <li>多种精度:32 比特全参数微调、16 比特冻结微调、16 比特 LoRA 微调和基于 AQLM/AWQ/GPTQ/LLM.int8 的 2/4/8 比特 QLoRA 微调。</li>
  <li>先进算法:GaLore、DoRA、LongLoRA、LLaMA Pro、LoRA+、LoftQ 和 Agent 微调。</li>
  <li>实用技巧:FlashAttention-2、Unsloth、RoPE scaling、NEFTune 和 rsLoRA。</li>
  <li>实验监控:LlamaBoard、TensorBoard、Wandb、MLflow 等等。</li>
  <li>极速推理:基于 vLLM 的 OpenAI 风格 API、浏览器界面和命令行接口。</li>
</ul>

<p>详情参考</p>
<ul>
  <li><a href="https://zhuanlan.zhihu.com/p/684989699">使用LLaMA Factory对大型语言模型进行微调</a></li>
  <li>作者北航博士<a href="https://github.com/hiyouga">郑耀威</a>讲解 <a href="https://www.bilibili.com/video/BV1Gt421L7dt">全栈大模型微调框架LLaMA Factory:从预训练到RLHF的高效实现</a></li>
</ul>

<iframe src="//player.bilibili.com/player.html?aid=1801563508&amp;bvid=BV1Gt421L7dt&amp;cid=1463913844&amp;p=1&amp;autoplay=0" scrolling="no" border="0" frameborder="no" framespacing="0" allowfullscreen="true" height="600" width="100%"> </iframe>

<h4 id="llama-factory-安装">LLaMA-Factory 安装</h4>

<p>安装</p>
<ul>
  <li><a href="https://github.com/hiyouga/LLaMA-Factory/blob/main/README_zh.md#%E5%A6%82%E4%BD%95%E4%BD%BF%E7%94%A8">安装说明</a></li>
</ul>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c"># Clone the repository</span>
git clone https://github.com/hiyouga/LLaMA-Factory.git
<span class="c"># Create a virtual environment</span>
conda create <span class="nt">-n</span> llama_factory <span class="nv">python</span><span class="o">=</span>3.10
<span class="c"># Activate the virtual environment</span>
conda activate llama_factory
<span class="c"># Install dependencies</span>
<span class="nb">cd </span>LLaMA-Factory
pip <span class="nb">install</span> <span class="nt">-r</span> requirements.txt
</code></pre></div></div>

<h4 id="llama-factory-使用">LLaMA-Factory 使用</h4>

<p>多GPU分布式训练, 多种工具</p>
<ul>
  <li>huggingface Accelerate</li>
  <li>DeepSpeed</li>
</ul>

<p><a href="https://zhuanlan.zhihu.com/p/718263213?utm_psn=1815334840821751808">参考</a></p>

<h5 id="指令监督微调">指令监督微调</h5>

<p>Accelerate</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code>accelerate launch   src/train.py <span class="se">\</span>
  <span class="nt">--ddp_timeout</span>  18000000 <span class="se">\</span>
    <span class="nt">--stage</span> sft <span class="se">\</span>
    <span class="nt">--do_train</span> <span class="se">\</span>
    <span class="nt">--model_name_or_path</span> /gemini/pretrain/Qwen1.5-4B/ <span class="se">\</span>
    <span class="nt">--dataset</span> alpaca_gpt4_data_zh,alpaca_gpt4_data_en,glaive_toolcall_zh_demo,adgen_local <span class="se">\</span>
    <span class="nt">--template</span> qwen <span class="se">\</span>
    <span class="nt">--finetuning_type</span> lora <span class="se">\</span>
    <span class="nt">--lora_target</span> q_proj,v_proj <span class="se">\</span>
    <span class="nt">--output_dir</span> path_to_sft_checkpoint <span class="se">\</span>
    <span class="nt">--overwrite_cache</span> <span class="se">\</span>
    <span class="nt">--overwrite_output_dir</span>
    <span class="nt">--per_device_train_batch_size</span> 2 <span class="se">\</span>
    <span class="nt">--gradient_accumulation_steps</span> 4 <span class="se">\</span>
    <span class="nt">--lr_scheduler_type</span> cosine <span class="se">\</span>
    <span class="nt">--logging_steps</span> 10 <span class="se">\</span>
    <span class="nt">--save_steps</span> 1000 <span class="se">\</span>
    <span class="nt">--learning_rate</span> 5e-5 <span class="se">\</span>
    <span class="nt">--num_train_epochs</span> 3.0 <span class="se">\</span>
    <span class="nt">--plot_loss</span> <span class="se">\</span>
    <span class="nt">--fp16</span>
</code></pre></div></div>

<p>使用 DeepSpeed</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code>deepspeed <span class="nt">--num_gpus</span> 2   src/train.py <span class="se">\</span>
 <span class="nt">--deepspeed</span> ds_config.json <span class="se">\</span>
  <span class="nt">--ddp_timeout</span>  18000000 <span class="se">\</span>
    <span class="nt">--stage</span> sft <span class="se">\</span>
    <span class="nt">--do_train</span> <span class="se">\</span>
    <span class="nt">--model_name_or_path</span> /gemini/pretrain/Qwen1.5-4B/ <span class="se">\</span>
    <span class="nt">--dataset</span> alpaca_zh_demo <span class="se">\</span>
    <span class="nt">--template</span> qwen <span class="se">\</span>
    <span class="nt">--finetuning_type</span> lora <span class="se">\</span>
    <span class="nt">--lora_target</span> q_proj,v_proj <span class="se">\</span>
    <span class="nt">--output_dir</span> path_to_sft_checkpoint <span class="se">\</span>
    <span class="nt">--overwrite_cache</span> <span class="se">\</span>
    <span class="nt">--overwrite_output_dir</span>
    <span class="nt">--per_device_train_batch_size</span> 4 <span class="se">\</span>
    <span class="nt">--gradient_accumulation_steps</span> 4 <span class="se">\</span>
    <span class="nt">--lr_scheduler_type</span> cosine <span class="se">\</span>
    <span class="nt">--logging_steps</span> 10 <span class="se">\</span>
    <span class="nt">--save_steps</span> 1000 <span class="se">\</span>
    <span class="nt">--learning_rate</span> 5e-5 <span class="se">\</span>
    <span class="nt">--num_train_epochs</span> 3.0 <span class="se">\</span>
    <span class="nt">--plot_loss</span> <span class="se">\</span>
    <span class="nt">--fp16</span>
</code></pre></div></div>

<h5 id="奖励模型训练">奖励模型训练</h5>

<p>Accelerate</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code>accelerate launch   src/train.py <span class="se">\</span>
    <span class="nt">--stage</span> <span class="nb">rm</span> <span class="se">\</span>
    <span class="nt">--do_train</span> <span class="se">\</span>
    <span class="nt">--model_name_or_path</span> /gemini/pretrain/Qwen1.5-4B/ <span class="se">\</span>
    <span class="nt">--adapter_name_or_path</span> path_to_sft_checkpoint <span class="se">\</span>
    <span class="nt">--create_new_adapter</span> <span class="se">\</span>
    <span class="nt">--dataset</span> dpo_zh_demo <span class="se">\</span>
    <span class="nt">--template</span> qwen <span class="se">\</span>
    <span class="nt">--finetuning_type</span> lora <span class="se">\</span>
    <span class="nt">--lora_target</span> q_proj,v_proj <span class="se">\</span>
    <span class="nt">--output_dir</span> path_to_ac_rm_checkpoint <span class="se">\</span>
    <span class="nt">--per_device_train_batch_size</span> 2 <span class="se">\</span>
    <span class="nt">--gradient_accumulation_steps</span> 4 <span class="se">\</span>
    <span class="nt">--lr_scheduler_type</span> cosine <span class="se">\</span>
    <span class="nt">--logging_steps</span> 10 <span class="se">\</span>
    <span class="nt">--save_steps</span> 1000 <span class="se">\</span>
    <span class="nt">--learning_rate</span> 1e-5 <span class="se">\</span>
    <span class="nt">--num_train_epochs</span> 1.0 <span class="se">\</span>
    <span class="nt">--plot_loss</span> <span class="se">\</span>
    <span class="nt">--fp16</span>
</code></pre></div></div>

<p>使用 DeepSpeed</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code>deepspeed <span class="nt">--num_gpus</span> 2   src/train.py <span class="se">\</span>
   <span class="nt">--deepspeed</span> ds_config.json <span class="se">\</span>
    <span class="nt">--stage</span> <span class="nb">rm</span> <span class="se">\</span>
    <span class="nt">--do_train</span> <span class="se">\</span>
    <span class="nt">--model_name_or_path</span> /gemini/pretrain/Qwen1.5-4B/ <span class="se">\</span>
    <span class="nt">--adapter_name_or_path</span> path_to_sft_checkpoint <span class="se">\</span>
    <span class="nt">--create_new_adapter</span> <span class="se">\</span>
    <span class="nt">--dataset</span> dpo_zh_demo <span class="se">\</span>
    <span class="nt">--template</span> qwen <span class="se">\</span>
    <span class="nt">--finetuning_type</span> lora <span class="se">\</span>
    <span class="nt">--lora_target</span> q_proj,v_proj <span class="se">\</span>
    <span class="nt">--output_dir</span> path_to_deep_rm_checkpoint <span class="se">\</span>
    <span class="nt">--per_device_train_batch_size</span> 2 <span class="se">\</span>
    <span class="nt">--gradient_accumulation_steps</span> 4 <span class="se">\</span>
    <span class="nt">--lr_scheduler_type</span> cosine <span class="se">\</span>
    <span class="nt">--logging_steps</span> 10 <span class="se">\</span>
    <span class="nt">--save_steps</span> 1000 <span class="se">\</span>
    <span class="nt">--learning_rate</span> 1e-5 <span class="se">\</span>
    <span class="nt">--num_train_epochs</span> 1.0 <span class="se">\</span>
    <span class="nt">--plot_loss</span> <span class="se">\</span>
    <span class="nt">--fp16</span>
</code></pre></div></div>

<h5 id="ppo-训练">ppo 训练</h5>

<p>Accelerate</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code>accelerate launch src/train.py <span class="se">\</span>
    <span class="nt">--stage</span> ppo <span class="se">\</span>
    <span class="nt">--do_train</span> <span class="se">\</span>
    <span class="nt">--model_name_or_path</span> /gemini/pretrain/Qwen1.5-4B/ <span class="se">\</span>
    <span class="nt">--adapter_name_or_path</span> path_to_sft_checkpoint <span class="se">\</span>
    <span class="nt">--create_new_adapter</span> <span class="se">\</span>
    <span class="nt">--dataset</span> alpaca_zh_demo <span class="se">\</span>
    <span class="nt">--template</span> qwen <span class="se">\</span>
    <span class="nt">--finetuning_type</span> lora <span class="se">\</span>
    <span class="nt">--lora_target</span> q_proj,v_proj <span class="se">\</span>
    <span class="nt">--reward_model</span> path_to_ac_rm_checkpoint <span class="se">\</span>
    <span class="nt">--output_dir</span> path_to_ac_ppo_checkpoint <span class="se">\</span>
    <span class="nt">--per_device_train_batch_size</span> 2 <span class="se">\</span>
    <span class="nt">--gradient_accumulation_steps</span> 4 <span class="se">\</span>
    <span class="nt">--lr_scheduler_type</span> cosine <span class="se">\</span>
    <span class="nt">--top_k</span> 0 <span class="se">\</span>
    <span class="nt">--top_p</span> 0.9 <span class="se">\</span>
    <span class="nt">--logging_steps</span> 10 <span class="se">\</span>
    <span class="nt">--save_steps</span> 1000 <span class="se">\</span>
    <span class="nt">--learning_rate</span> 1e-5 <span class="se">\</span>
    <span class="nt">--num_train_epochs</span> 1.0 <span class="se">\</span>
    <span class="nt">--plot_loss</span> <span class="se">\</span>
    <span class="nt">--fp16</span>
</code></pre></div></div>

<p>deepspeed</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code>deepspeed <span class="nt">--num_gpus</span> 2  src/train.py <span class="se">\</span>
    <span class="nt">--deepspeed</span> ds_config.json  <span class="se">\</span>
    <span class="nt">--stage</span> ppo <span class="se">\</span>
    <span class="nt">--do_train</span> <span class="se">\</span>
    <span class="nt">--model_name_or_path</span> /gemini/pretrain/Qwen1.5-4B/ <span class="se">\</span>
    <span class="nt">--adapter_name_or_path</span> path_to_sft_checkpoint <span class="se">\</span>
    <span class="nt">--create_new_adapter</span> <span class="se">\</span>
    <span class="nt">--dataset</span> alpaca_zh_demo <span class="se">\</span>
    <span class="nt">--template</span> qwen <span class="se">\</span>
    <span class="nt">--finetuning_type</span> lora <span class="se">\</span>
    <span class="nt">--lora_target</span> q_proj,v_proj <span class="se">\</span>
    <span class="nt">--reward_model</span> path_to_deep_rm_checkpoint <span class="se">\</span>
    <span class="nt">--output_dir</span> path_to_deep_ppo_checkpoint <span class="se">\</span>
    <span class="nt">--per_device_train_batch_size</span> 4 <span class="se">\</span>
    <span class="nt">--gradient_accumulation_steps</span> 4 <span class="se">\</span>
    <span class="nt">--lr_scheduler_type</span> cosine <span class="se">\</span>
    <span class="nt">--top_k</span> 0 <span class="se">\</span>
    <span class="nt">--top_p</span> 0.9 <span class="se">\</span>
    <span class="nt">--logging_steps</span> 10 <span class="se">\</span>
    <span class="nt">--save_steps</span> 1000 <span class="se">\</span>
    <span class="nt">--learning_rate</span> 1e-5 <span class="se">\</span>
    <span class="nt">--num_train_epochs</span> 1.0 <span class="se">\</span>
    <span class="nt">--plot_loss</span> <span class="se">\</span>
    <span class="nt">--fp16</span>
</code></pre></div></div>

<h5 id="dpo-训练">dpo 训练</h5>

<p>Accelerate</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code>accelerate launch src/train.py <span class="se">\</span>
    <span class="nt">--stage</span> dpo <span class="se">\</span>
    <span class="nt">--do_train</span> <span class="se">\</span>
    <span class="nt">--model_name_or_path</span> /gemini/pretrain/Qwen1.5-4B/ <span class="se">\</span>
    <span class="nt">--adapter_name_or_path</span> path_to_sft_checkpoint <span class="se">\</span>
    <span class="nt">--create_new_adapter</span> <span class="se">\</span>
    <span class="nt">--dataset</span> dpo_zh_demo <span class="se">\</span>
    <span class="nt">--template</span> qwen <span class="se">\</span>
    <span class="nt">--finetuning_type</span> lora <span class="se">\</span>
    <span class="nt">--lora_target</span> q_proj,v_proj <span class="se">\</span>
    <span class="nt">--output_dir</span> path_to_ac_dpo_checkpoint <span class="se">\</span>
    <span class="nt">--per_device_train_batch_size</span> 2 <span class="se">\</span>
    <span class="nt">--gradient_accumulation_steps</span> 4 <span class="se">\</span>
    <span class="nt">--lr_scheduler_type</span> cosine <span class="se">\</span>
    <span class="nt">--logging_steps</span> 10 <span class="se">\</span>
    <span class="nt">--save_steps</span> 1000 <span class="se">\</span>
    <span class="nt">--learning_rate</span> 1e-5 <span class="se">\</span>
    <span class="nt">--num_train_epochs</span> 1.0 <span class="se">\</span>
    <span class="nt">--plot_loss</span> <span class="se">\</span>
    <span class="nt">--fp16</span> 
</code></pre></div></div>

<p>deepspeed</p>

<div class="language-sh highlighter-rouge"><div class="highlight"><pre class="highlight"><code>deepspeed <span class="nt">--num_gpus</span> 2   src/train.py <span class="se">\</span>
    <span class="nt">--deepspeed</span> ds_config.json  <span class="se">\</span>
    <span class="nt">--stage</span> dpo <span class="se">\</span>
    <span class="nt">--do_train</span> <span class="se">\</span>
    <span class="nt">--model_name_or_path</span> /gemini/pretrain/Qwen1.5-4B/ <span class="se">\</span>
    <span class="nt">--adapter_name_or_path</span> path_to_sft_checkpoint <span class="se">\</span>
    <span class="nt">--create_new_adapter</span> <span class="se">\</span>
    <span class="nt">--dataset</span> dpo_zh_demo <span class="se">\</span>
    <span class="nt">--template</span> qwen <span class="se">\</span>
    <span class="nt">--finetuning_type</span> lora <span class="se">\</span>
    <span class="nt">--lora_target</span> q_proj,v_proj <span class="se">\</span>
    <span class="nt">--output_dir</span> path_to_deep_dpo_checkpoint <span class="se">\</span>
    <span class="nt">--per_device_train_batch_size</span> 2 <span class="se">\</span>
    <span class="nt">--gradient_accumulation_steps</span> 4 <span class="se">\</span>
    <span class="nt">--lr_scheduler_type</span> cosine <span class="se">\</span>
    <span class="nt">--logging_steps</span> 10 <span class="se">\</span>
    <span class="nt">--save_steps</span> 1000 <span class="se">\</span>
    <span class="nt">--learning_rate</span> 1e-5 <span class="se">\</span>
    <span class="nt">--num_train_epochs</span> 1.0 <span class="se">\</span>
    <span class="nt">--plot_loss</span> <span class="se">\</span>
    <span class="nt">--fp16</span> 
</code></pre></div></div>

<h3 id="xtuner">Xtuner</h3>

<p>上海AI实验室推出的 <a href="https://github.com/InternLM/xtuner">XTuner</a> 是一个高效、灵活、全能的轻量化大模型微调工具库。与 LLaMA-Factory 类似,不过,在<strong>长序列训练</strong>、<strong>token生成速度</strong>等方面要比 LLaMA-Factory 更强。</p>

<p>简析</p>
<ul>
  <li>数据集: LLaMA-Factory 支持<strong>多种格式</strong>的数据集,更通用泛化;而 <code class="language-plaintext highlighter-rouge">XTuner</code> 只支持类似 <code class="language-plaintext highlighter-rouge">ShareGPT</code> 格式的数据集。</li>
  <li>模型支持: LLaMA-Factory 支持模型种类也要比XTuner更多;但 XTuner 多模态模型(LLaVA-Internlm2-7B / 20B、LLaVA-v1.5)的支持要比 LLaMA-Factory。</li>
</ul>

<p>多轮对话训练时的loss计算。</p>
<ul>
  <li>从文档来看,XTuner更清晰,而且是我想要的效果;</li>
  <li>而对于 LLaMA-Factory,其放出来的只是数据集格式文档,loss计算没那么透明,只能啃源码。</li>
</ul>

<p>多轮对话所对应的<strong>长序列</strong>训练性能。随着 Gemini 1M context length 和 Sora 出世,如何训练超长上下文的大模型引起了大家广泛关注。同时在大多数的场景下,多轮对话一般也就是一个conversations包含几轮对话;但在实际情况中,一个conversations下有几百个对话,即长对话,这种场景还是比较多的。</p>

<p>解决方案比较麻烦,需要做拆分;在基座模型支持长上下文的情况下,如果微调框架能支持长序列训练,且性能不错,是很好的选择;</p>

<p>XTuner 在这方面要比 LLaMA-Factory 更好。</p>

<p>XTuner 序列并行设计思路参考了 DeepSpeed 的工作 DeepSpeed Ulysses,并加以优化,以达到直接基于 transformers 算法库或 Huggingface Hub 上的开源模型训练 1M 以上超长序列的目标。</p>

<table>
  <thead>
    <tr>
      <th>模型</th>
      <th>序列并行支持情况</th>
    </tr>
  </thead>
  <tbody>
    <tr>
      <td>baichuan</td>
      <td>1/2	❌</td>
    </tr>
    <tr>
      <td>chatglm</td>
      <td>2/3	❌</td>
    </tr>
    <tr>
      <td>deepseek</td>
      <td>✅</td>
    </tr>
    <tr>
      <td>gemma</td>
      <td>❌</td>
    </tr>
    <tr>
      <td>internlm 2</td>
      <td>✅</td>
    </tr>
    <tr>
      <td>llama 2</td>
      <td>✅</td>
    </tr>
    <tr>
      <td>mistral</td>
      <td>❌</td>
    </tr>
    <tr>
      <td>qwen 1/1.5</td>
      <td>❌</td>
    </tr>
    <tr>
      <td>starcoder</td>
      <td>❌</td>
    </tr>
    <tr>
      <td>yi</td>
      <td>✅</td>
    </tr>
    <tr>
      <td>zephyr</td>
      <td>✅</td>
    </tr>
  </tbody>
</table>

<h3 id="swift">SWIFT</h3>

<p>【2024-7-4】 阿里推出训练框架 <a href="https://github.com/modelscope/ms-swift/blob/main/README_CN.md">SWIFT</a> (Scalable lightWeight Infrastructure for Fine-Tuning)</p>

<p>SWIFT支持300+ LLM和50+ MLLM(多模态大模型)的训练(预训练、微调、对齐)、推理、评测和部署。开发者可以直接将我们的框架应用到自己的Research和生产环境中,实现模型训练评测到应用的完整链路。我们除支持了PEFT提供的轻量训练方案外,也提供了一个完整的Adapters库以支持最新的训练技术,如NEFTune、LoRA+、LLaMA-PRO等,这个适配器库可以脱离训练脚本直接使用在自己的自定流程中。</p>

<h2 id="新技术">新技术</h2>

<h3 id="distro">DisTrO</h3>

<p>【2024-8-29】<a href="https://mp.weixin.qq.com/s/wap7pZ3jUawNKG_3uMzojQ">DisTrO 让你家里的电脑也能训练超级大模型</a></p>

<p>Nous Research 最近放出了一份重磅报告,介绍最新研究成果——DisTrO(Distributed Training Over-the-Internet)。</p>
<ul>
  <li><a href="https://venturebeat.com/wp-content/uploads/2024/08/A_Preliminary_Report_on_DisTrO.pdf">A PRELIMINARY REPORT ON DISTRO</a></li>
  <li><a href="https://github.com/NousResearch/DisTrO/blob/main/A_Preliminary_Report_on_DisTrO.pdf">A_Preliminary_Report_on_DisTrO.pdf</a></li>
</ul>

<p>有望让告别”只有大公司才能训练大模型”的时代,开启全民AI狂欢!</p>

<p>DisTrO 是一个<strong>分布式优化器</strong>家族,两个超级牛X的特点:</p>
<ul>
  <li>与架构无关:不管你用啥架构,它都能用。</li>
  <li>与网络无关:网速慢?没关系,它照样能跑!</li>
</ul>

<p>最厉害的是,DisTrO把GPU之间的通信需求减少了1000到10000倍!</p>

<p>在龟速网络上,用各种杂牌子的网络硬件,也能训练大型神经网络,而且收敛速度跟AdamW+All-Reduce一样快!</p>

<p>DisTrO究竟有什么用呢?</p>
<ul>
  <li>提高LLM训练的抗风险能力:不再依赖单一实体的计算能力,训练过程更安全、更公平。</li>
  <li>促进研究合作与创新:研究人员和机构可以更自由地合作,尝试新技术、新算法、新模型。</li>
  <li>推动AI民主化:降低了训练大模型的门槛,让更多人有机会参与其中。</li>
</ul>

<h1 id="结束">结束</h1>

            </article>
        <hr>
        

        <!-- 打赏 -->
        <!-- 【2019-08-06】打赏功能,  http://www.twistedwg.com/2018/05/06/jekyll-reward.html-->
<div class="reward">
        <div class="reward-button">赏<span class="reward-code">
            <span class="alipay-code"> <img class="alipay-img wdp-appear" src="/wqw/fig/alipay.png"><b>支付宝打赏</b> </span>
            <span class="wechat-code"> <img class="wechat-img wdp-appear" src="/wqw/fig/wechatpay.png"><b>微信打赏</b> </span> </span>
        </div>
        <p class="reward-notice" style="color:chocolate">~ 海内存知已,天涯若比邻 ~</p>
</div>



        <!-- 分享工具 -->
        <h2 id="share">Share</h2>
        <!-- https://github.com/overtrue/share.js -->
<div class="social-share"></div>
<!--  css & js -->
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/social-share.js/1.0.16/css/share.min.css">
<script src="https://cdnjs.cloudflare.com/ajax/libs/social-share.js/1.0.16/js/social-share.min.js"></script>




        <!-- 相似文章 -->
        
        
            
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
        
            
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
        
            
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
        
            
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
        
            
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
        
            
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
        
            
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
        
            
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
        
            
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
        
            
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
                
                    
                
                    
                
                    
                
                    
                
                    
                
                    
                
            
        
        

        <!-- 相关文章推荐 -->
        <h2 id="comments">Related Posts</h2>


        <!-- 翻页 -->
        <head>
    <!-- [2022-11-10]卡片样式 -->
    <link  href="/css/card.css " rel="stylesheet"  type="text/css"> 
</head>

<div class="post-recent">
    <div class="pre">
        
        <p><strong>上一篇</strong> <a href="/deepspeed">DeepSpeed 学习笔记</a></p>
        
    </div>
    <div class="nex">
        
        <p><strong>下一篇</strong> <a href="/llm_train">LLM 大模型训练之路</a></p>
        
    </div>
</div>

<div class="kapian">
    
    <div class="tu">
       <img src="https://img.zcool.cn/community/01493a5cc98256a801214168b8995d.jpg">
    </div>
    <div class="wenben">
          <p><a href="/deepspeed">标题:DeepSpeed 学习笔记</a></p>
          <p style="padding-bottom: 20px;">摘要:DeepSpeed 知识点、训练技巧总结</p>
    </div>
    
    
    <div class="tu">
        <img src="https://img.zcool.cn/community/01493a5cc98256a801214168b8995d.jpg">
     </div>
     <div class="wenben">
           <p><a href="/llm_train">标题:LLM 大模型训练之路</a></p>
           <p style="padding-bottom: 40px;">摘要:大模型训练原理,如何训练,有什么经验?</p>
     </div>
     
</div>

	
        <h2> 站内可视化导航 </h2>
        
<!-- 文章导读区 -->

文章可视化导读:鼠标划过图形块时,如果出现蓝色光环, 点击即可跳转到对应主题


<!-- draw.io diagram -->
<div class="mxgraph" style="max-width:100%;border:1px solid transparent;" data-mxgraph="{&quot;highlight&quot;:&quot;#0000ff&quot;,&quot;nav&quot;:true,&quot;resize&quot;:true,&quot;toolbar&quot;:&quot;zoom layers tags lightbox&quot;,&quot;edit&quot;:&quot;_blank&quot;,&quot;xml&quot;:&quot;&lt;mxfile host=\&quot;app.diagrams.net\&quot; agent=\&quot;Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/131.0.0.0 Safari/537.36\&quot; version=\&quot;24.8.9\&quot;&gt;\n  &lt;diagram id=\&quot;4u5yHArNrn4fvDAkmxS5\&quot; name=\&quot;第 1 页\&quot;&gt;\n    &lt;mxGraphModel dx=\&quot;1050\&quot; dy=\&quot;530\&quot; grid=\&quot;1\&quot; gridSize=\&quot;10\&quot; guides=\&quot;1\&quot; tooltips=\&quot;1\&quot; connect=\&quot;1\&quot; arrows=\&quot;0\&quot; fold=\&quot;1\&quot; page=\&quot;1\&quot; pageScale=\&quot;1\&quot; pageWidth=\&quot;850\&quot; pageHeight=\&quot;1100\&quot; math=\&quot;0\&quot; shadow=\&quot;0\&quot;&gt;\n      &lt;root&gt;\n        &lt;mxCell id=\&quot;0\&quot; /&gt;\n        &lt;mxCell id=\&quot;1\&quot; parent=\&quot;0\&quot; /&gt;\n        &lt;mxCell id=\&quot;o5D4xRg-JXB86p6HjegH-613\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;whiteSpace=wrap;html=1;labelBackgroundColor=none;fontSize=10;fillColor=#f5f5f5;dashed=1;strokeColor=#666666;fontColor=#333333;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;291.75\&quot; y=\&quot;2007\&quot; width=\&quot;408.25\&quot; height=\&quot;230\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;379\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;whiteSpace=wrap;html=1;labelBackgroundColor=none;fontSize=10;fillColor=#f5f5f5;dashed=1;strokeColor=#666666;fontColor=#333333;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;210\&quot; y=\&quot;1090\&quot; width=\&quot;680\&quot; height=\&quot;430\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;o5D4xRg-JXB86p6HjegH-522\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;dashed=1;fillColor=#fff2cc;strokeColor=#d6b656;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;117.13\&quot; y=\&quot;1060\&quot; width=\&quot;352.87\&quot; height=\&quot;130\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;378\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;whiteSpace=wrap;html=1;labelBackgroundColor=none;fontSize=10;fillColor=#f5f5f5;dashed=1;strokeColor=#666666;fontColor=#333333;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;1026\&quot; y=\&quot;1750\&quot; width=\&quot;264\&quot; height=\&quot;360\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;380\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;dashed=1;fillColor=#fff2cc;strokeColor=#d6b656;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;910\&quot; y=\&quot;1190\&quot; width=\&quot;290\&quot; height=\&quot;130\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;381\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;dashed=1;fillColor=#fff2cc;strokeColor=#d6b656;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;734\&quot; y=\&quot;1060\&quot; width=\&quot;266.5\&quot; height=\&quot;103.37\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;382\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;dashed=1;fillColor=#fff2cc;strokeColor=#d6b656;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;640\&quot; y=\&quot;1223.25\&quot; width=\&quot;230\&quot; height=\&quot;96.75\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;383\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;dashed=1;fillColor=#fff2cc;strokeColor=#d6b656;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;863\&quot; y=\&quot;1370\&quot; width=\&quot;190\&quot; height=\&quot;126.25\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;384\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;dashed=1;fillColor=#E6D0DE;strokeColor=#ae4132;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;269.75\&quot; y=\&quot;1365\&quot; width=\&quot;441.5\&quot; height=\&quot;135\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;385\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;whiteSpace=wrap;html=1;labelBackgroundColor=none;fontSize=10;fillColor=#f5f5f5;dashed=1;strokeColor=#666666;fontColor=#333333;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;296\&quot; y=\&quot;1750\&quot; width=\&quot;408.25\&quot; height=\&quot;230\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;386\&quot; value=\&quot;\&quot; style=\&quot;ellipse;whiteSpace=wrap;html=1;dashed=1;fillColor=#fff2cc;strokeColor=#d6b656;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;305.5\&quot; y=\&quot;1200\&quot; width=\&quot;324.5\&quot; height=\&quot;130\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;387\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;dashed=1;fillColor=#E6D0DE;strokeColor=#E6E6E6;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;280\&quot; y=\&quot;1520\&quot; width=\&quot;470\&quot; height=\&quot;160\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;388\&quot; value=\&quot;模型层\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontSize=13;fontStyle=1;fontColor=#6666FF;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot;&gt;\n          &lt;mxGeometry x=\&quot;435.38\&quot; y=\&quot;1400\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;-3\&quot; y=\&quot;-20\&quot; as=\&quot;offset\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;389\&quot; value=\&quot;模态层\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontSize=13;fontStyle=1;fontColor=#6666FF;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot;&gt;\n          &lt;mxGeometry x=\&quot;347.13\&quot; y=\&quot;1260\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;-8\&quot; y=\&quot;-5\&quot; as=\&quot;offset\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;文本生成\&quot; link=\&quot;text-generation\&quot; id=\&quot;391\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#ffcd28;strokeColor=none;shadow=1;gradientColor=#FFB570;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;385\&quot; y=\&quot;1209\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;图像生成\&quot; link=\&quot;image-generation\&quot; id=\&quot;392\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#ffcd28;strokeColor=none;shadow=1;gradientColor=#FFB570;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;481\&quot; y=\&quot;1220\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;393\&quot; value=\&quot;语音生成\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#ffcd28;strokeColor=none;shadow=1;gradientColor=#FFB570;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;357\&quot; y=\&quot;1260\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;视频生成\&quot; link=\&quot;video_gen\&quot; id=\&quot;394\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#ffcd28;strokeColor=none;shadow=1;gradientColor=#FFB570;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;539.25\&quot; y=\&quot;1260\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;395\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=2;strokeColor=#999999;exitX=0.5;exitY=0;exitDx=0;exitDy=0;entryX=0.5;entryY=1;entryDx=0;entryDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;451\&quot; target=\&quot;391\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;469\&quot; y=\&quot;1675\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;539\&quot; y=\&quot;1675\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;396\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=2;strokeColor=#999999;entryX=0.5;entryY=1;entryDx=0;entryDy=0;exitX=0.5;exitY=0;exitDx=0;exitDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;451\&quot; target=\&quot;392\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;465\&quot; y=\&quot;1350\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;439\&quot; y=\&quot;1285\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;397\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=2;strokeColor=#999999;entryX=0.5;entryY=1;entryDx=0;entryDy=0;exitX=0.5;exitY=0;exitDx=0;exitDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;451\&quot; target=\&quot;394\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;528\&quot; y=\&quot;1330\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;555\&quot; y=\&quot;1285\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;398\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=2;strokeColor=#999999;entryX=0.5;entryY=1;entryDx=0;entryDy=0;exitX=0.5;exitY=0;exitDx=0;exitDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;451\&quot; target=\&quot;393\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;558\&quot; y=\&quot;1360\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;585\&quot; y=\&quot;1315\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;扩散模型\&quot; link=\&quot;ddpm\&quot; id=\&quot;399\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;497\&quot; y=\&quot;1149\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;401\&quot; value=\&quot;NLP任务\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d0cee2;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;780\&quot; y=\&quot;1480\&quot; width=\&quot;75\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;402\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;\&quot; parent=\&quot;1\&quot; source=\&quot;404\&quot; target=\&quot;488\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;403\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;\&quot; parent=\&quot;1\&quot; source=\&quot;404\&quot; target=\&quot;490\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;对话系统\&quot; link=\&quot;dialogue-system\&quot; id=\&quot;404\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;328.5\&quot; y=\&quot;1149\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;406\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=1;strokeColor=#999999;entryX=0.5;entryY=1;entryDx=0;entryDy=0;dashed=1;exitX=0.5;exitY=0;exitDx=0;exitDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;391\&quot; target=\&quot;404\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;425\&quot; y=\&quot;1260\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;565\&quot; y=\&quot;1220\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;407\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=1;strokeColor=#999999;entryX=0;entryY=0.5;entryDx=0;entryDy=0;dashed=1;exitX=0.5;exitY=1;exitDx=0;exitDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;401\&quot; target=\&quot;491\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;490\&quot; y=\&quot;1570\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;325\&quot; y=\&quot;1455\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;408\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=1;strokeColor=#999999;entryX=0.5;entryY=1;entryDx=0;entryDy=0;dashed=1;exitX=0.617;exitY=0.05;exitDx=0;exitDy=0;exitPerimeter=0;\&quot; parent=\&quot;1\&quot; source=\&quot;392\&quot; target=\&quot;399\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;685\&quot; y=\&quot;1433\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;735\&quot; y=\&quot;1455\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;409\&quot; value=\&quot;LLM大模型专题导航\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontSize=19;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot;&gt;\n          &lt;mxGeometry x=\&quot;524.248484809835\&quot; y=\&quot;980.0011254969539\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;1\&quot; as=\&quot;offset\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;LLM训练流程\&quot; link=\&quot;llm_train\&quot; id=\&quot;410\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d80073;strokeColor=#A50040;shadow=1;fontColor=#ffffff;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;454.25\&quot; y=\&quot;1570\&quot; width=\&quot;80.75\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;分布式训练\&quot; link=\&quot;dist\&quot; id=\&quot;411\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d80073;strokeColor=#A50040;shadow=1;fontColor=#ffffff;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;449.25\&quot; y=\&quot;1640\&quot; width=\&quot;90\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;GPU\&quot; link=\&quot;gpu\&quot; id=\&quot;412\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f5f5f5;strokeColor=#666666;shadow=1;fontColor=#333333;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;380.38\&quot; y=\&quot;1640\&quot; width=\&quot;55\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;DeepSpeed\&quot; link=\&quot;deepspeed\&quot; id=\&quot;413\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f5f5f5;strokeColor=#666666;shadow=1;fontColor=#333333;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;514.88\&quot; y=\&quot;1525\&quot; width=\&quot;70.12\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;414\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=2;strokeColor=#999999;exitX=0.5;exitY=0;exitDx=0;exitDy=0;entryX=0.5;entryY=1;entryDx=0;entryDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;411\&quot; target=\&quot;410\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;535.38\&quot; y=\&quot;1240\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;630.38\&quot; y=\&quot;1590\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;415\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=2;strokeColor=#999999;exitX=0.5;exitY=0;exitDx=0;exitDy=0;entryX=0.5;entryY=1;entryDx=0;entryDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;412\&quot; target=\&quot;410\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;525.38\&quot; y=\&quot;1650\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;525.38\&quot; y=\&quot;1610\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;416\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=2;strokeColor=#999999;exitX=0.5;exitY=0;exitDx=0;exitDy=0;entryX=0.5;entryY=1;entryDx=0;entryDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;410\&quot; target=\&quot;462\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;535.38\&quot; y=\&quot;1660\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;535.38\&quot; y=\&quot;1620\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;417\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;\&quot; parent=\&quot;1\&quot; source=\&quot;418\&quot; target=\&quot;422\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;RAG\&quot; link=\&quot;rag\&quot; id=\&quot;418\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#ffcd28;strokeColor=none;shadow=1;gradientColor=#FFB570;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;723\&quot; y=\&quot;1239\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;FineTune\&quot; link=\&quot;finetune\&quot; id=\&quot;419\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#ffcd28;strokeColor=none;shadow=1;gradientColor=#FFB570;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;803\&quot; y=\&quot;1276\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;RLHF\&quot; link=\&quot;rlhf\&quot; id=\&quot;420\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#ffe6cc;strokeColor=#d79b00;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;619.25\&quot; y=\&quot;1646\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;421\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;entryX=0;entryY=0.5;entryDx=0;entryDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;422\&quot; target=\&quot;443\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;PEFT\&quot; link=\&quot;peft\&quot; id=\&quot;422\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#ffcd28;strokeColor=none;shadow=1;gradientColor=#FFB570;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;803\&quot; y=\&quot;1239\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;o5D4xRg-JXB86p6HjegH-544\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;entryX=0.5;entryY=1;entryDx=0;entryDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;423\&quot; target=\&quot;429\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;o5D4xRg-JXB86p6HjegH-551\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;\&quot; parent=\&quot;1\&quot; source=\&quot;423\&quot; target=\&quot;o5D4xRg-JXB86p6HjegH-550\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;423\&quot; value=\&quot;数据准备\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d0cee2;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;287.13\&quot; y=\&quot;1570\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;模型评估\&quot; link=\&quot;llm_eva\&quot; id=\&quot;424\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d0cee2;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;630.38\&quot; y=\&quot;1570\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;425\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=3;strokeColor=#999999;entryX=0;entryY=0.5;entryDx=0;entryDy=0;dashed=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;dashPattern=1 1;\&quot; parent=\&quot;1\&quot; source=\&quot;423\&quot; target=\&quot;410\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;534.38\&quot; y=\&quot;1310\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;630.38\&quot; y=\&quot;1350\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;426\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=3;strokeColor=#999999;entryX=0;entryY=0.5;entryDx=0;entryDy=0;dashed=1;dashPattern=1 1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;410\&quot; target=\&quot;424\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;560.38\&quot; y=\&quot;1585\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;480.38\&quot; y=\&quot;1595\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;PyTorch\&quot; link=\&quot;pytorch\&quot; id=\&quot;427\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f5f5f5;strokeColor=#666666;shadow=1;fontColor=#333333;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;548.5\&quot; y=\&quot;1640\&quot; width=\&quot;55\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;428\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=2;strokeColor=#999999;exitX=0.5;exitY=0;exitDx=0;exitDy=0;entryX=0.5;entryY=1;entryDx=0;entryDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;427\&quot; target=\&quot;410\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;525.38\&quot; y=\&quot;1650\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;525.38\&quot; y=\&quot;1610\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;o5D4xRg-JXB86p6HjegH-543\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;\&quot; parent=\&quot;1\&quot; source=\&quot;429\&quot; target=\&quot;o5D4xRg-JXB86p6HjegH-542\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;数据标注\&quot; link=\&quot;label\&quot; id=\&quot;429\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d0cee2;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;287.13\&quot; y=\&quot;1525\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;MoE\&quot; link=\&quot;moe\&quot; id=\&quot;430\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#ffe6cc;strokeColor=#d79b00;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;685\&quot; y=\&quot;1610\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;LLM应用方案\&quot; link=\&quot;llm_solution\&quot; id=\&quot;431\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f0a30a;strokeColor=#BD7000;shadow=1;fontColor=#000000;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;595\&quot; y=\&quot;1355\&quot; width=\&quot;90\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;Transformer\&quot; link=\&quot;transformer\&quot; id=\&quot;432\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;414.25\&quot; y=\&quot;1923\&quot; width=\&quot;80\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;GPT\&quot; link=\&quot;gpt\&quot; id=\&quot;433\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;405.75\&quot; y=\&quot;1849.5\&quot; width=\&quot;50\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;BERT\&quot; link=\&quot;bert\&quot; id=\&quot;434\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;475.12\&quot; y=\&quot;1850\&quot; width=\&quot;50\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;435\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=2;strokeColor=#999999;exitX=0.5;exitY=0;exitDx=0;exitDy=0;entryX=0.5;entryY=1;entryDx=0;entryDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;432\&quot; target=\&quot;433\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;508.99\&quot; y=\&quot;1845\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;594.99\&quot; y=\&quot;1805\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;436\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=2;strokeColor=#999999;exitX=0.5;exitY=0;exitDx=0;exitDy=0;entryX=0.5;entryY=1;entryDx=0;entryDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;432\&quot; target=\&quot;434\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;438.99\&quot; y=\&quot;1935\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;388.99\&quot; y=\&quot;1890\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;437\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=2;strokeColor=#999999;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryX=0;entryY=0.5;entryDx=0;entryDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;ozFa4HHbGE1QGwIMumdl-526\&quot; target=\&quot;459\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;448.99\&quot; y=\&quot;1945\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;398.99\&quot; y=\&quot;1900\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;Scaline Law\&quot; link=\&quot;llm_law\&quot; id=\&quot;438\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d5e8d4;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;943\&quot; y=\&quot;1379\&quot; width=\&quot;73\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;复杂推理\&quot; link=\&quot;o1\&quot; id=\&quot;439\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d5e8d4;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;879.5\&quot; y=\&quot;1461.25\&quot; width=\&quot;50\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;Function Call\&quot; link=\&quot;function\&quot; id=\&quot;440\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d5e8d4;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;873\&quot; y=\&quot;1421.25\&quot; width=\&quot;73\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;Plugin 插件\&quot; link=\&quot;plugin\&quot; id=\&quot;441\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d5e8d4;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;953\&quot; y=\&quot;1421.25\&quot; width=\&quot;73\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;442\&quot; value=\&quot;小模型\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#a0522d;strokeColor=#6D1F00;shadow=1;fontColor=#ffffff;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;342\&quot; y=\&quot;1370\&quot; width=\&quot;56\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;Agent&amp;lt;div&amp;gt;智能体&amp;lt;/div&amp;gt;\&quot; link=\&quot;agent\&quot; id=\&quot;443\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#a0522d;strokeColor=#6D1F00;shadow=1;fontColor=#ffffff;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;920\&quot; y=\&quot;1238.81\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;LangChain\&quot; link=\&quot;langchain\&quot; id=\&quot;444\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1106.5\&quot; y=\&quot;1258.81\&quot; width=\&quot;70\&quot; height=\&quot;33.75\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;AutoGen\&quot; link=\&quot;autogen\&quot; id=\&quot;445\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1106.5\&quot; y=\&quot;1215.44\&quot; width=\&quot;70\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;CoT\&quot; link=\&quot;cot\&quot; id=\&quot;446\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#ffe6cc;strokeColor=#d79b00;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;619.25\&quot; y=\&quot;1610\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;447\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;fillColor=#60a917;strokeColor=#2D7600;\&quot; parent=\&quot;1\&quot; source=\&quot;448\&quot; target=\&quot;449\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;Prompt Engineering&amp;amp;nbsp;&amp;lt;div&amp;gt;提示工程&amp;lt;/div&amp;gt;\&quot; link=\&quot;pe\&quot; id=\&quot;448\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#60a917;strokeColor=#2D7600;shadow=1;fontColor=#ffffff;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;740.5\&quot; y=\&quot;1113.37\&quot; width=\&quot;117\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;APE&amp;amp;nbsp;&amp;lt;div&amp;gt;提示词自动化&amp;lt;/div&amp;gt;\&quot; link=\&quot;prompt_auto\&quot; id=\&quot;449\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#60a917;strokeColor=#2D7600;shadow=1;fontColor=#ffffff;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;895.5\&quot; y=\&quot;1113.37\&quot; width=\&quot;80\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;Prompt Attack&amp;amp;nbsp;&amp;lt;div&amp;gt;提示词攻击&amp;lt;/div&amp;gt;\&quot; link=\&quot;prompt_attack\&quot; id=\&quot;450\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#60a917;strokeColor=#2D7600;shadow=1;fontColor=#ffffff;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;890.5\&quot; y=\&quot;1063.37\&quot; width=\&quot;90\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;多模态\&quot; link=\&quot;modal\&quot; id=\&quot;451\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#a0522d;strokeColor=#6D1F00;shadow=1;fontColor=#ffffff;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;464.25\&quot; y=\&quot;1370\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;452\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=1;exitY=0.25;exitDx=0;exitDy=0;entryX=0;entryY=0.5;entryDx=0;entryDy=0;fillColor=#60a917;strokeColor=#2D7600;\&quot; parent=\&quot;1\&quot; source=\&quot;448\&quot; target=\&quot;450\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;453\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;fillColor=#60a917;strokeColor=#2D7600;\&quot; parent=\&quot;1\&quot; source=\&quot;454\&quot; target=\&quot;448\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;Prompt Learning&amp;amp;nbsp;&amp;lt;div&amp;gt;提示学习&amp;lt;/div&amp;gt;\&quot; link=\&quot;prompt\&quot; id=\&quot;454\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#60a917;strokeColor=#2D7600;shadow=1;fontColor=#ffffff;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;740.5\&quot; y=\&quot;1063.37\&quot; width=\&quot;117\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;Transformers 库\&quot; link=\&quot;huggingface\&quot; id=\&quot;455\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f5f5f5;strokeColor=#666666;shadow=1;fontColor=#333333;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;384\&quot; y=\&quot;1525\&quot; width=\&quot;90\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;Embedding\&quot; link=\&quot;emb\&quot; id=\&quot;456\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;523.99\&quot; y=\&quot;1925\&quot; width=\&quot;80\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;分词\&quot; link=\&quot;token\&quot; id=\&quot;457\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;613.99\&quot; y=\&quot;1925\&quot; width=\&quot;45\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;Pretrain Language Model&amp;lt;div&amp;gt;预训练语言模型&amp;lt;/div&amp;gt;\&quot; link=\&quot;plm\&quot; id=\&quot;458\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;545.49\&quot; y=\&quot;1850\&quot; width=\&quot;145\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;ChatGPT\&quot; link=\&quot;chatgpt\&quot; id=\&quot;459\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;393.38\&quot; y=\&quot;1770\&quot; width=\&quot;84\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;460\&quot; value=\&quot;NLP模型\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontSize=14;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot;&gt;\n          &lt;mxGeometry x=\&quot;579.998484809835\&quot; y=\&quot;1740.001125496954\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;1\&quot; as=\&quot;offset\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;461\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=5;strokeColor=#CCCCCC;entryX=0.5;entryY=1;entryDx=0;entryDy=0;exitX=0.5;exitY=0;exitDx=0;exitDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;385\&quot; target=\&quot;387\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;500\&quot; y=\&quot;1420\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;499\&quot; y=\&quot;1460\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;462\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;dashed=1;fillColor=#f8cecc;strokeColor=#b85450;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;310\&quot; y=\&quot;1425\&quot; width=\&quot;369.25\&quot; height=\&quot;70\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;ChatGLM\&quot; link=\&quot;chatglm\&quot; id=\&quot;463\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#e3c800;strokeColor=#B09500;shadow=1;fontColor=#000000;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;459.25\&quot; y=\&quot;1450\&quot; width=\&quot;70\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;Baichuan\&quot; link=\&quot;baichuan\&quot; id=\&quot;464\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#e3c800;strokeColor=#B09500;shadow=1;fontColor=#000000;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;554.25\&quot; y=\&quot;1450\&quot; width=\&quot;70\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;ChatGPT\&quot; link=\&quot;chatgpt\&quot; id=\&quot;465\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#e3c800;strokeColor=#B09500;shadow=1;fontColor=#000000;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;369.25\&quot; y=\&quot;1450\&quot; width=\&quot;70\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;466\&quot; value=\&quot;大语言模型\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontSize=13;fontStyle=1;fontColor=#333300;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot;&gt;\n          &lt;mxGeometry x=\&quot;494.25\&quot; y=\&quot;1440\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;467\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=2;strokeColor=#999999;exitX=0.5;exitY=0;exitDx=0;exitDy=0;entryX=0.5;entryY=1;entryDx=0;entryDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;462\&quot; target=\&quot;451\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;504\&quot; y=\&quot;1580\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;504\&quot; y=\&quot;1520\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;468\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=2;strokeColor=#999999;entryX=0.5;entryY=1;entryDx=0;entryDy=0;exitX=0.25;exitY=0;exitDx=0;exitDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;462\&quot; target=\&quot;442\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;490\&quot; y=\&quot;1430\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;504\&quot; y=\&quot;1410\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;469\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=2;strokeColor=#999999;exitX=0.5;exitY=0;exitDx=0;exitDy=0;entryX=0.039;entryY=1;entryDx=0;entryDy=0;entryPerimeter=0;\&quot; parent=\&quot;1\&quot; source=\&quot;431\&quot; target=\&quot;382\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;504\&quot; y=\&quot;1440\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;650\&quot; y=\&quot;1310\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;470\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=1;strokeColor=#999999;dashed=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;485\&quot; target=\&quot;444\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;735.5\&quot; y=\&quot;1462.81\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;810.5\&quot; y=\&quot;1453.81\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;471\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=1;strokeColor=#999999;dashed=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryX=0;entryY=0.5;entryDx=0;entryDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;443\&quot; target=\&quot;485\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;651.5\&quot; y=\&quot;1338.81\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;1041.5\&quot; y=\&quot;1233.81\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;472\&quot; value=\&quot;垂类模型\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#a0522d;strokeColor=#6D1F00;shadow=1;fontColor=#ffffff;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;280\&quot; y=\&quot;1370\&quot; width=\&quot;56\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;473\&quot; value=\&quot;专题优化\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#a0522d;strokeColor=#6D1F00;shadow=1;fontColor=#ffffff;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;750.5\&quot; y=\&quot;1419\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;474\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=2;strokeColor=#999999;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryX=0;entryY=0.5;entryDx=0;entryDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;384\&quot; target=\&quot;473\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;505\&quot; y=\&quot;1435\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;504\&quot; y=\&quot;1410\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;幻觉\&quot; link=\&quot;hallucination\&quot; id=\&quot;475\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d5e8d4;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;876.5\&quot; y=\&quot;1379\&quot; width=\&quot;50\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;476\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=2;strokeColor=#999999;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryX=0;entryY=0.5;entryDx=0;entryDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;473\&quot; target=\&quot;383\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;720.5\&quot; y=\&quot;1488\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;760.5\&quot; y=\&quot;1444\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;477\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;478\&quot; target=\&quot;418\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;753\&quot; y=\&quot;1289\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;PE\&quot; link=\&quot;pe\&quot; id=\&quot;478\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d5e8d4;strokeColor=#82b366;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;650\&quot; y=\&quot;1239\&quot; width=\&quot;50\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;479\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=2;strokeColor=#999999;entryX=0.5;entryY=1;entryDx=0;entryDy=0;exitX=1;exitY=0.25;exitDx=0;exitDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;384\&quot; target=\&quot;443\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;654\&quot; y=\&quot;1380\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;710\&quot; y=\&quot;1340\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;480\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=2;strokeColor=#999999;entryX=0.5;entryY=1;entryDx=0;entryDy=0;exitX=0.75;exitY=0;exitDx=0;exitDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;462\&quot; target=\&quot;431\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;640\&quot; y=\&quot;1375\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;655\&quot; y=\&quot;1333\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;481\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryX=0;entryY=0.5;entryDx=0;entryDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;418\&quot; target=\&quot;419\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;793\&quot; y=\&quot;1264\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;813\&quot; y=\&quot;1264\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;o5D4xRg-JXB86p6HjegH-540\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;\&quot; parent=\&quot;1\&quot; source=\&quot;482\&quot; target=\&quot;o5D4xRg-JXB86p6HjegH-538\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;482\&quot; value=\&quot;推理优化\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#a0522d;strokeColor=#6D1F00;shadow=1;fontColor=#ffffff;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;240\&quot; y=\&quot;1440\&quot; width=\&quot;56\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;483\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=2;strokeColor=#999999;exitX=0.5;exitY=0;exitDx=0;exitDy=0;entryX=0.25;entryY=1;entryDx=0;entryDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;478\&quot; target=\&quot;381\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;635\&quot; y=\&quot;1380\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;662\&quot; y=\&quot;1330\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;484\&quot; value=\&quot;Prompt优化\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;rotation=-30;\&quot; parent=\&quot;483\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot;&gt;\n          &lt;mxGeometry x=\&quot;-0.0199\&quot; y=\&quot;1\&quot; relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;-13\&quot; y=\&quot;-5\&quot; as=\&quot;offset\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;485\&quot; value=\&quot;Agent框架\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#ffcd28;strokeColor=none;shadow=1;gradientColor=#FFB570;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;1010.5\&quot; y=\&quot;1238.81\&quot; width=\&quot;70\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;486\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=1;strokeColor=#999999;dashed=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryX=0;entryY=0.5;entryDx=0;entryDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;485\&quot; target=\&quot;445\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;1096.5\&quot; y=\&quot;1263.81\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;1116.5\&quot; y=\&quot;1271.81\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;487\&quot; value=\&quot;模型训练\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontSize=13;fontStyle=1;fontColor=#6666FF;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot;&gt;\n          &lt;mxGeometry x=\&quot;314\&quot; y=\&quot;1670\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;-3\&quot; y=\&quot;-20\&quot; as=\&quot;offset\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;智能客服\&quot; link=\&quot;ics\&quot; id=\&quot;488\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;328.5\&quot; y=\&quot;1098.3699999999997\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;489\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;\&quot; parent=\&quot;1\&quot; source=\&quot;490\&quot; target=\&quot;513\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;对话管理\&quot; link=\&quot;dialogue-manager\&quot; id=\&quot;490\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;231.75\&quot; y=\&quot;1149\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;491\&quot; value=\&quot;\&quot; style=\&quot;rounded=1;whiteSpace=wrap;html=1;dashed=1;fillColor=#fff2cc;strokeColor=#d6b656;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;895.5\&quot; y=\&quot;1510\&quot; width=\&quot;190\&quot; height=\&quot;230\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;文本生成\&quot; link=\&quot;text-generation\&quot; id=\&quot;492\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d5e8d4;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;919\&quot; y=\&quot;1620\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;文本分类\&quot; link=\&quot;cls\&quot; id=\&quot;493\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d5e8d4;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;994.5\&quot; y=\&quot;1580\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;文本匹配\&quot; link=\&quot;text-match\&quot; id=\&quot;494\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d5e8d4;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;919\&quot; y=\&quot;1580\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;NER\&quot; link=\&quot;ner\&quot; id=\&quot;495\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d5e8d4;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;919\&quot; y=\&quot;1660\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;阅读理解\&quot; link=\&quot;mrc\&quot; id=\&quot;496\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d5e8d4;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;998.5\&quot; y=\&quot;1660\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;497\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=2;strokeColor=#999999;exitX=0.995;exitY=0.874;exitDx=0;exitDy=0;entryX=0;entryY=0.5;entryDx=0;entryDy=0;exitPerimeter=0;\&quot; parent=\&quot;1\&quot; source=\&quot;384\&quot; target=\&quot;401\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;721\&quot; y=\&quot;1443\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;761\&quot; y=\&quot;1444\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;GPT2\&quot; link=\&quot;gpt2\&quot; id=\&quot;498\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;312\&quot; y=\&quot;1849.5\&quot; width=\&quot;50\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;499\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=2;strokeColor=#999999;exitX=0;exitY=0.5;exitDx=0;exitDy=0;entryX=1;entryY=0.5;entryDx=0;entryDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;433\&quot; target=\&quot;498\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;438.99\&quot; y=\&quot;1935\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;438.99\&quot; y=\&quot;1890\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;模型蒸馏\&quot; link=\&quot;distill\&quot; id=\&quot;500\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;546.75\&quot; y=\&quot;1810\&quot; width=\&quot;53.25\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;NLP基础知识\&quot; link=\&quot;nlp\&quot; id=\&quot;501\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d5e8d4;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;946.5\&quot; y=\&quot;1534\&quot; width=\&quot;90\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;知识图谱\&quot; link=\&quot;kg\&quot; id=\&quot;502\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d5e8d4;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;919\&quot; y=\&quot;1700\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;503\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;whiteSpace=wrap;html=1;labelBackgroundColor=none;fontSize=10;fillColor=#f5f5f5;dashed=1;strokeColor=#666666;fontColor=#333333;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;720\&quot; y=\&quot;1750\&quot; width=\&quot;300\&quot; height=\&quot;320\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;504\&quot; value=\&quot;深度学习\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontSize=14;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot;&gt;\n          &lt;mxGeometry x=\&quot;846.378484809835\&quot; y=\&quot;1740.001125496954\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;1\&quot; as=\&quot;offset\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;机器学习\&quot; link=\&quot;ml\&quot; id=\&quot;505\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;735.25\&quot; y=\&quot;1770\&quot; width=\&quot;54.75\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;深度学习\&quot; link=\&quot;dl_note\&quot; id=\&quot;506\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;736\&quot; y=\&quot;1807\&quot; width=\&quot;54.5\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;神经网络\&quot; link=\&quot;ann\&quot; id=\&quot;507\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;730\&quot; y=\&quot;1917\&quot; width=\&quot;64.5\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;神经网络调参\&quot; link=\&quot;ann_tune\&quot; id=\&quot;508\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;803\&quot; y=\&quot;1917\&quot; width=\&quot;81\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;AutoML\&quot; link=\&quot;automl\&quot; id=\&quot;509\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;892\&quot; y=\&quot;1917\&quot; width=\&quot;62.5\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;强化学习\&quot; link=\&quot;rl\&quot; id=\&quot;510\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;730\&quot; y=\&quot;1952\&quot; width=\&quot;64.5\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;因果科学\&quot; link=\&quot;casual\&quot; id=\&quot;511\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;805.5\&quot; y=\&quot;1952\&quot; width=\&quot;64.5\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;多任务学习\&quot; link=\&quot;multi-task\&quot; id=\&quot;512\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;877.5\&quot; y=\&quot;1879.5\&quot; width=\&quot;77\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;用户模拟器\&quot; link=\&quot;simulator\&quot; id=\&quot;513\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;139\&quot; y=\&quot;1149\&quot; width=\&quot;71\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;图神经网络\&quot; link=\&quot;gnn\&quot; id=\&quot;514\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;879.5\&quot; y=\&quot;1952\&quot; width=\&quot;64.5\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;AGI\&quot; link=\&quot;agi\&quot; id=\&quot;515\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1042\&quot; y=\&quot;1810\&quot; width=\&quot;48\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;脑机接口\&quot; link=\&quot;bci\&quot; id=\&quot;516\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1152\&quot; y=\&quot;1810\&quot; width=\&quot;53\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;AIGC行业报告\&quot; link=\&quot;aigc\&quot; id=\&quot;517\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1116.5\&quot; y=\&quot;1770\&quot; width=\&quot;83.5\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;518\&quot; value=\&quot;行业知识\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontSize=14;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot;&gt;\n          &lt;mxGeometry x=\&quot;1156.8784848098348\&quot; y=\&quot;1740.001125496954\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;1\&quot; as=\&quot;offset\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;AI行业报告\&quot; link=\&quot;ai_report\&quot; id=\&quot;519\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1042\&quot; y=\&quot;1770\&quot; width=\&quot;64.5\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;具身智能\&quot; link=\&quot;embodied\&quot; id=\&quot;520\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1209.88\&quot; y=\&quot;1810\&quot; width=\&quot;53\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;ML笔记\&quot; link=\&quot;ml_note\&quot; id=\&quot;522\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;792.75\&quot; y=\&quot;1770\&quot; width=\&quot;47.25\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;o5D4xRg-JXB86p6HjegH-523\&quot; value=\&quot;应用层\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontSize=13;fontStyle=1;fontColor=#6666FF;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot;&gt;\n          &lt;mxGeometry x=\&quot;613.99\&quot; y=\&quot;1111.68\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;-7\&quot; y=\&quot;-2\&quot; as=\&quot;offset\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;o5D4xRg-JXB86p6HjegH-527\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;\&quot; parent=\&quot;1\&quot; source=\&quot;o5D4xRg-JXB86p6HjegH-524\&quot; target=\&quot;o5D4xRg-JXB86p6HjegH-526\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;人工智障\&quot; link=\&quot;dialogue\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-524\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f8cecc;strokeColor=#b85450;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;233.75\&quot; y=\&quot;1098.37\&quot; width=\&quot;58\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;o5D4xRg-JXB86p6HjegH-525\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;entryX=1;entryY=1;entryDx=0;entryDy=0;exitX=0;exitY=0;exitDx=0;exitDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;404\&quot; target=\&quot;o5D4xRg-JXB86p6HjegH-524\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;369\&quot; y=\&quot;1159\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;369\&quot; y=\&quot;1138\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;大模型时代对话系统\&quot; link=\&quot;llm_ds\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-526\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#008a00;strokeColor=#005700;shadow=1;fontColor=#ffffff;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;139\&quot; y=\&quot;1098.37\&quot; width=\&quot;71\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;LLM 开发模式\&quot; link=\&quot;llm_dev\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-528\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#ffe6cc;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;659.25\&quot; y=\&quot;1280\&quot; width=\&quot;90\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;对比学习\&quot; link=\&quot;contrastive\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-529\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;803\&quot; y=\&quot;1879.5\&quot; width=\&quot;64.5\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;计算机视觉\&quot; link=\&quot;cv\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-530\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;724.87\&quot; y=\&quot;1992\&quot; width=\&quot;64.5\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;视频理解\&quot; link=\&quot;video\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-531\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;595\&quot; y=\&quot;1170\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;o5D4xRg-JXB86p6HjegH-532\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=1;strokeColor=#999999;entryX=0.5;entryY=1;entryDx=0;entryDy=0;dashed=1;exitX=0.75;exitY=0;exitDx=0;exitDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;394\&quot; target=\&quot;o5D4xRg-JXB86p6HjegH-531\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;528\&quot; y=\&quot;1232\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;537\&quot; y=\&quot;1189\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;推荐系统\&quot; link=\&quot;rp\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-533\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;725.5\&quot; y=\&quot;2030\&quot; width=\&quot;54.5\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;文档问答\&quot; link=\&quot;doc_chat\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-534\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;395\&quot; y=\&quot;1099.9999999999998\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;o5D4xRg-JXB86p6HjegH-535\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;entryX=0.5;entryY=1;entryDx=0;entryDy=0;\&quot; parent=\&quot;1\&quot; target=\&quot;o5D4xRg-JXB86p6HjegH-534\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;390\&quot; y=\&quot;1170\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;369\&quot; y=\&quot;1138\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;开放域问答\&quot; link=\&quot;dialogue_qa\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-536\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;395\&quot; y=\&quot;1063.37\&quot; width=\&quot;65\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;LLM问题\&quot; link=\&quot;llm_think\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-537\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f8cecc;strokeColor=#b85450;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;752.5\&quot; y=\&quot;1385\&quot; width=\&quot;58\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;推理优化\&quot; link=\&quot;llm_opt\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-538\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d5e8d4;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;169.5\&quot; y=\&quot;1440\&quot; width=\&quot;50\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;服务部署实验\&quot; link=\&quot;exp\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-541\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d5e8d4;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;169.5\&quot; y=\&quot;1480\&quot; width=\&quot;80.5\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;自动标注\&quot; link=\&quot;label\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-542\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d5e8d4;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;200\&quot; y=\&quot;1525\&quot; width=\&quot;50\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;ChatGPT应用\&quot; link=\&quot;chatgpt_application\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-545\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;217.75\&quot; y=\&quot;1208.81\&quot; width=\&quot;78.25\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;评估方法\&quot; link=\&quot;eva\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-546\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d5e8d4;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;711.25\&quot; y=\&quot;1540\&quot; width=\&quot;50\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;目标检测\&quot; link=\&quot;loss\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-547\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;845.5\&quot; y=\&quot;1992\&quot; width=\&quot;51.25\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;大模型评测\&quot; link=\&quot;llm_eva\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-548\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d5e8d4;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;711.25\&quot; y=\&quot;1575\&quot; width=\&quot;62\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;ChatGPT复现\&quot; link=\&quot;chatgpt_mimic\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-549\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;365.12\&quot; y=\&quot;1490\&quot; width=\&quot;78.25\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;AI生成\&quot; link=\&quot;llm_data\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-550\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d5e8d4;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;200\&quot; y=\&quot;1570\&quot; width=\&quot;50\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;LLM原理\&quot; link=\&quot;llm_arch\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-552\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;357.13\&quot; y=\&quot;1590\&quot; width=\&quot;62.87\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;音乐生成\&quot; link=\&quot;music_gen\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-553\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;256\&quot; y=\&quot;1260.68\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;o5D4xRg-JXB86p6HjegH-554\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=1;strokeColor=#999999;entryX=1;entryY=0.5;entryDx=0;entryDy=0;dashed=1;exitX=0;exitY=0.5;exitDx=0;exitDy=0;\&quot; parent=\&quot;1\&quot; source=\&quot;393\&quot; target=\&quot;o5D4xRg-JXB86p6HjegH-553\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;528\&quot; y=\&quot;1232\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;537\&quot; y=\&quot;1189\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;推理加速\&quot; link=\&quot;infer\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-555\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d5e8d4;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;169.5\&quot; y=\&quot;1400\&quot; width=\&quot;50\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;端侧LLM\&quot; link=\&quot;llm_end\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-556\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d5e8d4;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;310\&quot; y=\&quot;1320\&quot; width=\&quot;50\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;OpenAI\&quot; link=\&quot;openai\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-557\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1044\&quot; y=\&quot;1849.9999999999998\&quot; width=\&quot;50\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;AI公司\&quot; link=\&quot;ai_company\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-558\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#dae8fc;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1102\&quot; y=\&quot;1850\&quot; width=\&quot;50\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;AIGC 机会\&quot; link=\&quot;aigc_idea\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-559\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1209.88\&quot; y=\&quot;1770\&quot; width=\&quot;60.12\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;o5D4xRg-JXB86p6HjegH-561\&quot; value=\&quot;\&quot; style=\&quot;edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;\&quot; parent=\&quot;1\&quot; source=\&quot;o5D4xRg-JXB86p6HjegH-560\&quot; edge=\&quot;1\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;180\&quot; y=\&quot;1180\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;用户画像\&quot; link=\&quot;user\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-560\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;155\&quot; y=\&quot;1198.81\&quot; width=\&quot;51\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;大脑原理\&quot; link=\&quot;brain\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-563\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1094\&quot; y=\&quot;1810\&quot; width=\&quot;53\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;回归分析\&quot; link=\&quot;regression\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-564\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;888.5\&quot; y=\&quot;1806\&quot; width=\&quot;58.75\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;芯片\&quot; link=\&quot;chip\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-565\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1156.88\&quot; y=\&quot;1850\&quot; width=\&quot;38\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;在线教育\&quot; link=\&quot;tutor\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-566\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1090\&quot; y=\&quot;2053\&quot; width=\&quot;53\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;汽车原理\&quot; link=\&quot;car\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-567\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1044\&quot; y=\&quot;1935\&quot; width=\&quot;53\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;自动驾驶\&quot; link=\&quot;driving\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-568\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1109\&quot; y=\&quot;1935\&quot; width=\&quot;53\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;异常检测\&quot; link=\&quot;anomaly\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-569\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;732.88\&quot; y=\&quot;1842.5\&quot; width=\&quot;58.75\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;聚类算法\&quot; link=\&quot;cluster\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-570\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;799.62\&quot; y=\&quot;1842.5\&quot; width=\&quot;58.75\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;o5D4xRg-JXB86p6HjegH-571\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;whiteSpace=wrap;html=1;labelBackgroundColor=none;fontSize=10;fillColor=#f5f5f5;dashed=1;strokeColor=#666666;fontColor=#333333;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n          &lt;mxGeometry x=\&quot;720\&quot; y=\&quot;2098\&quot; width=\&quot;300\&quot; height=\&quot;130\&quot; as=\&quot;geometry\&quot; /&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;贝叶斯\&quot; link=\&quot;bayes\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-572\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;829.75\&quot; y=\&quot;2112\&quot; width=\&quot;47.5\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;元宇宙\&quot; link=\&quot;meta\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-573\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1044\&quot; y=\&quot;1975\&quot; width=\&quot;53\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;新技术\&quot; link=\&quot;new_tech\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-574\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1107\&quot; y=\&quot;1891\&quot; width=\&quot;48\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;机器人\&quot; link=\&quot;robot\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-575\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1200\&quot; y=\&quot;1850\&quot; width=\&quot;40\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;搜索\&quot; link=\&quot;search\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-576\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;849.63\&quot; y=\&quot;2030\&quot; width=\&quot;40.75\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;可解释性\&quot; link=\&quot;explain\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-577\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;958.13\&quot; y=\&quot;1880\&quot; width=\&quot;58.75\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;NAS\&quot; link=\&quot;nas\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-578\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;959.5\&quot; y=\&quot;1917\&quot; width=\&quot;46.5\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;元学习\&quot; link=\&quot;meta_learning\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-579\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;948.5\&quot; y=\&quot;1952\&quot; width=\&quot;47.5\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;情感计算\&quot; link=\&quot;emotion\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-580\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d5e8d4;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;994.5\&quot; y=\&quot;1620\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;知识追踪\&quot; link=\&quot;dkt\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-581\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1149.38\&quot; y=\&quot;2053\&quot; width=\&quot;53\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;互联网金融\&quot; link=\&quot;finance\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-582\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1047.75\&quot; y=\&quot;2014\&quot; width=\&quot;62.25\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;房产行业\&quot; link=\&quot;house\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-583\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1044\&quot; y=\&quot;1892\&quot; width=\&quot;53\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;量化交易\&quot; link=\&quot;quant\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-584\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1116.5\&quot; y=\&quot;2014\&quot; width=\&quot;53\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;股票\&quot; link=\&quot;stock\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-585\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1176.5\&quot; y=\&quot;2014\&quot; width=\&quot;43.5\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;物联网\&quot; link=\&quot;iot\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-586\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1160\&quot; y=\&quot;1893\&quot; width=\&quot;45\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;移动设备\&quot; link=\&quot;phone\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-587\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1169\&quot; y=\&quot;1935\&quot; width=\&quot;53\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;语音识别\&quot; link=\&quot;voice\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-588\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;948.5\&quot; y=\&quot;1992\&quot; width=\&quot;56.25\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;模型部署\&quot; link=\&quot;model_deploy\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-589\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d5e8d4;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;255.5\&quot; y=\&quot;1480\&quot; width=\&quot;50\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;最优化\&quot; link=\&quot;optimization\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-590\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;863.75\&quot; y=\&quot;1842.5\&quot; width=\&quot;58.75\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;排序学习\&quot; link=\&quot;ltr\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-591\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;783.88\&quot; y=\&quot;2030\&quot; width=\&quot;59.25\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;微积分\&quot; link=\&quot;calculus\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-592\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;881\&quot; y=\&quot;2112\&quot; width=\&quot;40.5\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;知识图谱\&quot; link=\&quot;kg\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-593\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;952.13\&quot; y=\&quot;2030\&quot; width=\&quot;54.74\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;博弈论\&quot; link=\&quot;game-thoery\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-594\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;925.5\&quot; y=\&quot;2148\&quot; width=\&quot;43.5\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;联邦学习\&quot; link=\&quot;faderation\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-595\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;898\&quot; y=\&quot;2030\&quot; width=\&quot;50\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;密码学\&quot; link=\&quot;cryptography\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-596\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;808\&quot; y=\&quot;2186\&quot; width=\&quot;37.5\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;流形学习\&quot; link=\&quot;manifold\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-597\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;781.62\&quot; y=\&quot;2148\&quot; width=\&quot;57.5\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;Python\&quot; link=\&quot;python\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-598\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;310.74\&quot; y=\&quot;2020\&quot; width=\&quot;45.01\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;特征工程\&quot; link=\&quot;fe\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-599\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;808.75\&quot; y=\&quot;1806\&quot; width=\&quot;64.25\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;区块链\&quot; link=\&quot;block-chain\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-600\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1106.5\&quot; y=\&quot;1975\&quot; width=\&quot;53\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;信息论\&quot; link=\&quot;information\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-601\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;925.5\&quot; y=\&quot;2112\&quot; width=\&quot;46\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;概率统计\&quot; link=\&quot;probability\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-602\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;724.87\&quot; y=\&quot;2148\&quot; width=\&quot;50.87\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;量子计算\&quot; link=\&quot;quantum\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-603\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1169\&quot; y=\&quot;1975\&quot; width=\&quot;53\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;Pandas\&quot; link=\&quot;pandas\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-604\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;365\&quot; y=\&quot;2020\&quot; width=\&quot;42.38\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;Scikit-learn\&quot; link=\&quot;sklearn\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-605\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;415.75\&quot; y=\&quot;2020\&quot; width=\&quot;64.25\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;文本挖掘\&quot; link=\&quot;text-mining\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-606\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d5e8d4;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;996.63\&quot; y=\&quot;1700\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;神经网络可视化\&quot; link=\&quot;train_vis\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-607\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;932\&quot; y=\&quot;1842.5\&quot; width=\&quot;94\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;不均衡问题\&quot; link=\&quot;imbalance\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-608\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;729\&quot; y=\&quot;1879.5\&quot; width=\&quot;70\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;精简笔记\&quot; link=\&quot;dl_sum\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-609\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;955.25\&quot; y=\&quot;1770\&quot; width=\&quot;54.75\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;文本分类\&quot; link=\&quot;cls\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-610\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#d5e8d4;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;953.75\&quot; y=\&quot;1806\&quot; width=\&quot;60\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;ML军规\&quot; link=\&quot;ml_rule\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-611\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;846.38\&quot; y=\&quot;1770\&quot; width=\&quot;47.25\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;线性代数与矩阵\&quot; link=\&quot;bayes\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-612\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;724.87\&quot; y=\&quot;2112\&quot; width=\&quot;94.25\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;Go\&quot; link=\&quot;go\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-614\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;489.24\&quot; y=\&quot;2020\&quot; width=\&quot;35.01\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;ML本质\&quot; link=\&quot;ml_essense\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-615\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;900.75\&quot; y=\&quot;1770\&quot; width=\&quot;47.25\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;LBS\&quot; link=\&quot;lbs\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-616\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1230\&quot; y=\&quot;1935\&quot; width=\&quot;30\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;傅里叶变换\&quot; link=\&quot;fourier\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-617\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;728.68\&quot; y=\&quot;2186\&quot; width=\&quot;70.44\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;Git\&quot; link=\&quot;git\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-618\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;312.73\&quot; y=\&quot;2059\&quot; width=\&quot;35.01\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;Jupyter\&quot; link=\&quot;jupyter\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-619\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;353.1\&quot; y=\&quot;2059\&quot; width=\&quot;41.64\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;Linux\&quot; link=\&quot;linux\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-620\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;444.99\&quot; y=\&quot;2059\&quot; width=\&quot;35.01\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;Shell\&quot; link=\&quot;shell\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-621\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;530.75\&quot; y=\&quot;2020\&quot; width=\&quot;35.01\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;Latex\&quot; link=\&quot;latex\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-622\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;400.98\&quot; y=\&quot;2059\&quot; width=\&quot;35.01\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;Jekyll\&quot; link=\&quot;jekyll\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-623\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;651.23\&quot; y=\&quot;2100\&quot; width=\&quot;35.01\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;教育\&quot; link=\&quot;edu\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-624\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1047.75\&quot; y=\&quot;2053\&quot; width=\&quot;32.25\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;分形几何\&quot; link=\&quot;fractal\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-625\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;845.4999999999999\&quot; y=\&quot;2148\&quot; width=\&quot;70.44\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;SQL\&quot; link=\&quot;data\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-626\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;313.24\&quot; y=\&quot;2137\&quot; width=\&quot;32.51\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;可视化\&quot; link=\&quot;vis\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-627\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;358.74\&quot; y=\&quot;2137\&quot; width=\&quot;47.01\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;数据挖掘\&quot; link=\&quot;dm\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-628\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;416.74\&quot; y=\&quot;2137\&quot; width=\&quot;50\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;vpn\&quot; link=\&quot;vpn\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-629\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;314.49\&quot; y=\&quot;2177\&quot; width=\&quot;32.51\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;计算机网络\&quot; link=\&quot;network\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-630\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;360.74\&quot; y=\&quot;2177\&quot; width=\&quot;65.01\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;计算机语言\&quot; link=\&quot;computer\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-631\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;431.13\&quot; y=\&quot;2177\&quot; width=\&quot;65.01\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;操作系统\&quot; link=\&quot;os\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-632\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;501.74\&quot; y=\&quot;2177\&quot; width=\&quot;54.01\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;图形学\&quot; link=\&quot;graphic\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-633\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;560.99\&quot; y=\&quot;2177\&quot; width=\&quot;54.01\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;计算机知识脑图\&quot; link=\&quot;mindmap\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-634\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;474.24\&quot; y=\&quot;2137\&quot; width=\&quot;91.51\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;基础算法\&quot; link=\&quot;algorithm\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-635\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;313.24\&quot; y=\&quot;2098\&quot; width=\&quot;52.51\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;算法比赛\&quot; link=\&quot;kaggle\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-636\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;377.37\&quot; y=\&quot;2098\&quot; width=\&quot;52.51\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;Web前端\&quot; link=\&quot;web\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-637\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;439.12\&quot; y=\&quot;2098\&quot; width=\&quot;52.51\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;架构设计\&quot; link=\&quot;arch\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-638\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;500.24\&quot; y=\&quot;2098\&quot; width=\&quot;52.51\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;Docker\&quot; link=\&quot;docker\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-639\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;627.94\&quot; y=\&quot;2138\&quot; width=\&quot;41.64\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;小程序\&quot; link=\&quot;mini\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-640\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;559.49\&quot; y=\&quot;2099\&quot; width=\&quot;46.26\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;测试\&quot; link=\&quot;test\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-641\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;609.74\&quot; y=\&quot;2100\&quot; width=\&quot;36.01\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;面试指南\&quot; link=\&quot;interview\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-642\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;571.75\&quot; y=\&quot;2138\&quot; width=\&quot;50\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;数学历史\&quot; link=\&quot;math\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-643\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;851.62\&quot; y=\&quot;2186\&quot; width=\&quot;57.5\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;makefile\&quot; link=\&quot;makefile\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-644\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;575.75\&quot; y=\&quot;2020\&quot; width=\&quot;50\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;Linux C\&quot; link=\&quot;linux-program\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-645\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;489.25\&quot; y=\&quot;2059\&quot; width=\&quot;51.5\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;C/C++\&quot; link=\&quot;c\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-647\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;635.75\&quot; y=\&quot;2020\&quot; width=\&quot;35.01\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;设计模式\&quot; link=\&quot;design\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-648\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;621.75\&quot; y=\&quot;2177\&quot; width=\&quot;54.01\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;Tensorflow\&quot; link=\&quot;linux-program\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-649\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;544.74\&quot; y=\&quot;2059\&quot; width=\&quot;56.01\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;Pytorch\&quot; link=\&quot;linux-program\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-651\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;607.74\&quot; y=\&quot;2059\&quot; width=\&quot;43.01\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;Pytorch手册\&quot; link=\&quot;pytorch_simple\&quot; id=\&quot;o5D4xRg-JXB86p6HjegH-652\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;656.99\&quot; y=\&quot;2059\&quot; width=\&quot;43.01\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;o5D4xRg-JXB86p6HjegH-653\&quot; value=\&quot;计算机基础\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontSize=14;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot;&gt;\n          &lt;mxGeometry x=\&quot;510.74848480983496\&quot; y=\&quot;2001.001125496954\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;-4\&quot; y=\&quot;-4\&quot; as=\&quot;offset\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;o5D4xRg-JXB86p6HjegH-654\&quot; value=\&quot;数学知识\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontSize=14;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot;&gt;\n          &lt;mxGeometry x=\&quot;851.618484809835\&quot; y=\&quot;2089.001125496954\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;1\&quot; as=\&quot;offset\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;mxCell id=\&quot;o5D4xRg-JXB86p6HjegH-655\&quot; value=\&quot;【2024-11-24】&amp;lt;div&amp;gt;wqw547243068@163.com&amp;lt;/div&amp;gt;\&quot; style=\&quot;edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];labelBackgroundColor=none;fontSize=14;\&quot; parent=\&quot;1\&quot; vertex=\&quot;1\&quot; connectable=\&quot;0\&quot;&gt;\n          &lt;mxGeometry x=\&quot;279.99848480983496\&quot; y=\&quot;1720.001125496954\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;-4\&quot; y=\&quot;-4\&quot; as=\&quot;offset\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;图像处理\&quot; link=\&quot;image\&quot; id=\&quot;ozFa4HHbGE1QGwIMumdl-522\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;791.88\&quot; y=\&quot;1992\&quot; width=\&quot;51.25\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;OCR\&quot; link=\&quot;ocr\&quot; id=\&quot;ozFa4HHbGE1QGwIMumdl-523\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;900.75\&quot; y=\&quot;1992\&quot; width=\&quot;39.25\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;智能硬件\&quot; link=\&quot;smart_device\&quot; id=\&quot;ozFa4HHbGE1QGwIMumdl-524\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1209.88\&quot; y=\&quot;1892\&quot; width=\&quot;50.12\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;传感器\&quot; link=\&quot;sensor\&quot; id=\&quot;ozFa4HHbGE1QGwIMumdl-525\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1245\&quot; y=\&quot;1850\&quot; width=\&quot;40\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;GPT3\&quot; link=\&quot;gpt2\&quot; id=\&quot;ozFa4HHbGE1QGwIMumdl-526\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;313.24\&quot; y=\&quot;1800\&quot; width=\&quot;50\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;ozFa4HHbGE1QGwIMumdl-527\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=2;strokeColor=#999999;exitX=0.5;exitY=0;exitDx=0;exitDy=0;entryX=0.5;entryY=1;entryDx=0;entryDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;498\&quot; target=\&quot;ozFa4HHbGE1QGwIMumdl-526\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;414\&quot; y=\&quot;1875\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;375\&quot; y=\&quot;1875\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;Transformer&amp;lt;div&amp;gt;改进&amp;lt;/div&amp;gt;\&quot; link=\&quot;transformer_update\&quot; id=\&quot;ozFa4HHbGE1QGwIMumdl-528\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#b0e3e6;strokeColor=none;shadow=1;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;291.75\&quot; y=\&quot;1922\&quot; width=\&quot;80\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;ozFa4HHbGE1QGwIMumdl-529\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;strokeWidth=2;strokeColor=#999999;entryX=1;entryY=0.5;entryDx=0;entryDy=0;exitX=0;exitY=0.5;exitDx=0;exitDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;432\&quot; target=\&quot;ozFa4HHbGE1QGwIMumdl-528\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;390\&quot; y=\&quot;1940\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;439\&quot; y=\&quot;1890\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n        &lt;UserObject label=\&quot;AIGC&amp;lt;div&amp;gt;内容检测&amp;lt;/div&amp;gt;\&quot; link=\&quot;aigc_detect\&quot; id=\&quot;ozFa4HHbGE1QGwIMumdl-530\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#f8cecc;strokeColor=none;shadow=1;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;162\&quot; y=\&quot;1258\&quot; width=\&quot;65\&quot; height=\&quot;34\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;端到端对话\&quot; link=\&quot;end_voice\&quot; id=\&quot;ozFa4HHbGE1QGwIMumdl-531\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#008a00;strokeColor=#005700;shadow=1;fontColor=#ffffff;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;139\&quot; y=\&quot;1050\&quot; width=\&quot;71\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;UserObject label=\&quot;LLM发展方向\&quot; link=\&quot;llm_direction\&quot; id=\&quot;ozFa4HHbGE1QGwIMumdl-532\&quot;&gt;\n          &lt;mxCell style=\&quot;rounded=1;whiteSpace=wrap;html=1;fillColor=#a0522d;strokeColor=#6D1F00;shadow=1;fontColor=#ffffff;\&quot; vertex=\&quot;1\&quot; parent=\&quot;1\&quot;&gt;\n            &lt;mxGeometry x=\&quot;1030.5\&quot; y=\&quot;1395\&quot; width=\&quot;80\&quot; height=\&quot;30\&quot; as=\&quot;geometry\&quot; /&gt;\n          &lt;/mxCell&gt;\n        &lt;/UserObject&gt;\n        &lt;mxCell id=\&quot;ozFa4HHbGE1QGwIMumdl-534\&quot; value=\&quot;\&quot; style=\&quot;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;entryX=0.5;entryY=1;entryDx=0;entryDy=0;\&quot; edge=\&quot;1\&quot; parent=\&quot;1\&quot; source=\&quot;o5D4xRg-JXB86p6HjegH-526\&quot; target=\&quot;ozFa4HHbGE1QGwIMumdl-531\&quot;&gt;\n          &lt;mxGeometry relative=\&quot;1\&quot; as=\&quot;geometry\&quot;&gt;\n            &lt;mxPoint x=\&quot;339\&quot; y=\&quot;1159\&quot; as=\&quot;sourcePoint\&quot; /&gt;\n            &lt;mxPoint x=\&quot;302\&quot; y=\&quot;1138\&quot; as=\&quot;targetPoint\&quot; /&gt;\n          &lt;/mxGeometry&gt;\n        &lt;/mxCell&gt;\n      &lt;/root&gt;\n    &lt;/mxGraphModel&gt;\n  &lt;/diagram&gt;\n&lt;/mxfile&gt;\n&quot;}"></div>
<script type="text/javascript" src="https://viewer.diagrams.net/js/viewer-static.min.js"></script>





        <!-- 评论区 -->
        <h2 id="comments">Comments</h2>
        



<!-- 【2023-1-4】github新插件giscus, 暂未启用 -->
<script src="https://giscus.app/client.js"
        data-repo="wqw547243068/wqw547243068.github.io"
        data-repo-id="MDEwOlJlcG9zaXRvcnkxNDE3ODEwMzg="
        data-category="Announcements"
        data-category-id="DIC_kwDOCHNoLs4CRJjU"
        data-mapping="title"
        data-strict="0"
        data-reactions-enabled="1"
        data-emit-metadata="0"
        data-input-position="bottom"
        data-theme="light"
        data-lang="en"
        // data-lang="zh-CN" 
        crossorigin="anonymous"
        async>
</script>








<!-- disqus插件 -->

<p> --disqus-- </p>
<div id="disqus_thread"></div>
<script>
    /**
     * RECOMMENDED CONFIGURATION VARIABLES: EDIT AND UNCOMMENT THE SECTION BELOW TO INSERT DYNAMIC VALUES FROM YOUR PLATFORM OR CMS.
     * LEARN WHY DEFINING THESE VARIABLES IS IMPORTANT: https://disqus.com/admin/universalcode/#configuration-variables
     */
    var disqus_config = function() {
        this.page.url = 'https://wqw547243068.github.io/dist'; // Replace PAGE_URL with your page's canonical URL variable
        this.page.identifier = 'https://wqw547243068.github.io/dist'; // Replace PAGE_IDENTIFIER with your page's unique identifier variable
    };

    (function() { // DON'T EDIT BELOW THIS LINE
        var d = document,
            s = d.createElement('script');

        s.src = '//wqw.disqus.com/embed.js';

        s.setAttribute('data-timestamp', +new Date());
        (d.head || d.body).appendChild(s);
    })();
</script>
<noscript>Please enable JavaScript to view the <a href="https://disqus.com/?ref_noscript" rel="nofollow">comments powered by Disqus.</a></noscript>







    </div>
    <button class="anchor"><i class="fa fa-anchor"></i></button>


    <!-- 右侧工具栏 -->
    <div class="right">
        <!-- 搜索框 -->
        <div>
    <!-- HTML elements for search -->
    <input type="text" id="search-input" placeholder="Search blog posts..">
    <ul id="results-container"></ul>

    <!-- script pointing to jekyll-search.js -->
    <script src="/js/simple-jekyll-search.min.js"></script>
    <!-- <script src="https://cdn.rawgit.com/christian-fei/Simple-Jekyll-Search/master/dest/simple-jekyll-search.min.js"></script> -->
</div>

<!-- [2019-08-07]搜索框 -->
    <script>
      window.simpleJekyllSearch = new SimpleJekyllSearch({
        searchInput: document.getElementById('search-input'),
        resultsContainer: document.getElementById('results-container'),
        json: '/search.json',
        searchResultTemplate: '<li><a href="{url}?query={query}" title="{desc}">{title}</a></li>',
        noResultsText: 'No results found',
        limit: 10,
        fuzzy: false,
        exclude: ['Welcome']
      })
    </script>

        <!-- 访问可视化 -->
        <!-- 访问统计可视化工具 -->
<div class="side">
        <script type="text/javascript" src="//rf.revolvermaps.com/0/0/1.js?i=5q2837r7gjo&amp;s=265&amp;m=7&amp;v=true&amp;r=false&amp;b=000000&amp;n=false&amp;c=ff0000" async="async"></script>
</div>

<script type='text/javascript' id='mapmyvisitors' src='https://mapmyvisitors.com/map.js?cl=ffffff&w=a&t=n&d=Jqz5ooTlHsfwaaqJF5LezHsg7HXvyf3s_N_TE_2u8xM'></script>

        <div class="wrap">
            <!-- Content目录区 -->
            <div class="side content">
                <div>
                    Content
                </div>
                <ul id="content-side" class="content-ul">
                    
                    <li><a href="#comments">Comments</a></li>
                </ul>
            </div>
            <!-- 公众号区 -->
            <!-- 公众号区 -->
<div class="side content">
        <div>My Moment ( 微信公众号 )</div>
        <img src="https://wqw547243068.github.io/wqw/fig/wqw.png" alt="欢迎关注鹤啸九天" />
</div>
            <!-- 其他div框放到这里 -->
            <!-- <div class="side">bbbb</div> -->
            
        </div>

    </div>
</div>
<script>
/**
 * target _blank
 */
(function() {
    var aTags = document.querySelectorAll('article a:not([id])')
    for (var i = 0; i < aTags.length; i++) {
        aTags[i].setAttribute('target', '_blank')
    }
}());
</script>

<script src="/js/pageContent.js " charset="utf-8"></script>

<!-- 【2022-9-1】  支持latex数学公式显示 -->
<script src="https://cdn.mathjax.org/mathjax/latest/MathJax.js?config=TeX-AMS-MML_HTMLorMML" type="text/javascript"></script><|im_end|>

用户输入扮演 user 的 role ,而模型生成则承担 assistant 的 role 。

Qwen 还支持元消息,指导模型执行特定操作或生成具有特定特性的文本,例如: 改变语气、风格或内容,这将承担 system 的 role,且内容默认为 You are Qwen, created by Alibaba Cloud. You are a helpful assistant.

数据量

预训练数据共 3TB,涉及: 公共网络文档、百科全书、书籍、代码等,数据涉及多语言,但以中文和英文为主。

为了保证数据质量,制定了一套全面的预处理程序。

  • Web数据需要从HTML中提取文本内容,并采用语言识别工具确定语种;
    • 如: python cdl 工具包检测语种, 编码检测 chardetect, kenlm 计算流畅度
  • 通过重复数据删除技术增加数据的多样性,包括规范化后精确匹配重复数据删除方法和使用MinHashLSH算法的模糊重复数据删除方法;
  • 结合规则和机器学习的方法过滤低质量数据,即通过多个模型对内容进行评分,包括语言模型、文本质量评分模型以及用于识别潜在冒犯性模型;
  • 从各种来源数据中手动采样并进行审查,以确保其质量;
  • 有选择地对来自某些来源的数据进行采样,以确保模型在各种高质量内容上进行训练。

长度

Qwen2.5 训练中打包序列长度为 32768 个 token。

  • 预训练中最大文档长度即为此长度。
  • 而后训练中,user和assistant的最大消息长度则有所不同。一般情况下,assistant消息长度可达 8192 个 token。

要点:

  • Qwen2 模型可以处理 32K 或 128K token 长的文本,其中 8K 长度可作为输出。

模型结构

模型采用Transformer框架,主要做了以下修改:

  • Embedding and output projection:对于embedding层和lm_head层不进行权重共享,是两个单独的权重。
  • Positional embedding:采用RoPE为位置编码,并选择使用FP32精确度的逆频率矩阵。
  • Bias:在QKV注意力层中添加了偏差,以增强模型的外推能力。
  • Pre-Norm & RMSNorm:采用预归一化提高训练稳定性,并将传统归一化方法替换为RMSNorm
  • Activation function:采用SwiGLU激活函数,不同于传统FFN的2个矩阵,SwiGLU有三个矩阵,因此缩小了隐藏层维度,由原来的4倍变成8/3倍。

外推能力扩展

Transformer 模型的注意力机制在上下文长度上有很大限制,模型会随着上下文长度的增加,计算成本和内存会成倍增加。

Qwen模型利用了简单地非训练计算,在推理过程中扩展上下文长度

  • 动态NTK感知插值,即对序列长度的增加动态缩放位置信息。
  • LogN-Scaling,根据上下文长度与训练长度的比率,对Q和V的点积进行重新缩放,确保注意力值的熵随着上下文长度的增长而保持稳定。
  • Window attention,将注意力限制在一个上下文窗口内,防止模型关注到太远的内容。并在不同层采用不同的窗口大小,较低的层使用较短的窗口,而较高的层使用较长的窗口。

训练

qwen系列大模型本地部署,法律大模型训练,

  • 只需5G内存部署本地大模型,
  • 只需6G显存训练自己的法律大模型。
  • lora模型训练完成后,会合并到主模型,生成自己专属的大模型。

视频演示

预训练

遵循自回归语言建模的标准方法,通过前面Token的内容预测下一个Token;

  • 模型预训练时最大长度为2048,为了构建批次数据,对文本内容进行随机打乱及合并,再讲其截断到指定长度。
  • 注意力模块采用Flash Attention技术,提高训练速度;
  • 优化器采用AdamW,超参数β1、β2和ϵ为别为0.9、0.95和10−8;
  • 采用余弦学习率计划,学习率会衰减到峰值的10%;
  • 采用BFloat16进行混合精度训练。

QWEN 模型再同等级参数下表现优异,即使是更大的型号如LLaMA2-70B,在3个任务中也被QWEN-14B超越。

有监督微调SFT

为了提高有监督微调数据集的能力,对多种风格的对话进行了标注,来关注不同任务的自然语言生成,进一步提高模型的有用性。并且大小训练方法也会影响模型行了,Qwen采用ChatML样式的格式来进行模型训练。

ChatML格式让模型有效区分各类信息,包括:系统质量、用户输入、模型输出等,可以增强模型对复杂会话的处理分析能力。

ChatML Format 对话模版

<|im_start|>system
You are a helpful assistant.<|im_end|>
<|im_start|>user
hello, who are you?<|im_end|>
<|im_start|>assistant
I am a AI program developed by Firefly<|im_end|>

训练

  • 优化器采用AdamW,超参数β1、β2和ϵ为别为0.9、0.95和1e−8;
  • 模型最大输入长度2048
  • 训练批次大小为128
  • 模型共训练4000步,在前1430步中,学习率逐渐增加,达到2e−6的峰值。
  • 为了防止过拟合,权重衰减的值设置为0.1,dropout设置为0.1,梯度裁剪的限制为1.0。

RM

RM模型

奖励模型构建上,先采用大量数据进行偏好模型预训练(preference model pretraining,PMP),在经过高质量偏好数据进行奖励模型精调。

高质量偏好数据通过6600详细标签的分类系统平衡采样获取,为保证数据的多样性复杂性

奖励模型时由同等大小Qwen模型+池化层得来,用特殊的句子结束标记映射值作为模型奖励值。

模型在训练过程中,学习率恒为3e−6,批次大小为64,最大长度为2048,训练一个epoch。

效果

PPO

PPO阶段共包含四个模型:policy模型、value模型、reference模型、reward模型。

训练过程中,先对policy模型训练50步预热,这样保证了value模型能够有效地适应不同的奖励模型。在PPO过程中,对每个query会同时采样两个response,KL散度系数设为0.04,并根据平均值对奖励进行归一化处理。

policy模型和value模型的学习率分别为1e−6和5e−6。为了增强训练的稳定性,裁剪值0.15。在进行推理时,生成策略的top-p值设置为0.9。

对齐效果

Qwen的效果优于相同规模的其他开源模型,如LLaMA2、ChatGLM2、InternLM、Baichuan2

人工评测,比较了 Qwen-7B-Chat(SFT)、Qwen-14B-Chat(SFT)、Qwen-14B-Chat(RLHF)、GPT4在对话上与GPT3.5的差异。

RLHF模型明显优于SFT模型,说明RLHF可以生成更受人类喜爱的回答。

工具使用

Qwen模型具有工具使用能力:

  • 通过ReAct提示进行使用未见的工具;
  • 用Python解释器增强数学推理、数据分析等能力;
  • 作为代理,与人类交互过程中,可以访问HuggingFace中大量多模态模型集合。

PS:高质量数据2000条-React格式数据。

如何用 ReAct Prompting 技术命令千问使用工具

【2024-3-4】GPT-4

【2024-3-4】GPT-4 Technical Report

【2024-3-9】Yi

参考

Yi 介绍

AI(零一万物)是李开复带队孵化的AI公司。

  • 2023年11月初,01.AI 发布并开源了Yi-6BYi-34B base模型,同一周内,又开源了Yi-6B-200KYi-34B-200K base模型。Yi号称是从预训练的双语模型。
  • 接下来的几个月,01.AI陆续推出了chat模型多模态能力Yi-9B长上下文的记忆和检索能力等优化。

SuperCLUE/CMMLU等一些榜单数据的实测上,Yi的效果确实不错。能排在同时期中文(开源)大模型里的第一梯队。

2024年3月,Yi终于发布了技术报告,在此来梳理一下报告中的重点内容和值得关注的细节信息。

Yi目前有6B、9B、34B三个规模,其中34B是主力模型。

  • 选择34B,而不是更大规模的原因,是这个规模能在24G显存的消费级显卡(如RTX4090)上运行。
  • 使用int4量化之后的34B模型可运行在24G显存的GPU上。

参考《Understanding INT4 Quantization for Language Models: Latency Speedup, Composability, and Failure Cases》的量化方法

  • Yi-34B int8量化模型相比bf16模型,几乎可以做到效果无损(差距<1%),而int4量化模型在大部分任务的损失也完全可以接受,

官方资料

总结:

  • Yi-34B模型int4量化之后,相比float16损失<1%,可跑在RTX4090上(24G显存)
  • 模型结构不需要太多变化,LLAMA2 标准结构已经足够训出很好的效果
  • 3.1T 预训练数据远比scaling law建议的1T大,但是效果更好,并且模型还没饱和,继续增大数据量还能提升
  • 微调数据质量很重要,由算法人员直接标注,只要不到10k数据量就足够了
  • 4k长度的基础预训练模型已经具备长文本能力,只需用长文本数据继续预训练,更新百步就有很好效果

总之,数据要精心设计,数据质量要高,数据量要大

Yi实践结果证明: 较小模型+更大规模高质量数据,可获得进一步效果提升

  • 获得高性价比的推理模型–34B推理成本+大训练投入,就能得到接近普通70B规模的推理效果。

数据构造

数据是LLM最核心的部分,没有之一。Yi最核心的工作就是提升数据数量和质量。

预训练

主要步骤

Yi模型在预训练阶段的数据处理流程,主要是对爬取的网络文本进行数据过滤和去重

  • 原始网络数据 → 语种过滤 →

语料获取 & 语言分类

  • 从网络爬虫开始,爬取中英文这两种语言的网站,对网站内容进行解析。
  • 参考CCNeT(《CCNet: Extracting High Quality Monolingual Datasets from Web Crawl Data》)的做法,进行语言识别。

过滤方法

  • 启发式过滤:去除质量较低的文本内容。过滤规则包含:
    • (1)根据特殊URL、域名、黑名单词表以及乱码文本进行过滤;
    • (2)根据文本长度、特殊字符比例、短、连续或不完整的行比例;
    • (3)根据重复词语、N-Gram片段、段落的占比;
    • (4)识别和匿名话个人可识别信息,例如:邮箱、电话等。
  • 学习式过滤:Learned Filters, 规则不好处理的,训练模型来清洗
    • 通过困惑度质量安全和文档连贯性4种评分器来对文本进行过滤,共有4个scorer:
      • Perplexity Scorer:参照《CCNet: Extracting High Quality Monolingual Datasets from Web Crawl Data》,用kenlm库,把高于平均 perplexity 内容丢弃;
      • Quality Scorer:识别如维基百科高质量内容,丢弃低质量内容;
      • Document Coherence Scorer:发现句子、段落零散不连贯的文本,要么分割,要么直接丢弃;
      • Safety Scorer:识别并删除暴力、色情、涉政内容
    • 困惑度评分器利用KenLM库,按照CCNet方法评估大量网络文本,丢弃困惑度分数明显高于平均水平的文本;
    • 质量评分器经过维基百科数据训练的分类模型,当文本内容更偏向于维基这样高质量页面时,认为文本质量较高;
    • 安全评分器是识别并删除包含有毒内容的网络文档,如暴力、色情等;
    • 文档连贯性评分器识别文本的整体连贯性,删除句子或段落不连贯的文本。
  • 聚类过滤:Cluster-based Filters
    • 采用无监督语义聚类对文本进行分组,然后对聚类数据标注质量标签, 丢弃质量差的类别,为后续数据混合策略提供参考。
  • 去重方法:
    • 文本过滤之后进行去重流程,涉及基于文档级别的MinHash去重和子文档精确匹配去重,有效识别和消除文档内部和跨文档中的重复内容。
    • 同时利用主题模型对数据赋予特定主题,在最后数据采样过程种对信息密度较低的主题内容进行下采样(主要是广告文本)

最终预训练数据组成如下图所示,总计 3.1T Token。

  • 语种构成: 英语(60%) > 中文(20%) > 代码(10%)
  • 语料类型: 网页内容(80%) > 代码(8%) > 论文(5%) > 书籍(3%)

微调

对于微调数据

  • Quality is All You Need
  • 数据质量胜过数量

SFT数据质量能极大影响模型效果,随着数据量的增加,高质量数据能带来更多提升

微调阶段数据构造

  • 微调阶段采用 不到10k的 SFT数据

一共只有<10k条SFT数据,每条数据都通过人工多次打磨,这比大数量但质量一般数据的效果好。

  • 这思路和别人一致
    • 《Gemini: A family of highly capable multimodal models》
    • 《Llama 2: Open Foundation and Fine-Tuned Chat Models》
    • 《Lima: Less is more for alignment》
  • 不同
    • FLAN(《Scaling instruction-finetuned language models》)
    • UltraChat(《Enhancing chat language models by scaling high-quality instructional conversations》)

具体做法:

  • 对于 prompt distribution selection:参考《Wizardlm: Empowering large language models to follow complex instructions》,开发复合指令,并通过指令进化,逐步增加指令的复杂度。这种做法显著减少了SFT数据量。
  • 对于 CoT data formatting:参考《Take a step back: Evoking reasoning via abstraction in large language models》,采用了“Step-Back”的模式。即通过抽象化处理,让模型学习在深入探讨原始、具体的问题之前,制定更高层次的解决方案。
  • 对于 response formatting:使用从《Lima: Less is more for alignment》扩展的默认样式。
    • response的结构为introduction-body-conclusion的格式,“where the body is usually a list of bullet point”。
  • 缓解幻觉问题上,思路是确保response中的知识不由模型内部产生,对应的做法是把会导致模型进行记忆的response删掉。(但是这个具体标准是什么,有没有了解的朋友说下看法?)
  • 在缓解生成重复的问题上,则是直接把response中包含重复的部分都重写了。(核心还是洗数据,一条条打磨)
  • 数据多样性很重要,因此参考《#instag: Instruction tagging for analyzing supervised fine-tuning of large language models》建立了一个打标系统,并设计一个注重多样性的采样算法,平衡了各个领域数据的分布。
  • 为了找到最佳数据配比,参考《How abilities in large language models are affected by supervised fine-tuning data composition》,使用近似网络搜索(approximate grid search),对每个领域以 {1, 1/2, 1/4, 1/8, 1/16, 1/32, 1/64} 比例进行实验和人工测评,找到最佳的组合方式。
  • 除了内容,数据格式对效果也有很大影响。参OPENAI的ChatML格式,这种结构化的格式使模型能够区分各种信息类型,如system prompt、user input和bot response。

数据构造过程中

  • 采用WizardLM方法获取难度较高提示的数据集,采用LIMA中回复风格(总-分-总)对生成回复内容格式化,采用“Step-Back”模式对维链数据格式化。
  • 同时为了减少幻觉和重复,检查并确保回复中的知识不包含在模型中,消除可能导致模型死记硬背的回复,并重写回复保证微调多轮时数据不重复。

同时

  • 为了确保模型能力覆盖范围,微调数据中涉及多种任务,例如:问答、创意写作、对话、推理、数学、编码、双语能力等。
  • 为了增加模型的精细控制能力,设计了一套系统指令,通过多样性的采样算法,平衡各种系统指令上的数据分布,增强的跨任务鲁棒性。
  • 为了探索不同任务数据比例,对模型最终能力的影响,通过网格搜索方法,确定最终数据混合比例。

最后,微调数据采用ChatML格式,让模型可以更好地区分输入中各类型信息,例如:系统指令用户输入模型回复

模型结构

涉及 分词器、模型结构及微调参数

分词器

Tokenizer 采用 sentencepece 中 BPE方法对预训练数据训练得来,为平衡计算效率和词理解能力将词表设置为64000,将数字拆分为单个数字,将罕见字符用unicode编码

tokenizer

  • 用 BPE,词表大小为64000,平衡了计算效率和表达能力;
  • 其中数字全是单个的digit,让模型能更好地理解数字数据;
  • 对于OOV的词,会降级用unicode编码 ;
  • 保留全角标点符号,不转为半角;

另外,优先考虑英语的LLM在tokenizer会使用虚拟前缀(文本开头的空格)来泛化句子不同位置相同的单词。Yi不这么做,因为即使是在英语语境中,这种假设并不总是成立,比如对于以引号开头的句子,而且在中文语境中,这么做没有明显效果。

模型

模型 Transformer-Decoder 结构,基于标准LLAMA2模型,修改如下:

  • 注意力机制:LLAMA2只在70B用了GQA,Yi全系列都用了GQA
    • Yi-6B和34B版本均采用 Grouped-Query Attention(GQA),Llama2 中仅70B版本采用GQA。
  • 激活函数:Yi采用SwiGLU作为后注意力层的激活函数。
    • 参考《GLU Variants Improve Transformer》
  • 位置编码:Yi模型采用旋转位置编码RoPE),为例支持200k上下文窗口,调整了基础频率(RoPE ABF)。
    • 参考 RoPE ABF(《Effective long-context scaling of foundation models》),base扩大到10M,用于支持长上下文。

模型微调阶段

  • 仅计算回复内容的损失,不考虑系统指令和用户指令。
  • 采用AdamW优化器,其中β1、β2和ϵ分别为0.9、0.999和1e−8。
  • 训练数据最大长度为4096,批量大小为64,训练300步,学习率恒定为1e−5,权重衰减为0.1,梯度裁剪最大阈值为1.0,并采用NEFTune方式训练,Yi-34B-Chat和Yi-6B-Chat噪声尺度分别为45和5。

训练

Infra

从数据处理到模型训练都需要大集群大算力的支持。

Yi构建了支持全栈数据处理、预训练、微调和服务的基础设施。包括:

  • (1) 自动管理和监控计算资源的能力;
  • (2) 通过优化并行策略、内核效率和长上下文支持提高训练速度;
  • (3) 统一微调框架,支持异构分布式训练后端,例如在DPO中同时使用Megatron和DeepSpeed进行多个模型的训练;
  • (4) 通过各种LLM服务加速技术(如量化、continuous batching 和 paged attention)降低部署成本。

这部分工作还是很多的,比如

  • 由于经常有硬件坏,坏的硬件会被自动从资源池移除;
  • 任务失败时,会自动跟踪重启。
  • 给算法人员考法UI等。

预训练

预训练 pretrain

  • 训了4k基础模型。(暂时没有给出更多细节)

微调

微调超参如下

AdamWbeta=[0.9,0.999]epsilon = 1e-8
seq_len = 4096
batch size = 64
constant lr = 1e-5weight decay = 0.1
gradient clip = 1.0
max step = 300

参考

  • 《Neftune: Noisy embeddings improve instruction finetuning》
  • 对于6B模型 noise scale = 5,对于34B模型 noise scale = 45

评测

基模型评测

基础能力评测

对其他开源模型,保持和公开的设置相同做法获取结果。Yi使用贪婪解码,没有进行任何后处理

  • 数学和代码能力上,和GPT3.5、GPT4还存在一些差距,而这些能力是可以通过继续预训练和微调来持续提升的。Yi最初的设计并没有针对这些能力,因此没有在预训练数据中包含特别多相关数据,后续会有计划增加这部分能力的提升。
  • 而和其他开源模型相比,在代码和数学以外的任务,Yi基本上做到了跟大一倍模型的效果相近,甚至更好的水平。

观察

  • 模型规模带来的增益:尽管Yi-34B和Yi-6B使用了相同的预训练语料,但Yi-34B的性能相比Yi-6B有了质的提升。
    • 更大的模型尺寸在代码和数学基准测试上带来了明显的增益。
  • 数据质量:高质量预训练数据的小型模型,如Yi-34B或Qwen-14B,通常表现优于尺寸更大但(可能)数据质量较低的模型,例如Falcon-180B。
  • GPT-4与开源LLM差距:
    • 开源LLM在多种基准测试上的性能仍然落后于GPT-4和GPT-3.5。
    • 然而,具有代表性的双语LLM,例如Qwen-14B和Yi-34B,在包括C-Eval、CMMLU和Gaokao在内的中文知识相关基准测试上匹配甚至超过GPT-4的性能。然而,在BBH、代码(HumanEval)和数学(MATH)等推理相关基准测试上,仍然存在巨大差距。

In-Context Learning 能力

Yi进一步研究了in-context learning能力,即根据少数展示的输入-输出示例,推断underlying function的能力。

任务是推断加权和的线性系数

  • 定义 y = w1x1 + w2x2 + ... + wnxn

少量示例展示是 x1, x2, …, xn, y,要求模型预测给定一组新输入 x 的 y。

这就要求模型隐式地推断出 w1, w2, …, wn。

评测上,使用(a)模型预测的 y 与真实值 y∗ 之间的绝对差,即 |y − y∗| 作为连续度量,以及使用(b)精确匹配 y == y∗ 作为不连续度量。

模型在算术上的效果正常,因此可以认为这样的测试不受算术能力的影响,而能直接看模型是否具备根据给定的实例进行underlying function推理的能力。

实验发现,当问题比较简单时(系数是[1,-1]),Yi-34B和LLAMA-70B效果比较好(看下图)。

当问题更复杂点(系数是[1,1,1,1,1]),只有LLAMA-70B和Mistral 8*7B这样的大模型表现出了涌现的能力。

Chat 模型评测

自动评测

  • 评测任务和base模型相同,分别采用zero-shot和few-shot,效果依然不错

报告强调,如Goodhart’s principle所说

  • 当一个指标变成目标,就不再是一个好指标。
  • 因此这里的测试只是为了确认微调没有使得模型的知识能力下降,而不会专门去针对任务做优化。

结果上,Yi-34B-Chat数学能力不错,而Yi-6B-Chat并没有展现出强大的数学能力。推测较小的模型可能需要更多的数据在SFT阶段激活其相应的能力。

人工评测

能力扩展

上下文扩展

扩展模型上下文长度

对于长上下文的解决方法:采用继续预训练微调两种方法

  • 基础模型其实本身已经存在利用200K输入上下文中任何位置信息的前来,继续预训练可以“解锁”这种能力,通过微调可以进一步调整生成内容的风格以更好地遵循人类指令和偏好。

预训练阶段:

  • 采用序列并行分布式注意力方式蛮力对模型全部注意力进行训练。

数据来源:

  • (1)原始预训练数据;
  • (2)长上下文数据,主要来自数据;
  • (3)多文档文档合成数据。共计对5B Token的数据进行训练,批次大小为4M Token。

微调阶段:

  • 将短SFT数据与长上下文问答问答数据混合使用。文档问答数据由模型辅助构建,即随机将多个文档拼成一个长文档,从中抽取一个或多个段落,要求模型基于抽取段落内容构建问答对。
  • Trick,要求给答案之前模型需要背诵或改写原始段落,这种数据格式鼓励模型进行检索,从而阻止依赖自身知识回答产生的幻觉。

多模态

ViT部分由CLIP ViT-H/14 model初始化,后面的transformer由Yi-Chat初始化

3步训练:

  • (1)使用224^2的图像来训练ViT和projection模块的参数。这一训练利用了包含1亿个图像-文本对的数据集,这些数据来自LAION-400M。主要目标是增强ViT在架构中的知识获取能力,并实现ViT与LLM之间更好的对齐。
  • (2)将ViT图像分辨率提升到448^2,目的是进一步推动模型识别复杂视觉细节的能力。在这个阶段使用的数据集包括从LAION-400M中提取的2000万个图像-文本对。此外,还融入了来自不同来源的大约480万个图像-文本对,例如CLLaVA、LLaVAR、Flickr、VQAv2、RefCOCO、Visual7w等。
  • (3)整个模型的参数一起训练。主要目标是提高模型在多模态聊天交互方面的熟练度,从而赋予它能够无缝融合和解释视觉与语言输入的能力。为此,训练数据集涵盖了多种来源,总共大约有100万张图像-文本对,包括GQA、VizWiz VQA、TextCaps、OCR-VQA、Visual Genome、ShareGPT4V等等。为了确保数据平衡,对任何单一来源的最大数据量设定了上限,将其限制在不超过50,000对。

使用128张A100,6B训了3天,34B训10天。

扩展模型深度 Depth Upscaling

目标是把32层的6B扩展到48层的9B模型。

  • 参考《Scaling large language models with simple yet effective depth up-scaling》,通过复制中间的12-28层共16层,把层数扩展为48层。

参考SOLAR 10.7B模型对Yi-6B模型进行深度扩展,将原来的32层扩展到48层,构建Yi-9B模型。在具体层的选择时,通过评估每一层输入和输出直接的余弦相似度得出,余弦相似度越接近于1,则表明复制这些层不会显著改变原始模型输出的logits,因此选择复制原始模型中间12-28的16个层。

采用两阶段训练

  • 第一阶段使用了0.4T数据(包含文本和代码),数据配比与Yi-6B模型一样;
  • 第二阶段使用了0.4T数据(包含文本、代码和数学),重点增加了代码与数学数据的比例,以提高代码性能。

在微调过程中

  • 设定了一个固定的学习率 3e-5,并采取逐步增加 batch size 的策略,即从 batch size 4M token 开始,每当模型 loss 停止下降时就增加 batch size,使 loss 继续下降,让模型学习更加充分,收敛性能更好。

【2024-4-22】MiniCPM

【2024-4-22】MiniCPM:揭示端侧大语言模型的无限潜力

MiniCPM 是一系列端侧语言大模型,主体语言模型 MiniCPM-2B 具有2.4B的非词嵌入参数量。

  • 综合性榜单上与Mistral-7B相近(中文、数学、代码能力更优),整体性能超越Llama2-13B、MPT-30B、Falcon-40B等模型。
  • 当前最接近用户体感的榜单MTBench上,MiniCPM-2B也超越了Llama2-70B-Chat、Vicuna-33B、Mistral-7B-Instruct-v0.1、Zephyr-7B-alpha等众多代表性开源大模型。

超参调优

Hyper-parameters、Batch size、Learning Rate、Learning Rate Scheduler、Data Strategy 五个方面模型沙盒研究。

近400次在0.009B模型规模上的贝叶斯参数搜索得到

超参数对模型的性能具有重大影响

  • 传统训练方法要对每个模型进行超参数调整,这对于大模型并不现实。

借鉴 uP 方法,对模型各参数模块之间进行了连接权重的调整、以及对模型初始化的调整。部分调整接近Cerebras-GPT。

名称 具体操作
Embedding Output Scaling 将Embedding的输出乘12
Residual Connection Scaling 将每层的残差连接处的增量放缩为 1.4/sqrt(num_layers) = 0.22 倍
Initialization of Tensors 将每个二维的张量参数的初始化标准差设置为 0.1/sqrt(dim_model/256) = 0.033,其他参数初始化设置为0.1
Learning Rate Scaling of Tensors 将每个二维的张量参数的学习率调整为其他部分学习率(或称整体学习率)的1/(dim_model/256) = 0.11倍
lm_head Scaling 将输出logits调整为原来的0.11倍

batch size

Batchsize 随损失变化: 更大的Batchsize可能可达到更低的loss

  • 扩大Batchsize 时, 损失会有一次较大幅度的下降

2020年, OpenAI 开山之作研究了损失函数token数变化的规律: 消耗更多的步数等价于消耗更多的时间

在这种假设下,OpenAI定义了临界Batchsize(Critical Batchsize),使得达到一定的损失,既不消耗过多step,也不消耗过多token。

然而利用当前以A100为主的计算资源,结合gradient checkpointing策略进行训练时,通常计算速度(而不是显存)是瓶颈

  • 相同机器数量下,多一倍 Batchsize 几乎等同于慢一倍的单步时间

基于这个观察,取消了对“不消耗过多step”的追求,而转向追求用最少的token量达到最低的loss。

0.009B,0.036B,0.17B的模型上分别进行了6个batchsize的训练实验

  • log(BS) = -6.24 * log(L) + 20.91

最优batchsize随着C4数据集上的loss的偏移规律

  • 规律: BS = 1.211 * 10^9 / L^6.2393
  • 预估: 2B模型达到C4损失2.5左右,4M是比较合适的Batchsize

learning rate

模型最关键超参数:学习率

lr 不会因为模型规模扩大有大幅度的改变

0.04B, 0.1B, 0.3B, 0.5B 上分别做了6组学习率实验,发现虽然模型大小扩大了10倍,但是最优学习率偏移并不明显,均在0.01左右

  • 在 2.1B 规模上进行了简单验证,发现在 0.01 的学习率确实能取得最低的Loss。

lr 调度策略

不同训练阶段使用不同学习率的调整策略,对模型性能影响很关键

当前通用的学习率策略是Cosine图像,即 学习率从 Warmup阶段升高到最高点之后,开始呈现余弦函数的降低。

  • 几乎所有大模型都使用了 Cosine Learning Rate Scheduler (简称Cosine LRS)的方式。

为什么 Cosine Scheduler 表现优异?

对0.036B的模型,设置不同的Learning Rate Scheduler的截止步数$T$,进行了持续训练。

  • 对于训练至 S 步的模型,将 Cosine LRS 截止步数 T 设置为 S 步, 总是能获得最优的性能,而设置为更多或者更少性能都不是最优。

持续训练场景会发现 Cosine调度器有问题。

  • 如果在Cosine的截止步数之后, 继续沿用0.1倍的最大学习率(通常做法),则继续训练收敛非常缓慢
  • 如果在Cosine的截止步数之后, 重启Cosine LRS(即再次从最大学习率开始下降,或者是逐渐上升到最大学习率,再开始下降)则损失会经历长时间的上升周期,而这段时间,模型处于不可用状态

猜想 Cosine LRS 在预先指定步数的时候性能优异原因:

  1. T=S下的Cosine LRS,相对于Linear LRS、Noam LRS、以及T<S的Cosine LRS,有更长时间的大学习率训练。这一阶段可能有助于模型寻找更好的全局最优解。
  2. T=S下的Cosine LRS ,相对于T>S的Cosine LRS、Constant LRS,有更充分的学习率下降的退火阶段,这一阶段可能发生了较为特别的动力学现象,导致模型可以找到更好的局部最优解。

结合这两点,提出了一种新的学习率调度策略,Warmup-Stable-DecayWSD)调度器。

  • 公式见原文
  • Cosine调度器结束后, 需要持续保持最低学习率,以保证loss不上升
  • 而WSD调度器则从退火(decay)前开始继续用最大学习率训练,经过更长的训练再开始退火

这种学习率调度器分为三个阶段:

  • warmup阶段(用W表示warmup阶段结束时的步数/训练量)
  • 稳定训练阶段(用S表示稳定训练阶段结束时的步数/训练量)
  • 退火阶段(用D表示退火阶段的训练量)
    • 随着学习率的变小,损失有大幅度的快速下降,在步数S时迅速降低至和T=S的Cosine LRS相等或更低

WSD好处:

  1. 可以持续训练。
  2. 可以随时取出。
  3. 性能优于Cosine LRS。
  4. 有显式区分的训练阶段,便于使用不同的数据策略。

数据策略

结合训练阶段特点,使用不同类型的数据

  • 预训练阶段: 只用通用、量大的预训练粗质量数据
  • 退火阶段: 用非常广泛的高质量知识和能力数据以及SFT的高质量数据,混合入预训练数据进行退火。

实验结果

  • 退火开始时加入高质量数据的收益远高于在退火完成后的sft阶段加入。

因此, 建议模型能力的特化和增强应从退火阶段开始进行。

【2024-5-7】DeepSeek

深度求索(DeepSeek)顶尖人才招聘

DeepSeek 介绍

揭秘DeepSeek:一个更极致的中国技术理想主义故事

中国7家大模型创业公司中,DeepSeek(深度求索)最不声不响,但又总能以出其不意的方式被人记住。

  • 一年前,这种出其不意源自背后的量化私募巨头幻方,大厂外唯一储备万张A100芯片的公司
  • 一年后,则来自引发中国大模型价格战的源头。

2023年5月,DeepSeek (深度求索) 成立

被AI连续轰炸的5月,DeepSeek一跃成名。起因是发布的一款名为DeepSeek V2开源模型,提供了一种史无前例的性价比

  • 推理成本被降到每百万token仅 1块钱,约等于 Llama3 70B 1/7,GPT-4 Turbo 1/70。

DeepSeek 被迅速冠以“AI界拼多多”之称的同时,字节、腾讯、百度、阿里等大厂也按耐不住,纷纷降价。中国大模型价格战由此一触即发。

成见:

  • 美国更擅长从0-1的技术创新,而中国更擅长从1-10的应用创新。

事实:

与很多大厂烧钱补贴不同,DeepSeek 有利润

DeepSeek 对模型架构进行了全方位创新

  • 提出一种崭新的MLA(一种多头潜在注意力机制)架构,把显存占用降到了过去最常用的MHA架构的5%-13%
  • 独创 DeepSeekMoESparse 结构,把计算量降到极致,所有这些最终促成了成本的下降。

OpenAI前政策主管、Anthropic联合创始人Jack Clark:

  • DeepSeek “雇佣了一批高深莫测的奇才”,还认为中国制造的大模型,“将和无人机、电动汽车一样,成为不容忽视的力量。”

梁文锋:

  • 并没有什么高深莫测的奇才,都是一些Top高校的应届毕业生、没毕业的博四、博五实习生,还有一些毕业才几年的年轻人。都是本土 —— 达摩院背景的罗福莉 参考 罗福莉:天才AI少女“祛魅”记
    • 保研北大、在顶会顶刊发文章、拿遍大厂offer、进入阿里达摩院、转行跳槽知名私募公司…
    • 2019年,一位北大硕士,因在NLP国际顶会 ACL 上发表 8 篇论文(其中2篇一作),曾登上知乎热搜
    • 在达摩院,罗福莉主导开发的跨语言预训练模型VECO,成为深度语言模型体系AliceMind八大模型之一,并被顶会ACL2021录用,她也在AliceMind集体开源中挑起大梁。AliceMind 登顶多模态权威榜单VQA Challenge 2021,并在阿里内部数十个核心业务落地,日均调用50亿次,活跃场景超过200个,其中不乏大家熟悉的天猫精灵智能音响等。
  • 选人标准: 一直都是热爱好奇心,所以很多人会有一些奇特的经历。对做研究的渴望远超对的在意。
  • 对顶级人才吸引最大的,肯定是去解决世界上最难的问题。其实,顶尖人才在中国是被低估的。因为整个社会层面的硬核创新太少了,使得他们没有机会被识别出来。

Attention 架构提出多年来,几乎未被成功改过,更遑论大规模验证;对模型结构进行创新,没有路径可依,要经历很多失败,时间、经济成本都耗费巨大。

而 DeepSeek 成功了,它是 7家中国大模型创业公司中,唯一一家放弃“既要又要”路线,至今专注研究和技术,未做toC应用的公司,也是唯一一家未全面考虑商业化,坚定选择开源路线甚至都没融过资的公司。

  • 公司 60 个人, 50 个技术, 10 个工程

DeepSeek创始人梁文锋 浙江大学电子工程系人工智能方向, 从幻方时代 就在幕后潜心研究技术的80后创始人,在 DeepSeek 时代,依旧延续低调作风,和所有研究员一样,每天 “看论文,写代码,参与小组讨论”。

梁文锋是当下中国AI界非常罕见

  • “兼具强大的infra工程能力和模型研究能力,又能调动资源
  • “既可以从高处做精准判断,又可以在细节上强过一线研究员”的人,他拥有“令人恐怖的学习能力”,同时又“完全不像一个老板,而更像一个极客”。

他是少有把“是非观”置于“利害观”之前,并提醒看到时代惯性,把“原创式创新”提上日程的人。

DeepSeek V2

【2024-5-7】DeepSeek-V2 全球最强开源通用MoE模型

  • DeepSeek-V2 基于 2 千亿 MoE 模型底座,领先性能,超低价格,越级场景体验,已在对话官网和API全面上线
  • 技术报告: 浅读 DeepSeek-V2 技术报告
  • 仓库和技术报告地址:DeepSeek-V2

DeepSeek-V2 在 DeepSeek 上改进,但并没有沿用主流的“类LLaMA的Dense结构”和“类Mistral的Sparse结构”,而是对Transformer架构中的自注意力机制进行了全方位创新,提出了MLA(Multi-head Latent Attention)结构,并使用了MoE技术进一步将计算量降低,大幅提高了推理效率。

特点

  • 独创 MLA 结构
  • 稀疏结构 DeepSeek-MoE
  • 推理成本降低近百倍
  • LMSYS榜单中,位列开源模型第一

DeepSeek-V2 包含 236B参数,每个Token激活2.1B参数,支持长达 128K 的上下文长度。

  • 与DeepSeek 67B相比,DeepSeek-V2 在性能上取得了显著提升,节省了42.5%的训练成本,减少了93.3%的KV缓存,并将最大生成吞吐量提高到了5.76倍。

深度求索将该 DeepSeek-V2 模型已完全上线至平台服务用户,DeepSeek-V2 API也是物美价廉。并且秉持着最开放的开源精神,深度求索将这次的DeepSeek-V2模型和论文也将完全开源,免费商用。

模型结构

模型结构

DeepSeek Coder

2023年11月,DeepSeek Coder V1发布

2024年6月,DeepSeek Coder V2 全球最强代码开源模型

  • 全球首个超越 GPT4-Turbo 的开源代码模型
  • BigCodeBench 6月榜单中第二

DeepSeek VL

自然语言到多模态初探

DeepSeek R1

【2024-11-20】DeepSeek R1 详见站内专题: o1

训练经验

OOM

【2024-4-11】 OOM

  • 单机单卡(V100S,32G)
  • InternLM2-1.8B, 7.1G
  • 数据集: 231m

报错

torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 7.04 GiB. GPU 0 has a total capacty of 31.75 GiB of which 5.04 GiB is free. Process 743134 has 26.71 GiB memory in use. Of the allocated memory 25.01 GiB is allocated by PyTorch, and 342.98 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

deepspeed 配置

deepspeed --master_port 30001 ./llm/training/conversation_reward/main.py \
   --max_seq_len 2048 \
   --per_device_train_batch_size 2 \
   --per_device_eval_batch_size 2 \
   --weight_decay 0.01 \
   --dropout 0.0 \
   --gradient_accumulation_steps 1 \
   --zero_stage 2 \
   --dtype bf16 \
   --num_train_epochs 10 \
   --train_data_path /mnt/bn/flow-algo-cn/wangqiwen/session_process/data/train/cut_train_sequence_en_20240331.csv \
   --val_data_path /mnt/bn/flow-algo-cn/wangqiwen/session_process/data/test/cut_test_0322_es_sequence_v2.csv \
   --test_data_path /mnt/bn/flow-algo-cn/wangqiwen/session_process/data/test/cut_test_0322_en_sequence_v2.csv \
   --model_name_or_path /mnt/bn/flow-algo-cn/yufeng/ModelHub/internlm2-1_8b \
   --output_dir /mnt/bn/flow-algo-cn/wangqiwen/model/checkpoints \
   --debug \
   --deepspeed

解决

  • 设置GPU缓存碎片 → 无效
  • 改用 A100(80G) → 有效
--max_split_size_mb 32  # 无效

结束


支付宝打赏 微信打赏

~ 海内存知已,天涯若比邻 ~

Share

Similar Posts

Related Posts

上一篇 分布式训练

标题:分布式训练

摘要:分布式训练知识点

站内可视化导航

文章可视化导读:鼠标划过图形块时,如果出现蓝色光环, 点击即可跳转到对应主题

Comments

--disqus--

    My Moment ( 微信公众号 )
    欢迎关注鹤啸九天